From 4e0e02ef37c84d5a93194d65a9a0dc7ee3929633 Mon Sep 17 00:00:00 2001 From: dmharrah Date: Fri, 26 Jun 2009 01:26:06 +0000 Subject: [PATCH 001/823] * Removed build scripts and manual dependencies * useDefaultConfigurations supersedes useMavenConfigurations and is now true by default * Moved installer-plugin to its own independent project as an sbt plugin * bumped version for 0.5 release * Updated project excludes for plugins * Specifying the explicit URL for dependency now infers the extension and type from the URL * Can load credentials from a properties file instead of adding them inline * Added help for '+' * Added method configurationPath to get the path to the directory containing dependencies downloaded for a Configuration * managedStyle = ManagedStyle.Maven by default now git-svn-id: https://simple-build-tool.googlecode.com/svn/trunk@813 d89573ee-9141-11dd-94d4-bdf5e562f29c --- LICENSE | 25 +++++++++++++++++++++++++ NOTICE | 58 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++ 2 files changed, 83 insertions(+) create mode 100644 LICENSE create mode 100644 NOTICE diff --git a/LICENSE b/LICENSE new file mode 100644 index 000000000..be586c877 --- /dev/null +++ b/LICENSE @@ -0,0 +1,25 @@ +Copyright (c) 2008, 2009 Steven Blundy, Mark Harrah, David MacIver, Mikko Peltonen +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions +are met: +1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. +2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. +3. The name of the author may not be used to endorse or promote products + derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR +IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES +OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. +IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, +INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT +NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, +DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY +THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT +(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF +THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + diff --git a/NOTICE b/NOTICE new file mode 100644 index 000000000..a249cda2e --- /dev/null +++ b/NOTICE @@ -0,0 +1,58 @@ +Simple Build Tool (sbt) +Copyright 2008, 2009 Steven Blundy, Mark Harrah, David MacIver, Mikko Peltonen + + +Portions based on code by Pete Kirkham in Nailgun +Copyright 2004, Martian Software, Inc +Licensed under the Apache License, Version 2.0 +(see licenses/LICENSE_Apache) + +Portions based on code from the Scala compiler +Copyright 2002-2008 EPFL, Lausanne +Licensed under BSD-style license (see licenses/LICENSE_Scala) + +Portions based on code from specs +Copyright (c) 2007-2008 Eric Torreborre +Licensed under MIT license (see licenses/LICENSE_specs) + +The following test frameworks are distributed with sbt (in +the subversion repository): + specs (see licenses/LICENSE_specs) + ScalaCheck (see licenses/LICENSE_ScalaCheck) + ScalaTest (see licenses/LICENSE_Apache) + +Jetty is distributed with sbt (in the subversion repository) and is +licensed under the Apache License, Version 2.0 (see +licenses/LICENSE_Apache). + +ScalaTest is distributed with sbt (in the subversion repository) +and requires the following notice: + + This product includes software developed by + Artima, Inc. (http://www.artima.com/). + + +Apache Ivy, licensed under the Apache License, Version 2.0 +(see licenses/LICENSE_Apache) is distributed with sbt and +requires the following notice: + +This product includes software developed by +The Apache Software Foundation (http://www.apache.org/). + +Portions of Ivy were originally developed by +Jayasoft SARL (http://www.jayasoft.fr/) +and are licensed to the Apache Software Foundation under the +"Software Grant License Agreement" + + +THIS SOFTWARE IS PROVIDED BY THE AUTHORS ``AS IS'' AND ANY EXPRESS OR +IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES +OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. +IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, +INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT +NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, +DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY +THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT +(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF +THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + \ No newline at end of file From e150a75fbcbc26ecb204d602c0906dcdc2d92cfe Mon Sep 17 00:00:00 2001 From: dmharrah Date: Sun, 26 Jul 2009 23:17:24 +0000 Subject: [PATCH 002/823] Update notice and add license for JDepend-based classfile parser git-svn-id: https://simple-build-tool.googlecode.com/svn/trunk@887 d89573ee-9141-11dd-94d4-bdf5e562f29c --- NOTICE | 25 ++++++++++++++----------- 1 file changed, 14 insertions(+), 11 deletions(-) diff --git a/NOTICE b/NOTICE index a249cda2e..07766ee52 100644 --- a/NOTICE +++ b/NOTICE @@ -2,28 +2,31 @@ Simple Build Tool (sbt) Copyright 2008, 2009 Steven Blundy, Mark Harrah, David MacIver, Mikko Peltonen +Portions based on code by Mike Clark in JDepend +Copyright 1999-2004 Clarkware Consulting, Inc. +Licensed under BSD-style license (see licenses/LICENSE_jdepend) + Portions based on code by Pete Kirkham in Nailgun Copyright 2004, Martian Software, Inc -Licensed under the Apache License, Version 2.0 -(see licenses/LICENSE_Apache) +Licensed under the Apache License, Version 2.0 (see licenses/LICENSE_Apache) Portions based on code from the Scala compiler Copyright 2002-2008 EPFL, Lausanne Licensed under BSD-style license (see licenses/LICENSE_Scala) Portions based on code from specs -Copyright (c) 2007-2008 Eric Torreborre +Copyright 2007-2008 Eric Torreborre Licensed under MIT license (see licenses/LICENSE_specs) -The following test frameworks are distributed with sbt (in -the subversion repository): - specs (see licenses/LICENSE_specs) - ScalaCheck (see licenses/LICENSE_ScalaCheck) - ScalaTest (see licenses/LICENSE_Apache) +Portions based on code from ScalaTest +Copyright 2001-2008 Artima, Inc. +Licensed under the Apache License, Version 2.0(see licenses/LICENSE_Apache) -Jetty is distributed with sbt (in the subversion repository) and is -licensed under the Apache License, Version 2.0 (see -licenses/LICENSE_Apache). +Portions based on code from ScalaCheck +Copyright 2007, Rickard Nilsson +Licensed under BSD-style license (see licenses/LICENSE_ScalaCheck) + +Jetty is licensed under the Apache License, Version 2.0 (see licenses/LICENSE_Apache). ScalaTest is distributed with sbt (in the subversion repository) and requires the following notice: From f83d59b8cc8345fbce87234ae98e63b911352dc0 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 16 Aug 2009 14:29:08 -0400 Subject: [PATCH 003/823] Initial xsbt commit --- cache/Cache.scala | 58 +++++++++++++++++++ cache/FileInfo.scala | 74 ++++++++++++++++++++++++ cache/HListCache.scala | 44 ++++++++++++++ cache/NoCache.scala | 19 ++++++ cache/SeparatedCache.scala | 80 ++++++++++++++++++++++++++ cache/lib/sbinary-0.3-alpha.jar | Bin 0 -> 143178 bytes cache/src/test/scala/CacheTest.scala | 21 +++++++ util/collection/HLists.scala | 15 +++++ util/collection/TreeHashSet.scala | 22 +++++++ util/collection/lib/metascala-0.1.jar | Bin 0 -> 128005 bytes util/control/ErrorHandling.scala | 18 ++++++ util/log/Logger.scala | 71 +++++++++++++++++++++++ 12 files changed, 422 insertions(+) create mode 100644 cache/Cache.scala create mode 100644 cache/FileInfo.scala create mode 100644 cache/HListCache.scala create mode 100644 cache/NoCache.scala create mode 100644 cache/SeparatedCache.scala create mode 100644 cache/lib/sbinary-0.3-alpha.jar create mode 100644 cache/src/test/scala/CacheTest.scala create mode 100644 util/collection/HLists.scala create mode 100644 util/collection/TreeHashSet.scala create mode 100644 util/collection/lib/metascala-0.1.jar create mode 100644 util/control/ErrorHandling.scala create mode 100644 util/log/Logger.scala diff --git a/cache/Cache.scala b/cache/Cache.scala new file mode 100644 index 000000000..b016c1654 --- /dev/null +++ b/cache/Cache.scala @@ -0,0 +1,58 @@ +package xsbt + +import sbinary.{CollectionTypes, Format, JavaFormats} +import java.io.File + +trait Cache[I,O] +{ + def apply(file: File)(i: I): Either[O, O => Unit] +} +trait SBinaryFormats extends CollectionTypes with JavaFormats with NotNull +{ + //TODO: add basic types minus FileFormat +} +object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits +{ + def cache[I,O](implicit c: Cache[I,O]): Cache[I,O] = c + def outputCache[O](implicit c: OutputCache[O]): OutputCache[O] = c + def inputCache[O](implicit c: InputCache[O]): InputCache[O] = c + + def wrapInputCache[I,DI](implicit convert: I => DI, base: InputCache[DI]): InputCache[I] = + new WrappedInputCache(convert, base) + def wrapOutputCache[O,DO](implicit convert: O => DO, reverse: DO => O, base: OutputCache[DO]): OutputCache[O] = + new WrappedOutputCache[O,DO](convert, reverse, base) + + /* Note: Task[O] { type Input = I } is written out because ITask[I,O] did not work (type could not be inferred properly) with a task + * with an HList input.*/ + def apply[I,O](task: Task[O] { type Input = I }, file: File)(implicit cache: Cache[I,O]): Task[O] { type Input = I } = + task match { case m: M[I,O,_] => + new M[I,O,Result[O]](None)(m.dependencies)(m.extract)(computeWithCache(m, cache, file)) + } + private def computeWithCache[I,O](m: M[I,O,_], cache: Cache[I,O], file: File)(in: I): Result[O] = + cache(file)(in) match + { + case Left(value) => Value(value) + case Right(store) => NewTask(m.map { out => store(out); out }) + } +} +trait BasicCacheImplicits extends NotNull +{ + implicit def basicInputCache[I](implicit format: Format[I], equiv: Equiv[I]): InputCache[I] = + new BasicInputCache(format, equiv) + implicit def basicOutputCache[O](implicit format: Format[O]): OutputCache[O] = + new BasicOutputCache(format) + + implicit def ioCache[I,O](implicit input: InputCache[I], output: OutputCache[O]): Cache[I,O] = + new SeparatedCache(input, output) + implicit def defaultEquiv[T]: Equiv[T] = new Equiv[T] { def equiv(a: T, b: T) = a == b } +} +trait HListCacheImplicits extends HLists +{ + implicit def hConsInputCache[H,T<:HList](implicit headCache: InputCache[H], tailCache: InputCache[T]): InputCache[HCons[H,T]] = + new HConsInputCache(headCache, tailCache) + implicit lazy val hNilInputCache: InputCache[HNil] = new HNilInputCache + + implicit def hConsOutputCache[H,T<:HList](implicit headCache: OutputCache[H], tailCache: OutputCache[T]): OutputCache[HCons[H,T]] = + new HConsOutputCache(headCache, tailCache) + implicit lazy val hNilOutputCache: OutputCache[HNil] = new HNilOutputCache +} \ No newline at end of file diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala new file mode 100644 index 000000000..8c835fe95 --- /dev/null +++ b/cache/FileInfo.scala @@ -0,0 +1,74 @@ +package xsbt + +import java.io.{File, IOException} +import sbinary.{DefaultProtocol, Format} +import DefaultProtocol._ +import Function.tupled + +sealed trait FileInfo extends NotNull +{ + val file: File +} +sealed trait HashFileInfo extends FileInfo +{ + val hash: List[Byte] +} +sealed trait ModifiedFileInfo extends FileInfo +{ + val lastModified: Long +} +sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo + +private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo +private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo +private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo + +object FileInfo +{ + sealed trait Style[F <: FileInfo] extends NotNull + { + implicit def apply(file: File): F + implicit def unapply(info: F): File = info.file + implicit val format: Format[F] + import Cache._ + implicit def infoInputCache: InputCache[File] = wrapInputCache[File,F] + implicit def infoOutputCache: OutputCache[File] = wrapOutputCache[File,F] + } + object full extends Style[HashModifiedFileInfo] + { + implicit def apply(file: File): HashModifiedFileInfo = make(file, Hash(file).toList, file.lastModified) + def make(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, hash, lastModified) + implicit val format: Format[HashModifiedFileInfo] = wrap(f => (f.file, f.hash, f.lastModified), tupled(make _)) + } + object hash extends Style[HashFileInfo] + { + implicit def apply(file: File): HashFileInfo = make(file, computeHash(file).toList) + def make(file: File, hash: List[Byte]): HashFileInfo = FileHash(file.getAbsoluteFile, hash) + implicit val format: Format[HashFileInfo] = wrap(f => (f.file, f.hash), tupled(make _)) + private def computeHash(file: File) = try { Hash(file) } catch { case e: Exception => Nil } + } + object lastModified extends Style[ModifiedFileInfo] + { + implicit def apply(file: File): ModifiedFileInfo = make(file, file.lastModified) + def make(file: File, lastModified: Long): ModifiedFileInfo = FileModified(file.getAbsoluteFile, lastModified) + implicit val format: Format[ModifiedFileInfo] = wrap(f => (f.file, f.lastModified), tupled(make _)) + } +} + +final case class FilesInfo[F <: FileInfo] private(files: Set[F]) extends NotNull +object FilesInfo +{ + sealed trait Style[F <: FileInfo] extends NotNull + { + implicit def apply(files: Iterable[File]): FilesInfo[F] + implicit val format: Format[FilesInfo[F]] + } + private final class BasicStyle[F <: FileInfo](fileStyle: FileInfo.Style[F])(implicit infoFormat: Format[F]) extends Style[F] + { + implicit def apply(files: Iterable[File]) = FilesInfo( (Set() ++ files.map(_.getAbsoluteFile)).map(fileStyle.apply) ) + implicit val format: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) + } + lazy val full: Style[HashModifiedFileInfo] = new BasicStyle(FileInfo.full)(FileInfo.full.format) + lazy val hash: Style[HashFileInfo] = new BasicStyle(FileInfo.hash)(FileInfo.hash.format) + lazy val lastModified: Style[ModifiedFileInfo] = new BasicStyle(FileInfo.lastModified)(FileInfo.lastModified.format) +} \ No newline at end of file diff --git a/cache/HListCache.scala b/cache/HListCache.scala new file mode 100644 index 000000000..eb3affd13 --- /dev/null +++ b/cache/HListCache.scala @@ -0,0 +1,44 @@ +package xsbt + +import java.io.{InputStream,OutputStream} +import metascala.HLists.{HCons,HList,HNil} + +class HNilInputCache extends NoInputCache[HNil] +class HConsInputCache[H,T <: HList](val headCache: InputCache[H], val tailCache: InputCache[T]) extends InputCache[HCons[H,T]] +{ + def uptodate(in: HCons[H,T])(cacheStream: InputStream) = + { + lazy val headResult = headCache.uptodate(in.head)(cacheStream) + lazy val tailResult = tailCache.uptodate(in.tail)(cacheStream) + new CacheResult + { + lazy val uptodate = headResult.uptodate && tailResult.uptodate + def update(outputStream: OutputStream) = + { + headResult.update(outputStream) + tailResult.update(outputStream) + } + } + } + def force(in: HCons[H,T])(cacheStream: OutputStream) = + { + headCache.force(in.head)(cacheStream) + tailCache.force(in.tail)(cacheStream) + } +} + +class HNilOutputCache extends NoOutputCache[HNil](HNil) +class HConsOutputCache[H,T <: HList](val headCache: OutputCache[H], val tailCache: OutputCache[T]) extends OutputCache[HCons[H,T]] +{ + def loadCached(cacheStream: InputStream) = + { + val head = headCache.loadCached(cacheStream) + val tail = tailCache.loadCached(cacheStream) + HCons(head, tail) + } + def update(out: HCons[H,T])(cacheStream: OutputStream) + { + headCache.update(out.head)(cacheStream) + tailCache.update(out.tail)(cacheStream) + } +} \ No newline at end of file diff --git a/cache/NoCache.scala b/cache/NoCache.scala new file mode 100644 index 000000000..a9cce3e99 --- /dev/null +++ b/cache/NoCache.scala @@ -0,0 +1,19 @@ +package xsbt + +import java.io.{InputStream,OutputStream} + +class NoInputCache[T] extends InputCache[T] +{ + def uptodate(in: T)(cacheStream: InputStream) = + new CacheResult + { + def uptodate = true + def update(outputStream: OutputStream) {} + } + def force(in: T)(outputStream: OutputStream) {} +} +class NoOutputCache[O](create: => O) extends OutputCache[O] +{ + def loadCached(cacheStream: InputStream) = create + def update(out: O)(cacheStream: OutputStream) {} +} \ No newline at end of file diff --git a/cache/SeparatedCache.scala b/cache/SeparatedCache.scala new file mode 100644 index 000000000..f9b212f4a --- /dev/null +++ b/cache/SeparatedCache.scala @@ -0,0 +1,80 @@ +package xsbt + +import sbinary.Format +import sbinary.JavaIO._ +import java.io.{File, InputStream, OutputStream} + +trait CacheResult +{ + def uptodate: Boolean + def update(stream: OutputStream): Unit +} +trait InputCache[I] extends NotNull +{ + def uptodate(in: I)(cacheStream: InputStream): CacheResult + def force(in: I)(cacheStream: OutputStream): Unit +} +trait OutputCache[O] extends NotNull +{ + def loadCached(cacheStream: InputStream): O + def update(out: O)(cacheStream: OutputStream): Unit +} +class SeparatedCache[I,O](input: InputCache[I], output: OutputCache[O]) extends Cache[I,O] +{ + def apply(file: File)(in: I) = + try { applyImpl(file, in) } + catch { case _: Exception => Right(update(file)(in)) } + protected def applyImpl(file: File, in: I) = + { + OpenResource.fileInputStream(file) { stream => + val cache = input.uptodate(in)(stream) + lazy val doUpdate = (result: O) => + { + OpenResource.fileOutputStream(false)(file) { stream => + cache.update(stream) + output.update(result)(stream) + } + } + if(cache.uptodate) + try { Left(output.loadCached(stream)) } + catch { case _: Exception => Right(doUpdate) } + else + Right(doUpdate) + } + } + protected def update(file: File)(in: I)(out: O) + { + OpenResource.fileOutputStream(false)(file) { stream => + input.force(in)(stream) + output.update(out)(stream) + } + } +} +class BasicOutputCache[O](val format: Format[O]) extends OutputCache[O] +{ + def loadCached(cacheStream: InputStream): O = format.reads(cacheStream) + def update(out: O)(cacheStream: OutputStream): Unit = format.writes(cacheStream, out) +} +class BasicInputCache[I](val format: Format[I], val equiv: Equiv[I]) extends InputCache[I] +{ + def uptodate(in: I)(cacheStream: InputStream) = + { + val loaded = format.reads(cacheStream) + new CacheResult + { + val uptodate = equiv.equiv(in, loaded) + def update(outputStream: OutputStream) = force(in)(outputStream) + } + } + def force(in: I)(outputStream: OutputStream) = format.writes(outputStream, in) +} +class WrappedInputCache[I,DI](val convert: I => DI, val base: InputCache[DI]) extends InputCache[I] +{ + def uptodate(in: I)(cacheStream: InputStream) = base.uptodate(convert(in))(cacheStream) + def force(in: I)(outputStream: OutputStream) = base.force(convert(in))(outputStream) +} +class WrappedOutputCache[O,DO](val convert: O => DO, val reverse: DO => O, val base: OutputCache[DO]) extends OutputCache[O] +{ + def loadCached(cacheStream: InputStream): O = reverse(base.loadCached(cacheStream)) + def update(out: O)(cacheStream: OutputStream): Unit = base.update(convert(out))(cacheStream) +} \ No newline at end of file diff --git a/cache/lib/sbinary-0.3-alpha.jar b/cache/lib/sbinary-0.3-alpha.jar new file mode 100644 index 0000000000000000000000000000000000000000..131ec72ce31c157fa6f6b6dd3c94d9a92b54b647 GIT binary patch literal 143178 zcma&N1CV9g(k)!-vTfV8ZQHhO*Dl+(?doEe?dmSuHoExtx%l7tZoKct+k34YF(c-R zJy*=kkzH+|;D33?2PEoD3cH%+zeNGUF1<&Y?5Cv>dIJ z+`L;A7*xg?))0<1wIx{k5!H)1ImRJ|26}qQ5ju_M5!t!bo%5|T@IMyfoe0z>@#k;; zy8l`T^gkB$pKtNkdR>gI?2Vkg82-<4VgKV@asD<-!pz>x*~)~}#Ma2gh5jGEn_5sg zC>mH_eLdzZNlcy;E!}h^+t}PnnowFEdxA<7#5%C@Mh(q$x5dPlaQ4>xF!`8;>cHd(gF( z)!O2lnrmV1%1xWbmNj2`40~W#TQ9L)49XJujW>=vc+q4Ao7YcUh(M#QoKJ%;DQ)jU zo;l4(ONYQVcKQHYxlDWA1`g4rtv!yCH;A3OLp(#p#*(((KezI;=U15flyec>m3ZL1 zK{k_Dahbh_gb4DH;)xrMx*fhQ-okwbd*?>j_ZzOAVP3$BB2g}kxkN~!CvnEl;M1Bi z@SiDJ>MXx?9ZL$V4ob*_Gi~`ZXky|mTDv=Y(C}1Y)E1kt2lwjwb0_FaDN0R8ya9LG z%q(T)gCy2rvizCqKEEs~?K7G99=J|Ys2;w<`s?rs6OkoPhYmnN-BWGDxW%e7P6_z! z0`xc$6jG3ek|iH?iE`DM*If!3Lz+#oLuAHfZ71Nd_F-wL3*<{1jn?JeNDYdjq?BqX zy1mo@q^>L-mx{>fOf7Mhzf-BmB;Z)ftGh^XzY?sQ!DP&6#Sx^;MbZl>N)7-OQB-X8 z9I6%PuAyQ$-x0;8A|dx+Zx_pK9UJO7Ce~?8h+ZpZ@4U0q#mG8~eZ8gpIMkHwxZ74q z&Mc!8y?;?<9#mxU`2F-tU6w2mPgIo~M`QgN7TvfMS-yjOMjV1krQF+(y->Uq`csp! zCH{nrYTRmbkZQ*&$*K&<=jo(85--f1tG38r-5IC#;%Pv#H-GoD3dg7*0c(7bYSH0~ zP`B{J;>pjW^r_Mvd5EmWqT~3B*r7L(ePSih;v_xn_I63MBHlpCwsLOVU`!PcVbK#t zDEv`qGcrS-b;*XaQE|X=udrj6RG4Y-6iy7ToB@?7$N!@w$C_oJv)%mr+f1+X@=tXp zsC7(u0s8v`yB56>NY(0C@5%L&`j%Ml>9kWeow8-y%+OV4r)D!_^GVU^w8#uPmRc3a zBF*OD9*f>!H#c!r5Iq`uKiz|29;5Ua5OjNAl99wH-m;f;&Xj8!P5 zW)xy`3RxQ{xX!1ZUg)BKbU7E~dvDMBW-}c&^!*cI(m_;qM+_pi{5Y=~ivesMMKS zqBB23XLF0r=#)#2KHAh|xm+R1ch|e4O}^0w=vjnPr0=U^y*Xu*Nwnz3X;$JYsl-)K zNi3z17*83*{Fdb`6`sLL3SXEPq>q`E}1NJcWfphRE zlee_8)@8LeAaw~Dxp$oLCIaj*GeBm?Gyc*$FU4hDnT(%#X&BsUrQ_hb@0wK zfNUT5Jqvll;dayK%=I+o#G1~g&yP`CV_HO^l zMXiiyVikMa2|f11k@Afsvw5oYh^f})Sb?Dw&4N>2cX)J>uj4S#Sj!?|_Qfu-xd7pU zFONAHQ0C<(m2~=i&6Hq9}NKaV$SDNHS(+f)G-% z`=@^#w|!013j`woFDC#5H#h)j7U>;S66zDU?g49;BTG+ zA_~pE*s5}igM87#7&oNlHO_sZI9BVLM$R`TA#TPiJ`KmGeXrnVf@P=~e_n$>1MmLT zcnruH|8B)U1F@WDP=*n>;8PpV9xG5;X4ahwH zZsnVYUA5qHgnD#dKh))&(?OeW3IzJ2Ufeg{-9imrLDUP@wg-NTD`>aoJnbPlKg~wo zlVab|yHn$Rq*9FWMhf*ZCuhiJg)1 z8%NpV&N7%5qt@eQ_gs9t@tz|(E;}@a>_(5hbRl{J-$Yx`_L7Jc_AvW6eTVT+qmuS@ z+K9(rs2{8ReFN~e5jC?d!}3yXkVm=hiF@rNbOz_qvF~YFF?t6Xz&oGy{725Vo;e*L zdgBRFn)&)yKZoK1l+fA)hG#j~^Y<|hZ3z%<156~MHH_YLs%rb6Jc=gh-}M_}_Ye+k z6`b1w+V&YL2c8)A7G5;oYjMv}M_<$NY=qEi0$l3gc|*Ta^q{f!(}jPTLh!WTl_s!7 z59QW`O)iaXvlRrLW+Texd!U%Bj1x}@8A!(`rEVI%$`csLH;s;Wj^z=ri$7tmx;Igb z&k*|r*JNK1H8pPnz@jY8reJ8lQOaXixsNtsNjmB zer8S5)5-}7BU4Bc8Zj{n8m7Gs4+#c~vkMJ^kc&G7LSZ#cC9SS*Tx0BiGx&h$tv*bk zu5I#tLU`HPzFEC{c;K8io7C(*6`B3ro7r5h5OD1J{CPKuAOOsGLndNOuHl$oBgJFG zbTREy)@<83xmD{%6F}Q%YAy$e5?g0dmKrpykUNgdxT?Jx&*?JvF5cB3dBOgX2Q&f& zFJVb39D;-$p%@VQO1y+qs2X}e+Q_ut*s~AY*CsX|lkMe0MdFOVnYj}X+dj*bB6pBh ziOpuU-Z+Ey3!RR`gg)w(^+Ml~*@~Ns!s=mIX;vwuN%~^YKaLN%XF`c?UI?Ck-Ue3< z8E;Qga%oD5Y91*LIx-2@DN_G9LTYJ5i9T^giCT%`ak$EB^I|4zM*8_-ud4N6*eRy4 zrPeR57@ZHwHAGtDNU!l8HdgEmoi2o%I&BJ#chanaLERA0=^BUaGUIAjC`~E%-6nGq zwOh<>7|WtZFu2U)NSr9k!{^)dRxVYI1sw_71XE}t7+qYBi~~8GQtV}40k1lr4$xlHTz-IuV z*Z~v42?nhbcqZ612hEX;)70dTYnZi3C#3}hu*>}y#f5mV%i|YKWtBPLI5Mg9Z+HhC zw5L%8BDifO)xq#Cret~~co;GGaXAO)z#N>qJF%*VK2H*a0*NB?_{xK}LB_YFp08_Yky=KQdKv!_Ct43; zo1^!+ezu6oG_W?Sw>Eof83Gr}|~FOhC|5J8B)6&nZ^1)DL=-gd2ZviNyj8 zKf%CPCLnd6q`0kJ#!RF;Bz>kH6BLu}cexx*R0;rxBX5KYhy%cBiyTU3octkfCZDM{ za&g$jI!#}x25dMBbsMTjOi3DY(MmS0sEIsD;pJ_6S6AYgeCB+6{|EVzuy;^J*$G(@ zTogT`ovLf1WpWYv{9u$m(OS`oT2XBjw`e@IcdTVs5&k^9m;>S^aTS=PQb6FQI9ykV z$OV1p3SUg)TZS=$VTRinjond$|>ChmCXA!oI z=mUSV(ulfLd^P7-lckx>i4pt<+L%#N?V_6pTzZC#hBQ9?y<DX4-N z3qYEeWLPs$`bO&J!)s346=9#nSKsCx>Mvw`qL@>h1Oo!fg86UAi2XNYRB&^3baVat zLD?>0Qhq=X74Stxf*~YUx`j9&`c0acY0IRjk(``V6a$mp7PH|bV=b%A(i6yFDAp#u z^9I-#IbWDh1v}O<*(ICn`0e=Na{mzR2X-1Jz(#)AE_-x=8EbDr>v3%W8FoInrPRPM zgI2vKttD)oCg1%dyRNTY*J4Lvh6^r6t<)~hwO*&phg=()N%e-IbTk1!$2g;cd#DaZ zqv}_Kfd8@dHg5|p_(#0*prUqhpMr@b(@+##v9eZOIb+N`sW_v!!c(vsW(%O7@dH6l zj*%2Zs`yxih5;KoDX5N61n$%AH{3~9HJPSV^7IkQDo$v~KQQC0Jq$86)+G1}&6YWI6Ge8|*EIT^_c<^#b)kDE`+PRz zSAomc#$J5>wE&1q&~}Mf8&nG@Ue}m%Z(yvTYjOM=`gC{^b#}k)O$o1(d1OHot(i3e zI;CZXhKRvmo6V8B1z8FV1XTD(%K6`A5v0FuwwQwWKf|7%s-6Ol1|px!viUVVJ7$Ab z@Suhx^133d)j~;7;BUrC9f(E2#I@k)Y{`Ul7EO(IOR&PDHD)?j zxI*Jc>`;kh7U95~Xp`(3*$_JBeuFD_y03XS+gM>)M%QT4wRZP&t++a2*df6Y@KCWt z{$m5$o2yp#U743z^OhAS2e?=a5`ekv+7=^o=WM2-`BD;qz3{S;A^p!O&_#EsN?R@s zE*afTZ~G<`ql+o^HjgRw)>Cy4jNG!GMNWyjct*sMY7V<)Zbq1B`AJ(|n>Cznr9|SP zwwzx8l{P(>*&3AB_4}}=OicbCYOZ>|{=!0ghCEw5yGu_+e(pCs3}~SJHR@Tg%a4!T;xL1(dzI%58_!tG;+m#5 z*)fuLH@QSlL-`SLj=o$>adQiIP1E*cpe0PXaVYiv+JY<0Ejg@KYul%@iqa@0KjGk& z))_sEKYEvlWpmj2u7h@q(?%+1&42%dlm{BOns!7)Kf0fR(uToEFAyuFkygn%9p z&PZX%8fWEr6YqCC`)*Ncflm_SeFcgwqgS$}F@=bi%+F}M*A%;~)vd>7f7c23P^P@_ zcf9-qgWX%ToMZWUi=1Ny&=A4k0I^TS&^$LuWQx(z(cy|4Z9R&xrG`Yb2eG;NR9Q$i zbc#wCx$=nRX>K>=q3Ry;J!Q*s6jGv2pZ%1%%J3rpcFQ6LBJNo?WD@19lj8^@g7Fi! zoSO{Da_b~;Wa3y(C@P5lZh3LQ*!3bQT=RVdxj2gOE*Ht+%G>=S1=(Tk5BKbW zyNk4+qH!+t-15_`yjlBM89SUXPl+kXK?*@@`TkL5D`wg#9;GLcWpOFbAQyht*?3gg zX?H@74+-?Jhb3bO7T@rleqqT?k&D(ieYT1o55X`gv$5WY%zNeHpx!y5$y&9UNRsR9$}m zz$jt2+a6eheTVrAG>|d(Bc4EkfNsJ68)(q|4K!3-jqFX0oJ|#-t?aB^t=#``XcbEb zXIF6tXFDTT(tiVrjRJ}gDxV6pO?p%*$YW5Haw$S-ltXky0i-6d7U^yFx_gMpnqE`x z&n(QZkXwfrAryj&NPpH_iAoRibaZV?_RY-G4X)!2pR0@fzU=^zjzBf=O-J5#)9JgY zVZ;FYUPcaG2~M(dW__AWwB*9|o6)?IeW*z@O!}p$4fah9E|CoPRoBT$`g-Sn1Iaw0Zod@bvv+#9Mzk2;9 zb}$cHIBpRQ>K=44gd%S5#(X1Nho+L-DW~C@eRa9SB}c+2#!ixm!hE7%CEC>d?b5=7 zLU^FaOtg8o*dm=G*gOeTzdid4t!p#s(<8xFUAxk>$P4Dpg^n)a6b=@{)SaQ`(al7i zFeTg;c%~dhSCd@$bZ)9rv)#|s?S`8*tO|EIv9w*!s^gM|855Y>Tq*+Ym11^E;!rrn z#A+xQBthMolOh@P=xG;xJ1fn{QJkCgnp>F9R@WmTRJuZy@Tp43T*ReqzOk3&TR5td zd&k9CZE9Fkn8vObbXDX&k1q=VdUHovb&Vq`T+0lgoJn%B(nK{JTQSm|Vihavu`=GV zU`g<%=+&mQh;IP9bxO-Wsc!QLv{<)y^J|0|s$b1XnarJD}X)^JYTTfu#}>gfjcY^|o6gywqP66jg(NK#I|4TFnF z&Gt>QT`gPcwis~gc7b-rfJ?32(+ehc8rJJNyp?9*wMxvD5oEk!;PBqvOwu4M&f;?p64k}}W2=gxiAq5acmdGh%kCQY! zC`SkIjzrlfm^eG*2N8%0Nl@xwQhAyR7EG9eqq%RZS>EJ6?kHd!y*K!VD@0FPzOwv zmM1og6%OeJwNn&B&Dj}o3QO%}3fnynQl^&iuQOvsb1oRqmI{0LDmDc?&5_@r2HltT ziFyWE84&qYNQuaBG?lA|zy9)2KuHyc#2*hG{I4F$^*0X{aj>;DGjX+Yu>ac;*WAsX z)XvS-$k^6Q#q8hC%Jfff%}tTgKoLS5{h~{d8mRzAWs5qG&`M#tLKgidlmIDBY+o*K z>EtV=kTZ<&PVNuFfWH*-%KP$g7JjG22cvQ6HHYbqJbp;&jW~}4m}#JJw3vBe z-Z76+&g`lKP*O3bfP@%vP_0;|swL#El$t;om-PFXe}~23hGK)IxBI=6E~U(z zCfJcW)Jk9dhDXz?-@W}=;Btid#NJZr!3TPd$1=jWl=A7rlbsX2$2{U@>XZu2C|RJ= zI7Oz^$j78d6IrEPxQo3BGfmZ+EZ#1>o0$MBo(*oRRrqPcwm$HklH5s!?T%F>0H;d5 ziAHT5O=;@aB_)i}?2Jzwna0n<8;jzf8mCKf%X|qkJ9t3yisNqpM$PO^!r5k ziyyT?(!f0rSH-JpulB_bq>oVRD*5wYdF1PeMII4tb3Sk)toVOYB|ss9jBBxchK)X| zeQ($m-v+3wJ*hwNZkOj490~${BtA%BZX_b}SIWFUH6h#Runk%7C^3dxVToODvye4ylq1H3DK3 zsXwq-Hbq0Th=oySYJaTS`pjGg98Qre@4RCT?RBLQdI!rLp4kU>NW{zAkc<^0PQHWu z<;a|hz<9=g9C;q%zlGZWE>ki8(_yF8*45X=Q8x#NM~s2OZBQpRORxtJ9%v;Y)>t^-fg3=oYEkG#YR+!|8+8e4<|y=5&k+t5nX?i?bDW_Gi9zsIE}9V zk0i~cOXGlntVFvoqp8*No6v=(e=e<(F}$fStQ6CJj}K41iN@?IG!} z00XG}A{1lXFD%bb--EuQZrQ5RWEgf*;O=EO0h3T)fC!d%=1qAJ5h~}zmw8PrSi~ie zd5tV!!OfSphUA`XBs*;@aKX)&>J;I&KiS7kJLnp!grZ{5Sn99-%xq$;=`M3VwN=lb zHCJIJx0GGTC}NT`0gTYba$z6X^xMR(WVM>$*v}XKv=(JyDVg!G61Z@t3d+!yD*c5mH#+j*NEQbylWiglhv zz9x|R6<&8yT-tI`T;^nMDGOOuu9mJ)MTLaS)yldh6ko2Ercgn}gbdcQN?BN0MA&5&N-u_bmd2YW=6slDJ(A7*+@f!{8E`+0p>Uf>{SDI%5_dThb}_jO045jx z+0kcHR-Gl=eC4HTh%M7WSf%%Le&K2a7% zE+wL~P2x)pf`2><>7Qfo~a-^3W!;@ z`kXRGY?!W{TKFd2+B;nEcjnH;isybhUiftTIET+^p1I`gg}?s3>e#NF|iJWG?rhQ<6>291?>(@x39)EX4&UhB{{HJDgtQ&bO+LvDq~B zUv0EiY_ctIHO0z9la@cpf3rh(h@In*BNxjdqeN}cuGyyv{qorHVt)(O6E2tb2{#_& zF9ywS$<7b}o0kNl6PI9@(wKk1_-iYSvs>Y%{^&icfA0Uct&lcyHWvEUNuGRi3>;Sk4Y{*CBy}U-if{ZNpdVq zYI$eTnhzKiR}To~fJG|`b~sp z-sY=Bk=_irvx6X~g#l>;h2zqx!woL4yNY2ES{1e2JR2v*IW)x@*Ai4>@>15|gPVoY zE?#v~6c-Eg8?YZ?!y4}IvP>zNA1@bj%^VrmG}OUvU6U_W@;g)(NTUtw`w6q5B@GAI zec?UH)e{X-HB?yh!+2B72UJzTfazWNHs}H$aUAGyIQE%mR4sI2DZPdyESQNCQETBN zB0GiDGOU_we4Zuc9y+BJL(e(I-C zt(e8%=gGuP+Xp#UiDyTpsG#ZSIas)H(UrBFXj;{myDB zO!u?rON~*W%i{>x^3e&=Z4{jnFOKRo1rw;8m*x$i%1=3mEp zH|736MuX^=O};vUxmLnz6$nBXL~dV6g{#sbm>3M3C<*1bfmv;PxRQEVaHoJLL)Ewk)X0|(Qc^)la69@ob>>`EC!$>5a^QHHWHc$?3uAucAQa&h88o1~8 zIKgAXno;1735w=lV(cZKxZ>D(H(b9mCyb*2!py5@69TaP35!apC?z5bJL(I`>4R_> z4UuOOgPyHfaXse?F)5)YM;`{B#2igMAV%}D9ci&N0anEE!jXAunAFfqcw`jC2aK|G zgYt{>4rZ3+b^6F&#w+$8ja}_X-dK37>n`;-$l1nP_-Rk2XA;7_q#tHb3 z#=P}kZHV@7HuQgyjAR|`E&iusq^9SBZHDTvFWh9-4_K(zd<&wB1+%u4Tqp}Fkg}te zOvYTW?4AJO>A+i=rtM9ZqH2Sx0SiYq2V6 zOiqe)acN52eA15HgLcsLgkqHKV~mMbC5W0aF=svA5E=>1sGE%O?W-TzY|_?N+g-*5 z8M=%F(wI4vk{(hNBC)NS;UTMdrzlDWneoKes^!~=*)D|nS!2_%_N7f5(c;M=v(>qF zE~f{$pxFJGo3qQ|?h3N;P&I1*x~Bx$<~hi6cbTc}vTICQ-fBbK<5Qi@)x4sk3ZZIi z+%xUC^p0`U0RquiW;9`(huX$~9r_JtBVlv^H$KiwsvRdc);6Bwsym(+=Y(b((?wQq zLwd4>CsQQpk?Bs#k5Z(KB-=yUt8vx&Oz=?G)z1}qL6$;DX8&HR0s_-rpR|K;T*n2foOEG?ABVfQ#wfc5j%z@ z0^<3Gi1R8bO=Jv5sw+2{nY+9von5!HE^k2RIk=D%B+j=f*7ZW(Z-&sEEf5VIsA-2-iFECdj}^9(@L zJATfpzG4h1*E{~E$OA(>+XP|GNbKFj@vQ^NBTp1_&8_yR4e$Cj2v@q$w>JWH_09=S zy%Y{j6x4;kM7wn^)8< z#&9Yd40=b_8T#>lI z5H(7Xwkexbg;SMTM9!PhH;N|RR`&$FKa5~ULTf886WR;SOwO#6$M0>!Wu}KO;Pdkp z&Jc+J^Pt;Ghs;i<-V%1+l31v-2wUQGbFWc-4_O%Y z({qn87|0aM=wZR>Q>vb+#9k6*GRs^{C6;cns){#_$eU^LC~KI}3591NgGFk#(E2nu zX_Us&$~sedMqg%~UEVfN>R=5EMRc#vOA50n2CpCtvIsMgDMkt4G~HAZ0ZVc#%tCejB@5=kNMwSWd|<)XM!DfIEsgHi=319H_1#9i ziyb37mst$&&90d3w_~>qVP&S6>_MIh=2;+p#NNzKPy$Xly5+XV3w`8(Yh!00o92KB zpn-(MXQa|G$W#4C`E3OT4B=1zKZK3k{~Mqe9fp+G1Ts>&E4=c$390*sGeh1!S%?Bf zWDurCd`^!YK&hR52F{UdSQ@QWJ^4mjm3X-^SISnRA<7M_KG6)*g;Yz8ucuy3uFme4 z%o^0PoX7z9X3t2Hb<-kM-)_pf9F5*YZ`$cV5bead1Yp8W#-?dbEK#r%oCjXYQ-L7V zs56lNf>0tq96tdIa-l0+eUkOn5|%uG_%4K;Us4|tae%N#Vgu-*kigL#RPjfOoAr|p z{uK3aM1>=>TTr9S#)m(sY6mdd2x=K#1=hxW!B)Ql0wIOUeR@Ygkf?|DqP;4p40%mS zGXqJO7k9s?k9UPn3;ON+(hx={s(V~PFb(bRxj1@`LJ6D`CElmGV$a}mc@S*qNf7kwYpSE9xymiLx&@VoL$iGB1 zGIy_A+H&IK;T*G*^lDX}ilb?gfo#l`@3{iIRM0wx8CbHVI?J|M)hE%g0*~v_xih;i z{cZ>5M=6$(68lvzvU3v4n8X1C=b2hDYcxRshECpWw}&l`C9q*JZ_lAYIrBLI;becH72 zsi;5(o$6d))Jba&jo5B;{aW();zuU7M_RT&p^V0EMYaCmyiQh>zn}gboP1-W1?x{) z``x^PZ1hHHi@rnJENS!GwgMe=t7;)bg0bHoV4JBmVP$xtKi`-i1?NOEqX@Zl;t27L zy01c8&HWi?ihg71`GAB^Kcu8g(n zXBA8V3NX-{{z#c_XTkct#O#bX%(Jbvz6kmxI&oL;D+6eCsb*!t+AZ~>RTDO6>kFgb zpTyO-h~LmV3K!c_PCabyIN4f5(ZiVJpHk!?^trc5XD#UsE(c;29e8mR;Gda`S0n38RS&Cd?{wI-P6jGb?~+o8XY6yj$Es zt_gq#bISZ(i{K2p%ny@{6S3$M@iDA|LzIu`3|>hif|)2OS`GYLfo688|< zP5B;|Eeo1=VtZG?btqoQl2Xnt@3;|Mhsk_WhSw=k12Wi&j0VRcYKI|p6kT`9 zSVO7i4S2Q9>+~4-qlPLFDwmpIFQWSPoTV%Iv`H?QJ50q2v(i1@DTrFsaq^g)>4o5i zbg+nL7o*b(WG;#6q4x=S`K|7f4o=uY5v+aK$EcZLT$xbv zh-MokQWKGk>c@;S#^8=$+w{-fBa}e$J-xmD*2($tiCOE>r^ztrBna-WI7c5BPU@#Q zXCEg)$_941n~~(IJ*OWhLB>aGNf1tUswDKrWlqXRegY(+zo4~nn~-lapT4&F-V-YVTq)&P+ebz&*U!-`hUr#J_ zzfC8JYTx7Dgu`O&j?MDy=|}-?N)3tyW5-^p`k@ZNj=o|RbQn|%Wou`64uXS`R>Xo_ zlb1IcnJb1h4eHavj!@m|jsT}!G%%=5r*vAVBqPepY86=#Zce9kNK{Ey#b-N*%!EKi z*EkhYM%EF|PN%eAC?d1uEorkn=aLzmX7r7Zqy=@wXGPQ#?L$1e5W_Dp{!h^!`nGWT zlVZXG`JbXaq*^z8553oH&O}CeFb|Vx0t=e9Z%*zB1&5|4+BWM9n0cdd3tVhc^^EE< z#<5Xe!q3JOJ^e#`@_&l~$|K7nsrkGst$y^1x}#BKhTq%bUpLuwjP;JHJq^0O(uKa#?VJkseaq=*X&G?&RdlQkj)i8YTB$Kq?uUaeLtR3(RcR0rS~%Qb>n>Td zZ{{Sp0HI!aBb|k?G$ber^9yBt!wk7ViD^)14EFxB;X{;+x?+Y~pLH~0;+ z+a&Y;NuRo$I4(#7moBu@9 zzdHT?ExP`%DPSwReYL|R7?CB!{)qYH#(O;uECr1^9!J(>qS0@2;44O99!tjg96cyks`_&A;xg z)8OqrfBOqeF?^aZ+aI;%60o1-SVQ1kKNuq4vDo~k@n(1B40zAk=i22RT#cG{;KE06 zZz0=uyL3N!$s?0}v`!{!Kk$G&sMz&<`lwiM0*!75v%O{d3bLVpZfj}%WztD5WQ*4xN28L zQ8$l4YU>Qf%Awhzoq&}#wiM%A?j_#^`I!$BFLx!5I*=RXgh}D2047t!sKrF(2~#Co zN3V=yi|vzpicHU<`(o=J!ne0VVY5AYMG?;TW|z!!?@$FNWSdj zqx#TDp4_}0YlVGqAV_vbtIS^O^<5?r902=marMp<^V<>|7u$i(EpI(zj$~w{$H2!h z?lxK>mWXvvXOSTi9MOS8b19~k z@gc0!VdVntii*+EpeCISFb=C)dFkQfzpdVow~`5qQs@8(+P8~LZS_!0s&_p%Mn5Ft z`G)f%Yv?ckHZ1$Av1!lC9dWi`69Z}%p9;!g!5ouIgS7;LQ}P86j9ZlmcE^L(?BZ+9 zz2;MQUo4nBNB_%eh`fTQPYed{)gV!DIuN&KRH{?1PvWFq(I;xzdY17w4U|MexksMr z*ybQ>ZaE4`iX&`@Bhg97CT;J!Iv*Y$K-Pxlpzi(A9p z>2Mqp;xH(LxlWkmqhtlS1O*fIW`49Y|nP1IL zHD6K9LFQ4-91-CPfMCq%)PbKEOhe>+%zo9oG`@N@U;n7zvh)3VYx+a7JpVMs{vSd2 zKcV5j6D^j160M)=IvO}?sGsoZp`oCn1(K9vCBTD@xsDE&R2ni+&nau#!@%O)nB<(9 zEsC{K%KB3(zAmRVf_<2_EB)%s1# zcQ${_*e2{`gp2dB?iCs#z7yJb(u#OZ#o>nCAxwRMsSB#Du_X(JYPWj%9^4xbgo1)&24KBbaTO$ z#OBCYXlg!FOE>R(a#k#pt~8p~pBkwFU>c9C&%G6MnzViF1HXd_v}$ODfSL(EqQ zWlx$b?pH3VIzu+ux7x@LWL&G$m$(8@Tn{ookvXBbx1?UG6U0%Sq+Yrc#!;Rm>)<#? zTbWLh6Ux!9q$_Cd5hKY?32ytU4_vJAXS_ORQ!)WUc{wSswIX_XjvOkK!8I6at~~kb zaaV33lfV(eSR(8}R&mRi1a^WTD~$0AjwM$9$cJBnZD(2C!e50_(Sv&-z|ELwEsabMP= zpOq2w7c8#obz%x%FYNG#FpyGb962z1Ma@dob@PB$otMFNaa`@fY((ndy4n_6H(J%| zYSGiWl^lqLXVW9_3PX>Av`V`I!Ar{n#mThQ1(T8K7z(r^!J#dciuyscQ!T}S?W9>% z1!W@vFei#dso#cDDH#Nn@(QnYoK=Ihy}`%$Rv^KlIOZH$fL~AZL2y%k0BM>g(a#xi z#Mrcm*FaP1m1DjgIlP1M3IKc4oW5XmL4wPOi6N zQm_ON(-O&gP;1>ygeDk&jC*LFobeAg-SJ_|H(o8k6~*%RNVJTXBSf&j{Yf)NE3`+n zy{b~I9W927n(+I@TZSvh6*=YiDyR~jS!~GE0=eU`*#gtihT*qGIJJd-Ym0fa{RXZx zOr%r%<#7IzRhqAIG^BC_mR3Cl<=ilSPcj;i=U~HpG>?;9gsWduWCtU6p$s zlQT}0+Xc=p>+%Oj&mLCMzHVP@q!?i0N3K08`(%5Or$AvpLyF5Gu z_#;1h)XuxQuQevC&fY)1K3@$&6YQPylMIQIXX)2a##7{^HluRen#YGHaiY~HiiXn1 z>T9)5$)w}jTNUhRNf*`KtMAo~lx>_E+%UHb&U=1lHZ9QPvcNc83~$^->{q_{2kl`TBM*|?lrPxn?xuhtkEgbzPFfVE zXw&qtrBfXTRMv8z^r8h-7rx}!WVyJ!3804y$rc-FnH@&(Z>dX(Hg44P;#c5F_hKB!byB{q6paM&*m_%f+e-?5crwqqBbb6IRJ?!>wn^tt*Jn-mo4A(*^Wx z7Z<*p(=Y{Ce!rX3@L3>V@Ld+x8g#GHh>nNol__rG z+afjVu>3=B(T_^|E_;oZQM5}pF_SvWe44fEDEN+|kh+ruZHk|I2bwq`Det98U;W$L zYmRy_g#r)S71nS4AwKs$+MkB)0aNY*1&t7tzn`T65dLo??vq)cvjnAy)!yzq>(h^X zAFp=leb4%Q1vGC|zuTqZ-@YF8PS?V|Zh0;SssQKPeLTJ8t4|)!Z&od!+EwvM1T*!TyY5X2uedQ_}dLQU|&Vm9%lwJnoA{Y7F2z9U9FX^WSWNtOu`PYfx z*QV=fZh?)Tl$&(5tgZ#<%c?zN`v9J5+>cjeFFvem=X?33v%u01ip&ODGP4XC18fst zpCMBki5tz6bI*v~Z!ZiK%dJ1ed2$V!o6$$8r^*g|sZO)IOs*8RBW4XM^VYc?O7EWR zFCjXiY{^}!BXe5Y;OX_QzTng9&YqYr>2u!Hj0Qf4Cm-*cUZtBoD^r`=+NELhYEyb9 zdu|}MdrjyB@I2(q2&5jTYrcuft9qaEd&+gZzP-X!xSm@a)0LYT*0q4>)+6fO{ePUj zQ$B;X{`HH0rmoQHYmkAvUepbg|gT~4|2q{kX3w`K#}86Gsz0^t%QH1SMO z2bV}>{7$wy`s=P?3;G$tp}{H_uroM|X2Z14hQxH@nZ0f%v87=P_8G*X-pbX_J6wxq zBU{fk{<>ic$b~jM`HH9ReT;7n7uYjQi+p2HZ***_fgA9dnL(pv)K9qLOOj{>0^(1p zOs@wF@6ute3;LZ!Pr)Srd45lU0WRWlxcm8*^cv zwwNXOl00=DvxP>r;zb;4%(6VA7|~)SGMrhcRP_i!=Zw(6pqX~d@UOiE@1cg{o?3+vw>~Hd=om0&D^k>L@8|yMhH;|$CZRjLxCflR z+jrY}9Jkz|(;}bS;c3LJWeB3AB2pfl71G<_XRZc@GSZkUt`!d91QBcF9dE`rQ223)YfV1vqaypkk7Nge0%KC1NWCV zC{m81A!Fv`8dk~4DpXndo-v;%NY?bBqeTn|Aoc7$C;Ps-mb;~eq^10njQ;V2+$QAg ztp-XfBsOt;AnT9b;0!1`uxT%(`N6ZNBfk{-X1+J&$cpYe&qjEY!0Rt8V$$S_ z#FIUbaXha}GbsVTB@(X%Tm++j}OX@(3nO#&ybs|J;!&CTmT+2~DB6XuS{Yk_98CG#qz4RG8q3+FKGi!6QnyY6w(0@i|{ zodM0mkj*d@p@XM%H7j+MRdsEvVIn73+^MV0anmAeY))ucfetl;o}=}6p(<_tBTTjH z^@4~=2-|9FSMBAQ&=>)k2KbaU(QMa(6w0twlbqlB?^T zTL6SfVV*%FwaAd$D;#_x*-Ptwfr!SN3o%xR{rrI4tD=~NIDwxA`se``-0H5lZoL%D zzow=#&n#Z!ObJZjxklZTIQ{7BRtruHOy6WZ;yUYo`szk|+}Cfc4i zD#xx40fC#D16WU#-OulEEz$;tu?92>e=R8T+y?{>2gs!q z;9tfgtb~O}1CJugP;(#o!Ekd-&-245>QZ3|%>w*kZcBa}^uQ3d;kRFvJ@G|IVAYpc z*v_G%R|xMe+?(N7R4;G=X&cN=>+6LXckk5AoqEph!v};fY>+hO2P~pXH{Yst@|U)h zFC;|gH5sHXx|ayTanN0KKOWgY)-Cn`9E!foE5bnR-#|VvgZv#p2513%lmq-neTEOT zZBX2__)1XUn0jYUUWvYvdViFpxTIggF^9Z0IA-kxV^G4`e^XfDJI>5tC_CbPf zhAYR^i^MP7SO$P8Og(ca-801tMaVS(?k2QFGY6D!r^a-d-M`bH6&H98&@0OXuta#7 z=Rs!xsfdmAQz+65dzWg2i553w29c$9@%F z)X!$F>=ZRriCwfl%aE@%KvI~6$OG_f8|WZKunB+5bC(gkmGC<_g6E2My!2gx>+*dE z9B^HM@6fgfuPye1i|&eDfVQg!0g3`D4`Q$Mh^8cm4HsczFTux9pzRbUGY5nXcXdvO z9uYKbcZ6+^Y%+`^0UdxyR5rvMiYORghK|xYgkvu|84;9*COhPKCqd;0X-EjMvofr( zh`ng@H@tK|*bT;FlWs^_>A-?N+Fp4XIK>%isqLK11#ym30_G=I@GW<7 z2vQEen@ag!5FUP(oCTZyW?7i~_m4hNw@80O$gShw5PnPf9a`XZez?^G2Z{+w6hY{t z32BG`s0XMRQR~5vqp74*ACKtbrC|k>2#_m=LzY`9R{n% znH@&WqVGEY0-5+-1~&8)P}#n0{Gp;5YxCoWjnD6g7@8hA!FKkgVo5^gBS8f>DT|oV z4`u~g&=egDDm2;tdA|E1xYBq1Qfh_bt8QpiAhE#TkPEVJ-bi%*zcd!&oEva)pSJ<} zz~Zrt3<(^-wV4B8bB(0goz&ivON+#xmb01-=FRyEJIj2R*41!Aff)KiC2tYVXN)1i zs1tnx;th-A4;Jfoj;fKfSdAUgZmDE}x$cQ?y|@#_*Hkb`Ta!q0OPnu2o=xvoUjQmX zQp03h26AG5>mRqzPd+rYusK4@G-@Iw8+)2R0#?(;Y)h3F7M43pw*pi)05P@!V`#B7 zufuR=HYto*TM`(*12#m~W#=-jNlS7g-fzZpi@W`D!Lndaks86-ZI})i`O3lm z!fhcEJ;Hy|xVR{<4xttR1fhymNDiw~v#4Ml%8;5IHpC53^)rs|Y%!1HFi-eE1H&82bgB}Xcu`FV)V)=pxoKm2Rs9u}F|+`@yf(PS{6e>?f*(7Uf>myAAD zS9Xn?pjK990>kWqdUJPN9p$crh_DCR2(Xdr3Sa3S?t_P2mN+@qY#7d-Abs?no%lhh zC+hsZ+)LgKP#653ox`K8^A)uc|(|oZni7qqBDP_%jZHCUnQen>iVPy<$6fWD;xHW?exH%zaySNPkoH;?_?F&x# z(Vu_O3ik`K($nF7{Gj+|2mQ~T%>S1X^&gd+fVruVv7x!OzSaLl9I{mW%X8Mu%g?s} zIWLSTK)wR1eqAQyaWkt2)EIg(G-qU%$#j~p9{0TuWYExMJZClikdJXqOiN+mkTFTv zy4-S{=5~Gg`gnSy^b@xOZPsI{DqyKqWeUmf*cA!aB&pX~I2XMT*2$0S33z}RnF^Vu z3dw%^ond2V$*pL2plalmB2eNah_(P-Kcs7rPzl}5XcE5Ka^x(iX{EHnWEos5{^dRM z29;B8SD2-4r9?4NVgtXoSy`Zh)L~90hw&|VTcQg9uSD-d4H@j=x})@PSD`#t)=6ir zra-d?*%&w#)&)nfG7W1gOIgmb6wcQZ8?F<&M45Ty5S-@?v@sg@q_a&Bt=yb9CbVJk zP8pE|W?H@{G+o3b%$#&hF!e69`ol=wV};&XFIo)zoB&t5pEy4_m01Q%ORa4t#JLTq zWdYv4Dot65a8@L5H|y$}ime_dKUAD&KXA6GXURn~btmrn3#!H(HFc?4Fmk-o=3w%N%pR5U(sSzGqVMX3@MgWG0hSA{dh()dJf~AOx|%r0 z;t}=W^R(~bj4?wEr7;>d>yR(W5hl~E|5s*aS#GxzjD#Mi33SSy$2DV4yvEyW3z^ zpD7Z(qk88+1-9lSesP_V8OSbUYNlY^oB*E0@#=>R&Jk&mp4-DPCG*lzU2kAAvx5)@ zc+VSbg>1X~T7q!gei;(7UZZMAp&!iA9P+p+5^AAG9MOnk^RT2FA(0yR!wftf{!$(V z;Yj7K=pR9j(9&9`#BOsU@K4|cX)X{2ZDEou0L&7pdhy^!4)rjT;xQ~Ao32{62IUJ0 zBAF!gu_PhJ<`CSh_L0lP&tcqPzZLN8Pncm=Nm;KK+EPbe#ONBu0b87Q^SM25vHrze ziY3wZ`5sY3{2!da|GR1TA4e1nbpJH)u2SDsm6VZ3dtbmA>2&mfn`%sW&=G0ywe-;8 z7rpkNNQ{S=CTQsP(5Gj)yp*14uCmpiTS7I~6g73UDwoGqo|_69e9p4FFG{}xEOI@s zCn3_4p9Fi~2Gc)3C%=z5&-TUp-%g7ip-VD(;4TtI=_K-6Peea=wI zmzaKNUB8#DBg=S$rj7ho){awpgf@Q*{NyfotatWa>nYM6_Tr!1Hen|y7xl%M_tXwmNLvV#^Jdsi$ zWUE{>g_eM{Vm!4{WCVi1R34$OtqoJ<`=pqyv**fhIoE+`N0cZs%A4=w+Q^+2%$}4u z)M2O9{|e2VECbLSI)NrlDnJi1oE{;|udkVuQMC{i>@JesWHKsX0w=9QNa>; z=i*?+G(5DY<8q(^miXySP*|@%OxWaZ<|t`~mFW%2JC@&@$HqE55Ds6hPGM>(OZqyv zXL+CT3#%an3kQ-E?$6mQpBreZJ_ua#!%b@VfwidWgkyq!s<8RVm)xXOP+@%AOf=N# zx!OFBtMCR_Zor&1hL_n3+J1pnRlf#^6}5Fw8Y%~?6ZVDDBgAH3{5c~=2X8Pbu6EeA;RPgh0E z)Q6JE#Q+N2QKXNd7qan>254r7-<-;dzItUwo6mldVEuk;Jso z!-EVQe^0t)=_UjZ-Z-%4D8?mryrCSzdB=9#6|CfW+M*(3^X0e3_1gVtSc`FHU%pz! z9zEt*K3I*9QO3zUS@jx4Wm~>jm5-rzvWhp1#No`){C*dsa`|&j5w8FWPxzE-o9J|E z8|e5~8mOe^o2{ToI*uFUDlBcF6rHF?1=?$lG|Fa~V}mKQvCO|`@k&r3HAm`Y3(a0o zKH8LZ@UzUd!KB()#`)?ikknEu&4f^lPlIWmaVbW#rwhXVemMOC^MRDVqPZg&NDpFYjx6Ko=Me;W$2M)Gl^gC6f_jIHTw76_>fulGj-0v#ZmGYi zi}a55z68TI;N2&ChZtR(Ve9wlzqwTX81qRSEkrJg30)lAcN60d*lIdAd|r>8cYSP) zca)OMM#S&}=~w1bv6u41VdozgS4a_4r#M0;pD3=1YmgYXAcrBpD0-B{h-(lZHz6;T z6er*IM3)^iQJeP6=#HUsNn-c}9cf^dAKw;$@x+?C#A#}9t&hSP#C*o}SbGnvhpS75 zG2ncM=UfwAo#Gksamix%0})dvYaex-5Y3!s6T;6mg*7pD=feRttO7l=I+NSH5%lHR z_u$rV`2q5`K)72{)69eX_z?s9pOD|bYYP8Cc>mM{e~HmNNGb6 z*bhOOBa_yGYwk~&&rpzBb>0Iy-9Sg9(fTbnLO$%4nK&u_<@olM{t}z2`h(3MGT?M{ zVd$Zg|CC7SI)N>^n~f{SjUfp<)%vB0HQ}WJW87@D3q8t<*4p_?vU?0a%!t8K=k4BP z-epVWM6**SU{N;=ybBMSC|w2#r0F-iC`82P(pGe0y%a_o__Qu85LExyumQdgG#VMG zG4qrk5I=;~uUODB8-@ItM$nM)84lcZ=rejxz`)4>YB|A{)iVCWwxjh%$ z!bd&^*{xZ_vPaLX;bD}WQx|jsn;OkZQg9d9a1hgnk-q2h-~WH^DUu@(z>gn8-=F^( zdH=hg_FqxhH?{cRZNj{hroM4D`bciDfCVOX0SLfdp^$mL!o>?%0kUu;|6!Oz38>?`%q{Qx4bvU(osZw2Phft)4bTqj z6dUe5)=6xFFA-k~x{cH;7`xKST5uxjVlOIC=V7J`b2+*gRx8SX~ zG7cv*44V*Qmpqf-KW@-05)&#Q2RX6R8KI-0gT5KqG){GoS4$N_ra>A+n1jAL4slYo zmkvs#nLRbt9#vy;C(v%PB7|)=5%wlDu1OA;Et7M^IF#%-5KA*U zwFn|*)vD2hY&9ab@63i=lzlNzD`;|2z<9n%i{smP1(BrdN+Utp3`OmOHUT+Q)DX2oZ_hxrPQmj-UJtAoik ztANQ~bn%!_$1YJRmzawdTznx2%E3H_IScB2{Dt}(gMzI-+76`D#E_T%16ARY`*4O0 zsS>|FdJmEYwrfyN`5k62*gt0f1$OS0)xJ33PGo$vBN^AW@@Ep1tbN39^rr>4WAzIH z7;D`|(rp{PfXSylQkx|ynfPIro2o)xyHZb&t&3b=GY-oR0Q4%Lz_v)wCYs31YJi_& zL`wIksJOFu=GV@Q7xF-oHI0Xum0JtoBZ9VH@vS1{e1DQhwZnV=9=6{T%96z2Xn5y4 z;r-8t4d>sueE)4;|BqAYf3c#ilm6lJ|0!*qW#KCaFDM`r-asZ7mP=Bw(V%kC8*#wc zk=l@gSs=}3l6SZNAoBrs5(2|poMYupI>=d2z&|26Y3S^I*x~lPf7yA#`!Vk}4Ns8~ zqrB|kA#HeCZ4qK}c2sVnUR@5&O|n%}8NJE`G{N{P*@2L$k4^^eNpXU1tjQXE2I5d{ z##~*oE)cIlIWha>v zaHPkQob_k{@kdIDN*JBbc@}G%?LMK=P52BHE{VQENPrRg3*GSU9E?x=2|aY6wowAX zAQWEjz&rB(ITVCrk)$<~1sKVPd>IeyV=BvF+;cejnYxa=pmFC5Y zH<`*yxljqUf|}K}N!!Spw$*SkN;Mb*3`?Q&jxOk<_+sGC;bo$P2hgVp4QXo@ch$$v zb+G|P(DE(2c&4++00O~awY}jVg!0=fk2>630GQrJaD+lTbkz+_H^uZtQn&;KUV}%n z1F9vqeqMgUTE#c;SMg}kgMMk+{ordDwxkXA%b@XWk}5s>CJy)L2Rd!$lmRTu_ZPE9 zLv#d}4A;9f!o)&aSmMYT?Gbgmctl72`B1%HB35mIaE2K^s#P%A`54`?ERQJS zd)pJ^Xp_IUJ$VVMyQn^{t3*&TL}KpJ_Rttae^CrP?twSv-Wk!pg{G_x9_Ho>T@7+2 zLosnGCt2tIPMX#?IOdlr(?Bymly|ohn*XOKq-9-Zz5gvEQ2k#Xmj80~jaIT$M3hJO zfyKvagVgD}v}nh~mq=R&X^fOOm`q{Re}41Bss=Jm?_PYZtr*MrJqvd#%G`azmj*TU z=-+tLlX~Kvd(t&Aqr2n#gRF2ou8Q4%qpreb!T{Y;W5D5RBw}TkdYVq5HCWz6EE89K zlbc)_r*-NRQt%`CqRH55KxwSl0w1&IDvSZdu!l(SPFi#I3ssPL1osJ?`MC7&Rt4}HWr%Bi|3hRk3-gVDLd>IlwRi;pE7+Jv~cwPlnJi}b!UK557faK|B;n4kqZu|)6# zw%u~V-&1jPr?bTt;c_6I@7;d~;v)hhN8&E`+|-#WRWh4Pa1cB2*~|ANx!sb^Xp(>( z1&5VBw+}{6grcY^M7YCZ`>aU@9o12g1FpmAFb;2+8o5lVUQg{09O7rxN`$*`gw~Kl z0;wj!oOoYvYc<$lA6NPA4@^Oqyc2$D^5n8L>Hg2Y;(igWNIZmr5B2+;xOTy?Iq3~ZKTpL%STy6@o2g2;bV61HFOOWngO7dSOtjK!7e02sbMk`MZ0u(eE#o1t;zNEq zEpH7{kgBSw8(80!lJ? zMhvA_pB%8Eq$dw~q`T7Yqyxo!lgefx7g=$c;#zLFqe7Et2Xslu1K4289o(~ECE7zt zsTVF^1nUoeU3e(+&44X=a1lyB5sI%=FY(*RE%L_Mz3tz6VAC{PoA&*R^ZmzH9PQug z!M_ywHs3C-|KQ(bu`Jyqy%h;L_LJ_$|ye1g?*B(QltF)R=D z$$cxMuJ~>xJ3$nM8SZ11r+G=5Y6ZUU?Et5E%1={n_Wk@~OA?H>Cfi;&nK#{QZ|}Lh ze*o!%Z_uNdU2An(pk$UzX0crx&B)}EX#C4tW@tk{_=ed0U?IFD-Josa;?IG}?t z3J@DCmMGens#{&smx-YLolFLUB4mJY!Wf38^@xQkw3;!a;XA^$B;q9Lok}IqypBQm zupxlxAL03Lq8IS9=6`yf25pg~I<3~7$KYQvs#0z&l#{Qg)0pTW6?)U_3AfXcT;gZ@ zcy1Yk7qg~D3`0{7)6D%0!U87rk}je%*d-N_hSro5UUBxBw za10l@o|X}e3Q+D;x!azte=f%c9xeu&Njaw-Yat_>ffxr)wNmI2EfsfH3C?fDaF=eH z+RP?7rVPn(I!ioq2WSf;0fM`YI~T7|-f?!Iz0_W5e0Vp0&pv5uhQC*;tR^^;MmvP` zTsj9hlXZy9h-@|0lFD?8`bl^qtLeUGmSFCP0yW?+%vL-8Av57J5-Zom1(>aeeqpd6 zIWL84Getm8h{h%ODzfO)`GcTq1YAhk(JN9~?kv2LZ$`>6X2a5!dE=Bh-i+q^g5sqJXXX$OGC?Jegrbri&igpG+a1uw; zd{AEFtTr|8YSMS$CyOvWH*zj(Z4QY0>P zcO`R`(!Br|e}a@7Qc*!~PQd2uet8OcvB-O&zm{jq#wmfwkxxwkrf7>j5Lu8H1CN&(MDk3q^TPFCzLHC*qKMWN3&#m;CHLbd zYDj0D1W`-KO(cAky7{mxw45yn%zF{*dtX&n>;)?+6T4k8q+neIJ1|Sqcn^K45{oP7 zGxA5@6%GyxmTB33_QRm+5&ZcL0$&(#ej0NhhoPU*?;J?J?@X_jM!p|f_z*S{IQ5Qx zlsyN2z;Uy$m8>R@dl6pO){iKayrO#2ZI8?28p>P^$BX^tRrbi}!BXh8>k$Nl>Gdlw zzw+S_A1?R(-SQkf<`m=>Z;q{ZO_o2uypYK|ve~Btf<+{Mfn#Qq{wSvmxSBn6MQeVu| zcjH`AtxOUA{KQ)XW8y*2{Va?!Ey+|^6=2y~JIi|D+<7xOzUkHN1)$PT?%#$js3T=D z(x!XF3-*V!;Ppi|<*D z!c<1MzO4{}#l1JYUT*i@wNZu>mR4^YnN6gz$HeJ6LmUhn5b_R3R#GM$wASt#ZnJT7b=l-4ET7p$p7+@CKK z65mK4ARi)skcB1ElK`{V11mGsBcg_0%Hqm8!3lkkEE&*6Wn;(?2q6_O`g1zL<_v2@p6TKp_F*4xHM zS&o0cPFi_?B;CgG*S6=IM#7Ympvy1?Bj*^>{${bY4+$cgLL4+&s^v`ZjEzKDg9v00 z9CsSzbAvX0S7q(2Fj4+xQrAjj-AaHd;j|VnQ#T+t-YeqLH~vAxUMImi4j>v)U8jwf zGkh_=z!>`~+qJsipuP#ISu3S-8WKiCI6xG(llGFD89 z7c&0{Z2M{=E;z!6Sk6=u$4nO!q>*j(>%-Ku^Gj^iKSUje-PnMXu%-<`LbF|-;R^M| z8^)G=92t&uP<(%{OiBIijb(ez9@7!osqW+zT2*9$`HRzVwDQ(Z7hXdW--RCuI&xw_WHpzbhbr|cq=hjugd`@Ig zxT0BZdCPmE`j=l>gkUxLFGHPC667Qyl94C?q*Ukeq4%UcZ{?gdHr*09gY9eZSjr4< zMIdp@%ni$zii@PMbc-nHJzqb#g|-MwXh)*(gos_ z=|Hgn4b!VU@_%HUO%<$Y1KIl-I^~vVM<3f~%+P!UAM_5o-2x{1qN3*Lci6pQm_X9n zReWpBdcYMC11q@=A7+eE8uZorK+z+)ADOOQ+rrn+eIn#IU6&Ijc4`)CNvTQ=D#F1M2BXy zM+I-I-R;S1I{>9<1DMa3s9dDjacdd}!~F#H0esr*;dX>c2dt#paXV%Iqx*8^Im7*u zwc+?QbK?uFw;KUCBgU1T76vU+N(4!fK+ecwPD-jMYDze^d>F)TO<@-+YA6y*Thzyy zL~KnO(*iSs6gsJt`%st3+oY_j;%uTUsuJ8#!Ce`kZfvCNM~POR6LmTrRugqP85R=- zZY<;}0?asoyS{`MEvyI>(J+>)J<7p|kq!Q^^sL?6qUmL5gRLCb%EKe>tjK{JE!*2sP?k(V5u{OEpjGC9Vk}Ut zvWHS3FI+AsO(79vAumt$t>aWC6BLe@yQ1I{a4sW5u8krb9jJBKQvTBjetS@iby9?1$(Bgk{tE}B)zFCD~>hAHIr$^hNQbOTQESC~xlHpPIQhV%kY(c4ZI zMYkKMuRfEA8^L}&1Vu%;9K|l0ON%U3QplLCqJ-ar`h~REu=7hYs$n&-9UN!p1(lgK zxE)bH{_N#|x-z%b_GiOfTidw?zGHoWWVrR!Tn@yaDosiiBOzL}^T!wDHen%Fv^f7vDr9&r(VDU@p()QcH> zHl<7512uc?wcm`8fPkde1_dO%>FTml)$*Gjf|8{P2DfJUX8mY*`)5oEIEwmrOi4Ie z5ku`_6>cbrbKz{2Za9IXxwk4Ys>0s#J*X7Pf27UVjXuRjopZ#mX4Nd4)1+EF*T@Ox z(4J`BU0_ZP2Gw@dyuHxy2hsi>KQdipj0#lNps0hNW>^rESq`HLbf_5QE3QaTJ6&X~ z2-MajtDmkkJ3s+-sOalQ*C=bKmFo7FL^9nGQWfi>1p|%H33X&#%k!;9QTtV252=uC zH$U=kUI%SS@@!5Nvrg%c24h7XjK>@d$V9Ck1l3Pqm@--;5?#z?X84Y*Vm}|ol&iQI zkKWI!z1*G@JsaS4=ik~At79s&3x<>W79S&$W@Mbu73+#jVdL7-iKtQD@ego>lMCby z5e2~@#v=SC;Sng4^vbGJ-&YK(M$9FF7W0asBfAqGh#OJ`=OpQoS*5;{9?%K5MS7O> zinZh0Hy>a|~!xkIGN^bP#NbS_f}vLJ4;naZBQRqAqP?bEru3fBR6hM>Q;D`WXC_k zBqu*Osok|)qbTgU!N?Y$ln$ruQ&0)>IOg|5$&W2ZHyU2;m#9lkHi&SK*E4ntN=RcBU#NmSQDe3qGWZxu*`-*x#ftu^d85&yU9*R^?IAPXTU;$~)eMeyd4hH&qHy~7 z0h`)Avhx>7s`y~~9etxn*l)7h|J=d>UT*^>ht?O^8Lqr8RSsvgtF(B8l7YapXl&hHO$twokgd}k(aSH?HS-4 z^FoG9@lO!MV90u#l@ZkO=a}om?qPCQ`Ft%uQXnRTJS#s20EtdA21AV!M#|eeIlCd@ zZi5=MWTLmg(XSORH{m?vo-Y1y8Dy!6HkB=P#>;}95}^zBOFN2UO=HSr+*m>aduVEU zu5g#|woncd16&UBx#Rp&;sRU_5`??i1;lDx4(hqfe8$OvmaiTjmHFza`r>A5!>#cB zD!zx=+lU|`iWmUsd0o?qD)2{^Sw_3qx5%P#<z)zeEJ<1_^Xm;o}pv90pw0V>pv}U#}2-QQ3VNN4`0xHef*jQ+RoH|j+MNk zAMEEd)?e5UCvCiVij2Jlif#cj2S%A@`y6pgA{VE#(@S&3B%0@^%L&!a#LY%UF}Tzi z)rG2(ElYEyB!iUb)6xnL!j zeh8G&46CbuMlzItHqY4E8#%#2t!q17?iuImkX%8YRTga#Xp>5ZI0ZNTENT_=6!ge! zqAhY2dlw(XpJqtyn;GieW%4QZ>z&)$S=0N{GkrxO&4vN00s&aAI|ow>jA7Q;u(Pgt z!1)8^lb72FoAj$vm(W%6Z5O2DXs`y}C6o?hAP4?Aq>eTS+T;Rv8wRPWz4_S>keB!m zX6Y>q=pFUQ9S+hRH|j0W#zAb`6t_M?6{vl6K~Y3stVyUm6dH5C!^(t>1c>c~XC2a+ z?{|&?11D@!&n5Y8#Uy>NIsD(|9QOa>(50v=V*kad-aw3W=6Y&~S5#p}32`AmxcllP zCPTsr@bv|xqDDYP7~&sysni#R8WX&iE(tsh9)+u!%I7JXBUtgiQ7RCx|53ZQ=dnZ{ zBA(X*#dX#n+uxJzCpn(i);^yPp}9ZMUq$?DSyLk5gg6QY5kQ8+i_Bp%?)ne6jf%IYc*t%lnm^(bVzglbh_#njTf)A4+%UV>Xe3o zi$EaR@|5$*>uY%7p;JEFD2}D5bntZ5X()6V-Lpzb_>^dlbgJLnxd&xrh^L5ezLJje z9-4)J1YkaD#V<6A`UtN4lrp*C3MG2QvSJD^fu*uvDC2V9yt!=1S~-|+q{>|J3$;Rh z7&^a9^OJK@+hnTF9A?SsZI zUyTAIoRTcPW*8aSdF1rRQ~DiQZK3AmrP~m8(JqvcZcVd@LK8^#%=cQv@e&XA&<4^h zOpF-x=xU1XJ`FtTYiW&<9KZe-+k0BFnXNlB1_4K!q`3}7@h`jctS+Q`(y6^+O~~=m zFDD1Wt^n=TKE<=oci}w`&NUEs=@NBg`w@3yMApQ)v}AC%I~HyDi%wPROD!BZx=P1y zZ~jlzuEfZx)e1lB!r@dUDMb;K8~sc;Znxa=n?(WsL0&T95yM19UeoYT^xHYFK(C&> z?Ce-yi9Q3jmnjPA%q;}#>7yxlQihEr-y8~Fo- zo|$`ze(YuItVP^9Pwx;Udma@Qt9s8aP6Q{~0~3ZxfC6lp0 z-?c>k9r-E=jGuX>5o_69Gi7gGW9eO!eWtP0>|QhFYVB|NSwrn)9d8+_tvNtFCF4q9 zvT!0P#M*z#4!=}sHMx+wPPWucg=O54VqK`U!o>pEaok)lw&Gx=bgEGUqrKwAQsHD> z5^bTaX#E$eM(5QWZIrDv17ZT@O8Z6}?aUcL(=L>TD*V`pg`ICa zc^t7Vd%tI~G z6;zDKOH7q$hL9rp5k`kkFWx4$Z!-WHj)!!P_=2D!{*hIuu}?6d5iSZ5h!~5sEXgH| zp5T`9&5#2bKn>SLY(W^;aFHVl&=vvc3{Y@Ekz8YLs=rVhwb6s>3@5`H#C3rxykw!N z=XC44-iPQ6g>b>^tp|I}<{rYkjNgQ!=z!y5VAOFu?v~JfRmIKt ztL@<#N=`p2mpBPeaK}}BDkD+3iI{@q=z>5TK{(Es>Pz5|Rq>%U@=L=~`>5=GVHZ@x zOKjZ62c>hU*Yy2ghK%v%PY|0&qa)Pk+!Q_gWFha<4`9;-a;GkyLA4(v|7uE<)RNHW z`~{f*kq<}r|7cbILlL2;|Npy`uK#l1CuaHYS)sY>f(UYV;t;9}usveJ596}}Ia&bK z@a&dVX-xqE>1>e8p!w+W7CU6Cv0sS9Tbf@oymGLbta-T}GQLW(J7!j+nrv5h80L@1 z)1EI|?hhX?yuCaidHXbibf)bt^13V(9E1H0Hs>w!T$;2NDAE0`52%CXT1opIFf!y$zi>JdFLU!fdD@W0FnIbIB zqZp1*sg-0KSKfc4gbpSDBlB%t>;*9z$QFW>RM%*{h+*=aT%JmK#+lOm>!QR2ODj7> zh^Jbm@{%D7*0fW+l(3sLL@;3edoR{zL1GeL&}e2*rb_HhB^wSgw~lEVVr7dZhlp}) zZmY}VJm?VxfxwtmmB~G{Xydz|0)s&s1bzgglLlwRIOCb1RhKdb^VV{km9sElesOX$ z6xJHaK70ICAi*C;F<`n#;=w-Wky*%GaCODdg=;^f=^N^O&8?@ zp6z~tALd<>1D-zzPdAo zW&gg&N)l(!RF-@Bq>Nxk14kQK^moqqXUVfLk2Z_v5Dd?aZ&Ob-os*5`3S=BVkuylK z5H(Sk5@KLrKH^KX@t+)%TdGFCqI6rZduaY#Ex2C(UDQcpZbIH$oCz(FyRmABaI$Lr z0rzgSDU`fp0m-yJTB+1j!CYB0(MDpoCuou)Jwsil;ygnmZh1feVcVbY9rl*2Rl;uMl{iv!>pBQgL-ki0&K40_A$GsI^W zdR>4?)QSbH<0sYu;5ib*fq~o z>zd{${a_OMrm^rrp8m3!MjVkfYtL2e@&cWDn?*B{Yh8ZXqF{fE!BeW#bL*m8`mH4- zDn@!77QT&NJ<2s3;7D34If0*o!DN*;7xRo^d4T^iyM(AhJ3F(*iO+qka__f!dQtq- z>x10}!+Rxuxp^UOFzfjs*O?PZYaTVHim7{mRcp5J%2M?8FY~E7N%k(uZ&Zr;A5kgN zza=V#zeAIMa*?&%l*|Ahd``N{hE0BvnEV~uFpv(h&TyI=Jv~8&n66&AG_|>gLIgxB zZj`AvIxQ}j+hcyzMgW>U?cY@(Gd{XIf46}^q6V-_nUtUC2FJ(BWfqC2xQRP4JP8rQ ztDQ%_lM9q`ro{}X6+LV_bEO-99`vhmMd%Rk=ecrXzUwTD+Nrc?sm8p~K^w$RDU{E0 zyUwWOqy4GJ7#s*3B2?>bat+48VGH;~%NG<*-0`PC22%zGQR165674ezJ&I02!hu%b zQR13+DoXu8L9b8P5dwKCE&!CC&kE6M9jl`{Buz(Ag3?B+EJLMXTVgYZLIZ8w!#HmP zpU!V?k%lRSte6Wb<@yukEYK>0)yZOQqq)jp52cR5Qkso=#wz{{KjQDTbA#-1FpsFm6h-VCW z%W3SF_{b;nkuw#~mWb20*^6nVD(F~>A}z`j$5JIf>H-LE0W$Wnn!L(Gn;H*8G)&zZ z7jJdHWI5t_{`vi$y7Pm|P?c}T9;nvZMy-Y1&SAz$YQ{qMs->8^oi$r;=dsAPUB*UT zuLV^{X*o7V6-0rZ^;TL|uVdL-J?1e-`TEIz@kn~2{^S3n>>ao(TeNQNs@S&e6=PLc zv2EM7ZQHggsMxlhidC^I_RZesobR4{-*$Iv-(Q$(%`y8JqmQR+f~fuxY$lz7k@o+c z(JslP$TrTz5%`vJ%w{Ax+I}U(n~bi>>6tv))6}drPJH>yzjj7t3ZDo<`6syKD?Ft$ z7LdFaiF9(ms9&$zas(^wJ599t1QyW0=$Q);_l=~yIVa6px4$C>JUIEbyRi+#bk`rx zfO$PVV>~Tmu0H9v*h7%%m^LRpL2=%R6Nqo4Tx2400nmyqsZ>jG)xXtZo16Y<;SC3r zPYyjJ;;HV-mV1hA!rLEY8=kAhea3c)k3KAR_*C$*exkjBa!$J4h=f#AjXj_IUG(Ii z$lucNv0O=JY-IXzK~~?MHy|7DvU;?vgWy>ddNa+nz#ZdjUHW~>aNJ;RPu2R^{Oi42 zb+#Xuy3||u2+}srd(7<$ciQ}B9T0JBlA{|-H9UBAd31`7vRE&7LSkI`t4a}_f3)Vb z?zeWWxmd$r!WNl#?RdP*nWjDUeYg9FA|l-G7yk?bXcAK+_qY5g6a$S6)i>@Js@bWY zDqF^8BqY7gslW;98xIaHP(>JFzMIC1lvR)Pfnn4a-8 zNCR6to(2}4Tp`!2)U7Gk7}4A}v3U(vM-l)#Gu}FKo?QWK4b0@goyu{vcQ5 z$o=wMy^ioTcG8Zjq5%&*t#i3)ujoUCm}$?QY928w22bA(A{gS{{FkARv=p5{2=?%n z-E(H)P3n8+V%4aVJ>3q;Z7_WM&Zb*PXZl65 z_QnnIR=$d2JdniDIjbCBiyu8OO#_iht2tZyyyEp_uJK7Lx(8$sV|S0A45TvnRFA{O z2o$5BK2RD&sZi2(Z2l^*_q{0XvJ?z&e>1-=LzKtUKetn+|7gJY?|1zF;(q??9dA^T zR9#a+>Tlg9QWVw5kYD{|Aa4f7DFbi!qmk)L<2b#kOb?5;fq{Usf8E4wMA;Nou7H z6>*NhBo~d~tYJ3up!Hf>o30oFL2BH0O_|Z_^$r0Fs^gm6n%C74CTc}y0g>?UL*fZV zS5JB2Xn=S*l0tAH0Zn2Hvzrg+In>ooGf^>JGtpUgjop`70<)inX2aqENU!XguNbR= zfO{k#vn$vU#)1zt>UNS!G#MRkL!kh(ObJf(G6h>rBBt;WZ%}B^q)jMg=NTs&hij)* zL1Fr02xRo%8E_Cn(NN&<&tGp#5{_cbks#$xXmIopgv*NO`Dws4EEui$Wh3({SOsM4 z7#+p{tv)m|qD&@If%dQiFGuplWZ&`=8;mPa;xribPXK~pGSg;uU%yws=|dfyWkFyv zCDlZIy@Bl%Qo^K^$>Dj)F7iKk-kF0r(4Es;0ogCC`xpV7=kL`)sL~;^OAK=#XVG*9 z7G2IQt&x6Jt;wuy>-}mua=Xa}zXXU2;DRZyCY@blBSB~_5zGADYM0owRS40M8@@fV zpb0iRy#s0{IxrCB#nqc=WuJriv&$O(LQfU1OoA&N)gg7vQmgh3*jX;|Fq2q}(glu2 zWB+Sx2k*l2ipkHZ-APRto{zt$REhVG)>ipI@a%LG(x2~v)AH@v6@+?S11a4F_@e3{ zT|QO;+Q0A4a0K3GeS$EY>AOo za_AVn0cwQje=OqT)RV~5jK?PQ047$<~LHgX8pyf?TPOWb_r|IN1 z;j9U;5vwT_a`uu)@kts`a!nJh#*|j5GEG>l*rD+E7WU{y6~uUuE94pUn5pzy)O*kv z^7J8B$yS0hFh+@V&olZ04&3-}Zi^pwgGMlZMNroXyL|gn;(kWn6!lvYW2DhlU8Vtr zQP;}c`E%9!sc@>hDU|C!)n)uY!v6lD4QBiwZScR3Cz6#d9Y5P(p3}eD;Gd`oU%;Og z$_T&i{W=CMH76!pSaA2tmXn}P-XD~&C09)$y1mwvI9~So5~8}6-n7LzPq=ujJ?ipm zw0iB<*yee^={UXF`TUZr>kApz_W+k4aIA)-yI7gEqZf?3KG(wqO>P)a^f~Czq zE@Z}CgX^_Xp4_6WBpI+fs(n|T@uNb7y!Aq<@{@Uq6rIcyy==X~N*2WwC1kk9Ooj)? zKSUhzSUa=%yUCF1fo|gf-k4uNErigYh$JoU)0D=S0T>D8LY1NIJ79^6X@qn9-DD>F%3QQjz@$tuj z$YfL~kGBQE-#kJZ`O>EZ27*M)%++pTl>{V(MMh{m0~qCM4^vChl`pl}CZ4c*Kh>m+ zBM04|OOJC$>03G2)MjWyEVwYpc`JTd40igqx&)&Q9FcFdVR2xWXl`<))c?$i!fiqC zOMQ%x8CaaCI3BI+Fj?)`i%BiBZPYzp604A>zpxAB$F`e(rrU(R!}{pw#l}Oo*LB4r z0qu~;P_N8{r{9YStz^cloeZ!(7AbCq;IS;7$_b3OK<7I1*v7TJgsqCzL} zACAlt5(~6{gA2@UTY?LPoRa>H1wV#jc;|A01#;*&#%?k&fsLNN_*Kg|^?%5_BSHg)8&>pDi&*JiTrWv)>x8tlab6SOb z!{;SFhi3Iq$YGQ^0=~&+{`cRU0Yer5OiSbQwUeyu+)Wr04G*F_<{PzJ^=Y!X!`0rzzdhFo2mHR-s z1y%RTT}Wwy6Ju~}ctbJcyj?)Z8Ou8<2hRE$`%=X;lw|K3oMG^8Nr*k3sy&=~h@$t- zaU+CnGx~2TE^U)imMl6lviuz~nR1{%cQGoj^BNon0;+z}QE^rqVK9jv(6cN9?%R0j z=b_@ZL|V1gUdN)v8M79R+M9ajAU|qdmvGR0` zRj^h<0cNP*H-98B;Z^SmMxH2_Ef$rFM7Zi5&i*Gbea?;z+Lx#dS4&1Ja5UAfIJ>!H ze_Wb_?;Gt644nWUv7_Z#<7cg6KUPVju3+{1ue0_yiyW-^Fysk~W7jpru<#Ga{2a0L@Eukhkx9A^TaR5PxzP zDQI&+UJgKTYLB#x-_O|5A!?pO;%jgyD-ZfAVCD(yjgVH`2fFGO2DNw@_(rQ9vrWZQ zOsMd1MPGGeL@Jxrt4^fT8R!`F8dq`ag*i=uaLlHh@q~-%BX9vuKlvIA+uBD!{}F{& zRdbgA>IK!S^qIV;dX1bdJ7Ugnny_t?O7sDe`|jU53!RV5|JV=bM85U_|5C|eIN zAXe5Rf9spU<0{Nj?ZXz94GpE7E~9P_&7az(>E8L}5ES%%B4fd6Fbf`$%VlZ@!zVqd z;`8G92iNax%5j*_c*%RaN*(Ftvo|7SpH)YoESn^Mnd@gPqfpJ28+ zE$ee@dUuPW(iao_R1qgkQ~ZtDs@LkI7^8RylreU$~PcU0U~5M_jnQrM7ca0 zZG~o2LFodHU97mnE0XM=>%`v{mE6$P>in<*8CkvT=-9s-*V3yDhWHIIt@{y2 zU8T8-W8DQv%5ACCSw(YtBExs;+hC)_0Z+s%PgI@ zpX|$c^UAF*b~Z|R3XTd*zwd&Udo46r%ux`oz(WJ7XoE5yZ%3OFTp;b8+!@Ip=lj*L zLTniu&}0}C+V@?dZNuGAZ572^DKpdo+!ZG9ngEi0ZHf0kw9*p(b`+(`+5FYanzSw* z1F6=$1Fk65+k(7>dI3G$hGdyc#YM}aiP~1Zk?X&{Nf!uWgH+;$11+9UUd~Q?iTLhL4fB;xE)88B4%|k6#hO``n$S(Q zo*4n4cgo!g%4M<;@arUVcxao10G3|=)YhmIr1JQfl28wVQ_Lox=TCt3FN78pzG(26 znGLjwT(~Z`K(8JY$9Wd8V zY%QN%i2hfP&Ag*}M=cjIXIxAQGfUF1x&!OkFJ4ypY2O|v=|UN~LV3t83!Q_~f#Y-& z$zU!_glb$diLW%^f7(uu*~k5$-Q2QT;9f^Z&2HFTSB#~E)G{e?i}%R_utSUulFY-h)%S~`A%q;2bXUx3paSZ z0&UBOWoDnVHTO%eyovt&J%Jz7JbkwRT+_z=V}JVpFO^5p_&;Y;|4HRZ5?4j#M;iVB z#EB14^@6iTv_z!|WA9amQxQ?&@QV+eFF*sL7>>j19|N`qWVb(&8gjXixjJuhyp@#1 z5$ak9Ec23gHSfyv%3EE&&u~P6!*CwG_t|x6^)vzcrKk3-ec9-?ofPZP%@ed)#7Xr< z`_jy3vcESH$Xn}erhX+yP2S#Bzxbm{d^*Uy0$z zUqd2=-^|!&&bp*nU~FmFN7cnqX*}_i(B*e?Y*O6Mi65m=ZMuXG)wf+%!*=;?t?+XK zU-D~)Ji?VmRoxV0&d3LWE&EE>Ky4{<#Gd4I1&rfrb!0O2@XjgDJ@eUGho)RTxh{se zf!VL&ybYZW7Y05_b?nK9+DQ?n2vFp+;m_klqh(M`I8=ONmqei?LEsP6eW?DG4~goK zLbz5TS4`Z^aX`h&ZW zGg``OVVXQ-C*?JFZRNKY8fM&l{Gf4L6hBFJg)x12#d-LOhx$Si0=#OOWJ8IP7S3JR zZblrONT`FAX8y~Z%{}H}LD~?T@=`sVAby#~O7)VJs*ru5zNk8sr%SIRU2BELV2$vy zTCt)im$*%R1u8ktW}w z9bRyhvU;mFhy`1kC0OD4E5?+32`EKS?IBR^0-tJsdeG-%p6~VJw;KXmVke+YCAN%F^bd#{u3}%7yThWO-6l9BVD0$#As3`}hqs=?*A;ur+ zsTzK|E_2udhF?n67hqxSw?_@^F(45?C;kqu6_n8-F!~f{JO4Op|A%1{>;DjE|MuG@ zE1Q3A*sy%qLzTHKcFCmYQkuTYD-eoDJ2oSH&7dG3{HDQ0y-yQS$O+SAWK=Hg=YT#Y z{q0kzKFV?O=u=!pWBZ`>w`+|5G%N+(Q@6Z3tG~SNXs>zyJP`wYv40&v5=)Oso;A76 z8mi@}#Lg%-jlpEGkTEu6$JhMUizJ1i>Ll36ZU`HrZFdG1D4WkVX#iBy;UZ<7N*Gt2So!5M{vg&8|5B?s4$thF4K~!^$crFS+cpYz(GJjPK3qFr=d_# zgf!#Picq}v+*RLLd;Awl8|cx_LAzF?0ZUB$4)H*;lnela9B-TE`g7#uo#e|#+iU0r z7YYIl>rblfEWyNnc1sk3SO@MLh!AWt* z!rg?(aB_|8NxY{;V~h%NAvo69G9ChN z&Zusi-c#~3$k}B=981IKorHP_97~JB460F|!b-R<&?5sk6vIwtV!(pUVxrs2(#yh( z(l-_o(O^sg2I??C4M_$t70i^Rg72YE5hd0@51j@uRmAOafC=uy!O)UoRR%DX$X%s^ zla6EWF+26DVMf`L)00wSxmTF*XECX6o(NJ7{OF^YJ9Z<dLO*gSKmSDq4WWVSJDv`CrzlVB%l2*JoMS1niY7CXf(w+j; z7mTzpJ7Tf#f0>LmIgz&C*gz$@+%3VjhEVQ7@b0nmmEPo{~bTt_7y5{oy9)xj;tQ>K8xkq)HgkrC!`_vk7%4;WbWl@+=rZzwXl1@~|^X%|n(tPkkD;*UVR9?L+U#@HU@ z{dpa-wQFn-6F)2uA$S2R z(Iv`1hg=)~=;-+GjpKh%VE^t>|Fv;+me-ct`-JY!XOs+?8CETxocN>}uX)!lG#8Wn z0M{>W5wPU<%2?tT-4&t;ulKCfZQfQfd+jRqsqpdNV*dI-#bVmUi73PV?h5) zM|7I*Su$*&v;hH@3E48D9crgoOY<_(*Xmh-^f$4fcwNhGn&*vfq`)+h)OSI1L;v2w z^o0-)j#iifmH9z#4E!WHiq~dcD>~EBdBG6dMe&T$F}n|CNOUhfs@Q9*+y)sJSFsHz z(Hv?5$_$KJ&VIt(cLj+fm7YSeIP>72;8tGxwW6AC@N*fV?sY_4Ov1AT&4f;-M@?N@ zh6gDLa1LmWN)R#I@^px}+Xn35wTz)4X%C6qR55x>M<&!0_$fA2`j>iXKO^;|EZO!U zIz*LXB57qYvR`?YhDtpi)5g?g(KG1+Wy}4ho$xA>Yrk$;zWzO&5MkrFtNyf-41bpOJsakS~E-Y@t<;sCf|(?otGa3~sV zh=&bzv`J*4)9U#y;aY5OzU&`Lodh*xAF8|JB?_jz`@q7*3&vSVRL>ny0iQX^yYvh&)O$s#8nC`{PFjp6L z7hkGNE)k%PUF{`__)%9-SF+x{si36y71xZd517TS^`p*IaW;kGCIp+hX$Rfq>cI;A z$MsNk0nerwH1++roiC~!u9F`1I)}+21%lEcgtCqDzzS*$bCgtZl!??B!_*ULY8f}~ z!R||^Nt&0`jzfjUC-}csV>+$S<<_6S7Q#RNTD1Q)N>X&vw=vRpFp_gHw>Eb&cQJM( z6Sw(Kuko*!)>}#27F86DXD^m+v!&NcT&x@wWW|rqY&|HY&7Kj7XrY(P91r9+ys;A| z@NC13M>1OJF8?F1yMV#lthlL2I6#!l2kkujddQ{LvWnYt*zs`P@v_x*tMlWK&5zt` zI8Rk;t8(SEL!Sj}`p4mt{~=8C0Xv2!Tctje5N-BD+VOMWJyycLN-|f<1Bohfiehue zbJK99*43d)<#(UT^__wdJ9@S;rp{rc*8L^wvr%T(GwE*i#!l@@n(zq;lGc#@vqSbb z5yQ28gy63E>J`X<_85q4;w*{FL8g$F7<;dQ+N+P=i3HLm?HY4v#r{zsN!U6k;9cwRZaWx`$#p#c~DT zr0ov@JV#*79v;0M^4dWH+P0e#%C%=KY^EE-Etd6@00F|E6or#6N3<+`X10NFW^rP` z-X|~{u_vV{1Jho;30`!znxzCyYqq^bxHbG~NA&VB~EmLtY9TUVlB}wo8^SduG+NNrF-w*_y zy2IJk2xEZ`jEn) zjmzc!k~jpjm~{O0IDLZL5RzC9)#77JavuVPcVOr2N8Cb}^9&#<-Sn5f-&d$zg80--h@!ygT~QX)|xRkL<4qcz$RiW0})>ZVaY@8d0V6gbeU z3@D{iRld3!jgD&cwiTck8f=i$O1Ua3y_1lq6_KQlkYtyeUc{%8@d$Fcvt)c5Xo8!L zDObp66Szm$r=x_K`xZkM9@Qrsz3VUTiqtI6W~OceD9VKuZ#_amvJn!K-S9nUvQRl)dP!QbFg0&U^lJE zD%EqL;omU1-%J*QvpMs@BTnus9+i`U_wom9+h&9WxnCCVBWs}6wrZc#NfDdf8nL#? zaXQ!^qvXpB;qBU;*LS$P5!Uyz_B~F2eWg)w!W*nt|C13w_?4iAdxu!_=IYlX&+a31 z&Xl;WqfznOJ!ViY%s{)Z+f7ZlcgpJaYn08rvz04BlNPVb`;(rRjkGc9AKsIjkw6X;WgBvllq;74;r>I+quc zIiC%bR0-xu<7RBL&$Lhv29t1+C=VyI)JcuPJz&zGV_qRnd&YPys9>4)MoAAkN?M zvxPU8^R`qA+|?9ig?nC-#Ci6%jN0X4=4Gyf@82A$ULZ(!Rvl#Wf1h~F2m#6r*yB-( zPkNg*g2mj$9S~9|X^V5LRUbpaJtWBKm8~0HF2M~aID-$ubC&#^`ejPGk$N`CPELJU zJcLBq-r3~dF(=_L@Z{jMkf@N(A zgRbsBzoC0kK)jeKAcHb`i8s@;vX_z?a7)F6&1stvv!fy1o~%L~dUr6^$hCiJ?jP@f zNVwz!ood(u1##@zGZ)Ij!@6)qbFh~-WVx`ZD3Eu>5Cp|@;c?nR7`eXkw*Ey1w)j0u32@~|(lBMx&6+`o5>K1EIR)$_n*>HL=cr?XGZekkMaevc|| zgdwPdy=AP7PY&l;_YK<-HF3>N`vW|**&g6$i$p=%A_YY0%?v6;_KIXg?^W(A3__k{ zLILkGLG12@_L9We(npYWx5z!u(}6Enm4_}{7;2~`QP7? z^OO}=G~rLyS3#c?Gn)8F2z&t+1YCAp5hbkuzz7P77+7#v0G&$$6GDi1R(fq%8PUbd z7zNkJ?$DC{)?KjZ^tE=bcTe~C&c(%fW3d>wn8T|9!O1tDjpj<*#m;6An^*S>lrON$ z*aEQao~Y-u2WqNG#C+*>d-yH|q>a^-8FaEuYufCBwE&D*wH+uy8_9%*Y&F%{^=vic z(v!~2S)E`+2Y!fV>wxV9^z31b3zAR>i**zxg?m*63iD_Q{h3ltB*qEX%DeeY601^<&o zgN~KIK<2MiOatbSR0md<^v~oQa@nlU~cwSC)z<8Aq9tj3F0ly>PFcVXp6HAUg1gDVy z4k1qq(YMiAZWfTQSF2z$VtHH=qGLp9;LzPEO_&k3iqHB4)c$bX<>^eSNYOMbq#i=J zbwCV{SlY~~HXA57R|}&QYye8I>f$yT<-G9m+gFPZD9C#py78>NZYFhnyUMOJ2zl@n zMUuphfaG1}Luw6_tknvs+s%bcJv5SAAR=uq@vDRUQdsTVD?8nPM)G zI%`C`7)aveP$|h3pdVr zeE@=O=XtAZX}~v?UvtJ6LTK36&Q(R=GHn&j z;j5}tye%z*M9`xxaM)Na%%w`?MGax#D;3_wWF^})=9UwtQf`|*FP`22Pk!XNVI!1O zK_bnNbr?gOdoC3m{QM>(ImTh@t5Ll4aXdU~kg!<;)ar}~hPVV}noONJk=xAPV^Z>EQD7-UY^DLSB^djeAH4w&<0JX?lv zN5vLeaa7a;d}5B5+xblQ0Q&I;?1~km{f4fWWs|#}+CR%`gbvUiVC?PYa+Nkj!{A$u zG-*8kQnWfFBg^=tO(9udhud)cdm2ZENJV2QRQZ59>byUapad$Ct(q9l(kt99UhDs;t+j_q+;Qf%UEzt+Gi1+i?U?#8c-{7H(kLl1YZoE56HuHV%os*#(J;TW+?e9tHCSWXDFgu zWNv@zD>uR)Sl}STE@yX8?vwHBP|Y@?54!KQC0(Yjso`VnkEoBLn=za@ck;W=G1_uM zyC@Ri&X9dp^{no9hVIkybR+8x?pWhU<8_*D?;N@%<{$fD0Lcc{M!NYrt#JtesfLQB zdZlr8iejVHOr89=xyIB?o%py3fSS;|)Lx}|#lhX@3NW&*Nex(IjIbSc>jF)FrCqqm zHalp)0^5o%_Cn-*(jj}~wc38UX6N&pe&j#f)!l>Q6Ef$F8h?edxhi}N_6^j1YJSW< zSUmTfw{6V?Qm!paj<`$UIW8h;*7Wya4Qdts|ORCWJtw2M8M1LcDF9Bv@}lVtr& z3H#z`U~Z%D;Lh-Wb~OL<`jz=#3O4!k*6A->UFI(gb`pD(KY2+>7+-yguu93TnSuqx zdbY{HxOQeSEsZS)e$mF#&0gg``!jchOeMumTzy*V+L~|X3fGxUg5ZXy-2Q;x$F@aKsKMh&5mIxzRD>26l(}%0#_OMCTew)I))}K>{!HpKrem2f6VVx`UPl8gV{*~=DYvb5 zQk#!&J5;*pib2o{M1#`mzk5d#?wIa!sw7pYVh@qX!LM02Zp1MDBv{=jO1$2vxcpX^ zZ!;znA5!_-nLFZ-Uv@YSJNf~DbI{^3Al+D8K}|^*T`72$HyG4hU5j#$__0MB3DOZ*(IDxTI`^ec}7v6@GA_4!X5Kgo?R zq^+fCYNPwPdjqHwJnvsN#V{FU-gWoT&ST3>{q6-i(uOi;-DIx-cZu$7?h*CV_PiP8 z<-;u6zjGTV!8O{;o~YMJSryssB_76fS@qh#SIgrT+NZytmN_MINlryLkfi#CHzJZJ zAE8t8i*L;IE*08W`8L_diMX_5Q@PGVfa`YhI|_t+lqe*C36 zGaWj^V?R}A+CNS&{#kYY8}k0i-*Z|}!W#Z)7B^kpY09QY`^-eBYB0APZ75#QN2Zaq zM)nVgfGs)?3pCa+EdjaUifnITXero5NGS00+Z`)Bd_4N*5yM5_LW_9f>d4ZXa$kl> z;C)Se^X8M7nR!XCo8aATvjyJMw+5I0#N$0W^()HFH7XyN+=Qy_Qb*fa!I8pAzi95d z2G<+oMuI|JkJV>QZNSx9;UXgI(3&0NK?$&*E4Ih3%kawBRxFg=^R~Du-z??KCE({%fm<-mqX&(LYte?J7(%_iH5yV;dMwN`Dc#LJH=r*Fjd}I` zFzs|84%}Ry7jCc31LTBduQzv)rdTy|4RmpPT?Hg1uY~6asTc8vk$!_aY6Zd+eiM=4 z3^5EUVvbP|A9jFCRqB-|8U0O_qBJQ_b_|RW5Z?p#C6M5OC!BoyLmquNlE~^GD9M>) zm{t^yM=YNB&4E0-C!W+CM^tVu9Pg_*3$TuSa(9L-TS&NhqN}I+ya`p(^3aJc(zzti z%y5Q=2GapABvCLe`Wvh3vEv?fh`ivLwr@q57N{|g_pw?+zEPRp=V0v=D}_8;ArmynkS`g4^M#Fgx$opBqRli$rar%lH?dZ-t~?p8y93I zO2MeQU$*PGlB>sVvLwnF0}38OlR1vN0ZqroaQ$l`W`c#WhU)ZFgYIzZk>*$A=z>zD z!wzHuY5@cpKCbjwIPXJBj{q6x!)g58C$W$tA8LEP)nV_tJ=mZn`yWYd5>)5skwrRm z4gPxy<*CrcZ$7~bw888ST7Wy8d?nDz;%em8fuAxb&aAPK0GeXlqOGI0LHqh!O|8;_ zCCTnd=dm+{W7jsB@(sWn!7_F8OVQRPC;5ZlEu?no89~Z>*lE(H$Z>3|%o7i>cj6V) z{zKAjsdnjWPV#%!Y0{R-vF1p5n3?vR0by&qYfqDb+6yL^t$}Ady0vZWjJ&8Q4?et8 zRB2&+473yLtbQ~qUqLZ>hDJdZIgVyQBYB2qK_@wV?EpHmz3{*}7fSXShRDV1<6Kbo zg``4B8DYyKQU^3gV2L71+I&`-N@tBdFYP*9wI&^otPz*7* z1;ixAqh`qnrCX#UE<}-2pp}!Kt$rpO!cbbsMc(?S&WvJ<_6wAlhvF<_j>dyz`%7>x zu?is2&0`(pO|iBFLd=gi&Y0mmA!JAu_V6#T_90NDi{uAVm&pItt1w67GGmWMft;5% zYza&+kw3{B{7bB|HSZ)%CP`q3%-Z^)1l@ndKuVqSO(HokAlRPsY6uK2ojZL|vLzZ; z#GR$lQ@Jyoz44us?YX4%W3ek)n^p&|otk0*N=2BARFp4!6MUQyvggfK#x7}vyq&{u zgqX4fRT-~EO2sqE-FrZZZkCPo%rT*s)_ImZw2NYic{lals&$g7td9S zRVL|3Y7{yF%&X>3#R0_}@k`X5$qoF+=S9&Hmn7B-9t`G6#}ISk5`-k#1-J5ZqFlL^FS<+`W{b{rbM?r}plaoi zKR;Js__u&|c|`q6UN}$Z7}r~Fue`%OKOMiVTgLHa*cGbIFy+%hx{B1YVw0-$Vv1Fo z)Qk~+KI7|x%yIVT*e=RWa$Qknil((*d^Lx*5pDV>8N^{KI#r;GesLd7FdK{|x(3WC z8atJh(yt=M3cG|pYM4?QD;1#jj#MxJ#@yY-rr$<62!9v5&>Kqo1qyt17%24uw!d16 zRq7p5cK^-9?1f471;o@s*P#3!Jz;gZMTpCfZIwf<#yb*z)%91AcWl)v;;*W=Fz;1a zH`N#HOUrJ(ZWkmzg{_Jn#Aji5Bu^oK&(j;2DpE&k*qXM(hqnE(BlCC8O(@OozgMPQ z<%=NJpE@<>A9X6(f9cf!EaCopy2|`-G)uJdi{d8=@SVPfT+$2_ZKeV+))Z323U0lq zK)Y{TF4+?zaP6I_rDtECd12%B=%aa;D=*Sd#j?7Ubsg)#?aJ}Hl@mcQq0@89wd*DO zGUsJ^tNRT>4+If=Axhi9!JwBByx;HCo_0FB!&MIL!kA-NdTOS4gN%mFukWy72m{UJH=3rBGHM|LSqsdGQdu>vt|mU2$A^Yrf9SyKM7gcz=766VMa{u^C11(w%`yzlL$24GQ6<7DKUB^o#76ZTB(4s&lVdnv{-mrp{Ih zezPk1uCemBfOLsOZtQD4nuuXL74lu#nWl$EXlivs4C<~!tem_yOpCB_a@j7cD%!zI z%ksi)^N`4@Y5WEFJ z1F>K-df~VpV1GGIskaUO;~e1Dc@O+y4L8h8Z0Mr{{+M!_W?4!;C-$jk0g8^dv=_e>eI zbMgTsS`2gkPRY2}y~KCmnB-%280pRHGGCda=$C8xEmTJy-jPzO-BbV`vKIAS zudZ&oOd%hkj9EmSid*L6Tl8P_T3ts^+|dD z;3*NoLLXugSC&fOgQ|SMDwQ*Sja{l>Nt1`OH)(Cy|Lh5Jzc^8O)r%Z-j^3A~e8e==SWcMWH`)q}+-_ z!FDpF{7UGOlAKsFd54XU8fvZRo&NCsgT#QhD}p^goJNdq zGQT&Rwsqut2jw~@RufCObH{u^?jy2QABIlhj4YNnsxgZ)@bDIGoK`+3tCJJ%p$m{S=-SnW;;BjzpJ?|kJ>LT;dt_KJJ}bn!iT`*iw# zY3a?%qkZ^&TGC)GIMQTdOVEj{b-sGq#a=~RW-&Kr1LfXbmwi1wc%DX^y_KGyiWKBT zgHd(NKDc4Awh$;J*76*w8hE@kRCC$6vC?~y`Lh@*a=taxm^y!W2@WL+imAkKX4(|2 zkN`|OB;xVv`A-P;FeP3FNWyUMm)7rYDaCY$(A1Oh6il<`5#mxmkSQbjb(fMb%?0qT zzd7C$Iyc;eF{JGy-~1h6{J697=kW7c@O^I5{@+_Ptp9pivUbJ}`cCGyHvfKD(JGRf zpA3dZQH4+qMEU7+qNpUt%_~?X8rft=d62f+w0zTpYk?e(!Ty1qmPd9b!z&h z7m>g~CwXVONmuImn&ZRxr4516IY~Rqz z&$SX1T`iodn)mjQa^=!(Hd1+j^cm3kCc#8QlZly?82hu(ra2JgMFF6tu*~LM!i1$) z8IhOfrY6tzE2k;<DjmCX~uv|O)mwK0;xJ&%-T3{VMq)nTo6kLZ@|p7v3RcAYh+ z7YP8jg&g?$<9APHtMaHuq`*p@By`N!R4#&1H2Rtw68%%V?Xs!UXrx;Dhe??`VW|=j z#+HC0O644VG}p}DhR8XK-W2=XA;_Xp5eT&weBLXBL;&F-oC4<8Cr7@C8g-mC#l+(X zezoa61@VsJ%p4z9g&=cP2pyG*h|lZ4OF&fKqfQ^oUZ3>4p4aPY+7_$>1E_08Nzm(x zar=wz(2SNKp1r>5cRe=H(egw-KnyJgd5Q!Jp2UtN7F9)z+#An#Y={KwWBh0H=q4qD z^nyij71dXth;1zm!RPNNmJxUsSOP7XtDBo_4tM+vu4sLvEPbsU$Zn9DESaf+NI3- z6;at^D9VS3Mln*ul+oOToL7^PxGrn5=+Gf^o=?eQBKwf~rI$x;U&UiY6e=Rf1Vi1l zm0qZZL0NNmK3j=uMr};1+un%%Xt`Bja#69qKK1iweR5{LK}x*igwd7N|271r?^{sU z&w%IzfTJC8lfQobc=2Jf*hvuJr!6?#iwghQ|7eHn)1^!r_RYrd?ah2)FRj!-voqk?f0~Udt|`90k^5h`zowX$e2HuXD-a> z51k9@$?_D-r$9AvWU88`6n}1o*zoSJs(G4~-d1%`PIxam6=;s;!y0I@Ah-xus8BD2 z6>JtSvT>LMN0oFMUpbzn)RCge!+fz;Ev|-v;jENJz(A)`ZZSI7V2E@Eimy9F{ds2F z$BF7_G7*L%4$e{1|0tX@I;3(ysj)_c%1(s_QypXCfj(DII8Pjpwr#3Hi zc%}6EmyTv0lh%j@)5GbWiF8vWwJ~$Y2TIC3@1%EbU#H{nyTzO*1|Wq7Aou(4wEo1MDfD=;>eE@Oq#tp;3Wu( z(Tv9N}E`mFU1nt%nwP>I*Veq#;BbOJGntjW9A= z857#}u>XWya#-+9?r+le0BzMjohlHv=wy_!H`8rzctNGEZ8@!Z&U3=~ukvP>Y9#(x;hD6Vtq>S?WYDic z<~nx}8_bw!W?1HnPW#S+>B7vDzRbi&{^u0IrBX{H6zniX`c<^P`6J3I6KF zJ{&EUse(~Tr0q(037H%mC*0}PuiS3|=N``*tkyDv4(lq7@=*nJrJlbzx+;%Lr>p33 z*tO*haS>jq_7v%W{^8(&*2*cFU~dRN9`AqJ#t$xhO4Q}t814pW(d{dsfOeePW~AZJtxTPDR{K@AT$_OhD0S%2e2&QJ(+)6 zta7KU8TqEl(9on>AcFnD^qTwF(N1lxwZ#Jfg{|@}GCL@5_owb~Ed2*5`LJklN_bz9 z>A@wA#^|z)bc=RTNm4&u&%lt&an1o*3bp0*)JJ4&-gG5oJMANn-QDe>~8C> ztwe66-%W;~TiSEez-Z#J}HlH)(mm6tPH zwbfAWsZ2Vs3@F4Rl3_1-^3CltZVwv_-Juu;byV#VdHosQ3etXoCg5Jed?J5)dA3v# zf^Aq&Y%S*V@|UA(s^Qx?h2e9nZcXQ1j(!B&d~G%1_qjM7xhSTOZ^6fShx&`ou20&d z4g0hN62Sl4oJjtUIq|vV5oO---`AYQVeB_{#uygEC;I0li04A90J zXe#qRPv=UVe^~H^{cs;MruO*EhE^1ZhfFyv5QN02E0fk0`Vs~k+{*Uy!gmD&I~J0O zys_N_O8^BZi#K9ohur5__5+b$jfd>pL%u%+f$ZTq(M8Zsv!jN*sTrhFsPYjpv-VzLQ!KEeK{VrNUXp3?NH5vd<8~k5rxI zdh7bKdiMisU*4=ZRlv==l6)Q4j9OdLJDIN+1c!0-H-@5kK=rNFtf^^NUnOnE@=|mT zV|KkGaS+csJes_8ASRm4w2pa~VNju#r^ma<&_LibH8oN4BK`f4-KY4s#%^r$um2An zn9Zy7=M4YtUv<;}vsm))Lt=#TXZiAXwD-n(cpI>IDS=R8NnhRQ73z&pwd~-;#G2q8 zBTYkiRQAv6rn8-*{eBw#D_NS&=B z8y81NPiH5!U=bsgZC)PRV}ptl*puryhK8qG=Mraj0}$wUwjOuPP{}A;@G-4Oph=hF zT09d2u!JDkL8&^X^>kUEBfWiBIM5tkF0))n)I^7nIWHxc3(vXsDBQ&wC@gU@)Ok~$ z#Ua)b3=EYfZ88x*gcB0w?@bO;R8Y2cB%>grCh^sW4#0|?2J2diU=*GdADP#wFqOiB zH8!F2RoJN?apnG5!LhqPRS*p9WxBX1LU z7GV{dz+L0(L@{4?G~xNKE3T0%de7Pd{;4!vF~@oV)P>N~O~GHlWQty5O=5X*Va__# zkX{YxO5|q+Q&$`%EON4))aJ0K<)TCJ@X{J&teoV>U_UeP)dm`}O5HF)0^uKXg$(t- zJQpG7Mu{OVlikkXry-d0~GBeB1z$I&0spdAQiHSOLPOs z&GE@_3k>iay4NAJ`Fl;YiL$;_A=P||^6q!`ct>(7vBTmUfCma0cT_vkpDNSjz~c%b zG->|?G}VY>yREl#8@C0$PzaL|qwMvCbdzz5-Ckmi@_F4?k7uGH+tu~g_^VtualoGI zgk&IEReo^3{Wiw!gKG~r{rJ!M-AKMheEIWzM1A7e|Geu8`~$=OzvkThKg`VuDq4TV zR=gv~uo&YJ(a964i=-IRPcc#IiVGZ)X~I-E3wHb&m2jl(m7M5fVX3?=dW7@30@2Xa z;A_53An13@lEBGHG42maNw1sa+uYbp>UiM!d3B)oW#lok$N5d1O=R8*4yYtBWD-`U z+PsP0<2bczGr@RLmgpT}P>F$?Zb_P!b1>@oi>PB;f}y?No=p!9Cv(?wQ{!gU2HA6t zP130nIVg2X||}AW8Oks{#@EC z)Mia+WB^$LkZvD{_8$AWE1NdTYu0v^Tn%WSVEk;D)J*P|K&6N_;cz15ZClsLL76%WspR6$7NF`nFaq+V1D*h>xZhtr1{AeNR@wu-tm)y6?HE_wzHVc>(LXV)w8jXSH648vMSQcYJ5R z?&C~{+Ezipgnr^fKS={?rxTpjZ81C|(0+vxJu*v{b7*n)-QH|=_brnn*FOL*anknY z+;FZ>Iv3X1Ay!uuJ!UJv!Y_dEr3RY9Pt$sZ%N@3(cy4CT!`dRqV_JxPW0LAq`}vtr z17RERnNU+TtJH4>miq0m79oZ{F>@y0O9DXofr zPAyi|zii0ZNppr|lv1HB`}&#OBvVo7$@39x4F%&iOi03u={${F@HhfyfV`XKa*Ld6b zQg#i;X6Zp;8~rVUgQZ-ivm>*-GX-TVgK=dctDo{bOYh@N+PQAG&Y zkWHy#@uAAo_2F-_d4@|+9=|BeXm_(pk5y+CZ=_Ichs25S2PrpV93a#3F2klCG=JlMxmbJC#vTtIajiqeJkUd0M1^ z2Zeup5|?b5&=nwUPlOd+vMrDeSgfWBh+F0h@+dMc-_b?+0cjC?o^no*`sWu(g1n90=$%-3gJoPsRi8lX^Kc>sFC8m8TvPwd&?_wdkvMp6Aez* zW*VutsGhEy>)*kTs!bk=ZtjmR8lwH~;~5Yp-0#{W(Y ztiqajiSPcf*^iBVB@d~(q7&+?cOPxW!cic`yb&7CzfBPFOnRU#?Wkzou)TSRhfT;2 zGy>-QJdQmsXiCPe)U`H1k)8c3D=G(>@}m`{?Ilmyoz!1~8Ch={++Vga;f|P>DLMFq@!`TBb(L1I76jg#b z#S>LSVHu~1*U%X+v6C+$+`WM=EYriD5@lBNdgw>V=Xnbt;p&LtPAL!-1$wG5l%*4M zTra>sM@*Tcv8BuF*r22wz5ZuZ4!%y&?D+(P?tcXXl7E1~|2YQyV~?p&wNd({seI@| zXsrchvQvxYYbtTB1)&*iENBz6i^+U#h}l)O7>09CNZXw29DrFfdVd!0rSM0>3070y zDF>atwMh-pNT+42J2_o&PC9MelYaO-p78ntbeV=AK7DrcE)}KB(sOi`YRwuo2>0{P z)*WMI=?3c4&f|($gdL{JWtPh`fDTIK&j@+IJ6oP0;lWL6r7fFfTdBrw z8|ZzyVaLodAju>3v-}p`aCNJY+#0V}f1O!HSL@ptP7_vb$5Tf@p{;A7cYr}_P(`l& z`c%Rb8H@nzeIvB%FQOliwHbtbJwLFGpcU$!i)k4Hxpaap8#H@SYS00*iZp3nn-OOk z{9Fn_w;n=G-&pWpi=g>dzdd#UD{mcThQ*tQqPYToj(Z842-}BHpRDY;>(H!{qg2W} zR1Em2WVoR+5^y`I3rKI(eIa*9Xr16~t2!<%Ng&fU!upYSr82t<1%RvZvpDtti7)vj zYX?$go{>fpNVEnb6d`B&tB_K$U6)*Iq|M>kmP>>!Nmg$h)kQj__0eREczrn>V&1W1 z{3FQpl|OaTNra<=&`9AnM#t}I;kD7-3mBgMO#{l_ zE%MN!PK0?9W1%j(Tr8Y7Eq`0`bN)l7Km!kiF!&D>MKJ$yA#to4853~4UUYvfVk&}95t3ZH(BlJp z-~s#Z{=&onLe-t*>>CBIP&kAIIFDxu(H*hiCH12q3S&rv(H)_)BVx}mr1*`4x3@9jNUPRV?+8v;!nNO)!P2Z-Ew*DstYJ;7w%L~{ z-~3YDC>!xZ85L+(KKf zeRYM_p!jHNBd&z*<6cL;0DOO45L%-UR>)}SGiCM(b^%t*@I&)RyhYSBC5yj>yxKv} z3?l;Uka+~BCuVkx1{@47)EM`={LFET!6_BtzOj5W&}T-qJ^%h4o|5hyuGuhQT{aAjNDw6)wjM*1 zroy$Rp`zhr_Q2w$%=@&8Z|1XGF)l?~(9djwchqgej`QLnBhHQAbI0Zj%at0)WT>zr zq7_^&lT3G$!g{XjAq)w!Y@h8Bpo~eSyXN;ZmlTSG&@4oy)v&H?bguQEIM(jYMsS$ZXm?Mu}snvIe%&j!lA8Y|T=wFsm#X zypBOdZ!q~bEDvboOs$G8$JSjoQXY8d4QqTkm6)# z;;^eEd9lCco@tsGt*KWwq)tQ;Scvo(Yi9S?yMjy~iZB&6D+JaaWMQtkMZKmFQBvnE=0T+4zcD{j3jG?z?9 z6oV8*1K?4<_ZUS-^WfUJJp~McN7_y*w~VuAc7DZpLngY@cs~CiWXTue>l$!%Sc1f$ zBrZgqoyT97)i7qx2Z{tgLad(N$3-TB5x&rrrAeVxONo80$VwdEQp4ZmYvE=FsY4PE zkm*MnR%?aF+*GlvCWgJmOmrAYpyNE-E;G16ih9Vr;+qyN=>!~N_Jt{Y>v$ULN^N9l^DD+tZPF!+R8&@! z5Pe!?yE1aETyWNu*`wd&AU}PEVFu?WWFgL$k5ywes4eEhFM+MbR%L}O`rqKIssr%s zWL^8}HF$i%xp;z&SR~@xa(%x*oS5#-lHfxy#4c_7`Bi|T2F0-`mYp24L#!sN38{_K z5X3|mV-Y}b|G*lzN@N>KBX3h>0PBlD)K9{F*g0u$ zRkoO*E>oXiuxU;0Ve6F_b@u-D^>!*Z2rIk61@>#q;7RN_Is<4jY1Au>X>z${ox6m@ zz?_sJ&oAEE+tY;HpV1L&>_*0QO6;=BV@cybmaDs?zK%#iFd#3E1+A!uyEHi4=;TQZ zym80}Cpkpp?XHia->$JWP3N~XY&=%La+)6tQqiD}qLosGnw&4!Zn5V2y7+*N8XIGm=K4QeDIzYUF4{sCX+S8QR8yU2&C z+KS&jlxEOv8gkdve}=}bgq)^%xUFeENcY$;RIp!&_(u8zc42|WU1`Irj(dOpPjr8-LtwX=peb+kK0?uUdn&SYwi}H& z*}V9%rYwRwA?*8*GeM(7Pu7z({^)@w$a`Sd_wc*nBK+AQuxor+EljbnX5H?Kz-uW= zyzg77eBpeU)cjz2#fa}p+El(;`^0;G(BzUDR!|>$h{?PmV{@^~nFAeQ>=U3n)TZg8 z_Sc#F*wmqzeM4Je+|NA>IdH+kAdurVXnn#9;Ja^3aj103&z9M z%jXD5B%wE3=S@CJTMuJMA482&N(ab0T$@iL%-Ho4!tnd2J?iOLnSOl*X z*B@Z%@tAutVMy%^E?mp%r2?yD#a6MYFZ!>&($@~Tpd8ZG{Y+8K$W#j)FlMB&!R3nE zw|NTh49ABk(ChOdYk2xvN3|dH={hY3fQy~KG%u;-6ntzNe}GYH8lCjET6%U*Fsj`W zdto*7=Y;xpAf~<{4HUF>E|AEdQiV^3uSN44fi?C+)G*gc%K9L%ne zEM9t8>gXJ71B>*X38LV{JB2!B*NoYb7FtL;#?!Nnxki(eJF`Hb++T7wEEY0zoT4|K zEfWs}8|^ z9wE0pl8PPv_~KjKhdmBHa(`A2%{)UEXD35mkPkk>7Jnr*%?hU@lN}<@PBWP9l&c4R z6NZBnPJbA+Z6~|GQ9K&_am7a;+_i;&WjFE`B#t_cCQquuGG}^GEn;&YIfX#3AOK>M zLd=pHU^k7y?Ha68e}Ihg7HD1=K~;xzJ#ACqYr{Hg7QaTtslE|)u|dr(lAk*kD22u5 zySOkUexHu(+oQ`cF;rNF_$Fm)udnC=T0#v)t^p-oOTF+|VF|^p)1s)u8LwJ?l{Y_e z?K`XLjImkQt6H#Bs@+K4R54;yMtw^o62!#)=a!~!_?)_FGebh4&4z#+0?#LDv%{@m z(?m~i@0!LsVJ>0RF)2B9-6Vu+wXEIef(clKywElMO4l_|#dP83_UuNwOHsC=K_{ze z`F4(`dqsn=tP&botA4jdtd@c@gX9^w_qa|SwYlS2v-gHgvJ}7tTSA4n9 zpZ0HA?vk%?T756~sN@H5jAGh_WOlJfZXswJMGdlsw*64r zDwoZSmn`?%ed}ZG>4F!TrW738p4Qn{fc!z^L2V)VoH!zoW$g69nR7Ij!tz&*#Ki0K+YW^lR?&}VVBBJ^CjeVxnA zuGwUL4%u zL9?=(vGk3EibSYK89XkTyvt4aS3mc3GLJ}zxVSE504{Nh z)=^cb@jrAf)pPBY5Hzf6pe<-0JX~jdxLOvS^+qC~PXv&l= zwCX8CaT-XoWF6~%>gs^GXxSkVksE)bjYhUZ@W8`SQjcNWz2Hr`y`?EZROAkxtpft; z!C;SpRE9(CN7xNGjZAtkapqY=$RPqoze3BGZO*rC*4b*{-JgB7_Rxa=%CA|dgDqkI zE17WWQYt%}@(&GZBk%ZR z97RJ53PiAifydTvaS8+$XDrqys&~+?M}a!(yPCu(xyrpL#~~!qI@ML^pDxGlu6^t{ zt-CSRRaQ6aUgF=c1}b)~KpQG~?1_HVJ)~FY66kH7-fi_v^@(CkTDM+ToV5t5W+Z>a zIFLZ5aU{N+P9|pSrd1Vc&{*CEh6J;f|}30INyVduU^*57|4H zJr6i|c`%+&wf6Y1n@PR<;@B)@8VomR@AOE1*Vwi#7}@4SWc>#$=iqCvc2c*6R^{{* z3+Ib`l>{!@nprcCJGIzv+g0jmAh;#qW(6Br@m%#w+ESsToH2A6x|cXv{JrZnVAdOD zQD(zZ!4SFNi|bXiDLSU($eOhh*m+;WEV!AN{6OM6G(kzed6a=6f02Gx>S&rwoUIyZ zXBO@cc>aDf>Zdo(U$_`lkuB}?g9-v`eAqNr=i7s&{^}rid)rP0-h<1+M1ioK$~gh@ z`}sUY2J&S%jb#rJJgo$lc2ZUD@8O;c9n~}eBWiUp=YzxeIqFJw}P=sZ{@b3#tA8JpFq#fXP>~m^`ze-{@6jE^Y+dW4A030lV%Y; zlVnH}{xRt>Gm`CdWjRcN*p(!#yTp;{lKOgXgSeO_?ZWuci_P)=HIhKhUS*s1Pil}8 z;=v)voH6wq1qQdfOMBrjTV=#gz58MxXT`R-JR~Z~v=CkFXW0&~D0#xUKIao!jNT=^ z5Yz=cD$fx~@I7>{`Wj2e>~myV0uly}=S#S>qpr(8T4Q)dL6RQ!;f_!Q zKcNV0_>TUB8C0cu>(V_OCiVPMyH%DiU=iSAc<461Hsg8E#IWuztTI2#y91#ziZvFr z1jlhS`!5v^z&Cg5^%GL8{*}7^mn>hF|NWK!yJ+F>45S2A4Y$v>%y;sTiWLQ13X8-< zc56wA(|S<{_G6@M8%4$Rf`&N7#$1llJbfwA#8hxfx)+2GzogDfd5>0*gD-aJy_;M3 z&3QF=c1M0YEP2xzcIVgY+-mPP7g;|)Ta>wh_A8EO<$PwlpQ91y`kW1kXKwubHI7iX zBfd!}Q7m-@cA@D-Uy^pwA(sR&ThOz0+`hTiO%+cSZMC(=}1xQcBP(Z*3-qBFn!V|=V1bKT2lnOWM=$d$Vc6Tg?4te&MF z0lsP^x|$z=4TY8KnSv2UQ(0oY49rW4r-DJo1_2j36iy0Pt-qQV$kkswpIIr)bRoS3 zP7lsMxLfxhXtRv!$R}9q(j<`LZPQ*oq5}iae$r77dwYS4{pJYIL#wVch{qDW&o7x) z+is#UiCIt?XXoHH2?f=xN!~QzV2>|#}Ab1CIC=u=sf7UU6szydrBG8tSP05To^k z5z6Dn0Sg)`6lgj++6)be@z^Zh5^}7WtZsGaJS;QwmU74UG+}xp6s0q)J{G{6=ha_D z$~L7V!gn&sb*KIVE<9l=NQGB;$QLF+1Fz!Fd|VqfB+L$LkJC3>46`9&4frsup!K#HNTOa|Kxser$yR< zE%6p1)ZK$-es2BVPK1OC+j3v{-5igJii8H~1eB?yn7I;KIMPuC4ShF8O-m7A>762D z#%#$Phg}ep%_M4+J^&suyc@oVD`-JtW*KRPI&Tx2x7Uu#uU9h6ngucbu2PTX;Vq@I zI=uA*8=@Uw6nSpsxr25Yg2YFFK|NA;US;UTelOv?BtPXO1%BOFK6ULBGNd?!YwITm zg%KQgyHI=Y)Ut3{d;_dVS^j7^>(eo9hanqcKW{-GeMrUO zezQUy5ndvGEowRn0@(;n(~r5CyUe7;O{T*=4BrrV5M;gQhP*I1WS3p@2p*ZJ z9r@xNyzQQP&;0|jfVc9RAvCD0{ivip0>k$5#mx>3a(+)DL0=xr9Ux=8mp zEXZbWH#z=@r{+ek3W#Y+aNg}e37mYWB){)T<^B!zxJ|vQg|$IjdUTv~avHR#5#r@} zZ!JwD(*ywO`c0@L*+MT zkOEW9e&a$yB$}pRmls`+5(~Q{*zy3+`T#G)==xKK*?P?2t8?Us}pb}3)3JMrs1y*pI&RUY8ryF7G z>9r`*>B+$2(YY=6E@9gHAdyvfdY&`h#wcULtMEigDeT&(th7l&?|%G3>WS2<^so(9 zvu7K*N!+UQv(;lkEPPY4h-)!X-!+Dwh~@I^x~n{fGn=t8wlbKmSwOVsRw9TKL4L5e~Zirh+k* zR;%8YP#DVXt`*vIe`in{=Jm0IdvRVV1%fRdWdW1U|GM!SAU*eU{$%X9)&%?BP{^)f z7e17r{Q&luR;7!I(ZEwH)4xI;*klL`iV?4_-AL@~)Wt9Ms&RDy^7HHML2-JE7Nc;& z=Xf`K@VxO z2aeA>#+fvKQ9?}UnfXXvTuow7s8y5;d1f^~EO4Y9ey z_F7Go>VYJaL8Hs+(jNKE{D^LGBh72j^jz%D_b4$`^{3EzBhO}7>3(kvQY6rxHgJb1 z9h#`4iZ>9s@G@RY?15^F05Fr%4 z``?1l+3cAh@4Z#a{{r2iF!KWdsi@-esXrv~{jf6}UQeSO+uK6u>hX7lY9CB5cGBzI zZeAuIvcAMFnaDSWl3D5LF~f2RrJdQ|_-a8-9EI2po_Xv30lf3}N?VmE0QDEh)3l00 z3UXb>GQ!(ZV$*(0N{)WBNd~g?R^!oAj5(|>O@`H`920-skbGLjZu-GMmr~iJz4M&g zi?n`pQV_Z=vQM$vV#wHxMXgeMnZthX$44KpoG;CtZa7EDA$j+sfL-H}-o1~!^&X63 z2M*X9i~SDq1$q5|Jn#%LG=J#c=5itk=ex%Z%GNkS_I8ESDp3O6+yp~-07kjU;d zWNIx<&|3oVm${hwywQfgpk$B0Uj%!2Hb(-fGYzGCkOm|?OTc@V-~%beP9^V1`z55Q z<}OkWlN6cGFns9O>O(OrUt*VLwB^5v1@}%ls?Y%Y?kr^wZic1#vT`XhQer4jmY5M_ z&f{sR7fB(5iBx#8P27|K>KzV>HR0$2`UwWq+CkUwoy`xs352O+8^B>k$G33RX(FA* zwXX11%hXmjy+4g~g)GXRL9a{sOqcX{FhB0TZ4J6IN}w(x0P$HNg|<_{>e)TZ=OGqWn~d!%XMrJcxu>EOR`$_UE7rPJRx@TAKuczsw zIejSex!4f?D_8Q*#peIcmHdapGUE!is(%zZmZ;}Vw^)V(9Ar3{W)1doy5Ixa-CWHh{Fn;5Oy5>PT}El z``ru+p|9LaNY!gK2D?siFSDqJ&|B#!4Iq^N6DL81G-;}6hj<$;pv%e=;52QJN}d$y zwHUo$s5Q=Q8}VXFUiO#)WY7jqcjXT<$~%j@hqX11lq9Hst6D6B`V@a}FQe5BUoMnL z#(OO$a^ujmc60T;%oSz?!h87_1Y($S+x|uEIsZ z$*ZPO57}j29sb66JdW&6Id8@K6n0o&w3)EJw;BnUvBMo;+pn`%=m~g3F4ycYna7vI zQn%jz%6R)ITcFT5I@P*9&9!w$!qM-#v|tUj)yY@a+q6_rb=^r zp6zCLjD)EucK9I}hR|moK8@VUXBfCoB7+3U2n}9G1PJKgjMf;GE#}fxKFY%S2tCP~ zlyCuybKa=(+RRPOhX$K-lkSosFjQ5jQBn>sPbQ4^FPk*ST#xUrkRv*@NiFlD^I&eC zol&>^g)3DMRef2dZbVTtK)LYZw`9mV+554lA=ubIo7rc{u{rTaUV;~v^6PBg>;`xy zG3Mm!e`M=>PLV;KpR#q#&qR*@nc@GRtsQ?)!T+&#$RoYW=0wo>C4%;=wu@OSWuslf z>LR06C{rn4(#3yqu$+-V#V2Uzef&9B!+0FNQ^!BzAg2gNYH~}o%$xLfzj|@X+2!-| z1%xoF2oMu%kcRG}TUH8K*D9?{)if#BRValOGS``UKUv4?cTNk$1<-TQIGP#H&}IrI4zZC6@%9TN+|cdcPmfq_OztF^ZlO7AUonSgGD9 zaSavH14)#=zS^G6t-LC5-|)q%w8TEwMWS1Vde7i$LefCi5R0;YS;&8Lz#6N`K4>Ml z$T<1Z)EYjCqu_p!J*o{^u;suiKmz0e`@D(s{Ue1lUj$ai-8TpTB6E;*NvQKMhhQlR z8kbM@V1a_^d<_`i5ejy_-(N3KRPkZY)Lm*J4$DKzGbkKrlz zz~b&HJ-M6xaM)W<58s|>L$y+KIL(9ytQwOH@?v^3c4}KB?zlCifxWE82*yV^r!6YC zx_PVE!x~zPUd>WSRK+b!E^Ir|`UY{RTl^s}`n9QE$udR;ZK%6&i4IB>zD^5fcWj+* zE`w|ymW6g<91g{W1ntLvtZ%)^F9~d)-=?vDeSiNV9{)SWtv|s=ZPxTN#?37UUM7#O zSUeOuTv_Ui)ej^ruKg*lM$00RLQh#KUw6xuIP%r2ms2onZn&@OoZOygK`YBH2NO6F zqpN2!ReFvTA1@w{2j^{Fo9{5$fG~l|Cf*_o)$+t)`}3iCnL*2qL-vToo@h(;d!VZ6 zkV;!1@TOGRf|~)}O{%=YY82Ggf(CS!p+uw1%`H+Qx z(@)uj2KJCw3-ZW5h~tp0frC;Tr7U2p)`(j1_?am$AoaJubIHP{llm{ za1PYJkD$#+(!;8ew(g)0U^f4(vicmERLp+wRtD7Z)LNtbaD{+i|CM%1-MGX`Z4DW< zO#4ILyUHp;*96`x=%NHxVG^*EB>+1{uFUH<6 zJkqUO8|{u!v2EMvsAJor&4?|%2b*1Ny+t#hucX4QD^F`qw4 z%~|6a^B(uO-CH1q=d6iykReyyQrCHl`|$5+0_y1T-9z4sxp-X4L=o2XY^2{gJZ&9k zd0hd!7QZ9-tFS`3viuga`TYL#2HF&{=Kh!NC<*C+uXLJa*S)S+GGW#4`+c?dHY~9o zH+eGBD~L_P93@x;bx=A-AAYOb0L@9d)sv=wb#=K^9QBbW_fPrNf<3V&X@cjp8^FRR79zyxX6 zJFu5i6KPBT8Z(Gs8(|2}^w!fWJVBD-zIGdRy6t{VG<6TDwme=0 zeBcDiCt=k41&7V23T}`NsNM}1ZaNQK_Bu694i5*sR!F?g@g9BTb1$$itH&vRO*)D3&$f= zTGs^64&>A=H`~|{_|%s(Ibuf=4O`!^72{ZAiR2G|rJ31W{TrjY5)c}=c~N~T?GZFI zxqSc12WL~il?|*GIt}XH;m6Nr8ZKhFy?_e>io09P;&_C#v(L z4&KhrU^=S=wxu%A%prragr!7W?D1$JiR26DDKZ(}Jl2x&Z%7N^9R3JB|5liQ_@^T&KAR&rm^0zBHAM@Ce{tN$dQ6l)W=|t z@0U0pL~!?+;Rqzvw9`#=(_U0u^$HPI)`5akB@CK|YiPfeWKAor!_Te9+sbQqo)v>d zYhx#9I`qs>u&v~?Ivl)NE0qJ9Y<*cz&P(bp$t`j^l|UsDt^`JP)my?--07@zUM&`_&yS~3*e@Qy{+9ITDGG^dXey>{ z>ZxpN@|I538AXx+jEV?ytk7^Cp{7Mg_I8y`&bTSEkN{p=OuDt1b``10^oW|nab)R} z6e^sSIbN6T@Tq7wR2e3IeRb(ajaNgD#*!8+2-p|;06Z$Hj4oR|Ws^c;6|T2apVRmx zC9>)qtyX0hB{f#T`uUo18GM;O5s++Yn@A`F(kKIAEOK5%WN3+nW?@|c=^%etqNv@3 zFe0!&mV!fHApV@O6&%oRin{34q=V3hA{{G0PKV4Jrrn~uV`rO(B|BHRXfdP0Z&edM zugXiO%WiS*n+`H(rnD>Hnl`qFaR;F}t0Y^=#;;)Q&1enIOCJi%B>%CUoTqM*fD&hf zvch@7Pf|5^DZYo#SDShW2Jn8OM!9PW_lW_J!cmUzc?KVhw=v!I!rCGQ z!6w_prbi>q(wJ~my5B<~@a>s6YW9ZAfJ;R2C!EK^PnqtLN{}9M&I8{GAKKHwht_BMKsx~# zE@K4i*pzN(-A;Uj3!>Kbt`MCW4iWKgVd|{ooMJLu8}TXFG8W_4p~gx&UDB8xDg)NK zS;pUp8l)x%86*N^T&pB7MVfi>NV_Ccfwp<(~jjz`SFE!|5W$8x5aZLmiw$l6x`v{6jr->rYT0UPS5iAT;PGh7K!csJyno=H+bEt%6x7IfZgpJZ# z`GE2Z_S|ktLSULP-r9KD>IQ}vTgE+~FOKj#NIA^H%NH#Rj1#7l`X?+G;H~+s&LQ%@ znxq+~8MDeeTo&fDg{lp^t-L1J;2=H*wo2amD=e@oFWx(L7wE3lb^d;Y{ z4M>x5&>ZuA(VI~i%{QkV9w)9uuB8|!BHNT=7N)DfV3cEEEaz8iMYm~b*{n0l zhj>gRM505Uz2|vMg4sTjXxrqxl3YK==&aGLy3K;^OfUD0ieCujAOS@J1us~jk%Ld3 zgMtkLHHUeNL7W6ZnJqJRQs|bZTDUoI42G2F#+r(7G&{>U z1~JSI9~+0zPUAfik3fiR^bxr9t6`3dxu+p|r=c*bO!HTM$yZ9^jQp@y^F~zYbHgCd zPEgXj=Md1uPX|KLTWNlZy1EmjRn40#_Eb8q*4deeb$mNJ9r6ZQrF=qK{>U2Gtqy&MnK*ZgRrotxfS|m z;eP5M9{!g^4|#!n3H4lPJHW zeK$82k3%(exE3PE=}O~I_pl@M;X?cAmP0G<@HyqHJf$oD;;o! zHc+&`A@&V_J!&r?aVniI7n^WyW4P)JT4KvOt;ge?PL!Iqig%J(NVsY2c$?-Jw2&kA z7;#8=H4)P}Eu1gPlv*w(lN=|J1h*J8!({4qcH2lrE>=RqIc+Dm6o1FyOd4M{R2P;O zLk$IIW}uKtwT^AGYFS{~XbLH03aUr^jynprCsb4YlCx&dhP&b*_&sEkqJbtZ6J?Z> zbx=;(`b>odw_Naav~ed*G*LV?Bq6EVXO_;CCh<3n;Y0=MV5PfE>7W^y>}xm`XU^e7 zjD~tW!qjDqnM(Ulds#xwR)h**YlpsZX^SM%Rr(S8ueIl#v5}~^gg!cC23d*WZ!_^V z615Bn?`yZ(E}R$5lE((K8F@Jdqw>HmIA>XX;Zz3+l4oezD)l|%2husOms|48vxt|^ zeeT=N02L}7)6C^x@741^CM+}fR7_XpW8Can0$Xn`)p@}J%-wT4W&CuTXO@+qJh;cf6%D0MN zW=pOBtyR^u+&bykCFB#fM%A)Q`U=T<$SHx&+PAvC0o3$|d&Jxd01q;hL}aJMius@S zeA+kwe&qHb$f|RpjWgA4Do812|A7OOR?IF_&+-l0>VsmBnyeUeSOe3%Miv_&0@fWo3*r`5cH1356F0t34oM-KCOZZT3HZ58hAeCQJ zWUB)GYWC?OkoK~wi()lNEYpn6FuGD_@9ir?*W2~tpW)oVbBv4<3YiDuvF(f62<(gO z2{QR0wVZN|ek@b>6zySt-9F7<9yH^ZmA8%(Pr#KYy=hMf+BWj;X94$K)&qrY$3Nq@ zCmva3{6lcmCE>f9806g>kkb2d^e;*~#apH5ATV;j`^Uo|qQ7kq{|EDf>EF@&w2IAN z+S9)D4A5)3>rklB#Yyx22Khrlq;U{)DT0wQkts=<)yr^I&P3LGrVD`|Uq8b3)m#AC zb6BHxGtVqp7CujrSKRyVms>8WselE$m)%Q`oNhN@Pe`ugt@pjFSbp%hzHfjD!*)}; zb0<*=vy-x48SMocG>@txwJ|EG%?TrjwAiNX8RASqcPaxJ&1ktaS~wZmiHV6bMFJ@J z6Yi83`AMN8yh-=+1K$Iv23dbwgghB;nPN0m)N3OW9X>){>)q{=g`}k4!i2lg+NHfT z{}NSbh0#|u?%Ymbi69EVmuF{_~pf9{4K zH%|yRDA#Xbae}{Y68nSBkC~wtq*WEfMEQkWA(f(y&QdFR^kDvYE~l$8MjxNRlGAfE z44ji7OI2-uAU2W|eoIs17d(cq>9`ju(Bab|{dKWGR=~^GEz-17+8X}&=N^8}Hy1q| z{#lsN;8^Az*LL$?Ii!v z{k~AbCldG_n&&ke#0^~YbnDpew~nLUQkrMTjgi=-fk7RchRfo1c&VoFOzmW0eR#{j zi>Z~Yf`#%SAL|g5|04IBDJE~L0K*p~VCl{OJ%ss(LxF#H6#Q>)+HyGEx`GvjjF7b? z^aVMmq~iX@dXUwxB!8_1ZW+zXhQgvV&IYj*{TGgBq^@fafMqG;;Puq@E<6jJS@QsK z!rdyL`tdq%r&;e^cH?<2f!`;IKg=Um!$XS|N3=|%31b!~Bkdr2CPs%$cQH#5c1iO9 z4d9PuYM$*jgI&vo(56OWj|m?`KeAq&emsL@U0*6`+VLc^GT2yK%GPPqB`lBjLoAhD zhw1{;x@vF%^{A)XQ2IGgQKpv3TjLC=gPN*#IAydz%*p-N{wZEafAk|hbY^8nio+(a zHRfZ>7NTOT*t1iOmQAD@I%<}p0|*5R=CB2tkUY4jtO)uK5Hw|DQcqnG%>oQ%O62Y` zVzN0Y>ICF$H73G2QkC1Bk91wz)8+5&%WYt%*ch@tW! zzLWxB%N1|%kp`_Cd{Deu=IrXEg=!woQP#&6C?w-RF?hSAUAmv(s-W59*)9%^CZ9?6 zj&6`f{ji3bt5_d&NJMKG{5|=`5)VnHUj*W!4UBBnjgDU8P6a$^MZ4-oP@;fKLPBra z5O1+z!grnh{zxZePO*w@c%I|h6g~AC-9&ugcIk7wgOC{X9LB~-f;noKbhG9a+w|-x zp!i#j47fP*58eoSKJ6}(KgzGcJWH^vOx8Mx7mna6l82fXx7BB8Gxi<7%-|$^@tOfO zyS!(l4y1$L_d{FPC3%az_8bbcib)shBvq7oQ#~gOUuGST(o8#fgLF0w#M4sRp2ZeY zmiQ}p(IO<3QsrtoOcbqlW5-*S)^F^=L$ZsvJ9R1Ojr|&tDwNYR$jU7o_|_+4vpkr_er%vk z1MsA@3@!>~(Ih(Dtx}cekv0!*3U$&X+7GY_t_r2PATsUApN5Y53GBHLpmz=9Gin|a zXo&9QA4{Jn?JN4Jdv|~$wp}6&F^1lJ#mM&xRSEqTB*m})9>aczSj*oh%KC`H{}{Bf z6sa?0qJ8r&V!cOwv3mQ|E#q{xv$@k$D-Pdpkt&Pxiq2s=hEA}f`eiog+8T;N(49Vr zCrAm5VE3K1=Tae?_&q>R;&og6csqQNAZ;jD_?@9@*RJ3h!8VRR$)YuNf}XE0R9K7r zI$n<^0Qq_K$xpBs2|HR)>_>-Dwmgg8pMUYjju~vj^MLUP)<3fD|6#2&{?~ax_P@jx zF?qGjg`|oOsjCYpze}|xDfYp-gIAb0OXf$M2$do2EjA^RbN4*6bn!i7{Ldl$&3aKw zLK38p_~UQiPP19J$MyF==jU!;BzW|p4kc3*)Vknq!`jPCFJH64Yp5M zDRNevr+H?tS#`G`A_}c$ST+YQ(O+uc<9US61y70Ek!H+0JQqA0%CRda$e&8Li^UR9 zyNjg8CB7md5j8>1h0`P1Kshdw2nx`M`4EG^pJD8}$x91I{(w*Hk4-L>EGRqC3jcj% z)T|G~Iq(yiU_26$33EUJm&D+xMb#jz-KYF~=TU-eQWM4jZM6XX7fMQPu-6K>u%AgCcFijs3ug5yQ5jxE{ z?n&zB*H~H|M`0-zkTQr--s1VXq3lGXQ)eiuDS0|RYOD&yRnwDjuZVCt-rxnI4r((> zA!gN#Kd;P-m{{t;Ugr`C`Ljl+mNSoxX$vjR!%YD0FZ@*Q;sBxO@yQOlW*|2#yRHfw z217F*j8>kSqao5Noh&?`W5F0+qbJB2F|tiCd8pMm&GAb8s>WVo8o?H8MR3S*QweGT z#FlZxPVvhpM}yp&@wx*_E2ekG48!RbSeDbqIkAQ;`EeG9jP`Q50D6RE@qiWI?+To+ zZkWG5P!TqX<@%YNfBUe2SqmpPN5=*xh!M_@U$F^P_bx+NR_nLwy!q5=c`}{g+2*?t z!~g94;fZstfXhi0t1?Fyt2Rd?rd7#FDfX@+d}y8*A>^oBW+<>Vd=tyM6w3GxD*UBu zNaE#b4pTuN3R%CH82Q)KkhALv4f{0at$xb#i_To z9WqJwbh(b8`vadkq zSHBHE-(vERIP7&ql()<~$D*H9d-ot|G#hhnb!Fx-jAPNzT$5$ybUcEUE03HW#sl|w ztPA%DYQ9(e!M_@To1fMaC;^uz$v-|WF#YXu;lHy9f9qEB7k5n5;(zi9|3e`@IXedo7WpBoB$1{v=Is$D zp#EsJ>Hyr`7@TppTA%w&&YQH5ty|d3F`yRvZ{7wCGCsz>VM5TbQ`tF;CP@URKL8W> zzI}|THCZ$DY2minlp;$`(pF;qCLtSK1uFKz1|3QagqG*&uDSz}*dz8T!=yC|nJ#{A?6dy)aryz_zdNhFlc5@YU%*bCk0%JmvYqbaNneY+o>Dhn3by#IYu!y`ukOVIqYr%SA(Z zcft$V3>{qVgmRh5nM{s~#$B{rgu5_Lc()Z!0nPZKE=2 zFy)TDj5lyAy;<>R5G5x(WYGkQG73R_XON~_hww}k<*U%tqPo&3;sb|fO_92i`X5eU zo|#9?+dRDB8_;YigZ3zE8%XG%7*>0n%V5Kv+Zb9A#~$vwIVPw=IWwfo1%F&$Llw{B z7>E#PTO@2WHu@|B<&s_y4jk{+4AEwYD>G{$EKptG}u#c+a_! zRHe^45Hja6lx^NA+31o|7X%hYqO~}n>(3P_W1FixaEz%)Y9*iL0H6RHxT!ud(`N;M z+Y0t@Jsv_~eesfRHc;}+?TY{NoB^qYw)GKKt+l6VLvS;U@Vca$j+rH%% z_G%;7_ToKhw(jlMqK;>wj!ziew6);1&ytaA6YlOc)iJEs znl+fBjqFiXl{Ca9<;~4T5@Q8PXqO><+2Tk5&BSbETNR8?WE9r+%9^ecww%I+Ick^D zt2UcNZdQr;{>R7#WV1@7V}(pO5!=@A1K3u-SJ7aaG6bK#x_1FSj_y)2jbH+)19Ef5 zkQim=GoqAoXHC}QMGZFrh2xUK+~AXNQ&g3|5ZocZq z#MQo%fw!a#*5BxoAO5;T0q5qF?f7xvhGjmBx(*3x9j8QmO6WX>VbG~tnh92>E*`#qiQ2-CQWt{T_RtZ06Zp~%s24?HW>$08R|?>-moc>m8YV2KKRZt20uS( znmVLjnL2VQKca*A>#k48&_AhBb4hgE!}s|YWC7vVH2P>7Xy4ufYXEteGp0nI;pd4g^tSWJp z>7F?hHpem@kJv0BQThCv!M)cU&fdsBGgvSF1I`UocV~-pG#e?#dMI9m?qF<>Z6c+sgmwOYSG9r~+0zDT;#eb#OvZ&FFK&BkHw9H^dPPXp?UqWQ!bJ z=Ewg129a-xgdw+GUZtAc$G@7qW!r{>4gr5Q&;Rjf^BG35!BD({3y73jT2Qk<2>!sVwG;m}1gW-?qYZ8~(VN$e2L|4jeJyW60)^wjQ zs5ws6q+WM)gqlMYWC-X9;$2TyX1Hfw3~>g=jUm?4qo+0y=;^x|TEk`BFM%wAf;g>loN69l; z1C7~(s`MNcR#^#H+TxWSN(=2l(}2u@cxe8!IYJ{?Y%EvLz+gd`6%R^Rb-}(^?8Rjx zoR-&m=q1C&sh@55CdQVkGMUJ(6Xr-WXDi6b_6R?W&)!4wX_!dq;!}8y40h~W9B15wd zgzD3R#4u%(%N??tJ|W#&ojpzN^I{DmL3w12VFt+JK8h_i<&*CLgO7DyZ9?dBuwO@i zU&Iz}hmJA|1fSBzVq1!4!%J|AcFM)ze#^!Lgx{{#NzP*q+;{@^pbLI3m0QSj!K-W1 z7h)slBsmC{wb`g6xZWJG$fiF1Lw;C!!xr#KBOPf*D*2n(XomsYlyyAflg zCeA95)jyjl_<}`svon=ogSgkOjlu$oix)3@|!d#0opo3`J{Qb zWpPw_f!STciBo8Lx2xksW*q!*l*L_jM1x0+YMO>i>OsiCv;@;tS^nDM%nUiKP_qJk z9>oQ|){wGbQyBE{CMc`>XkVgJ`z%p2g(itt#T?FpOsKfUZj*B+*ajSGZ$l05NWuZc zHNbcfD+3u#)q4EkJbxeLu*Z_o@}wkZ*$J7lil@%JLa2kOG!4+7U|VO2rpbZ$K|A%T zzeoH)8qX<|%5l}a!t}KY8NiS3AW>44Q<|bW<7F_lw|`H(E7Flo%U8@0bxsT`s3B1W z>Z?66$*em-Lpnvp{JX&F2QmObc+a;hWFGYA`-{MDc8{D*1vc|1jd;-rxFw8 zZqlP@p0n~L_g$S<`#Kh^*AnNr)GfRVo zL}H%j*&(5b@bG$cmv$6)ggHU_4Vp5EZwGLDbQiy{2UiO5`^ffD| z)(fCFVM$gKNf}^k;E2Jn{4{^^$V}GAm^7FHk0bjM7=MYw_zU_9UUCkPozCbx;>!SC zm*!Ar>-8lwU(EoXHq{<%Mo)`q!=Q9c(9UHuYsU>)Pk{@?^9pN0FSPBoV!@zpf!QYA zHX^-gY{hl;#qGD3migak+QSH|QkESN1QIH|tS{wEkN+Yy(o#Nxl>-S;WdA5hN%*&z z@c%ek`S-S;t!!n7BZA7CxT@Pg8*M~E2VIQNQ9MIlep3j3#`&{E0_SYRVxK%I^e4GS za`lEwQzr!VO?EG6SP`;#*z6nSQ2Lrl_E05M++ntt+2w}UO!~LGW4+!F_`E;{@cU`q z?QIv;%^DJUfZ+Oi<9qk@VGoKng6l;MUEW2T?q-!YWt<&&lg+1zNAq!S-Jocbx>DqP z1-Yt-M!U$0(;!*Eg<``)3|?C&9#GPSdDHK~tQCxTvYS_sz^*-~am%lM{~6CqEFm3> z-aMmay>X-LVowD5qJIGse5*IKM-e#&r&Z8WNN3A_L+_WApaaEfL zhkN!C9#A85$k?Xf^axix5JDAha8Af$Z*wCNKhRn#P)UOgnV`4ayuUen>FQe&qFflf zUM4O_P%$n^h7&zupBa_yh4hW7Z7^y`&i(9reHI7>XSY|SBC_AtgNhfnrg=aehCp{& zwwmx%0VNO#=TEVn?#mw2NwMARlRT(N;8_nE7AjBfOW&V|UX;aft6wN*<<6%5WcxUe zCUX%t%DiNmg@5H!GBuWiR+eLz+T%;%O~36&-eF9v(&`Ni;!}buxtYbc^ zXvf9s_CNgkc!orH3&oVwy?f0a%65U}pbTZUOha~@M(=~7;aDwI1LL=NFsr_`aT}o<|WOx|%`~aqf zhezcbbpjm^YZ9$5Rc%K@1>Si<#W$(;0r3~jF`_DuF$fsOW&dLs_xJ3_|CYA@KO{kz z{++K(R#{WSQNjJ7mnwl?K+Y4;TJL}pPiV35|6);RNu_M5j?&k$L?k^3)SIP9MBiw( zUFx`cxlBZ8F3t4R2FiI;=)GLIziqas3!r=6A8brzU->?@diQ$y-L3HRe?kxjUSWDm zG@2JTL-STL{hl%pF+Dvc32w?z!QwHUQzaI|dJ7+Duq|5_buTeLj&#jhM%7%T4e>c_ zRx)fYC=0QmjHfhJbIGvUq37 zS&5x~?aY*!&F1oB08>xS%0r6-4w?=S?Gqq^h3xu5XoMkT{%yt3vBsPdgLZQnCaDUn zH_Ft#%Uq~4N|g0x&30aY;b!VKBc+AR`OnYk3pZ|{N`8cB#wen~yhgKpTwUtq5V$<1 zYvPyWDo%u3`rl6m1n}}Bj}^Lv`7KN3vUNrFUIV0!krl;GK9a<2m4Cclf8@6^DB!v9 z$ip)IMy{XMJd$Lx86}k;!m}NsKLhh7@bn=a0|jgvE>UB*lGn;yLog+9z&<-tE!DioYqlW*f%>WyVK|3epEq4l-(@2u%rTS+Vq9d0&@_fD7b#G^Uz_=LXE@;n&zSbl3B(q zG#ORFjMVdg&OnQXuu$6TFfM}`3FqH81jg@2r2&)}Zv&B(h?!~o^Sp=-hpC)lrF*Kf2(wmP3(GC{Wh-SjDS94`74-?4O8|&s z+03%K1aQEw#H!#+khTqQu|0fM%1&aBZT{}^XRi4l@Z0oE^LJj%j+mI8 z{PPpQ)=dC3vP5#g-u%{ILy`Y66kr}ZR6`PJ^*Cq4avJEV& zv!+aaNA?-hN;r~+*~KUJbUR)xVR~!x=xesACa*XmSrlu_cvcwZEddxo-kbtQ4iOhflWy3$ldjHxc9~Too#(?Lw#Q%6+OZKWMyhI5U@~s z`DO}?0!AAV3hz?zSf^7cS)Mqa6ED^Rj4VPfvTqxGW($H^V3NOgQYC|JDiDXVhBS;3 zN)11!qcGDx7?oL+9D8Z%l=`yr@45CdF0|r>I1H9v#!mMkWufeWR*yNX11gI{h=$AA zpvDe-^HNbp;gADpH-too!N4CNS+e5Jxp)TlX~~&}<30+Ou+)V#wE%y%q(hQdja+D! zXGrKzq?%$CN{t|?BZA~3(6VL?;b8%pPjS&#vIiE5;QhYg`Eza*9T7MP`bpAGWRWK3 zfWp>w#haw>7taAUnr48+Vz67nTT*?Ul2m#85f)|hfennGK}u)w4G}JLmk`5kj;EN4 zS&!rk39LaKl?Z#2@*j4#>(eDYPFC*qp#~>oTo2Ot;D+Hbdn?o{*XH$4+AIJYL&oLC zE9x!As0~jR%#v60>cEJy@SP^~?!t)`3qNMWTvqdY`4ikcJ?=G&j-?6B2bjU_F=>Oi zneQxNEmw$!$ z5MKy`Qh{k->3^(y`2YT<|GOhxaDtuOf;{R-Zet4Ci2e^MsQp^>HGLy^pawZKGNTmw zR$UyXp35d2D^=@3i_B{4JO?t{4d`=zRH+*Zn8rrrR^wyVw$r84%&gwF?-%nuIwCi0 z@(GktUqYy2N>`IvmF_k_vV!a}C(zWm&AVcPD-_FQ*Tsk8%9rRtMyMc+*Zki_&+^-8=XX`u$O`N1s=!xD~bXchaeO6iCE<5$6`6HmWe;*M{I1%X2O5Mt(yUQvFQWkA3 zHDJ(}+v8$iJrj^ii3|7O)F3M$4TFv>{2Em3_qKZOoaIyc5|1!2RPIN=$*FM4BMXU> zVJEQ_KiC&d+|L~hS!tC2J*}hFIOq>jV||~j=HOSf%-{E%^SNJJu*msGpK}N6`%9af zbvpYG%-~tZZdoRw895XV++(&KU~bdqpHsDivuA2ek)dIcZH=!%f(l zNUq{e0QH8LcX}$b%<@Vk)_pw9hO z9GNp%ZAmh+X13zAI-~x;;+_~-ftT})O~sK1by&Bz#dUP+vRBaRO#k=@q>X0TRu`(ym#`ZaLM4M zj~{AqFtm`4j#!{z31~V{wu@nIHAv)oWj-6`vEfYvT@sPHXb@~Lb48?#!d)ioQY);Ii2msHyR7Q$*U)Pp$Hy1V8#Uwp}rYBGVOEJif7nB2ks<4bDU9mVwd#W|P#Qp7MXC~OBhq#AA}Hg}C%+|P!57U=1_mnD4IP6*E* zgMYHZg(_9)uj;GT-a%2ilCS5(&aGMI27u~L@) z#^N-AmPTJ;x|jxZq3SY%v`99EA1c|R`!QLE8!_1z$IfGOAtW)`)FO0t9! zQiOE1kWfW|T8dhWT8vtaT8>(;B0;$!l*~)LA+-2uha(# z;+o;r*UsTMp{*`Rb@J;J=q?iwyYxOFedT{lWOpeLJBb}S=q|E=R?_Pfu&snXt-pax z3~6Z$cuWhbvCU}xzQUog(uR%0?w^=P`ZeM8Hip)jnCs*tE&pNCF}0A5v<$7LV6Gz@ zX&G8eMcIVc|2N3KOP370zE;?q{U{4>OkHbrz6$4#rG;o?`&;6ff z&_cXDH?pkQz^OPV9&{C7k&LV#q0osrgU1Mn-YZDlGy$c1&L~_f z!ZU2~PBI7D|6!^TA4?u+hiB2mJCQlo2+t&swIg%=1^)C#VYM@U%j8xcb_7dBzTg&g zE&SoE0XVmBhfXjHJV+<>AQ#X150m2iEva}8*mxqL2QsN#;DJOO4|D>>&5gA8Jc-ap z@4jNh4txQeUs`W-vh~HN0ncT`uIz@*-fpJ*GdX@wc&Ocfm{L66;h_vS*JSuTLxZXA z&&UXRg1Z^=O)*i9t3=l%&!h^*L0M@Z<(N4`rBSC+Bll!RF9}RO<7qv{(t6A#3|I@P zK-lmlA+3kBgdzODO;xw3rO|F8NOm48FC_2G>lF@iDx1Trp=;V6t14}FReBtn)ac~N zkqHx{!-jgtHT8DuDs6O{)Zqz}dPj}_GW*zZF#SA)qDI)Is@crE5z1?6+txR$o0^E3Ly-*?P42z_AVGXh7#%%_ ziA{k|8vz-)*I(dXNBJM-GLL$dN1ggzp0|e{tKQcBc`fY5AWH^bzUn}!F2O`|{Vzk- z;5|cHsVI~1dRnFwvXO=%^%TrWWFrkj+Dg;EUVmcR%zH>ilxihP2AyKj@&Wm`zmzLB z6QbvoGstc0hv{0{hS6JB_t3RA3?sFy9U|#$>#b}ROn$V4{UJt1=pk=|n?)LiD(`jO zy8k6#f>$RNBel(9QS-e+1DvQJ9aps6>UFfbD7` z*O2}DjhS_tB6gL>4HrH#2P1z0I`cTBGj6%ls3~NNq-jF1lJ>7RYs~Rbt<0m58UrQ= z5@&pag^d+dW=?)lt5cA{zJyueAKIF{Oz`kFe!Q~N&s1G98%jse-K8})XkvNzM6p9U zY25DZ`_f{sV_9pjo&8k+exb#_tqNue#KYtzBs9*GjL+&_sG&2QPa30YH|E9_zpL*v zKN;Ka?UfZ4Mk|k%9sO8ODGT3Ia1!Sq$YbO26{N`|DjL8&P_JT#1L|(tqMCM5QcQBP zg7H=MQnCE;s$LISu22KG9^Yt$em;Q1=d?iO--Yd{L3#o1o zbeX!SgI|#>s;hjl+|!6qb)4ORAl5;%BrxPoXJ#VlvM^*V7oLYR*;`kmsgPM zTvSzDI43KrG}gE(EE($*Fa0X0)KO{fu3@gQRN<8tNufGYo~=G>7DQ{0E>Z8l_Y6rj zH`#`*B?G{oZO0jW6){zp_Xz^P)}ioOY48D2+ou4`puUIbvz$L1%2`fj6@C zpf$Gjz?}h34s1WH4+Gvn3cSGvIJpow+0G2;oCZ2Yog(+totdi952O5zYa%YA3sm~& zJyi>v17|~4kW{&7ja0dCN2?r9On{%WRgUhEjS~i3zyaPkksqsZ_!Ee!j{|WEAm*=e z$T(Kzh-s+GRRtVbeN~Q}@fwHr$UtASDcbP3I|d4N?Y5S`HCm}HSU$}1;tkeA+U3j5 zHT3{Rq#H{v?be`ogFD0B_#WUWaX5X7BC;_}4&4msm@(kf$eQ$S;EeraT%YpC*ntl? z2?RJP7&z&PDVLVskWZs47wD)zLK%j%hyJ{|W?;MUS1*ZhsTD5{q&Z8^D*5Ka*cHM! zz9VUx(62H@`lB*k0)oJp97}spn)w-#WqfB42*xrd0%}Zv;HWfnTTZ&MJt)v23cN!G z=*$G((F&YkyChxE2G`0G{vb`IGn`#3P!Q;K#`U|nvy-H(j?Qf}9(iZ{5n@e#Erzgg@?#yb+tPidMuFTQGT?K8hE z@M`I_*vuItHPkF}i?nCfGx7QBxy)W@6-PKh{T8Y3lxJduXBkI$ZUgXQ#52)7=7D@^ zJ6o_@;w|uj{6=aybEsC`4!L*Pv+xPS7 z$@y841Q5(nSEH~Q?cgp?cX63FuBMTsYvQd0R$9R)xhh*sKrAVjb?Zjfx^Z&x$Td0t zgs&?8!L2R@c`>HRU6$DF{4*1HLG~9gq{&S=uE}j~wmkox3OG8==@cXvYN~cjhgs!U zbc@SziaGthhJuZ8*OPoG0ly^RW-D=-k6NiCg8aM`lV-h$@ zUt<$$eLJB$do<@@QJYmPo>HB4@;O!>suXdxEiBw)?l(kP+K$;hVV^ccy&Pt3nmB+4 zOZzc<#xQT|9N&f$&c!6qu-KyRR1uHWSaSb|!pU zlZ12t8jL_A)E;P10*$-&SyjzGpO$%pj#)~&qQUk~7SwqC*c56yPX#wL?uwpP8fUf8 zkWnc+x_j5l>fo58EzJ|mzl_5z&9X=@tNh^&Em)WHt%9Z?&$hzEMf&fp(<+*eficw1 z-ah=5F8oddg~hq1o{&!q(v$7ErkefltV#sHi}wvVmg>d-hp~5T&vRYdM$@KgbjP-B z+qP}nw(r=sZL3Kd+iK9*YHX~`G3WDcW3K02V_V;@A8;Kw_hZMK@NqWtPaSGCHE37M zin%yvg?yw_F9{t~w$fY65 z>g{Z*+l^-*vlE}}OYxP~_^B-)tYhho&yStY59=XiMNOVP7h9doPeGoGt2vLTGC@^`Sl#3g8l9 zNM}%f^nUA5w1Y3MJhJRQ11!qcnN&$GmyN3)5#6ap+lSpMI4+{=RMv88;xR!L9(8P4 zbm+ddsfifF{1#_pL9s~Jv{OATw(JtAj$H7p)LgE(V{L{lzjJNYEnMQth-g!)cB>e6 zTJ%iS94@=Fy?e!O-Y8r$TlB2eB&)dNy?a(}-l)aRtyltIcrMb!Q=XPnuoB_9P(2m0 z)Tv#Htzao#3SB(ytZc5ptyMZDk{SGWKJ&K$W-b|65D;XXbVD81Fof!1i<@WAR^9!}liZ1+ zUAn{2*--)dLAwF70Qsbt0%DPrU){(BYC0&QfK1TK&)6X29XQc>`w81G`30#@oX|^?yyXlq=8>Z{Z*wZVPMt5xUN(A>wMGPZJ~=6$9Iv`jce69>+#jQAHpI*Udq zJjt~aXfrrsBf*%RI3;qn@`mJUqMlpNUQXr&sCFzH7bEFQ`S361`)lrU>1jp*}i!$^2qxnb@ zI3l8aqf6;bxJ$2p#bxslw4KX*Vg$qgzj1>9xhX2S*w`4@8vjpbrIO9RutVRAU#b*f zL+%)a`1;ANLf^@|VFX0@odtFwMtL9esZgXkT#q?HUrG7Q3#QtH`7$00qYhifXHc1C z{&XGYIOW`ArM}!OFWZ5jUIhzKbHJ#vvC4cJ6Y$Vm_p{tRb<}8J3KbxqNvv471IOv* ztJI{moMLaZ0!#vus7h=+D*-~1!j%^c1IyVyeHU~YE%ozdssJ~&%1iUa&{ZuOJBxb( z-%?Z3OsFB{(Z8dx^a=GLH)PqwKvkC+P$HVz zXICEjP;QH&7O)9?K@~1jH%8~}C#!2CFhpguD2v$x14e!vDQMp}f%-kSc=QKSKVfc- za>@z+y{BEWR2YaA6bJ6A>J{2_jP2f+8X$ZRt-3iI@CJ5Mr_%{Okn$v8$ZNNS@J9GMdY6Hf{tfG2u;li_ z8dc>dEUEO5$ob!4$^ZH7|9uRV|9NJA$>@zV7k}b|kFiZk-9-LAYA6-(@hDz1S zP#*D`USILt$v<8fEnO6RWx*CRSbsY(PLQLy*fpAsL#XUz#F20qo~GGiY_WDFuoAW{)eQaA znm7fcs%02XbCp)rbI9*tg)}w#%=Iq4%wl|oMvQaA1}NBocHj$6&S$Z6#a)Grei@A~erql{91GPNi$L8Vt3NneA?iqDL9~u3QX%3^nmWr2 zf>gDDY3n9K?qA?<(Zo+J`n@|g8{AnX)I&augnU01 z&cD{ZeBDLxeDqrCrAm|EcG%XCT?b#9`D?xVp{yxIKaB#WesrxZc-%nyn?@9-GVt^R zw8^aAF72S19B;=mn2-d6bSg(ve9avXFUkl3BOz{&T}MC%{=4W}Bj{Fggam`OD`tXM zHzBZ4@{)7Z>1m*iyxGQ5y*=983b^mPyWyxNhIL@I%V~R*#cR^0uZdsjearz)*tx@t zcCMqRYHqOY)WLL>IGrhcV(A#B9n^Y1Ad1^c*cMJ#Ha#U@Z$CUV{Ai4=w^PH*Ji;Zp zFsRhAq@r_Vc0a*V6XE~|rUQLYc19t_s_I>;fi~6`1QF4pix$yvKq-Q}QzDVNV>PYS zgm{;E#b7Ou`(qI?;Z_o6lt={=F>8jczIfMep2cJbY&n&5Rmc;WyVYpip4UTjdy2L_ zMPark_DD5Ek+xZal*8N&;7$FJBIQGd#d+i7jgmis$Mg-`qKLz3T)2FS$2MV+_?Zeg z7MVAm{V+)@X*sb-YS-%WB6ELN3D!{Gk?m74!9^Dz;>j##Hw-}k7C4*cD-D7v)=9*} zemn8~*c)K*EM~4E>^~Kd?tB2Bcx)A#COI98ZOHkC`R7#t{7ck4pYXv?&>vzS8mmO$ z6T4saO`+XWr_ya1Ip+;}NJj9Zq3Ik<7~UcXd-`q>6q%)IOgFYP;mLZ5NVi}~uVz37 zxN3O(!KewWAgH&9#t8gPb~6ikD&}eT{_GXMEp^6ydsry}PQ8Y&^%(T=_w-7F(CO~H zYa!$H40%>L2jkcX3t6Z7Vm!yxxdoHGoQcF5NK=Km{QC71TT=M~3uw{N={MnwqqmFI ze@iHlm#!<=z;$Z>K6O-U_Ql)I+m{|nJ~ux?llkGVqnRZ?`9a+$iw5Q&Su_NHTR{F> zL-aRrJUVGw4oLucxJOAVg%-M+1_i9x5KT%ekCq%Q3@rf@DKM09cGtOcC*izm<#)oZ zpAZ3nwwbsnkBP2{maN3*ZDIToT8Kk9pPS|0VPcQxR5j!2&`Qi|*Srp4pwc7+Y zSCsM;M8F#gN-qnOkRwqEIU0S})=B~aWoWE+ z-*afD(3i>~Jo4^D)zSjap@PnGZ=iME+Sn_5%(Jyj>uL=kTMzA_-Oy2MP)qfT|I9!6 z)HQles-vd8?2Me%0`9#{Cr9@nd5qFDy$mOD2-wS2tGe`(g@dZE9Y zR4O*R-mw%`G2O1fvOTmlA7bMj-#2xQk7p6o8Kr5^WEw!AT4y+&vUjQ0!?q_@+L-Q@ z`1R7mn8h2=>e6W)wA5=M>bFOV(bXWW0nVSsa7T|Q&`Y{Cc-CjD5jPR~`l|gih9)D* zC%@&D9r1gS)d;H0s%E+*{sMguC>>v4mqT-dz11P}#~0G#uCMNkjSJMhMJT@6sKWsA z5Pulqr}|bax;RJXZ;oRMmC5H2w|av-JVzRwhqH(X)UPy5gF!%ZPyib6$}7mZp)DuNTOJK!(fpS&;z9YH z9N55x{VSRz$#m`Vq)D2wN3PZ=pF}Q|$h=jNR(fN)76G3pMdw7dKV|BD4f2_0X<=TE zH;=bFP*qS;R(mt_7>KE@dO`M%Jt{8@Wv3%CfbcnvcY-Wi}j+2JNwvAI=F zC?D0Ek!9GfwkT3n8CHa#qJ2Oijw?cv)~4}NPGp>D6KoCa%kSq zTxUtfD;gQQGzHLVmPQ)(7Dw)UK`nod=YEOXn5`Ww_K`{elidSvVhgMk?c9+KciacW zPWMBN+98%Wgr$XApxQ#vDw@U0OCx8Ex67`kWj$RgU+^WuqPz5eXAMgKix5O6C0mUD z+3^eiQ60zhw`%+Ej{koW2b4T)4DJ3`=zz{&`5@56T844g(P0tjb$EhRVR*<_pk=C+ zQK%kr<4^{xpD`)0k6U!t0zUjh;KOF09#p@H>64JbaT`|7Hh2yY2dC<8zi^u2a?Mj9z#>W}0~|Lv*R+^#}`$sT^Hc&}9>S-mYGF ztrj-Of>vYA9hpX5gsBdN_`Fe^-NO83fR&V9513a57Ra2}4CZq1Icf}nR8T_>$*WzJ`^GRfiYFYt;=5_aTeZuo8>y-+ z6lUYJ29{|jLom`*%<%ToH&HZtQ-m&bfD|KtKXRiKgGc!;qqQez1rbapC|iRRg9W+i z^2}#0g;t*W4lNWI3g%&(hLc$`i-~fD)3rXsnbkJJhCX^c7_NQ}6sNS*i;m%$tBWX8bJk$jOG>6(Mwi~vBQ<>!$`uP(xOyhFA6l9sJE%dOQ6&?GF*qY z)k$e&9pEfH12YTx2Su8;-37oNHPTyjQH@_So+O@gc1;+?x+Q1cd}0W7XQzXMr2WyO zb#Cv%HgIk)re1%BOniCno)rV%VIH!8uAfFPd9sHO_62~vTf+? z)dwybU;MQiU1>$zj(miVEmiWdpI*kL{)A%+YFL6<#z9!@D|s{pC`mB&4l%_v z;gdm~&bM~~++Avu#zc=M49=%FL^ld+PH&MCQ`Nw1)88rh3K$hqFhf@sA)`;mrDI>H z^tpgi0qTv%SzSSGYukX@tPbZ)OrmwPqX^5X@)=8*f~%IWNovIE$t5I<(T6Czzok7cQCfSAVvat=fnR;{FJl34NQxHH!F2~dM ztXVWrKpOrD+pj)N0NQz*A2AyZ*g}ZVuW#$<+PkJV=Qo(S!9KrG;^zMoKA|Q2F0waj z74@3>`L=L>W*PD;6$df+K(@zKqHOgD^iA~Fr7%YAltHsy9oD&W=s%v&OXC<^-2BcT zq9t!Y>>%^08tZ^_ffFdZVxCVAQG#DL^wVgZtU^tZP}jg=yFy_h;$^AwKpVe#StWOp55`swM_arWAkhb2~*e zKPv9-UayR+=`2Qh7mRu=Vmw#x!VV%|L~Vcy9V@M?(Vs~x;m$D$pMITXl0lql2(al| zd;VnffJ68+^IiSR3t{#ji^X_VGm`%u&G6FXkJ-`)nHDAH>q;3Bjn-DjwH=u6vG5n;`tMN z(ex%c`<)?bP-wr=t;$ZqYvwqcw@|h9)1#*l@U0R*jMz(SaKXbnwK+(iwPWpfE~Qgl zrz@5+3rcU0@b}WAKsrEC01TA}A%@UNxeuL(n}aoWJ$r30pS!Ja<^+3rfWj-b*sF+R z^hvZ$5pj4o6zju95s=dHQZP0Wo~ydC+K&fmfl6{WY?}Al1VP(5d(H_pi;5|x=u-ux z2YI7?H-LTuTY)Jn)L4^LiwsM%b)-&eG5g5!uJci-oH#gLxb8gj)&5iWi~+*3ecB90 znTz$Z4rEg>U=AxB@(<&%@!n^URoS8s7S#mz5HEF;M!C-ZM7oH zi3=DOpm4#~j6HWn#sDUhawG60*2{*2MQQVpMeFihG$a>tz6JS5q0oZ%oV;rvD)aAa zwmHkgndO^S2>e*s38ZoOntm; z!8!G3bVa^%Qr{%wzTF!6UljU&^_+orxAQ?Zc*|@W4gETSxj#}R%<2|?MJ4^w2AXVM zfavaz0ml*<`sIYDND*>>+7w zJ+=UuQ(Xv#9pkP$D*di|3>t2ttn1GS@C}M~s~P)lH+*nSP1Ou>u*wFLj-@(T2FQ%- zB}WLgUkT_-OE&Qr3Z9#!E)!Hc4}@I_mDwNvsu1$vGwHd{an|#XuPnsBRfve3=zs8R z8wDA=d471GO!nUcE?TW2MaWR!0Kc~T?YHM5poXMDbT7p(T7C=!# z@hN+=@j}vmtCN2MYf)wolM|V2p9wl2U$!uCnbJa!*`W({FvI{Z>Aa&t2++XQ?S-4= zE;YNLn$K9bJ)iWjvxPtRuGuz%e}e0!jjsapPdvo{L*uugD)pug*f78loan|n2;08k z!8*hd;#EAj7lnJ#TNS4lb#GK(>V$yJQ#>()-g?76ph+)(wSa|{uvC=j6-9C91~rah zBRi5KvrLmR7O(6WVC{{p*GtYjQjU@Anruo5SlsSuB_Iif9WG7jNiuAiEsL}0+n&342Qm+y6l-F`Sqw`G0>y51lsM-$&}Q;t=OwZWT&_>H z(dfz4IhVgEWm0~zxrUlMlCE2Qinw!vQC~`42r*#N?p;Lm#X|AX?BPHC{S8a+U`9oY zVTF2jdQ@#hEsDv+%%M%9g`;cGWm5a?_s8yX4&k&dtj$ zYhPsNCX>st0u;~6ikrVS9#fp6AG@EYkAH=}{D0FV+5h&7{NJ?#j4c0KTJ>+{YO;#e zr?Lw9H=1AxhA5_B0I&5RXSIj2Vcw&8U3Grm>?Jwmf~Aji!Wun$I@V$SHs4deD@sI1 zim!9^dl~I?HaDg?R7Zp90q_0q!z`a^4v(wr*d7q$Erv3YZ7!587zB)fT7=_yaXA;U1GR>e%CYN75)450rsd0K5tX{2)X!>i4m%YbV z;Y(y}Y1pWK5R{`tF#?8>)-aKO0X`x6c(*j2cA;4;V&k79sv%;V++fwZtoB`lpPRK zDeUcroiS{4D7AW~i@d7v!+h54rOQ>d8W0L+>+aEOhM|YEr%9hv$>Rk0udgYcladrDqqru)HSEjF=&(j_@1;F$IlqZuIQ{LlWt!xU@Pr>Q_zbe zM7ebWH3t|wdG4z2V;tqsK=&tj2OJQ(Lf%C?pxzVoPOI7E_^q|g3eWmU8x%!SR|BWC=~cdgYm}%Cp^P=4tM?p4^f)(RZq5pDEplzok~1O z5=z+x$178$*O;vJ)rQb;=u_Y|TrNyqqo;klJTTh?;fvXd78DDQl3eSiGvLTZo03vx zMv))eSwOKzS=74+z%HnIazq(h@Hk77RvT3Rr=J+laFbPo!flD6l?v%He(H-oCc9;V zQz&iuuak4ljh6OCVv+qCWN-Tuin@_lS;53p9~)rDH3J$lGsG`95vVM4;PW6e@DfQI zAY#-I|Mj+bN%BJqN*p{Z-!0fwqz`8Pp?ZN2{S9fg|B(H~jj=g4d@tHZnlyLU`x;)& zmuLiD*)#MQUy#c^b4aO4RP2}y^c#^ypvWJkWzSL@{AE`96mMYoqVWmHa_-JhBGTi~ zsyC7IJI}=2d)rxH6WNm`a@1ZX++)rTsSjD<8GP$r-qY@THoR?T!IfU(L&7c898E)E!971IH_fV6U&8O z?G^X7YZ441)*+&j!Ecm5=-`HTS>^2D4-&iafk84BmTY;C@j2*|s0yJtdrO4E%r}1l zfU9t@uZW+N_=A7+nWOt#-}&$R6UYB*H~+x{KiiGlr!V&JRvSiQoYkUY8gl$2EC9#KGR)Khnt)4AaMgeK>b-~>gdB;PBV7h-S4f==Y~>Br<6g{&ekUB1j^^Inted;%B>8V9xZZU&nF6gDJGE@qfbY@_> zkC`dgDa%m=F3-9>Z2ftAJH{n7g%}lGWj;h=D-=oW)Hk>ez-*ydL@qsmM63^fhF?0DNCWPHi2E2 zPny1c=?@V%h;t(4wEoP$NVCy0U7S5@`nJQhw!K#0;k-E%fTscc1oIHI-st3#Ex z3Sqjv4+5`G&WOVwnU*2yoHXQ>h%xnwfcJAnq>c0z9j6sopsyxLR{ldWlCCO*0*_}z z;{VB1{QWz~pbZ(L>Tr8TCwA?^pkv1uSSe8OgOa?|xXNSQQTH)Puw=SR-D5jGrT)SL znLD;AA*ue@!#ZAA=FgV)+#A`bz3-)SW3eKm5@qp`l$q)}G&=bSi%>I8q+*7=b@PF4 zrT4>8WWn9sG4iVo@Cnv&+L1IPRbeO_zx!t@fJ}#_3YgP2h*dJ_Rdca?i|$hDj77Xy zmU7?Ln-|r__vsPx@@3cREU_M1WxXFt$o*SIQF?UA$_``Op8S8H8F$S(b7dZkY@use>gNIRmaI#z4yZ2qFX-w=*MR zmV5&3?bjjwUECo-rkzpHvQn|QcYI=ba_mA@{zk&5Xr&L;S3Bu<*rzEc=?{!ZzCmm2 zV~PXAX$QvpuV#4hs3tpq-|zsN3mt3wB+M6AAEn~jO_*L23kUVXW9x=L;9!>5+JJZ947 zV%GbUW_O#;#(7}Fh)G{6N2`dcNhA1jc!w^pOA^CR&6K8P@O!FrS?!;U7B<{@*&y|2j3W{d<$C zR`&aJ1B3TT9dIQQHo{(J&c6kB>HW-BcQyR#B5pnp{J!uVbGVq_V!DNtoZV40CBXN4 z_m|wS=vh-q7{HzmK+RU0Y& zj#G(JlHVy0w=*ZY=d31LP(_$6N~5V82;!MKt8+e@>t}7?bCusS$KOjDQV1)tHAprk zYgS{!XmTi}mD^5m#exOV*f|0Uv(x9@Ktj4(1y3N$BXIPgpnuS8{-OL@c zKP4mUE;&FUC9+3T;FN!6*@`jNmADXpJ&lUF`stb6cQ$TkTu+nTwc~2EY3WHZ0TG3E zaKcG?l>yYqA|m)h)k$S<8w}!iSXwaZU3#cd!|$3T{}rRLzk8wBs?xGA;ghPZ73IUB z>Nd`CrT}q6%mQqqKihxkP(Q_Ksg_h3Hqo9{U(KZ6>GYNeXLt8!?s2i1HXNgJs5{Ro zWeruTa6KAUDyqY`w|LwGfho=jsEiQ5db|vo;%#j}G!_n@kH5*^mz=&}|LqMmjy>AB z*6KRB<`KYj0D-sk&Zx_4;jybF_{9cN1wvCd{}^mK-pnf%Zdq|0I?8+zE?TB zK&%J;@M{W|hU|dusEm6*@@&nEZ<~BR;||nrg*2TyEPv9swbJ`iv8yhjlwNyd6`&Y( zb^z0Yc>%(lg}m4x%ffA>{VXUor3+&^5pdBTbO`5$Q7=D9jPhff>VJt1rKKMcSxpc> zk6=+R%vb(G}>_=WcF8q-&^op?Y zdo(8wOZV<4m&0^%c?M98ey=pFt23kv=GNex8rzeCV#Z9pAc34sH z&aaTv^Tf1~t;gQyHAE+lw=D``hhS>>_{D$~h=+J!=$?8>Z7@M4*8T-c=@(gUEeEen z)y@{W9Mg`)j_%0(04uL)@QZJj++z&hRNS8j>zp~uH#KTrWe?EqyP1grQ&v`|dI{e< zGk5gX?H`*T!%5qa7Lrp+L8BRMK-3CvyjC-wf{vJJWiP?N+(!6qEhFoIXP)c9gOTgk zMCaqbHjH62fu)MiR}JPr*8SguH~!x*n}4B10*;Ob9{+{rMt|DuW2$}H?AKJ`ON7yd zOAyaWYJLOEnrjWFMa-rom8l3?+GNt!4K|O{>bN33QoMn^&B0YCV%l(7T$eMwDP-q( zrGACt#4-Kl_3+8~I^;gg>h$}#S+fF}FJpRal^Lciv&yJp1gP2!l2fYA)~GFJ3jiH0xOAM;v2`F=#A-rgRKZqEk&76Aa>1rt$D} zCd@)QZet#cW0W3K+Xvm(+7d?*3X)Y^9o-g|-vervpe#eg;!le(sUJ`DJ@zK%? zPw_*Qp0spWYe@P`HjPE#q&QH3Kz8wdb4Vxn)weqGC$TU}fDh;{KDseq0hnz}ID(LA zhclK|q%2j(lq-&xt1YG!iC~_cBeGG(=Hf`_h?XVz&Q;~A^hLWLhK~Wac*1JiqGnZs zDB&%nc!jce-4^VZy|wQz$o@M#gh3hg7y~O4mUu=hONAn$T^t}5nVblyHHT;lYmq^k zfbWm1O9|6t)%*#LmQj$6K`-8Q(co>vNC-4J(In%k3+oz2ZCj%D0;)wyjvX?Qr(enO6Mh0mq-O(>Bw93RTciaaLx#cSS&8ukBfu@JC36Rd75LEpc?2lUVudeG z+h+NrI#}W$(u};pq>^A0@6V_s8!RrxyFLsGJk1_e!Ll4vH-*@n)LfFDmri;l*9|XP zbF%(X8b;-QBnDMut#jKt zFdS}haGS;4s^z`jf#wUYXOGyqpb&GF8&mL{O_Tt;Mddw7;yq-){VfI10r1Hh#?+C& z1A)r2!WAAb#9)r>uuMAR`JO`BZJM3!5zQwtH z!c`zInLr08k9!QJy1l<%<8Ay^;I{zj51%G4;X_M)8$+Ls4&1X2*g%AsX|G8KYAxFiE%T0NVZBhWFE3e4E5>kSW*f=Qr{yui zA%@6B{-HZFs$XO{9=*H&=p&XiHoilc7oT9k+W3(f~JK>EhRfz23*{ zxo?TVK1U#UgFCgcb*Mh!0Y=8?u=N;`ctnQ+&?D1NQa9)+sH;B(W$NH_SbqvWVmE$` z;>Cuc53M4R4GwDF_Vx(2yNG?CZN<&x1fKLNOL6N8iZh`nTo?#v4ANN0f{6X*x6$w& z4?#|n{6+MK;M=zlT+d2_+>KjEO=UZ7>K-&(c$9Mlkd|L6A7K9i z>K-G@@8mysl-7UjRez`C{I4B_f&Jf0%6W>08nPJj@P|$xAPOvMD`9DC2tNNyet-dg zn1TW_hJeX{jx`WL%KYA!Ma#k@Da6wk~5=aMIBu$D{meX}U z6d|VTxF*a(%l4R!M?3e}hCM%ROmuZ^EU7jp7w(uG`*oG3F83#B_G9@I(-xBOvrKU9 zptK}I>Th>K3U{8!fcNULH0To~ph}knPT2C{ZkGv2a5z`(m8G00#E>XfA3#VprxZ$7 z++imxhOko>c}4|EE&i-mrwk&(S78!sSH|KPE3(|Zlv%Kl z(yE?Ub*ykP-jOEExid-xioir0VN1=T@zdd!-kHg*VZbztoD|k+Q(EbfCy$fPVWyx` z5yLFbBICp`P}xMDc~DTxkX1=w{*YDK)OREm3y2<~R6*qI5*7aXP7gL6<_=gq9Oy0rZBkbcYN4nBS0ui zorC7<9FWjrk3GWZ^Cre#yg3;HyyO!X>D|rCSqGrHT(b9(Hrl0#pQzPDe!Z>)}vB#!evbB-$%F2Mku}?#FzfxV}u~xHvCPZ0C zhsxw-RPSnuv%py%QtqME8MiMrDMWg_Sr^K-;0dp^OQS4_o zUw$jPuLs~wlO~;en^z?zqHR10ajU`8GbbP4oWZ0#^jmv<U)S+;t zm$GnBpgKR5UES!Y7?HH#-9gn*s2t4?-0>gw=%U>I<+q zN`~bGbd3r8nhAm>1a=Py#5$m5_LN$ntIo`eB?j3D@Ez~pV#&mk)q&YBs#*u3u1n^o z;poO?j)=KGMU`lrJzv-jU7-wJ8Nhe`b^yZP4Xucj>j|^!GqZdIZfl%T;m&IwtDsZy z8=a%Q6^SAgvN2yR5=a$Vno(5u`CWixTUh1{Lhiz14=n-8`^&LPn*rI%1I zy_D$l4AJO=fue(FrSy)dlh`)tLk*cmYN7NFpHY6KGzjfmhUkadB#t8Y4y_B7C$Ym? z#QsH~T;H!g8Iu=+BZ9{TByhzR`vv*S8XIjzLfZhfrV(=8)M4Eq@iCd?l8Y;Z@7ID) zcrF#OFYF)B@Z5ec_gvGLpV>E0h$Iye?KTFee+3xEJu#FU?>v_Z*4YW;(+Q#m8FxNu0togc43X|5}UtkBj1bP2#D>;z=p!OAA zS<)V~A3y(Cr*Vk3t)#nBnR$i&R})t`NP%(vytKr?{1cS^k4>EMKSm_~r-|G2Mnw%0 z=9REq2nCA<I)wYvg5s7|EPU&U98r6P@RtEt9=9cIq-n#eO|BpgKYsZCYTwS zBtf-lI^uyO7A6Ey3b9LdU~cOiQVdk(-4X_@8gfqcRoz62n%A5JYhS2QfZJKwM|9C- zVJ@m-vMeL2VycW^G#z7K`jlWX4)xALAr)0>zJQg+a;|`r=Bvdv;9LVY2AD|2Qr*bh zSg@!{xQ>NwMj=7qEc?LNcq#9dRud$1C@t`%UzJLee z?m#yTFy7Z^J+PNfwZt7}E=RRTmfAlgPdMR?5iFPFD3LU=M;32RBhsrN!S+y8aoCeY z+1gTEw3m*zqzIL(@QlGOPjQe=vE&FzCFno8*31r^JY`BF`z)XF+ZmD%0Sq za#uHAk#TY5HolwPf>uek3xh2-57M-Yjs~W&ZH#k(PzIIfi<8SQj?|;9`82h;Y7W(H zB6o7#!+ZwNb7ui&vq&tPT9g&JvH42N64x1=-0P+K1FvdhV1$k9kUB-kz1}^@+c#vL zzr-g(s%j)T;6e>n8?&RzJa)mCt{=3paA__-vq)QI{|r^#jO@m~^+)6?&qUSJ<`lr> zcy^cxr46Qdlci-|cn?)J=H|?P<&feI4vRg&LcUQb9@_gU^tF;>2NCIOxu+6hEJW2l zdWyR=zRImbI2=^fuDke-M1b;sthl$bKIrf+Vs0L{YvuxVDRHjruRV9oYbIDF{b>o` z4TFd?kx@Dh>HO^Y*NA@PHI30)4(Z-5f+v|!ijG#2o?ZHohnl?y?2<>`xYztGkU=-y z>*0X~;-rV7GNe+@;iPC_VrzNTesA6iI^g?JMM)*FMzbSjPDM_C zxx2)kHKdhdU1r-Z0!{kr`(*bV#Y5J;z2Aal$quq!in$z2fYu9b(3;4L>_XKiZ=Fg3 z3ytMm{`lLX^TQNQ8xh>ac)?PQd}VMUrtJE`#E(WWk>U;7kAf?hS#$CY?0Q50)^Sln1ka~noAAEyC z3W-L^Eva5``K3<;N)A4S!aJ=_d)p`&1uB=gO8F5vzx-08&oYD#eu>gMy-s$!G8hI* zkIY*65p^3X`2|fDkUP)wynJA58tf^GJU4lIv`S)_lIby;e3PtQW*nXnnMAGBpgjsjLY2&% zT&3Ee`Z-&$lB|^-LlP>JtQ8wHBv$>U3LiCD6jRx;ewlt^WKD23X`?L#!8SrDYcZlr zF{r2mtSz8k>fp~BWINVmvUNJ_lO6WJ^C6BFaI{~#tEoPZ@h-dH*JAb3LpIH&7qi%y zU|f*YM4URf^yg(QVlsje4xh^AEpK!2Hg_}wp8j$Y3K`UhL-^#n#K8U2eEA0x^ZyK& zW)&?pBsKKkWO34n-yzgws#J{g3_y$m{lo5nhM&|l$QjLZ6GSm=wJc*9Q|GHCz1h-t zh;vJOi{+E+l~zR?pUYO~J1!@~uT0xAc#NC8tv>fFn{KkYZx1_iAR0;NJ0THPsZl14 z*7VMHSSbdX#q-Sxx=R>@BJ&wa4FJe$QTbLk#&MLF{0r)cdJPh?pBg5j{uujJARH#D z{?yDaYEzk0ier9E8Yq0Mc-EFU5XG z_XD4+lj1(@({g2cO8o&ZHRe4~cP|F_?)dWzRG}`(N|vt&jJiPoPESr0LmLA$8JHQJ zRlraPP-G+y;EhK>I8wqe2#_5+f9(=^ zo%4QZraVp)Q}f^re|3M`sQd-g?U<%dgGI*AedeGYVu+P;7GzXS-}EcSi~>{*b-`dK z7_jRfi}v2otJ1g9{C2RB!uOH@uQIp?42ZsvvGA5FBpSuV^18tbUry*6s{kyX3(i5F?}e+A4wI)gH#pt)BIL zA32D#xElAlj3ZnI&n{__CxqI$OH!kzEGTZ!t-SJ@J*}jH++-5oL_`1G#IL}3i|uVi zd=yTtbH_lBj39G+)d?pLU^6U2C>D)kHJpfid(FNg?Bc$Kn-SpVIV!db)Les@Wv4Hc zU1TVdU3L(bab1d6PTb~9)@JWs5TOo89G5%$t8G8N(q?}+k}Y!;%AW2 z-M=8_oaPVZZjPwLJY|oGfw`OMD>cVOUb3+1V0^-1D&!vi#8+o8mF*3L?rb|d^gVRQ zMXtycOThf)us9^R^OG&GRMmPuPSYnWZJTEDHL26o>orm}35w-Q$(0iBG|F z(BGAL_Pfz+`eO1A-=^Q!k!>2GGG(^Ah=!^qWZ(vmh>~+N4Is|c7FQvAWVA9=i9n=CA&2h@`gX@+TzcRz^28VR+g zkKq|d=A_S;TtsS=F@T6Q`CDR6BcX9afJLzeNhUL~w&;62QZ3XzKKP=|Yzcv^oG`EWrwJ{Z$p+w6p1>2i&k3obEEAx6tynAe+qW;pm4T zF&U(K!w2I|g=m*v+2|x`?t$;HIE1?pW%FTRO{i58`ty{Y*B7mAY@KQHqt>r_?*5x< z{;oaEVvpG^t{wZgM7MM8@Sf%<$JG#UPXYA{%Fv+Y($GQWWwjyO$_tx)<|pWC-EnI{ zx|*QK-_EwbR}ng8fJN8zv3ud&f!Y=^lvRdCg8pbK7%vlj>!v?ETpyfL4AR%bwCdTvA)l$dp`^6ug5l>*n zl=br0!weCgnFQ^#!aVq06N{^oVC--d}eUZ=@7##V}02E--NP zVoAQ)IW+Ok5EO-(hc^PefWIRm>FGnnXh{t$r7%eQv?yvWoO^VWKY67*LFZ69HpXRC zBS$)=7$9yBRQ~8D1dJ=^F`i+D(FG$2s*6D1)89T_o9~L~(#mPNhC|B;XrT)*vk(Qk z0!L7#Tpcu~p~12#JBZcgx!J}589k4j@|-qeLxg6iZ8qXh8yFeZ*>nh^A1R>o2a+O1 zyXm63fk$f;WBF2{^>ElG#?@ZVxtd) zn-V)D7(#cX?UOKeSmiBMX4Szo^5>V*7Zu=o@yCT0eLOP_=m=PCOLylD zjN_FYz;6J*VMBBnkB&&Bj#!w2e<)SvI-8ifmKChQ6#51Z&+!tXYo02ha2E;MK`URT z4cP?)WkF&uzfsQ-fFr}~!ymB<c@qdiV365h-eBpUjX zLwWp`3yeUMB>Z5S82q4#c7BwV?1sv7>`=KFvyjGh9D{_i5aZM5em=td+@QKERV9hi zPjeh8MlV%KeZUgil@(9IRXv z25uju(hlXY&tB~bodCz$ax?kP8~ge)J*@|FiJk$W%a@BB zie802pEL64IJ>kc8s)aA~~O-CpG4Gq(O>!vk6 z`v2kU8>1`RwskA1*tTukwr$%hb}ALKV%x0PcEwi3E4EQds`A#pXP^7dZSTI_+FY&8 zug~#g{g`v~(ed@qBp9*Z8EiqL-8>Hh>C>P3+a3*><>ymwl#-)DBE1uX44t_~5iU+% zcpx}c>cwRh!wHc$nEX%iLEEF4Vo6ewvg>Rl|5liixSbC@nuBM3WW}#@}UKGWzo>jpC>%PCePLBx)_E zF?fq2%?*1okrX!xqY(0@#OChvhZ3TE;gMWqQT5J3CQKo7qz-7desy zhWT_m!r?T9>WZIk;JpQ>Ygv>og-=-8^1ecof8K@a@H=}*T*#Lu3UFQg4g0fM+9KtY z)wr8e)TeBm(26vf5LMf7d_4TD3OaN#;3 zneq|w5URp$A*?ANFq(&s$M{b)=b)H_3=25U0Sg_5jLz~n34p}C?x%PoG#tCcNmvHT zt!xh2c8}$SOz+fyuF^{V1AZ-Rv-LGMnqvz$fv@epD-(=T43|fIC} zFiT=}mnbAUo|ST*Ipi&0)KtV2MD%2Ldh!hr;2x;;*AlSQGoJPXrd)aqRVG;-dMZYM zN4;8lZm~5pqE2uxZbyMDvBYbe7lq!%PyVtx#Cztr#6WbpV~VvZ%b!-;3^FD^9kgw= zgA?bx8`tS{xJ{he>gS?-Hp`G)&PiSFgHnpE8unJ^Fkg{?Y?hfQC~q7tg;;ULCaCgk zRJWNT){@aFwl*dEjUw8Od!w3{@WMjUNqmxTD)*b)C4^}+9B1ZK5&DwKqs$37 z%``;uCN#(v4PexTV3#n($Vt+j*3?YpZ;7iSbSq65+tO)sPnrfWQQ2JV56eD5+G4HNcM2M~-Co zCqyFPxhtfzT5SNr(Vh(?eX#d^tTpj>1vU%I~)md9?5^*ujGdyZe%@^S9eoU%U+womb}eb%Ce8m&dtluXEerI7TSV z>eb7)aoZ_lx@)QOOO_vS-@Yd{mW}F=$G%fEZQ@g+>1r%|3CFug>+S zm0;SUw+lWUACFRENhYuLSHrp@;p9GlACX-}#0&{+cqNUyk{L~R)wW;{j z0xlh*F$_<~<)&qdq7J$LZs zp{59zj6RN+qfJG0cGQOD7I+ z0RI8ucN~V@b9c+T{X*?8&m7QwH#l|;Sw&WrO1yQJ)=BBX$fvZkk&uZB*qQd$)qUvu z*y($(w$w$n|Jk}=-G=O$t~I*Ct?fYH-qu&&(p}%x9n{>^xZ2&;T;E}$pZNyOpG`l1 zvi5Y<6c(_tU7Ne6Z?L1cCTPc7_L}r^5A4iGYkgMwFevddGv9M5v~)AIueJ0Am|NQR zIWwI}wRR+4I$1>Eub)bJb|h{&S$xCCwe_$xbc=4M!URmy0kgD*IqM?r%}`gT$i7!f z8o0%D9$|T0r|Ua~2pg*Yc5_SY9mqcuYf{y2olbT1vg9{dSJG~sOuh89^fOqOsph}T zGaMG|iS!FZeTE>v!j;-ok8MB30VuNp^DLJHCr@@x3dirt<-3ExH(M>Q*6c+;m2H)E zl%_IT)wVV3b(u2Pvu%CUA?bwt=$XDa{p=F zry3BJyAs;@O(FllI$@-^G$OL|z@xhfT=>Upq#Tqv;*l>hEiB`D?Fn;=NVW9~JD zTP$n+?oRzKzX1E;O|9U&t$pfKo~{3U%-sfId>;Kv&u3vZaC_`1S?I~{RcE2$kIoq1 zR#vA_H78}en)Otw>&L_Q1E5%(EcPoKQ_4g3&bYG@ciPM>XvSCG0_r`+vGOr3v0vnU zpDgmPTB0QPnAuCL@(r;u>-sPVX`7E-n3bTQ=lEUVr;jxEFUD7&##{bWufCqfbpCE* zJPUo?#L@@tdRHq=ldr$u4qiPb={nSc0!#CHDK-KhlN5B~8n};YKi-eDizmBu*5C)E=7b+jBbdKLxcwcH;YP?j@CC8eHe6%Cu}&Cf8(^>5YH|$)_{Fk6ZC#&@!sHY`SH?f`PKvTph5Ykv8|vU%Dj$4 zH|ROSeooTu<7V3CY9o2IQtb5?w^Dti+4VP_2QTcz7@O8Usd@kDR~}8+@}T~2nkaAG z+pl*!@m{B|bHtmEb;Seat#{5rCD9u=Z;?x{@5cTfw)!*ItF1d-;^j|+zug!%BVM4@ z`+n0q5wC8&o?}V8>LJgk(Cjcvb@b|qlgR{`cBCiro4@wi-IQ>iT0do>l01duWMCF#itelvJPX+`$kV!GI?g^ zcorw3_Z{LsYbg;N+YvP10eMOQYkc@)zwq*#;DGHa`()?-1laZlW~rlq%+W(FNdj#< z153>@K-_4HmLb8a{iX@$C?I)sT1%dw&5od<2(VcO=vqb{%f<83-2!YD1L~Hs$A04> z>x2OOMfXYVdE%uxdTruHZz+F&L;N_>F;CNcGB(n$Eqir{_&9_%E9E(*7F22z^!yoN zLCOy~yz4b{#A}^E=1+Uf;DotWnR>2&{+8=SU0q?u$IZ|;tzz@p zBOb-tSAprka>`lR?4R2~jlC^pQIR`QN4$A^wd8JVdd6}QLO$iWyS${ztNki#i-NB6 z?+VKO_!pObyI!vkzv8t!gm%8Ha7&qxq(IJ?EC3F2x&Y)84%CPKd>jh zJ(;}7J-AKZrsfVS;X_j(PM`$T18OB!CDelNw(UU`LkYIX~`1d4b{*vR0CAK9GRX%#xCMA@G{ zunIB&h~5mk{i@a7yR3gZX~PYj%e~hC8$-qLrpEQyBG7~$-l}r8@@H>S+j;QMrPpViXh`=5F%mF-Wc!pteS!rL?0$ikyMj~8YcrwlE0JT$&p-;5#p7GlOd%gJN~?V z0SAP)fR4(~A%Cnj~tA5QikjMsd$^Azp;zM98}&f<)~D zP9IQ9qSt|7%dSXFuu1|$B4(r1%DDPntV8m##W#WzJjOA*CG+pz%S$JFvQWwjaVXm? zoMPUKOE7zk>g1tb1yYe~VR)ow#CD;P;)>mosGT-~2?KD$oRQ}!`C|T10DPcZq#p{h zzwOd|X)zK1ty{iIhOaH(bb&61h?^}D;-bHvNX(#VmRO7}EMuu@c7h|n@!#>*ii{=gt84gZ_yZ1K4X`P<#&r@nBQ{DhSAD;l!Yp0a1TEv|p)4UZU)A zi4Ne}Fb@YthM+`{#ETFtd*pL^>l2|D;2w(OJ!<>0W-VMGiJ&dP8jC!sTAR!QTMh?UUfsMbw z>abHR%O#J~fMBExcHun=vA98KI|2X`h#rZL!VMZ+uz6d`D$gCZ?g``%EhIZ)A(pqs zWr%^)VVX2IO`C<&L8Z4U-6K0PJ*@ev*+{1~ZKuOnZT?sj%-Ki<#VvD$n1kGikv95% zfiP*Y&JY2q8<}f)tV7B_EB8*6hUXkd#C4uA+k6xtV4LQo7vd6bi#VuT#t8$hV2?5r zwn9RNtQQi=(hPPyUmaq66YMpXt{+Sd zlNWSdqg3l0U4?HexYzn!vhDiMaVi`)va*}bM_gCMIq$lC7o8XkqKI5i5hDB~gkwMZJW!52cChA7hg34jBXj7%pn z3irkYaz-Yjl#_1HT^g`m8nmMKm7&vF(qGW}jkd14wT>aJ!FvuHHMLy`wsl05-!!|> z>K_K?&CK2)&Bdhgm*z@q)c!b8QFC9OaSYkd&lQ0D6R2+I$Ryfjo@~805OnsI32}U z)c-T;l>&j3_Lazkn8IV}ty|}6cfZnmmUTrA_8ptiK(#kw-9BP>RND=V^FCpB%=3+q zP(O3%qz@|b5&1hA@nQN4PTy4BO|^fH@jkkLlJc!Bk~+6AaRtnsnbE|g4`tm^OLrdk zoz?b4ad)oQozwQ@+pmJ1A$cGnH@=)v8UM_^GMsuLe>@=dif7bUzcspHJFRD=eZZXn zU9oIG!-kb`EOb4yr(Yubl!uB>VT^U&k}p1rXc3_BkQr}o)(=|eD+Cm09hCVLj%Nhd zKww5Gv4-aQ-&gxS04JVjGm}>bQdkHHL7E_(9BBxZcu6dSRVG?(9RqLkcgrwxg$qtb5%1Ng;77 zML+ca0Ne8+hYlsi=OyX3IrwtJ_5%#Hngm)x0F0N!?DN5wNX3NDWUV4dk!yb$utA0Z zkO4^)DbgF(Rb-f6u{$1Bir58y*Nb#J;spgr9_d1TD34;gGxon;$RxdAybt8jb@eF8 zHk97+1q$s@QA`t=;Up4iuu6*zG@|WP>CYsKZLn&|=zdCqQcT)01oQN-5`8!*A;7OV zXhUWa1{lS_Bpmeg5lnz)EPOj{)TW4{3WTIg4(72$ArG^hL}mqwW|7oV6x2M0o`ghm z6w>_Rb7XhB)>JVp>WUs}=Iz%g|60sg_+~8@rUku$b{%`jeYIjtg%tc+lcEJ1Nc3eX z`iu%ry#a}yc{nP(I4zcWhvu(QTA(Eoe*q^DPyzvWzeP#qSQa1{KMKnMY`Zu-&K(4T z4|uqMDtH5e8RRfsWPL8S2MRt3m#9N9j2{Qse%Y6BQl$QbEGGRURc|G!=rvy{}JP=VU%Qk*Vs)~Hv#ZbRQe&yOId#C;Uhki(KKOF zKuU9Op>s3|u1gVPS;>%|+NifO;k!V;ki!RWm$Le>}Nq!)>z>)OgU?pI$q;V zddOSL77O1p!6H>I7Fnk3yCs@(KHl>8>$_+A5 znx-YO$jfGwQcS7cy=46R7Hvu$^z5zW1R|>o6r)C^EkAtkx*KUtM2y+YFBZm z3C0d`uftw}6y`(vUsAp5X|5^Un>5bV8mVRe6LkyAxT%F~nrRbSV5!Bfnkt8t32@a- zbjG>rW0_m99FT{GKE^^R-wwVSB+*poqLFN+cPal==!&_Z0&57J?!&5;1{r@pEgDmN&1*rv?Tb%j9(s{hSQBqfBTV#Zl^U;rKVY} zr&))m*<`1wy8N9_v+Ydtx19=NB`Rmt=n^L9cd8|gcHPv9^Rw_Q)h0WoEseG%M-S;O z&F7_nM-f4d`?Eyf8my%vJ+(YvZP z;%Q+oQ+ntcYOXGo>=aMzzjAz&)ht~prxT!cX<4BjZPCleS+x%^+pKtu0yNE2Q|ZMC zv7&Om4oh@_03QgwXw)_ssQ zg9T(wp(-;EMP<;*{i>@xP*ZNQqRw(ooAHP-16G0Bk5%BY+-5f|4f@; z{Z}e9Rm0wGQ42+oJBpD(0?2b9x*qpMgu$SNpq;F+v_pBrcI=4 zHxzURJ#mxeS@wJFuZrRG#Gr?Rxrx9^z{S=2R#Vq=eo)W{x(E>`RLcQW-t=lNlr@jq z@mIh2fI>%o#2C^esNOtlnXJnya)xVw6aS5Vu(3(xImJHFP}KJe-!UETj*2%w`$13p zZYs@m!o3%I1nfoU`Ae_lb;}GLO4Om0Q%j_OT%eNEhpTj`Zqt?#B$zE1Qu98R6_ z1z%R49JlWXwP|>!Sn_5r>{MJN!op?4H)TFLonW=!F<^dU&))jz&(h1-MUv+AbEx5s zvXXj9M=U zwOcSOTKk|`TlNEKbS@m0`CK-`uiGsG(svk%mfc~mhpv?>OxRjt6X=xXObo&|KUl0< z^gu0S)(cia6Jb-kn5DuBiLQ$PDaxxEwka>QXYf6K#XS1Xw{9&`BMb|g+ldva%)N|X zP^DkP8D=O=b_G(sj6JRGba2?WwCn)0Nf>P&kR`PaPR7+4_8OtOZjnfPLE-YS2E&%& zBZ)b*FFAV}B}ILOA{30zNOPU%oQ#kN5>hq1ubogR34 zFFyLrg5u7ff@*zqX%(r@?Iv>(?l1j{)W^M(UiFJ6X7WlCTtg2i8Y7C8d#?ax_100KYiF#_G z@j`$^8HTprbA4ZlWdQ|3a8=_1(+dKlCIh27Y?S#Rc@E#26I*Q&id~d(ba%RWKBe=h ziyfYK=w^Oq*2nb3fOy>fO}8mL<_44Q!P^}fO9Z-reG8NOo#g7T0SKX8IyAzV zc0ozqlF0#CGYhJ$t{k=kLsMmm0l^RI2HtGoUTlMLi2+FYN&b4h7^=IHa&kb%P;X^G z*vfj1fWXilyuYA;8RI6i#E)E_zQQgtE%HrCKG|HH(%+YL&8E;IPIF#gpLXVVhOJUI zHQCZg$Ko#NHwjWTy5+TUk~%T@HgMzB1C4XwJjrvmDXs)?Wy}N&N~?**BD)@x=WSu1 zZ%u#{x80!;4{~X~T3v~a;(_#Hv{@^fG*9AioXY3=_oLiS~91V%k zti3VOBB?%eT9Irb5yyg5ER7^`RJL)@BqOe=i-2w$4la-8ECyNQ*LN)Rs^*Et3GV6| zwf%q|@15RXWr2c%U5Y=_bPltH9s)mhlxA0rgLYO0o&VfU*ndJ07(>BRvdq0b_`;cK zp_4F^@f3QhU=goCxmdlpe^fG#LeS(;EVwX)5??NxJ(GAMGQn$N<@-Cr5m;Dm=3{xF zS*-%5CdfOVi*Cp}pNUS$JD-Q{$cs7&JL^#}+gLlL7rujkr&`OsxZ*7~yzOTSn!e{m zU`n%|kX2BnIl{=_!Rcqha7r=7sydd{0UITuz;7uXJ41ryHYF{~%QR3*U|N~xq9N;n zD_yD~&r4ULH#1PGH_obD1tTp}HOV_sYPU<1EO!hew>*WKV8Pgo&{3m$JjC0&iRuoxFcG(D1)50>7 zium%1N@;~Lw9LCj%`=K=SXZNRKHFr$)KZEu=GZs`hJ5}>ewuj+Pdhf-=4aQM5lzyc zn|wb<_aaH@INm_qn*?*qq-Lw0JFb-J$OyMpjF#^iEmA1j>@_)WXBQ$8P;U*gp{~=v zB3AP+C9%fnAn1W_8pK-CaEKKfq;V^CwJus|NgpjnL#fF6eva(P9yDLk03tYM*q(tzZ84uUPTE}rLL($S zlvv)^RSNPxvV#sho+2x5ikC#vrdfOwvk!LC67_6!0tbIlZ7A{KQ%$?;8SpK;U5}FF znbAycl^~=x^>K>|pmUuNoqif>szA_=fhQo*Kil}?9Ma2p@7Uoi-kW=@f3*R(qxIT4 z(Pp|cgPo>TSkdq$5F0I51XU2)^K6=lqh~}i?t5x`#d3=p_M|)Gu36Z3NW-O5z04S2 zLwc_Uxm7VHJP`dRhRSfCNzc7@P*P)+&Jd?l>##^sIN~b(hX*Rwyz>_<2tRo31S5ry z`J|Hpyngkblh5#V#pi$IgzGsuq^ICc6c=?qfIBa+2frce!SDrzf8*TzMo{yO84rIb zJ6dJsiJJ~H*JH9i3<&$7Z6HA10;zwBNY(qvVx0cGTqF7yJ2UzGl0e+~L(^PY!`PVW zAYPbxXbW6&=p@D!xk2F#)*xRPmub(SKCCwF6o@`4AhA|*!v&&GO2ZbzT~l#jY)h|6 zRV*}x^TA%oYc$l=rv$`dJt4lXwXIF{4mmvuGCX}oY9a>2{tjKdu1%U{41_yTR4y1%VAb|U;b^Hy0~c7#Gl({_mc z7hlq#^l!c-uI7SD(t`UP0g<|5hfxmc6PMf?e@d-aL+ErtUQdM9laTK^G>!?e8U7oL z-`dzi@H+(38;10{HP4LD80`~eUb&a1c>cKc1p*N^;^pxv>;(0_25R0zUjE(BySf90RGSwE>DI|-E9IdUa zvV3+jRnMH>q1FHt9ftwrZ8PGbiL4!yfyOYuH?~UmeloHUOEFUU)Jg1qth3N7^#d~C z{c=5-p%nLafI499y*EEcu&eL)l{5FR-?RCFU?#ixh<_>_Sh+UVcgP3quW#};u>gf)VM5Y)$W+m8lTMhUbYywN{iJGRHE|X>oJ(?ckl3B4ONQyBh^Bt^%@{^-YFtju zECfO&X;Jm`c{}mt%4aT`MXc6w&=7F3uPdzKJeV;`lUMnBLy*QpRj#-}l>Y1o?XBd} zJsR0OXC*(U21?I%D^l&?BaEwj(nP& zx2L8a6_xT8XEC~Qmg?yiJTi)5hXN5y@oPI>MknmoR0`oHVoH)8Lz_m#_{s}A)H-Zz z6s#(5kh6K^y^rv;bLF#}I9{wY&HdQnB^&vFnK*GsY)lxTaeCcV*m^4ANy!KR+cV zz$&=tpv}S&>}I5UEA5gUb0~d+2^~M$oo!Q}ms(9C~WrahlQD-Bpr2EmxLVZq)2Oh|`&Jq_Rn=RpPV++5zlf z?LIF-RU)e5msv@SM4Z@0eh)vfkDTEc&rlERrL|`=$BO2aE0)W_Lu`QlCXS07t*&{B z0Pv6eHk@X-+8izuruc_Ah}^j++iYLBjIW6e&x+V}@14F&{*aUM1MQ|KQ-QC|ZF}l? z1ai6kj_45xkH}f7uPy%j&X|_D7YWb(7(sN%5#nncuo0R@t80E=st(_N^Te%qC$7cm zz^TT~ya)fY6GCUhKwKQzf`fZpVWyBax*>G&aldTO*eG8>bsUzUhGnrPvgU@(qBs%$ zrSA`W@|p-c2lq45{J2P>yN$U=mD)HW!C}XTvqC<6RY47B^0EkUE}$=sjk!o)L?W>b zcsS?7X`$^v)h*8{=L4Z)tf~MUj~&7{QFy0v;%GuA-;IiICx`Q0*bQ%QeYj)lL}MWU zfxXx!XumDMQ^I;R&FkY;o-u1fO3OOHo&$JvbZWjN^!}L zQer?!mo=*LxmSY|YPmrUm4j*qm4l{tc)jADTj(Vmd#EXhzwSxLfFgxqM6!h0E3#I6 z!)97Hs0~vL;wHw@`p2+*zi}J{6FXeNNyXNI2Y2|ar%T`37>8r&m|>FblG8F|mJLV@ z1EhNFVx14SIlx~v!UUQdzJ#1UcXVrb`oLbdGcoTOc1yDd>`pxk1xN?s8MJY$u1Dwg z-m_E1N4=M9a2u=_SyU%E%C*wtYBj2lM<#V93Ccau;cBv#Ss2K=UXBX-7s+`VkV$7Vmu*^0-#j<;!Le@oa?C-+Bb zy`R6GQf|BDQoT{z4P(f9=6*I8eC6cn7n(8T-y8Ee2iBq-Xk$kDNBTO;e?-dvlO^%r zA`zOdwttf){yRw42eBn^1Sd;_aY{EqnRJK?$S6(J(l6<9ElSeTa8obsjP#+l+_@AU z{azn&$ggCvGdx0X#RPqd@ai(|H>BA;IXU-+Ge@h}&cdH8ZjwZqc@f*l?mRVF5PXHw zH#WDPkk1CQpXs_MaLp!Ln}2JD-%8BWZzOd=YgIySs`NX!#uU;YJD^J%&!+)3i?y(> zpqS|wUWkHbYw6PYSI+P9Dqiq7!e=C0Cq<}zJsw1?agc^<{Kd8WHxhB1)DH8?2S@_c zPqPukOM>uHH7U~P%J<2D7$JT3eh6;!rdd?!@&2m`M`iZMgl(e%?8%z%N<-L8Lnp=y z<~THb6x}0+`iDz$HQB4@%b&y7C2k#}*DbuyH7MnJ?I#Hl({a^~k+p^}!z(>IsAd~- zFnU`$jDFzZjA#;~w@Za>bFTJ|ScG}g1`!pVnC;3*XI&yi2 zS+I~Bp!_t|ff<^)%@u6}es^6I4F3b8RSR4?i>F3l-~u4y_<~fu=nm&B+P@&MtCB{z zK&92!@M!%R@8gbvbOUIsY{r@6Z0%A}LIjEN?R4JO8C*Fr#`@2H9a^?KlSX?%J#*Oq$RHy5 zU%rk1s|)e3gi&0osx!VMsI&9v>gkDqzxRcU(Kgzqzktft9vlu%PG4e^N&=&FDfA19 znDv{CK~9&8P+Jv>qSBo>@h<^^8?YCZ$aRm+oUq7OVD3id$NL|@F26r-`NALIUpU*t zRqQB-n$aDfUs1&_ZC@A!WRpPw+i(65r8Q*K3FIWxE@DUOQ$PmkW@pE{&D_<9>1)ya z#9%vq+18ltb$j|_o@9t`Lwj!&OVgI}paYpvkAXqO^3R203ZWbsZKiai6uZ(k{5~rt zV=Il-noI!HgjG#{4Q};rQ_`nt<^h7)6^E%h_$ZFoeFo?VyAMNgSVs5R zX+4xR5-WS5jzhQ!G#|?5#|b%}kXEB|X-ru&hmG>6@%v!N69ClS(#%(oOSN{94S*TVWJ15{fcUv${EXR z--kT17-U?0@10`g$egVgi;RvLMQ=;S{fF(h=0-hHlsl~0nHi&b$E+FXr>b`K`&v8m z>0;!lVF@h%gZZ)O&3n7kn!5O1bPTVS!#B9}(xw6S{Gwt-B#heS(x!2CFI53$KV1%$ z6<`l1m(Nb|Z2Rpt3p-qwN(xHU5J=V| ze-Yh7rvf|_7kyR)-cT#@x#T9uI(EN^u#}Tqcuhr}kfE*--Zm9705)b|;xSq~7^S(} zDUth5(&WFpO);7?_}8{ulO2q=AJ_f$pUBh}-TPPAzlFcYyJw+}!{pH)^la>&gcSvQ z&g?nkvAg8%;NGLf5NjOeM5`g0tHf(4K+RPCs65(OmU8-~m9UYQ%Ku)b5QlMIPX5C% zjJb4euCTmjvr4|9lGrMx$D6s$fV+-`Gk>Hq2y91hyc*$YcEF7z6$!%n{wEHhO|#iU zHyP{qq$^(|OPHM|f2~gZBh%npQNzycBis3#kRY(XVi8CXhcoIg{?I>u*8jk$|J%XU z{+oL{7oPU1lx!h760aGe9;;AD1Va_0R*Mn%hH5v=LwQ*eg%s59{0CbSt|)&0FAb#a z&7X2=Lbldfg3m2}PZ!%D>DTn@PYd2NDC6k_)djQolDUQElH4AXGz4`R%|)XqdR$Yo zl+!5FS}=AidUm}QwRTS_I@v0tkUW(+7$ydpC?wOOYpgA<9qpq4uN9B1EU&4rjww?d zzY;tRa}Q9<5EtvMakeSX_v&E2s`p+BT^o&l@DUlA+2qpqkYsi?hF9*mYo&YGtLh&O zKKi`SMq8yRQ0Au1F%}cL;;>#MqDqAp_ptq(o0u2EO%eg%k_JCb!oazUNX3XiS3=db zaO+i7WYSk|-r4t<$ny-(B@q}U%?FdC5`>FkTxv^&k?v{M%tXRhrx57387vESvJE#< zs=3NGA>`>X_RZ-})_dIXFVRTOG)y9$9+V!^Eg#ZGgtCSg=sSfMk5fgwtFx)`%*XE+ zK&6EB8D#n;i)@u$r4@W)CR9f)NsSGa4QDk*=cKqjqM#7O6oo3Uls8UeFr@D`JCS|; z8S5+OgNEZA4d-z^=gPC2cxhmM#;ax}{o`tn; zlTMB7j!kthofXD7^jaFUBidB_4tcMp^h)I?tLzKX@j5foSo(IU++lIHSU>WW2rr(} zqp(S!RA~3+1Lt5#s6K*KGSrJPdl$z-uIHvy9;r#jJNUJNtcGX#cMXUnL1rQ`pMjFJ zOnSRixIRe3Jf)lrXbRy(-jIzk3L1EIDm?WGsk8sr31w3T0{2#3eiWT@Lgj=KD zpM+U%F*BjheqhZw0CQHKxmTYHcr>4L`w?ck&S4Gu8FDk+W(`X^ALiAS(<$pusxNmzXpsly(Gka_eGzSz{t z_zaLKEi$9Pd$#tIG+0u|2hp6cR51?G9_uoX_=zmXzhr9-C5FFPfxcLee`NfT|A!;~ zU&}dv`(kZdFLX`Nd;>i`cQg{mLdhq~eh0J=vKGc=HN99Uy=;|=H;T+M_2f*JOcBu$ z$mak_zz$$9#FavYckD3v_@!8&*UKu4Dw*IULdohXApKbA*=NV^?K-IM1B@g*2Ej~? z86VeDW11d;-Zi=m$IeQdmi*W@kQ z-px3sYhMWPZXpJ7eSUwPfqGjlAyVo4#ZV+wFt<C*B(z~PwQpjOigDp4ly z*LQxt)eGj`PdOpELS==E-iipEnF>It#%4i04`*{aDa)I`2S-Xw8lTtVp4VtnD}M>V z8EVIl00_fwB@qneQPO<311u#~wy4UMNSN)yh(C0C(8)HI*)LiuHgYnJKoD52ZK>pc zzw<0^#2-(IH23~uTB_)r^OUmM^(8`Pb5SB9CzJFmcznCmWEN#@#^E0rP=s$PAQeN+ zc_a6bLHWHLit-K-@L7iP@@O1if-tZvvN|8VJD9^BTS$R?jF&vrOjjS}Q&f$v?tJ_# z5YYSV!zb+!x*wjTStgsAwJc#b#-NC`QMk_|S+3;|8h|XxRO?}&_h^@+!p@!o$~3Y_ z-qE5*mQjH#q3Qd_s=Q#6E23zGy({lENAe1Y_LxS|)xXkX52zzFP{#D1eouqS!va9f zvN%+Vjh}@TDoD|$CQSgM;LJP=sfDD&bQ5A)C$^$xHh<)=JPR1z02z#G?)Dq8ZrD6z zDC;mVrWe#qNNXq+)sH#f>5T4ynB7VWWs`P7lU=5pjw=rb-ssM^SwYzr<#q=33{KTu zd00tQoFCyCGqBdvt#V~K<|hq|HBQd-iW3KiYO?(itw&pnX(+RM-I3pJScmlFE2Kmz zGqK&(D|xSkDYrC*qoUV?EsTwaiX0Rwd!vv)Cy-6MP&aW+f3(e4E zf*U_DPX?iUof@`W&{x1#@Rf&Z+8!-V2dE~Ui-RJa$n|Z)%g>8Zk-uTUY1jWwFH7ul zjIlf5l34?j0(b(gsAn(^8%-m~gUG#!#16oRnSfhOrC*Cz?fU} zS=pwERZ)}eC=)^t6#p#@V|WhL84!(S;*RXz&oxy7w_67gr(yZZP zCJw1=G=vL3)O7DX9^%tIFue_MmheJM+onr1>Yn8Oy8X1!vfo3O_Nhr! zIJq-I1~+4Wp#M7PC4P7$AAxckX#beo!2b`o@jr4K{|?>s-8M8(#%8}e9skg(8FpRd zumu<%FJQT-g>M|FEr?_ybfk0!iv<@^*j15CR@OAbB8U&akbd(I4MZaHJ~0=@7&-&$ z*vjomE?ljDI5Q{B`1}GD{8>k>)+kMUn|g>Hc-stT!L+VITl*ZLH}>}O7Jb^9Ed97x z0DXsBPSA|}5L_G{9;HpEubwVzG67BfRI@zCT4;{@1&5p~KR1*Xeb~B05{pi+UOd~0 zRoEI(+XWp!+L?xLm0LHt^$h?oUQB0*?GKC^3UI-zGhs}pF4Qr~XtVqJmi0jMwB`m5 zrg@Asfv&|KzK9 zsF<*B+%^$saGk>zxVYTn*Q7n~*1koZolTEcvAA4lu1$1?7Qt^hP8+in*k2H>nxg6i zr!(ZpxCRsA8e<;j0k|2%7ElV5@R?$*X-)eciA^4j@Vs3zt%*(0KbTmPSv!#%3lEH} zv>KttF?GYb-`FHpHIdeS@6(vU;g$2Ph=McPE1FqmR(=+zvXlJW9xn#Xn-_uvnXq38 z!@*N!8$p$(@L6S-#OyS+FhfVk`b()eWQqj`cmyNE!O>7E>PqQ+rN^q*P$y(@2`;+# zyqHgjK-L`=e_EBxnPIkfhn335JKQ-DA)z^;GGqE-7N3xXTxk?-aB`%bvSu&cO;(s4 z-c)Lvz2Os5g&W`u628j7Ho+P%rN5z#A4_Rk#f=-HH==^!&F_uECm&qs)1&S#XgYk7 zlyFW_!Z(OO1>2Qp@#y&!-70US8UJmK$L#mk5)cHTf`Z8bPFz#bH@&VmOOrssWk{X@?$ zV`Ne~LUhYL+X7WL0otjeKI!=T1jbX+4T!f(SQ3G+JJ8 zng=jVJsOJxvIj30aC}(rFa4!9k^NM`(5KQRggHMBe@H}exh`n4)q}l82^}F>Z{E)d z%SG}hxiC{~s*=T(mRPD8OvsS4j$bBM;hwTR%<_>fqb*>PEf>pHx#ZC0)Qc3U$F-}^ zd%q7R55?iEvUwstpi4ax|Fw-Ml3{550i8bW{_*trpDEIRpFaO9IZDCG_W$$|{awmR z)m(SQm&6dfltdp-)@e28XO&$W6-~>DY+%GEGbM>9#5kZ#yg+GXNus{Pv6aG|9(G;x zqA|K76$sVfDuDAYj%MBs{w+@9q%7aq!UI`)mUp#Le(n0YU7i2&>&fpEU9SmD3FGR_ zDZG@Z^;-U_;ue=H^(1d?0jsPp99ycVcNNC8i?9?H~A1 zXe>qD`Eo|0T5s;f+f{i?04olL;Px0oq(2Y2It9Sipwyn5u!M~OO|J|L+anp-iHvq5 zCLT=uB;}F4vr!WdXsFg%cW!X)GLK~c+QD(saJIp3M_tY(7 zfD%a?#$*X$mP#&jC>iW6OJW9x8>mZSv=?cMTL(LQ4$fLz!M2S8V~46Hdk`V5Xg(NO z<(xpthRc~Zt}vHT!YWBFWzHvbuGu5(c%V29a3pJx2Q+tUPmcSxmi}UZslm(~V9{u5 zJm(Qg`#+7H1z1!~`|y!aKw3JayGt5Gx*MdsW08;s1p(>q5JW&q5Re8%k?t<(6p%(G z^gF!Dvn=8NJ6`)-=ULs~+%@;anKKg`)sk|<=H>n3wQUei*a}>|1<&CdZ$o9&LQo#? z+)by)W`#n1e9$IDwC-`tme6F(BP>(Bqb41Ff+FQ^y7!XJGblQ7D;AdO7#m69ICSkw)w(xC_Nv1 zh|i7M(p1c*Vhcs_4DRVeUtjjZ&R}-dJ2Z~Izimw1Q&~AGi*E@Ep3Y z|Kr|G>`zg|tnN6O+T3otH|UP?>1tP*se-9j>mpnvc<1SSGF_|M3DVoO6LU5V76?*b z3Xw-oHaogJGL!JEpb{J*(JKtnz;9Yjb}bV2%%k#tC+t~G)gA2`_zFE@b=n^Gy-M*h z%<>jB$FV=+uPLm>F!xw}vOwqHo#0&IT(!FdLlzPJ;kyHJn~KN5*o9BaF}ifiD=c*U zbxlt&b(g3-XIs6hP+yK~_y`ayd3CW@4C(b8{2p>70;%n9$i6`12=xH|x^fomiD>skEuhj4LUN@ssJQ#oB0IU$NEkkX`H z^g1k!L+ZAi8|~)^MRIC?$M+p1rMQEt4*R#2F4X*m`86A(U!7-FS!@TiO&?09f8~Tv z8wZv|I(`*3&o>$O>;Mf{X%AWZ&Mb+Hc3)y-&sAGD9!&#Hm?Ho-2MaiXtFT+;*Gb9ZI4KuDf+O1! z+@TGL+T%}^$c^_^kYzxXlscFgLF;M(yRj!y#dW}*VN7kVaJOTNT?{Jl*m8q7WaFCB zn0R1o3TktDbI27%LGKC!ku;*Sg}Z+amITe}wAxHKe{Gmuq(GZaVw!%#nQ=t*rj?>) z0%>>l&Duoz*ut20H+u1=Vm^j)Y)7~HTm16R7P)o4NbyFJ^F;YWQTai#;%F(W`(h!r zHHt+=(j*|uGEa}(a(RAp8FEi}A3S>21^X~V8~r;w5v@9nM^%DBVxA;Bc#O5MeV@v6 z5#mo|N@lC%bnkmUAHqeflB&7oZ&(#ixDFh^B@l5W%nI{JsF^W|rWm1}2X2kDGK|Gj zehAYOqnR!ms;GqFMs|#|(L>xq-At{aogpix_xL(R4?o+C)-2bGK7O#jCvI;TT>nzN z2^V9dVn;Win(6D?4bS}78KJ?U9vQy+@H6;h>j9mFO^!ZFa=J3=|hD@n&k+`UtCeE7t zU0lB~?B=@2+d7dxKVB`%()b)p?em193S}2-nqAF@1{l@YzgSnCC!T9dnJi45@-ppD;Ht_~6J|+G%Y~z8%{K{{f+?!c`j{HHlA_AguQO(fv(eD%X z`Xo0XayPjMM_?Ize)nf0J@`;xV;$%iDM2i$L@`C<1BO)pUFRp#MNQ2Noo$@XwmtcK z=_DSzPMWN= zpYM7GG+k-Xb9i{V=Zq{)4`E?7MbU*}g~d$npsU{3#B6k&c3|sBqtvyy-S*w|Tm9yj zvg6-8->1GWb=5|Ay0(0Q>V4fQs>ny~*Y|CvUr`^;&`7~v&Y*<3f(buJtcKjH>}Cy& z*Ntay)5I;cy<6voh-4<=If-C&Gk@K8vN$jjG4ZFJOOH@2&8TG_Bo1a2#hp*{#(mKD zQW7<kV1`e)X%;@ z{iEHZ6y25$xCvJ_4TB7y9Njwd>rudtW6@rS0OdNea+0m5}DK6g~(@Y!jHDWI!O7QQY~g&q&~#&OQwAW zvs~jI10ox&WCtv7RiRMYmgN{Hk_&c!2)UoG*J|ex+ba!Gky%BC@~5h=BqI@9`Wqxk zzIaplQXscMlN}I6{hH!%12O%4^^3x;NA$f06L2hEc=x-gmFKAn*l69%S_IF-n@;~$ zZ|rW+9vsYxCBE50ep8~8%KcY|sDVGGg`@F_u}3=W|A=iK$Y z_@(kHX|?%AQ*ZILmKCe$KQ7zo`dJUdX2HaMA)<~P8}oYlYig`uSlsVvdN{nC=XD9{dJ4yjceidn_|p8E1?yeR$fuo8%W}veY7b#$TE!13 zjDt57X7_S>?&qwtVWB*ML8mfLv63Ysf`2zL60FJumRMtiSvXur2-qaves?8*EY$i{WDcf8uc!F{{o)yL{B(`4;E zHwJu^E=7=~zvvuaV|)IQ{$uB9R6%AM^V#ofJJP?*oxhtP|9A9*-4q$Z<`^Z ziM<}4!)2Wl$GZqIs(wlw_+l|i90Xz^O6B-sDN3EJ!PVl17M5(~9C%vEk|84R;rSTQ zkb~%%cXv7no5F@kzTp!!@p<>!;S!benMY0`2%-D!*>xY?q~FvCUphJH+2zb~`?bBW zUvK}Sb?vCA&Mwt_(fPCH{x9_max+!rk;R;?;QahdO$>#oxLor*i ztp&L}#acym%7V^rlxVDpmcu9%f`#JT$l5}dXWXK2 ze3t4gDt6}l6?F6@(}f#$^IBn~CiR~sh)wFxR%xU9%mjTA{1(mqekUljjN(o4M{BY- zKJ!&tJuov3UzpxRa|`YVJM4@dTx1eDC26Pa2Dp z)|W$vY)JCIM=j%eUE7$1jj3#(a|c*_A`7{Vdo3}WsKh5u)A0&AL`NCqj{Clpie{!$d+ys z%-yOaT78eG&>w%`eJhAL_6I989R6W1^Kmk*H$@!ja~AI~6q2@2V$mFZ+oFEfO+6Ln z%&2>T!?3p;j_ReGrz`S@N^YdUCYwF`YBLAFcv!3*L&7LYrR$?hDKTzhq?yislYA)G zkf0ljWGTo+w11!E_0y+VeUd!}E;i|pCNe(h@QxP$Hu9c2E{6%@J1M2RQ$x{{-p!l- z7-WLa^ueDj`2~8PtVFC^AuHua$1uL=*t?D{0(W1bd(i;XvZxgJOdl!9d}Y|PIJqc+ zef9`afWLn%C~oOgu@Kh%X0&sas*HDACQP~PD@>fQQ4#4`8?Mv@ro>*vqf9|1~k%06fCMz)o zOYE)^@+Vw$yCRW95+G>`uwrUY;N5&r)4?%z0CrEEG<5S4|Ix#pxkep_ZIpSW)oA|HBl7*C(x$O(9Ngf;X8O@LDW3ko*4{prM=}QgbSKhM% zz&)RtZ!)F*(+XGyJJ%t{AuHB->S8b%ooK0tif}eACyZ53Z z?!iq{mfM>uhLlqZoQaIF{2xH^61R?wGwtmkdzcuPNYqu>ehRe6*Z0~gRfF#iXGo(# z5^9l<-ZngjtA;tU>NQCGfi;A5=RG#{Iz9(JwYLc8d=rS%f-PJ?j&pvL9iM}s#ajWm zBc_#kEswqZ;0!-s%(-OCfeSJenuqQRaTaeW4azqi2-W8z7 z(XRj_)QMUDHTu9O>YJjG%2%=S2&K5Z;1CQaO`)Q;!!3khBIi6W`-(ez2&6$rj#&W%mqr#U>jFz`?Q~-5ji^{B4cWv_wz&-dD=(cs=EpE0l1$%(TAlQiD<6fr--lrHu>( zU8QXdDf`OkH!>Y4V;``N(u~Nw=6ou>ZC-n4kdei%Z0i17)1)y&Vz2O9iAxjEyPlOF zJoTXuvVLdp`JkN?XZ#-8`>lB>WLL&CDgNSfjbhY^BKKYBoP63Gx|Gew5B*A{`l~g@ zs}-hr7&hwUr`+nx9c8E76vy4N>UJZ`cX1Zd6yhHXs!jdmuv})ROk3S$;7&1O;8nQC z7m*$5x!4@~1}71ES*6AstHDgmqBr7(>XDuQZDQZ;2EN&s^+J41A%;6_YL0sus-V% zl_JSvuNPLv`{Ssu)eA?wB{@^{YM)wBx^IZ*$9;WAtRy#N&81W%TYQpem8>@*`tFgt zE&EXW{&<%vLGg%PX0mSa7wLD1_1L+|BE?_C-yyi$-X5ac*YdTxvp>F}T0kc2@==${ zPtQuDkH;qyiE8F9k53Ptd!mrg?y~u>lCY`)3#Hm@3 zU12b&foqNMBPq3v8kw;(DS~I0)m(6?vnBOsB0I6;l5w}^|M8`{LjbTbvjv-JZiPXg zHoOYLkLuK12_&MSHU+(f}1=IApe|{N5O~LV=WsLS^g+X`Bi!|6{ z>F%!jfSXIrpC(-s2~Zv94ej_l!vd?fS-9?A|D~)Q+Fd{Wl(j~WL2R4qUDoI))2%}+VA3hUSCBoob^<_fajgRbGg+a$|R=3r^I+7}UAYQC0zGs5xA~65P zd~!duHvQ2E`#1My-JYto%U^=##I56w`x}E;?BqW z`z001v&tFUfg|fY#z-|&_ZLrG78;7#DMr839n=SV6&p<#T)b`HBbwFK8bOL+hEbDH zN&5B(`WO9om$1GUCq*Vc^kdV~2bIwfw;U&8KcIe@ySv?e2L~;YpMWrblWm|SD{N{# z%g@WHBm(82uThY96lmO542X$xA|!o+SI6@9Xr(<$%@GLczguVbC7hI&=|4W`==L5P zW9BfAi=V%ISzJopri+d^a;<^mak#cuKY}x5;_3CXSJgb~r zfO>TU45%2~dUigUeGwbhHu}a#_!A?e4QAsK<;MP&`xaKW8S@P}pT;?)Fp4u*O=NQTZUtTW@cEXL-k*Sy5s`zJX8} zLto5QQ{q5CUzya7o@-E_=K!+0?fAI?4`EW}&D(b1rMnQhu;+HER0H#j4^ibu_J7z1 z$o3R|+{gXJVUQJ@A77M#sO_9TZXd3?I0aYG^wV;k`GXgtR=54MZvy*dk5+eyR|}JQ z6+8}3N$Tt`IA(L#C+|{|8bHg7NC~q~-4fV!^tK!VZfSQUt!_Tn?A}7ti30JgF}U)4 zQX6(KJFH^t7_;JM`0nu7q_G3fdswSm_PuyRDv`g$vQ~FEJ@C)wixxZgvU?X9p6HGx zlodo7D_5ts2i-kBNX4>6?8UWu5V2#Hznj5YsbTzNtei4>+Xv6=6^=GN8QDX<0 znLaibkl{O+F~2Rg)gs2Le#3xV{e9}vQYTjPQZ~lsYqWTAAQu;39$iSO5 z9%AN5CUbpePabOpz3z1^S5j{nw)F6s&xZr(b$gQ2*vDps%-Dex3KClmv3kLt9)%qc@TnMhr%5PdG>SHOaO=6@|j))d_bmMu@jB$~g$6i07M^>GUv? z4>=a-^w^RQs65qXA#EGv3ze;xG5PxCQJn~3v_MPBX9e1b?1%6KxAWHO?`-AfhE^k* z9fq%0G3N=iEz8XG%>Bsd;diwn+^om*ar+T^wEDp>#ca|C)v_!m>?ejc@|cZ4ve}wa z#r{J=VlG>OWTUlh{tEoEF?X`A;X5{KZQO#UY?; z*j{vNj>g;%Uv3JwoKT-e4a)vvy#C_bt6XzGlDR2je)0VhAv?(-J*kNNr9n828EDc| zrrqF!(MM)pb7Stxvq&>7R4JMNGuVv_sCADaS z>1j~fUfnYE6RjDIWJ2%Hew6)?Fj}fO@U>`7XXIV#R~_@xNkC~RBJV}V?Iii0?LDh9 z!U;6#uebLVb$>}JASJ2M5*IWVY4S}eRa$Nd30K343x^-3A~*?CE4D2seU?~ZGJmk0 zQ1Grh1&RVIVLBb*6#E-l^gy=kmX&OKRR*n$Dq$V{mvc&WuB5UjLZ?rne_|gMQDQ+s zwc4M5a&vLQl?s@8bu_ZHGjwoczIeF#pC{66oPXE0sA&UBAu)f+Cz-8q#FkGos3Dp> z;gCYOA)NiZA{&dtHWR_18a`$<7KMs^LnlM6Zh9YQhIj@`GdfsTW=m{%r%UM7H-V&_ z%2$VxsqD+^yBna*lQgei$44X6P$t`cPogWmB)9^)qDf1-_BpJT9jWSn;(ZcW|EeW* z$AT$2-d&a?-qA9LXE(`kg4SB^#eLCtUOo1(yw)CmkMm4c(k->&ZW?B(IE*=Vo9fEu~M46EUjv7iuRdy;tQ0rC>+lCJ%(+iO2TQ zmC*J%(jww%V~X}sl~g`8R~urOP>#5pfFAM^FABRni#%8^(zPP(qu#+i^*T8T({W!g?KOKdW!a?(Y~w|`C^#v z+?|a*dg+Spfob$bZo4N>v-`_e7}e0)@KwBpoeylIiJRL??g={nE7CY@DDdmrJ`PyO z@}7i&iIPUyRuAT%Oeo1qrlFxeYqmO(&6jaslist*sLJ}S7V*`jy>ifscd#%t^uu-l zwC4mxgVMCtTW+FOU^9}Ywl;|!+}?Mt7oc5uI$A(7uQ)rL-HB@AQKQEM)E z#>)11$;eI;Yc#6BSj1E?r~$h!Kn1$CK#*{Hr?WH)uVfj?WwH`+kJRstj!a)V{V2@t z-RW;lAEY(6$qeg*yx~9wPw3t)tk)TPRM;h$q-Nf1d_1HrRcRqMo=`Yw3bS-Cv{H-p zSsBGQlFEY8aYakckPcx1U)B0B7tCNb2}xDwc*iDWOsd`rtw+$exP}+<9yDw^ku1m8 zxpnmOPm>y14T%=V@v0mMG$bZ@xAZODI?kj06cweRI5RQzcERwzqntpbKay|gnDE4- zqcpAb1L|SU@^`nD>Oz9lb(*ohl26{0tgzm6*>Fsq5UtNa`^>4c%uugUn`dlzr%vkK zZAXJL{9tIKaA!>%GmHReA{FSAp`_@|{APw#W~E|kQC8#_M+fbUcV7($E08ciQU)@1 zpQTCE3gApfRr!sFN2Kjw2I-YLKh8F^W2=Qeat~d1c>jjrO$wKp>5r!FvfWI0qaf## zO{OI+;2;-z7r0f{-dc+*yOzpZv>(iG}a`VUEVS{kLsfRxGVSevG?02 z4{yCU+^dqX&}}w51Z5Vpa`XxI#<_3p~?)c5#K>=w?+>W#D#Lq6+lWmWS)zNCLy($nz9YJU)q*Fo`3KF}m7Ux2~Bfyf0EWwcS3w zu*I(o$096!t#@BvwazT3uQ*JuegU#pf@qm>XdmeeU)0{ zQ9q26&_~%#Da$#YmkE5?LcIr2e0Bhv}hXIooC zJCi^8*I&(Ag#cKJF3+SZDFQ!B^{SjCFZwp{_KUwFCe7fW{aY#Fd@Qau4D1-)nlWeS z-_M9MI=N32iJpe1y*-dg+YbEENId%^{R<`O*yu-_?lH%yj$Qs`fz4n0o1phFiN4J< z9r*$3k@1VAO}%Q-m{Kvfx$@!OGO+0czYbuEQd7uDPQX&)lbm|Tpj=*^Wrd|=4nNG5 z(1kU=i`_mqOZt7{&U>5~QzcqToDQRR?I&iYV#CY0c5>V=c{{a5B)D+-L#(1cJT^BG z)8eFalj;a^L-8r0)S<9erF*H2*}E9{BwQZ0cOwud;YVTMtQGl}d?lvhg3{t3MZWG8*8gRnw2Q% zWio?j^zCgDxDP~sy_5?!uMSWmeq$8M{k7I1&hWKbrla&LW11}GC&)@MJRaJ<6QE(I zd-q1@)@+y(lG)#FbhcB94I+mbxygLaYsC~Xnd@rViR4xgC3>uLkQlKLe8+k9Eopp^ z>hI64@eY_DBBUl~b||a4`-YK@M{`_Xr8rS=qJ(%`dUsWXl*3~NrH|`z~gn|EU+$BHd?X97}27ffMalI`&Rzk zYE#?4c3lL~1@B=p%V^;;r=}2#bqaRv`e$dk@hEE~Uq-`Q>c>Bzh#iJTjuY?4_B71I zjfm&kv!5{dl&wm@F3_yEkEh*AqRc;u>a?~A8oGIpZyQQqna7Tc0@r0S@o_MJlZN2V zN%G(XRzaWJUJT0~BolmdpuW&@XBJ9`)5e(PE? zOa0o(lVF2JwZD|@sN-~_Or78$7mH4tJw)_-Zf$ll+`3Y(bJHEgoHE;B+WVQxY;Rkw z*krnmGzAyZL_P>MuQ+`goTgLU#{Fi1y6L=^2EQF{89pdtx#x%JkIJNzQ^^)KNrxem zN`jlhFHG}`nn_4vdKmAfe$yN^ZgASH^g7_nvY%mApfLDLZ|DnQiWwMdPBWO$;P`t*n6tBQ|3EIz^6?4Ad{ySxD(H2G~qoIYS$ z)&OtGMdrxyf8VN3a~f@4-VwBfVe%83V6vI07Ae?xsF+ZJj9e?YG|!AK_<#=k_YJOj z`k(}O!ddq@R9N}9@`)opO(Z_ri!D(Z=7&h%tG^a&OwV!XFLNlQmQq#t^A0PO&OYSx zZp1cqA4J*Wa)#3?+}&`Z87v~;#hk$FUv6!-0cSg}}<8UWk=L7yF-t=lY-jJ?S?-Jx%XV)-w$g zmVseK^Pe8vc%!}%oZ(`Kik%r^9s#G2jYU`A)0{!OnnR)8_+oL$Pbfo3nm5S3@BUOI z*QUpqmzO7Wm6#MR!{ji%7pRa5iEGMPo`(-5nDNd|dwEU)JkH2KZ({t3_Q_UEH31{S zSd#$4LwUIZ^9kpy6LPM%`TEYxRA}A59s8X6-0Ixv_C9;yp*_@iYk+w_-WyYP(ZT;TxW&*0LwJaUCqG# z*)ZX}$VkBY4GxB6+7`gcPspI4PS?kSuXW(|>{n;YT+jLTBItm+SFr<7Ir%me)amXn z;OiKW@eBk$YUIz-G746XKcR9F#}zjPpADEe6R_v$Svueg8x4$8aCWkHcB1%)SQxnT z&kul=E8D;;py*%-a0i0{=7=eXUoA$!t9&jA;OC!!pEE-IoEtcH?o6w*mkGcxgHxzj zfE=74&Xl~UA-CcUA|n9Z2P`(`fk2PrfzekE>&%?(C~Tdb42^6|RZP!H`TMTw#bSJ* z>AfEVen|wp0O^-3px1Jy!DYYvJxux{&M4&_iWK0NG2jCkrxrj2o#8YbES*gMCV`6t zkDsYpJHW450o@@9yX0VkwBchzDTS+@0y3a~bOVTsDS!Zsk^2f+9Y=KxSp)-+(m6@HZk*_r>-tsX-B@h5;e$7YoXJrj-8zkQ?t( ze)$2Ed#3tjx&N+My@;F3B1lR?g@U33>;@S_OZUOJzj^O+7@n%`5kaZz4X7Rr#1Ld| z_^1d*oOxQ+&EC|J;(?{PsHw4~t)b1eem+IJSu-;ApKhXsa?wdCs$ldbW!YG+mh{ES z9Y;Z+$~9f8v~^lw+#iXqCLDIC$xr~gP6I+gHWU0hV8Y+PyDlG}YNliR+I9|zU=09t zI%)!6dHP_`S$p$m66>1Kvn%f{P6SZ*FQJBwu0vhX+MG&7A-rj>4g}XK;7S~@NWfjS zDH!#?jA92gI9J=Ks>B`23y_BkkcSKcqG|<({MS!Xv$H%)-B(hQkALBKt~O_h3{Ra5ARRs+9SelAoln8IGjslZ zp@;R-3i!V}gI>%hJyzzPd_eVT1U`2l=*|IPy7H+vt`^TaK&>MQ@U}S+f{+zsI}i+j z6z}iLMHgjx^PWnr4A3+RNR^PNp%5_YLQcL~Dt>xGTvh-z2E-tw(FP*HC`hSTSuPpv zni|YOH2$R@(76ks3qu&qC=N`&ASf9AU*n+y^tDV){?8_)U3=~D5fD(3fL&N2ipxp@ z7k5FKd*Je1{@1IaV7(^%MdbDFngH6&0v?2H3LRg7@&6y%{HsfMT5Vleu%z<=f~*wx z^T8;o>r*Pm(fx_j-i0+_QhErFeJcP1&YIYB-ES$of9F>9_a5qlfMj+698$@j#b7X` z39so%EMhcOPXL$%Ac#pIWW+56qb?;;?yF4}vD6R42Xq}#&@Sds;fjCI*sc*S#i2eZ z0uU|{Fd)#=Id^r`fpL&}Ue0+}j0H~PkS}it+7^&C{F1DdAHeWSnzLQg2vfS&@LB_5 zEewP;#W_gt z2oP}hE~RwoNigdFBMJU5rT=qudn)yivAC?E}4}SVgkc%0U-B50&7;lz_U!M;^biYkFn*cw!>wJ zdG`QqX@S~&4?;rXT`=&Xgnv^sIR6_tTiTdh*C#}J^0N0BD8m}CC@n-8)d%1*&f@F8 zQO*CAa1#UC%EAHTre12ZkdDCE3yxw3%lx+V0(4A(P68qHkKbVWrA%~9 z3+S!Yw;~6afCMlzBuyIj#`QzZYr<;LF+A{WZ$Zc|)Z#5L4bq_hbJ^)EkD9~TK=GPR z$EU=jI8H#;P9P6MR^1!7!E{KwUZZwBdTo5cU(Z9`f;3l_bmYZ6?T!SW0cGwo82kpqU2*G^|$LpvCZ{?!VE z4jB+SkTxX-`aWm03zBhO%C6Uq#BV~8JWBvvCVQ$pMAVCmg7Me1J6BijjAeLNMF8mt zNDYZ(k@&xm*K|FMRqkgp0zIQb-~*X027npsGb3DhG4*QcV4FDjidxPvs3mEL96pRU6-LB{|S_?Z(BO9Bv!6e222jld|#iu?~dQOn3@lK_%s z0eJzk)m=3K{c-w|Fya*Znj3C|D)mcdMC7f zniBhf0wA9tJh8r9z_~mlVqxg;&vt<3*}7~$Ao_d2u4E9RGlIb6|K<@NxHZ9$}t8|O_zX%g!|mDp`aX`z`wt`_Oo4)WUj7q z$u?-{$UrI4Kq-(T)vvDLQqF?>kMDUdWw2|ap~YNyI3G|SAAplUsG#ZvhF^-UYv$K} zgyiij0-{y|G{{bX>{Br9qCrmmB<$c|==M+Nt1ZPNKL#8h4vZ2ZE7cEwF#D3Etk-mm z{Xc6)q5}~cerblcKI9)bwrf&spNfLz>FBQIQa^ww8jOQ<66-ZA3^Wc(9DwBxhzFU8 z_TsK%U87|(Pnwz^fXWBNqJeO)dJ-4~S=mKSmnrHLR)G@Aes@}YQB7MLaiaxne3ROb)P zgY2D~1HJ zz1J&R>{HhHMXOX04x;@9W?gKc{y6>Yje(2uonH|KnXXG=p#Q5oXQqW%{CW{}ehJm- k5;E|W2M println("File length: " + f.length); f.length } + val cached = Cache(lengthTask, new File("/tmp/length-cache")) + + val cTask = (createTask :: cached :: TNil) map { case (file :: len :: HNil) => println("File: " + file + " length: " + len); len :: file :: HNil } + val cachedC = Cache(cTask, new File("/tmp/c-cache")) + + TaskRunner(cachedC).left.foreach(_.foreach(f => f.exception.printStackTrace)) + } +} \ No newline at end of file diff --git a/util/collection/HLists.scala b/util/collection/HLists.scala new file mode 100644 index 000000000..4d4a00caa --- /dev/null +++ b/util/collection/HLists.scala @@ -0,0 +1,15 @@ +package xsbt + +import metascala.HLists.{HCons => metaHCons, HList => metaHList, HNil => metaHNil} + +object HLists extends HLists +// add an extractor to metascala.HLists and define aliases to the HList classes in the xsbt namespace +trait HLists extends NotNull +{ + object :: { def unapply[H,T<:HList](list: HCons[H,T]) = Some((list.head,list.tail)) } + final val HNil = metaHNil + final type ::[H, T <: HList] = metaHCons[H, T] + final type HNil = metaHNil + final type HList = metaHList + final type HCons[H, T <: HList] = metaHCons[H, T] +} \ No newline at end of file diff --git a/util/collection/TreeHashSet.scala b/util/collection/TreeHashSet.scala new file mode 100644 index 000000000..f84981174 --- /dev/null +++ b/util/collection/TreeHashSet.scala @@ -0,0 +1,22 @@ +package xsbt + +import scala.collection.{mutable,immutable} + + // immutable.HashSet is not suitable for multi-threaded access, so this +// implementation uses an underlying immutable.TreeHashMap, which is suitable +object TreeHashSet +{ + def apply[T](contents: T*) = new TreeHashSet(immutable.TreeHashMap( andUnit(contents) : _*)) + def andUnit[T](contents: Iterable[T]) = contents.map(c => (c,()) ).toSeq +} +final class TreeHashSet[T](backing: immutable.TreeHashMap[T,Unit]) extends immutable.Set[T] +{ + import TreeHashSet.andUnit + override def contains(t: T) = backing.contains(t) + override def ++(s: Iterable[T]) = new TreeHashSet(backing ++ andUnit(s)) + override def +(s: T) = ++( Seq(s) ) + override def -(s: T) = new TreeHashSet(backing - s) + override def elements = backing.keys + override def empty[A] = TreeHashSet[A]() + override def size = backing.size +} \ No newline at end of file diff --git a/util/collection/lib/metascala-0.1.jar b/util/collection/lib/metascala-0.1.jar new file mode 100644 index 0000000000000000000000000000000000000000..ea1d3a62b2590fa6231ee203c443348c971ee86a GIT binary patch literal 128005 zcmb6A1yG$`(gg|!g6H5)aCZ+D+}+*6!QDMbu!Fn1ySqbhcXxO9gdjh!d^3~xpF6iM z)Khg%QN>ex?_R4{cdsok^&SH49n9PBS4G6n@BZTl?H$BB84+axItf`(1{nca2~iOx zWqKLWm*IEsZe&JBq^0QSr{JaNsK-YqY84r0SvL2!_rTxDOF_fbc!QsReEV0}Z;$-H zUQiPuMT)_Q1)GatGz)ji@pRLX=1~e{xFn9VI)|V=1p1)|OuXZE zmhu?$$FgKiTxv}^pT~cWXUu%*?0kKfwMFKq?VM6EP<1y zdVvz6O=z^X!g6ZXdvi29(?B#$jRosXbQR9m`94P{7)Wf=4^T=WdJ$FgPis3Pi! zEjlTRAu3O*sh?5fpkvDTPI;~h@0ZYwYU%2NlS(Y(FryQOr0b^;oi3z)!#qY=vnxc} zNMOsYnll`=bDlY%N8t(~5x1qyqA6j}oTi@V!YMj+5b1ogA`DrYnK2W%sXqx=3T2+B zMlY8yrkzyJs}>ewuta0l2OctLsjOV4(*HbF@yx~Hc8>6`ypFR(RHDb>O=^xfG9SKlI#Y-2i+j3XvarW})AKs+bTxdGi-!KW%tfd6QHGe47L$~mCNQY~| zbxu~SvpTm9&a)pDtf>tnKM=9p6v}tnG6W8_${TY5^y@ukCwwpUo2t@3XuNiF{aK~z| zJGo0K>s*&bEOA|egCI+7>#^mz3y5Or0ty!;E1tmqfQG;!S580pyLWex{|OCrh5rQ& zacMILM+Z_xBU?+Lfsvq_vYV}us5R+d&;}P_s0yWu-2VKdr(ZaXl=mB=QCn?UJs62KGl9$gXtKlp`t#Mz zipx8)IP5sh(iJ9=z8XlsN5Mwz2^|NS9yToE`mMn)=RDr!7slACcLkRK?^cdp;|Gxy zHtDM}mr0AzkXW+uBf6Oe+>x)tjPWzafUWfTk2!L3+@r__LDoM5RkYu!W$_y&P0@+9 z9kYGRanNJUoBdIJwNAMVa!_8i!2K|@lcazYSV9~o9u_ec4VJss@D-5C>FH{ff0C3j zXU{ggEKFf&XoLbH8M6J!aOza-B_IG7YPj*$q#2Oih|qJ2l>n5rnbKT4RG-$;i*iZl ze|ER7QKic&uTK#)zLV>XHWuHltNyA&3sj-bYa2hfoq7OpuPR4%(Mj$~IHwcc@Ks`r zyZ1+DK2i%&%M<@1vYW4iU)XEviF$grKWG@yR1pCafwai_i0Pq|UZApyIhH`-EGIpc z0L8&Jc%UJEQL2h3=IBh?n#IUV9lyh?G99ElJeUL#A$Emg_=$YexuB~i5pLG8~8@wCTpQnO;u;fQwB_@mI56iI^O-O%EA z7>`ZY!M>Aj92E+gxW2Y?iKHe{>QE#lC+e_84ezO)zKsI6K3LyI|H4dl?=KJ(T3^ul zZnYe1IpPy3{z4;3AjY$ae+Nw2$d^j?mIyMwUdbz%)PzI4L|SI}BRFfLe(8z71*Z$d z{|e4ezw!3p!TB%VHnn^U*55*v@&8!d^Z!^}QP6-Ia##cn0)9C`kc!&3?_<7tUbpz? z(H}|k_5KpEvMF4fHrHTZzE5HQMdNjh^%BX__osWGO(ym}|2nvc{PJ#k(IhHVb*AFd zYHbrB*^$Mb)~}eGgN@Vk9b|nx{sF{K>oqu;S=W(3%$?pt-C5t1KTd zR#tF7GSIdy9)$)m$zXs)1Y%myHE%Uj#a&}on+?hSgo9!|iC)qqXcW{Br&A>TBrkx) zZbXTt4`Y7UVmDlcLF4!(@#?&{T??@D>D;tjOVg+1XPHq9NiKkDBDo|v;b*Z~i5OHl zqU+iuwZ~1JqvpYgxn?(3k80Vb&GbSpzC2`4QWR5!gk-ft9y`{w?&;mSysZ!Fr!mHD zOm@8TAd*l3zIg+E$vnMIXYLVHkMigQ57A(|#^_McHhxdem-DZO!N-tUaAR&zqKVS? z!|$5A;B^fK?}iUVsS>H;nHiW#zmi?U^&n9nt0+07Gz7^z)s;wX;emn^G!??=UKyI3 zgm0Vj^akw6J8}twjj*ZY-6_^C&r&kDHb0GQHOInLerS+%6>fK|&%=SDky#<{h)^_t zHEAeHI2Lk2l0q8i+6u9C3AqiuN9USg85 zHW4A_aKk?uuk@DzZy5_ zb6L9N#5lKP7-h}Yd(Wc)DD5S&4Vs0cgQIq!^25 zarcHRH>m%GE8PEvD_KWVQR}~%UPVjoqYBy!0<1JM9-+(!RmD6yg`vKF^)}=X3`t8z z@hVutJKxw{qyn1JO~dVwXK3CZ?_F>D<}e9LMg-M#q|W?W68I$7OpH@lG1WZ>(>T9& z)IL0%wNH%mz4E@m^W&gHHJh@P!%ZAJjc2VY-htMdmWxcC!hxm>dK{%(t;!8%%9d-a z)7YS9-4%U_u_BlH%6erT7V~1kpY)ZB$z;Azq3&aorv)PCVi-SZahsge3m8fCKH>79-VE^v{~b7U3X&hh9c%EK%%kIc#^V?Ls~9SmyB^NPY8* zBY4n$9{v;+Kx38K3Ql@p|fyJy>}-v|Pk{waX^xr$EaT+=%ltzq0*{#{bQLISYM0M2Jw8&UY7TfyisJ&dc zCu9|sTy^?YkOsb4VOUAAn-gnH`0CIAD}lO2?zLi#4?1XS-7Cy7u7N&#L4Jjgp!}5$ zdFp+AL)^Qt6O4^HEoH*OMKD^bq<$&lvy0Gf#6g)nEqjGi?jZ$O8o;#X);_33GM&9V z!|{YeV{l6-dE84Fdu0aa>$%g1bI0{xE4+n-k-i-DvQ`B3 z^(hw++!1i6dTr_a%uR{>BQ3(6Ej-y9mfBk)A|k9`EfCO4$_~ zs6qn1ppjWlqL$__n^JKCDo8@%^T;^ULRcW_3)Z$n9`OWp%|KJUND6#?+6)d+ju#kA zFZ=JzJYp~it3v)6vGiyR`f#@E8BU%|Xc*~IU$>gXF$l_8f`1^>S6aP>vuF)7AUgtv&KYOv341)5Pgc2eF}4BPMco0zj0qKvX7S8l;VSgS zLB2px)0F8V0rvh(ytPZdbY;*Wgez~pufGLg#_1dzZ#tm_C*q{d=CgVD*8txV2`rN& zPDC=sa6vUjUs!A^X+_O8q0&A^t)Dbpkm#RPKY^x-T*YYUu`jS?qzpevTWz%Z4$2t9 zxY(-oV-;X04KUuCOnT*Tk)59c` zO#2)PlbQkKUSOyLw94+85A_F!7GOt6*3_gZ6etgTF0tR=FGC)1)4@v-rPoKv5~qId z88#9irS}y_N(}6%oik0_e%&t1M%#9bS>!$bQ22e^|Vql1JAshTk`syT=XmEZ-7*y`BW5R-e0*o!_; zK1=*dWxX}R&uP7Fj-Ic1WwVA^`1EX|P;5}vP%wJNdJr+v{DXk~+ILKt@5LdZ-&1^~ z_)|DIsFhQOyvc&*x98uDlz&!>jQ>qI{8Nxy**F`$iHU!M{#6N84DFTHN82F-M;^&n z-byY7SD#y}4y)JPTyhyL6t?L37)8~7+$geH8~0>m7m;J9v6=DklwD71x`f82#g#trbQY+!-*l}XC%eVRFi%gFvqX9 zwBJy`KFT+0BKy&xNKVB4o_zc$sXy%+^qF3WK6AG<1twigAe#;}6F3(Px}hGY!u!w^ zMVXz#s%@w)KG;L}OMz2c5#M9z`{2@$05eM&;_#TgLH(?0oGR45IM}vqIg&}i`KdU* zqbFG@_GNmNcqyQeIH3iH=(SYOqr!uPXm0R5t1qPN9MYzr~Ve~w~20ka8V+e%KIc&?n(2-cC1|7$Q0r{Z93AYM^N{{M1=-7$s2`qwK-C6pRg!vr~C1sx3tm&sfqR00U8aE6;^-3o5 z_O9ZCIw8(1g{^Plr!DS&?VlIiU6|*eK1+9YpW@Sj5W258!%>l8GPeOpUg>=%j=Uwy z80l0cpX^1l?Y)7IFHbZc!1iwfEYFx5U0&gp2s`6v4hL01BaSalK^=l$q?(p%nS^MZ z9Q=t}U&{EHSM%(WI&3B@!)(4X>7Ml^HSBL}w)UinUwn=(c&8?0FlU%S@WLPm&v);N z0ChE^OBS(M0`sn^eNu(a6Ne-r?`=&z|~`1F8|O*SfA$w>`QfRlaJTxMo~K zf;}q}A&=?u?Kk`&wIxtJ?e2F&rBKZ60N0%}lRz7i9e73l+2m=fe6RoYR zz*_iiLI^XHd(Vuv&Vy&XCFzVY2Jh!fl5fr)WSaz1Ml($96G127BP0?^nNu^#R>?^i z3lP);rz+SuN6A6=Y_PM!s4c}BE9DD0slMP+g4IpYtfN)@Zdq=MSvjAb;>0TMVbnHJ zi{Lc2s1pRQdJy9^3YPno!ca^HN0@;)SJE>-ien#L%@spUoQy-!RuJ9dIp#3J@E45h z=bIAJn(yryM{7L{`lS_FYRy!&c<0v1tJY7xWmj9$VxZ+XP038OJ&Yqz;NJhhq)c6= z>ph;PeT+a5i!e_7PLQv0?aljc=|$FOL$Y>LYdX);F7UftzQ#G9wQ$1})#aUzd>O4% z>-QxTD)EFFBHL{?@^~r&WUS^s=n;bzNUoOvb;+OgnyupBzK0Y=&j!{8oAK~e$VgGt zFhA(XSEvlZ`FN(LV3ixh?}@ySqZc*a zvo38CTM#{|^^m`RJKAMwpC;TD^^Ge%rIt3vQb5m&aQbx1Tscf{HWG-t83wNKK>8Sou>oGAl8xVB;nTeAi!F$ zQ?R3bYfrrgudOnedLF}kT?_%pc-fy!-8~bF6)t_gPjj1)Om6DT=0!7&DrYcd@%3B3;)YqT0Q0WG0$B*C*6*ZQw@ep*hDOY)|d!t9ARCbaA4ZqW2+mk|R zgcs&*&;@hF>=@Q(#PBv6PDZB@mfpnOHj%xLAeB|UpjtXe5XP18E$dRMS)e;|A(f~R zm6KScTe=WeH2lazM%0)$5uHZ5vP*SjFXfIm%u>(gpFw!l;~nRd@@!w>9#khjm3U%vC3)a1hXPigkgAsYT~qvnsAn_QS>n=pn#uq&xeKk9o>kahfmp|`$gf2MKA9jjw zx0w8Ll1(rf8PZQDxZKWYJD$wK68P)q-3!;qW2xUEtIwddZpkA&v=Db5Bl| zrICFN&D^I+^?Uazxy5}_3F$|@CC(iflRXDGr)vCrv=o%{34RWS-5x-6SI+g^;mREOo)<_(ye;@T+f(S8+yEuMA5ed6-PSSp z*2UY#6oLKc*8Q`B{eA2HmG@+noD2;9-Z}*-xo&=>mjtzZRkxyVaJ&)_h$xYY-2tZ4 z5Yg&8T`P+0d3kjOI$?anp%mfJ-rx9!Ip~hNYKF#7Pe<8VHbxH?enEdz!euThV85ZW z1Ag`e&e`T;ec`u&CkrR}&OvQ=w^;3Aik)IuU>0G|5(EA#V{D!>}!SZC*lVaD1b{kuyrjHZsWz;X=mf44QxQqIwvHpvR|cfNTBw%_B?KmH33ff5rjg)_M~%sY z4Ogv;b~>Z)Vs;8^x%F*6?cE1<0WKNUa9U9=&nLLM0Q@?`nB>D7m=R<`kpxgY)-HVC z0hXbkA&h#YA&lHp9#wYOrSc(0u2)HsQO&Fl`d5x$f1aO&=Y^Q#+t~%aRa^hAO8$9% zWWR;}A2TZ@qqixRh?TA5pA)RNZ+{=7i~xc!KOimS6xOZ)LVB<27*-gl^c#3v$uh!ruKBy8kT5 z@P1qR-xCsXAsg%elWp-zR&uBUNL~$fI_?>Y{FH=*hp-F8X!6Rm+fpAA)X9)Zw{{fR zYeT#RhY(RvEiAVYFU7D;9_URE@B9*1$H&K6SSCitUml-#D88Y`Vb}3g&J~4N ztsn#3oAueB_Z4f+IWdIv8KR!ftdE3@M)a^MXZ>xc9C*Lu?ByJ=g@X&diq2K*tZWz_ zveZHYu5w(Y1z32rITQgy@OwEsI89LSTs{1K&V8+_XX32)Ncb<%@Dx`F;3+%)4BkIIN+7HBJSXFKdT|5 zEZvpd8F{c05RV1m6c~*(;*8br4jE^Sk4K;muhl?tifq#ie7`@ct71gcwUwcUiYE4} z)NVC@1@ZP-bq%=`_W{c~Wyv~(i=2t!gbAB`vAx#GadR+N}Xty9^t(~_eQHw9UIR0_eVJK08^5bZ|} zDW;&mn*gi|#T{^VhrC8+Y%B2A?tpNgfz6A8D2(-*21yJ@x@6VcJquDvchn;>N+}8| zHnTxFd|~p)e#dSqL*3(;Qt*z1s7+eqQQREQ5CQYNk?AU+Zn4vKbdn1-%T~%Rrd2|k zs#P>-B}idbV-+^HgJ=V5NE*5y2<;H~VIwL(bNgULJzSOvr%AhDa1#u_#C-lT$ddU| zSzU+t0WV-eji!gZ+-{kVq(Pc2m}>pI!(*WM$R?RC5RWWIYiO3>kD%7kcTos=!!pNj zos-gk*fo9+>VNa`+o!*9>?toJ)x(eEX)t4{WTBN?kHIZ50X2av0Ff>5?kr;)d zw7B*I@kt5)3G9W^4u#M3qf_D<`)3>D`?0q6&Ngt40sqNx)qL4mnFJa^Z&mepYO8T+ zVh0J=42Hc}fW1aumrk2VL%`FoKkXk%k( z1pF6Uq${r|zOe<55WPjh9)@6(ABb{^QW#A^`uhh=0StK<<*gO+h*`x{`1m#TA9B3n zROpf=Pw)7KYfMlX_l>uBM;&c4&v|ENX8624U%~XCY_NCNJgTGiNcT}A@g4TNqDpk3e9PNIYrp;ddj0Y*S9r2&GkG7URw|TjfF!xMqRPGPY5H=OKioM` zvR_A9NvOYWnS^4_b4vBWsOZg(Ae@->ATMB|;oi%Luk<>lom!5(U|m`M8njRoDEb1Y zzQEHHJ{I49C5rbv=~;3zi(nM?YkmYh%vffvfs8hUX7YGc!bwK}Zo-JW=Yu(Nq8^b6 zsk1Bn#btafEuV8(8yPT|d^FyC=D9)(gMl{qFl| zQ&_Vu5PfYueRRkOw;2(4Id0jqQO&#vR*yK_H&OF!B26XrA`<(E#7v7U{IW^v<7R1Q z2Z{P<(hilxqD13G9HWc@i)HpFFdQX|IQ@+MF7k$~mp_$;C*iQi{Tpz=p#M{8{4?E> z{I_(gX!Li-^IuMxfPXt>R_3PlvsFy3MAiZ{du{S@Vl|0}mdSHSpPS1JTE}Xn^vsSw z|Db%SVnQYfM%)y_zM!=r)3c5J+2k;h_Vs+V^-myWg~9nLKDgsaDO-s#7BRrLp13r& zBpf9Hsvm9lD(}CveR945ScppfFy5qxul~7^vzH#k^ICMgA8g~iVsE){pUE_8>h4F> zveQm-ygD0-3k&+;Y^%_R+XN0t-O~na)vsJI4;lYhOGQQ=Ifh!hn{*B|>vW@~%{0&z zKq(2K)Q4?p5bqvyUitZ5em)F1Xg*QzvEnd-?r9Dc$M2kyNit@8L{J7_?5-gIMH5%! zZRmnce4zshfTiup*_zVmIbCEfWMo=*5SmL}b_{nH*LctrRK2Y1s~(Sd2AIp`fttYDouonC^J~m{p_b_ar5swBd_0 z0+o;7eT43-Y+4Uw3zy+BQOeFmI;7W+^@yRv+()MI5c)aeiqItH5O?oFMft&3*p#C4 z7b_aq$R-iK!0Vr523ofJIKj7&)cVaM^Uq}Y`RTg(G91SJb0`$V+2x_8 zPv}rC3Y!>nC>-x<`}95|>{Ee@+s0WPZ237~ixM=3c^rN|8;YC>899DL^{bbbX(&i`nI6${@)k1%7!_fIv5;OAKt!*eSRa{J1N z1@{x~pl?IoJ@A+2J2@F0WOTA&vjS0-at1CuTffB=?A_CQ^t|Zy@VqVqa+l!bgg(`g zy_BWX#MOnSBDb>L=3SWm1c`|mxRR(0lSYj>;J)YyQ|FcMhBh{wB-+S&{s|8%~-3Js#2d!*I$YW%iy)Qh@S@ z&zs>iP*JG}2J(r2l1t_PaVEO{HWOX`YbN?|+WfHE-q`8o{R+n4BL&rb+>lDX(O@~A zb$A9E)A~R+Qw9HKINeiU|I=^^Jlt@6fMHdx#(ing+1Qg=TSQ-ofqpZb+I*Cydo_Oe zmYM+}Apm%0hGHCK+`rjtT!}c+sK_lYu&UEUp^yH{a5@uDuS`{&yUAQJD9_l7#Et|1 z3$;hPW%{fl?&wo=(zl63zn1qA(Ml;N>m&Gm20G#&m|_aPht!#VbHiLOwbu} zUl@9#xSXc!w1?x+M@cA1LR$f-UYlbr@!kU`6$6b3GV4eeG(ew{3ri{NIDm=V#sPvK zjsk!O1EkMzN^nl4{3A4{C;jSnSU1?SH-b9hg$DYmd-kKGVqSJ3JRg7@}I(}qVH{^dAD zlrOd3;vD|BCP4 z{7n%bzr8x4#yaxSri6?`%)y;OQfGYbzZGG>6RzdG9)jaCcOG< znhyNI3?t=+l@(2pvot$Rn9ap^TSS0#+*dp*0lK4h&om?cbx%WtYWE1O+HRJFc)zlH zjwExN2|n49io4I)eh={vXh_~_VJVErcE17QeE{4ay_O&w zj7>pDDkL`HSaS4mEcOZzmzkug#j^erC3^)c{U7SnwXo93p`+JUl zULog08Be9OCDg8}vexV>e*#=6@=RNh9)j*Ubs|{9Mhl^NM+FSBtbQ10E+$WJ3 zjr{rEMyAQ4o?g@d2%=Vom!Q<3CG3g9UQQV4DV9^u$G4|ZoW&ki$eEX4#;&!Nbww&t zH>Ug)w&+k;Y%v)%K^zD!L?(Iq+4)40LzhyPK*o+>IiplRAmtbQd(g^x)*2fw@^$i| zT;-h}it&&WQN1;FEQduj)I-UD5_RV5t}-iDYiuc9TYn$yTAI08Uh z*dx_X8X&(R?;fR4fS=m~ReN)>C8vJJnmUSZID-U(Y+%I6s$IH?m00Bp>+4R9TPN>@ z?m>sfLM@yqYcC@QIQj|p_QcN*8P(0TX&UP8^wM9tpg`)(^+iCKRAo!yG@#T$zWxf6 z=ZKSii`k8;R2P=QG#Obs(4l74k9n6YwOXW)qajAIZK+Oe@XoOLL4l+#29ANr626*# zPbUyJS+5C~$gJw4n;St9=xnh#hP74vYo~QR);Os8mFg2G?$=@y?EajlNmW_vX2%=i z2U_d}Yg_0OU%vPd(V(m?LUI*SMlzn=gfYhGux%3fY$b82Th^X+Z=*Edo|snzbg`&l zG%Xy+-a}huz^@;Z68Q$xA4*U~5o|6f_!LkYOc~K0Asq@-naw{L%om|QirS&~Fe=t3 z;GH+}6qlUe>=8B|U^O^JsO&ZFyjG_1ij8{bU#D1+d@Ahv?uK65#E%C4sM|BFEZlP^ zvXxJj!hC|Rodc@o%7?&VUda%1qA2ar7AIQZ` z=3r8!eheeY_Nc_|r7sNinHMITIYf}OwV??{9bD;pqOX$LGA6JQT&DgAZ`spwmSW&) zUcpCiy+m8h*`KSe++?G={U{uiA#9lRIDwK|%cG3Wt%n64AlF4QO@q{B&!Qr5XRK9e z*i7Y$G9k#;f`z}$#G&pX?JR!$(nFZirIx)!3Uvr=DJ|C-nD7!bbxUFSRf+~Jo*t)Y zz-E~EID=|RhN;Hf{qZA8Ag|uT#}xRoKB2;kbp*Jtrv7?$h`;7y_khaA)qS< ziVfbuneSc`pGPrM9qXd6l9a}@6Y$-j{=v-Z0bO$TZ{y$L--?@mGBeF@=>F#|8T-HP z1h6vwpKu9RwvwBXNAo)Gr<{d*&-NV?t!|Jls+tLXwHMQnIhoK}W1B+h*eSvBY}yIp zS;p%-gB~l>8=A2vte~-1a`KuV?$dsyJvf|S@NIa5ivtq{=$B-~4%~Wn0g}xc>~03t z*%k%;lbI~EJvs(3{oClE4yRu{KA=!gUt?-L&v{%u4-v-t0zzG0vBuo)_&V=`&f7g1 z!bd}ZUKG*sk<@-_!Z}VuysDFu1uLmGF=`28HnXsPAA&NQu7WML5HlKp#lOyd?Wwm4 zg~mB$#MxncjTUaAaT78*nz5fzYo}tq->Rs@cxgVcAfKzv+MH`5Z#K@KvS+&3uftU? z_23R4elrfYvJxP-JtwQQuBk>sM#KT2;HOFielblP%j2uY z7>gCoO%#^dN(3UkC z*<&p?(o8P3y$R|ehZR0)DmXzF^UehP} z;?4`t@EK#-p2eJCsn*w)@g^#d-zzQ!C|#nR2p%}tSMSO-Dpt{+mxT)3UURQ5rJ%44 znODB#v1j!Z_|t)+E4U?HMX0%L0G~CsXUht%G;T}$(``eH4Zd;Z1zEKfJQNW~bK}$n z2Bl=jPH`r2_@w;+U8)fucA0|t7`i5sTh>8tkWLY%+v~qz5`P$IvKLm)!i!9;llw)f z-S_GHuhs~v3*CPB#}84mDOy|s^C&}b9=FiMHkkX7cB%7tnpdc_Pb2)dqx@(t)rV_h zutb17?q;&>{OVO^*0~Ti;phsFqcAj$Wf6V+>ESH}ntkVN3NanQu#lHBXvf@-IJe{P zJl!UaFm-uNIpn{OTps^voLE^V9oTxyiA=xwssD)_g5Qv%;%H{+K&ooyVCD$axBPqf zrW4){_clTDH~B!66Su|H$C z`#MiSRJ^C*_e(mpXU$q-Jx3ksFdK&c2?Sn$_7&6DGVmJTRtWn)OP=3#-M{*Ze|-2$ z*BMDk!F4mfHEsE->w}uEz$@T&1q5vOawxER2osX{m3Y!VAZ}*&7WBolv=Q9wKf86j zvs6GHUf$3m2`okL_Bd!qcLt&Yw8#zbisq&`%^nhjqWIO<@^-Rwya`TB(U`lhUyHI} z%%uHAdJyLm)an7(d_lE(@tWxxV!3ezMlHM`8&MHf;0sEzQTA&sFlmOTw_|Akk%71nvV_sOXplh6)cw zT{Qbe*4NBb#@+rAo-zTf;ik$~aX7X2#l1D*>kT|dcC*-$rN9UsD1&K(s@E%6j(al# zW^pSky$GQeOFw7!$!_GerFSz2grm_5Ihj+LpWOm~ibCn6FlItobraK4Yvg-nJ+_Fh zk%(U3H+UXUdHIgUUQiek?)yoX^q zJAOO0e8X2OdEhe;qJ^>T85gH67nK_u-tQN4FY0bq^PUlYfM987 z4G{_i8V-Nfp6aOPRWzp!dtrgn70`j5W9)W{vbF(7N^xCwNX#x}1Qmxb{j~&#vWBzXDN@EAkO%plVBvKDmjIs0mSHjW#Nn&}>cau~v}A!cPG-dmRJfW+>?FKYzU(o8 zn)A(ot5u77DLTcrPFjhkMfYWD#jNH{cs=1g3!t4yNhN~l(Poq(y5C!MIZ`kv$08Ge>z!u&W#f z)GpYj165$D>bl^3j4&xl0@%4=-T#uEbD^sypZQw&zDFqrCrpXcS4p8eZkJXGTe+to zYa9mHLUyDa+ZJVwNHFjo9J&}u3(kH!iA1_8&VlOEWP|NK+GZ(36(eAP9zFBOo>uY3 zwDq7!QMf?>?MX+{$8aTV&sBOojP(70(zW^~)X0#UObNn@J42}7j$6ySB~=txUlUib zg!6>wphrq*Y#-9_G-shjF}}^Hj9=g?uOal$AKm$;VEY0@gWvB@ht`N|GB;{6Q}H~Y zKBscrMYH{K?zYS5&p`5@;dl>GTFW8!pi?6^nVy$osZjiAQq(M?y(nzWz~&Sx)5l0Z z?a9j@IcXmELM%8Bwk$CuTeG(u=&WTM+mE2Rk!1tuOHeVOQ;Bgu|44qXNZgz+93$O= z$IXU4;qc(?CfT&Gy=2z=9EnVV(nXt@f z|IjCCaVDav3|=pO+rxkrH9!z> zv>w!K{2V8Hdbzp5{Kl2RnD=!j*(5mO3*d_JW)!6s5? zO>a@F?vr$_by9(Kl1{D!SaAlAV>90k49ilL&Wa7QWgLmTW+rgY&_fis6mb?Zvt&xh z9g7Igyh5^da~Y7QV&42{5h+LBtrVqzJRk{2t%_}b@Pt>%xr#Thi!dIrM>ELWJEHj| z!1s>Teo%?{wVcOj@#u*lvFPBlf&E>x2{i4#atQMA0u92kIFj#;&z4zpt@uEYW?fOR zdsUQhS3?Nh>i8ytvnhTX{-xG6W4{w@A zINMfX9lbHTUIlm43+LhT8&JylPSm-aM%3U7x3^8O1izw>sFY)|qMt}l8DD4(>m6DD zIf|Bp#B0uN(N?>y%v{3>+y!ZosQlw;w=4Q=&}Jw0jHK8k+9BmM{oeP>B(I-(&4<#{ zYA8a!tfHZge&1P0gP}Tn;-{QRAyybrYsMV z7zQy5mC8N3H17_fop*lJeM^Zha_MQ3Qbzx}nCKPn@#~hjmm%itU7*90&a`im@ZRTP zf}`$=sqz>=kxHIFEu%kYw@~uFdcQ*BRnQAZUFRQZ`hh}El<`f13H{av|7UjiJx%|e z9sUZsKFy!rXeQXN9oj+#RImXkS>Ni`BvF?4zlnBN6w{HCHql1Wb}3iWr}|i~p7yW7 zqoJGel!%lDJ@{a5YoOCQS>)bzOfI`CpLovQ$X!D7d2P5*)YdwjT1uAe-Yi|T{o=l0 z*?4g4xEcLitNl1Z5($3zoe%Oh`&60U>^6TI#P8|N8qct6GfsheEcrKa-uC zf%<4rT*n{#xsCM_4Sq_wKS)Lfm(~8NS74~#j@yoODnZr|q4A8D!8!#fnGyTb z5r3_cRLFVdS_^fWs--<-NyD0CA*JLff=c;7&;gA0M`3IA{hyk6CwA%IulG%dYHg@v zY4d855oU<;4DhV=h~Ny0xoNy9QI8K%)G_PoN^>-D0#Tde%0e_!28f@va>uXzceEPa zRDmI6rki3-3nY^KOvG2N9QYpwzV`5dIR(2$EhUWYl3cwo`rWDnA&xlhm7IT+4{Fki{bRpC?P` z3hf=QK+dy6>oR?`kTGr@-KOJeF~n@H)Vyz-VI3&#to1ZrvOt>$2rfKK2XSl zGr8#4b_6o}d9Y4eNb&P}x(@I1e%S-**E z#Z#Q7J`4E4j6DX_2jUA)Nl3Ce{PRgBU(Az5o4X*{2^n#+l|qvh0(zyphFWOK^>gf5 zO!Kx_O`_-Xy!d9=me_=zAwe>yFv);Ik||DkSCNbyA8>qD@)|L~7ced3PA!9_Q!>3& z9U*Gs zubvDwlYI!2-`-%4J7uRle7nB1KURp$cayKwV?TBml6R|oeRf;qS7~daSk=9fvo!}k z26n_WLF;gX<74q8S1Ih?7=hX(Hee@}M>{R)Z5+uPdJRC^*A~B){2J3VKc~x_q+umx z)vPMmCO05K)!TX?9F$1b6_3#@uoDzL^084y7AIMOg7_z|5KXC32- zxyLVebZ_MupA}a)-hU%|hjqPpCvKIJHp{=&GlMKm>c@j64f@$_BfKh&E#b%FNAtc2 zXX>9&YUDqng#u7a6@NzNPtyngI89DpT0xW7bTQRfLOkXmsf>9)=Zwx<&S{%{{Y=P? zY40XQ=N6CI(3guaCv2m2O8q4l6bZYk>^FL*@q>dMTxEz7=q)(F>7h8VgblO-@>NRS zT)pPzoGH==mq|)mB$(}lB@7#BpX4KGp;!Yj3feKgOxo#E+q=$(lqrOiw(X}})L{C< z1@|1Q{CvkS6f+R@L>o^ZCrxqf2LEL{L2v(4$QZ6QcF}+ccUf)_L+x4V@Bz7^mHkI0 zM)wOzoY?E|P>>ne8qR*R1oog|*eVhI_cR@h<A`!uW_Z?#+%vcX^wCwk{NZrn(CmF zP}g%eFLt>x?Yx|Wso@E9vM-kM{(A~6nx>r!ZZ^A9GG5UuhP%#@gR#8PpDm+5yVb#_ z{{M)3%b+^5ZEZAoa1Rc_-6goYyD!|`-DTmf!GgQHy9W0Jx8M#zg58zwK4qkk#ne2pUE zchYL zvKrkUXT10}3})NYB*BVXsk_vyCvDRj*FJwJ`h&>I(&Hu`2FXBjr=Ci&GRdE@~=!1M12zLjZ;q7Tv8`0Vm>QUC+bAnU;51H zwUI9mogX@XKcCackREu}NV31j=+p>_oI&qMucVFDtT$+Nfbso}VTbsJ*K|4TY7_!# z`SF29fIOilUCxd0ZJ+k~T0rSJYCiMQHKz@4lguybfndiSYjKb3?%=x;6H}c!zptckulc?2bZ&K*3wmpgxKMivRa-F-e0Z;g&l!;iQB5=pErb z#DwIpshrZ!1XsLQ_k+%Z0x0)rZ<02ouMr-=2lE}T9sTf<2nBR^v^VKf;@7IWr$1U2 zpxAp;JCM}o6Uo0L3;!84`U`LU%g^GkemO>MT?Hh>e*=Ki*eI6qGm{O89Qt8n{o#q??5VC2h-c%kY^!Cfwc8#V2v+u_ozq{(@fO(4U0V1HvNv59*3iTFAxDRE(1Fv=4qCH!e*{p@GX%=M^ z6$%a7MAR{7J19=OVIo7vHBk-FWDAQGZlWA7%Q6@%1&y6wYJg$P!mtBHa&l_PA<}uN zYgfpx&U7?hY-K4dnM;AH>|bw^kK+4t#m1@FJDZ>Qe7-Cmph`@vf?3TZdQqWY!3AJH zZZrD9U&D=7oxt>(QgJNm>C!#}n_CN)=pi+Xp=TD(CU&Qj^>4ILku#F!&97ni)!1L)@@;RUuu0nZk=F1nt2WJF?`R)TB`@*qS+< zkmVPe88?V2M%-eSAZ%h%eZNeckpl&j#{d(0LKl$S9F1@b~3lX9&!6tB~?B&3uWtY2_rBpkA9M;_p090Uz+-|@3_}4J)!Ou zG4!k#E~O-@Bwqb4r`-|K1_(U7PR|~jbhfpVJHCB@oBr}Wi8hOqEM>Fj#{%FU(VMEZ-t840n&|xWaDUE-iLgNQ_n%$F8lB6ZjC!1 z3Sd%?zAHG)treS|v}jB>({~Jy3##$;K2n2r4k6 zg{;U9^bHxl&_CW49GzY$hQFbn<74f=VOSIw83NhpymajvTLHw;m)H-m7b=c$>+h?Z zG2C+nj*Gp!LhVi%Tp{L&L}v7pz-k#^Sb+lw&r{UiVemDWzT?mwd;p!>s9C)}jt_^I z1rHcsh<_6O9)41~`lDVzqqoT4gnR$QDF4ps`~#!>ckP1QZ|#CFJZPWJkEz(FfV3sF zqx4ZXoPrDy*Q%4zRYY2v`ON59Nsrf(_W|NrY52R5Y@toPy}8@Loz zBfI`XbE})K&{X{R7H@6aI8Gv9#MvZ{c*A0Y)#Bw5BGg1~1btx0hVGe%U%hZGS>P&l zyi4lDj2E+cArCKHM8+6`8kqb;mX`sGf5^&pjM8uzOoWSWp&o)p9D`f1lK6)}fH^JP1R@hTwn_T!i zv1-T}3MSuOX0d3Em6`=sKS-}Yt^(34NVNXxxfRDN(`x}jF>OKf>c1f6e-{4#t3@00 zzZPvO<95FpLP>0{ySN$M@I^C0B4>)Y_$mOwI%6z1Yvf8gW0tY_>#{FL8E@VCGpeF@US-JNW%{1k1 z78THshWx1Tim@(dB?ov@nABjC42P@~JlM~w1Fc)D2YSIpB9O!SD1i_ad|K6?m%n*p z0zSTO#bLXxLZIGMYP%WE@3ACbpj(O?VLK>r#vUoYi^54QHZ?lg6%J#R$S;irGi88OOYJly$3`MZ(kba{ugnG_=z=xayePGp;93xdr~Z0zbcvu~2j zVh}yA>InUsSw6Reigm@?3_ou#66|!Y6RAixQt7>e3|&Q>;m)`|L8zUUb7&$sJ01=v zFg}%=YG_S0Cf9S*n@&v&B6`ythy)#)-Iz5NV$9Z&2%}sd`!`7ii=cBJm2WeHW$i0K z@Pa7}umXqhf8ZifiNUQQpH%1a!@Qa=njtdMAmOsDiSdpQXWX);^ezl0(al&QAd>9s9~=+)SwwG9(2E)RAAuCkzrlF_Bg`&g>hh;G90+Uw8@nS#O-lu|5`L`? z9Ui8F0z>U%H^ORM0>Fi<(pdBrVd@eIcg=Ag$x%uXTzz`XWsQtRj-7s`8gWhv$}SF& zr*!AL=Q-ff@qc-Jg%JEugT0Wr?m*LNpq(X@$(T`npyxmfG-Yz>LG`+$8Mrcv@do;* z!;;;0tYrciYnDBQ_OSDlikZ4BfhrN}VoBo&0M94(qhsofAVzkpiZyDy3gxGw8I^UE zW{-Ikuf_Zqj+sOa09&WD+B8kLWq2+vG$su8j(v=~_%O?$_I}^7j@}gpzDYP);y7;m zY{3r74gy4AtgwhG@B9a?E+d@kirl#H!^H5ZnJ-k$Qa{Kf1)x>&hd(|Jmo8Vi0_#NU z0G13m{6^Rab(zzvK?_K%(e4hJ{;ROk11RFEB)rnyx8=p-4&={13d;#n3HM&9c{NE* zzG9g(sfZ)Vb(YYjzj61hEaec~`nF}ITXIT!}X2-XzJjb#2yTrndg;bDnLg2K6dO;I2qg|q3Zrck2 zMz!Yeu#|!WeXzLLS#Q#pZQ`c8YA)Ml;h~nxfXyu!@Ixpy<7y1pw@6$V+=i{VAD+V8 zaVeOnogE|dS|G}m>#RW)8`liHZ582(uVas^STpHyJwvO{a_g}6I6e!=yH92oc$|xI z3ui0z{(P@UrEQgY^+(|rp_=7Wb`S~JqczuouK?0pir3EP)Cw$7x1qf9-PoY81%Z`w zOx|M2XG|l`nMC`AjJwMB{WhdE2BDpJ(nwI#qru%evOFp~s${8;LQCO6SXXo=Um~`s zb9Tu!I~6#ntXkm0@$=be8u14Be}RQ5AJxn#Z_Z?{3?@vw2s`F0eFgjZK&nMP`AKE+ zQ=RHB%2?a22g;RGYs?WGTo?%AtX`>J{p^0r_KTF5;B&GCS#)U`-@w$3-ZsQScaiz# zS+?k1-gR@IBlyO4$u}~vW2&>gal0L-Gh-gCwZginrB8>Mg4JVI`IgdbDI0^gcH*Kf zG2v)MI^Ff3r#KtlqLN0kDdcG*I=%>#k-XBWO+KN(-xg8T-q(c}Swwqnk`SUSN>Js= zuG33BQL>e@F^<^(d3H0taPX^x`T(E5LGb?sK|$ljzX|yNI=lZyPyZ7SjZw4xJyiHH zfTf1~D6$kHS6o43x3`V9^(AK0rb;*zj=v;%2$FGvU4`vv{eC9;7RdSv*8Mnige9Ol zB3v?6`jXH3iue{N(0jGEZVUds_g3WYobPGPIp<)_S^o9);>i9T3l(dKG|)EHQ?J4i z+EZxuSp7x3t8z&dSPjVD4%#`L^VDI~$;dOWN2qgKohVOKJFh@qaFJWAINF)=)oZTj z3WAo=R@0zgCA;rp&8XJm0?P2ie1G5;xzAD}{uZBq$B<6n^xJ?$&jZn{%&1p6x#sX+Hje!N=LE2f5Z^(iKOO#2a?AX5)bUlfj zSSTE{$liP%UkF$*xad2{+;T2UL%5(V>pbk`K9|sBDNgkz=6LP}w}v5?es9*W_w4+l4-DyfKb-@=IYl); z|2piLdOBC8AVcGuE5T8cwZ3XDc8$%&brWqHRN5we-kAj7J~he348!Run zCbY2G>VR=uB_q8%9*w^wk_+VX#erEWqRBc(F5i$>>`N*h42h{x<^(MB@99nZ2DToA z?a+8P_w8I5S@>2yIC>-4ME>=T|KdW<|e z_=fHNg>qjEE!om)G#;oGHaEehY?DwA(^*(u1bw>?&KKgahqvET26MwmDiWwG z0RG1*>8$cNpy zYhhjkwkyzHFM2){8-2&l&CGni%QNZmm){$tLo7OUfxzc!8_%=WL}H$F1;BT_*dhXG z^FVstK9q|Z-ku}Cy8Egc&}lg-Pto**v0%kSve{<&D*G}>sA7=_+bAtb!V2&l)}O~p zrX6Wygtry|mPEYh>f_0C4|L){8L()*)#Qa8$9x^tPT*38o@0_5L|KkVC2z8#5|?fl z$)M$9e~VOZ0-!?TLji@WI+W~|cqp!^aH#Sp6qO@cr|ZrOKf#(leNUaA`TT$n)7ZAj zWH4%TyTh4}C;`EOKTj*6GPlHjC>fwaA*B@SmSeKm}Cr_Yg#ShQ=%9DKi#%c`*!-neUVZ9Fp++-TE2q zIOH!SJf?4l@k%~*{Rt^~7qEgFw);~CSQFA+F}M4yo2+}^oP*AbvCX&62ShE=EMc~|kVBA`;U>%3$U4ujj&B-Ww}bLP~Hc5SW( z;-sA>MvcGTCl{Up2g4;S^7g5j4h!}79Qnu4`j)4ut!fL>L8E&NE$r@LGSp6StME@l zFMQo5lGv&!g7zqIKx~SYZ|piXi=;B*Rs(Z)cdg`xqJbe`I0c*|4jW=s+Ga&c&7?mv z)N5oo6?jR5%tMG>CgulFWT1?((;UHSQ1!hfYFKrZr}HHULsXMKEVIjCImNO+@>;Y} zbRJfy+leW!S9wXUnHO|>Zud9JGU%FN`xg3EkXvUc$?eO;d3;ZGNC~;J$JLIZikmFY zeg{|aG3b-Y#FiXTP{o93!#UL4+9wUK`CEF_uNIbx#sequ&`&prJUv|o9MRHzt8_Pf zeTu?au_LCo>Hi|Q7KTqa=X0_4#@nPWM$Y+CBN^XY$XIT&* z#-_nQr(FrtEE}?;;ZIPhgP^_3)v*|DBQ2qq9y*m2RQGvO?Jklq8yQ@Co zKr3Wjy<>qZc92ScOW?d0XJgfEHBo@aR( z7Ls;`Odk^39ORlEV-RmUx5N5Ullw@yI?+^x(9_6BB-v5GXH^Geyk@t8ky*3~|74K{ zJYjZ=!|rvpjvTHlEIT6q`Z)h-UE`LZ@UxiL{fsx+hd0cDx(8I@x&x0FN*KENhA&&a z#vw;s=faMt=m9sbEVp|Wk_)skVq4&okcWH(aRfVGulMz_*Cd}K@+h(8@pq~qB)b%5 z@I1W4^N-Z?Eho4mu=B$`VXrat$liz9D?t1821wqVlA~pAAGG>~-^X6|U6?~6w*-Yz zKV+fv#&|M5gs>K`cLjaq!3^H!u=8rI4TBHi_gR9H)c69kmqSqYKs$l|kud5lyyEJ| zAT1RPb`<$*fbesi!@;-oH)F-}B4RPz5BLn9wAUEW#ZR#xaPG(OoACO6r0yWymaMx7 zSmdp?!>TVfM{`woZ9ZE7IQa4Jc+GF{1s}!8!5+IIG}L}#S;Ds z|GV_lMw#Rwf=a*Me=Pmfe`_qI>`Y8OOig|pEjW|?{`uQAT+GJQ_Rs$E|J~S*c)Bxr zA#x!({%g>}f3Ua9&HB^cF0N7rO7xe>hqTuH!u@*IS{9aYLygg?UL-bjEq2miL3Jao zd_Q?3z4mT@WpyhjSRM#0JW>ceh))oZOsq`cQWPIepl`GS(xbo%gMtH-V$lDfGpe5> zX=Q@`pYNd0zc8AALTdk47|p+&En`&A6y^o-{p^e}tU|%?w&3BkP4Gs)rNi0nz_Wao zMQNeCN}~RmMGok2ArN|_5ZJ~x@fI3v$O8o47-Z);WSR_b~Q7k@fDoNX8TKy;DFe`&g$d}q5IdB97XsG zYZ&dhg-Rc zq=J@@AX5VN>zL0s2Z9Q_FC^MgR{3ZxPCVGK+Jo31!|GB|jyd}7Yf$Xm&L{;d<|MNy zvUbe-PMHYlJ4wAhHZv1Z2btVCI92^D%$@omF8yd`8$f9{*0GRS{72%`DORR5!$Y4? zi<!f&WGnZdhxgN6u!F@@L zm{W}Kys$V#?s_Ha%Pq9aLvrm9)0~|GKs20OF4T%vObRnbd*gRpyppF(;P1(C(!-xm&axD1HvT*@gU&e2fr`xDnYMAJ%; zL9JZ1O5HTZ^{eSZboQL2Ur(8?<@UV$Ug#}TQL9Y?2#k9b+&g<_wwrGiA?LzS2loLtf(>TOBaf-l!pkEm2ktIy3UmTIPy9>(V6vGBDGa5ScrhE`B4 za00e|2!!rp7u4T?{}rl%+>TGRW$N+s56^bwZZ`8RP!IBVF7!VS&fjOyKWShpu0}3S zhQ|L+hn|oF(V-Ebwy>;u?qF`;F#vcxph0Y4qswG?cjFHzV#m2h@z~lLGvK<+QVaA$TP=l zb<2-OOZ68P8e;5;uC;ocT5)s-vF0)HFe0Pl^9EI6srKpIST=4rp}LI%dB+8QSHqTp zz{Dg_T^RgZZTM%a^%3;jfBEG7HD(KeSZ;PE|1AOcRsJh1Ax~xLw*fLoy}CBEu58?QsV!P3e3Qu=>O4^J3`xy*}pPqnobOgG^_Ut>MU@jiI{gX)I% z6+wFP#MMU0F(XLG*OnX5qjR{TK5rDh2;}i^csdSlD_}O6Ouek?1>hSsrJXf9+|yQz zn>g#Z%{vmjaScS6NvCW%4E-7?XhajE_%?l7{7xS zwv?b|qkS&q*y2cK^IQE5{#*U+=+!KlSmoC7_&F~(bHF1d$NDMuz%DgrY(>h;EI)~K z$l+1I$F%!Zf3l0epNd3(WQhy)K0@jZ?Su&Z(LF(@=Air2P>h)r0YP|m4 z^p2*v`|A}VDcrAC6pweUuQjmWYOQ8s?69J2r z+x)Q`*Ev(6ytV5TfTNt|nu>-hK$spTh?~CTLB5e!{}g{UpFYLeMmflwLQS=*z538b z#!bk~NR@O%#Hv#zPxZO(8@>V>`KY>~hcqkcQ|K=NAg5X1ky&gd*_qQNSTa9D4kUCE z0S@?*Wm>#PbzJD$Co%=_UUF@7h!LfrUyw4y&p2i%3724V-jDq1-HE7N$Rn8ADk3QA z5K`C4L%7&VfT5=NV=>7hq}sZ)w^%&i!})IutQ?roPJ!pVc1a5LRiU1PP6CR*STZ2a z=W&Z}_~5<?bdnJd!d6XKDc7JtCYUT>fvENOF1a?UBUAoXppQ~N ztNUSw`vbSD_*vA{L%x5)Zd=D{p$WvmAArxg^VI$NAArwy`*-yqz$a3Co$ki8Y8Ypa zClaKaVXycvz$bs_lx5tt%f=eWpJJSD*D{%!Y*Sq*`y<)cuHsx`Sz|)W$=Rj6rc}*! zu{sl@%v^n*-8**MY3)Fut`9|*jRq1=l`VU->f*woq3}E$W5gG1=>ft{+vjyF;3UNq z&<}+J2Kn@{;EQ2lsD4XR2BywZ35Q%tZ*hna4w{$9`(2U&a;zpR+!U|xu_@N&rc$k< zZNs9~%)rqoT|NX;8sql^#Ja<@$qij4S|C+0;)*gEJxQcc0OH~TBz@ef+?NjrGis%a zfGXLM`s|@jiHRN9F8~X+4#;D>cf= z)qp&>pgYw2<#*c+Tnl1A;LloXh`Wg7E+2Gi2|n-AO|hB3jPfA5$?1J`1;yqfuIh*_ z=1de9PJstNw>K;wm%9{$PpX^Bjqou&L=4)p$iW++mEVuAHr0z(LQZ$T&B32fxMl~- zbVPd~@3nX37(V4EymD5yl(U-VyISS$ud!#kCMJ}nT~rOYSNpHEWjcORWq#DNto^G2pqq*vE z!R^ctWbA&kgr(e{rm95QFa50;+|5pgl|@Y6xUkrlvN-%V?o-~su&KXoeHvgVIU%C$ zZE@(HhezyH(k5}?Yob*M!hQNoj$xq_%Zx=66|RnrVes?Kk^2In^KbFp@;nep18(l> zPiVKpORZMmN5twOEeK@D;~ZQ;u!8P4*|&(-FlD>L3#+`3ZI8z032TD%-yPVU8OV0c|r$;V| zWTWyA2Cbndgbc^q;0eftuJ7jJZv)nnNHe`+$3zMf%ZQj0w50^Xe8Tfgej$%0zcKIY zzjr1Ktd~R5+Ns^zV_BcD&0^XSy_jwl*`dlt)3s!=6Y*b(xYtHs-vOLL7W!k=bc^#D zxL;!-Jg_>@mh9;=WGTDQpL}!hu$^NReqw5Nh-1(n#!462@&A3>i7pOZ{f2!8|JM8e zb6bh|w+i#eh5lzj@m1H+K>vK{f zzf>43HG-NApOOMYWd7Rs2zFq*zI$%%a6=(nI%;^rRAg}_=lL{olTQC?%EaWP5s z_OjdqW&nQ9#N_l-yS$>7Z+pLb=?|v0R^TZ0rR0%(p={w5%wV)|uTUD`((Gi{zVNea! z)+w>ap*^swpOfA)ZJTimY?hmRTKsafck5QkqxxsR{kNjhaUHg6JNT;mSzkp%V0YGy z1Kwa9WUx_aph$EhS#59mD>Pq^$j-7LmrxJB7p&w778rdZcsZ)Wt-GWfDZTLA27cys zl0WoUl$Wo2oMKH{D1+N8s)M!yW_(b9F5(4%QHhIyWzh+3ya zbd{B3{F3g|VLyEG$WI%~HxJG>s80`=dfL&2r8KzS6AaRVmZ{T>S397j-$1vq4WBUz zwt|soHl6yZ>%U>dEXG&$+W7{bwY@E0T-b0hC&h0vx^jtv`iPu$z$-92!l=viIOKwF za;9w2BYS5<)WafeD&dk-*V?WJYKB@wfEC<5W{#G#t15dZjRc+67s_)nN2^kn6Nk%< zr<&exL{HabyQ?|Wl=kwIhtH`)KT}dQ(Dc_KF|#RXj18sc?8aBJC3kI-aB)w1vxWq+ z*?v6Mm3@X(Y5E5K{exC?x!OG)KWovbyISi=#}XGgqku)RXlI zxBi(?d{G8+c;6gxa2Q)q+zRwn1RF`iLg2`U%|g{1ndHdQcyfEGllx_iwo(S(?R9-A zX)kThoMd?Mw=}kf8kbjg#4Du&J-b;Y%W1eXfjgc{diaYcjMaRnY12st{E43Q{Sr*&IyjhfjWr^nCi z+3V#J_!%&MRYsp^cMNA6)h)aS*m`nq@LaY&JWcJ{Z{A<;Uj&K zB_90PQT&DhBw-ptU0rG9ttLH`%KhqNZu|2_M#U-EFg&v%GjA z7`G@6eJb36kB)Hr173Acnvts$1&KN1NlezbTFJgg_mIR1sWOS(tRLn zooOAGoIXR9&x~7IiF0DSlqU>1kxSvXH}!Tzhoa9$$;6WjsaSL@MeQNRoYG|syJ{yk8Z}%bW2KT##L_g8fPv`RxCo;JfQn-l16XI@VOwz_I%jAS1dG`fXlEUX zvD~4&)?Znx5P^{ZV0I8c3%7=_zd85jrQU8N#^a2d53!kV$H|m$Q2PlbK6^F4Ro&3Q6>CsWa=&%ZilW%Zct|@6 z*IVwePPq(`1rM;!!VEpg+N@nOblh15a)jFoUAS`mJ- z=Oc`z-6orrv$Nd)Lhkqrj);=e5CLm|5#h@)hEFlL-?;J7x>bb~>4q@gcyZ8lT>yFC zd~pWjWhT?dk;k2o+SIXQe9e|IT()KrwVYk!oeS}2{p&Yx__Bydi~Pux+6Y{YG5?tm z%8XpRckXqGsf2FdmRQ}UH#0=oJwj9#<*qqG z1i5*ydh6|jF22Ay4F!0XT>3&{jWE&8r}a-b`S z_127GRv_=-kf0~`FrLCQVh=NsluSZOFBBm!tTmS z*_C@NbN$})qCmgL_ZcBKoIBX=X)<*1T- z=!f2{-DwY$>xLL;tSC%F=3Du;{+ME%JF_>$>B*$)`W{INnDXi^jZ9j2-cm03xd>8b z$9dFlk2#F}dWnO2#(n{e>s~smL)Lg;69C zt%2YObRiB$OQmW?K4Duo)o0gfSP?`Kj9BUcW*iRF3R@}}n(rZ#+#E*fc_Wr=omdEz z^^0x=>dzEdJx18EV!Wim3rvZlNumR8KpEIF^+fglE;5sDh=E@9(-~?{o-&r%dOQ~V zDC4&R6b$*}HZ2ou>BwYF^08~FlTP&3`W>}OFvhg zC|ikEtI(QPuhKr&6$J9H0?_0a5KLGmN41m@1;%TOpjv)(qy!Pg$nSDL7xefW>$GV> zsBjkSvP~t<(**ZL>ZI?|`VBJj*4zN`p%6{AKMKSxu=4#hLz!W+DQth{;;3hy2+@k{ zLr`TRHIL%e$9i0vCVA#iaph9s++)FlNjY@-G89usC9dvWDq$jR(krZ&5OH~nc!}>}K2a^P zz$=pYD=H$R-YsntYous1_oT{Sd zF7M1F=d-*lr_6sk98<9rH!N6Mm8f2Am73bj?_3j$<%o;%7In?PpHJHSWA=xJd|Cf( z*zx^ubdP^RZU3+xu(Jed)0JEe?OZHfJpWqB`u=7A`(cHFPhLlmiZMS(Z6zPWXPKmw zoI>zA`UDJcytV|K4Qz$yiQWWBSh}cXxXsKskV*Qu8QM!)6m=?52|}CIeKgiV+K$ zM3CO1)+z2tw;{T)SHSRB#a z)f{mjZ3H&ps5+=`E;>Of7D&5A^@k@HQ`Z`mpVE4{c(-MpBSh6PZ?6o0T!kSnT zT8Ax9y}cxai9zySDaU&3y^i80-Br}z!qx5M2scbNxNhubk2OvWkbH&q}M zE~D*4PAu8ao?~siy4Q}`_VVn#8AjkUrawz*7^DI;nj<3tiJx0PCKzWoUw)AX*2xrc zaU@wQMh#p7b8wpYZmC64B@nWBNloNtbY47=#j%#<8~YagfB{fmw358bLz?G26!L6a z+}X?o=`t*9y(aJd<|JkV!s4%BcFKYWw13OFJW zX~)bJj1%g{Xle;O)&D(8;$?q(H35NDs(%Bk{`taE{_QIIr^5^=hne-iNw5ENR9GH~ zFc>2!hq<6&^70tepD@|ZNh!mC)cafTEc~4OPX+kX6rp8OFs|GknGN1k?o;jzFOMgn zs99V+mQnVSvkadRD+f?iSPSl&&aG^LAyeTJ{p$569JKFi44|m61_kK>wekI$l}Wzy zY#D##5{v^V#`MDm^g%+?^X){F039B7GgcYGO}&B0hgpF@#U(La*(~##cH$m zfuEELJM1n3t9f2Xan}rHw1KcVbW-w%x87@PvkHH)l3X;2@?oYG+Kt(Ak>L)Ybws6` z^6I@jdfO3BKmA%NGtd%xpB3Q+$0d5Wz1rFopr5ZsW1s3QY=T{><+^&g%J6us;afn6 z(b3O@y0SD#=*3&19ZT#Cz5Y{oN)3Z9_akW>4?>bquiZRN_Yb*|gH;I%`-paF&BAIP zENbUO!;2G`H8VwTHlb6XAy@SsvKt(`j}+UyFED|yH1UW{Cw2t0GQVU^b+E=4Kpn7@H0K6+s<{ZkPn?^8@C#wZy8Z`_9+~j!V6uo zf(*XOWE+wdtB-8jFflz;`r;<=4&Rsl<7!~$F^Uu#h?-YG<`C!B*Cr`5pd2=(9VSns z&pC80CR8{TEQy*16WEZl!CCPZ)c|yYDajn%cZ?P#;S8;i02?&cW?47AOg?V>h4=eN zpe0JZ?*<)-v%h&=|FiS_A4oOMWFXpx%YRAcW7Gh@vpCn&ASrO5CFh0d!7ODKe7<9s z>4%ELhvFh4a)_YxI3()!yM*FoO|<5}g?!T2^}Wh#$7o+^TIfF1eueiBXPD0ZOc7s5 zN0j*4yZ(a9`@(x_QXiDWi7AN04&7$MtsYZzBsY~?Iwh3UQr%>wtzwM5!j3(Kxvu6m z$!yhZhubCwJlgn{Sk6$6>B_7i037!kn6Vkj4Z)a z@@lhD>&?;u!Y@-M?x-&^$my`{7O-~O`jC8uJJH9wTOADKDPR~_MbEAEt-lwWvDFtH z`|352>yj%y(@I#Qt&fS;VF_Ad4qhcjdv;EsSVSG~!sHw-QfwXo={Y_mzUbiWchm(b z9h@Z^r@5ucjg+>iud@4n#FmzvpY^DuAF2K6n*-oj?R1@HR@IjDNb#z&YF2A7_}1A+ z4=kJi`OGObxdW2_2;z%0Yca#Buyyc=4j z*Ew`}A!L-*>pl$IuH|{I93AIeoXiSwU|a^udJB+^-!(4fd@|n7KKJ&oz`|G)lR90= zRgYWMCS~Hjp58=YeR*+r`c_5$Q@jT~MX1l`>TzI2?g-2I?DEr{Ls0^?_kEHw)9?6A zrTvAIgpIM}OXhWYJ~i?g)v@(Cmw}%t$(LL|`^xIk+)B@MmD)ekJ7-)KzgA&8!s1pd zs)$<9fM|K8K?on*G5917GYI;bOuR}u!Z!vmwDJy6_$ZZ+^qF#`iAXcNLa_W?`dED9 zQ1CT{qq|FZsTd4#9X92Qb0S2=ROYnF7*xmvr`ueP-)k(jybT_z?;}U^^?K~TufZ3F z;}>MQ9KZ?P=MLSEcYtRmOSpzlrs|?XiCWKs^$Cloy19g9S*?@sczJJ6xSPE>B}sTf z7N>NLXLl_o>@ild&VGgj7j&wM1k7wb-!>utz^TJnM zyv(YQhi}A^@<;%8B`{oNKKx7+>_mXbHCKD;q;=@fA&Xr{dW*h6=^PpSzS6kAg8Je68}WZY4>qlyF8%!L{7&) z|1rBJL1~QMgStSD|A^22@0Tj;e+_g0-!Ij`GPyr4RlfgUF4c}k(25!=jhRfOk4a?C zq6rt2zuB;8=l(-iSJXtfGeB9Y9}{>)aUX+J_HUPJS)mo$Fpk-jFsz(+9_?>_0s;j! z{cGS4<_Yz;NO>XhSk)NAMvcw_wXkS+#AP(C7|^9!P3APu8luxm4>GAmm@vow#d?Sx z;|f^*+TJZ%p}IkLFxbbW*>K#v-2%8?tobF*N97i3NYYpp!FTJS-h?l*kI=r`mQ_Ke zDJUdP=#EVh9I#%j<};odJKYi|yMgAKQY#5?_bZx^0(R99Ib;f*%|bH64mjza5uIXK zJtSHbKOvCvDU#1yHsi<9YQp{`WI=e4pLD11*Z_nmN~=3}3CA8$YYw#cuH}kXht3zN z+GKXgx06FOvv+D22LzXOU=l`N--DZ$A0QmMmQ2zLea=p-uGEmx5HX5`dRGB zq*?N4nIp_-5nCA5git7PmSbe&Thz^S6+M4UMqPdOKYRXt>@wf_$o~qfRrj2gRgQA3kRJXiK8y9~^ycgZlyiMH z?eG5!Q}CU>vxB$_4m-oouTl{rVr2EC1XxKVFbrdcwid#3qJWt>ti*ZkmHFjtgwZQzU$i(HFr;L4VL|q=cE_v6Z;2QwqB15Px{=nrR$QE&a>3vpl@9qx z3`n$yS)^80Wa!JhSB2>mxoUfbVO&SsTh~A*mmq{(LhCHen5MlV4vG@z0%O<6(hgy2 zG~~$giM3B$HuU0rc&s0S>IWrz(KVSADLE_eIWY0ez${&q1v$`3k5t8F6Kg8E@}9q; z#0FGd)|Re`C}kL_TT>to^GR6>E3(5y-l2?RO1935nMT3R zRdjioPy&py5})0JEi0mBCoh6wk&fo~U~@(Rs;Jbd_T)>!N)?G7tg!wNwSv*k`WCBB zqfC#vsH!F9p!uVT22Xj;c0hjkxSMxlzf}nlnR9z2SoPp#FL=!B3yt->S6_VX3I?H4 zk&DGwbkJn@b*D70WX@_ z7(FLLi=1CeLAXs(qC~&4+3r7z*~-es;fz-}S#;}b9>0M1b559vl`imQq&qzOZaoX^ z)tPIm^372rNp!Up5v4ZPEl8%Nrv`!S@{cVm!D*4y&NEy7D5 zAhv`a>E>-6znL~T(vJlMKn|BP!F|5V-IoDD!4CCuWkfR;TU;lRqpGh9djEWW9xPh;fp4(K^$CS zK|r?3X`>B_;PA&B7dc71ae$%CoeAyG6ySiRCg*FHIBTKZ;gPEe;1t(%%GtdfU)MMx z@Vb2)F0!3PYx;b9=9D~KuTy zrc!VA2zfi&kIREAD1g_`*0HkGQqha3LLVR>qlZt#WFyzY?Ae>M?yyE~2Z1-Pl8(H6 z7h3WxHeg3l=tVlwMM+24HfH*b^aU^?y@^=)q_^xk=g(yc!EclM<|TrDvJHKkgXw9U z9R&zG%G?&0+R1J1yT}|$y}S}obgg`JSJfAiyP<2I3}EaEQXX-K$F>^9wd(C^Cq~j^ z`wUG2IgQY|?#;o;P!ROF?HB9w$oeOg^r4sKx|0^D-gPa9Pk|RpeXFQ!eak1CY#qs5 zqdyEB4twQQrfrY8bh9;VF^&MHG}30A;IFHtnSq-JZ_ql=+xx31X0m-`8n!X{dRf#4sOI;@Z zinj~!MI6xrJ|{nq)a*N2EpoBrMrz3~dVq`k@?e)2F2oVfMofyH5}fZ9YCwSOn#!;@ zS_|}f{s7+{KZN!EP0CkSFfspAn1B!q6g9el4Mp{mL{5o4@orBp`O_xD-+m)7i%es3E zUp7DX8JUh3d{@@1uEb1>kw;7UnPC46>37!2HR6%yr9R zQi?YTPJ8nbkkS8@y|~j3x`j;U39++Gq(`jBZRN+*HbJLMg-);7X&S z34bn!$XVEepR5-v+bea`m%O7tNN02XIGquo6$LK(zr*orEtFR|A<+i`y)~SWV6P#`Q7=~$^AXiJ2;QYkISd>t6QsP z!K_r;C5u~ALB@xO*~E2b+Ql7xC^F`5IKMuf`8wg2XMET^V!QXynZF-w+RJ-Yn*QWc zk>lUsK8mOjRQM6-hXa9af_-y&y-G0dAh$|WDlX!V534~HLL*#?P;g{5T9Nf#%>i@S zgbLdz{G>{K_jjZKE|SL7ZFzWf+<_8}RO=Me;$HPek*lg&i_MA1i1NuAsQ3qkFawVo zRG%x94&i{z45Xp3G*R9L+z70TJ^3oib)-$bOlT<=l8eRT9bT&#YKYRS_uW)h=%4!m zYe~$J)X@7Zx*h3kB6lP)>~gfp?6wK&2}@{I%AN&Ts3Zdhla)H{tClJgev6AI8wDEJ z!@kiO-(E(|YtjrQvqKMA(pWK}ZbrvOclZZF(mXmW_cyPK&#tRTgIfllq&0KTd9ccY z679~cVb+WkNi*^!1{9B7pTfN1*!`5)7SRSAAj?g@O{ShXK#g~A8Uw+WtJ`en%fpBU zmglOzmnM5ZV^kW+8C0u?EI(7L(4-EnX6Bb>mB%BrB!rk3BHp7-8dBwiV2PF<@R8Z5 z%;`}SXV*Ad)c_y0T+%``#=w7GzKc2jpqDTF1`BHUbXRuC z)!=mxIvO|FIbd>}5F~%fI^dJhzmbG-NCHdXOz^lQmjy?*#LBv9gwsSgtb}#};cL{= zY2*o<3M4|hgr0q=1&->kSmu~N-hm*LVMpYMboWR_1>W$IMw7XsOZ?2Sls`^}$Nu5; z5UAPv`5z&a|Mg_B{2f(PM0rIjLL7V#mA3_j@qQ*sSw)NvN))wGUgD3Fv8~zPC`Bnp zaQo>ofN}I;>)|xcX)Ot|xR7AsXixX<)URVx@6Ut5)@c-9A-~H07^`-G5V>Kuu|OltRiVB0bSy z2rSW09*^10wqB{m*5z8CwCQ0rMnh?IOA(BZq(v)|e;eg$t7UdNG4|#7HROh;F4JJU)g}b}EySux) zySoQ>cL?qp+}%lVcXziC2-Yh*=iYPn?yqlu-Btaot5{V~{95n($e42uG5liqKm##x;O5|-_-1A5mg#mJ#ShQvGWSI#&9zY zGYs_dgyu&HGRx=})s>uSNiM+@$0&*6V1iewO|W$~OZ)v&oho|@p<|W-b5pGY-<+gM z1@4GGH=Z;m(@#;qcLd!)5q{NL?d_Gvz|G@B!dO;82~uj=-sIP0*0I{3KS1r6%XqQ! zr2A{vtN-dMK7-Ckz+;MeW;g|x7 zJ_+l`#fO+9HoZKVcTnJtP}ACT$;!{3;Y)u7=k$);S8t!ro6Wq&XG^j#e-;=1;rhD( zI`T6*2V44609k4&2O!yPlWphP8Afi34-(8|eWse#wHR_HPwxVd`N1KN;HY@%UpF0{ zG={rCvZEqJAV+xX7mzsQs`YWo=Y|SNn6nlojULL?`Dh|U+z}uR9wc1Z6tm0S_&cg8 zLOXIv+Tge+ggi#g&k+ z;1X`v;x->;d65Q@<(x&X7VIGoWZsOhm|J>D)#kl)txlS3a!3I>N&tQ3j7ORG*Xqw4 zSs@{_7*Vf=BY|n7I-^$Nj-xCac<{ooNdOpC@;ck4OmKkAS$aQUz(|MVSo%!m-j@{a zO2@IV=*lv^cu~x9)WLE4{EFK*|53qZxg3+;ZR2Sp;1zat7p%yEPPqJ~_iX@8xS5xS zx$wl3$U;OHl1lrRx|nc;;q6g#DAUSYKSMHdtoEr@QLT{qJ9eHS?JPz`2+I3_^-SU0Y>!q&JhopQ3W#tB0*2F9YVp3QVhhy zU5pv!TqYO_A}_QP+Cjfm#j%624KB9YpMMp!YwYZlfk1U%74W70dx-zf+y5UVyx*g~ zKb7o)hORoYBvwsD&1DVTN9;rj0!SkPfsk;>vT(4#@XvXI?rv;w!_xvJ`X$I(IR)#)tIA^_So4&9_c3eR^rmIi1bibvOe2wvIeE{rD>?JMQ-Q z-*44}r)62966ka2(UlhmT!#D0c0B2H>%-1fn34v=i`W=WkMT9WX%!dkVY^*l&p9!v zOQk(r8H$+SVH2>(E4dUI7SVql%%PtMVYGUZjP~{$j5j2}eR`1)oQDwii+*z)J|R1c z578N0t&eo`;B~rUcBCKqUOsQhDc46YCdSj1BAxn8)FLBs!u(~_h1&k@z3hcQ*@Nmc z;^s|Zp0@QdbLEdCR9CfXNq#F=Yuo20t5?w|u1dAFIIeXq?)S5%H~L7fLbdva1Mv;J zx-|bbBmbd_Ni3VixPH6pF8;#Bsjp?sRtpVkvu1p^ z`nkGnuxW|f4#cPU8rOx`77}V~(TZnT`*u#P>9X+r4fRintSxEv(8Vr%rG;77oeHeB z(Hc77p;=gBZy`F9RFh_j&9)zrcvi5U)WwZ4vbh6}EkKkK!Z1SQDi(DxNh)jFk{9Q^ zY6>YO%8dGmBb3F+GS;y4JYR>U1k)*aZr25wCEhT~Wcl0(0##~3Ai4Ekr^p7)=LqpA&sqL7c&QxD`(uz=0FWzm zUP-R^iHOE64kK6%@p0<&@=xA&wW&@x84aFyHj{nrk=9T6(7lkseX-$riosZKayM5A zHAc4)Wg4!-Cj-!I6~t=sk}O?_$T5sgE*_5&7=s6&P&Sr~62Th~pM8t!pp-c+@%h5a zzjO#@zKS>>*(LTqiQ(MuAgE=Rfwv)$z)sG(V#?Ug#VK$vlFcD_Q-!O>xE_o9bG#6r zl1$D5;-mP!wbKc~fw&@fuFKGc5jw;P)__I4k?60=9W~3$u5F1l@&)OTrmNDi2LS*` z?SK;amt@_qB7{+Q$*`)%!<11fF%+i6s>TWRGm-f;N6g7Fs>b8PEL{0X+=E>LLg#iY zZ0{bPtau8;zO)J}kBFIo-RBZU);EKPo&MQx4PsSUCS0fD$rI7SU+_)7{4l*9qR6WB zNd7V8=^z*xXH~@Efv4Q8aVOH(AhVK`X^zNKnao@5am-Z&;48DQM)P4Ssi(8VD}bQt zxWt>Nuo>m`Dyt9oZp^Dk{AnR+T6j4V`E7W;&>YLRT;5-_0^KjX`eq)$w z(Zr66FfpHV!sWJ_dne>3+pIUn&-5hv+G>CT!(r2G22m5(LV1IzynHA>QQ+@}&nY=e48#(+M z+K&yTlV9vI%tpaKBDbM zUUci-*05BMQW5%(tE$3op`C-IgCp9~gSZQZH~N~xCzao2(uI1&UiiCMzUijQzURmnw~cGE%WG&KAtLuLYPiCYFF7J)*>t$VQa(~d z@L%9?h21<=h?wQvV+p-^$Pze9nMdM5`REhG<>aZtt4`T+w|_pO#NA@#nq%oFr-7&? zg$SADK0Vg!ep-m2ud zw8+|sz!!G11O4kQR>Al!p?LZr4Q*O2>_svbrapb73DaLBV1kQX1Yc{-x@gIn*#IPC z>?Uwu1#<$e{mF8tG{(NTXl}0tHx_VaJU5rh?hOZa@b_l0&3O~{Z>39tZhds?haapG zA2;j~k48Ii104dEJs(gGcKH}M#amk6hbh+r~{g&;61lt@TW zXz!`&VBcS`F(Rr!*(KupZe;Wp^iz~`(_NGB+YdL0eSLfVz?ZWNyV0J_%K+dn4Ey?Z z8FatyiD_zfze$6zG;i6S3%q_UzmF7@cwHq}2O%CuV86%TCL<1Im<$caa6?H*ipAOi@T(uOt5?dZQ2R#&z3Y)4WoSgR z$w9N!+P1VuH`Mam&H)&YzGn_)YjM9WBF|SI`RLjer#g!RrRQzm3s3x{P*!T~+Q>FgOr0aJ+ zAG}_3EZCR71j}Ehqfjq^12KEF|HMoG1i*i<3IFMY^A8w2CO04e@SUS&32r`rR-#b) z9uP?k<|s%&B`R7_#tt{fKn0{{NR^=C_Z~UyfbI4|Tq{D73(~-fMB+|$n)8qNdHZ{Q zstvOR?>d0>(f;zS+kjz`2D8k7z1B0Ac^|HrdI1p#gRPlOV!NzCP-#yh4|`7_-xJ-1 zw8}@qIyaB(TKqOoptLQg0YwQjjG;Wj-J~6K>Y%x`8Iiji6vjb@d{ip-A#7%9=X~%W zu%ANPDpFe6tYJaHrMM!~O*F!xpJS`^AxD&V_b1U>q`YcEEv+u70T?xK)j^=&5hF(= z_vEM(nB%Hu;KuQ%$;b&ZngK<(K6mh%fEBfusp!x%=y#5DA%&G8|2Yc{6N;Da#afyA z9!9`H@NVr~zC}gWR~~DY0LAoBST5WlCK<9u>*WGjjZ(?t+Fy1#YQqC!k$&mwLD~qyVlHerXVrKfmmG~5pvWiS)O0W*oiXx9dpG1q zlI5q$l`j)gG!WANl;03xSm8Y!#us93>WrkL-Y~*NadGHvr_R&BYUX_4dO_-GD^$0j ze=%HGIZQJ6LL+pP*fY(plB+yz+w8bRdnq@D^T~Cvw6A4Y561RB3wJ_(~xE_DcRaZNvK8D*;a1WKCTx>`fT|R4A@0Yj&#w2)r;KA?hU|i%Ging`~6r zmJ+j-R)fNYA_WCaDhO9m4WwcQ#Ksj55WiS`EsE8;cwCJ0Fgp zSG~Wy{=n`f%wa0hei&rjFXxO0M74MGvkeL`3~p)xtY!5pbL40SSZ&WFjF&T87FNAl zzBXZNX)m4b$6ICaO1C11eMn-RW7^;nu0aV@;s(HsJYcp}v&l+#=KwlpEu(5iNZ6BW zAcBX6>%Xe)k(o;9Z;*72BodSg!{CxFij{!yzOE! zZzm{33bJfhP0k@l?wRE0(1l#Ss+~B>htF7C2kdB;bIOarXwTP1oaRUuY?F#(?Q{It}2I( z)m@j~qq}6bd9=*V*YqV0Yo?ap#CF=BQOTQK*QuCak7E_`z&l(CuRTF*_7Z_8tT_w8 z_`Xeass-lXo}|Xm)oG|X%lYsvWiW0kkg|-4LRp0B&OZVLhmmFiMD&agGAl{!ZY!y7lu;@ z;$?&8lV-~`v$^&n*O_arB>Y20oe$ChK80@qY==;b&}^aF1L+G0XgV*?_1yVf8fo_4XFo0~*3%DhiRJ#k*Cg>` zF}MtHJ%;^{RKvedfdA96;?Jt|RgzW)Cf#42_>NziflBKMfPjBWaRDSrG+g}tzi2+E^ zLJfYa^>RDh86Weo>fM=?A^0p}Xcnrc=6&*2Ix`0V2#7Us$?x{uG+{SDkV^FzeO zI4a;izQ(59x!A@G={+Fq_JQE-a5w;Dl6cr(%$R`lwW|x2!o#9wk?PBcbw^Dc54nwl zX?;|4TbC9|5g8Vi$3w+2cIq#h#kzH%o>=I~eUNkS=6q;SDoXz+J!9;!4> zK>TOssJ>(WYtPb?gNlM-LP05dFAR-rr^;q+N!!jfbkn#2)>A1!;x>Id|GV7>K|*So z=AK7%NS~1HqgcZkYURAN&wdyuCxy0RX7;W9*C?oFdh;d>5hq7Zq_qkMf}s^9h8+uB zqPX;Z14!&bXy8=@%&>q5h81<|T0OGoRo8Q7#S5zz1{|3B@V)|>llZe`{62wFS>50g zz%AQgM*DVaYzu~AWCiTa!^k2z#;~+H@^CPr^%$5WR{$blAMUQd+QWN3o3E2ktlaLH z14~e^%-cIwyYBkX1HPigr87oGyJo4f-z|$vPNv{o8~n-Cl**&-+5sVi!Cw5qDO zgklE1HE7mMSRt@6#5sasufi=&xhUl4G=6mL7sgq_3Zl8tXT*ok3YenOWU zbPB^tdlD44F5qZ|t&K8ow*cj2_flF@SI}Q1SK`t+2?vxxERv{2Bho<^R8B zU-ECZZH$UFFv}6cZ}NARBX)Guzp@-BV4OMVA(^rfHtGlEOXl}zPF9Q0Ab(}OeAe%^ zFnT$UW9lx&HL-yul%4wFc{Szz_V)aQEdag-a2m4X=GtPBnaopJlTB(#)owSl0t8uy znzo9DIuL0ZV`#MSBUv=sX!vw&EFZ|JuAv&u@Y?E2XJx9&U!7k%Zln-ULK+JUBZOj- z@%WxhG#7V-+M23v%mcgPX`{hZGG%}pkv0fbiA5Bh;c&pGLI7g*3QG?|71_70L+DrX zu{YvBQI1IrlMGwiHH)sm4C;ycYI$2}SC6@Qmz?iqm{7<7YI3+=y8&!dA5aDvNR5%^ zfKiSOGe#N|`oE(b`4pG|#)68|s%knbMI|Mip=E-l2>I}u31gqPQOAws#Bf@`s7eKX zwA=xnU`SaA)-%g)0;dxV4Cn?%SHPxs_=In4HWw=vWb(BpW^<`gV2sju;CmT#7*A6V zn{lBX1YDciM@EnalIhhD3$Ny&>L^^3$jv%xV8#*yS)7>aTSpi!LiSmr@3}W?b-Nw& z?n?4r&hyJDQn7kB_|gM;paZ{K1B7dI9f6)DKh?hTlcwAW zQaw9BM|x1X@DZa^`^{02e&3yJmjEN5D(s3*suffVvJkJ{FjnqI6Rx6U)7rGe$hoBMN zT|A6=*4rY%BGd3i>m}%M7O@6rmTo>~L>9~37o}zwYrDF}c(^0Y5g3L!D|sQ&Fx@U( z*gN7!_~vJM1KD!Y;~c&3`OUcWo3X?UEHE)Q=hjsIk}kWrZkCa0E`mi!ar1ofNW><;XCN>n*yp4mUa`rjG2e+)|mO-%kHx-k+MTquAbQW-Or4u3oe)9Q8&Nd#G=0|q}Wi6x04 zJ}Lr#)DP-G;rOMsKC+_V8@PB``lMy$Mcp&U*2Y&wDHU8fOX8(#WbXB&s2WBz*(P6tZy3&U;WUsA< zCI!Ya>hoy?Mnp+VRLBNjzs~AvLIJIh3*X1!GcA0cw&AbEG3B*MQg*{~yeCiErtZGX z&FS~}zkn%>AcMDE$ni?+EDQJ5nq`tBN5OLnpT-Q-ng%Og7@cZ#UvRl*8=5mN9jA8% zY%=~J>zZi3=pJ}27q`d&65;)l2=(Q*Ep$T{oPQ>MV*{)qel01?x#sFbA03Sgo+`vH z(e$Exq5gTR5?EmLsZz77b-0#u+uqZ5G}|xcIOITj*+7HRS{D_m*~ErO9OW&A*N_6^ zJA$O$qyy#3K@Kj%)bURwCu3Uh{y85Wyl!(2I_xMa7ut*hH1undb)~wPENC}}ER~Kf z!93G3*)|*{>p|I#WS>p=EE`4?;&C^F!rMgF{iKLsVyj1cn7XC*MJ3129ZXUjX3KCZ zoe8?$v|=^%dYuh}917(5GM#tv!$+);JC;g0Z6r%FC8==loVno^RkFSjYFJL(S|Uvc z9R1wzTF6DBF+q|XhVf?M^Rw^aB=eZb8i8uueUv+hG2+ETk~@GnL4A^Cb}r`ohNAeA z@a0-Dzf>k!E@hW*DW~XF>fO7)N^9lFvht<2!J$oH6j5(ci0^mHL~kvS&C;jL(xncV z7b~~wvie6pO&bAJGJ>d<2i|b*;XENYz64wYLrc&YNp zf+%mHz(Ye-iuMzI*)d2LK`|9ka+Fq|8g_K^jxU9AhY9yuqju068UFRB2>4-h|Kyu- zJHnzhB-sXLZ@2EJ$*0NUp7)nWD1xXKz)t;!rRq?uB@}p<0@IqDcb)D5nmJ^3)~pK~ zCP>6S-hjgOel!neC)71mK=)z3i38Vo>G-5Q9Osh{u zc5D?_32lSe!tO$rux~r*`7n89;I?qpz5j->pcoWmK(Kx3t^DgjI%thD7Rh?3d>vW9%5iZhAp7 zr)$GR{DVIJejErq4yIRtv!RgBYiZ68AkdL1FHn& zz&Sy}S&C4N(wOaG;(b4Z7MpbFNdo%5ES+SCb|xj+lrX1U-(Lzc<%Ag% zi_!4&DYpba+8D zFh8-yi=f+ZQJ07p;dxo529a*0n!xm`_n1#=VuyWT%>%$ zEgnjcHKWQj4##W6Li9*ON;yyH473~{4c|@`C`K*o=Aa&o{(ojMr~l$l5Zafh*u>Iu zqComfMZMB%kd{H_v4WKGxh|WDVSvH!wEKbso1Q{quRGe?qnF{N_;5aYA6u^h2-nK) z+PBMsRoey4iCj5~*m1)&xX_~9uQgp%F9iR@ZG)ZQGXq-P##{lFt1D{$X)==!On>6G z`wyL=>-M+oiVZZ^HPJ)1Q9N6de#dQF)BDY0PwxQn+JJ;QE@>wbvK~ulHjN`urKJy= zPD(AZ+{^|BK`2UH%+fvHkDv(gphE;n=0+wh2xRhjoSOm@@ypKWHlid8je6KZpzZW(boW7ui9x`7XHYmp!Hv&x?6eN z%`)RX(RfyP%isMYx3Pr!99xhq3I5r zb3_}B8JBJ~ej2BSJhV?~C&G5+lTcx*9zp8VRuS~LsIe73qG=9-q3~UPqd+>H{SG^A zwGzzNW9PLJt!gLb{#pPO=*Q@7py?GvlO2}#k3>{D)p;%1e$rhmG@SuJ-;s_kfH8$J zvhD-j-lts~TnpAv%!3{ZCAvVM=N0Y3d@zmrS~5AhIWooUtomjLrkv~4DsLh}gx5I{ zrt(LrVU~eW-yhPfnuC|A%*^!>IN>n30EJd!5`)Xns^bibD>KHz-h`wc~ z$5Seggh0b{2J9brrs=jT^}q1!_fOUA{Kv}9yqjbS2P*^fgE=-52Ue@jQr$+Aq{V`k z*iO9y*pT!Jp=q#U0uvT!^$y7fG`xA1wusnp`C^L`e^Oll>f+tziw z(5*+YLeE9#R`=29X9{Y}>}V)ADFY!-ajq?*3_|6<+~&1v-cSQE<8@u5#n``KSpG7Qs`c>7_;jQXQ11*`=7Z_hQ#d^8iZP-;X=;|Y z{%v^9#e6lfS#W#x>dRK}eRwj>yHOaxs=k?F@1zEFzoPk-tE>)i7m2?{oETBVu&R{9 z1*R-Poj_6UC|9I6*CM%<3)pIACX730LZDtyGs2OSwMwZn3%x#L@`)q5fEv}#tjZg= zL-;~YbHg%t{Y)~=Sj5u`Grk$Yh#ZnfjOOi&inDR}obw6>5sO%Glny#1jRA4q(ojP# z7Fb|q34{e>yM?Y)c-(yp3)x!^hTJ#Q;P?g<5n{JDEP^DcNqM|aJ4END7Mdk;BsKcNGw zCLjiA*Bu-Q(T`xd?`mP&5#Nz-T2lRA98yd5L#ngx$%cJLqgrn6`crT`m0s zXlN(k%Qwp9G?)$)-0x(~Ho9x~FtdJO+^yni^n1uOM8EzTH`roey`uw>tlR&nt|I;0 zQS+aD#J`+x{~faw6E|ViFMtvKdgkAzv8C@G>} z9mOYrFU2JWhKP8@;2&l``Q^Fg{pr)RXcPj;mS~AbvK3k5h}ooej{n7?%~o}y7Sj9( z%bp&+o+F(!C0FSt98e+aQ|7tw9-ZbyOiVjLlsTSSb^E4!1?iVa7d;*|K2%0F0_-Ik zb6{Io(rWjAhb=8qkP<-PS|a?ItH7NQyu_^p#iA4H9D<bgN>n*$gV&y9xfJvpdQwrDwV*kUQIX zyY+0?Pucms{d&Ot9GV8~IoYzyc3ZD}5iD=UX0xYlmKk8fK(bCX_QVF%xtPV}`+w&p zDBh_*xK_XT#b>V;<7=48!^DT-x#nQ?wDc^#eorC+ZS+Uq?TAm`v*so<<%BECYFoIF z=n$e))*teXA(bwy@qRofu=0<-+l~e6K|J)yh4Wl)O5|TmOi~~dv&MPqI1$LibkI5S zT~h=yG0{|H`#A>2Jt+b#>0Le%r^hGslN1keH);VI4uBM8g*szZV+`P!)MABnkGIhB zkG{Kx;I|ZwyOsJcDVp(bDH>!&M^nL zim-J<1#c8Vk^Wi~)n^Fq5a@1-$P>Z=33!z;mowOwAygU&uQ*I_o|au!+EO&XV1Ykc zv|~z*q0C;LQlT2KjpAgSF!mM^B~DX9Ogf+z4aiC(Jnf0+hguV%xWz+qqqN{h`aYq~ zk&k|Y-gcftkm0D_HRtK$ojc+iQ@XH5frJjI#H{yN@ zJ$79m2O_z9&AE&UshC?7l1fL7{7|e%Csm{{kAN4^zQZP~p>X8L z*=W;|H~+xP0;c2dF9E^}OH8YvQeh1eDPh2*^4-?IyO`J$amYF@A8xkqxPx`;B$+9r zRK+}hZo-e1tPlbu7Qe<=3LYv8{UyxV6K#YmL&PMS4Wt_1iY+x>#y3}z!{(6$vhlr> zTJf+;Pb2T`$vh%+;;aBJ0Q%DK26ptOu0B(o@NDgi?67Dn#R^=5u8Qpu`0wX&6*D9~ z3_K%*|8_?F6T30|?Lhrswg)8boK2lvfDMBGIxKuufF?u=D7Tk%V>?w zb_NJ(QnpQj)6lzkr^x+9^fH-Tr}no&ldUn696k@?oSSK>Lu9606TaU)e{MRr@ca9} zfyWOs1!*~*+Jzj^WHDPj6{y}>9a*E!v0=7vLh@!Cawt|gP2IyQ!v^aiuwHsGZ(7M= z05+0RYda}J^B*s#2SJ3*7d>U0;ZSJ%%ug3gFHEqO6LDzf?!?N5dDn=E^ZCFH4( zXAlI<;Z@E}!)r3^Cz5R=Q|8ZavA%jhKQM~yMtp+R<;XMQF93vLwL7~i-q47f+(+6j zT0#E`|4avU41m}hLQ5u0!5Jh(edUm0WL#Rh_zBp%F{)G~_DF=pW5j{G9yiAoj0G^Q zLqX;FV^?&-sthRNC{wbGP_AxP0FM$Fqc!j}o#@WFIe@R>=A5dS9%qqtAF$iiVFtpj z-u9$5%n;lhY8YJ>zEgES)@T#j=YauA37IXRR3MHSdjVu2Pf9j}QcV)0VrfyLAY?fk zVp<#bC~}||L!@C~aC$fD)<(d!c&6m9)43)cbSlzeD~oxy+fI^@9dDvoUQudN_urq^ z0BSEYsQnCq5mYRV$@?tZ z%9y>DtziZ8PoH&YSj|$^5CS_iF(Ku0#I*;i4PtxWw7ykM**jqW7;QMu%t6i&P2X=D$5^&XteDGSho^{N+6F zFpI}BGF~OA5Ia1^cOXK(@oYIFOm0gV7Da9r`ppWm^wk^H++ra>k26FD%4%P=lkFh=h0z_xPhiE6h~fVX zKJ3ou*U+(RMrlo=W0f`eh50Ufg_AkUk9o@vXgy>A8HX8|A=E}eJ!d6n$WdQb?q!)r zBF!0*PiDN>KYqS%dS{5Z|#^A@K#>bDL zt3`;^yO6;Ife4s9x6Gjrw!%{>q_XNi*zd!yxZTKXc`0Sp>!s>Y{J7G~5)U(EG*jI( z_c(kpK`>b>CB1;IAcqs^PVYf$J=M=RQ?^zyNB{zIR4&NgA@9exv#jvi0J+~ zX)9e!9^rE9Yz*&HZxvJFQ9bRA6_oDZpHPci397G))ZT@(Qg%5dzYm+M>rvFxZL1)Z z5=LIH^MdiDeq9O`-PN8Q+)7>O5Ad9ejiMr&)!W0s%%FdG)EQlP5uw+rt6^w}p z6A6*E6YQo(&@s7)6xeBAg-N?NnFMYeb-nj(T;A!ZwyoXKxN}^>QH@+QbO=v6cpudK zs_(B^SIvb*cNfx2kQ$cFPhv{czPvt}Qst_sQ~f1-8>Jnvn_5)x&AA{2qo+Mak=DQO z5@r|XOlh&zy#zyRJRLwkA%maltW-|2jVA0&(pAlkC>xPJd_kbpDs28?FHd2-^2U=? ze>$anss2_<9VYWLW;=zwqLw;iv89w_U|bD?ty2vV{ipF6$H5t50^x?k_?=eeCWCLs z(7Fu&USAm(x#y;=D^Vz+cUrc)dv z=rqiS+p8fSv|0`9Uf&A*X*XBpuIOX>-#hx4=W{|Va7WYrO<(xWG1&htc4z$$6_qZ~ z6A*=0+7gseNRZ+&0H`kvQvy|?`S>C*{TsW7s*@)-1zRS`WpW+c-+t7Ozv;VK!9Va| zhmw)xE}L7~INxYE+I0T8`S$De72*TzSB(A0Ik!rmkqT$ak;gXthrF-qpIDez7<_EBT|NQkDO=!0_kOEw^pQsb*|-`9k}O zm1L{XA9p6{G-3+ghRSWKP@urjEIez%>=k*yrya+&)Vy7mglzkc^e-Kiox{|5EfBpo z{#{3PNRzy6Mep`+^j`ZHdannfcOcFD4|>o4-cbT{X9_L9@ctUHeh=7zyZnYRzyf-Q z5lRO>5#L2&yurY-!H{Y}GpCWSIL-bent>UhR+VY%Wt-;nP?@87hu<~Gfastyh~vTM z-31Un9Ty#`%b}#y5Rel?`-wsJ+Pe{&;y;2NJJJ$Y(Mh?f-cO{u^_0p)k@3*&*lP>k zB+lrfi2=P)>)%?cutOhJpk;`0(nqD5EjDNA_^I;aCE~#6BNd*ddh3}H$SN1>8ud|K zj!)Ja=wGl^rc?61CnwrEyGxU#D5E9LFeP3mSIuY@@I^l+U4H>{?FtLVg=i2Q_C877=K?YOFc8Cc{K4?{@eQRCe#Nk?v_K4>_r*s#n*pnz zeOjFVQaxSF=hV zdw8s;g>&Q(vqp4@jNYhpcpoTprsKy_`qx^woECgh-_ zto_$*rw!$!GKBd~7R+-S9xot-1PR$HgcfMH<(&mqPXxWg6fc3sfOlL^L&#<`I}vKt zq*vV~Z(I4$Ul>Mffk-V4+tjp(w>q%8I)K-_i1$!i{lGVGe<-`T#%^!st|$r8v(;&D zzWLJg{?pg_+^1n@2mP`G|AUYT`^TfEO^oKIb8sV6Sek<}mHZ@a?5pbNQ_n5WmmMXEauy9L;0_ejz=s&> zPz>j$dvip%aL5=c`HtCbY>Bevil#fQhNxl95YmslF~doplR82ZKbG(y%;t%OX<3h^zg>r#F5YJ5Cy3O1!mJwRBn8u`5G16G-6qDNdTy|k zXTi8=c=Wefw%cI=|18Uja<$;tS!dB<7xvr$D6*%WB5>`j132xpJeNv_7R@xm$W9Mq zMusMYcpG8Sf@pk|r8|aM|Q->O4txQO-Ne8MnF%Wq6V&kCDv-WfQ~87Vewro<}ibhq7sK zGmLDj*?3}`64*4eN2k-WF>*LTGa;+y91otLR{OAJeIQ=>aBJctZ0vUVoy{TXw%jh^ z%4?~j1tS)sERn&{K+_ds2eB^vuRM2dv_>jW#~`*=%7+|CnV7JL1w!kXuQY3_`t7@xtkIr_mAgFwL#XNSs%7{JH)N3xG(`UKVl4JsaN%N4H=|uOsFi@_pN>!|PMEmZ?Lr zZdat*1XJs9jh=L+7-r{Kjh=XAK9fh3t}k_f-^w&z3mUhrH}`=b1RSMYHKmaZ9(EVA zQKKZ+JYH1M$bdy7u$ow|i|A?eFdxut-poLF5E|&0fwaEn!>tLUKHj7H)tbszQwWb! z2NSm!zOv_xY}&4gfo*lrRp{wQzn_eU!i|dL_Z_Ao_*xswM<&dd<@VWP|L4r;Rbc}E zvuac)?opg_S1!V?%tYG>2kzM|?q;A-m`^@S7lJZmae1G)%d&i7Yp$3hf6Q)u8m1_> zS#Gfw95iELXIjrKv20C{w1Q6tHBD$s2VwBg^tK)@+*~+L38JQkf{4vKzdYkls5j-n zixC$P#cbxlU6`ezSrwx^dNlcI8;|Dm>)I7R`5rs#6nfT2qyXiGQuMIk9E0wvfuLC! zfgtfcLR`GhUy)ip?{D)E#i1wLvRgAgaZWN7gt>YIbMf)LGIjZ3etH^;z1^DGnYlSj z+xbDQ60d(=_Uem^hu`Dpx%5?Ynx~QZ4*%>L?8n*ZULw7euEN1#?9(S%c5d-^Q#6*!bDLLnE*{>)LvQjpMTPr|&R2IX9^r{A&*HYac_q!mOASqh_zS?BJQokQ zmYSC_Ce^4EcbdOk*im^(b;!|eEUz8gSI6x1UubTS%KjtL4KJzva7?#8@A~&0FI(xT zm!u34*pZsARbL=toa1OiMmYANlaof=XT3v!3zVoWNgybmcYl;rD>z9nN(?dpTa0yt z?vX~4?6@c#1(`xB?0A5n$SykR%A%pF?!c6kdDolUHcl+7t}$GE1I;w&3dCs&j1bid z%}l64%mrjA`bMU9q)Uc8X-pDK4=1kB*9O$u3tSCj{_smsro8~{yYGhb-l-jxzD=}5 zYzNgsF_J4O?Gz85wby7_!0X`$Yf=+-nb6-_MPL!_Jd$F~Sy|rPf z4#^T*MYK6f)p6zcv8;@4$vftYm_epIxinnbX;DJ(ac-#h7E-tfR~kLK)=IUMo$+xK zv}-j-eC&_f9u%q%67S>h$^wZNeF^KQ7t03}Hh2L| z8<^=C0V@!n6v)3sPZ6aDHMLP)`+(&VeROwl_+VXH(F=p9L;ul!cbi}lt-Kc&Qjh4P z{r>h`?P3{77=n#N?*)#H(S%Y!7Q#BsNBiwp%kn$Gia)B)eMy*a+?GRb7_FbJ@0CXpAAG%~OHRq#I)sV_IN1(e@FEs~8K$GtDKwqVz zT%Io$8h+>!jtQ^MxRVDh-_a^>UYsrB{T!Y|{ZGys)vo6R z$VUQXH36(}!mvy}-wa!W7=1FCs|NTQ-47VfO92qI>k$VvfOT3ccyr6IE{*C155lnz z-sulhoW#20UTN?Hh84U)2NK|+q7!PVhfdjvFXNZ@iAZ`Z;DzS1}&2r6Ej?-|RK=7R?yRu_s_SBlKd9nEZK;f{3n zc}^YF7$z;hIAh0x3@zMVtbV6lsXVJ6wZg5}3adgJON3i-5mPINJlFM+=iSCv&^Iur z0`(q+s^ZR5jdl+us$J>4H24LoT?sfRB(J?RP-C9Nn)JkjKOg;Rz($TQp>Jbd|5)D* zf!obBC9ufUo$=)>jH`kU>bscB6N!Gvt@&{k?>lN;(plW-ClQ2I_&CqNG)^h#LwE;q zI6or6sD?@+t4pTAiOT7?1UKS+*n#_8)l2U8oHV!M`+>9}368*gdyn%4ypqg3vEh38 z=EFBt1QJ^lTFo#g&~FWjY)0PP-CMCv#!pNZLTyk7;K{fYM}_Ain4+F=-@ucz6l+9Q zLh7+iQA{BZ(2{K_S_-N}+h9#0zBwz>iLAubqaJut>`_dKKe2p+QQ(Wb6-t)RTaUff zOh(W5K{yzq$Q8j9e&YFtOMx$lDgMOqEt?`&FoQz>0qJ)+HW@_JE(QDHL$C0E_8$Mq zY9;(zIriUjBLydW6IWxG|1v&T`WLx1NVhE8I4h{@b9=;UAogwtQ_R>GD5wUpWD|Vp zww3ne)MTSW|J$Ib2n@-Wk8c!vZN+lKM6zC&yuY)Jk2+s|z3!2JhOPie4V*ovdf0LY z5?W%c@qTxa2^BGRx%J&!2>YcMK&XEe-V2y89iTXm+8(v;`O)RJ1q~gXyk3X1))a&e zV9wk$3BU5P1<}n#8cr}u-+&rW(E(A<6EIExx>n;cchwmDzG)+Abz*#1sOcv2v(~0D zHU!mSoLWLy6;Yf>)G$h9_1rC69-yt|o6PVHh#?)wPp#iuoJ=s#=(kx(*HAxcRw?E8 zC2SZQ=t{nH5)Uo%dRb<6)51hl?D(=5Xw8=>&JnkUE+IN;ytBl@hJKRG9W5ma!%jql zwb87UXQL(ZlkOw4oBJXM?T+a*mxW9Zc4){khb7ErL~sCEZ@_8xb);@6{%Ng*(OLCt zIYoYz0d}%AsmZ?k`7QU%4M?54QG}fV(9@i^;!392t%7w4^{qw(T?v?Yjs5=Z%T zIAkA3ehzh%)zCxiy<#18^#x2R{ZnZgiKVp5w>bEG0E0Ugm5iP7eyw1sizF(928G|y ztmUyo#=eb(N4i;S;KArCrx`-hrLYOS6lj$%At~EsY$XXZF?m&3@;+@C4~C%bzz@h% zQ?O#VI!}dPp;@37bJO$8ET~+OgjYzH@v;R6gI{pYOv?;Y?zO@V*)1azQuu)lleY6dqZenLwP zhWEj?x&9=v%d~8;2ZL=4xQ(F41RDkq5L_Pqd3j!?pVjDVnMG!q#rFFriA^$ViOhDE z%yyN`)~o2}M#)d#E!#x8evAtsW#IjNYyAAY#d^Cz|7DpRk}#$* z1Zit~etq2GcKcBz5y{`j&Wnh%!`Y34F(I$Wg(kz!s-UQ%xyGF-L(ZUevtI=tXtFbw z;U=ZCp-ep8vI`X))=GJU!`=OEaur}D6)5l2_#7eCskckW$4^N}hwpQ%eA^f6u~)Nc z9g!9BD?-Go>g_SbdThtVn@5QPgBB$k6`Bnd^3qYRx=GvvIoS-|3TV22HCcCiRNbnG z`mu7(6|u&T6@BJdee^MUYh#KL4_nfVOx6s15b6*r-2;e`27>9rOLBs$i2H2nhzeX0 z;d#~diRv$)QiO(N^QUB0HN-331BvE8Nf3gp40}?BK_z#dI*Apkch``7(P%%=sXr{I ztfr&RmUZ_T)c0FbS3A)bt9s&%>f_tAbltksb*~;VLs;hYJ#2~(2h=WF)V1b8T2W

!#){}~C#mysNP7ln5hX*rKA>TS2EzXO zh-JjFhviura4V-aOJqx8YLd6FsYwwl$GD_se@HSamQ&}DojA$cu3_5IrXK*=f!tyF z_1iSTn75w8i7^#W%<;aD{;M=ZCyV|IFyRZI1kWS2M2G=IP=~iX9oYz~YlMD8iEv~w zx%$5'sKG7P1qxQal*48T-Bg3GrP7<&6puesWyVG;6``y>|BNtH#z zd<1f5;trYU$s5JX!+mEWDpt7o7~u^Qq@D)M3vm6jE8>vAvp;$;o;RxJe1&Qrn&D~Q#)pK$aH)vUuh7mRST(SGD- zq%C>ENUfLEgr72W{}{Q)4U_tvY=u@#+U5+FRA2*o=P{1b&e9J`G%9!Z)Ppw4KOt$RXPd$`ECgj(_+|;2iOM^O{dlA1k%@$-?pdNR2re}e z_yk9)H%aH2lIzxqJy+QO{QMBhKJCF~9T7B(H0WQJQ$x|Ouk|lu+S4qxf}~f!=l}ph%%%;GB$bj(OB@r-G) zJSuxuzqRABn6m$KVZ8w zC@3JhBY^5;{}F)pK7cj}a1}uN24K4?Xf5!bC)l1SMmvQ6GHm}eqdgRW6^_4_$sQ|! ztsB5~a=;^iwLO5Su|Bdt1DJjjv^Ai96|j98v^AK17_faD^yLrzG+_HY=u6N&U$8v^ z40mw5-l7F{gC{(PXB4{j-{ZBx4THrds+fHD6MbJ_J_9K99 z0$?)$_6vaQqyRQR_bEVjI^Y|C+gjk;UTANC{xDd68PIb8V_1G#U>N{jR{-A_=yy$k z-iU4v;OgKMZJ-*sw4*1r@ztp{JAlinLmF7A=Pct>Hll|b*txcexi+E)9N0O#q=Mbp zIX6!JKBdrJsDc|;f8RFf58UA1F~Cl{p0Yz$7E)-8?;wh-O_QcQN%qnLSiNYV8BMF^Y>@ zblK>lmtgbKmFCr%X3;p@pn=I%YJSyZn>7x6?H@`k!ALSENfkDgvpgT;cuUE$p=Q#2 z>zxBK+Ll&GX}`!NisnASb;?(pd&oK^f(DYkPGI+ZtufCjgHsDVr$5N7|6lfKHrpuI|a}c*RCG7Cd_q zPAko=T3WePU6T7ieC@`Y(}w)2CalY@Cp^vtt=e`t_mm(=8lDW?y!1!PFfZiY0p(w3 z7{8kknRwYjB*g?O78L`FjA&cvoQlcI4k|ZxBP{arC(S9P!GLP__7ZgK5aB3xf}AQ0 zeh(HYsK~h(CIUAbGm|u4GLL7(JG$~eb1zg?3I#5z`{f>VrNH+9tZP3|rr~GhB178} zdg=CLruR_#UdgD0)vST5EGh+^VW9tnmjO+l|LsV^eyma(THfVH%8`iuNV!@|Weaui z-FhsmsiU-%)6I~Uo(@ByWcXg%GYONP>7%5CG$AahVn(B$nQF2l0JwYld_|uVWa5To z{!!_E(j9rptIqie(n1lqIuUKh8Wi%LZwimJ*LnrrJc$ zCPWuYUaN8ycqdC->kre7F1>rJBTarPunNZPJ#o`OV3Mqs5?Z^O;|XaxY_;}on{}h* z)jAJ**A{*Cc|kxsd8m_i$`Xv&@?IqX6s77Asp2Hx#2^rLJTq)s$o@3)R%>?7wuI- zb#tmG>#LenFdQFURYJ9Tv{DLGD{85&k2H^%-hM5oiDdqQ2 z=YmY;*+96o7oNjLVFqKA<=$Wrm=j~-V*sH#Z%18~4cpS}V=d2tbPwEs_zl<%9n6^J zAUrp$!+TY{lC_O<5JTvW8E96lPfTS`TWu+{VR9A83bb_6BDGPju8+1SB!~ri7o~Am zzSj|bXm*w22@+A#5B?)@9e`Yfri?!|r5p3f;+Ljm){1E}bGYszX-ifBiNF=7clv^@ zc@sWYLP`w80`g?+I^`lc6^N<4Dr7MMeHx5^TzU%XomtKn>|2Y+Re-Fsh})U%t)p01 zT~$D1zN=bS9nVsQZedzrmX`*bOyDbEXGhB70z_v55ajoZ)|m)oi#G3i0<>Q@i3I>- zB3fV@P4$t)vjcZ7Eo7@x5AGZ#Jyf@pE+npBiMIPxH;Wy=dpy8oP>L3xROn%bGI)r$ zM^Udr?ab=ICBH#iug;F2?iJD}rB_prie0r(a-6hA5QU;1RHe9|Gr@t@zmebE2VkZ8 zv$S5XZl&$i<(1VZOcxH;@&U95p^SgGdi7lE@#P{&PNiBZprY&MO z%Wn_9J^sfNl9-Ll1J_>y`=yODVx6AMZA1LktMM?!ok<)@8vP(Ox!&s^Z z0SjtqNuGYCuq#;2wkY-CIN4gHZ;);(X+O z`XiLmqd765pzdv70$C$S{+a)j(RwA=XakNe^r4Usd@A9m*hmAyZKeb1rp3EA0j?+1 zsurHZtZlCDlUA+fzBH3sXGAqju?EpXaclypGNf{_sE0%~HfY6$4bZ$)C)3ENK|zEV z50Fgo$5tmmj0dg~2I>HKoiI24sD?;uYOx019JX9TRUK;$C~~pp+6=G?tTDsVVA=d> zxvo0U2FO#>3c^}IMj;=RTilQOKn!v|mxR1V3d(}UpM;^IJs`++(aMXh4?{hfsSagj zbY;zU@uM%MI_heKWq^=+G*g3WbA)pZR{&RJjLW8l)R`JYbFKl|{F{QEkAou>uzjiy z=^SYi52X)@LDr5-&9zS4IW)EAyD}8pXmrWWH2}IU0g3{{)j!kd6rdlEV;r`Qurl?k z&WEM%0J;prm8`Q>BUlI4v|)zD9;qP_V3eMM%pUuFgY$r7kuvLBl!7Py2g6dd9OX;zKX08hQqn6NNA*oPj0l+0$XHExe zQ;(R^)KItznae;StzIFU1-m|}a~pxz17N59*Rnf*=I=&-{EaghE|)2+((ll!2;-Y9B5lQjsno@2gfvssv88?@SfTO zc7`L&>ls5gIHa2TO@ro5RsNyHVX&-jQO&x7c|kL(B8sAmBAU`Ls!m3(N@P|t-GRzc z((l=j-2@1~PA<&4PKW?s!P<)MGRi7dJX#NlY1E`(72Y$|KRL0c?__vH3|8+r3CeFA z2ET3*B`i?96OPVMw2o*&pGln{N@N*cxE7v2?wZ9EN~oBRA<}vxtQjJX)#9f>bIIh2 z(G??WR5x!k(Ts)TZi9g{id!_bMrQ6>Bp)@s86l{sPX{RxVA}v}44m)HPmb4|LX_`4 z&TjaC=_A@#;=RUhN%03=EXHWmLBQ!p%Ad{do^ooCS9y)^N#X7@`_5k;w>hOKADc}U z%PfZAS73?v0=pSS9^(x9N%Vc-F*9Xey~k`}BQm<6357jiirir`9mLU8t0QzW(r7q? zr8Y~g#-v)bgOtabIHr+3!1wyad;SlJjtW^x%wW02ycPbz zab`7J+!pes@RxJ!0C?(!>p5Y9%ntR@-j`}m^K!QsACYRu6ASNZ&#Kx2!|qlto|1@7 z_6n}XzJ=7_s23hT1`*8^Pu)D0;?tlJa7J)Y_yPkCl^ zp!)7`8WuVgh19Pg22;()qNs=5HU=(lKzts`S4wyU9zoQbYWvOE8?t&U;|PNmbG2(6CB{f8~Fw^(Tt*TRI6b~oz2jD6GOD39(=;2*6W zNv0diecP$ctsUA6(L$C zK7xL2In%yrZFFvzu&|KX?enJTuz-sOJ5z7|6=8T|STuZr{YCHT_@^krU^aui0`BsX z@b;BqdGJnb{55@X2(G^cd=Y0yJ+vPm07TNk$+dQ;`CA^0za3MV*z|N@8vZ;@0(8A z-FbXdKE*z@-eX7nvUd0X2;X=d=Czz%6ZRpu$Gtak_$r*H4CTASA52PQN z_G_C)3XW=1GE2XqBxn#YANrHKBs%gZb2_a9ghwilX&XcgZZB$E)UvK-UQEZ3cTQ>> zMW|+2{d7479q>25 z&7d*G!Uhmxiez0&t|6xcaV;oEKINYe%|68?*pR5{^G1Ej(lKrG=;t%Y%T(sJxtcqNp=Q`42_yh z!QOFRI(^7=0h?)TeuBhl*wJyl+b}l0=vZYSk)KS6^@{#`l!P-O$msV-bTKj^4zksd zB#0>q=xpiC#pCD*d_(`6R=Lx-`&3&RY`@vl83ck4ou-?JVeG)Z)L0|Rp zzuC=way5TJT=@yS@y&hGHGjcfp#j?7-YA&^&DR07 zf#0BSPMNFqcO_|7nX3(TrD?v%-v+o6HLohD32=pKZke+Uam8xp%HM{t0d9_&6D^1V z+31D-Yu{|BU$Dp){hK$S{6FoRA^yj{*+07#HR)PvSQ7|iU0$M0MF~HEF(*dRfkBPx zt?Ru{46oPfL5DCs+}pU8OK?6j)Cb3~A`#Qkl-QGZWn# zl-Jg#l@&us>Kawn)GK5tI_CSzg+t#{<^x7->Jn>fXz2dnBH@(?=TTS#;Bb%{BL*Ql zS$I6X9B9*#OphTH85kz$$EOd&B7IBEc0wpHFtji*I1IeI`C9Jyc+iJF*|)%JP1)|x zCT$^l*|jnRd$TFIr%YRq&2Cc@wLV~?Uq-Au6t7jZ$o9iH?H34W7Ftv#CynljqHY%O z7KA#SeVxobK61cL6z%`CDRcDSFkQx!9E&MYw0oy+3jq!{t3L0gNNRZWZkmDGZ|qB# zP3@f8q2zjuXmkFOxhZ*y+$nlzz>8%VA{IShd~!;D_p&}y&mN0;-<>6Cd3cMB`@k(W z__Dz4vSK4~d;jT=1*>yYQ_My&{|bgWcnD;XR%V}&cuzUG;Ocss(WX=G?u5sr+mJ|e zDB}pB;>s>kt5KC0=Uq(P9Y5<;lA+zH{Z#)e*D;b!W&wC$6emeLVOeRDSBJ_X@lz;+ zc=F|;T{@AUBYTT!0!9+Z*JN0(;1{k8(A2=tnMB8sGs}d{IE(T|2NUB6@gI9xF(>RZ zi!ucyoKoVYdizf6Mp7`Yrc z58=Z&*$rbSqUay=P7I3CE4vb>8+62XY$S`(e4?575xP#D#?1-~A;gr6Y0*Ko2F)eL z2d{L8-bL0}z6pK3*2)YJ3dD^w=!NY_#5aj!+kb?nJo-pHj+%Vmo@*D^9Y2XZxuI>5#=zY977afw!Faq$4KvupM;ZQ%+)K z*~yY!%>mv|YltQD8Qct095u3)JB2Ig{Votp3gfdttW>T0!eHZR%TOt|##G}^Y6h4C z>|6-iQay%b_(k{j%57qIAB(G#?t^Ho*G~Ox&X$)@Lk>QBx zFe!XBFEH~pqPk`XNXvrXUO0eBbvF0 z(C6kyVMoooY`hIyum{uv$Tp?vxdE+yL2&!Uc|hw=z>0QjwATpe2jG>9=t$1EXR906 zJ&NKd?W@HVIM}y$pI}pZ?|L9VVcbvgINnGjRI+!(HjiNz_M>RGVV7=EoG7cU?<~xA z0lF591y_HmoFzGbM#i>5bE@Siwg|>i3+`?X?_76wswCX@zH|f8&Xjq%(8XbFUceTx zNQHrTzMtOZV>!bMkk;a!H!Op1}*d%!OkJwttmhvz!PU?E7%%EKtXCH8S#@=?K ziN<^GM!Q^#OdaCl3ovh>7y5BgA}5d~S>MH2F|1WInZ5c@_!j{z)x%oV5*%bT#7Jf_ zC#F1XJq0h+8nfS}x%Y7Z@eJjGdki~S-4OyMIYe6Q9#dE`2z~FK-2Ma7Cv>sMCyBU} z9r0NO;qRGsGOGo*1Ox*djNuxTZ&2Eqa^d1(glQy=3^-~LZxyq z;*%SaX>Es1AgOK_Q+VYn6xFJd<`)3oaoAtud~ASCU_Z&yFsVrFhP)jDpP2!3q7IL<5bwO&6E0h(Vp0Ny+l?$-U{}`2D=r*(cBg+o9s8L zO_n;X%bh)Dn}LQ{IeHI9qVrISJcuK(Stp+cY;d71i)`77G=x6`BG}Ll*X)&t-+u&2 z%;Pc*i_BAW>gYI!PgRBvlgx)$A?7sVj)V+(B5yb1ete(aq|&9ZUN2ld(J45#)h$8j z_c`c9|M6UjF6e~%ae!HfZlBSsx%TA!7~D=UznW5$OQaPvsVcXfdI!~*N|AUBh|+dy zm-|cqwzZuMw|@_ZL3^>C>RM`zN2Jd~*f0N-Elv!`U9-v9V4;lISraY-gP}0rFm)b9qfhY_q1a>-Q7IY4WHS&R;ojaEFf-ss zYnic^{FI!&wF32+O9SyYm)xE5LXCPV5kYTh4UxJxnFjPcRUJhyg5o#wwsx_7|LpIH z#0{Eb?gyveEzhACr)DU1##1#yOmqgvDe6IRbt0gnJK-+qZ>lRD>Pn-!OVK`LF(oEw zmh?|G$tpbJ19Kk3TEAiIWf(x#Qr+PKs_))&0&Sji@emsuZ^(~q$WKOMt-=GzBN36K zCZljl((ilXrz$teaXg5+4w#^Q;q&zECB5s$@qS+;4eki7rG90oCDzYEs@j@!oEI81 zCFGPiG-~B}$J-3#ffG*k@-i5ktdbFEj6F&y^~G*F|I}y6mu!lJb)Zg>&sABhPcW_I zu2NO5%ly7P(<-#+l;?9-l~R{kvX`n6sf;aG$t;_uP@p#tR1zuGVBRUotmM_rL57&K zz=E`S%A)A7kfIz&7L2HsCm5z`jG$R)_VCX>hr#JzElc_9RJ$%Ya}r-B2Il-RIAp|y z%|*z1I5<0|#OSxf>j+)FFph^#FnC;YngUPcLccNm;2i}^kJHB$BFTuzZ6=z+n?``S6l!vbQp(~i?$fvW6kVAidJki*R-9Xwr9{0E{m)^ zGNJ06O=EL?N7(W6_9M+NRjx*PZI-XS_ENZKcn{qrn~(1oA4In9h{PZa4^!y<_(Sb+ zX}cDip=O+|rI@MLortya;3~oL7^4d*;VKkOW&hRsbunG~m|(!G6A}8Y&{FwFq4gKI z!PZ{xX0Efcax?a$*QUzl8U^Zah1M~5N`K3LE3_sLa58C!tdHtWla(*km(fMNj<_@( z#y#ZhG72jVEZ%ldgl#-a?ZMnkq750Y{NxWsG%l1xIv{2tCol-+|NyNOm`Hwnd(aD#70YgYb(_A?umYx|q z! zd_!Q6cP7hMl|WVfPk6%x$EzkkXK)r|#Bjv$$E*UPJAc<+c3DXQ{DKD58F6C=6SZEd z@Bv}O4+oD^Fs-wgrq@1pEAkNb9AZ|vPVwP)rC`rXA?^Z_B-#+OHObn~zhMQUmRJhI z--o<@|BoK6e^aFY7p&m_@CetaN9kd!p?s2ZgOX;&$H%i0M6B6jJqeH(TLNoY;uq&* zi5ny(tJXkCrw*8;&Ri2lwD#iFteE-aFI%>Zu9RC?r>>c2mD{xR?!L}G-*+($(RX>d z0n+sC?woY&a-4LXP%^#z|Y06V+hsRK@v6Sk!o$1<0g(Pec{xZfix9HB1_rs z;bOg(y@Doo-!aehkBpYzT9%$EV0Ah?nON{+myG;_I({n&y()& zJ15@Z6R&)<8&g?(kZusC@bU{=b8}lkiABJu=s5I0)S8r16-%N(vg= zHV{$F#Cc$3os4W#HmEoZndq5|>8tb?F_Z>A&s8bZqhlT(o`Xdbkqzd#)|6m()wU

!#){}~C#mysNP7ln5hX*rKA>TS2EzXO zh-JjFhviura4V-aOJqx8YLd6FsYwwl$GD_se@HSamQ&}DojA$cu3_5IrXK*=f!tyF z_1iSTn75w8i7^#W%<;aD{;M=ZCyV|IFyRZI1kWS2M2G=IP=~iX9oYz~YlMD8iEv~w zx%$5'sKG7P1qxQal*48T-Bg3GrP7<&6puesWyVG;6``y>|BNtH#z zd<1f5;trYU$s5JX!+mEWDpt7o7~u^Qq@D)M3vm6jE8>vAvp;$;o;RxJe1&Qrn&D~Q#)pK$aH)vUuh7mRST(SGD- zq%C>ENUfLEgr72W{}{Q)4U_tvY=u@#+U5+FRA2*o=P{1b&e9J`G%9!Z)Ppw4KOt$RXPd$`ECgj(_+|;2iOM^O{dlA1k%@$-?pdNR2re}e z_yk9)H%aH2lIzxqJy+QO{QMBhKJCF~9T7B(H0WQJQ$x|Ouk|lu+S4qxf}~f!=l}ph%%%;GB$bj(OB@r-G) zJSuxuzqRABn6m$KVZ8w zC@3JhBY^5;{}F)pK7cj}a1}uN24K4?Xf5!bC)l1SMmvQ6GHm}eqdgRW6^_4_$sQ|! ztsB5~a=;^iwLO5Su|Bdt1DJjjv^Ai96|j98v^AK17_faD^yLrzG+_HY=u6N&U$8v^ z40mw5-l7F{gC{(PXB4{j-{ZBx4THrds+fHD6MbJ_J_9K99 z0$?)$_6vaQqyRQR_bEVjI^Y|C+gjk;UTANC{xDd68PIb8V_1G#U>N{jR{-A_=yy$k z-iU4v;OgKMZJ-*sw4*1r@ztp{JAlinLmF7A=Pct>Hll|b*txcexi+E)9N0O#q=Mbp zIX6!JKBdrJsDc|;f8RFf58UA1F~Cl{p0Yz$7E)-8?;wh-O_QcQN%qnLSiNYV8BMF^Y>@ zblK>lmtgbKmFCr%X3;p@pn=I%YJSyZn>7x6?H@`k!ALSENfkDgvpgT;cuUE$p=Q#2 z>zxBK+Ll&GX}`!NisnASb;?(pd&oK^f(DYkPGI+ZtufCjgHsDVr$5N7|6lfKHrpuI|a}c*RCG7Cd_q zPAko=T3WePU6T7ieC@`Y(}w)2CalY@Cp^vtt=e`t_mm(=8lDW?y!1!PFfZiY0p(w3 z7{8kknRwYjB*g?O78L`FjA&cvoQlcI4k|ZxBP{arC(S9P!GLP__7ZgK5aB3xf}AQ0 zeh(HYsK~h(CIUAbGm|u4GLL7(JG$~eb1zg?3I#5z`{f>VrNH+9tZP3|rr~GhB178} zdg=CLruR_#UdgD0)vST5EGh+^VW9tnmjO+l|LsV^eyma(THfVH%8`iuNV!@|Weaui z-FhsmsiU-%)6I~Uo(@ByWcXg%GYONP>7%5CG$AahVn(B$nQF2l0JwYld_|uVWa5To z{!!_E(j9rptIqie(n1lqIuUKh8Wi%LZwimJ*LnrrJc$ zCPWuYUaN8ycqdC->kre7F1>rJBTarPunNZPJ#o`OV3Mqs5?Z^O;|XaxY_;}on{}h* z)jAJ**A{*Cc|kxsd8m_i$`Xv&@?IqX6s77Asp2Hx#2^rLJTq)s$o@3)R%>?7wuI- zb#tmG>#LenFdQFURYJ9Tv{DLGD{85&k2H^%-hM5oiDdqQ2 z=YmY;*+96o7oNjLVFqKA<=$Wrm=j~-V*sH#Z%18~4cpS}V=d2tbPwEs_zl<%9n6^J zAUrp$!+TY{lC_O<5JTvW8E96lPfTS`TWu+{VR9A83bb_6BDGPju8+1SB!~ri7o~Am zzSj|bXm*w22@+A#5B?)@9e`Yfri?!|r5p3f;+Ljm){1E}bGYszX-ifBiNF=7clv^@ zc@sWYLP`w80`g?+I^`lc6^N<4Dr7MMeHx5^TzU%XomtKn>|2Y+Re-Fsh})U%t)p01 zT~$D1zN=bS9nVsQZedzrmX`*bOyDbEXGhB70z_v55ajoZ)|m)oi#G3i0<>Q@i3I>- zB3fV@P4$t)vjcZ7Eo7@x5AGZ#Jyf@pE+npBiMIPxH;Wy=dpy8oP>L3xROn%bGI)r$ zM^Udr?ab=ICBH#iug;F2?iJD}rB_prie0r(a-6hA5QU;1RHe9|Gr@t@zmebE2VkZ8 zv$S5XZl&$i<(1VZOcxH;@&U95p^SgGdi7lE@#P{&PNiBZprY&MO z%Wn_9J^sfNl9-Ll1J_>y`=yODVx6AMZA1LktMM?!ok<)@8vP(Ox!&s^Z z0SjtqNuGYCuq#;2wkY-CIN4gHZ;);(X+O z`XiLmqd765pzdv70$C$S{+a)j(RwA=XakNe^r4Usd@A9m*hmAyZKeb1rp3EA0j?+1 zsurHZtZlCDlUA+fzBH3sXGAqju?EpXaclypGNf{_sE0%~HfY6$4bZ$)C)3ENK|zEV z50Fgo$5tmmj0dg~2I>HKoiI24sD?;uYOx019JX9TRUK;$C~~pp+6=G?tTDsVVA=d> zxvo0U2FO#>3c^}IMj;=RTilQOKn!v|mxR1V3d(}UpM;^IJs`++(aMXh4?{hfsSagj zbY;zU@uM%MI_heKWq^=+G*g3WbA)pZR{&RJjLW8l)R`JYbFKl|{F{QEkAou>uzjiy z=^SYi52X)@LDr5-&9zS4IW)EAyD}8pXmrWWH2}IU0g3{{)j!kd6rdlEV;r`Qurl?k z&WEM%0J;prm8`Q>BUlI4v|)zD9;qP_V3eMM%pUuFgY$r7kuvLBl!7Py2g6dd9OX;zKX08hQqn6NNA*oPj0l+0$XHExe zQ;(R^)KItznae;StzIFU1-m|}a~pxz17N59*Rnf*=I=&-{EaghE|)2+((ll!2;-Y9B5lQjsno@2gfvssv88?@SfTO zc7`L&>ls5gIHa2TO@ro5RsNyHVX&-jQO&x7c|kL(B8sAmBAU`Ls!m3(N@P|t-GRzc z((l=j-2@1~PA<&4PKW?s!P<)MGRi7dJX#NlY1E`(72Y$|KRL0c?__vH3|8+r3CeFA z2ET3*B`i?96OPVMw2o*&pGln{N@N*cxE7v2?wZ9EN~oBRA<}vxtQjJX)#9f>bIIh2 z(G??WR5x!k(Ts)TZi9g{id!_bMrQ6>Bp)@s86l{sPX{RxVA}v}44m)HPmb4|LX_`4 z&TjaC=_A@#;=RUhN%03=EXHWmLBQ!p%Ad{do^ooCS9y)^N#X7@`_5k;w>hOKADc}U z%PfZAS73?v0=pSS9^(x9N%Vc-F*9Xey~k`}BQm<6357jiirir`9mLU8t0QzW(r7q? zr8Y~g#-v)bgOtabIHr+3!1wyad;SlJjtW^x%wW02ycPbz zab`7J+!pes@RxJ!0C?(!>p5Y9%ntR@-j`}m^K!QsACYRu6ASNZ&#Kx2!|qlto|1@7 z_6n}XzJ=7_s23hT1`*8^Pu)D0;?tlJa7J)Y_yPkCl^ zp!)7`8WuVgh19Pg22;()qNs=5HU=(lKzts`S4wyU9zoQbYWvOE8?t&U;|PNmbG2(6CB{f8~Fw^(Tt*TRI6b~oz2jD6GOD39(=;2*6W zNv0diecP$ctsUA6(L$C zK7xL2In%yrZFFvzu&|KX?enJTuz-sOJ5z7|6=8T|STuZr{YCHT_@^krU^aui0`BsX z@b;BqdGJnb{55@X2(G^cd=Y0yJ+vPm07TNk$+dQ;`CA^0za3MV*z|N@8vZ;@0(8A z-FbXdKE*z@-eX7nvUd0X2;X=d=Czz%6ZRpu$Gtak_$r*H4CTASA52PQN z_G_C)3XW=1GE2XqBxn#YANrHKBs%gZb2_a9ghwilX&XcgZZB$E)UvK-UQEZ3cTQ>> zMW|+2{d7479q>25 z&7d*G!Uhmxiez0&t|6xcaV;oEKINYe%|68?*pR5{^G1Ej(lKrG=;t%Y%T(sJxtcqNp=Q`42_yh z!QOFRI(^7=0h?)TeuBhl*wJyl+b}l0=vZYSk)KS6^@{#`l!P-O$msV-bTKj^4zksd zB#0>q=xpiC#pCD*d_(`6R=Lx-`&3&RY`@vl83ck4ou-?JVeG)Z)L0|Rp zzuC=way5TJT=@yS@y&hGHGjcfp#j?7-YA&^&DR07 zf#0BSPMNFqcO_|7nX3(TrD?v%-v+o6HLohD32=pKZke+Uam8xp%HM{t0d9_&6D^1V z+31D-Yu{|BU$Dp){hK$S{6FoRA^yj{*+07#HR)PvSQ7|iU0$M0MF~HEF(*dRfkBPx zt?Ru{46oPfL5DCs+}pU8OK?6j)Cb3~A`#Qkl-QGZWn# zl-Jg#l@&us>Kawn)GK5tI_CSzg+t#{<^x7->Jn>fXz2dnBH@(?=TTS#;Bb%{BL*Ql zS$I6X9B9*#OphTH85kz$$EOd&B7IBEc0wpHFtji*I1IeI`C9Jyc+iJF*|)%JP1)|x zCT$^l*|jnRd$TFIr%YRq&2Cc@wLV~?Uq-Au6t7jZ$o9iH?H34W7Ftv#CynljqHY%O z7KA#SeVxobK61cL6z%`CDRcDSFkQx!9E&MYw0oy+3jq!{t3L0gNNRZWZkmDGZ|qB# zP3@f8q2zjuXmkFOxhZ*y+$nlzz>8%VA{IShd~!;D_p&}y&mN0;-<>6Cd3cMB`@k(W z__Dz4vSK4~d;jT=1*>yYQ_My&{|bgWcnD;XR%V}&cuzUG;Ocss(WX=G?u5sr+mJ|e zDB}pB;>s>kt5KC0=Uq(P9Y5<;lA+zH{Z#)e*D;b!W&wC$6emeLVOeRDSBJ_X@lz;+ zc=F|;T{@AUBYTT!0!9+Z*JN0(;1{k8(A2=tnMB8sGs}d{IE(T|2NUB6@gI9xF(>RZ zi!ucyoKoVYdizf6Mp7`Yrc z58=Z&*$rbSqUay=P7I3CE4vb>8+62XY$S`(e4?575xP#D#?1-~A;gr6Y0*Ko2F)eL z2d{L8-bL0}z6pK3*2)YJ3dD^w=!NY_#5aj!+kb?nJo-pHj+%Vmo@*D^9Y2XZxuI>5#=zY977afw!Faq$4KvupM;ZQ%+)K z*~yY!%>mv|YltQD8Qct095u3)JB2Ig{Votp3gfdttW>T0!eHZR%TOt|##G}^Y6h4C z>|6-iQay%b_(k{j%57qIAB(G#?t^Ho*G~Ox&X$)@Lk>QBx zFe!XBFEH~pqPk`XNXvrXUO0eBbvF0 z(C6kyVMoooY`hIyum{uv$Tp?vxdE+yL2&!Uc|hw=z>0QjwATpe2jG>9=t$1EXR906 zJ&NKd?W@HVIM}y$pI}pZ?|L9VVcbvgINnGjRI+!(HjiNz_M>RGVV7=EoG7cU?<~xA z0lF591y_HmoFzGbM#i>5bE@Siwg|>i3+`?X?_76wswCX@zH|f8&Xjq%(8XbFUceTx zNQHrTzMtOZV>!bMkk;a!H!Op1}*d%!OkJwttmhvz!PU?E7%%EKtXCH8S#@=?K ziN<^GM!Q^#OdaCl3ovh>7y5BgA}5d~S>MH2F|1WInZ5c@_!j{z)x%oV5*%bT#7Jf_ zC#F1XJq0h+8nfS}x%Y7Z@eJjGdki~S-4OyMIYe6Q9#dE`2z~FK-2Ma7Cv>sMCyBU} z9r0NO;qRGsGOGo*1Ox*djNuxTZ&2Eqa^d1(glQy=3^-~LZxyq z;*%SaX>Es1AgOK_Q+VYn6xFJd<`)3oaoAtud~ASCU_Z&yFsVrFhP)jDpP2!3q7IL<5bwO&6E0h(Vp0Ny+l?$-U{}`2D=r*(cBg+o9s8L zO_n;X%bh)Dn}LQ{IeHI9qVrISJcuK(Stp+cY;d71i)`77G=x6`BG}Ll*X)&t-+u&2 z%;Pc*i_BAW>gYI!PgRBvlgx)$A?7sVj)V+(B5yb1ete(aq|&9ZUN2ld(J45#)h$8j z_c`c9|M6UjF6e~%ae!HfZlBSsx%TA!7~D=UznW5$OQaPvsVcXfdI!~*N|AUBh|+dy zm-|cqwzZuMw|@_ZL3^>C>RM`zN2Jd~*f0N-Elv!`U9-v9V4;lISraY-gP}0rFm)b9qfhY_q1a>-Q7IY4WHS&R;ojaEFf-ss zYnic^{FI!&wF32+O9SyYm)xE5LXCPV5kYTh4UxJxnFjPcRUJhyg5o#wwsx_7|LpIH z#0{Eb?gyveEzhACr)DU1##1#yOmqgvDe6IRbt0gnJK-+qZ>lRD>Pn-!OVK`LF(oEw zmh?|G$tpbJ19Kk3TEAiIWf(x#Qr+PKs_))&0&Sji@emsuZ^(~q$WKOMt-=GzBN36K zCZljl((ilXrz$teaXg5+4w#^Q;q&zECB5s$@qS+;4eki7rG90oCDzYEs@j@!oEI81 zCFGPiG-~B}$J-3#ffG*k@-i5ktdbFEj6F&y^~G*F|I}y6mu!lJb)Zg>&sABhPcW_I zu2NO5%ly7P(<-#+l;?9-l~R{kvX`n6sf;aG$t;_uP@p#tR1zuGVBRUotmM_rL57&K zz=E`S%A)A7kfIz&7L2HsCm5z`jG$R)_VCX>hr#JzElc_9RJ$%Ya}r-B2Il-RIAp|y z%|*z1I5<0|#OSxf>j+)FFph^#FnC;YngUPcLccNm;2i}^kJHB$BFTuzZ6=z+n?``S6l!vbQp(~i?$fvW6kVAidJki*R-9Xwr9{0E{m)^ zGNJ06O=EL?N7(W6_9M+NRjx*PZI-XS_ENZKcn{qrn~(1oA4In9h{PZa4^!y<_(Sb+ zX}cDip=O+|rI@MLortya;3~oL7^4d*;VKkOW&hRsbunG~m|(!G6A}8Y&{FwFq4gKI z!PZ{xX0Efcax?a$*QUzl8U^Zah1M~5N`K3LE3_sLa58C!tdHtWla(*km(fMNj<_@( z#y#ZhG72jVEZ%ldgl#-a?ZMnkq750Y{NxWsG%l1xIv{2tCol-+|NyNOm`Hwnd(aD#70YgYb(_A?umYx|q z! zd_!Q6cP7hMl|WVfPk6%x$EzkkXK)r|#Bjv$$E*UPJAc<+c3DXQ{DKD58F6C=6SZEd z@Bv}O4+oD^Fs-wgrq@1pEAkNb9AZ|vPVwP)rC`rXA?^Z_B-#+OHObn~zhMQUmRJhI z--o<@|BoK6e^aFY7p&m_@CetaN9kd!p?s2ZgOX;&$H%i0M6B6jJqeH(TLNoY;uq&* zi5ny(tJXkCrw*8;&Ri2lwD#iFteE-aFI%>Zu9RC?r>>c2mD{xR?!L}G-*+($(RX>d z0n+sC?woY&a-4LXP%^#z|Y06V+hsRK@v6Sk!o$1<0g(Pec{xZfix9HB1_rs z;bOg(y@Doo-!aehkBpYzT9%$EV0Ah?nON{+myG;_I({n&y()& zJ15@Z6R&)<8&g?(kZusC@bU{=b8}lkiABJu=s5I0)S8r16-%N(vg= zHV{$F#Cc$3os4W#HmEoZndq5|>8tb?F_Z>A&s8bZqhlT(o`Xdbkqzd#)|6m()wU

I1?NL(); z!y^}`&Ygu5MkthpU+Fw;1?+!~mw<^27bc4U=4%dok*prS%dk81^sd`$w>*RDOer~G zsDSml3;$rKjIZ0mx_)ZJK*>8YtMv}~czpLJ;tivtAApVtue!HAlL=Qs_Qlp)Ab&UU zkfmmnA5mkWj$k6|IFFPZS*0>=G898OTrD+{*`R7Co>ifu17Tt-!l#_qB>|Y~aS^)N z#LPu3AnVYo`>}l~8hHH$({{Y4Oq#>+Q4;toh3?Aa*9FALHi}xsb8o2Gr5$ciZ3|>~ zpi_f80-cCt`uEw1>L>}bD2Yl;kO0Hb!UZjrkglM1PCHbOC&F!QeZe@ALkSjzqpS?| zu#fJ?maxmPn(~pP$AYzZVUPq2373fU_`RSD4_$hC;F1gWyP+N7Z|5jt^I@d3XLovF3=Y z!2!Y64Ak_(ZDB#HG>7xr7;4{kRc+o8=3;I5Fzp^$&Ma(>=UehT*)h{->wiJIL--6- zb8s(wguJHS8)uB|C5GmNQ%QGMsPsh}13#(Y6MhWJRyrGl=2KhSrC)VtM2$G?rMCFq8ihDO-{!guHLzVU?$S(MK!ojqIWYeF)P zEk`&54W^Z4jE8Q+nld0%3Y`qE%%pegLZa@KkpSuPK0CIf-uUw~!lYYBUgi$U-R+(j zssT{8&A&oX)z*vyW*cS;cV#ff6G8NbV!PHhujnzi=uzBgh6|EZ3*%>(yAkb_rWm^> ze{OZEq3e3Of!O5GQoaA5jeAVgD{$l*<{8A?(8m4(Cu{PqJ42dJaUzVGbpEJ zrNWK5$Zg4AK|2I*@h~3-Xr4;VxyZt7AeJon@gwD3R11}L3XA$m8JsSe*QS4c*-U?IL4})zT0-NvvKY5IChgg02R=n3Oy1WN|l&OCuaRjXx`5Ok#bM$QR+*K7ZG zM*g2PhJRab{^6@A|8@DNXzTNry7Sv|lcrXp444x3tX&U)#!3q!Xn2c2Xl^)9)3Jo5 zxDaL$^}A@ReWCpp_#HoO)Pf+b~XGCA*?3ifITLIsSt9cS7vt_8xMkTsQCeIOZ!S{mK(eVr9jtR%ShP-qz&cj$xATU1g2xNjv zH4l{-Br-ssCpv)p$w)~UVnC@>jkGq9hLP*_@67SQJvrO`8*z30ojv~V0RCrt0Peqv z?f(Q&!NtJV*}~c5pBP@L?J8lZAb;8dYp4SOYSs%DGz&vOq=KSQNh#nrizW(6@DJd% z6HBRgu8$a}349@d=6O539z>s-8(cNkGb&taMorfZP$3zDAAe8%z|ptG)W*%iahvPvf()lS=AA7 z>^;a%VQo>iBMaLKlF(e^o@U}eam+*YgeXe7ftn(Ynq!ys=160qDo9@#CBX*B!tqi4{)BrPTun3&=DQCh6g zy)cMu)m?FQnoU6>nP@<6IYWiJL+OY1D%c86l6yd=wAZp3sCm>fH>W7MN-a-;2%!P7 z*bA~GbtxE~)lN=kkRHHy_U78qGo^`aP*r$l?MbvTE`KUs+6+Id#{3qZeao5lmJ&v` zONvEI4w_nUPH1ndD4tMtb8INLuJmfYoH%9b84-`Q(LA+*zLw@8zY9?VLQ`dgEZ48vdXi z!l^r*kd~bMxtqplLBS-Gnk<2lnVbkWGMXz_I!a1PS*Sgd zB1q9#{PL!|)12nCfwVd$!Q$Cr_DgrC;?){H%P3p6v-pf_8qA2b#3*}_fpFP~ijt?a zGDXnBYKVGLBdIU5_^U9@Y8>xiRym@f?-Ols3aY?!#II~<{V1?f>!EdpTcYBv_NC0O zd~_~$G|h)0<8CQ6t+_k-2;{tmN>0P0;@q3b2`hLzwX#gs6BHrARn{&bxRz=}!`HAd z40eF288Sb0(E(NtfTekbQW}u5#qYb`hAu+_JcppzA$r}rI8)l3uo|BTaL;Lh72C7|aqWRhe=9rT zqXXB*?#=`jZw$vZwFkScJ^9@a;J00VbN6V_-9Nbf1B~|Zi6{NgYz(JBv>g#oe|CC~ z-%x3=pU{WGDgOid4fut%m? zg)6`;y3)2yb->6&_Z6LyZISsNTalzwt9g@pHmGjHP0g3N&@Dqs6Dg@C?SF2Qav^%w$2EfKn85FtXgHh&lP0ryw2UjDV%1yjp9u?H- z0lYdjuJjb#v*f?geIEtZXk>G41$Ev9DL9Ku>RR`yWyU1pBf1^xxU#j+MSf*xv%|L% zIzFXJlp7``hW`4Sj(8_19k}`3K{W&WKTVVW?xOlfzVW|);=YwM-%bD?CG#{LRgb;_ zv*lLumgnFV!ni_W6~wOO*+r8PlFjYTZNwKjU;p&*`C+`hIL4g}TgY|Q%**T!r`aB- zd=9ne%DMXnIx`k0$q!uJ6kuaI&DeD=+q0<{t_;# ziV5x^{sCv}RoF;qp9<%c0>4GBeB&X7TA&mHg?Me8AbDo3~>-JPU`s^Kk_a~ZC>V0 z)Y~1!6l~(WRjxM+DGe;1yl|^F-m2AX8Hdy0u6Qoz8)o2I8)zIWz)=r8y5Q~EH}i0XUQ28vjA#!9rzxoNN7 zv5B}w{fsC-UN}K@LpH8ZAfyH1sBCyhIxydQ9S9)L2rOF~EEu3iDp9#A#c7V9E2rZm z9kNdDygGjJ=~SFjtCuWmulGt!<2qDJ?A{`P)9noVYED?%SGe>PheP|`&UlclLVezX zhHDT(>FFM%UvSZY+?l&Tql?_9`Q{{O%<~CiQzFWnoA_5V2>w>`72q0cVX~nAZ7(!v zBPgK2bb>zNJ+ziVtX;-HfaAj)YXrm~kq3A7VV##FnvNkok3K}Ku~=M&(;%oSeX98$ zi&d6rkH8qo#Vp$j(M|z}Z`xK|>)P4HEOnrpZcBg+Kl{#qY?ZbmCWtkSC%H|?Ez0y; ztI#e;k$aRemspvb$XoXL8v#o%r_PTlGRQK>F}NCx6`XZQhhR&#^tY3&0s6+T;yWL3 z|5r8XKXU@2f8_-KtuFk&`InH=CPoHwkRConuUck9)9-XA$12JUKG@b-!5(i)Uegd# z;=!Jl+dmA^NTl=oLG*y@^|*K6xdvcOXCgLnV?epW6oGCMgjBuq&v z1u5MTG!_G$)Lm^4h5iob!S|BvK9)%Er;(h9xW_xtU)ijLhVf+i_tygc7xVK!J9huA z*!k~{T|omUlYjo%xgPCSc|k!zg+OsdL2LKk7YYxR+kR#%i-N-X`Slze4rdFCg0AR; zPyfnDOVd2ePEyy-*8Dk+pPQwb2(DUqb5nQleV34TkgT3PInxiR97R1b(+|sxQxKRS zfKSyxt-wddK&P|IM?uBX5u6hc6_XGU6OaW6l#YQOK#T;>81zET|8oRDabTc-Vie+U zHdKBE9>iaYowo1S|8t&)_pg}B8aO)=>!2l$&3KPp`?LY0=LnWi-;7WzB&t)Om|9KE30I`c zsU!OsWRy~3Sg?wq3)+-Y^N4>w_VZk<4ZG$^D1+DRHUv|z^gFP{y#Y^nEnrkh1U0^{tbszL|sAABuAqoKP9y3NN&|FAMq53 zw+u1QBIV66!MRAhvTfQ%d|-R=r_*O<;C(BK-n~Rs+=wB_%2?0f*!}RnKe>t117KV; zh^KPhoPS9*a0D&cu+8jlSy*6EGD6p8qaLDR3`Uy39O@7K2;eU=89Ly)8pU<1+{lHG zy0!w-+*qpi^mMb!yQuwM!=M8*hSL9}-#?SGYoXa@v8JjzFJ1ZO#tD-u8wi?&lTGrJ z3pSSSF^Yb3<4nTMr%PWl3}_UtS=3JXB%bgT+*P^YKF|Fp|yZ}#?ym^PparM?0CvIwD~tTI%4+O@nZDRSTyH^PLy*8^ensWy2RkwgCF2pACf`EVon>JB zK(YgzM%b91pESpj&Ec2z5qrS0ImI~z2sk>$vdnIP_0AF=YM!hZ*$t<|`8A1fCAj^U zxMJnM#1%i?|NJGc2zOTGTA*ycUZB2h$P4e&;vTiZbW4?uf6u#H%eX^VRDZZOQol>C zb9bC_P93!z8kh$6TS)_1DrGan5%CKJ>9&` z_Pss|#01uIN0|P{H1@KEmAmvk|Nq-G_Mh|r_lp1jfTk*Y*qbPsI6M6_1KP;T*nKk< zy>DKOda$}249RSgR6`K+L4=1ff`bL)DN!g9UV)|6OKK6hB4G&Iz>nsL-r;1i+3okh zZj0b&B`^cVHL0#Hqfe%1>UQ~jf%=Ko#N>M(K8ow~hJ|AGdNN{NjY$p{SJ}<+uHs#cJ#&BrfEp&t5@I_Z9Yta(LArGXUPYZrDKWnQlJ6eHOdVrV1!&> zoBB75pw(+xK3EUA0)q*Sz*KPNc-&)EOHy^9z^hpJ#0J*Ogwp+jgGh`IAS=9og}l+pQ#QT4A9yUw{9K1ix_E zEPYR*6aQni0RLZ4#=k5BvKH3=JRvR$-!-3a>gJGDv2sAAKRGWUl)b^P1@kR<=}dU( z;C8&pVCb`;=8-+V4u=wleXwVR(al73XikaQhaHYTTO3Y&eLH*r+IztQSkL$Uscn-9 z(4j{$M}o0oa!eSxVXgK*gswF`C_4?r+cw~!Gh+EjEgQR^2MznK;J51Pa#DWXO%7jW*wYhpPw>l%sFV%-2 z3ei}gwoi7a)#6QD52iMWzu<-ml==?F6L6`&&e6;!36NbAQs}su5U1Lx>ae{{?w2o+ zeDa>G5y5?Tq<@GJjOIcX)e`B%0-twh8j#mz{9-S(SUSy7G~Y`%O#+I_i=~%R2yzwm zdb&i}fcltI26vKZAAa0545wox>b$U}BTsDtzY6=7Lwh?7FB{6(Mk z^>;}J{KC*l?K?=+{|eH7lFPaO6{P=La{0d;G~WPo83Sjd|E8G7{1?Ssk&r=9V;J%P zvxP>ArrlovnG&TLwFG6$Owd~}o%yV|0{D}ZZ!bEqh$%MjLm~E035wNX<)X!o<805( zj#JtVAFm%s{Gb#l$BnXfkePH86M2+I(`Cxo)L$Ft`>U;)na1o{G>&qA=#o@|I|rpw zZEs(424y46tYZp{Mw?~qQuno`Jzv~XT7<|ZKxO!x*`pJT__#Z;ztp75G1t>jqtErq)zG*xb#DixejezMi#Pb z5*KGC5QbgXS=LcHY))Nsm1Cm`9Z+YY8xhpa9RtA-Y9F_M)l(rRpe^L2B??F=MUE61 zL?2Rb`q_M2pouTBKSBa33M;0SAW7g5g@%EuSz=yVpN|5$Q7vj!wS}#Z&Ut?vTH}SH zR$t=z8mjlLRT!(Z^!Q!#`cB)(cv+E)HBFKuaJ8{Q-(lSO8BE{AG!~U=6N3XcU^ue@Dzs(&`N`?Ip*SoP>$Fmk?GM zV#yh0Od(vAjs6l^Fwc{{RidQDgG=cnr>4Lz!u8z6#C|bll3k>O+%~oT0LtOD5kfXfD1F^5Eu0*6jac>>Q&jQI~ZcJL%ZA zZQHifv2CY=8FXxPY}*~%cG9tJcW&0&YwvT`x_8`je#|lQFJq2bUw!pey>C6o**OJ~ zB%%A!^4L^dyxpwoirpmrQpqG_Tp_qI$-KNlXS{cVN$KOZH3&$<73 zl>D_u=wCTEQ$t4sMFaIW0x)?n5=mg*%nn&&YYw8eeLxANm5mXl5bdHBq!U!WAu8E2XtYZC}f!lbDkeeio5o)81bGSm!l!|JTY4V-vB7geY%jKGV(mZbikP zO^F$PoNFT_vmbg`>@lfWpB_wDYA;rd!77ruzag^XnMFWnVaytfXckLRe`~x_Vs^#4 zgvFk?yK*aP#qr=V9b#98OOnCrq9()A04^O>sZ|%1qfz5k7ueikXX;!HNpzOK^IaK^ z{WQPLN|yyiKEe8I1SwOY{Faq|EFA}wxoL-qsG&rXBFblfKEcRD&8tK@2@=b{zl#B&|O*-n#g;}poK}Z$IN?4u-_WVEaq83t4_T|Z47Pf zGF+CK${fkDO*^SX*tQ=}bga0Fz(O;iYD#Aea>y5ahr8cFYXm)|GBM6}ziTS-6@djuE&V+H9fb~+cIeyHv3U8s5IGWRp%&4&N1;U|hHCh{i zRCdnP#E1!3GwCFHc2dw*96ROMt^|)ZQ&=^6tmLEMm>% zN$EGFUdrlrA;{k3kY_B)Z0$}iocP6Csc0+grI>JYsAB3>O%i_S%URfBC#BxkJJ}lA z(T8L`No0QFAG2c6+0#LGD}lMKqzMB`F>I@lZ08~XJVNWqIviTis4h@v@C2}-*xA%Ft01{iryALz*^w5g>tRHu9WH(oCE zCf^D`9vS zJe)#1Kipuhj)`|+PJ>Y{#^sIS9PDQ|lP(5SdZ|$7JX#0{^ z+9un^C1a#q1-4#<1WK(MCat1Q6w1nh4_dY(o7B&@UF=4{CjI`B{FL(jx9Lqb(~{|U ze1Obvw)8P#^H+t{-)puAqCFk~tyjLf<>esUiXij)U{*t}^#NiEJu+fBDV7&#N9@xj zbm}@Ka8!8u{ZZ0C-pM#^#eT3SKzzznIIGh*tJgT0R>@9^jj5>6iDq2)8mbEpxFa<2 z$a`mx{e!&j43eX27OG+Z(+n1+^t{$%8BL_F1B{!!YelE-N7X3@zxp7b5QTgqGoBi1 z!?PqguCW@O=l$f*QyPAwRgSUZ3tkEQaeIhra?4(Q_VKj8{P(`kKZ`-G|G60atlX7N z-AtXFO@$mBOzlieo&L?U2~pMgV}btfCJbTgSSJZoCAf6df~K%QF{D<>A%y6}ZNA{f zKO~g#%zv)WpTF+e{$qVUJG5-E1MOlqK^ooQmat7Qn|y_9}wk56OYzUj}pb} zS7egR@sg%9F>rmFvDGzmb?}=mXYodwZHH1VLS(2)!iJb?<#_t0=Zb?sc4kDzu|$;| z8H*GEw8mAfD%>RtN3?AV2ZJx=!BN!h-bG}F!G^qc$iklmm-2YgQIoO1&B|!FnD=Py z)Ktsqb(C4c+R(aM1Tr}$wN<0_fD=X|VsKli&>*6nDN{HVH^fK(9cJ=w=SQfRh-4Ox z7+~A(mPkb^6<6A;q`io9<0xzet4}n>ZrF-R_Mvx@an2e6a-mL~K%~g*Q%z~#s!Cn> zyP zh*pcc1sAFa!BDAMW_RZpAmk*Pa+TKr(OXT#WEudY2CRJ42VJ;95<+LdiPRFwEEiRQ zloY9yIOau|`k~9lU9oGBWV5JJFTh~F^fLpsr~qsJ_{{F%V#9f=)!W|x{r7Ls0Kn2G8P2kme6!Z-htP~p6t?{&hg`lH zBk3aT+6_CV(eXIG$Yrv#pnQ<&*nz$Fz3;`zU}wu8WH{{SbVH5X_WzLKl4e5v07{=^ zxVbvHDJL5_Hf!;Es%`8LMKP!(LsC5iPc#;9m?~pIK%fwyWyWHjpJ}{kY};W#wmG~X z%%ig)Fc=3e!SwtXL&^hmffyFpe_1%9FKVhg5xqGGkz=I}F*>BNiagXsvgs{F zv~>y^#|%j|%e=XEuQA%uoL#;59`&y+Uq!CE3 zBHc8xK^LbguO$f>lk8=w16YT@Frq5-!XnS#fAZnN8m9hYOVt0#mZ1J*OFSxlvL(p= z8(Tu^lMna4^2wI?U4 z$-tda=ujlD^hj2)a$4VGz_~cJ4Tjtg7-fj{6E1PzVSZ$daLm3gsw4U1lLd{8{!M}v zcES@I8-3MDea#7ZWgQDfBTiB`Rx&nu$ggJK;!i%DSZ&lCmCjuz=o# zYx*uy(ZnDxI)pN{X1k56jvK{k-c(t1zWiGSQ2ImPJnW(+R5UxyBb8*5u>%iR-4R&# z8%UDjVh!GP+nQC_Bu+h1A$jx!O8qixX8w-;uBImcP(>h^DwoEoh(m zB&NR3F?X_k`=psx_EB(Tg?z6=0Ve%NNC*7_1GS`a_}S~FZ4A-Pq+u#fAM-d4edWSL zG+jpsgy$0Ug8d%3^@7N1`WiBGxj%6`zdl{r^MO+A3$+NhD)B2(tr?r#)NV^(?902;VbhqGni#Mc_-PIf2B=%i z0&)OnQIDl5AdfWaW*3ywSCzg^L0%-SVNfQ*M9~c1%OY{a^ zgcx@PJPR4ZU*(fW=!73_JzoqU)P*MY=@2L1^ z)^`8fiqn6F=HKU)e}(3M;C%lH%aDJE+zRO-c9 zYsd{pbNIsOVtQlurO%*uVztPKRH!@iZoj<#c3hYVvv(V^ zT8TavFijmfg$Eqj9nvWs7U}OvGWNR%ey2X@Mx#{3I)Gn8dhKviX{nwlN=)Tt+RP|S zKE_Q#NT=w%;W|Jg%`FQfPR_CZPP%^PrbB=^>S>Hwx9Uv0dBM4c=4%C!DCm}+O)}wd zOi%$|>ZK(FL6cbLZ{rOY=Z?_V3zlp+b2P2h=VF_tl8SlBm3uPr<`U&4sdd>9Qn6+5 z-8hpdX^Fz9Yq~Z^w?g@6eCAoQW+ys)~-?=+zJ<5jAg~}LvA)P0BmQ=g;_>DLjCy~IFX*b2PRl}kZwaM8PVya=8xZBDo{VDLf&-H|f(5^P!-V>GvMMzaTH54W|MV=S|E=JDveizNpNmi1sQ11&JafN!+`JS3x}k*br;#74 z#9k{hg(~4Otv_Z<$xz#v1qUq3uHETF2dfsXO} z#3j>g-TDP`NPIyhrv!=wd5F;nM5VK9-EFMOtH>Cmp{(#ztWm`jOfF&pl2Gatqt4V( zZ3MQotH?x+xPSwH`n_AaY#pZxbFUzX_CSe)w*#+GucWEoS{sHOtSE?B3RM=`1e95> z8SmA3Y%!Yl*RgT_4C9$d-@*-y8JKjb)28A8zl~Jt1<Vr~7eKMf=(sU_n?>|figM<364%Q%rzu@oT+le^Dy3c%xouxm8vi(0{q(CW=j z^~X$0!VZqv2u#8DWY(5BeV(?y**fYv@e z?PrI~ImJ6%>V;DnvbD4LVvQzifOaYzL&jTCdv`PkS(3VehHEX5TaE5O_=lP}u0L0#{RyBQiuA*KO0K^rNH6UH9sPT{Lgv zygSrej>Us^`R3-pBt69j$NAcDWdI&%fJgGCOcj?CMeSsu29sPH6Y4H#<*uC3{h#mB zX*m&&?$7ga_cx8nKL_%EcP#x!W1?vP?>Xc5PcecZqG)+knop!s-4$3Be3lSmoe~j+ zMIR#EE*SGnU+WbRzO)TVRYOKmd5K2M;W@(BC@doSH5rZL^AQVS?PdL?HfI`*TEjpC zKR;#7^7bgh!v+%C$+kn_k*=uyTvuQp;fD-xD5xx;j|`p*(0<`C!f~8VC%fgv=ikdm z(siZ8lm&#cSNIC3jPdsn0?A|jRKa1Yq^e<}Vt<+zf02p9p;0!OP!r1M=GN;KRHq%}wP~HC^(`$FZ6Z-g^ zzhE$?V+y=VVEHd0FgJ-wIlI4a@O3>NUOZ-B3IG?>j1i`s73JUl{6l%LUXk&$c6s_! zd2lp^mAqZW1ZI2@(XAFs!V2N_m*5c#88T`9^jMLvuMM$hgHle3_GDi6MBM?w9R373A5)bC+6GZA%= z$w@{l>Exv8j|}88h-9+SE9t~I5fY0fBs!a9VN5YR@(HoH4IuLGtZ8-PN}@}dV@P{U8W4tT9ogo3Q0p4d-82D8qMzg+uAIk{b83DTW(ojLdGc=b0RYg_&((*u}?Z zl=Uj*Vk3D5EZL;kS5@^fCko zTBYZN*H`A&w6IB*P~RczwvEa=&{p&OLzwotS%*p!-F;Dvh*J#zqM#^P?q>CS_8o-F zQ)0~k}+riva?_n7Y>LpA~BoY&uz@_Uw3A5xI|~+DkB$r z`l0vwIfA?51;2>HLeYk}9|NMAwlWH^3)JMtKZH%TF*TwkVH$QF6Amm1LdshpJp7lK z_A|#?g#3Z)q#l#i4I4$w<$}XB#<#Rlas+BHgh(9~LtweL^R!AYCCmosoy8_ZB9|!wx5y;n*;Rr*Y z@5m`9AZ7bsk8wBjqp7XW6v6cWeT@HIhVbtc@vni$e(H#d&IsO+|ArS2hyZOc{4xZ~ zGK5V%G?5@P`?5#3S$z*plj%1TEwi|&ep-FzI~s%pog0EZ=4P$}r&JMy(ox)7fxRyy zWJ&gq-T}Pms)f`^6D1}3o!P%_s}0I!{GVS5K14;|8wQ89Xzf0ibK0RVI!B%qag4S4 z*p<(#Xirp_W3CjQFDw)i7hx50o3lIZM=E<=Cc&N-Mi<%}941FRvYf89nk>aplbm}o z!D7ULT1{z%NVNcr&rV>pL(odcFQW`QdWA&z6^aGcN zd1roB*y%v+pj`IS%Z#O~nVHMYg}l^i(hNh1+|s_HO$^nkHgLsv8^j+rZmRJ?+3yUiBctQfga5dq%9=0XI#{IkBZSy zB!!sZ(iH|nSmT+gj$k!dvN2ZNA{M_iZLIh6TS}EAqhV#z-ua7^9awSycYo>u3g|JH z&|a5c3XdT90w0%!uipzyq^}uWU#ax^o9Fzkr3t|4`|#mZVhG$A@(n)B`>LSi0qC%E z0R-BSGlN|u4evk&uiqHqpD$1%FWyeqA>|9c|7MJOp=XTzsHaaPvV<@=l3Tz4g`UII zOQNsZre0K<+;fCHI!i~G1Jg^HR)d;DK629D67q9)1@n**M*rrQqm+pdsG4O5} z9@j{Twye2^5uE5Y)xIwj6Q|qIM-xa1RjB=LuxG2Wx;gbsb6foz&tYg9fa|iD?c43 zV&?dSYl9&XW`8GH%3&8bvi$bw?3SUMAQGz;5~J%mNuyOQF)x7T1Gj_tSwlkZ23zij zOC93$r@7f_k9b6Dt8oRvDNA zD6ccrvO07Y__80=vJ!L_sK;-_ntJ3jkY!&ek5c5Ha5aaBHPy&xkem)soSM*szzBFC z2pWMTKoVboRg$*Q0zBSqQHM{JQ3$NJeE74XJa8zTc%AM!cMi*EFRz3JbO!) zzm0gLLbh_8Z28&P5shQZE-;o>rLUUdKhv}D9XkMlb%J{{u!OIz8=h@LV~U4?IF&xb zox{PoO4CnkdA$8?c3*qZ=r+skVtpd1X=!XWw&=0ZV}8oEQ5Bz1$p?mVwavkdY*Z%2B6k`0hMB|Lz?IyPDBPzvp;39eIoe%NXp#iA{xi4gWK-b8t+@s z_^21cQ?Q}l<85aT&18tAMQ>X0dfu^tya($59og2|_nTuZQOB|U=D2}ivx#5pG1Z#F zr3bd1(>)W{HjC~vUP}kx@pyl>bSVECt)L!ib7cru<6$imOzX7cTh0bK(b!X(%^%_g zBOOi2?mjOu!X=={?y# z2Pb*QxfXHy>83VH0_+xYQ;1uZTjnj>kYT2kJG8x4?XBb(Khz8}Z~CkxWOv(=CaFWM z9nK&7xt6&<#`mph+VEfxxS5v74zpwkpph{QU*p5t0WGv%{a@*1EQ%OW@Wb^feTmfx zPEg-9*&MXIqN%3%goO5FbthaPYh5g%<2;NxK^V|*otSzZRU(dvxBFvs%n~He;w9M~ z;(AHYpf@~75&JQ9_0X0ITCLOHRdu1WP>ADh!1hj!kcHMcDbl3&op_;y-Dhjc>ST`P z&8h_oVPevje~r_jI!MnSN2fA@LiF4l^I}9EGT|$pqMLX$6K%j3h)FV$(MegN&%O(K z0a+?aeJON-oZS`F2B2!pqEKxF-OZzw>;?1Va3N;P4y>;hFm#kH6 ztA_7i;=hIu^Fvf2{6%d9Bicp!>$M*e5tQ&i$fr2W^YahhBB+?4uy2|TvjqLq;f*!c#DT{bi_Ww8$x*+;x9&;$d&vv5Jq7A*kNvB72h!9L_-yRB;3b|(Sfw) z#1hXjlBLm)Fji-rztdyw{}NlWxu*|LTrv**Io88rez$=L^pZhK&wcAa+Sy z4+3hNd<>sngJljWKq$dxKa68L|1&s6b+N;+TH*r89;`Z+qiluXXR3ev%T3slNVCIC zix~UF)-P_r>mHWqaNn&mGF`jX+xRb2kXw4 z>jqu^Z(xc-NkW!xkXdAIwi1jbuR&QPFB!gnVxCym|Llh1@L=~T=acxbIHZVFJSabP zUa{t-S87F{ul{xhU~O11^*6U)hOr5p!%IXEX~F&~X#uJd{5HzCRHJw8HEjuNziA_5 zcueva;OXZQKXUREQ!ydDniy_4DxVHvt6BA-kJ0ipj$y;bH|uA&rApN(%Qh)DyNK^3 zV-C*(x^$&%)r6rS`$zsYuW+@G^nGui4X50h!uI6j73Ks=Jbnj~%F`EBQ}O=AGq>Qk z5`Xj|Y7ub`^4?O-L3pajPAet*NDR98PtX(g;&DQBBM)4Nvoy+*&n@HA1or|JGT~CJ zLOpK%=3&c3?7L1e%SD-zhLW&W4>_eGf9$U7*$vLj@r%*nGBRIj#x8xKyOlEALfa{f zy-po0^V}{B7rydtFmve{N*q^K0ZlA??^U9ikQ=nC4+s@$&#_r@1>CvQ6;dJ*eHIu* zf-fepx@AhNs${w!nge7B7`0zFn<76KdDdfx{zg9A%jW(WUph=q777U zl3+Y9oiTW1N!7NOXd{|ZDM(d2A$_?@9E5q`)nESr0kIqWex?z%WS7pyhqtRVxE zb04#r1-6rUnPcOIvoF#R?kQZ z+*mKR#(+1M{XTpeviw~2xVuk139&k}K(u$;mId?toJh3oO%9Lp$DtePo2JuQd=3?s z6%`7up7c)t^dnTqHJkw z@#j{Se`f}jzu<467bOc!2Vc*`HY)1D@t6%LI8(O9CCf-fq#kPL%}$lIK?cS@_8>7( znBt#6-;@TOWr8BZj zB9KTQLD@tXIE7HiEOpLy$!pD;t#XnlpNf)9Au2|^hiQFRxYE-qjeqDzFsTWaQa^HH zGI5=F`}t>PP#@yNKXO~Tr0xkJvxV{8$@<+KN-|Nyj8{YC!5}qEDC6{w(mvLW3n;Y2 zf}`5V-v!LnhBPS>UbYgw%DG}o$r0c$b|i~&&i>JfHq=@u(xWy;YX+CH+R7_aKxav; zr7{F{FLgH`4f+m4ZiS}iv&(tpn3Rq)i+GDD&xDsRlhm%DrgkrL5wD&UCn3~}-p>b| zAwNcst4%l6{4gb3e!iA9Rx3v=L5`+g)mF7o3gse`x(mjY#Yien%fRaL6CIc5?gzOq zE-4MV*`z_nAj&^T&Xi=pTR_Qu+K5*;EoZB2AHf?X8;>h zXrsRp(TX8B9&(v;)W0v2fBqwPgoG6f+Y{`P_aaUqr!s`wsP&T4FKa$Pf>pxuBbMH@ zhYV>o{E{gORSS!EfCa=anz?lS8T60P+;RhtQTz>*qR>9c7ugO^45_F zf#EO1gfu}s!&?X!S5-+#vUTd>So<0#u}K}}I?ABEKb&I{Jm}ih)Y>~cx;os|&aFoL zUoQP$K#2;H-lCB|#?{FiadE;aj1p0!BlI3&8lk}ceuld2;@ z$1!>pB*t(0Op%0?etZ@GyrfjAIg^f;-H6+8d0Kyn!*ITbIc$@;>K240#?cIS{)qQ~ z70^c*vf(n8!cF=8Jhme(q`*CvJ6hzQ6z~6GpoM@)Jlpk+xeKjf*`)d8l zlIN>m3lw!adx29*yuf=-K(=8#%DT#;um0Ioj17MU*U!0LuX&6|Dae<+1mp)9U&m45 zkK1Fx@i+;j!*B7t>$c!N-c~vF4PKHAvnTm`Up~GQ@w2oD5LXP?xeF>*qdIvfy9dJ& zUJ6NfzQ;QJE)&$B)cXAYQH5?splfTjd(yP7+GM-P+_rcGXjH^~CScG4d3#-cRNLWA zb3;X4hs7-Cy@^j*Q{<~2HdEvtZasaul&$@Gd}s)09Ck!18K}+lV+j|R` zPGnoMYC<$QsmM5K#c|Zt2PkxWu}J^k|6l@Mk83u*!7KN7Qw3cqIp9;% zq6DIMACKp98FH4+`q<(8Jq{GHoKsMN2$n)a%rF9?T?^IgdVGXUmh}RtCB8}j*>llc z!*<14$iC}-)h#&$5TF}sn!Bznkj+bca-P7GPQ+3LYS!h%8jK&&w4y}XSPAcuY{~IT zs|Y-d{xSj<9Y5RamofrDrvJM(2aeyQE}Z$PHWqG^!n6*a`HiIIQ1x*XOv_X!F%d)9 z8-FvrxLKKb1>Aja%|Yz#R~$_OpY&uaH_S*oHA5P1z|Zph*q^6XGzry&mnQ+(yA|hx zndFP%!RS_q20jQbf ze*?C%zdK|AFlOEc=e|B3zQ(A%i!JY6)L5rpIU1T7Hm_>V|O=8E8 zh=cYnN64DDOc8@meGG?TnoY=t)iTMo&XX&ErCNZ|RsA;TJV3#=UsN$~l6C5kNVj2> z5O1P&YQVvA1sUKxr|4D?;uSm5gU_5`-GT^FgF}d3H{!ntB7ZWkV@2$GLJ6S?B@Bm^ zn)fgzd$+X_oo4Ac!2=yq(EvZ(WF`Yh&u+nq4mScB1QDBz5IL!yM$u{ewPb>SwoLWns!(V_MV`DC z=}A`nX<8>#rC7?56^0Q47YflU zeN}rTz&UXy?4j5m0xN%qy9QzII^fD9FDz`-BzR4UZdysq6M4q(rQLpq^FB|tF^{*t zSWblre!kBIqG%#wRF`zjQC;caT#9A>JM#i!SeTGLfERX`1-O=pnSWPJKlhEJ4hUa2 z=F%gn+|7Hve10w4?5B})@K0FS88m)#s;x9jZ3t7EtB zQFcMngD}cR+n=J{bB(S0DDm%sVHl+WjB^)g0-}=ASNjN6HSQAUYOkQu#@gq~-t+uY zIs!Bv8mnadUs_d_%# z75jh{6zRa}Y9e@zM_8ef_Jb4ek0!bO_CY%K19ZV- zPY{!`ii~J#)v`0y@=Daf?}66gA~MM!t13@=V=4`fZROu=sL zt2Cu?P)d*vp6#OvH3bM|GAGjAOC<MT7j-O6OO5zloF=^Lns{*W@hcOvij;~YEUQ+!A>jf932IC3jM}X0mnAI5ch0S z0**Re8HBfBHN%mcwisd8&UNHiYtdi`TzoBY47c`)eSC!}e(uhtDle}YD6wuRvGb&< zO=|dn|9;Z;f{)q15EanOlxIh~HUF4rFCVpUX16m=+HsVXmeEys(fwx2rbb;xFz?b` zUX(un39A(hED4Akr8Txe_SNUJ!vY7vC5II2*j@A&EZ2u_DYz z)_cOBR4#i$Wg6&Z+UB@*Nxk>xr)|&72iN3yo>@3C;&;HsWF7PznF>n6gr~z9t^f2z zK-_C-RWx1C9c>Xwatx*3MNk{WE0g6{#k%h#wkF$>KPSE?ojWyDg0VcrE zaI@Udm!b(-{XqUvdF<-#X_3-Me=aRJJKEq>iOid$*W+Mq$g|ELqdo&Gal)+sn=o2C zX^~25RZVNU419?UYJR~2UP!9^#(Igq*ROh^cfQRpHoOQA>nc(m9HLjLYNMg{M9C0T z6b0gprLUUpW@=M1b1OUg+yvQ@b>3Uv+URO-oYj-;P zIeu&#Yyo3vqj#Qx_0k3BYYn6u(^>#Re@i&Bo`o1^b8;j81t1{deyfN7j?m#Ug=7`9 zr&A^HgLpNT9i;!Y0eG0v@=_3I9-p`y5#(V%UKO^ln-L+>VXt!?=Al7?)hJXuUbHi&HlsxZdDo6Oh-?7XqtKn$#7m3W@t;+|O64;IPb z>ZM8;ltgp)eu;-5SI24*JsiNKGh=j=jmd2{G6Go&FR}F1MeqBr1+Iif{fd*o_gj2t zdSI^R6$U7K9aJ&B{j%n-2Tc$-;SWMG;01$W$-?CX)-|EymQv;v%JXvcnZU<-=O~i? z^Za@96!QnF@DX_v&ZWL#b0FhXq3Vrph`oJ?AWl%wbwJQJYSBlcoPHe$GTaMddV#83 z^Wf$zvwUqDDME4dJ7xuV2R8~S4kV+l9bQR97H(5#bZPY7#lIs z0$@ZXl6SX+fW=%)g%lC@Hr%$k{pn2l&}j97>6B{0h!Ory-@jkH&x>>V|4MWzPGgI$5a0?&PVK!Jw!{tKIn2s}M{N3Ry( zmw1QzWh`ljJo?>fZ#V}`&N<*0X5yXpI-ncYiw;wT0QnBW`+~0DCOD^8(5`PgyDLto zWrQ>PehoOr@08q*-ivNV2dG8Oe&}sx?XI0u9bEkvD2=1{%-ZMYvZeB8)O%)0DKWvk z-Q5i}yY6Nf|j5?1ec~TENzvAV*35PboOTEavfB%hEXV#OW_;K`oi~K0yrzg1Q ztv2Ok2jY%r{*nEi?|Y^n5R)e?Q+OW3FGmB~1K*B<%j4{xrLkBZn%;1C=^VZy`y*(F zCuMf;LL=?**7YKMl*CWmP>lTqso@)XakRctq|TsiJ{c)ilz6=(WZ4FNK2=W1hN6)O z4wJnz8ON=(c1wK5itwt1$X@$=b8@@9%phO>Sj!joW*nbUU;e^{S{e2B<|z4mJc%|= zuW}RuExMSkT7bra7hJM8@TbE_3|-uMN30q8->MuR*-Tf<1HPT-ZTja{J`%-h{M*1>=?Q7BTh zE+*5+(#^=Q>tL^K{p4lP9fc??V}$kR@l@>s%28{5IJ5^TRPFC)p0f`}#3Cu}H;Y@zVP z)TVCBl9d*m%{y|W7)NNZ%9)uKgY!8kMl?CtBf(N4a9VaTal+8Y^0%XR8*m6;q*ihU z(fO`7R5e+fqENd6Eno*Bf28%gwx?yJ4W?mv-N1I&Y%a4sM1k&3lg5e=L{;MsKAHGz z1J1Fsk?aWI*G6#M0s!ZG&ES5Sqpl9YQ-c^Y_Omv{g2iYncL8`sk_xYrthDL+DhH;K zO<$!0Y(@q9QR_ry2|%8XOE>L9-w(KM1GS7GCT5jC-S{H76SV9M#00lLu>|K~yN6Vr ze8!Qtb{pbiutfP=5uy@OII-KRJOut9l5;Xkv9Ycg3+WS9l5fHj7D5751fNDMCesK>2HTkM+f%F9_StD zRa#NHeFiFrCwHf(CQ=TFG4Igfy(D1YR}m@2C)DQr2+Zh#2Tbk8>cTY`61rBHHus{m z#EgIK2{%O#JHBKZI0-+okJ~yA$$}g|nysoQG76+-=*|?ihO|gkS1yZ$Jj7q$0WFHE{@uxu^5Ru3NIX|Qh4K8zZF^^6>+xQ9bL1G5U z{z?~VAWwkh>RT=CKNh@X`L>ol;z=ObMOZ@2!hktQ>h(r3Z5TdM5%zoy&Y{xqI35bY zNYLjLaeWgW&IPN9q`B45bA84esGn2(md;yI14LtLN~=m;hwK(|+A>UEOSpt&Vi*kP zh&SMgS6GtIdFkwi zou?H9hVQJkLTkjSpf=|q;XRYCXZs-W_z7>T8lkN7(&JT0nDf$bR%C?={Rg@99GFVY{%sAd1qY9j}=K14zuiMumQd!`t{8~?cI6o z$Db}Nng>9KNXrtgKYz?e^6u23WS5nxz0LQp2(kK2Qq_vDNjS<;zzd&EHfqJFPwtwa zD_%*JLm89Z9K`Zn7TL-tKkO^PQ`{gsmYpyyG!D=AL%Nj%PmHNg<);?J2&%t(s0ybF z4q8?1N&UeL!yEWLC*ssufY>gKLvmcg=b6K&8Zl98I(luj)uYt>BJ8n z$c2vj`JO+ciVRHbyHguFTNoMOE68jkCAOKA;Yd7Pml`TE-2nALXx4tcLrm@4*SP(O znZ}4|#Nxgp*!aWeSFj&}E_m}mImw!ar_4F}4WpcTG!_cbT7^*P;#aU$8pw>JWKI`A$uoEy@1%Z)Wg3)&nNUZ?1( z>gH@WLDOw^8x5QTI2LMhSx$T$MdZ!(n#~U8j@)IQMZJ3v<>3Z5Hnk?}FI(iquAR%>qYC%AXSeSdmApRxa;DPO%= zg?SijR2lhAnHkqj@F^zv$|1U)=prviAO0zj`_G1rTEUG?sC*i9GK0bEewwTPyz_9e ztA3oTevrq0;^rGXO{_-s3N>rI3f;OLk7Ug;^>J;HC>o26wV$r#>w(Rl5*5zQn`YJ1 zslUe-vE73zxA-U}HP_S!rN0HrSr561%qDNtUBQ^y=BIxm5K$#IhO_MmQ=7zC)vo^O% zdsRYoQF z$TrSr8^PY?{JIs81^zW!S`#{Gdx}5k%*=Y_J>)uE|I>r>b(byZ*Ij-$l6MsaW~BQ3 zy{LWVp+gMIUbEgYz!hLq>eJ2GbV?96XVRCnV>{1@C_CqDpZIM=+EE8CGOev#c2Q@0@}IJ zKgkvR2;@gCi{3NWBG(7wNP0<_@G`1R9 zBAg9F8}1i%0i|Nj@-!hd;bMe{6I75Pd8RC5Wy(j7j)AW4HLxCxUR!b#LtC=I=srJ- zv2Z-mSm-#HdA7YlXU8KKh5yIx=_Sz4=N60X#WC?@hBEL;Li5o95QYtE)1<)O?g+RI zDQl)$tD{z@#UEx@ChEthGUc9fOk)tGuB57RZ049Rz3@lF5F%?ovvzx9%dR2!gYfK3 z67jZej{-6xqMF|Ui}Pn$`P8{807zT@VJODCGn&v*QK6X|l+sLJ`iFL^L?W8xQj-oUfbP8o}Y#1Y{F_6B%i(^_Z^W!{n--1awWtfG5a`6QGfTGyE{1Kj^ zbRgkU*3^KABga8D^BsT$URlVpUm;O^1j1ie9R^j{pL1n6dfrrbD0K^sI16n%zuZDc zi}DJzDoxmtYqnyPQ9XgqnoD0QX=483hyJ2dVTg1xhMd|GsVFAI`B7w@SsA2{k$m;l zthUzJw6JjGi=jld$(`xMo~Ion-2AKap6Cg=-7`}(G(>f2v7E@l4O~jEc~4VfjiW4# zx;@*K+rzn6=dJ|mr7|eJ3LL_zmIzz6CK-icH`4!M?5(2W+`4V;Bsf%XcX#*T?(Xgo z++7o-f(LhZcX#*T?(PH)4mp*q^-0#*|NhSfbwi^ys_Gr1k2zszPQ`~wo3^DRzUjvbFKekhW07?=isn#XopPxmr zZPw`F4l{w$7K-5mx&CN0H6|FS#1 z$G)0md#VVeIq)YlDLRNZsusv^I@>R7j!hM1X0JLL){`s)Hx9$q4xW@8so|DPF>%DY2H>Efb(pDB+MO0ooDh^=G;G>(Em&iT zz9xZlDBPuUk^0%}z(_it?UgZtRe+^rhp-ZUPNr5^fPKujqa=%CSnu6lG*5eG@l+j^ z$QFTfy*;w$T>OI-l2ni+QW{?TT&un#fNi&>7+oq`)W&fp3m>I8@8vtt2S$a~@BOYN zkzV~pJVc~=ce`vFDNp{&2yx{xiiI49kDw1Vy(i4ZCbljKud|I`#Fu+?XVU&s5o4(K zER0FHR_EKz{N}HuPON%a)il^oL41Wxm_DBueCw)wA6*f(yaR6Vs-m0x1-nwx>v?8p zC~TRu;94zW+|6-AMCI*mOivh`Z6kk%bU8UkWN!W-k+7MV+JDFn%8f3`MOv-GWor9e zwL>Vn9~5A4F&=e*)z1S+~d%SQOr?+`Oa^(QK6lV6;3rtvC+Sk>0ZThCGpX8b^^~!Us+W1|;Dm3>nx}N0X z-|gh^Rijw9_x0{T%ISs86umEVgxzEbi%sI~Imk1-TVfyX__WR0v~yEAaa@aX%8s2l zPCBN}_FWj?Yi?fELhPp#Zfq}n^X=ipq2+i=d5s+#DeZZ~t~4s`4Af27^oLO)aU>*9 zHfNrK>Of|Ai?T@&dn9=mM&uRZS5LGQDfH?C9l@`{{6 z~=*GGlqQwSw>uBAmMX%HvZIJgN+izN-sK^Fm4Jj2! zQAf+O?c1m8q6gc8KF}iWy9{=m&(p_kM5ydcDYg%tLB5mbDdWdPJ~cxudR5fXWzxaD zoasZ-ATjx}^A3k+1srK!OvU`Z{qU#~z|i&TC!B=$7}Z5@#}1;EjMX514Q)*KhXM1q z8^GFI91!>J@$B6VgG6N@A+oNDpfdaMzIIN1SC7>&v_2b*YTg5tvEYaXhF95VaQhn3 zCCDY^#WWpK4%?W4uas6(43t)yU}%(j7HRnUJ3{u2%HdKN$n%r?hv(;?#tN=KzR|z$ zqW*hAGq<*Iat4_Lg*<+xDuRO74UW-@NR0a)guN6T91JDo^Q3>8n3h&(qGWUw%A2A-vIS zGgFXu?Jd6Ww{U>8sW5NJ@R}>fOz56`5j$LB0&W>acme&AU`TJER#TT~?dClMcQJEN z3K-j|fKDOpqOmjpT@SBx=O5^@-55CaY~SM|*or}aT5Jog-DUyZZaL+PVLnw#zYGUd zFFw|UV0w?_+@H0LDEwT0IXn}J*^AS_KZL$T6K9lIXF7JViZvuvv{p}%27-uS!q59x4TSZ= zfJAQ2@y;{w{4GwoVv&1RkC{>s;!NZIfiF#5OmzK36Zk53dy8Syqs@e?JWz<0M?b+x z2h0|!3FvktK5^DjaX=;k6yA5HAi`mnR z35y?lZ|MGB9J|zO07&ra0W|KOiNOVn?Bi7#L~vr=s(jnWjIIl;l0P0)Yv;!A*w@+Xs-jxkRcd$G>7z1x>?$~XsX;ht(9h-(GYn> zEL6-n{K0O(Je7_rCzUoP6PE|7C!B?%a}-D3+ImaQOJ097xt67ke+`M%#(Wh?1uNff z64)^y3}oSiyDxR(l)v+-@t8~xOtBTRC}xEYg>mU-E&#D5C=v;uZRRe~{I`+P zd_7iI6BM`R5B1+I`JV;+C(-2Z0+#qsy5+x?RgkF1h)IlksQO=A+hhe48cg-k0n*qK zt2KRJP1na_D%fl^pA~&4<*{X--@a1rIITe1pG@DRJzcFHWL&3ty*|G@;B?_rzw;bz zy=uKGKa=>pT5JJjScT(EH;9~6vQ&XL4^bmlOu{&I!H3BDak+x0^;ln`(?++*_&RaE zi<&!Og(g#9XT-KhaFi$v25%+_DG9z^?Qq3b-e8I{nd8vCUg~&;WhHr_?9=|}=Nqt^ z3gqR)Y%~bvND>-q+{*`}1t*@n2DC*ErHc{Onc$rF>$tvkV4wG@)nUX9OrIaxjvShe z9~!2KkxX*U)TnpHXCoRk=`JynES7i|U@N`zUw#f-7R9 z!X}M76<8*btjsB~d?5zlF(??d=S)6#==xGT=~2q_x$PBB-s-&9_P$zBM)$2j>-?cJ z-|ng`cv{Z!&TU`cUR$} zTalcjl&c9nrnXxN4{w9qG486MMRLh1s{L}Sx3!EFnAFp;sOnzfPR<*X#&!(Hd+6j| zQj`Z#c-r^92NaQTU=AYK`KumGMN;uV&LsA|@aerqoah7+q?)*wH?^oQQLpBfz(6SX zh#kLQgF{U9*t#xAV#xp3#nhkH(tq21RUBRZPAa!i7zGUvsJxAp)fSH`D_Z3UhdC~W z_Ee#8VM5uIi`%}~Ut(gukQU!7+Uiiz|C%1!Vsi+YqFC<6(?CR7Z{K#mH=F}+yOY32 zC)Ly=OQGY#2<_up`7R$_8@#&e#`& zYXyouDjJ0Gnh*S}Z9!Z#jI4nE5eM;^cf9qRg~?<;Ut#9kqfeOEt1b_ueAizp8TFbr zPqE;JO_Qs^W5r*UHWd8i6ge>KkGV>fYbXL_4=acL0$SO*76)M9T&lDweScdjNBVM0 zf4P0(|4{ya?#&SXq5OX#BE)Q&)Ex~$salSInJZmlHk_sf5C=QO;gFEw`lV>lo9##APKmbdjYxdgnR^-ea=D#LiRU)SArj%pD#BFHVO8c5(!295tdHR&V24_-JVYF zzGl9mIZtmPn_2Y<|GTAB@a0wx9EzG;>9?Zh%T+q+ zNIDD*u+T+kFJfj>fw`wVt^HO-Ky{lJQA5)(`jG`<5D?V)Ota#WTb>=hf~8{LV~>b_ zJt$XBGY~7l3B*blw!_0C9#WM>GeUXyu zSutuF@@T);l=;9cYRVDwH&J*1!r|zsV76aq0qz^ENPL)Z(kR>z7$es<2NpR#LEZgs|cZ{$vIY1>3_ef z6S_`Z8J-(t9-?+-GW`tXCKxjO$(Uk7_7zDh5IkB2%}w~yi-=8^$+8Jy=l5k3!2ok% zbkOtl<`3KMKc6w8|KS#~;LH5CA}IVpwtCLOGw$*Y~qP`_o{k-hmb>rq=l^ zuzKHupe_`)7j63j+Ys-EPF~#KX%LZ+ih7+%Et!)9-fY%YOKaB{3TXhiSJ)$*t$OD} z$;KwOn^kjo zAD&ldg6lI0;`aa6-xTp{?l{mJ=mGuw$29-XxAs7~2 z(jX;50c1p^uh6p?DN@k@3E9J^0Zk&36~SudDZgX5V=!n^zE59;#%$jZ0EutRFGb~^ zbW8K!Wc7yhYtTpT+gz7B6nv=}w+xUg!e|}YX9}9NsNxR_oCB=jkCR`a;$jF!NRg5# z#uD4JF9h?qjL^g2(G>`3Q$pJ*i(V8IIaGhp=<5~;a-iS$#dNjfzO}m9TkjyjzX@j$Uk_ZI!;w+_X|%umhOeeh zhOpNW-3gQBVh_W;!ZU0;ljx9nly*fL2uU7MoxkvLU9_Ro;Qh(sAmoYcs93T~e@*7^ z?LlaW@7U#__Idh8`~2rp_|FiZf8F-}zaSN+k-s1AApP0#MXmZ4G`9*mP%gykVNWIq zQlT#bnMri@_ zpCriHPY=CFIT2hX8l~2F^6EHpWg^N=j3R{4JSC^}S-UMdhoPwr%D+(+)_Cu?HhzB+ z@qK7)&7h|r1N3S9rvm=@Nud5wX?I>POh;&hTTD=lv!E9u=hgl zl0DkF!C5LIBVih0c{CMF?4(jc8`_#*;22h(#_I^OFw*thr^57ZYFJ*-(~rI!$`t-| zat%{NE?_yUt2Z%l8vp5*aqu(kG{b4=Yp2g0FVYWXV05#_>avbop_2zj#D_v1E_E%p zD|nLJy84a|l{Bh2tO&OW+a^0mgx%J&N?3Mx&9)DTQZ+9;&ea&D>I#v1l#4f(M|vT^Wotr?Q3}coy&&~R(|%U zE{}zyQSKmXf@8b2C2NKX z#1susaz~$Py=YCh>)R-kUpaFF82VPq-+eHirFFzNxv_F7@O-GWq#&o(-RlqjC`N1f zgCaL#YT{?@aFJGHk>_HLYv(b%7=L?N= zzsAdguA3)0=A5cR`02ryPbY5pO(z$gIE8H=v`8*z1n9O=Mbm`L zMIB6=jyIHa&sj=-iZMf7jFN@yNC=3Lisyay7K@f*ghl)qT(X%hRRvKwkG;7ibphRV zi@iA|b%EJ6j%}b$=KgMT7~8;&Y!R%g5z@T?nFsDd=RKny%q~a;tflo9S~wJ>P8UrNcS%C5kjap=bjzu*=ZK4NQ$9tT}Sa&(oTz^0L}_$B7f8aq66@0jvRmHkOCfV&5G3eozin(&c)M?=N65 z(p^LP1+csrRVTTnzcVv)2r11m%^|4PyF0t1ERMI<=q!feewK}_=gFx^3L*DKrQyC& ziz_Fdkdn~V&IYI(un%mwZVm~CulR;Xc^U>W(z%^4t@6G-!>{A+Sg7*U-}zASh;o;O zsaWfKaSFeeYu&3Bv-A##s`_Hco}|eB)md{n{Bqh$h+##_@I*JDGUBZ_LN&qJ{7x<9|p^M|?M)qtZ+*y^aJ4GEqgWC}I@Ws@?Vo8O^+Oqn} zF|{)DW$vVx&kO6#E2bCVrEHQw_#DGCvvM#Sp|5GsWeo@M)1V*#+GvOZnGjOYAm~CV z208!}wJ%sc+#mCup+9CKIC}C}6yY?V1HQ;R5wKtwN!Cvi&_>e~^q(dC&EGN6g^(jk z3hG>jU?EZ=ej}`q%E1zmLR|1Rg70GtRztjwF^6EH4Ti|n0B3ragV$F>x-1wDZPZU; z8~P9eb_DJyO`I+xnrNT~(c89i`jr8$t9)q-o}u>m18}gNL*t!HbT4CYwq@_ z$3*r*_2He7Us697aYXw^`~{94NV@Hoz}xpa>XW{)>p7~&4|u;;wGp*2aZRAK7Y*e9 zR=$5;nEdk+>%XJ!nf?Rdicr**eg}${Tv)|h_(~gp2z6rsP2V9JiU9FBYpnMc%67$I zq;6y!`)0ut^t;@g6Lg|8K2}Ei@#cedFW|$yw-*$n&{u`8?5(WqZZuZVY~^3*%|^k$ zHfldfH62eZJojYpnP44%sSl}jP;91^3KaaQcj31tSf@pvcBVAUlp5@<>ntl>!;>0* zJ(fL>0T;__gP+ejtS)a_f#Jhn^g$Xqc3sPLoU=#MBU4DX>9zu~6r;utAL)klqK`)gDMO4ehOaWVQcQ88Kr_W>NTeAoq zzwH6{&cg^dpt)e|kICVmdsj?X0Ot|g%hLwh`)X4Tc?1b+g$n=xw zC5u9)m~+RvMN+={-J&`%*#=QWxda5-1XyZyo_DVC_cDBZ{fz6|9r)jokyXJ+3#e7kKrIe+{TW~m8^elAdxhfhqB#np1LuFNrjMqwUclHB`&APe#X zM4qL?L`$U?nF6;+JL>946L4W0DnZfC<%dJaQbfoxjIvmys!dSp4q{ZRaZCZ)fH9+_ zQwTEP>NXvu>`JNnZG)$pX~aBqwH*EuaH-Cv^Gs>PVs0VzGrAQ3C0kp#&M@>KCxyiE zK^HX&*0lVnT1g^fPKeT?;3TU!Ba*f%sc#L<+Gs(W zaUK<|9mpAooQH7Q@9$-QBw*;?R>+WAGXeTbXtuDXU>sS~l>%btX|H3WQUbLqWZn@F zCR-arO&o0rrOTlA`sdgYoMkr=RBUg86re z{IRNbQCg7;04+PUEvkbXe)uj>lOG54iHBYGn_CN68bzFIug=H)BLji2J0t*`6ry*)Jw=r)IZnM4_^p}`zOQ5U0L4&hnf0|faTNxO70 z)-}6yXW>C6f>Sq7V?_8?lJz*Phg$5xcka0+KG{6g%^_t>XQyKNn;Bq=0*#a#d@3G^ z4B4bFU6rcG8Dn)!IE+)7=_*HaB?d8jZwgXBxHcS<%f!8pQ!q@DrHA*L=0j;02+8G^ z*P+3>=OUL_R>x#oJ#DQ|SwRUc6jjbw+fKVq;il~VFjU|>iIX;DLT^HI`;CD5h;tLTHkbnJFck$M~5@ zIGGp+g&YNE#wpXZfGOZUIp2-v=p&MJ5V4DTKlUx(?{)}JMrRcQYK3F}>yi@fk9PQ* zdHDBAO2bMS#}w69&vqqmCBczSRD@IlnI6WGDk`aA6*lReP*%R4fuwp1D=F&|;g4m{ z;vcu&=UlIE`Rk0S!qYFI4na<-<9iC+kGK=9tudyx3b=WYvB_%(!(JKJ8MRAe9yR>0 z4=)IBmMlCl)VBhSt3 z{KCzuJn@lYR99}T6Sb~fvjo$ouuE-NSNoorzTe7?5OUPlEjW7Rn^liHQnu;R4c()( zgT9x0>Um-GE_~1>XQ+wYlCz!dEj==j;Mt0JXgC8TB3MFY{PX}`Kgso@)HAu?Rcygx z!{xJwu`c#V>EKHtTUfI7WFC# zeySj?5iW{c_!b#<=l7}~u#h|~tE4g?kiVaAI6_cWVmSeI_ ztn^1|Z=cr=ga15P@d8eQavN#3z<1_H6-Qkc%Uj~Y|Gh_dk$0Owqk6<9d2Id!;VB*$ zM<2?x!}lrEr8D6A@!_Ju!m1S?Kh)QnXMv+x%^h@T`b3~+?x4+)Aw#uNHXrZjBOg0) zydi%@7dzKBNDJ(xo`7FCRsAd-k6!-~#zc(z#~hXK?8kB%>{unLx23C^fwc+sie*OM zLOASWx8m4cV@5;ml=S3^t~Rw)^c3Sobh@-z7aA!etQ<)36JuX^T4iqTZyfj1JQ@M2OM)y?Qi>>&ZwZd%H2@uFSRw495UZw~me~0wIJH`+7g)<(FJ!$3 zoMHo!GUNj2kvhmet}*ubp&_xwE8C`hBeHG&?)f2^#CoRNg(2#x_sV@%GS3#<_2jRB zOD+_@1o!rk)?kjLd#AqPTV#45m?)z89yAkvw|)-^Cx`uMhZv@xo(cG~(AiZKLpzQ+ z=*GR;Im!J;j>(2dSO0YI*7>RQ&+S;4V_gc68KYet0NkOjYyj?fm!kjH(zC3Lt$`J#jU`Q1V^q2%F`E6YUYAFxDyfYFo4VLXX0xN^ z3-&L?-*~tZFJ)m<@VDn=C9WUM?&@fU+^DDNMWHIDtA)M}`nz)2qq40bFlX<$uMO!5 zhtt5dBiJ9Ut8ICPrV=Rz$FE2M%aJ>YG>O$LXs~Leyn(x1)zsR< zpH!zUC5-w?Y{-1nB*Ta_$<>T$K75R4#fjmp*3y<7qPAcrg)wqzh{;mjW+SEPJGLQf zR|^@8S|yw}pgC%gS_P6ymLXB?sDDhT(R*4=T=?DQ`I||(2!Q;BpzgP)OZcxT2 zdi7L4sgC*{t{qncbw2YDKn(Trqdf>AWFz}}u&?(XJhSKex1^pARnjqoPNn+)$V2+) z{w?7jP4{0mKT~^aL*xHSMEW}*{;xzN1+9{XiWWFnrOtfp*ky83^H`MxC^irkEFJOqB~d@Sc$ZBTClfRf!Ah|G!=m^>^C&ld`UJqVohh_&#;v^Tb~Pe}Ntau4Lf*yE1gqlpOC{M4da5BJ`D93fGkoFGfKf*N z&!!sY?#>I$KH6p|`F&5ec?**a3LizhWh+T`6JKgM=P{XEn2;1i-*V)$KGuo0GwHQR zJljB=Gzn+Al&yW#Xs!})pkqYn^($xDC5CatW3o&zzV*bt!>lQ+=Q^msazP_gL5&m4 zP7o6G9jrJFiNpv#GX`aFJ4U{jx){p~Epb^udrdIGFew;hbL1%);6{_`&kX){Nvj)V zjRrhmy{^$O`iUYz1S7{$Yhr4i7=Vxsuzw(BO*$Qdt`chv%nJn@!{{V(LVi~rd3OVs3s(NrP4)>si->`5qzvFk`G_)tMC}9 z4R(l8UHs9S8h|RiMz%_pX`&r*C4uUf^=Qfi#goS-cK|wI2of}Mj@dzy4f^ir?qA6G zmltPj$Z?k!r2S%loR<7^x+DJMQT*q0CuaNCFc6_?qYQGK`qIBm2;9rJ5F}GrMviZb z!_+E`L8W6|JQS*eA-wmOc7$1hX>DqC$k{o!2im=C-+&Cx#@WgrXMLxzeg^pUU#_m% zyjx1T6u$k=`?Ttmaj@zn_xgCSRQYC_k|i)yY>oCBt$YA$*DKU)X+BL=&5BxhqGQ}% z(80AhzD2KrO*nIck#yN+DJF($k9h>_2MwF5@(;_Bs=H5#XJf?K0ixw4rLJpv&@Y;@ z3#Xz$=~}+dYFXgAX|7NelIYBX){FhuR5BNNg;bv1hT{x-NwMpeep#T_^fDz~{ z4rnZ401DkHuzarUFqE&;v_xEdHI9*3xJLjwx2em1O0+yxEMKo7cQ3_K6PX@)qG%J# z6uNWs$S^oRoGJW^Jq!bq{*yp0COysCp(1=ws2~A{sxpcrkE~M%uJds2keq(KBTdTp zv^G2?7M$xPQN04;xI`L2?wKu_4oUIzXp0sJl5y>8Q#BrzVjFf3K+hPORpe%igafS^ zn}4^34u0SGH5b4_r>$3cA(m~6ahn+4qF3!0!OV{gAR|5O(U~{XZLAm0&|=13ZP8zl zu8X!$KX#ljj|*?Jy{6{C8kiWoJ|$%f;t)y#bHJ^N$g#2XR3$y3U93mGg|vlLwQD%K z?FiFOxKM+ZZ2Q)A)z=O3ea)U$QlZr{F~fTdxWt}Yy$5!)H1d5C#Zqpfjro~T^a&m- zGz~)!m0R9jrH477POGUkd$$O&FpgehI<{8|(e0A&$BM1=b9kjU_a5{J;gOZk zmUvCPn5#DJPpyOu#_u5E@I-h- z%yCO`yul|OWiIEfi|j*@XT+6HBgXX$P4si_BJ#XN!M%q^@{DX%X!D2Z^58!4bY!Kq zOu6lGh^Bq3RfuwlqNBU^iV~nb%+GiO-8+oZ<_jpkL-GXUcZLBOWmzLMDG-GL4h#aH zSHs>!51kEn2?K%|zxoL}a$lidQ#84Ry7``L^GrMA6CJ=sUIS_}u3LC!ocV`ZAV2Woa*lq|Z7wKmUAH4#zB+iZ^VfzI>3L2IGN`*0fC2#jb5!%s z4)f<><=-6!Wb6AkbHYdI?^_s)O=W6WGVBFe{%9+*$=AAHfZ&=sN<_UZvu+U#Z-RFiMxS|aYSWkPMHy<=lc#Mz!h9wtejbyPt>n)(O{W|fNo`LBx z;H{!tFp^*ca{JXkZI?HfGog+(K$j2c;v3bcUu5lD^s&|&wniQVPGA){=MG1$UTq** z`1YVgvi9DH-H7ymu!n$E0cy@%_Q_SQA+K?$Nso-+Y7vyxSb03{t<7gw2}f!|U=>lv zNaPO}ulT1X4WqWRj;(h$+P{wHDVhlRvB-2T3DUyb3 z`Jh#;@sQ%Lx|#=xXW@p4PO(5%95nAcJH+L}7|F0ely26lhG@T90Q*`#P31OMR7#m$ zW`M^mmshajI#MaRg3x~(O~<9gKBznLy)RFz#AO0yjSfVb7z>Bpb(biQR8jgi_|A!E zScULoz!udD#0NswW2!u+_i`L8CZ-mN1Ox&{JI@X#j34}&41!!?Br%?nXnaL)Ip67I zxo|mpk4(B75CQpnQIV1P1bi!J2SLeyq;kCuLG!DKfuqr+9$G!jAerm0PqJhwP=eA} zqw(pD;Jf)*T}<&y=14#?M+}m=$ZH9-&fF^xR;blNHpv4o)HqQ=WapR-ut&6((H$iG z-uvIi6KBV?2`-S#G5xPH$MQeP+@F?HVLMx=|C=>o0fM_>`X;=qW|9yTtN}XiZ=DyT5o(L~)t@K?W$(0^~@!|UbiL!8mW|ImN z&xRt0+{v@jgl1Y~ycnGK2T=-0;!1(0%kaI`BAC!2a=naVfj5-GexnV_QRwDRsyc6i zxUtn5)sAA%oD`dy`X=|I9;ei>(4J*D^_Pv@*Gs%n`ZaSkSji)d_-1NMgo@(NBx~^L z7HL**tTH63zo*jF{5-&noJE7()*ft~yJM~N(LU=4vmrfTmDwx^rIJA*PNuh8ljE4(5oBZm+)BCQgKYY$cnzvwBrSjD#+VZ#7=#FHTB3;crIP;iJtfYSWy1 zvDlbilChIafqw=$s(cZMu7jfAO-yDb9=vt0HzNv}@`=iQcoKy&!G9qR4`#`A1|m%C zTQz^Eq_V-l0HK;bc;yA8v_Z9AHH*cGo4>$trBGun2>5GH$oBkR>zLb9iQj$~t{rg~ z$jXT3MLAHET`Q14EYhmfn;QJp+QNzchwmkh3BhNc5v(W(C}q zb@lzE?2t3!u6AY$kuk92%(+AMC+>IFu5-1NpSTJ}ap~T{hc8*?B1+M(P|Sy4ZsZj$ zAQc3vS){TtE|UB^1?mDxGq$lZP*jgziCxcrn_vp~MyVV@Ce;2vOsM~~0Q_k}{mV-x z>fmDdH)7p1LKfa%01%}A^0s>;g2SF}1UUjHHzXtk?&3|n475q#(5Ej>TW@*d9B^G1 z3v90RuG{Yp9^O^;1#bcyOM?6P;{7N26N^=x&}vP3jPU3JD6iC`32<3_t0+Fmx6P2D zI5aOmOM_F$#HNjwOjg-y7(;&H!RS}3Q4&HbP*<#xypUP8cB|A_u7cSCn)Xg%?x2u= zTpr8M5N&V9Y^0%b^ z+cN)eVf`OqPNHi;T77o;DFa$@CiNjIF|RuF)wW=8h8W5j{puJ0wT9HBc^i=W_ZsqG z{pH~VWR@3=!;G^mU8xpXvW15H5oFk!(YmK))@u^1@`c8)`R8dwb)u;TOG*!mkDFoe zH$IH@nRUTRx)HgAd$Y!J;Mw9j3J^=Sk}1$l+b)EQCTH3G1dV;tmp^U^#Ee#Sq-&|X#j43S#7)_c_G&0s4 zL&<)_oIF7=r(ydof-<&WFeg|L%;|336-($G7Z?OX+nY0$Q=76>Z7L^d)OG@1!cl*O z{o_;3vL&o$cPIKFm$1+b2<9ZQ6Z#A0G#!4)0l3juuKIkU(iD3t`y1xOd6F#dn@j^J z76?lFlBZc}*g`+}9=Mx{f#Z;P6B^0`3)>(tmEa?rUZCqvmqI6hny!&U*YC8NgF7MP zsb0U$nNv%%1d)8{kPKRX?u*qap^^7E@q}X#aevuDia=Z9yHOTF6h=*xGaHtgq~_=Q zG(Fg?ykSry2-d|_j*L9|qdMxzzpog9Q{HLT%{yGV*MRC1{{TP;q_M}08-x85CV^sG z8u=eSfN#YxNPeFl`h+xxO%J8Q{WhMfYryc8rs6hR)>wIBQRQ&xq2+p!G}B}h+(QJ! z5qf6L0Q)|PTkv87JMv6Ov^;!H|2}A3sp%`!q~C9eJpzw>6#+@?pERd`)+5m$68l#@ z{_@?a*r^#>|I2b4p`z=wB7(|`!yR2-6q3&Vb{leez~&`-zG<)a;S;$__vWlcIi?mL zb7;HEtRg9iG+rWqx6``2q5D?0Y1VKTU{W3b;R6rjYCLLT!<@{Zary1{C8r5qFLl44 zcOV9L=%P`SRe}}Az3kd1U?(-w)|uW7J0+VboNpG8n4=sIwJcn2U~X2bu;$x^Gfph5 zQeMj1XXQCMi@QL+x)@`7JPzCIZ00WBAm=q-M!rQ((*m;Vw63hG)!RgK)Q{qQ8`Zi% z&6^XqsnBwX*6(TZht6@JT#sz z{0hfB-6T=q64VgVU?(uV*}Cc+xab&64SdMeShBvdopULR<7c9mL#wBv3umC`U! zXeh6+jF6|#(*cGd7bVtki;m^`!TP(m`#LU1}XEUEP zY5cUPxlDe7ym*JW3?lz4Xm7OuUco!G@$Paxb1~1E-T{hss)-Mmi?djh{dJeL@ zNe6i5i`7aDCFnCU(LUNs&+-#KJ|Ll`W}vw|j@A`yI?<9m@&8)(p~X=ib%Er0@eg_a z=N{*uWSZXxOQ4s(wv%)tbm6-d5Q8Qt?=0gJFmMiQ2arXGFMEWI-_EokIf8Qd&=0!b z(MpBKK(i!gvK&b7zc#JEkzpbQUpmr3GplYC>kGBwaDdPzcdoeuB$`^3G@BlTlY~bW z>_bZGI}^+13S&O6v$1x^Mlil|rFz=wGOyTvXcQwM6!9w%lb9s%MVJjUWLglfB4Uv9 z2GHMjk`oiPcGF@tbkelOh6rO)wPK*vauC!i-VrZhX<%X9Pnj%YEn;D52*`**elF{) z5qU}LYjgJbnO8Pzk_AuCE(XaCX$uK$U}gXwBO@>b*sp!VgaIZ20R={ZL-G4~u)B$C zG7R(&*ZuL9{`oEbdHDN3gefpH|H~s0|2LN}VH(mX&|l%=9W3~vXmL5Q(3^f(t-{>g zvy0=vngm9RX-Q?M7g@edIA%UhzNcK=3G&ZW7LYE?OW!hh54aDw6Q3VnUZ6g`(_ru% zv)2W-Y*%)-@E&XNgjB)V`U7tuu?BeCp5+EhT7fobvn|DqE65d5zp-2c)A3OW2}`Q> zXu`~>0G7TVYNOA6RZ$gzYZl3RNYR&CjTy2)UK70`qvQ(^hdM#4e_fb{d&OL~pMdsdB zwfb>=oI1f!BE8#H*DATjUv$^R#39je8gAYK`@yZvqSgDUXk0s|06C#(XZ&j%n`Gmy z71XMc0tt4Ocu&=8=`DgQEZY;RXJx;{4$_OyrsC5`2 zh6kl-S|hA{VUaNx%$ivpcv+XL18{Li{DhxF4D;FK3bB^4&H{c>ehWdAU$FPojP{taXA0 zoufbQ75#HV>;J()^zT{x-wq;Bb;zR#qP`AH)6c3((LpGxIDgjrq=be9TckoiCzfmD zPB5}azckK%Hvg$}7|L64JK%DOz?7#}8~Uh0Wae}X#QN=U>To&&k>Y*d4Qi1 zqE*{y`3r7<=FP(T$rAf8mcV%#3oabMXA#G*hJtF(^MXE=s6IW*( zz&Ls*nZ^)zBY-YrjIorit)pZw>VV4@`=&2PQ_QZ{4Wx~YnA3de-T8F<_4$gm66;9g z`wkCHMCcJ^Kt*7)NHvEJ!v>caBf1#wP8@5YSp(+&%fypc-X@s1SOU>*UqSN}YutwingmD%X~qlVyy4#Hf^%ySRFo>cVoZIBO$rji)LiTps|0 zu$+gBRw{AuP^f?zHn=&&+j!p_vvEqWe#A;4uY$m5ih1<8MD^6O*A@nEG-J&PC^$&} zK~`Cf@Rh+sxELc3^;?@%Nfxb0%Srwg8dj^Qanj+23G0w9 zlKCQWXa}stA`!N&bwN56%rW=qbI!7yFi}iCJ3*f2#sK~LWM(L20eR*;p`>yh%kmec z4J5gAqN@n-6i+STiX8k0fnMaZq{TB9nV6!gQ4XTp8CJGoCS(c*M4~@L78{Zp0a)8*wPw)c0 z(vZ!^&RT38R$6I-Q{1v)jTkFzz08=%Cj~Ywseo!{v#;5HD<25+w?Zwh)zb&~?2ZOH z7^U&C2*TS;7k@mTypZTQkx9TB9VRk;5(GLRFm$d|TU8t_rfAiMyK48KIcCFRQ1@!I zAWu`9-lFRCM*$#u0S>uKb$rZYg*1k4`7)KkL-+3Z`8k1BfSLE-v2~{#e&OOf&lfGz zTl@V>Br!iGeL(2vh!9d|^oPNsl2OuH6|VipsQp001@RjfKUL%|P6Lm1jzy*(XHovsiVV=14GIufbf|3UI|Dy3==# z?RY&P5sTdWsU*p#@_-=`BqP0Ux{1uD5O71oL)-Yhc^J;6hWi^=z@4k~9c;VM#`r3U zcsGZ=BSOqNLdY3WtUBVe0YrYt8A#a~cTrW+f?m-{ zavdYZbs$?^5iJZu4EEgloBAX8j9i(YU@$VQbl)7fWM@Xq@y2|E(s+fh;P8qJE#%xW z_I7cO{ig(cz(CvMP4X8AV@85@UV59YwFNQ( zo4M|8;4PF@K?YR*`-bByU_K5{rn(dI09Jw=igogr0Ujs?Mt?~xl%$+_4}x#xz1~Mc zmy~md*(t`D?@RqvJhB~!WL4c{nF4MJiO(H#Vu|``lDf;nNo|$TF-{Cr!xhljod9FT zm|Xe@!p2Kob`TGSdgP;-{yHY^4p$No6*Z(0T&TEPr(ms@g@ZJp;0Oo-DN4{AQ`rlQ=8zK|@%plNhfq*`Wo;}9j` zG|f09j|nV9WebJAaA)!{hzF z>pQ*Pp7*^jzvo`(zR$VOIrkk3&f@gCUP)fLf9)o}@sEUz#O}-Y1-&0`1~h26ce$5( z--&aT`^E7-U?I{X)`supx1+6ZIL}dcpX~{1#^X2BV$!KqqxtEC*Xym<&537dB!C{GWp6; zE!t@%BF_$7$rPl%s`}13fI6v|athADkQ#2@~9YVJjOELmEk%E?%p$pp`M;qQZUV zj&zI)p?1QzMR%KTEz8d=@-a7KsOirPylbEyjo|9gW0%rWVILe!N-|4!yKZ}XV{?&6 zh=iRYHsH#a!STFLxB+YRH8+0^-5RoI3otRR`tEYXBXF|1sld=#%)oz42D?9DO1Jxm zcHAWbtuRBIX6KWgIC|$BoYER2-z{GuylJR@N-}}AglIDR{t=BwLZ&VE!x=vuEVyNCB$l`@(%<}iuoC(BXUeLSxwq4a&DZ>*f-LlN_tl^4=6;-F zI{(?xfdy3bWkCoi+EEW7av$gDHD`MgoV-Ht%k0IM!{G}zBZ((^GTy4pBs02KM(ai< zK2U63^b1%HRt;#({(7SGxmwGjfvfNhxen~GTZ9(rHPhF_t4!ZdG)pQ5lsO;gzbbEt z`7T{oq$h)xR93&x^Ed`+OJ-e-NZf}LXOI6T8NVz26MHRfiGZg%lBg25`S9Q{x!_w8*M}ZZ1u}o(MgO#XTdqAf!g_S#! zCPa?Ul$A(HgSw?}M_4dB#y6`7_%0rS)+}SmHnZ%&{4OIM7ykcz^G+Q>kkwS^H6XtI zAx!DXNd!6d?MIKq2RJEmgGUKeF(0r}jubVVRrxx@I{iN52F`k7RSgyBN1eZ+)b!oi zH`JQ4Xm*3YbM3ag*PpdbA|d=rgncWXK#Z;SMKi@V;gQL(#sG_VheiJ+OkCUK7v;`S zBVWHbvWiK0P7B|QqHm>4KJVfkJnjdqN%T+#pNJ=zB>&X1O~#D!1%-S&pMk8Cy+M6)t|J$(KAY-$=yx+9qZcjlUpMOXP@Jnu)4wCseGRL( zx>5g7Y~SieEmlELZjx!_8uvvw_jr^`#S@c=&m?aQ)FlXcn1e&q3QlW1!A;^jJb50G z)ERLKa?SCkc{&{vuT3w%N+x5R(-#S-jaG9)T=~W{FWAqML)(svh>^P&eJ!!@;gV^T z;dq~JB5ytku}J|v-BFhJs`rC6o)_7=wlA*@JZ_!wQ6iit{Mh-dFXBFh>#3WTtQk%P z3O|JAgj>(;_~xHBt*fkA?+5)R&pvoKV_I}HizY&{?o628)4qq|8KbReJqG@aT1ECVKmZ&*(-C7oETPRG-=~ zFlw*}>$Z+_miT3`j%mM6MW{b5@@WJEBfH^7LSQ$kXP|0Y@#|@SeW~l=H`ngkL^%uC zbn*9!jp07S7@MZd6fF;(aSV=>OW=C%->-`+Ir`~Jq-UZgZ&M!8@Mg-`&j(|?lhSo{ zF{G4w*vmaX;%bxPe5cjpINKEe^gN)Q?`6TYkJx?BY9j1N`-B zN*d1Jzh9t8_RxSHFRCv=@rNDs_?U|{^lVbA$z~_8`id)PPdOPHCMMpiTYB^aN0Mf$ zlz{Fr>Q5gK#HOSWH$8Dq8`Ltz`TqQ?tI(3Iaru<|X&RGP5B%s|@o88$xn1O?8!%sK z_-WSjbX-W}zjA^~t=#T3HU=}c@sCe@SfRu2zN;gq9XNh>#lX#^pt~O+N1!;x`6mrT znQ5f_*WTAwL`ZQ6ytewA%ga-<)}Rt6z8!ElbMOq-I-$&slDzISinzs=)`Y1dwU4}M zth(0&5=~`EV@Ap}-Dka)eES__A}%*mkJ>KsxO!ga$!t9nb*y4)cDURj#clG#C70`) zKW9omqO;Z!bvX~ybruiIRxsFAD3_0Lg;mgUR1nFOE4h6us7fFuXXj!4i2dgKJsExl z*V?B;;d5kI@}@IJH|96hpWwbRZ5vCNi}HN& zcI*`I%U1%!EK!8KZ<|YVhhL=|;`WC2=MS=>cdJJIz~C#LwKsYD&5A_xQk9P3+4vBl z5Q3h*%<%bC zeAx(Xadjx<$+e=-p<$1`5pC;5-9<|qL@)BLk8tEoWUcpRmQ4QmNf6}v`<`jjnwLf6}Y&+$n}s5T6O(gWpfNG=K&K%qURs^-+_um zuGFg+hDF1hKjxh(BBjLW6@A{uOMMB0M_x~Pw6jh^A?OF!#sWiz?Zbz6>!j%qIY(4A zKE|2Nd3)_W55AAAqpksq-{DnTa-H{2so3Vf9~!8+60aQ3m!hj+KyiaoeUVeaIFc&g zu+vr>?`^5N7k6H!ZDly-U`cqJHeKO7Z(M2WJGPvdTgRoWupiEyjC=CUXD0t#P$xOY zB@D8|v%{%=9VC@(gq2^taoT;&x_u>2A{1K1xZ?@+zZY75^vOm?yoqW!wcr=tcKdBR z--Yxx`m5Jx?sYWa2ebGIlJfRm;2^IgFq*@6rfoPwcWkgEMn=i!h_wJ$%iK&ZBIu01 z+p-_-?;0+~IeD(?N-DH+qIQ>N~&2)(zWtTD7kmE`8#qj0PISFCmDhJuek&NSU@cK(}J!I&2d z8(l^PUog8gjsMti@DwJiPX8UebgH_xwAf2>%S6N@0>NWxr8ou;bG>!pi-Wy z#1F$UDoPJFECYAm%DtnX>AvU~1>U2ffg=8I>EX&?2m0r&xA;~CR{1*lrdFo%7p|bG zDw-;O70&;3_bn(r+_#}OL^RU=%-G|iXpW%HTH=VtLf*+R!+y)4uJGYWg50_Cj>SrO zyLS#EVLvlVPjoJie8Z}bwc#d@9;&5PB4gD1+`^Gwni|4#ed~tzb+)%;wjOzzUuK8o zm^nR+yb9zxS0dhwOa|GBfSYe%=^8C%`sKXK0-bNRc75c{hdG=|S6bPxE;)QA zpJcbYm_E*c<;JQ~F;+NaYGBR0sZTRfg28X`?xe;UBC`gM+pI34DXg7$ab|>NTLv~t zz|HPSM`PC1UYEy=>7?%^m0WHh)~@-21Nnqc zgCpmfPX-L9D2J8hO2~;*WctV{7H3GQWKAS^q>}g%#L4_lO-Umu67AQO(Xj z1QxLmPR6$`-Zi`~c2@1Ph--M(Ans>hLsto7&1%YZGUXN;JbA`m6SnOW`Hbf+21~k4 zcP?l#(5rkke)a?FsM+yRuJ-Acs17cjS|fy4I5R!H=bhie)rYqBmlFCPaxz@VbsYCKmTYVb zhaM6BD(_`Yt#dcDO;*l`wz*{91nadIta+9_?)rVRM1@nT@x`G&oBSdF3)b_^aeGY|_7suxh)JySd&Ve(_elEk>=8a@R|d|d7?vzAKIKm?ZrL=F zhTlw;YN?{XuG_wwPJM)XA*{1Vh=e$m!tyR{1=W&kh>If*v%ym4BTlJYiUf-GvClkW z(PVdTXJCyGat93iZhSdnncIEQljzV7Dc7%zMpI=@5+UC5yz@Q0^WGf(Lw9~T9qDxx zJbfv)gT9|KkAdYrf3t6{xlcT|uz;V$l*2jqxa!H{w!Lzt{+QAYb@4XGucazk%J(ML z+G}^XYs)h|sHkaEllMCZ3UI|`JuB@0AaLW)>*x3{+k2UfX5KKK9_{GqbBG_udZ|)O zRe!OFxa(wsi>M)2N!Z)&lC5Lx%1^xb6JQQG|7Z64|CFUWzjFv0rg{TZlqbL40kU+s zRG}ec#KkQaBMaM)I`Fb%b0ragouEQy-zR7%ID@_QRJ`WzQ}ODolO|Spv~r!xZVR0r zo14GZsDus=A?zD%F7bclzS{9fbx1CzG)}wXx+TK*Mu2IVc!1q$O=I$FrB?{ei>2NzsN3kBnaCC5d9DO$6 z^j=x5y(mw?TGq+6x6d*8bq)m%5#lT({`$E0#ZENh1`Y;e!roZ^)#u$t*O>HMPL)#% zDxOCe3oCW0sp|};XJ+yRJQ22M;W+jh7+DF z`gBDbfCJK&LCmf(LD5C44+A;OB1^ zmmA9(a%txD@rMEtD_rt*hc)q1PnI}G$l+)s3abTVT*X-gY!R1q#~Ortf3h$!JwDM+ zFcEl{f}yI3%KiG`+6HYQV?SpH?(+$OD$S89R|q!bsoan73mGu4$5EO+J@oN(H%Gg@ z<>Ik%S;;bcsY*qjQ#=+AmzBzbZ`>)Z2&Y)kYr@m=mtriTYZsZp55u7T<^S_nBi*oG z>R1P7^b~VM5=#WFzxaZn1omUk!I(-}GmEE=p>G^dhbhLczFL+DQlk`CKGl=boF-p2 zA`)@zm*115Z}e#ERn(L#lLWVBl6%>!bQFyZh}hHC{y@|vOlT?1$yUHw`5Vk&hny;6OVTxb$Tj7eB?7)SzIM!I|@4;+i zXQn6gc}n=1DN@ub(LbvTW9GNsubn}7$K4gISeP`<=vXZ?4Dnri@`tJZYI7Viv4}ON8Nca1Vy-+-*Q@h|4K!;r!#sgJ1NZy^dHB0jp(r)+pMS% z$>7aM_GaKb+k*E=Q1FrS&^%MhB8gqT*yqK^SIw@RUa8bFersJjHrG-4>ev$QsZ<*I zJSA!|t31*c9Zg=7vQPOn4tPNeq;| z7iop-@p5u#F091=#5L{03&N3_p=n;p)S2nNf%7nGvDg2R-LoY~+M$avo!4QgLW`y^HT?ZJyd}_#RCRdd0a! zoE9WnbrUW75A*t=E6K;rD^8kIK9{EnU^3FXgHcX~UN`3P7UQF_l%LzmB|`17DJkml z`Qslpz9Xi_UVP`^OL2K9SqthMDg5OCclImGpL(m9D^yrHYpRj5_T z@?!A2Evvv8%`*a+J9zlV; zq~y?8KaL@r;rsO7bo}*1&l)llPdm>QNxoz`43%EQS4U?akClg@EV$14_qU}BXxvpM zk-5%U*56J(uce69%IY8ONCgiWl4uKUv@$DO31rPy$k$6}4=GbVyHD!XZ!!PfS?(j* z^;WtK-(=e_>X-kx|G7)`3{zSdvB7niqlIIBY{G>loj+vDa9L=FiE|ez9C_ob5DE%B=>) zV~110wv_(Q3^0kfl<%h(mj!0l?>#xbNU<<&{rBYflHBJrTz;3z&Ke_n5pPV%*Uy`? zD~&`iYssDMx0Ef8vmlX4%m^jmM5wn{WKC>*NHD)mqy@yAS?kV$4M?XGiIeLoD?s8CiSK4#=cA# za8Iwqt3c(+biBAI2|2ic#87YZ!Yc6WwdJjoW9NSBMD@!$*!nW+4obAEUzo4eSJVtY zx(+I0WyZ3Fl$MGq^O@38*wErANK?+!CDcYJQaK%YjlVWA;pBwW1`C_Rck=i`tZ7m#tMLraJ;sy)eYqKJl&Oql-eYtha@OK$l^ zubTskxONrNR>?mi@498#Ex+@16_GuoiQ8Flk9Bh^#~?h59sR$b93x+?{8!lcZ)P${ zO;&|Ul3+eLY2<+}eIZSrH@0tqe^^2NYiB~4hN#*|8gJ=0Y0tj{6r-Z)X z6=lzpHZSm3lDvK3RWUtV-YBCf6f7}nr!5+pTa8(Agtq9~xl8@!(Ym>OpQrig(9!F$ zM{N1AbZn?UvtybtX?uweCiUZ>Yh?c@J#ps_r(cH(8afg7$!10D$!?3M042LTAk~vZ{-wgHKutzdiu`NF-i7wswV6Yf{Xf<50Ug8)6Tfl%6(is zQ@WLMd0jU1xv06$af#Aa{x?h#2+}axRc$i^M*CM${4>EEY*y*5<5Q{kIl4qH& zwvOYbexD|PZ9`JOWh$B%dSG9*%jwHUGjuU6KIYRV6 zDSFW7j2Y(O%&aJ>XY$SD3jvkc9ae_~Wemc5-sE_49DPojn8-6Yw}!2#S6Wx4DwyCO z!4~(+&+4dL;)&Pmp^;hF#`SU6;x%3M8{BA#nYaxYyxmT-*O-yM{>TulaLnRp!S%vr zJi=j`UtejRXdiWCd#0l4Uu%okAE9fxoHPHufIp|I{InElbz!XS!VqS;LU_J*1Un$7lG-f{1nVi^m>;W~-uh7&K zziaO=XSGbqx260p?m8skG^YJVRK@j;%J<|@=7g*-!OavowbrQ)3g0m=-A*B|9#S1o zmKXAEdM-_$LAE&agW!^aOnUl!qsW@Pz*>sT*XvJ`zteL%9zmRGl7FNei7)7iAOB_G ztG@+##wv+ipv!d9qYd5>GnL4-txC)9vZ$j)KzttBy5fzFL5xO>h7JBe$0J2v81vBE zE2u9EF;t2fX~>DI3bRSeOR&oc%S%g$tEirnlh|y79y9HtKyhC93x8@8*i$TnYXDH3 z-yMeFqqV7{p@XrZl_5KdE5{cu{JSp(2ZgpH&m(^sNC8k6Kov5mCrI4xf^Kh4l>b)T z_}ge?=hf6+Q017DHZ-GV(anrUw02jg2cMvbFFr-3;(IOwhhfMCo^MJ3~)4ypi*+cK?o5ujjoWPqx+1A|=2wv^) zYJ0ECvj6M7xp($}dx4Wil24WbG#nrkC^@w|YA?cp|2|dRDr9JAWVPKJWG|SCG*f99 zKzIp$kj=Z12M)GllhB9#!iCI8y$NSl4AArw{D5wzu#p5Bo9;5a^=3++V?PL_A^Sz@ zATgjJHK3t$FbL2T1+>@N1wpAHXc_~>9OOq=84V;jfJQ{Y#GygLZn_M|{wp{M8!n{& z9dOA(lm>`no7w<>&vSY&yJA9cv~5cFvkOg_doSxkHUPV#00>!^t5R@~Z6nz>!R?MB zaEc%WL%m;>Qk10Iq6i?AFVY)b(L`(ycTJP+tutUnMFvf+M{hr=9TxVvp=;q*Zg zOGvpWs1BI*1iA$l0UKdTaA1_W-FbQf6pYk2ZSS{mK5vf02c!o8kpk7ghO`=rZ`nC1 z{Flsy<3UorJrbX(0;Ea=M1nj+vNYj%aPN44=RO-^l8gsP)7wXyxHcSWTL3#OA4Hn^ zJicZkU~mxp08b4Ysrqm{q)r-87`+Nw(TQ7`TJJYXh~s0`+-9SI+x?y^)&LHBfB~Rh za6!PO_1@7jfXW8&gRGi%V>s~N38>;^;(A385DJw7qXC$^qre1? zw=K_qN&5q|Bi7%z2pLWT&<+TsupwZE0!`V}+0@>_^dIPcQ9|>Z>}Q$7V-NtOj-sJ~ zwHj#@rZzBH$pHgHXT1 z-J$0IM?*OWA^(cRHmK^yx&M10RZ}Is+cW{HQ3V1;c5J*Z@b7`%kvh;TIT_kGS~$9) z&AmWO7b930jfXGAe}&O3K@5;2z;mK$WB5pxf37LIF~pX1E9;8XtCrkjw$~+JdT;$g__QxS?Gd zw~TPV>>!G8Jip2K1B%K3MaXQbJcL7RQ?wIJ{*x|it~oOq0E!Jjp_Ie!U`vLB?yP80 zt(G7gp&Q6Jd=-EMf>{Q=(%v04DR7{jE!_dLHQnq*MG3&+0654y0M%4D4m^}Dy<^{Zh?E`Fd&)IprTRz`66jFvkfh^z2TG9gE&7-}*0^v|sC69)@-* z^*!J}3>Y;7Ob%#h4`0AB;r6rj?&aWBs1L`~1Ztn!S_+5U%Rl5(QEnw6%`b#C$~FP| zL;;=1c0K~^YM1vN&4bW>DInS=CJScmWF3K@`bs}bG1 z>3Ia?x+}nK(EXG_T6i>`@s$V68zEqYv@qVlEb4=U>}WyJ-WF5}bVRW_fM}|(5lOQF zXGgrMdxcdp0Ehca6KaxJf5?4%4|0;(phpS#2R6(=%50Y}1%>^;578iDNu>n6{(_!H zB8Lg@HTdtN5H>eTYlUo#OIpO99n4Zuz#Ou*Cj5qjZrj)aD*(tA#$96{x&oV@2P7f~ z93m`q7<+&1Nfr7FHJ?i&SAB+n+B)#}=KcB(?+5y~tV4}eh*2h$oF+82B{#!e&!9}W!7Teg`^jRS=l;A8fLnXzMVz#ZE> zI7ezQs^kkiDjHyB&l0$b01maa9S8q9kS<7rYr|B@eLz_I<7(w`IM`kT`xjRaT*vvX zU9#>26t)6zU}v!Lg%}BVJNt@S57;wC4jKT36`;_-FwoF2K*PJ;`GXVyxUmkpY0nFs zIr0Gxi69(ho4mifUlaitVio6|>!1{v`~(0G8S9}C9BU_sEMaKnfFd6aO~_GatO!Mb z6Ab7=UgmR%z_GxD+}ZV_APX`+&0Y%v2Qm;SdH2Sb*C0*3>y!Rk^nNixteq37yI=;$ z!P^(teRSX$d$kCr8tObKBzsG-ERP1j-!OP_a~jEr1@%#YLVOlff zPzoeGWfLU4e+Ov$dEN3}A7XShf!Fl`UY8#UhW6pR$2N{W}^!ZtcAn4c_gT z775MAZa*57ulvu~4{xH2%%>?h7uz*tq~N+=$B@Y2U1*WlOS%roP>!Z%AicS-2QQpe z2oT=fj~IznGjKkg^g+2Nhv4C@r;u0qJzgl_cK{F6E+j#CYZc@qRe?7gbZeQoUxbkL!dqg{?D7mg zF$BUPw!PH$3d#NezX>mYkL;cbK;Nxhm2L}hhf7qa6p+;5mCTXNmoOX!9$2y+82;fE zl+PnE8Xt)Q3OW2k3DW{Hg`;edZBgM127fm%3ykjcO+KO(Opa2~?p{%ub78y-Xyyx1Zol0JST zq5R!fqTW_Q)W8eWAy2H%BseuF!t8D(AvlYW-NDOvAv=RlkKmvvoxzSmP&xyMcX%}< zWSR-n;qWMEMxC64K;cznkkh9uS^ovPy=p`)3 z!=E)EALgyUgadEW{ohBT@Vns1^X6(D9BeO@U`c)Oof!DVPGomx(g+9rw-o%Ra4fJ> T2Ui8apF`k+(rhbOMWg)>*U46h From c864dd90cc3faed8baa828c3c2202eb17af8971e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 5 Jan 2010 19:50:43 -0500 Subject: [PATCH 029/823] * Basic API serialization * Fixes to API extraction and equality checking * Reworked tracking * New compile infrastructure based on API changes * Example application for testing --- cache/tracking/ChangeReport.scala | 3 + cache/tracking/DependencyTracking.scala | 50 ++++++++-- cache/tracking/Tracked.scala | 125 ++++++++++++++++-------- cache/tracking/TrackingFormat.scala | 3 + interface/definition | 5 +- 5 files changed, 135 insertions(+), 51 deletions(-) diff --git a/cache/tracking/ChangeReport.scala b/cache/tracking/ChangeReport.scala index 41f99ca1a..c8f3a52eb 100644 --- a/cache/tracking/ChangeReport.scala +++ b/cache/tracking/ChangeReport.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2009, 2010 Mark Harrah + */ package xsbt object ChangeReport diff --git a/cache/tracking/DependencyTracking.scala b/cache/tracking/DependencyTracking.scala index 5d61f020e..e34930f5a 100644 --- a/cache/tracking/DependencyTracking.scala +++ b/cache/tracking/DependencyTracking.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2009, 2010 Mark Harrah + */ package xsbt private object DependencyTracking @@ -16,14 +19,22 @@ trait UpdateTracking[T] extends NotNull def product(source: T, output: T): Unit def tag(source: T, t: Array[Byte]): Unit def read: ReadTracking[T] + // removes files from all maps, both keys and values + def removeAll(files: Iterable[T]): Unit + // removes sources as keys/values in source, product maps and as values in reverseDependencies map + def pending(sources: Iterable[T]): Unit } import scala.collection.Set trait ReadTracking[T] extends NotNull { + def isProduct(file: T): Boolean + def isSource(file: T): Boolean + def isUsed(file: T): Boolean def dependsOn(file: T): Set[T] def products(file: T): Set[T] def sources(file: T): Set[T] def usedBy(file: T): Set[T] + def tag(file: T): Array[Byte] def allProducts: Set[T] def allSources: Set[T] def allUsed: Set[T] @@ -42,6 +53,7 @@ private final class DefaultTracking[T](translateProducts: Boolean) val productMap: DMap[T] = forward(sourceMap) // map from a source to its products. Keep in sync with sourceMap } // if translateProducts is true, dependencies on a product are translated to dependencies on a source +// if there is a source recorded as generating that product private abstract class DependencyTracking[T](translateProducts: Boolean) extends ReadTracking[T] with UpdateTracking[T] { val reverseDependencies: DMap[T] // map from a file to the files that depend on it @@ -58,11 +70,17 @@ private abstract class DependencyTracking[T](translateProducts: Boolean) extends final def usedBy(file: T): Set[T] = get(reverseUses, file) final def tag(file: T): Array[Byte] = tagMap.getOrElse(file, new Array[Byte](0)) + def isProduct(file: T): Boolean = exists(sourceMap, file) + def isSource(file: T): Boolean = exists(productMap, file) + def isUsed(file: T): Boolean = exists(reverseUses, file) + + final def allProducts = Set() ++ sourceMap.keys final def allSources = Set() ++ productMap.keys final def allUsed = Set() ++ reverseUses.keys final def allTags = tagMap.toSeq + private def exists(map: DMap[T], value: T): Boolean = map.contains(value) private def get(map: DMap[T], value: T): Set[T] = map.getOrElse(value, Set.empty[T]) final def dependency(sourceFile: T, dependsOn: T) @@ -82,22 +100,38 @@ private abstract class DependencyTracking[T](translateProducts: Boolean) extends final def use(sourceFile: T, usesFile: T) { reverseUses.add(usesFile, sourceFile) } final def tag(sourceFile: T, t: Array[Byte]) { tagMap(sourceFile) = t } + private def removeOneWay(a: DMap[T], files: Iterable[T]): Unit = + a.values.foreach { _ --= files } + private def remove(a: DMap[T], b: DMap[T], file: T): Unit = + for(x <- a.removeKey(file)) b --= x + private def removeAll(files: Iterable[T], a: DMap[T], b: DMap[T]): Unit = + files.foreach { file => remove(a, b, file); remove(b, a, file) } final def removeAll(files: Iterable[T]) { - def remove(a: DMap[T], b: DMap[T], file: T): Unit = - for(x <- a.removeKey(file)) b --= x - def removeAll(a: DMap[T], b: DMap[T]): Unit = - files.foreach { file => remove(a, b, file); remove(b, a, file) } - - removeAll(forward(reverseDependencies), reverseDependencies) - removeAll(productMap, sourceMap) - removeAll(forward(reverseUses), reverseUses) + removeAll(files, forward(reverseDependencies), reverseDependencies) + removeAll(files, productMap, sourceMap) + removeAll(files, forward(reverseUses), reverseUses) tagMap --= files } + def pending(sources: Iterable[T]) + { + removeOneWay(reverseDependencies, sources) + removeOneWay(reverseUses, sources) + removeAll(sources, productMap, sourceMap) + tagMap --= sources + } protected final def forward(map: DMap[T]): DMap[T] = { val f = newMap[T] for( (key, values) <- map; value <- values) f.add(value, key) f } + override def toString = + (graph("Reverse source dependencies", reverseDependencies) :: + graph("Sources and products", productMap) :: + graph("Reverse uses", reverseUses) :: + Nil) mkString "\n" + def graph(title: String, map: DMap[T]) = + "\"" + title + "\" {\n\t" + graphEntries(map) + "\n}" + def graphEntries(map: DMap[T]) = map.map{ case (key, values) => values.map(key + " -> " + _).mkString("\n\t") }.mkString("\n\t") } diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index 72be00ac6..dc8c6095e 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2009, 2010 Mark Harrah + */ package xsbt import java.io.{File,IOException} @@ -79,60 +82,98 @@ class Difference(val filesTask: Task[Set[File]], val style: FilesInfo.Style, val } } } -object InvalidateFiles +class DependencyTracked[T](val cacheDirectory: File, val translateProducts: Boolean, cleanT: T => Unit)(implicit format: Format[T], mf: Manifest[T]) extends Tracked { - def apply(cacheDirectory: File): Invalidate[File] = apply(cacheDirectory, true) - def apply(cacheDirectory: File, translateProducts: Boolean): Invalidate[File] = - { - import sbinary.DefaultProtocol.FileFormat - new Invalidate[File](cacheDirectory, translateProducts, FileUtilities.delete) - } -} -class Invalidate[T](val cacheDirectory: File, val translateProducts: Boolean, cleanT: T => Unit) - (implicit format: Format[T], mf: Manifest[T]) extends Tracked -{ - def this(cacheDirectory: File, translateProducts: Boolean)(implicit format: Format[T], mf: Manifest[T]) = - this(cacheDirectory, translateProducts, x => ()) - private val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) private def cleanAll(fs: Set[T]) = fs.foreach(cleanT) val clean = Task(cleanAll(trackFormat.read.allProducts)) val clear = Clean(cacheDirectory) + def apply[R](f: UpdateTracking[T] => Task[R]): Task[R] = + { + val tracker = trackFormat.read + f(tracker) map { result => + trackFormat.write(tracker) + result + } + } +} +object InvalidateFiles +{ + def apply(cacheDirectory: File): InvalidateTransitive[File] = apply(cacheDirectory, true) + def apply(cacheDirectory: File, translateProducts: Boolean): InvalidateTransitive[File] = + { + import sbinary.DefaultProtocol.FileFormat + new InvalidateTransitive[File](cacheDirectory, translateProducts, FileUtilities.delete) + } +} + +object InvalidateTransitive +{ + import scala.collection.Set + def apply[T](tracker: UpdateTracking[T], files: Set[T]): InvalidationReport[T] = + { + val readTracker = tracker.read + val invalidated = Set() ++ invalidate(readTracker, files) + val invalidatedProducts = Set() ++ invalidated.filter(readTracker.isProduct) + + new InvalidationReport[T] + { + val invalid = invalidated + val invalidProducts = invalidatedProducts + val valid = Set() ++ files -- invalid + } + } + def andClean[T](tracker: UpdateTracking[T], cleanImpl: Set[T] => Unit, files: Set[T]): InvalidationReport[T] = + { + val report = apply(tracker, files) + clean(tracker, cleanImpl, report) + report + } + def clear[T](tracker: UpdateTracking[T], report: InvalidationReport[T]): Unit = + tracker.removeAll(report.invalid) + def clean[T](tracker: UpdateTracking[T], cleanImpl: Set[T] => Unit, report: InvalidationReport[T]) + { + clear(tracker, report) + cleanImpl(report.invalidProducts) + } + + private def invalidate[T](tracker: ReadTracking[T], files: Iterable[T]): Set[T] = + { + import scala.collection.mutable.HashSet + val invalidated = new HashSet[T] + def invalidate0(files: Iterable[T]): Unit = + for(file <- files if !invalidated(file)) + { + invalidated += file + invalidate0(invalidatedBy(tracker, file)) + } + invalidate0(files) + invalidated + } + private def invalidatedBy[T](tracker: ReadTracking[T], file: T) = + tracker.products(file) ++ tracker.sources(file) ++ tracker.usedBy(file) ++ tracker.dependsOn(file) + +} +class InvalidateTransitive[T](cacheDirectory: File, translateProducts: Boolean, cleanT: T => Unit) + (implicit format: Format[T], mf: Manifest[T]) extends Tracked +{ + def this(cacheDirectory: File, translateProducts: Boolean)(implicit format: Format[T], mf: Manifest[T]) = + this(cacheDirectory, translateProducts, (_: T) => ()) + + private val tracked = new DependencyTracked(cacheDirectory, translateProducts, cleanT) + def clean = tracked.clean + def clear = tracked.clear + def apply[R](changes: ChangeReport[T])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = apply(Task(changes))(f) def apply[R](changesTask: Task[ChangeReport[T]])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = { changesTask bind { changes => - val tracker = trackFormat.read - def invalidatedBy(file: T) = tracker.products(file) ++ tracker.sources(file) ++ tracker.usedBy(file) ++ tracker.dependsOn(file) - - import scala.collection.mutable.HashSet - val invalidated = new HashSet[T] - val invalidatedProducts = new HashSet[T] - def invalidate(files: Iterable[T]): Unit = - for(file <- files if !invalidated(file)) - { - invalidated += file - if(!tracker.sources(file).isEmpty) invalidatedProducts += file - invalidate(invalidatedBy(file)) - } - - invalidate(changes.modified) - tracker.removeAll(invalidated) - - val report = new InvalidationReport[T] - { - val invalid = Set(invalidated.toSeq : _*) - val invalidProducts = Set(invalidatedProducts.toSeq : _*) - val valid = changes.unmodified -- invalid - } - cleanAll(report.invalidProducts) - - f(report, tracker) map { result => - trackFormat.write(tracker) - result + tracked { tracker => + val report = InvalidateTransitive.andClean[T](tracker, _.foreach(cleanT), changes.modified) + f(report, tracker) } } } diff --git a/cache/tracking/TrackingFormat.scala b/cache/tracking/TrackingFormat.scala index c1bb3da56..c0106d3a0 100644 --- a/cache/tracking/TrackingFormat.scala +++ b/cache/tracking/TrackingFormat.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2009, 2010 Mark Harrah + */ package xsbt import java.io.File diff --git a/interface/definition b/interface/definition index f2696c863..2365e9ee5 100644 --- a/interface/definition +++ b/interface/definition @@ -96,7 +96,10 @@ TypeParameter Annotation base: SimpleType - arguments: String* + arguments: AnnotationArgument* +AnnotationArgument + name: String + value: String enum Variance : Contravariant, Covariant, Invariant enum ParameterModifier : Repeated, Plain, ByName From 2c083af736e1be70a98d9bad8851d16487a4fea7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 7 Jan 2010 21:38:39 -0500 Subject: [PATCH 030/823] Polymorphic types and fix parameterized type arguments to be Type and not just SimpleType --- interface/definition | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/interface/definition b/interface/definition index 2365e9ee5..a89b238b9 100644 --- a/interface/definition +++ b/interface/definition @@ -42,7 +42,7 @@ Type EmptyType Parameterized baseType : SimpleType - typeArguments: SimpleType* + typeArguments: Type* Annotated baseType : SimpleType annotations : Annotation* @@ -53,6 +53,9 @@ Type Existential baseType : Type clause: TypeParameter* + Polymorphic + baseType: Type + parameters: TypeParameter* Access Public From 45f29074d2f1be065b0ae28dd8a5e0dec6eed635 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 14 Jan 2010 00:15:21 -0500 Subject: [PATCH 031/823] * limit stack trace length: trace 'on' | 'off' | 'nosbt' | 1 | 2 | ... * updating license/copyright --- LICENSE | 2 +- NOTICE | 29 ++++++----------------------- 2 files changed, 7 insertions(+), 24 deletions(-) diff --git a/LICENSE b/LICENSE index be586c877..6d1f89b26 100644 --- a/LICENSE +++ b/LICENSE @@ -1,4 +1,4 @@ -Copyright (c) 2008, 2009 Steven Blundy, Mark Harrah, David MacIver, Mikko Peltonen +Copyright (c) 2008, 2009, 2010 Steven Blundy, Josh Cough, Nathan Hamblen, Mark Harrah, David MacIver, Mikko Peltonen, Tony Sloane, Vesa Vilhonen All rights reserved. Redistribution and use in source and binary forms, with or without diff --git a/NOTICE b/NOTICE index 07766ee52..1c90f88f5 100644 --- a/NOTICE +++ b/NOTICE @@ -1,5 +1,6 @@ Simple Build Tool (sbt) -Copyright 2008, 2009 Steven Blundy, Mark Harrah, David MacIver, Mikko Peltonen +Copyright 2008, 2009, 2010 Steven Blundy, Josh Cough, Nathan Hamblen, Mark Harrah, David MacIver, Mikko Peltonen, Tony Sloane, Vesa Vilhonen +Licensed under BSD-style license (see LICENSE) Portions based on code by Mike Clark in JDepend @@ -14,30 +15,12 @@ Portions based on code from the Scala compiler Copyright 2002-2008 EPFL, Lausanne Licensed under BSD-style license (see licenses/LICENSE_Scala) -Portions based on code from specs -Copyright 2007-2008 Eric Torreborre -Licensed under MIT license (see licenses/LICENSE_specs) - -Portions based on code from ScalaTest -Copyright 2001-2008 Artima, Inc. -Licensed under the Apache License, Version 2.0(see licenses/LICENSE_Apache) - -Portions based on code from ScalaCheck -Copyright 2007, Rickard Nilsson -Licensed under BSD-style license (see licenses/LICENSE_ScalaCheck) - -Jetty is licensed under the Apache License, Version 2.0 (see licenses/LICENSE_Apache). - -ScalaTest is distributed with sbt (in the subversion repository) -and requires the following notice: - - This product includes software developed by - Artima, Inc. (http://www.artima.com/). - +JLine is distributed with the sbt launcher. +It is licensed under a BSD-style license (see licenses/LICENSE_JLine) Apache Ivy, licensed under the Apache License, Version 2.0 -(see licenses/LICENSE_Apache) is distributed with sbt and -requires the following notice: +(see licenses/LICENSE_Apache) is distributed with the sbt launcher. +It requires the following notice: This product includes software developed by The Apache Software Foundation (http://www.apache.org/). From 7fab90da814e97e645d00b7357b51117cc29ea61 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 14 Jan 2010 00:16:34 -0500 Subject: [PATCH 032/823] copyright/license updates --- LICENSE | 2 +- NOTICE | 35 +++++++++++++++++++++++++++++++++++ 2 files changed, 36 insertions(+), 1 deletion(-) create mode 100644 NOTICE diff --git a/LICENSE b/LICENSE index 49fe1ee66..27b5d6df2 100644 --- a/LICENSE +++ b/LICENSE @@ -1,4 +1,4 @@ -Copyright (c) 2008, 2009 Mark Harrah +Copyright (c) 2008, 2009, 2010 Mark Harrah All rights reserved. Redistribution and use in source and binary forms, with or without diff --git a/NOTICE b/NOTICE new file mode 100644 index 000000000..6f5297e2e --- /dev/null +++ b/NOTICE @@ -0,0 +1,35 @@ +Simple Build Tool (xsbt) +Copyright 2008, 2009, 2010 Mark Harrah +Licensed under BSD-style license (see LICENSE) + +Portions based on code from the Scala compiler +Copyright 2002-2008 EPFL, Lausanne +Licensed under BSD-style license (see licenses/LICENSE_Scala) + +JLine is distributed with the launcher. +It is licensed under a BSD-style license (see licenses/LICENSE_JLine) + +Apache Ivy, licensed under the Apache License, Version 2.0 +(see licenses/LICENSE_Apache) is distributed with the launcher. +It requires the following notice: + +This product includes software developed by +The Apache Software Foundation (http://www.apache.org/). + +Portions of Ivy were originally developed by +Jayasoft SARL (http://www.jayasoft.fr/) +and are licensed to the Apache Software Foundation under the +"Software Grant License Agreement" + + +THIS SOFTWARE IS PROVIDED BY THE AUTHORS ``AS IS'' AND ANY EXPRESS OR +IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES +OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. +IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, +INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT +NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, +DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY +THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT +(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF +THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + \ No newline at end of file From dc7b7d536304ccac331f6067b68d0231a71ee7e0 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 22 Jan 2010 20:17:49 -0500 Subject: [PATCH 033/823] work on source api parts --- interface/definition | 95 -------------------------------------------- interface/other | 68 +++++++++++++++++++++++++++++++ interface/type | 27 +++++++++++++ 3 files changed, 95 insertions(+), 95 deletions(-) create mode 100644 interface/other create mode 100644 interface/type diff --git a/interface/definition b/interface/definition index a89b238b9..9220a9a0e 100644 --- a/interface/definition +++ b/interface/definition @@ -1,10 +1,3 @@ -Source - packages : Package* - definitions: Definition* - -Package - name: String - Definition name: String access: Access @@ -29,91 +22,3 @@ Definition TypeDeclaration lowerBound: Type upperBound: Type - -Type - SimpleType - Projection - prefix : SimpleType - id : String - ParameterRef - id: Int - Singleton - path: Path - EmptyType - Parameterized - baseType : SimpleType - typeArguments: Type* - Annotated - baseType : SimpleType - annotations : Annotation* - Structure - parents : Type* - declarations: Definition* - inherited: Definition* - Existential - baseType : Type - clause: TypeParameter* - Polymorphic - baseType: Type - parameters: TypeParameter* - -Access - Public - Qualified - qualifier: Qualifier - Protected - Private - Pkg - -Qualifier - Unqualified - ThisQualifier - IdQualifier - value: String - -Modifiers - isAbstract: Boolean - isDeferred: Boolean - isOverride: Boolean - isFinal: Boolean - isSealed: Boolean - isImplicit: Boolean - isLazy: Boolean - isSynthetic: Boolean - -ParameterList - parameters: MethodParameter* - isImplicit: Boolean -MethodParameter - name: String - tpe: Type - hasDefault: Boolean - modifier: ParameterModifier - -TypeParameter - id: Int - typeParameters : TypeParameter* - variance: Variance - lowerBound: Type - upperBound: Type - -Annotation - base: SimpleType - arguments: AnnotationArgument* -AnnotationArgument - name: String - value: String - -enum Variance : Contravariant, Covariant, Invariant -enum ParameterModifier : Repeated, Plain, ByName -enum DefinitionType : Trait, ClassDef, Module, PackageModule - -Path - components: PathComponent* - -PathComponent - Super - qualifier: Path - This - Id - id: String \ No newline at end of file diff --git a/interface/other b/interface/other new file mode 100644 index 000000000..78fe98691 --- /dev/null +++ b/interface/other @@ -0,0 +1,68 @@ +Source + packages : Package* + definitions: Definition* + +Package + name: String + +Access + Public + Qualified + qualifier: Qualifier + Protected + Private + Pkg + +Qualifier + Unqualified + ThisQualifier + IdQualifier + value: String + +Modifiers + isAbstract: Boolean + isDeferred: Boolean + isOverride: Boolean + isFinal: Boolean + isSealed: Boolean + isImplicit: Boolean + isLazy: Boolean + isSynthetic: Boolean + +ParameterList + parameters: MethodParameter* + isImplicit: Boolean +MethodParameter + name: String + tpe: Type + hasDefault: Boolean + modifier: ParameterModifier + +TypeParameter + id: Int + annotations: Annotation* + typeParameters : TypeParameter* + variance: Variance + lowerBound: Type + upperBound: Type + +Annotation + base: SimpleType + arguments: AnnotationArgument* +AnnotationArgument + name: String + value: String + +enum Variance : Contravariant, Covariant, Invariant +enum ParameterModifier : Repeated, Plain, ByName +enum DefinitionType : Trait, ClassDef, Module, PackageModule + +Path + components: PathComponent* + +PathComponent + Super + qualifier: Path + This + Id + id: String \ No newline at end of file diff --git a/interface/type b/interface/type new file mode 100644 index 000000000..c516b62b9 --- /dev/null +++ b/interface/type @@ -0,0 +1,27 @@ + +Type + SimpleType + Projection + prefix : SimpleType + id : String + ParameterRef + id: Int + Singleton + path: Path + EmptyType + Parameterized + baseType : SimpleType + typeArguments: Type* + Annotated + baseType : SimpleType + annotations : Annotation* + Structure + parents : Type* + declared: Definition* + inherited: Definition* + Existential + baseType : Type + clause: TypeParameter* + Polymorphic + baseType: Type + parameters: TypeParameter* From 058e28e9b134f67fe4bb8ebb3c349be52dd442df Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 23 Jan 2010 09:33:42 -0500 Subject: [PATCH 034/823] API: base types with applied type parameters Compile task: fix detection of classpath changes Aggressive compiler seems to work on scalaz now --- cache/tracking/Tracked.scala | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index dc8c6095e..f79a2a7ee 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -16,6 +16,22 @@ trait Tracked extends NotNull /** Clears the cache. If also cleaning, 'clean' should be called first as it might require information from the cache.*/ def clear: Task[Unit] } +class Timestamp(val cacheFile: File) extends Tracked +{ + val clean = Clean(cacheFile) + def clear = Task.empty + def apply[T](f: Long => Task[T]): Task[T] = + { + val getTimestamp = Task { readTimestamp } + getTimestamp bind f map { result => + FileUtilities.write(cacheFile, System.currentTimeMillis.toString) + result + } + } + def readTimestamp: Long = + try { FileUtilities.read(cacheFile).toLong } + catch { case _: NumberFormatException | _: java.io.FileNotFoundException => 0 } +} object Clean { def apply(src: Task[Set[File]]): Task[Unit] = src map FileUtilities.delete From 1306625dc6c006c6821f21b557e73649882dfafe Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 25 Jan 2010 23:06:23 -0500 Subject: [PATCH 035/823] Use published version of SBinary --- cache/lib/sbinary-0.3-alpha.jar | Bin 143178 -> 0 bytes 1 file changed, 0 insertions(+), 0 deletions(-) delete mode 100644 cache/lib/sbinary-0.3-alpha.jar diff --git a/cache/lib/sbinary-0.3-alpha.jar b/cache/lib/sbinary-0.3-alpha.jar deleted file mode 100644 index 131ec72ce31c157fa6f6b6dd3c94d9a92b54b647..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 143178 zcma&N1CV9g(k)!-vTfV8ZQHhO*Dl+(?doEe?dmSuHoExtx%l7tZoKct+k34YF(c-R zJy*=kkzH+|;D33?2PEoD3cH%+zeNGUF1<&Y?5Cv>dIJ z+`L;A7*xg?))0<1wIx{k5!H)1ImRJ|26}qQ5ju_M5!t!bo%5|T@IMyfoe0z>@#k;; zy8l`T^gkB$pKtNkdR>gI?2Vkg82-<4VgKV@asD<-!pz>x*~)~}#Ma2gh5jGEn_5sg zC>mH_eLdzZNlcy;E!}h^+t}PnnowFEdxA<7#5%C@Mh(q$x5dPlaQ4>xF!`8;>cHd(gF( z)!O2lnrmV1%1xWbmNj2`40~W#TQ9L)49XJujW>=vc+q4Ao7YcUh(M#QoKJ%;DQ)jU zo;l4(ONYQVcKQHYxlDWA1`g4rtv!yCH;A3OLp(#p#*(((KezI;=U15flyec>m3ZL1 zK{k_Dahbh_gb4DH;)xrMx*fhQ-okwbd*?>j_ZzOAVP3$BB2g}kxkN~!CvnEl;M1Bi z@SiDJ>MXx?9ZL$V4ob*_Gi~`ZXky|mTDv=Y(C}1Y)E1kt2lwjwb0_FaDN0R8ya9LG z%q(T)gCy2rvizCqKEEs~?K7G99=J|Ys2;w<`s?rs6OkoPhYmnN-BWGDxW%e7P6_z! z0`xc$6jG3ek|iH?iE`DM*If!3Lz+#oLuAHfZ71Nd_F-wL3*<{1jn?JeNDYdjq?BqX zy1mo@q^>L-mx{>fOf7Mhzf-BmB;Z)ftGh^XzY?sQ!DP&6#Sx^;MbZl>N)7-OQB-X8 z9I6%PuAyQ$-x0;8A|dx+Zx_pK9UJO7Ce~?8h+ZpZ@4U0q#mG8~eZ8gpIMkHwxZ74q z&Mc!8y?;?<9#mxU`2F-tU6w2mPgIo~M`QgN7TvfMS-yjOMjV1krQF+(y->Uq`csp! zCH{nrYTRmbkZQ*&$*K&<=jo(85--f1tG38r-5IC#;%Pv#H-GoD3dg7*0c(7bYSH0~ zP`B{J;>pjW^r_Mvd5EmWqT~3B*r7L(ePSih;v_xn_I63MBHlpCwsLOVU`!PcVbK#t zDEv`qGcrS-b;*XaQE|X=udrj6RG4Y-6iy7ToB@?7$N!@w$C_oJv)%mr+f1+X@=tXp zsC7(u0s8v`yB56>NY(0C@5%L&`j%Ml>9kWeow8-y%+OV4r)D!_^GVU^w8#uPmRc3a zBF*OD9*f>!H#c!r5Iq`uKiz|29;5Ua5OjNAl99wH-m;f;&Xj8!P5 zW)xy`3RxQ{xX!1ZUg)BKbU7E~dvDMBW-}c&^!*cI(m_;qM+_pi{5Y=~ivesMMKS zqBB23XLF0r=#)#2KHAh|xm+R1ch|e4O}^0w=vjnPr0=U^y*Xu*Nwnz3X;$JYsl-)K zNi3z17*83*{Fdb`6`sLL3SXEPq>q`E}1NJcWfphRE zlee_8)@8LeAaw~Dxp$oLCIaj*GeBm?Gyc*$FU4hDnT(%#X&BsUrQ_hb@0wK zfNUT5Jqvll;dayK%=I+o#G1~g&yP`CV_HO^l zMXiiyVikMa2|f11k@Afsvw5oYh^f})Sb?Dw&4N>2cX)J>uj4S#Sj!?|_Qfu-xd7pU zFONAHQ0C<(m2~=i&6Hq9}NKaV$SDNHS(+f)G-% z`=@^#w|!013j`woFDC#5H#h)j7U>;S66zDU?g49;BTG+ zA_~pE*s5}igM87#7&oNlHO_sZI9BVLM$R`TA#TPiJ`KmGeXrnVf@P=~e_n$>1MmLT zcnruH|8B)U1F@WDP=*n>;8PpV9xG5;X4ahwH zZsnVYUA5qHgnD#dKh))&(?OeW3IzJ2Ufeg{-9imrLDUP@wg-NTD`>aoJnbPlKg~wo zlVab|yHn$Rq*9FWMhf*ZCuhiJg)1 z8%NpV&N7%5qt@eQ_gs9t@tz|(E;}@a>_(5hbRl{J-$Yx`_L7Jc_AvW6eTVT+qmuS@ z+K9(rs2{8ReFN~e5jC?d!}3yXkVm=hiF@rNbOz_qvF~YFF?t6Xz&oGy{725Vo;e*L zdgBRFn)&)yKZoK1l+fA)hG#j~^Y<|hZ3z%<156~MHH_YLs%rb6Jc=gh-}M_}_Ye+k z6`b1w+V&YL2c8)A7G5;oYjMv}M_<$NY=qEi0$l3gc|*Ta^q{f!(}jPTLh!WTl_s!7 z59QW`O)iaXvlRrLW+Texd!U%Bj1x}@8A!(`rEVI%$`csLH;s;Wj^z=ri$7tmx;Igb z&k*|r*JNK1H8pPnz@jY8reJ8lQOaXixsNtsNjmB zer8S5)5-}7BU4Bc8Zj{n8m7Gs4+#c~vkMJ^kc&G7LSZ#cC9SS*Tx0BiGx&h$tv*bk zu5I#tLU`HPzFEC{c;K8io7C(*6`B3ro7r5h5OD1J{CPKuAOOsGLndNOuHl$oBgJFG zbTREy)@<83xmD{%6F}Q%YAy$e5?g0dmKrpykUNgdxT?Jx&*?JvF5cB3dBOgX2Q&f& zFJVb39D;-$p%@VQO1y+qs2X}e+Q_ut*s~AY*CsX|lkMe0MdFOVnYj}X+dj*bB6pBh ziOpuU-Z+Ey3!RR`gg)w(^+Ml~*@~Ns!s=mIX;vwuN%~^YKaLN%XF`c?UI?Ck-Ue3< z8E;Qga%oD5Y91*LIx-2@DN_G9LTYJ5i9T^giCT%`ak$EB^I|4zM*8_-ud4N6*eRy4 zrPeR57@ZHwHAGtDNU!l8HdgEmoi2o%I&BJ#chanaLERA0=^BUaGUIAjC`~E%-6nGq zwOh<>7|WtZFu2U)NSr9k!{^)dRxVYI1sw_71XE}t7+qYBi~~8GQtV}40k1lr4$xlHTz-IuV z*Z~v42?nhbcqZ612hEX;)70dTYnZi3C#3}hu*>}y#f5mV%i|YKWtBPLI5Mg9Z+HhC zw5L%8BDifO)xq#Cret~~co;GGaXAO)z#N>qJF%*VK2H*a0*NB?_{xK}LB_YFp08_Yky=KQdKv!_Ct43; zo1^!+ezu6oG_W?Sw>Eof83Gr}|~FOhC|5J8B)6&nZ^1)DL=-gd2ZviNyj8 zKf%CPCLnd6q`0kJ#!RF;Bz>kH6BLu}cexx*R0;rxBX5KYhy%cBiyTU3octkfCZDM{ za&g$jI!#}x25dMBbsMTjOi3DY(MmS0sEIsD;pJ_6S6AYgeCB+6{|EVzuy;^J*$G(@ zTogT`ovLf1WpWYv{9u$m(OS`oT2XBjw`e@IcdTVs5&k^9m;>S^aTS=PQb6FQI9ykV z$OV1p3SUg)TZS=$VTRinjond$|>ChmCXA!oI z=mUSV(ulfLd^P7-lckx>i4pt<+L%#N?V_6pTzZC#hBQ9?y<DX4-N z3qYEeWLPs$`bO&J!)s346=9#nSKsCx>Mvw`qL@>h1Oo!fg86UAi2XNYRB&^3baVat zLD?>0Qhq=X74Stxf*~YUx`j9&`c0acY0IRjk(``V6a$mp7PH|bV=b%A(i6yFDAp#u z^9I-#IbWDh1v}O<*(ICn`0e=Na{mzR2X-1Jz(#)AE_-x=8EbDr>v3%W8FoInrPRPM zgI2vKttD)oCg1%dyRNTY*J4Lvh6^r6t<)~hwO*&phg=()N%e-IbTk1!$2g;cd#DaZ zqv}_Kfd8@dHg5|p_(#0*prUqhpMr@b(@+##v9eZOIb+N`sW_v!!c(vsW(%O7@dH6l zj*%2Zs`yxih5;KoDX5N61n$%AH{3~9HJPSV^7IkQDo$v~KQQC0Jq$86)+G1}&6YWI6Ge8|*EIT^_c<^#b)kDE`+PRz zSAomc#$J5>wE&1q&~}Mf8&nG@Ue}m%Z(yvTYjOM=`gC{^b#}k)O$o1(d1OHot(i3e zI;CZXhKRvmo6V8B1z8FV1XTD(%K6`A5v0FuwwQwWKf|7%s-6Ol1|px!viUVVJ7$Ab z@Suhx^133d)j~;7;BUrC9f(E2#I@k)Y{`Ul7EO(IOR&PDHD)?j zxI*Jc>`;kh7U95~Xp`(3*$_JBeuFD_y03XS+gM>)M%QT4wRZP&t++a2*df6Y@KCWt z{$m5$o2yp#U743z^OhAS2e?=a5`ekv+7=^o=WM2-`BD;qz3{S;A^p!O&_#EsN?R@s zE*afTZ~G<`ql+o^HjgRw)>Cy4jNG!GMNWyjct*sMY7V<)Zbq1B`AJ(|n>Cznr9|SP zwwzx8l{P(>*&3AB_4}}=OicbCYOZ>|{=!0ghCEw5yGu_+e(pCs3}~SJHR@Tg%a4!T;xL1(dzI%58_!tG;+m#5 z*)fuLH@QSlL-`SLj=o$>adQiIP1E*cpe0PXaVYiv+JY<0Ejg@KYul%@iqa@0KjGk& z))_sEKYEvlWpmj2u7h@q(?%+1&42%dlm{BOns!7)Kf0fR(uToEFAyuFkygn%9p z&PZX%8fWEr6YqCC`)*Ncflm_SeFcgwqgS$}F@=bi%+F}M*A%;~)vd>7f7c23P^P@_ zcf9-qgWX%ToMZWUi=1Ny&=A4k0I^TS&^$LuWQx(z(cy|4Z9R&xrG`Yb2eG;NR9Q$i zbc#wCx$=nRX>K>=q3Ry;J!Q*s6jGv2pZ%1%%J3rpcFQ6LBJNo?WD@19lj8^@g7Fi! zoSO{Da_b~;Wa3y(C@P5lZh3LQ*!3bQT=RVdxj2gOE*Ht+%G>=S1=(Tk5BKbW zyNk4+qH!+t-15_`yjlBM89SUXPl+kXK?*@@`TkL5D`wg#9;GLcWpOFbAQyht*?3gg zX?H@74+-?Jhb3bO7T@rleqqT?k&D(ieYT1o55X`gv$5WY%zNeHpx!y5$y&9UNRsR9$}m zz$jt2+a6eheTVrAG>|d(Bc4EkfNsJ68)(q|4K!3-jqFX0oJ|#-t?aB^t=#``XcbEb zXIF6tXFDTT(tiVrjRJ}gDxV6pO?p%*$YW5Haw$S-ltXky0i-6d7U^yFx_gMpnqE`x z&n(QZkXwfrAryj&NPpH_iAoRibaZV?_RY-G4X)!2pR0@fzU=^zjzBf=O-J5#)9JgY zVZ;FYUPcaG2~M(dW__AWwB*9|o6)?IeW*z@O!}p$4fah9E|CoPRoBT$`g-Sn1Iaw0Zod@bvv+#9Mzk2;9 zb}$cHIBpRQ>K=44gd%S5#(X1Nho+L-DW~C@eRa9SB}c+2#!ixm!hE7%CEC>d?b5=7 zLU^FaOtg8o*dm=G*gOeTzdid4t!p#s(<8xFUAxk>$P4Dpg^n)a6b=@{)SaQ`(al7i zFeTg;c%~dhSCd@$bZ)9rv)#|s?S`8*tO|EIv9w*!s^gM|855Y>Tq*+Ym11^E;!rrn z#A+xQBthMolOh@P=xG;xJ1fn{QJkCgnp>F9R@WmTRJuZy@Tp43T*ReqzOk3&TR5td zd&k9CZE9Fkn8vObbXDX&k1q=VdUHovb&Vq`T+0lgoJn%B(nK{JTQSm|Vihavu`=GV zU`g<%=+&mQh;IP9bxO-Wsc!QLv{<)y^J|0|s$b1XnarJD}X)^JYTTfu#}>gfjcY^|o6gywqP66jg(NK#I|4TFnF z&Gt>QT`gPcwis~gc7b-rfJ?32(+ehc8rJJNyp?9*wMxvD5oEk!;PBqvOwu4M&f;?p64k}}W2=gxiAq5acmdGh%kCQY! zC`SkIjzrlfm^eG*2N8%0Nl@xwQhAyR7EG9eqq%RZS>EJ6?kHd!y*K!VD@0FPzOwv zmM1og6%OeJwNn&B&Dj}o3QO%}3fnynQl^&iuQOvsb1oRqmI{0LDmDc?&5_@r2HltT ziFyWE84&qYNQuaBG?lA|zy9)2KuHyc#2*hG{I4F$^*0X{aj>;DGjX+Yu>ac;*WAsX z)XvS-$k^6Q#q8hC%Jfff%}tTgKoLS5{h~{d8mRzAWs5qG&`M#tLKgidlmIDBY+o*K z>EtV=kTZ<&PVNuFfWH*-%KP$g7JjG22cvQ6HHYbqJbp;&jW~}4m}#JJw3vBe z-Z76+&g`lKP*O3bfP@%vP_0;|swL#El$t;om-PFXe}~23hGK)IxBI=6E~U(z zCfJcW)Jk9dhDXz?-@W}=;Btid#NJZr!3TPd$1=jWl=A7rlbsX2$2{U@>XZu2C|RJ= zI7Oz^$j78d6IrEPxQo3BGfmZ+EZ#1>o0$MBo(*oRRrqPcwm$HklH5s!?T%F>0H;d5 ziAHT5O=;@aB_)i}?2Jzwna0n<8;jzf8mCKf%X|qkJ9t3yisNqpM$PO^!r5k ziyyT?(!f0rSH-JpulB_bq>oVRD*5wYdF1PeMII4tb3Sk)toVOYB|ss9jBBxchK)X| zeQ($m-v+3wJ*hwNZkOj490~${BtA%BZX_b}SIWFUH6h#Runk%7C^3dxVToODvye4ylq1H3DK3 zsXwq-Hbq0Th=oySYJaTS`pjGg98Qre@4RCT?RBLQdI!rLp4kU>NW{zAkc<^0PQHWu z<;a|hz<9=g9C;q%zlGZWE>ki8(_yF8*45X=Q8x#NM~s2OZBQpRORxtJ9%v;Y)>t^-fg3=oYEkG#YR+!|8+8e4<|y=5&k+t5nX?i?bDW_Gi9zsIE}9V zk0i~cOXGlntVFvoqp8*No6v=(e=e<(F}$fStQ6CJj}K41iN@?IG!} z00XG}A{1lXFD%bb--EuQZrQ5RWEgf*;O=EO0h3T)fC!d%=1qAJ5h~}zmw8PrSi~ie zd5tV!!OfSphUA`XBs*;@aKX)&>J;I&KiS7kJLnp!grZ{5Sn99-%xq$;=`M3VwN=lb zHCJIJx0GGTC}NT`0gTYba$z6X^xMR(WVM>$*v}XKv=(JyDVg!G61Z@t3d+!yD*c5mH#+j*NEQbylWiglhv zz9x|R6<&8yT-tI`T;^nMDGOOuu9mJ)MTLaS)yldh6ko2Ercgn}gbdcQN?BN0MA&5&N-u_bmd2YW=6slDJ(A7*+@f!{8E`+0p>Uf>{SDI%5_dThb}_jO045jx z+0kcHR-Gl=eC4HTh%M7WSf%%Le&K2a7% zE+wL~P2x)pf`2><>7Qfo~a-^3W!;@ z`kXRGY?!W{TKFd2+B;nEcjnH;isybhUiftTIET+^p1I`gg}?s3>e#NF|iJWG?rhQ<6>291?>(@x39)EX4&UhB{{HJDgtQ&bO+LvDq~B zUv0EiY_ctIHO0z9la@cpf3rh(h@In*BNxjdqeN}cuGyyv{qorHVt)(O6E2tb2{#_& zF9ywS$<7b}o0kNl6PI9@(wKk1_-iYSvs>Y%{^&icfA0Uct&lcyHWvEUNuGRi3>;Sk4Y{*CBy}U-if{ZNpdVq zYI$eTnhzKiR}To~fJG|`b~sp z-sY=Bk=_irvx6X~g#l>;h2zqx!woL4yNY2ES{1e2JR2v*IW)x@*Ai4>@>15|gPVoY zE?#v~6c-Eg8?YZ?!y4}IvP>zNA1@bj%^VrmG}OUvU6U_W@;g)(NTUtw`w6q5B@GAI zec?UH)e{X-HB?yh!+2B72UJzTfazWNHs}H$aUAGyIQE%mR4sI2DZPdyESQNCQETBN zB0GiDGOU_we4Zuc9y+BJL(e(I-C zt(e8%=gGuP+Xp#UiDyTpsG#ZSIas)H(UrBFXj;{myDB zO!u?rON~*W%i{>x^3e&=Z4{jnFOKRo1rw;8m*x$i%1=3mEp zH|736MuX^=O};vUxmLnz6$nBXL~dV6g{#sbm>3M3C<*1bfmv;PxRQEVaHoJLL)Ewk)X0|(Qc^)la69@ob>>`EC!$>5a^QHHWHc$?3uAucAQa&h88o1~8 zIKgAXno;1735w=lV(cZKxZ>D(H(b9mCyb*2!py5@69TaP35!apC?z5bJL(I`>4R_> z4UuOOgPyHfaXse?F)5)YM;`{B#2igMAV%}D9ci&N0anEE!jXAunAFfqcw`jC2aK|G zgYt{>4rZ3+b^6F&#w+$8ja}_X-dK37>n`;-$l1nP_-Rk2XA;7_q#tHb3 z#=P}kZHV@7HuQgyjAR|`E&iusq^9SBZHDTvFWh9-4_K(zd<&wB1+%u4Tqp}Fkg}te zOvYTW?4AJO>A+i=rtM9ZqH2Sx0SiYq2V6 zOiqe)acN52eA15HgLcsLgkqHKV~mMbC5W0aF=svA5E=>1sGE%O?W-TzY|_?N+g-*5 z8M=%F(wI4vk{(hNBC)NS;UTMdrzlDWneoKes^!~=*)D|nS!2_%_N7f5(c;M=v(>qF zE~f{$pxFJGo3qQ|?h3N;P&I1*x~Bx$<~hi6cbTc}vTICQ-fBbK<5Qi@)x4sk3ZZIi z+%xUC^p0`U0RquiW;9`(huX$~9r_JtBVlv^H$KiwsvRdc);6Bwsym(+=Y(b((?wQq zLwd4>CsQQpk?Bs#k5Z(KB-=yUt8vx&Oz=?G)z1}qL6$;DX8&HR0s_-rpR|K;T*n2foOEG?ABVfQ#wfc5j%z@ z0^<3Gi1R8bO=Jv5sw+2{nY+9von5!HE^k2RIk=D%B+j=f*7ZW(Z-&sEEf5VIsA-2-iFECdj}^9(@L zJATfpzG4h1*E{~E$OA(>+XP|GNbKFj@vQ^NBTp1_&8_yR4e$Cj2v@q$w>JWH_09=S zy%Y{j6x4;kM7wn^)8< z#&9Yd40=b_8T#>lI z5H(7Xwkexbg;SMTM9!PhH;N|RR`&$FKa5~ULTf886WR;SOwO#6$M0>!Wu}KO;Pdkp z&Jc+J^Pt;Ghs;i<-V%1+l31v-2wUQGbFWc-4_O%Y z({qn87|0aM=wZR>Q>vb+#9k6*GRs^{C6;cns){#_$eU^LC~KI}3591NgGFk#(E2nu zX_Us&$~sedMqg%~UEVfN>R=5EMRc#vOA50n2CpCtvIsMgDMkt4G~HAZ0ZVc#%tCejB@5=kNMwSWd|<)XM!DfIEsgHi=319H_1#9i ziyb37mst$&&90d3w_~>qVP&S6>_MIh=2;+p#NNzKPy$Xly5+XV3w`8(Yh!00o92KB zpn-(MXQa|G$W#4C`E3OT4B=1zKZK3k{~Mqe9fp+G1Ts>&E4=c$390*sGeh1!S%?Bf zWDurCd`^!YK&hR52F{UdSQ@QWJ^4mjm3X-^SISnRA<7M_KG6)*g;Yz8ucuy3uFme4 z%o^0PoX7z9X3t2Hb<-kM-)_pf9F5*YZ`$cV5bead1Yp8W#-?dbEK#r%oCjXYQ-L7V zs56lNf>0tq96tdIa-l0+eUkOn5|%uG_%4K;Us4|tae%N#Vgu-*kigL#RPjfOoAr|p z{uK3aM1>=>TTr9S#)m(sY6mdd2x=K#1=hxW!B)Ql0wIOUeR@Ygkf?|DqP;4p40%mS zGXqJO7k9s?k9UPn3;ON+(hx={s(V~PFb(bRxj1@`LJ6D`CElmGV$a}mc@S*qNf7kwYpSE9xymiLx&@VoL$iGB1 zGIy_A+H&IK;T*G*^lDX}ilb?gfo#l`@3{iIRM0wx8CbHVI?J|M)hE%g0*~v_xih;i z{cZ>5M=6$(68lvzvU3v4n8X1C=b2hDYcxRshECpWw}&l`C9q*JZ_lAYIrBLI;becH72 zsi;5(o$6d))Jba&jo5B;{aW();zuU7M_RT&p^V0EMYaCmyiQh>zn}gboP1-W1?x{) z``x^PZ1hHHi@rnJENS!GwgMe=t7;)bg0bHoV4JBmVP$xtKi`-i1?NOEqX@Zl;t27L zy01c8&HWi?ihg71`GAB^Kcu8g(n zXBA8V3NX-{{z#c_XTkct#O#bX%(Jbvz6kmxI&oL;D+6eCsb*!t+AZ~>RTDO6>kFgb zpTyO-h~LmV3K!c_PCabyIN4f5(ZiVJpHk!?^trc5XD#UsE(c;29e8mR;Gda`S0n38RS&Cd?{wI-P6jGb?~+o8XY6yj$Es zt_gq#bISZ(i{K2p%ny@{6S3$M@iDA|LzIu`3|>hif|)2OS`GYLfo688|< zP5B;|Eeo1=VtZG?btqoQl2Xnt@3;|Mhsk_WhSw=k12Wi&j0VRcYKI|p6kT`9 zSVO7i4S2Q9>+~4-qlPLFDwmpIFQWSPoTV%Iv`H?QJ50q2v(i1@DTrFsaq^g)>4o5i zbg+nL7o*b(WG;#6q4x=S`K|7f4o=uY5v+aK$EcZLT$xbv zh-MokQWKGk>c@;S#^8=$+w{-fBa}e$J-xmD*2($tiCOE>r^ztrBna-WI7c5BPU@#Q zXCEg)$_941n~~(IJ*OWhLB>aGNf1tUswDKrWlqXRegY(+zo4~nn~-lapT4&F-V-YVTq)&P+ebz&*U!-`hUr#J_ zzfC8JYTx7Dgu`O&j?MDy=|}-?N)3tyW5-^p`k@ZNj=o|RbQn|%Wou`64uXS`R>Xo_ zlb1IcnJb1h4eHavj!@m|jsT}!G%%=5r*vAVBqPepY86=#Zce9kNK{Ey#b-N*%!EKi z*EkhYM%EF|PN%eAC?d1uEorkn=aLzmX7r7Zqy=@wXGPQ#?L$1e5W_Dp{!h^!`nGWT zlVZXG`JbXaq*^z8553oH&O}CeFb|Vx0t=e9Z%*zB1&5|4+BWM9n0cdd3tVhc^^EE< z#<5Xe!q3JOJ^e#`@_&l~$|K7nsrkGst$y^1x}#BKhTq%bUpLuwjP;JHJq^0O(uKa#?VJkseaq=*X&G?&RdlQkj)i8YTB$Kq?uUaeLtR3(RcR0rS~%Qb>n>Td zZ{{Sp0HI!aBb|k?G$ber^9yBt!wk7ViD^)14EFxB;X{;+x?+Y~pLH~0;+ z+a&Y;NuRo$I4(#7moBu@9 zzdHT?ExP`%DPSwReYL|R7?CB!{)qYH#(O;uECr1^9!J(>qS0@2;44O99!tjg96cyks`_&A;xg z)8OqrfBOqeF?^aZ+aI;%60o1-SVQ1kKNuq4vDo~k@n(1B40zAk=i22RT#cG{;KE06 zZz0=uyL3N!$s?0}v`!{!Kk$G&sMz&<`lwiM0*!75v%O{d3bLVpZfj}%WztD5WQ*4xN28L zQ8$l4YU>Qf%Awhzoq&}#wiM%A?j_#^`I!$BFLx!5I*=RXgh}D2047t!sKrF(2~#Co zN3V=yi|vzpicHU<`(o=J!ne0VVY5AYMG?;TW|z!!?@$FNWSdj zqx#TDp4_}0YlVGqAV_vbtIS^O^<5?r902=marMp<^V<>|7u$i(EpI(zj$~w{$H2!h z?lxK>mWXvvXOSTi9MOS8b19~k z@gc0!VdVntii*+EpeCISFb=C)dFkQfzpdVow~`5qQs@8(+P8~LZS_!0s&_p%Mn5Ft z`G)f%Yv?ckHZ1$Av1!lC9dWi`69Z}%p9;!g!5ouIgS7;LQ}P86j9ZlmcE^L(?BZ+9 zz2;MQUo4nBNB_%eh`fTQPYed{)gV!DIuN&KRH{?1PvWFq(I;xzdY17w4U|MexksMr z*ybQ>ZaE4`iX&`@Bhg97CT;J!Iv*Y$K-Pxlpzi(A9p z>2Mqp;xH(LxlWkmqhtlS1O*fIW`49Y|nP1IL zHD6K9LFQ4-91-CPfMCq%)PbKEOhe>+%zo9oG`@N@U;n7zvh)3VYx+a7JpVMs{vSd2 zKcV5j6D^j160M)=IvO}?sGsoZp`oCn1(K9vCBTD@xsDE&R2ni+&nau#!@%O)nB<(9 zEsC{K%KB3(zAmRVf_<2_EB)%s1# zcQ${_*e2{`gp2dB?iCs#z7yJb(u#OZ#o>nCAxwRMsSB#Du_X(JYPWj%9^4xbgo1)&24KBbaTO$ z#OBCYXlg!FOE>R(a#k#pt~8p~pBkwFU>c9C&%G6MnzViF1HXd_v}$ODfSL(EqQ zWlx$b?pH3VIzu+ux7x@LWL&G$m$(8@Tn{ookvXBbx1?UG6U0%Sq+Yrc#!;Rm>)<#? zTbWLh6Ux!9q$_Cd5hKY?32ytU4_vJAXS_ORQ!)WUc{wSswIX_XjvOkK!8I6at~~kb zaaV33lfV(eSR(8}R&mRi1a^WTD~$0AjwM$9$cJBnZD(2C!e50_(Sv&-z|ELwEsabMP= zpOq2w7c8#obz%x%FYNG#FpyGb962z1Ma@dob@PB$otMFNaa`@fY((ndy4n_6H(J%| zYSGiWl^lqLXVW9_3PX>Av`V`I!Ar{n#mThQ1(T8K7z(r^!J#dciuyscQ!T}S?W9>% z1!W@vFei#dso#cDDH#Nn@(QnYoK=Ihy}`%$Rv^KlIOZH$fL~AZL2y%k0BM>g(a#xi z#Mrcm*FaP1m1DjgIlP1M3IKc4oW5XmL4wPOi6N zQm_ON(-O&gP;1>ygeDk&jC*LFobeAg-SJ_|H(o8k6~*%RNVJTXBSf&j{Yf)NE3`+n zy{b~I9W927n(+I@TZSvh6*=YiDyR~jS!~GE0=eU`*#gtihT*qGIJJd-Ym0fa{RXZx zOr%r%<#7IzRhqAIG^BC_mR3Cl<=ilSPcj;i=U~HpG>?;9gsWduWCtU6p$s zlQT}0+Xc=p>+%Oj&mLCMzHVP@q!?i0N3K08`(%5Or$AvpLyF5Gu z_#;1h)XuxQuQevC&fY)1K3@$&6YQPylMIQIXX)2a##7{^HluRen#YGHaiY~HiiXn1 z>T9)5$)w}jTNUhRNf*`KtMAo~lx>_E+%UHb&U=1lHZ9QPvcNc83~$^->{q_{2kl`TBM*|?lrPxn?xuhtkEgbzPFfVE zXw&qtrBfXTRMv8z^r8h-7rx}!WVyJ!3804y$rc-FnH@&(Z>dX(Hg44P;#c5F_hKB!byB{q6paM&*m_%f+e-?5crwqqBbb6IRJ?!>wn^tt*Jn-mo4A(*^Wx z7Z<*p(=Y{Ce!rX3@L3>V@Ld+x8g#GHh>nNol__rG z+afjVu>3=B(T_^|E_;oZQM5}pF_SvWe44fEDEN+|kh+ruZHk|I2bwq`Det98U;W$L zYmRy_g#r)S71nS4AwKs$+MkB)0aNY*1&t7tzn`T65dLo??vq)cvjnAy)!yzq>(h^X zAFp=leb4%Q1vGC|zuTqZ-@YF8PS?V|Zh0;SssQKPeLTJ8t4|)!Z&od!+EwvM1T*!TyY5X2uedQ_}dLQU|&Vm9%lwJnoA{Y7F2z9U9FX^WSWNtOu`PYfx z*QV=fZh?)Tl$&(5tgZ#<%c?zN`v9J5+>cjeFFvem=X?33v%u01ip&ODGP4XC18fst zpCMBki5tz6bI*v~Z!ZiK%dJ1ed2$V!o6$$8r^*g|sZO)IOs*8RBW4XM^VYc?O7EWR zFCjXiY{^}!BXe5Y;OX_QzTng9&YqYr>2u!Hj0Qf4Cm-*cUZtBoD^r`=+NELhYEyb9 zdu|}MdrjyB@I2(q2&5jTYrcuft9qaEd&+gZzP-X!xSm@a)0LYT*0q4>)+6fO{ePUj zQ$B;X{`HH0rmoQHYmkAvUepbg|gT~4|2q{kX3w`K#}86Gsz0^t%QH1SMO z2bV}>{7$wy`s=P?3;G$tp}{H_uroM|X2Z14hQxH@nZ0f%v87=P_8G*X-pbX_J6wxq zBU{fk{<>ic$b~jM`HH9ReT;7n7uYjQi+p2HZ***_fgA9dnL(pv)K9qLOOj{>0^(1p zOs@wF@6ute3;LZ!Pr)Srd45lU0WRWlxcm8*^cv zwwNXOl00=DvxP>r;zb;4%(6VA7|~)SGMrhcRP_i!=Zw(6pqX~d@UOiE@1cg{o?3+vw>~Hd=om0&D^k>L@8|yMhH;|$CZRjLxCflR z+jrY}9Jkz|(;}bS;c3LJWeB3AB2pfl71G<_XRZc@GSZkUt`!d91QBcF9dE`rQ223)YfV1vqaypkk7Nge0%KC1NWCV zC{m81A!Fv`8dk~4DpXndo-v;%NY?bBqeTn|Aoc7$C;Ps-mb;~eq^10njQ;V2+$QAg ztp-XfBsOt;AnT9b;0!1`uxT%(`N6ZNBfk{-X1+J&$cpYe&qjEY!0Rt8V$$S_ z#FIUbaXha}GbsVTB@(X%Tm++j}OX@(3nO#&ybs|J;!&CTmT+2~DB6XuS{Yk_98CG#qz4RG8q3+FKGi!6QnyY6w(0@i|{ zodM0mkj*d@p@XM%H7j+MRdsEvVIn73+^MV0anmAeY))ucfetl;o}=}6p(<_tBTTjH z^@4~=2-|9FSMBAQ&=>)k2KbaU(QMa(6w0twlbqlB?^T zTL6SfVV*%FwaAd$D;#_x*-Ptwfr!SN3o%xR{rrI4tD=~NIDwxA`se``-0H5lZoL%D zzow=#&n#Z!ObJZjxklZTIQ{7BRtruHOy6WZ;yUYo`szk|+}Cfc4i zD#xx40fC#D16WU#-OulEEz$;tu?92>e=R8T+y?{>2gs!q z;9tfgtb~O}1CJugP;(#o!Ekd-&-245>QZ3|%>w*kZcBa}^uQ3d;kRFvJ@G|IVAYpc z*v_G%R|xMe+?(N7R4;G=X&cN=>+6LXckk5AoqEph!v};fY>+hO2P~pXH{Yst@|U)h zFC;|gH5sHXx|ayTanN0KKOWgY)-Cn`9E!foE5bnR-#|VvgZv#p2513%lmq-neTEOT zZBX2__)1XUn0jYUUWvYvdViFpxTIggF^9Z0IA-kxV^G4`e^XfDJI>5tC_CbPf zhAYR^i^MP7SO$P8Og(ca-801tMaVS(?k2QFGY6D!r^a-d-M`bH6&H98&@0OXuta#7 z=Rs!xsfdmAQz+65dzWg2i553w29c$9@%F z)X!$F>=ZRriCwfl%aE@%KvI~6$OG_f8|WZKunB+5bC(gkmGC<_g6E2My!2gx>+*dE z9B^HM@6fgfuPye1i|&eDfVQg!0g3`D4`Q$Mh^8cm4HsczFTux9pzRbUGY5nXcXdvO z9uYKbcZ6+^Y%+`^0UdxyR5rvMiYORghK|xYgkvu|84;9*COhPKCqd;0X-EjMvofr( zh`ng@H@tK|*bT;FlWs^_>A-?N+Fp4XIK>%isqLK11#ym30_G=I@GW<7 z2vQEen@ag!5FUP(oCTZyW?7i~_m4hNw@80O$gShw5PnPf9a`XZez?^G2Z{+w6hY{t z32BG`s0XMRQR~5vqp74*ACKtbrC|k>2#_m=LzY`9R{n% znH@&WqVGEY0-5+-1~&8)P}#n0{Gp;5YxCoWjnD6g7@8hA!FKkgVo5^gBS8f>DT|oV z4`u~g&=egDDm2;tdA|E1xYBq1Qfh_bt8QpiAhE#TkPEVJ-bi%*zcd!&oEva)pSJ<} zz~Zrt3<(^-wV4B8bB(0goz&ivON+#xmb01-=FRyEJIj2R*41!Aff)KiC2tYVXN)1i zs1tnx;th-A4;Jfoj;fKfSdAUgZmDE}x$cQ?y|@#_*Hkb`Ta!q0OPnu2o=xvoUjQmX zQp03h26AG5>mRqzPd+rYusK4@G-@Iw8+)2R0#?(;Y)h3F7M43pw*pi)05P@!V`#B7 zufuR=HYto*TM`(*12#m~W#=-jNlS7g-fzZpi@W`D!Lndaks86-ZI})i`O3lm z!fhcEJ;Hy|xVR{<4xttR1fhymNDiw~v#4Ml%8;5IHpC53^)rs|Y%!1HFi-eE1H&82bgB}Xcu`FV)V)=pxoKm2Rs9u}F|+`@yf(PS{6e>?f*(7Uf>myAAD zS9Xn?pjK990>kWqdUJPN9p$crh_DCR2(Xdr3Sa3S?t_P2mN+@qY#7d-Abs?no%lhh zC+hsZ+)LgKP#653ox`K8^A)uc|(|oZni7qqBDP_%jZHCUnQen>iVPy<$6fWD;xHW?exH%zaySNPkoH;?_?F&x# z(Vu_O3ik`K($nF7{Gj+|2mQ~T%>S1X^&gd+fVruVv7x!OzSaLl9I{mW%X8Mu%g?s} zIWLSTK)wR1eqAQyaWkt2)EIg(G-qU%$#j~p9{0TuWYExMJZClikdJXqOiN+mkTFTv zy4-S{=5~Gg`gnSy^b@xOZPsI{DqyKqWeUmf*cA!aB&pX~I2XMT*2$0S33z}RnF^Vu z3dw%^ond2V$*pL2plalmB2eNah_(P-Kcs7rPzl}5XcE5Ka^x(iX{EHnWEos5{^dRM z29;B8SD2-4r9?4NVgtXoSy`Zh)L~90hw&|VTcQg9uSD-d4H@j=x})@PSD`#t)=6ir zra-d?*%&w#)&)nfG7W1gOIgmb6wcQZ8?F<&M45Ty5S-@?v@sg@q_a&Bt=yb9CbVJk zP8pE|W?H@{G+o3b%$#&hF!e69`ol=wV};&XFIo)zoB&t5pEy4_m01Q%ORa4t#JLTq zWdYv4Dot65a8@L5H|y$}ime_dKUAD&KXA6GXURn~btmrn3#!H(HFc?4Fmk-o=3w%N%pR5U(sSzGqVMX3@MgWG0hSA{dh()dJf~AOx|%r0 z;t}=W^R(~bj4?wEr7;>d>yR(W5hl~E|5s*aS#GxzjD#Mi33SSy$2DV4yvEyW3z^ zpD7Z(qk88+1-9lSesP_V8OSbUYNlY^oB*E0@#=>R&Jk&mp4-DPCG*lzU2kAAvx5)@ zc+VSbg>1X~T7q!gei;(7UZZMAp&!iA9P+p+5^AAG9MOnk^RT2FA(0yR!wftf{!$(V z;Yj7K=pR9j(9&9`#BOsU@K4|cX)X{2ZDEou0L&7pdhy^!4)rjT;xQ~Ao32{62IUJ0 zBAF!gu_PhJ<`CSh_L0lP&tcqPzZLN8Pncm=Nm;KK+EPbe#ONBu0b87Q^SM25vHrze ziY3wZ`5sY3{2!da|GR1TA4e1nbpJH)u2SDsm6VZ3dtbmA>2&mfn`%sW&=G0ywe-;8 z7rpkNNQ{S=CTQsP(5Gj)yp*14uCmpiTS7I~6g73UDwoGqo|_69e9p4FFG{}xEOI@s zCn3_4p9Fi~2Gc)3C%=z5&-TUp-%g7ip-VD(;4TtI=_K-6Peea=wI zmzaKNUB8#DBg=S$rj7ho){awpgf@Q*{NyfotatWa>nYM6_Tr!1Hen|y7xl%M_tXwmNLvV#^Jdsi$ zWUE{>g_eM{Vm!4{WCVi1R34$OtqoJ<`=pqyv**fhIoE+`N0cZs%A4=w+Q^+2%$}4u z)M2O9{|e2VECbLSI)NrlDnJi1oE{;|udkVuQMC{i>@JesWHKsX0w=9QNa>; z=i*?+G(5DY<8q(^miXySP*|@%OxWaZ<|t`~mFW%2JC@&@$HqE55Ds6hPGM>(OZqyv zXL+CT3#%an3kQ-E?$6mQpBreZJ_ua#!%b@VfwidWgkyq!s<8RVm)xXOP+@%AOf=N# zx!OFBtMCR_Zor&1hL_n3+J1pnRlf#^6}5Fw8Y%~?6ZVDDBgAH3{5c~=2X8Pbu6EeA;RPgh0E z)Q6JE#Q+N2QKXNd7qan>254r7-<-;dzItUwo6mldVEuk;Jso z!-EVQe^0t)=_UjZ-Z-%4D8?mryrCSzdB=9#6|CfW+M*(3^X0e3_1gVtSc`FHU%pz! z9zEt*K3I*9QO3zUS@jx4Wm~>jm5-rzvWhp1#No`){C*dsa`|&j5w8FWPxzE-o9J|E z8|e5~8mOe^o2{ToI*uFUDlBcF6rHF?1=?$lG|Fa~V}mKQvCO|`@k&r3HAm`Y3(a0o zKH8LZ@UzUd!KB()#`)?ikknEu&4f^lPlIWmaVbW#rwhXVemMOC^MRDVqPZg&NDpFYjx6Ko=Me;W$2M)Gl^gC6f_jIHTw76_>fulGj-0v#ZmGYi zi}a55z68TI;N2&ChZtR(Ve9wlzqwTX81qRSEkrJg30)lAcN60d*lIdAd|r>8cYSP) zca)OMM#S&}=~w1bv6u41VdozgS4a_4r#M0;pD3=1YmgYXAcrBpD0-B{h-(lZHz6;T z6er*IM3)^iQJeP6=#HUsNn-c}9cf^dAKw;$@x+?C#A#}9t&hSP#C*o}SbGnvhpS75 zG2ncM=UfwAo#Gksamix%0})dvYaex-5Y3!s6T;6mg*7pD=feRttO7l=I+NSH5%lHR z_u$rV`2q5`K)72{)69eX_z?s9pOD|bYYP8Cc>mM{e~HmNNGb6 z*bhOOBa_yGYwk~&&rpzBb>0Iy-9Sg9(fTbnLO$%4nK&u_<@olM{t}z2`h(3MGT?M{ zVd$Zg|CC7SI)N>^n~f{SjUfp<)%vB0HQ}WJW87@D3q8t<*4p_?vU?0a%!t8K=k4BP z-epVWM6**SU{N;=ybBMSC|w2#r0F-iC`82P(pGe0y%a_o__Qu85LExyumQdgG#VMG zG4qrk5I=;~uUODB8-@ItM$nM)84lcZ=rejxz`)4>YB|A{)iVCWwxjh%$ z!bd&^*{xZ_vPaLX;bD}WQx|jsn;OkZQg9d9a1hgnk-q2h-~WH^DUu@(z>gn8-=F^( zdH=hg_FqxhH?{cRZNj{hroM4D`bciDfCVOX0SLfdp^$mL!o>?%0kUu;|6!Oz38>?`%q{Qx4bvU(osZw2Phft)4bTqj z6dUe5)=6xFFA-k~x{cH;7`xKST5uxjVlOIC=V7J`b2+*gRx8SX~ zG7cv*44V*Qmpqf-KW@-05)&#Q2RX6R8KI-0gT5KqG){GoS4$N_ra>A+n1jAL4slYo zmkvs#nLRbt9#vy;C(v%PB7|)=5%wlDu1OA;Et7M^IF#%-5KA*U zwFn|*)vD2hY&9ab@63i=lzlNzD`;|2z<9n%i{smP1(BrdN+Utp3`OmOHUT+Q)DX2oZ_hxrPQmj-UJtAoik ztANQ~bn%!_$1YJRmzawdTznx2%E3H_IScB2{Dt}(gMzI-+76`D#E_T%16ARY`*4O0 zsS>|FdJmEYwrfyN`5k62*gt0f1$OS0)xJ33PGo$vBN^AW@@Ep1tbN39^rr>4WAzIH z7;D`|(rp{PfXSylQkx|ynfPIro2o)xyHZb&t&3b=GY-oR0Q4%Lz_v)wCYs31YJi_& zL`wIksJOFu=GV@Q7xF-oHI0Xum0JtoBZ9VH@vS1{e1DQhwZnV=9=6{T%96z2Xn5y4 z;r-8t4d>sueE)4;|BqAYf3c#ilm6lJ|0!*qW#KCaFDM`r-asZ7mP=Bw(V%kC8*#wc zk=l@gSs=}3l6SZNAoBrs5(2|poMYupI>=d2z&|26Y3S^I*x~lPf7yA#`!Vk}4Ns8~ zqrB|kA#HeCZ4qK}c2sVnUR@5&O|n%}8NJE`G{N{P*@2L$k4^^eNpXU1tjQXE2I5d{ z##~*oE)cIlIWha>v zaHPkQob_k{@kdIDN*JBbc@}G%?LMK=P52BHE{VQENPrRg3*GSU9E?x=2|aY6wowAX zAQWEjz&rB(ITVCrk)$<~1sKVPd>IeyV=BvF+;cejnYxa=pmFC5Y zH<`*yxljqUf|}K}N!!Spw$*SkN;Mb*3`?Q&jxOk<_+sGC;bo$P2hgVp4QXo@ch$$v zb+G|P(DE(2c&4++00O~awY}jVg!0=fk2>630GQrJaD+lTbkz+_H^uZtQn&;KUV}%n z1F9vqeqMgUTE#c;SMg}kgMMk+{ordDwxkXA%b@XWk}5s>CJy)L2Rd!$lmRTu_ZPE9 zLv#d}4A;9f!o)&aSmMYT?Gbgmctl72`B1%HB35mIaE2K^s#P%A`54`?ERQJS zd)pJ^Xp_IUJ$VVMyQn^{t3*&TL}KpJ_Rttae^CrP?twSv-Wk!pg{G_x9_Ho>T@7+2 zLosnGCt2tIPMX#?IOdlr(?Bymly|ohn*XOKq-9-Zz5gvEQ2k#Xmj80~jaIT$M3hJO zfyKvagVgD}v}nh~mq=R&X^fOOm`q{Re}41Bss=Jm?_PYZtr*MrJqvd#%G`azmj*TU z=-+tLlX~Kvd(t&Aqr2n#gRF2ou8Q4%qpreb!T{Y;W5D5RBw}TkdYVq5HCWz6EE89K zlbc)_r*-NRQt%`CqRH55KxwSl0w1&IDvSZdu!l(SPFi#I3ssPL1osJ?`MC7&Rt4}HWr%Bi|3hRk3-gVDLd>IlwRi;pE7+Jv~cwPlnJi}b!UK557faK|B;n4kqZu|)6# zw%u~V-&1jPr?bTt;c_6I@7;d~;v)hhN8&E`+|-#WRWh4Pa1cB2*~|ANx!sb^Xp(>( z1&5VBw+}{6grcY^M7YCZ`>aU@9o12g1FpmAFb;2+8o5lVUQg{09O7rxN`$*`gw~Kl z0;wj!oOoYvYc<$lA6NPA4@^Oqyc2$D^5n8L>Hg2Y;(igWNIZmr5B2+;xOTy?Iq3~ZKTpL%STy6@o2g2;bV61HFOOWngO7dSOtjK!7e02sbMk`MZ0u(eE#o1t;zNEq zEpH7{kgBSw8(80!lJ? zMhvA_pB%8Eq$dw~q`T7Yqyxo!lgefx7g=$c;#zLFqe7Et2Xslu1K4289o(~ECE7zt zsTVF^1nUoeU3e(+&44X=a1lyB5sI%=FY(*RE%L_Mz3tz6VAC{PoA&*R^ZmzH9PQug z!M_ywHs3C-|KQ(bu`Jyqy%h;L_LJ_$|ye1g?*B(QltF)R=D z$$cxMuJ~>xJ3$nM8SZ11r+G=5Y6ZUU?Et5E%1={n_Wk@~OA?H>Cfi;&nK#{QZ|}Lh ze*o!%Z_uNdU2An(pk$UzX0crx&B)}EX#C4tW@tk{_=ed0U?IFD-Josa;?IG}?t z3J@DCmMGens#{&smx-YLolFLUB4mJY!Wf38^@xQkw3;!a;XA^$B;q9Lok}IqypBQm zupxlxAL03Lq8IS9=6`yf25pg~I<3~7$KYQvs#0z&l#{Qg)0pTW6?)U_3AfXcT;gZ@ zcy1Yk7qg~D3`0{7)6D%0!U87rk}je%*d-N_hSro5UUBxBw za10l@o|X}e3Q+D;x!azte=f%c9xeu&Njaw-Yat_>ffxr)wNmI2EfsfH3C?fDaF=eH z+RP?7rVPn(I!ioq2WSf;0fM`YI~T7|-f?!Iz0_W5e0Vp0&pv5uhQC*;tR^^;MmvP` zTsj9hlXZy9h-@|0lFD?8`bl^qtLeUGmSFCP0yW?+%vL-8Av57J5-Zom1(>aeeqpd6 zIWL84Getm8h{h%ODzfO)`GcTq1YAhk(JN9~?kv2LZ$`>6X2a5!dE=Bh-i+q^g5sqJXXX$OGC?Jegrbri&igpG+a1uw; zd{AEFtTr|8YSMS$CyOvWH*zj(Z4QY0>P zcO`R`(!Br|e}a@7Qc*!~PQd2uet8OcvB-O&zm{jq#wmfwkxxwkrf7>j5Lu8H1CN&(MDk3q^TPFCzLHC*qKMWN3&#m;CHLbd zYDj0D1W`-KO(cAky7{mxw45yn%zF{*dtX&n>;)?+6T4k8q+neIJ1|Sqcn^K45{oP7 zGxA5@6%GyxmTB33_QRm+5&ZcL0$&(#ej0NhhoPU*?;J?J?@X_jM!p|f_z*S{IQ5Qx zlsyN2z;Uy$m8>R@dl6pO){iKayrO#2ZI8?28p>P^$BX^tRrbi}!BXh8>k$Nl>Gdlw zzw+S_A1?R(-SQkf<`m=>Z;q{ZO_o2uypYK|ve~Btf<+{Mfn#Qq{wSvmxSBn6MQeVu| zcjH`AtxOUA{KQ)XW8y*2{Va?!Ey+|^6=2y~JIi|D+<7xOzUkHN1)$PT?%#$js3T=D z(x!XF3-*V!;Ppi|<*D z!c<1MzO4{}#l1JYUT*i@wNZu>mR4^YnN6gz$HeJ6LmUhn5b_R3R#GM$wASt#ZnJT7b=l-4ET7p$p7+@CKK z65mK4ARi)skcB1ElK`{V11mGsBcg_0%Hqm8!3lkkEE&*6Wn;(?2q6_O`g1zL<_v2@p6TKp_F*4xHM zS&o0cPFi_?B;CgG*S6=IM#7Ympvy1?Bj*^>{${bY4+$cgLL4+&s^v`ZjEzKDg9v00 z9CsSzbAvX0S7q(2Fj4+xQrAjj-AaHd;j|VnQ#T+t-YeqLH~vAxUMImi4j>v)U8jwf zGkh_=z!>`~+qJsipuP#ISu3S-8WKiCI6xG(llGFD89 z7c&0{Z2M{=E;z!6Sk6=u$4nO!q>*j(>%-Ku^Gj^iKSUje-PnMXu%-<`LbF|-;R^M| z8^)G=92t&uP<(%{OiBIijb(ez9@7!osqW+zT2*9$`HRzVwDQ(Z7hXdW--RCuI&xw_WHpzbhbr|cq=hjugd`@Ig zxT0BZdCPmE`j=l>gkUxLFGHPC667Qyl94C?q*Ukeq4%UcZ{?gdHr*09gY9eZSjr4< zMIdp@%ni$zii@PMbc-nHJzqb#g|-MwXh)*(gos_ z=|Hgn4b!VU@_%HUO%<$Y1KIl-I^~vVM<3f~%+P!UAM_5o-2x{1qN3*Lci6pQm_X9n zReWpBdcYMC11q@=A7+eE8uZorK+z+)ADOOQ+rrn+eIn#IU6&Ijc4`)CNvTQ=D#F1M2BXy zM+I-I-R;S1I{>9<1DMa3s9dDjacdd}!~F#H0esr*;dX>c2dt#paXV%Iqx*8^Im7*u zwc+?QbK?uFw;KUCBgU1T76vU+N(4!fK+ecwPD-jMYDze^d>F)TO<@-+YA6y*Thzyy zL~KnO(*iSs6gsJt`%st3+oY_j;%uTUsuJ8#!Ce`kZfvCNM~POR6LmTrRugqP85R=- zZY<;}0?asoyS{`MEvyI>(J+>)J<7p|kq!Q^^sL?6qUmL5gRLCb%EKe>tjK{JE!*2sP?k(V5u{OEpjGC9Vk}Ut zvWHS3FI+AsO(79vAumt$t>aWC6BLe@yQ1I{a4sW5u8krb9jJBKQvTBjetS@iby9?1$(Bgk{tE}B)zFCD~>hAHIr$^hNQbOTQESC~xlHpPIQhV%kY(c4ZI zMYkKMuRfEA8^L}&1Vu%;9K|l0ON%U3QplLCqJ-ar`h~REu=7hYs$n&-9UN!p1(lgK zxE)bH{_N#|x-z%b_GiOfTidw?zGHoWWVrR!Tn@yaDosiiBOzL}^T!wDHen%Fv^f7vDr9&r(VDU@p()QcH> zHl<7512uc?wcm`8fPkde1_dO%>FTml)$*Gjf|8{P2DfJUX8mY*`)5oEIEwmrOi4Ie z5ku`_6>cbrbKz{2Za9IXxwk4Ys>0s#J*X7Pf27UVjXuRjopZ#mX4Nd4)1+EF*T@Ox z(4J`BU0_ZP2Gw@dyuHxy2hsi>KQdipj0#lNps0hNW>^rESq`HLbf_5QE3QaTJ6&X~ z2-MajtDmkkJ3s+-sOalQ*C=bKmFo7FL^9nGQWfi>1p|%H33X&#%k!;9QTtV252=uC zH$U=kUI%SS@@!5Nvrg%c24h7XjK>@d$V9Ck1l3Pqm@--;5?#z?X84Y*Vm}|ol&iQI zkKWI!z1*G@JsaS4=ik~At79s&3x<>W79S&$W@Mbu73+#jVdL7-iKtQD@ego>lMCby z5e2~@#v=SC;Sng4^vbGJ-&YK(M$9FF7W0asBfAqGh#OJ`=OpQoS*5;{9?%K5MS7O> zinZh0Hy>a|~!xkIGN^bP#NbS_f}vLJ4;naZBQRqAqP?bEru3fBR6hM>Q;D`WXC_k zBqu*Osok|)qbTgU!N?Y$ln$ruQ&0)>IOg|5$&W2ZHyU2;m#9lkHi&SK*E4ntN=RcBU#NmSQDe3qGWZxu*`-*x#ftu^d85&yU9*R^?IAPXTU;$~)eMeyd4hH&qHy~7 z0h`)Avhx>7s`y~~9etxn*l)7h|J=d>UT*^>ht?O^8Lqr8RSsvgtF(B8l7YapXl&hHO$twokgd}k(aSH?HS-4 z^FoG9@lO!MV90u#l@ZkO=a}om?qPCQ`Ft%uQXnRTJS#s20EtdA21AV!M#|eeIlCd@ zZi5=MWTLmg(XSORH{m?vo-Y1y8Dy!6HkB=P#>;}95}^zBOFN2UO=HSr+*m>aduVEU zu5g#|woncd16&UBx#Rp&;sRU_5`??i1;lDx4(hqfe8$OvmaiTjmHFza`r>A5!>#cB zD!zx=+lU|`iWmUsd0o?qD)2{^Sw_3qx5%P#<z)zeEJ<1_^Xm;o}pv90pw0V>pv}U#}2-QQ3VNN4`0xHef*jQ+RoH|j+MNk zAMEEd)?e5UCvCiVij2Jlif#cj2S%A@`y6pgA{VE#(@S&3B%0@^%L&!a#LY%UF}Tzi z)rG2(ElYEyB!iUb)6xnL!j zeh8G&46CbuMlzItHqY4E8#%#2t!q17?iuImkX%8YRTga#Xp>5ZI0ZNTENT_=6!ge! zqAhY2dlw(XpJqtyn;GieW%4QZ>z&)$S=0N{GkrxO&4vN00s&aAI|ow>jA7Q;u(Pgt z!1)8^lb72FoAj$vm(W%6Z5O2DXs`y}C6o?hAP4?Aq>eTS+T;Rv8wRPWz4_S>keB!m zX6Y>q=pFUQ9S+hRH|j0W#zAb`6t_M?6{vl6K~Y3stVyUm6dH5C!^(t>1c>c~XC2a+ z?{|&?11D@!&n5Y8#Uy>NIsD(|9QOa>(50v=V*kad-aw3W=6Y&~S5#p}32`AmxcllP zCPTsr@bv|xqDDYP7~&sysni#R8WX&iE(tsh9)+u!%I7JXBUtgiQ7RCx|53ZQ=dnZ{ zBA(X*#dX#n+uxJzCpn(i);^yPp}9ZMUq$?DSyLk5gg6QY5kQ8+i_Bp%?)ne6jf%IYc*t%lnm^(bVzglbh_#njTf)A4+%UV>Xe3o zi$EaR@|5$*>uY%7p;JEFD2}D5bntZ5X()6V-Lpzb_>^dlbgJLnxd&xrh^L5ezLJje z9-4)J1YkaD#V<6A`UtN4lrp*C3MG2QvSJD^fu*uvDC2V9yt!=1S~-|+q{>|J3$;Rh z7&^a9^OJK@+hnTF9A?SsZI zUyTAIoRTcPW*8aSdF1rRQ~DiQZK3AmrP~m8(JqvcZcVd@LK8^#%=cQv@e&XA&<4^h zOpF-x=xU1XJ`FtTYiW&<9KZe-+k0BFnXNlB1_4K!q`3}7@h`jctS+Q`(y6^+O~~=m zFDD1Wt^n=TKE<=oci}w`&NUEs=@NBg`w@3yMApQ)v}AC%I~HyDi%wPROD!BZx=P1y zZ~jlzuEfZx)e1lB!r@dUDMb;K8~sc;Znxa=n?(WsL0&T95yM19UeoYT^xHYFK(C&> z?Ce-yi9Q3jmnjPA%q;}#>7yxlQihEr-y8~Fo- zo|$`ze(YuItVP^9Pwx;Udma@Qt9s8aP6Q{~0~3ZxfC6lp0 z-?c>k9r-E=jGuX>5o_69Gi7gGW9eO!eWtP0>|QhFYVB|NSwrn)9d8+_tvNtFCF4q9 zvT!0P#M*z#4!=}sHMx+wPPWucg=O54VqK`U!o>pEaok)lw&Gx=bgEGUqrKwAQsHD> z5^bTaX#E$eM(5QWZIrDv17ZT@O8Z6}?aUcL(=L>TD*V`pg`ICa zc^t7Vd%tI~G z6;zDKOH7q$hL9rp5k`kkFWx4$Z!-WHj)!!P_=2D!{*hIuu}?6d5iSZ5h!~5sEXgH| zp5T`9&5#2bKn>SLY(W^;aFHVl&=vvc3{Y@Ekz8YLs=rVhwb6s>3@5`H#C3rxykw!N z=XC44-iPQ6g>b>^tp|I}<{rYkjNgQ!=z!y5VAOFu?v~JfRmIKt ztL@<#N=`p2mpBPeaK}}BDkD+3iI{@q=z>5TK{(Es>Pz5|Rq>%U@=L=~`>5=GVHZ@x zOKjZ62c>hU*Yy2ghK%v%PY|0&qa)Pk+!Q_gWFha<4`9;-a;GkyLA4(v|7uE<)RNHW z`~{f*kq<}r|7cbILlL2;|Npy`uK#l1CuaHYS)sY>f(UYV;t;9}usveJ596}}Ia&bK z@a&dVX-xqE>1>e8p!w+W7CU6Cv0sS9Tbf@oymGLbta-T}GQLW(J7!j+nrv5h80L@1 z)1EI|?hhX?yuCaidHXbibf)bt^13V(9E1H0Hs>w!T$;2NDAE0`52%CXT1opIFf!y$zi>JdFLU!fdD@W0FnIbIB zqZp1*sg-0KSKfc4gbpSDBlB%t>;*9z$QFW>RM%*{h+*=aT%JmK#+lOm>!QR2ODj7> zh^Jbm@{%D7*0fW+l(3sLL@;3edoR{zL1GeL&}e2*rb_HhB^wSgw~lEVVr7dZhlp}) zZmY}VJm?VxfxwtmmB~G{Xydz|0)s&s1bzgglLlwRIOCb1RhKdb^VV{km9sElesOX$ z6xJHaK70ICAi*C;F<`n#;=w-Wky*%GaCODdg=;^f=^N^O&8?@ zp6z~tALd<>1D-zzPdAo zW&gg&N)l(!RF-@Bq>Nxk14kQK^moqqXUVfLk2Z_v5Dd?aZ&Ob-os*5`3S=BVkuylK z5H(Sk5@KLrKH^KX@t+)%TdGFCqI6rZduaY#Ex2C(UDQcpZbIH$oCz(FyRmABaI$Lr z0rzgSDU`fp0m-yJTB+1j!CYB0(MDpoCuou)Jwsil;ygnmZh1feVcVbY9rl*2Rl;uMl{iv!>pBQgL-ki0&K40_A$GsI^W zdR>4?)QSbH<0sYu;5ib*fq~o z>zd{${a_OMrm^rrp8m3!MjVkfYtL2e@&cWDn?*B{Yh8ZXqF{fE!BeW#bL*m8`mH4- zDn@!77QT&NJ<2s3;7D34If0*o!DN*;7xRo^d4T^iyM(AhJ3F(*iO+qka__f!dQtq- z>x10}!+Rxuxp^UOFzfjs*O?PZYaTVHim7{mRcp5J%2M?8FY~E7N%k(uZ&Zr;A5kgN zza=V#zeAIMa*?&%l*|Ahd``N{hE0BvnEV~uFpv(h&TyI=Jv~8&n66&AG_|>gLIgxB zZj`AvIxQ}j+hcyzMgW>U?cY@(Gd{XIf46}^q6V-_nUtUC2FJ(BWfqC2xQRP4JP8rQ ztDQ%_lM9q`ro{}X6+LV_bEO-99`vhmMd%Rk=ecrXzUwTD+Nrc?sm8p~K^w$RDU{E0 zyUwWOqy4GJ7#s*3B2?>bat+48VGH;~%NG<*-0`PC22%zGQR165674ezJ&I02!hu%b zQR13+DoXu8L9b8P5dwKCE&!CC&kE6M9jl`{Buz(Ag3?B+EJLMXTVgYZLIZ8w!#HmP zpU!V?k%lRSte6Wb<@yukEYK>0)yZOQqq)jp52cR5Qkso=#wz{{KjQDTbA#-1FpsFm6h-VCW z%W3SF_{b;nkuw#~mWb20*^6nVD(F~>A}z`j$5JIf>H-LE0W$Wnn!L(Gn;H*8G)&zZ z7jJdHWI5t_{`vi$y7Pm|P?c}T9;nvZMy-Y1&SAz$YQ{qMs->8^oi$r;=dsAPUB*UT zuLV^{X*o7V6-0rZ^;TL|uVdL-J?1e-`TEIz@kn~2{^S3n>>ao(TeNQNs@S&e6=PLc zv2EM7ZQHggsMxlhidC^I_RZesobR4{-*$Iv-(Q$(%`y8JqmQR+f~fuxY$lz7k@o+c z(JslP$TrTz5%`vJ%w{Ax+I}U(n~bi>>6tv))6}drPJH>yzjj7t3ZDo<`6syKD?Ft$ z7LdFaiF9(ms9&$zas(^wJ599t1QyW0=$Q);_l=~yIVa6px4$C>JUIEbyRi+#bk`rx zfO$PVV>~Tmu0H9v*h7%%m^LRpL2=%R6Nqo4Tx2400nmyqsZ>jG)xXtZo16Y<;SC3r zPYyjJ;;HV-mV1hA!rLEY8=kAhea3c)k3KAR_*C$*exkjBa!$J4h=f#AjXj_IUG(Ii z$lucNv0O=JY-IXzK~~?MHy|7DvU;?vgWy>ddNa+nz#ZdjUHW~>aNJ;RPu2R^{Oi42 zb+#Xuy3||u2+}srd(7<$ciQ}B9T0JBlA{|-H9UBAd31`7vRE&7LSkI`t4a}_f3)Vb z?zeWWxmd$r!WNl#?RdP*nWjDUeYg9FA|l-G7yk?bXcAK+_qY5g6a$S6)i>@Js@bWY zDqF^8BqY7gslW;98xIaHP(>JFzMIC1lvR)Pfnn4a-8 zNCR6to(2}4Tp`!2)U7Gk7}4A}v3U(vM-l)#Gu}FKo?QWK4b0@goyu{vcQ5 z$o=wMy^ioTcG8Zjq5%&*t#i3)ujoUCm}$?QY928w22bA(A{gS{{FkARv=p5{2=?%n z-E(H)P3n8+V%4aVJ>3q;Z7_WM&Zb*PXZl65 z_QnnIR=$d2JdniDIjbCBiyu8OO#_iht2tZyyyEp_uJK7Lx(8$sV|S0A45TvnRFA{O z2o$5BK2RD&sZi2(Z2l^*_q{0XvJ?z&e>1-=LzKtUKetn+|7gJY?|1zF;(q??9dA^T zR9#a+>Tlg9QWVw5kYD{|Aa4f7DFbi!qmk)L<2b#kOb?5;fq{Usf8E4wMA;Nou7H z6>*NhBo~d~tYJ3up!Hf>o30oFL2BH0O_|Z_^$r0Fs^gm6n%C74CTc}y0g>?UL*fZV zS5JB2Xn=S*l0tAH0Zn2Hvzrg+In>ooGf^>JGtpUgjop`70<)inX2aqENU!XguNbR= zfO{k#vn$vU#)1zt>UNS!G#MRkL!kh(ObJf(G6h>rBBt;WZ%}B^q)jMg=NTs&hij)* zL1Fr02xRo%8E_Cn(NN&<&tGp#5{_cbks#$xXmIopgv*NO`Dws4EEui$Wh3({SOsM4 z7#+p{tv)m|qD&@If%dQiFGuplWZ&`=8;mPa;xribPXK~pGSg;uU%yws=|dfyWkFyv zCDlZIy@Bl%Qo^K^$>Dj)F7iKk-kF0r(4Es;0ogCC`xpV7=kL`)sL~;^OAK=#XVG*9 z7G2IQt&x6Jt;wuy>-}mua=Xa}zXXU2;DRZyCY@blBSB~_5zGADYM0owRS40M8@@fV zpb0iRy#s0{IxrCB#nqc=WuJriv&$O(LQfU1OoA&N)gg7vQmgh3*jX;|Fq2q}(glu2 zWB+Sx2k*l2ipkHZ-APRto{zt$REhVG)>ipI@a%LG(x2~v)AH@v6@+?S11a4F_@e3{ zT|QO;+Q0A4a0K3GeS$EY>AOo za_AVn0cwQje=OqT)RV~5jK?PQ047$<~LHgX8pyf?TPOWb_r|IN1 z;j9U;5vwT_a`uu)@kts`a!nJh#*|j5GEG>l*rD+E7WU{y6~uUuE94pUn5pzy)O*kv z^7J8B$yS0hFh+@V&olZ04&3-}Zi^pwgGMlZMNroXyL|gn;(kWn6!lvYW2DhlU8Vtr zQP;}c`E%9!sc@>hDU|C!)n)uY!v6lD4QBiwZScR3Cz6#d9Y5P(p3}eD;Gd`oU%;Og z$_T&i{W=CMH76!pSaA2tmXn}P-XD~&C09)$y1mwvI9~So5~8}6-n7LzPq=ujJ?ipm zw0iB<*yee^={UXF`TUZr>kApz_W+k4aIA)-yI7gEqZf?3KG(wqO>P)a^f~Czq zE@Z}CgX^_Xp4_6WBpI+fs(n|T@uNb7y!Aq<@{@Uq6rIcyy==X~N*2WwC1kk9Ooj)? zKSUhzSUa=%yUCF1fo|gf-k4uNErigYh$JoU)0D=S0T>D8LY1NIJ79^6X@qn9-DD>F%3QQjz@$tuj z$YfL~kGBQE-#kJZ`O>EZ27*M)%++pTl>{V(MMh{m0~qCM4^vChl`pl}CZ4c*Kh>m+ zBM04|OOJC$>03G2)MjWyEVwYpc`JTd40igqx&)&Q9FcFdVR2xWXl`<))c?$i!fiqC zOMQ%x8CaaCI3BI+Fj?)`i%BiBZPYzp604A>zpxAB$F`e(rrU(R!}{pw#l}Oo*LB4r z0qu~;P_N8{r{9YStz^cloeZ!(7AbCq;IS;7$_b3OK<7I1*v7TJgsqCzL} zACAlt5(~6{gA2@UTY?LPoRa>H1wV#jc;|A01#;*&#%?k&fsLNN_*Kg|^?%5_BSHg)8&>pDi&*JiTrWv)>x8tlab6SOb z!{;SFhi3Iq$YGQ^0=~&+{`cRU0Yer5OiSbQwUeyu+)Wr04G*F_<{PzJ^=Y!X!`0rzzdhFo2mHR-s z1y%RTT}Wwy6Ju~}ctbJcyj?)Z8Ou8<2hRE$`%=X;lw|K3oMG^8Nr*k3sy&=~h@$t- zaU+CnGx~2TE^U)imMl6lviuz~nR1{%cQGoj^BNon0;+z}QE^rqVK9jv(6cN9?%R0j z=b_@ZL|V1gUdN)v8M79R+M9ajAU|qdmvGR0` zRj^h<0cNP*H-98B;Z^SmMxH2_Ef$rFM7Zi5&i*Gbea?;z+Lx#dS4&1Ja5UAfIJ>!H ze_Wb_?;Gt644nWUv7_Z#<7cg6KUPVju3+{1ue0_yiyW-^Fysk~W7jpru<#Ga{2a0L@Eukhkx9A^TaR5PxzP zDQI&+UJgKTYLB#x-_O|5A!?pO;%jgyD-ZfAVCD(yjgVH`2fFGO2DNw@_(rQ9vrWZQ zOsMd1MPGGeL@Jxrt4^fT8R!`F8dq`ag*i=uaLlHh@q~-%BX9vuKlvIA+uBD!{}F{& zRdbgA>IK!S^qIV;dX1bdJ7Ugnny_t?O7sDe`|jU53!RV5|JV=bM85U_|5C|eIN zAXe5Rf9spU<0{Nj?ZXz94GpE7E~9P_&7az(>E8L}5ES%%B4fd6Fbf`$%VlZ@!zVqd z;`8G92iNax%5j*_c*%RaN*(Ftvo|7SpH)YoESn^Mnd@gPqfpJ28+ zE$ee@dUuPW(iao_R1qgkQ~ZtDs@LkI7^8RylreU$~PcU0U~5M_jnQrM7ca0 zZG~o2LFodHU97mnE0XM=>%`v{mE6$P>in<*8CkvT=-9s-*V3yDhWHIIt@{y2 zU8T8-W8DQv%5ACCSw(YtBExs;+hC)_0Z+s%PgI@ zpX|$c^UAF*b~Z|R3XTd*zwd&Udo46r%ux`oz(WJ7XoE5yZ%3OFTp;b8+!@Ip=lj*L zLTniu&}0}C+V@?dZNuGAZ572^DKpdo+!ZG9ngEi0ZHf0kw9*p(b`+(`+5FYanzSw* z1F6=$1Fk65+k(7>dI3G$hGdyc#YM}aiP~1Zk?X&{Nf!uWgH+;$11+9UUd~Q?iTLhL4fB;xE)88B4%|k6#hO``n$S(Q zo*4n4cgo!g%4M<;@arUVcxao10G3|=)YhmIr1JQfl28wVQ_Lox=TCt3FN78pzG(26 znGLjwT(~Z`K(8JY$9Wd8V zY%QN%i2hfP&Ag*}M=cjIXIxAQGfUF1x&!OkFJ4ypY2O|v=|UN~LV3t83!Q_~f#Y-& z$zU!_glb$diLW%^f7(uu*~k5$-Q2QT;9f^Z&2HFTSB#~E)G{e?i}%R_utSUulFY-h)%S~`A%q;2bXUx3paSZ z0&UBOWoDnVHTO%eyovt&J%Jz7JbkwRT+_z=V}JVpFO^5p_&;Y;|4HRZ5?4j#M;iVB z#EB14^@6iTv_z!|WA9amQxQ?&@QV+eFF*sL7>>j19|N`qWVb(&8gjXixjJuhyp@#1 z5$ak9Ec23gHSfyv%3EE&&u~P6!*CwG_t|x6^)vzcrKk3-ec9-?ofPZP%@ed)#7Xr< z`_jy3vcESH$Xn}erhX+yP2S#Bzxbm{d^*Uy0$z zUqd2=-^|!&&bp*nU~FmFN7cnqX*}_i(B*e?Y*O6Mi65m=ZMuXG)wf+%!*=;?t?+XK zU-D~)Ji?VmRoxV0&d3LWE&EE>Ky4{<#Gd4I1&rfrb!0O2@XjgDJ@eUGho)RTxh{se zf!VL&ybYZW7Y05_b?nK9+DQ?n2vFp+;m_klqh(M`I8=ONmqei?LEsP6eW?DG4~goK zLbz5TS4`Z^aX`h&ZW zGg``OVVXQ-C*?JFZRNKY8fM&l{Gf4L6hBFJg)x12#d-LOhx$Si0=#OOWJ8IP7S3JR zZblrONT`FAX8y~Z%{}H}LD~?T@=`sVAby#~O7)VJs*ru5zNk8sr%SIRU2BELV2$vy zTCt)im$*%R1u8ktW}w z9bRyhvU;mFhy`1kC0OD4E5?+32`EKS?IBR^0-tJsdeG-%p6~VJw;KXmVke+YCAN%F^bd#{u3}%7yThWO-6l9BVD0$#As3`}hqs=?*A;ur+ zsTzK|E_2udhF?n67hqxSw?_@^F(45?C;kqu6_n8-F!~f{JO4Op|A%1{>;DjE|MuG@ zE1Q3A*sy%qLzTHKcFCmYQkuTYD-eoDJ2oSH&7dG3{HDQ0y-yQS$O+SAWK=Hg=YT#Y z{q0kzKFV?O=u=!pWBZ`>w`+|5G%N+(Q@6Z3tG~SNXs>zyJP`wYv40&v5=)Oso;A76 z8mi@}#Lg%-jlpEGkTEu6$JhMUizJ1i>Ll36ZU`HrZFdG1D4WkVX#iBy;UZ<7N*Gt2So!5M{vg&8|5B?s4$thF4K~!^$crFS+cpYz(GJjPK3qFr=d_# zgf!#Picq}v+*RLLd;Awl8|cx_LAzF?0ZUB$4)H*;lnela9B-TE`g7#uo#e|#+iU0r z7YYIl>rblfEWyNnc1sk3SO@MLh!AWt* z!rg?(aB_|8NxY{;V~h%NAvo69G9ChN z&Zusi-c#~3$k}B=981IKorHP_97~JB460F|!b-R<&?5sk6vIwtV!(pUVxrs2(#yh( z(l-_o(O^sg2I??C4M_$t70i^Rg72YE5hd0@51j@uRmAOafC=uy!O)UoRR%DX$X%s^ zla6EWF+26DVMf`L)00wSxmTF*XECX6o(NJ7{OF^YJ9Z<dLO*gSKmSDq4WWVSJDv`CrzlVB%l2*JoMS1niY7CXf(w+j; z7mTzpJ7Tf#f0>LmIgz&C*gz$@+%3VjhEVQ7@b0nmmEPo{~bTt_7y5{oy9)xj;tQ>K8xkq)HgkrC!`_vk7%4;WbWl@+=rZzwXl1@~|^X%|n(tPkkD;*UVR9?L+U#@HU@ z{dpa-wQFn-6F)2uA$S2R z(Iv`1hg=)~=;-+GjpKh%VE^t>|Fv;+me-ct`-JY!XOs+?8CETxocN>}uX)!lG#8Wn z0M{>W5wPU<%2?tT-4&t;ulKCfZQfQfd+jRqsqpdNV*dI-#bVmUi73PV?h5) zM|7I*Su$*&v;hH@3E48D9crgoOY<_(*Xmh-^f$4fcwNhGn&*vfq`)+h)OSI1L;v2w z^o0-)j#iifmH9z#4E!WHiq~dcD>~EBdBG6dMe&T$F}n|CNOUhfs@Q9*+y)sJSFsHz z(Hv?5$_$KJ&VIt(cLj+fm7YSeIP>72;8tGxwW6AC@N*fV?sY_4Ov1AT&4f;-M@?N@ zh6gDLa1LmWN)R#I@^px}+Xn35wTz)4X%C6qR55x>M<&!0_$fA2`j>iXKO^;|EZO!U zIz*LXB57qYvR`?YhDtpi)5g?g(KG1+Wy}4ho$xA>Yrk$;zWzO&5MkrFtNyf-41bpOJsakS~E-Y@t<;sCf|(?otGa3~sV zh=&bzv`J*4)9U#y;aY5OzU&`Lodh*xAF8|JB?_jz`@q7*3&vSVRL>ny0iQX^yYvh&)O$s#8nC`{PFjp6L z7hkGNE)k%PUF{`__)%9-SF+x{si36y71xZd517TS^`p*IaW;kGCIp+hX$Rfq>cI;A z$MsNk0nerwH1++roiC~!u9F`1I)}+21%lEcgtCqDzzS*$bCgtZl!??B!_*ULY8f}~ z!R||^Nt&0`jzfjUC-}csV>+$S<<_6S7Q#RNTD1Q)N>X&vw=vRpFp_gHw>Eb&cQJM( z6Sw(Kuko*!)>}#27F86DXD^m+v!&NcT&x@wWW|rqY&|HY&7Kj7XrY(P91r9+ys;A| z@NC13M>1OJF8?F1yMV#lthlL2I6#!l2kkujddQ{LvWnYt*zs`P@v_x*tMlWK&5zt` zI8Rk;t8(SEL!Sj}`p4mt{~=8C0Xv2!Tctje5N-BD+VOMWJyycLN-|f<1Bohfiehue zbJK99*43d)<#(UT^__wdJ9@S;rp{rc*8L^wvr%T(GwE*i#!l@@n(zq;lGc#@vqSbb z5yQ28gy63E>J`X<_85q4;w*{FL8g$F7<;dQ+N+P=i3HLm?HY4v#r{zsN!U6k;9cwRZaWx`$#p#c~DT zr0ov@JV#*79v;0M^4dWH+P0e#%C%=KY^EE-Etd6@00F|E6or#6N3<+`X10NFW^rP` z-X|~{u_vV{1Jho;30`!znxzCyYqq^bxHbG~NA&VB~EmLtY9TUVlB}wo8^SduG+NNrF-w*_y zy2IJk2xEZ`jEn) zjmzc!k~jpjm~{O0IDLZL5RzC9)#77JavuVPcVOr2N8Cb}^9&#<-Sn5f-&d$zg80--h@!ygT~QX)|xRkL<4qcz$RiW0})>ZVaY@8d0V6gbeU z3@D{iRld3!jgD&cwiTck8f=i$O1Ua3y_1lq6_KQlkYtyeUc{%8@d$Fcvt)c5Xo8!L zDObp66Szm$r=x_K`xZkM9@Qrsz3VUTiqtI6W~OceD9VKuZ#_amvJn!K-S9nUvQRl)dP!QbFg0&U^lJE zD%EqL;omU1-%J*QvpMs@BTnus9+i`U_wom9+h&9WxnCCVBWs}6wrZc#NfDdf8nL#? zaXQ!^qvXpB;qBU;*LS$P5!Uyz_B~F2eWg)w!W*nt|C13w_?4iAdxu!_=IYlX&+a31 z&Xl;WqfznOJ!ViY%s{)Z+f7ZlcgpJaYn08rvz04BlNPVb`;(rRjkGc9AKsIjkw6X;WgBvllq;74;r>I+quc zIiC%bR0-xu<7RBL&$Lhv29t1+C=VyI)JcuPJz&zGV_qRnd&YPys9>4)MoAAkN?M zvxPU8^R`qA+|?9ig?nC-#Ci6%jN0X4=4Gyf@82A$ULZ(!Rvl#Wf1h~F2m#6r*yB-( zPkNg*g2mj$9S~9|X^V5LRUbpaJtWBKm8~0HF2M~aID-$ubC&#^`ejPGk$N`CPELJU zJcLBq-r3~dF(=_L@Z{jMkf@N(A zgRbsBzoC0kK)jeKAcHb`i8s@;vX_z?a7)F6&1stvv!fy1o~%L~dUr6^$hCiJ?jP@f zNVwz!ood(u1##@zGZ)Ij!@6)qbFh~-WVx`ZD3Eu>5Cp|@;c?nR7`eXkw*Ey1w)j0u32@~|(lBMx&6+`o5>K1EIR)$_n*>HL=cr?XGZekkMaevc|| zgdwPdy=AP7PY&l;_YK<-HF3>N`vW|**&g6$i$p=%A_YY0%?v6;_KIXg?^W(A3__k{ zLILkGLG12@_L9We(npYWx5z!u(}6Enm4_}{7;2~`QP7? z^OO}=G~rLyS3#c?Gn)8F2z&t+1YCAp5hbkuzz7P77+7#v0G&$$6GDi1R(fq%8PUbd z7zNkJ?$DC{)?KjZ^tE=bcTe~C&c(%fW3d>wn8T|9!O1tDjpj<*#m;6An^*S>lrON$ z*aEQao~Y-u2WqNG#C+*>d-yH|q>a^-8FaEuYufCBwE&D*wH+uy8_9%*Y&F%{^=vic z(v!~2S)E`+2Y!fV>wxV9^z31b3zAR>i**zxg?m*63iD_Q{h3ltB*qEX%DeeY601^<&o zgN~KIK<2MiOatbSR0md<^v~oQa@nlU~cwSC)z<8Aq9tj3F0ly>PFcVXp6HAUg1gDVy z4k1qq(YMiAZWfTQSF2z$VtHH=qGLp9;LzPEO_&k3iqHB4)c$bX<>^eSNYOMbq#i=J zbwCV{SlY~~HXA57R|}&QYye8I>f$yT<-G9m+gFPZD9C#py78>NZYFhnyUMOJ2zl@n zMUuphfaG1}Luw6_tknvs+s%bcJv5SAAR=uq@vDRUQdsTVD?8nPM)G zI%`C`7)aveP$|h3pdVr zeE@=O=XtAZX}~v?UvtJ6LTK36&Q(R=GHn&j z;j5}tye%z*M9`xxaM)Na%%w`?MGax#D;3_wWF^})=9UwtQf`|*FP`22Pk!XNVI!1O zK_bnNbr?gOdoC3m{QM>(ImTh@t5Ll4aXdU~kg!<;)ar}~hPVV}noONJk=xAPV^Z>EQD7-UY^DLSB^djeAH4w&<0JX?lv zN5vLeaa7a;d}5B5+xblQ0Q&I;?1~km{f4fWWs|#}+CR%`gbvUiVC?PYa+Nkj!{A$u zG-*8kQnWfFBg^=tO(9udhud)cdm2ZENJV2QRQZ59>byUapad$Ct(q9l(kt99UhDs;t+j_q+;Qf%UEzt+Gi1+i?U?#8c-{7H(kLl1YZoE56HuHV%os*#(J;TW+?e9tHCSWXDFgu zWNv@zD>uR)Sl}STE@yX8?vwHBP|Y@?54!KQC0(Yjso`VnkEoBLn=za@ck;W=G1_uM zyC@Ri&X9dp^{no9hVIkybR+8x?pWhU<8_*D?;N@%<{$fD0Lcc{M!NYrt#JtesfLQB zdZlr8iejVHOr89=xyIB?o%py3fSS;|)Lx}|#lhX@3NW&*Nex(IjIbSc>jF)FrCqqm zHalp)0^5o%_Cn-*(jj}~wc38UX6N&pe&j#f)!l>Q6Ef$F8h?edxhi}N_6^j1YJSW< zSUmTfw{6V?Qm!paj<`$UIW8h;*7Wya4Qdts|ORCWJtw2M8M1LcDF9Bv@}lVtr& z3H#z`U~Z%D;Lh-Wb~OL<`jz=#3O4!k*6A->UFI(gb`pD(KY2+>7+-yguu93TnSuqx zdbY{HxOQeSEsZS)e$mF#&0gg``!jchOeMumTzy*V+L~|X3fGxUg5ZXy-2Q;x$F@aKsKMh&5mIxzRD>26l(}%0#_OMCTew)I))}K>{!HpKrem2f6VVx`UPl8gV{*~=DYvb5 zQk#!&J5;*pib2o{M1#`mzk5d#?wIa!sw7pYVh@qX!LM02Zp1MDBv{=jO1$2vxcpX^ zZ!;znA5!_-nLFZ-Uv@YSJNf~DbI{^3Al+D8K}|^*T`72$HyG4hU5j#$__0MB3DOZ*(IDxTI`^ec}7v6@GA_4!X5Kgo?R zq^+fCYNPwPdjqHwJnvsN#V{FU-gWoT&ST3>{q6-i(uOi;-DIx-cZu$7?h*CV_PiP8 z<-;u6zjGTV!8O{;o~YMJSryssB_76fS@qh#SIgrT+NZytmN_MINlryLkfi#CHzJZJ zAE8t8i*L;IE*08W`8L_diMX_5Q@PGVfa`YhI|_t+lqe*C36 zGaWj^V?R}A+CNS&{#kYY8}k0i-*Z|}!W#Z)7B^kpY09QY`^-eBYB0APZ75#QN2Zaq zM)nVgfGs)?3pCa+EdjaUifnITXero5NGS00+Z`)Bd_4N*5yM5_LW_9f>d4ZXa$kl> z;C)Se^X8M7nR!XCo8aATvjyJMw+5I0#N$0W^()HFH7XyN+=Qy_Qb*fa!I8pAzi95d z2G<+oMuI|JkJV>QZNSx9;UXgI(3&0NK?$&*E4Ih3%kawBRxFg=^R~Du-z??KCE({%fm<-mqX&(LYte?J7(%_iH5yV;dMwN`Dc#LJH=r*Fjd}I` zFzs|84%}Ry7jCc31LTBduQzv)rdTy|4RmpPT?Hg1uY~6asTc8vk$!_aY6Zd+eiM=4 z3^5EUVvbP|A9jFCRqB-|8U0O_qBJQ_b_|RW5Z?p#C6M5OC!BoyLmquNlE~^GD9M>) zm{t^yM=YNB&4E0-C!W+CM^tVu9Pg_*3$TuSa(9L-TS&NhqN}I+ya`p(^3aJc(zzti z%y5Q=2GapABvCLe`Wvh3vEv?fh`ivLwr@q57N{|g_pw?+zEPRp=V0v=D}_8;ArmynkS`g4^M#Fgx$opBqRli$rar%lH?dZ-t~?p8y93I zO2MeQU$*PGlB>sVvLwnF0}38OlR1vN0ZqroaQ$l`W`c#WhU)ZFgYIzZk>*$A=z>zD z!wzHuY5@cpKCbjwIPXJBj{q6x!)g58C$W$tA8LEP)nV_tJ=mZn`yWYd5>)5skwrRm z4gPxy<*CrcZ$7~bw888ST7Wy8d?nDz;%em8fuAxb&aAPK0GeXlqOGI0LHqh!O|8;_ zCCTnd=dm+{W7jsB@(sWn!7_F8OVQRPC;5ZlEu?no89~Z>*lE(H$Z>3|%o7i>cj6V) z{zKAjsdnjWPV#%!Y0{R-vF1p5n3?vR0by&qYfqDb+6yL^t$}Ady0vZWjJ&8Q4?et8 zRB2&+473yLtbQ~qUqLZ>hDJdZIgVyQBYB2qK_@wV?EpHmz3{*}7fSXShRDV1<6Kbo zg``4B8DYyKQU^3gV2L71+I&`-N@tBdFYP*9wI&^otPz*7* z1;ixAqh`qnrCX#UE<}-2pp}!Kt$rpO!cbbsMc(?S&WvJ<_6wAlhvF<_j>dyz`%7>x zu?is2&0`(pO|iBFLd=gi&Y0mmA!JAu_V6#T_90NDi{uAVm&pItt1w67GGmWMft;5% zYza&+kw3{B{7bB|HSZ)%CP`q3%-Z^)1l@ndKuVqSO(HokAlRPsY6uK2ojZL|vLzZ; z#GR$lQ@Jyoz44us?YX4%W3ek)n^p&|otk0*N=2BARFp4!6MUQyvggfK#x7}vyq&{u zgqX4fRT-~EO2sqE-FrZZZkCPo%rT*s)_ImZw2NYic{lals&$g7td9S zRVL|3Y7{yF%&X>3#R0_}@k`X5$qoF+=S9&Hmn7B-9t`G6#}ISk5`-k#1-J5ZqFlL^FS<+`W{b{rbM?r}plaoi zKR;Js__u&|c|`q6UN}$Z7}r~Fue`%OKOMiVTgLHa*cGbIFy+%hx{B1YVw0-$Vv1Fo z)Qk~+KI7|x%yIVT*e=RWa$Qknil((*d^Lx*5pDV>8N^{KI#r;GesLd7FdK{|x(3WC z8atJh(yt=M3cG|pYM4?QD;1#jj#MxJ#@yY-rr$<62!9v5&>Kqo1qyt17%24uw!d16 zRq7p5cK^-9?1f471;o@s*P#3!Jz;gZMTpCfZIwf<#yb*z)%91AcWl)v;;*W=Fz;1a zH`N#HOUrJ(ZWkmzg{_Jn#Aji5Bu^oK&(j;2DpE&k*qXM(hqnE(BlCC8O(@OozgMPQ z<%=NJpE@<>A9X6(f9cf!EaCopy2|`-G)uJdi{d8=@SVPfT+$2_ZKeV+))Z323U0lq zK)Y{TF4+?zaP6I_rDtECd12%B=%aa;D=*Sd#j?7Ubsg)#?aJ}Hl@mcQq0@89wd*DO zGUsJ^tNRT>4+If=Axhi9!JwBByx;HCo_0FB!&MIL!kA-NdTOS4gN%mFukWy72m{UJH=3rBGHM|LSqsdGQdu>vt|mU2$A^Yrf9SyKM7gcz=766VMa{u^C11(w%`yzlL$24GQ6<7DKUB^o#76ZTB(4s&lVdnv{-mrp{Ih zezPk1uCemBfOLsOZtQD4nuuXL74lu#nWl$EXlivs4C<~!tem_yOpCB_a@j7cD%!zI z%ksi)^N`4@Y5WEFJ z1F>K-df~VpV1GGIskaUO;~e1Dc@O+y4L8h8Z0Mr{{+M!_W?4!;C-$jk0g8^dv=_e>eI zbMgTsS`2gkPRY2}y~KCmnB-%280pRHGGCda=$C8xEmTJy-jPzO-BbV`vKIAS zudZ&oOd%hkj9EmSid*L6Tl8P_T3ts^+|dD z;3*NoLLXugSC&fOgQ|SMDwQ*Sja{l>Nt1`OH)(Cy|Lh5Jzc^8O)r%Z-j^3A~e8e==SWcMWH`)q}+-_ z!FDpF{7UGOlAKsFd54XU8fvZRo&NCsgT#QhD}p^goJNdq zGQT&Rwsqut2jw~@RufCObH{u^?jy2QABIlhj4YNnsxgZ)@bDIGoK`+3tCJJ%p$m{S=-SnW;;BjzpJ?|kJ>LT;dt_KJJ}bn!iT`*iw# zY3a?%qkZ^&TGC)GIMQTdOVEj{b-sGq#a=~RW-&Kr1LfXbmwi1wc%DX^y_KGyiWKBT zgHd(NKDc4Awh$;J*76*w8hE@kRCC$6vC?~y`Lh@*a=taxm^y!W2@WL+imAkKX4(|2 zkN`|OB;xVv`A-P;FeP3FNWyUMm)7rYDaCY$(A1Oh6il<`5#mxmkSQbjb(fMb%?0qT zzd7C$Iyc;eF{JGy-~1h6{J697=kW7c@O^I5{@+_Ptp9pivUbJ}`cCGyHvfKD(JGRf zpA3dZQH4+qMEU7+qNpUt%_~?X8rft=d62f+w0zTpYk?e(!Ty1qmPd9b!z&h z7m>g~CwXVONmuImn&ZRxr4516IY~Rqz z&$SX1T`iodn)mjQa^=!(Hd1+j^cm3kCc#8QlZly?82hu(ra2JgMFF6tu*~LM!i1$) z8IhOfrY6tzE2k;<DjmCX~uv|O)mwK0;xJ&%-T3{VMq)nTo6kLZ@|p7v3RcAYh+ z7YP8jg&g?$<9APHtMaHuq`*p@By`N!R4#&1H2Rtw68%%V?Xs!UXrx;Dhe??`VW|=j z#+HC0O644VG}p}DhR8XK-W2=XA;_Xp5eT&weBLXBL;&F-oC4<8Cr7@C8g-mC#l+(X zezoa61@VsJ%p4z9g&=cP2pyG*h|lZ4OF&fKqfQ^oUZ3>4p4aPY+7_$>1E_08Nzm(x zar=wz(2SNKp1r>5cRe=H(egw-KnyJgd5Q!Jp2UtN7F9)z+#An#Y={KwWBh0H=q4qD z^nyij71dXth;1zm!RPNNmJxUsSOP7XtDBo_4tM+vu4sLvEPbsU$Zn9DESaf+NI3- z6;at^D9VS3Mln*ul+oOToL7^PxGrn5=+Gf^o=?eQBKwf~rI$x;U&UiY6e=Rf1Vi1l zm0qZZL0NNmK3j=uMr};1+un%%Xt`Bja#69qKK1iweR5{LK}x*igwd7N|271r?^{sU z&w%IzfTJC8lfQobc=2Jf*hvuJr!6?#iwghQ|7eHn)1^!r_RYrd?ah2)FRj!-voqk?f0~Udt|`90k^5h`zowX$e2HuXD-a> z51k9@$?_D-r$9AvWU88`6n}1o*zoSJs(G4~-d1%`PIxam6=;s;!y0I@Ah-xus8BD2 z6>JtSvT>LMN0oFMUpbzn)RCge!+fz;Ev|-v;jENJz(A)`ZZSI7V2E@Eimy9F{ds2F z$BF7_G7*L%4$e{1|0tX@I;3(ysj)_c%1(s_QypXCfj(DII8Pjpwr#3Hi zc%}6EmyTv0lh%j@)5GbWiF8vWwJ~$Y2TIC3@1%EbU#H{nyTzO*1|Wq7Aou(4wEo1MDfD=;>eE@Oq#tp;3Wu( z(Tv9N}E`mFU1nt%nwP>I*Veq#;BbOJGntjW9A= z857#}u>XWya#-+9?r+le0BzMjohlHv=wy_!H`8rzctNGEZ8@!Z&U3=~ukvP>Y9#(x;hD6Vtq>S?WYDic z<~nx}8_bw!W?1HnPW#S+>B7vDzRbi&{^u0IrBX{H6zniX`c<^P`6J3I6KF zJ{&EUse(~Tr0q(037H%mC*0}PuiS3|=N``*tkyDv4(lq7@=*nJrJlbzx+;%Lr>p33 z*tO*haS>jq_7v%W{^8(&*2*cFU~dRN9`AqJ#t$xhO4Q}t814pW(d{dsfOeePW~AZJtxTPDR{K@AT$_OhD0S%2e2&QJ(+)6 zta7KU8TqEl(9on>AcFnD^qTwF(N1lxwZ#Jfg{|@}GCL@5_owb~Ed2*5`LJklN_bz9 z>A@wA#^|z)bc=RTNm4&u&%lt&an1o*3bp0*)JJ4&-gG5oJMANn-QDe>~8C> ztwe66-%W;~TiSEez-Z#J}HlH)(mm6tPH zwbfAWsZ2Vs3@F4Rl3_1-^3CltZVwv_-Juu;byV#VdHosQ3etXoCg5Jed?J5)dA3v# zf^Aq&Y%S*V@|UA(s^Qx?h2e9nZcXQ1j(!B&d~G%1_qjM7xhSTOZ^6fShx&`ou20&d z4g0hN62Sl4oJjtUIq|vV5oO---`AYQVeB_{#uygEC;I0li04A90J zXe#qRPv=UVe^~H^{cs;MruO*EhE^1ZhfFyv5QN02E0fk0`Vs~k+{*Uy!gmD&I~J0O zys_N_O8^BZi#K9ohur5__5+b$jfd>pL%u%+f$ZTq(M8Zsv!jN*sTrhFsPYjpv-VzLQ!KEeK{VrNUXp3?NH5vd<8~k5rxI zdh7bKdiMisU*4=ZRlv==l6)Q4j9OdLJDIN+1c!0-H-@5kK=rNFtf^^NUnOnE@=|mT zV|KkGaS+csJes_8ASRm4w2pa~VNju#r^ma<&_LibH8oN4BK`f4-KY4s#%^r$um2An zn9Zy7=M4YtUv<;}vsm))Lt=#TXZiAXwD-n(cpI>IDS=R8NnhRQ73z&pwd~-;#G2q8 zBTYkiRQAv6rn8-*{eBw#D_NS&=B z8y81NPiH5!U=bsgZC)PRV}ptl*puryhK8qG=Mraj0}$wUwjOuPP{}A;@G-4Oph=hF zT09d2u!JDkL8&^X^>kUEBfWiBIM5tkF0))n)I^7nIWHxc3(vXsDBQ&wC@gU@)Ok~$ z#Ua)b3=EYfZ88x*gcB0w?@bO;R8Y2cB%>grCh^sW4#0|?2J2diU=*GdADP#wFqOiB zH8!F2RoJN?apnG5!LhqPRS*p9WxBX1LU z7GV{dz+L0(L@{4?G~xNKE3T0%de7Pd{;4!vF~@oV)P>N~O~GHlWQty5O=5X*Va__# zkX{YxO5|q+Q&$`%EON4))aJ0K<)TCJ@X{J&teoV>U_UeP)dm`}O5HF)0^uKXg$(t- zJQpG7Mu{OVlikkXry-d0~GBeB1z$I&0spdAQiHSOLPOs z&GE@_3k>iay4NAJ`Fl;YiL$;_A=P||^6q!`ct>(7vBTmUfCma0cT_vkpDNSjz~c%b zG->|?G}VY>yREl#8@C0$PzaL|qwMvCbdzz5-Ckmi@_F4?k7uGH+tu~g_^VtualoGI zgk&IEReo^3{Wiw!gKG~r{rJ!M-AKMheEIWzM1A7e|Geu8`~$=OzvkThKg`VuDq4TV zR=gv~uo&YJ(a964i=-IRPcc#IiVGZ)X~I-E3wHb&m2jl(m7M5fVX3?=dW7@30@2Xa z;A_53An13@lEBGHG42maNw1sa+uYbp>UiM!d3B)oW#lok$N5d1O=R8*4yYtBWD-`U z+PsP0<2bczGr@RLmgpT}P>F$?Zb_P!b1>@oi>PB;f}y?No=p!9Cv(?wQ{!gU2HA6t zP130nIVg2X||}AW8Oks{#@EC z)Mia+WB^$LkZvD{_8$AWE1NdTYu0v^Tn%WSVEk;D)J*P|K&6N_;cz15ZClsLL76%WspR6$7NF`nFaq+V1D*h>xZhtr1{AeNR@wu-tm)y6?HE_wzHVc>(LXV)w8jXSH648vMSQcYJ5R z?&C~{+Ezipgnr^fKS={?rxTpjZ81C|(0+vxJu*v{b7*n)-QH|=_brnn*FOL*anknY z+;FZ>Iv3X1Ay!uuJ!UJv!Y_dEr3RY9Pt$sZ%N@3(cy4CT!`dRqV_JxPW0LAq`}vtr z17RERnNU+TtJH4>miq0m79oZ{F>@y0O9DXofr zPAyi|zii0ZNppr|lv1HB`}&#OBvVo7$@39x4F%&iOi03u={${F@HhfyfV`XKa*Ld6b zQg#i;X6Zp;8~rVUgQZ-ivm>*-GX-TVgK=dctDo{bOYh@N+PQAG&Y zkWHy#@uAAo_2F-_d4@|+9=|BeXm_(pk5y+CZ=_Ichs25S2PrpV93a#3F2klCG=JlMxmbJC#vTtIajiqeJkUd0M1^ z2Zeup5|?b5&=nwUPlOd+vMrDeSgfWBh+F0h@+dMc-_b?+0cjC?o^no*`sWu(g1n90=$%-3gJoPsRi8lX^Kc>sFC8m8TvPwd&?_wdkvMp6Aez* zW*VutsGhEy>)*kTs!bk=ZtjmR8lwH~;~5Yp-0#{W(Y ztiqajiSPcf*^iBVB@d~(q7&+?cOPxW!cic`yb&7CzfBPFOnRU#?Wkzou)TSRhfT;2 zGy>-QJdQmsXiCPe)U`H1k)8c3D=G(>@}m`{?Ilmyoz!1~8Ch={++Vga;f|P>DLMFq@!`TBb(L1I76jg#b z#S>LSVHu~1*U%X+v6C+$+`WM=EYriD5@lBNdgw>V=Xnbt;p&LtPAL!-1$wG5l%*4M zTra>sM@*Tcv8BuF*r22wz5ZuZ4!%y&?D+(P?tcXXl7E1~|2YQyV~?p&wNd({seI@| zXsrchvQvxYYbtTB1)&*iENBz6i^+U#h}l)O7>09CNZXw29DrFfdVd!0rSM0>3070y zDF>atwMh-pNT+42J2_o&PC9MelYaO-p78ntbeV=AK7DrcE)}KB(sOi`YRwuo2>0{P z)*WMI=?3c4&f|($gdL{JWtPh`fDTIK&j@+IJ6oP0;lWL6r7fFfTdBrw z8|ZzyVaLodAju>3v-}p`aCNJY+#0V}f1O!HSL@ptP7_vb$5Tf@p{;A7cYr}_P(`l& z`c%Rb8H@nzeIvB%FQOliwHbtbJwLFGpcU$!i)k4Hxpaap8#H@SYS00*iZp3nn-OOk z{9Fn_w;n=G-&pWpi=g>dzdd#UD{mcThQ*tQqPYToj(Z842-}BHpRDY;>(H!{qg2W} zR1Em2WVoR+5^y`I3rKI(eIa*9Xr16~t2!<%Ng&fU!upYSr82t<1%RvZvpDtti7)vj zYX?$go{>fpNVEnb6d`B&tB_K$U6)*Iq|M>kmP>>!Nmg$h)kQj__0eREczrn>V&1W1 z{3FQpl|OaTNra<=&`9AnM#t}I;kD7-3mBgMO#{l_ zE%MN!PK0?9W1%j(Tr8Y7Eq`0`bN)l7Km!kiF!&D>MKJ$yA#to4853~4UUYvfVk&}95t3ZH(BlJp z-~s#Z{=&onLe-t*>>CBIP&kAIIFDxu(H*hiCH12q3S&rv(H)_)BVx}mr1*`4x3@9jNUPRV?+8v;!nNO)!P2Z-Ew*DstYJ;7w%L~{ z-~3YDC>!xZ85L+(KKf zeRYM_p!jHNBd&z*<6cL;0DOO45L%-UR>)}SGiCM(b^%t*@I&)RyhYSBC5yj>yxKv} z3?l;Uka+~BCuVkx1{@47)EM`={LFET!6_BtzOj5W&}T-qJ^%h4o|5hyuGuhQT{aAjNDw6)wjM*1 zroy$Rp`zhr_Q2w$%=@&8Z|1XGF)l?~(9djwchqgej`QLnBhHQAbI0Zj%at0)WT>zr zq7_^&lT3G$!g{XjAq)w!Y@h8Bpo~eSyXN;ZmlTSG&@4oy)v&H?bguQEIM(jYMsS$ZXm?Mu}snvIe%&j!lA8Y|T=wFsm#X zypBOdZ!q~bEDvboOs$G8$JSjoQXY8d4QqTkm6)# z;;^eEd9lCco@tsGt*KWwq)tQ;Scvo(Yi9S?yMjy~iZB&6D+JaaWMQtkMZKmFQBvnE=0T+4zcD{j3jG?z?9 z6oV8*1K?4<_ZUS-^WfUJJp~McN7_y*w~VuAc7DZpLngY@cs~CiWXTue>l$!%Sc1f$ zBrZgqoyT97)i7qx2Z{tgLad(N$3-TB5x&rrrAeVxONo80$VwdEQp4ZmYvE=FsY4PE zkm*MnR%?aF+*GlvCWgJmOmrAYpyNE-E;G16ih9Vr;+qyN=>!~N_Jt{Y>v$ULN^N9l^DD+tZPF!+R8&@! z5Pe!?yE1aETyWNu*`wd&AU}PEVFu?WWFgL$k5ywes4eEhFM+MbR%L}O`rqKIssr%s zWL^8}HF$i%xp;z&SR~@xa(%x*oS5#-lHfxy#4c_7`Bi|T2F0-`mYp24L#!sN38{_K z5X3|mV-Y}b|G*lzN@N>KBX3h>0PBlD)K9{F*g0u$ zRkoO*E>oXiuxU;0Ve6F_b@u-D^>!*Z2rIk61@>#q;7RN_Is<4jY1Au>X>z${ox6m@ zz?_sJ&oAEE+tY;HpV1L&>_*0QO6;=BV@cybmaDs?zK%#iFd#3E1+A!uyEHi4=;TQZ zym80}Cpkpp?XHia->$JWP3N~XY&=%La+)6tQqiD}qLosGnw&4!Zn5V2y7+*N8XIGm=K4QeDIzYUF4{sCX+S8QR8yU2&C z+KS&jlxEOv8gkdve}=}bgq)^%xUFeENcY$;RIp!&_(u8zc42|WU1`Irj(dOpPjr8-LtwX=peb+kK0?uUdn&SYwi}H& z*}V9%rYwRwA?*8*GeM(7Pu7z({^)@w$a`Sd_wc*nBK+AQuxor+EljbnX5H?Kz-uW= zyzg77eBpeU)cjz2#fa}p+El(;`^0;G(BzUDR!|>$h{?PmV{@^~nFAeQ>=U3n)TZg8 z_Sc#F*wmqzeM4Je+|NA>IdH+kAdurVXnn#9;Ja^3aj103&z9M z%jXD5B%wE3=S@CJTMuJMA482&N(ab0T$@iL%-Ho4!tnd2J?iOLnSOl*X z*B@Z%@tAutVMy%^E?mp%r2?yD#a6MYFZ!>&($@~Tpd8ZG{Y+8K$W#j)FlMB&!R3nE zw|NTh49ABk(ChOdYk2xvN3|dH={hY3fQy~KG%u;-6ntzNe}GYH8lCjET6%U*Fsj`W zdto*7=Y;xpAf~<{4HUF>E|AEdQiV^3uSN44fi?C+)G*gc%K9L%ne zEM9t8>gXJ71B>*X38LV{JB2!B*NoYb7FtL;#?!Nnxki(eJF`Hb++T7wEEY0zoT4|K zEfWs}8|^ z9wE0pl8PPv_~KjKhdmBHa(`A2%{)UEXD35mkPkk>7Jnr*%?hU@lN}<@PBWP9l&c4R z6NZBnPJbA+Z6~|GQ9K&_am7a;+_i;&WjFE`B#t_cCQquuGG}^GEn;&YIfX#3AOK>M zLd=pHU^k7y?Ha68e}Ihg7HD1=K~;xzJ#ACqYr{Hg7QaTtslE|)u|dr(lAk*kD22u5 zySOkUexHu(+oQ`cF;rNF_$Fm)udnC=T0#v)t^p-oOTF+|VF|^p)1s)u8LwJ?l{Y_e z?K`XLjImkQt6H#Bs@+K4R54;yMtw^o62!#)=a!~!_?)_FGebh4&4z#+0?#LDv%{@m z(?m~i@0!LsVJ>0RF)2B9-6Vu+wXEIef(clKywElMO4l_|#dP83_UuNwOHsC=K_{ze z`F4(`dqsn=tP&botA4jdtd@c@gX9^w_qa|SwYlS2v-gHgvJ}7tTSA4n9 zpZ0HA?vk%?T756~sN@H5jAGh_WOlJfZXswJMGdlsw*64r zDwoZSmn`?%ed}ZG>4F!TrW738p4Qn{fc!z^L2V)VoH!zoW$g69nR7Ij!tz&*#Ki0K+YW^lR?&}VVBBJ^CjeVxnA zuGwUL4%u zL9?=(vGk3EibSYK89XkTyvt4aS3mc3GLJ}zxVSE504{Nh z)=^cb@jrAf)pPBY5Hzf6pe<-0JX~jdxLOvS^+qC~PXv&l= zwCX8CaT-XoWF6~%>gs^GXxSkVksE)bjYhUZ@W8`SQjcNWz2Hr`y`?EZROAkxtpft; z!C;SpRE9(CN7xNGjZAtkapqY=$RPqoze3BGZO*rC*4b*{-JgB7_Rxa=%CA|dgDqkI zE17WWQYt%}@(&GZBk%ZR z97RJ53PiAifydTvaS8+$XDrqys&~+?M}a!(yPCu(xyrpL#~~!qI@ML^pDxGlu6^t{ zt-CSRRaQ6aUgF=c1}b)~KpQG~?1_HVJ)~FY66kH7-fi_v^@(CkTDM+ToV5t5W+Z>a zIFLZ5aU{N+P9|pSrd1Vc&{*CEh6J;f|}30INyVduU^*57|4H zJr6i|c`%+&wf6Y1n@PR<;@B)@8VomR@AOE1*Vwi#7}@4SWc>#$=iqCvc2c*6R^{{* z3+Ib`l>{!@nprcCJGIzv+g0jmAh;#qW(6Br@m%#w+ESsToH2A6x|cXv{JrZnVAdOD zQD(zZ!4SFNi|bXiDLSU($eOhh*m+;WEV!AN{6OM6G(kzed6a=6f02Gx>S&rwoUIyZ zXBO@cc>aDf>Zdo(U$_`lkuB}?g9-v`eAqNr=i7s&{^}rid)rP0-h<1+M1ioK$~gh@ z`}sUY2J&S%jb#rJJgo$lc2ZUD@8O;c9n~}eBWiUp=YzxeIqFJw}P=sZ{@b3#tA8JpFq#fXP>~m^`ze-{@6jE^Y+dW4A030lV%Y; zlVnH}{xRt>Gm`CdWjRcN*p(!#yTp;{lKOgXgSeO_?ZWuci_P)=HIhKhUS*s1Pil}8 z;=v)voH6wq1qQdfOMBrjTV=#gz58MxXT`R-JR~Z~v=CkFXW0&~D0#xUKIao!jNT=^ z5Yz=cD$fx~@I7>{`Wj2e>~myV0uly}=S#S>qpr(8T4Q)dL6RQ!;f_!Q zKcNV0_>TUB8C0cu>(V_OCiVPMyH%DiU=iSAc<461Hsg8E#IWuztTI2#y91#ziZvFr z1jlhS`!5v^z&Cg5^%GL8{*}7^mn>hF|NWK!yJ+F>45S2A4Y$v>%y;sTiWLQ13X8-< zc56wA(|S<{_G6@M8%4$Rf`&N7#$1llJbfwA#8hxfx)+2GzogDfd5>0*gD-aJy_;M3 z&3QF=c1M0YEP2xzcIVgY+-mPP7g;|)Ta>wh_A8EO<$PwlpQ91y`kW1kXKwubHI7iX zBfd!}Q7m-@cA@D-Uy^pwA(sR&ThOz0+`hTiO%+cSZMC(=}1xQcBP(Z*3-qBFn!V|=V1bKT2lnOWM=$d$Vc6Tg?4te&MF z0lsP^x|$z=4TY8KnSv2UQ(0oY49rW4r-DJo1_2j36iy0Pt-qQV$kkswpIIr)bRoS3 zP7lsMxLfxhXtRv!$R}9q(j<`LZPQ*oq5}iae$r77dwYS4{pJYIL#wVch{qDW&o7x) z+is#UiCIt?XXoHH2?f=xN!~QzV2>|#}Ab1CIC=u=sf7UU6szydrBG8tSP05To^k z5z6Dn0Sg)`6lgj++6)be@z^Zh5^}7WtZsGaJS;QwmU74UG+}xp6s0q)J{G{6=ha_D z$~L7V!gn&sb*KIVE<9l=NQGB;$QLF+1Fz!Fd|VqfB+L$LkJC3>46`9&4frsup!K#HNTOa|Kxser$yR< zE%6p1)ZK$-es2BVPK1OC+j3v{-5igJii8H~1eB?yn7I;KIMPuC4ShF8O-m7A>762D z#%#$Phg}ep%_M4+J^&suyc@oVD`-JtW*KRPI&Tx2x7Uu#uU9h6ngucbu2PTX;Vq@I zI=uA*8=@Uw6nSpsxr25Yg2YFFK|NA;US;UTelOv?BtPXO1%BOFK6ULBGNd?!YwITm zg%KQgyHI=Y)Ut3{d;_dVS^j7^>(eo9hanqcKW{-GeMrUO zezQUy5ndvGEowRn0@(;n(~r5CyUe7;O{T*=4BrrV5M;gQhP*I1WS3p@2p*ZJ z9r@xNyzQQP&;0|jfVc9RAvCD0{ivip0>k$5#mx>3a(+)DL0=xr9Ux=8mp zEXZbWH#z=@r{+ek3W#Y+aNg}e37mYWB){)T<^B!zxJ|vQg|$IjdUTv~avHR#5#r@} zZ!JwD(*ywO`c0@L*+MT zkOEW9e&a$yB$}pRmls`+5(~Q{*zy3+`T#G)==xKK*?P?2t8?Us}pb}3)3JMrs1y*pI&RUY8ryF7G z>9r`*>B+$2(YY=6E@9gHAdyvfdY&`h#wcULtMEigDeT&(th7l&?|%G3>WS2<^so(9 zvu7K*N!+UQv(;lkEPPY4h-)!X-!+Dwh~@I^x~n{fGn=t8wlbKmSwOVsRw9TKL4L5e~Zirh+k* zR;%8YP#DVXt`*vIe`in{=Jm0IdvRVV1%fRdWdW1U|GM!SAU*eU{$%X9)&%?BP{^)f z7e17r{Q&luR;7!I(ZEwH)4xI;*klL`iV?4_-AL@~)Wt9Ms&RDy^7HHML2-JE7Nc;& z=Xf`K@VxO z2aeA>#+fvKQ9?}UnfXXvTuow7s8y5;d1f^~EO4Y9ey z_F7Go>VYJaL8Hs+(jNKE{D^LGBh72j^jz%D_b4$`^{3EzBhO}7>3(kvQY6rxHgJb1 z9h#`4iZ>9s@G@RY?15^F05Fr%4 z``?1l+3cAh@4Z#a{{r2iF!KWdsi@-esXrv~{jf6}UQeSO+uK6u>hX7lY9CB5cGBzI zZeAuIvcAMFnaDSWl3D5LF~f2RrJdQ|_-a8-9EI2po_Xv30lf3}N?VmE0QDEh)3l00 z3UXb>GQ!(ZV$*(0N{)WBNd~g?R^!oAj5(|>O@`H`920-skbGLjZu-GMmr~iJz4M&g zi?n`pQV_Z=vQM$vV#wHxMXgeMnZthX$44KpoG;CtZa7EDA$j+sfL-H}-o1~!^&X63 z2M*X9i~SDq1$q5|Jn#%LG=J#c=5itk=ex%Z%GNkS_I8ESDp3O6+yp~-07kjU;d zWNIx<&|3oVm${hwywQfgpk$B0Uj%!2Hb(-fGYzGCkOm|?OTc@V-~%beP9^V1`z55Q z<}OkWlN6cGFns9O>O(OrUt*VLwB^5v1@}%ls?Y%Y?kr^wZic1#vT`XhQer4jmY5M_ z&f{sR7fB(5iBx#8P27|K>KzV>HR0$2`UwWq+CkUwoy`xs352O+8^B>k$G33RX(FA* zwXX11%hXmjy+4g~g)GXRL9a{sOqcX{FhB0TZ4J6IN}w(x0P$HNg|<_{>e)TZ=OGqWn~d!%XMrJcxu>EOR`$_UE7rPJRx@TAKuczsw zIejSex!4f?D_8Q*#peIcmHdapGUE!is(%zZmZ;}Vw^)V(9Ar3{W)1doy5Ixa-CWHh{Fn;5Oy5>PT}El z``ru+p|9LaNY!gK2D?siFSDqJ&|B#!4Iq^N6DL81G-;}6hj<$;pv%e=;52QJN}d$y zwHUo$s5Q=Q8}VXFUiO#)WY7jqcjXT<$~%j@hqX11lq9Hst6D6B`V@a}FQe5BUoMnL z#(OO$a^ujmc60T;%oSz?!h87_1Y($S+x|uEIsZ z$*ZPO57}j29sb66JdW&6Id8@K6n0o&w3)EJw;BnUvBMo;+pn`%=m~g3F4ycYna7vI zQn%jz%6R)ITcFT5I@P*9&9!w$!qM-#v|tUj)yY@a+q6_rb=^r zp6zCLjD)EucK9I}hR|moK8@VUXBfCoB7+3U2n}9G1PJKgjMf;GE#}fxKFY%S2tCP~ zlyCuybKa=(+RRPOhX$K-lkSosFjQ5jQBn>sPbQ4^FPk*ST#xUrkRv*@NiFlD^I&eC zol&>^g)3DMRef2dZbVTtK)LYZw`9mV+554lA=ubIo7rc{u{rTaUV;~v^6PBg>;`xy zG3Mm!e`M=>PLV;KpR#q#&qR*@nc@GRtsQ?)!T+&#$RoYW=0wo>C4%;=wu@OSWuslf z>LR06C{rn4(#3yqu$+-V#V2Uzef&9B!+0FNQ^!BzAg2gNYH~}o%$xLfzj|@X+2!-| z1%xoF2oMu%kcRG}TUH8K*D9?{)if#BRValOGS``UKUv4?cTNk$1<-TQIGP#H&}IrI4zZC6@%9TN+|cdcPmfq_OztF^ZlO7AUonSgGD9 zaSavH14)#=zS^G6t-LC5-|)q%w8TEwMWS1Vde7i$LefCi5R0;YS;&8Lz#6N`K4>Ml z$T<1Z)EYjCqu_p!J*o{^u;suiKmz0e`@D(s{Ue1lUj$ai-8TpTB6E;*NvQKMhhQlR z8kbM@V1a_^d<_`i5ejy_-(N3KRPkZY)Lm*J4$DKzGbkKrlz zz~b&HJ-M6xaM)W<58s|>L$y+KIL(9ytQwOH@?v^3c4}KB?zlCifxWE82*yV^r!6YC zx_PVE!x~zPUd>WSRK+b!E^Ir|`UY{RTl^s}`n9QE$udR;ZK%6&i4IB>zD^5fcWj+* zE`w|ymW6g<91g{W1ntLvtZ%)^F9~d)-=?vDeSiNV9{)SWtv|s=ZPxTN#?37UUM7#O zSUeOuTv_Ui)ej^ruKg*lM$00RLQh#KUw6xuIP%r2ms2onZn&@OoZOygK`YBH2NO6F zqpN2!ReFvTA1@w{2j^{Fo9{5$fG~l|Cf*_o)$+t)`}3iCnL*2qL-vToo@h(;d!VZ6 zkV;!1@TOGRf|~)}O{%=YY82Ggf(CS!p+uw1%`H+Qx z(@)uj2KJCw3-ZW5h~tp0frC;Tr7U2p)`(j1_?am$AoaJubIHP{llm{ za1PYJkD$#+(!;8ew(g)0U^f4(vicmERLp+wRtD7Z)LNtbaD{+i|CM%1-MGX`Z4DW< zO#4ILyUHp;*96`x=%NHxVG^*EB>+1{uFUH<6 zJkqUO8|{u!v2EMvsAJor&4?|%2b*1Ny+t#hucX4QD^F`qw4 z%~|6a^B(uO-CH1q=d6iykReyyQrCHl`|$5+0_y1T-9z4sxp-X4L=o2XY^2{gJZ&9k zd0hd!7QZ9-tFS`3viuga`TYL#2HF&{=Kh!NC<*C+uXLJa*S)S+GGW#4`+c?dHY~9o zH+eGBD~L_P93@x;bx=A-AAYOb0L@9d)sv=wb#=K^9QBbW_fPrNf<3V&X@cjp8^FRR79zyxX6 zJFu5i6KPBT8Z(Gs8(|2}^w!fWJVBD-zIGdRy6t{VG<6TDwme=0 zeBcDiCt=k41&7V23T}`NsNM}1ZaNQK_Bu694i5*sR!F?g@g9BTb1$$itH&vRO*)D3&$f= zTGs^64&>A=H`~|{_|%s(Ibuf=4O`!^72{ZAiR2G|rJ31W{TrjY5)c}=c~N~T?GZFI zxqSc12WL~il?|*GIt}XH;m6Nr8ZKhFy?_e>io09P;&_C#v(L z4&KhrU^=S=wxu%A%prragr!7W?D1$JiR26DDKZ(}Jl2x&Z%7N^9R3JB|5liQ_@^T&KAR&rm^0zBHAM@Ce{tN$dQ6l)W=|t z@0U0pL~!?+;Rqzvw9`#=(_U0u^$HPI)`5akB@CK|YiPfeWKAor!_Te9+sbQqo)v>d zYhx#9I`qs>u&v~?Ivl)NE0qJ9Y<*cz&P(bp$t`j^l|UsDt^`JP)my?--07@zUM&`_&yS~3*e@Qy{+9ITDGG^dXey>{ z>ZxpN@|I538AXx+jEV?ytk7^Cp{7Mg_I8y`&bTSEkN{p=OuDt1b``10^oW|nab)R} z6e^sSIbN6T@Tq7wR2e3IeRb(ajaNgD#*!8+2-p|;06Z$Hj4oR|Ws^c;6|T2apVRmx zC9>)qtyX0hB{f#T`uUo18GM;O5s++Yn@A`F(kKIAEOK5%WN3+nW?@|c=^%etqNv@3 zFe0!&mV!fHApV@O6&%oRin{34q=V3hA{{G0PKV4Jrrn~uV`rO(B|BHRXfdP0Z&edM zugXiO%WiS*n+`H(rnD>Hnl`qFaR;F}t0Y^=#;;)Q&1enIOCJi%B>%CUoTqM*fD&hf zvch@7Pf|5^DZYo#SDShW2Jn8OM!9PW_lW_J!cmUzc?KVhw=v!I!rCGQ z!6w_prbi>q(wJ~my5B<~@a>s6YW9ZAfJ;R2C!EK^PnqtLN{}9M&I8{GAKKHwht_BMKsx~# zE@K4i*pzN(-A;Uj3!>Kbt`MCW4iWKgVd|{ooMJLu8}TXFG8W_4p~gx&UDB8xDg)NK zS;pUp8l)x%86*N^T&pB7MVfi>NV_Ccfwp<(~jjz`SFE!|5W$8x5aZLmiw$l6x`v{6jr->rYT0UPS5iAT;PGh7K!csJyno=H+bEt%6x7IfZgpJZ# z`GE2Z_S|ktLSULP-r9KD>IQ}vTgE+~FOKj#NIA^H%NH#Rj1#7l`X?+G;H~+s&LQ%@ znxq+~8MDeeTo&fDg{lp^t-L1J;2=H*wo2amD=e@oFWx(L7wE3lb^d;Y{ z4M>x5&>ZuA(VI~i%{QkV9w)9uuB8|!BHNT=7N)DfV3cEEEaz8iMYm~b*{n0l zhj>gRM505Uz2|vMg4sTjXxrqxl3YK==&aGLy3K;^OfUD0ieCujAOS@J1us~jk%Ld3 zgMtkLHHUeNL7W6ZnJqJRQs|bZTDUoI42G2F#+r(7G&{>U z1~JSI9~+0zPUAfik3fiR^bxr9t6`3dxu+p|r=c*bO!HTM$yZ9^jQp@y^F~zYbHgCd zPEgXj=Md1uPX|KLTWNlZy1EmjRn40#_Eb8q*4deeb$mNJ9r6ZQrF=qK{>U2Gtqy&MnK*ZgRrotxfS|m z;eP5M9{!g^4|#!n3H4lPJHW zeK$82k3%(exE3PE=}O~I_pl@M;X?cAmP0G<@HyqHJf$oD;;o! zHc+&`A@&V_J!&r?aVniI7n^WyW4P)JT4KvOt;ge?PL!Iqig%J(NVsY2c$?-Jw2&kA z7;#8=H4)P}Eu1gPlv*w(lN=|J1h*J8!({4qcH2lrE>=RqIc+Dm6o1FyOd4M{R2P;O zLk$IIW}uKtwT^AGYFS{~XbLH03aUr^jynprCsb4YlCx&dhP&b*_&sEkqJbtZ6J?Z> zbx=;(`b>odw_Naav~ed*G*LV?Bq6EVXO_;CCh<3n;Y0=MV5PfE>7W^y>}xm`XU^e7 zjD~tW!qjDqnM(Ulds#xwR)h**YlpsZX^SM%Rr(S8ueIl#v5}~^gg!cC23d*WZ!_^V z615Bn?`yZ(E}R$5lE((K8F@Jdqw>HmIA>XX;Zz3+l4oezD)l|%2husOms|48vxt|^ zeeT=N02L}7)6C^x@741^CM+}fR7_XpW8Can0$Xn`)p@}J%-wT4W&CuTXO@+qJh;cf6%D0MN zW=pOBtyR^u+&bykCFB#fM%A)Q`U=T<$SHx&+PAvC0o3$|d&Jxd01q;hL}aJMius@S zeA+kwe&qHb$f|RpjWgA4Do812|A7OOR?IF_&+-l0>VsmBnyeUeSOe3%Miv_&0@fWo3*r`5cH1356F0t34oM-KCOZZT3HZ58hAeCQJ zWUB)GYWC?OkoK~wi()lNEYpn6FuGD_@9ir?*W2~tpW)oVbBv4<3YiDuvF(f62<(gO z2{QR0wVZN|ek@b>6zySt-9F7<9yH^ZmA8%(Pr#KYy=hMf+BWj;X94$K)&qrY$3Nq@ zCmva3{6lcmCE>f9806g>kkb2d^e;*~#apH5ATV;j`^Uo|qQ7kq{|EDf>EF@&w2IAN z+S9)D4A5)3>rklB#Yyx22Khrlq;U{)DT0wQkts=<)yr^I&P3LGrVD`|Uq8b3)m#AC zb6BHxGtVqp7CujrSKRyVms>8WselE$m)%Q`oNhN@Pe`ugt@pjFSbp%hzHfjD!*)}; zb0<*=vy-x48SMocG>@txwJ|EG%?TrjwAiNX8RASqcPaxJ&1ktaS~wZmiHV6bMFJ@J z6Yi83`AMN8yh-=+1K$Iv23dbwgghB;nPN0m)N3OW9X>){>)q{=g`}k4!i2lg+NHfT z{}NSbh0#|u?%Ymbi69EVmuF{_~pf9{4K zH%|yRDA#Xbae}{Y68nSBkC~wtq*WEfMEQkWA(f(y&QdFR^kDvYE~l$8MjxNRlGAfE z44ji7OI2-uAU2W|eoIs17d(cq>9`ju(Bab|{dKWGR=~^GEz-17+8X}&=N^8}Hy1q| z{#lsN;8^Az*LL$?Ii!v z{k~AbCldG_n&&ke#0^~YbnDpew~nLUQkrMTjgi=-fk7RchRfo1c&VoFOzmW0eR#{j zi>Z~Yf`#%SAL|g5|04IBDJE~L0K*p~VCl{OJ%ss(LxF#H6#Q>)+HyGEx`GvjjF7b? z^aVMmq~iX@dXUwxB!8_1ZW+zXhQgvV&IYj*{TGgBq^@fafMqG;;Puq@E<6jJS@QsK z!rdyL`tdq%r&;e^cH?<2f!`;IKg=Um!$XS|N3=|%31b!~Bkdr2CPs%$cQH#5c1iO9 z4d9PuYM$*jgI&vo(56OWj|m?`KeAq&emsL@U0*6`+VLc^GT2yK%GPPqB`lBjLoAhD zhw1{;x@vF%^{A)XQ2IGgQKpv3TjLC=gPN*#IAydz%*p-N{wZEafAk|hbY^8nio+(a zHRfZ>7NTOT*t1iOmQAD@I%<}p0|*5R=CB2tkUY4jtO)uK5Hw|DQcqnG%>oQ%O62Y` zVzN0Y>ICF$H73G2QkC1Bk91wz)8+5&%WYt%*ch@tW! zzLWxB%N1|%kp`_Cd{Deu=IrXEg=!woQP#&6C?w-RF?hSAUAmv(s-W59*)9%^CZ9?6 zj&6`f{ji3bt5_d&NJMKG{5|=`5)VnHUj*W!4UBBnjgDU8P6a$^MZ4-oP@;fKLPBra z5O1+z!grnh{zxZePO*w@c%I|h6g~AC-9&ugcIk7wgOC{X9LB~-f;noKbhG9a+w|-x zp!i#j47fP*58eoSKJ6}(KgzGcJWH^vOx8Mx7mna6l82fXx7BB8Gxi<7%-|$^@tOfO zyS!(l4y1$L_d{FPC3%az_8bbcib)shBvq7oQ#~gOUuGST(o8#fgLF0w#M4sRp2ZeY zmiQ}p(IO<3QsrtoOcbqlW5-*S)^F^=L$ZsvJ9R1Ojr|&tDwNYR$jU7o_|_+4vpkr_er%vk z1MsA@3@!>~(Ih(Dtx}cekv0!*3U$&X+7GY_t_r2PATsUApN5Y53GBHLpmz=9Gin|a zXo&9QA4{Jn?JN4Jdv|~$wp}6&F^1lJ#mM&xRSEqTB*m})9>aczSj*oh%KC`H{}{Bf z6sa?0qJ8r&V!cOwv3mQ|E#q{xv$@k$D-Pdpkt&Pxiq2s=hEA}f`eiog+8T;N(49Vr zCrAm5VE3K1=Tae?_&q>R;&og6csqQNAZ;jD_?@9@*RJ3h!8VRR$)YuNf}XE0R9K7r zI$n<^0Qq_K$xpBs2|HR)>_>-Dwmgg8pMUYjju~vj^MLUP)<3fD|6#2&{?~ax_P@jx zF?qGjg`|oOsjCYpze}|xDfYp-gIAb0OXf$M2$do2EjA^RbN4*6bn!i7{Ldl$&3aKw zLK38p_~UQiPP19J$MyF==jU!;BzW|p4kc3*)Vknq!`jPCFJH64Yp5M zDRNevr+H?tS#`G`A_}c$ST+YQ(O+uc<9US61y70Ek!H+0JQqA0%CRda$e&8Li^UR9 zyNjg8CB7md5j8>1h0`P1Kshdw2nx`M`4EG^pJD8}$x91I{(w*Hk4-L>EGRqC3jcj% z)T|G~Iq(yiU_26$33EUJm&D+xMb#jz-KYF~=TU-eQWM4jZM6XX7fMQPu-6K>u%AgCcFijs3ug5yQ5jxE{ z?n&zB*H~H|M`0-zkTQr--s1VXq3lGXQ)eiuDS0|RYOD&yRnwDjuZVCt-rxnI4r((> zA!gN#Kd;P-m{{t;Ugr`C`Ljl+mNSoxX$vjR!%YD0FZ@*Q;sBxO@yQOlW*|2#yRHfw z217F*j8>kSqao5Noh&?`W5F0+qbJB2F|tiCd8pMm&GAb8s>WVo8o?H8MR3S*QweGT z#FlZxPVvhpM}yp&@wx*_E2ekG48!RbSeDbqIkAQ;`EeG9jP`Q50D6RE@qiWI?+To+ zZkWG5P!TqX<@%YNfBUe2SqmpPN5=*xh!M_@U$F^P_bx+NR_nLwy!q5=c`}{g+2*?t z!~g94;fZstfXhi0t1?Fyt2Rd?rd7#FDfX@+d}y8*A>^oBW+<>Vd=tyM6w3GxD*UBu zNaE#b4pTuN3R%CH82Q)KkhALv4f{0at$xb#i_To z9WqJwbh(b8`vadkq zSHBHE-(vERIP7&ql()<~$D*H9d-ot|G#hhnb!Fx-jAPNzT$5$ybUcEUE03HW#sl|w ztPA%DYQ9(e!M_@To1fMaC;^uz$v-|WF#YXu;lHy9f9qEB7k5n5;(zi9|3e`@IXedo7WpBoB$1{v=Is$D zp#EsJ>Hyr`7@TppTA%w&&YQH5ty|d3F`yRvZ{7wCGCsz>VM5TbQ`tF;CP@URKL8W> zzI}|THCZ$DY2minlp;$`(pF;qCLtSK1uFKz1|3QagqG*&uDSz}*dz8T!=yC|nJ#{A?6dy)aryz_zdNhFlc5@YU%*bCk0%JmvYqbaNneY+o>Dhn3by#IYu!y`ukOVIqYr%SA(Z zcft$V3>{qVgmRh5nM{s~#$B{rgu5_Lc()Z!0nPZKE=2 zFy)TDj5lyAy;<>R5G5x(WYGkQG73R_XON~_hww}k<*U%tqPo&3;sb|fO_92i`X5eU zo|#9?+dRDB8_;YigZ3zE8%XG%7*>0n%V5Kv+Zb9A#~$vwIVPw=IWwfo1%F&$Llw{B z7>E#PTO@2WHu@|B<&s_y4jk{+4AEwYD>G{$EKptG}u#c+a_! zRHe^45Hja6lx^NA+31o|7X%hYqO~}n>(3P_W1FixaEz%)Y9*iL0H6RHxT!ud(`N;M z+Y0t@Jsv_~eesfRHc;}+?TY{NoB^qYw)GKKt+l6VLvS;U@Vca$j+rH%% z_G%;7_ToKhw(jlMqK;>wj!ziew6);1&ytaA6YlOc)iJEs znl+fBjqFiXl{Ca9<;~4T5@Q8PXqO><+2Tk5&BSbETNR8?WE9r+%9^ecww%I+Ick^D zt2UcNZdQr;{>R7#WV1@7V}(pO5!=@A1K3u-SJ7aaG6bK#x_1FSj_y)2jbH+)19Ef5 zkQim=GoqAoXHC}QMGZFrh2xUK+~AXNQ&g3|5ZocZq z#MQo%fw!a#*5BxoAO5;T0q5qF?f7xvhGjmBx(*3x9j8QmO6WX>VbG~tnh92>E*`#qiQ2-CQWt{T_RtZ06Zp~%s24?HW>$08R|?>-moc>m8YV2KKRZt20uS( znmVLjnL2VQKca*A>#k48&_AhBb4hgE!}s|YWC7vVH2P>7Xy4ufYXEteGp0nI;pd4g^tSWJp z>7F?hHpem@kJv0BQThCv!M)cU&fdsBGgvSF1I`UocV~-pG#e?#dMI9m?qF<>Z6c+sgmwOYSG9r~+0zDT;#eb#OvZ&FFK&BkHw9H^dPPXp?UqWQ!bJ z=Ewg129a-xgdw+GUZtAc$G@7qW!r{>4gr5Q&;Rjf^BG35!BD({3y73jT2Qk<2>!sVwG;m}1gW-?qYZ8~(VN$e2L|4jeJyW60)^wjQ zs5ws6q+WM)gqlMYWC-X9;$2TyX1Hfw3~>g=jUm?4qo+0y=;^x|TEk`BFM%wAf;g>loN69l; z1C7~(s`MNcR#^#H+TxWSN(=2l(}2u@cxe8!IYJ{?Y%EvLz+gd`6%R^Rb-}(^?8Rjx zoR-&m=q1C&sh@55CdQVkGMUJ(6Xr-WXDi6b_6R?W&)!4wX_!dq;!}8y40h~W9B15wd zgzD3R#4u%(%N??tJ|W#&ojpzN^I{DmL3w12VFt+JK8h_i<&*CLgO7DyZ9?dBuwO@i zU&Iz}hmJA|1fSBzVq1!4!%J|AcFM)ze#^!Lgx{{#NzP*q+;{@^pbLI3m0QSj!K-W1 z7h)slBsmC{wb`g6xZWJG$fiF1Lw;C!!xr#KBOPf*D*2n(XomsYlyyAflg zCeA95)jyjl_<}`svon=ogSgkOjlu$oix)3@|!d#0opo3`J{Qb zWpPw_f!STciBo8Lx2xksW*q!*l*L_jM1x0+YMO>i>OsiCv;@;tS^nDM%nUiKP_qJk z9>oQ|){wGbQyBE{CMc`>XkVgJ`z%p2g(itt#T?FpOsKfUZj*B+*ajSGZ$l05NWuZc zHNbcfD+3u#)q4EkJbxeLu*Z_o@}wkZ*$J7lil@%JLa2kOG!4+7U|VO2rpbZ$K|A%T zzeoH)8qX<|%5l}a!t}KY8NiS3AW>44Q<|bW<7F_lw|`H(E7Flo%U8@0bxsT`s3B1W z>Z?66$*em-Lpnvp{JX&F2QmObc+a;hWFGYA`-{MDc8{D*1vc|1jd;-rxFw8 zZqlP@p0n~L_g$S<`#Kh^*AnNr)GfRVo zL}H%j*&(5b@bG$cmv$6)ggHU_4Vp5EZwGLDbQiy{2UiO5`^ffD| z)(fCFVM$gKNf}^k;E2Jn{4{^^$V}GAm^7FHk0bjM7=MYw_zU_9UUCkPozCbx;>!SC zm*!Ar>-8lwU(EoXHq{<%Mo)`q!=Q9c(9UHuYsU>)Pk{@?^9pN0FSPBoV!@zpf!QYA zHX^-gY{hl;#qGD3migak+QSH|QkESN1QIH|tS{wEkN+Yy(o#Nxl>-S;WdA5hN%*&z z@c%ek`S-S;t!!n7BZA7CxT@Pg8*M~E2VIQNQ9MIlep3j3#`&{E0_SYRVxK%I^e4GS za`lEwQzr!VO?EG6SP`;#*z6nSQ2Lrl_E05M++ntt+2w}UO!~LGW4+!F_`E;{@cU`q z?QIv;%^DJUfZ+Oi<9qk@VGoKng6l;MUEW2T?q-!YWt<&&lg+1zNAq!S-Jocbx>DqP z1-Yt-M!U$0(;!*Eg<``)3|?C&9#GPSdDHK~tQCxTvYS_sz^*-~am%lM{~6CqEFm3> z-aMmay>X-LVowD5qJIGse5*IKM-e#&r&Z8WNN3A_L+_WApaaEfL zhkN!C9#A85$k?Xf^axix5JDAha8Af$Z*wCNKhRn#P)UOgnV`4ayuUen>FQe&qFflf zUM4O_P%$n^h7&zupBa_yh4hW7Z7^y`&i(9reHI7>XSY|SBC_AtgNhfnrg=aehCp{& zwwmx%0VNO#=TEVn?#mw2NwMARlRT(N;8_nE7AjBfOW&V|UX;aft6wN*<<6%5WcxUe zCUX%t%DiNmg@5H!GBuWiR+eLz+T%;%O~36&-eF9v(&`Ni;!}buxtYbc^ zXvf9s_CNgkc!orH3&oVwy?f0a%65U}pbTZUOha~@M(=~7;aDwI1LL=NFsr_`aT}o<|WOx|%`~aqf zhezcbbpjm^YZ9$5Rc%K@1>Si<#W$(;0r3~jF`_DuF$fsOW&dLs_xJ3_|CYA@KO{kz z{++K(R#{WSQNjJ7mnwl?K+Y4;TJL}pPiV35|6);RNu_M5j?&k$L?k^3)SIP9MBiw( zUFx`cxlBZ8F3t4R2FiI;=)GLIziqas3!r=6A8brzU->?@diQ$y-L3HRe?kxjUSWDm zG@2JTL-STL{hl%pF+Dvc32w?z!QwHUQzaI|dJ7+Duq|5_buTeLj&#jhM%7%T4e>c_ zRx)fYC=0QmjHfhJbIGvUq37 zS&5x~?aY*!&F1oB08>xS%0r6-4w?=S?Gqq^h3xu5XoMkT{%yt3vBsPdgLZQnCaDUn zH_Ft#%Uq~4N|g0x&30aY;b!VKBc+AR`OnYk3pZ|{N`8cB#wen~yhgKpTwUtq5V$<1 zYvPyWDo%u3`rl6m1n}}Bj}^Lv`7KN3vUNrFUIV0!krl;GK9a<2m4Cclf8@6^DB!v9 z$ip)IMy{XMJd$Lx86}k;!m}NsKLhh7@bn=a0|jgvE>UB*lGn;yLog+9z&<-tE!DioYqlW*f%>WyVK|3epEq4l-(@2u%rTS+Vq9d0&@_fD7b#G^Uz_=LXE@;n&zSbl3B(q zG#ORFjMVdg&OnQXuu$6TFfM}`3FqH81jg@2r2&)}Zv&B(h?!~o^Sp=-hpC)lrF*Kf2(wmP3(GC{Wh-SjDS94`74-?4O8|&s z+03%K1aQEw#H!#+khTqQu|0fM%1&aBZT{}^XRi4l@Z0oE^LJj%j+mI8 z{PPpQ)=dC3vP5#g-u%{ILy`Y66kr}ZR6`PJ^*Cq4avJEV& zv!+aaNA?-hN;r~+*~KUJbUR)xVR~!x=xesACa*XmSrlu_cvcwZEddxo-kbtQ4iOhflWy3$ldjHxc9~Too#(?Lw#Q%6+OZKWMyhI5U@~s z`DO}?0!AAV3hz?zSf^7cS)Mqa6ED^Rj4VPfvTqxGW($H^V3NOgQYC|JDiDXVhBS;3 zN)11!qcGDx7?oL+9D8Z%l=`yr@45CdF0|r>I1H9v#!mMkWufeWR*yNX11gI{h=$AA zpvDe-^HNbp;gADpH-too!N4CNS+e5Jxp)TlX~~&}<30+Ou+)V#wE%y%q(hQdja+D! zXGrKzq?%$CN{t|?BZA~3(6VL?;b8%pPjS&#vIiE5;QhYg`Eza*9T7MP`bpAGWRWK3 zfWp>w#haw>7taAUnr48+Vz67nTT*?Ul2m#85f)|hfennGK}u)w4G}JLmk`5kj;EN4 zS&!rk39LaKl?Z#2@*j4#>(eDYPFC*qp#~>oTo2Ot;D+Hbdn?o{*XH$4+AIJYL&oLC zE9x!As0~jR%#v60>cEJy@SP^~?!t)`3qNMWTvqdY`4ikcJ?=G&j-?6B2bjU_F=>Oi zneQxNEmw$!$ z5MKy`Qh{k->3^(y`2YT<|GOhxaDtuOf;{R-Zet4Ci2e^MsQp^>HGLy^pawZKGNTmw zR$UyXp35d2D^=@3i_B{4JO?t{4d`=zRH+*Zn8rrrR^wyVw$r84%&gwF?-%nuIwCi0 z@(GktUqYy2N>`IvmF_k_vV!a}C(zWm&AVcPD-_FQ*Tsk8%9rRtMyMc+*Zki_&+^-8=XX`u$O`N1s=!xD~bXchaeO6iCE<5$6`6HmWe;*M{I1%X2O5Mt(yUQvFQWkA3 zHDJ(}+v8$iJrj^ii3|7O)F3M$4TFv>{2Em3_qKZOoaIyc5|1!2RPIN=$*FM4BMXU> zVJEQ_KiC&d+|L~hS!tC2J*}hFIOq>jV||~j=HOSf%-{E%^SNJJu*msGpK}N6`%9af zbvpYG%-~tZZdoRw895XV++(&KU~bdqpHsDivuA2ek)dIcZH=!%f(l zNUq{e0QH8LcX}$b%<@Vk)_pw9hO z9GNp%ZAmh+X13zAI-~x;;+_~-ftT})O~sK1by&Bz#dUP+vRBaRO#k=@q>X0TRu`(ym#`ZaLM4M zj~{AqFtm`4j#!{z31~V{wu@nIHAv)oWj-6`vEfYvT@sPHXb@~Lb48?#!d)ioQY);Ii2msHyR7Q$*U)Pp$Hy1V8#Uwp}rYBGVOEJif7nB2ks<4bDU9mVwd#W|P#Qp7MXC~OBhq#AA}Hg}C%+|P!57U=1_mnD4IP6*E* zgMYHZg(_9)uj;GT-a%2ilCS5(&aGMI27u~L@) z#^N-AmPTJ;x|jxZq3SY%v`99EA1c|R`!QLE8!_1z$IfGOAtW)`)FO0t9! zQiOE1kWfW|T8dhWT8vtaT8>(;B0;$!l*~)LA+-2uha(# z;+o;r*UsTMp{*`Rb@J;J=q?iwyYxOFedT{lWOpeLJBb}S=q|E=R?_Pfu&snXt-pax z3~6Z$cuWhbvCU}xzQUog(uR%0?w^=P`ZeM8Hip)jnCs*tE&pNCF}0A5v<$7LV6Gz@ zX&G8eMcIVc|2N3KOP370zE;?q{U{4>OkHbrz6$4#rG;o?`&;6ff z&_cXDH?pkQz^OPV9&{C7k&LV#q0osrgU1Mn-YZDlGy$c1&L~_f z!ZU2~PBI7D|6!^TA4?u+hiB2mJCQlo2+t&swIg%=1^)C#VYM@U%j8xcb_7dBzTg&g zE&SoE0XVmBhfXjHJV+<>AQ#X150m2iEva}8*mxqL2QsN#;DJOO4|D>>&5gA8Jc-ap z@4jNh4txQeUs`W-vh~HN0ncT`uIz@*-fpJ*GdX@wc&Ocfm{L66;h_vS*JSuTLxZXA z&&UXRg1Z^=O)*i9t3=l%&!h^*L0M@Z<(N4`rBSC+Bll!RF9}RO<7qv{(t6A#3|I@P zK-lmlA+3kBgdzODO;xw3rO|F8NOm48FC_2G>lF@iDx1Trp=;V6t14}FReBtn)ac~N zkqHx{!-jgtHT8DuDs6O{)Zqz}dPj}_GW*zZF#SA)qDI)Is@crE5z1?6+txR$o0^E3Ly-*?P42z_AVGXh7#%%_ ziA{k|8vz-)*I(dXNBJM-GLL$dN1ggzp0|e{tKQcBc`fY5AWH^bzUn}!F2O`|{Vzk- z;5|cHsVI~1dRnFwvXO=%^%TrWWFrkj+Dg;EUVmcR%zH>ilxihP2AyKj@&Wm`zmzLB z6QbvoGstc0hv{0{hS6JB_t3RA3?sFy9U|#$>#b}ROn$V4{UJt1=pk=|n?)LiD(`jO zy8k6#f>$RNBel(9QS-e+1DvQJ9aps6>UFfbD7` z*O2}DjhS_tB6gL>4HrH#2P1z0I`cTBGj6%ls3~NNq-jF1lJ>7RYs~Rbt<0m58UrQ= z5@&pag^d+dW=?)lt5cA{zJyueAKIF{Oz`kFe!Q~N&s1G98%jse-K8})XkvNzM6p9U zY25DZ`_f{sV_9pjo&8k+exb#_tqNue#KYtzBs9*GjL+&_sG&2QPa30YH|E9_zpL*v zKN;Ka?UfZ4Mk|k%9sO8ODGT3Ia1!Sq$YbO26{N`|DjL8&P_JT#1L|(tqMCM5QcQBP zg7H=MQnCE;s$LISu22KG9^Yt$em;Q1=d?iO--Yd{L3#o1o zbeX!SgI|#>s;hjl+|!6qb)4ORAl5;%BrxPoXJ#VlvM^*V7oLYR*;`kmsgPM zTvSzDI43KrG}gE(EE($*Fa0X0)KO{fu3@gQRN<8tNufGYo~=G>7DQ{0E>Z8l_Y6rj zH`#`*B?G{oZO0jW6){zp_Xz^P)}ioOY48D2+ou4`puUIbvz$L1%2`fj6@C zpf$Gjz?}h34s1WH4+Gvn3cSGvIJpow+0G2;oCZ2Yog(+totdi952O5zYa%YA3sm~& zJyi>v17|~4kW{&7ja0dCN2?r9On{%WRgUhEjS~i3zyaPkksqsZ_!Ee!j{|WEAm*=e z$T(Kzh-s+GRRtVbeN~Q}@fwHr$UtASDcbP3I|d4N?Y5S`HCm}HSU$}1;tkeA+U3j5 zHT3{Rq#H{v?be`ogFD0B_#WUWaX5X7BC;_}4&4msm@(kf$eQ$S;EeraT%YpC*ntl? z2?RJP7&z&PDVLVskWZs47wD)zLK%j%hyJ{|W?;MUS1*ZhsTD5{q&Z8^D*5Ka*cHM! zz9VUx(62H@`lB*k0)oJp97}spn)w-#WqfB42*xrd0%}Zv;HWfnTTZ&MJt)v23cN!G z=*$G((F&YkyChxE2G`0G{vb`IGn`#3P!Q;K#`U|nvy-H(j?Qf}9(iZ{5n@e#Erzgg@?#yb+tPidMuFTQGT?K8hE z@M`I_*vuItHPkF}i?nCfGx7QBxy)W@6-PKh{T8Y3lxJduXBkI$ZUgXQ#52)7=7D@^ zJ6o_@;w|uj{6=aybEsC`4!L*Pv+xPS7 z$@y841Q5(nSEH~Q?cgp?cX63FuBMTsYvQd0R$9R)xhh*sKrAVjb?Zjfx^Z&x$Td0t zgs&?8!L2R@c`>HRU6$DF{4*1HLG~9gq{&S=uE}j~wmkox3OG8==@cXvYN~cjhgs!U zbc@SziaGthhJuZ8*OPoG0ly^RW-D=-k6NiCg8aM`lV-h$@ zUt<$$eLJB$do<@@QJYmPo>HB4@;O!>suXdxEiBw)?l(kP+K$;hVV^ccy&Pt3nmB+4 zOZzc<#xQT|9N&f$&c!6qu-KyRR1uHWSaSb|!pU zlZ12t8jL_A)E;P10*$-&SyjzGpO$%pj#)~&qQUk~7SwqC*c56yPX#wL?uwpP8fUf8 zkWnc+x_j5l>fo58EzJ|mzl_5z&9X=@tNh^&Em)WHt%9Z?&$hzEMf&fp(<+*eficw1 z-ah=5F8oddg~hq1o{&!q(v$7ErkefltV#sHi}wvVmg>d-hp~5T&vRYdM$@KgbjP-B z+qP}nw(r=sZL3Kd+iK9*YHX~`G3WDcW3K02V_V;@A8;Kw_hZMK@NqWtPaSGCHE37M zin%yvg?yw_F9{t~w$fY65 z>g{Z*+l^-*vlE}}OYxP~_^B-)tYhho&yStY59=XiMNOVP7h9doPeGoGt2vLTGC@^`Sl#3g8l9 zNM}%f^nUA5w1Y3MJhJRQ11!qcnN&$GmyN3)5#6ap+lSpMI4+{=RMv88;xR!L9(8P4 zbm+ddsfifF{1#_pL9s~Jv{OATw(JtAj$H7p)LgE(V{L{lzjJNYEnMQth-g!)cB>e6 zTJ%iS94@=Fy?e!O-Y8r$TlB2eB&)dNy?a(}-l)aRtyltIcrMb!Q=XPnuoB_9P(2m0 z)Tv#Htzao#3SB(ytZc5ptyMZDk{SGWKJ&K$W-b|65D;XXbVD81Fof!1i<@WAR^9!}liZ1+ zUAn{2*--)dLAwF70Qsbt0%DPrU){(BYC0&QfK1TK&)6X29XQc>`w81G`30#@oX|^?yyXlq=8>Z{Z*wZVPMt5xUN(A>wMGPZJ~=6$9Iv`jce69>+#jQAHpI*Udq zJjt~aXfrrsBf*%RI3;qn@`mJUqMlpNUQXr&sCFzH7bEFQ`S361`)lrU>1jp*}i!$^2qxnb@ zI3l8aqf6;bxJ$2p#bxslw4KX*Vg$qgzj1>9xhX2S*w`4@8vjpbrIO9RutVRAU#b*f zL+%)a`1;ANLf^@|VFX0@odtFwMtL9esZgXkT#q?HUrG7Q3#QtH`7$00qYhifXHc1C z{&XGYIOW`ArM}!OFWZ5jUIhzKbHJ#vvC4cJ6Y$Vm_p{tRb<}8J3KbxqNvv471IOv* ztJI{moMLaZ0!#vus7h=+D*-~1!j%^c1IyVyeHU~YE%ozdssJ~&%1iUa&{ZuOJBxb( z-%?Z3OsFB{(Z8dx^a=GLH)PqwKvkC+P$HVz zXICEjP;QH&7O)9?K@~1jH%8~}C#!2CFhpguD2v$x14e!vDQMp}f%-kSc=QKSKVfc- za>@z+y{BEWR2YaA6bJ6A>J{2_jP2f+8X$ZRt-3iI@CJ5Mr_%{Okn$v8$ZNNS@J9GMdY6Hf{tfG2u;li_ z8dc>dEUEO5$ob!4$^ZH7|9uRV|9NJA$>@zV7k}b|kFiZk-9-LAYA6-(@hDz1S zP#*D`USILt$v<8fEnO6RWx*CRSbsY(PLQLy*fpAsL#XUz#F20qo~GGiY_WDFuoAW{)eQaA znm7fcs%02XbCp)rbI9*tg)}w#%=Iq4%wl|oMvQaA1}NBocHj$6&S$Z6#a)Grei@A~erql{91GPNi$L8Vt3NneA?iqDL9~u3QX%3^nmWr2 zf>gDDY3n9K?qA?<(Zo+J`n@|g8{AnX)I&augnU01 z&cD{ZeBDLxeDqrCrAm|EcG%XCT?b#9`D?xVp{yxIKaB#WesrxZc-%nyn?@9-GVt^R zw8^aAF72S19B;=mn2-d6bSg(ve9avXFUkl3BOz{&T}MC%{=4W}Bj{Fggam`OD`tXM zHzBZ4@{)7Z>1m*iyxGQ5y*=983b^mPyWyxNhIL@I%V~R*#cR^0uZdsjearz)*tx@t zcCMqRYHqOY)WLL>IGrhcV(A#B9n^Y1Ad1^c*cMJ#Ha#U@Z$CUV{Ai4=w^PH*Ji;Zp zFsRhAq@r_Vc0a*V6XE~|rUQLYc19t_s_I>;fi~6`1QF4pix$yvKq-Q}QzDVNV>PYS zgm{;E#b7Ou`(qI?;Z_o6lt={=F>8jczIfMep2cJbY&n&5Rmc;WyVYpip4UTjdy2L_ zMPark_DD5Ek+xZal*8N&;7$FJBIQGd#d+i7jgmis$Mg-`qKLz3T)2FS$2MV+_?Zeg z7MVAm{V+)@X*sb-YS-%WB6ELN3D!{Gk?m74!9^Dz;>j##Hw-}k7C4*cD-D7v)=9*} zemn8~*c)K*EM~4E>^~Kd?tB2Bcx)A#COI98ZOHkC`R7#t{7ck4pYXv?&>vzS8mmO$ z6T4saO`+XWr_ya1Ip+;}NJj9Zq3Ik<7~UcXd-`q>6q%)IOgFYP;mLZ5NVi}~uVz37 zxN3O(!KewWAgH&9#t8gPb~6ikD&}eT{_GXMEp^6ydsry}PQ8Y&^%(T=_w-7F(CO~H zYa!$H40%>L2jkcX3t6Z7Vm!yxxdoHGoQcF5NK=Km{QC71TT=M~3uw{N={MnwqqmFI ze@iHlm#!<=z;$Z>K6O-U_Ql)I+m{|nJ~ux?llkGVqnRZ?`9a+$iw5Q&Su_NHTR{F> zL-aRrJUVGw4oLucxJOAVg%-M+1_i9x5KT%ekCq%Q3@rf@DKM09cGtOcC*izm<#)oZ zpAZ3nwwbsnkBP2{maN3*ZDIToT8Kk9pPS|0VPcQxR5j!2&`Qi|*Srp4pwc7+Y zSCsM;M8F#gN-qnOkRwqEIU0S})=B~aWoWE+ z-*afD(3i>~Jo4^D)zSjap@PnGZ=iME+Sn_5%(Jyj>uL=kTMzA_-Oy2MP)qfT|I9!6 z)HQles-vd8?2Me%0`9#{Cr9@nd5qFDy$mOD2-wS2tGe`(g@dZE9Y zR4O*R-mw%`G2O1fvOTmlA7bMj-#2xQk7p6o8Kr5^WEw!AT4y+&vUjQ0!?q_@+L-Q@ z`1R7mn8h2=>e6W)wA5=M>bFOV(bXWW0nVSsa7T|Q&`Y{Cc-CjD5jPR~`l|gih9)D* zC%@&D9r1gS)d;H0s%E+*{sMguC>>v4mqT-dz11P}#~0G#uCMNkjSJMhMJT@6sKWsA z5Pulqr}|bax;RJXZ;oRMmC5H2w|av-JVzRwhqH(X)UPy5gF!%ZPyib6$}7mZp)DuNTOJK!(fpS&;z9YH z9N55x{VSRz$#m`Vq)D2wN3PZ=pF}Q|$h=jNR(fN)76G3pMdw7dKV|BD4f2_0X<=TE zH;=bFP*qS;R(mt_7>KE@dO`M%Jt{8@Wv3%CfbcnvcY-Wi}j+2JNwvAI=F zC?D0Ek!9GfwkT3n8CHa#qJ2Oijw?cv)~4}NPGp>D6KoCa%kSq zTxUtfD;gQQGzHLVmPQ)(7Dw)UK`nod=YEOXn5`Ww_K`{elidSvVhgMk?c9+KciacW zPWMBN+98%Wgr$XApxQ#vDw@U0OCx8Ex67`kWj$RgU+^WuqPz5eXAMgKix5O6C0mUD z+3^eiQ60zhw`%+Ej{koW2b4T)4DJ3`=zz{&`5@56T844g(P0tjb$EhRVR*<_pk=C+ zQK%kr<4^{xpD`)0k6U!t0zUjh;KOF09#p@H>64JbaT`|7Hh2yY2dC<8zi^u2a?Mj9z#>W}0~|Lv*R+^#}`$sT^Hc&}9>S-mYGF ztrj-Of>vYA9hpX5gsBdN_`Fe^-NO83fR&V9513a57Ra2}4CZq1Icf}nR8T_>$*WzJ`^GRfiYFYt;=5_aTeZuo8>y-+ z6lUYJ29{|jLom`*%<%ToH&HZtQ-m&bfD|KtKXRiKgGc!;qqQez1rbapC|iRRg9W+i z^2}#0g;t*W4lNWI3g%&(hLc$`i-~fD)3rXsnbkJJhCX^c7_NQ}6sNS*i;m%$tBWX8bJk$jOG>6(Mwi~vBQ<>!$`uP(xOyhFA6l9sJE%dOQ6&?GF*qY z)k$e&9pEfH12YTx2Su8;-37oNHPTyjQH@_So+O@gc1;+?x+Q1cd}0W7XQzXMr2WyO zb#Cv%HgIk)re1%BOniCno)rV%VIH!8uAfFPd9sHO_62~vTf+? z)dwybU;MQiU1>$zj(miVEmiWdpI*kL{)A%+YFL6<#z9!@D|s{pC`mB&4l%_v z;gdm~&bM~~++Avu#zc=M49=%FL^ld+PH&MCQ`Nw1)88rh3K$hqFhf@sA)`;mrDI>H z^tpgi0qTv%SzSSGYukX@tPbZ)OrmwPqX^5X@)=8*f~%IWNovIE$t5I<(T6Czzok7cQCfSAVvat=fnR;{FJl34NQxHH!F2~dM ztXVWrKpOrD+pj)N0NQz*A2AyZ*g}ZVuW#$<+PkJV=Qo(S!9KrG;^zMoKA|Q2F0waj z74@3>`L=L>W*PD;6$df+K(@zKqHOgD^iA~Fr7%YAltHsy9oD&W=s%v&OXC<^-2BcT zq9t!Y>>%^08tZ^_ffFdZVxCVAQG#DL^wVgZtU^tZP}jg=yFy_h;$^AwKpVe#StWOp55`swM_arWAkhb2~*e zKPv9-UayR+=`2Qh7mRu=Vmw#x!VV%|L~Vcy9V@M?(Vs~x;m$D$pMITXl0lql2(al| zd;VnffJ68+^IiSR3t{#ji^X_VGm`%u&G6FXkJ-`)nHDAH>q;3Bjn-DjwH=u6vG5n;`tMN z(ex%c`<)?bP-wr=t;$ZqYvwqcw@|h9)1#*l@U0R*jMz(SaKXbnwK+(iwPWpfE~Qgl zrz@5+3rcU0@b}WAKsrEC01TA}A%@UNxeuL(n}aoWJ$r30pS!Ja<^+3rfWj-b*sF+R z^hvZ$5pj4o6zju95s=dHQZP0Wo~ydC+K&fmfl6{WY?}Al1VP(5d(H_pi;5|x=u-ux z2YI7?H-LTuTY)Jn)L4^LiwsM%b)-&eG5g5!uJci-oH#gLxb8gj)&5iWi~+*3ecB90 znTz$Z4rEg>U=AxB@(<&%@!n^URoS8s7S#mz5HEF;M!C-ZM7oH zi3=DOpm4#~j6HWn#sDUhawG60*2{*2MQQVpMeFihG$a>tz6JS5q0oZ%oV;rvD)aAa zwmHkgndO^S2>e*s38ZoOntm; z!8!G3bVa^%Qr{%wzTF!6UljU&^_+orxAQ?Zc*|@W4gETSxj#}R%<2|?MJ4^w2AXVM zfavaz0ml*<`sIYDND*>>+7w zJ+=UuQ(Xv#9pkP$D*di|3>t2ttn1GS@C}M~s~P)lH+*nSP1Ou>u*wFLj-@(T2FQ%- zB}WLgUkT_-OE&Qr3Z9#!E)!Hc4}@I_mDwNvsu1$vGwHd{an|#XuPnsBRfve3=zs8R z8wDA=d471GO!nUcE?TW2MaWR!0Kc~T?YHM5poXMDbT7p(T7C=!# z@hN+=@j}vmtCN2MYf)wolM|V2p9wl2U$!uCnbJa!*`W({FvI{Z>Aa&t2++XQ?S-4= zE;YNLn$K9bJ)iWjvxPtRuGuz%e}e0!jjsapPdvo{L*uugD)pug*f78loan|n2;08k z!8*hd;#EAj7lnJ#TNS4lb#GK(>V$yJQ#>()-g?76ph+)(wSa|{uvC=j6-9C91~rah zBRi5KvrLmR7O(6WVC{{p*GtYjQjU@Anruo5SlsSuB_Iif9WG7jNiuAiEsL}0+n&342Qm+y6l-F`Sqw`G0>y51lsM-$&}Q;t=OwZWT&_>H z(dfz4IhVgEWm0~zxrUlMlCE2Qinw!vQC~`42r*#N?p;Lm#X|AX?BPHC{S8a+U`9oY zVTF2jdQ@#hEsDv+%%M%9g`;cGWm5a?_s8yX4&k&dtj$ zYhPsNCX>st0u;~6ikrVS9#fp6AG@EYkAH=}{D0FV+5h&7{NJ?#j4c0KTJ>+{YO;#e zr?Lw9H=1AxhA5_B0I&5RXSIj2Vcw&8U3Grm>?Jwmf~Aji!Wun$I@V$SHs4deD@sI1 zim!9^dl~I?HaDg?R7Zp90q_0q!z`a^4v(wr*d7q$Erv3YZ7!587zB)fT7=_yaXA;U1GR>e%CYN75)450rsd0K5tX{2)X!>i4m%YbV z;Y(y}Y1pWK5R{`tF#?8>)-aKO0X`x6c(*j2cA;4;V&k79sv%;V++fwZtoB`lpPRK zDeUcroiS{4D7AW~i@d7v!+h54rOQ>d8W0L+>+aEOhM|YEr%9hv$>Rk0udgYcladrDqqru)HSEjF=&(j_@1;F$IlqZuIQ{LlWt!xU@Pr>Q_zbe zM7ebWH3t|wdG4z2V;tqsK=&tj2OJQ(Lf%C?pxzVoPOI7E_^q|g3eWmU8x%!SR|BWC=~cdgYm}%Cp^P=4tM?p4^f)(RZq5pDEplzok~1O z5=z+x$178$*O;vJ)rQb;=u_Y|TrNyqqo;klJTTh?;fvXd78DDQl3eSiGvLTZo03vx zMv))eSwOKzS=74+z%HnIazq(h@Hk77RvT3Rr=J+laFbPo!flD6l?v%He(H-oCc9;V zQz&iuuak4ljh6OCVv+qCWN-Tuin@_lS;53p9~)rDH3J$lGsG`95vVM4;PW6e@DfQI zAY#-I|Mj+bN%BJqN*p{Z-!0fwqz`8Pp?ZN2{S9fg|B(H~jj=g4d@tHZnlyLU`x;)& zmuLiD*)#MQUy#c^b4aO4RP2}y^c#^ypvWJkWzSL@{AE`96mMYoqVWmHa_-JhBGTi~ zsyC7IJI}=2d)rxH6WNm`a@1ZX++)rTsSjD<8GP$r-qY@THoR?T!IfU(L&7c898E)E!971IH_fV6U&8O z?G^X7YZ441)*+&j!Ecm5=-`HTS>^2D4-&iafk84BmTY;C@j2*|s0yJtdrO4E%r}1l zfU9t@uZW+N_=A7+nWOt#-}&$R6UYB*H~+x{KiiGlr!V&JRvSiQoYkUY8gl$2EC9#KGR)Khnt)4AaMgeK>b-~>gdB;PBV7h-S4f==Y~>Br<6g{&ekUB1j^^Inted;%B>8V9xZZU&nF6gDJGE@qfbY@_> zkC`dgDa%m=F3-9>Z2ftAJH{n7g%}lGWj;h=D-=oW)Hk>ez-*ydL@qsmM63^fhF?0DNCWPHi2E2 zPny1c=?@V%h;t(4wEoP$NVCy0U7S5@`nJQhw!K#0;k-E%fTscc1oIHI-st3#Ex z3Sqjv4+5`G&WOVwnU*2yoHXQ>h%xnwfcJAnq>c0z9j6sopsyxLR{ldWlCCO*0*_}z z;{VB1{QWz~pbZ(L>Tr8TCwA?^pkv1uSSe8OgOa?|xXNSQQTH)Puw=SR-D5jGrT)SL znLD;AA*ue@!#ZAA=FgV)+#A`bz3-)SW3eKm5@qp`l$q)}G&=bSi%>I8q+*7=b@PF4 zrT4>8WWn9sG4iVo@Cnv&+L1IPRbeO_zx!t@fJ}#_3YgP2h*dJ_Rdca?i|$hDj77Xy zmU7?Ln-|r__vsPx@@3cREU_M1WxXFt$o*SIQF?UA$_``Op8S8H8F$S(b7dZkY@use>gNIRmaI#z4yZ2qFX-w=*MR zmV5&3?bjjwUECo-rkzpHvQn|QcYI=ba_mA@{zk&5Xr&L;S3Bu<*rzEc=?{!ZzCmm2 zV~PXAX$QvpuV#4hs3tpq-|zsN3mt3wB+M6AAEn~jO_*L23kUVXW9x=L;9!>5+JJZ947 zV%GbUW_O#;#(7}Fh)G{6N2`dcNhA1jc!w^pOA^CR&6K8P@O!FrS?!;U7B<{@*&y|2j3W{d<$C zR`&aJ1B3TT9dIQQHo{(J&c6kB>HW-BcQyR#B5pnp{J!uVbGVq_V!DNtoZV40CBXN4 z_m|wS=vh-q7{HzmK+RU0Y& zj#G(JlHVy0w=*ZY=d31LP(_$6N~5V82;!MKt8+e@>t}7?bCusS$KOjDQV1)tHAprk zYgS{!XmTi}mD^5m#exOV*f|0Uv(x9@Ktj4(1y3N$BXIPgpnuS8{-OL@c zKP4mUE;&FUC9+3T;FN!6*@`jNmADXpJ&lUF`stb6cQ$TkTu+nTwc~2EY3WHZ0TG3E zaKcG?l>yYqA|m)h)k$S<8w}!iSXwaZU3#cd!|$3T{}rRLzk8wBs?xGA;ghPZ73IUB z>Nd`CrT}q6%mQqqKihxkP(Q_Ksg_h3Hqo9{U(KZ6>GYNeXLt8!?s2i1HXNgJs5{Ro zWeruTa6KAUDyqY`w|LwGfho=jsEiQ5db|vo;%#j}G!_n@kH5*^mz=&}|LqMmjy>AB z*6KRB<`KYj0D-sk&Zx_4;jybF_{9cN1wvCd{}^mK-pnf%Zdq|0I?8+zE?TB zK&%J;@M{W|hU|dusEm6*@@&nEZ<~BR;||nrg*2TyEPv9swbJ`iv8yhjlwNyd6`&Y( zb^z0Yc>%(lg}m4x%ffA>{VXUor3+&^5pdBTbO`5$Q7=D9jPhff>VJt1rKKMcSxpc> zk6=+R%vb(G}>_=WcF8q-&^op?Y zdo(8wOZV<4m&0^%c?M98ey=pFt23kv=GNex8rzeCV#Z9pAc34sH z&aaTv^Tf1~t;gQyHAE+lw=D``hhS>>_{D$~h=+J!=$?8>Z7@M4*8T-c=@(gUEeEen z)y@{W9Mg`)j_%0(04uL)@QZJj++z&hRNS8j>zp~uH#KTrWe?EqyP1grQ&v`|dI{e< zGk5gX?H`*T!%5qa7Lrp+L8BRMK-3CvyjC-wf{vJJWiP?N+(!6qEhFoIXP)c9gOTgk zMCaqbHjH62fu)MiR}JPr*8SguH~!x*n}4B10*;Ob9{+{rMt|DuW2$}H?AKJ`ON7yd zOAyaWYJLOEnrjWFMa-rom8l3?+GNt!4K|O{>bN33QoMn^&B0YCV%l(7T$eMwDP-q( zrGACt#4-Kl_3+8~I^;gg>h$}#S+fF}FJpRal^Lciv&yJp1gP2!l2fYA)~GFJ3jiH0xOAM;v2`F=#A-rgRKZqEk&76Aa>1rt$D} zCd@)QZet#cW0W3K+Xvm(+7d?*3X)Y^9o-g|-vervpe#eg;!le(sUJ`DJ@zK%? zPw_*Qp0spWYe@P`HjPE#q&QH3Kz8wdb4Vxn)weqGC$TU}fDh;{KDseq0hnz}ID(LA zhclK|q%2j(lq-&xt1YG!iC~_cBeGG(=Hf`_h?XVz&Q;~A^hLWLhK~Wac*1JiqGnZs zDB&%nc!jce-4^VZy|wQz$o@M#gh3hg7y~O4mUu=hONAn$T^t}5nVblyHHT;lYmq^k zfbWm1O9|6t)%*#LmQj$6K`-8Q(co>vNC-4J(In%k3+oz2ZCj%D0;)wyjvX?Qr(enO6Mh0mq-O(>Bw93RTciaaLx#cSS&8ukBfu@JC36Rd75LEpc?2lUVudeG z+h+NrI#}W$(u};pq>^A0@6V_s8!RrxyFLsGJk1_e!Ll4vH-*@n)LfFDmri;l*9|XP zbF%(X8b;-QBnDMut#jKt zFdS}haGS;4s^z`jf#wUYXOGyqpb&GF8&mL{O_Tt;Mddw7;yq-){VfI10r1Hh#?+C& z1A)r2!WAAb#9)r>uuMAR`JO`BZJM3!5zQwtH z!c`zInLr08k9!QJy1l<%<8Ay^;I{zj51%G4;X_M)8$+Ls4&1X2*g%AsX|G8KYAxFiE%T0NVZBhWFE3e4E5>kSW*f=Qr{yui zA%@6B{-HZFs$XO{9=*H&=p&XiHoilc7oT9k+W3(f~JK>EhRfz23*{ zxo?TVK1U#UgFCgcb*Mh!0Y=8?u=N;`ctnQ+&?D1NQa9)+sH;B(W$NH_SbqvWVmE$` z;>Cuc53M4R4GwDF_Vx(2yNG?CZN<&x1fKLNOL6N8iZh`nTo?#v4ANN0f{6X*x6$w& z4?#|n{6+MK;M=zlT+d2_+>KjEO=UZ7>K-&(c$9Mlkd|L6A7K9i z>K-G@@8mysl-7UjRez`C{I4B_f&Jf0%6W>08nPJj@P|$xAPOvMD`9DC2tNNyet-dg zn1TW_hJeX{jx`WL%KYA!Ma#k@Da6wk~5=aMIBu$D{meX}U z6d|VTxF*a(%l4R!M?3e}hCM%ROmuZ^EU7jp7w(uG`*oG3F83#B_G9@I(-xBOvrKU9 zptK}I>Th>K3U{8!fcNULH0To~ph}knPT2C{ZkGv2a5z`(m8G00#E>XfA3#VprxZ$7 z++imxhOko>c}4|EE&i-mrwk&(S78!sSH|KPE3(|Zlv%Kl z(yE?Ub*ykP-jOEExid-xioir0VN1=T@zdd!-kHg*VZbztoD|k+Q(EbfCy$fPVWyx` z5yLFbBICp`P}xMDc~DTxkX1=w{*YDK)OREm3y2<~R6*qI5*7aXP7gL6<_=gq9Oy0rZBkbcYN4nBS0ui zorC7<9FWjrk3GWZ^Cre#yg3;HyyO!X>D|rCSqGrHT(b9(Hrl0#pQzPDe!Z>)}vB#!evbB-$%F2Mku}?#FzfxV}u~xHvCPZ0C zhsxw-RPSnuv%py%QtqME8MiMrDMWg_Sr^K-;0dp^OQS4_o zUw$jPuLs~wlO~;en^z?zqHR10ajU`8GbbP4oWZ0#^jmv<U)S+;t zm$GnBpgKR5UES!Y7?HH#-9gn*s2t4?-0>gw=%U>I<+q zN`~bGbd3r8nhAm>1a=Py#5$m5_LN$ntIo`eB?j3D@Ez~pV#&mk)q&YBs#*u3u1n^o z;poO?j)=KGMU`lrJzv-jU7-wJ8Nhe`b^yZP4Xucj>j|^!GqZdIZfl%T;m&IwtDsZy z8=a%Q6^SAgvN2yR5=a$Vno(5u`CWixTUh1{Lhiz14=n-8`^&LPn*rI%1I zy_D$l4AJO=fue(FrSy)dlh`)tLk*cmYN7NFpHY6KGzjfmhUkadB#t8Y4y_B7C$Ym? z#QsH~T;H!g8Iu=+BZ9{TByhzR`vv*S8XIjzLfZhfrV(=8)M4Eq@iCd?l8Y;Z@7ID) zcrF#OFYF)B@Z5ec_gvGLpV>E0h$Iye?KTFee+3xEJu#FU?>v_Z*4YW;(+Q#m8FxNu0togc43X|5}UtkBj1bP2#D>;z=p!OAA zS<)V~A3y(Cr*Vk3t)#nBnR$i&R})t`NP%(vytKr?{1cS^k4>EMKSm_~r-|G2Mnw%0 z=9REq2nCA<I)wYvg5s7|EPU&U98r6P@RtEt9=9cIq-n#eO|BpgKYsZCYTwS zBtf-lI^uyO7A6Ey3b9LdU~cOiQVdk(-4X_@8gfqcRoz62n%A5JYhS2QfZJKwM|9C- zVJ@m-vMeL2VycW^G#z7K`jlWX4)xALAr)0>zJQg+a;|`r=Bvdv;9LVY2AD|2Qr*bh zSg@!{xQ>NwMj=7qEc?LNcq#9dRud$1C@t`%UzJLee z?m#yTFy7Z^J+PNfwZt7}E=RRTmfAlgPdMR?5iFPFD3LU=M;32RBhsrN!S+y8aoCeY z+1gTEw3m*zqzIL(@QlGOPjQe=vE&FzCFno8*31r^JY`BF`z)XF+ZmD%0Sq za#uHAk#TY5HolwPf>uek3xh2-57M-Yjs~W&ZH#k(PzIIfi<8SQj?|;9`82h;Y7W(H zB6o7#!+ZwNb7ui&vq&tPT9g&JvH42N64x1=-0P+K1FvdhV1$k9kUB-kz1}^@+c#vL zzr-g(s%j)T;6e>n8?&RzJa)mCt{=3paA__-vq)QI{|r^#jO@m~^+)6?&qUSJ<`lr> zcy^cxr46Qdlci-|cn?)J=H|?P<&feI4vRg&LcUQb9@_gU^tF;>2NCIOxu+6hEJW2l zdWyR=zRImbI2=^fuDke-M1b;sthl$bKIrf+Vs0L{YvuxVDRHjruRV9oYbIDF{b>o` z4TFd?kx@Dh>HO^Y*NA@PHI30)4(Z-5f+v|!ijG#2o?ZHohnl?y?2<>`xYztGkU=-y z>*0X~;-rV7GNe+@;iPC_VrzNTesA6iI^g?JMM)*FMzbSjPDM_C zxx2)kHKdhdU1r-Z0!{kr`(*bV#Y5J;z2Aal$quq!in$z2fYu9b(3;4L>_XKiZ=Fg3 z3ytMm{`lLX^TQNQ8xh>ac)?PQd}VMUrtJE`#E(WWk>U;7kAf?hS#$CY?0Q50)^Sln1ka~noAAEyC z3W-L^Eva5``K3<;N)A4S!aJ=_d)p`&1uB=gO8F5vzx-08&oYD#eu>gMy-s$!G8hI* zkIY*65p^3X`2|fDkUP)wynJA58tf^GJU4lIv`S)_lIby;e3PtQW*nXnnMAGBpgjsjLY2&% zT&3Ee`Z-&$lB|^-LlP>JtQ8wHBv$>U3LiCD6jRx;ewlt^WKD23X`?L#!8SrDYcZlr zF{r2mtSz8k>fp~BWINVmvUNJ_lO6WJ^C6BFaI{~#tEoPZ@h-dH*JAb3LpIH&7qi%y zU|f*YM4URf^yg(QVlsje4xh^AEpK!2Hg_}wp8j$Y3K`UhL-^#n#K8U2eEA0x^ZyK& zW)&?pBsKKkWO34n-yzgws#J{g3_y$m{lo5nhM&|l$QjLZ6GSm=wJc*9Q|GHCz1h-t zh;vJOi{+E+l~zR?pUYO~J1!@~uT0xAc#NC8tv>fFn{KkYZx1_iAR0;NJ0THPsZl14 z*7VMHSSbdX#q-Sxx=R>@BJ&wa4FJe$QTbLk#&MLF{0r)cdJPh?pBg5j{uujJARH#D z{?yDaYEzk0ier9E8Yq0Mc-EFU5XG z_XD4+lj1(@({g2cO8o&ZHRe4~cP|F_?)dWzRG}`(N|vt&jJiPoPESr0LmLA$8JHQJ zRlraPP-G+y;EhK>I8wqe2#_5+f9(=^ zo%4QZraVp)Q}f^re|3M`sQd-g?U<%dgGI*AedeGYVu+P;7GzXS-}EcSi~>{*b-`dK z7_jRfi}v2otJ1g9{C2RB!uOH@uQIp?42ZsvvGA5FBpSuV^18tbUry*6s{kyX3(i5F?}e+A4wI)gH#pt)BIL zA32D#xElAlj3ZnI&n{__CxqI$OH!kzEGTZ!t-SJ@J*}jH++-5oL_`1G#IL}3i|uVi zd=yTtbH_lBj39G+)d?pLU^6U2C>D)kHJpfid(FNg?Bc$Kn-SpVIV!db)Les@Wv4Hc zU1TVdU3L(bab1d6PTb~9)@JWs5TOo89G5%$t8G8N(q?}+k}Y!;%AW2 z-M=8_oaPVZZjPwLJY|oGfw`OMD>cVOUb3+1V0^-1D&!vi#8+o8mF*3L?rb|d^gVRQ zMXtycOThf)us9^R^OG&GRMmPuPSYnWZJTEDHL26o>orm}35w-Q$(0iBG|F z(BGAL_Pfz+`eO1A-=^Q!k!>2GGG(^Ah=!^qWZ(vmh>~+N4Is|c7FQvAWVA9=i9n=CA&2h@`gX@+TzcRz^28VR+g zkKq|d=A_S;TtsS=F@T6Q`CDR6BcX9afJLzeNhUL~w&;62QZ3XzKKP=|Yzcv^oG`EWrwJ{Z$p+w6p1>2i&k3obEEAx6tynAe+qW;pm4T zF&U(K!w2I|g=m*v+2|x`?t$;HIE1?pW%FTRO{i58`ty{Y*B7mAY@KQHqt>r_?*5x< z{;oaEVvpG^t{wZgM7MM8@Sf%<$JG#UPXYA{%Fv+Y($GQWWwjyO$_tx)<|pWC-EnI{ zx|*QK-_EwbR}ng8fJN8zv3ud&f!Y=^lvRdCg8pbK7%vlj>!v?ETpyfL4AR%bwCdTvA)l$dp`^6ug5l>*n zl=br0!weCgnFQ^#!aVq06N{^oVC--d}eUZ=@7##V}02E--NP zVoAQ)IW+Ok5EO-(hc^PefWIRm>FGnnXh{t$r7%eQv?yvWoO^VWKY67*LFZ69HpXRC zBS$)=7$9yBRQ~8D1dJ=^F`i+D(FG$2s*6D1)89T_o9~L~(#mPNhC|B;XrT)*vk(Qk z0!L7#Tpcu~p~12#JBZcgx!J}589k4j@|-qeLxg6iZ8qXh8yFeZ*>nh^A1R>o2a+O1 zyXm63fk$f;WBF2{^>ElG#?@ZVxtd) zn-V)D7(#cX?UOKeSmiBMX4Szo^5>V*7Zu=o@yCT0eLOP_=m=PCOLylD zjN_FYz;6J*VMBBnkB&&Bj#!w2e<)SvI-8ifmKChQ6#51Z&+!tXYo02ha2E;MK`URT z4cP?)WkF&uzfsQ-fFr}~!ymB<c@qdiV365h-eBpUjX zLwWp`3yeUMB>Z5S82q4#c7BwV?1sv7>`=KFvyjGh9D{_i5aZM5em=td+@QKERV9hi zPjeh8MlV%KeZUgil@(9IRXv z25uju(hlXY&tB~bodCz$ax?kP8~ge)J*@|FiJk$W%a@BB zie802pEL64IJ>kc8s)aA~~O-CpG4Gq(O>!vk6 z`v2kU8>1`RwskA1*tTukwr$%hb}ALKV%x0PcEwi3E4EQds`A#pXP^7dZSTI_+FY&8 zug~#g{g`v~(ed@qBp9*Z8EiqL-8>Hh>C>P3+a3*><>ymwl#-)DBE1uX44t_~5iU+% zcpx}c>cwRh!wHc$nEX%iLEEF4Vo6ewvg>Rl|5liixSbC@nuBM3WW}#@}UKGWzo>jpC>%PCePLBx)_E zF?fq2%?*1okrX!xqY(0@#OChvhZ3TE;gMWqQT5J3CQKo7qz-7desy zhWT_m!r?T9>WZIk;JpQ>Ygv>og-=-8^1ecof8K@a@H=}*T*#Lu3UFQg4g0fM+9KtY z)wr8e)TeBm(26vf5LMf7d_4TD3OaN#;3 zneq|w5URp$A*?ANFq(&s$M{b)=b)H_3=25U0Sg_5jLz~n34p}C?x%PoG#tCcNmvHT zt!xh2c8}$SOz+fyuF^{V1AZ-Rv-LGMnqvz$fv@epD-(=T43|fIC} zFiT=}mnbAUo|ST*Ipi&0)KtV2MD%2Ldh!hr;2x;;*AlSQGoJPXrd)aqRVG;-dMZYM zN4;8lZm~5pqE2uxZbyMDvBYbe7lq!%PyVtx#Cztr#6WbpV~VvZ%b!-;3^FD^9kgw= zgA?bx8`tS{xJ{he>gS?-Hp`G)&PiSFgHnpE8unJ^Fkg{?Y?hfQC~q7tg;;ULCaCgk zRJWNT){@aFwl*dEjUw8Od!w3{@WMjUNqmxTD)*b)C4^}+9B1ZK5&DwKqs$37 z%``;uCN#(v4PexTV3#n($Vt+j*3?YpZ;7iSbSq65+tO)sPnrfWQQ2JV56eD5+G4HNcM2M~-Co zCqyFPxhtfzT5SNr(Vh(?eX#d^tTpj>1vU%I~)md9?5^*ujGdyZe%@^S9eoU%U+womb}eb%Ce8m&dtluXEerI7TSV z>eb7)aoZ_lx@)QOOO_vS-@Yd{mW}F=$G%fEZQ@g+>1r%|3CFug>+S zm0;SUw+lWUACFRENhYuLSHrp@;p9GlACX-}#0&{+cqNUyk{L~R)wW;{j z0xlh*F$_<~<)&qdq7J$LZs zp{59zj6RN+qfJG0cGQOD7I+ z0RI8ucN~V@b9c+T{X*?8&m7QwH#l|;Sw&WrO1yQJ)=BBX$fvZkk&uZB*qQd$)qUvu z*y($(w$w$n|Jk}=-G=O$t~I*Ct?fYH-qu&&(p}%x9n{>^xZ2&;T;E}$pZNyOpG`l1 zvi5Y<6c(_tU7Ne6Z?L1cCTPc7_L}r^5A4iGYkgMwFevddGv9M5v~)AIueJ0Am|NQR zIWwI}wRR+4I$1>Eub)bJb|h{&S$xCCwe_$xbc=4M!URmy0kgD*IqM?r%}`gT$i7!f z8o0%D9$|T0r|Ua~2pg*Yc5_SY9mqcuYf{y2olbT1vg9{dSJG~sOuh89^fOqOsph}T zGaMG|iS!FZeTE>v!j;-ok8MB30VuNp^DLJHCr@@x3dirt<-3ExH(M>Q*6c+;m2H)E zl%_IT)wVV3b(u2Pvu%CUA?bwt=$XDa{p=F zry3BJyAs;@O(FllI$@-^G$OL|z@xhfT=>Upq#Tqv;*l>hEiB`D?Fn;=NVW9~JD zTP$n+?oRzKzX1E;O|9U&t$pfKo~{3U%-sfId>;Kv&u3vZaC_`1S?I~{RcE2$kIoq1 zR#vA_H78}en)Otw>&L_Q1E5%(EcPoKQ_4g3&bYG@ciPM>XvSCG0_r`+vGOr3v0vnU zpDgmPTB0QPnAuCL@(r;u>-sPVX`7E-n3bTQ=lEUVr;jxEFUD7&##{bWufCqfbpCE* zJPUo?#L@@tdRHq=ldr$u4qiPb={nSc0!#CHDK-KhlN5B~8n};YKi-eDizmBu*5C)E=7b+jBbdKLxcwcH;YP?j@CC8eHe6%Cu}&Cf8(^>5YH|$)_{Fk6ZC#&@!sHY`SH?f`PKvTph5Ykv8|vU%Dj$4 zH|ROSeooTu<7V3CY9o2IQtb5?w^Dti+4VP_2QTcz7@O8Usd@kDR~}8+@}T~2nkaAG z+pl*!@m{B|bHtmEb;Seat#{5rCD9u=Z;?x{@5cTfw)!*ItF1d-;^j|+zug!%BVM4@ z`+n0q5wC8&o?}V8>LJgk(Cjcvb@b|qlgR{`cBCiro4@wi-IQ>iT0do>l01duWMCF#itelvJPX+`$kV!GI?g^ zcorw3_Z{LsYbg;N+YvP10eMOQYkc@)zwq*#;DGHa`()?-1laZlW~rlq%+W(FNdj#< z153>@K-_4HmLb8a{iX@$C?I)sT1%dw&5od<2(VcO=vqb{%f<83-2!YD1L~Hs$A04> z>x2OOMfXYVdE%uxdTruHZz+F&L;N_>F;CNcGB(n$Eqir{_&9_%E9E(*7F22z^!yoN zLCOy~yz4b{#A}^E=1+Uf;DotWnR>2&{+8=SU0q?u$IZ|;tzz@p zBOb-tSAprka>`lR?4R2~jlC^pQIR`QN4$A^wd8JVdd6}QLO$iWyS${ztNki#i-NB6 z?+VKO_!pObyI!vkzv8t!gm%8Ha7&qxq(IJ?EC3F2x&Y)84%CPKd>jh zJ(;}7J-AKZrsfVS;X_j(PM`$T18OB!CDelNw(UU`LkYIX~`1d4b{*vR0CAK9GRX%#xCMA@G{ zunIB&h~5mk{i@a7yR3gZX~PYj%e~hC8$-qLrpEQyBG7~$-l}r8@@H>S+j;QMrPpViXh`=5F%mF-Wc!pteS!rL?0$ikyMj~8YcrwlE0JT$&p-;5#p7GlOd%gJN~?V z0SAP)fR4(~A%Cnj~tA5QikjMsd$^Azp;zM98}&f<)~D zP9IQ9qSt|7%dSXFuu1|$B4(r1%DDPntV8m##W#WzJjOA*CG+pz%S$JFvQWwjaVXm? zoMPUKOE7zk>g1tb1yYe~VR)ow#CD;P;)>mosGT-~2?KD$oRQ}!`C|T10DPcZq#p{h zzwOd|X)zK1ty{iIhOaH(bb&61h?^}D;-bHvNX(#VmRO7}EMuu@c7h|n@!#>*ii{=gt84gZ_yZ1K4X`P<#&r@nBQ{DhSAD;l!Yp0a1TEv|p)4UZU)A zi4Ne}Fb@YthM+`{#ETFtd*pL^>l2|D;2w(OJ!<>0W-VMGiJ&dP8jC!sTAR!QTMh?UUfsMbw z>abHR%O#J~fMBExcHun=vA98KI|2X`h#rZL!VMZ+uz6d`D$gCZ?g``%EhIZ)A(pqs zWr%^)VVX2IO`C<&L8Z4U-6K0PJ*@ev*+{1~ZKuOnZT?sj%-Ki<#VvD$n1kGikv95% zfiP*Y&JY2q8<}f)tV7B_EB8*6hUXkd#C4uA+k6xtV4LQo7vd6bi#VuT#t8$hV2?5r zwn9RNtQQi=(hPPyUmaq66YMpXt{+Sd zlNWSdqg3l0U4?HexYzn!vhDiMaVi`)va*}bM_gCMIq$lC7o8XkqKI5i5hDB~gkwMZJW!52cChA7hg34jBXj7%pn z3irkYaz-Yjl#_1HT^g`m8nmMKm7&vF(qGW}jkd14wT>aJ!FvuHHMLy`wsl05-!!|> z>K_K?&CK2)&Bdhgm*z@q)c!b8QFC9OaSYkd&lQ0D6R2+I$Ryfjo@~805OnsI32}U z)c-T;l>&j3_Lazkn8IV}ty|}6cfZnmmUTrA_8ptiK(#kw-9BP>RND=V^FCpB%=3+q zP(O3%qz@|b5&1hA@nQN4PTy4BO|^fH@jkkLlJc!Bk~+6AaRtnsnbE|g4`tm^OLrdk zoz?b4ad)oQozwQ@+pmJ1A$cGnH@=)v8UM_^GMsuLe>@=dif7bUzcspHJFRD=eZZXn zU9oIG!-kb`EOb4yr(Yubl!uB>VT^U&k}p1rXc3_BkQr}o)(=|eD+Cm09hCVLj%Nhd zKww5Gv4-aQ-&gxS04JVjGm}>bQdkHHL7E_(9BBxZcu6dSRVG?(9RqLkcgrwxg$qtb5%1Ng;77 zML+ca0Ne8+hYlsi=OyX3IrwtJ_5%#Hngm)x0F0N!?DN5wNX3NDWUV4dk!yb$utA0Z zkO4^)DbgF(Rb-f6u{$1Bir58y*Nb#J;spgr9_d1TD34;gGxon;$RxdAybt8jb@eF8 zHk97+1q$s@QA`t=;Up4iuu6*zG@|WP>CYsKZLn&|=zdCqQcT)01oQN-5`8!*A;7OV zXhUWa1{lS_Bpmeg5lnz)EPOj{)TW4{3WTIg4(72$ArG^hL}mqwW|7oV6x2M0o`ghm z6w>_Rb7XhB)>JVp>WUs}=Iz%g|60sg_+~8@rUku$b{%`jeYIjtg%tc+lcEJ1Nc3eX z`iu%ry#a}yc{nP(I4zcWhvu(QTA(Eoe*q^DPyzvWzeP#qSQa1{KMKnMY`Zu-&K(4T z4|uqMDtH5e8RRfsWPL8S2MRt3m#9N9j2{Qse%Y6BQl$QbEGGRURc|G!=rvy{}JP=VU%Qk*Vs)~Hv#ZbRQe&yOId#C;Uhki(KKOF zKuU9Op>s3|u1gVPS;>%|+NifO;k!V;ki!RWm$Le>}Nq!)>z>)OgU?pI$q;V zddOSL77O1p!6H>I7Fnk3yCs@(KHl>8>$_+A5 znx-YO$jfGwQcS7cy=46R7Hvu$^z5zW1R|>o6r)C^EkAtkx*KUtM2y+YFBZm z3C0d`uftw}6y`(vUsAp5X|5^Un>5bV8mVRe6LkyAxT%F~nrRbSV5!Bfnkt8t32@a- zbjG>rW0_m99FT{GKE^^R-wwVSB+*poqLFN+cPal==!&_Z0&57J?!&5;1{r@pEgDmN&1*rv?Tb%j9(s{hSQBqfBTV#Zl^U;rKVY} zr&))m*<`1wy8N9_v+Ydtx19=NB`Rmt=n^L9cd8|gcHPv9^Rw_Q)h0WoEseG%M-S;O z&F7_nM-f4d`?Eyf8my%vJ+(YvZP z;%Q+oQ+ntcYOXGo>=aMzzjAz&)ht~prxT!cX<4BjZPCleS+x%^+pKtu0yNE2Q|ZMC zv7&Om4oh@_03QgwXw)_ssQ zg9T(wp(-;EMP<;*{i>@xP*ZNQqRw(ooAHP-16G0Bk5%BY+-5f|4f@; z{Z}e9Rm0wGQ42+oJBpD(0?2b9x*qpMgu$SNpq;F+v_pBrcI=4 zHxzURJ#mxeS@wJFuZrRG#Gr?Rxrx9^z{S=2R#Vq=eo)W{x(E>`RLcQW-t=lNlr@jq z@mIh2fI>%o#2C^esNOtlnXJnya)xVw6aS5Vu(3(xImJHFP}KJe-!UETj*2%w`$13p zZYs@m!o3%I1nfoU`Ae_lb;}GLO4Om0Q%j_OT%eNEhpTj`Zqt?#B$zE1Qu98R6_ z1z%R49JlWXwP|>!Sn_5r>{MJN!op?4H)TFLonW=!F<^dU&))jz&(h1-MUv+AbEx5s zvXXj9M=U zwOcSOTKk|`TlNEKbS@m0`CK-`uiGsG(svk%mfc~mhpv?>OxRjt6X=xXObo&|KUl0< z^gu0S)(cia6Jb-kn5DuBiLQ$PDaxxEwka>QXYf6K#XS1Xw{9&`BMb|g+ldva%)N|X zP^DkP8D=O=b_G(sj6JRGba2?WwCn)0Nf>P&kR`PaPR7+4_8OtOZjnfPLE-YS2E&%& zBZ)b*FFAV}B}ILOA{30zNOPU%oQ#kN5>hq1ubogR34 zFFyLrg5u7ff@*zqX%(r@?Iv>(?l1j{)W^M(UiFJ6X7WlCTtg2i8Y7C8d#?ax_100KYiF#_G z@j`$^8HTprbA4ZlWdQ|3a8=_1(+dKlCIh27Y?S#Rc@E#26I*Q&id~d(ba%RWKBe=h ziyfYK=w^Oq*2nb3fOy>fO}8mL<_44Q!P^}fO9Z-reG8NOo#g7T0SKX8IyAzV zc0ozqlF0#CGYhJ$t{k=kLsMmm0l^RI2HtGoUTlMLi2+FYN&b4h7^=IHa&kb%P;X^G z*vfj1fWXilyuYA;8RI6i#E)E_zQQgtE%HrCKG|HH(%+YL&8E;IPIF#gpLXVVhOJUI zHQCZg$Ko#NHwjWTy5+TUk~%T@HgMzB1C4XwJjrvmDXs)?Wy}N&N~?**BD)@x=WSu1 zZ%u#{x80!;4{~X~T3v~a;(_#Hv{@^fG*9AioXY3=_oLiS~91V%k zti3VOBB?%eT9Irb5yyg5ER7^`RJL)@BqOe=i-2w$4la-8ECyNQ*LN)Rs^*Et3GV6| zwf%q|@15RXWr2c%U5Y=_bPltH9s)mhlxA0rgLYO0o&VfU*ndJ07(>BRvdq0b_`;cK zp_4F^@f3QhU=goCxmdlpe^fG#LeS(;EVwX)5??NxJ(GAMGQn$N<@-Cr5m;Dm=3{xF zS*-%5CdfOVi*Cp}pNUS$JD-Q{$cs7&JL^#}+gLlL7rujkr&`OsxZ*7~yzOTSn!e{m zU`n%|kX2BnIl{=_!Rcqha7r=7sydd{0UITuz;7uXJ41ryHYF{~%QR3*U|N~xq9N;n zD_yD~&r4ULH#1PGH_obD1tTp}HOV_sYPU<1EO!hew>*WKV8Pgo&{3m$JjC0&iRuoxFcG(D1)50>7 zium%1N@;~Lw9LCj%`=K=SXZNRKHFr$)KZEu=GZs`hJ5}>ewuj+Pdhf-=4aQM5lzyc zn|wb<_aaH@INm_qn*?*qq-Lw0JFb-J$OyMpjF#^iEmA1j>@_)WXBQ$8P;U*gp{~=v zB3AP+C9%fnAn1W_8pK-CaEKKfq;V^CwJus|NgpjnL#fF6eva(P9yDLk03tYM*q(tzZ84uUPTE}rLL($S zlvv)^RSNPxvV#sho+2x5ikC#vrdfOwvk!LC67_6!0tbIlZ7A{KQ%$?;8SpK;U5}FF znbAycl^~=x^>K>|pmUuNoqif>szA_=fhQo*Kil}?9Ma2p@7Uoi-kW=@f3*R(qxIT4 z(Pp|cgPo>TSkdq$5F0I51XU2)^K6=lqh~}i?t5x`#d3=p_M|)Gu36Z3NW-O5z04S2 zLwc_Uxm7VHJP`dRhRSfCNzc7@P*P)+&Jd?l>##^sIN~b(hX*Rwyz>_<2tRo31S5ry z`J|Hpyngkblh5#V#pi$IgzGsuq^ICc6c=?qfIBa+2frce!SDrzf8*TzMo{yO84rIb zJ6dJsiJJ~H*JH9i3<&$7Z6HA10;zwBNY(qvVx0cGTqF7yJ2UzGl0e+~L(^PY!`PVW zAYPbxXbW6&=p@D!xk2F#)*xRPmub(SKCCwF6o@`4AhA|*!v&&GO2ZbzT~l#jY)h|6 zRV*}x^TA%oYc$l=rv$`dJt4lXwXIF{4mmvuGCX}oY9a>2{tjKdu1%U{41_yTR4y1%VAb|U;b^Hy0~c7#Gl({_mc z7hlq#^l!c-uI7SD(t`UP0g<|5hfxmc6PMf?e@d-aL+ErtUQdM9laTK^G>!?e8U7oL z-`dzi@H+(38;10{HP4LD80`~eUb&a1c>cKc1p*N^;^pxv>;(0_25R0zUjE(BySf90RGSwE>DI|-E9IdUa zvV3+jRnMH>q1FHt9ftwrZ8PGbiL4!yfyOYuH?~UmeloHUOEFUU)Jg1qth3N7^#d~C z{c=5-p%nLafI499y*EEcu&eL)l{5FR-?RCFU?#ixh<_>_Sh+UVcgP3quW#};u>gf)VM5Y)$W+m8lTMhUbYywN{iJGRHE|X>oJ(?ckl3B4ONQyBh^Bt^%@{^-YFtju zECfO&X;Jm`c{}mt%4aT`MXc6w&=7F3uPdzKJeV;`lUMnBLy*QpRj#-}l>Y1o?XBd} zJsR0OXC*(U21?I%D^l&?BaEwj(nP& zx2L8a6_xT8XEC~Qmg?yiJTi)5hXN5y@oPI>MknmoR0`oHVoH)8Lz_m#_{s}A)H-Zz z6s#(5kh6K^y^rv;bLF#}I9{wY&HdQnB^&vFnK*GsY)lxTaeCcV*m^4ANy!KR+cV zz$&=tpv}S&>}I5UEA5gUb0~d+2^~M$oo!Q}ms(9C~WrahlQD-Bpr2EmxLVZq)2Oh|`&Jq_Rn=RpPV++5zlf z?LIF-RU)e5msv@SM4Z@0eh)vfkDTEc&rlERrL|`=$BO2aE0)W_Lu`QlCXS07t*&{B z0Pv6eHk@X-+8izuruc_Ah}^j++iYLBjIW6e&x+V}@14F&{*aUM1MQ|KQ-QC|ZF}l? z1ai6kj_45xkH}f7uPy%j&X|_D7YWb(7(sN%5#nncuo0R@t80E=st(_N^Te%qC$7cm zz^TT~ya)fY6GCUhKwKQzf`fZpVWyBax*>G&aldTO*eG8>bsUzUhGnrPvgU@(qBs%$ zrSA`W@|p-c2lq45{J2P>yN$U=mD)HW!C}XTvqC<6RY47B^0EkUE}$=sjk!o)L?W>b zcsS?7X`$^v)h*8{=L4Z)tf~MUj~&7{QFy0v;%GuA-;IiICx`Q0*bQ%QeYj)lL}MWU zfxXx!XumDMQ^I;R&FkY;o-u1fO3OOHo&$JvbZWjN^!}L zQer?!mo=*LxmSY|YPmrUm4j*qm4l{tc)jADTj(Vmd#EXhzwSxLfFgxqM6!h0E3#I6 z!)97Hs0~vL;wHw@`p2+*zi}J{6FXeNNyXNI2Y2|ar%T`37>8r&m|>FblG8F|mJLV@ z1EhNFVx14SIlx~v!UUQdzJ#1UcXVrb`oLbdGcoTOc1yDd>`pxk1xN?s8MJY$u1Dwg z-m_E1N4=M9a2u=_SyU%E%C*wtYBj2lM<#V93Ccau;cBv#Ss2K=UXBX-7s+`VkV$7Vmu*^0-#j<;!Le@oa?C-+Bb zy`R6GQf|BDQoT{z4P(f9=6*I8eC6cn7n(8T-y8Ee2iBq-Xk$kDNBTO;e?-dvlO^%r zA`zOdwttf){yRw42eBn^1Sd;_aY{EqnRJK?$S6(J(l6<9ElSeTa8obsjP#+l+_@AU z{azn&$ggCvGdx0X#RPqd@ai(|H>BA;IXU-+Ge@h}&cdH8ZjwZqc@f*l?mRVF5PXHw zH#WDPkk1CQpXs_MaLp!Ln}2JD-%8BWZzOd=YgIySs`NX!#uU;YJD^J%&!+)3i?y(> zpqS|wUWkHbYw6PYSI+P9Dqiq7!e=C0Cq<}zJsw1?agc^<{Kd8WHxhB1)DH8?2S@_c zPqPukOM>uHH7U~P%J<2D7$JT3eh6;!rdd?!@&2m`M`iZMgl(e%?8%z%N<-L8Lnp=y z<~THb6x}0+`iDz$HQB4@%b&y7C2k#}*DbuyH7MnJ?I#Hl({a^~k+p^}!z(>IsAd~- zFnU`$jDFzZjA#;~w@Za>bFTJ|ScG}g1`!pVnC;3*XI&yi2 zS+I~Bp!_t|ff<^)%@u6}es^6I4F3b8RSR4?i>F3l-~u4y_<~fu=nm&B+P@&MtCB{z zK&92!@M!%R@8gbvbOUIsY{r@6Z0%A}LIjEN?R4JO8C*Fr#`@2H9a^?KlSX?%J#*Oq$RHy5 zU%rk1s|)e3gi&0osx!VMsI&9v>gkDqzxRcU(Kgzqzktft9vlu%PG4e^N&=&FDfA19 znDv{CK~9&8P+Jv>qSBo>@h<^^8?YCZ$aRm+oUq7OVD3id$NL|@F26r-`NALIUpU*t zRqQB-n$aDfUs1&_ZC@A!WRpPw+i(65r8Q*K3FIWxE@DUOQ$PmkW@pE{&D_<9>1)ya z#9%vq+18ltb$j|_o@9t`Lwj!&OVgI}paYpvkAXqO^3R203ZWbsZKiai6uZ(k{5~rt zV=Il-noI!HgjG#{4Q};rQ_`nt<^h7)6^E%h_$ZFoeFo?VyAMNgSVs5R zX+4xR5-WS5jzhQ!G#|?5#|b%}kXEB|X-ru&hmG>6@%v!N69ClS(#%(oOSN{94S*TVWJ15{fcUv${EXR z--kT17-U?0@10`g$egVgi;RvLMQ=;S{fF(h=0-hHlsl~0nHi&b$E+FXr>b`K`&v8m z>0;!lVF@h%gZZ)O&3n7kn!5O1bPTVS!#B9}(xw6S{Gwt-B#heS(x!2CFI53$KV1%$ z6<`l1m(Nb|Z2Rpt3p-qwN(xHU5J=V| ze-Yh7rvf|_7kyR)-cT#@x#T9uI(EN^u#}Tqcuhr}kfE*--Zm9705)b|;xSq~7^S(} zDUth5(&WFpO);7?_}8{ulO2q=AJ_f$pUBh}-TPPAzlFcYyJw+}!{pH)^la>&gcSvQ z&g?nkvAg8%;NGLf5NjOeM5`g0tHf(4K+RPCs65(OmU8-~m9UYQ%Ku)b5QlMIPX5C% zjJb4euCTmjvr4|9lGrMx$D6s$fV+-`Gk>Hq2y91hyc*$YcEF7z6$!%n{wEHhO|#iU zHyP{qq$^(|OPHM|f2~gZBh%npQNzycBis3#kRY(XVi8CXhcoIg{?I>u*8jk$|J%XU z{+oL{7oPU1lx!h760aGe9;;AD1Va_0R*Mn%hH5v=LwQ*eg%s59{0CbSt|)&0FAb#a z&7X2=Lbldfg3m2}PZ!%D>DTn@PYd2NDC6k_)djQolDUQElH4AXGz4`R%|)XqdR$Yo zl+!5FS}=AidUm}QwRTS_I@v0tkUW(+7$ydpC?wOOYpgA<9qpq4uN9B1EU&4rjww?d zzY;tRa}Q9<5EtvMakeSX_v&E2s`p+BT^o&l@DUlA+2qpqkYsi?hF9*mYo&YGtLh&O zKKi`SMq8yRQ0Au1F%}cL;;>#MqDqAp_ptq(o0u2EO%eg%k_JCb!oazUNX3XiS3=db zaO+i7WYSk|-r4t<$ny-(B@q}U%?FdC5`>FkTxv^&k?v{M%tXRhrx57387vESvJE#< zs=3NGA>`>X_RZ-})_dIXFVRTOG)y9$9+V!^Eg#ZGgtCSg=sSfMk5fgwtFx)`%*XE+ zK&6EB8D#n;i)@u$r4@W)CR9f)NsSGa4QDk*=cKqjqM#7O6oo3Uls8UeFr@D`JCS|; z8S5+OgNEZA4d-z^=gPC2cxhmM#;ax}{o`tn; zlTMB7j!kthofXD7^jaFUBidB_4tcMp^h)I?tLzKX@j5foSo(IU++lIHSU>WW2rr(} zqp(S!RA~3+1Lt5#s6K*KGSrJPdl$z-uIHvy9;r#jJNUJNtcGX#cMXUnL1rQ`pMjFJ zOnSRixIRe3Jf)lrXbRy(-jIzk3L1EIDm?WGsk8sr31w3T0{2#3eiWT@Lgj=KD zpM+U%F*BjheqhZw0CQHKxmTYHcr>4L`w?ck&S4Gu8FDk+W(`X^ALiAS(<$pusxNmzXpsly(Gka_eGzSz{t z_zaLKEi$9Pd$#tIG+0u|2hp6cR51?G9_uoX_=zmXzhr9-C5FFPfxcLee`NfT|A!;~ zU&}dv`(kZdFLX`Nd;>i`cQg{mLdhq~eh0J=vKGc=HN99Uy=;|=H;T+M_2f*JOcBu$ z$mak_zz$$9#FavYckD3v_@!8&*UKu4Dw*IULdohXApKbA*=NV^?K-IM1B@g*2Ej~? z86VeDW11d;-Zi=m$IeQdmi*W@kQ z-px3sYhMWPZXpJ7eSUwPfqGjlAyVo4#ZV+wFt<C*B(z~PwQpjOigDp4ly z*LQxt)eGj`PdOpELS==E-iipEnF>It#%4i04`*{aDa)I`2S-Xw8lTtVp4VtnD}M>V z8EVIl00_fwB@qneQPO<311u#~wy4UMNSN)yh(C0C(8)HI*)LiuHgYnJKoD52ZK>pc zzw<0^#2-(IH23~uTB_)r^OUmM^(8`Pb5SB9CzJFmcznCmWEN#@#^E0rP=s$PAQeN+ zc_a6bLHWHLit-K-@L7iP@@O1if-tZvvN|8VJD9^BTS$R?jF&vrOjjS}Q&f$v?tJ_# z5YYSV!zb+!x*wjTStgsAwJc#b#-NC`QMk_|S+3;|8h|XxRO?}&_h^@+!p@!o$~3Y_ z-qE5*mQjH#q3Qd_s=Q#6E23zGy({lENAe1Y_LxS|)xXkX52zzFP{#D1eouqS!va9f zvN%+Vjh}@TDoD|$CQSgM;LJP=sfDD&bQ5A)C$^$xHh<)=JPR1z02z#G?)Dq8ZrD6z zDC;mVrWe#qNNXq+)sH#f>5T4ynB7VWWs`P7lU=5pjw=rb-ssM^SwYzr<#q=33{KTu zd00tQoFCyCGqBdvt#V~K<|hq|HBQd-iW3KiYO?(itw&pnX(+RM-I3pJScmlFE2Kmz zGqK&(D|xSkDYrC*qoUV?EsTwaiX0Rwd!vv)Cy-6MP&aW+f3(e4E zf*U_DPX?iUof@`W&{x1#@Rf&Z+8!-V2dE~Ui-RJa$n|Z)%g>8Zk-uTUY1jWwFH7ul zjIlf5l34?j0(b(gsAn(^8%-m~gUG#!#16oRnSfhOrC*Cz?fU} zS=pwERZ)}eC=)^t6#p#@V|WhL84!(S;*RXz&oxy7w_67gr(yZZP zCJw1=G=vL3)O7DX9^%tIFue_MmheJM+onr1>Yn8Oy8X1!vfo3O_Nhr! zIJq-I1~+4Wp#M7PC4P7$AAxckX#beo!2b`o@jr4K{|?>s-8M8(#%8}e9skg(8FpRd zumu<%FJQT-g>M|FEr?_ybfk0!iv<@^*j15CR@OAbB8U&akbd(I4MZaHJ~0=@7&-&$ z*vjomE?ljDI5Q{B`1}GD{8>k>)+kMUn|g>Hc-stT!L+VITl*ZLH}>}O7Jb^9Ed97x z0DXsBPSA|}5L_G{9;HpEubwVzG67BfRI@zCT4;{@1&5p~KR1*Xeb~B05{pi+UOd~0 zRoEI(+XWp!+L?xLm0LHt^$h?oUQB0*?GKC^3UI-zGhs}pF4Qr~XtVqJmi0jMwB`m5 zrg@Asfv&|KzK9 zsF<*B+%^$saGk>zxVYTn*Q7n~*1koZolTEcvAA4lu1$1?7Qt^hP8+in*k2H>nxg6i zr!(ZpxCRsA8e<;j0k|2%7ElV5@R?$*X-)eciA^4j@Vs3zt%*(0KbTmPSv!#%3lEH} zv>KttF?GYb-`FHpHIdeS@6(vU;g$2Ph=McPE1FqmR(=+zvXlJW9xn#Xn-_uvnXq38 z!@*N!8$p$(@L6S-#OyS+FhfVk`b()eWQqj`cmyNE!O>7E>PqQ+rN^q*P$y(@2`;+# zyqHgjK-L`=e_EBxnPIkfhn335JKQ-DA)z^;GGqE-7N3xXTxk?-aB`%bvSu&cO;(s4 z-c)Lvz2Os5g&W`u628j7Ho+P%rN5z#A4_Rk#f=-HH==^!&F_uECm&qs)1&S#XgYk7 zlyFW_!Z(OO1>2Qp@#y&!-70US8UJmK$L#mk5)cHTf`Z8bPFz#bH@&VmOOrssWk{X@?$ zV`Ne~LUhYL+X7WL0otjeKI!=T1jbX+4T!f(SQ3G+JJ8 zng=jVJsOJxvIj30aC}(rFa4!9k^NM`(5KQRggHMBe@H}exh`n4)q}l82^}F>Z{E)d z%SG}hxiC{~s*=T(mRPD8OvsS4j$bBM;hwTR%<_>fqb*>PEf>pHx#ZC0)Qc3U$F-}^ zd%q7R55?iEvUwstpi4ax|Fw-Ml3{550i8bW{_*trpDEIRpFaO9IZDCG_W$$|{awmR z)m(SQm&6dfltdp-)@e28XO&$W6-~>DY+%GEGbM>9#5kZ#yg+GXNus{Pv6aG|9(G;x zqA|K76$sVfDuDAYj%MBs{w+@9q%7aq!UI`)mUp#Le(n0YU7i2&>&fpEU9SmD3FGR_ zDZG@Z^;-U_;ue=H^(1d?0jsPp99ycVcNNC8i?9?H~A1 zXe>qD`Eo|0T5s;f+f{i?04olL;Px0oq(2Y2It9Sipwyn5u!M~OO|J|L+anp-iHvq5 zCLT=uB;}F4vr!WdXsFg%cW!X)GLK~c+QD(saJIp3M_tY(7 zfD%a?#$*X$mP#&jC>iW6OJW9x8>mZSv=?cMTL(LQ4$fLz!M2S8V~46Hdk`V5Xg(NO z<(xpthRc~Zt}vHT!YWBFWzHvbuGu5(c%V29a3pJx2Q+tUPmcSxmi}UZslm(~V9{u5 zJm(Qg`#+7H1z1!~`|y!aKw3JayGt5Gx*MdsW08;s1p(>q5JW&q5Re8%k?t<(6p%(G z^gF!Dvn=8NJ6`)-=ULs~+%@;anKKg`)sk|<=H>n3wQUei*a}>|1<&CdZ$o9&LQo#? z+)by)W`#n1e9$IDwC-`tme6F(BP>(Bqb41Ff+FQ^y7!XJGblQ7D;AdO7#m69ICSkw)w(xC_Nv1 zh|i7M(p1c*Vhcs_4DRVeUtjjZ&R}-dJ2Z~Izimw1Q&~AGi*E@Ep3Y z|Kr|G>`zg|tnN6O+T3otH|UP?>1tP*se-9j>mpnvc<1SSGF_|M3DVoO6LU5V76?*b z3Xw-oHaogJGL!JEpb{J*(JKtnz;9Yjb}bV2%%k#tC+t~G)gA2`_zFE@b=n^Gy-M*h z%<>jB$FV=+uPLm>F!xw}vOwqHo#0&IT(!FdLlzPJ;kyHJn~KN5*o9BaF}ifiD=c*U zbxlt&b(g3-XIs6hP+yK~_y`ayd3CW@4C(b8{2p>70;%n9$i6`12=xH|x^fomiD>skEuhj4LUN@ssJQ#oB0IU$NEkkX`H z^g1k!L+ZAi8|~)^MRIC?$M+p1rMQEt4*R#2F4X*m`86A(U!7-FS!@TiO&?09f8~Tv z8wZv|I(`*3&o>$O>;Mf{X%AWZ&Mb+Hc3)y-&sAGD9!&#Hm?Ho-2MaiXtFT+;*Gb9ZI4KuDf+O1! z+@TGL+T%}^$c^_^kYzxXlscFgLF;M(yRj!y#dW}*VN7kVaJOTNT?{Jl*m8q7WaFCB zn0R1o3TktDbI27%LGKC!ku;*Sg}Z+amITe}wAxHKe{Gmuq(GZaVw!%#nQ=t*rj?>) z0%>>l&Duoz*ut20H+u1=Vm^j)Y)7~HTm16R7P)o4NbyFJ^F;YWQTai#;%F(W`(h!r zHHt+=(j*|uGEa}(a(RAp8FEi}A3S>21^X~V8~r;w5v@9nM^%DBVxA;Bc#O5MeV@v6 z5#mo|N@lC%bnkmUAHqeflB&7oZ&(#ixDFh^B@l5W%nI{JsF^W|rWm1}2X2kDGK|Gj zehAYOqnR!ms;GqFMs|#|(L>xq-At{aogpix_xL(R4?o+C)-2bGK7O#jCvI;TT>nzN z2^V9dVn;Win(6D?4bS}78KJ?U9vQy+@H6;h>j9mFO^!ZFa=J3=|hD@n&k+`UtCeE7t zU0lB~?B=@2+d7dxKVB`%()b)p?em193S}2-nqAF@1{l@YzgSnCC!T9dnJi45@-ppD;Ht_~6J|+G%Y~z8%{K{{f+?!c`j{HHlA_AguQO(fv(eD%X z`Xo0XayPjMM_?Ize)nf0J@`;xV;$%iDM2i$L@`C<1BO)pUFRp#MNQ2Noo$@XwmtcK z=_DSzPMWN= zpYM7GG+k-Xb9i{V=Zq{)4`E?7MbU*}g~d$npsU{3#B6k&c3|sBqtvyy-S*w|Tm9yj zvg6-8->1GWb=5|Ay0(0Q>V4fQs>ny~*Y|CvUr`^;&`7~v&Y*<3f(buJtcKjH>}Cy& z*Ntay)5I;cy<6voh-4<=If-C&Gk@K8vN$jjG4ZFJOOH@2&8TG_Bo1a2#hp*{#(mKD zQW7<kV1`e)X%;@ z{iEHZ6y25$xCvJ_4TB7y9Njwd>rudtW6@rS0OdNea+0m5}DK6g~(@Y!jHDWI!O7QQY~g&q&~#&OQwAW zvs~jI10ox&WCtv7RiRMYmgN{Hk_&c!2)UoG*J|ex+ba!Gky%BC@~5h=BqI@9`Wqxk zzIaplQXscMlN}I6{hH!%12O%4^^3x;NA$f06L2hEc=x-gmFKAn*l69%S_IF-n@;~$ zZ|rW+9vsYxCBE50ep8~8%KcY|sDVGGg`@F_u}3=W|A=iK$Y z_@(kHX|?%AQ*ZILmKCe$KQ7zo`dJUdX2HaMA)<~P8}oYlYig`uSlsVvdN{nC=XD9{dJ4yjceidn_|p8E1?yeR$fuo8%W}veY7b#$TE!13 zjDt57X7_S>?&qwtVWB*ML8mfLv63Ysf`2zL60FJumRMtiSvXur2-qaves?8*EY$i{WDcf8uc!F{{o)yL{B(`4;E zHwJu^E=7=~zvvuaV|)IQ{$uB9R6%AM^V#ofJJP?*oxhtP|9A9*-4q$Z<`^Z ziM<}4!)2Wl$GZqIs(wlw_+l|i90Xz^O6B-sDN3EJ!PVl17M5(~9C%vEk|84R;rSTQ zkb~%%cXv7no5F@kzTp!!@p<>!;S!benMY0`2%-D!*>xY?q~FvCUphJH+2zb~`?bBW zUvK}Sb?vCA&Mwt_(fPCH{x9_max+!rk;R;?;QahdO$>#oxLor*i ztp&L}#acym%7V^rlxVDpmcu9%f`#JT$l5}dXWXK2 ze3t4gDt6}l6?F6@(}f#$^IBn~CiR~sh)wFxR%xU9%mjTA{1(mqekUljjN(o4M{BY- zKJ!&tJuov3UzpxRa|`YVJM4@dTx1eDC26Pa2Dp z)|W$vY)JCIM=j%eUE7$1jj3#(a|c*_A`7{Vdo3}WsKh5u)A0&AL`NCqj{Clpie{!$d+ys z%-yOaT78eG&>w%`eJhAL_6I989R6W1^Kmk*H$@!ja~AI~6q2@2V$mFZ+oFEfO+6Ln z%&2>T!?3p;j_ReGrz`S@N^YdUCYwF`YBLAFcv!3*L&7LYrR$?hDKTzhq?yislYA)G zkf0ljWGTo+w11!E_0y+VeUd!}E;i|pCNe(h@QxP$Hu9c2E{6%@J1M2RQ$x{{-p!l- z7-WLa^ueDj`2~8PtVFC^AuHua$1uL=*t?D{0(W1bd(i;XvZxgJOdl!9d}Y|PIJqc+ zef9`afWLn%C~oOgu@Kh%X0&sas*HDACQP~PD@>fQQ4#4`8?Mv@ro>*vqf9|1~k%06fCMz)o zOYE)^@+Vw$yCRW95+G>`uwrUY;N5&r)4?%z0CrEEG<5S4|Ix#pxkep_ZIpSW)oA|HBl7*C(x$O(9Ngf;X8O@LDW3ko*4{prM=}QgbSKhM% zz&)RtZ!)F*(+XGyJJ%t{AuHB->S8b%ooK0tif}eACyZ53Z z?!iq{mfM>uhLlqZoQaIF{2xH^61R?wGwtmkdzcuPNYqu>ehRe6*Z0~gRfF#iXGo(# z5^9l<-ZngjtA;tU>NQCGfi;A5=RG#{Iz9(JwYLc8d=rS%f-PJ?j&pvL9iM}s#ajWm zBc_#kEswqZ;0!-s%(-OCfeSJenuqQRaTaeW4azqi2-W8z7 z(XRj_)QMUDHTu9O>YJjG%2%=S2&K5Z;1CQaO`)Q;!!3khBIi6W`-(ez2&6$rj#&W%mqr#U>jFz`?Q~-5ji^{B4cWv_wz&-dD=(cs=EpE0l1$%(TAlQiD<6fr--lrHu>( zU8QXdDf`OkH!>Y4V;``N(u~Nw=6ou>ZC-n4kdei%Z0i17)1)y&Vz2O9iAxjEyPlOF zJoTXuvVLdp`JkN?XZ#-8`>lB>WLL&CDgNSfjbhY^BKKYBoP63Gx|Gew5B*A{`l~g@ zs}-hr7&hwUr`+nx9c8E76vy4N>UJZ`cX1Zd6yhHXs!jdmuv})ROk3S$;7&1O;8nQC z7m*$5x!4@~1}71ES*6AstHDgmqBr7(>XDuQZDQZ;2EN&s^+J41A%;6_YL0sus-V% zl_JSvuNPLv`{Ssu)eA?wB{@^{YM)wBx^IZ*$9;WAtRy#N&81W%TYQpem8>@*`tFgt zE&EXW{&<%vLGg%PX0mSa7wLD1_1L+|BE?_C-yyi$-X5ac*YdTxvp>F}T0kc2@==${ zPtQuDkH;qyiE8F9k53Ptd!mrg?y~u>lCY`)3#Hm@3 zU12b&foqNMBPq3v8kw;(DS~I0)m(6?vnBOsB0I6;l5w}^|M8`{LjbTbvjv-JZiPXg zHoOYLkLuK12_&MSHU+(f}1=IApe|{N5O~LV=WsLS^g+X`Bi!|6{ z>F%!jfSXIrpC(-s2~Zv94ej_l!vd?fS-9?A|D~)Q+Fd{Wl(j~WL2R4qUDoI))2%}+VA3hUSCBoob^<_fajgRbGg+a$|R=3r^I+7}UAYQC0zGs5xA~65P zd~!duHvQ2E`#1My-JYto%U^=##I56w`x}E;?BqW z`z001v&tFUfg|fY#z-|&_ZLrG78;7#DMr839n=SV6&p<#T)b`HBbwFK8bOL+hEbDH zN&5B(`WO9om$1GUCq*Vc^kdV~2bIwfw;U&8KcIe@ySv?e2L~;YpMWrblWm|SD{N{# z%g@WHBm(82uThY96lmO542X$xA|!o+SI6@9Xr(<$%@GLczguVbC7hI&=|4W`==L5P zW9BfAi=V%ISzJopri+d^a;<^mak#cuKY}x5;_3CXSJgb~r zfO>TU45%2~dUigUeGwbhHu}a#_!A?e4QAsK<;MP&`xaKW8S@P}pT;?)Fp4u*O=NQTZUtTW@cEXL-k*Sy5s`zJX8} zLto5QQ{q5CUzya7o@-E_=K!+0?fAI?4`EW}&D(b1rMnQhu;+HER0H#j4^ibu_J7z1 z$o3R|+{gXJVUQJ@A77M#sO_9TZXd3?I0aYG^wV;k`GXgtR=54MZvy*dk5+eyR|}JQ z6+8}3N$Tt`IA(L#C+|{|8bHg7NC~q~-4fV!^tK!VZfSQUt!_Tn?A}7ti30JgF}U)4 zQX6(KJFH^t7_;JM`0nu7q_G3fdswSm_PuyRDv`g$vQ~FEJ@C)wixxZgvU?X9p6HGx zlodo7D_5ts2i-kBNX4>6?8UWu5V2#Hznj5YsbTzNtei4>+Xv6=6^=GN8QDX<0 znLaibkl{O+F~2Rg)gs2Le#3xV{e9}vQYTjPQZ~lsYqWTAAQu;39$iSO5 z9%AN5CUbpePabOpz3z1^S5j{nw)F6s&xZr(b$gQ2*vDps%-Dex3KClmv3kLt9)%qc@TnMhr%5PdG>SHOaO=6@|j))d_bmMu@jB$~g$6i07M^>GUv? z4>=a-^w^RQs65qXA#EGv3ze;xG5PxCQJn~3v_MPBX9e1b?1%6KxAWHO?`-AfhE^k* z9fq%0G3N=iEz8XG%>Bsd;diwn+^om*ar+T^wEDp>#ca|C)v_!m>?ejc@|cZ4ve}wa z#r{J=VlG>OWTUlh{tEoEF?X`A;X5{KZQO#UY?; z*j{vNj>g;%Uv3JwoKT-e4a)vvy#C_bt6XzGlDR2je)0VhAv?(-J*kNNr9n828EDc| zrrqF!(MM)pb7Stxvq&>7R4JMNGuVv_sCADaS z>1j~fUfnYE6RjDIWJ2%Hew6)?Fj}fO@U>`7XXIV#R~_@xNkC~RBJV}V?Iii0?LDh9 z!U;6#uebLVb$>}JASJ2M5*IWVY4S}eRa$Nd30K343x^-3A~*?CE4D2seU?~ZGJmk0 zQ1Grh1&RVIVLBb*6#E-l^gy=kmX&OKRR*n$Dq$V{mvc&WuB5UjLZ?rne_|gMQDQ+s zwc4M5a&vLQl?s@8bu_ZHGjwoczIeF#pC{66oPXE0sA&UBAu)f+Cz-8q#FkGos3Dp> z;gCYOA)NiZA{&dtHWR_18a`$<7KMs^LnlM6Zh9YQhIj@`GdfsTW=m{%r%UM7H-V&_ z%2$VxsqD+^yBna*lQgei$44X6P$t`cPogWmB)9^)qDf1-_BpJT9jWSn;(ZcW|EeW* z$AT$2-d&a?-qA9LXE(`kg4SB^#eLCtUOo1(yw)CmkMm4c(k->&ZW?B(IE*=Vo9fEu~M46EUjv7iuRdy;tQ0rC>+lCJ%(+iO2TQ zmC*J%(jww%V~X}sl~g`8R~urOP>#5pfFAM^FABRni#%8^(zPP(qu#+i^*T8T({W!g?KOKdW!a?(Y~w|`C^#v z+?|a*dg+Spfob$bZo4N>v-`_e7}e0)@KwBpoeylIiJRL??g={nE7CY@DDdmrJ`PyO z@}7i&iIPUyRuAT%Oeo1qrlFxeYqmO(&6jaslist*sLJ}S7V*`jy>ifscd#%t^uu-l zwC4mxgVMCtTW+FOU^9}Ywl;|!+}?Mt7oc5uI$A(7uQ)rL-HB@AQKQEM)E z#>)11$;eI;Yc#6BSj1E?r~$h!Kn1$CK#*{Hr?WH)uVfj?WwH`+kJRstj!a)V{V2@t z-RW;lAEY(6$qeg*yx~9wPw3t)tk)TPRM;h$q-Nf1d_1HrRcRqMo=`Yw3bS-Cv{H-p zSsBGQlFEY8aYakckPcx1U)B0B7tCNb2}xDwc*iDWOsd`rtw+$exP}+<9yDw^ku1m8 zxpnmOPm>y14T%=V@v0mMG$bZ@xAZODI?kj06cweRI5RQzcERwzqntpbKay|gnDE4- zqcpAb1L|SU@^`nD>Oz9lb(*ohl26{0tgzm6*>Fsq5UtNa`^>4c%uugUn`dlzr%vkK zZAXJL{9tIKaA!>%GmHReA{FSAp`_@|{APw#W~E|kQC8#_M+fbUcV7($E08ciQU)@1 zpQTCE3gApfRr!sFN2Kjw2I-YLKh8F^W2=Qeat~d1c>jjrO$wKp>5r!FvfWI0qaf## zO{OI+;2;-z7r0f{-dc+*yOzpZv>(iG}a`VUEVS{kLsfRxGVSevG?02 z4{yCU+^dqX&}}w51Z5Vpa`XxI#<_3p~?)c5#K>=w?+>W#D#Lq6+lWmWS)zNCLy($nz9YJU)q*Fo`3KF}m7Ux2~Bfyf0EWwcS3w zu*I(o$096!t#@BvwazT3uQ*JuegU#pf@qm>XdmeeU)0{ zQ9q26&_~%#Da$#YmkE5?LcIr2e0Bhv}hXIooC zJCi^8*I&(Ag#cKJF3+SZDFQ!B^{SjCFZwp{_KUwFCe7fW{aY#Fd@Qau4D1-)nlWeS z-_M9MI=N32iJpe1y*-dg+YbEENId%^{R<`O*yu-_?lH%yj$Qs`fz4n0o1phFiN4J< z9r*$3k@1VAO}%Q-m{Kvfx$@!OGO+0czYbuEQd7uDPQX&)lbm|Tpj=*^Wrd|=4nNG5 z(1kU=i`_mqOZt7{&U>5~QzcqToDQRR?I&iYV#CY0c5>V=c{{a5B)D+-L#(1cJT^BG z)8eFalj;a^L-8r0)S<9erF*H2*}E9{BwQZ0cOwud;YVTMtQGl}d?lvhg3{t3MZWG8*8gRnw2Q% zWio?j^zCgDxDP~sy_5?!uMSWmeq$8M{k7I1&hWKbrla&LW11}GC&)@MJRaJ<6QE(I zd-q1@)@+y(lG)#FbhcB94I+mbxygLaYsC~Xnd@rViR4xgC3>uLkQlKLe8+k9Eopp^ z>hI64@eY_DBBUl~b||a4`-YK@M{`_Xr8rS=qJ(%`dUsWXl*3~NrH|`z~gn|EU+$BHd?X97}27ffMalI`&Rzk zYE#?4c3lL~1@B=p%V^;;r=}2#bqaRv`e$dk@hEE~Uq-`Q>c>Bzh#iJTjuY?4_B71I zjfm&kv!5{dl&wm@F3_yEkEh*AqRc;u>a?~A8oGIpZyQQqna7Tc0@r0S@o_MJlZN2V zN%G(XRzaWJUJT0~BolmdpuW&@XBJ9`)5e(PE? zOa0o(lVF2JwZD|@sN-~_Or78$7mH4tJw)_-Zf$ll+`3Y(bJHEgoHE;B+WVQxY;Rkw z*krnmGzAyZL_P>MuQ+`goTgLU#{Fi1y6L=^2EQF{89pdtx#x%JkIJNzQ^^)KNrxem zN`jlhFHG}`nn_4vdKmAfe$yN^ZgASH^g7_nvY%mApfLDLZ|DnQiWwMdPBWO$;P`t*n6tBQ|3EIz^6?4Ad{ySxD(H2G~qoIYS$ z)&OtGMdrxyf8VN3a~f@4-VwBfVe%83V6vI07Ae?xsF+ZJj9e?YG|!AK_<#=k_YJOj z`k(}O!ddq@R9N}9@`)opO(Z_ri!D(Z=7&h%tG^a&OwV!XFLNlQmQq#t^A0PO&OYSx zZp1cqA4J*Wa)#3?+}&`Z87v~;#hk$FUv6!-0cSg}}<8UWk=L7yF-t=lY-jJ?S?-Jx%XV)-w$g zmVseK^Pe8vc%!}%oZ(`Kik%r^9s#G2jYU`A)0{!OnnR)8_+oL$Pbfo3nm5S3@BUOI z*QUpqmzO7Wm6#MR!{ji%7pRa5iEGMPo`(-5nDNd|dwEU)JkH2KZ({t3_Q_UEH31{S zSd#$4LwUIZ^9kpy6LPM%`TEYxRA}A59s8X6-0Ixv_C9;yp*_@iYk+w_-WyYP(ZT;TxW&*0LwJaUCqG# z*)ZX}$VkBY4GxB6+7`gcPspI4PS?kSuXW(|>{n;YT+jLTBItm+SFr<7Ir%me)amXn z;OiKW@eBk$YUIz-G746XKcR9F#}zjPpADEe6R_v$Svueg8x4$8aCWkHcB1%)SQxnT z&kul=E8D;;py*%-a0i0{=7=eXUoA$!t9&jA;OC!!pEE-IoEtcH?o6w*mkGcxgHxzj zfE=74&Xl~UA-CcUA|n9Z2P`(`fk2PrfzekE>&%?(C~Tdb42^6|RZP!H`TMTw#bSJ* z>AfEVen|wp0O^-3px1Jy!DYYvJxux{&M4&_iWK0NG2jCkrxrj2o#8YbES*gMCV`6t zkDsYpJHW450o@@9yX0VkwBchzDTS+@0y3a~bOVTsDS!Zsk^2f+9Y=KxSp)-+(m6@HZk*_r>-tsX-B@h5;e$7YoXJrj-8zkQ?t( ze)$2Ed#3tjx&N+My@;F3B1lR?g@U33>;@S_OZUOJzj^O+7@n%`5kaZz4X7Rr#1Ld| z_^1d*oOxQ+&EC|J;(?{PsHw4~t)b1eem+IJSu-;ApKhXsa?wdCs$ldbW!YG+mh{ES z9Y;Z+$~9f8v~^lw+#iXqCLDIC$xr~gP6I+gHWU0hV8Y+PyDlG}YNliR+I9|zU=09t zI%)!6dHP_`S$p$m66>1Kvn%f{P6SZ*FQJBwu0vhX+MG&7A-rj>4g}XK;7S~@NWfjS zDH!#?jA92gI9J=Ks>B`23y_BkkcSKcqG|<({MS!Xv$H%)-B(hQkALBKt~O_h3{Ra5ARRs+9SelAoln8IGjslZ zp@;R-3i!V}gI>%hJyzzPd_eVT1U`2l=*|IPy7H+vt`^TaK&>MQ@U}S+f{+zsI}i+j z6z}iLMHgjx^PWnr4A3+RNR^PNp%5_YLQcL~Dt>xGTvh-z2E-tw(FP*HC`hSTSuPpv zni|YOH2$R@(76ks3qu&qC=N`&ASf9AU*n+y^tDV){?8_)U3=~D5fD(3fL&N2ipxp@ z7k5FKd*Je1{@1IaV7(^%MdbDFngH6&0v?2H3LRg7@&6y%{HsfMT5Vleu%z<=f~*wx z^T8;o>r*Pm(fx_j-i0+_QhErFeJcP1&YIYB-ES$of9F>9_a5qlfMj+698$@j#b7X` z39so%EMhcOPXL$%Ac#pIWW+56qb?;;?yF4}vD6R42Xq}#&@Sds;fjCI*sc*S#i2eZ z0uU|{Fd)#=Id^r`fpL&}Ue0+}j0H~PkS}it+7^&C{F1DdAHeWSnzLQg2vfS&@LB_5 zEewP;#W_gt z2oP}hE~RwoNigdFBMJU5rT=qudn)yivAC?E}4}SVgkc%0U-B50&7;lz_U!M;^biYkFn*cw!>wJ zdG`QqX@S~&4?;rXT`=&Xgnv^sIR6_tTiTdh*C#}J^0N0BD8m}CC@n-8)d%1*&f@F8 zQO*CAa1#UC%EAHTre12ZkdDCE3yxw3%lx+V0(4A(P68qHkKbVWrA%~9 z3+S!Yw;~6afCMlzBuyIj#`QzZYr<;LF+A{WZ$Zc|)Z#5L4bq_hbJ^)EkD9~TK=GPR z$EU=jI8H#;P9P6MR^1!7!E{KwUZZwBdTo5cU(Z9`f;3l_bmYZ6?T!SW0cGwo82kpqU2*G^|$LpvCZ{?!VE z4jB+SkTxX-`aWm03zBhO%C6Uq#BV~8JWBvvCVQ$pMAVCmg7Me1J6BijjAeLNMF8mt zNDYZ(k@&xm*K|FMRqkgp0zIQb-~*X027npsGb3DhG4*QcV4FDjidxPvs3mEL96pRU6-LB{|S_?Z(BO9Bv!6e222jld|#iu?~dQOn3@lK_%s z0eJzk)m=3K{c-w|Fya*Znj3C|D)mcdMC7f zniBhf0wA9tJh8r9z_~mlVqxg;&vt<3*}7~$Ao_d2u4E9RGlIb6|K<@NxHZ9$}t8|O_zX%g!|mDp`aX`z`wt`_Oo4)WUj7q z$u?-{$UrI4Kq-(T)vvDLQqF?>kMDUdWw2|ap~YNyI3G|SAAplUsG#ZvhF^-UYv$K} zgyiij0-{y|G{{bX>{Br9qCrmmB<$c|==M+Nt1ZPNKL#8h4vZ2ZE7cEwF#D3Etk-mm z{Xc6)q5}~cerblcKI9)bwrf&spNfLz>FBQIQa^ww8jOQ<66-ZA3^Wc(9DwBxhzFU8 z_TsK%U87|(Pnwz^fXWBNqJeO)dJ-4~S=mKSmnrHLR)G@Aes@}YQB7MLaiaxne3ROb)P zgY2D~1HJ zz1J&R>{HhHMXOX04x;@9W?gKc{y6>Yje(2uonH|KnXXG=p#Q5oXQqW%{CW{}ehJm- k5;E|W2M Date: Tue, 26 Jan 2010 18:41:03 -0500 Subject: [PATCH 036/823] Ignore 'unknown resolver' errors, work with published version of SBinary, work towards fixing OOME:PermGen issues on reload --- cache/tracking/TrackingFormat.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/cache/tracking/TrackingFormat.scala b/cache/tracking/TrackingFormat.scala index c0106d3a0..1318c6096 100644 --- a/cache/tracking/TrackingFormat.scala +++ b/cache/tracking/TrackingFormat.scala @@ -46,7 +46,7 @@ private object TrackingFormat } def trackingFormat[T](translateProducts: Boolean)(implicit tFormat: Format[T]): Format[DependencyTracking[T]] = asProduct4((a: DMap[T],b: DMap[T],c: DMap[T], d:TagMap[T]) => new DefaultTracking(translateProducts)(a,b,c,d) : DependencyTracking[T] - )(dt => Some(dt.reverseDependencies, dt.reverseUses, dt.sourceMap, dt.tagMap)) + )(dt => (dt.reverseDependencies, dt.reverseUses, dt.sourceMap, dt.tagMap)) } private final class IndexMap[T] extends NotNull From a293916e46849d666f8c3c40dc3c03bc25268ac1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 7 Feb 2010 23:45:19 -0500 Subject: [PATCH 037/823] legal cleanup --- NOTICE | 3 ++- cache/NOTICE | 3 +++ cache/tracking/NOTICE | 3 +++ util/collection/HLists.scala | 6 +++--- util/collection/NOTICE | 7 +++++++ util/collection/TreeHashSet.scala | 3 +++ util/control/ErrorHandling.scala | 3 +++ util/control/NOTICE | 3 +++ util/log/NOTICE | 3 +++ 9 files changed, 30 insertions(+), 4 deletions(-) create mode 100644 cache/NOTICE create mode 100644 cache/tracking/NOTICE create mode 100644 util/collection/NOTICE create mode 100644 util/control/NOTICE create mode 100644 util/log/NOTICE diff --git a/NOTICE b/NOTICE index 6f5297e2e..27a575311 100644 --- a/NOTICE +++ b/NOTICE @@ -2,7 +2,8 @@ Simple Build Tool (xsbt) Copyright 2008, 2009, 2010 Mark Harrah Licensed under BSD-style license (see LICENSE) -Portions based on code from the Scala compiler +Portions based on code from the Scala compiler. Portions of the Scala +library are distributed with the launcher. Copyright 2002-2008 EPFL, Lausanne Licensed under BSD-style license (see licenses/LICENSE_Scala) diff --git a/cache/NOTICE b/cache/NOTICE new file mode 100644 index 000000000..8f0040336 --- /dev/null +++ b/cache/NOTICE @@ -0,0 +1,3 @@ +Simple Build Tool: Cache Component +Copyright 2009 Mark Harrah +Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/cache/tracking/NOTICE b/cache/tracking/NOTICE new file mode 100644 index 000000000..c7c0531d9 --- /dev/null +++ b/cache/tracking/NOTICE @@ -0,0 +1,3 @@ +Simple Build Tool: Tracking Component +Copyright 2009, 2010 Mark Harrah +Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/util/collection/HLists.scala b/util/collection/HLists.scala index f376ee4fd..69ac9d6da 100644 --- a/util/collection/HLists.scala +++ b/util/collection/HLists.scala @@ -1,8 +1,8 @@ -package xsbt - // stripped down version of http://trac.assembla.com/metascala/browser/src/metascala/HLists.scala // Copyright (c) 2009, Jesper Nordenberg -// new BSD license, see licenses/MetaScala +// new BSD license, see licenses/LICENSE_MetaScala + +package xsbt object HLists extends HLists trait HLists diff --git a/util/collection/NOTICE b/util/collection/NOTICE new file mode 100644 index 000000000..ddfa4614d --- /dev/null +++ b/util/collection/NOTICE @@ -0,0 +1,7 @@ +Simple Build Tool: Collection Component +Copyright 2009 Mark Harrah +Licensed under BSD-style license (see LICENSE) + +Portions based on MetaScala +Copyright (c) 2009, Jesper Nordenberg +Licensed under BSD-style license (see licenses/LICENSE_MetaScala) \ No newline at end of file diff --git a/util/collection/TreeHashSet.scala b/util/collection/TreeHashSet.scala index f84981174..d45a0acfb 100644 --- a/util/collection/TreeHashSet.scala +++ b/util/collection/TreeHashSet.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ package xsbt import scala.collection.{mutable,immutable} diff --git a/util/control/ErrorHandling.scala b/util/control/ErrorHandling.scala index fd8b6e69c..98c1b1a73 100644 --- a/util/control/ErrorHandling.scala +++ b/util/control/ErrorHandling.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ package xsbt object ErrorHandling diff --git a/util/control/NOTICE b/util/control/NOTICE new file mode 100644 index 000000000..76a30965a --- /dev/null +++ b/util/control/NOTICE @@ -0,0 +1,3 @@ +Simple Build Tool: Control Component +Copyright 2009 Mark Harrah +Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/util/log/NOTICE b/util/log/NOTICE new file mode 100644 index 000000000..1a42b7a31 --- /dev/null +++ b/util/log/NOTICE @@ -0,0 +1,3 @@ +Simple Build Tool: Logging Component +Copyright 2008, 2009 Mark Harrah +Licensed under BSD-style license (see LICENSE) \ No newline at end of file From 7d06e7a57b583884bb30b57772c559b5d27cece6 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 7 Feb 2010 23:45:55 -0500 Subject: [PATCH 038/823] legal cleanup --- NOTICE | 32 -------------------------------- 1 file changed, 32 deletions(-) diff --git a/NOTICE b/NOTICE index 1c90f88f5..7d09e8161 100644 --- a/NOTICE +++ b/NOTICE @@ -10,35 +10,3 @@ Licensed under BSD-style license (see licenses/LICENSE_jdepend) Portions based on code by Pete Kirkham in Nailgun Copyright 2004, Martian Software, Inc Licensed under the Apache License, Version 2.0 (see licenses/LICENSE_Apache) - -Portions based on code from the Scala compiler -Copyright 2002-2008 EPFL, Lausanne -Licensed under BSD-style license (see licenses/LICENSE_Scala) - -JLine is distributed with the sbt launcher. -It is licensed under a BSD-style license (see licenses/LICENSE_JLine) - -Apache Ivy, licensed under the Apache License, Version 2.0 -(see licenses/LICENSE_Apache) is distributed with the sbt launcher. -It requires the following notice: - -This product includes software developed by -The Apache Software Foundation (http://www.apache.org/). - -Portions of Ivy were originally developed by -Jayasoft SARL (http://www.jayasoft.fr/) -and are licensed to the Apache Software Foundation under the -"Software Grant License Agreement" - - -THIS SOFTWARE IS PROVIDED BY THE AUTHORS ``AS IS'' AND ANY EXPRESS OR -IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES -OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. -IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, -INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT -NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, -DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY -THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF -THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - \ No newline at end of file From 1b8fb9a3e59ed608bda2a78c8de42e9d40202aa4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 26 Mar 2010 07:55:02 -0400 Subject: [PATCH 039/823] Jason's patch to work with latest changes to CompilerCommand --- LICENSE | 2 +- NOTICE | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/LICENSE b/LICENSE index 27b5d6df2..7b09b8ec6 100644 --- a/LICENSE +++ b/LICENSE @@ -1,4 +1,4 @@ -Copyright (c) 2008, 2009, 2010 Mark Harrah +Copyright (c) 2008, 2009, 2010 Mark Harrah, Jason Zaugg All rights reserved. Redistribution and use in source and binary forms, with or without diff --git a/NOTICE b/NOTICE index 27a575311..4dd573949 100644 --- a/NOTICE +++ b/NOTICE @@ -1,5 +1,5 @@ Simple Build Tool (xsbt) -Copyright 2008, 2009, 2010 Mark Harrah +Copyright 2008, 2009, 2010 Mark Harrah, Jason Zaugg Licensed under BSD-style license (see LICENSE) Portions based on code from the Scala compiler. Portions of the Scala From 74c0f2a4f53a48e892ad53200a97b358bf5a786f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 26 Mar 2010 08:19:39 -0400 Subject: [PATCH 040/823] clarification in NOTICE --- NOTICE | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/NOTICE b/NOTICE index 4dd573949..63326efeb 100644 --- a/NOTICE +++ b/NOTICE @@ -1,4 +1,4 @@ -Simple Build Tool (xsbt) +Simple Build Tool (xsbt components other than sbt/) Copyright 2008, 2009, 2010 Mark Harrah, Jason Zaugg Licensed under BSD-style license (see LICENSE) From 4604682a1d5202d4fdcba3701be5c3aba0e31f40 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 28 Mar 2010 00:05:40 -0400 Subject: [PATCH 041/823] Support for tests written in Java and annotation-based test frameworks --- interface/src/main/java/xsbti/AnalysisCallback.java | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 2372a408b..4db5f28c3 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -9,6 +9,8 @@ public interface AnalysisCallback { /** The names of classes that the analyzer should find subclasses of.*/ public String[] superclassNames(); + /** The names of annotations that the analyzer should look for on methods and classes.*/ + public String[] annotationNames(); /** Called when the the given superclass could not be found on the classpath by the compiler.*/ public void superclassNotFound(String superclassName); /** Called before the source at the given location is processed. */ @@ -16,6 +18,8 @@ public interface AnalysisCallback /** Called when the a subclass of one of the classes given in superclassNames is * discovered.*/ public void foundSubclass(File source, String subclassName, String superclassName, boolean isModule); + /** Called when an annotation with name annotationName is found on a class or one of its methods.*/ + public void foundAnnotated(File source, String className, String annotationName, boolean isModule); /** Called to indicate that the source file source depends on the source file * dependsOn. Note that only source files included in the current compilation will * passed to this method. Dependencies on classes generated by sources not in the current compilation will From 83fa0480262de03f42db239f9a69dfa3993a59a8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 28 Mar 2010 20:20:17 -0400 Subject: [PATCH 042/823] annotation detection test --- interface/src/test/scala/TestCallback.scala | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index 23978f920..8e7014af7 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -3,12 +3,13 @@ package xsbti import java.io.File import scala.collection.mutable.ArrayBuffer -class TestCallback(val superclassNames: Array[String]) extends AnalysisCallback +class TestCallback(val superclassNames: Array[String], val annotationNames: Array[String]) extends AnalysisCallback { val invalidSuperclasses = new ArrayBuffer[String] val beganSources = new ArrayBuffer[File] val endedSources = new ArrayBuffer[File] val foundSubclasses = new ArrayBuffer[(File, String, String, Boolean)] + val foundAnnotated = new ArrayBuffer[(File, String, String, Boolean)] val sourceDependencies = new ArrayBuffer[(File, File)] val jarDependencies = new ArrayBuffer[(File, File)] val classDependencies = new ArrayBuffer[(File, File)] @@ -19,6 +20,8 @@ class TestCallback(val superclassNames: Array[String]) extends AnalysisCallback def beginSource(source: File) { beganSources += source } def foundSubclass(source: File, subclassName: String, superclassName: String, isModule: Boolean): Unit = foundSubclasses += ((source, subclassName, superclassName, isModule)) + def foundAnnotated(source: File, className: String, annotationName: String, isModule: Boolean): Unit = + foundAnnotated += ((source, className, annotationName, isModule)) def sourceDependency(dependsOn: File, source: File) { sourceDependencies += ((dependsOn, source)) } def jarDependency(jar: File, source: File) { jarDependencies += ((jar, source)) } def classDependency(clazz: File, source: File) { classDependencies += ((clazz, source)) } From 7927d8bdad2d762e4b53c6a5b91384f4ccd24f76 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 30 May 2010 18:42:58 -0400 Subject: [PATCH 043/823] higher-kinded heterogeneous lists: MList[M[_]] natural transformations: ~>[A[_], B[_]] Scala 2.8 --- util/collection/HList.scala | 22 +++++++++++++++ util/collection/HLists.scala | 29 -------------------- util/collection/MList.scala | 42 +++++++++++++++++++++++++++++ util/collection/NOTICE | 8 ++---- util/collection/TreeHashSet.scala | 25 ----------------- util/collection/TypeFunctions.scala | 16 +++++++++++ util/collection/Types.scala | 8 ++++++ 7 files changed, 90 insertions(+), 60 deletions(-) create mode 100644 util/collection/HList.scala delete mode 100644 util/collection/HLists.scala create mode 100644 util/collection/MList.scala delete mode 100644 util/collection/TreeHashSet.scala create mode 100644 util/collection/TypeFunctions.scala create mode 100644 util/collection/Types.scala diff --git a/util/collection/HList.scala b/util/collection/HList.scala new file mode 100644 index 000000000..b36f68955 --- /dev/null +++ b/util/collection/HList.scala @@ -0,0 +1,22 @@ +package sbt + +import Types._ + +sealed trait HList +{ + type Up <: MList[Id] + def up: Up +} +sealed trait HNil extends HList +{ + type Up = MNil[Id] + def up = MNil + def :+: [G](g: G): G :+: HNil = HCons(g, this) +} +object HNil extends HNil +final case class HCons[H, T <: HList](head : H, tail : T) extends HList +{ + type Up = MCons[H, T#Up, Id] + def up = MCons[H,T#Up, Id](head, tail.up) + def :+: [G](g: G): G :+: H :+: T = HCons(g, this) +} \ No newline at end of file diff --git a/util/collection/HLists.scala b/util/collection/HLists.scala deleted file mode 100644 index 69ac9d6da..000000000 --- a/util/collection/HLists.scala +++ /dev/null @@ -1,29 +0,0 @@ -// stripped down version of http://trac.assembla.com/metascala/browser/src/metascala/HLists.scala -// Copyright (c) 2009, Jesper Nordenberg -// new BSD license, see licenses/LICENSE_MetaScala - -package xsbt - -object HLists extends HLists -trait HLists -{ - object :: { def unapply[H,T<:HList](list: HCons[H,T]) = Some((list.head,list.tail)) } - type ::[H, T <: HList] = HCons[H, T] -} - -object HNil extends HNil -sealed trait HList { - type Head - type Tail <: HList -} -sealed class HNil extends HList { - type Head = Nothing - type Tail = HNil - def ::[T](v : T) = HCons(v, this) -} - -final case class HCons[H, T <: HList](head : H, tail : T) extends HList { - type Head = H - type Tail = T - def ::[T](v : T) = HCons(v, this) -} \ No newline at end of file diff --git a/util/collection/MList.scala b/util/collection/MList.scala new file mode 100644 index 000000000..981056823 --- /dev/null +++ b/util/collection/MList.scala @@ -0,0 +1,42 @@ +package sbt + +import Types._ + +sealed trait MList[M[_]] +{ + type Map[N[_]] <: MList[N] + def map[N[_]](f: M ~> N): Map[N] + + type Down <: HList + def down: Down + + def toList: List[M[_]] +} +final case class MCons[H, T <: MList[M], M[_]](head: M[H], tail: T) extends MList[M] +{ + type Down = M[H] :+: T#Down + def down = HCons(head, tail.down) + + type Map[N[_]] = MCons[H, T#Map[N], N] + def map[N[_]](f: M ~> N) = MCons( f(head), tail.map(f) ) + + def :^: [G](g: M[G]): MCons[G, MCons[H, T, M], M] = MCons(g, this) + + def toList = head :: tail.toList +} +sealed class MNil[M[_]] extends MList[M] +{ + type Down = HNil + def down = HNil + + type Map[N[_]] = MNil[N] + def map[N[_]](f: M ~> N): MNil[N] = new MNil[N] + + def :^: [H](h: M[H]): MCons[H, MNil[M], M] = MCons(h, this) + + def toList = Nil +} +object MNil extends MNil[Id] +{ + implicit def apply[N[_]]: MNil[N] = new MNil[N] +} \ No newline at end of file diff --git a/util/collection/NOTICE b/util/collection/NOTICE index ddfa4614d..428020987 100644 --- a/util/collection/NOTICE +++ b/util/collection/NOTICE @@ -1,7 +1,3 @@ Simple Build Tool: Collection Component -Copyright 2009 Mark Harrah -Licensed under BSD-style license (see LICENSE) - -Portions based on MetaScala -Copyright (c) 2009, Jesper Nordenberg -Licensed under BSD-style license (see licenses/LICENSE_MetaScala) \ No newline at end of file +Copyright 2010 Mark Harrah +Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/util/collection/TreeHashSet.scala b/util/collection/TreeHashSet.scala deleted file mode 100644 index d45a0acfb..000000000 --- a/util/collection/TreeHashSet.scala +++ /dev/null @@ -1,25 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah - */ -package xsbt - -import scala.collection.{mutable,immutable} - - // immutable.HashSet is not suitable for multi-threaded access, so this -// implementation uses an underlying immutable.TreeHashMap, which is suitable -object TreeHashSet -{ - def apply[T](contents: T*) = new TreeHashSet(immutable.TreeHashMap( andUnit(contents) : _*)) - def andUnit[T](contents: Iterable[T]) = contents.map(c => (c,()) ).toSeq -} -final class TreeHashSet[T](backing: immutable.TreeHashMap[T,Unit]) extends immutable.Set[T] -{ - import TreeHashSet.andUnit - override def contains(t: T) = backing.contains(t) - override def ++(s: Iterable[T]) = new TreeHashSet(backing ++ andUnit(s)) - override def +(s: T) = ++( Seq(s) ) - override def -(s: T) = new TreeHashSet(backing - s) - override def elements = backing.keys - override def empty[A] = TreeHashSet[A]() - override def size = backing.size -} \ No newline at end of file diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala new file mode 100644 index 000000000..c07ecce77 --- /dev/null +++ b/util/collection/TypeFunctions.scala @@ -0,0 +1,16 @@ +package sbt + +trait TypeFunctions +{ + type Id[X] = X + trait Const[A] { type Apply[B] = A } + trait Down[M[_]] { type Apply[B] = Id[M[B]] } + + trait ~>[A[_], B[_]] + { + def apply[T](a: A[T]): B[T] + } + def Id: Id ~> Id = + new ~>[Id, Id] { def apply[T](a: T): T = a } +} +object TypeFunctions extends TypeFunctions \ No newline at end of file diff --git a/util/collection/Types.scala b/util/collection/Types.scala new file mode 100644 index 000000000..40987e257 --- /dev/null +++ b/util/collection/Types.scala @@ -0,0 +1,8 @@ +package sbt + +object Types extends TypeFunctions +{ + val :^: = MCons + val :+: = HCons + type :+:[H, T <: HList] = HCons[H,T] +} \ No newline at end of file From 86c938d198c7e01ab47d2c928ece95179ec7fad5 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 30 May 2010 21:14:18 -0400 Subject: [PATCH 044/823] MList covariant, initial Node --- util/collection/TypeFunctions.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index c07ecce77..9a6d96b14 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -4,6 +4,7 @@ trait TypeFunctions { type Id[X] = X trait Const[A] { type Apply[B] = A } + trait P1of2[M[_,_], A] { type Apply[B] = M[A,B] } trait Down[M[_]] { type Apply[B] = Id[M[B]] } trait ~>[A[_], B[_]] From b1bb6ce5ecbda89699f3482fda930b2eb40f68c5 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 1 Jun 2010 08:38:56 -0400 Subject: [PATCH 045/823] variance fixes, inference fixes with Result hierarchy --- util/collection/HList.scala | 12 ++++++++--- util/collection/MList.scala | 33 +++++++++++------------------ util/collection/TypeFunctions.scala | 7 ++++-- util/collection/Types.scala | 3 +++ 4 files changed, 29 insertions(+), 26 deletions(-) diff --git a/util/collection/HList.scala b/util/collection/HList.scala index b36f68955..a475c1194 100644 --- a/util/collection/HList.scala +++ b/util/collection/HList.scala @@ -1,22 +1,28 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ package sbt import Types._ sealed trait HList { + type ToM[M[_]] <: MList[M] type Up <: MList[Id] def up: Up } sealed trait HNil extends HList { - type Up = MNil[Id] + type ToM[M[_]] = MNil + type Up = MNil def up = MNil def :+: [G](g: G): G :+: HNil = HCons(g, this) } object HNil extends HNil final case class HCons[H, T <: HList](head : H, tail : T) extends HList { - type Up = MCons[H, T#Up, Id] - def up = MCons[H,T#Up, Id](head, tail.up) + type ToM[M[_]] = MCons[H, tail.ToM[M], M] + type Up = MCons[H, tail.Up, Id] + def up = MCons[H,tail.Up, Id](head, tail.up) def :+: [G](g: G): G :+: H :+: T = HCons(g, this) } \ No newline at end of file diff --git a/util/collection/MList.scala b/util/collection/MList.scala index 981056823..58d2724c3 100644 --- a/util/collection/MList.scala +++ b/util/collection/MList.scala @@ -1,42 +1,33 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ package sbt import Types._ -sealed trait MList[M[_]] +sealed trait MList[+M[_]] { type Map[N[_]] <: MList[N] def map[N[_]](f: M ~> N): Map[N] - type Down <: HList - def down: Down - def toList: List[M[_]] } -final case class MCons[H, T <: MList[M], M[_]](head: M[H], tail: T) extends MList[M] +final case class MCons[H, +T <: MList[M], +M[_]](head: M[H], tail: T) extends MList[M] { - type Down = M[H] :+: T#Down - def down = HCons(head, tail.down) - - type Map[N[_]] = MCons[H, T#Map[N], N] + type Map[N[_]] = MCons[H, tail.Map[N], N] def map[N[_]](f: M ~> N) = MCons( f(head), tail.map(f) ) - def :^: [G](g: M[G]): MCons[G, MCons[H, T, M], M] = MCons(g, this) + def :^: [N[X] >: M[X], G](g: N[G]): MCons[G, MCons[H, T, N], N] = MCons(g, this) def toList = head :: tail.toList } -sealed class MNil[M[_]] extends MList[M] +sealed class MNil extends MList[Nothing] { - type Down = HNil - def down = HNil - - type Map[N[_]] = MNil[N] - def map[N[_]](f: M ~> N): MNil[N] = new MNil[N] + type Map[N[_]] = MNil + def map[N[_]](f: Nothing ~> N) = MNil - def :^: [H](h: M[H]): MCons[H, MNil[M], M] = MCons(h, this) + def :^: [M[_], H](h: M[H]): MCons[H, MNil, M] = MCons(h, this) def toList = Nil } -object MNil extends MNil[Id] -{ - implicit def apply[N[_]]: MNil[N] = new MNil[N] -} \ No newline at end of file +object MNil extends MNil diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index 9a6d96b14..95fb8ae9e 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ package sbt trait TypeFunctions @@ -7,11 +10,11 @@ trait TypeFunctions trait P1of2[M[_,_], A] { type Apply[B] = M[A,B] } trait Down[M[_]] { type Apply[B] = Id[M[B]] } - trait ~>[A[_], B[_]] + trait ~>[-A[_], +B[_]] { def apply[T](a: A[T]): B[T] } def Id: Id ~> Id = - new ~>[Id, Id] { def apply[T](a: T): T = a } + new (Id ~> Id) { def apply[T](a: T): T = a } } object TypeFunctions extends TypeFunctions \ No newline at end of file diff --git a/util/collection/Types.scala b/util/collection/Types.scala index 40987e257..de468216d 100644 --- a/util/collection/Types.scala +++ b/util/collection/Types.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ package sbt object Types extends TypeFunctions From 1144fb5a2724779707546a966adb78ca63532a3d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 7 Jun 2010 08:53:21 -0400 Subject: [PATCH 046/823] graph evaluator, rewrite, general updates --- util/collection/IDSet.scala | 45 +++++++++++++++++ util/collection/MList.scala | 13 ++++- util/collection/PMap.scala | 43 ++++++++++++++++ util/collection/Param.scala | 31 ++++++++++++ util/collection/Rewrite.scala | 42 ++++++++++++++++ util/collection/TypeFunctions.scala | 30 +++++++---- util/collection/Types.scala | 2 +- .../src/test/scala/LiteralTest.scala | 17 +++++++ .../collection/src/test/scala/MListTest.scala | 19 +++++++ util/collection/src/test/scala/PMapTest.scala | 18 +++++++ .../src/test/scala/RewriteTest.scala | 50 +++++++++++++++++++ 11 files changed, 299 insertions(+), 11 deletions(-) create mode 100644 util/collection/IDSet.scala create mode 100644 util/collection/PMap.scala create mode 100644 util/collection/Param.scala create mode 100644 util/collection/Rewrite.scala create mode 100644 util/collection/src/test/scala/LiteralTest.scala create mode 100644 util/collection/src/test/scala/MListTest.scala create mode 100644 util/collection/src/test/scala/PMapTest.scala create mode 100644 util/collection/src/test/scala/RewriteTest.scala diff --git a/util/collection/IDSet.scala b/util/collection/IDSet.scala new file mode 100644 index 000000000..29ecf469d --- /dev/null +++ b/util/collection/IDSet.scala @@ -0,0 +1,45 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +/** A mutable set interface that uses object identity to test for set membership.*/ +trait IDSet[T] +{ + def apply(t: T): Boolean + def contains(t: T): Boolean + def += (t: T): Unit + def ++=(t: Iterable[T]): Unit + def -= (t: T): Boolean + def all: collection.Iterable[T] + def isEmpty: Boolean + def foreach(f: T => Unit): Unit + def process[S](t: T)(ifSeen: S)(ifNew: => S): S +} + +object IDSet +{ + implicit def toTraversable[T]: IDSet[T] => Traversable[T] = _.all + def apply[T](values: T*): IDSet[T] = apply(values) + def apply[T](values: Iterable[T]): IDSet[T] = + { + val s = create[T] + s ++= values + s + } + def create[T]: IDSet[T] = new IDSet[T] { + private[this] val backing = new java.util.IdentityHashMap[T, AnyRef] + private[this] val Dummy: AnyRef = "" + + def apply(t: T) = contains(t) + def contains(t: T) = backing.containsKey(t) + def foreach(f: T => Unit) = all foreach f + def += (t: T) = backing.put(t, Dummy) + def ++=(t: Iterable[T]) = t foreach += + def -= (t:T) = if(backing.remove(t) eq null) false else true + def all = collection.JavaConversions.asIterable(backing.keySet) + def isEmpty = backing.isEmpty + def process[S](t: T)(ifSeen: S)(ifNew: => S) = if(contains(t)) ifSeen else { this += t ; ifNew } + override def toString = backing.toString + } +} \ No newline at end of file diff --git a/util/collection/MList.scala b/util/collection/MList.scala index 58d2724c3..b350858c3 100644 --- a/util/collection/MList.scala +++ b/util/collection/MList.scala @@ -7,6 +7,11 @@ import Types._ sealed trait MList[+M[_]] { + // For converting MList[Id] to an HList + // This is useful because type inference doesn't work well with Id + type Raw <: HList + def down(implicit ev: M ~> Id): Raw + type Map[N[_]] <: MList[N] def map[N[_]](f: M ~> N): Map[N] @@ -14,6 +19,9 @@ sealed trait MList[+M[_]] } final case class MCons[H, +T <: MList[M], +M[_]](head: M[H], tail: T) extends MList[M] { + type Raw = H :+: tail.Raw + def down(implicit f: M ~> Id): Raw = HCons(f(head), tail.down(f)) + type Map[N[_]] = MCons[H, tail.Map[N], N] def map[N[_]](f: M ~> N) = MCons( f(head), tail.map(f) ) @@ -23,6 +31,9 @@ final case class MCons[H, +T <: MList[M], +M[_]](head: M[H], tail: T) extends ML } sealed class MNil extends MList[Nothing] { + type Raw = HNil + def down(implicit f: Nothing ~> Id) = HNil + type Map[N[_]] = MNil def map[N[_]](f: Nothing ~> N) = MNil @@ -30,4 +41,4 @@ sealed class MNil extends MList[Nothing] def toList = Nil } -object MNil extends MNil +object MNil extends MNil \ No newline at end of file diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala new file mode 100644 index 000000000..bc5e092af --- /dev/null +++ b/util/collection/PMap.scala @@ -0,0 +1,43 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +trait PMap[K[_], V[_]] extends (K ~> V) +{ + def apply[T](k: K[T]): V[T] + def get[T](k: K[T]): Option[V[T]] + def update[T](k: K[T], v: V[T]): Unit + def contains[T](k: K[T]): Boolean + def remove[T](k: K[T]): Option[V[T]] + def getOrUpdate[T](k: K[T], make: => V[T]): V[T] +} +object PMap +{ + implicit def toFunction[K[_], V[_]](map: PMap[K,V]): K[_] => V[_] = k => map(k) +} + +abstract class AbstractPMap[K[_], V[_]] extends PMap[K,V] +{ + def apply[T](k: K[T]): V[T] = get(k).get + def contains[T](k: K[T]): Boolean = get(k).isDefined +} + +import collection.mutable.Map + +/** Only suitable for K that is invariant in its parameter. +* Option and List keys are not, for example, because None <:< Option[String] and None <: Option[Int].*/ +class DelegatingPMap[K[_], V[_]](backing: Map[K[_], V[_]]) extends AbstractPMap[K,V] +{ + def get[T](k: K[T]): Option[V[T]] = cast[T]( backing.get(k) ) + def update[T](k: K[T], v: V[T]) { backing(k) = v } + def remove[T](k: K[T]) = cast( backing.remove(k) ) + def getOrUpdate[T](k: K[T], make: => V[T]) = cast[T]( backing.getOrElseUpdate(k, make) ) + + private[this] def cast[T](v: V[_]): V[T] = v.asInstanceOf[V[T]] + private[this] def cast[T](o: Option[V[_]]): Option[V[T]] = o map cast[T] + + override def toString = backing.toString +} \ No newline at end of file diff --git a/util/collection/Param.scala b/util/collection/Param.scala new file mode 100644 index 000000000..3271465d9 --- /dev/null +++ b/util/collection/Param.scala @@ -0,0 +1,31 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +// Used to emulate ~> literals +trait Param[A[_], B[_]] +{ + type T + def in: A[T] + def ret(out: B[T]) + def ret: B[T] +} + +object Param +{ + implicit def pToT[A[_], B[_]](p: Param[A,B] => Unit): A~>B = new (A ~> B) { + def apply[s](a: A[s]): B[s] = { + val v: Param[A,B] { type T = s} = new Param[A,B] { type T = s + def in = a + private var r: B[T] = _ + def ret(b: B[T]) {r = b} + def ret: B[T] = r + } + p(v) + v.ret + } + } +} \ No newline at end of file diff --git a/util/collection/Rewrite.scala b/util/collection/Rewrite.scala new file mode 100644 index 000000000..40b40add4 --- /dev/null +++ b/util/collection/Rewrite.scala @@ -0,0 +1,42 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +trait Rewrite[A[_]] +{ + def apply[T](node: A[T], rewrite: Rewrite[A]): A[T] +} +object Rewrite +{ + def Id[A[_]]: Rewrite[A] = new Rewrite[A] { def apply[T](node: A[T], rewrite: Rewrite[A]) = node } + + implicit def specificF[T](f: T => T): Rewrite[Const[T]#Apply] = new Rewrite[Const[T]#Apply] { + def apply[S](node:T, rewrite: Rewrite[Const[T]#Apply]): T = f(node) + } + implicit def pToRewrite[A[_]](p: Param[A,A] => Unit): Rewrite[A] = toRewrite(Param.pToT(p)) + implicit def toRewrite[A[_]](f: A ~> A): Rewrite[A] = new Rewrite[A] { + def apply[T](node: A[T], rewrite:Rewrite[A]) = f(node) + } + def compose[A[_]](a: Rewrite[A], b: Rewrite[A]): Rewrite[A] = + new Rewrite[A] { + def apply[T](node: A[T], rewrite: Rewrite[A]) = + a(b(node, rewrite), rewrite) + } + implicit def rewriteOps[A[_]](outer: Rewrite[A]): RewriteOps[A] = + new RewriteOps[A] { + def ∙(g: A ~> A): Rewrite[A] = compose(outer, g) + def andThen(g: A ~> A): Rewrite[A] = compose(g, outer) + def ∙(g: Rewrite[A]): Rewrite[A] = compose(outer, g) + def andThen(g: Rewrite[A]): Rewrite[A] = compose(g, outer) + } + def apply[A[_], T](value: A[T])(implicit rewrite: Rewrite[A]): A[T] = rewrite(value, rewrite) +} +trait RewriteOps[A[_]] +{ + def andThen(g: A ~> A): Rewrite[A] + def ∙(g: A ~> A): Rewrite[A] + def ∙(g: Rewrite[A]): Rewrite[A] +} \ No newline at end of file diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index 95fb8ae9e..00a6fc772 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -7,14 +7,26 @@ trait TypeFunctions { type Id[X] = X trait Const[A] { type Apply[B] = A } - trait P1of2[M[_,_], A] { type Apply[B] = M[A,B] } - trait Down[M[_]] { type Apply[B] = Id[M[B]] } + trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] } + trait P1of2[M[_,_], A] { type Apply[B] = M[A,B]; type Flip[B] = M[B, A] } - trait ~>[-A[_], +B[_]] - { - def apply[T](a: A[T]): B[T] - } - def Id: Id ~> Id = - new (Id ~> Id) { def apply[T](a: T): T = a } + final val left = new (Id ~> P1of2[Left, Nothing]#Flip) { def apply[T](t: T) = Left(t) } + final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } + final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } } -object TypeFunctions extends TypeFunctions \ No newline at end of file +object TypeFunctions extends TypeFunctions + +trait ~>[-A[_], +B[_]] +{ outer => + def apply[T](a: A[T]): B[T] + // directly on ~> because of type inference limitations + final def ∙[C[_]](g: C ~> A): C ~> B = new (C ~> B) { def apply[T](c: C[T]) = outer.apply(g(c)) } + final def ∙[C,D](g: C => D)(implicit ev: D <:< A[D]): C => B[D] = i => apply(ev(g(i)) ) + final def fn[T] = (t: A[T]) => apply[T](t) +} +object ~> +{ + import TypeFunctions._ + val Id: Id ~> Id = new (Id ~> Id) { def apply[T](a: T): T = a } + implicit def tcIdEquals: (Id ~> Id) = Id +} \ No newline at end of file diff --git a/util/collection/Types.scala b/util/collection/Types.scala index de468216d..de6cf5aec 100644 --- a/util/collection/Types.scala +++ b/util/collection/Types.scala @@ -8,4 +8,4 @@ object Types extends TypeFunctions val :^: = MCons val :+: = HCons type :+:[H, T <: HList] = HCons[H,T] -} \ No newline at end of file +} diff --git a/util/collection/src/test/scala/LiteralTest.scala b/util/collection/src/test/scala/LiteralTest.scala new file mode 100644 index 000000000..76fffe80a --- /dev/null +++ b/util/collection/src/test/scala/LiteralTest.scala @@ -0,0 +1,17 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +// compilation test +object LiteralTest { + def x[A[_],B[_]](f: A ~> B) = f + + import Param._ + val f = x { (p: Param[Option,List]) => p.ret( p.in.toList ) } + + val a: List[Int] = f( Some(3) ) + val b: List[String] = f( Some("aa") ) +} \ No newline at end of file diff --git a/util/collection/src/test/scala/MListTest.scala b/util/collection/src/test/scala/MListTest.scala new file mode 100644 index 000000000..ffb86b18e --- /dev/null +++ b/util/collection/src/test/scala/MListTest.scala @@ -0,0 +1,19 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +object MTest { + val f = new (Option ~> List) { def apply[T](o: Option[T]): List[T] = o.toList } + + val x = Some(3) :^: Some("asdf") :^: MNil + val y = x map f + val m1a = y match { case List(3) :^: List("asdf") :^: MNil => println("true") } + val m1b = (List(3) :^: MNil) match { case yy :^: MNil => println("true") } + + val head = new (List ~> Id) { def apply[T](xs: List[T]): T = xs.head } + val z = y.map[Id](head).down + val m2 = z match { case 3 :+: "asdf" :+: HNil => println("true") } +} diff --git a/util/collection/src/test/scala/PMapTest.scala b/util/collection/src/test/scala/PMapTest.scala new file mode 100644 index 000000000..1ea66daaa --- /dev/null +++ b/util/collection/src/test/scala/PMapTest.scala @@ -0,0 +1,18 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +// compilation test +object PMapTest +{ + val mp = new DelegatingPMap[Some, Id](new collection.mutable.HashMap) + mp(Some("asdf")) = "a" + mp(Some(3)) = 9 + val x = Some(3) :^: Some("asdf") :^: MNil + val y = x.map[Id](mp) + val z = y.down + z match { case 9 :+: "a" :+: HNil => println("true") } +} \ No newline at end of file diff --git a/util/collection/src/test/scala/RewriteTest.scala b/util/collection/src/test/scala/RewriteTest.scala new file mode 100644 index 000000000..c2ca1b237 --- /dev/null +++ b/util/collection/src/test/scala/RewriteTest.scala @@ -0,0 +1,50 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +object RewriteTest +{ + // dist and add0 show the awkwardness when not just manipulating superstructure: + // would have to constrain the parameters to Term to be instances of Zero/Eq somehow + val dist: Rewrite[Term] = (p: Param[Term, Term]) => p.ret( p.in match { + case Add(Mult(a,b),Mult(c,d)) if a == c=> Mult(a, Add(b,d)) + case x => x + }) + val add0: Rewrite[Term] = (p: Param[Term, Term]) => p.ret( p.in match { + case Add(V(0), y) => y + case Add(x, V(0)) => x + case x => x + }) + val rewriteBU= new Rewrite[Term] { + def apply[T](node: Term[T], rewrite: Rewrite[Term]) = { + def r[T](node: Term[T]) = rewrite(node, rewrite) + node match { + case Add(x, y) => Add(r(x), r(y)) + case Mult(x, y) => Mult(r(x), r(y)) + case x => x + } + } + } + + val d2 = dist ∙ add0 ∙ rewriteBU + + implicit def toV(t: Int): V[Int] = V(t) + implicit def toVar(s: String): Var[Int] = Var[Int](s) + + val t1: Term[Int] = Add(Mult(3,4), Mult(4, 5)) + val t2: Term[Int] = Add(Mult(4,4), Mult(4, 5)) + val t3: Term[Int] = Add(Mult(Add("x", 0),4), Mult("x", 5)) + + println( Rewrite(t1)(d2) ) + println( Rewrite(t2)(d2) ) + println( Rewrite(t3)(d2) ) +} + +sealed trait Term[T] +final case class Add[T](a: Term[T], b: Term[T]) extends Term[T] +final case class Mult[T](a: Term[T], b: Term[T]) extends Term[T] +final case class V[T](v: T) extends Term[T] +final case class Var[T](name: String) extends Term[T] From 9e9f587be2d921c31adf991035bd99ebb757dea3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 7 Jun 2010 10:50:51 -0400 Subject: [PATCH 047/823] cache updates --- cache/Cache.scala | 10 +- cache/FileInfo.scala | 15 +- cache/src/test/scala/CacheTest.scala | 21 +-- cache/tracking/Tracked.scala | 209 ++++++++++++++++----------- 4 files changed, 146 insertions(+), 109 deletions(-) diff --git a/cache/Cache.scala b/cache/Cache.scala index 2f35d2c96..e7ba310dc 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -2,7 +2,6 @@ package xsbt import sbinary.{CollectionTypes, Format, JavaFormats} import java.io.File -import scala.reflect.Manifest trait Cache[I,O] { @@ -23,13 +22,6 @@ object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImpl def wrapOutputCache[O,DO](implicit convert: O => DO, reverse: DO => O, base: OutputCache[DO]): OutputCache[O] = new WrappedOutputCache[O,DO](convert, reverse, base) - def apply[I,O](file: File)(f: I => Task[O])(implicit cache: Cache[I,O]): I => Task[O] = - in => - cache(file)(in) match - { - case Left(value) => Task(value) - case Right(store) => f(in) map { out => store(out); out } - } def cached[I,O](file: File)(f: I => O)(implicit cache: Cache[I,O]): I => O = in => cache(file)(in) match @@ -61,4 +53,4 @@ trait HListCacheImplicits extends HLists implicit def hConsOutputCache[H,T<:HList](implicit headCache: OutputCache[H], tailCache: OutputCache[T]): OutputCache[HCons[H,T]] = new HConsOutputCache(headCache, tailCache) implicit lazy val hNilOutputCache: OutputCache[HNil] = new HNilOutputCache -} \ No newline at end of file +} diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala index 66c8c496b..d1b350fa8 100644 --- a/cache/FileInfo.scala +++ b/cache/FileInfo.scala @@ -18,8 +18,10 @@ sealed trait ModifiedFileInfo extends FileInfo { val lastModified: Long } +sealed trait PlainFileInfo extends FileInfo sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo +private final case class PlainFile(file: File) extends PlainFileInfo private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo @@ -32,8 +34,6 @@ object FileInfo implicit def apply(file: File): F implicit def unapply(info: F): File = info.file implicit val format: Format[F] - /*val manifest: Manifest[F] - def formatManifest: Manifest[Format[F]] = CacheIO.manifest[Format[F]]*/ import Cache._ implicit def infoInputCache: InputCache[File] = wrapInputCache[File,F] implicit def infoOutputCache: OutputCache[File] = wrapOutputCache[File,F] @@ -41,7 +41,6 @@ object FileInfo object full extends Style { type F = HashModifiedFileInfo - //val manifest: Manifest[F] = CacheIO.manifest[HashModifiedFileInfo] implicit def apply(file: File): HashModifiedFileInfo = make(file, Hash(file).toList, file.lastModified) def make(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, hash, lastModified) implicit val format: Format[HashModifiedFileInfo] = wrap(f => (f.file, f.hash, f.lastModified), tupled(make _)) @@ -49,7 +48,6 @@ object FileInfo object hash extends Style { type F = HashFileInfo - //val manifest: Manifest[F] = CacheIO.manifest[HashFileInfo] implicit def apply(file: File): HashFileInfo = make(file, computeHash(file).toList) def make(file: File, hash: List[Byte]): HashFileInfo = FileHash(file.getAbsoluteFile, hash) implicit val format: Format[HashFileInfo] = wrap(f => (f.file, f.hash), tupled(make _)) @@ -58,11 +56,17 @@ object FileInfo object lastModified extends Style { type F = ModifiedFileInfo - //val manifest: Manifest[F] = CacheIO.manifest[ModifiedFileInfo] implicit def apply(file: File): ModifiedFileInfo = make(file, file.lastModified) def make(file: File, lastModified: Long): ModifiedFileInfo = FileModified(file.getAbsoluteFile, lastModified) implicit val format: Format[ModifiedFileInfo] = wrap(f => (f.file, f.lastModified), tupled(make _)) } + object exists extends Style + { + type F = PlainFileInfo + implicit def apply(file: File): PlainFileInfo = make(file) + def make(file: File): PlainFileInfo = PlainFile(file.getAbsoluteFile) + implicit val format: Format[PlainFileInfo] = wrap(_.file, make) + } } final case class FilesInfo[F <: FileInfo] private(files: Set[F]) extends NotNull @@ -92,4 +96,5 @@ object FilesInfo lazy val full: Style = new BasicStyle(FileInfo.full) lazy val hash: Style = new BasicStyle(FileInfo.hash) lazy val lastModified: Style = new BasicStyle(FileInfo.lastModified) + lazy val exists: Style = new BasicStyle(FileInfo.exists) } \ No newline at end of file diff --git a/cache/src/test/scala/CacheTest.scala b/cache/src/test/scala/CacheTest.scala index 7bba6ec79..65703ecaa 100644 --- a/cache/src/test/scala/CacheTest.scala +++ b/cache/src/test/scala/CacheTest.scala @@ -7,22 +7,23 @@ object CacheTest// extends Properties("Cache test") val lengthCache = new File("/tmp/length-cache") val cCache = new File("/tmp/c-cache") - import Task._ import Cache._ import FileInfo.hash._ def test { - val createTask = Task { new File("test") } + lazy val create = new File("test") - val length = (f: File) => { println("File length: " + f.length); f.length } - val cachedLength = cached(lengthCache) ( length ) + val length = cached(lengthCache) { + (f: File) => { println("File length: " + f.length); f.length } + } - val lengthTask = createTask map cachedLength + lazy val fileLength = length(create) - val c = (file: File, len: Long) => { println("File: " + file + ", length: " + len); len :: file :: HNil } - val cTask = (createTask :: lengthTask :: TNil) map cached(cCache) { case (file :: len :: HNil) => c(file, len) } - - try { TaskRunner(cTask) } - catch { case TasksFailed(failures) => failures.foreach(_.exception.printStackTrace) } + val c = cached(cCache) { (in: (File :: Long :: HNil)) => + val file :: len :: HNil = in + println("File: " + file + " (" + file.exists + "), length: " + len) + (len+1) :: file :: HNil + } + c(create :: fileLength :: HNil) } } \ No newline at end of file diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index f79a2a7ee..49d33c622 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -7,112 +7,152 @@ import java.io.{File,IOException} import CacheIO.{fromFile, toFile} import sbinary.Format import scala.reflect.Manifest -import Task.{iterableToBuilder, iterableToForkBuilder} +import xsbt.FileUtilities.{delete, read, write} + +/* A proper implementation of fileTask that tracks inputs and outputs properly + +def fileTask(cacheBaseDirectory: Path)(inputs: PathFinder, outputs: PathFinder)(action: => Unit): Task = + fileTask(cacheBaseDirectory, FilesInfo.hash, FilesInfo.lastModified) +def fileTask(cacheBaseDirectory: Path, inStyle: FilesInfo.Style, outStyle: FilesInfo.Style)(inputs: PathFinder, outputs: PathFinder)(action: => Unit): Task = +{ + lazy val inCache = diffInputs(base / "in-cache", inStyle)(inputs) + lazy val outCache = diffOutputs(base / "out-cache", outStyle)(outputs) + task + { + inCache { inReport => + outCache { outReport => + if(inReport.modified.isEmpty && outReport.modified.isEmpty) () else action + } + } + } +} +*/ + +object Tracked +{ + /** Creates a tracker that provides the last time it was evaluated. + * If 'useStartTime' is true, the recorded time is the start of the evaluated function. + * If 'useStartTime' is false, the recorded time is when the evaluated function completes. + * In both cases, the timestamp is not updated if the function throws an exception.*/ + def tstamp(cacheFile: File, useStartTime: Boolean): Timestamp = new Timestamp(cacheFile) + /** Creates a tracker that only evaluates a function when the input has changed.*/ + def changed[O](cacheFile: File)(getValue: => O)(implicit input: InputCache[O]): Changed[O] = + new Changed[O](getValue, cacheFile) + + /** Creates a tracker that provides the difference between the set of input files provided for successive invocations.*/ + def diffInputs(cache: File, style: FilesInfo.Style)(files: => Set[File]): Difference = + Difference.inputs(files, style, cache) + /** Creates a tracker that provides the difference between the set of output files provided for successive invocations.*/ + def diffOutputs(cache: File, style: FilesInfo.Style)(files: => Set[File]): Difference = + Difference.outputs(files, style, cache) +} trait Tracked extends NotNull { - /** Cleans outputs. This operation might require information from the cache, so it should be called first if clear is also called.*/ - def clean: Task[Unit] - /** Clears the cache. If also cleaning, 'clean' should be called first as it might require information from the cache.*/ - def clear: Task[Unit] + /** Cleans outputs and clears the cache.*/ + def clean: Unit } -class Timestamp(val cacheFile: File) extends Tracked +class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked { - val clean = Clean(cacheFile) - def clear = Task.empty - def apply[T](f: Long => Task[T]): Task[T] = + def clean = delete(cacheFile) + /** Reads the previous timestamp, evaluates the provided function, and then updates the timestamp.*/ + def apply[T](f: Long => T): T = { - val getTimestamp = Task { readTimestamp } - getTimestamp bind f map { result => - FileUtilities.write(cacheFile, System.currentTimeMillis.toString) - result - } + val start = now() + val result = f(readTimestamp) + write(cacheFile, (if(useStartTime) start else now()).toString) + result } + private def now() = System.currentTimeMillis def readTimestamp: Long = - try { FileUtilities.read(cacheFile).toLong } + try { read(cacheFile).toLong } catch { case _: NumberFormatException | _: java.io.FileNotFoundException => 0 } } -object Clean -{ - def apply(src: Task[Set[File]]): Task[Unit] = src map FileUtilities.delete - def apply(srcs: File*): Task[Unit] = Task(FileUtilities.delete(srcs)) - def apply(srcs: Set[File]): Task[Unit] = Task(FileUtilities.delete(srcs)) -} -class Changed[O](val task: Task[O], val cacheFile: File)(implicit input: InputCache[O]) extends Tracked +class Changed[O](getValue: => O, val cacheFile: File)(implicit input: InputCache[O]) extends Tracked { - val clean = Clean(cacheFile) - def clear = Task.empty - def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): Task[O2] = - task map { value => - val cache = - try { OpenResource.fileInputStream(cacheFile)(input.uptodate(value)) } - catch { case _: IOException => new ForceResult(input)(value) } - if(cache.uptodate) - ifUnchanged(value) - else - { - OpenResource.fileOutputStream(false)(cacheFile)(cache.update) - ifChanged(value) - } + def clean = delete(cacheFile) + def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O2 = + { + val value = getValue + val cache = + try { OpenResource.fileInputStream(cacheFile)(input.uptodate(value)) } + catch { case _: IOException => new ForceResult(input)(value) } + if(cache.uptodate) + ifUnchanged(value) + else + { + OpenResource.fileOutputStream(false)(cacheFile)(cache.update) + ifChanged(value) } + } } object Difference { sealed class Constructor private[Difference](defineClean: Boolean, filesAreOutputs: Boolean) extends NotNull { - def apply(filesTask: Task[Set[File]], style: FilesInfo.Style, cache: File): Difference = new Difference(filesTask, style, cache, defineClean, filesAreOutputs) - def apply(files: Set[File], style: FilesInfo.Style, cache: File): Difference = apply(Task(files), style, cache) + def apply(files: => Set[File], style: FilesInfo.Style, cache: File): Difference = new Difference(files, style, cache, defineClean, filesAreOutputs) } + /** Provides a constructor for a Difference that removes the files from the previous run on a call to 'clean' and saves the + * hash/last modified time of the files as they are after running the function. This means that this information must be evaluated twice: + * before and after running the function.*/ object outputs extends Constructor(true, true) + /** Provides a constructor for a Difference that does nothing on a call to 'clean' and saves the + * hash/last modified time of the files as they were prior to running the function.*/ object inputs extends Constructor(false, false) } -class Difference(val filesTask: Task[Set[File]], val style: FilesInfo.Style, val cache: File, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked +class Difference(getFiles: => Set[File], val style: FilesInfo.Style, val cache: File, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked { - val clean = if(defineClean) Clean(Task(raw(cachedFilesInfo))) else Task.empty - val clear = Clean(cache) + def clean = + { + if(defineClean) delete(raw(cachedFilesInfo)) else () + clearCache() + } + private def clearCache = delete(cache) private def cachedFilesInfo = fromFile(style.formats, style.empty)(cache)(style.manifest).files private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) - def apply[T](f: ChangeReport[File] => Task[T]): Task[T] = - filesTask bind { files => - val lastFilesInfo = cachedFilesInfo - val lastFiles = raw(lastFilesInfo) - val currentFiles = files.map(_.getAbsoluteFile) - val currentFilesInfo = style(currentFiles) + def apply[T](f: ChangeReport[File] => T): T = + { + val files = getFiles + val lastFilesInfo = cachedFilesInfo + val lastFiles = raw(lastFilesInfo) + val currentFiles = files.map(_.getAbsoluteFile) + val currentFilesInfo = style(currentFiles) - val report = new ChangeReport[File] - { - lazy val checked = currentFiles - lazy val removed = lastFiles -- checked // all files that were included previously but not this time. This is independent of whether the files exist. - lazy val added = checked -- lastFiles // all files included now but not previously. This is independent of whether the files exist. - lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) ++ added - lazy val unmodified = checked -- modified - } - - f(report) map { result => - val info = if(filesAreOutputs) style(currentFiles) else currentFilesInfo - toFile(style.formats)(info)(cache)(style.manifest) - result - } + val report = new ChangeReport[File] + { + lazy val checked = currentFiles + lazy val removed = lastFiles -- checked // all files that were included previously but not this time. This is independent of whether the files exist. + lazy val added = checked -- lastFiles // all files included now but not previously. This is independent of whether the files exist. + lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) ++ added + lazy val unmodified = checked -- modified } + + val result = f(report) + val info = if(filesAreOutputs) style(currentFiles) else currentFilesInfo + toFile(style.formats)(info)(cache)(style.manifest) + result + } } class DependencyTracked[T](val cacheDirectory: File, val translateProducts: Boolean, cleanT: T => Unit)(implicit format: Format[T], mf: Manifest[T]) extends Tracked { private val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) private def cleanAll(fs: Set[T]) = fs.foreach(cleanT) - val clean = Task(cleanAll(trackFormat.read.allProducts)) - val clear = Clean(cacheDirectory) + def clean = + { + cleanAll(trackFormat.read.allProducts) + delete(cacheDirectory) + } - def apply[R](f: UpdateTracking[T] => Task[R]): Task[R] = + def apply[R](f: UpdateTracking[T] => R): R = { val tracker = trackFormat.read - f(tracker) map { result => - trackFormat.write(tracker) - result - } + val result = f(tracker) + trackFormat.write(tracker) + result } } object InvalidateFiles @@ -179,30 +219,29 @@ class InvalidateTransitive[T](cacheDirectory: File, translateProducts: Boolean, this(cacheDirectory, translateProducts, (_: T) => ()) private val tracked = new DependencyTracked(cacheDirectory, translateProducts, cleanT) - def clean = tracked.clean - def clear = tracked.clear - - def apply[R](changes: ChangeReport[T])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = - apply(Task(changes))(f) - def apply[R](changesTask: Task[ChangeReport[T]])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = + def clean { - changesTask bind { changes => - tracked { tracker => - val report = InvalidateTransitive.andClean[T](tracker, _.foreach(cleanT), changes.modified) - f(report, tracker) - } + tracked.clean + tracked.clear + } + + def apply[R](getChanges: => ChangeReport[T])(f: (InvalidationReport[T], UpdateTracking[T]) => R): R = + { + val changes = getChanges + tracked { tracker => + val report = InvalidateTransitive.andClean[T](tracker, _.foreach(cleanT), changes.modified) + f(report, tracker) } } } -class BasicTracked(filesTask: Task[Set[File]], style: FilesInfo.Style, cacheDirectory: File) extends Tracked +class BasicTracked(files: => Set[File], style: FilesInfo.Style, cacheDirectory: File) extends Tracked { - private val changed = Difference.inputs(filesTask, style, new File(cacheDirectory, "files")) + private val changed = Difference.inputs(files, style, new File(cacheDirectory, "files")) private val invalidation = InvalidateFiles(new File(cacheDirectory, "invalidation")) - private def onTracked(f: Tracked => Task[Unit]) = Seq(invalidation, changed).forkTasks(f).joinIgnore - val clear = onTracked(_.clear) - val clean = onTracked(_.clean) + private def onTracked(f: Tracked => Unit) = { f(invalidation); f(changed) } + def clean = onTracked(_.clean) - def apply[R](f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[R]): Task[R] = + def apply[R](f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => R): R = changed { sourceChanges => invalidation(sourceChanges) { (report, tracking) => f(sourceChanges, report, tracking) From 93492a011ce051248e2f42d91bfe851d11ef504f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 10 Jun 2010 08:14:50 -0400 Subject: [PATCH 048/823] conversions --- util/collection/HList.scala | 6 ++++++ util/collection/MList.scala | 9 ++++++++- util/collection/TypeFunctions.scala | 7 +++++++ 3 files changed, 21 insertions(+), 1 deletion(-) diff --git a/util/collection/HList.scala b/util/collection/HList.scala index a475c1194..a1e595eeb 100644 --- a/util/collection/HList.scala +++ b/util/collection/HList.scala @@ -25,4 +25,10 @@ final case class HCons[H, T <: HList](head : H, tail : T) extends HList type Up = MCons[H, tail.Up, Id] def up = MCons[H,tail.Up, Id](head, tail.up) def :+: [G](g: G): G :+: H :+: T = HCons(g, this) +} + +object HList +{ + // contains no type information: not even A + implicit def fromList[A](list: Traversable[A]): HList = ((HNil: HList) /: list) ( (hl,v) => HCons(v, hl) ) } \ No newline at end of file diff --git a/util/collection/MList.scala b/util/collection/MList.scala index b350858c3..7adfc1568 100644 --- a/util/collection/MList.scala +++ b/util/collection/MList.scala @@ -41,4 +41,11 @@ sealed class MNil extends MList[Nothing] def toList = Nil } -object MNil extends MNil \ No newline at end of file +object MNil extends MNil + + +object MList +{ + implicit def fromTCList[A[_]](list: Traversable[A[_]]): MList[A] = ((MNil: MList[A]) /: list) ( (hl,v) => MCons(v, hl) ) + implicit def fromList[A](list: Traversable[A]): MList[Const[A]#Apply] = ((MNil: MList[Const[A]#Apply]) /: list) ( (hl,v) => MCons[A, hl.type, Const[A]#Apply](v, hl) ) +} \ No newline at end of file diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index 00a6fc772..942a00d72 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -13,6 +13,10 @@ trait TypeFunctions final val left = new (Id ~> P1of2[Left, Nothing]#Flip) { def apply[T](t: T) = Left(t) } final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } + + implicit def toFn1[A,B](f: A => B): Fn1[A,B] = new Fn1[A,B] { + def ∙[C](g: C => A) = f compose g + } } object TypeFunctions extends TypeFunctions @@ -29,4 +33,7 @@ object ~> import TypeFunctions._ val Id: Id ~> Id = new (Id ~> Id) { def apply[T](a: T): T = a } implicit def tcIdEquals: (Id ~> Id) = Id +} +trait Fn1[A, B] { + def ∙[C](g: C => A): C => B } \ No newline at end of file From 3033bfec4405024c39425ca9ad5eb9eb458ac24b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 10 Jun 2010 21:08:01 -0400 Subject: [PATCH 049/823] move StackTrace to util/log --- LICENSE | 2 +- util/log/NOTICE | 2 +- util/log/StackTrace.scala | 63 +++++++++++++++++++++++++++++++++++++++ 3 files changed, 65 insertions(+), 2 deletions(-) create mode 100644 util/log/StackTrace.scala diff --git a/LICENSE b/LICENSE index 7b09b8ec6..15f983d64 100644 --- a/LICENSE +++ b/LICENSE @@ -1,4 +1,4 @@ -Copyright (c) 2008, 2009, 2010 Mark Harrah, Jason Zaugg +Copyright (c) 2008, 2009, 2010 Mark Harrah, Tony Sloane, Jason Zaugg All rights reserved. Redistribution and use in source and binary forms, with or without diff --git a/util/log/NOTICE b/util/log/NOTICE index 1a42b7a31..2455dad65 100644 --- a/util/log/NOTICE +++ b/util/log/NOTICE @@ -1,3 +1,3 @@ Simple Build Tool: Logging Component -Copyright 2008, 2009 Mark Harrah +Copyright 2008, 2009, 2010 Mark Harrah, Tony Sloane Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/util/log/StackTrace.scala b/util/log/StackTrace.scala new file mode 100644 index 000000000..1ecd6e8bf --- /dev/null +++ b/util/log/StackTrace.scala @@ -0,0 +1,63 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Tony Sloane + */ +package sbt + +object StackTrace +{ + def isSbtClass(name: String) = name.startsWith("sbt") || name.startsWith("xsbt") + /** + * Return a printable representation of the stack trace associated + * with t. Information about t and its Throwable causes is included. + * The number of lines to be included for each Throwable is configured + * via d which should be greater than or equal to zero. If d is zero, + * then all elements are included up to (but not including) the first + * element that comes from sbt. If d is greater than zero, then up to + * that many lines are included, where the line for the Throwable is + * counted plus one line for each stack element. Less lines will be + * included if there are not enough stack elements. + */ + def trimmed(t : Throwable, d : Int) : String = { + require(d >= 0) + val b = new StringBuilder () + + def appendStackTrace (t : Throwable, first : Boolean) { + + val include : StackTraceElement => Boolean = + if (d == 0) + element => !isSbtClass(element.getClassName) + else { + var count = d - 1 + (_ => { count -= 1; count >= 0 }) + } + + def appendElement (e : StackTraceElement) { + b.append ("\tat ") + b.append (e) + b.append ('\n') + } + + if (!first) + b.append ("Caused by: ") + b.append (t) + b.append ('\n') + + val els = t.getStackTrace () + var i = 0 + while ((i < els.size) && include (els (i))) { + appendElement (els (i)) + i += 1 + } + + } + + appendStackTrace (t, true) + var c = t + while (c.getCause () != null) { + c = c.getCause () + appendStackTrace (c, false) + } + b.toString () + + } +} \ No newline at end of file From 1584f01de8088f529eeee0eff68572c8bcebab31 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 10 Jun 2010 21:25:37 -0400 Subject: [PATCH 050/823] wideConvert lets the serious errors pass through, use it in Execute --- util/control/ErrorHandling.scala | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/util/control/ErrorHandling.scala b/util/control/ErrorHandling.scala index 98c1b1a73..f4d42993b 100644 --- a/util/control/ErrorHandling.scala +++ b/util/control/ErrorHandling.scala @@ -7,10 +7,17 @@ object ErrorHandling { def translate[T](msg: => String)(f: => T) = try { f } - catch { case e => throw new TranslatedException(msg + e.toString, e) } + catch { case e: Exception => throw new TranslatedException(msg + e.toString, e) } + def wideConvert[T](f: => T): Either[Throwable, T] = try { Right(f) } - catch { case e => Left(e) } // TODO: restrict type of e + catch + { + case ex @ (_: Exception | _: StackOverflowError) => Left(ex) + case err @ (_: ThreadDeath | _: VirtualMachineError) => throw err + case x => Left(x) + } + def convert[T](f: => T): Either[Exception, T] = try { Right(f) } catch { case e: Exception => Left(e) } From e02adb06943678a51b5cc98967ac0642801f951b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 10 Jun 2010 21:26:27 -0400 Subject: [PATCH 051/823] first round of logger cleanup/migration --- util/log/BasicLogger.scala | 14 ++++++------ util/log/BufferedLogger.scala | 15 +++++++------ util/log/ConsoleLogger.scala | 17 +++++++++------ util/log/FilterLogger.scala | 35 ++++++++++++++++++++++++++++++ util/log/Level.scala | 10 ++++----- util/log/LogEvent.scala | 4 ++-- util/log/Logger.scala | 14 +++++++----- util/log/LoggerWriter.scala | 40 +++++++++++++++++++++++++++++++++++ util/log/MultiLogger.scala | 27 +++++++++++++++++++++++ 9 files changed, 142 insertions(+), 34 deletions(-) create mode 100644 util/log/FilterLogger.scala create mode 100644 util/log/LoggerWriter.scala create mode 100644 util/log/MultiLogger.scala diff --git a/util/log/BasicLogger.scala b/util/log/BasicLogger.scala index 23607c8ed..a52d3b433 100644 --- a/util/log/BasicLogger.scala +++ b/util/log/BasicLogger.scala @@ -1,15 +1,15 @@ /* sbt -- Simple Build Tool - * Copyright 2008, 2009 Mark Harrah + * Copyright 2008, 2009, 2010 Mark Harrah */ - package xsbt +package sbt /** Implements the level-setting methods of Logger.*/ -abstract class BasicLogger extends Logger +abstract class BasicLogger extends AbstractLogger { - private var traceEnabledVar = true + private var traceEnabledVar = java.lang.Integer.MAX_VALUE private var level: Level.Value = Level.Info def getLevel = level def setLevel(newLevel: Level.Value) { level = newLevel } - def enableTrace(flag: Boolean) { traceEnabledVar = flag } - def traceEnabled = traceEnabledVar -} + def setTrace(level: Int) { traceEnabledVar = level } + def getTrace = traceEnabledVar +} \ No newline at end of file diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala index 7e8348f2d..1a72e022d 100644 --- a/util/log/BufferedLogger.scala +++ b/util/log/BufferedLogger.scala @@ -1,8 +1,9 @@ /* sbt -- Simple Build Tool - * Copyright 2008, 2009 Mark Harrah + * Copyright 2008, 2009, 2010 Mark Harrah */ package xsbt + import sbt.{AbstractLogger, ControlEvent, Level, Log, LogEvent, SetLevel, SetTrace, Success, Trace} import scala.collection.mutable.ListBuffer /** A logger that can buffer the logging done on it and then can flush the buffer @@ -13,7 +14,7 @@ * * This class assumes that it is the only client of the delegate logger. * */ -class BufferedLogger(delegate: Logger) extends Logger +class BufferedLogger(delegate: AbstractLogger) extends AbstractLogger { private[this] val buffer = new ListBuffer[LogEvent] private[this] var recording = false @@ -54,10 +55,10 @@ class BufferedLogger(delegate: Logger) extends Logger } def getLevel = delegate.getLevel def traceEnabled = delegate.traceEnabled - def enableTrace(flag: Boolean) + def setTrace(level: Int) { - buffer += new SetTrace(flag) - delegate.enableTrace(flag) + buffer += new SetTrace(level) + delegate.setTrace(level) } def trace(t: => Throwable): Unit = @@ -73,9 +74,9 @@ class BufferedLogger(delegate: Logger) extends Logger delegate.logAll(events) def control(event: ControlEvent.Value, message: => String): Unit = doBufferable(Level.Info, new ControlEvent(event, message), _.control(event, message)) - private def doBufferable(level: Level.Value, appendIfBuffered: => LogEvent, doUnbuffered: Logger => Unit): Unit = + private def doBufferable(level: Level.Value, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = doBufferableIf(atLevel(level), appendIfBuffered, doUnbuffered) - private def doBufferableIf(condition: => Boolean, appendIfBuffered: => LogEvent, doUnbuffered: Logger => Unit): Unit = + private def doBufferableIf(condition: => Boolean, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = if(condition) { if(recording) diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index a7217f026..49db31b66 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool - * Copyright 2008, 2009 Mark Harrah + * Copyright 2008, 2009, 2010 Mark Harrah */ - package xsbt +package sbt object ConsoleLogger { @@ -17,10 +17,12 @@ object ConsoleLogger } /** A logger that logs to the console. On supported systems, the level labels are -* colored. */ +* colored. +* +* This logger is not thread-safe.*/ class ConsoleLogger extends BasicLogger { - import ConsoleLogger.formatEnabled + override def ansiCodesSupported = ConsoleLogger.formatEnabled def messageColor(level: Level.Value) = Console.RESET def labelColor(level: Level.Value) = level match @@ -39,8 +41,9 @@ class ConsoleLogger extends BasicLogger def trace(t: => Throwable): Unit = System.out.synchronized { - if(traceEnabled) - t.printStackTrace + val traceLevel = getTrace + if(traceLevel >= 0) + System.out.synchronized { System.out.print(StackTrace.trimmed(t, traceLevel)) } } def log(level: Level.Value, message: => String) { @@ -49,7 +52,7 @@ class ConsoleLogger extends BasicLogger } private def setColor(color: String) { - if(formatEnabled) + if(ansiCodesSupported) System.out.synchronized { System.out.print(color) } } private def log(labelColor: String, label: String, messageColor: String, message: String): Unit = diff --git a/util/log/FilterLogger.scala b/util/log/FilterLogger.scala new file mode 100644 index 000000000..152f6bdd5 --- /dev/null +++ b/util/log/FilterLogger.scala @@ -0,0 +1,35 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 Mark Harrah + */ +package sbt + +/** A filter logger is used to delegate messages but not the logging level to another logger. This means +* that messages are logged at the higher of the two levels set by this logger and its delegate. +* */ +class FilterLogger(delegate: AbstractLogger) extends BasicLogger +{ + override lazy val ansiCodesSupported = delegate.ansiCodesSupported + def trace(t: => Throwable) + { + if(traceEnabled) + delegate.trace(t) + } + override def setTrace(level: Int) { delegate.setTrace(level) } + override def getTrace = delegate.getTrace + def log(level: Level.Value, message: => String) + { + if(atLevel(level)) + delegate.log(level, message) + } + def success(message: => String) + { + if(atLevel(Level.Info)) + delegate.success(message) + } + def control(event: ControlEvent.Value, message: => String) + { + if(atLevel(Level.Info)) + delegate.control(event, message) + } + def logAll(events: Seq[LogEvent]): Unit = delegate.logAll(events) +} diff --git a/util/log/Level.scala b/util/log/Level.scala index 86abc257d..ad4e51759 100644 --- a/util/log/Level.scala +++ b/util/log/Level.scala @@ -1,11 +1,11 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ - package xsbt + package sbt /** An enumeration defining the levels available for logging. A level includes all of the levels * with id larger than its own id. For example, Warn (id=3) includes Error (id=4).*/ -object Level extends Enumeration with NotNull +object Level extends Enumeration { val Debug = Value(1, "debug") val Info = Value(2, "info") @@ -16,10 +16,8 @@ object Level extends Enumeration with NotNull * label is also defined here. */ val SuccessLabel = "success" - // added because elements was renamed to iterator in 2.8.0 nightly - def levels = Debug :: Info :: Warn :: Error :: Nil /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ - def apply(s: String) = levels.find(s == _.toString) + def apply(s: String) = values.find(s == _.toString) /** Same as apply, defined for use in pattern matching. */ - private[xsbt] def unapply(s: String) = apply(s) + private[sbt] def unapply(s: String) = apply(s) } \ No newline at end of file diff --git a/util/log/LogEvent.scala b/util/log/LogEvent.scala index 19a5b24db..ffe6049d7 100644 --- a/util/log/LogEvent.scala +++ b/util/log/LogEvent.scala @@ -1,14 +1,14 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ - package xsbt + package sbt sealed trait LogEvent extends NotNull final class Success(val msg: String) extends LogEvent final class Log(val level: Level.Value, val msg: String) extends LogEvent final class Trace(val exception: Throwable) extends LogEvent final class SetLevel(val newLevel: Level.Value) extends LogEvent -final class SetTrace(val enabled: Boolean) extends LogEvent +final class SetTrace(val level: Int) extends LogEvent final class ControlEvent(val event: ControlEvent.Value, val msg: String) extends LogEvent object ControlEvent extends Enumeration diff --git a/util/log/Logger.scala b/util/log/Logger.scala index 153596f6f..1be353c4b 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -1,18 +1,22 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ - package xsbt + package sbt import xsbti.{Logger => xLogger, F0} -abstract class Logger extends xLogger with NotNull + +abstract class AbstractLogger extends xLogger with NotNull { def getLevel: Level.Value def setLevel(newLevel: Level.Value) - def enableTrace(flag: Boolean) - def traceEnabled: Boolean + def setTrace(flag: Int) + def getTrace: Int + final def traceEnabled = getTrace >= 0 + def ansiCodesSupported = false def atLevel(level: Level.Value) = level.id >= getLevel.id def trace(t: => Throwable): Unit + final def verbose(message: => String): Unit = debug(message) final def debug(message: => String): Unit = log(Level.Debug, message) final def info(message: => String): Unit = log(Level.Info, message) final def warn(message: => String): Unit = log(Level.Warn, message) @@ -31,7 +35,7 @@ abstract class Logger extends xLogger with NotNull case l: Log => log(l.level, l.msg) case t: Trace => trace(t.exception) case setL: SetLevel => setLevel(setL.newLevel) - case setT: SetTrace => enableTrace(setT.enabled) + case setT: SetTrace => setTrace(setT.level) case c: ControlEvent => control(c.event, c.msg) } } diff --git a/util/log/LoggerWriter.scala b/util/log/LoggerWriter.scala new file mode 100644 index 000000000..b3b2d54d3 --- /dev/null +++ b/util/log/LoggerWriter.scala @@ -0,0 +1,40 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 Mark Harrah + */ +package sbt + +/** Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. +* A line is delimited by `nl`, which is by default the platform line separator.*/ +class LoggerWriter(delegate: AbstractLogger, level: Level.Value, nl: String) extends java.io.Writer +{ + def this(delegate: AbstractLogger, level: Level.Value) = this(delegate, level, System.getProperty("line.separator")) + + private[this] val buffer = new StringBuilder + + override def close() = flush() + override def flush(): Unit = + synchronized { + if(buffer.length > 0) + { + log(buffer.toString) + buffer.clear() + } + } + override def write(content: Array[Char], offset: Int, length: Int): Unit = + synchronized { + buffer.append(content, offset, length) + process() + } + + private[this] def process() + { + val i = buffer.indexOf(nl) + if(i >= 0) + { + log(buffer.substring(0, i)) + buffer.delete(0, i + nl.length) + process() + } + } + private[this] def log(s: String): Unit = delegate.log(level, s) +} \ No newline at end of file diff --git a/util/log/MultiLogger.scala b/util/log/MultiLogger.scala new file mode 100644 index 000000000..800d170ed --- /dev/null +++ b/util/log/MultiLogger.scala @@ -0,0 +1,27 @@ + +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 Mark Harrah + */ +package sbt + + +class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger +{ + override lazy val ansiCodesSupported = delegates.forall(_.ansiCodesSupported) + override def setLevel(newLevel: Level.Value) + { + super.setLevel(newLevel) + dispatch(new SetLevel(newLevel)) + } + override def setTrace(level: Int) + { + super.setTrace(level) + dispatch(new SetTrace(level)) + } + def trace(t: => Throwable) { dispatch(new Trace(t)) } + def log(level: Level.Value, message: => String) { dispatch(new Log(level, message)) } + def success(message: => String) { dispatch(new Success(message)) } + def logAll(events: Seq[LogEvent]) { delegates.foreach(_.logAll(events)) } + def control(event: ControlEvent.Value, message: => String) { delegates.foreach(_.control(event, message)) } + private def dispatch(event: LogEvent) { delegates.foreach(_.log(event)) } +} \ No newline at end of file From b54b8fb348d8bdd641c2b2efdcec1eb1f5986ee3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 10 Jun 2010 22:47:04 -0400 Subject: [PATCH 052/823] more fixes --- util/log/BufferedLogger.scala | 2 +- util/log/LoggerWriter.scala | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala index 1a72e022d..689d21b14 100644 --- a/util/log/BufferedLogger.scala +++ b/util/log/BufferedLogger.scala @@ -54,7 +54,7 @@ class BufferedLogger(delegate: AbstractLogger) extends AbstractLogger delegate.setLevel(newLevel) } def getLevel = delegate.getLevel - def traceEnabled = delegate.traceEnabled + def getTrace = delegate.getTrace def setTrace(level: Int) { buffer += new SetTrace(level) diff --git a/util/log/LoggerWriter.scala b/util/log/LoggerWriter.scala index b3b2d54d3..885646973 100644 --- a/util/log/LoggerWriter.scala +++ b/util/log/LoggerWriter.scala @@ -22,7 +22,7 @@ class LoggerWriter(delegate: AbstractLogger, level: Level.Value, nl: String) ext } override def write(content: Array[Char], offset: Int, length: Int): Unit = synchronized { - buffer.append(content, offset, length) + buffer.appendAll(content, offset, length) process() } From 64b19286ee2953ce7e6ffec882e240b514af6440 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 13 Jun 2010 22:59:29 -0400 Subject: [PATCH 053/823] more reorganization, mostly IO. Also, move class file analyzer and history code to separate projects --- cache/Cache.scala | 12 +- cache/CacheIO.scala | 7 +- cache/FileInfo.scala | 16 +- cache/HListCache.scala | 17 +- cache/NoCache.scala | 5 +- cache/SeparatedCache.scala | 11 +- cache/src/test/scala/CacheTest.scala | 11 +- cache/tracking/ChangeReport.scala | 2 +- cache/tracking/DependencyTracking.scala | 13 +- cache/tracking/Tracked.scala | 14 +- cache/tracking/TrackingFormat.scala | 26 +-- .../src/main/java/xsbti/AnalysisCallback.java | 6 + interface/src/test/scala/TestCallback.scala | 2 + util/complete/History.scala | 49 ++++++ util/complete/HistoryCommands.scala | 84 +++++++++ util/control/ErrorHandling.scala | 4 +- util/log/src/test/scala/LogWriterTest.scala | 160 ++++++++++++++++++ 17 files changed, 382 insertions(+), 57 deletions(-) create mode 100644 util/complete/History.scala create mode 100644 util/complete/HistoryCommands.scala create mode 100644 util/log/src/test/scala/LogWriterTest.scala diff --git a/cache/Cache.scala b/cache/Cache.scala index e7ba310dc..c638e94f0 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -1,7 +1,11 @@ -package xsbt +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt import sbinary.{CollectionTypes, Format, JavaFormats} import java.io.File +import Types.:+: trait Cache[I,O] { @@ -44,13 +48,13 @@ trait BasicCacheImplicits extends NotNull new SeparatedCache(input, output) implicit def defaultEquiv[T]: Equiv[T] = new Equiv[T] { def equiv(a: T, b: T) = a == b } } -trait HListCacheImplicits extends HLists +trait HListCacheImplicits { - implicit def hConsInputCache[H,T<:HList](implicit headCache: InputCache[H], tailCache: InputCache[T]): InputCache[HCons[H,T]] = + implicit def hConsInputCache[H,T<:HList](implicit headCache: InputCache[H], tailCache: InputCache[T]): InputCache[H :+: T] = new HConsInputCache(headCache, tailCache) implicit lazy val hNilInputCache: InputCache[HNil] = new HNilInputCache - implicit def hConsOutputCache[H,T<:HList](implicit headCache: OutputCache[H], tailCache: OutputCache[T]): OutputCache[HCons[H,T]] = + implicit def hConsOutputCache[H,T<:HList](implicit headCache: OutputCache[H], tailCache: OutputCache[T]): OutputCache[H :+: T] = new HConsOutputCache(headCache, tailCache) implicit lazy val hNilOutputCache: OutputCache[HNil] = new HNilOutputCache } diff --git a/cache/CacheIO.scala b/cache/CacheIO.scala index e5c643c6a..7ff1eb519 100644 --- a/cache/CacheIO.scala +++ b/cache/CacheIO.scala @@ -1,4 +1,7 @@ -package xsbt +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt import java.io.{File, FileNotFoundException} import sbinary.{DefaultProtocol, Format, Operations} @@ -24,7 +27,7 @@ object CacheIO toFile(value)(file)(format, mf) def toFile[T](value: T)(file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Unit = { - FileUtilities.createDirectory(file.getParentFile) + IO.createDirectory(file.getParentFile) Operations.toFile(value)(file)(stampedFormat(format)) } def stampedFormat[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Format[T] = diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala index d1b350fa8..425a8598d 100644 --- a/cache/FileInfo.scala +++ b/cache/FileInfo.scala @@ -1,9 +1,11 @@ -package xsbt +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt import java.io.{File, IOException} import sbinary.{DefaultProtocol, Format} import DefaultProtocol._ -import Function.tupled import scala.reflect.Manifest sealed trait FileInfo extends NotNull @@ -43,22 +45,22 @@ object FileInfo type F = HashModifiedFileInfo implicit def apply(file: File): HashModifiedFileInfo = make(file, Hash(file).toList, file.lastModified) def make(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, hash, lastModified) - implicit val format: Format[HashModifiedFileInfo] = wrap(f => (f.file, f.hash, f.lastModified), tupled(make _)) + implicit val format: Format[HashModifiedFileInfo] = wrap(f => (f.file, f.hash, f.lastModified), (make _).tupled) } object hash extends Style { type F = HashFileInfo - implicit def apply(file: File): HashFileInfo = make(file, computeHash(file).toList) + implicit def apply(file: File): HashFileInfo = make(file, computeHash(file)) def make(file: File, hash: List[Byte]): HashFileInfo = FileHash(file.getAbsoluteFile, hash) - implicit val format: Format[HashFileInfo] = wrap(f => (f.file, f.hash), tupled(make _)) - private def computeHash(file: File) = try { Hash(file) } catch { case e: Exception => Nil } + implicit val format: Format[HashFileInfo] = wrap(f => (f.file, f.hash), (make _).tupled) + private def computeHash(file: File): List[Byte] = try { Hash(file).toList } catch { case e: Exception => Nil } } object lastModified extends Style { type F = ModifiedFileInfo implicit def apply(file: File): ModifiedFileInfo = make(file, file.lastModified) def make(file: File, lastModified: Long): ModifiedFileInfo = FileModified(file.getAbsoluteFile, lastModified) - implicit val format: Format[ModifiedFileInfo] = wrap(f => (f.file, f.lastModified), tupled(make _)) + implicit val format: Format[ModifiedFileInfo] = wrap(f => (f.file, f.lastModified), (make _).tupled) } object exists extends Style { diff --git a/cache/HListCache.scala b/cache/HListCache.scala index 90f00f47d..2bb3def3b 100644 --- a/cache/HListCache.scala +++ b/cache/HListCache.scala @@ -1,12 +1,15 @@ -package xsbt +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt import java.io.{InputStream,OutputStream} -import HLists._ +import Types._ class HNilInputCache extends NoInputCache[HNil] -class HConsInputCache[H,T <: HList](val headCache: InputCache[H], val tailCache: InputCache[T]) extends InputCache[HCons[H,T]] +class HConsInputCache[H,T <: HList](val headCache: InputCache[H], val tailCache: InputCache[T]) extends InputCache[H :+: T] { - def uptodate(in: HCons[H,T])(cacheStream: InputStream) = + def uptodate(in: H :+: T)(cacheStream: InputStream) = { val headResult = headCache.uptodate(in.head)(cacheStream) val tailResult = tailCache.uptodate(in.tail)(cacheStream) @@ -20,7 +23,7 @@ class HConsInputCache[H,T <: HList](val headCache: InputCache[H], val tailCache: } } } - def force(in: HCons[H,T])(cacheStream: OutputStream) = + def force(in: H :+: T)(cacheStream: OutputStream) = { headCache.force(in.head)(cacheStream) tailCache.force(in.tail)(cacheStream) @@ -28,7 +31,7 @@ class HConsInputCache[H,T <: HList](val headCache: InputCache[H], val tailCache: } class HNilOutputCache extends NoOutputCache[HNil](HNil) -class HConsOutputCache[H,T <: HList](val headCache: OutputCache[H], val tailCache: OutputCache[T]) extends OutputCache[HCons[H,T]] +class HConsOutputCache[H,T <: HList](val headCache: OutputCache[H], val tailCache: OutputCache[T]) extends OutputCache[H :+: T] { def loadCached(cacheStream: InputStream) = { @@ -36,7 +39,7 @@ class HConsOutputCache[H,T <: HList](val headCache: OutputCache[H], val tailCach val tail = tailCache.loadCached(cacheStream) HCons(head, tail) } - def update(out: HCons[H,T])(cacheStream: OutputStream) + def update(out: H :+: T)(cacheStream: OutputStream) { headCache.update(out.head)(cacheStream) tailCache.update(out.tail)(cacheStream) diff --git a/cache/NoCache.scala b/cache/NoCache.scala index a9cce3e99..bdb9c4f1b 100644 --- a/cache/NoCache.scala +++ b/cache/NoCache.scala @@ -1,4 +1,7 @@ -package xsbt +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt import java.io.{InputStream,OutputStream} diff --git a/cache/SeparatedCache.scala b/cache/SeparatedCache.scala index 91ecda713..6509e05bf 100644 --- a/cache/SeparatedCache.scala +++ b/cache/SeparatedCache.scala @@ -1,4 +1,7 @@ -package xsbt +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt import sbinary.Format import sbinary.JavaIO._ @@ -31,11 +34,11 @@ class SeparatedCache[I,O](input: InputCache[I], output: OutputCache[O]) extends catch { case _: Exception => Right(update(file)(in)) } protected def applyImpl(file: File, in: I) = { - OpenResource.fileInputStream(file) { stream => + Using.fileInputStream(file) { stream => val cache = input.uptodate(in)(stream) lazy val doUpdate = (result: O) => { - OpenResource.fileOutputStream(false)(file) { stream => + Using.fileOutputStream(false)(file) { stream => cache.update(stream) output.update(result)(stream) } @@ -49,7 +52,7 @@ class SeparatedCache[I,O](input: InputCache[I], output: OutputCache[O]) extends } protected def update(file: File)(in: I)(out: O) { - OpenResource.fileOutputStream(false)(file) { stream => + Using.fileOutputStream(false)(file) { stream => input.force(in)(stream) output.update(out)(stream) } diff --git a/cache/src/test/scala/CacheTest.scala b/cache/src/test/scala/CacheTest.scala index 65703ecaa..ad6085fc1 100644 --- a/cache/src/test/scala/CacheTest.scala +++ b/cache/src/test/scala/CacheTest.scala @@ -1,6 +1,7 @@ -package xsbt +package sbt import java.io.File +import Types.:+: object CacheTest// extends Properties("Cache test") { @@ -19,11 +20,11 @@ object CacheTest// extends Properties("Cache test") lazy val fileLength = length(create) - val c = cached(cCache) { (in: (File :: Long :: HNil)) => - val file :: len :: HNil = in + val c = cached(cCache) { (in: (File :+: Long :+: HNil)) => + val file :+: len :+: HNil = in println("File: " + file + " (" + file.exists + "), length: " + len) - (len+1) :: file :: HNil + (len+1) :+: file :+: HNil } - c(create :: fileLength :: HNil) + c(create :+: fileLength :+: HNil) } } \ No newline at end of file diff --git a/cache/tracking/ChangeReport.scala b/cache/tracking/ChangeReport.scala index c8f3a52eb..d25b1bbaa 100644 --- a/cache/tracking/ChangeReport.scala +++ b/cache/tracking/ChangeReport.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package xsbt +package sbt object ChangeReport { diff --git a/cache/tracking/DependencyTracking.scala b/cache/tracking/DependencyTracking.scala index e34930f5a..30e060af3 100644 --- a/cache/tracking/DependencyTracking.scala +++ b/cache/tracking/DependencyTracking.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package xsbt +package sbt private object DependencyTracking { @@ -24,7 +24,6 @@ trait UpdateTracking[T] extends NotNull // removes sources as keys/values in source, product maps and as values in reverseDependencies map def pending(sources: Iterable[T]): Unit } -import scala.collection.Set trait ReadTracking[T] extends NotNull { def isProduct(file: T): Boolean @@ -75,13 +74,13 @@ private abstract class DependencyTracking[T](translateProducts: Boolean) extends def isUsed(file: T): Boolean = exists(reverseUses, file) - final def allProducts = Set() ++ sourceMap.keys - final def allSources = Set() ++ productMap.keys - final def allUsed = Set() ++ reverseUses.keys + final def allProducts = sourceMap.keysIterator.toSet + final def allSources = productMap.keysIterator.toSet + final def allUsed = reverseUses.keysIterator.toSet final def allTags = tagMap.toSeq private def exists(map: DMap[T], value: T): Boolean = map.contains(value) - private def get(map: DMap[T], value: T): Set[T] = map.getOrElse(value, Set.empty[T]) + private def get(map: DMap[T], value: T): Set[T] = map.getOrElse[collection.Set[T]](value, Set.empty[T]).toSet final def dependency(sourceFile: T, dependsOn: T) { @@ -89,7 +88,7 @@ private abstract class DependencyTracking[T](translateProducts: Boolean) extends if(!translateProducts) Seq(dependsOn) else - sourceMap.getOrElse(dependsOn, Seq(dependsOn)) + sourceMap.getOrElse[Iterable[T]](dependsOn, Seq(dependsOn)) actualDependencies.foreach { actualDependency => reverseDependencies.add(actualDependency, sourceFile) } } final def product(sourceFile: T, product: T) diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index 49d33c622..77c7447f5 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -1,13 +1,13 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package xsbt +package sbt import java.io.{File,IOException} import CacheIO.{fromFile, toFile} import sbinary.Format import scala.reflect.Manifest -import xsbt.FileUtilities.{delete, read, write} +import IO.{delete, read, write} /* A proper implementation of fileTask that tracks inputs and outputs properly @@ -34,7 +34,7 @@ object Tracked * If 'useStartTime' is true, the recorded time is the start of the evaluated function. * If 'useStartTime' is false, the recorded time is when the evaluated function completes. * In both cases, the timestamp is not updated if the function throws an exception.*/ - def tstamp(cacheFile: File, useStartTime: Boolean): Timestamp = new Timestamp(cacheFile) + def tstamp(cacheFile: File, useStartTime: Boolean = true): Timestamp = new Timestamp(cacheFile, useStartTime) /** Creates a tracker that only evaluates a function when the input has changed.*/ def changed[O](cacheFile: File)(getValue: => O)(implicit input: InputCache[O]): Changed[O] = new Changed[O](getValue, cacheFile) @@ -76,13 +76,13 @@ class Changed[O](getValue: => O, val cacheFile: File)(implicit input: InputCache { val value = getValue val cache = - try { OpenResource.fileInputStream(cacheFile)(input.uptodate(value)) } + try { Using.fileInputStream(cacheFile)(input.uptodate(value)) } catch { case _: IOException => new ForceResult(input)(value) } if(cache.uptodate) ifUnchanged(value) else { - OpenResource.fileOutputStream(false)(cacheFile)(cache.update) + Using.fileOutputStream(false)(cacheFile)(cache.update) ifChanged(value) } } @@ -108,7 +108,7 @@ class Difference(getFiles: => Set[File], val style: FilesInfo.Style, val cache: if(defineClean) delete(raw(cachedFilesInfo)) else () clearCache() } - private def clearCache = delete(cache) + private def clearCache() = delete(cache) private def cachedFilesInfo = fromFile(style.formats, style.empty)(cache)(style.manifest).files private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) @@ -161,7 +161,7 @@ object InvalidateFiles def apply(cacheDirectory: File, translateProducts: Boolean): InvalidateTransitive[File] = { import sbinary.DefaultProtocol.FileFormat - new InvalidateTransitive[File](cacheDirectory, translateProducts, FileUtilities.delete) + new InvalidateTransitive[File](cacheDirectory, translateProducts, IO.delete) } } diff --git a/cache/tracking/TrackingFormat.scala b/cache/tracking/TrackingFormat.scala index 1318c6096..d8a5e0f2c 100644 --- a/cache/tracking/TrackingFormat.scala +++ b/cache/tracking/TrackingFormat.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package xsbt +package sbt import java.io.File import scala.collection.mutable.{HashMap, Map, MultiMap, Set} @@ -34,16 +34,22 @@ private class TrackingFormat[T](directory: File, translateProducts: Boolean)(imp } private object TrackingFormat { - implicit def mutableMapFormat[S, T](implicit binS : Format[S], binT : Format[T]) : Format[Map[S, T]] = - viaArray( (x : Array[(S, T)]) => Map(x :_*)); - implicit def depMapFormat[T](implicit bin: Format[T]) : Format[DMap[T]] = - { - viaArray { (x : Array[(T, Set[T])]) => - val map = newMap[T] - map ++= x - map + implicit def mutableMapFormat[S, T](implicit binS : Format[S], binT : Format[T]) : Format[HashMap[S, T]] = + new LengthEncoded[HashMap[S, T], (S, T)] { + def build(size : Int, ts : Iterator[(S, T)]) : HashMap[S, T] = { + val b = new HashMap[S, T] + b ++= ts + b + } + } + implicit def depMapFormat[T](implicit bin: Format[T]) : Format[DMap[T]] = + new LengthEncoded[DMap[T], (T, Set[T])] { + def build(size : Int, ts : Iterator[(T, Set[T])]) : DMap[T] = { + val b = newMap[T] + b ++= ts + b + } } - } def trackingFormat[T](translateProducts: Boolean)(implicit tFormat: Format[T]): Format[DependencyTracking[T]] = asProduct4((a: DMap[T],b: DMap[T],c: DMap[T], d:TagMap[T]) => new DefaultTracking(translateProducts)(a,b,c,d) : DependencyTracking[T] )(dt => (dt.reverseDependencies, dt.reverseUses, dt.sourceMap, dt.tagMap)) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 4db5f28c3..03c4798c9 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -31,6 +31,12 @@ public interface AnalysisCallback /** Called to indicate that the source file source depends on the class file * clazz.*/ public void classDependency(File clazz, File source); + /** Called to indicate that the source file sourcePath depends on the class file + * classFile that is a product of some source. This differs from classDependency + * because it is really a sourceDependency. The source corresponding to classFile + * was not incuded in the compilation so the plugin doesn't know what the source is though. It + * only knows that the class file came from the output directory.*/ + public void productDependency(File classFile, File sourcePath); /** Called to indicate that the source file source produces a class file at * module.*/ public void generatedClass(File source, File module); diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index 8e7014af7..75e8d77af 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -13,6 +13,7 @@ class TestCallback(val superclassNames: Array[String], val annotationNames: Arra val sourceDependencies = new ArrayBuffer[(File, File)] val jarDependencies = new ArrayBuffer[(File, File)] val classDependencies = new ArrayBuffer[(File, File)] + val productDependencies = new ArrayBuffer[(File, File)] val products = new ArrayBuffer[(File, File)] val applications = new ArrayBuffer[(File, String)] @@ -25,6 +26,7 @@ class TestCallback(val superclassNames: Array[String], val annotationNames: Arra def sourceDependency(dependsOn: File, source: File) { sourceDependencies += ((dependsOn, source)) } def jarDependency(jar: File, source: File) { jarDependencies += ((jar, source)) } def classDependency(clazz: File, source: File) { classDependencies += ((clazz, source)) } + def productDependency(clazz: File, source: File) { productDependencies += ((clazz, source)) } def generatedClass(source: File, module: File) { products += ((source, module)) } def endSource(source: File) { endedSources += source } def foundApplication(source: File, className: String) { applications += ((source, className)) } diff --git a/util/complete/History.scala b/util/complete/History.scala new file mode 100644 index 000000000..bf009f626 --- /dev/null +++ b/util/complete/History.scala @@ -0,0 +1,49 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt +package complete + +import History.number + +final class History private(lines: IndexedSeq[String], error: (=> String) => Unit) extends NotNull +{ + private def reversed = lines.reverse + + def all: Seq[String] = lines + def size = lines.length + def !! : Option[String] = !- (1) + def apply(i: Int): Option[String] = if(0 <= i && i < size) Some( lines(i) ) else { error("Invalid history index: " + i); None } + def !(i: Int): Option[String] = apply(i) + + def !(s: String): Option[String] = + number(s) match + { + case Some(n) => if(n < 0) !- (-n) else apply(n) + case None => nonEmpty(s) { reversed.find(_.startsWith(s)) } + } + def !- (n: Int): Option[String] = apply(size - n - 1) + + def !?(s: String): Option[String] = nonEmpty(s) { reversed.drop(1).find(_.contains(s)) } + + private def nonEmpty[T](s: String)(act: => Option[T]): Option[T] = + if(s.isEmpty) + { + error("No action specified to history command") + None + } + else + act + + def list(historySize: Int, show: Int): Seq[String] = + lines.toList.drop((lines.size - historySize) max 0).zipWithIndex.map { case (line, number) => " " + number + " " + line }.takeRight(show max 1) +} + +object History +{ + def apply(lines: Seq[String], error: (=> String) => Unit): History = new History(lines.toIndexedSeq, error) + + def number(s: String): Option[Int] = + try { Some(s.toInt) } + catch { case e: NumberFormatException => None } +} \ No newline at end of file diff --git a/util/complete/HistoryCommands.scala b/util/complete/HistoryCommands.scala new file mode 100644 index 000000000..a03c9cfca --- /dev/null +++ b/util/complete/HistoryCommands.scala @@ -0,0 +1,84 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt +package complete + +object HistoryCommands +{ + val Start = "!" + // second characters + val Contains = "?" + val Last = "!" + val ListCommands = ":" + + def ContainsFull = h(Contains) + def LastFull = h(Last) + def ListFull = h(ListCommands) + + def ListN = ListFull + "n" + def ContainsString = ContainsFull + "string" + def StartsWithString = Start + "string" + def Previous = Start + "-n" + def Nth = Start + "n" + + private def h(s: String) = Start + s + def plainCommands = Seq(ListFull, Start, LastFull, ContainsFull) + + def descriptions = Seq( + LastFull -> "Execute the last command again", + ListFull -> "Show all previous commands", + ListN -> "Show the last n commands", + Nth -> ("Execute the command with index n, as shown by the " + ListFull + " command"), + Previous -> "Execute the nth command before this one", + StartsWithString -> "Execute the most recent command starting with 'string'", + ContainsString -> "Execute the most recent command containing 'string'" + ) + def helpString = "History commands:\n " + (descriptions.map{ case (c,d) => c + " " + d}).mkString("\n ") + def printHelp(): Unit = + println(helpString) + + def apply(s: String, historyPath: Option[Path], maxLines: Int, error: (=> String) => Unit): Option[List[String]] = + if(s.isEmpty) + { + printHelp() + Some(Nil) + } + else + { + val lines = historyPath.toList.flatMap(h => IO.readLines(h.asFile) ).toArray + if(lines.isEmpty) + { + error("No history") + None + } + else + { + val history = complete.History(lines, error) + if(s.startsWith(ListCommands)) + { + val rest = s.substring(ListCommands.length) + val show = complete.History.number(rest).getOrElse(lines.length) + printHistory(history, maxLines, show) + Some(Nil) + } + else + { + val command = historyCommand(history, s) + command.foreach(lines(lines.length - 1) = _) + historyPath foreach { h => IO.writeLines(h.asFile, lines) } + Some(command.toList) + } + } + } + def printHistory(history: complete.History, historySize: Int, show: Int): Unit = history.list(historySize, show).foreach(println) + def historyCommand(history: complete.History, s: String): Option[String] = + { + if(s == Last) + history !! + else if(s.startsWith(Contains)) + history !? s.substring(Contains.length) + else + history ! s + } +} \ No newline at end of file diff --git a/util/control/ErrorHandling.scala b/util/control/ErrorHandling.scala index f4d42993b..4072edff3 100644 --- a/util/control/ErrorHandling.scala +++ b/util/control/ErrorHandling.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package xsbt +package sbt object ErrorHandling { @@ -22,7 +22,7 @@ object ErrorHandling try { Right(f) } catch { case e: Exception => Left(e) } } -final class TranslatedException private[xsbt](msg: String, cause: Throwable) extends RuntimeException(msg, cause) +final class TranslatedException private[sbt](msg: String, cause: Throwable) extends RuntimeException(msg, cause) { override def toString = msg } \ No newline at end of file diff --git a/util/log/src/test/scala/LogWriterTest.scala b/util/log/src/test/scala/LogWriterTest.scala new file mode 100644 index 000000000..6d01341a0 --- /dev/null +++ b/util/log/src/test/scala/LogWriterTest.scala @@ -0,0 +1,160 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah */ + +package sbt + +import org.scalacheck._ +import Arbitrary.{arbitrary => arb, _} +import Gen.{listOfN, oneOf} +import Prop._ + +import java.io.Writer + +object LogWriterTest extends Properties("Log Writer") +{ + final val MaxLines = 100 + final val MaxSegments = 10 + + /* Tests that content written through a LoggerWriter is properly passed to the underlying Logger. + * Each line, determined by the specified newline separator, must be logged at the correct logging level. */ + property("properly logged") = forAll { (output: Output, newLine: NewLine) => + import output.{lines, level} + val log = new RecordingLogger + val writer = new LoggerWriter(log, level, newLine.str) + logLines(writer, lines, newLine.str) + val events = log.getEvents + ("Recorded:\n" + events.map(show).mkString("\n")) |: + check( toLines(lines), events, level) + } + + /** Displays a LogEvent in a useful format for debugging. In particular, we are only interested in `Log` types + * and non-printable characters should be escaped*/ + def show(event: LogEvent): String = + event match + { + case l: Log => "Log('" + Escape(l.msg) + "', " + l.level + ")" + case _ => "Not Log" + } + /** Writes the given lines to the Writer. `lines` is taken to be a list of lines, which are + * represented as separately written segments (ToLog instances). ToLog.`byCharacter` + * indicates whether to write the segment by character (true) or all at once (false)*/ + def logLines(writer: Writer, lines: List[List[ToLog]], newLine: String) + { + for(line <- lines; section <- line) + { + val content = section.content + val normalized = Escape.newline(content, newLine) + if(section.byCharacter) + normalized.foreach { c => writer.write(c.toInt) } + else + writer.write(normalized) + } + writer.flush() + } + + /** Converts the given lines in segments to lines as Strings for checking the results of the test.*/ + def toLines(lines: List[List[ToLog]]): List[String] = + lines.map(_.map(_.contentOnly).mkString) + /** Checks that the expected `lines` were recorded as `events` at level `Lvl`.*/ + def check(lines: List[String], events: List[LogEvent], Lvl: Level.Value): Boolean = + (lines zip events) forall { + case (line, log : Log) => log.level == Lvl && line == log.msg + case _ => false + } + + /* The following are implicit generators to build up a write sequence. + * ToLog represents a written segment. NewLine represents one of the possible + * newline separators. A List[ToLog] represents a full line and always includes a + * final ToLog with a trailing '\n'. Newline characters are otherwise not present in + * the `content` of a ToLog instance.*/ + + implicit lazy val arbOut: Arbitrary[Output] = Arbitrary(genOutput) + implicit lazy val arbLog: Arbitrary[ToLog] = Arbitrary(genLog) + implicit lazy val arbLine: Arbitrary[List[ToLog]] = Arbitrary(genLine) + implicit lazy val arbNewLine: Arbitrary[NewLine] = Arbitrary(genNewLine) + implicit lazy val arbLevel : Arbitrary[Level.Value] = Arbitrary(genLevel) + + implicit def genLine(implicit logG: Gen[ToLog]): Gen[List[ToLog]] = + for(l <- listOf[ToLog](MaxSegments); last <- logG) yield + (addNewline(last) :: l.filter(!_.content.isEmpty)).reverse + + implicit def genLog(implicit content: Arbitrary[String], byChar: Arbitrary[Boolean]): Gen[ToLog] = + for(c <- content.arbitrary; by <- byChar.arbitrary) yield + { + assert(c != null) + new ToLog(removeNewlines(c), by) + } + + implicit lazy val genNewLine: Gen[NewLine] = + for(str <- oneOf("\n", "\r", "\r\n")) yield + new NewLine(str) + + implicit lazy val genLevel: Gen[Level.Value] = + oneOf(Level.values.toSeq) + + implicit lazy val genOutput: Gen[Output] = + for(ls <- listOf[List[ToLog]](MaxLines); lv <- genLevel) yield + new Output(ls, lv) + + def removeNewlines(s: String) = s.replaceAll("""[\n\r]+""", "") + def addNewline(l: ToLog): ToLog = + new ToLog(l.content + "\n", l.byCharacter) // \n will be replaced by a random line terminator for all lines + + def listOf[T](max: Int)(implicit content: Arbitrary[T]): Gen[List[T]] = + Gen.choose(0, max) flatMap { sz => listOfN(sz, content.arbitrary) } +} + +/* Helper classes*/ + +final class Output(val lines: List[List[ToLog]], val level: Level.Value) extends NotNull +{ + override def toString = + "Level: " + level + "\n" + lines.map(_.mkString).mkString("\n") +} +final class NewLine(val str: String) extends NotNull +{ + override def toString = Escape(str) +} +final class ToLog(val content: String, val byCharacter: Boolean) extends NotNull +{ + def contentOnly = Escape.newline(content, "") + override def toString = if(content.isEmpty) "" else "ToLog('" + Escape(contentOnly) + "', " + byCharacter + ")" +} +/** Defines some utility methods for escaping unprintable characters.*/ +object Escape +{ + /** Escapes characters with code less than 20 by printing them as unicode escapes.*/ + def apply(s: String): String = + { + val builder = new StringBuilder(s.length) + for(c <- s) + { + def escaped = pad(c.toInt.toHexString.toUpperCase, 4, '0') + if(c < 20) builder.append("\\u").append(escaped) else builder.append(c) + } + builder.toString + } + def pad(s: String, minLength: Int, extra: Char) = + { + val diff = minLength - s.length + if(diff <= 0) s else List.make(diff, extra).mkString("", "", s) + } + /** Replaces a \n character at the end of a string `s` with `nl`.*/ + def newline(s: String, nl: String): String = + if(s.endsWith("\n")) s.substring(0, s.length - 1) + nl else s +} +/** Records logging events for later retrieval.*/ +final class RecordingLogger extends BasicLogger +{ + private var events: List[LogEvent] = Nil + + def getEvents = events.reverse + + override def ansiCodesSupported = true + def trace(t: => Throwable) { events ::= new Trace(t) } + def log(level: Level.Value, message: => String) { events ::= new Log(level, message) } + def success(message: => String) { events ::= new Success(message) } + def logAll(es: Seq[LogEvent]) { events :::= es.toList } + def control(event: ControlEvent.Value, message: => String) { events ::= new ControlEvent(event, message) } + +} \ No newline at end of file From ff1657879cdb92269034787ed788aa9191c0f724 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 21 Jun 2010 21:22:11 -0400 Subject: [PATCH 054/823] relation data structure --- util/collection/Relation.scala | 99 +++++++++++++++++++ .../src/test/scala/RelationTest.scala | 67 +++++++++++++ 2 files changed, 166 insertions(+) create mode 100644 util/collection/Relation.scala create mode 100644 util/collection/src/test/scala/RelationTest.scala diff --git a/util/collection/Relation.scala b/util/collection/Relation.scala new file mode 100644 index 000000000..c60925aba --- /dev/null +++ b/util/collection/Relation.scala @@ -0,0 +1,99 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +object Relation +{ + /** Constructs a new immutable, finite relation that is initially empty. */ + def empty[T]: Relation[T] = new MRelation[T](Map.empty, Map.empty) +} +/** Binary relation on T. It is a set of pairs (_1, _2) for _1, _2 in T. */ +trait Relation[T] +{ + /** Returns the set of all _2s such that (_1, _2) is in this relation. */ + def forward(_1: T): Set[T] + /** Returns the set of all _1s such that (_1, _2) is in this relation. */ + def reverse(_2: T): Set[T] + /** Includes the relation given by `pair`. */ + def +(pair: (T, T)): Relation[T] + /** Includes the relation (a, b). */ + def +(a: T, b: T): Relation[T] + /** Includes the relations (a, b) for all b in bs. */ + def +(a: T, bs: Iterable[T]): Relation[T] + /** Returns the union of the relation r with this relation. */ + def ++(r: Relation[T]): Relation[T] + /** Includes the given relations. */ + def ++(rs: Iterable[(T,T)]): Relation[T] + /** Removes all relations (_1, _2) for all _1 in _1s. */ + def --(_1s: Iterable[T]): Relation[T] + /** Removes all `pairs` from this relation. */ + def --(pairs: Traversable[(T,T)]): Relation[T] + /** Removes all pairs (_1, _2) from this relation. */ + def -(_1: T): Relation[T] + /** Removes `pair` from this relation. */ + def -(pair: (T,T)): Relation[T] + /** Returns the set of all _1s such that (_1, _2) is in this relation. */ + def _1s: collection.Set[T] + /** Returns the set of all _2s such that (_1, _2) is in this relation. */ + def _2s: collection.Set[T] + + /** Returns all pairs in this relation.*/ + def all: Traversable[(T,T)] + + def forwardMap: Map[T, Set[T]] + def reverseMap: Map[T, Set[T]] +} +private final class MRelation[T](fwd: Map[T, Set[T]], rev: Map[T, Set[T]]) extends Relation[T] +{ + type M = Map[T, Set[T]] + + def forwardMap = fwd + def reverseMap = rev + + def forward(t: T) = get(fwd, t) + def reverse(t: T) = get(rev, t) + + def _1s = fwd.keySet + def _2s = rev.keySet + + def all: Traversable[(T,T)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable + + def +(pair: (T, T)): Relation[T] = this + (pair._1, Set(pair._2)) + def +(from: T, to: T): Relation[T] = this + (from, Set(to)) + def +(from: T, to: Iterable[T]): Relation[T] = + new MRelation( add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, Seq(from)) }) + + def ++(rs: Iterable[(T,T)]): Relation[T] = ((this: Relation[T]) /: rs) { _ + _ } + def ++(other: Relation[T]): Relation[T] = new MRelation[T]( combine(fwd, other.forwardMap), combine(rev, other.reverseMap) ) + + def --(ts: Iterable[T]): Relation[T] = ((this: Relation[T]) /: ts) { _ - _ } + def --(pairs: Traversable[(T,T)]): Relation[T] = ((this: Relation[T]) /: pairs) { _ - _ } + def -(pair: (T,T)): Relation[T] = + new MRelation( remove(fwd, pair._1, pair._2), remove(rev, pair._2, pair._1) ) + def -(t: T): Relation[T] = + fwd.get(t) match { + case Some(rs) => + val upRev = (rev /: rs) { (map, r) => remove(map, r, t) } + new MRelation(fwd - t, upRev) + case None => this + } + + private def remove(map: M, from: T, to: T): M = + map.get(from) match { + case Some(tos) => + val newSet = tos - to + if(newSet.isEmpty) map - from else map.updated(from, newSet) + case None => map + } + + private def combine(a: M, b: M): M = + (a /: b) { (map, mapping) => add(map, mapping._1, mapping._2) } + + private[this] def add(map: M, from: T, to: Iterable[T]): M = + map.updated(from, get(map, from) ++ to) + + private[this] def get(map: M, t: T): Set[T] = map.getOrElse(t, Set.empty[T]) + + override def toString = all.mkString("Relation [", ", ", "]") +} \ No newline at end of file diff --git a/util/collection/src/test/scala/RelationTest.scala b/util/collection/src/test/scala/RelationTest.scala new file mode 100644 index 000000000..d8a76bc17 --- /dev/null +++ b/util/collection/src/test/scala/RelationTest.scala @@ -0,0 +1,67 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import org.scalacheck._ +import Prop._ + +object RelationTest extends Properties("Relation") +{ + property("Added entry check") = forAll { (pairs: List[(Int, Int)]) => + val r = Relation.empty[Int] ++ pairs + check(r, pairs) + } + def check(r: Relation[Int], pairs: Seq[(Int, Int)]) = + { + val _1s = pairs.map(_._1).toSet + val _2s = pairs.map(_._2).toSet + + r._1s == _1s && r.forwardMap.keySet == _1s && + r._2s == _2s && r.reverseMap.keySet == _2s && + pairs.forall { case (a, b) => + (r.forward(a) contains b) && + (r.reverse(b) contains a) && + (r.forwardMap(a) contains b) && + (r.reverseMap(b) contains a) + } + } + + property("Does not contain removed entries") = forAll { (pairs: List[(Int, Int, Boolean)]) => + val add = pairs.map { case (a,b,c) => (a,b) } + val added = Relation.empty[Int] ++ add + + val removeFine = pairs.collect { case (a,b,true) => (a,b) } + val removeCoarse = removeFine.map(_._1) + val r = added -- removeCoarse + + def notIn[T](map: Map[T, Set[T]], a: T, b: T) = map.get(a).forall(set => ! (set contains b) ) + + all(removeCoarse) { rem => + ("_1s does not contain removed" |: (!r._1s.contains(rem)) ) && + ("Forward does not contain removed" |: r.forward(rem).isEmpty ) && + ("Forward map does not contain removed" |: !r.forwardMap.contains(rem) ) && + ("Removed is not a value in reverse map" |: !r.reverseMap.values.toSet.contains(rem) ) + } && + all(removeFine) { case (a, b) => + ("Forward does not contain removed" |: ( !r.forward(a).contains(b) ) ) && + ("Reverse does not contain removed" |: ( !r.reverse(b).contains(a) ) ) && + ("Forward map does not contain removed" |: ( notIn(r.forwardMap, a, b) ) ) && + ("Reverse map does not contain removed" |: ( notIn(r.reverseMap, b, a) ) ) + } + } + def all[T](s: Seq[T])(p: T => Prop): Prop = + if(s.isEmpty) true else s.map(p).reduceLeft(_ && _) +} + +object EmptyRelationTest extends Properties("Empty relation") +{ + lazy val e = Relation.empty[Int] + + property("Forward empty") = forAll { (i: Int) => e.forward(i).isEmpty } + property("Reverse empty") = forAll { (i: Int) => e.reverse(i).isEmpty } + property("Forward map empty") = forAll { (i: Int) => e.forwardMap.isEmpty } + property("Reverse map empty") = forAll { (i: Int) => e.reverseMap.isEmpty } + property("_1 empty") = forAll { (i: Int) => e._1s.isEmpty } + property("_2 empty") = forAll { (i: Int) => e._2s.isEmpty } +} \ No newline at end of file From f1b5e0cf50808488bdcd448dcf544895458cf485 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 24 Jun 2010 18:09:07 -0400 Subject: [PATCH 055/823] MList -> KList, Relation[T] -> Relation[A,B] --- util/collection/HList.scala | 12 +-- util/collection/KList.scala | 47 ++++++++++++ util/collection/MList.scala | 51 ------------- util/collection/Relation.scala | 74 +++++++++---------- util/collection/Types.scala | 2 +- .../{MListTest.scala => KListTest.scala} | 10 +-- util/collection/src/test/scala/PMapTest.scala | 2 +- .../src/test/scala/RelationTest.scala | 24 +++--- 8 files changed, 106 insertions(+), 116 deletions(-) create mode 100644 util/collection/KList.scala delete mode 100644 util/collection/MList.scala rename util/collection/src/test/scala/{MListTest.scala => KListTest.scala} (62%) diff --git a/util/collection/HList.scala b/util/collection/HList.scala index a1e595eeb..db2c9db85 100644 --- a/util/collection/HList.scala +++ b/util/collection/HList.scala @@ -7,23 +7,17 @@ import Types._ sealed trait HList { - type ToM[M[_]] <: MList[M] - type Up <: MList[Id] - def up: Up + type Wrap[M[_]] <: HList } sealed trait HNil extends HList { - type ToM[M[_]] = MNil - type Up = MNil - def up = MNil + type Wrap[M[_]] = HNil def :+: [G](g: G): G :+: HNil = HCons(g, this) } object HNil extends HNil final case class HCons[H, T <: HList](head : H, tail : T) extends HList { - type ToM[M[_]] = MCons[H, tail.ToM[M], M] - type Up = MCons[H, tail.Up, Id] - def up = MCons[H,tail.Up, Id](head, tail.up) + type Wrap[M[_]] = M[H] :+: T#Wrap[M] def :+: [G](g: G): G :+: H :+: T = HCons(g, this) } diff --git a/util/collection/KList.scala b/util/collection/KList.scala new file mode 100644 index 000000000..1a6e72554 --- /dev/null +++ b/util/collection/KList.scala @@ -0,0 +1,47 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +/** A higher-order heterogeneous list. It has a type constructor M[_] and +* type parameters HL. The underlying data is M applied to each type parameter. +* Explicitly tracking M[_] allows performing natural transformations or ensuring +* all data conforms to some common type. */ +sealed trait KList[+M[_], HL <: HList] { + type Raw = HL + /** Transform to the underlying HList type.*/ + def down(implicit ev: M ~> Id): HL + /** Apply a natural transformation. */ + def map[N[_]](f: M ~> N): KList[N, HL] + /** Convert to a List. */ + def toList: List[M[_]] + /** Convert to an HList. */ + def combine[N[X] >: M[X]]: HL#Wrap[N] +} + +final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] { + def down(implicit f: M ~> Id) = HCons(f(head), tail.down(f)) + def map[N[_]](f: M ~> N) = KCons( f(head), tail.map(f) ) + // prepend + def :^: [N[X] >: M[X], G](g: N[G]) = KCons(g, this) + def toList = head :: tail.toList + + def combine[N[X] >: M[X]]: (H :+: T)#Wrap[N] = HCons(head, tail.combine) +} + +sealed class KNil extends KList[Nothing, HNil] { + def down(implicit f: Nothing ~> Id) = HNil + def map[N[_]](f: Nothing ~> N) = KNil + def :^: [M[_], H](h: M[H]) = KCons(h, this) + def toList = Nil + def combine[N[X]] = HNil +} +object KNil extends KNil + +object KList +{ + // nicer alias for pattern matching + val :^: = KCons +} diff --git a/util/collection/MList.scala b/util/collection/MList.scala deleted file mode 100644 index 7adfc1568..000000000 --- a/util/collection/MList.scala +++ /dev/null @@ -1,51 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt - -import Types._ - -sealed trait MList[+M[_]] -{ - // For converting MList[Id] to an HList - // This is useful because type inference doesn't work well with Id - type Raw <: HList - def down(implicit ev: M ~> Id): Raw - - type Map[N[_]] <: MList[N] - def map[N[_]](f: M ~> N): Map[N] - - def toList: List[M[_]] -} -final case class MCons[H, +T <: MList[M], +M[_]](head: M[H], tail: T) extends MList[M] -{ - type Raw = H :+: tail.Raw - def down(implicit f: M ~> Id): Raw = HCons(f(head), tail.down(f)) - - type Map[N[_]] = MCons[H, tail.Map[N], N] - def map[N[_]](f: M ~> N) = MCons( f(head), tail.map(f) ) - - def :^: [N[X] >: M[X], G](g: N[G]): MCons[G, MCons[H, T, N], N] = MCons(g, this) - - def toList = head :: tail.toList -} -sealed class MNil extends MList[Nothing] -{ - type Raw = HNil - def down(implicit f: Nothing ~> Id) = HNil - - type Map[N[_]] = MNil - def map[N[_]](f: Nothing ~> N) = MNil - - def :^: [M[_], H](h: M[H]): MCons[H, MNil, M] = MCons(h, this) - - def toList = Nil -} -object MNil extends MNil - - -object MList -{ - implicit def fromTCList[A[_]](list: Traversable[A[_]]): MList[A] = ((MNil: MList[A]) /: list) ( (hl,v) => MCons(v, hl) ) - implicit def fromList[A](list: Traversable[A]): MList[Const[A]#Apply] = ((MNil: MList[Const[A]#Apply]) /: list) ( (hl,v) => MCons[A, hl.type, Const[A]#Apply](v, hl) ) -} \ No newline at end of file diff --git a/util/collection/Relation.scala b/util/collection/Relation.scala index c60925aba..170c82a74 100644 --- a/util/collection/Relation.scala +++ b/util/collection/Relation.scala @@ -6,72 +6,72 @@ package sbt object Relation { /** Constructs a new immutable, finite relation that is initially empty. */ - def empty[T]: Relation[T] = new MRelation[T](Map.empty, Map.empty) + def empty[A,B]: Relation[A,B] = new MRelation[A,B](Map.empty, Map.empty) } -/** Binary relation on T. It is a set of pairs (_1, _2) for _1, _2 in T. */ -trait Relation[T] +/** Binary relation between A and B. It is a set of pairs (_1, _2) for _1 in A, _2 in B. */ +trait Relation[A,B] { /** Returns the set of all _2s such that (_1, _2) is in this relation. */ - def forward(_1: T): Set[T] + def forward(_1: A): Set[B] /** Returns the set of all _1s such that (_1, _2) is in this relation. */ - def reverse(_2: T): Set[T] + def reverse(_2: B): Set[A] /** Includes the relation given by `pair`. */ - def +(pair: (T, T)): Relation[T] + def +(pair: (A, B)): Relation[A,B] /** Includes the relation (a, b). */ - def +(a: T, b: T): Relation[T] + def +(a: A, b: B): Relation[A,B] /** Includes the relations (a, b) for all b in bs. */ - def +(a: T, bs: Iterable[T]): Relation[T] + def +(a: A, bs: Iterable[B]): Relation[A,B] /** Returns the union of the relation r with this relation. */ - def ++(r: Relation[T]): Relation[T] + def ++(r: Relation[A,B]): Relation[A,B] /** Includes the given relations. */ - def ++(rs: Iterable[(T,T)]): Relation[T] + def ++(rs: Iterable[(A,B)]): Relation[A,B] /** Removes all relations (_1, _2) for all _1 in _1s. */ - def --(_1s: Iterable[T]): Relation[T] + def --(_1s: Iterable[A]): Relation[A,B] /** Removes all `pairs` from this relation. */ - def --(pairs: Traversable[(T,T)]): Relation[T] + def --(pairs: Traversable[(A,B)]): Relation[A,B] /** Removes all pairs (_1, _2) from this relation. */ - def -(_1: T): Relation[T] + def -(_1: A): Relation[A,B] /** Removes `pair` from this relation. */ - def -(pair: (T,T)): Relation[T] + def -(pair: (A,B)): Relation[A,B] /** Returns the set of all _1s such that (_1, _2) is in this relation. */ - def _1s: collection.Set[T] + def _1s: collection.Set[A] /** Returns the set of all _2s such that (_1, _2) is in this relation. */ - def _2s: collection.Set[T] + def _2s: collection.Set[B] /** Returns all pairs in this relation.*/ - def all: Traversable[(T,T)] + def all: Traversable[(A,B)] - def forwardMap: Map[T, Set[T]] - def reverseMap: Map[T, Set[T]] + def forwardMap: Map[A, Set[B]] + def reverseMap: Map[B, Set[A]] } -private final class MRelation[T](fwd: Map[T, Set[T]], rev: Map[T, Set[T]]) extends Relation[T] +private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) extends Relation[A,B] { - type M = Map[T, Set[T]] + type M[X,Y] = Map[X, Set[Y]] def forwardMap = fwd def reverseMap = rev - def forward(t: T) = get(fwd, t) - def reverse(t: T) = get(rev, t) + def forward(t: A) = get(fwd, t) + def reverse(t: B) = get(rev, t) def _1s = fwd.keySet def _2s = rev.keySet - def all: Traversable[(T,T)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable + def all: Traversable[(A,B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable - def +(pair: (T, T)): Relation[T] = this + (pair._1, Set(pair._2)) - def +(from: T, to: T): Relation[T] = this + (from, Set(to)) - def +(from: T, to: Iterable[T]): Relation[T] = + def +(pair: (A,B)) = this + (pair._1, Set(pair._2)) + def +(from: A, to: B) = this + (from, Set(to)) + def +(from: A, to: Iterable[B]) = new MRelation( add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, Seq(from)) }) - def ++(rs: Iterable[(T,T)]): Relation[T] = ((this: Relation[T]) /: rs) { _ + _ } - def ++(other: Relation[T]): Relation[T] = new MRelation[T]( combine(fwd, other.forwardMap), combine(rev, other.reverseMap) ) + def ++(rs: Iterable[(A,B)]) = ((this: Relation[A,B]) /: rs) { _ + _ } + def ++(other: Relation[A,B]) = new MRelation[A,B]( combine(fwd, other.forwardMap), combine(rev, other.reverseMap) ) - def --(ts: Iterable[T]): Relation[T] = ((this: Relation[T]) /: ts) { _ - _ } - def --(pairs: Traversable[(T,T)]): Relation[T] = ((this: Relation[T]) /: pairs) { _ - _ } - def -(pair: (T,T)): Relation[T] = + def --(ts: Iterable[A]): Relation[A,B] = ((this: Relation[A,B]) /: ts) { _ - _ } + def --(pairs: Traversable[(A,B)]): Relation[A,B] = ((this: Relation[A,B]) /: pairs) { _ - _ } + def -(pair: (A,B)): Relation[A,B] = new MRelation( remove(fwd, pair._1, pair._2), remove(rev, pair._2, pair._1) ) - def -(t: T): Relation[T] = + def -(t: A): Relation[A,B] = fwd.get(t) match { case Some(rs) => val upRev = (rev /: rs) { (map, r) => remove(map, r, t) } @@ -79,7 +79,7 @@ private final class MRelation[T](fwd: Map[T, Set[T]], rev: Map[T, Set[T]]) exten case None => this } - private def remove(map: M, from: T, to: T): M = + private def remove[X,Y](map: M[X,Y], from: X, to: Y): M[X,Y] = map.get(from) match { case Some(tos) => val newSet = tos - to @@ -87,13 +87,13 @@ private final class MRelation[T](fwd: Map[T, Set[T]], rev: Map[T, Set[T]]) exten case None => map } - private def combine(a: M, b: M): M = + private def combine[X,Y](a: M[X,Y], b: M[X,Y]): M[X,Y] = (a /: b) { (map, mapping) => add(map, mapping._1, mapping._2) } - private[this] def add(map: M, from: T, to: Iterable[T]): M = + private[this] def add[X,Y](map: M[X,Y], from: X, to: Iterable[Y]): M[X,Y] = map.updated(from, get(map, from) ++ to) - private[this] def get(map: M, t: T): Set[T] = map.getOrElse(t, Set.empty[T]) + private[this] def get[X,Y](map: M[X,Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) override def toString = all.mkString("Relation [", ", ", "]") } \ No newline at end of file diff --git a/util/collection/Types.scala b/util/collection/Types.scala index de6cf5aec..c5d484c51 100644 --- a/util/collection/Types.scala +++ b/util/collection/Types.scala @@ -5,7 +5,7 @@ package sbt object Types extends TypeFunctions { - val :^: = MCons + val :^: = KCons val :+: = HCons type :+:[H, T <: HList] = HCons[H,T] } diff --git a/util/collection/src/test/scala/MListTest.scala b/util/collection/src/test/scala/KListTest.scala similarity index 62% rename from util/collection/src/test/scala/MListTest.scala rename to util/collection/src/test/scala/KListTest.scala index ffb86b18e..210084fb1 100644 --- a/util/collection/src/test/scala/MListTest.scala +++ b/util/collection/src/test/scala/KListTest.scala @@ -5,15 +5,15 @@ package sbt import Types._ -object MTest { +object KTest { val f = new (Option ~> List) { def apply[T](o: Option[T]): List[T] = o.toList } - val x = Some(3) :^: Some("asdf") :^: MNil + val x = Some(3) :^: Some("asdf") :^: KNil val y = x map f - val m1a = y match { case List(3) :^: List("asdf") :^: MNil => println("true") } - val m1b = (List(3) :^: MNil) match { case yy :^: MNil => println("true") } + val m1a = y match { case List(3) :^: List("asdf") :^: KNil => println("true") } + val m1b = (List(3) :^: KNil) match { case yy :^: KNil => println("true") } val head = new (List ~> Id) { def apply[T](xs: List[T]): T = xs.head } - val z = y.map[Id](head).down + val z = y down head val m2 = z match { case 3 :+: "asdf" :+: HNil => println("true") } } diff --git a/util/collection/src/test/scala/PMapTest.scala b/util/collection/src/test/scala/PMapTest.scala index 1ea66daaa..091012f6e 100644 --- a/util/collection/src/test/scala/PMapTest.scala +++ b/util/collection/src/test/scala/PMapTest.scala @@ -11,7 +11,7 @@ object PMapTest val mp = new DelegatingPMap[Some, Id](new collection.mutable.HashMap) mp(Some("asdf")) = "a" mp(Some(3)) = 9 - val x = Some(3) :^: Some("asdf") :^: MNil + val x = Some(3) :^: Some("asdf") :^: KNil val y = x.map[Id](mp) val z = y.down z match { case 9 :+: "a" :+: HNil => println("true") } diff --git a/util/collection/src/test/scala/RelationTest.scala b/util/collection/src/test/scala/RelationTest.scala index d8a76bc17..e82bd861d 100644 --- a/util/collection/src/test/scala/RelationTest.scala +++ b/util/collection/src/test/scala/RelationTest.scala @@ -8,11 +8,11 @@ import Prop._ object RelationTest extends Properties("Relation") { - property("Added entry check") = forAll { (pairs: List[(Int, Int)]) => - val r = Relation.empty[Int] ++ pairs + property("Added entry check") = forAll { (pairs: List[(Int, Double)]) => + val r = Relation.empty[Int, Double] ++ pairs check(r, pairs) } - def check(r: Relation[Int], pairs: Seq[(Int, Int)]) = + def check(r: Relation[Int, Double], pairs: Seq[(Int, Double)]) = { val _1s = pairs.map(_._1).toSet val _2s = pairs.map(_._2).toSet @@ -27,15 +27,15 @@ object RelationTest extends Properties("Relation") } } - property("Does not contain removed entries") = forAll { (pairs: List[(Int, Int, Boolean)]) => + property("Does not contain removed entries") = forAll { (pairs: List[(Int, Double, Boolean)]) => val add = pairs.map { case (a,b,c) => (a,b) } - val added = Relation.empty[Int] ++ add + val added = Relation.empty[Int, Double] ++ add val removeFine = pairs.collect { case (a,b,true) => (a,b) } val removeCoarse = removeFine.map(_._1) val r = added -- removeCoarse - def notIn[T](map: Map[T, Set[T]], a: T, b: T) = map.get(a).forall(set => ! (set contains b) ) + def notIn[X,Y](map: Map[X, Set[Y]], a: X, b: Y) = map.get(a).forall(set => ! (set contains b) ) all(removeCoarse) { rem => ("_1s does not contain removed" |: (!r._1s.contains(rem)) ) && @@ -56,12 +56,12 @@ object RelationTest extends Properties("Relation") object EmptyRelationTest extends Properties("Empty relation") { - lazy val e = Relation.empty[Int] + lazy val e = Relation.empty[Int, Double] property("Forward empty") = forAll { (i: Int) => e.forward(i).isEmpty } - property("Reverse empty") = forAll { (i: Int) => e.reverse(i).isEmpty } - property("Forward map empty") = forAll { (i: Int) => e.forwardMap.isEmpty } - property("Reverse map empty") = forAll { (i: Int) => e.reverseMap.isEmpty } - property("_1 empty") = forAll { (i: Int) => e._1s.isEmpty } - property("_2 empty") = forAll { (i: Int) => e._2s.isEmpty } + property("Reverse empty") = forAll { (i: Double) => e.reverse(i).isEmpty } + property("Forward map empty") = e.forwardMap.isEmpty + property("Reverse map empty") = e.reverseMap.isEmpty + property("_1 empty") = e._1s.isEmpty + property("_2 empty") = e._2s.isEmpty } \ No newline at end of file From ba725d50462fd6da32e5bb77e928502d4b5600f8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 27 Jun 2010 09:16:16 -0400 Subject: [PATCH 056/823] Relation.make --- util/collection/Relation.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/collection/Relation.scala b/util/collection/Relation.scala index 170c82a74..ab31cdd06 100644 --- a/util/collection/Relation.scala +++ b/util/collection/Relation.scala @@ -6,7 +6,8 @@ package sbt object Relation { /** Constructs a new immutable, finite relation that is initially empty. */ - def empty[A,B]: Relation[A,B] = new MRelation[A,B](Map.empty, Map.empty) + def empty[A,B]: Relation[A,B] = make(Map.empty, Map.empty) + def make[A,B](forward: Map[A,Set[B]], reverse: Map[B, Set[A]]): Relation[A,B] = new MRelation(forward, reverse) } /** Binary relation between A and B. It is a set of pairs (_1, _2) for _1 in A, _2 in B. */ trait Relation[A,B] From 7ecfc0b8f8a3b7fd4dbf14b34453ae62e5fc5482 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 2 Jul 2010 06:57:03 -0400 Subject: [PATCH 057/823] discovery, persistence, frontend, and various fixes to incremental --- util/collection/Relation.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/collection/Relation.scala b/util/collection/Relation.scala index ab31cdd06..ed6046f6e 100644 --- a/util/collection/Relation.scala +++ b/util/collection/Relation.scala @@ -96,5 +96,5 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext private[this] def get[X,Y](map: M[X,Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) - override def toString = all.mkString("Relation [", ", ", "]") + override def toString = all.map { case (a,b) => a + " -> " + b }.mkString("Relation [", ", ", "]") } \ No newline at end of file From 5cd6ef268c20a9e0aef2215957f585b4083e57d5 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 5 Jul 2010 12:53:37 -0400 Subject: [PATCH 058/823] - Stuart's improvements to triggered execution - continue splitting original sbt module * separated process, testing modules * various IO, logging, classpath migration * split out javac interface --- util/complete/NOTICE | 3 + util/log/BufferedLogger.scala | 6 +- util/log/FullLogger.scala | 24 + util/log/Logger.scala | 30 +- util/log/LoggerWriter.scala | 4 +- util/process/Process.scala | 167 +++++++ util/process/ProcessImpl.scala | 473 ++++++++++++++++++ .../src/test/scala/ProcessSpecification.scala | 93 ++++ .../src/test/scala/TestedProcess.scala | 56 +++ 9 files changed, 840 insertions(+), 16 deletions(-) create mode 100644 util/complete/NOTICE create mode 100644 util/log/FullLogger.scala create mode 100644 util/process/Process.scala create mode 100644 util/process/ProcessImpl.scala create mode 100644 util/process/src/test/scala/ProcessSpecification.scala create mode 100644 util/process/src/test/scala/TestedProcess.scala diff --git a/util/complete/NOTICE b/util/complete/NOTICE new file mode 100644 index 000000000..a6f2c1de4 --- /dev/null +++ b/util/complete/NOTICE @@ -0,0 +1,3 @@ +Simple Build Tool: Completion Component +Copyright 2010 Mark Harrah +Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala index 689d21b14..054a4b55d 100644 --- a/util/log/BufferedLogger.scala +++ b/util/log/BufferedLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ - package xsbt +package sbt import sbt.{AbstractLogger, ControlEvent, Level, Log, LogEvent, SetLevel, SetTrace, Success, Trace} import scala.collection.mutable.ListBuffer @@ -50,14 +50,14 @@ class BufferedLogger(delegate: AbstractLogger) extends AbstractLogger def setLevel(newLevel: Level.Value) { - buffer += new SetLevel(newLevel) + buffer += new SetLevel(newLevel) delegate.setLevel(newLevel) } def getLevel = delegate.getLevel def getTrace = delegate.getTrace def setTrace(level: Int) { - buffer += new SetTrace(level) + buffer += new SetTrace(level) delegate.setTrace(level) } diff --git a/util/log/FullLogger.scala b/util/log/FullLogger.scala new file mode 100644 index 000000000..091664244 --- /dev/null +++ b/util/log/FullLogger.scala @@ -0,0 +1,24 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +/** Promotes the simple Logger interface to the full AbstractLogger interface. */ +class FullLogger(delegate: Logger, override val ansiCodesSupported: Boolean = false) extends BasicLogger +{ + def trace(t: => Throwable) + { + if(traceEnabled) + delegate.trace(t) + } + def log(level: Level.Value, message: => String) + { + if(atLevel(level)) + delegate.log(level, message) + } + def success(message: => String): Unit = + info(message) + def control(event: ControlEvent.Value, message: => String): Unit = + info(message) + def logAll(events: Seq[LogEvent]): Unit = events.foreach(log) +} diff --git a/util/log/Logger.scala b/util/log/Logger.scala index 1be353c4b..68e1f26d8 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -1,28 +1,20 @@ /* sbt -- Simple Build Tool - * Copyright 2008, 2009 Mark Harrah + * Copyright 2008, 2009, 2010 Mark Harrah */ package sbt import xsbti.{Logger => xLogger, F0} -abstract class AbstractLogger extends xLogger with NotNull +abstract class AbstractLogger extends Logger { def getLevel: Level.Value def setLevel(newLevel: Level.Value) def setTrace(flag: Int) def getTrace: Int final def traceEnabled = getTrace >= 0 - def ansiCodesSupported = false def atLevel(level: Level.Value) = level.id >= getLevel.id - def trace(t: => Throwable): Unit - final def verbose(message: => String): Unit = debug(message) - final def debug(message: => String): Unit = log(Level.Debug, message) - final def info(message: => String): Unit = log(Level.Info, message) - final def warn(message: => String): Unit = log(Level.Warn, message) - final def error(message: => String): Unit = log(Level.Error, message) def success(message: => String): Unit - def log(level: Level.Value, message: => String): Unit def control(event: ControlEvent.Value, message: => String): Unit def logAll(events: Seq[LogEvent]): Unit @@ -39,11 +31,27 @@ abstract class AbstractLogger extends xLogger with NotNull case c: ControlEvent => control(c.event, c.msg) } } +} +/** This is intended to be the simplest logging interface for use by code that wants to log. +* It does not include configuring the logger. */ +trait Logger extends xLogger +{ + final def verbose(message: => String): Unit = debug(message) + final def debug(message: => String): Unit = log(Level.Debug, message) + final def info(message: => String): Unit = log(Level.Info, message) + final def warn(message: => String): Unit = log(Level.Warn, message) + final def error(message: => String): Unit = log(Level.Error, message) + + def ansiCodesSupported = false + + def trace(t: => Throwable): Unit + def log(level: Level.Value, message: => String): Unit + def debug(msg: F0[String]): Unit = log(Level.Debug, msg) def warn(msg: F0[String]): Unit = log(Level.Warn, msg) def info(msg: F0[String]): Unit = log(Level.Info, msg) def error(msg: F0[String]): Unit = log(Level.Error, msg) def trace(msg: F0[Throwable]) = trace(msg.apply) def log(level: Level.Value, msg: F0[String]): Unit = log(level, msg.apply) -} +} \ No newline at end of file diff --git a/util/log/LoggerWriter.scala b/util/log/LoggerWriter.scala index 885646973..81c0d89d0 100644 --- a/util/log/LoggerWriter.scala +++ b/util/log/LoggerWriter.scala @@ -5,9 +5,9 @@ package sbt /** Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. * A line is delimited by `nl`, which is by default the platform line separator.*/ -class LoggerWriter(delegate: AbstractLogger, level: Level.Value, nl: String) extends java.io.Writer +class LoggerWriter(delegate: Logger, level: Level.Value, nl: String) extends java.io.Writer { - def this(delegate: AbstractLogger, level: Level.Value) = this(delegate, level, System.getProperty("line.separator")) + def this(delegate: Logger, level: Level.Value) = this(delegate, level, System.getProperty("line.separator")) private[this] val buffer = new StringBuilder diff --git a/util/process/Process.scala b/util/process/Process.scala new file mode 100644 index 000000000..183469516 --- /dev/null +++ b/util/process/Process.scala @@ -0,0 +1,167 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt + +import java.lang.{Process => JProcess, ProcessBuilder => JProcessBuilder} +import java.io.{Closeable, File, IOException} +import java.io.{BufferedReader, InputStream, InputStreamReader, OutputStream, PipedInputStream, PipedOutputStream} +import java.net.URL + +/** Methods for constructing simple commands that can then be combined. */ +object Process +{ + implicit def apply(command: String): ProcessBuilder = apply(command, None) + implicit def apply(command: Seq[String]): ProcessBuilder = apply (command.toArray, None) + def apply(command: String, arguments: Seq[String]): ProcessBuilder = apply(command :: arguments.toList, None) + /** create ProcessBuilder with working dir set to File and extra environment variables */ + def apply(command: String, cwd: File, extraEnv: (String,String)*): ProcessBuilder = + apply(command, Some(cwd), extraEnv : _*) + /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ + def apply(command: String, cwd: Option[File], extraEnv: (String,String)*): ProcessBuilder = { + apply(command.split("""\s+"""), cwd, extraEnv : _*) + // not smart to use this on windows, because CommandParser uses \ to escape ". + /*CommandParser.parse(command) match { + case Left(errorMsg) => error(errorMsg) + case Right((cmd, args)) => apply(cmd :: args, cwd, extraEnv : _*) + }*/ + } + /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ + def apply(command: Seq[String], cwd: Option[File], extraEnv: (String,String)*): ProcessBuilder = { + val jpb = new JProcessBuilder(command.toArray : _*) + cwd.foreach(jpb directory _) + extraEnv.foreach { case (k, v) => jpb.environment.put(k, v) } + apply(jpb) + } + implicit def apply(builder: JProcessBuilder): ProcessBuilder = new SimpleProcessBuilder(builder) + implicit def apply(file: File): FilePartialBuilder = new FileBuilder(file) + implicit def apply(url: URL): URLPartialBuilder = new URLBuilder(url) + implicit def apply(command: scala.xml.Elem): ProcessBuilder = apply(command.text.trim) + implicit def applySeq[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = builders.map(convert) + def apply(value: Boolean): ProcessBuilder = apply(value.toString, if(value) 0 else 1) + def apply(name: String, exitValue: => Int): ProcessBuilder = new DummyProcessBuilder(name, exitValue) + + def cat(file: SourcePartialBuilder, files: SourcePartialBuilder*): ProcessBuilder = cat(file :: files.toList) + def cat(files: Seq[SourcePartialBuilder]): ProcessBuilder = + { + require(!files.isEmpty) + files.map(_.cat).reduceLeft(_ #&& _) + } +} + +trait SourcePartialBuilder extends NotNull +{ + /** Writes the output stream of this process to the given file. */ + def #> (f: File): ProcessBuilder = toFile(f, false) + /** Appends the output stream of this process to the given file. */ + def #>> (f: File): ProcessBuilder = toFile(f, true) + /** Writes the output stream of this process to the given OutputStream. The + * argument is call-by-name, so the stream is recreated, written, and closed each + * time this process is executed. */ + def #>(out: => OutputStream): ProcessBuilder = #> (new OutputStreamBuilder(out)) + def #>(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(toSource, b, false) + private def toFile(f: File, append: Boolean) = #> (new FileOutput(f, append)) + def cat = toSource + protected def toSource: ProcessBuilder +} +trait SinkPartialBuilder extends NotNull +{ + /** Reads the given file into the input stream of this process. */ + def #< (f: File): ProcessBuilder = #< (new FileInput(f)) + /** Reads the given URL into the input stream of this process. */ + def #< (f: URL): ProcessBuilder = #< (new URLInput(f)) + /** Reads the given InputStream into the input stream of this process. The + * argument is call-by-name, so the stream is recreated, read, and closed each + * time this process is executed. */ + def #<(in: => InputStream): ProcessBuilder = #< (new InputStreamBuilder(in)) + def #<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, toSink, false) + protected def toSink: ProcessBuilder +} + +trait URLPartialBuilder extends SourcePartialBuilder +trait FilePartialBuilder extends SinkPartialBuilder with SourcePartialBuilder +{ + def #<<(f: File): ProcessBuilder + def #<<(u: URL): ProcessBuilder + def #<<(i: => InputStream): ProcessBuilder + def #<<(p: ProcessBuilder): ProcessBuilder +} + +/** Represents a process that is running or has finished running. +* It may be a compound process with several underlying native processes (such as 'a #&& b`).*/ +trait Process extends NotNull +{ + /** Blocks until this process exits and returns the exit code.*/ + def exitValue(): Int + /** Destroys this process. */ + def destroy(): Unit +} +/** Represents a runnable process. */ +trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder +{ + /** Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is + * sent to the console. If the exit code is non-zero, an exception is thrown.*/ + def !! : String + /** Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is + * sent to the provided Logger. If the exit code is non-zero, an exception is thrown.*/ + def !!(log: Logger) : String + /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available + * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, + * the Stream will provide all lines up to termination and then throw an exception. */ + def lines: Stream[String] + /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available + * but the process has not completed. Standard error is sent to the provided Logger. If the process exits with a non-zero value, + * the Stream will provide all lines up to termination but will not throw an exception. */ + def lines(log: Logger): Stream[String] + /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available + * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, + * the Stream will provide all lines up to termination but will not throw an exception. */ + def lines_! : Stream[String] + /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available + * but the process has not completed. Standard error is sent to the provided Logger. If the process exits with a non-zero value, + * the Stream will provide all lines up to termination but will not throw an exception. */ + def lines_!(log: Logger): Stream[String] + /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the console.*/ + def ! : Int + /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the given Logger.*/ + def !(log: Logger): Int + /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the console. The newly started process reads from standard input of the current process.*/ + def !< : Int + /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the given Logger. The newly started process reads from standard input of the current process.*/ + def !<(log: Logger) : Int + /** Starts the process represented by this builder. Standard output and error are sent to the console.*/ + def run(): Process + /** Starts the process represented by this builder. Standard output and error are sent to the given Logger.*/ + def run(log: Logger): Process + /** Starts the process represented by this builder. I/O is handled by the given ProcessIO instance.*/ + def run(io: ProcessIO): Process + /** Starts the process represented by this builder. Standard output and error are sent to the console. + * The newly started process reads from standard input of the current process if `connectInput` is true.*/ + def run(connectInput: Boolean): Process + /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the given Logger. + * The newly started process reads from standard input of the current process if `connectInput` is true.*/ + def run(log: Logger, connectInput: Boolean): Process + + /** Constructs a command that runs this command first and then `other` if this command succeeds.*/ + def #&& (other: ProcessBuilder): ProcessBuilder + /** Constructs a command that runs this command first and then `other` if this command does not succeed.*/ + def #|| (other: ProcessBuilder): ProcessBuilder + /** Constructs a command that will run this command and pipes the output to `other`. `other` must be a simple command.*/ + def #| (other: ProcessBuilder): ProcessBuilder + /** Constructs a command that will run this command and then `other`. The exit code will be the exit code of `other`.*/ + def ### (other: ProcessBuilder): ProcessBuilder + + def canPipeTo: Boolean +} +/** Each method will be called in a separate thread.*/ +final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: InputStream => Unit, val processError: InputStream => Unit) extends NotNull +{ + def withOutput(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, process, processError) + def withError(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, processOutput, process) + def withInput(write: OutputStream => Unit): ProcessIO = new ProcessIO(write, processOutput, processError) +} \ No newline at end of file diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala new file mode 100644 index 000000000..5f89a4749 --- /dev/null +++ b/util/process/ProcessImpl.scala @@ -0,0 +1,473 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah, Vesa Vilhonen + */ +package sbt + +import java.lang.{Process => JProcess, ProcessBuilder => JProcessBuilder} +import java.io.{BufferedReader, Closeable, InputStream, InputStreamReader, IOException, OutputStream, PrintStream} +import java.io.{FilterInputStream, FilterOutputStream, PipedInputStream, PipedOutputStream} +import java.io.{File, FileInputStream, FileOutputStream} +import java.net.URL + +import scala.concurrent.SyncVar + +/** Runs provided code in a new Thread and returns the Thread instance. */ +private object Spawn +{ + def apply(f: => Unit): Thread = apply(f, false) + def apply(f: => Unit, daemon: Boolean): Thread = + { + val thread = new Thread() { override def run() = { f } } + thread.setDaemon(daemon) + thread.start() + thread + } +} +private object Future +{ + def apply[T](f: => T): () => T = + { + val result = new SyncVar[Either[Throwable, T]] + def run: Unit = + try { result.set(Right(f)) } + catch { case e: Exception => result.set(Left(e)) } + Spawn(run) + () => + result.get match + { + case Right(value) => value + case Left(exception) => throw exception + } + } +} + +object BasicIO +{ + def apply(buffer: StringBuffer, log: Option[Logger], withIn: Boolean) = new ProcessIO(input(withIn), processFully(buffer), getErr(log)) + def apply(log: Logger, withIn: Boolean) = new ProcessIO(input(withIn), processFully(log, Level.Info), processFully(log, Level.Error)) + + def getErr(log: Option[Logger]) = log match { case Some(lg) => processFully(lg, Level.Error); case None => toStdErr } + + def ignoreOut = (i: OutputStream) => () + final val BufferSize = 8192 + final val Newline = System.getProperty("line.separator") + + def close(c: java.io.Closeable) = try { c.close() } catch { case _: java.io.IOException => () } + def processFully(log: Logger, level: Level.Value): InputStream => Unit = processFully(line => log.log(level, line)) + def processFully(buffer: Appendable): InputStream => Unit = processFully(appendLine(buffer)) + def processFully(processLine: String => Unit): InputStream => Unit = + in => + { + val reader = new BufferedReader(new InputStreamReader(in)) + processLinesFully(processLine)(reader.readLine) + } + def processLinesFully(processLine: String => Unit)(readLine: () => String) + { + def readFully() + { + val line = readLine() + if(line != null) + { + processLine(line) + readFully() + } + } + readFully() + } + def connectToIn(o: OutputStream) { transferFully(System.in, o) } + def input(connect: Boolean): OutputStream => Unit = if(connect) connectToIn else ignoreOut + def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput)) + def standard(in: OutputStream => Unit): ProcessIO = new ProcessIO(in, toStdOut, toStdErr) + + def toStdErr = (in: InputStream) => transferFully(in, System.err) + def toStdOut = (in: InputStream) => transferFully(in, System.out) + + def transferFully(in: InputStream, out: OutputStream): Unit = + try { transferFullyImpl(in, out) } + catch { case _: InterruptedException => () } + + private[this] def appendLine(buffer: Appendable): String => Unit = + line => + { + buffer.append(line) + buffer.append(Newline) + } + + private[this] def transferFullyImpl(in: InputStream, out: OutputStream) + { + val continueCount = 1//if(in.isInstanceOf[PipedInputStream]) 1 else 0 + val buffer = new Array[Byte](BufferSize) + def read + { + val byteCount = in.read(buffer) + if(byteCount >= continueCount) + { + out.write(buffer, 0, byteCount) + out.flush() + read + } + } + read + } +} + + +private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPartialBuilder with SourcePartialBuilder +{ + def #&&(other: ProcessBuilder): ProcessBuilder = new AndProcessBuilder(this, other) + def #||(other: ProcessBuilder): ProcessBuilder = new OrProcessBuilder(this, other) + def #|(other: ProcessBuilder): ProcessBuilder = + { + require(other.canPipeTo, "Piping to multiple processes is not supported.") + new PipedProcessBuilder(this, other, false) + } + def ###(other: ProcessBuilder): ProcessBuilder = new SequenceProcessBuilder(this, other) + + protected def toSource = this + protected def toSink = this + + def run(): Process = run(false) + def run(connectInput: Boolean): Process = run(BasicIO.standard(connectInput)) + def run(log: Logger): Process = run(log, false) + def run(log: Logger, connectInput: Boolean): Process = run(BasicIO(log, connectInput)) + + private[this] def getString(log: Option[Logger], withIn: Boolean): String = + { + val buffer = new StringBuffer + val code = this ! BasicIO(buffer, log, withIn) + if(code == 0) buffer.toString else error("Nonzero exit value: " + code) + } + def !! = getString(None, false) + def !!(log: Logger) = getString(Some(log), false) + def !!< = getString(None, true) + def !!<(log: Logger) = getString(Some(log), true) + + def lines: Stream[String] = lines(false, true, None) + def lines(log: Logger): Stream[String] = lines(false, true, Some(log)) + def lines_! : Stream[String] = lines(false, false, None) + def lines_!(log: Logger): Stream[String] = lines(false, false, Some(log)) + + private[this] def lines(withInput: Boolean, nonZeroException: Boolean, log: Option[Logger]): Stream[String] = + { + val streamed = Streamed[String](nonZeroException) + val process = run(new ProcessIO(BasicIO.input(withInput), BasicIO.processFully(streamed.process), BasicIO.getErr(log))) + Spawn { streamed.done(process.exitValue()) } + streamed.stream() + } + + def ! = run(false).exitValue() + def !< = run(true).exitValue() + def !(log: Logger) = runBuffered(log, false) + def !<(log: Logger) = runBuffered(log, true) + private[this] def runBuffered(log: Logger, connectInput: Boolean) = + { + val log2 = new BufferedLogger(new FullLogger(log)) + log2.buffer { run(log2, connectInput).exitValue() } + } + def !(io: ProcessIO) = run(io).exitValue() + + def canPipeTo = false +} + +private[sbt] class URLBuilder(url: URL) extends URLPartialBuilder with SourcePartialBuilder +{ + protected def toSource = new URLInput(url) +} +private[sbt] class FileBuilder(base: File) extends FilePartialBuilder with SinkPartialBuilder with SourcePartialBuilder +{ + protected def toSource = new FileInput(base) + protected def toSink = new FileOutput(base, false) + def #<<(f: File): ProcessBuilder = #<<(new FileInput(f)) + def #<<(u: URL): ProcessBuilder = #<<(new URLInput(u)) + def #<<(s: => InputStream): ProcessBuilder = #<<(new InputStreamBuilder(s)) + def #<<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, new FileOutput(base, true), false) +} + +private abstract class BasicBuilder extends AbstractProcessBuilder +{ + protected[this] def checkNotThis(a: ProcessBuilder) = require(a != this, "Compound process '" + a + "' cannot contain itself.") + final def run(io: ProcessIO): Process = + { + val p = createProcess(io) + p.start() + p + } + protected[this] def createProcess(io: ProcessIO): BasicProcess +} +private abstract class BasicProcess extends Process +{ + def start(): Unit +} + +private abstract class CompoundProcess extends BasicProcess +{ + def destroy() { destroyer() } + def exitValue() = getExitValue().getOrElse(error("No exit code: process destroyed.")) + + def start() = getExitValue + + protected lazy val (getExitValue, destroyer) = + { + val code = new SyncVar[Option[Int]]() + code.set(None) + val thread = Spawn(code.set(runAndExitValue())) + + ( + Future { thread.join(); code.get }, + () => thread.interrupt() + ) + } + + /** Start and block until the exit value is available and then return it in Some. Return None if destroyed (use 'run')*/ + protected[this] def runAndExitValue(): Option[Int] + + protected[this] def runInterruptible[T](action: => T)(destroyImpl: => Unit): Option[T] = + { + try { Some(action) } + catch { case _: InterruptedException => destroyImpl; None } + } +} + +private abstract class SequentialProcessBuilder(a: ProcessBuilder, b: ProcessBuilder, operatorString: String) extends BasicBuilder +{ + checkNotThis(a) + checkNotThis(b) + override def toString = " ( " + a + " " + operatorString + " " + b + " ) " +} +private class PipedProcessBuilder(first: ProcessBuilder, second: ProcessBuilder, toError: Boolean) extends SequentialProcessBuilder(first, second, if(toError) "#|!" else "#|") +{ + override def createProcess(io: ProcessIO) = new PipedProcesses(first, second, io, toError) +} +private class AndProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#&&") +{ + override def createProcess(io: ProcessIO) = new AndProcess(first, second, io) +} +private class OrProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#||") +{ + override def createProcess(io: ProcessIO) = new OrProcess(first, second, io) +} +private class SequenceProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "###") +{ + override def createProcess(io: ProcessIO) = new ProcessSequence(first, second, io) +} + +private class SequentialProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO, evaluateSecondProcess: Int => Boolean) extends CompoundProcess +{ + protected[this] override def runAndExitValue() = + { + val first = a.run(io) + runInterruptible(first.exitValue)(first.destroy()) flatMap + { codeA => + if(evaluateSecondProcess(codeA)) + { + val second = b.run(io) + runInterruptible(second.exitValue)(second.destroy()) + } + else + Some(codeA) + } + } +} +private class AndProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, _ == 0) +private class OrProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, _ != 0) +private class ProcessSequence(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, ignore => true) + + +private class PipedProcesses(a: ProcessBuilder, b: ProcessBuilder, defaultIO: ProcessIO, toError: Boolean) extends CompoundProcess +{ + protected[this] override def runAndExitValue() = + { + val currentSource = new SyncVar[Option[InputStream]] + val pipeOut = new PipedOutputStream + val source = new PipeSource(currentSource, pipeOut, a.toString) + source.start() + + val pipeIn = new PipedInputStream(pipeOut) + val currentSink = new SyncVar[Option[OutputStream]] + val sink = new PipeSink(pipeIn, currentSink, b.toString) + sink.start() + + def handleOutOrError(fromOutput: InputStream) = currentSource.put(Some(fromOutput)) + + val firstIO = + if(toError) + defaultIO.withError(handleOutOrError) + else + defaultIO.withOutput(handleOutOrError) + val secondIO = defaultIO.withInput(toInput => currentSink.put(Some(toInput)) ) + + val second = b.run(secondIO) + val first = a.run(firstIO) + try + { + runInterruptible { + first.exitValue + currentSource.put(None) + currentSink.put(None) + val result = second.exitValue + result + } { + first.destroy() + second.destroy() + } + } + finally + { + BasicIO.close(pipeIn) + BasicIO.close(pipeOut) + } + } +} +private class PipeSource(currentSource: SyncVar[Option[InputStream]], pipe: PipedOutputStream, label: => String) extends Thread +{ + final override def run() + { + currentSource.get match + { + case Some(source) => + try { BasicIO.transferFully(source, pipe) } + catch { case e: IOException => println("I/O error " + e.getMessage + " for process: " + label); e.printStackTrace() } + finally + { + BasicIO.close(source) + currentSource.unset() + } + run() + case None => + currentSource.unset() + BasicIO.close(pipe) + } + } +} +private class PipeSink(pipe: PipedInputStream, currentSink: SyncVar[Option[OutputStream]], label: => String) extends Thread +{ + final override def run() + { + currentSink.get match + { + case Some(sink) => + try { BasicIO.transferFully(pipe, sink) } + catch { case e: IOException => println("I/O error " + e.getMessage + " for process: " + label); e.printStackTrace() } + finally + { + BasicIO.close(sink) + currentSink.unset() + } + run() + case None => + currentSink.unset() + } + } +} + +private[sbt] class DummyProcessBuilder(override val toString: String, exitValue : => Int) extends AbstractProcessBuilder +{ + override def run(io: ProcessIO): Process = new DummyProcess(exitValue) + override def canPipeTo = true +} +/** A thin wrapper around a java.lang.Process. `ioThreads` are the Threads created to do I/O. +* The implementation of `exitValue` waits until these threads die before returning. */ +private class DummyProcess(action: => Int) extends Process +{ + private[this] val exitCode = Future(action) + override def exitValue() = exitCode() + override def destroy() {} +} +/** Represents a simple command without any redirection or combination. */ +private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProcessBuilder +{ + override def run(io: ProcessIO): Process = + { + val process = p.start() // start the external process + import io.{writeInput, processOutput, processError} + // spawn threads that process the input, output, and error streams using the functions defined in `io` + val inThread = Spawn(writeInput(process.getOutputStream), true) + val outThread = Spawn(processOutput(process.getInputStream)) + val errorThread = + if(!p.redirectErrorStream) + Spawn(processError(process.getErrorStream)) :: Nil + else + Nil + new SimpleProcess(process, inThread, outThread :: errorThread) + } + override def toString = p.command.toString + override def canPipeTo = true +} +/** A thin wrapper around a java.lang.Process. `outputThreads` are the Threads created to read from the +* output and error streams of the process. `inputThread` is the Thread created to write to the input stream of +* the process. +* The implementation of `exitValue` interrupts `inputThread` and then waits until all I/O threads die before +* returning. */ +private class SimpleProcess(p: JProcess, inputThread: Thread, outputThreads: List[Thread]) extends Process +{ + override def exitValue() = + { + try { p.waitFor() }// wait for the process to terminate + finally { inputThread.interrupt() } // we interrupt the input thread to notify it that it can terminate + outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) + p.exitValue() + } + override def destroy() = + { + try { p.destroy() } + finally { inputThread.interrupt() } + } +} + +private class FileOutput(file: File, append: Boolean) extends OutputStreamBuilder(new FileOutputStream(file, append), file.getAbsolutePath) +private class URLInput(url: URL) extends InputStreamBuilder(url.openStream, url.toString) +private class FileInput(file: File) extends InputStreamBuilder(new FileInputStream(file), file.getAbsolutePath) + +import Uncloseable.protect +private class OutputStreamBuilder(stream: => OutputStream, label: String) extends ThreadProcessBuilder(label, _.writeInput(protect(stream))) +{ + def this(stream: => OutputStream) = this(stream, "") +} +private class InputStreamBuilder(stream: => InputStream, label: String) extends ThreadProcessBuilder(label, _.processOutput(protect(stream))) +{ + def this(stream: => InputStream) = this(stream, "") +} + +private abstract class ThreadProcessBuilder(override val toString: String, runImpl: ProcessIO => Unit) extends AbstractProcessBuilder +{ + override def run(io: ProcessIO): Process = + { + val success = new SyncVar[Boolean] + success.put(false) + new ThreadProcess(Spawn {runImpl(io); success.set(true) }, success) + } +} +private final class ThreadProcess(thread: Thread, success: SyncVar[Boolean]) extends Process +{ + override def exitValue() = + { + thread.join() + if(success.get) 0 else 1 + } + override def destroy() { thread.interrupt() } +} + +object Uncloseable +{ + def apply(in: InputStream): InputStream = new FilterInputStream(in) { override def close() {} } + def apply(out: OutputStream): OutputStream = new FilterOutputStream(out) { override def close() {} } + def protect(in: InputStream): InputStream = if(in eq System.in) Uncloseable(in) else in + def protect(out: OutputStream): OutputStream = if( (out eq System.out) || (out eq System.err)) Uncloseable(out) else out +} +private object Streamed +{ + def apply[T](nonzeroException: Boolean): Streamed[T] = + { + val q = new java.util.concurrent.LinkedBlockingQueue[Either[Int, T]] + def next(): Stream[T] = + q.take match + { + case Left(0) => Stream.empty + case Left(code) => if(nonzeroException) error("Nonzero exit code: " + code) else Stream.empty + case Right(s) => Stream.cons(s, next) + } + new Streamed((s: T) => q.put(Right(s)), code => q.put(Left(code)), () => next()) + } +} + +private final class Streamed[T](val process: T => Unit, val done: Int => Unit, val stream: () => Stream[T]) extends NotNull \ No newline at end of file diff --git a/util/process/src/test/scala/ProcessSpecification.scala b/util/process/src/test/scala/ProcessSpecification.scala new file mode 100644 index 000000000..0d7141635 --- /dev/null +++ b/util/process/src/test/scala/ProcessSpecification.scala @@ -0,0 +1,93 @@ +package sbt + +import java.io.File +import org.scalacheck.{Arbitrary, Gen, Prop, Properties} +import Prop._ + +import Process._ + +object ProcessSpecification extends Properties("Process I/O") +{ + private val log = new ConsoleLogger + + implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary(Gen.choose(0, 10) flatMap { size => Gen.resize(size, Arbitrary.arbArray[Byte].arbitrary) }) + + /*property("Correct exit code") = forAll( (exitCode: Byte) => checkExit(exitCode)) + property("#&& correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ #&& _)(_ && _)) + property("#|| correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ #|| _)(_ || _)) + property("### correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ ### _)( (x,latest) => latest))*/ + property("Pipe to output file") = forAll( (data: Array[Byte]) => checkFileOut(data)) + property("Pipe to input file") = forAll( (data: Array[Byte]) => checkFileIn(data)) + property("Pipe to process") = forAll( (data: Array[Byte]) => checkPipe(data)) + + private def checkBinary(codes: Array[Byte])(reduceProcesses: (ProcessBuilder, ProcessBuilder) => ProcessBuilder)(reduceExit: (Boolean, Boolean) => Boolean) = + { + (codes.length > 1) ==> + { + val unsignedCodes = codes.map(unsigned) + val exitCode = unsignedCodes.map(code => Process(process("sbt.exit " + code))).reduceLeft(reduceProcesses) ! + val expectedExitCode = unsignedCodes.map(toBoolean).reduceLeft(reduceExit) + toBoolean(exitCode) == expectedExitCode + } + } + private def toBoolean(exitCode: Int) = exitCode == 0 + private def checkExit(code: Byte) = + { + val exitCode = unsigned(code) + (process("sbt.exit " + exitCode) !) == exitCode + } + private def checkFileOut(data: Array[Byte]) = + { + withData(data) { (temporaryFile, temporaryFile2) => + val catCommand = process("sbt.cat " + temporaryFile.getAbsolutePath) + catCommand #> temporaryFile2 + } + } + private def checkFileIn(data: Array[Byte]) = + { + withData(data) { (temporaryFile, temporaryFile2) => + val catCommand = process("sbt.cat") + temporaryFile #> catCommand #> temporaryFile2 + } + } + private def checkPipe(data: Array[Byte]) = + { + withData(data) { (temporaryFile, temporaryFile2) => + val catCommand = process("sbt.cat") + temporaryFile #> catCommand #| catCommand #> temporaryFile2 + } + } + private def temp() = File.createTempFile("sbt", "") + private def withData(data: Array[Byte])(f: (File, File) => ProcessBuilder) = + { + val temporaryFile1 = temp() + val temporaryFile2 = temp() + try + { + IO.write(temporaryFile1, data) + val process = f(temporaryFile1, temporaryFile2) + ( process ! ) == 0 && + { + val b1 = IO.readBytes(temporaryFile1) + val b2 = IO.readBytes(temporaryFile2) + b1 sameElements b2 + } + } + finally + { + temporaryFile1.delete() + temporaryFile2.delete() + } + } + private def unsigned(b: Byte): Int = ((b: Int) +256) % 256 + private def process(command: String) = + { + val ignore = echo // just for the compile dependency so that this test is rerun when TestedProcess.scala changes, not used otherwise + + val thisClasspath = List(getSource[ScalaObject], getSource[IO.type], getSource[SourceTag]).mkString(File.pathSeparator) + "java -cp " + thisClasspath + " " + command + } + private def getSource[T : Manifest]: String = + IO.classLocationFile[T].getAbsolutePath +} +private trait SourceTag \ No newline at end of file diff --git a/util/process/src/test/scala/TestedProcess.scala b/util/process/src/test/scala/TestedProcess.scala new file mode 100644 index 000000000..c013de531 --- /dev/null +++ b/util/process/src/test/scala/TestedProcess.scala @@ -0,0 +1,56 @@ +package sbt + +import java.io.{File, FileNotFoundException, IOException} + +object exit +{ + def main(args: Array[String]) + { + System.exit(java.lang.Integer.parseInt(args(0))) + } +} +object cat +{ + def main(args: Array[String]) + { + try { + if(args.length == 0) + IO.transfer(System.in, System.out) + else + catFiles(args.toList) + System.exit(0) + } catch { + case e => + e.printStackTrace() + System.err.println("Error: " + e.toString) + System.exit(1) + } + } + private def catFiles(filenames: List[String]): Option[String] = + { + filenames match + { + case head :: tail => + val file = new File(head) + if(file.isDirectory) + throw new IOException("Is directory: " + file) + else if(file.exists) + { + Using.fileInputStream(file) { stream => + IO.transfer(stream, System.out) + } + catFiles(tail) + } + else + throw new FileNotFoundException("No such file or directory: " + file) + case Nil => None + } + } +} +object echo +{ + def main(args: Array[String]) + { + System.out.println(args.mkString(" ")) + } +} \ No newline at end of file From 96c50975f205c621a097137b09f8fac2bef68706 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 14 Jul 2010 19:24:50 -0400 Subject: [PATCH 059/823] * move Environment classes to util/env module * move TrapExit, SelectMainClass to run module * rearrange some compilation-related code * Jetty-related code moved to web module --- LICENSE | 2 +- NOTICE | 2 +- util/collection/Dag.scala | 34 +++++++++++ .../src/test/scala/DagSpecification.scala | 56 +++++++++++++++++++ util/control/ExitHook.scala | 32 +++++++++++ util/process/NOTICE | 3 + 6 files changed, 127 insertions(+), 2 deletions(-) create mode 100644 util/collection/Dag.scala create mode 100644 util/collection/src/test/scala/DagSpecification.scala create mode 100644 util/control/ExitHook.scala create mode 100644 util/process/NOTICE diff --git a/LICENSE b/LICENSE index 15f983d64..46c73ae23 100644 --- a/LICENSE +++ b/LICENSE @@ -1,4 +1,4 @@ -Copyright (c) 2008, 2009, 2010 Mark Harrah, Tony Sloane, Jason Zaugg +Copyright (c) 2008, 2009, 2010 Steven Blundy, Josh Cough, Mark Harrah, Stuart Roebuck, Tony Sloane, Vesa Vilhonen, Jason Zaugg All rights reserved. Redistribution and use in source and binary forms, with or without diff --git a/NOTICE b/NOTICE index 63326efeb..88899abdc 100644 --- a/NOTICE +++ b/NOTICE @@ -1,4 +1,4 @@ -Simple Build Tool (xsbt components other than sbt/) +Simple Build Tool Copyright 2008, 2009, 2010 Mark Harrah, Jason Zaugg Licensed under BSD-style license (see LICENSE) diff --git a/util/collection/Dag.scala b/util/collection/Dag.scala new file mode 100644 index 000000000..3ecc1f95b --- /dev/null +++ b/util/collection/Dag.scala @@ -0,0 +1,34 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 David MacIver, Mark Harrah + */ +package sbt; + +trait Dag[Node <: Dag[Node]]{ + self : Node => + + def dependencies : Iterable[Node] + def topologicalSort = Dag.topologicalSort(self)(_.dependencies) +} +object Dag +{ + import scala.collection.{mutable, JavaConversions}; + import JavaConversions.{asIterable, asSet} + + def topologicalSort[T](root: T)(dependencies: T => Iterable[T]) = { + val discovered = new mutable.HashSet[T] + val finished = asSet(new java.util.LinkedHashSet[T]) + + def visit(dag : T){ + if (!discovered(dag)) { + discovered(dag) = true; + dependencies(dag).foreach(visit); + finished += dag; + } + } + + visit(root); + + finished.toList; + } +} + diff --git a/util/collection/src/test/scala/DagSpecification.scala b/util/collection/src/test/scala/DagSpecification.scala new file mode 100644 index 000000000..7cf19f2df --- /dev/null +++ b/util/collection/src/test/scala/DagSpecification.scala @@ -0,0 +1,56 @@ +/* sbt -- Simple Build Tool + * Copyright 2008 Mark Harrah */ + +package sbt + +import org.scalacheck._ +import Prop._ + +import scala.collection.mutable.HashSet + +object DagSpecification extends Properties("Dag") +{ + property("No repeated nodes") = forAll{ (dag: TestDag) => isSet(dag.topologicalSort) } + property("Sort contains node") = forAll{ (dag: TestDag) => dag.topologicalSort.contains(dag) } + property("Dependencies precede node") = forAll{ (dag: TestDag) => dependenciesPrecedeNodes(dag.topologicalSort) } + + implicit lazy val arbTestDag: Arbitrary[TestDag] = Arbitrary(Gen.sized(dagGen)) + private def dagGen(nodeCount: Int): Gen[TestDag] = + { + val nodes = new HashSet[TestDag] + def nonterminalGen(p: Gen.Params): Gen[TestDag] = + { + for(i <- 0 until nodeCount; nextDeps <- Gen.someOf(nodes).apply(p)) + nodes += new TestDag(i, nextDeps) + for(nextDeps <- Gen.someOf(nodes)) yield + new TestDag(nodeCount, nextDeps) + } + Gen.parameterized(nonterminalGen) + } + + private def isSet[T](c: Seq[T]) = Set(c: _*).size == c.size + private def dependenciesPrecedeNodes(sort: List[TestDag]) = + { + val seen = new HashSet[TestDag] + def iterate(remaining: List[TestDag]): Boolean = + { + remaining match + { + case Nil => true + case node :: tail => + if(node.dependencies.forall(seen.contains) && !seen.contains(node)) + { + seen += node + iterate(tail) + } + else + false + } + } + iterate(sort) + } +} +class TestDag(id: Int, val dependencies: Iterable[TestDag]) extends Dag[TestDag] +{ + override def toString = id + "->" + dependencies.mkString("[", ",", "]") +} \ No newline at end of file diff --git a/util/control/ExitHook.scala b/util/control/ExitHook.scala new file mode 100644 index 000000000..00f7b0d66 --- /dev/null +++ b/util/control/ExitHook.scala @@ -0,0 +1,32 @@ +/* sbt -- Simple Build Tool + * Copyright 2009, 2010 Mark Harrah + */ +package sbt + +/** Defines a function to call as sbt exits.*/ +trait ExitHook extends NotNull +{ + /** Provides a name for this hook to be used to provide feedback to the user. */ + def name: String + /** Subclasses should implement this method, which is called when this hook is executed. */ + def runBeforeExiting(): Unit +} + +trait ExitHookRegistry +{ + def register(hook: ExitHook): Unit + def unregister(hook: ExitHook): Unit +} + + +class ExitHooks extends ExitHookRegistry +{ + private val exitHooks = new scala.collection.mutable.HashSet[ExitHook] + def register(hook: ExitHook) { exitHooks += hook } + def unregister(hook: ExitHook) { exitHooks -= hook } + /** Calls each registered exit hook, trapping any exceptions so that each hook is given a chance to run. */ + def runExitHooks(debug: String => Unit): List[Throwable] = + exitHooks.toList.flatMap( hook => + ErrorHandling.wideConvert( hook.runBeforeExiting() ).left.toOption + ) +} \ No newline at end of file diff --git a/util/process/NOTICE b/util/process/NOTICE new file mode 100644 index 000000000..789c9ff1f --- /dev/null +++ b/util/process/NOTICE @@ -0,0 +1,3 @@ +Simple Build Tool: Process Component +Copyright 2008, 2009, 2010 Mark Harrah, Vesa Vilhonen +Licensed under BSD-style license (see LICENSE) \ No newline at end of file From b1b53e115e18674ad147dabfcee57f6bdd813e4c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 17 Jul 2010 12:07:41 -0400 Subject: [PATCH 060/823] first shot at general command/definition model --- util/control/ExitHook.scala | 16 +++------------- 1 file changed, 3 insertions(+), 13 deletions(-) diff --git a/util/control/ExitHook.scala b/util/control/ExitHook.scala index 00f7b0d66..1e491b095 100644 --- a/util/control/ExitHook.scala +++ b/util/control/ExitHook.scala @@ -12,21 +12,11 @@ trait ExitHook extends NotNull def runBeforeExiting(): Unit } -trait ExitHookRegistry +object ExitHooks { - def register(hook: ExitHook): Unit - def unregister(hook: ExitHook): Unit -} - - -class ExitHooks extends ExitHookRegistry -{ - private val exitHooks = new scala.collection.mutable.HashSet[ExitHook] - def register(hook: ExitHook) { exitHooks += hook } - def unregister(hook: ExitHook) { exitHooks -= hook } /** Calls each registered exit hook, trapping any exceptions so that each hook is given a chance to run. */ - def runExitHooks(debug: String => Unit): List[Throwable] = - exitHooks.toList.flatMap( hook => + def runExitHooks(exitHooks: Seq[ExitHook]): Seq[Throwable] = + exitHooks.flatMap( hook => ErrorHandling.wideConvert( hook.runBeforeExiting() ).left.toOption ) } \ No newline at end of file From 384924691bbdf91354988ab86ac862f806846511 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 19 Jul 2010 12:32:13 -0400 Subject: [PATCH 061/823] unnecessary import in BufferedLogger --- util/log/BufferedLogger.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala index 054a4b55d..38f845e11 100644 --- a/util/log/BufferedLogger.scala +++ b/util/log/BufferedLogger.scala @@ -3,7 +3,6 @@ */ package sbt - import sbt.{AbstractLogger, ControlEvent, Level, Log, LogEvent, SetLevel, SetTrace, Success, Trace} import scala.collection.mutable.ListBuffer /** A logger that can buffer the logging done on it and then can flush the buffer From fbb8db813208b899ad498ed338b6469f43826253 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 27 Jul 2010 23:01:45 -0400 Subject: [PATCH 062/823] adding more commands --- util/complete/LineReader.scala | 86 ++++++++++++++++++++++++++++++++++ 1 file changed, 86 insertions(+) create mode 100644 util/complete/LineReader.scala diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala new file mode 100644 index 000000000..e7c9ec67a --- /dev/null +++ b/util/complete/LineReader.scala @@ -0,0 +1,86 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ +package sbt +import jline.{Completor, ConsoleReader} +abstract class JLine extends LineReader +{ + protected[this] val reader: ConsoleReader + def readLine(prompt: String) = JLine.withJLine { unsynchronizedReadLine(prompt) } + private[this] def unsynchronizedReadLine(prompt: String) = + reader.readLine(prompt) match + { + case null => None + case x => Some(x.trim) + } +} +private object JLine +{ + def terminal = jline.Terminal.getTerminal + def resetTerminal() = withTerminal { _ => jline.Terminal.resetTerminal } + private def withTerminal[T](f: jline.Terminal => T): T = + synchronized + { + val t = terminal + t.synchronized { f(t) } + } + def createReader() = + withTerminal { t => + val cr = new ConsoleReader + t.enableEcho() + cr.setBellEnabled(false) + cr + } + def withJLine[T](action: => T): T = + withTerminal { t => + t.disableEcho() + try { action } + finally { t.enableEcho() } + } + private[sbt] def initializeHistory(cr: ConsoleReader, historyPath: Option[Path]): Unit = + for(historyLocation <- historyPath) + { + val historyFile = historyLocation.asFile + ErrorHandling.wideConvert + { + historyFile.getParentFile.mkdirs() + val history = cr.getHistory + history.setMaxSize(MaxHistorySize) + history.setHistoryFile(historyFile) + } + } + def simple(historyPath: Option[Path]): SimpleReader = new SimpleReader(historyPath) + val MaxHistorySize = 500 +} + +trait LineReader extends NotNull +{ + def readLine(prompt: String): Option[String] +} +private[sbt] final class LazyJLineReader(historyPath: Option[Path], completor: => Completor) extends JLine +{ + protected[this] val reader = + { + val cr = new ConsoleReader + cr.setBellEnabled(false) + JLine.initializeHistory(cr, historyPath) + cr.addCompletor(new LazyCompletor(completor)) + cr + } +} +private class LazyCompletor(delegate0: => Completor) extends Completor +{ + private lazy val delegate = delegate0 + def complete(buffer: String, cursor: Int, candidates: java.util.List[_]): Int = + delegate.complete(buffer, cursor, candidates) +} + +class SimpleReader private[sbt] (historyPath: Option[Path]) extends JLine +{ + protected[this] val reader = JLine.createReader() + JLine.initializeHistory(reader, historyPath) +} +object SimpleReader extends JLine +{ + protected[this] val reader = JLine.createReader() +} \ No newline at end of file From 9c8cf4451d7dcba5417f2f3dd24ea9f6fe392dfd Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 4 Aug 2010 19:48:48 -0400 Subject: [PATCH 063/823] remove call-by-name modifier for error function, doesn't work well --- util/complete/History.scala | 4 ++-- util/complete/HistoryCommands.scala | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/util/complete/History.scala b/util/complete/History.scala index bf009f626..e792454f7 100644 --- a/util/complete/History.scala +++ b/util/complete/History.scala @@ -6,7 +6,7 @@ package complete import History.number -final class History private(lines: IndexedSeq[String], error: (=> String) => Unit) extends NotNull +final class History private(lines: IndexedSeq[String], error: String => Unit) extends NotNull { private def reversed = lines.reverse @@ -41,7 +41,7 @@ final class History private(lines: IndexedSeq[String], error: (=> String) => Uni object History { - def apply(lines: Seq[String], error: (=> String) => Unit): History = new History(lines.toIndexedSeq, error) + def apply(lines: Seq[String], error: String => Unit): History = new History(lines.toIndexedSeq, error) def number(s: String): Option[Int] = try { Some(s.toInt) } diff --git a/util/complete/HistoryCommands.scala b/util/complete/HistoryCommands.scala index a03c9cfca..a5b321d8c 100644 --- a/util/complete/HistoryCommands.scala +++ b/util/complete/HistoryCommands.scala @@ -38,7 +38,7 @@ object HistoryCommands def printHelp(): Unit = println(helpString) - def apply(s: String, historyPath: Option[Path], maxLines: Int, error: (=> String) => Unit): Option[List[String]] = + def apply(s: String, historyPath: Option[Path], maxLines: Int, error: String => Unit): Option[List[String]] = if(s.isEmpty) { printHelp() From 6cada88fb3890e69e73f8a8782ab8ab6bcc40a4a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 10 Aug 2010 08:39:30 -0400 Subject: [PATCH 064/823] split out read-only RMap from PMap --- util/collection/PMap.scala | 14 ++++++++++---- 1 file changed, 10 insertions(+), 4 deletions(-) diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala index bc5e092af..e6b002995 100644 --- a/util/collection/PMap.scala +++ b/util/collection/PMap.scala @@ -5,12 +5,15 @@ package sbt import Types._ -trait PMap[K[_], V[_]] extends (K ~> V) +trait RMap[K[_], V[_]] { def apply[T](k: K[T]): V[T] def get[T](k: K[T]): Option[V[T]] - def update[T](k: K[T], v: V[T]): Unit def contains[T](k: K[T]): Boolean +} +trait PMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] +{ + def update[T](k: K[T], v: V[T]): Unit def remove[T](k: K[T]): Option[V[T]] def getOrUpdate[T](k: K[T], make: => V[T]): V[T] } @@ -27,8 +30,11 @@ abstract class AbstractPMap[K[_], V[_]] extends PMap[K,V] import collection.mutable.Map -/** Only suitable for K that is invariant in its parameter. -* Option and List keys are not, for example, because None <:< Option[String] and None <: Option[Int].*/ +/** +* Only suitable for K that is invariant in its type parameter. +* Option and List keys are not suitable, for example, +* because None <:< Option[String] and None <: Option[Int]. +*/ class DelegatingPMap[K[_], V[_]](backing: Map[K[_], V[_]]) extends AbstractPMap[K,V] { def get[T](k: K[T]): Option[V[T]] = cast[T]( backing.get(k) ) From 3bc345ffe091c812465d5d981e50211b9cb6b19d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 10 Aug 2010 08:40:14 -0400 Subject: [PATCH 065/823] type alias A ~>| B for [T]A[T] => Option[B[T]] --- util/collection/TypeFunctions.scala | 2 ++ 1 file changed, 2 insertions(+) diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index 942a00d72..92a5fb4b7 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -17,6 +17,8 @@ trait TypeFunctions implicit def toFn1[A,B](f: A => B): Fn1[A,B] = new Fn1[A,B] { def ∙[C](g: C => A) = f compose g } + + type ~>|[A[_],B[_]] = A ~> Compose[Option, B]#Apply } object TypeFunctions extends TypeFunctions From 9520c6eae30b556a8b03e4e65b9ae8938ff904c1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 14 Aug 2010 09:46:49 -0400 Subject: [PATCH 066/823] KList updates add conversion from List[M[_]] to KList[M, HList] required KList to be covariant in its HList parameter --- util/collection/KList.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/util/collection/KList.scala b/util/collection/KList.scala index 1a6e72554..b2ad40859 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -9,7 +9,7 @@ import Types._ * type parameters HL. The underlying data is M applied to each type parameter. * Explicitly tracking M[_] allows performing natural transformations or ensuring * all data conforms to some common type. */ -sealed trait KList[+M[_], HL <: HList] { +sealed trait KList[+M[_], +HL <: HList] { type Raw = HL /** Transform to the underlying HList type.*/ def down(implicit ev: M ~> Id): HL @@ -44,4 +44,6 @@ object KList { // nicer alias for pattern matching val :^: = KCons + + def fromList[M[_]](s: Seq[M[_]]): KList[M, HList] = if(s.isEmpty) KNil else KCons(s.head, fromList(s.tail)) } From 6d0d3a1e4d3fd145a21b6bd64af4a8f84e0a8c52 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 14 Aug 2010 09:49:28 -0400 Subject: [PATCH 067/823] remove Rewrite --- util/collection/Rewrite.scala | 42 ---------------- .../src/test/scala/RewriteTest.scala | 50 ------------------- 2 files changed, 92 deletions(-) delete mode 100644 util/collection/Rewrite.scala delete mode 100644 util/collection/src/test/scala/RewriteTest.scala diff --git a/util/collection/Rewrite.scala b/util/collection/Rewrite.scala deleted file mode 100644 index 40b40add4..000000000 --- a/util/collection/Rewrite.scala +++ /dev/null @@ -1,42 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt - -import Types._ - -trait Rewrite[A[_]] -{ - def apply[T](node: A[T], rewrite: Rewrite[A]): A[T] -} -object Rewrite -{ - def Id[A[_]]: Rewrite[A] = new Rewrite[A] { def apply[T](node: A[T], rewrite: Rewrite[A]) = node } - - implicit def specificF[T](f: T => T): Rewrite[Const[T]#Apply] = new Rewrite[Const[T]#Apply] { - def apply[S](node:T, rewrite: Rewrite[Const[T]#Apply]): T = f(node) - } - implicit def pToRewrite[A[_]](p: Param[A,A] => Unit): Rewrite[A] = toRewrite(Param.pToT(p)) - implicit def toRewrite[A[_]](f: A ~> A): Rewrite[A] = new Rewrite[A] { - def apply[T](node: A[T], rewrite:Rewrite[A]) = f(node) - } - def compose[A[_]](a: Rewrite[A], b: Rewrite[A]): Rewrite[A] = - new Rewrite[A] { - def apply[T](node: A[T], rewrite: Rewrite[A]) = - a(b(node, rewrite), rewrite) - } - implicit def rewriteOps[A[_]](outer: Rewrite[A]): RewriteOps[A] = - new RewriteOps[A] { - def ∙(g: A ~> A): Rewrite[A] = compose(outer, g) - def andThen(g: A ~> A): Rewrite[A] = compose(g, outer) - def ∙(g: Rewrite[A]): Rewrite[A] = compose(outer, g) - def andThen(g: Rewrite[A]): Rewrite[A] = compose(g, outer) - } - def apply[A[_], T](value: A[T])(implicit rewrite: Rewrite[A]): A[T] = rewrite(value, rewrite) -} -trait RewriteOps[A[_]] -{ - def andThen(g: A ~> A): Rewrite[A] - def ∙(g: A ~> A): Rewrite[A] - def ∙(g: Rewrite[A]): Rewrite[A] -} \ No newline at end of file diff --git a/util/collection/src/test/scala/RewriteTest.scala b/util/collection/src/test/scala/RewriteTest.scala deleted file mode 100644 index c2ca1b237..000000000 --- a/util/collection/src/test/scala/RewriteTest.scala +++ /dev/null @@ -1,50 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt - -import Types._ - -object RewriteTest -{ - // dist and add0 show the awkwardness when not just manipulating superstructure: - // would have to constrain the parameters to Term to be instances of Zero/Eq somehow - val dist: Rewrite[Term] = (p: Param[Term, Term]) => p.ret( p.in match { - case Add(Mult(a,b),Mult(c,d)) if a == c=> Mult(a, Add(b,d)) - case x => x - }) - val add0: Rewrite[Term] = (p: Param[Term, Term]) => p.ret( p.in match { - case Add(V(0), y) => y - case Add(x, V(0)) => x - case x => x - }) - val rewriteBU= new Rewrite[Term] { - def apply[T](node: Term[T], rewrite: Rewrite[Term]) = { - def r[T](node: Term[T]) = rewrite(node, rewrite) - node match { - case Add(x, y) => Add(r(x), r(y)) - case Mult(x, y) => Mult(r(x), r(y)) - case x => x - } - } - } - - val d2 = dist ∙ add0 ∙ rewriteBU - - implicit def toV(t: Int): V[Int] = V(t) - implicit def toVar(s: String): Var[Int] = Var[Int](s) - - val t1: Term[Int] = Add(Mult(3,4), Mult(4, 5)) - val t2: Term[Int] = Add(Mult(4,4), Mult(4, 5)) - val t3: Term[Int] = Add(Mult(Add("x", 0),4), Mult("x", 5)) - - println( Rewrite(t1)(d2) ) - println( Rewrite(t2)(d2) ) - println( Rewrite(t3)(d2) ) -} - -sealed trait Term[T] -final case class Add[T](a: Term[T], b: Term[T]) extends Term[T] -final case class Mult[T](a: Term[T], b: Term[T]) extends Term[T] -final case class V[T](v: T) extends Term[T] -final case class Var[T](name: String) extends Term[T] From 48d5ec5da495326ccaf1b1215b0fed87bf4912a4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 21 Aug 2010 22:49:11 -0400 Subject: [PATCH 068/823] clean up Process subproject no longer has any dependencies small ProcessLogger interface to send buffered out/err to commented out (but working) implicit conversions from Logger -> ProcessLogger for use in an integrating project to get original functionality --- util/log/Logger.scala | 8 ++++ util/process/Process.scala | 34 ++++++++++------- util/process/ProcessImpl.scala | 37 +++++++++---------- .../src/test/scala/ProcessSpecification.scala | 2 - 4 files changed, 46 insertions(+), 35 deletions(-) diff --git a/util/log/Logger.scala b/util/log/Logger.scala index 68e1f26d8..0751b72c8 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -33,6 +33,14 @@ abstract class AbstractLogger extends Logger } } +/* +These need to go in a module that integrates Logger and Process. +object Logger +{ + implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger + implicit def log2PLog(log: Logger): ProcessLogger = absLog2PLog(new FullLogger(log)) +}*/ + /** This is intended to be the simplest logging interface for use by code that wants to log. * It does not include configuring the logger. */ trait Logger extends xLogger diff --git a/util/process/Process.scala b/util/process/Process.scala index 183469516..536d40eea 100644 --- a/util/process/Process.scala +++ b/util/process/Process.scala @@ -103,49 +103,49 @@ trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder * sent to the console. If the exit code is non-zero, an exception is thrown.*/ def !! : String /** Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is - * sent to the provided Logger. If the exit code is non-zero, an exception is thrown.*/ - def !!(log: Logger) : String + * sent to the provided ProcessLogger. If the exit code is non-zero, an exception is thrown.*/ + def !!(log: ProcessLogger) : String /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, * the Stream will provide all lines up to termination and then throw an exception. */ def lines: Stream[String] /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the provided Logger. If the process exits with a non-zero value, + * but the process has not completed. Standard error is sent to the provided ProcessLogger. If the process exits with a non-zero value, * the Stream will provide all lines up to termination but will not throw an exception. */ - def lines(log: Logger): Stream[String] + def lines(log: ProcessLogger): Stream[String] /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, * the Stream will provide all lines up to termination but will not throw an exception. */ def lines_! : Stream[String] /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the provided Logger. If the process exits with a non-zero value, + * but the process has not completed. Standard error is sent to the provided ProcessLogger. If the process exits with a non-zero value, * the Stream will provide all lines up to termination but will not throw an exception. */ - def lines_!(log: Logger): Stream[String] + def lines_!(log: ProcessLogger): Stream[String] /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are * sent to the console.*/ def ! : Int /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given Logger.*/ - def !(log: Logger): Int + * sent to the given ProcessLogger.*/ + def !(log: ProcessLogger): Int /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are * sent to the console. The newly started process reads from standard input of the current process.*/ def !< : Int /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given Logger. The newly started process reads from standard input of the current process.*/ - def !<(log: Logger) : Int + * sent to the given ProcessLogger. The newly started process reads from standard input of the current process.*/ + def !<(log: ProcessLogger) : Int /** Starts the process represented by this builder. Standard output and error are sent to the console.*/ def run(): Process - /** Starts the process represented by this builder. Standard output and error are sent to the given Logger.*/ - def run(log: Logger): Process + /** Starts the process represented by this builder. Standard output and error are sent to the given ProcessLogger.*/ + def run(log: ProcessLogger): Process /** Starts the process represented by this builder. I/O is handled by the given ProcessIO instance.*/ def run(io: ProcessIO): Process /** Starts the process represented by this builder. Standard output and error are sent to the console. * The newly started process reads from standard input of the current process if `connectInput` is true.*/ def run(connectInput: Boolean): Process /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given Logger. + * sent to the given ProcessLogger. * The newly started process reads from standard input of the current process if `connectInput` is true.*/ - def run(log: Logger, connectInput: Boolean): Process + def run(log: ProcessLogger, connectInput: Boolean): Process /** Constructs a command that runs this command first and then `other` if this command succeeds.*/ def #&& (other: ProcessBuilder): ProcessBuilder @@ -164,4 +164,10 @@ final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: I def withOutput(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, process, processError) def withError(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, processOutput, process) def withInput(write: OutputStream => Unit): ProcessIO = new ProcessIO(write, processOutput, processError) +} +trait ProcessLogger +{ + def info(s: => String): Unit + def error(s: => String): Unit + def buffer[T](f: => T): T } \ No newline at end of file diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index 5f89a4749..c20b23f20 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -43,17 +43,19 @@ private object Future object BasicIO { - def apply(buffer: StringBuffer, log: Option[Logger], withIn: Boolean) = new ProcessIO(input(withIn), processFully(buffer), getErr(log)) - def apply(log: Logger, withIn: Boolean) = new ProcessIO(input(withIn), processFully(log, Level.Info), processFully(log, Level.Error)) + def apply(buffer: StringBuffer, log: Option[ProcessLogger], withIn: Boolean) = new ProcessIO(input(withIn), processFully(buffer), getErr(log)) + def apply(log: ProcessLogger, withIn: Boolean) = new ProcessIO(input(withIn), processInfoFully(log), processErrFully(log)) - def getErr(log: Option[Logger]) = log match { case Some(lg) => processFully(lg, Level.Error); case None => toStdErr } + def getErr(log: Option[ProcessLogger]) = log match { case Some(lg) => processErrFully(lg); case None => toStdErr } + + private def processErrFully(log: ProcessLogger) = processFully(s => log.error(s)) + private def processInfoFully(log: ProcessLogger) = processFully(s => log.info(s)) def ignoreOut = (i: OutputStream) => () final val BufferSize = 8192 final val Newline = System.getProperty("line.separator") def close(c: java.io.Closeable) = try { c.close() } catch { case _: java.io.IOException => () } - def processFully(log: Logger, level: Level.Value): InputStream => Unit = processFully(line => log.log(level, line)) def processFully(buffer: Appendable): InputStream => Unit = processFully(appendLine(buffer)) def processFully(processLine: String => Unit): InputStream => Unit = in => @@ -128,26 +130,26 @@ private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPa def run(): Process = run(false) def run(connectInput: Boolean): Process = run(BasicIO.standard(connectInput)) - def run(log: Logger): Process = run(log, false) - def run(log: Logger, connectInput: Boolean): Process = run(BasicIO(log, connectInput)) + def run(log: ProcessLogger): Process = run(log, false) + def run(log: ProcessLogger, connectInput: Boolean): Process = run(BasicIO(log, connectInput)) - private[this] def getString(log: Option[Logger], withIn: Boolean): String = + private[this] def getString(log: Option[ProcessLogger], withIn: Boolean): String = { val buffer = new StringBuffer val code = this ! BasicIO(buffer, log, withIn) if(code == 0) buffer.toString else error("Nonzero exit value: " + code) } def !! = getString(None, false) - def !!(log: Logger) = getString(Some(log), false) + def !!(log: ProcessLogger) = getString(Some(log), false) def !!< = getString(None, true) - def !!<(log: Logger) = getString(Some(log), true) + def !!<(log: ProcessLogger) = getString(Some(log), true) def lines: Stream[String] = lines(false, true, None) - def lines(log: Logger): Stream[String] = lines(false, true, Some(log)) + def lines(log: ProcessLogger): Stream[String] = lines(false, true, Some(log)) def lines_! : Stream[String] = lines(false, false, None) - def lines_!(log: Logger): Stream[String] = lines(false, false, Some(log)) + def lines_!(log: ProcessLogger): Stream[String] = lines(false, false, Some(log)) - private[this] def lines(withInput: Boolean, nonZeroException: Boolean, log: Option[Logger]): Stream[String] = + private[this] def lines(withInput: Boolean, nonZeroException: Boolean, log: Option[ProcessLogger]): Stream[String] = { val streamed = Streamed[String](nonZeroException) val process = run(new ProcessIO(BasicIO.input(withInput), BasicIO.processFully(streamed.process), BasicIO.getErr(log))) @@ -157,13 +159,10 @@ private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPa def ! = run(false).exitValue() def !< = run(true).exitValue() - def !(log: Logger) = runBuffered(log, false) - def !<(log: Logger) = runBuffered(log, true) - private[this] def runBuffered(log: Logger, connectInput: Boolean) = - { - val log2 = new BufferedLogger(new FullLogger(log)) - log2.buffer { run(log2, connectInput).exitValue() } - } + def !(log: ProcessLogger) = runBuffered(log, false) + def !<(log: ProcessLogger) = runBuffered(log, true) + private[this] def runBuffered(log: ProcessLogger, connectInput: Boolean) = + log.buffer { run(log, connectInput).exitValue() } def !(io: ProcessIO) = run(io).exitValue() def canPipeTo = false diff --git a/util/process/src/test/scala/ProcessSpecification.scala b/util/process/src/test/scala/ProcessSpecification.scala index 0d7141635..f2f42d3ca 100644 --- a/util/process/src/test/scala/ProcessSpecification.scala +++ b/util/process/src/test/scala/ProcessSpecification.scala @@ -8,8 +8,6 @@ import Process._ object ProcessSpecification extends Properties("Process I/O") { - private val log = new ConsoleLogger - implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary(Gen.choose(0, 10) flatMap { size => Gen.resize(size, Arbitrary.arbArray[Byte].arbitrary) }) /*property("Correct exit code") = forAll( (exitCode: Byte) => checkExit(exitCode)) From 5b21bae2449174609c6ec279cb786b03578cd366 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 21 Aug 2010 22:55:42 -0400 Subject: [PATCH 069/823] task system cleanup KList.map -> transform can now drop trailing 'H' from multi-Task 'mapH' compressed Action hierarchy by merging (Flat)Map{ped,All,Failure} into (Flat)Mapped moved most information in Info into attributes: AttributeMap to allow future changes --- util/collection/Attributes.scala | 34 +++++++++++++++++++ util/collection/KList.scala | 17 ++++++---- util/collection/Types.scala | 4 +++ .../collection/src/test/scala/KListTest.scala | 2 +- 4 files changed, 49 insertions(+), 8 deletions(-) create mode 100644 util/collection/Attributes.scala diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala new file mode 100644 index 000000000..231cda300 --- /dev/null +++ b/util/collection/Attributes.scala @@ -0,0 +1,34 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + +import Types._ + +// T must be invariant to work properly. +// Because it is sealed and the only instances go through make, +// a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts +sealed trait AttributeKey[T] +object AttributeKey +{ + def make[T]: AttributeKey[T] = new AttributeKey[T] {} +} + +trait AttributeMap +{ + def apply[T](k: AttributeKey[T]): T + def get[T](k: AttributeKey[T]): Option[T] + def contains[T](k: AttributeKey[T]): Boolean + def put[T](k: AttributeKey[T], value: T): AttributeMap +} +object AttributeMap +{ + def empty: AttributeMap = new BasicAttributeMap(Map.empty) +} +private class BasicAttributeMap(backing: Map[AttributeKey[_], Any]) extends AttributeMap +{ + def apply[T](k: AttributeKey[T]) = backing(k).asInstanceOf[T] + def get[T](k: AttributeKey[T]) = backing.get(k).asInstanceOf[Option[T]] + def contains[T](k: AttributeKey[T]) = backing.contains(k) + def put[T](k: AttributeKey[T], value: T): AttributeMap = new BasicAttributeMap( backing.updated(k, value) ) +} \ No newline at end of file diff --git a/util/collection/KList.scala b/util/collection/KList.scala index b2ad40859..81ef6afcd 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -9,21 +9,23 @@ import Types._ * type parameters HL. The underlying data is M applied to each type parameter. * Explicitly tracking M[_] allows performing natural transformations or ensuring * all data conforms to some common type. */ -sealed trait KList[+M[_], +HL <: HList] { +sealed trait KList[+M[_], +HL <: HList] +{ type Raw = HL /** Transform to the underlying HList type.*/ def down(implicit ev: M ~> Id): HL /** Apply a natural transformation. */ - def map[N[_]](f: M ~> N): KList[N, HL] + def transform[N[_]](f: M ~> N): KList[N, HL] /** Convert to a List. */ def toList: List[M[_]] /** Convert to an HList. */ def combine[N[X] >: M[X]]: HL#Wrap[N] } -final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] { - def down(implicit f: M ~> Id) = HCons(f(head), tail.down(f)) - def map[N[_]](f: M ~> N) = KCons( f(head), tail.map(f) ) +final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] +{ + def down(implicit f: M ~> Id) = HCons(f(head), tail down f) + def transform[N[_]](f: M ~> N) = KCons( f(head), tail transform f ) // prepend def :^: [N[X] >: M[X], G](g: N[G]) = KCons(g, this) def toList = head :: tail.toList @@ -31,9 +33,10 @@ final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) exten def combine[N[X] >: M[X]]: (H :+: T)#Wrap[N] = HCons(head, tail.combine) } -sealed class KNil extends KList[Nothing, HNil] { +sealed class KNil extends KList[Nothing, HNil] +{ def down(implicit f: Nothing ~> Id) = HNil - def map[N[_]](f: Nothing ~> N) = KNil + def transform[N[_]](f: Nothing ~> N) = KNil def :^: [M[_], H](h: M[H]) = KCons(h, this) def toList = Nil def combine[N[X]] = HNil diff --git a/util/collection/Types.scala b/util/collection/Types.scala index c5d484c51..abd9ee06b 100644 --- a/util/collection/Types.scala +++ b/util/collection/Types.scala @@ -8,4 +8,8 @@ object Types extends TypeFunctions val :^: = KCons val :+: = HCons type :+:[H, T <: HList] = HCons[H,T] + + implicit def hconsToK[M[_], H, T <: HList](h: M[H] :+: T)(implicit mt: T => KList[M, T]): KList[M, H :+: T] = + KCons[H, T, M](h.head, mt(h.tail) ) + implicit def hnilToK(hnil: HNil): KNil = KNil } diff --git a/util/collection/src/test/scala/KListTest.scala b/util/collection/src/test/scala/KListTest.scala index 210084fb1..2ca25a31a 100644 --- a/util/collection/src/test/scala/KListTest.scala +++ b/util/collection/src/test/scala/KListTest.scala @@ -9,7 +9,7 @@ object KTest { val f = new (Option ~> List) { def apply[T](o: Option[T]): List[T] = o.toList } val x = Some(3) :^: Some("asdf") :^: KNil - val y = x map f + val y = x transform f val m1a = y match { case List(3) :^: List("asdf") :^: KNil => println("true") } val m1b = (List(3) :^: KNil) match { case yy :^: KNil => println("true") } From d12adcd7aede251c5025426bc47bff0d100fdb9d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 22 Aug 2010 19:07:46 -0400 Subject: [PATCH 070/823] fix Logger/Process --- util/log/Logger.scala | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/util/log/Logger.scala b/util/log/Logger.scala index 0751b72c8..4a0f9c365 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -33,13 +33,11 @@ abstract class AbstractLogger extends Logger } } -/* -These need to go in a module that integrates Logger and Process. object Logger { implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger implicit def log2PLog(log: Logger): ProcessLogger = absLog2PLog(new FullLogger(log)) -}*/ +} /** This is intended to be the simplest logging interface for use by code that wants to log. * It does not include configuring the logger. */ From dd8d58a9c06a44272ca83d0e38a5157f1f1a6120 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 27 Aug 2010 19:17:03 -0400 Subject: [PATCH 071/823] cross-configurations --- util/collection/Attributes.scala | 30 ++++++++++++++++++++++++++---- 1 file changed, 26 insertions(+), 4 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 231cda300..488142ad7 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -8,10 +8,12 @@ import Types._ // T must be invariant to work properly. // Because it is sealed and the only instances go through make, // a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts -sealed trait AttributeKey[T] +sealed trait AttributeKey[T] { + def label: String +} object AttributeKey { - def make[T]: AttributeKey[T] = new AttributeKey[T] {} + def apply[T](name: String): AttributeKey[T] = new AttributeKey[T] { def label = name } } trait AttributeMap @@ -20,15 +22,35 @@ trait AttributeMap def get[T](k: AttributeKey[T]): Option[T] def contains[T](k: AttributeKey[T]): Boolean def put[T](k: AttributeKey[T], value: T): AttributeMap + def keys: Iterable[AttributeKey[_]] + def ++(o: AttributeMap): AttributeMap + def entries: Iterable[AttributeEntry[_]] + def isEmpty: Boolean } object AttributeMap { - def empty: AttributeMap = new BasicAttributeMap(Map.empty) + val empty: AttributeMap = new BasicAttributeMap(Map.empty) } -private class BasicAttributeMap(backing: Map[AttributeKey[_], Any]) extends AttributeMap +private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) extends AttributeMap { + def isEmpty: Boolean = backing.isEmpty def apply[T](k: AttributeKey[T]) = backing(k).asInstanceOf[T] def get[T](k: AttributeKey[T]) = backing.get(k).asInstanceOf[Option[T]] def contains[T](k: AttributeKey[T]) = backing.contains(k) def put[T](k: AttributeKey[T], value: T): AttributeMap = new BasicAttributeMap( backing.updated(k, value) ) + def keys: Iterable[AttributeKey[_]] = backing.keys + def ++(o: AttributeMap): AttributeMap = + o match { + case bam: BasicAttributeMap => new BasicAttributeMap(backing ++ bam.backing) + case _ => o ++ this + } + def entries: Iterable[AttributeEntry[_]] = + for( (k: AttributeKey[kt], v) <- backing) yield AttributeEntry(k, v.asInstanceOf[kt]) + override def toString = entries.mkString("(", ", ", ")") +} + +// type inference required less generality +final case class AttributeEntry[T](a: AttributeKey[T], b: T) +{ + override def toString = a.label + ": " + b } \ No newline at end of file From 12336b79f67cc7c7a4588ce493c86de107bd9513 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 30 Aug 2010 09:10:25 -0400 Subject: [PATCH 072/823] minor updates for p2 --- util/collection/Types.scala | 13 ++++++++----- 1 file changed, 8 insertions(+), 5 deletions(-) diff --git a/util/collection/Types.scala b/util/collection/Types.scala index abd9ee06b..42b81f990 100644 --- a/util/collection/Types.scala +++ b/util/collection/Types.scala @@ -3,13 +3,16 @@ */ package sbt -object Types extends TypeFunctions +object Types extends Types { - val :^: = KCons - val :+: = HCons - type :+:[H, T <: HList] = HCons[H,T] - implicit def hconsToK[M[_], H, T <: HList](h: M[H] :+: T)(implicit mt: T => KList[M, T]): KList[M, H :+: T] = KCons[H, T, M](h.head, mt(h.tail) ) implicit def hnilToK(hnil: HNil): KNil = KNil } + +trait Types extends TypeFunctions +{ + val :^: = KCons + val :+: = HCons + type :+:[H, T <: HList] = HCons[H,T] +} From 34df04c37806d1969c5bad994fcf0bbd137dbf74 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 4 Sep 2010 08:07:51 -0400 Subject: [PATCH 073/823] cleanup --- cache/tracking/DependencyTracking.scala | 136 ------------------------ cache/tracking/TrackingFormat.scala | 65 ----------- 2 files changed, 201 deletions(-) delete mode 100644 cache/tracking/DependencyTracking.scala delete mode 100644 cache/tracking/TrackingFormat.scala diff --git a/cache/tracking/DependencyTracking.scala b/cache/tracking/DependencyTracking.scala deleted file mode 100644 index 30e060af3..000000000 --- a/cache/tracking/DependencyTracking.scala +++ /dev/null @@ -1,136 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009, 2010 Mark Harrah - */ -package sbt - -private object DependencyTracking -{ - import scala.collection.mutable.{Set, HashMap, Map, MultiMap} - type DependencyMap[T] = HashMap[T, Set[T]] with MultiMap[T, T] - def newMap[T]: DependencyMap[T] = new HashMap[T, Set[T]] with MultiMap[T, T] - type TagMap[T] = Map[T, Array[Byte]] - def newTagMap[T] = new HashMap[T, Array[Byte]] -} - -trait UpdateTracking[T] extends NotNull -{ - def dependency(source: T, dependsOn: T): Unit - def use(source: T, uses: T): Unit - def product(source: T, output: T): Unit - def tag(source: T, t: Array[Byte]): Unit - def read: ReadTracking[T] - // removes files from all maps, both keys and values - def removeAll(files: Iterable[T]): Unit - // removes sources as keys/values in source, product maps and as values in reverseDependencies map - def pending(sources: Iterable[T]): Unit -} -trait ReadTracking[T] extends NotNull -{ - def isProduct(file: T): Boolean - def isSource(file: T): Boolean - def isUsed(file: T): Boolean - def dependsOn(file: T): Set[T] - def products(file: T): Set[T] - def sources(file: T): Set[T] - def usedBy(file: T): Set[T] - def tag(file: T): Array[Byte] - def allProducts: Set[T] - def allSources: Set[T] - def allUsed: Set[T] - def allTags: Seq[(T,Array[Byte])] -} -import DependencyTracking.{DependencyMap => DMap, newMap, newTagMap, TagMap} -private object DefaultTracking -{ - def apply[T](translateProducts: Boolean): DependencyTracking[T] = - new DefaultTracking(translateProducts)(newMap, newMap, newMap, newTagMap) -} -private final class DefaultTracking[T](translateProducts: Boolean) - (val reverseDependencies: DMap[T], val reverseUses: DMap[T], val sourceMap: DMap[T], val tagMap: TagMap[T]) - extends DependencyTracking[T](translateProducts) -{ - val productMap: DMap[T] = forward(sourceMap) // map from a source to its products. Keep in sync with sourceMap -} -// if translateProducts is true, dependencies on a product are translated to dependencies on a source -// if there is a source recorded as generating that product -private abstract class DependencyTracking[T](translateProducts: Boolean) extends ReadTracking[T] with UpdateTracking[T] -{ - val reverseDependencies: DMap[T] // map from a file to the files that depend on it - val reverseUses: DMap[T] // map from a file to the files that use it - val sourceMap: DMap[T] // map from a product to its sources. Keep in sync with productMap - val productMap: DMap[T] // map from a source to its products. Keep in sync with sourceMap - val tagMap: TagMap[T] - - def read = this - - final def dependsOn(file: T): Set[T] = get(reverseDependencies, file) - final def products(file: T): Set[T] = get(productMap, file) - final def sources(file: T): Set[T] = get(sourceMap, file) - final def usedBy(file: T): Set[T] = get(reverseUses, file) - final def tag(file: T): Array[Byte] = tagMap.getOrElse(file, new Array[Byte](0)) - - def isProduct(file: T): Boolean = exists(sourceMap, file) - def isSource(file: T): Boolean = exists(productMap, file) - def isUsed(file: T): Boolean = exists(reverseUses, file) - - - final def allProducts = sourceMap.keysIterator.toSet - final def allSources = productMap.keysIterator.toSet - final def allUsed = reverseUses.keysIterator.toSet - final def allTags = tagMap.toSeq - - private def exists(map: DMap[T], value: T): Boolean = map.contains(value) - private def get(map: DMap[T], value: T): Set[T] = map.getOrElse[collection.Set[T]](value, Set.empty[T]).toSet - - final def dependency(sourceFile: T, dependsOn: T) - { - val actualDependencies = - if(!translateProducts) - Seq(dependsOn) - else - sourceMap.getOrElse[Iterable[T]](dependsOn, Seq(dependsOn)) - actualDependencies.foreach { actualDependency => reverseDependencies.add(actualDependency, sourceFile) } - } - final def product(sourceFile: T, product: T) - { - productMap.add(sourceFile, product) - sourceMap.add(product, sourceFile) - } - final def use(sourceFile: T, usesFile: T) { reverseUses.add(usesFile, sourceFile) } - final def tag(sourceFile: T, t: Array[Byte]) { tagMap(sourceFile) = t } - - private def removeOneWay(a: DMap[T], files: Iterable[T]): Unit = - a.values.foreach { _ --= files } - private def remove(a: DMap[T], b: DMap[T], file: T): Unit = - for(x <- a.removeKey(file)) b --= x - private def removeAll(files: Iterable[T], a: DMap[T], b: DMap[T]): Unit = - files.foreach { file => remove(a, b, file); remove(b, a, file) } - final def removeAll(files: Iterable[T]) - { - removeAll(files, forward(reverseDependencies), reverseDependencies) - removeAll(files, productMap, sourceMap) - removeAll(files, forward(reverseUses), reverseUses) - tagMap --= files - } - def pending(sources: Iterable[T]) - { - removeOneWay(reverseDependencies, sources) - removeOneWay(reverseUses, sources) - removeAll(sources, productMap, sourceMap) - tagMap --= sources - } - protected final def forward(map: DMap[T]): DMap[T] = - { - val f = newMap[T] - for( (key, values) <- map; value <- values) f.add(value, key) - f - } - override def toString = - (graph("Reverse source dependencies", reverseDependencies) :: - graph("Sources and products", productMap) :: - graph("Reverse uses", reverseUses) :: - Nil) mkString "\n" - def graph(title: String, map: DMap[T]) = - "\"" + title + "\" {\n\t" + graphEntries(map) + "\n}" - def graphEntries(map: DMap[T]) = map.map{ case (key, values) => values.map(key + " -> " + _).mkString("\n\t") }.mkString("\n\t") -} diff --git a/cache/tracking/TrackingFormat.scala b/cache/tracking/TrackingFormat.scala deleted file mode 100644 index d8a5e0f2c..000000000 --- a/cache/tracking/TrackingFormat.scala +++ /dev/null @@ -1,65 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009, 2010 Mark Harrah - */ -package sbt - -import java.io.File -import scala.collection.mutable.{HashMap, Map, MultiMap, Set} -import scala.reflect.Manifest -import sbinary.{DefaultProtocol, Format} -import DefaultProtocol._ -import TrackingFormat._ -import CacheIO.{fromFile, toFile} -import DependencyTracking.{DependencyMap => DMap, newMap, TagMap} - -private class TrackingFormat[T](directory: File, translateProducts: Boolean)(implicit tFormat: Format[T], mf: Manifest[T]) extends NotNull -{ - val indexFile = new File(directory, "index") - val dependencyFile = new File(directory, "dependencies") - def read(): DependencyTracking[T] = - { - val indexMap = CacheIO.fromFile[Map[Int,T]](indexFile, new HashMap[Int,T]) - val indexedFormat = wrap[T,Int](ignore => error("Read-only"), i => indexMap.getOrElse(i, error("Index " + i + " not found"))) - val trackFormat = trackingFormat(translateProducts)(indexedFormat) - fromFile(trackFormat, DefaultTracking[T](translateProducts))(dependencyFile) - } - def write(tracking: DependencyTracking[T]) - { - val index = new IndexMap[T] - val indexedFormat = wrap[T,Int](t => index(t), ignore => error("Write-only")) - val trackFormat = trackingFormat(translateProducts)(indexedFormat) - toFile(trackFormat)(tracking)(dependencyFile) - toFile(index.indices)(indexFile) - } -} -private object TrackingFormat -{ - implicit def mutableMapFormat[S, T](implicit binS : Format[S], binT : Format[T]) : Format[HashMap[S, T]] = - new LengthEncoded[HashMap[S, T], (S, T)] { - def build(size : Int, ts : Iterator[(S, T)]) : HashMap[S, T] = { - val b = new HashMap[S, T] - b ++= ts - b - } - } - implicit def depMapFormat[T](implicit bin: Format[T]) : Format[DMap[T]] = - new LengthEncoded[DMap[T], (T, Set[T])] { - def build(size : Int, ts : Iterator[(T, Set[T])]) : DMap[T] = { - val b = newMap[T] - b ++= ts - b - } - } - def trackingFormat[T](translateProducts: Boolean)(implicit tFormat: Format[T]): Format[DependencyTracking[T]] = - asProduct4((a: DMap[T],b: DMap[T],c: DMap[T], d:TagMap[T]) => new DefaultTracking(translateProducts)(a,b,c,d) : DependencyTracking[T] - )(dt => (dt.reverseDependencies, dt.reverseUses, dt.sourceMap, dt.tagMap)) -} - -private final class IndexMap[T] extends NotNull -{ - private[this] var lastIndex = 0 - private[this] val map = new HashMap[T, Int] - private[this] def nextIndex = { lastIndex += 1; lastIndex } - def indices = HashMap(map.map( (_: (T,Int)).swap ).toSeq : _*) - def apply(t: T) = map.getOrElseUpdate(t, nextIndex) -} \ No newline at end of file From 30e47ace17cb7ce44f8802ec597ada233c383af0 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 4 Sep 2010 08:08:17 -0400 Subject: [PATCH 074/823] reworked tracking added memoization for Set[File] => Set[File] --- cache/CacheIO.scala | 7 +- cache/tracking/ChangeReport.scala | 6 - cache/tracking/Tracked.scala | 192 ++++++++---------------------- 3 files changed, 54 insertions(+), 151 deletions(-) diff --git a/cache/CacheIO.scala b/cache/CacheIO.scala index 7ff1eb519..dad9bd467 100644 --- a/cache/CacheIO.scala +++ b/cache/CacheIO.scala @@ -21,8 +21,11 @@ object CacheIO def fromFile[T](format: Format[T], default: => T)(file: File)(implicit mf: Manifest[Format[T]]): T = fromFile(file, default)(format, mf) def fromFile[T](file: File, default: => T)(implicit format: Format[T], mf: Manifest[Format[T]]): T = - try { Operations.fromFile(file)(stampedFormat(format)) } - catch { case e: FileNotFoundException => default } + fromFile[T](file) getOrElse default + def fromFile[T](file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Option[T] = + try { Some( Operations.fromFile(file)(stampedFormat(format)) ) } + catch { case e: FileNotFoundException => None } + def toFile[T](format: Format[T])(value: T)(file: File)(implicit mf: Manifest[Format[T]]): Unit = toFile(value)(file)(format, mf) def toFile[T](value: T)(file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Unit = diff --git a/cache/tracking/ChangeReport.scala b/cache/tracking/ChangeReport.scala index d25b1bbaa..634650f20 100644 --- a/cache/tracking/ChangeReport.scala +++ b/cache/tracking/ChangeReport.scala @@ -63,12 +63,6 @@ class EmptyChangeReport[T] extends ChangeReport[T] def removed = Set.empty[T] override def toString = "No changes" } -trait InvalidationReport[T] extends NotNull -{ - def valid: Set[T] - def invalid: Set[T] - def invalidProducts: Set[T] -} private class CompoundChangeReport[T](a: ChangeReport[T], b: ChangeReport[T]) extends ChangeReport[T] { lazy val checked = a.checked ++ b.checked diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index 77c7447f5..798adbb11 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -9,24 +9,6 @@ import sbinary.Format import scala.reflect.Manifest import IO.{delete, read, write} -/* A proper implementation of fileTask that tracks inputs and outputs properly - -def fileTask(cacheBaseDirectory: Path)(inputs: PathFinder, outputs: PathFinder)(action: => Unit): Task = - fileTask(cacheBaseDirectory, FilesInfo.hash, FilesInfo.lastModified) -def fileTask(cacheBaseDirectory: Path, inStyle: FilesInfo.Style, outStyle: FilesInfo.Style)(inputs: PathFinder, outputs: PathFinder)(action: => Unit): Task = -{ - lazy val inCache = diffInputs(base / "in-cache", inStyle)(inputs) - lazy val outCache = diffOutputs(base / "out-cache", outStyle)(outputs) - task - { - inCache { inReport => - outCache { outReport => - if(inReport.modified.isEmpty && outReport.modified.isEmpty) () else action - } - } - } -} -*/ object Tracked { @@ -36,15 +18,15 @@ object Tracked * In both cases, the timestamp is not updated if the function throws an exception.*/ def tstamp(cacheFile: File, useStartTime: Boolean = true): Timestamp = new Timestamp(cacheFile, useStartTime) /** Creates a tracker that only evaluates a function when the input has changed.*/ - def changed[O](cacheFile: File)(getValue: => O)(implicit input: InputCache[O]): Changed[O] = - new Changed[O](getValue, cacheFile) + def changed[O](cacheFile: File)(implicit input: InputCache[O]): Changed[O] = + new Changed[O](cacheFile)(input) - /** Creates a tracker that provides the difference between the set of input files provided for successive invocations.*/ - def diffInputs(cache: File, style: FilesInfo.Style)(files: => Set[File]): Difference = - Difference.inputs(files, style, cache) - /** Creates a tracker that provides the difference between the set of output files provided for successive invocations.*/ - def diffOutputs(cache: File, style: FilesInfo.Style)(files: => Set[File]): Difference = - Difference.outputs(files, style, cache) + /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ + def diffInputs(cache: File, style: FilesInfo.Style): Difference = + Difference.inputs(cache, style) + /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ + def diffOutputs(cache: File, style: FilesInfo.Style): Difference = + Difference.outputs(cache, style) } trait Tracked extends NotNull @@ -55,7 +37,8 @@ trait Tracked extends NotNull class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked { def clean = delete(cacheFile) - /** Reads the previous timestamp, evaluates the provided function, and then updates the timestamp.*/ + /** Reads the previous timestamp, evaluates the provided function, + * and then updates the timestamp if the function completes normally.*/ def apply[T](f: Long => T): T = { val start = now() @@ -69,12 +52,11 @@ class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked catch { case _: NumberFormatException | _: java.io.FileNotFoundException => 0 } } -class Changed[O](getValue: => O, val cacheFile: File)(implicit input: InputCache[O]) extends Tracked +class Changed[O](val cacheFile: File)(implicit input: InputCache[O]) extends Tracked { def clean = delete(cacheFile) - def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O2 = + def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => { - val value = getValue val cache = try { Using.fileInputStream(cacheFile)(input.uptodate(value)) } catch { case _: IOException => new ForceResult(input)(value) } @@ -91,7 +73,7 @@ object Difference { sealed class Constructor private[Difference](defineClean: Boolean, filesAreOutputs: Boolean) extends NotNull { - def apply(files: => Set[File], style: FilesInfo.Style, cache: File): Difference = new Difference(files, style, cache, defineClean, filesAreOutputs) + def apply(cache: File, style: FilesInfo.Style): Difference = new Difference(cache, style, defineClean, filesAreOutputs) } /** Provides a constructor for a Difference that removes the files from the previous run on a call to 'clean' and saves the * hash/last modified time of the files as they are after running the function. This means that this information must be evaluated twice: @@ -101,7 +83,7 @@ object Difference * hash/last modified time of the files as they were prior to running the function.*/ object inputs extends Constructor(false, false) } -class Difference(getFiles: => Set[File], val style: FilesInfo.Style, val cache: File, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked +class Difference(val cache: File, val style: FilesInfo.Style, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked { def clean = { @@ -111,14 +93,25 @@ class Difference(getFiles: => Set[File], val style: FilesInfo.Style, val cache: private def clearCache() = delete(cache) private def cachedFilesInfo = fromFile(style.formats, style.empty)(cache)(style.manifest).files - private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) + private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) - def apply[T](f: ChangeReport[File] => T): T = + def apply[T](files: Set[File])(f: ChangeReport[File] => T): T = { - val files = getFiles val lastFilesInfo = cachedFilesInfo + apply(files, lastFilesInfo)(f)(_ => files) + } + + def apply[T](f: ChangeReport[File] => T)(implicit toFiles: T => Set[File]): T = + { + val lastFilesInfo = cachedFilesInfo + apply(raw(lastFilesInfo), lastFilesInfo)(f)(toFiles) + } + + private def abs(files: Set[File]) = files.map(_.getAbsoluteFile) + private[this] def apply[T](files: Set[File], lastFilesInfo: Set[style.F])(f: ChangeReport[File] => T)(extractFiles: T => Set[File]): T = + { val lastFiles = raw(lastFilesInfo) - val currentFiles = files.map(_.getAbsoluteFile) + val currentFiles = abs(files) val currentFilesInfo = style(currentFiles) val report = new ChangeReport[File] @@ -131,120 +124,33 @@ class Difference(getFiles: => Set[File], val style: FilesInfo.Style, val cache: } val result = f(report) - val info = if(filesAreOutputs) style(currentFiles) else currentFilesInfo + val info = if(filesAreOutputs) style(abs(extractFiles(result))) else currentFilesInfo toFile(style.formats)(info)(cache)(style.manifest) result } } -class DependencyTracked[T](val cacheDirectory: File, val translateProducts: Boolean, cleanT: T => Unit)(implicit format: Format[T], mf: Manifest[T]) extends Tracked -{ - private val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) - private def cleanAll(fs: Set[T]) = fs.foreach(cleanT) - - def clean = - { - cleanAll(trackFormat.read.allProducts) - delete(cacheDirectory) - } - - def apply[R](f: UpdateTracking[T] => R): R = - { - val tracker = trackFormat.read - val result = f(tracker) - trackFormat.write(tracker) - result - } -} -object InvalidateFiles -{ - def apply(cacheDirectory: File): InvalidateTransitive[File] = apply(cacheDirectory, true) - def apply(cacheDirectory: File, translateProducts: Boolean): InvalidateTransitive[File] = - { - import sbinary.DefaultProtocol.FileFormat - new InvalidateTransitive[File](cacheDirectory, translateProducts, IO.delete) - } -} -object InvalidateTransitive -{ - import scala.collection.Set - def apply[T](tracker: UpdateTracking[T], files: Set[T]): InvalidationReport[T] = +object FileFunction { + type UpdateFunction = (ChangeReport[File], ChangeReport[File]) => Set[File] + + def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style = FilesInfo.lastModified, outStyle: FilesInfo.Style = FilesInfo.exists)(action: Set[File] => Set[File]): Set[File] => Set[File] = + cached(cacheBaseDirectory)(inStyle, outStyle)( (in, out) => action(in.checked) ) + + def cached(cacheBaseDirectory: File)(inStyle: FilesInfo.Style, outStyle: FilesInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = { - val readTracker = tracker.read - val invalidated = Set() ++ invalidate(readTracker, files) - val invalidatedProducts = Set() ++ invalidated.filter(readTracker.isProduct) - - new InvalidationReport[T] + import Path._ + lazy val inCache = Difference.inputs(cacheBaseDirectory / "in-cache", inStyle) + lazy val outCache = Difference.outputs(cacheBaseDirectory / "out-cache", outStyle) + inputs => { - val invalid = invalidated - val invalidProducts = invalidatedProducts - val valid = Set() ++ files -- invalid - } - } - def andClean[T](tracker: UpdateTracking[T], cleanImpl: Set[T] => Unit, files: Set[T]): InvalidationReport[T] = - { - val report = apply(tracker, files) - clean(tracker, cleanImpl, report) - report - } - def clear[T](tracker: UpdateTracking[T], report: InvalidationReport[T]): Unit = - tracker.removeAll(report.invalid) - def clean[T](tracker: UpdateTracking[T], cleanImpl: Set[T] => Unit, report: InvalidationReport[T]) - { - clear(tracker, report) - cleanImpl(report.invalidProducts) - } - - private def invalidate[T](tracker: ReadTracking[T], files: Iterable[T]): Set[T] = - { - import scala.collection.mutable.HashSet - val invalidated = new HashSet[T] - def invalidate0(files: Iterable[T]): Unit = - for(file <- files if !invalidated(file)) - { - invalidated += file - invalidate0(invalidatedBy(tracker, file)) - } - invalidate0(files) - invalidated - } - private def invalidatedBy[T](tracker: ReadTracking[T], file: T) = - tracker.products(file) ++ tracker.sources(file) ++ tracker.usedBy(file) ++ tracker.dependsOn(file) - -} -class InvalidateTransitive[T](cacheDirectory: File, translateProducts: Boolean, cleanT: T => Unit) - (implicit format: Format[T], mf: Manifest[T]) extends Tracked -{ - def this(cacheDirectory: File, translateProducts: Boolean)(implicit format: Format[T], mf: Manifest[T]) = - this(cacheDirectory, translateProducts, (_: T) => ()) - - private val tracked = new DependencyTracked(cacheDirectory, translateProducts, cleanT) - def clean - { - tracked.clean - tracked.clear - } - - def apply[R](getChanges: => ChangeReport[T])(f: (InvalidationReport[T], UpdateTracking[T]) => R): R = - { - val changes = getChanges - tracked { tracker => - val report = InvalidateTransitive.andClean[T](tracker, _.foreach(cleanT), changes.modified) - f(report, tracker) - } - } -} -class BasicTracked(files: => Set[File], style: FilesInfo.Style, cacheDirectory: File) extends Tracked -{ - private val changed = Difference.inputs(files, style, new File(cacheDirectory, "files")) - private val invalidation = InvalidateFiles(new File(cacheDirectory, "invalidation")) - private def onTracked(f: Tracked => Unit) = { f(invalidation); f(changed) } - def clean = onTracked(_.clean) - - def apply[R](f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => R): R = - changed { sourceChanges => - invalidation(sourceChanges) { (report, tracking) => - f(sourceChanges, report, tracking) + inCache(inputs) { inReport => + outCache { outReport => + if(inReport.modified.isEmpty && outReport.modified.isEmpty) + outReport.checked + else + action(inReport, outReport) + } } } + } } \ No newline at end of file From f14e7883ed4efff69fa820885312ef883f131f96 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 4 Sep 2010 08:12:17 -0400 Subject: [PATCH 075/823] fix PMap test --- util/collection/src/test/scala/PMapTest.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/collection/src/test/scala/PMapTest.scala b/util/collection/src/test/scala/PMapTest.scala index 091012f6e..bac4b7364 100644 --- a/util/collection/src/test/scala/PMapTest.scala +++ b/util/collection/src/test/scala/PMapTest.scala @@ -12,7 +12,7 @@ object PMapTest mp(Some("asdf")) = "a" mp(Some(3)) = 9 val x = Some(3) :^: Some("asdf") :^: KNil - val y = x.map[Id](mp) + val y = x.transform[Id](mp) val z = y.down z match { case 9 :+: "a" :+: HNil => println("true") } } \ No newline at end of file From 58d7de72378994d5fa07aa05a0a3ae4b895f42fa Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 4 Sep 2010 08:19:58 -0400 Subject: [PATCH 076/823] rework ConsoleLogger can send output to a PrintWriter control over color, still need custom formatter replace IvyLogger with normal Logger --- util/log/ConsoleLogger.scala | 62 ++++++++++++++++++++++++++---------- util/log/Level.scala | 2 ++ util/log/MultiLogger.scala | 3 +- 3 files changed, 50 insertions(+), 17 deletions(-) diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index 49db31b66..59bc680c3 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -3,8 +3,24 @@ */ package sbt + import java.io.{PrintStream, PrintWriter} + object ConsoleLogger { + def systemOut: ConsoleOut = printStreamOut(System.out) + def printStreamOut(out: PrintStream): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.print(s) + def println(s: String) = out.println(s) + def println() = out.println() + } + def printWriterOut(out: PrintWriter): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.print(s) + def println(s: String) = out.println(s) + def println() = out.println() + } + private val formatEnabled = ansiSupported && !formatExplicitlyDisabled private[this] def formatExplicitlyDisabled = java.lang.Boolean.getBoolean("sbt.log.noformat") @@ -14,15 +30,20 @@ object ConsoleLogger private[this] def os = System.getProperty("os.name") private[this] def isWindows = os.toLowerCase.indexOf("windows") >= 0 + + def apply(): ConsoleLogger = apply(systemOut) + def apply(out: PrintStream): ConsoleLogger = apply(printStreamOut(out)) + def apply(out: PrintWriter): ConsoleLogger = apply(printWriterOut(out)) + def apply(out: ConsoleOut, ansiCodesSupported: Boolean = formatEnabled, useColor: Boolean = true): ConsoleLogger = + new ConsoleLogger(out, ansiCodesSupported, useColor) } /** A logger that logs to the console. On supported systems, the level labels are * colored. * * This logger is not thread-safe.*/ -class ConsoleLogger extends BasicLogger +class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean) extends BasicLogger { - override def ansiCodesSupported = ConsoleLogger.formatEnabled def messageColor(level: Level.Value) = Console.RESET def labelColor(level: Level.Value) = level match @@ -39,41 +60,50 @@ class ConsoleLogger extends BasicLogger log(successLabelColor, Level.SuccessLabel, successMessageColor, message) } def trace(t: => Throwable): Unit = - System.out.synchronized + out.lockObject.synchronized { val traceLevel = getTrace if(traceLevel >= 0) - System.out.synchronized { System.out.print(StackTrace.trimmed(t, traceLevel)) } + out.print(StackTrace.trimmed(t, traceLevel)) } def log(level: Level.Value, message: => String) { if(atLevel(level)) log(labelColor(level), level.toString, messageColor(level), message) } + private def reset(): Unit = setColor(Console.RESET) + private def setColor(color: String) { - if(ansiCodesSupported) - System.out.synchronized { System.out.print(color) } + if(ansiCodesSupported && useColor) + out.lockObject.synchronized { out.print(color) } } private def log(labelColor: String, label: String, messageColor: String, message: String): Unit = - System.out.synchronized + out.lockObject.synchronized { for(line <- message.split("""\n""")) { - setColor(Console.RESET) - System.out.print('[') + reset() + out.print("[") setColor(labelColor) - System.out.print(label) - setColor(Console.RESET) - System.out.print("] ") + out.print(label) + reset() + out.print("] ") setColor(messageColor) - System.out.print(line) - setColor(Console.RESET) - System.out.println() + out.print(line) + reset() + out.println() } } - def logAll(events: Seq[LogEvent]) = System.out.synchronized { events.foreach(log) } + def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } def control(event: ControlEvent.Value, message: => String) { log(labelColor(Level.Info), Level.Info.toString, Console.BLUE, message) } +} +sealed trait ConsoleOut +{ + val lockObject: AnyRef + def print(s: String): Unit + def println(s: String): Unit + def println(): Unit } \ No newline at end of file diff --git a/util/log/Level.scala b/util/log/Level.scala index ad4e51759..bc1156729 100644 --- a/util/log/Level.scala +++ b/util/log/Level.scala @@ -16,6 +16,8 @@ object Level extends Enumeration * label is also defined here. */ val SuccessLabel = "success" + def union(a: Value, b: Value) = if(a.id < b.id) a else b + /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ def apply(s: String) = values.find(s == _.toString) /** Same as apply, defined for use in pattern matching. */ diff --git a/util/log/MultiLogger.scala b/util/log/MultiLogger.scala index 800d170ed..525e3ef9d 100644 --- a/util/log/MultiLogger.scala +++ b/util/log/MultiLogger.scala @@ -4,7 +4,8 @@ */ package sbt - +// note that setting the logging level on this logger has no effect on its behavior, only +// on the behavior of the delegates. class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { override lazy val ansiCodesSupported = delegates.forall(_.ansiCodesSupported) From f884fa9cdd41b1836544b4f31101d0b911ef024b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 Sep 2010 14:29:00 -0400 Subject: [PATCH 077/823] hierarchical in-memory settings --- util/collection/Settings.scala | 29 +++++++++++++++++++++++++++++ 1 file changed, 29 insertions(+) create mode 100644 util/collection/Settings.scala diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala new file mode 100644 index 000000000..ab85fd87d --- /dev/null +++ b/util/collection/Settings.scala @@ -0,0 +1,29 @@ +package sbt + +sealed trait Settings +{ + def get[T](key: AttributeKey[T], path: List[String]): Option[T] + def set[T](key: AttributeKey[T], path: List[String], value: T): Settings +} +object Settings +{ + def empty: Settings = new Basic(Map.empty) + def x = 3 + + private[this] class Basic(val roots: Map[ List[String], AttributeMap ]) extends Settings + { + def get[T](key: AttributeKey[T], path: List[String]): Option[T] = + { + def notFound = path match { + case Nil => None + case x :: xs => get(key, xs) + } + (roots get path) flatMap ( _ get key ) orElse notFound + } + def set[T](key: AttributeKey[T], path: List[String], value: T): Settings = + { + val amap = (roots get path) getOrElse AttributeMap.empty + new Basic( roots updated(path, amap put(key, value)) ) + } + } +} \ No newline at end of file From b033bc889d298ae7e1d084c1ab67ed15d5733a41 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 12 Sep 2010 22:27:11 -0400 Subject: [PATCH 078/823] toString for HList and KList --- util/collection/HList.scala | 4 ++++ util/collection/KList.scala | 3 +++ 2 files changed, 7 insertions(+) diff --git a/util/collection/HList.scala b/util/collection/HList.scala index db2c9db85..df0391de8 100644 --- a/util/collection/HList.scala +++ b/util/collection/HList.scala @@ -13,12 +13,16 @@ sealed trait HNil extends HList { type Wrap[M[_]] = HNil def :+: [G](g: G): G :+: HNil = HCons(g, this) + + override def toString = "HNil" } object HNil extends HNil final case class HCons[H, T <: HList](head : H, tail : T) extends HList { type Wrap[M[_]] = M[H] :+: T#Wrap[M] def :+: [G](g: G): G :+: H :+: T = HCons(g, this) + + override def toString = head + " :+: " + tail.toString } object HList diff --git a/util/collection/KList.scala b/util/collection/KList.scala index 81ef6afcd..aa9662917 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -31,6 +31,8 @@ final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) exten def toList = head :: tail.toList def combine[N[X] >: M[X]]: (H :+: T)#Wrap[N] = HCons(head, tail.combine) + + override def toString = head + " :^: " + tail.toString } sealed class KNil extends KList[Nothing, HNil] @@ -40,6 +42,7 @@ sealed class KNil extends KList[Nothing, HNil] def :^: [M[_], H](h: M[H]) = KCons(h, this) def toList = Nil def combine[N[X]] = HNil + override def toString = "KNil" } object KNil extends KNil From ccb3a840c68baebe89620f3fa895ab30ff240061 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 17 Sep 2010 20:46:31 -0400 Subject: [PATCH 079/823] Attributed, attaches attributes to arbitrary data --- util/collection/Attributes.scala | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 488142ad7..4037884dd 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -53,4 +53,13 @@ private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) final case class AttributeEntry[T](a: AttributeKey[T], b: T) { override def toString = a.label + ": " + b +} + +final case class Attributed[D](data: D)(val metadata: AttributeMap) +{ + def put[T](key: AttributeKey[T], value: T): Attributed[D] = Attributed(data)(metadata.put(key, value)) +} +object Attributed +{ + implicit def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) } \ No newline at end of file From 0b77a070dde7c8ffa8949f9b5dea3b178b172ede Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 17 Sep 2010 21:29:29 -0400 Subject: [PATCH 080/823] merge Pkg into Private this better represents the original source --- interface/other | 1 - 1 file changed, 1 deletion(-) diff --git a/interface/other b/interface/other index 78fe98691..3ef1d4461 100644 --- a/interface/other +++ b/interface/other @@ -11,7 +11,6 @@ Access qualifier: Qualifier Protected Private - Pkg Qualifier Unqualified From 1f9c13e7214ceab0b2aacbc52edb5284987c6dde Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 17 Sep 2010 21:38:40 -0400 Subject: [PATCH 081/823] Rework external dependency tracking and multi-projects Reduce AnalysisCallback interface: remove discovery simplify dependency notification methods Use map of classpath entry to Analysis for locating source API for external dependencies Handle classpath changes by locating class on classpath and either locating Analysis/Source as above or comparing Stamp. This requires storing the class name of a binary dependency now. Make this process aware of full classpath, including boot classpath --- .../src/main/java/xsbti/AnalysisCallback.java | 28 ++----------------- 1 file changed, 3 insertions(+), 25 deletions(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 03c4798c9..d3eb2ab54 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -7,43 +7,21 @@ import java.io.File; public interface AnalysisCallback { - /** The names of classes that the analyzer should find subclasses of.*/ - public String[] superclassNames(); - /** The names of annotations that the analyzer should look for on methods and classes.*/ - public String[] annotationNames(); - /** Called when the the given superclass could not be found on the classpath by the compiler.*/ - public void superclassNotFound(String superclassName); /** Called before the source at the given location is processed. */ public void beginSource(File source); - /** Called when the a subclass of one of the classes given in superclassNames is - * discovered.*/ - public void foundSubclass(File source, String subclassName, String superclassName, boolean isModule); - /** Called when an annotation with name annotationName is found on a class or one of its methods.*/ - public void foundAnnotated(File source, String className, String annotationName, boolean isModule); /** Called to indicate that the source file source depends on the source file * dependsOn. Note that only source files included in the current compilation will * passed to this method. Dependencies on classes generated by sources not in the current compilation will * be passed as class dependencies to the classDependency method.*/ public void sourceDependency(File dependsOn, File source); - /** Called to indicate that the source file source depends on the jar - * jar.*/ - public void jarDependency(File jar, File source); - /** Called to indicate that the source file source depends on the class file - * clazz.*/ - public void classDependency(File clazz, File source); - /** Called to indicate that the source file sourcePath depends on the class file - * classFile that is a product of some source. This differs from classDependency - * because it is really a sourceDependency. The source corresponding to classFile - * was not incuded in the compilation so the plugin doesn't know what the source is though. It - * only knows that the class file came from the output directory.*/ - public void productDependency(File classFile, File sourcePath); + /** Called to indicate that the source file source depends on the top-level + * class named name from class or jar file binary. */ + public void binaryDependency(File binary, String name, File source); /** Called to indicate that the source file source produces a class file at * module.*/ public void generatedClass(File source, File module); /** Called after the source at the given location has been processed. */ public void endSource(File sourcePath); - /** Called when a module with a public 'main' method with the right signature is found.*/ - public void foundApplication(File source, String className); /** Called when the public API of a source file is extracted. */ public void api(File sourceFile, xsbti.api.Source source); } \ No newline at end of file From 3dd98e872389c64113ed5bef38b89ca80640afe1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 21 Sep 2010 21:57:15 -0400 Subject: [PATCH 082/823] reorganize Process implicits split out Process implicits to ProcessExtra trait give them unique names to avoid shadowing when used --- util/process/Process.scala | 34 +++++++++++++++++++++++++--------- 1 file changed, 25 insertions(+), 9 deletions(-) diff --git a/util/process/Process.scala b/util/process/Process.scala index 536d40eea..79bffe075 100644 --- a/util/process/Process.scala +++ b/util/process/Process.scala @@ -8,11 +8,26 @@ import java.io.{Closeable, File, IOException} import java.io.{BufferedReader, InputStream, InputStreamReader, OutputStream, PipedInputStream, PipedOutputStream} import java.net.URL -/** Methods for constructing simple commands that can then be combined. */ -object Process +trait ProcessExtra { - implicit def apply(command: String): ProcessBuilder = apply(command, None) - implicit def apply(command: Seq[String]): ProcessBuilder = apply (command.toArray, None) + import Process._ + implicit def builderToProcess(builder: JProcessBuilder): ProcessBuilder = apply(builder) + implicit def fileToProcess(file: File): FilePartialBuilder = apply(file) + implicit def urlToProcess(url: URL): URLPartialBuilder = apply(url) + implicit def xmlToProcess(command: scala.xml.Elem): ProcessBuilder = apply(command) + implicit def buildersToProcess[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = applySeq(builders) + + implicit def stringToProcess(command: String): ProcessBuilder = apply(command) + implicit def stringSeqToProcess(command: Seq[String]): ProcessBuilder = apply(command) +} + +/** Methods for constructing simple commands that can then be combined. */ +object Process extends ProcessExtra +{ + def apply(command: String): ProcessBuilder = apply(command, None) + + def apply(command: Seq[String]): ProcessBuilder = apply (command.toArray, None) + def apply(command: String, arguments: Seq[String]): ProcessBuilder = apply(command :: arguments.toList, None) /** create ProcessBuilder with working dir set to File and extra environment variables */ def apply(command: String, cwd: File, extraEnv: (String,String)*): ProcessBuilder = @@ -33,11 +48,12 @@ object Process extraEnv.foreach { case (k, v) => jpb.environment.put(k, v) } apply(jpb) } - implicit def apply(builder: JProcessBuilder): ProcessBuilder = new SimpleProcessBuilder(builder) - implicit def apply(file: File): FilePartialBuilder = new FileBuilder(file) - implicit def apply(url: URL): URLPartialBuilder = new URLBuilder(url) - implicit def apply(command: scala.xml.Elem): ProcessBuilder = apply(command.text.trim) - implicit def applySeq[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = builders.map(convert) + def apply(builder: JProcessBuilder): ProcessBuilder = new SimpleProcessBuilder(builder) + def apply(file: File): FilePartialBuilder = new FileBuilder(file) + def apply(url: URL): URLPartialBuilder = new URLBuilder(url) + def apply(command: scala.xml.Elem): ProcessBuilder = apply(command.text.trim) + def applySeq[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = builders.map(convert) + def apply(value: Boolean): ProcessBuilder = apply(value.toString, if(value) 0 else 1) def apply(name: String, exitValue: => Int): ProcessBuilder = new DummyProcessBuilder(name, exitValue) From d8ed444f56d48e6369bd98ae7669719fa41e756a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 27 Sep 2010 18:50:17 -0400 Subject: [PATCH 083/823] add another Process constructor --- util/process/Process.scala | 3 +++ 1 file changed, 3 insertions(+) diff --git a/util/process/Process.scala b/util/process/Process.scala index 79bffe075..2dd70484c 100644 --- a/util/process/Process.scala +++ b/util/process/Process.scala @@ -32,6 +32,9 @@ object Process extends ProcessExtra /** create ProcessBuilder with working dir set to File and extra environment variables */ def apply(command: String, cwd: File, extraEnv: (String,String)*): ProcessBuilder = apply(command, Some(cwd), extraEnv : _*) + /** create ProcessBuilder with working dir set to File and extra environment variables */ + def apply(command: Seq[String], cwd: File, extraEnv: (String,String)*): ProcessBuilder = + apply(command, Some(cwd), extraEnv : _*) /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ def apply(command: String, cwd: Option[File], extraEnv: (String,String)*): ProcessBuilder = { apply(command.split("""\s+"""), cwd, extraEnv : _*) From 5a71431031c1679589dce8233a1fcbe45631b047 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 27 Sep 2010 18:51:35 -0400 Subject: [PATCH 084/823] add conversion from xsbti.Logger to sbt.Logger --- util/log/Logger.scala | 21 +++++++++++++++++++++ 1 file changed, 21 insertions(+) diff --git a/util/log/Logger.scala b/util/log/Logger.scala index 4a0f9c365..3be05b539 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -37,6 +37,27 @@ object Logger { implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger implicit def log2PLog(log: Logger): ProcessLogger = absLog2PLog(new FullLogger(log)) + implicit def xlog2Log(lg: xLogger): Logger = new Logger { + override def debug(msg: F0[String]): Unit = lg.debug(msg) + override def warn(msg: F0[String]): Unit = lg.warn(msg) + override def info(msg: F0[String]): Unit = lg.info(msg) + override def error(msg: F0[String]): Unit = lg.error(msg) + override def trace(msg: F0[Throwable]) = lg.trace(msg) + override def log(level: Level.Value, msg: F0[String]) = lg.log(level, msg) + def trace(t: => Throwable) = trace(f0(t)) + def log(level: Level.Value, msg: => String) = + { + val fmsg = f0(msg) + level match + { + case Level.Debug => lg.debug(fmsg) + case Level.Info => lg.info(fmsg) + case Level.Warn => lg.warn(fmsg) + case Level.Error => lg.error(fmsg) + } + } + } + def f0[T](t: =>T): F0[T] = new F0[T] { def apply = t } } /** This is intended to be the simplest logging interface for use by code that wants to log. From 0425532275ceb7fe5f8d956901ad980f80f938a4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 6 Oct 2010 08:24:13 -0400 Subject: [PATCH 085/823] fix tests, discovery updated compile tests for new minimal AnalysisCallback moved discovery to discovery/ subproject and updated for new approach fixed discovery to only find public methods when searching for annotated definitions extracting inherited definitions unimplemented in api/, so some discovery tests fail moved discovery classes from sbt.inc package to sbt.compile --- interface/src/test/scala/TestCallback.scala | 29 +++++++-------------- util/log/src/test/scala/TestLogger.scala | 11 ++++++++ 2 files changed, 20 insertions(+), 20 deletions(-) create mode 100644 util/log/src/test/scala/TestLogger.scala diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index 75e8d77af..d554a9a08 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -1,34 +1,23 @@ package xsbti -import java.io.File -import scala.collection.mutable.ArrayBuffer + import java.io.File + import scala.collection.mutable.ArrayBuffer -class TestCallback(val superclassNames: Array[String], val annotationNames: Array[String]) extends AnalysisCallback +class TestCallback extends AnalysisCallback { - val invalidSuperclasses = new ArrayBuffer[String] val beganSources = new ArrayBuffer[File] val endedSources = new ArrayBuffer[File] - val foundSubclasses = new ArrayBuffer[(File, String, String, Boolean)] - val foundAnnotated = new ArrayBuffer[(File, String, String, Boolean)] val sourceDependencies = new ArrayBuffer[(File, File)] - val jarDependencies = new ArrayBuffer[(File, File)] - val classDependencies = new ArrayBuffer[(File, File)] - val productDependencies = new ArrayBuffer[(File, File)] + val binaryDependencies = new ArrayBuffer[(File, String, File)] val products = new ArrayBuffer[(File, File)] - val applications = new ArrayBuffer[(File, String)] + val apis = new ArrayBuffer[(File, xsbti.api.Source)] - def superclassNotFound(superclassName: String) { invalidSuperclasses += superclassName } def beginSource(source: File) { beganSources += source } - def foundSubclass(source: File, subclassName: String, superclassName: String, isModule: Boolean): Unit = - foundSubclasses += ((source, subclassName, superclassName, isModule)) - def foundAnnotated(source: File, className: String, annotationName: String, isModule: Boolean): Unit = - foundAnnotated += ((source, className, annotationName, isModule)) + def sourceDependency(dependsOn: File, source: File) { sourceDependencies += ((dependsOn, source)) } - def jarDependency(jar: File, source: File) { jarDependencies += ((jar, source)) } - def classDependency(clazz: File, source: File) { classDependencies += ((clazz, source)) } - def productDependency(clazz: File, source: File) { productDependencies += ((clazz, source)) } + def binaryDependency(binary: File, name: String, source: File) { binaryDependencies += ((binary, name, source)) } def generatedClass(source: File, module: File) { products += ((source, module)) } def endSource(source: File) { endedSources += source } - def foundApplication(source: File, className: String) { applications += ((source, className)) } - def api(source: File, sourceAPI: xsbti.api.Source) = () + + def api(source: File, sourceAPI: xsbti.api.Source) { apis += ((source, sourceAPI)) } } \ No newline at end of file diff --git a/util/log/src/test/scala/TestLogger.scala b/util/log/src/test/scala/TestLogger.scala new file mode 100644 index 000000000..edf2b00dd --- /dev/null +++ b/util/log/src/test/scala/TestLogger.scala @@ -0,0 +1,11 @@ +package sbt + +object TestLogger +{ + def apply[T](f: Logger => T): T = + { + val log = new BufferedLogger(ConsoleLogger()) + log.setLevel(Level.Debug) + log.bufferQuietly(f(log)) + } +} \ No newline at end of file From 7dca038bded2ff331f7bb6083518014fa330f356 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 22 Oct 2010 21:55:16 -0400 Subject: [PATCH 086/823] improving incremental compilation support lazy arguments in data type generator SafeLazy implementation that explicitly clears the reference to the thunk in API representation, drop synthetic modifier and merge deferred into abstract handle cyclic structures in API generation, display, comparison, persistence gzip compile cache file bump to 2.8.1.RC3, project definition cleanup fix main method detection to check for the right name properly view inherited definitions exclude constructors of ancestors --- interface/definition | 4 ++-- interface/other | 2 -- interface/src/main/java/xsbti/api/Lazy.java | 9 +++++++++ interface/type | 8 ++++---- util/collection/Relation.scala | 6 +++++- 5 files changed, 20 insertions(+), 9 deletions(-) create mode 100644 interface/src/main/java/xsbti/api/Lazy.java diff --git a/interface/definition b/interface/definition index 9220a9a0e..2dcd4025b 100644 --- a/interface/definition +++ b/interface/definition @@ -14,8 +14,8 @@ Definition returnType: Type ClassLike definitionType: DefinitionType - selfType: Type - structure: Structure + selfType: ~Type + structure: ~Structure TypeMember TypeAlias tpe: Type diff --git a/interface/other b/interface/other index 3ef1d4461..bba27dc7f 100644 --- a/interface/other +++ b/interface/other @@ -20,13 +20,11 @@ Qualifier Modifiers isAbstract: Boolean - isDeferred: Boolean isOverride: Boolean isFinal: Boolean isSealed: Boolean isImplicit: Boolean isLazy: Boolean - isSynthetic: Boolean ParameterList parameters: MethodParameter* diff --git a/interface/src/main/java/xsbti/api/Lazy.java b/interface/src/main/java/xsbti/api/Lazy.java new file mode 100644 index 000000000..4a5642a01 --- /dev/null +++ b/interface/src/main/java/xsbti/api/Lazy.java @@ -0,0 +1,9 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ +package xsbti.api; + +public interface Lazy +{ + T get(); +} \ No newline at end of file diff --git a/interface/type b/interface/type index c516b62b9..a605f4cd4 100644 --- a/interface/type +++ b/interface/type @@ -3,7 +3,7 @@ Type SimpleType Projection prefix : SimpleType - id : String + id: String ParameterRef id: Int Singleton @@ -16,9 +16,9 @@ Type baseType : SimpleType annotations : Annotation* Structure - parents : Type* - declared: Definition* - inherited: Definition* + parents : ~Type* + declared: ~Definition* + inherited: ~Definition* Existential baseType : Type clause: TypeParameter* diff --git a/util/collection/Relation.scala b/util/collection/Relation.scala index ed6046f6e..a282bf7e2 100644 --- a/util/collection/Relation.scala +++ b/util/collection/Relation.scala @@ -38,7 +38,9 @@ trait Relation[A,B] def _1s: collection.Set[A] /** Returns the set of all _2s such that (_1, _2) is in this relation. */ def _2s: collection.Set[B] - + /** Returns the number of pairs in this relation */ + def size: Int + /** Returns all pairs in this relation.*/ def all: Traversable[(A,B)] @@ -57,6 +59,8 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def _1s = fwd.keySet def _2s = rev.keySet + + def size = fwd.size def all: Traversable[(A,B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable From 6402a766b5e1570474cbcfca24dcdceff3c6e58b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 23 Oct 2010 16:34:22 -0400 Subject: [PATCH 087/823] more flexible scalac logging the custom scalac Reporter now delegates to an instance of an sbt interface called xsbti.Reporter handling compilation logging is now mainly done on the sbt-side of the compiler interface the xsbti.Reporter interface provides access to richer information about errors and warnings, including source file, line, and offset xsbti.Reporter can be implemented by users to get access to detailed information without needing to parse the logging output the CompileFailed exception that is thrown when compilation fails now includes an array of the problems, providing detailed error and warning information that can, for example, be consumed by doing a mapFailure on 'compile' and using 'Compile.allProblems' --- .../src/main/java/xsbti/CompileFailed.java | 1 + interface/src/main/java/xsbti/Maybe.java | 30 +++++++++++++++++++ interface/src/main/java/xsbti/Position.java | 18 +++++++++++ interface/src/main/java/xsbti/Problem.java | 11 +++++++ interface/src/main/java/xsbti/Reporter.java | 20 +++++++++++++ interface/src/main/java/xsbti/Severity.java | 9 ++++++ 6 files changed, 89 insertions(+) create mode 100644 interface/src/main/java/xsbti/Maybe.java create mode 100644 interface/src/main/java/xsbti/Position.java create mode 100644 interface/src/main/java/xsbti/Problem.java create mode 100644 interface/src/main/java/xsbti/Reporter.java create mode 100644 interface/src/main/java/xsbti/Severity.java diff --git a/interface/src/main/java/xsbti/CompileFailed.java b/interface/src/main/java/xsbti/CompileFailed.java index bb5b2a93a..f1cbbc61b 100644 --- a/interface/src/main/java/xsbti/CompileFailed.java +++ b/interface/src/main/java/xsbti/CompileFailed.java @@ -3,4 +3,5 @@ package xsbti; public abstract class CompileFailed extends RuntimeException { public abstract String[] arguments(); + public abstract Problem[] problems(); } \ No newline at end of file diff --git a/interface/src/main/java/xsbti/Maybe.java b/interface/src/main/java/xsbti/Maybe.java new file mode 100644 index 000000000..f730ef918 --- /dev/null +++ b/interface/src/main/java/xsbti/Maybe.java @@ -0,0 +1,30 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 Mark Harrah + */ +package xsbti; + +/** Intended as a lightweight carrier for scala.Option. */ +public abstract class Maybe +{ + // private pending Scala bug #3642 + protected Maybe() {} + + public static Maybe just(final s v) + { + return new Maybe() { + public boolean isDefined() { return true; } + public s get() { return v; } + }; + } + public static Maybe nothing() + { + return new Maybe() { + public boolean isDefined() { return false; } + public s get() { throw new UnsupportedOperationException("nothing.get"); } + }; + } + + public final boolean isEmpty() { return !isDefined(); } + public abstract boolean isDefined(); + public abstract t get(); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/Position.java b/interface/src/main/java/xsbti/Position.java new file mode 100644 index 000000000..96c60ebb2 --- /dev/null +++ b/interface/src/main/java/xsbti/Position.java @@ -0,0 +1,18 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 Mark Harrah + */ +package xsbti; + +public interface Position +{ + Maybe line(); + String lineContent(); + Maybe offset(); + + // pointer to the column position of the error/warning + Maybe pointer(); + Maybe pointerSpace(); + + Maybe sourcePath(); + Maybe sourceFile(); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/Problem.java b/interface/src/main/java/xsbti/Problem.java new file mode 100644 index 000000000..cf2641900 --- /dev/null +++ b/interface/src/main/java/xsbti/Problem.java @@ -0,0 +1,11 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 Mark Harrah + */ +package xsbti; + +public interface Problem +{ + Severity severity(); + String message(); + Position position(); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/Reporter.java b/interface/src/main/java/xsbti/Reporter.java new file mode 100644 index 000000000..8556cbe8a --- /dev/null +++ b/interface/src/main/java/xsbti/Reporter.java @@ -0,0 +1,20 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 Mark Harrah + */ +package xsbti; + +public interface Reporter +{ + /** Resets logging, including any accumulated errors, warnings, messages, and counts.*/ + public void reset(); + /** Returns true if this logger has seen any errors since the last call to reset.*/ + public boolean hasErrors(); + /** Returns true if this logger has seen any warnings since the last call to reset.*/ + public boolean hasWarnings(); + /** Logs a summary of logging since the last reset.*/ + public void printSummary(); + /** Returns a list of warnings and errors since the last reset.*/ + public Problem[] problems(); + /** Logs a message.*/ + public void log(Position pos, String msg, Severity sev); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/Severity.java b/interface/src/main/java/xsbti/Severity.java new file mode 100644 index 000000000..09aed574b --- /dev/null +++ b/interface/src/main/java/xsbti/Severity.java @@ -0,0 +1,9 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009, 2010 Mark Harrah + */ +package xsbti; + +public enum Severity +{ + Info, Warn, Error +} \ No newline at end of file From e30368b3142c69237b6f57c8241043da551e45e1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 26 Oct 2010 18:02:27 -0400 Subject: [PATCH 088/823] overhaul caching, mainly InputCache better underlying model supports arbitrary length unions and products (unions actually limited to 256 elements to encode length as byte) --- cache/Cache.scala | 247 +++++++++++++++++++++++++++++++---- cache/FileInfo.scala | 29 ++-- cache/HListCache.scala | 47 ------- cache/NoCache.scala | 22 ---- cache/SeparatedCache.scala | 100 +++++--------- cache/tracking/Tracked.scala | 25 ++-- 6 files changed, 288 insertions(+), 182 deletions(-) delete mode 100644 cache/HListCache.scala delete mode 100644 cache/NoCache.scala diff --git a/cache/Cache.scala b/cache/Cache.scala index c638e94f0..0cf5ccd12 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -3,28 +3,25 @@ */ package sbt -import sbinary.{CollectionTypes, Format, JavaFormats} +import sbinary.{CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output} import java.io.File +import java.net.{URI, URL} import Types.:+: +import DefaultProtocol.{asProduct2, asSingleton, BooleanFormat, ByteFormat, IntFormat, wrap} +import scala.xml.NodeSeq trait Cache[I,O] { def apply(file: File)(i: I): Either[O, O => Unit] } -trait SBinaryFormats extends CollectionTypes with JavaFormats with NotNull +trait SBinaryFormats extends CollectionTypes with JavaFormats { - //TODO: add basic types from SBinary minus FileFormat + implicit def urlFormat: Format[URL] = DefaultProtocol.UrlFormat + implicit def uriFormat: Format[URI] = DefaultProtocol.UriFormat } -object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits +object Cache extends CacheImplicits { def cache[I,O](implicit c: Cache[I,O]): Cache[I,O] = c - def outputCache[O](implicit c: OutputCache[O]): OutputCache[O] = c - def inputCache[O](implicit c: InputCache[O]): InputCache[O] = c - - def wrapInputCache[I,DI](implicit convert: I => DI, base: InputCache[DI]): InputCache[I] = - new WrappedInputCache(convert, base) - def wrapOutputCache[O,DO](implicit convert: O => DO, reverse: DO => O, base: OutputCache[DO]): OutputCache[O] = - new WrappedOutputCache[O,DO](convert, reverse, base) def cached[I,O](file: File)(f: I => O)(implicit cache: Cache[I,O]): I => O = in => @@ -36,25 +33,219 @@ object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImpl store(out) out } -} -trait BasicCacheImplicits extends NotNull -{ - implicit def basicInputCache[I](implicit format: Format[I], equiv: Equiv[I]): InputCache[I] = - new BasicInputCache(format, equiv) - implicit def basicOutputCache[O](implicit format: Format[O]): OutputCache[O] = - new BasicOutputCache(format) - implicit def ioCache[I,O](implicit input: InputCache[I], output: OutputCache[O]): Cache[I,O] = - new SeparatedCache(input, output) - implicit def defaultEquiv[T]: Equiv[T] = new Equiv[T] { def equiv(a: T, b: T) = a == b } + def debug[I](label: String, c: InputCache[I]): InputCache[I] = + new InputCache[I] + { + type Internal = c.Internal + def convert(i: I) = c.convert(i) + def read(from: Input) = + { + val v = c.read(from) + println(label + ".read: " + v) + v + } + def write(to: Output, v: Internal) + { + println(label + ".write: " + v) + c.write(to, v) + } + def equiv: Equiv[Internal] = new Equiv[Internal] { + def equiv(a: Internal, b: Internal)= + { + val equ = c.equiv.equiv(a,b) + println(label + ".equiv(" + a + ", " + b +"): " + equ) + equ + } + } + } } +trait CacheImplicits extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits with UnionImplicits +trait BasicCacheImplicits +{ + implicit def basicCache[I, O](implicit in: InputCache[I], outFormat: Format[O]): Cache[I,O] = + new BasicCache()(in, outFormat) + def basicInput[I](implicit eq: Equiv[I], fmt: Format[I]): InputCache[I] = InputCache.basicInputCache(fmt, eq) + + def defaultEquiv[T]: Equiv[T] = new Equiv[T] { def equiv(a: T, b: T) = a == b } + + implicit def optInputCache[T](implicit t: InputCache[T]): InputCache[Option[T]] = + new InputCache[Option[T]] + { + type Internal = Option[t.Internal] + def convert(v: Option[T]): Internal = v.map(x => t.convert(x)) + def read(from: Input) = + { + val isDefined = BooleanFormat.reads(from) + if(isDefined) Some(t.read(from)) else None + } + def write(to: Output, j: Internal): Unit = + { + BooleanFormat.writes(to, j.isDefined) + j foreach { x => t.write(to, x) } + } + def equiv = optEquiv(t.equiv) + } + + def wrapEquiv[S,T](f: S => T)(implicit eqT: Equiv[T]): Equiv[S] = + new Equiv[S] { + def equiv(a: S, b: S) = + eqT.equiv( f(a), f(b) ) + } + + implicit def optEquiv[T](implicit t: Equiv[T]): Equiv[Option[T]] = + new Equiv[Option[T]] { + def equiv(a: Option[T], b: Option[T]) = + (a,b) match + { + case (None, None) => true + case (Some(va), Some(vb)) => t.equiv(va, vb) + case _ => false + } + } + implicit def urlEquiv(implicit uriEq: Equiv[URI]): Equiv[URL] = wrapEquiv[URL, URI](_.toURI)(uriEq) + implicit def uriEquiv: Equiv[URI] = defaultEquiv + implicit def stringSetEquiv: Equiv[Set[String]] = defaultEquiv + implicit def stringMapEquiv: Equiv[Map[String, String]] = defaultEquiv + + + implicit def xmlInputCache(implicit strEq: InputCache[String]): InputCache[NodeSeq] = wrapIn[NodeSeq, String](_.toString, strEq) + + implicit def seqCache[T](implicit t: InputCache[T]): InputCache[Seq[T]] = + new InputCache[Seq[T]] + { + type Internal = Seq[t.Internal] + def convert(v: Seq[T]) = v.map(x => t.convert(x)) + def read(from: Input) = + { + val size = IntFormat.reads(from) + def next(left: Int, acc: List[t.Internal]): Internal = + if(left <= 0) acc.reverse else next(left - 1, t.read(from) :: acc) + next(size, Nil) + } + def write(to: Output, vs: Internal) + { + val size = vs.length + IntFormat.writes(to, size) + for(v <- vs) t.write(to, v) + } + def equiv: Equiv[Internal] = seqEquiv(t.equiv) + } + + implicit def arrEquiv[T](implicit t: Equiv[T]): Equiv[Array[T]] = + wrapEquiv( (x: Array[T]) => x :Seq[T] )(seqEquiv[T](t)) + + implicit def seqEquiv[T](implicit t: Equiv[T]): Equiv[Seq[T]] = + new Equiv[Seq[T]] + { + def equiv(a: Seq[T], b: Seq[T]) = + a.length == b.length && + ((a,b).zipped forall t.equiv) + } + implicit def seqFormat[T](implicit t: Format[T]): Format[Seq[T]] = + wrap[Seq[T], List[T]](_.toList, _.toSeq)(DefaultProtocol.listFormat) + + def wrapIn[I,J](implicit f: I => J, jCache: InputCache[J]): InputCache[I] = + new InputCache[I] + { + type Internal = jCache.Internal + def convert(i: I) = jCache.convert(f(i)) + def read(from: Input) = jCache.read(from) + def write(to: Output, j: Internal) = jCache.write(to, j) + def equiv = jCache.equiv + } + + def singleton[T](t: T): InputCache[T] = + basicInput(trueEquiv, asSingleton(t)) + + def trueEquiv[T] = new Equiv[T] { def equiv(a: T, b: T) = true } +} + trait HListCacheImplicits { - implicit def hConsInputCache[H,T<:HList](implicit headCache: InputCache[H], tailCache: InputCache[T]): InputCache[H :+: T] = - new HConsInputCache(headCache, tailCache) - implicit lazy val hNilInputCache: InputCache[HNil] = new HNilInputCache - - implicit def hConsOutputCache[H,T<:HList](implicit headCache: OutputCache[H], tailCache: OutputCache[T]): OutputCache[H :+: T] = - new HConsOutputCache(headCache, tailCache) - implicit lazy val hNilOutputCache: OutputCache[HNil] = new HNilOutputCache + implicit def hConsCache[H, T <: HList](implicit head: InputCache[H], tail: InputCache[T]): InputCache[H :+: T] = + new InputCache[H :+: T] + { + type Internal = (head.Internal, tail.Internal) + def convert(in: H :+: T) = (head.convert(in.head), tail.convert(in.tail)) + def read(from: Input) = + { + val h = head.read(from) + val t = tail.read(from) + (h, t) + } + def write(to: Output, j: Internal) + { + head.write(to, j._1) + tail.write(to, j._2) + } + def equiv = new Equiv[Internal] + { + def equiv(a: Internal, b: Internal) = + head.equiv.equiv(a._1, b._1) && + tail.equiv.equiv(a._2, b._2) + } + } + + implicit def hNilCache: InputCache[HNil] = Cache.singleton(HNil : HNil) } +trait UnionImplicits +{ + def unionInputCache[UB, HL <: HList](implicit uc: UnionCache[HL, UB]): InputCache[UB] = + new InputCache[UB] + { + type Internal = Found[_] + def convert(in: UB) = uc.find(in) + def read(in: Input) = + { + val index = ByteFormat.reads(in) + val (cache, clazz) = uc.at(index) + val value = cache.read(in) + new Found[cache.Internal](cache, clazz, value, index) + } + def write(to: Output, i: Internal) + { + def write0[I](f: Found[I]) + { + ByteFormat.writes(to, f.index.toByte) + f.cache.write(to, f.value) + } + write0(i) + } + def equiv: Equiv[Internal] = new Equiv[Internal] + { + def equiv(a: Internal, b: Internal) = + { + if(a.clazz == b.clazz) + force(a.cache.equiv, a.value, b.value) + else + false + } + def force[T <: UB, UB](e: Equiv[T], a: UB, b: UB) = e.equiv(a.asInstanceOf[T], b.asInstanceOf[T]) + } + } + + implicit def unionCons[H <: UB, UB, T <: HList](implicit head: InputCache[H], mf: Manifest[H], t: UnionCache[T, UB]): UnionCache[H :+: T, UB] = + new UnionCache[H :+: T, UB] + { + val size = 1 + t.size + def c = mf.erasure + def find(value: UB): Found[_] = + if(c.isInstance(value)) new Found[head.Internal](head, c, head.convert(value.asInstanceOf[H]), size - 1) else t.find(value) + def at(i: Int): (InputCache[_ <: UB], Class[_]) = if(size == i + 1) (head, c) else t.at(i) + } + + implicit def unionNil[UB]: UnionCache[HNil, UB] = new UnionCache[HNil, UB] { + def size = 0 + def find(value: UB) = error("No valid sum type for " + value) + def at(i: Int) = error("Invalid union index " + i) + } + + final class Found[I](val cache: InputCache[_] { type Internal = I }, val clazz: Class[_], val value: I, val index: Int) + sealed trait UnionCache[HL <: HList, UB] + { + def size: Int + def at(i: Int): (InputCache[_ <: UB], Class[_]) + def find(forValue: UB): Found[_] + } +} \ No newline at end of file diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala index 425a8598d..5731a535e 100644 --- a/cache/FileInfo.scala +++ b/cache/FileInfo.scala @@ -30,15 +30,21 @@ private final case class FileHashModified(file: File, hash: List[Byte], lastModi object FileInfo { - sealed trait Style extends NotNull + implicit def existsInputCache: InputCache[PlainFileInfo] = exists.infoInputCache + implicit def modifiedInputCache: InputCache[ModifiedFileInfo] = lastModified.infoInputCache + implicit def hashInputCache: InputCache[HashFileInfo] = hash.infoInputCache + implicit def fullInputCache: InputCache[HashModifiedFileInfo] = full.infoInputCache + + sealed trait Style { type F <: FileInfo implicit def apply(file: File): F implicit def unapply(info: F): File = info.file implicit val format: Format[F] import Cache._ - implicit def infoInputCache: InputCache[File] = wrapInputCache[File,F] - implicit def infoOutputCache: OutputCache[File] = wrapOutputCache[File,F] + implicit def fileInfoEquiv: Equiv[F] = defaultEquiv + implicit def infoInputCache: InputCache[F] = basicInput + implicit def fileInputCache: InputCache[File] = wrapIn[File,F] } object full extends Style { @@ -71,10 +77,10 @@ object FileInfo } } -final case class FilesInfo[F <: FileInfo] private(files: Set[F]) extends NotNull +final case class FilesInfo[F <: FileInfo] private(files: Set[F]) object FilesInfo { - sealed abstract class Style extends NotNull + sealed abstract class Style { val fileStyle: FileInfo.Style type F = fileStyle.F @@ -85,8 +91,9 @@ object FilesInfo val manifest: Manifest[Format[FilesInfo[F]]] def empty: FilesInfo[F] = new FilesInfo(Set.empty) import Cache._ - implicit def infosInputCache: InputCache[Set[File]] = wrapInputCache[Set[File],FilesInfo[F]] - implicit def infosOutputCache: OutputCache[Set[File]] = wrapOutputCache[Set[File],FilesInfo[F]] + implicit def infosInputCache: InputCache[FilesInfo[F]] = basicInput + implicit def filesInputCache: InputCache[Set[File]] = wrapIn[Set[File],FilesInfo[F]] + implicit def filesInfoEquiv: Equiv[FilesInfo[F]] = defaultEquiv } private final class BasicStyle[FI <: FileInfo](val fileStyle: FileInfo.Style { type F = FI }) (implicit val manifest: Manifest[Format[FilesInfo[FI]]]) extends Style @@ -95,8 +102,8 @@ object FilesInfo implicit def apply(files: Set[File]): FilesInfo[F] = FilesInfo( files.map(_.getAbsoluteFile).map(fileStyle.apply) ) implicit val formats: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) } - lazy val full: Style = new BasicStyle(FileInfo.full) - lazy val hash: Style = new BasicStyle(FileInfo.hash) - lazy val lastModified: Style = new BasicStyle(FileInfo.lastModified) - lazy val exists: Style = new BasicStyle(FileInfo.exists) + lazy val full: Style { type F = HashModifiedFileInfo } = new BasicStyle(FileInfo.full) + lazy val hash: Style { type F = HashFileInfo } = new BasicStyle(FileInfo.hash) + lazy val lastModified: Style { type F = ModifiedFileInfo } = new BasicStyle(FileInfo.lastModified) + lazy val exists: Style { type F = PlainFileInfo } = new BasicStyle(FileInfo.exists) } \ No newline at end of file diff --git a/cache/HListCache.scala b/cache/HListCache.scala deleted file mode 100644 index 2bb3def3b..000000000 --- a/cache/HListCache.scala +++ /dev/null @@ -1,47 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah - */ -package sbt - -import java.io.{InputStream,OutputStream} - -import Types._ -class HNilInputCache extends NoInputCache[HNil] -class HConsInputCache[H,T <: HList](val headCache: InputCache[H], val tailCache: InputCache[T]) extends InputCache[H :+: T] -{ - def uptodate(in: H :+: T)(cacheStream: InputStream) = - { - val headResult = headCache.uptodate(in.head)(cacheStream) - val tailResult = tailCache.uptodate(in.tail)(cacheStream) - new CacheResult - { - val uptodate = headResult.uptodate && tailResult.uptodate - def update(outputStream: OutputStream) = - { - headResult.update(outputStream) - tailResult.update(outputStream) - } - } - } - def force(in: H :+: T)(cacheStream: OutputStream) = - { - headCache.force(in.head)(cacheStream) - tailCache.force(in.tail)(cacheStream) - } -} - -class HNilOutputCache extends NoOutputCache[HNil](HNil) -class HConsOutputCache[H,T <: HList](val headCache: OutputCache[H], val tailCache: OutputCache[T]) extends OutputCache[H :+: T] -{ - def loadCached(cacheStream: InputStream) = - { - val head = headCache.loadCached(cacheStream) - val tail = tailCache.loadCached(cacheStream) - HCons(head, tail) - } - def update(out: H :+: T)(cacheStream: OutputStream) - { - headCache.update(out.head)(cacheStream) - tailCache.update(out.tail)(cacheStream) - } -} \ No newline at end of file diff --git a/cache/NoCache.scala b/cache/NoCache.scala deleted file mode 100644 index bdb9c4f1b..000000000 --- a/cache/NoCache.scala +++ /dev/null @@ -1,22 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah - */ -package sbt - -import java.io.{InputStream,OutputStream} - -class NoInputCache[T] extends InputCache[T] -{ - def uptodate(in: T)(cacheStream: InputStream) = - new CacheResult - { - def uptodate = true - def update(outputStream: OutputStream) {} - } - def force(in: T)(outputStream: OutputStream) {} -} -class NoOutputCache[O](create: => O) extends OutputCache[O] -{ - def loadCached(cacheStream: InputStream) = create - def update(out: O)(cacheStream: OutputStream) {} -} \ No newline at end of file diff --git a/cache/SeparatedCache.scala b/cache/SeparatedCache.scala index 6509e05bf..59bbbe379 100644 --- a/cache/SeparatedCache.scala +++ b/cache/SeparatedCache.scala @@ -3,86 +3,56 @@ */ package sbt -import sbinary.Format -import sbinary.JavaIO._ +import Types.:+: +import sbinary.{DefaultProtocol, Format, Input, JavaIO, Output} +import DefaultProtocol.ByteFormat +import JavaIO._ import java.io.{File, InputStream, OutputStream} -trait CacheResult +trait InputCache[I] { - def uptodate: Boolean - def update(stream: OutputStream): Unit + type Internal + def convert(i: I): Internal + def read(from: Input): Internal + def write(to: Output, j: Internal): Unit + def equiv: Equiv[Internal] } -class ForceResult[I](inCache: InputCache[I])(in: I) extends CacheResult +object InputCache { - def uptodate = false - def update(stream: OutputStream) = inCache.force(in)(stream) + implicit def basicInputCache[I](implicit fmt: Format[I], eqv: Equiv[I]): InputCache[I] = + new InputCache[I] + { + type Internal = I + def convert(i: I) = i + def read(from: Input): I = fmt.reads(from) + def write(to: Output, i: I) = fmt.writes(to, i) + def equiv = eqv + } } -trait InputCache[I] extends NotNull -{ - def uptodate(in: I)(cacheStream: InputStream): CacheResult - def force(in: I)(cacheStream: OutputStream): Unit -} -trait OutputCache[O] extends NotNull -{ - def loadCached(cacheStream: InputStream): O - def update(out: O)(cacheStream: OutputStream): Unit -} -class SeparatedCache[I,O](input: InputCache[I], output: OutputCache[O]) extends Cache[I,O] + +class BasicCache[I,O](implicit input: InputCache[I], outFormat: Format[O]) extends Cache[I,O] { def apply(file: File)(in: I) = - try { applyImpl(file, in) } - catch { case _: Exception => Right(update(file)(in)) } - protected def applyImpl(file: File, in: I) = + { + val j = input.convert(in) + try { applyImpl(file, j) } + catch { case e: Exception => Right(update(file)(j)) } + } + protected def applyImpl(file: File, in: input.Internal) = { Using.fileInputStream(file) { stream => - val cache = input.uptodate(in)(stream) - lazy val doUpdate = (result: O) => - { - Using.fileOutputStream(false)(file) { stream => - cache.update(stream) - output.update(result)(stream) - } - } - if(cache.uptodate) - try { Left(output.loadCached(stream)) } - catch { case _: Exception => Right(doUpdate) } + val previousIn = input.read(stream) + if(input.equiv.equiv(in, previousIn)) + Left(outFormat.reads(stream)) else - Right(doUpdate) + Right(update(file)(in)) } } - protected def update(file: File)(in: I)(out: O) + protected def update(file: File)(in: input.Internal) = (out: O) => { Using.fileOutputStream(false)(file) { stream => - input.force(in)(stream) - output.update(out)(stream) + input.write(stream, in) + outFormat.writes(stream, out) } } -} -class BasicOutputCache[O](val format: Format[O]) extends OutputCache[O] -{ - def loadCached(cacheStream: InputStream): O = format.reads(cacheStream) - def update(out: O)(cacheStream: OutputStream): Unit = format.writes(cacheStream, out) -} -class BasicInputCache[I](val format: Format[I], val equiv: Equiv[I]) extends InputCache[I] -{ - def uptodate(in: I)(cacheStream: InputStream) = - { - val loaded = format.reads(cacheStream) - new CacheResult - { - val uptodate = equiv.equiv(in, loaded) - def update(outputStream: OutputStream) = force(in)(outputStream) - } - } - def force(in: I)(outputStream: OutputStream) = format.writes(outputStream, in) -} -class WrappedInputCache[I,DI](val convert: I => DI, val base: InputCache[DI]) extends InputCache[I] -{ - def uptodate(in: I)(cacheStream: InputStream) = base.uptodate(convert(in))(cacheStream) - def force(in: I)(outputStream: OutputStream) = base.force(convert(in))(outputStream) -} -class WrappedOutputCache[O,DO](val convert: O => DO, val reverse: DO => O, val base: OutputCache[DO]) extends OutputCache[O] -{ - def loadCached(cacheStream: InputStream): O = reverse(base.loadCached(cacheStream)) - def update(out: O)(cacheStream: OutputStream): Unit = base.update(convert(out))(cacheStream) } \ No newline at end of file diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index 798adbb11..6fe399821 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -5,7 +5,7 @@ package sbt import java.io.{File,IOException} import CacheIO.{fromFile, toFile} -import sbinary.Format +import sbinary.{Format, JavaIO} import scala.reflect.Manifest import IO.{delete, read, write} @@ -18,8 +18,8 @@ object Tracked * In both cases, the timestamp is not updated if the function throws an exception.*/ def tstamp(cacheFile: File, useStartTime: Boolean = true): Timestamp = new Timestamp(cacheFile, useStartTime) /** Creates a tracker that only evaluates a function when the input has changed.*/ - def changed[O](cacheFile: File)(implicit input: InputCache[O]): Changed[O] = - new Changed[O](cacheFile)(input) + def changed[O](cacheFile: File)(implicit format: Format[O], equiv: Equiv[O]): Changed[O] = + new Changed[O](cacheFile) /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ def diffInputs(cache: File, style: FilesInfo.Style): Difference = @@ -52,22 +52,29 @@ class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked catch { case _: NumberFormatException | _: java.io.FileNotFoundException => 0 } } -class Changed[O](val cacheFile: File)(implicit input: InputCache[O]) extends Tracked +class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O]) extends Tracked { def clean = delete(cacheFile) def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => { - val cache = - try { Using.fileInputStream(cacheFile)(input.uptodate(value)) } - catch { case _: IOException => new ForceResult(input)(value) } - if(cache.uptodate) + if(uptodate(value)) ifUnchanged(value) else { - Using.fileOutputStream(false)(cacheFile)(cache.update) + update(value) ifChanged(value) } } + import JavaIO._ + def update(value: O): Unit = Using.fileOutputStream(false)(cacheFile)(stream => format.writes(stream, value)) + def uptodate(value: O): Boolean = + try { + Using.fileInputStream(cacheFile) { + stream => equiv.equiv(value, format.reads(stream)) + } + } catch { + case _: IOException => false + } } object Difference { From 4a0461c34f1bbde528294fda32ffbe1f58c99107 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 30 Oct 2010 11:54:43 -0400 Subject: [PATCH 089/823] minor updates to utilities --- util/collection/Relation.scala | 50 +++++++++++++++++------------ util/collection/TypeFunctions.scala | 6 ++-- 2 files changed, 33 insertions(+), 23 deletions(-) diff --git a/util/collection/Relation.scala b/util/collection/Relation.scala index a282bf7e2..b305bb47c 100644 --- a/util/collection/Relation.scala +++ b/util/collection/Relation.scala @@ -3,12 +3,40 @@ */ package sbt + import Relation._ + object Relation { /** Constructs a new immutable, finite relation that is initially empty. */ def empty[A,B]: Relation[A,B] = make(Map.empty, Map.empty) def make[A,B](forward: Map[A,Set[B]], reverse: Map[B, Set[A]]): Relation[A,B] = new MRelation(forward, reverse) + def reconstruct[A,B](forward: Map[A, Set[B]]): Relation[A,B] = + { + val reversePairs = for( (a,bs) <- forward.view; b <- bs.view) yield (b, a) + val reverse = (Map.empty[B,Set[A]] /: reversePairs) { case (m, (b, a)) => add(m, b, a :: Nil) } + make(forward, reverse) + } + + + private[sbt] def remove[X,Y](map: M[X,Y], from: X, to: Y): M[X,Y] = + map.get(from) match { + case Some(tos) => + val newSet = tos - to + if(newSet.isEmpty) map - from else map.updated(from, newSet) + case None => map + } + + private[sbt] def combine[X,Y](a: M[X,Y], b: M[X,Y]): M[X,Y] = + (a /: b) { (map, mapping) => add(map, mapping._1, mapping._2) } + + private[sbt] def add[X,Y](map: M[X,Y], from: X, to: Iterable[Y]): M[X,Y] = + map.updated(from, get(map, from) ++ to) + + private[sbt] def get[X,Y](map: M[X,Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) + + private[sbt] type M[X,Y] = Map[X, Set[Y]] } + /** Binary relation between A and B. It is a set of pairs (_1, _2) for _1 in A, _2 in B. */ trait Relation[A,B] { @@ -49,8 +77,6 @@ trait Relation[A,B] } private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) extends Relation[A,B] { - type M[X,Y] = Map[X, Set[Y]] - def forwardMap = fwd def reverseMap = rev @@ -65,9 +91,9 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def all: Traversable[(A,B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable def +(pair: (A,B)) = this + (pair._1, Set(pair._2)) - def +(from: A, to: B) = this + (from, Set(to)) + def +(from: A, to: B) = this + (from, to :: Nil) def +(from: A, to: Iterable[B]) = - new MRelation( add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, Seq(from)) }) + new MRelation( add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, from :: Nil) }) def ++(rs: Iterable[(A,B)]) = ((this: Relation[A,B]) /: rs) { _ + _ } def ++(other: Relation[A,B]) = new MRelation[A,B]( combine(fwd, other.forwardMap), combine(rev, other.reverseMap) ) @@ -84,21 +110,5 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext case None => this } - private def remove[X,Y](map: M[X,Y], from: X, to: Y): M[X,Y] = - map.get(from) match { - case Some(tos) => - val newSet = tos - to - if(newSet.isEmpty) map - from else map.updated(from, newSet) - case None => map - } - - private def combine[X,Y](a: M[X,Y], b: M[X,Y]): M[X,Y] = - (a /: b) { (map, mapping) => add(map, mapping._1, mapping._2) } - - private[this] def add[X,Y](map: M[X,Y], from: X, to: Iterable[Y]): M[X,Y] = - map.updated(from, get(map, from) ++ to) - - private[this] def get[X,Y](map: M[X,Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) - override def toString = all.map { case (a,b) => a + " -> " + b }.mkString("Relation [", ", ", "]") } \ No newline at end of file diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index 92a5fb4b7..ca173f9bc 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -6,9 +6,9 @@ package sbt trait TypeFunctions { type Id[X] = X - trait Const[A] { type Apply[B] = A } - trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] } - trait P1of2[M[_,_], A] { type Apply[B] = M[A,B]; type Flip[B] = M[B, A] } + sealed trait Const[A] { type Apply[B] = A } + sealed trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] } + sealed trait P1of2[M[_,_], A] { type Apply[B] = M[A,B]; type Flip[B] = M[B, A] } final val left = new (Id ~> P1of2[Left, Nothing]#Flip) { def apply[T](t: T) = Left(t) } final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } From d85f438035fbc8c2e4e0d36aee65ebaee9a3f7e7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 30 Oct 2010 11:55:47 -0400 Subject: [PATCH 090/823] make serializable abstract Lazy template --- .../src/main/java/xsbti/api/AbstractLazy.java | 26 +++++++++++++++++++ interface/src/main/java/xsbti/api/Lazy.java | 2 +- 2 files changed, 27 insertions(+), 1 deletion(-) create mode 100644 interface/src/main/java/xsbti/api/AbstractLazy.java diff --git a/interface/src/main/java/xsbti/api/AbstractLazy.java b/interface/src/main/java/xsbti/api/AbstractLazy.java new file mode 100644 index 000000000..bd21f166f --- /dev/null +++ b/interface/src/main/java/xsbti/api/AbstractLazy.java @@ -0,0 +1,26 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package xsbti.api; + + import java.io.ObjectStreamException; + +public abstract class AbstractLazy implements Lazy, java.io.Serializable +{ + private Object writeReplace() throws ObjectStreamException + { + return new StrictLazy(get()); + } + private static final class StrictLazy implements Lazy, java.io.Serializable + { + private final T value; + StrictLazy(T t) + { + value = t; + } + public T get() + { + return value; + } + } +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/api/Lazy.java b/interface/src/main/java/xsbti/api/Lazy.java index 4a5642a01..1ee29b013 100644 --- a/interface/src/main/java/xsbti/api/Lazy.java +++ b/interface/src/main/java/xsbti/api/Lazy.java @@ -1,5 +1,5 @@ /* sbt -- Simple Build Tool - * Copyright 2008, 2009 Mark Harrah + * Copyright 2010 Mark Harrah */ package xsbti.api; From 9fcd42db32f5f5538f6bc87bbb74bc17d5401843 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 30 Oct 2010 17:46:56 -0400 Subject: [PATCH 091/823] Type cache in API extraction for smaller cache size and faster I/O manually implement Modifiers, use byte-size bit field --- interface/other | 8 --- .../src/main/java/xsbti/api/Modifiers.java | 69 +++++++++++++++++++ 2 files changed, 69 insertions(+), 8 deletions(-) create mode 100644 interface/src/main/java/xsbti/api/Modifiers.java diff --git a/interface/other b/interface/other index bba27dc7f..2ee086bcb 100644 --- a/interface/other +++ b/interface/other @@ -18,14 +18,6 @@ Qualifier IdQualifier value: String -Modifiers - isAbstract: Boolean - isOverride: Boolean - isFinal: Boolean - isSealed: Boolean - isImplicit: Boolean - isLazy: Boolean - ParameterList parameters: MethodParameter* isImplicit: Boolean diff --git a/interface/src/main/java/xsbti/api/Modifiers.java b/interface/src/main/java/xsbti/api/Modifiers.java new file mode 100644 index 000000000..14737be57 --- /dev/null +++ b/interface/src/main/java/xsbti/api/Modifiers.java @@ -0,0 +1,69 @@ +package xsbti.api; + +public final class Modifiers implements java.io.Serializable +{ + private static final int AbstractBit = 0; + private static final int OverrideBit = 1; + private static final int FinalBit = 2; + private static final int SealedBit = 3; + private static final int ImplicitBit = 4; + private static final int LazyBit = 5; + + private static final int flag(boolean set, int bit) + { + return set ? (1 << bit) : 0; + } + + public Modifiers(boolean isAbstract, boolean isOverride, boolean isFinal, boolean isSealed, boolean isImplicit, boolean isLazy) + { + this.flags = (byte)( + flag(isAbstract, AbstractBit) | + flag(isOverride, OverrideBit) | + flag(isFinal, FinalBit) | + flag(isSealed, SealedBit) | + flag(isImplicit, ImplicitBit) | + flag(isLazy, LazyBit) + ); + } + + private final byte flags; + + private final boolean flag(int bit) + { + return (flags & (1 << bit)) != 0; + } + + public final byte raw() + { + return flags; + } + + public final boolean isAbstract() + { + return flag(AbstractBit); + } + public final boolean isOverride() + { + return flag(OverrideBit); + } + public final boolean isFinal() + { + return flag(FinalBit); + } + public final boolean isSealed() + { + return flag(SealedBit); + } + public final boolean isImplicit() + { + return flag(ImplicitBit); + } + public final boolean isLazy() + { + return flag(LazyBit); + } + public String toString() + { + return "Modifiers(" + "isAbstract: " + isAbstract() + ", " + "isOverride: " + isOverride() + ", " + "isFinal: " + isFinal() + ", " + "isSealed: " + isSealed() + ", " + "isImplicit: " + isImplicit() + ", " + "isLazy: " + isLazy()+ ")"; + } +} From 53ab627df96f8ab269ed91ba5b70673c03859a93 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 9 Nov 2010 20:43:58 -0500 Subject: [PATCH 092/823] fix cache test --- cache/Cache.scala | 16 ++++++++++++++++ cache/src/test/scala/CacheTest.scala | 2 ++ 2 files changed, 18 insertions(+) diff --git a/cache/Cache.scala b/cache/Cache.scala index 0cf5ccd12..238773b18 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -188,6 +188,22 @@ trait HListCacheImplicits } implicit def hNilCache: InputCache[HNil] = Cache.singleton(HNil : HNil) + + implicit def hConsFormat[H, T <: HList](implicit head: Format[H], tail: Format[T]): Format[H :+: T] = new Format[H :+: T] { + def reads(from: Input) = + { + val h = head.reads(from) + val t = tail.reads(from) + HCons(h, t) + } + def writes(to: Output, hc: H :+: T) + { + head.writes(to, hc.head) + tail.writes(to, hc.tail) + } + } + + implicit def hNilFormat: Format[HNil] = asSingleton(HNil) } trait UnionImplicits { diff --git a/cache/src/test/scala/CacheTest.scala b/cache/src/test/scala/CacheTest.scala index ad6085fc1..481bfb9b6 100644 --- a/cache/src/test/scala/CacheTest.scala +++ b/cache/src/test/scala/CacheTest.scala @@ -10,6 +10,8 @@ object CacheTest// extends Properties("Cache test") import Cache._ import FileInfo.hash._ + import Ordering._ + import sbinary.DefaultProtocol.FileFormat def test { lazy val create = new File("test") From 23471ae3cb04c090ea4959b25afe5278d82be77a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 24 Nov 2010 14:04:20 -0500 Subject: [PATCH 093/823] cleanup --- cache/tracking/Tracked.scala | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index 6fe399821..e32841a77 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -7,6 +7,7 @@ import java.io.{File,IOException} import CacheIO.{fromFile, toFile} import sbinary.{Format, JavaIO} import scala.reflect.Manifest +import scala.collection.mutable import IO.{delete, read, write} @@ -78,17 +79,16 @@ class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O } object Difference { - sealed class Constructor private[Difference](defineClean: Boolean, filesAreOutputs: Boolean) extends NotNull - { - def apply(cache: File, style: FilesInfo.Style): Difference = new Difference(cache, style, defineClean, filesAreOutputs) - } + def constructor(defineClean: Boolean, filesAreOutputs: Boolean): (File, FilesInfo.Style) => Difference = + (cache, style) => new Difference(cache, style, defineClean, filesAreOutputs) + /** Provides a constructor for a Difference that removes the files from the previous run on a call to 'clean' and saves the * hash/last modified time of the files as they are after running the function. This means that this information must be evaluated twice: * before and after running the function.*/ - object outputs extends Constructor(true, true) + val outputs = constructor(true, true) /** Provides a constructor for a Difference that does nothing on a call to 'clean' and saves the * hash/last modified time of the files as they were prior to running the function.*/ - object inputs extends Constructor(false, false) + val inputs = constructor(false, false) } class Difference(val cache: File, val style: FilesInfo.Style, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked { From 8ed0f36dbea37f8537c3721af691290ea4dc6893 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 24 Nov 2010 14:18:59 -0500 Subject: [PATCH 094/823] TaskGroups, Context in tasks, new tasks add syncTask task constructor and copy-resources/copy-test-resources instances add console-quick, test-console, console, test-run add IntegrationTest trait make Context available through 'context' task update 'last' and 'show' to use Context to retrieve task by name drop SingleProject (superseded by Project) add TaskGroup to be able to inject groups of named tasks fix watchPaths missing flat sources proper logging in a few more places, such as compile --- util/collection/Relation.scala | 27 ++++++++++++++++++--------- 1 file changed, 18 insertions(+), 9 deletions(-) diff --git a/util/collection/Relation.scala b/util/collection/Relation.scala index b305bb47c..c5195ffb7 100644 --- a/util/collection/Relation.scala +++ b/util/collection/Relation.scala @@ -29,7 +29,7 @@ object Relation private[sbt] def combine[X,Y](a: M[X,Y], b: M[X,Y]): M[X,Y] = (a /: b) { (map, mapping) => add(map, mapping._1, mapping._2) } - private[sbt] def add[X,Y](map: M[X,Y], from: X, to: Iterable[Y]): M[X,Y] = + private[sbt] def add[X,Y](map: M[X,Y], from: X, to: Traversable[Y]): M[X,Y] = map.updated(from, get(map, from) ++ to) private[sbt] def get[X,Y](map: M[X,Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) @@ -49,15 +49,15 @@ trait Relation[A,B] /** Includes the relation (a, b). */ def +(a: A, b: B): Relation[A,B] /** Includes the relations (a, b) for all b in bs. */ - def +(a: A, bs: Iterable[B]): Relation[A,B] + def +(a: A, bs: Traversable[B]): Relation[A,B] /** Returns the union of the relation r with this relation. */ def ++(r: Relation[A,B]): Relation[A,B] /** Includes the given relations. */ - def ++(rs: Iterable[(A,B)]): Relation[A,B] + def ++(rs: Traversable[(A,B)]): Relation[A,B] /** Removes all relations (_1, _2) for all _1 in _1s. */ - def --(_1s: Iterable[A]): Relation[A,B] + def --(_1s: Traversable[A]): Relation[A,B] /** Removes all `pairs` from this relation. */ - def --(pairs: Traversable[(A,B)]): Relation[A,B] + def --(pairs: TraversableOnce[(A,B)]): Relation[A,B] /** Removes all pairs (_1, _2) from this relation. */ def -(_1: A): Relation[A,B] /** Removes `pair` from this relation. */ @@ -69,6 +69,11 @@ trait Relation[A,B] /** Returns the number of pairs in this relation */ def size: Int + /** Returns true iff (a,b) is in this relation*/ + def contains(a: A, b: B): Boolean + /** Returns a relation with only pairs (a,b) for which f(a,b) is true.*/ + def filter(f: (A,B) => Boolean): Relation[A,B] + /** Returns all pairs in this relation.*/ def all: Traversable[(A,B)] @@ -92,14 +97,14 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def +(pair: (A,B)) = this + (pair._1, Set(pair._2)) def +(from: A, to: B) = this + (from, to :: Nil) - def +(from: A, to: Iterable[B]) = + def +(from: A, to: Traversable[B]) = new MRelation( add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, from :: Nil) }) - def ++(rs: Iterable[(A,B)]) = ((this: Relation[A,B]) /: rs) { _ + _ } + def ++(rs: Traversable[(A,B)]) = ((this: Relation[A,B]) /: rs) { _ + _ } def ++(other: Relation[A,B]) = new MRelation[A,B]( combine(fwd, other.forwardMap), combine(rev, other.reverseMap) ) - def --(ts: Iterable[A]): Relation[A,B] = ((this: Relation[A,B]) /: ts) { _ - _ } - def --(pairs: Traversable[(A,B)]): Relation[A,B] = ((this: Relation[A,B]) /: pairs) { _ - _ } + def --(ts: Traversable[A]): Relation[A,B] = ((this: Relation[A,B]) /: ts) { _ - _ } + def --(pairs: TraversableOnce[(A,B)]): Relation[A,B] = ((this: Relation[A,B]) /: pairs) { _ - _ } def -(pair: (A,B)): Relation[A,B] = new MRelation( remove(fwd, pair._1, pair._2), remove(rev, pair._2, pair._1) ) def -(t: A): Relation[A,B] = @@ -110,5 +115,9 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext case None => this } + def filter(f: (A,B) => Boolean): Relation[A,B] = Relation.empty[A,B] ++ all.filter(f.tupled) + + def contains(a: A, b: B): Boolean = forward(a)(b) + override def toString = all.map { case (a,b) => a + " -> " + b }.mkString("Relation [", ", ", "]") } \ No newline at end of file From 1cd848cd9b4c25dffd352c1e51cf852eb1281f01 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 2 Dec 2010 19:14:30 -0500 Subject: [PATCH 095/823] introduce sbt.log.format for explicit formatting control implements issue #134 if true, formatting enabled if false, formatting disabled if unset, formatting configured as before sbt.log.noformat is no longer recommended, but is supported: a. setting it to 'true' explicitly disables formatting b. if 'false' or unspecified, autodetection is used c. sbt.log.format takes precedence if defined --- util/log/ConsoleLogger.scala | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index 59bc680c3..0f9e78b22 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -21,9 +21,13 @@ object ConsoleLogger def println() = out.println() } - private val formatEnabled = ansiSupported && !formatExplicitlyDisabled + val formatEnabled = + { + import java.lang.Boolean.{getBoolean, parseBoolean} + val value = System.getProperty("sbt.log.format") + if(value eq null) (ansiSupported && !getBoolean("sbt.log.noformat")) else parseBoolean(value) + } - private[this] def formatExplicitlyDisabled = java.lang.Boolean.getBoolean("sbt.log.noformat") private[this] def ansiSupported = try { jline.Terminal.getTerminal.isANSISupported } catch { case e: Exception => !isWindows } From f1f8c0eb0b14bde223b2978cf4afcfd8cabf7a1d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 2 Dec 2010 19:45:58 -0500 Subject: [PATCH 096/823] Format for types that can be read/written to/from InputStream/OutputStream use case: java.util.jar.Manifest --- cache/Cache.scala | 10 ++++++++-- 1 file changed, 8 insertions(+), 2 deletions(-) diff --git a/cache/Cache.scala b/cache/Cache.scala index 238773b18..21e89abc4 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -4,7 +4,7 @@ package sbt import sbinary.{CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output} -import java.io.File +import java.io.{ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream} import java.net.{URI, URL} import Types.:+: import DefaultProtocol.{asProduct2, asSingleton, BooleanFormat, ByteFormat, IntFormat, wrap} @@ -108,7 +108,13 @@ trait BasicCacheImplicits implicit def stringSetEquiv: Equiv[Set[String]] = defaultEquiv implicit def stringMapEquiv: Equiv[Map[String, String]] = defaultEquiv - + def streamFormat[T](write: (T, OutputStream) => Unit, f: InputStream => T): Format[T] = + { + val toBytes = (t: T) => { val bos = new ByteArrayOutputStream; write(t, bos); bos.toByteArray } + val fromBytes = (bs: Array[Byte]) => f(new ByteArrayInputStream(bs)) + wrap(toBytes, fromBytes)(DefaultProtocol.ByteArrayFormat) + } + implicit def xmlInputCache(implicit strEq: InputCache[String]): InputCache[NodeSeq] = wrapIn[NodeSeq, String](_.toString, strEq) implicit def seqCache[T](implicit t: InputCache[T]): InputCache[Seq[T]] = From f075adb25b1e8bceb4192be71d75191266877568 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 2 Dec 2010 19:51:56 -0500 Subject: [PATCH 097/823] fix FilesInfo style type member, fix PlainFileInfo to track existence --- cache/FileInfo.scala | 20 +++++++++++++------- 1 file changed, 13 insertions(+), 7 deletions(-) diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala index 5731a535e..ae626b827 100644 --- a/cache/FileInfo.scala +++ b/cache/FileInfo.scala @@ -21,9 +21,12 @@ sealed trait ModifiedFileInfo extends FileInfo val lastModified: Long } sealed trait PlainFileInfo extends FileInfo +{ + def exists: Boolean +} sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo -private final case class PlainFile(file: File) extends PlainFileInfo +private final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo @@ -72,8 +75,8 @@ object FileInfo { type F = PlainFileInfo implicit def apply(file: File): PlainFileInfo = make(file) - def make(file: File): PlainFileInfo = PlainFile(file.getAbsoluteFile) - implicit val format: Format[PlainFileInfo] = wrap(_.file, make) + def make(file: File): PlainFileInfo = { val abs = file.getAbsoluteFile; PlainFile(abs, abs.exists) } + implicit val format: Format[PlainFileInfo] = asProduct2[PlainFileInfo, File, Boolean](PlainFile.apply)(x => (x.file, x.exists)) } } @@ -82,22 +85,25 @@ object FilesInfo { sealed abstract class Style { - val fileStyle: FileInfo.Style - type F = fileStyle.F + type F <: FileInfo + val fileStyle: FileInfo.Style { type F = Style.this.F } + //def manifest: Manifest[F] = fileStyle.manifest implicit def apply(files: Set[File]): FilesInfo[F] implicit def unapply(info: FilesInfo[F]): Set[File] = info.files.map(_.file) implicit val formats: Format[FilesInfo[F]] val manifest: Manifest[Format[FilesInfo[F]]] - def empty: FilesInfo[F] = new FilesInfo(Set.empty) + def empty: FilesInfo[F] = new FilesInfo[F](Set.empty) import Cache._ implicit def infosInputCache: InputCache[FilesInfo[F]] = basicInput implicit def filesInputCache: InputCache[Set[File]] = wrapIn[Set[File],FilesInfo[F]] implicit def filesInfoEquiv: Equiv[FilesInfo[F]] = defaultEquiv } - private final class BasicStyle[FI <: FileInfo](val fileStyle: FileInfo.Style { type F = FI }) + private final class BasicStyle[FI <: FileInfo](style: FileInfo.Style { type F = FI }) (implicit val manifest: Manifest[Format[FilesInfo[FI]]]) extends Style { + type F = FI + val fileStyle: FileInfo.Style { type F = FI } = style private implicit val infoFormat: Format[FI] = fileStyle.format implicit def apply(files: Set[File]): FilesInfo[F] = FilesInfo( files.map(_.getAbsoluteFile).map(fileStyle.apply) ) implicit val formats: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) From 93fb4f3dcaf4faefda4de81b49261e8424aba656 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 2 Dec 2010 19:53:14 -0500 Subject: [PATCH 098/823] more useful, stackable version of Tracked.changed --- cache/tracking/Tracked.scala | 44 +++++++++++++++++++++++++++++++++--- 1 file changed, 41 insertions(+), 3 deletions(-) diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index e32841a77..d88518ce7 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -19,8 +19,8 @@ object Tracked * In both cases, the timestamp is not updated if the function throws an exception.*/ def tstamp(cacheFile: File, useStartTime: Boolean = true): Timestamp = new Timestamp(cacheFile, useStartTime) /** Creates a tracker that only evaluates a function when the input has changed.*/ - def changed[O](cacheFile: File)(implicit format: Format[O], equiv: Equiv[O]): Changed[O] = - new Changed[O](cacheFile) + //def changed[O](cacheFile: File)(implicit format: Format[O], equiv: Equiv[O]): Changed[O] = + // new Changed[O](cacheFile) /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ def diffInputs(cache: File, style: FilesInfo.Style): Difference = @@ -28,9 +28,47 @@ object Tracked /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ def diffOutputs(cache: File, style: FilesInfo.Style): Difference = Difference.outputs(cache, style) + + import sbinary.JavaIO._ + + def inputChanged[I,O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): I => O = in => + { + val help = new CacheHelp(ic) + val conv = help.convert(in) + val changed = help.changed(cacheFile, conv) + val result = f(changed, in) + + if(changed) + help.save(cacheFile, conv) + + result + } + def outputChanged[I,O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): (() => I) => O = in => + { + val initial = in() + val help = new CacheHelp(ic) + val changed = help.changed(cacheFile, help.convert(initial)) + val result = f(changed, initial) + + if(changed) + help.save(cacheFile, help.convert(in())) + + result + } + final class CacheHelp[I](val ic: InputCache[I]) + { + def convert(i: I): ic.Internal = ic.convert(i) + def save(cacheFile: File, value: ic.Internal): Unit = + Using.fileOutputStream()(cacheFile)(out => ic.write(out, value) ) + def changed(cacheFile: File, converted: ic.Internal): Boolean = + try { + val prev = Using.fileInputStream(cacheFile)(x => ic.read(x)) + !ic.equiv.equiv(converted, prev) + } catch { case e: Exception => true } + } } -trait Tracked extends NotNull +trait Tracked { /** Cleans outputs and clears the cache.*/ def clean: Unit From 29efa529cdd99c5af5182b4a4f22aecb1b683b4d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 6 Dec 2010 19:48:49 -0500 Subject: [PATCH 099/823] parser combinators with builtin tab completion support lacks memoization lacks error messages for normal parsing --- util/complete/Completions.scala | 103 ++++++ util/complete/Parser.scala | 328 ++++++++++++++++++ util/complete/UpperBound.scala | 43 +++ util/complete/src/test/scala/ParserTest.scala | 26 ++ 4 files changed, 500 insertions(+) create mode 100644 util/complete/Completions.scala create mode 100644 util/complete/Parser.scala create mode 100644 util/complete/UpperBound.scala create mode 100644 util/complete/src/test/scala/ParserTest.scala diff --git a/util/complete/Completions.scala b/util/complete/Completions.scala new file mode 100644 index 000000000..db7b628e2 --- /dev/null +++ b/util/complete/Completions.scala @@ -0,0 +1,103 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt.parse + +/** +* Represents a set of completions. +* It exists instead of implicitly defined operations on top of Set[Completion] +* for laziness. +*/ +sealed trait Completions +{ + def get: Set[Completion] + final def x(o: Completions): Completions = Completions( for(cs <- get; os <- o.get) yield cs ++ os ) + final def ++(o: Completions): Completions = Completions( get ++ o.get ) + final def +:(o: Completion): Completions = Completions(get + o) + override def toString = get.mkString("Completions(",",",")") +} +object Completions +{ + /** Returns a lazy Completions instance using the provided Completion Set. */ + def apply(cs: => Set[Completion]): Completions = new Completions { + lazy val get = cs + } + + /** Returns a strict Completions instance using the provided Completion Set. */ + def strict(cs: Set[Completion]): Completions = new Completions { + def get = cs + } + + /** A Completions with no suggested completions, not even the empty Completion.*/ + val empty: Completions = strict(Set.empty) + + /** A Completions with only the marked empty Completion as a suggestion. */ + val mark: Completions = strict(Set.empty + Completion.mark) + + /** Returns a strict Completions instance with a single Completion with `s` for `append`.*/ + def single(s: String): Completions = strict(Set.empty + Completion.strict("", s)) +} + +/** +* Represents a completion. +* The abstract members `prepend` and `append` are best explained with an example. +* +* Assuming space-delimited tokens, processing this: +* am is are w +* could produce these Completions: +* Completion { prepend = "w"; append = "as" } +* Completion { prepend = "w"; append = "ere" } +* to suggest the tokens "was" and "were". +* +* In this way, two pieces of information are preserved: +* 1) what needs to be appended to the current input if a completion is selected +* 2) the full token being completed, which is useful for presenting a user with choices to select +*/ +sealed trait Completion +{ + /** The part of the token that was in the input.*/ + def prepend: String + + /** The proposed suffix to append to the existing input to complete the last token in the input.*/ + def append: String + + /** True if this completion has been identified with a token. + * A marked Completion will not be appended to another Completion unless that Completion is empty. + * In this way, only a single token is completed at a time.*/ + def mark: Boolean + + final def isEmpty = prepend.isEmpty && append.isEmpty + + /** Appends the completions in `o` with the completions in this unless `o` is marked and this is nonempty.*/ + final def ++(o: Completion): Completion = if(o.mark && !isEmpty) this else Completion(prepend + o.prepend, append + o.append, mark) + + override final def toString = triple.toString + override final lazy val hashCode = triple.hashCode + override final def equals(o: Any) = o match { + case c: Completion => triple == c.triple + case _ => false + } + final def triple = (prepend, append, mark) +} +object Completion +{ + /** Constructs a lazy Completion with the given prepend, append, and mark values. */ + def apply(d: => String, a: => String, m: Boolean = false): Completion = new Completion { + lazy val prepend = d + lazy val append = a + def mark = m + } + + /** Constructs a strict Completion with the given prepend, append, and mark values. */ + def strict(d: String, a: String, m: Boolean = false): Completion = new Completion { + def prepend = d + def append = a + def mark = m + } + + /** An unmarked completion with the empty string for prepend and append. */ + val empty: Completion = strict("", "", false) + + /** A marked completion with the empty string for prepend and append. */ + val mark: Completion = Completion.strict("", "", true) +} \ No newline at end of file diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala new file mode 100644 index 000000000..7bd1b455f --- /dev/null +++ b/util/complete/Parser.scala @@ -0,0 +1,328 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2010 Mark Harrah + */ +package sbt.parse + + import Parser._ + +sealed trait Parser[+T] +{ + def derive(i: Char): Parser[T] + def resultEmpty: Option[T] + def result: Option[T] = None + def completions: Completions + def valid: Boolean = true + def isTokenStart = false +} +sealed trait RichParser[A] +{ + /** Produces a Parser that applies the original Parser and then applies `next` (in order).*/ + def ~[B](next: Parser[B]): Parser[(A,B)] + /** Produces a Parser that applies the original Parser one or more times.*/ + def + : Parser[Seq[A]] + /** Produces a Parser that applies the original Parser zero or more times.*/ + def * : Parser[Seq[A]] + /** Produces a Parser that applies the original Parser zero or one times.*/ + def ? : Parser[Option[A]] + /** Produces a Parser that applies either the original Parser or `next`.*/ + def ||[B >: A](b: Parser[B]): Parser[B] + /** Produces a Parser that applies either the original Parser or `next`.*/ + def |[B](b: Parser[B]): Parser[Either[A,B]] + /** Produces a Parser that applies the original Parser to the input and then applies `f` to the result.*/ + def map[B](f: A => B): Parser[B] + /** Returns the original parser. This is useful for converting literals to Parsers. + * For example, `'c'.id` or `"asdf".id`*/ + def id: Parser[A] +} +object Parser +{ + def apply[T](p: Parser[T])(s: String): Parser[T] = + (p /: s)(derive1) + + def derive1[T](p: Parser[T], c: Char): Parser[T] = + p.derive(c) + + def completions(p: Parser[_], s: String): Completions = completions( apply(p)(s) ) + def completions(p: Parser[_]): Completions = Completions.mark x p.completions + + implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] + { + def ~[B](b: Parser[B]) = seqParser(a, b) + def |[B](b: Parser[B]) = choiceParser(a,b) + def ||[B >: A](b: Parser[B]) = homParser(a,b) + def ? = opt(a) + def * = zeroOrMore(a) + def + = oneOrMore(a) + def map[B](f: A => B) = mapParser(a, f) + def id = a + } + implicit def literalRichParser(c: Char): RichParser[Char] = richParser(c) + implicit def literalRichParser(s: String): RichParser[String] = richParser(s) + def examples[A](a: Parser[A], completions: Set[String]): Parser[A] = + if(a.valid) { + a.result match + { + case Some(av) => success( av ) + case None => new Examples(a, completions) + } + } + else Invalid + + def mapParser[A,B](a: Parser[A], f: A => B): Parser[B] = + if(a.valid) { + a.result match + { + case Some(av) => success( f(av) ) + case None => new MapParser(a, f) + } + } + else Invalid + + def seqParser[A,B](a: Parser[A], b: Parser[B]): Parser[(A,B)] = + if(a.valid && b.valid) { + (a.result, b.result) match { + case (Some(av), Some(bv)) => success( (av, bv) ) + case (Some(av), None) => b map { bv => (av, bv) } + case (None, Some(bv)) => a map { av => (av, bv) } + case (None, None) => new SeqParser(a,b) + } + } + else Invalid + + def token[T](t: Parser[T]): Parser[T] = tokenStart(t, "") + def tokenStart[T](t: Parser[T], seen: String): Parser[T] = + if(t.valid && !t.isTokenStart) + { + t.result match + { + case None => new TokenStart(t, seen) + case Some(tv) => success(tv) + } + } + else + t + + def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = + if(a.valid) { + if(b.valid) { + (a.result orElse b.result) match + { + case Some(v) => success( v ) + case None => new HomParser(a, b) + } + } + else a + } + else b + + def choiceParser[A,B](a: Parser[A], b: Parser[B]): Parser[Either[A,B]] = + if(a.valid) { + if(b.valid) { + a.result match + { + case Some(av) => success( Left(av) ) + case None => + b.result match + { + case Some(bv) => success( Right(bv) ) + case None => new HetParser(a, b) + } + } + } + else + a.map( Left(_) ) + } + else + b.map( Right(_) ) + + def opt[T](a: Parser[T]): Parser[Option[T]] = + if(a.valid) { + a.result match + { + case None => new Optional(a) + case x => success(x) + } + } + else success(None) + + def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) + def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) + + def repeat[T](p: Parser[T], min: Int = 0, max: UpperBound = Infinite): Parser[Seq[T]] = + repeat(None, p, min, max, Nil) + private[parse] def repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, revAcc: List[T]): Parser[Seq[T]] = + { + assume(min >= 0, "Minimum must be greater than or equal to zero") + + def checkRepeated(invalidButOptional: => Parser[Seq[T]]): Parser[Seq[T]] = + if(repeated.valid) + repeated.result match + { + case Some(value) => success(value :: Nil) + case None => new Repeat(partial, repeated, min, max, revAcc) + } + else if(min == 0) + invalidButOptional + else + Invalid + + partial match + { + case Some(part) => + if(part.valid) + part.result match + { + case Some(value) => repeat(None, repeated, min, max, value :: revAcc) + case None => checkRepeated(part.map(lv => (lv :: revAcc).reverse)) + } + else Invalid + case None => checkRepeated(success(Nil)) + } + } + + def success[T](value: T): Parser[T] = new Parser[T] { + override def result = Some(value) + def resultEmpty = result + def derive(c: Char) = Invalid + def completions = Completions.empty + } + + def charClass(f: Char => Boolean): Parser[Char] = new CharacterClass(f) + implicit def literal(ch: Char): Parser[Char] = new Parser[Char] { + def resultEmpty = None + def derive(c: Char) = if(c == ch) success(ch) else Invalid + def completions = Completions.single(ch.toString) + } + implicit def literal(s: String): Parser[String] = stringLiteral(s, s.toList) + def stringLiteral(s: String, remaining: List[Char]): Parser[String] = + if(remaining.isEmpty) success(s) else if(s.isEmpty) error("String literal cannot be empty") else new StringLiteral(s, remaining) +} +private final object Invalid extends Parser[Nothing] +{ + def resultEmpty = None + def derive(c: Char) = error("Invalid.") + override def valid = false + def completions = Completions.empty +} +private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[(A,B)] +{ + def cross(ao: Option[A], bo: Option[B]): Option[(A,B)] = for(av <- ao; bv <- bo) yield (av,bv) + lazy val resultEmpty = cross(a.resultEmpty, b.resultEmpty) + def derive(c: Char) = + { + val common = a.derive(c) ~ b + a.resultEmpty match + { + case Some(av) => common || b.derive(c).map(br => (av,br)) + case None => common + } + } + lazy val completions = a.completions x b.completions +} + +private final class HomParser[A](a: Parser[A], b: Parser[A]) extends Parser[A] +{ + def derive(c: Char) = (a derive c) || (b derive c) + lazy val resultEmpty = a.resultEmpty orElse b.resultEmpty + lazy val completions = a.completions ++ b.completions +} +private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[Either[A,B]] +{ + def derive(c: Char) = (a derive c) | (b derive c) + lazy val resultEmpty = a.resultEmpty.map(Left(_)) orElse b.resultEmpty.map(Right(_)) + lazy val completions = a.completions ++ b.completions +} +private final class MapParser[A,B](a: Parser[A], f: A => B) extends Parser[B] +{ + lazy val resultEmpty = a.resultEmpty map f + def derive(c: Char) = (a derive c) map f + def completions = a.completions + override def isTokenStart = a.isTokenStart +} +private final class TokenStart[T](delegate: Parser[T], seen: String) extends Parser[T] +{ + def derive(c: Char) = tokenStart( delegate derive c, seen + c ) + lazy val completions = + { + val dcs = delegate.completions + Completions( for(c <- dcs.get) yield Completion(seen, c.append, true) ) + } + def resultEmpty = delegate.resultEmpty + override def isTokenStart = true +} +private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Parser[T] +{ + def derive(c: Char) = examples(delegate.derive(c), fixed.collect { case x if x.length > 0 && x(0) == c => x.tail }) + def resultEmpty = delegate.resultEmpty + lazy val completions = Completions(fixed map { ex => Completion.strict("",ex,false) } ) +} +private final class StringLiteral(str: String, remaining: List[Char]) extends Parser[String] +{ + assert(str.length > 0 && !remaining.isEmpty) + def resultEmpty = None + def derive(c: Char) = if(remaining.head == c) stringLiteral(str, remaining.tail) else Invalid + lazy val completions = Completions.single(remaining.mkString) +} +private final class CharacterClass(f: Char => Boolean) extends Parser[Char] +{ + def resultEmpty = None + def derive(c: Char) = if( f(c) ) success(c) else Invalid + def completions = Completions.empty +} +private final class Optional[T](delegate: Parser[T]) extends Parser[Option[T]] +{ + def resultEmpty = Some(None) + def derive(c: Char) = (delegate derive c).map(Some(_)) + lazy val completions = Completion.empty +: delegate.completions +} +private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, accumulatedReverse: List[T]) extends Parser[Seq[T]] +{ + assume(0 <= min, "Minimum occurences must be non-negative") + assume(max >= min, "Minimum occurences must be less than the maximum occurences") + + def derive(c: Char) = + partial match + { + case Some(part) => + val partD = repeat(Some(part derive c), repeated, min, max, accumulatedReverse) + part.resultEmpty match + { + case Some(pv) => partD || repeatDerive(c, pv :: accumulatedReverse) + case None => partD + } + case None => repeatDerive(c, accumulatedReverse) + } + + def repeatDerive(c: Char, accRev: List[T]): Parser[Seq[T]] = repeat(Some(repeated derive c), repeated, (min - 1) max 0, max.decrement, accRev) + + lazy val completions = + { + val repC = repeated.completions + val fin = if(min == 0) Completion.empty +: repC else repC + partial match + { + case Some(p) => p.completions x fin + case None => fin + } + } + lazy val resultEmpty: Option[Seq[T]] = + { + val partialAccumulatedOption = + partial match + { + case None => Some(accumulatedReverse) + case Some(partialPattern) => partialPattern.resultEmpty.map(_ :: accumulatedReverse) + } + for(partialAccumulated <- partialAccumulatedOption; repeatEmpty <- repeatedParseEmpty) yield + partialAccumulated reverse_::: repeatEmpty + } + private def repeatedParseEmpty: Option[List[T]] = + { + if(min == 0) + Some(Nil) + else + // forced determinism + for(value <- repeated.resultEmpty) yield + List.make(min, value) + } +} \ No newline at end of file diff --git a/util/complete/UpperBound.scala b/util/complete/UpperBound.scala new file mode 100644 index 000000000..1bdc1592c --- /dev/null +++ b/util/complete/UpperBound.scala @@ -0,0 +1,43 @@ +/* sbt -- Simple Build Tool + * Copyright 2008,2010 Mark Harrah + */ +package sbt.parse + +sealed trait UpperBound +{ + /** True if and only if the given value meets this bound.*/ + def >=(min: Int): Boolean + /** True if and only if this bound is one.*/ + def isOne: Boolean + /** True if and only if this bound is zero.*/ + def isZero: Boolean + /** If this bound is zero or Infinite, `decrement` returns this bound. + * Otherwise, this bound is finite and nonzero, and `decrement` returns the bound that is one less than this bound.*/ + def decrement: UpperBound + /** True if and only if this is unbounded.*/ + def isInfinite: Boolean +} +/** Represents unbounded. */ +case object Infinite extends UpperBound +{ + /** All finite numbers meet this bound. */ + def >=(min: Int) = true + def isOne = false + def isZero = false + def decrement = this + def isInfinite = true + override def toString = "Infinity" +} +/** Represents a finite upper bound. The maximum allowed value is 'value', inclusive. +* It must positive. */ +final case class Finite(value: Int) extends UpperBound +{ + assume(value > 0, "Maximum occurences must be positive.") + + def >=(min: Int) = value >= min + def isOne = value == 1 + def isZero = value == 0 + def decrement = Finite( (value - 1) max 0 ) + def isInfinite = false + override def toString = value.toString +} \ No newline at end of file diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala new file mode 100644 index 000000000..5331cd27b --- /dev/null +++ b/util/complete/src/test/scala/ParserTest.scala @@ -0,0 +1,26 @@ +package sbt.parse + + import Parser._ + +object ParserExample +{ + val ws = charClass(_.isWhitespace)+ + val notws = charClass(!_.isWhitespace)+ + + val name = token("test") + val options = (ws ~ token("quick" || "failed" || "new") )* + val include = (ws ~ token(examples(notws, Set("am", "is", "are", "was", "were") )) )* + + val t = name ~ options ~ include + + // Get completions for some different inputs + println(completions(t, "te")) + println(completions(t, "test ")) + println(completions(t, "test w")) + + // Get the parsed result for different inputs + println(apply(t)("te").resultEmpty) + println(apply(t)("test").resultEmpty) + println(apply(t)("test w").resultEmpty) + println(apply(t)("test was were").resultEmpty) +} From 99230f02a2e98e93e214cf73b2f22b28f290211f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 Dec 2010 22:16:12 -0500 Subject: [PATCH 100/823] fixes and additions to completion combinators filter,map,flatMap remove incorrect reductions --- util/complete/Completions.scala | 15 +- util/complete/Parser.scala | 194 ++++++++++++------ util/complete/UpperBound.scala | 10 +- util/complete/src/test/scala/ParserTest.scala | 20 +- 4 files changed, 170 insertions(+), 69 deletions(-) diff --git a/util/complete/Completions.scala b/util/complete/Completions.scala index db7b628e2..7a5f22444 100644 --- a/util/complete/Completions.scala +++ b/util/complete/Completions.scala @@ -14,7 +14,11 @@ sealed trait Completions final def x(o: Completions): Completions = Completions( for(cs <- get; os <- o.get) yield cs ++ os ) final def ++(o: Completions): Completions = Completions( get ++ o.get ) final def +:(o: Completion): Completions = Completions(get + o) + final def filter(f: Completion => Boolean): Completions = Completions(get filter f) + final def filterS(f: String => Boolean): Completions = filter(c => f(c.append)) override def toString = get.mkString("Completions(",",",")") + final def flatMap(f: Completion => Completions): Completions = Completions(get.flatMap(c => f(c).get)) + final def map(f: Completion => Completion): Completions = Completions(get map f) } object Completions { @@ -28,10 +32,13 @@ object Completions def get = cs } - /** A Completions with no suggested completions, not even the empty Completion.*/ - val empty: Completions = strict(Set.empty) + /** No suggested completions, not even the empty Completion.*/ + val nil: Completions = strict(Set.empty) - /** A Completions with only the marked empty Completion as a suggestion. */ + /** Only includes the unmarked empty Completion as a suggestion. */ + val empty: Completions = strict(Set.empty + Completion.empty) + + /** Includes only the marked empty Completion as a suggestion. */ val mark: Completions = strict(Set.empty + Completion.mark) /** Returns a strict Completions instance with a single Completion with `s` for `append`.*/ @@ -71,6 +78,8 @@ sealed trait Completion /** Appends the completions in `o` with the completions in this unless `o` is marked and this is nonempty.*/ final def ++(o: Completion): Completion = if(o.mark && !isEmpty) this else Completion(prepend + o.prepend, append + o.append, mark) + final def x(o: Completions): Completions = o.map(this ++ _) + override final def toString = triple.toString override final lazy val hashCode = triple.hashCode override final def equals(o: Any) = o match { diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 7bd1b455f..459f7680a 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -25,14 +25,29 @@ sealed trait RichParser[A] /** Produces a Parser that applies the original Parser zero or one times.*/ def ? : Parser[Option[A]] /** Produces a Parser that applies either the original Parser or `next`.*/ - def ||[B >: A](b: Parser[B]): Parser[B] + def |[B >: A](b: Parser[B]): Parser[B] /** Produces a Parser that applies either the original Parser or `next`.*/ - def |[B](b: Parser[B]): Parser[Either[A,B]] + def ||[B](b: Parser[B]): Parser[Either[A,B]] /** Produces a Parser that applies the original Parser to the input and then applies `f` to the result.*/ def map[B](f: A => B): Parser[B] /** Returns the original parser. This is useful for converting literals to Parsers. * For example, `'c'.id` or `"asdf".id`*/ def id: Parser[A] + + def unary_- : Parser[Unit] + def & (o: Parser[_]): Parser[A] + def - (o: Parser[_]): Parser[A] + /** Explicitly defines the completions for the original Parser.*/ + def examples(s: String*): Parser[A] + /** Explicitly defines the completions for the original Parser.*/ + def examples(s: Set[String]): Parser[A] + /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ + def string(implicit ev: A <:< Seq[Char]): Parser[String] + /** Produces a Parser that filters the original parser. + * If 'f' is not true when applied to the output of the original parser, the Parser returned by this method fails.*/ + def filter(f: A => Boolean): Parser[A] + + def flatMap[B](f: A => Parser[B]): Parser[B] } object Parser { @@ -40,7 +55,7 @@ object Parser (p /: s)(derive1) def derive1[T](p: Parser[T], c: Char): Parser[T] = - p.derive(c) + if(p.valid) p.derive(c) else p def completions(p: Parser[_], s: String): Completions = completions( apply(p)(s) ) def completions(p: Parser[_]): Completions = Completions.mark x p.completions @@ -48,26 +63,44 @@ object Parser implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] { def ~[B](b: Parser[B]) = seqParser(a, b) - def |[B](b: Parser[B]) = choiceParser(a,b) - def ||[B >: A](b: Parser[B]) = homParser(a,b) + def ||[B](b: Parser[B]) = choiceParser(a,b) + def |[B >: A](b: Parser[B]) = homParser(a,b) def ? = opt(a) def * = zeroOrMore(a) def + = oneOrMore(a) def map[B](f: A => B) = mapParser(a, f) def id = a + + def unary_- = not(a) + def & (o: Parser[_]) = and(a, o) + def - (o: Parser[_]) = sub(a, o) + def examples(s: String*): Parser[A] = examples(s.toSet) + def examples(s: Set[String]): Parser[A] = Parser.examples(a, s, check = true) + def filter(f: A => Boolean): Parser[A] = filterParser(a, f) + def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) + def flatMap[B](f: A => Parser[B]) = bindParser(a, f) } implicit def literalRichParser(c: Char): RichParser[Char] = richParser(c) implicit def literalRichParser(s: String): RichParser[String] = richParser(s) - def examples[A](a: Parser[A], completions: Set[String]): Parser[A] = + + def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = if(a.valid) { a.result match { case Some(av) => success( av ) - case None => new Examples(a, completions) + case None => + if(check) checkMatches(a, completions.toSeq) + new Examples(a, completions) } } else Invalid + def checkMatches(a: Parser[_], completions: Seq[String]) + { + val bad = completions.filter( apply(a)(_).resultEmpty.isEmpty) + if(!bad.isEmpty) error("Invalid example completions: " + bad.mkString("'", "', '", "'")) + } + def mapParser[A,B](a: Parser[A], f: A => B): Parser[B] = if(a.valid) { a.result match @@ -78,6 +111,26 @@ object Parser } else Invalid + def bindParser[A,B](a: Parser[A], f: A => Parser[B]): Parser[B] = + if(a.valid) { + a.result match + { + case Some(av) => f(av) + case None => new BindParser(a, f) + } + } + else Invalid + + def filterParser[T](a: Parser[T], f: T => Boolean): Parser[T] = + if(a.valid) { + a.result match + { + case Some(av) => if( f(av) ) success( av ) else Invalid + case None => new Filter(a, f) + } + } + else Invalid + def seqParser[A,B](a: Parser[A], b: Parser[B]): Parser[(A,B)] = if(a.valid && b.valid) { (a.result, b.result) match { @@ -92,58 +145,24 @@ object Parser def token[T](t: Parser[T]): Parser[T] = tokenStart(t, "") def tokenStart[T](t: Parser[T], seen: String): Parser[T] = if(t.valid && !t.isTokenStart) - { - t.result match - { - case None => new TokenStart(t, seen) - case Some(tv) => success(tv) - } - } + if(t.result.isEmpty) new TokenStart(t, seen) else t else t def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = - if(a.valid) { - if(b.valid) { - (a.result orElse b.result) match - { - case Some(v) => success( v ) - case None => new HomParser(a, b) - } - } - else a - } - else b + if(a.valid) + if(b.valid) new HomParser(a, b) else a + else + b def choiceParser[A,B](a: Parser[A], b: Parser[B]): Parser[Either[A,B]] = - if(a.valid) { - if(b.valid) { - a.result match - { - case Some(av) => success( Left(av) ) - case None => - b.result match - { - case Some(bv) => success( Right(bv) ) - case None => new HetParser(a, b) - } - } - } - else - a.map( Left(_) ) - } + if(a.valid) + if(b.valid) new HetParser(a,b) else a.map( Left(_) ) else b.map( Right(_) ) - + def opt[T](a: Parser[T]): Parser[Option[T]] = - if(a.valid) { - a.result match - { - case None => new Optional(a) - case x => success(x) - } - } - else success(None) + if(a.valid) new Optional(a) else success(None) def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) @@ -152,14 +171,15 @@ object Parser repeat(None, p, min, max, Nil) private[parse] def repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, revAcc: List[T]): Parser[Seq[T]] = { - assume(min >= 0, "Minimum must be greater than or equal to zero") - + assume(min >= 0, "Minimum must be greater than or equal to zero (was " + min + ")") + assume(max >= min, "Minimum must be less than or equal to maximum (min: " + min + ", max: " + max + ")") + def checkRepeated(invalidButOptional: => Parser[Seq[T]]): Parser[Seq[T]] = if(repeated.valid) repeated.result match { - case Some(value) => success(value :: Nil) - case None => new Repeat(partial, repeated, min, max, revAcc) + case Some(value) => success(revAcc reverse_::: value :: Nil) // revAcc should be Nil here + case None => if(max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) } else if(min == 0) invalidButOptional @@ -187,6 +207,22 @@ object Parser def completions = Completions.empty } + val any: Parser[Char] = charClass(_ => true) + + def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) + + def and[T](a: Parser[T], b: Parser[_]): Parser[T] = + if(a.valid && b.valid) new And(a, b) else Invalid + + def not(p: Parser[_]): Parser[Unit] = new Not(p) + + implicit def range(r: collection.immutable.NumericRange[Char]): Parser[Char] = + new CharacterClass(r contains _).examples(r.map(_.toString) : _*) + def chars(legal: String): Parser[Char] = + { + val set = legal.toSet + new CharacterClass(set) examples(set.map(_.toString)) + } def charClass(f: Char => Boolean): Parser[Char] = new CharacterClass(f) implicit def literal(ch: Char): Parser[Char] = new Parser[Char] { def resultEmpty = None @@ -195,14 +231,14 @@ object Parser } implicit def literal(s: String): Parser[String] = stringLiteral(s, s.toList) def stringLiteral(s: String, remaining: List[Char]): Parser[String] = - if(remaining.isEmpty) success(s) else if(s.isEmpty) error("String literal cannot be empty") else new StringLiteral(s, remaining) + if(s.isEmpty) error("String literal cannot be empty") else if(remaining.isEmpty) success(s) else new StringLiteral(s, remaining) } private final object Invalid extends Parser[Nothing] { def resultEmpty = None def derive(c: Char) = error("Invalid.") override def valid = false - def completions = Completions.empty + def completions = Completions.nil } private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[(A,B)] { @@ -213,7 +249,7 @@ private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[(A val common = a.derive(c) ~ b a.resultEmpty match { - case Some(av) => common || b.derive(c).map(br => (av,br)) + case Some(av) => common | b.derive(c).map(br => (av,br)) case None => common } } @@ -222,16 +258,28 @@ private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[(A private final class HomParser[A](a: Parser[A], b: Parser[A]) extends Parser[A] { - def derive(c: Char) = (a derive c) || (b derive c) + def derive(c: Char) = (a derive c) | (b derive c) lazy val resultEmpty = a.resultEmpty orElse b.resultEmpty lazy val completions = a.completions ++ b.completions } private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[Either[A,B]] { - def derive(c: Char) = (a derive c) | (b derive c) + def derive(c: Char) = (a derive c) || (b derive c) lazy val resultEmpty = a.resultEmpty.map(Left(_)) orElse b.resultEmpty.map(Right(_)) lazy val completions = a.completions ++ b.completions } +private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends Parser[B] +{ + lazy val resultEmpty = a.resultEmpty match { case None => None; case Some(av) => f(av).resultEmpty } + lazy val completions = + a.completions flatMap { c => + apply(a)(c.append).resultEmpty match { + case None => Completions.empty + case Some(av) => c x f(av).completions + } + } + def derive(c: Char) = a derive c flatMap f +} private final class MapParser[A,B](a: Parser[A], f: A => B) extends Parser[B] { lazy val resultEmpty = a.resultEmpty map f @@ -239,6 +287,12 @@ private final class MapParser[A,B](a: Parser[A], f: A => B) extends Parser[B] def completions = a.completions override def isTokenStart = a.isTokenStart } +private final class Filter[T](p: Parser[T], f: T => Boolean) extends Parser[T] +{ + lazy val resultEmpty = p.resultEmpty filter f + def derive(c: Char) = (p derive c) filter f + lazy val completions = p.completions filterS { s => apply(p)(s).resultEmpty.filter(f).isDefined } +} private final class TokenStart[T](delegate: Parser[T], seen: String) extends Parser[T] { def derive(c: Char) = tokenStart( delegate derive c, seen + c ) @@ -250,9 +304,22 @@ private final class TokenStart[T](delegate: Parser[T], seen: String) extends Par def resultEmpty = delegate.resultEmpty override def isTokenStart = true } +private final class And[T](a: Parser[T], b: Parser[_]) extends Parser[T] +{ + def derive(c: Char) = (a derive c) & (b derive c) + lazy val completions = a.completions.filterS(s => apply(b)(s).resultEmpty.isDefined ) + lazy val resultEmpty = if(b.resultEmpty.isDefined) a.resultEmpty else None +} + +private final class Not(delegate: Parser[_]) extends Parser[Unit] +{ + def derive(c: Char) = if(delegate.valid) not(delegate derive c) else this + def completions = Completions.empty + lazy val resultEmpty = if(delegate.resultEmpty.isDefined) None else Some(()) +} private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Parser[T] { - def derive(c: Char) = examples(delegate.derive(c), fixed.collect { case x if x.length > 0 && x(0) == c => x.tail }) + def derive(c: Char) = examples(delegate derive c, fixed.collect { case x if x.length > 0 && x(0) == c => x.tail }) def resultEmpty = delegate.resultEmpty lazy val completions = Completions(fixed map { ex => Completion.strict("",ex,false) } ) } @@ -287,7 +354,7 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m val partD = repeat(Some(part derive c), repeated, min, max, accumulatedReverse) part.resultEmpty match { - case Some(pv) => partD || repeatDerive(c, pv :: accumulatedReverse) + case Some(pv) => partD | repeatDerive(c, pv :: accumulatedReverse) case None => partD } case None => repeatDerive(c, accumulatedReverse) @@ -297,8 +364,11 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m lazy val completions = { + def pow(comp: Completions, exp: Completions, n: Int): Completions = + if(n == 1) comp else pow(comp x exp, exp, n - 1) + val repC = repeated.completions - val fin = if(min == 0) Completion.empty +: repC else repC + val fin = if(min == 0) Completion.empty +: repC else pow(repC, repC, min) partial match { case Some(p) => p.completions x fin diff --git a/util/complete/UpperBound.scala b/util/complete/UpperBound.scala index 1bdc1592c..9427070f7 100644 --- a/util/complete/UpperBound.scala +++ b/util/complete/UpperBound.scala @@ -12,7 +12,7 @@ sealed trait UpperBound /** True if and only if this bound is zero.*/ def isZero: Boolean /** If this bound is zero or Infinite, `decrement` returns this bound. - * Otherwise, this bound is finite and nonzero, and `decrement` returns the bound that is one less than this bound.*/ + * Otherwise, this bound is finite and greater than zero and `decrement` returns the bound that is one less than this bound.*/ def decrement: UpperBound /** True if and only if this is unbounded.*/ def isInfinite: Boolean @@ -32,12 +32,16 @@ case object Infinite extends UpperBound * It must positive. */ final case class Finite(value: Int) extends UpperBound { - assume(value > 0, "Maximum occurences must be positive.") - + assume(value >= 0, "Maximum occurences must be nonnegative.") + def >=(min: Int) = value >= min def isOne = value == 1 def isZero = value == 0 def decrement = Finite( (value - 1) max 0 ) def isInfinite = false override def toString = value.toString +} +object UpperBound +{ + implicit def intToFinite(i: Int): Finite = Finite(i) } \ No newline at end of file diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 5331cd27b..0fda3c6e3 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -23,4 +23,22 @@ object ParserExample println(apply(t)("test").resultEmpty) println(apply(t)("test w").resultEmpty) println(apply(t)("test was were").resultEmpty) -} + + def run(n: Int) + { + val a = 'a'.id + val aq = a.? + val aqn = repeat(aq, min = n, max = n) + val an = repeat(a, min = n, max = n) + val ann = aqn ~ an + + def r = apply(ann)("a"*(n*2)).resultEmpty + println(r.isDefined) + } + def run2(n: Int) + { + val ab = "ab".?.* + val r = apply(ab)("a"*n).resultEmpty + println(r) + } +} \ No newline at end of file From c436a1d3eb047237b20c13b1f12fa1e0f15076ef Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 12 Dec 2010 21:33:32 -0500 Subject: [PATCH 101/823] Settings --- util/collection/Attributes.scala | 4 + util/collection/Dag.scala | 14 ++- util/collection/Settings.scala | 147 ++++++++++++++++++++++++++----- 3 files changed, 142 insertions(+), 23 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 4037884dd..0d9e9ade1 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -10,6 +10,7 @@ import Types._ // a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts sealed trait AttributeKey[T] { def label: String + override final def toString = label } object AttributeKey { @@ -30,6 +31,9 @@ trait AttributeMap object AttributeMap { val empty: AttributeMap = new BasicAttributeMap(Map.empty) + implicit def toNatTrans(map: AttributeMap): AttributeKey ~> Id = new (AttributeKey ~> Id) { + def apply[T](key: AttributeKey[T]): T = map(key) + } } private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) extends AttributeMap { diff --git a/util/collection/Dag.scala b/util/collection/Dag.scala index 3ecc1f95b..55064eaef 100644 --- a/util/collection/Dag.scala +++ b/util/collection/Dag.scala @@ -14,21 +14,29 @@ object Dag import scala.collection.{mutable, JavaConversions}; import JavaConversions.{asIterable, asSet} - def topologicalSort[T](root: T)(dependencies: T => Iterable[T]) = { + // TODO: replace implementation with call to new version + def topologicalSort[T](root: T)(dependencies: T => Iterable[T]): List[T] = topologicalSort(root :: Nil)(dependencies) + + def topologicalSort[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = + { val discovered = new mutable.HashSet[T] val finished = asSet(new java.util.LinkedHashSet[T]) + def visitAll(nodes: Iterable[T]) = nodes foreach visit def visit(dag : T){ if (!discovered(dag)) { discovered(dag) = true; - dependencies(dag).foreach(visit); + visitAll(dependencies(dag)); finished += dag; } + else if(!finished(dag)) + throw new Cyclic(dag) } - visit(root); + visitAll(nodes); finished.toList; } + final class Cyclic(val value: Any) extends Exception("Cyclic reference involving " + value) } diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index ab85fd87d..a059e84da 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -1,29 +1,136 @@ package sbt -sealed trait Settings + import annotation.tailrec + import Settings._ + +sealed trait Settings[Scope] { - def get[T](key: AttributeKey[T], path: List[String]): Option[T] - def set[T](key: AttributeKey[T], path: List[String], value: T): Settings + def data: Scope => AttributeMap + def definitions: Scope => Definitions + def linear: Scope => Seq[Scope] + def get[T](scope: Scope, key: AttributeKey[T]): Option[T] + def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] +} +private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val definitions: Map[Scope, Definitions], val linear: Scope => Seq[Scope]) extends Settings[Scope] +{ + def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = + linear(scope).toStream.flatMap(sc => scopeLocal(sc, key) ).headOption + + private def scopeLocal[T](scope: Scope, key: AttributeKey[T]): Option[T] = + (data get scope).flatMap(_ get key) + + def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] = + { + val map = (data get scope) getOrElse AttributeMap.empty + val newData = data.updated(scope, map.put(key, value)) + new Settings0(newData, definitions, linear) + } } object Settings { - def empty: Settings = new Basic(Map.empty) - def x = 3 + type Data[Scope] = Map[Scope, AttributeMap] + type Init = Seq[Setting[_]] + type Keys = Set[AttributeKey[_]] - private[this] class Basic(val roots: Map[ List[String], AttributeMap ]) extends Settings + def make[Scope](inits: Iterable[(Scope,Init)], lzA: Scope => Seq[Scope]): Settings[Scope] = { - def get[T](key: AttributeKey[T], path: List[String]): Option[T] = - { - def notFound = path match { - case Nil => None - case x :: xs => get(key, xs) - } - (roots get path) flatMap ( _ get key ) orElse notFound - } - def set[T](key: AttributeKey[T], path: List[String], value: T): Settings = - { - val amap = (roots get path) getOrElse AttributeMap.empty - new Basic( roots updated(path, amap put(key, value)) ) - } + val definitions = inits map { case (scope, init) => (scope, compile(init)) } toMap; + val resolved = for( (scope, definition) <- definitions) yield (scope, resolveScopes(definition, lzA(scope), definitions) ) + val scopeDeps = resolved map { case (scope, requiredMap) => (scope, requiredMap.values) } toMap; + val ordered = Dag.topologicalSort(scopeDeps.keys)(scopeDeps) + val data = (Map.empty[Scope, AttributeMap] /: ordered) { (mp, scope) => add(mp, scope, definitions, resolved) } + new Settings0(data, definitions, lzA) } -} \ No newline at end of file + + private[this] def add[Scope](data: Data[Scope], scope: Scope, definitions: Map[Scope, Definitions], resolved: Map[Scope, Map[AttributeKey[_], Scope]]): Map[Scope, AttributeMap] = + data.updated(scope, mkScopeMap(data, definitions(scope), resolved(scope)) ) + + private[this] def mkScopeMap[Scope](data: Data[Scope], definitions: Definitions, definedIn: Map[AttributeKey[_], Scope]): AttributeMap = + { + val start = (AttributeMap.empty /: definitions.requires) ( (mp, key) => prepop(data, definedIn, mp, key)) + definitions eval start + } + + private[this] def prepop[T, Scope](data: Data[Scope], definedIn: Map[AttributeKey[_], Scope], mp: AttributeMap, key: AttributeKey[T]): AttributeMap = + mp.put(key, data(definedIn(key))(key)) + + private[this] def resolveScopes[Scope](definition: Definitions, search: Seq[Scope], definitions: Map[Scope, Definitions]): Map[AttributeKey[_], Scope] = + definition.requires.view.map(req => (req, resolveScope(req, search, definitions )) ).toMap + + private[this] def resolveScope[Scope](key: AttributeKey[_], search: Seq[Scope], definitions: Map[Scope, Definitions]): Scope = + search find defines(key, definitions) getOrElse { throw new Uninitialized(key) } + + private[this] def defines[Scope](key: AttributeKey[_], definitions: Map[Scope, Definitions])(scope: Scope): Boolean = + (definitions get scope).filter(_.provides contains key).isDefined + + final class Definitions(val provides: Keys, val requires: Keys, val eval: AttributeMap => AttributeMap) + + def value[T](key: AttributeKey[T])(value: => T): Setting[T] = new Value(key, value _) + def update[T](key: AttributeKey[T])(f: T => T): Setting[T] = new Update(key, f) + def app[HL <: HList, T](key: AttributeKey[T], inputs: KList[AttributeKey, HL])(f: HL => T): Setting[T] = new Apply(key, f, inputs) + + def compile(settings: Seq[Setting[_]]): Definitions = + { + val grpd = grouped(settings) + val sorted = sort(grpd) + val eval = (map: AttributeMap) => (map /: sorted)( (m, c) => c eval m ) + val provided = grpd.keySet.toSet + val requires = sorted.flatMap(_.dependencies).toSet -- provided ++ sorted.collect { case c if !c.selfContained => c.key } + new Definitions(provided, requires, eval) + } + private[this] def grouped(settings: Seq[Setting[_]]): Map[AttributeKey[_], Compiled] = + settings.groupBy(_.key) map { case (key: AttributeKey[t], actions) => + (key: AttributeKey[_], compileSetting(key, actions.asInstanceOf[Seq[Setting[t]]]) ) + } toMap; + + private[this] def compileSetting[T](key: AttributeKey[T], actions: Seq[Setting[T]]): Compiled = + { + val (alive, selfContained) = live(key, actions) + val f = (map: AttributeMap) => (map /: alive)(eval) + new Compiled(key, f, dependencies(actions), selfContained) + } + private[this] final class Compiled(val key: AttributeKey[_], val eval: AttributeMap => AttributeMap, val dependencies: Iterable[AttributeKey[_]], val selfContained: Boolean) { + override def toString = key.label + } + + private[this] def sort(actionMap: Map[AttributeKey[_], Compiled]): Seq[Compiled] = + Dag.topologicalSort(actionMap.values)( _.dependencies.flatMap(actionMap.get) ) + + private[this] def live[T](key: AttributeKey[T], actions: Seq[Setting[T]]): (Seq[Setting[T]], Boolean) = + { + val lastOverwrite = actions.lastIndexWhere(_ overwrite key) + val selfContained = lastOverwrite >= 0 + val alive = if(selfContained) actions.drop(lastOverwrite) else actions + (alive, selfContained) + } + private[this] def dependencies(actions: Seq[Setting[_]]): Seq[AttributeKey[_]] = actions.flatMap(_.dependsOn) + private[this] def eval[T](map: AttributeMap, a: Setting[T]): AttributeMap = + a match + { + case s: Value[T] => map.put(s.key, s.value()) + case u: Update[T] => map.put(u.key, u.f(map(u.key))) + case a: Apply[hl, T] => map.put(a.key, a.f(a.inputs.down(map))) + } + + sealed trait Setting[T] + { + def key: AttributeKey[T] + def overwrite(key: AttributeKey[T]): Boolean + def dependsOn: Seq[AttributeKey[_]] = Nil + } + private[this] final class Value[T](val key: AttributeKey[T], val value: () => T) extends Setting[T] + { + def overwrite(key: AttributeKey[T]) = true + } + private[this] final class Update[T](val key: AttributeKey[T], val f: T => T) extends Setting[T] + { + def overwrite(key: AttributeKey[T]) = false + } + private[this] final class Apply[HL <: HList, T](val key: AttributeKey[T], val f: HL => T, val inputs: KList[AttributeKey, HL]) extends Setting[T] + { + def overwrite(key: AttributeKey[T]) = inputs.toList.forall(_ ne key) + override def dependsOn = inputs.toList - key + } + + final class Uninitialized(key: AttributeKey[_]) extends Exception("Update on uninitialized setting " + key.label) +} From ddb4381454b80be70529c20aa29279382b333d5c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 13 Dec 2010 22:44:25 -0500 Subject: [PATCH 102/823] fixes and improvements to tab completions combinators --- util/complete/Completions.scala | 120 +++++++++++------- util/complete/Parser.scala | 66 +++++++--- util/complete/src/test/scala/ParserTest.scala | 37 +++++- 3 files changed, 158 insertions(+), 65 deletions(-) diff --git a/util/complete/Completions.scala b/util/complete/Completions.scala index 7a5f22444..a27e5dc6f 100644 --- a/util/complete/Completions.scala +++ b/util/complete/Completions.scala @@ -19,6 +19,8 @@ sealed trait Completions override def toString = get.mkString("Completions(",",",")") final def flatMap(f: Completion => Completions): Completions = Completions(get.flatMap(c => f(c).get)) final def map(f: Completion => Completion): Completions = Completions(get map f) + override final def hashCode = get.hashCode + override final def equals(o: Any) = o match { case c: Completions => get == c.get; case _ => false } } object Completions { @@ -28,32 +30,27 @@ object Completions } /** Returns a strict Completions instance using the provided Completion Set. */ - def strict(cs: Set[Completion]): Completions = new Completions { - def get = cs - } + def strict(cs: Set[Completion]): Completions = apply(cs) /** No suggested completions, not even the empty Completion.*/ val nil: Completions = strict(Set.empty) - /** Only includes the unmarked empty Completion as a suggestion. */ + /** Only includes an empty Suggestion */ val empty: Completions = strict(Set.empty + Completion.empty) - /** Includes only the marked empty Completion as a suggestion. */ - val mark: Completions = strict(Set.empty + Completion.mark) - - /** Returns a strict Completions instance with a single Completion with `s` for `append`.*/ - def single(s: String): Completions = strict(Set.empty + Completion.strict("", s)) + /** Returns a strict Completions instance containing only the provided Completion.*/ + def single(c: Completion): Completions = strict(Set.empty + c) } /** * Represents a completion. -* The abstract members `prepend` and `append` are best explained with an example. +* The abstract members `display` and `append` are best explained with an example. * * Assuming space-delimited tokens, processing this: * am is are w * could produce these Completions: -* Completion { prepend = "w"; append = "as" } -* Completion { prepend = "w"; append = "ere" } +* Completion { display = "was"; append = "as" } +* Completion { display = "were"; append = "ere" } * to suggest the tokens "was" and "were". * * In this way, two pieces of information are preserved: @@ -62,51 +59,76 @@ object Completions */ sealed trait Completion { - /** The part of the token that was in the input.*/ - def prepend: String - /** The proposed suffix to append to the existing input to complete the last token in the input.*/ def append: String + /** The string to present to the user to represent the full token being suggested.*/ + def display: String + /** True if this Completion is suggesting the empty string.*/ + def isEmpty: Boolean - /** True if this completion has been identified with a token. - * A marked Completion will not be appended to another Completion unless that Completion is empty. - * In this way, only a single token is completed at a time.*/ - def mark: Boolean - - final def isEmpty = prepend.isEmpty && append.isEmpty - - /** Appends the completions in `o` with the completions in this unless `o` is marked and this is nonempty.*/ - final def ++(o: Completion): Completion = if(o.mark && !isEmpty) this else Completion(prepend + o.prepend, append + o.append, mark) - + /** Appends the completions in `o` with the completions in this Completion.*/ + def ++(o: Completion): Completion = Completion.concat(this, o) final def x(o: Completions): Completions = o.map(this ++ _) - - override final def toString = triple.toString - override final lazy val hashCode = triple.hashCode - override final def equals(o: Any) = o match { - case c: Completion => triple == c.triple - case _ => false - } - final def triple = (prepend, append, mark) + override final lazy val hashCode = Completion.hashCode(this) + override final def equals(o: Any) = o match { case c: Completion => Completion.equal(this, c); case _ => false } +} +final class DisplayOnly(display0: String) extends Completion +{ + lazy val display = display0 + def isEmpty = display.isEmpty + def append = "" + override def toString = "{" + display + "}" +} +final class Token(prepend0: String, append0: String) extends Completion +{ + lazy val prepend = prepend0 + lazy val append = append0 + def isEmpty = prepend.isEmpty && append.isEmpty + def display = prepend + append + override final def toString = "[" + prepend + "," + append +"]" +} +final class Suggestion(append0: String) extends Completion +{ + lazy val append = append0 + def isEmpty = append.isEmpty + def display = append + override def toString = append } object Completion { - /** Constructs a lazy Completion with the given prepend, append, and mark values. */ - def apply(d: => String, a: => String, m: Boolean = false): Completion = new Completion { - lazy val prepend = d - lazy val append = a - def mark = m - } + def concat(a: Completion, b: Completion): Completion = + (a,b) match + { + case (as: Suggestion, bs: Suggestion) => suggestion(as.append + bs.append) + case (at: Token, _) if at.append.isEmpty => b + case _ if a.isEmpty => b + case _ => a + } - /** Constructs a strict Completion with the given prepend, append, and mark values. */ - def strict(d: String, a: String, m: Boolean = false): Completion = new Completion { - def prepend = d - def append = a - def mark = m - } + def equal(a: Completion, b: Completion): Boolean = + (a,b) match + { + case (as: Suggestion, bs: Suggestion) => as.append == bs.append + case (ad: DisplayOnly, bd: DisplayOnly) => ad.display == bd.display + case (at: Token, bt: Token) => at.prepend == bt.prepend && at.append == bt.append + case _ => false + } - /** An unmarked completion with the empty string for prepend and append. */ - val empty: Completion = strict("", "", false) + def hashCode(a: Completion): Int = + a match + { + case as: Suggestion => (0, as.append).hashCode + case ad: DisplayOnly => (1, ad.display).hashCode + case at: Token => (2, at.prepend, at.append).hashCode + } - /** A marked completion with the empty string for prepend and append. */ - val mark: Completion = Completion.strict("", "", true) + val empty: Completion = suggestStrict("") + def single(c: Char): Completion = suggestStrict(c.toString) + + def displayOnly(value: => String): Completion = new DisplayOnly(value) + def displayStrict(value: String): Completion = displayOnly(value) + def token(prepend: => String, append: => String): Completion = new Token(prepend, append) + def tokenStrict(prepend: String, append: String): Completion = token(prepend, append) + def suggestion(value: => String): Completion = new Suggestion(value) + def suggestStrict(value: String): Completion = suggestion(value) } \ No newline at end of file diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 459f7680a..f2fb6d919 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -34,6 +34,11 @@ sealed trait RichParser[A] * For example, `'c'.id` or `"asdf".id`*/ def id: Parser[A] + def ^^^[B](value: B): Parser[B] + def ??[B >: A](alt: B): Parser[B] + def <~[B](b: Parser[B]): Parser[A] + def ~>[B](b: Parser[B]): Parser[B] + def unary_- : Parser[Unit] def & (o: Parser[_]): Parser[A] def - (o: Parser[_]): Parser[A] @@ -58,7 +63,7 @@ object Parser if(p.valid) p.derive(c) else p def completions(p: Parser[_], s: String): Completions = completions( apply(p)(s) ) - def completions(p: Parser[_]): Completions = Completions.mark x p.completions + def completions(p: Parser[_]): Completions = p.completions implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] { @@ -71,6 +76,11 @@ object Parser def map[B](f: A => B) = mapParser(a, f) def id = a + def ^^^[B](value: B): Parser[B] = a map { _ => value } + def ??[B >: A](alt: B): Parser[B] = a.? map { _ getOrElse alt } + def <~[B](b: Parser[B]): Parser[A] = (a ~ b) map { case av ~ _ => av } + def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } + def unary_- = not(a) def & (o: Parser[_]) = and(a, o) def - (o: Parser[_]) = sub(a, o) @@ -142,10 +152,11 @@ object Parser } else Invalid - def token[T](t: Parser[T]): Parser[T] = tokenStart(t, "") - def tokenStart[T](t: Parser[T], seen: String): Parser[T] = + def token[T](t: Parser[T]): Parser[T] = token(t, "", true) + def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false) + def token[T](t: Parser[T], seen: String, track: Boolean): Parser[T] = if(t.valid && !t.isTokenStart) - if(t.result.isEmpty) new TokenStart(t, seen) else t + if(t.result.isEmpty) new TokenStart(t, seen, track) else t else t @@ -205,6 +216,7 @@ object Parser def resultEmpty = result def derive(c: Char) = Invalid def completions = Completions.empty + override def toString = "success(" + value + ")" } val any: Parser[Char] = charClass(_ => true) @@ -224,14 +236,20 @@ object Parser new CharacterClass(set) examples(set.map(_.toString)) } def charClass(f: Char => Boolean): Parser[Char] = new CharacterClass(f) + implicit def literal(ch: Char): Parser[Char] = new Parser[Char] { def resultEmpty = None def derive(c: Char) = if(c == ch) success(ch) else Invalid - def completions = Completions.single(ch.toString) + def completions = Completions.single(Completion.suggestStrict(ch.toString)) + override def toString = "'" + ch + "'" } implicit def literal(s: String): Parser[String] = stringLiteral(s, s.toList) def stringLiteral(s: String, remaining: List[Char]): Parser[String] = if(s.isEmpty) error("String literal cannot be empty") else if(remaining.isEmpty) success(s) else new StringLiteral(s, remaining) + + object ~ { + def unapply[A,B](t: (A,B)): Some[(A,B)] = Some(t) + } } private final object Invalid extends Parser[Nothing] { @@ -239,6 +257,7 @@ private final object Invalid extends Parser[Nothing] def derive(c: Char) = error("Invalid.") override def valid = false def completions = Completions.nil + override def toString = "inv" } private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[(A,B)] { @@ -254,6 +273,7 @@ private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[(A } } lazy val completions = a.completions x b.completions + override def toString = "(" + a + " ~ " + b + ")" } private final class HomParser[A](a: Parser[A], b: Parser[A]) extends Parser[A] @@ -261,24 +281,28 @@ private final class HomParser[A](a: Parser[A], b: Parser[A]) extends Parser[A] def derive(c: Char) = (a derive c) | (b derive c) lazy val resultEmpty = a.resultEmpty orElse b.resultEmpty lazy val completions = a.completions ++ b.completions + override def toString = "(" + a + " | " + b + ")" } private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[Either[A,B]] { def derive(c: Char) = (a derive c) || (b derive c) lazy val resultEmpty = a.resultEmpty.map(Left(_)) orElse b.resultEmpty.map(Right(_)) lazy val completions = a.completions ++ b.completions + override def toString = "(" + a + " || " + b + ")" } private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends Parser[B] { lazy val resultEmpty = a.resultEmpty match { case None => None; case Some(av) => f(av).resultEmpty } - lazy val completions = + lazy val completions = { a.completions flatMap { c => apply(a)(c.append).resultEmpty match { - case None => Completions.empty + case None => Completions.strict(Set.empty + c) case Some(av) => c x f(av).completions } } + } def derive(c: Char) = a derive c flatMap f + override def toString = "bind(" + a + ")" } private final class MapParser[A,B](a: Parser[A], f: A => B) extends Parser[B] { @@ -286,23 +310,30 @@ private final class MapParser[A,B](a: Parser[A], f: A => B) extends Parser[B] def derive(c: Char) = (a derive c) map f def completions = a.completions override def isTokenStart = a.isTokenStart + override def toString = "map(" + a + ")" } private final class Filter[T](p: Parser[T], f: T => Boolean) extends Parser[T] { lazy val resultEmpty = p.resultEmpty filter f def derive(c: Char) = (p derive c) filter f lazy val completions = p.completions filterS { s => apply(p)(s).resultEmpty.filter(f).isDefined } + override def toString = "filter(" + p + ")" } -private final class TokenStart[T](delegate: Parser[T], seen: String) extends Parser[T] +private final class TokenStart[T](delegate: Parser[T], seen: String, track: Boolean) extends Parser[T] { - def derive(c: Char) = tokenStart( delegate derive c, seen + c ) + def derive(c: Char) = token( delegate derive c, if(track) seen + c else seen, track) lazy val completions = - { - val dcs = delegate.completions - Completions( for(c <- dcs.get) yield Completion(seen, c.append, true) ) - } + if(track) + { + val dcs = delegate.completions + Completions( for(c <- dcs.get) yield Completion.token(seen, c.append) ) + } + else + Completions.single(Completion.displayStrict(seen)) + def resultEmpty = delegate.resultEmpty override def isTokenStart = true + override def toString = "token('" + seen + "', " + track + ", " + delegate + ")" } private final class And[T](a: Parser[T], b: Parser[_]) extends Parser[T] { @@ -321,26 +352,30 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends { def derive(c: Char) = examples(delegate derive c, fixed.collect { case x if x.length > 0 && x(0) == c => x.tail }) def resultEmpty = delegate.resultEmpty - lazy val completions = Completions(fixed map { ex => Completion.strict("",ex,false) } ) + lazy val completions = if(fixed.isEmpty) Completions.empty else Completions(fixed map(f => Completion.suggestion(f)) ) + override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" } private final class StringLiteral(str: String, remaining: List[Char]) extends Parser[String] { assert(str.length > 0 && !remaining.isEmpty) def resultEmpty = None def derive(c: Char) = if(remaining.head == c) stringLiteral(str, remaining.tail) else Invalid - lazy val completions = Completions.single(remaining.mkString) + lazy val completions = Completions.single(Completion.suggestion(remaining.mkString)) + override def toString = '"' + str + '"' } private final class CharacterClass(f: Char => Boolean) extends Parser[Char] { def resultEmpty = None def derive(c: Char) = if( f(c) ) success(c) else Invalid def completions = Completions.empty + override def toString = "class()" } private final class Optional[T](delegate: Parser[T]) extends Parser[Option[T]] { def resultEmpty = Some(None) def derive(c: Char) = (delegate derive c).map(Some(_)) lazy val completions = Completion.empty +: delegate.completions + override def toString = delegate.toString + "?" } private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, accumulatedReverse: List[T]) extends Parser[Seq[T]] { @@ -395,4 +430,5 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m for(value <- repeated.resultEmpty) yield List.make(min, value) } + override def toString = "repeat(" + min + "," + max +"," + partial + "," + repeated + ")" } \ No newline at end of file diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 0fda3c6e3..3a2bc494f 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -1,14 +1,49 @@ package sbt.parse import Parser._ + import org.scalacheck._ +object ParserTest extends Properties("Completing Parser") +{ + val wsc = charClass(_.isWhitespace) + val ws = ( wsc + ) examples(" ") + val optWs = ( wsc * ) examples("") + + val nested = (token("a1") ~ token("b2")) ~ "c3" + val nestedDisplay = (token("a1", "") ~ token("b2", "")) ~ "c3" + + def p[T](f: T): T = { /*println(f);*/ f } + + def checkSingle(in: String, expect: Completion)(expectDisplay: Completion = expect) = + ( ("token '" + in + "'") |: checkOne(in, nested, expect)) && + ( ("display '" + in + "'") |: checkOne(in, nestedDisplay, expectDisplay) ) + + def checkOne(in: String, parser: Parser[_], expect: Completion): Prop = + p(completions(parser, in)) == Completions.single(expect) + + def checkInvalid(in: String) = + ( ("token '" + in + "'") |: checkInv(in, nested) ) && + ( ("display '" + in + "'") |: checkInv(in, nestedDisplay) ) + def checkInv(in: String, parser: Parser[_]): Prop = + p(completions(parser, in)) == Completions.nil + + property("nested tokens a") = checkSingle("", Completion.tokenStrict("","a1") )( Completion.displayStrict("")) + property("nested tokens a1") = checkSingle("a", Completion.tokenStrict("a","1") )( Completion.displayStrict("")) + property("nested tokens a inv") = checkInvalid("b") + property("nested tokens b") = checkSingle("a1", Completion.tokenStrict("","b2") )( Completion.displayStrict("")) + property("nested tokens b2") = checkSingle("a1b", Completion.tokenStrict("b","2") )( Completion.displayStrict("")) + property("nested tokens b inv") = checkInvalid("a1a") + property("nested tokens c") = checkSingle("a1b2", Completion.suggestStrict("c3") )() + property("nested tokens c3") = checkSingle("a1b2c", Completion.suggestStrict("3"))() + property("nested tokens c inv") = checkInvalid("a1b2a") +} object ParserExample { val ws = charClass(_.isWhitespace)+ val notws = charClass(!_.isWhitespace)+ val name = token("test") - val options = (ws ~ token("quick" || "failed" || "new") )* + val options = (ws ~ token("quick" | "failed" | "new") )* val include = (ws ~ token(examples(notws, Set("am", "is", "are", "was", "were") )) )* val t = name ~ options ~ include From 62958e2f1937d3b2d0e48adf92b698969d5161c1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 14 Dec 2010 06:08:20 -0500 Subject: [PATCH 103/823] fix flatMap in completion --- util/complete/Parser.scala | 11 ++++++++++- 1 file changed, 10 insertions(+), 1 deletion(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index f2fb6d919..4944c6d77 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -301,7 +301,16 @@ private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends Par } } } - def derive(c: Char) = a derive c flatMap f + + def derive(c: Char) = + { + val common = a derive c flatMap f + a.resultEmpty match + { + case Some(av) => common | f(av).derive(c) + case None => common + } + } override def toString = "bind(" + a + ")" } private final class MapParser[A,B](a: Parser[A], f: A => B) extends Parser[B] From 5cb2ba2a7d0d39e628162bac39901dc4ab888ad0 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 18 Jan 2011 18:07:48 -0500 Subject: [PATCH 104/823] JLine integration for tab completion combinators --- util/complete/JLineCompletion.scala | 94 +++++++++++++++++++ util/complete/src/test/scala/ParserTest.scala | 20 ++++ 2 files changed, 114 insertions(+) create mode 100644 util/complete/JLineCompletion.scala diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala new file mode 100644 index 000000000..f9c7183ca --- /dev/null +++ b/util/complete/JLineCompletion.scala @@ -0,0 +1,94 @@ +/* sbt -- Simple Build Tool + * Copyright 2011 Mark Harrah + */ +package sbt.parse + + import jline.{CandidateListCompletionHandler,Completor,CompletionHandler,ConsoleReader} + import scala.annotation.tailrec + import collection.JavaConversions + +object JLineCompletion +{ + def installCustomCompletor(reader: ConsoleReader, parser: Parser[_]): Unit = + installCustomCompletor(parserAsCompletor(parser), reader) + def installCustomCompletor(reader: ConsoleReader)(complete: String => (Seq[String], Seq[String])): Unit = + installCustomCompletor(customCompletor(complete), reader) + def installCustomCompletor(complete: ConsoleReader => Boolean, reader: ConsoleReader): Unit = + { + reader.removeCompletor(DummyCompletor) + reader.addCompletor(DummyCompletor) + reader.setCompletionHandler(new CustomHandler(complete)) + } + + private[this] final class CustomHandler(completeImpl: ConsoleReader => Boolean) extends CompletionHandler + { + override def complete(reader: ConsoleReader, candidates: java.util.List[_], position: Int) = completeImpl(reader) + } + + // always provides dummy completions so that the custom completion handler gets called + // (ConsoleReader doesn't call the handler if there aren't any completions) + // the custom handler will then throw away the candidates and call the custom function + private[this] final object DummyCompletor extends Completor + { + override def complete(buffer: String, cursor: Int, candidates: java.util.List[_]): Int = + { + candidates.asInstanceOf[java.util.List[String]] add "dummy" + 0 + } + } + + def parserAsCompletor(p: Parser[_]): ConsoleReader => Boolean = + customCompletor(str => convertCompletions(Parser.completions(p, str))) + def convertCompletions(c: Completions): (Seq[String], Seq[String]) = + { + ( (Seq[String](), Seq[String]()) /: c.get) { case ( t @ (insert,display), comp) => + if(comp.isEmpty) t else (insert :+ comp.append, insert :+ comp.display) + } + } + + def customCompletor(f: String => (Seq[String], Seq[String])): ConsoleReader => Boolean = + reader => { + val success = complete(beforeCursor(reader), f, reader, false) + reader.flushConsole() + success + } + + def beforeCursor(reader: ConsoleReader): String = + { + val b = reader.getCursorBuffer + b.getBuffer.substring(0, b.cursor) + } + + def complete(beforeCursor: String, completions: String => (Seq[String],Seq[String]), reader: ConsoleReader, inserted: Boolean): Boolean = + { + val (insert,display) = completions(beforeCursor) + if(insert.isEmpty) + inserted + else + { + lazy val common = commonPrefix(insert) + if(inserted || common.isEmpty) + { + showCompletions(display, reader) + reader.drawLine() + true + } + else + { + reader.getCursorBuffer.write(common) + reader.redrawLine() + complete(beforeCursor + common, completions, reader, true) + } + } + } + def showCompletions(cs: Seq[String], reader: ConsoleReader): Unit = + if(cs.isEmpty) () else CandidateListCompletionHandler.printCandidates(reader, JavaConversions.asJavaList(cs), true) + + def commonPrefix(s: Seq[String]): String = if(s.isEmpty) "" else s reduceLeft commonPrefix + def commonPrefix(a: String, b: String): String = + { + val len = a.length min b.length + @tailrec def loop(i: Int): Int = if(i >= len) len else if(a(i) != b(i)) i else loop(i+1) + a.substring(0, loop(0)) + } +} \ No newline at end of file diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 3a2bc494f..2cb907bc1 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -3,6 +3,26 @@ package sbt.parse import Parser._ import org.scalacheck._ +object JLineTest +{ + def main(args: Array[String]) + { + import jline.{ConsoleReader,Terminal} + val reader = new ConsoleReader() + Terminal.getTerminal.disableEcho() + + val parser = ParserExample.t + JLineCompletion.installCustomCompletor(reader, parser) + def loop() { + val line = reader.readLine("> ") + if(line ne null) { + println("Entered '" + line + "'") + loop() + } + } + loop() + } +} object ParserTest extends Properties("Completing Parser") { val wsc = charClass(_.isWhitespace) From f0ef14289dcefba05e2035ac1219f6a648e264c5 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 18 Dec 2010 12:40:23 -0500 Subject: [PATCH 105/823] update completion example with newer combinators --- util/complete/src/test/scala/ParserTest.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 2cb907bc1..257171016 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -63,8 +63,8 @@ object ParserExample val notws = charClass(!_.isWhitespace)+ val name = token("test") - val options = (ws ~ token("quick" | "failed" | "new") )* - val include = (ws ~ token(examples(notws, Set("am", "is", "are", "was", "were") )) )* + val options = (ws ~> token("quick" | "failed" | "new") )* + val include = (ws ~> token(examples(notws.string, Set("am", "is", "are", "was", "were") )) )* val t = name ~ options ~ include From 783d732868b3d91165735bdaf923a195348868aa Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 18 Dec 2010 12:39:38 -0500 Subject: [PATCH 106/823] Remove obsolete comments --- util/collection/Dag.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/util/collection/Dag.scala b/util/collection/Dag.scala index 55064eaef..9d6a295ea 100644 --- a/util/collection/Dag.scala +++ b/util/collection/Dag.scala @@ -14,7 +14,6 @@ object Dag import scala.collection.{mutable, JavaConversions}; import JavaConversions.{asIterable, asSet} - // TODO: replace implementation with call to new version def topologicalSort[T](root: T)(dependencies: T => Iterable[T]): List[T] = topologicalSort(root :: Nil)(dependencies) def topologicalSort[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = From 625ddd703c682c8fa987787ec9ec35cebce9d36b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 19 Dec 2010 12:03:10 -0500 Subject: [PATCH 107/823] part I of revised approach to commands/projects no privileged project member of State no separation of Command and Apply, so no pre-filtering on State use entries in State attributes map instead of mixing in traits to project object: HistoryPath, Logger, Analysis, Navigate, Watch, TaskedKey rework Navigation to be standalone instead of mixin --- util/complete/HistoryCommands.scala | 8 +++++--- util/complete/LineReader.scala | 17 ++++++++++------- 2 files changed, 15 insertions(+), 10 deletions(-) diff --git a/util/complete/HistoryCommands.scala b/util/complete/HistoryCommands.scala index a5b321d8c..bbf8c0bfd 100644 --- a/util/complete/HistoryCommands.scala +++ b/util/complete/HistoryCommands.scala @@ -4,6 +4,8 @@ package sbt package complete + import java.io.File + object HistoryCommands { val Start = "!" @@ -38,7 +40,7 @@ object HistoryCommands def printHelp(): Unit = println(helpString) - def apply(s: String, historyPath: Option[Path], maxLines: Int, error: String => Unit): Option[List[String]] = + def apply(s: String, historyPath: Option[File], maxLines: Int, error: String => Unit): Option[List[String]] = if(s.isEmpty) { printHelp() @@ -46,7 +48,7 @@ object HistoryCommands } else { - val lines = historyPath.toList.flatMap(h => IO.readLines(h.asFile) ).toArray + val lines = historyPath.toList.flatMap( p => IO.readLines(p) ).toArray if(lines.isEmpty) { error("No history") @@ -66,7 +68,7 @@ object HistoryCommands { val command = historyCommand(history, s) command.foreach(lines(lines.length - 1) = _) - historyPath foreach { h => IO.writeLines(h.asFile, lines) } + historyPath foreach { h => IO.writeLines(h, lines) } Some(command.toList) } } diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index e7c9ec67a..d586d0729 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -2,7 +2,10 @@ * Copyright 2008, 2009 Mark Harrah */ package sbt -import jline.{Completor, ConsoleReader} + + import jline.{Completor, ConsoleReader} + import java.io.File + abstract class JLine extends LineReader { protected[this] val reader: ConsoleReader @@ -37,10 +40,10 @@ private object JLine try { action } finally { t.enableEcho() } } - private[sbt] def initializeHistory(cr: ConsoleReader, historyPath: Option[Path]): Unit = + private[sbt] def initializeHistory(cr: ConsoleReader, historyPath: Option[File]): Unit = for(historyLocation <- historyPath) { - val historyFile = historyLocation.asFile + val historyFile = historyLocation.getAbsoluteFile ErrorHandling.wideConvert { historyFile.getParentFile.mkdirs() @@ -49,7 +52,7 @@ private object JLine history.setHistoryFile(historyFile) } } - def simple(historyPath: Option[Path]): SimpleReader = new SimpleReader(historyPath) + def simple(historyPath: Option[File]): SimpleReader = new SimpleReader(historyPath) val MaxHistorySize = 500 } @@ -57,14 +60,14 @@ trait LineReader extends NotNull { def readLine(prompt: String): Option[String] } -private[sbt] final class LazyJLineReader(historyPath: Option[Path], completor: => Completor) extends JLine +private[sbt] final class LazyJLineReader(historyPath: Option[File] /*, completor: => Completor*/) extends JLine { protected[this] val reader = { val cr = new ConsoleReader cr.setBellEnabled(false) JLine.initializeHistory(cr, historyPath) - cr.addCompletor(new LazyCompletor(completor)) +// cr.addCompletor(new LazyCompletor(completor)) cr } } @@ -75,7 +78,7 @@ private class LazyCompletor(delegate0: => Completor) extends Completor delegate.complete(buffer, cursor, candidates) } -class SimpleReader private[sbt] (historyPath: Option[Path]) extends JLine +class SimpleReader private[sbt] (historyPath: Option[File]) extends JLine { protected[this] val reader = JLine.createReader() JLine.initializeHistory(reader, historyPath) From c24c0b7a23b193d79dada6e0e9a5f2fce178255d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 29 Dec 2010 16:07:17 -0500 Subject: [PATCH 108/823] fully-scoped Settings --- util/collection/PMap.scala | 53 ++++++++-- util/collection/Settings.scala | 185 +++++++++++++++++---------------- 2 files changed, 143 insertions(+), 95 deletions(-) diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala index e6b002995..bc054ae87 100644 --- a/util/collection/PMap.scala +++ b/util/collection/PMap.scala @@ -3,7 +3,8 @@ */ package sbt -import Types._ + import Types._ + import collection.mutable trait RMap[K[_], V[_]] { @@ -11,39 +12,79 @@ trait RMap[K[_], V[_]] def get[T](k: K[T]): Option[V[T]] def contains[T](k: K[T]): Boolean } + +trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] +{ + def put[T](k: K[T], v: V[T]): IMap[K,V] + def remove[T](k: K[T]): IMap[K,V] + def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): IMap[K,V] + def mapValues[V2[_]](f: V ~> V2): IMap[K,V2] + def toSeq: Seq[(K[_], V[_])] +} trait PMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] { def update[T](k: K[T], v: V[T]): Unit def remove[T](k: K[T]): Option[V[T]] def getOrUpdate[T](k: K[T], make: => V[T]): V[T] + def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] } object PMap { implicit def toFunction[K[_], V[_]](map: PMap[K,V]): K[_] => V[_] = k => map(k) + def empty[K[_], V[_]]: PMap[K,V] = new DelegatingPMap[K,V](new mutable.HashMap) +} +object IMap +{ + /** + * Only suitable for K that is invariant in its type parameter. + * Option and List keys are not suitable, for example, + * because None <:< Option[String] and None <: Option[Int]. + */ + def empty[K[_], V[_]]: IMap[K,V] = new IMap0[K,V](Map.empty) + + private[this] class IMap0[K[_], V[_]](backing: Map[K[_], V[_]]) extends AbstractRMap[K,V] with IMap[K,V] + { + def get[T](k: K[T]): Option[V[T]] = ( backing get k ).asInstanceOf[Option[V[T]]] + def put[T](k: K[T], v: V[T]) = new IMap0[K,V]( backing.updated(k, v) ) + def remove[T](k: K[T]) = new IMap0[K,V]( backing - k ) + + def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]) = + put(k, f(this get k getOrElse init)) + + def mapValues[V2[_]](f: V ~> V2) = + new IMap0[K,V2](backing.mapValues(x => f(x))) + def toSeq = backing.toSeq + + override def toString = backing.toString + } } -abstract class AbstractPMap[K[_], V[_]] extends PMap[K,V] +abstract class AbstractRMap[K[_], V[_]] extends RMap[K,V] { def apply[T](k: K[T]): V[T] = get(k).get def contains[T](k: K[T]): Boolean = get(k).isDefined } -import collection.mutable.Map - /** * Only suitable for K that is invariant in its type parameter. * Option and List keys are not suitable, for example, * because None <:< Option[String] and None <: Option[Int]. */ -class DelegatingPMap[K[_], V[_]](backing: Map[K[_], V[_]]) extends AbstractPMap[K,V] +class DelegatingPMap[K[_], V[_]](backing: mutable.Map[K[_], V[_]]) extends AbstractRMap[K,V] with PMap[K,V] { def get[T](k: K[T]): Option[V[T]] = cast[T]( backing.get(k) ) def update[T](k: K[T], v: V[T]) { backing(k) = v } def remove[T](k: K[T]) = cast( backing.remove(k) ) def getOrUpdate[T](k: K[T], make: => V[T]) = cast[T]( backing.getOrElseUpdate(k, make) ) + def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] = + { + val v = f(this get k getOrElse init) + update(k, v) + v + } private[this] def cast[T](v: V[_]): V[T] = v.asInstanceOf[V[T]] private[this] def cast[T](o: Option[V[_]]): Option[V[T]] = o map cast[T] override def toString = backing.toString -} \ No newline at end of file +} diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index a059e84da..c1ea10cb7 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -1,20 +1,23 @@ package sbt + import Types._ import annotation.tailrec - import Settings._ + import collection.mutable sealed trait Settings[Scope] { def data: Scope => AttributeMap - def definitions: Scope => Definitions - def linear: Scope => Seq[Scope] + def scopes: Seq[Scope] def get[T](scope: Scope, key: AttributeKey[T]): Option[T] def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] } -private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val definitions: Map[Scope, Definitions], val linear: Scope => Seq[Scope]) extends Settings[Scope] + +private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val delegates: Scope => Seq[Scope]) extends Settings[Scope] { + def scopes: Seq[Scope] = data.keys.toSeq + def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = - linear(scope).toStream.flatMap(sc => scopeLocal(sc, key) ).headOption + delegates(scope).toStream.flatMap(sc => scopeLocal(sc, key) ).headOption private def scopeLocal[T](scope: Scope, key: AttributeKey[T]): Option[T] = (data get scope).flatMap(_ get key) @@ -23,114 +26,118 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val def { val map = (data get scope) getOrElse AttributeMap.empty val newData = data.updated(scope, map.put(key, value)) - new Settings0(newData, definitions, linear) + new Settings0(newData, delegates) } } -object Settings +// delegates should contain the input Scope as the first entry +final class Init[Scope](val delegates: Scope => Seq[Scope]) { - type Data[Scope] = Map[Scope, AttributeMap] - type Init = Seq[Setting[_]] - type Keys = Set[AttributeKey[_]] + final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) - def make[Scope](inits: Iterable[(Scope,Init)], lzA: Scope => Seq[Scope]): Settings[Scope] = - { - val definitions = inits map { case (scope, init) => (scope, compile(init)) } toMap; - val resolved = for( (scope, definition) <- definitions) yield (scope, resolveScopes(definition, lzA(scope), definitions) ) - val scopeDeps = resolved map { case (scope, requiredMap) => (scope, requiredMap.values) } toMap; - val ordered = Dag.topologicalSort(scopeDeps.keys)(scopeDeps) - val data = (Map.empty[Scope, AttributeMap] /: ordered) { (mp, scope) => add(mp, scope, definitions, resolved) } - new Settings0(data, definitions, lzA) + type SettingSeq[T] = Seq[Setting[T]] + type ScopedMap = IMap[ScopedKey, SettingSeq] + type CompiledMap = Map[ScopedKey[_], Compiled] + type MapScoped = ScopedKey ~> ScopedKey + + def value[T](key: ScopedKey[T])(value: => T): Setting[T] = new Value(key, value _) + def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = app(key, key :^: KNil)(h => f(h.head)) + def app[HL <: HList, T](key: ScopedKey[T], inputs: KList[ScopedKey, HL])(f: HL => T): Setting[T] = new Apply(key, f, inputs) + + def empty: Settings[Scope] = new Settings0(Map.empty, delegates) + def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { + def apply[T](k: ScopedKey[T]): T = s.get(k.scope, k.key).get } - private[this] def add[Scope](data: Data[Scope], scope: Scope, definitions: Map[Scope, Definitions], resolved: Map[Scope, Map[AttributeKey[_], Scope]]): Map[Scope, AttributeMap] = - data.updated(scope, mkScopeMap(data, definitions(scope), resolved(scope)) ) - - private[this] def mkScopeMap[Scope](data: Data[Scope], definitions: Definitions, definedIn: Map[AttributeKey[_], Scope]): AttributeMap = + def make(init: Seq[Setting[_]]): Settings[Scope] = { - val start = (AttributeMap.empty /: definitions.requires) ( (mp, key) => prepop(data, definedIn, mp, key)) - definitions eval start + // group by Scope/Key, dropping dead initializations + val sMap: ScopedMap = grouped(init) + // delegate references to undefined values according to 'delegates' + val dMap: ScopedMap = delegate(sMap) + // merge Seq[Setting[_]] into Compiled + val cMap: CompiledMap = compile(dMap) + // order the initializations. cyclic references are detected here. + val ordered: Seq[Compiled] = sort(cMap) + // evaluation: apply the initializations. + applyInits(ordered) } + def sort(cMap: CompiledMap): Seq[Compiled] = + Dag.topologicalSort(cMap.values)(_.dependencies.map(cMap)) - private[this] def prepop[T, Scope](data: Data[Scope], definedIn: Map[AttributeKey[_], Scope], mp: AttributeMap, key: AttributeKey[T]): AttributeMap = - mp.put(key, data(definedIn(key))(key)) - - private[this] def resolveScopes[Scope](definition: Definitions, search: Seq[Scope], definitions: Map[Scope, Definitions]): Map[AttributeKey[_], Scope] = - definition.requires.view.map(req => (req, resolveScope(req, search, definitions )) ).toMap - - private[this] def resolveScope[Scope](key: AttributeKey[_], search: Seq[Scope], definitions: Map[Scope, Definitions]): Scope = - search find defines(key, definitions) getOrElse { throw new Uninitialized(key) } - - private[this] def defines[Scope](key: AttributeKey[_], definitions: Map[Scope, Definitions])(scope: Scope): Boolean = - (definitions get scope).filter(_.provides contains key).isDefined - - final class Definitions(val provides: Keys, val requires: Keys, val eval: AttributeMap => AttributeMap) - - def value[T](key: AttributeKey[T])(value: => T): Setting[T] = new Value(key, value _) - def update[T](key: AttributeKey[T])(f: T => T): Setting[T] = new Update(key, f) - def app[HL <: HList, T](key: AttributeKey[T], inputs: KList[AttributeKey, HL])(f: HL => T): Setting[T] = new Apply(key, f, inputs) - - def compile(settings: Seq[Setting[_]]): Definitions = - { - val grpd = grouped(settings) - val sorted = sort(grpd) - val eval = (map: AttributeMap) => (map /: sorted)( (m, c) => c eval m ) - val provided = grpd.keySet.toSet - val requires = sorted.flatMap(_.dependencies).toSet -- provided ++ sorted.collect { case c if !c.selfContained => c.key } - new Definitions(provided, requires, eval) - } - private[this] def grouped(settings: Seq[Setting[_]]): Map[AttributeKey[_], Compiled] = - settings.groupBy(_.key) map { case (key: AttributeKey[t], actions) => - (key: AttributeKey[_], compileSetting(key, actions.asInstanceOf[Seq[Setting[t]]]) ) + def compile(sMap: ScopedMap): CompiledMap = + sMap.toSeq.map { case (k, ss) => + val deps = ss flatMap { _.dependsOn } + val eval = (settings: Settings[Scope]) => (settings /: ss)(applySetting) + (k, new Compiled(deps, eval)) } toMap; - private[this] def compileSetting[T](key: AttributeKey[T], actions: Seq[Setting[T]]): Compiled = + def grouped(init: Seq[Setting[_]]): ScopedMap = + ((IMap.empty : ScopedMap) /: init) ( (m,s) => add(m,s) ) + + def add[T](m: ScopedMap, s: Setting[T]): ScopedMap = + m.mapValue[T]( s.key, Nil, ss => append(ss, s)) + + def append[T](ss: Seq[Setting[T]], s: Setting[T]): Seq[Setting[T]] = + if(s.definitive) s :: Nil else ss :+ s + + def delegate(sMap: ScopedMap): ScopedMap = { - val (alive, selfContained) = live(key, actions) - val f = (map: AttributeMap) => (map /: alive)(eval) - new Compiled(key, f, dependencies(actions), selfContained) + val md = memoDelegates + def refMap(refKey: ScopedKey[_]) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = mapReferenced(sMap, k, md(k.scope), refKey) } + val f = new (SettingSeq ~> SettingSeq) { def apply[T](ks: Seq[Setting[T]]) = ks.map{ s => s mapReferenced refMap(s.key) } } + sMap mapValues f } - private[this] final class Compiled(val key: AttributeKey[_], val eval: AttributeMap => AttributeMap, val dependencies: Iterable[AttributeKey[_]], val selfContained: Boolean) { - override def toString = key.label + private[this] def mapReferenced[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_]): ScopedKey[T] = + { + val scache = PMap.empty[ScopedKey, ScopedKey] + def resolve(search: Seq[Scope]): ScopedKey[T] = + search match { + case Seq() => throw new Uninitialized(k) + case Seq(x, xs @ _*) => + val sk = ScopedKey(x, k.key) + scache.getOrUpdate(sk, if(defines(sMap, sk, refKey)) sk else resolve(xs)) + } + resolve(scopes) + } + private[this] def defines(map: ScopedMap, key: ScopedKey[_], refKey: ScopedKey[_]): Boolean = + (map get key) match { case Some(Seq(x, _*)) => (refKey != key) || x.definitive; case _ => false } + + private[this] def applyInits(ordered: Seq[Compiled]): Settings[Scope] = + (empty /: ordered){ (m, comp) => comp.eval(m) } + + private[this] def memoDelegates: Scope => Seq[Scope] = + { + val dcache = new mutable.HashMap[Scope, Seq[Scope]] + (scope: Scope) => dcache.getOrElseUpdate(scope, delegates(scope)) } - private[this] def sort(actionMap: Map[AttributeKey[_], Compiled]): Seq[Compiled] = - Dag.topologicalSort(actionMap.values)( _.dependencies.flatMap(actionMap.get) ) - - private[this] def live[T](key: AttributeKey[T], actions: Seq[Setting[T]]): (Seq[Setting[T]], Boolean) = - { - val lastOverwrite = actions.lastIndexWhere(_ overwrite key) - val selfContained = lastOverwrite >= 0 - val alive = if(selfContained) actions.drop(lastOverwrite) else actions - (alive, selfContained) - } - private[this] def dependencies(actions: Seq[Setting[_]]): Seq[AttributeKey[_]] = actions.flatMap(_.dependsOn) - private[this] def eval[T](map: AttributeMap, a: Setting[T]): AttributeMap = + private[this] def applySetting[T](map: Settings[Scope], a: Setting[T]): Settings[Scope] = a match { - case s: Value[T] => map.put(s.key, s.value()) - case u: Update[T] => map.put(u.key, u.f(map(u.key))) - case a: Apply[hl, T] => map.put(a.key, a.f(a.inputs.down(map))) + case s: Value[T] => map.set(s.key.scope, s.key.key, s.value()) + case a: Apply[hl, T] => map.set(a.key.scope, a.key.key, a.f(a.inputs.down(asTransform(map)))) } + final class Uninitialized(key: ScopedKey[_]) extends Exception("Update on uninitialized setting " + key.key.label + " (in " + key.scope + ")") + final class Compiled(val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) + sealed trait Setting[T] { - def key: AttributeKey[T] - def overwrite(key: AttributeKey[T]): Boolean - def dependsOn: Seq[AttributeKey[_]] = Nil + def key: ScopedKey[T] + def definitive: Boolean + def dependsOn: Seq[ScopedKey[_]] + def mapReferenced(f: MapScoped): Setting[T] } - private[this] final class Value[T](val key: AttributeKey[T], val value: () => T) extends Setting[T] + private[this] final class Value[T](val key: ScopedKey[T], val value: () => T) extends Setting[T] { - def overwrite(key: AttributeKey[T]) = true + def definitive = true + def dependsOn = Nil + def mapReferenced(f: MapScoped) = this } - private[this] final class Update[T](val key: AttributeKey[T], val f: T => T) extends Setting[T] + private[this] final class Apply[HL <: HList, T](val key: ScopedKey[T], val f: HL => T, val inputs: KList[ScopedKey, HL]) extends Setting[T] { - def overwrite(key: AttributeKey[T]) = false + def definitive = !inputs.toList.contains(key) + def dependsOn = inputs.toList - key + def mapReferenced(g: MapScoped) = new Apply(key, f, inputs transform g) } - private[this] final class Apply[HL <: HList, T](val key: AttributeKey[T], val f: HL => T, val inputs: KList[AttributeKey, HL]) extends Setting[T] - { - def overwrite(key: AttributeKey[T]) = inputs.toList.forall(_ ne key) - override def dependsOn = inputs.toList - key - } - - final class Uninitialized(key: AttributeKey[_]) extends Exception("Update on uninitialized setting " + key.label) } From bcc8c37f4d774af762e8ce56e45e74c0f7da2d3f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 18 Jan 2011 18:24:11 -0500 Subject: [PATCH 109/823] multi-project model based on Settings and ProjectRef --- util/collection/Settings.scala | 79 ++++++++++++++++++++++------- util/collection/TypeFunctions.scala | 8 +++ util/collection/Util.scala | 17 +++++++ 3 files changed, 85 insertions(+), 19 deletions(-) create mode 100644 util/collection/Util.scala diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index c1ea10cb7..cd94f012a 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -1,3 +1,6 @@ +/* sbt -- Simple Build Tool + * Copyright 2011 Mark Harrah + */ package sbt import Types._ @@ -6,15 +9,17 @@ package sbt sealed trait Settings[Scope] { - def data: Scope => AttributeMap - def scopes: Seq[Scope] + def data: Map[Scope, AttributeMap] + def keys(scope: Scope): Set[AttributeKey[_]] + def scopes: Set[Scope] def get[T](scope: Scope, key: AttributeKey[T]): Option[T] def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] } private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val delegates: Scope => Seq[Scope]) extends Settings[Scope] { - def scopes: Seq[Scope] = data.keys.toSeq + def scopes: Set[Scope] = data.keySet.toSet + def keys(scope: Scope) = data(scope).keys.toSet def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = delegates(scope).toStream.flatMap(sc => scopeLocal(sc, key) ).headOption @@ -30,7 +35,8 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val del } } // delegates should contain the input Scope as the first entry -final class Init[Scope](val delegates: Scope => Seq[Scope]) +// this trait is intended to be mixed into an object +trait Init[Scope] { final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) @@ -42,18 +48,28 @@ final class Init[Scope](val delegates: Scope => Seq[Scope]) def value[T](key: ScopedKey[T])(value: => T): Setting[T] = new Value(key, value _) def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = app(key, key :^: KNil)(h => f(h.head)) def app[HL <: HList, T](key: ScopedKey[T], inputs: KList[ScopedKey, HL])(f: HL => T): Setting[T] = new Apply(key, f, inputs) + def uniform[S,T](key: ScopedKey[T], inputs: Seq[ScopedKey[S]])(f: Seq[S] => T): Setting[T] = new Uniform(key, f, inputs) + def kapp[HL <: HList, M[_], T](key: ScopedKey[T], inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL])(f: KList[M, HL] => T): Setting[T] = new KApply[HL, M, T](key, f, inputs) - def empty: Settings[Scope] = new Settings0(Map.empty, delegates) - def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { - def apply[T](k: ScopedKey[T]): T = s.get(k.scope, k.key).get + // the following is a temporary workaround for the "... cannot be instantiated from ..." bug, which renders 'kapp' above unusable outside this source file + class KApp[HL <: HList, M[_], T] { + type Composed[S] = ScopedKey[M[S]] + def apply(key: ScopedKey[T], inputs: KList[Composed, HL])(f: KList[M, HL] => T): Setting[T] = new KApply[HL, M, T](key, f, inputs) } - def make(init: Seq[Setting[_]]): Settings[Scope] = + def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) + def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { + def apply[T](k: ScopedKey[T]): T = getValue(s, k) + } + def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key).get + def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) + + def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = { // group by Scope/Key, dropping dead initializations val sMap: ScopedMap = grouped(init) // delegate references to undefined values according to 'delegates' - val dMap: ScopedMap = delegate(sMap) + val dMap: ScopedMap = delegate(sMap)(delegates) // merge Seq[Setting[_]] into Compiled val cMap: CompiledMap = compile(dMap) // order the initializations. cyclic references are detected here. @@ -80,9 +96,9 @@ final class Init[Scope](val delegates: Scope => Seq[Scope]) def append[T](ss: Seq[Setting[T]], s: Setting[T]): Seq[Setting[T]] = if(s.definitive) s :: Nil else ss :+ s - def delegate(sMap: ScopedMap): ScopedMap = + def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope]): ScopedMap = { - val md = memoDelegates + val md = memoDelegates(delegates) def refMap(refKey: ScopedKey[_]) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = mapReferenced(sMap, k, md(k.scope), refKey) } val f = new (SettingSeq ~> SettingSeq) { def apply[T](ks: Seq[Setting[T]]) = ks.map{ s => s mapReferenced refMap(s.key) } } sMap mapValues f @@ -102,21 +118,27 @@ final class Init[Scope](val delegates: Scope => Seq[Scope]) private[this] def defines(map: ScopedMap, key: ScopedKey[_], refKey: ScopedKey[_]): Boolean = (map get key) match { case Some(Seq(x, _*)) => (refKey != key) || x.definitive; case _ => false } - private[this] def applyInits(ordered: Seq[Compiled]): Settings[Scope] = + private[this] def applyInits(ordered: Seq[Compiled])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = (empty /: ordered){ (m, comp) => comp.eval(m) } - private[this] def memoDelegates: Scope => Seq[Scope] = + private[this] def memoDelegates(implicit delegates: Scope => Seq[Scope]): Scope => Seq[Scope] = { val dcache = new mutable.HashMap[Scope, Seq[Scope]] (scope: Scope) => dcache.getOrElseUpdate(scope, delegates(scope)) } - private[this] def applySetting[T](map: Settings[Scope], a: Setting[T]): Settings[Scope] = - a match + private[this] def applySetting[T](map: Settings[Scope], setting: Setting[T]): Settings[Scope] = + { + def execK[HL <: HList, M[_]](a: KApply[HL, M, T]) = + map.set(a.key.scope, a.key.key, a.f(a.inputs.transform[M]( nestCon[ScopedKey, Id, M](asTransform(map)) )) ) + setting match { case s: Value[T] => map.set(s.key.scope, s.key.key, s.value()) - case a: Apply[hl, T] => map.set(a.key.scope, a.key.key, a.f(a.inputs.down(asTransform(map)))) + case u: Uniform[s, T] => map.set(u.key.scope, u.key.key, u.f(u.inputs map asFunction(map)) ) + case a: Apply[hl, T] => map.set(a.key.scope, a.key.key, a.f(a.inputs down asTransform(map) ) ) + case ka: KApply[hl, m, T] => execK[hl, m](ka) // separate method needed to workaround bug where m is not recognized as higher-kinded in inline version } + } final class Uninitialized(key: ScopedKey[_]) extends Exception("Update on uninitialized setting " + key.key.label + " (in " + key.scope + ")") final class Compiled(val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) @@ -126,18 +148,37 @@ final class Init[Scope](val delegates: Scope => Seq[Scope]) def key: ScopedKey[T] def definitive: Boolean def dependsOn: Seq[ScopedKey[_]] - def mapReferenced(f: MapScoped): Setting[T] + def mapReferenced(g: MapScoped): Setting[T] + def mapKey(g: MapScoped): Setting[T] } private[this] final class Value[T](val key: ScopedKey[T], val value: () => T) extends Setting[T] { def definitive = true def dependsOn = Nil - def mapReferenced(f: MapScoped) = this + def mapReferenced(g: MapScoped) = this + def mapKey(g: MapScoped): Setting[T] = new Value(g(key), value) } private[this] final class Apply[HL <: HList, T](val key: ScopedKey[T], val f: HL => T, val inputs: KList[ScopedKey, HL]) extends Setting[T] { def definitive = !inputs.toList.contains(key) - def dependsOn = inputs.toList - key + def dependsOn = remove(inputs.toList, key) def mapReferenced(g: MapScoped) = new Apply(key, f, inputs transform g) + def mapKey(g: MapScoped): Setting[T] = new Apply(g(key), f, inputs) } + private[this] final class KApply[HL <: HList, M[_], T](val key: ScopedKey[T], val f: KList[M, HL] => T, val inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL]) extends Setting[T] + { + def definitive = !inputs.toList.contains(key) + def dependsOn = remove(unnest(inputs.toList), key) + def mapReferenced(g: MapScoped) = new KApply[HL, M, T](key, f, inputs.transform[({type l[t] = ScopedKey[M[t]]})#l]( nestCon(g) ) ) + def mapKey(g: MapScoped): Setting[T] = new KApply[HL, M, T](g(key), f, inputs) + private[this] def unnest(l: List[ScopedKey[M[T]] forSome { type T }]): List[ScopedKey[_]] = l.asInstanceOf[List[ScopedKey[_]]] + } + private[this] final class Uniform[S, T](val key: ScopedKey[T], val f: Seq[S] => T, val inputs: Seq[ScopedKey[S]]) extends Setting[T] + { + def definitive = !inputs.contains(key) + def dependsOn = remove(inputs, key) + def mapReferenced(g: MapScoped) = new Uniform(key, f, inputs map g.fn[S]) + def mapKey(g: MapScoped): Setting[T] = new Uniform(g(key), f, inputs) + } + private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) } diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index ca173f9bc..93e44154a 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -8,16 +8,24 @@ trait TypeFunctions type Id[X] = X sealed trait Const[A] { type Apply[B] = A } sealed trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] } + sealed trait ∙[A[_], B[_]] { type l[T] = A[B[T]] } sealed trait P1of2[M[_,_], A] { type Apply[B] = M[A,B]; type Flip[B] = M[B, A] } final val left = new (Id ~> P1of2[Left, Nothing]#Flip) { def apply[T](t: T) = Left(t) } final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } + def nestCon[M[_], N[_], G[_]](f: M ~> N): (M ∙ G)#l ~> (N ∙ G)#l = + f.asInstanceOf[(M ∙ G)#l ~> (N ∙ G)#l] // implemented with a cast to avoid extra object+method call. castless version: + /* new ( (M ∙ G)#l ~> (N ∙ G)#l ) { + def apply[T](mg: M[G[T]]): N[G[T]] = f(mg) + }*/ + implicit def toFn1[A,B](f: A => B): Fn1[A,B] = new Fn1[A,B] { def ∙[C](g: C => A) = f compose g } + type Endo[T] = T=>T type ~>|[A[_],B[_]] = A ~> Compose[Option, B]#Apply } object TypeFunctions extends TypeFunctions diff --git a/util/collection/Util.scala b/util/collection/Util.scala new file mode 100644 index 000000000..a3bf330aa --- /dev/null +++ b/util/collection/Util.scala @@ -0,0 +1,17 @@ +/* sbt -- Simple Build Tool + * Copyright 2011 Mark Harrah + */ +package sbt + +object Collections +{ + def separate[T,A,B](ps: Seq[T])(f: T => Either[A,B]): (Seq[A], Seq[B]) = + ((Nil: Seq[A], Nil: Seq[B]) /: ps)( (xs, y) => prependEither(xs, f(y)) ) + + def prependEither[A,B](acc: (Seq[A], Seq[B]), next: Either[A,B]): (Seq[A], Seq[B]) = + next match + { + case Left(l) => (l +: acc._1, acc._2) + case Right(r) => (acc._1, r +: acc._2) + } +} \ No newline at end of file From d49706b2976aaddcb3767bb5a1b9d70add1f625e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 22 Jan 2011 14:01:59 -0500 Subject: [PATCH 110/823] redo Command to use Parser nested commands still need work --- util/complete/Completions.scala | 5 +- util/complete/HistoryCommands.scala | 15 + util/complete/JLineCompletion.scala | 58 ++-- util/complete/LineReader.scala | 13 +- util/complete/Parser.scala | 277 ++++++++++-------- util/complete/Parsers.scala | 53 ++++ util/complete/UpperBound.scala | 2 +- util/complete/src/test/scala/ParserTest.scala | 17 +- 8 files changed, 282 insertions(+), 158 deletions(-) create mode 100644 util/complete/Parsers.scala diff --git a/util/complete/Completions.scala b/util/complete/Completions.scala index a27e5dc6f..a2b910897 100644 --- a/util/complete/Completions.scala +++ b/util/complete/Completions.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.parse +package sbt.complete /** * Represents a set of completions. @@ -72,9 +72,8 @@ sealed trait Completion override final lazy val hashCode = Completion.hashCode(this) override final def equals(o: Any) = o match { case c: Completion => Completion.equal(this, c); case _ => false } } -final class DisplayOnly(display0: String) extends Completion +final class DisplayOnly(val display: String) extends Completion { - lazy val display = display0 def isEmpty = display.isEmpty def append = "" override def toString = "{" + display + "}" diff --git a/util/complete/HistoryCommands.scala b/util/complete/HistoryCommands.scala index bbf8c0bfd..16a359f9a 100644 --- a/util/complete/HistoryCommands.scala +++ b/util/complete/HistoryCommands.scala @@ -83,4 +83,19 @@ object HistoryCommands else history ! s } +/* + import parse.{Parser,Parsers} + import Parser._ + import Parsers._ + val historyParser: Parser[complete.History => Option[String]] = + { + Start ~> Specific) + } + !! Execute the last command again + !: Show all previous commands + !:n Show the last n commands + !n Execute the command with index n, as shown by the !: command + !-n Execute the nth command before this one + !string Execute the most recent command starting with 'string' + !?string*/ } \ No newline at end of file diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala index f9c7183ca..9103c3e72 100644 --- a/util/complete/JLineCompletion.scala +++ b/util/complete/JLineCompletion.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt.parse +package sbt.complete import jline.{CandidateListCompletionHandler,Completor,CompletionHandler,ConsoleReader} import scala.annotation.tailrec @@ -41,14 +41,17 @@ object JLineCompletion customCompletor(str => convertCompletions(Parser.completions(p, str))) def convertCompletions(c: Completions): (Seq[String], Seq[String]) = { - ( (Seq[String](), Seq[String]()) /: c.get) { case ( t @ (insert,display), comp) => - if(comp.isEmpty) t else (insert :+ comp.append, insert :+ comp.display) - } + val (insert, display) = + ( (Set.empty[String], Set.empty[String]) /: c.get) { case ( t @ (insert,display), comp) => + if(comp.isEmpty) t else (insert + comp.append, appendNonEmpty(display, comp.display.trim)) + } + (insert.toSeq, display.toSeq.sorted) } - + def appendNonEmpty(set: Set[String], add: String) = if(add.isEmpty) set else set + add + def customCompletor(f: String => (Seq[String], Seq[String])): ConsoleReader => Boolean = reader => { - val success = complete(beforeCursor(reader), f, reader, false) + val success = complete(beforeCursor(reader), f, reader) reader.flushConsole() success } @@ -59,29 +62,34 @@ object JLineCompletion b.getBuffer.substring(0, b.cursor) } - def complete(beforeCursor: String, completions: String => (Seq[String],Seq[String]), reader: ConsoleReader, inserted: Boolean): Boolean = + // returns false if there was nothing to insert and nothing to display + def complete(beforeCursor: String, completions: String => (Seq[String],Seq[String]), reader: ConsoleReader): Boolean = { val (insert,display) = completions(beforeCursor) - if(insert.isEmpty) - inserted - else - { - lazy val common = commonPrefix(insert) - if(inserted || common.isEmpty) - { - showCompletions(display, reader) - reader.drawLine() - true - } + val common = commonPrefix(insert) + if(common.isEmpty) + if(display.isEmpty) + () else - { - reader.getCursorBuffer.write(common) - reader.redrawLine() - complete(beforeCursor + common, completions, reader, true) - } - } + showCompletions(display, reader) + else + appendCompletion(common, reader) + + !(common.isEmpty && display.isEmpty) } - def showCompletions(cs: Seq[String], reader: ConsoleReader): Unit = + + def appendCompletion(common: String, reader: ConsoleReader) + { + reader.getCursorBuffer.write(common) + reader.redrawLine() + } + + def showCompletions(display: Seq[String], reader: ConsoleReader) + { + printCompletions(display, reader) + reader.drawLine() + } + def printCompletions(cs: Seq[String], reader: ConsoleReader): Unit = if(cs.isEmpty) () else CandidateListCompletionHandler.printCandidates(reader, JavaConversions.asJavaList(cs), true) def commonPrefix(s: Seq[String]): String = if(s.isEmpty) "" else s reduceLeft commonPrefix diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index d586d0729..b8ab32f87 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -5,6 +5,7 @@ package sbt import jline.{Completor, ConsoleReader} import java.io.File + import complete.Parser abstract class JLine extends LineReader { @@ -56,27 +57,21 @@ private object JLine val MaxHistorySize = 500 } -trait LineReader extends NotNull +trait LineReader { def readLine(prompt: String): Option[String] } -private[sbt] final class LazyJLineReader(historyPath: Option[File] /*, completor: => Completor*/) extends JLine +final class FullReader(historyPath: Option[File], complete: Parser[_]) extends JLine { protected[this] val reader = { val cr = new ConsoleReader cr.setBellEnabled(false) JLine.initializeHistory(cr, historyPath) -// cr.addCompletor(new LazyCompletor(completor)) + sbt.complete.JLineCompletion.installCustomCompletor(cr, complete) cr } } -private class LazyCompletor(delegate0: => Completor) extends Completor -{ - private lazy val delegate = delegate0 - def complete(buffer: String, cursor: Int, candidates: java.util.List[_]): Int = - delegate.complete(buffer, cursor, candidates) -} class SimpleReader private[sbt] (historyPath: Option[File]) extends JLine { diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 4944c6d77..cb94bee85 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool - * Copyright 2008, 2010 Mark Harrah + * Copyright 2008, 2010, 2011 Mark Harrah */ -package sbt.parse +package sbt.complete import Parser._ @@ -11,7 +11,7 @@ sealed trait Parser[+T] def resultEmpty: Option[T] def result: Option[T] = None def completions: Completions - def valid: Boolean = true + def valid: Boolean def isTokenStart = false } sealed trait RichParser[A] @@ -54,56 +54,8 @@ sealed trait RichParser[A] def flatMap[B](f: A => Parser[B]): Parser[B] } -object Parser +object Parser extends ParserMain { - def apply[T](p: Parser[T])(s: String): Parser[T] = - (p /: s)(derive1) - - def derive1[T](p: Parser[T], c: Char): Parser[T] = - if(p.valid) p.derive(c) else p - - def completions(p: Parser[_], s: String): Completions = completions( apply(p)(s) ) - def completions(p: Parser[_]): Completions = p.completions - - implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] - { - def ~[B](b: Parser[B]) = seqParser(a, b) - def ||[B](b: Parser[B]) = choiceParser(a,b) - def |[B >: A](b: Parser[B]) = homParser(a,b) - def ? = opt(a) - def * = zeroOrMore(a) - def + = oneOrMore(a) - def map[B](f: A => B) = mapParser(a, f) - def id = a - - def ^^^[B](value: B): Parser[B] = a map { _ => value } - def ??[B >: A](alt: B): Parser[B] = a.? map { _ getOrElse alt } - def <~[B](b: Parser[B]): Parser[A] = (a ~ b) map { case av ~ _ => av } - def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } - - def unary_- = not(a) - def & (o: Parser[_]) = and(a, o) - def - (o: Parser[_]) = sub(a, o) - def examples(s: String*): Parser[A] = examples(s.toSet) - def examples(s: Set[String]): Parser[A] = Parser.examples(a, s, check = true) - def filter(f: A => Boolean): Parser[A] = filterParser(a, f) - def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) - def flatMap[B](f: A => Parser[B]) = bindParser(a, f) - } - implicit def literalRichParser(c: Char): RichParser[Char] = richParser(c) - implicit def literalRichParser(s: String): RichParser[String] = richParser(s) - - def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = - if(a.valid) { - a.result match - { - case Some(av) => success( av ) - case None => - if(check) checkMatches(a, completions.toSeq) - new Examples(a, completions) - } - } - else Invalid def checkMatches(a: Parser[_], completions: Seq[String]) { @@ -135,37 +87,22 @@ object Parser if(a.valid) { a.result match { - case Some(av) => if( f(av) ) success( av ) else Invalid + case Some(av) => if( f(av) ) successStrict( av ) else Invalid case None => new Filter(a, f) } } else Invalid def seqParser[A,B](a: Parser[A], b: Parser[B]): Parser[(A,B)] = - if(a.valid && b.valid) { + if(a.valid && b.valid) (a.result, b.result) match { - case (Some(av), Some(bv)) => success( (av, bv) ) + case (Some(av), Some(bv)) => successStrict( (av, bv) ) case (Some(av), None) => b map { bv => (av, bv) } case (None, Some(bv)) => a map { av => (av, bv) } case (None, None) => new SeqParser(a,b) } - } else Invalid - def token[T](t: Parser[T]): Parser[T] = token(t, "", true) - def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false) - def token[T](t: Parser[T], seen: String, track: Boolean): Parser[T] = - if(t.valid && !t.isTokenStart) - if(t.result.isEmpty) new TokenStart(t, seen, track) else t - else - t - - def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = - if(a.valid) - if(b.valid) new HomParser(a, b) else a - else - b - def choiceParser[A,B](a: Parser[A], b: Parser[B]): Parser[Either[A,B]] = if(a.valid) if(b.valid) new HetParser(a,b) else a.map( Left(_) ) @@ -173,14 +110,14 @@ object Parser b.map( Right(_) ) def opt[T](a: Parser[T]): Parser[Option[T]] = - if(a.valid) new Optional(a) else success(None) + if(a.valid) new Optional(a) else successStrict(None) def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) def repeat[T](p: Parser[T], min: Int = 0, max: UpperBound = Infinite): Parser[Seq[T]] = repeat(None, p, min, max, Nil) - private[parse] def repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, revAcc: List[T]): Parser[Seq[T]] = + private[complete] def repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, revAcc: List[T]): Parser[Seq[T]] = { assume(min >= 0, "Minimum must be greater than or equal to zero (was " + min + ")") assume(max >= min, "Minimum must be less than or equal to maximum (min: " + min + ", max: " + max + ")") @@ -189,8 +126,8 @@ object Parser if(repeated.valid) repeated.result match { - case Some(value) => success(revAcc reverse_::: value :: Nil) // revAcc should be Nil here - case None => if(max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) + case Some(value) => successStrict(revAcc reverse_::: value :: Nil) // revAcc should be Nil here + case None => if(max.isZero) successStrict(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) } else if(min == 0) invalidButOptional @@ -207,27 +144,55 @@ object Parser case None => checkRepeated(part.map(lv => (lv :: revAcc).reverse)) } else Invalid - case None => checkRepeated(success(Nil)) + case None => checkRepeated(successStrict(Nil)) } } - def success[T](value: T): Parser[T] = new Parser[T] { - override def result = Some(value) + def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) + + def and[T](a: Parser[T], b: Parser[_]): Parser[T] = if(a.valid && b.valid) new And(a, b) else Invalid +} +trait ParserMain +{ + implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] + { + def ~[B](b: Parser[B]) = seqParser(a, b) + def ||[B](b: Parser[B]) = choiceParser(a,b) + def |[B >: A](b: Parser[B]) = homParser(a,b) + def ? = opt(a) + def * = zeroOrMore(a) + def + = oneOrMore(a) + def map[B](f: A => B) = mapParser(a, f) + def id = a + + def ^^^[B](value: B): Parser[B] = a map { _ => value } + def ??[B >: A](alt: B): Parser[B] = a.? map { _ getOrElse alt } + def <~[B](b: Parser[B]): Parser[A] = (a ~ b) map { case av ~ _ => av } + def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } + + def unary_- = not(a) + def & (o: Parser[_]) = and(a, o) + def - (o: Parser[_]) = sub(a, o) + def examples(s: String*): Parser[A] = examples(s.toSet) + def examples(s: Set[String]): Parser[A] = Parser.examples(a, s, check = true) + def filter(f: A => Boolean): Parser[A] = filterParser(a, f) + def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) + def flatMap[B](f: A => Parser[B]) = bindParser(a, f) + } + implicit def literalRichParser(c: Char): RichParser[Char] = richParser(c) + implicit def literalRichParser(s: String): RichParser[String] = richParser(s) + + def failure[T](msg: String): Parser[T] = Invalid(msg) + def successStrict[T](value: T): Parser[T] = success(value) + def success[T](value: => T): Parser[T] = new ValidParser[T] { + private[this] lazy val v = value + override def result = Some(v) def resultEmpty = result def derive(c: Char) = Invalid def completions = Completions.empty - override def toString = "success(" + value + ")" + override def toString = "success(" + v + ")" } - val any: Parser[Char] = charClass(_ => true) - - def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) - - def and[T](a: Parser[T], b: Parser[_]): Parser[T] = - if(a.valid && b.valid) new And(a, b) else Invalid - - def not(p: Parser[_]): Parser[Unit] = new Not(p) - implicit def range(r: collection.immutable.NumericRange[Char]): Parser[Char] = new CharacterClass(r contains _).examples(r.map(_.toString) : _*) def chars(legal: String): Parser[Char] = @@ -237,29 +202,97 @@ object Parser } def charClass(f: Char => Boolean): Parser[Char] = new CharacterClass(f) - implicit def literal(ch: Char): Parser[Char] = new Parser[Char] { + implicit def literal(ch: Char): Parser[Char] = new ValidParser[Char] { def resultEmpty = None - def derive(c: Char) = if(c == ch) success(ch) else Invalid + def derive(c: Char) = if(c == ch) successStrict(ch) else Invalid def completions = Completions.single(Completion.suggestStrict(ch.toString)) override def toString = "'" + ch + "'" } implicit def literal(s: String): Parser[String] = stringLiteral(s, s.toList) - def stringLiteral(s: String, remaining: List[Char]): Parser[String] = - if(s.isEmpty) error("String literal cannot be empty") else if(remaining.isEmpty) success(s) else new StringLiteral(s, remaining) - object ~ { def unapply[A,B](t: (A,B)): Some[(A,B)] = Some(t) } + + // intended to be temporary pending proper error feedback + def result[T](p: Parser[T], s: String): Either[(String,Int), T] = + { + /* def loop(i: Int, a: Parser[T]): Either[(String,Int), T] = + a.err match + { + case Some(msg) => Left((msg, i)) + case None => + val ci = i+1 + if(ci >= s.length) + a.resultEmpty.toRight(("", i)) + else + loop(ci, a derive s(ci)) + } + loop(-1, p)*/ + apply(p)(s).resultEmpty.toRight(("Parse error", 0)) + } + + def apply[T](p: Parser[T])(s: String): Parser[T] = + (p /: s)(derive1) + + def derive1[T](p: Parser[T], c: Char): Parser[T] = + if(p.valid) p.derive(c) else p + + // The x Completions.empty removes any trailing token completions where append.isEmpty + def completions(p: Parser[_], s: String): Completions = apply(p)(s).completions x Completions.empty + + def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = + if(a.valid) { + a.result match + { + case Some(av) => successStrict( av ) + case None => + if(check) checkMatches(a, completions.toSeq) + new Examples(a, completions) + } + } + else a + + def matched(t: Parser[_], seenReverse: List[Char] = Nil, partial: Boolean = false): Parser[String] = + if(!t.valid) + if(partial && !seenReverse.isEmpty) successStrict(seenReverse.reverse.mkString) else Invalid + else if(t.result.isEmpty) + new MatchedString(t, seenReverse, partial) + else + successStrict(seenReverse.reverse.mkString) + + def token[T](t: Parser[T]): Parser[T] = token(t, "", true) + def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false) + def token[T](t: Parser[T], seen: String, track: Boolean): Parser[T] = + if(t.valid && !t.isTokenStart) + if(t.result.isEmpty) new TokenStart(t, seen, track) else t + else + t + + def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = + if(a.valid) + if(b.valid) new HomParser(a, b) else a + else + b + + def not(p: Parser[_]): Parser[Unit] = new Not(p) + + def stringLiteral(s: String, remaining: List[Char]): Parser[String] = + if(s.isEmpty) error("String literal cannot be empty") else if(remaining.isEmpty) success(s) else new StringLiteral(s, remaining) } -private final object Invalid extends Parser[Nothing] +sealed trait ValidParser[T] extends Parser[T] +{ + final def valid = true +} +private object Invalid extends Invalid("inv") +private sealed case class Invalid(val message: String) extends Parser[Nothing] { def resultEmpty = None def derive(c: Char) = error("Invalid.") override def valid = false def completions = Completions.nil - override def toString = "inv" + override def toString = message } -private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[(A,B)] +private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends ValidParser[(A,B)] { def cross(ao: Option[A], bo: Option[B]): Option[(A,B)] = for(av <- ao; bv <- bo) yield (av,bv) lazy val resultEmpty = cross(a.resultEmpty, b.resultEmpty) @@ -276,44 +309,44 @@ private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[(A override def toString = "(" + a + " ~ " + b + ")" } -private final class HomParser[A](a: Parser[A], b: Parser[A]) extends Parser[A] +private final class HomParser[A](a: Parser[A], b: Parser[A]) extends ValidParser[A] { def derive(c: Char) = (a derive c) | (b derive c) lazy val resultEmpty = a.resultEmpty orElse b.resultEmpty lazy val completions = a.completions ++ b.completions override def toString = "(" + a + " | " + b + ")" } -private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends Parser[Either[A,B]] +private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends ValidParser[Either[A,B]] { def derive(c: Char) = (a derive c) || (b derive c) lazy val resultEmpty = a.resultEmpty.map(Left(_)) orElse b.resultEmpty.map(Right(_)) lazy val completions = a.completions ++ b.completions override def toString = "(" + a + " || " + b + ")" } -private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends Parser[B] +private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends ValidParser[B] { lazy val resultEmpty = a.resultEmpty match { case None => None; case Some(av) => f(av).resultEmpty } - lazy val completions = { + lazy val completions = a.completions flatMap { c => apply(a)(c.append).resultEmpty match { case None => Completions.strict(Set.empty + c) case Some(av) => c x f(av).completions } } - } def derive(c: Char) = { val common = a derive c flatMap f a.resultEmpty match { - case Some(av) => common | f(av).derive(c) + case Some(av) => common | derive1(f(av), c) case None => common } } + override def isTokenStart = a.isTokenStart override def toString = "bind(" + a + ")" } -private final class MapParser[A,B](a: Parser[A], f: A => B) extends Parser[B] +private final class MapParser[A,B](a: Parser[A], f: A => B) extends ValidParser[B] { lazy val resultEmpty = a.resultEmpty map f def derive(c: Char) = (a derive c) map f @@ -321,14 +354,24 @@ private final class MapParser[A,B](a: Parser[A], f: A => B) extends Parser[B] override def isTokenStart = a.isTokenStart override def toString = "map(" + a + ")" } -private final class Filter[T](p: Parser[T], f: T => Boolean) extends Parser[T] +private final class Filter[T](p: Parser[T], f: T => Boolean) extends ValidParser[T] { lazy val resultEmpty = p.resultEmpty filter f def derive(c: Char) = (p derive c) filter f lazy val completions = p.completions filterS { s => apply(p)(s).resultEmpty.filter(f).isDefined } override def toString = "filter(" + p + ")" + override def isTokenStart = p.isTokenStart } -private final class TokenStart[T](delegate: Parser[T], seen: String, track: Boolean) extends Parser[T] +private final class MatchedString(delegate: Parser[_], seenReverse: List[Char], partial: Boolean) extends ValidParser[String] +{ + lazy val seen = seenReverse.reverse.mkString + def derive(c: Char) = matched(delegate derive c, c :: seenReverse, partial) + def completions = delegate.completions + def resultEmpty = if(delegate.resultEmpty.isDefined) Some(seen) else if(partial) Some(seen) else None + override def isTokenStart = delegate.isTokenStart + override def toString = "matched(" + partial + ", " + seen + ", " + delegate + ")" +} +private final class TokenStart[T](delegate: Parser[T], seen: String, track: Boolean) extends ValidParser[T] { def derive(c: Char) = token( delegate derive c, if(track) seen + c else seen, track) lazy val completions = @@ -344,27 +387,31 @@ private final class TokenStart[T](delegate: Parser[T], seen: String, track: Bool override def isTokenStart = true override def toString = "token('" + seen + "', " + track + ", " + delegate + ")" } -private final class And[T](a: Parser[T], b: Parser[_]) extends Parser[T] +private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] { def derive(c: Char) = (a derive c) & (b derive c) lazy val completions = a.completions.filterS(s => apply(b)(s).resultEmpty.isDefined ) lazy val resultEmpty = if(b.resultEmpty.isDefined) a.resultEmpty else None } -private final class Not(delegate: Parser[_]) extends Parser[Unit] +private final class Not(delegate: Parser[_]) extends ValidParser[Unit] { def derive(c: Char) = if(delegate.valid) not(delegate derive c) else this def completions = Completions.empty lazy val resultEmpty = if(delegate.resultEmpty.isDefined) None else Some(()) } -private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Parser[T] +private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends ValidParser[T] { - def derive(c: Char) = examples(delegate derive c, fixed.collect { case x if x.length > 0 && x(0) == c => x.tail }) - def resultEmpty = delegate.resultEmpty - lazy val completions = if(fixed.isEmpty) Completions.empty else Completions(fixed map(f => Completion.suggestion(f)) ) + def derive(c: Char) = examples(delegate derive c, fixed.collect { case x if x.length > 0 && x(0) == c => x substring 1 }) + lazy val resultEmpty = delegate.resultEmpty + lazy val completions = + if(fixed.isEmpty) + if(resultEmpty.isEmpty) Completions.nil else Completions.empty + else + Completions(fixed map(f => Completion.suggestion(f)) ) override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" } -private final class StringLiteral(str: String, remaining: List[Char]) extends Parser[String] +private final class StringLiteral(str: String, remaining: List[Char]) extends ValidParser[String] { assert(str.length > 0 && !remaining.isEmpty) def resultEmpty = None @@ -372,21 +419,21 @@ private final class StringLiteral(str: String, remaining: List[Char]) extends Pa lazy val completions = Completions.single(Completion.suggestion(remaining.mkString)) override def toString = '"' + str + '"' } -private final class CharacterClass(f: Char => Boolean) extends Parser[Char] +private final class CharacterClass(f: Char => Boolean) extends ValidParser[Char] { def resultEmpty = None - def derive(c: Char) = if( f(c) ) success(c) else Invalid + def derive(c: Char) = if( f(c) ) successStrict(c) else Invalid def completions = Completions.empty override def toString = "class()" } -private final class Optional[T](delegate: Parser[T]) extends Parser[Option[T]] +private final class Optional[T](delegate: Parser[T]) extends ValidParser[Option[T]] { def resultEmpty = Some(None) def derive(c: Char) = (delegate derive c).map(Some(_)) lazy val completions = Completion.empty +: delegate.completions override def toString = delegate.toString + "?" } -private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, accumulatedReverse: List[T]) extends Parser[Seq[T]] +private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, accumulatedReverse: List[T]) extends ValidParser[Seq[T]] { assume(0 <= min, "Minimum occurences must be non-negative") assume(max >= min, "Minimum occurences must be less than the maximum occurences") diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala new file mode 100644 index 000000000..53d47e27d --- /dev/null +++ b/util/complete/Parsers.scala @@ -0,0 +1,53 @@ +/* sbt -- Simple Build Tool + * Copyright 2011 Mark Harrah + */ +package sbt.complete + + import Parser._ + import java.io.File + import java.net.URI + import java.lang.Character.{getType, MATH_SYMBOL, OTHER_SYMBOL, DASH_PUNCTUATION, OTHER_PUNCTUATION, MODIFIER_SYMBOL, CURRENCY_SYMBOL} + +// Some predefined parsers +trait Parsers +{ + lazy val any: Parser[Char] = charClass(_ => true) + + lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") + lazy val Digit = charClass(_.isDigit) examples DigitSet + lazy val Letter = charClass(_.isLetter) + def IDStart = Letter + lazy val IDChar = charClass(isIDChar) + lazy val ID = IDStart ~ IDChar.* map { case x ~ xs => (x +: xs).mkString } + lazy val OpChar = charClass(isOpChar) + lazy val Op = OpChar.+.string + lazy val OpOrID = ID | Op + + def isOpChar(c: Char) = !isDelimiter(c) && isOpType(getType(c)) + def isOpType(cat: Int) = cat match { case MATH_SYMBOL | OTHER_SYMBOL | DASH_PUNCTUATION | OTHER_PUNCTUATION | MODIFIER_SYMBOL | CURRENCY_SYMBOL => true; case _ => false } + def isIDChar(c: Char) = c.isLetterOrDigit || c == '-' || c == '_' + def isDelimiter(c: Char) = c match { case '`' | '\'' | '\"' | /*';' | */',' | '.' => true ; case _ => false } + + lazy val NotSpaceClass = charClass(!_.isWhitespace) + lazy val SpaceClass = charClass(_.isWhitespace) + lazy val NotSpace = NotSpaceClass.+.string + lazy val Space = SpaceClass.+.examples(" ") + lazy val OptSpace = SpaceClass.*.examples(" ") + + // TODO: implement + def fileParser(base: File): Parser[File] = token(mapOrFail(NotSpace)(s => new File(s.mkString)), "") + + lazy val Port = token(IntBasic, "") + lazy val IntBasic = mapOrFail( '-'.? ~ Digit.+ )( Function.tupled(toInt) ) + private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = + (neg.toSeq ++ digits).mkString.toInt + + def mapOrFail[S,T](p: Parser[S])(f: S => T): Parser[T] = + p flatMap { s => try { successStrict(f(s)) } catch { case e: Exception => failure(e.toString) } } + + def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(NotSpace, display)).* + + def Uri(ex: Set[URI]) = NotSpace map { uri => new URI(uri) } examples(ex.map(_.toString)) +} +object Parsers extends Parsers +object DefaultParsers extends Parsers with ParserMain \ No newline at end of file diff --git a/util/complete/UpperBound.scala b/util/complete/UpperBound.scala index 9427070f7..ba1a69ef9 100644 --- a/util/complete/UpperBound.scala +++ b/util/complete/UpperBound.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008,2010 Mark Harrah */ -package sbt.parse +package sbt.complete sealed trait UpperBound { diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 257171016..244982f43 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -1,4 +1,4 @@ -package sbt.parse +package sbt.complete import Parser._ import org.scalacheck._ @@ -25,14 +25,14 @@ object JLineTest } object ParserTest extends Properties("Completing Parser") { - val wsc = charClass(_.isWhitespace) - val ws = ( wsc + ) examples(" ") - val optWs = ( wsc * ) examples("") + import Parsers._ val nested = (token("a1") ~ token("b2")) ~ "c3" val nestedDisplay = (token("a1", "") ~ token("b2", "")) ~ "c3" - def p[T](f: T): T = { /*println(f);*/ f } + val spacePort = (token(Space) ~> Port) + + def p[T](f: T): T = { println(f); f } def checkSingle(in: String, expect: Completion)(expectDisplay: Completion = expect) = ( ("token '" + in + "'") |: checkOne(in, nested, expect)) && @@ -56,6 +56,13 @@ object ParserTest extends Properties("Completing Parser") property("nested tokens c") = checkSingle("a1b2", Completion.suggestStrict("c3") )() property("nested tokens c3") = checkSingle("a1b2c", Completion.suggestStrict("3"))() property("nested tokens c inv") = checkInvalid("a1b2a") + + property("suggest space") = checkOne("", spacePort, Completion.tokenStrict("", " ")) + property("suggest port") = checkOne(" ", spacePort, Completion.displayStrict("") ) + property("no suggest at end") = checkOne("asdf", "asdf", Completion.suggestStrict("")) + property("no suggest at token end") = checkOne("asdf", token("asdf"), Completion.suggestStrict("")) + property("empty suggest for examples") = checkOne("asdf", any.+.examples("asdf", "qwer"), Completion.suggestStrict("")) + property("empty suggest for examples token") = checkOne("asdf", token(any.+.examples("asdf", "qwer")), Completion.suggestStrict("")) } object ParserExample { From 092c012b0bf6f4b39871ceeff52e0cbebf7ee3f3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 22 Jan 2011 15:01:10 -0500 Subject: [PATCH 111/823] make explicit the separation between parsing and execution Parser[() => State] instead of Parser[State] --- util/complete/Parser.scala | 30 ++++++++++++++---------------- util/complete/Parsers.scala | 2 +- 2 files changed, 15 insertions(+), 17 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index cb94bee85..5adfa222d 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -87,7 +87,7 @@ object Parser extends ParserMain if(a.valid) { a.result match { - case Some(av) => if( f(av) ) successStrict( av ) else Invalid + case Some(av) => if( f(av) ) success( av ) else Invalid case None => new Filter(a, f) } } @@ -96,7 +96,7 @@ object Parser extends ParserMain def seqParser[A,B](a: Parser[A], b: Parser[B]): Parser[(A,B)] = if(a.valid && b.valid) (a.result, b.result) match { - case (Some(av), Some(bv)) => successStrict( (av, bv) ) + case (Some(av), Some(bv)) => success( (av, bv) ) case (Some(av), None) => b map { bv => (av, bv) } case (None, Some(bv)) => a map { av => (av, bv) } case (None, None) => new SeqParser(a,b) @@ -110,7 +110,7 @@ object Parser extends ParserMain b.map( Right(_) ) def opt[T](a: Parser[T]): Parser[Option[T]] = - if(a.valid) new Optional(a) else successStrict(None) + if(a.valid) new Optional(a) else success(None) def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) @@ -126,8 +126,8 @@ object Parser extends ParserMain if(repeated.valid) repeated.result match { - case Some(value) => successStrict(revAcc reverse_::: value :: Nil) // revAcc should be Nil here - case None => if(max.isZero) successStrict(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) + case Some(value) => success(revAcc reverse_::: value :: Nil) // revAcc should be Nil here + case None => if(max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) } else if(min == 0) invalidButOptional @@ -144,7 +144,7 @@ object Parser extends ParserMain case None => checkRepeated(part.map(lv => (lv :: revAcc).reverse)) } else Invalid - case None => checkRepeated(successStrict(Nil)) + case None => checkRepeated(success(Nil)) } } @@ -183,14 +183,12 @@ trait ParserMain implicit def literalRichParser(s: String): RichParser[String] = richParser(s) def failure[T](msg: String): Parser[T] = Invalid(msg) - def successStrict[T](value: T): Parser[T] = success(value) - def success[T](value: => T): Parser[T] = new ValidParser[T] { - private[this] lazy val v = value - override def result = Some(v) + def success[T](value: T): Parser[T] = new ValidParser[T] { + override def result = Some(value) def resultEmpty = result def derive(c: Char) = Invalid def completions = Completions.empty - override def toString = "success(" + v + ")" + override def toString = "success(" + value + ")" } implicit def range(r: collection.immutable.NumericRange[Char]): Parser[Char] = @@ -204,7 +202,7 @@ trait ParserMain implicit def literal(ch: Char): Parser[Char] = new ValidParser[Char] { def resultEmpty = None - def derive(c: Char) = if(c == ch) successStrict(ch) else Invalid + def derive(c: Char) = if(c == ch) success(ch) else Invalid def completions = Completions.single(Completion.suggestStrict(ch.toString)) override def toString = "'" + ch + "'" } @@ -244,7 +242,7 @@ trait ParserMain if(a.valid) { a.result match { - case Some(av) => successStrict( av ) + case Some(av) => success( av ) case None => if(check) checkMatches(a, completions.toSeq) new Examples(a, completions) @@ -254,11 +252,11 @@ trait ParserMain def matched(t: Parser[_], seenReverse: List[Char] = Nil, partial: Boolean = false): Parser[String] = if(!t.valid) - if(partial && !seenReverse.isEmpty) successStrict(seenReverse.reverse.mkString) else Invalid + if(partial && !seenReverse.isEmpty) success(seenReverse.reverse.mkString) else Invalid else if(t.result.isEmpty) new MatchedString(t, seenReverse, partial) else - successStrict(seenReverse.reverse.mkString) + success(seenReverse.reverse.mkString) def token[T](t: Parser[T]): Parser[T] = token(t, "", true) def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false) @@ -422,7 +420,7 @@ private final class StringLiteral(str: String, remaining: List[Char]) extends Va private final class CharacterClass(f: Char => Boolean) extends ValidParser[Char] { def resultEmpty = None - def derive(c: Char) = if( f(c) ) successStrict(c) else Invalid + def derive(c: Char) = if( f(c) ) success(c) else Invalid def completions = Completions.empty override def toString = "class()" } diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 53d47e27d..a14425e11 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -43,7 +43,7 @@ trait Parsers (neg.toSeq ++ digits).mkString.toInt def mapOrFail[S,T](p: Parser[S])(f: S => T): Parser[T] = - p flatMap { s => try { successStrict(f(s)) } catch { case e: Exception => failure(e.toString) } } + p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(NotSpace, display)).* From 5f9c6f7f26be9f0b29ab38264b38d8d742ea151e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 23 Jan 2011 22:34:17 -0500 Subject: [PATCH 112/823] improve commands, proper build/project base resolution finish alias support better project printing in 'projects' completion support for 'help' resolve URIs in ProjectRef against base URI of defining build in keys and project relations resolve base directories and record build URI in BuildUnit preserve relative paths in File to URI conversion for later resolution --- util/collection/Settings.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index cd94f012a..c779274e6 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -99,11 +99,11 @@ trait Init[Scope] def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope]): ScopedMap = { val md = memoDelegates(delegates) - def refMap(refKey: ScopedKey[_]) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = mapReferenced(sMap, k, md(k.scope), refKey) } + def refMap(refKey: ScopedKey[_]) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = delegateForKey(sMap, k, md(k.scope), refKey) } val f = new (SettingSeq ~> SettingSeq) { def apply[T](ks: Seq[Setting[T]]) = ks.map{ s => s mapReferenced refMap(s.key) } } sMap mapValues f } - private[this] def mapReferenced[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_]): ScopedKey[T] = + private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_]): ScopedKey[T] = { val scache = PMap.empty[ScopedKey, ScopedKey] def resolve(search: Seq[Scope]): ScopedKey[T] = From ba9c2c0e148b61d2537b0026577852371f33af23 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 24 Jan 2011 18:08:43 -0500 Subject: [PATCH 113/823] cleanup and fixes --- util/complete/Parsers.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index a14425e11..9679ac48a 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -45,8 +45,9 @@ trait Parsers def mapOrFail[S,T](p: Parser[S])(f: S => T): Parser[T] = p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } - def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(NotSpace, display)).* + def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(NotSpace, display)).* <~ SpaceClass.* + def trimmed(p: Parser[String]) = p map { _.trim } def Uri(ex: Set[URI]) = NotSpace map { uri => new URI(uri) } examples(ex.map(_.toString)) } object Parsers extends Parsers From d906455aedfc0f6184c8d07a3f9a63f8766bf0ba Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 25 Jan 2011 22:14:02 -0500 Subject: [PATCH 114/823] split load-time project structure scope resolution into two phases first phase resolves referenced build URIs as each build is loaded second phase resolves references without an explicit project ID (this requires the whole structure to be known and this isn't available during the first phase) setting resolution is unchanged (done after both phases) --- util/collection/Settings.scala | 17 ++++++----------- 1 file changed, 6 insertions(+), 11 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index c779274e6..6c9532a0d 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -4,8 +4,6 @@ package sbt import Types._ - import annotation.tailrec - import collection.mutable sealed trait Settings[Scope] { @@ -98,9 +96,12 @@ trait Init[Scope] def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope]): ScopedMap = { - val md = memoDelegates(delegates) - def refMap(refKey: ScopedKey[_]) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = delegateForKey(sMap, k, md(k.scope), refKey) } - val f = new (SettingSeq ~> SettingSeq) { def apply[T](ks: Seq[Setting[T]]) = ks.map{ s => s mapReferenced refMap(s.key) } } + def refMap(refKey: ScopedKey[_]) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = + delegateForKey(sMap, k, delegates(k.scope), refKey) + } + val f = new (SettingSeq ~> SettingSeq) { def apply[T](ks: Seq[Setting[T]]) = + ks.map{ s => s mapReferenced refMap(s.key) } + } sMap mapValues f } private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_]): ScopedKey[T] = @@ -121,12 +122,6 @@ trait Init[Scope] private[this] def applyInits(ordered: Seq[Compiled])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = (empty /: ordered){ (m, comp) => comp.eval(m) } - private[this] def memoDelegates(implicit delegates: Scope => Seq[Scope]): Scope => Seq[Scope] = - { - val dcache = new mutable.HashMap[Scope, Seq[Scope]] - (scope: Scope) => dcache.getOrElseUpdate(scope, delegates(scope)) - } - private[this] def applySetting[T](map: Settings[Scope], setting: Setting[T]): Settings[Scope] = { def execK[HL <: HList, M[_]](a: KApply[HL, M, T]) = From 1be53be310ce74d9a01a4b00a8e2edd87bc05dcc Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 25 Jan 2011 22:18:18 -0500 Subject: [PATCH 115/823] make Uri parser fail (instead of error) on malformed inputs --- util/complete/Parsers.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 9679ac48a..2b8957a54 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -48,7 +48,7 @@ trait Parsers def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(NotSpace, display)).* <~ SpaceClass.* def trimmed(p: Parser[String]) = p map { _.trim } - def Uri(ex: Set[URI]) = NotSpace map { uri => new URI(uri) } examples(ex.map(_.toString)) + def Uri(ex: Set[URI]) = mapOrFail(NotSpace)( uri => new URI(uri)) examples(ex.map(_.toString)) } object Parsers extends Parsers object DefaultParsers extends Parsers with ParserMain \ No newline at end of file From e41d4cc8c8bf49558c5adf9d8598e3f138d82269 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 25 Jan 2011 22:19:03 -0500 Subject: [PATCH 116/823] convenience method on Settings to get all (Scope, AttributeKey[_]) pairs --- util/collection/Settings.scala | 2 ++ 1 file changed, 2 insertions(+) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 6c9532a0d..6ef6964c8 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -10,6 +10,7 @@ sealed trait Settings[Scope] def data: Map[Scope, AttributeMap] def keys(scope: Scope): Set[AttributeKey[_]] def scopes: Set[Scope] + def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] def get[T](scope: Scope, key: AttributeKey[T]): Option[T] def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] } @@ -18,6 +19,7 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val del { def scopes: Set[Scope] = data.keySet.toSet def keys(scope: Scope) = data(scope).keys.toSet + def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) } toSeq; def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = delegates(scope).toStream.flatMap(sc => scopeLocal(sc, key) ).headOption From 2f4169026972ee82cbcf6055bf886c6e2e1bb00f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 28 Jan 2011 21:07:29 -0500 Subject: [PATCH 117/823] KList updates - exchange variance annotations on KList for a bounded existential in KList.toList - add foldr (reason for dropping variance annotations) - add functions stating equivalence between + KList[M,H :+: T] and KCons[H,T,M] + KList[M,HNil] and KNil --- util/collection/KList.scala | 21 +++++++++++++++++++-- 1 file changed, 19 insertions(+), 2 deletions(-) diff --git a/util/collection/KList.scala b/util/collection/KList.scala index aa9662917..8035a4f2f 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -9,7 +9,7 @@ import Types._ * type parameters HL. The underlying data is M applied to each type parameter. * Explicitly tracking M[_] allows performing natural transformations or ensuring * all data conforms to some common type. */ -sealed trait KList[+M[_], +HL <: HList] +sealed trait KList[+M[_], HL <: HList] { type Raw = HL /** Transform to the underlying HList type.*/ @@ -20,6 +20,13 @@ sealed trait KList[+M[_], +HL <: HList] def toList: List[M[_]] /** Convert to an HList. */ def combine[N[X] >: M[X]]: HL#Wrap[N] + + def foldr[P[_ <: HList],N[X] >: M[X]](f: KFold[N,P]): P[HL] +} +trait KFold[M[_],P[_ <: HList]] +{ + def kcons[H,T <: HList](h: M[H], acc: P[T]): P[H :+: T] + def knil: P[HNil] } final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] @@ -33,6 +40,8 @@ final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) exten def combine[N[X] >: M[X]]: (H :+: T)#Wrap[N] = HCons(head, tail.combine) override def toString = head + " :^: " + tail.toString + + def foldr[P[_ <: HList],N[X] >: M[X]](f: KFold[N,P]) = f.kcons(head, tail foldr f) } sealed class KNil extends KList[Nothing, HNil] @@ -42,6 +51,7 @@ sealed class KNil extends KList[Nothing, HNil] def :^: [M[_], H](h: M[H]) = KCons(h, this) def toList = Nil def combine[N[X]] = HNil + override def foldr[P[_ <: HList],N[_]](f: KFold[N,P]) = f.knil override def toString = "KNil" } object KNil extends KNil @@ -51,5 +61,12 @@ object KList // nicer alias for pattern matching val :^: = KCons - def fromList[M[_]](s: Seq[M[_]]): KList[M, HList] = if(s.isEmpty) KNil else KCons(s.head, fromList(s.tail)) + def fromList[M[_]](s: Seq[M[_]]): KList[M, _ <: HList] = if(s.isEmpty) KNil else KCons(s.head, fromList(s.tail)) + + // haven't found a way to convince scalac that KList[M, H :+: T] implies KCons[H,T,M] + // Therefore, this method exists to put the cast in one location. + implicit def kcons[H, T <: HList, M[_]](kl: KList[M, H :+: T]): KCons[H,T,M] = + kl.asInstanceOf[KCons[H,T,M]] + // haven't need this, but for symmetry with kcons: + implicit def knil[M[_]](kl: KList[M, HNil]): KNil = KNil } From 536e95cca511770abcf6e21da912b3a93ecdd6d2 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 31 Jan 2011 18:16:25 -0500 Subject: [PATCH 118/823] translate Uninitialized message to use 'display' --- util/collection/Settings.scala | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 6ef6964c8..a7a58526b 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -111,7 +111,7 @@ trait Init[Scope] val scache = PMap.empty[ScopedKey, ScopedKey] def resolve(search: Seq[Scope]): ScopedKey[T] = search match { - case Seq() => throw new Uninitialized(k) + case Seq() => throw Uninitialized(k, refKey) case Seq(x, xs @ _*) => val sk = ScopedKey(x, k.key) scache.getOrUpdate(sk, if(defines(sMap, sk, refKey)) sk else resolve(xs)) @@ -137,7 +137,9 @@ trait Init[Scope] } } - final class Uninitialized(key: ScopedKey[_]) extends Exception("Update on uninitialized setting " + key.key.label + " (in " + key.scope + ")") + final class Uninitialized(val key: ScopedKey[_], val refKey: ScopedKey[_], msg: String) extends Exception(msg) + def Uninitialized(key: ScopedKey[_], refKey: ScopedKey[_]): Uninitialized = + new Uninitialized(key, refKey, "Reference to uninitialized setting " + key.key.label + " (in " + key.scope + ") from " + refKey.key.label +" (in " + refKey.scope + ")") final class Compiled(val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) sealed trait Setting[T] From 8183b717dd90ebeddb1797d01b66173baf88cc69 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 2 Feb 2011 22:56:11 -0500 Subject: [PATCH 119/823] session manipulation commands save, clear, list, and remove session settings --- util/complete/Parsers.scala | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 2b8957a54..f8dee3aaa 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -39,9 +39,13 @@ trait Parsers lazy val Port = token(IntBasic, "") lazy val IntBasic = mapOrFail( '-'.? ~ Digit.+ )( Function.tupled(toInt) ) + lazy val NatBasic = mapOrFail( Digit.+ )( _.mkString.toInt ) private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = (neg.toSeq ++ digits).mkString.toInt + def rep1sep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = + (rep ~ (sep ~> rep).*).map { case (x ~ xs) => x +: xs } + def mapOrFail[S,T](p: Parser[S])(f: S => T): Parser[T] = p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } From 6688c4fdf13ae9352ae6d3d09723b3717020f1ac Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 4 Feb 2011 22:02:39 -0500 Subject: [PATCH 120/823] improve Setting construction - make all constructing methods end in = for lowest precedence - rename Scope constructing method 'apply' to 'in' to allow 'apply' to be used on single settings as well as tuples and 'in' reads better --- util/collection/Attributes.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 0d9e9ade1..ba3f3293b 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -65,5 +65,6 @@ final case class Attributed[D](data: D)(val metadata: AttributeMap) } object Attributed { + implicit def blankSeq[T](in: Seq[T]): Seq[Attributed[T]] = in map blank implicit def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) } \ No newline at end of file From 80ae202965f67d2619e0b215f3c542de222ee83a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 5 Feb 2011 21:39:34 -0500 Subject: [PATCH 121/823] overhaul Streams injection --- util/collection/Settings.scala | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index a7a58526b..8d5e37f12 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -64,10 +64,11 @@ trait Init[Scope] def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key).get def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) - def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = + def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopedKey[_] => Seq[Setting[_]]): Settings[Scope] = { + val withLocal = addLocal(init)(scopeLocal) // group by Scope/Key, dropping dead initializations - val sMap: ScopedMap = grouped(init) + val sMap: ScopedMap = grouped(withLocal) // delegate references to undefined values according to 'delegates' val dMap: ScopedMap = delegate(sMap)(delegates) // merge Seq[Setting[_]] into Compiled @@ -95,6 +96,9 @@ trait Init[Scope] def append[T](ss: Seq[Setting[T]], s: Setting[T]): Seq[Setting[T]] = if(s.definitive) s :: Nil else ss :+ s + + def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopedKey[_] => Seq[Setting[_]]): Seq[Setting[_]] = + init.flatMap( _.dependsOn flatMap scopeLocal ) ++ init def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope]): ScopedMap = { From b503716e3822423049213e2b4cef79f5a9683ec1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 6 Feb 2011 11:33:56 -0500 Subject: [PATCH 122/823] 'get' now shows defining scope, related definitions, dependencies --- util/collection/Settings.scala | 12 ++++++++++-- 1 file changed, 10 insertions(+), 2 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 8d5e37f12..b1b2b6531 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -10,6 +10,7 @@ sealed trait Settings[Scope] def data: Map[Scope, AttributeMap] def keys(scope: Scope): Set[AttributeKey[_]] def scopes: Set[Scope] + def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] def get[T](scope: Scope, key: AttributeKey[T]): Option[T] def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] @@ -23,6 +24,8 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val del def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = delegates(scope).toStream.flatMap(sc => scopeLocal(sc, key) ).headOption + def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] = + delegates(scope).toStream.filter(sc => scopeLocal(sc, key).isDefined ).headOption private def scopeLocal[T](scope: Scope, key: AttributeKey[T]): Option[T] = (data get scope).flatMap(_ get key) @@ -64,15 +67,20 @@ trait Init[Scope] def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key).get def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) - def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopedKey[_] => Seq[Setting[_]]): Settings[Scope] = + def compiled(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopedKey[_] => Seq[Setting[_]]): CompiledMap = { + // prepend per-scope settings val withLocal = addLocal(init)(scopeLocal) // group by Scope/Key, dropping dead initializations val sMap: ScopedMap = grouped(withLocal) // delegate references to undefined values according to 'delegates' val dMap: ScopedMap = delegate(sMap)(delegates) // merge Seq[Setting[_]] into Compiled - val cMap: CompiledMap = compile(dMap) + compile(dMap) + } + def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopedKey[_] => Seq[Setting[_]]): Settings[Scope] = + { + val cMap = compiled(init)(delegates, scopeLocal) // order the initializations. cyclic references are detected here. val ordered: Seq[Compiled] = sort(cMap) // evaluation: apply the initializations. From c54d412e66a24c3929030e1860ba9eb5d75acb5e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 8 Feb 2011 20:33:34 -0500 Subject: [PATCH 123/823] some more example tab completion combinators --- util/complete/src/test/scala/ParserTest.scala | 25 +++++++++++++++---- 1 file changed, 20 insertions(+), 5 deletions(-) diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 244982f43..111f2cbb6 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -1,28 +1,43 @@ package sbt.complete - import Parser._ - import org.scalacheck._ - object JLineTest { + import DefaultParsers._ + + val one = "blue" | "green" | "black" + val two = token("color" ~> Space) ~> token(one) + val three = token("color" ~> Space) ~> token(ID.examples("blue", "green", "black")) + val four = token("color" ~> Space) ~> token(ID, "") + + val num = token(NatBasic) + val five = (num ~ token("+" | "-") ~ num) <~ token('=') flatMap { + case a ~ "+" ~ b => token((a+b).toString) + case a ~ "-" ~ b => token((a-b).toString) + } + + val parsers = Map("1" -> one, "2" -> two, "3" -> three, "4" -> four, "5" -> five) def main(args: Array[String]) { import jline.{ConsoleReader,Terminal} val reader = new ConsoleReader() Terminal.getTerminal.disableEcho() - val parser = ParserExample.t + val parser = parsers(args(0)) JLineCompletion.installCustomCompletor(reader, parser) def loop() { val line = reader.readLine("> ") if(line ne null) { - println("Entered '" + line + "'") + println("Result: " + apply(parser)(line).resultEmpty) loop() } } loop() } } + + import Parser._ + import org.scalacheck._ + object ParserTest extends Properties("Completing Parser") { import Parsers._ From 86d82141a30e65f2070e162f9392322017cec8b0 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 11 Feb 2011 20:22:17 -0500 Subject: [PATCH 124/823] cleanup/rework related to Settings/InputParser - drop fillThis: handle in injectStreams instead - simplify InputParser construction (at the expense of implementation simplicity) - split out ScopeKey/initialization parts of Setting with separate Initialize trait + makes Apply obsolete + makes the Initialize trait properly composable + this allowed splitting the InputParser definition into an Initialize for parsing and one for the action - implement test-only - inject resolved scope --- util/collection/Settings.scala | 94 +++++++++++++++++++--------------- 1 file changed, 53 insertions(+), 41 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index b1b2b6531..ef8867c60 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -47,17 +47,19 @@ trait Init[Scope] type ScopedMap = IMap[ScopedKey, SettingSeq] type CompiledMap = Map[ScopedKey[_], Compiled] type MapScoped = ScopedKey ~> ScopedKey + type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] - def value[T](key: ScopedKey[T])(value: => T): Setting[T] = new Value(key, value _) - def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = app(key, key :^: KNil)(h => f(h.head)) - def app[HL <: HList, T](key: ScopedKey[T], inputs: KList[ScopedKey, HL])(f: HL => T): Setting[T] = new Apply(key, f, inputs) - def uniform[S,T](key: ScopedKey[T], inputs: Seq[ScopedKey[S]])(f: Seq[S] => T): Setting[T] = new Uniform(key, f, inputs) - def kapp[HL <: HList, M[_], T](key: ScopedKey[T], inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL])(f: KList[M, HL] => T): Setting[T] = new KApply[HL, M, T](key, f, inputs) + def setting[T](key: ScopedKey[T], init: Initialize[T]): Setting[T] = new Setting[T](key, init) + def value[T](value: => T): Initialize[T] = new Value(value _) + def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head))) + def app[HL <: HList, T](inputs: KList[ScopedKey, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) + def uniform[S,T](inputs: Seq[ScopedKey[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) + def kapp[HL <: HList, M[_], T](inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL])(f: KList[M, HL] => T): Initialize[T] = new KApply[HL, M, T](f, inputs) // the following is a temporary workaround for the "... cannot be instantiated from ..." bug, which renders 'kapp' above unusable outside this source file class KApp[HL <: HList, M[_], T] { type Composed[S] = ScopedKey[M[S]] - def apply(key: ScopedKey[T], inputs: KList[Composed, HL])(f: KList[M, HL] => T): Setting[T] = new KApply[HL, M, T](key, f, inputs) + def apply(inputs: KList[Composed, HL])(f: KList[M, HL] => T): Initialize[T] = new KApply[HL, M, T](f, inputs) } def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) @@ -67,7 +69,7 @@ trait Init[Scope] def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key).get def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) - def compiled(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopedKey[_] => Seq[Setting[_]]): CompiledMap = + def compiled(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): CompiledMap = { // prepend per-scope settings val withLocal = addLocal(init)(scopeLocal) @@ -78,7 +80,7 @@ trait Init[Scope] // merge Seq[Setting[_]] into Compiled compile(dMap) } - def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopedKey[_] => Seq[Setting[_]]): Settings[Scope] = + def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): Settings[Scope] = { val cMap = compiled(init)(delegates, scopeLocal) // order the initializations. cyclic references are detected here. @@ -105,7 +107,7 @@ trait Init[Scope] def append[T](ss: Seq[Setting[T]], s: Setting[T]): Seq[Setting[T]] = if(s.definitive) s :: Nil else ss :+ s - def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopedKey[_] => Seq[Setting[_]]): Seq[Setting[_]] = + def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopeLocal): Seq[Setting[_]] = init.flatMap( _.dependsOn flatMap scopeLocal ) ++ init def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope]): ScopedMap = @@ -138,15 +140,9 @@ trait Init[Scope] private[this] def applySetting[T](map: Settings[Scope], setting: Setting[T]): Settings[Scope] = { - def execK[HL <: HList, M[_]](a: KApply[HL, M, T]) = - map.set(a.key.scope, a.key.key, a.f(a.inputs.transform[M]( nestCon[ScopedKey, Id, M](asTransform(map)) )) ) - setting match - { - case s: Value[T] => map.set(s.key.scope, s.key.key, s.value()) - case u: Uniform[s, T] => map.set(u.key.scope, u.key.key, u.f(u.inputs map asFunction(map)) ) - case a: Apply[hl, T] => map.set(a.key.scope, a.key.key, a.f(a.inputs down asTransform(map) ) ) - case ka: KApply[hl, m, T] => execK[hl, m](ka) // separate method needed to workaround bug where m is not recognized as higher-kinded in inline version - } + val value = setting.init.get(map) + val key = setting.key + map.set(key.scope, key.key, value) } final class Uninitialized(val key: ScopedKey[_], val refKey: ScopedKey[_], msg: String) extends Exception(msg) @@ -154,42 +150,58 @@ trait Init[Scope] new Uninitialized(key, refKey, "Reference to uninitialized setting " + key.key.label + " (in " + key.scope + ") from " + refKey.key.label +" (in " + refKey.scope + ")") final class Compiled(val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) - sealed trait Setting[T] + sealed trait Initialize[T] { - def key: ScopedKey[T] - def definitive: Boolean def dependsOn: Seq[ScopedKey[_]] - def mapReferenced(g: MapScoped): Setting[T] - def mapKey(g: MapScoped): Setting[T] + def map[S](g: T => S): Initialize[S] + def mapReferenced(g: MapScoped): Initialize[T] + def zip[S](o: Initialize[S]): Initialize[(T,S)] = zipWith(o)((x,y) => (x,y)) + def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = new Joined[T,S,U](this, o, f) + def get(map: Settings[Scope]): T } - private[this] final class Value[T](val key: ScopedKey[T], val value: () => T) extends Setting[T] + final class Setting[T](val key: ScopedKey[T], val init: Initialize[T]) + { + def definitive: Boolean = !init.dependsOn.contains(key) + def dependsOn: Seq[ScopedKey[_]] = remove(init.dependsOn, key) + def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) + def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) + } + + private[this] final class Joined[S,T,U](a: Initialize[S], b: Initialize[T], f: (S,T) => U) extends Initialize[U] + { + def dependsOn = a.dependsOn ++ b.dependsOn + def mapReferenced(g: MapScoped) = new Joined(a mapReferenced g, b mapReferenced g, f) + def map[Z](g: U => Z) = new Joined[S,T,Z](a, b, (s,t) => g(f(s,t))) + def get(map: Settings[Scope]): U = f(a get map, b get map) + } + private[this] final class Value[T](value: () => T) extends Initialize[T] { - def definitive = true def dependsOn = Nil def mapReferenced(g: MapScoped) = this - def mapKey(g: MapScoped): Setting[T] = new Value(g(key), value) + def map[S](g: T => S) = new Value[S](() => g(value())) + def get(map: Settings[Scope]): T = value() } - private[this] final class Apply[HL <: HList, T](val key: ScopedKey[T], val f: HL => T, val inputs: KList[ScopedKey, HL]) extends Setting[T] + private[this] final class Apply[HL <: HList, T](val f: HL => T, val inputs: KList[ScopedKey, HL]) extends Initialize[T] { - def definitive = !inputs.toList.contains(key) - def dependsOn = remove(inputs.toList, key) - def mapReferenced(g: MapScoped) = new Apply(key, f, inputs transform g) - def mapKey(g: MapScoped): Setting[T] = new Apply(g(key), f, inputs) + def dependsOn = inputs.toList + def mapReferenced(g: MapScoped) = new Apply(f, inputs transform g) + def map[S](g: T => S) = new Apply(g compose f, inputs) + def get(map: Settings[Scope]) = f(inputs down asTransform(map) ) } - private[this] final class KApply[HL <: HList, M[_], T](val key: ScopedKey[T], val f: KList[M, HL] => T, val inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL]) extends Setting[T] + private[this] final class KApply[HL <: HList, M[_], T](val f: KList[M, HL] => T, val inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL]) extends Initialize[T] { - def definitive = !inputs.toList.contains(key) - def dependsOn = remove(unnest(inputs.toList), key) - def mapReferenced(g: MapScoped) = new KApply[HL, M, T](key, f, inputs.transform[({type l[t] = ScopedKey[M[t]]})#l]( nestCon(g) ) ) - def mapKey(g: MapScoped): Setting[T] = new KApply[HL, M, T](g(key), f, inputs) + def dependsOn = unnest(inputs.toList) + def mapReferenced(g: MapScoped) = new KApply[HL, M, T](f, inputs.transform[({type l[t] = ScopedKey[M[t]]})#l]( nestCon(g) ) ) + def map[S](g: T => S) = new KApply[HL, M, S](g compose f, inputs) + def get(map: Settings[Scope]) = f(inputs.transform[M]( nestCon[ScopedKey, Id, M](asTransform(map)) )) private[this] def unnest(l: List[ScopedKey[M[T]] forSome { type T }]): List[ScopedKey[_]] = l.asInstanceOf[List[ScopedKey[_]]] } - private[this] final class Uniform[S, T](val key: ScopedKey[T], val f: Seq[S] => T, val inputs: Seq[ScopedKey[S]]) extends Setting[T] + private[this] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[ScopedKey[S]]) extends Initialize[T] { - def definitive = !inputs.contains(key) - def dependsOn = remove(inputs, key) - def mapReferenced(g: MapScoped) = new Uniform(key, f, inputs map g.fn[S]) - def mapKey(g: MapScoped): Setting[T] = new Uniform(g(key), f, inputs) + def dependsOn = inputs + def mapReferenced(g: MapScoped) = new Uniform(f, inputs map g.fn[S]) + def map[S](g: T => S) = new Uniform(g compose f, inputs) + def get(map: Settings[Scope]) = f(inputs map asFunction(map)) } private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) } From e6dcca1b42ea879beb43b4d0234d6d65a6369b9f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 14 Feb 2011 18:59:54 -0500 Subject: [PATCH 125/823] handle constant types --- interface/type | 3 +++ 1 file changed, 3 insertions(+) diff --git a/interface/type b/interface/type index a605f4cd4..ae24f5cc1 100644 --- a/interface/type +++ b/interface/type @@ -12,6 +12,9 @@ Type Parameterized baseType : SimpleType typeArguments: Type* + Constant + baseType: Type + value: String Annotated baseType : SimpleType annotations : Annotation* From a6df926d412181e20faf0111e0a649bb761b8c6f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 15 Feb 2011 18:43:44 -0500 Subject: [PATCH 126/823] fix issue with updating a non-definitive setting --- util/collection/Settings.scala | 21 +++++++++++---------- 1 file changed, 11 insertions(+), 10 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index ef8867c60..e6b574859 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -93,9 +93,9 @@ trait Init[Scope] def compile(sMap: ScopedMap): CompiledMap = sMap.toSeq.map { case (k, ss) => - val deps = ss flatMap { _.dependsOn } + val deps = ss flatMap { _.dependsOn } toSet; val eval = (settings: Settings[Scope]) => (settings /: ss)(applySetting) - (k, new Compiled(deps, eval)) + (k, new Compiled(k, deps, eval)) } toMap; def grouped(init: Seq[Setting[_]]): ScopedMap = @@ -112,15 +112,15 @@ trait Init[Scope] def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope]): ScopedMap = { - def refMap(refKey: ScopedKey[_]) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = - delegateForKey(sMap, k, delegates(k.scope), refKey) + def refMap(refKey: ScopedKey[_], isFirst: Boolean) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = + delegateForKey(sMap, k, delegates(k.scope), refKey, isFirst) } val f = new (SettingSeq ~> SettingSeq) { def apply[T](ks: Seq[Setting[T]]) = - ks.map{ s => s mapReferenced refMap(s.key) } + ks.zipWithIndex.map{ case (s,i) => s mapReferenced refMap(s.key, i == 0) } } sMap mapValues f } - private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_]): ScopedKey[T] = + private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean): ScopedKey[T] = { val scache = PMap.empty[ScopedKey, ScopedKey] def resolve(search: Seq[Scope]): ScopedKey[T] = @@ -128,12 +128,12 @@ trait Init[Scope] case Seq() => throw Uninitialized(k, refKey) case Seq(x, xs @ _*) => val sk = ScopedKey(x, k.key) - scache.getOrUpdate(sk, if(defines(sMap, sk, refKey)) sk else resolve(xs)) + scache.getOrUpdate(sk, if(defines(sMap, sk, refKey, isFirst)) sk else resolve(xs)) } resolve(scopes) } - private[this] def defines(map: ScopedMap, key: ScopedKey[_], refKey: ScopedKey[_]): Boolean = - (map get key) match { case Some(Seq(x, _*)) => (refKey != key) || x.definitive; case _ => false } + private[this] def defines(map: ScopedMap, key: ScopedKey[_], refKey: ScopedKey[_], isFirst: Boolean): Boolean = + (map get key) match { case Some(Seq(x, _*)) => (refKey != key) || !isFirst; case _ => false } private[this] def applyInits(ordered: Seq[Compiled])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = (empty /: ordered){ (m, comp) => comp.eval(m) } @@ -148,7 +148,7 @@ trait Init[Scope] final class Uninitialized(val key: ScopedKey[_], val refKey: ScopedKey[_], msg: String) extends Exception(msg) def Uninitialized(key: ScopedKey[_], refKey: ScopedKey[_]): Uninitialized = new Uninitialized(key, refKey, "Reference to uninitialized setting " + key.key.label + " (in " + key.scope + ") from " + refKey.key.label +" (in " + refKey.scope + ")") - final class Compiled(val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) + final class Compiled(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) sealed trait Initialize[T] { @@ -165,6 +165,7 @@ trait Init[Scope] def dependsOn: Seq[ScopedKey[_]] = remove(init.dependsOn, key) def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) + override def toString = "setting(" + key + ")" } private[this] final class Joined[S,T,U](a: Initialize[S], b: Initialize[T], f: (S,T) => U) extends Initialize[U] From 5d74d2d9857ec5004c3c453ce7e62819d569c290 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 18 Feb 2011 20:57:39 -0500 Subject: [PATCH 127/823] return position at which parsing fails --- util/complete/Parser.scala | 21 ++++++++++----------- 1 file changed, 10 insertions(+), 11 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 5adfa222d..d3c8e766c 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -214,19 +214,18 @@ trait ParserMain // intended to be temporary pending proper error feedback def result[T](p: Parser[T], s: String): Either[(String,Int), T] = { - /* def loop(i: Int, a: Parser[T]): Either[(String,Int), T] = - a.err match + def loop(i: Int, a: Parser[T]): Either[(String,Int), T] = + if(a.valid) { - case Some(msg) => Left((msg, i)) - case None => - val ci = i+1 - if(ci >= s.length) - a.resultEmpty.toRight(("", i)) - else - loop(ci, a derive s(ci)) + val ci = i+1 + if(ci >= s.length) + a.resultEmpty.toRight(("Unexpected end of input", ci)) + else + loop(ci, a derive s(ci) ) } - loop(-1, p)*/ - apply(p)(s).resultEmpty.toRight(("Parse error", 0)) + else + Left(("Parse error",i)) + loop(-1, p) } def apply[T](p: Parser[T])(s: String): Parser[T] = From 333b2833fe3c10a3a4fa0ebdebd8a03c86b714f9 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 18 Feb 2011 20:58:13 -0500 Subject: [PATCH 128/823] make completion cross product lazier --- util/complete/Completions.scala | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/util/complete/Completions.scala b/util/complete/Completions.scala index a2b910897..a14527d48 100644 --- a/util/complete/Completions.scala +++ b/util/complete/Completions.scala @@ -11,7 +11,7 @@ package sbt.complete sealed trait Completions { def get: Set[Completion] - final def x(o: Completions): Completions = Completions( for(cs <- get; os <- o.get) yield cs ++ os ) + final def x(o: Completions): Completions = flatMap(_ x o) final def ++(o: Completions): Completions = Completions( get ++ o.get ) final def +:(o: Completion): Completions = Completions(get + o) final def filter(f: Completion => Boolean): Completions = Completions(get filter f) @@ -68,7 +68,7 @@ sealed trait Completion /** Appends the completions in `o` with the completions in this Completion.*/ def ++(o: Completion): Completion = Completion.concat(this, o) - final def x(o: Completions): Completions = o.map(this ++ _) + final def x(o: Completions): Completions = if(Completion evaluatesRight this) o.map(this ++ _) else Completions.strict(Set.empty + this) override final lazy val hashCode = Completion.hashCode(this) override final def equals(o: Any) = o match { case c: Completion => Completion.equal(this, c); case _ => false } } @@ -103,6 +103,13 @@ object Completion case _ if a.isEmpty => b case _ => a } + def evaluatesRight(a: Completion): Boolean = + a match + { + case _: Suggestion => true + case at: Token if at.append.isEmpty => true + case _ => a.isEmpty + } def equal(a: Completion, b: Completion): Boolean = (a,b) match From fb29d8e11e0624808a380546b341b0bf971aa9fd Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 19 Feb 2011 15:29:51 -0500 Subject: [PATCH 129/823] tweak URI character class for improved completion --- util/complete/Parsers.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index f8dee3aaa..78cce0271 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -33,6 +33,7 @@ trait Parsers lazy val NotSpace = NotSpaceClass.+.string lazy val Space = SpaceClass.+.examples(" ") lazy val OptSpace = SpaceClass.*.examples(" ") + lazy val URIClass = charClass(x => !x.isWhitespace && ')' != x).+.string // TODO: implement def fileParser(base: File): Parser[File] = token(mapOrFail(NotSpace)(s => new File(s.mkString)), "") @@ -52,7 +53,7 @@ trait Parsers def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(NotSpace, display)).* <~ SpaceClass.* def trimmed(p: Parser[String]) = p map { _.trim } - def Uri(ex: Set[URI]) = mapOrFail(NotSpace)( uri => new URI(uri)) examples(ex.map(_.toString)) + def Uri(ex: Set[URI]) = mapOrFail(URIClass)( uri => new URI(uri)) examples(ex.map(_.toString)) } object Parsers extends Parsers object DefaultParsers extends Parsers with ParserMain \ No newline at end of file From d264ab0ad2802eb268fbfb951c493c435aea222f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 21 Feb 2011 10:00:40 -0500 Subject: [PATCH 130/823] add 'seq' combinator that applies one or more parsers, collecting all valid results --- util/complete/Parser.scala | 20 +++++++++++++++++--- 1 file changed, 17 insertions(+), 3 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index d3c8e766c..b9f1fe577 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -24,9 +24,9 @@ sealed trait RichParser[A] def * : Parser[Seq[A]] /** Produces a Parser that applies the original Parser zero or one times.*/ def ? : Parser[Option[A]] - /** Produces a Parser that applies either the original Parser or `next`.*/ + /** Produces a Parser that applies either the original Parser or `b`.*/ def |[B >: A](b: Parser[B]): Parser[B] - /** Produces a Parser that applies either the original Parser or `next`.*/ + /** Produces a Parser that applies either the original Parser or `b`.*/ def ||[B](b: Parser[B]): Parser[Either[A,B]] /** Produces a Parser that applies the original Parser to the input and then applies `f` to the result.*/ def map[B](f: A => B): Parser[B] @@ -56,7 +56,6 @@ sealed trait RichParser[A] } object Parser extends ParserMain { - def checkMatches(a: Parser[_], completions: Seq[String]) { val bad = completions.filter( apply(a)(_).resultEmpty.isEmpty) @@ -273,6 +272,12 @@ trait ParserMain def not(p: Parser[_]): Parser[Unit] = new Not(p) + def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = + { + val valid = p.filter(_.valid) + if(valid.isEmpty) failure("") else new ParserSeq(valid) + } + def stringLiteral(s: String, remaining: List[Char]): Parser[String] = if(s.isEmpty) error("String literal cannot be empty") else if(remaining.isEmpty) success(s) else new StringLiteral(s, remaining) } @@ -320,6 +325,15 @@ private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends ValidPars lazy val completions = a.completions ++ b.completions override def toString = "(" + a + " || " + b + ")" } +private final class ParserSeq[T](a: Seq[Parser[T]]) extends ValidParser[Seq[T]] +{ + assert(!a.isEmpty) + lazy val resultEmpty = { val rs = a.flatMap(_.resultEmpty); if(rs.isEmpty) None else Some(rs) } + lazy val completions = a.map(_.completions).reduceLeft(_ ++ _) + def derive(c: Char) = seq(a.map(_ derive c)) + override def toString = "seq(" + a + ")" +} + private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends ValidParser[B] { lazy val resultEmpty = a.resultEmpty match { case None => None; case Some(av) => f(av).resultEmpty } From 9e080d7418d9fdf53ad9afedf4ce8b764dd6d604 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 21 Feb 2011 19:35:05 -0500 Subject: [PATCH 131/823] configurable shell prompt for example: Command.ShellPrompt := { s => Project.extract(s).cid + "> " } --- util/collection/Attributes.scala | 2 ++ 1 file changed, 2 insertions(+) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index ba3f3293b..d176a5d71 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -21,6 +21,7 @@ trait AttributeMap { def apply[T](k: AttributeKey[T]): T def get[T](k: AttributeKey[T]): Option[T] + def remove[T](k: AttributeKey[T]): AttributeMap def contains[T](k: AttributeKey[T]): Boolean def put[T](k: AttributeKey[T], value: T): AttributeMap def keys: Iterable[AttributeKey[_]] @@ -40,6 +41,7 @@ private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) def isEmpty: Boolean = backing.isEmpty def apply[T](k: AttributeKey[T]) = backing(k).asInstanceOf[T] def get[T](k: AttributeKey[T]) = backing.get(k).asInstanceOf[Option[T]] + def remove[T](k: AttributeKey[T]): AttributeMap = new BasicAttributeMap( backing - k ) def contains[T](k: AttributeKey[T]) = backing.contains(k) def put[T](k: AttributeKey[T], value: T): AttributeMap = new BasicAttributeMap( backing.updated(k, value) ) def keys: Iterable[AttributeKey[_]] = backing.keys From a9ee49ee1e90c7d8750664202e31b75b4fb5d3ab Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 22 Feb 2011 22:36:48 -0500 Subject: [PATCH 132/823] starting to convert integration tests --- interface/src/test/scala/F0.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/interface/src/test/scala/F0.scala b/interface/src/test/scala/F0.scala index d71458e68..94a1aa876 100644 --- a/interface/src/test/scala/F0.scala +++ b/interface/src/test/scala/F0.scala @@ -1,6 +1,6 @@ package xsbti -object f0 +object g0 { def apply[T](s: => T) = new F0[T] { def apply = s } } \ No newline at end of file From f9e8534a89828c52643cd378b18df3b497cc46c5 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 1 Mar 2011 08:48:14 -0500 Subject: [PATCH 133/823] join for tasks and settings --- util/collection/Settings.scala | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index e6b574859..ef6261c51 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -159,6 +159,21 @@ trait Init[Scope] def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = new Joined[T,S,U](this, o, f) def get(map: Settings[Scope]): T } + object Initialize + { + implicit def joinInitialize[T](s: Seq[Initialize[T]]): JoinInitSeq[T] = new JoinInitSeq(s) + final class JoinInitSeq[T](s: Seq[Initialize[T]]) + { + def join[S](f: Seq[T] => S): Initialize[S] = this.join map f + def join: Initialize[Seq[T]] = Initialize.join(s) + } + def join[T](inits: Seq[Initialize[T]]): Initialize[Seq[T]] = + inits match + { + case Seq() => value( Nil ) + case Seq(x, xs @ _*) => (join(xs) zipWith x)( (t,h) => h +: t) + } + } final class Setting[T](val key: ScopedKey[T], val init: Initialize[T]) { def definitive: Boolean = !init.dependsOn.contains(key) From 93b13e80b7097c4393505ea07e6b58af7cf9814a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 6 Mar 2011 21:57:31 -0500 Subject: [PATCH 134/823] success indication and timestamps for actions --- util/log/BasicLogger.scala | 3 +++ util/log/BufferedLogger.scala | 6 ++++++ util/log/ConsoleLogger.scala | 2 +- util/log/FilterLogger.scala | 4 +++- util/log/FullLogger.scala | 3 ++- util/log/Level.scala | 5 ++--- util/log/LogEvent.scala | 1 + util/log/Logger.scala | 6 +++++- util/log/MultiLogger.scala | 5 +++++ 9 files changed, 28 insertions(+), 7 deletions(-) diff --git a/util/log/BasicLogger.scala b/util/log/BasicLogger.scala index a52d3b433..f7adb11ce 100644 --- a/util/log/BasicLogger.scala +++ b/util/log/BasicLogger.scala @@ -8,6 +8,9 @@ abstract class BasicLogger extends AbstractLogger { private var traceEnabledVar = java.lang.Integer.MAX_VALUE private var level: Level.Value = Level.Info + private var successEnabledVar = true + def successEnabled = successEnabledVar + def setSuccessEnabled(flag: Boolean) { successEnabledVar = flag } def getLevel = level def setLevel(newLevel: Level.Value) { level = newLevel } def setTrace(level: Int) { traceEnabledVar = level } diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala index 38f845e11..73aa7f8e5 100644 --- a/util/log/BufferedLogger.scala +++ b/util/log/BufferedLogger.scala @@ -52,6 +52,12 @@ class BufferedLogger(delegate: AbstractLogger) extends AbstractLogger buffer += new SetLevel(newLevel) delegate.setLevel(newLevel) } + def setSuccessEnabled(flag: Boolean) + { + buffer += new SetSuccess(flag) + delegate.setSuccessEnabled(flag) + } + def successEnabled = delegate.successEnabled def getLevel = delegate.getLevel def getTrace = delegate.getTrace def setTrace(level: Int) diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index 0f9e78b22..4005a0889 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -60,7 +60,7 @@ class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ans def successMessageColor = Console.RESET override def success(message: => String) { - if(atLevel(Level.Info)) + if(successEnabled) log(successLabelColor, Level.SuccessLabel, successMessageColor, message) } def trace(t: => Throwable): Unit = diff --git a/util/log/FilterLogger.scala b/util/log/FilterLogger.scala index 152f6bdd5..59048c381 100644 --- a/util/log/FilterLogger.scala +++ b/util/log/FilterLogger.scala @@ -14,6 +14,8 @@ class FilterLogger(delegate: AbstractLogger) extends BasicLogger if(traceEnabled) delegate.trace(t) } + override def setSuccessEnabled(flag: Boolean) { delegate.setSuccessEnabled(flag) } + override def successEnabled = delegate.successEnabled override def setTrace(level: Int) { delegate.setTrace(level) } override def getTrace = delegate.getTrace def log(level: Level.Value, message: => String) @@ -23,7 +25,7 @@ class FilterLogger(delegate: AbstractLogger) extends BasicLogger } def success(message: => String) { - if(atLevel(Level.Info)) + if(successEnabled) delegate.success(message) } def control(event: ControlEvent.Value, message: => String) diff --git a/util/log/FullLogger.scala b/util/log/FullLogger.scala index 091664244..e562fdb20 100644 --- a/util/log/FullLogger.scala +++ b/util/log/FullLogger.scala @@ -17,7 +17,8 @@ class FullLogger(delegate: Logger, override val ansiCodesSupported: Boolean = fa delegate.log(level, message) } def success(message: => String): Unit = - info(message) + if(successEnabled) + delegate.success(message) def control(event: ControlEvent.Value, message: => String): Unit = info(message) def logAll(events: Seq[LogEvent]): Unit = events.foreach(log) diff --git a/util/log/Level.scala b/util/log/Level.scala index bc1156729..62fb5f2c2 100644 --- a/util/log/Level.scala +++ b/util/log/Level.scala @@ -11,9 +11,8 @@ object Level extends Enumeration val Info = Value(2, "info") val Warn = Value(3, "warn") val Error = Value(4, "error") - /** Defines the label to use for success messages. A success message is logged at the info level but - * uses this label. Because the label for levels is defined in this module, the success - * label is also defined here. */ + /** Defines the label to use for success messages. + * Because the label for levels is defined in this module, the success label is also defined here. */ val SuccessLabel = "success" def union(a: Value, b: Value) = if(a.id < b.id) a else b diff --git a/util/log/LogEvent.scala b/util/log/LogEvent.scala index ffe6049d7..7bd91c2a4 100644 --- a/util/log/LogEvent.scala +++ b/util/log/LogEvent.scala @@ -9,6 +9,7 @@ final class Log(val level: Level.Value, val msg: String) extends LogEvent final class Trace(val exception: Throwable) extends LogEvent final class SetLevel(val newLevel: Level.Value) extends LogEvent final class SetTrace(val level: Int) extends LogEvent +final class SetSuccess(val enabled: Boolean) extends LogEvent final class ControlEvent(val event: ControlEvent.Value, val msg: String) extends LogEvent object ControlEvent extends Enumeration diff --git a/util/log/Logger.scala b/util/log/Logger.scala index 3be05b539..04babcc4e 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -12,9 +12,10 @@ abstract class AbstractLogger extends Logger def setTrace(flag: Int) def getTrace: Int final def traceEnabled = getTrace >= 0 + def successEnabled: Boolean + def setSuccessEnabled(flag: Boolean): Unit def atLevel(level: Level.Value) = level.id >= getLevel.id - def success(message: => String): Unit def control(event: ControlEvent.Value, message: => String): Unit def logAll(events: Seq[LogEvent]): Unit @@ -28,6 +29,7 @@ abstract class AbstractLogger extends Logger case t: Trace => trace(t.exception) case setL: SetLevel => setLevel(setL.newLevel) case setT: SetTrace => setTrace(setT.level) + case setS: SetSuccess => setSuccessEnabled(setS.enabled) case c: ControlEvent => control(c.event, c.msg) } } @@ -45,6 +47,7 @@ object Logger override def trace(msg: F0[Throwable]) = lg.trace(msg) override def log(level: Level.Value, msg: F0[String]) = lg.log(level, msg) def trace(t: => Throwable) = trace(f0(t)) + def success(s: => String) = info(f0(s)) def log(level: Level.Value, msg: => String) = { val fmsg = f0(msg) @@ -73,6 +76,7 @@ trait Logger extends xLogger def ansiCodesSupported = false def trace(t: => Throwable): Unit + def success(message: => String): Unit def log(level: Level.Value, message: => String): Unit def debug(msg: F0[String]): Unit = log(Level.Debug, msg) diff --git a/util/log/MultiLogger.scala b/util/log/MultiLogger.scala index 525e3ef9d..9cdb65386 100644 --- a/util/log/MultiLogger.scala +++ b/util/log/MultiLogger.scala @@ -19,6 +19,11 @@ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger super.setTrace(level) dispatch(new SetTrace(level)) } + override def setSuccessEnabled(flag: Boolean) + { + super.setSuccessEnabled(flag) + dispatch(new SetSuccess(flag)) + } def trace(t: => Throwable) { dispatch(new Trace(t)) } def log(level: Level.Value, message: => String) { dispatch(new Log(level, message)) } def success(message: => String) { dispatch(new Success(message)) } From 309bc5caeb7c6a189e58082e5d3686e4e7f12ca9 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 11 Mar 2011 16:52:44 -0500 Subject: [PATCH 135/823] reintegrate history commands, add proper parsing for recursive commands --- util/complete/History.scala | 7 +-- util/complete/HistoryCommands.scala | 83 ++++++++++------------------- util/complete/Parsers.scala | 2 + 3 files changed, 33 insertions(+), 59 deletions(-) diff --git a/util/complete/History.scala b/util/complete/History.scala index e792454f7..9c36f2605 100644 --- a/util/complete/History.scala +++ b/util/complete/History.scala @@ -4,9 +4,10 @@ package sbt package complete -import History.number + import History.number + import java.io.File -final class History private(lines: IndexedSeq[String], error: String => Unit) extends NotNull +final class History private(val lines: IndexedSeq[String], val path: Option[File], error: String => Unit) extends NotNull { private def reversed = lines.reverse @@ -41,7 +42,7 @@ final class History private(lines: IndexedSeq[String], error: String => Unit) ex object History { - def apply(lines: Seq[String], error: String => Unit): History = new History(lines.toIndexedSeq, error) + def apply(lines: Seq[String], path: Option[File], error: String => Unit): History = new History(lines.toIndexedSeq, path, error) def number(s: String): Option[Int] = try { Some(s.toInt) } diff --git a/util/complete/HistoryCommands.scala b/util/complete/HistoryCommands.scala index 16a359f9a..906aa328a 100644 --- a/util/complete/HistoryCommands.scala +++ b/util/complete/HistoryCommands.scala @@ -39,63 +39,34 @@ object HistoryCommands def helpString = "History commands:\n " + (descriptions.map{ case (c,d) => c + " " + d}).mkString("\n ") def printHelp(): Unit = println(helpString) + def printHistory(history: complete.History, historySize: Int, show: Int): Unit = + history.list(historySize, show).foreach(println) - def apply(s: String, historyPath: Option[File], maxLines: Int, error: String => Unit): Option[List[String]] = - if(s.isEmpty) - { - printHelp() - Some(Nil) - } - else - { - val lines = historyPath.toList.flatMap( p => IO.readLines(p) ).toArray - if(lines.isEmpty) - { - error("No history") - None - } - else - { - val history = complete.History(lines, error) - if(s.startsWith(ListCommands)) - { - val rest = s.substring(ListCommands.length) - val show = complete.History.number(rest).getOrElse(lines.length) - printHistory(history, maxLines, show) - Some(Nil) - } - else - { - val command = historyCommand(history, s) - command.foreach(lines(lines.length - 1) = _) - historyPath foreach { h => IO.writeLines(h, lines) } - Some(command.toList) - } - } - } - def printHistory(history: complete.History, historySize: Int, show: Int): Unit = history.list(historySize, show).foreach(println) - def historyCommand(history: complete.History, s: String): Option[String] = - { - if(s == Last) - history !! - else if(s.startsWith(Contains)) - history !? s.substring(Contains.length) - else - history ! s + import DefaultParsers._ + + val MaxLines = 500 + lazy val num = token(NatBasic, "") + lazy val last = Last ^^^ { execute(_ !!) } + lazy val list = ListCommands ~> (num ?? Int.MaxValue) map { show => + (h: History) => { printHistory(h, MaxLines, show); Some(Nil) } } -/* - import parse.{Parser,Parsers} - import Parser._ - import Parsers._ - val historyParser: Parser[complete.History => Option[String]] = - { - Start ~> Specific) + lazy val execStr = flag('?') ~ token(any.+.string, "") map { case (contains, str) => + execute(h => if(contains) h !? str else h ! str) } - !! Execute the last command again - !: Show all previous commands - !:n Show the last n commands - !n Execute the command with index n, as shown by the !: command - !-n Execute the nth command before this one - !string Execute the most recent command starting with 'string' - !?string*/ + lazy val execInt = flag('-') ~ num map { case (neg, value) => + execute(h => if(neg) h !- value else h ! value) + } + lazy val help = success( (h: History) => { printHelp(); Some(Nil) } ) + + def execute(f: History => Option[String]): History => Option[List[String]] = (h: History) => + { + val command = f(h) + val lines = h.lines.toArray + command.foreach(lines(lines.length - 1) = _) + h.path foreach { h => IO.writeLines(h, lines) } + Some(command.toList) + } + + val actionParser: Parser[complete.History => Option[List[String]]] = + Start ~> (help | last | execInt | list | execStr ) // execStr must come last } \ No newline at end of file diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 78cce0271..096614ad9 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -52,6 +52,8 @@ trait Parsers def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(NotSpace, display)).* <~ SpaceClass.* + def flag[T](p: Parser[T]): Parser[Boolean] = (p ^^^ true) ?? false + def trimmed(p: Parser[String]) = p map { _.trim } def Uri(ex: Set[URI]) = mapOrFail(URIClass)( uri => new URI(uri)) examples(ex.map(_.toString)) } From cacd1a5be8cc55103e866c9770e09442a1d70f76 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 17 Mar 2011 21:29:35 -0400 Subject: [PATCH 136/823] 'update' caching now takes into account whether jars still exist --- cache/tracking/Tracked.scala | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index d88518ce7..1add819f4 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -31,6 +31,14 @@ object Tracked import sbinary.JavaIO._ + def lastOutput[I,O](cacheFile: File)(f: (I,Option[O]) => O)(implicit o: Format[O], mf: Manifest[Format[O]]): I => O = in => + { + val previous: Option[O] = fromFile[O](cacheFile) + val next = f(in, previous) + toFile(next)(cacheFile) + next + } + def inputChanged[I,O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): I => O = in => { val help = new CacheHelp(ic) From 95e5206c3f04f3d054d66e9a1f86c24c6336c969 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 20 Mar 2011 22:54:01 -0400 Subject: [PATCH 137/823] work on displaying task errors --- util/collection/Settings.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index ef6261c51..9450ee10e 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -180,6 +180,7 @@ trait Init[Scope] def dependsOn: Seq[ScopedKey[_]] = remove(init.dependsOn, key) def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) + def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init.map(t => f(key,t))) override def toString = "setting(" + key + ")" } From c9f8d70ee5465c72460bb45a335375a4708539a8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 21 Mar 2011 20:26:04 -0400 Subject: [PATCH 138/823] command logging through Streams, 'last' without a key to redisplay it --- util/control/ExitHook.scala | 8 +++++--- util/log/ConsoleLogger.scala | 14 ++++++++++---- 2 files changed, 15 insertions(+), 7 deletions(-) diff --git a/util/control/ExitHook.scala b/util/control/ExitHook.scala index 1e491b095..de85bff42 100644 --- a/util/control/ExitHook.scala +++ b/util/control/ExitHook.scala @@ -4,13 +4,15 @@ package sbt /** Defines a function to call as sbt exits.*/ -trait ExitHook extends NotNull +trait ExitHook { - /** Provides a name for this hook to be used to provide feedback to the user. */ - def name: String /** Subclasses should implement this method, which is called when this hook is executed. */ def runBeforeExiting(): Unit } +object ExitHook +{ + def apply(f: => Unit): ExitHook = new ExitHook { def runBeforeExiting() = f } +} object ExitHooks { diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index 4005a0889..f7229573e 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -1,9 +1,9 @@ /* sbt -- Simple Build Tool - * Copyright 2008, 2009, 2010 Mark Harrah + * Copyright 2008, 2009, 2010, 2011 Mark Harrah */ package sbt - import java.io.{PrintStream, PrintWriter} + import java.io.{BufferedWriter, PrintStream, PrintWriter} object ConsoleLogger { @@ -17,8 +17,14 @@ object ConsoleLogger def printWriterOut(out: PrintWriter): ConsoleOut = new ConsoleOut { val lockObject = out def print(s: String) = out.print(s) - def println(s: String) = out.println(s) - def println() = out.println() + def println(s: String) = { out.println(s); out.flush() } + def println() = { out.println(); out.flush() } + } + def bufferedWriterOut(out: BufferedWriter): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.write(s) + def println(s: String) = { out.write(s); println() } + def println() = { out.newLine(); out.flush() } } val formatEnabled = From 7feebe2f85f7fff5a1c3dc62417922fb06362c58 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 24 Mar 2011 21:22:09 -0400 Subject: [PATCH 139/823] tab completion: example-checking off by default, 'matches' convenience method --- util/complete/Parser.scala | 4 ++-- util/complete/Parsers.scala | 9 ++++++++- 2 files changed, 10 insertions(+), 3 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index b9f1fe577..53d6ca01a 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -45,7 +45,7 @@ sealed trait RichParser[A] /** Explicitly defines the completions for the original Parser.*/ def examples(s: String*): Parser[A] /** Explicitly defines the completions for the original Parser.*/ - def examples(s: Set[String]): Parser[A] + def examples(s: Set[String], check: Boolean = false): Parser[A] /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ def string(implicit ev: A <:< Seq[Char]): Parser[String] /** Produces a Parser that filters the original parser. @@ -173,7 +173,7 @@ trait ParserMain def & (o: Parser[_]) = and(a, o) def - (o: Parser[_]) = sub(a, o) def examples(s: String*): Parser[A] = examples(s.toSet) - def examples(s: Set[String]): Parser[A] = Parser.examples(a, s, check = true) + def examples(s: Set[String], check: Boolean = false): Parser[A] = Parser.examples(a, s, check) def filter(f: A => Boolean): Parser[A] = filterParser(a, f) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 096614ad9..85c5a73fb 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -44,6 +44,8 @@ trait Parsers private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = (neg.toSeq ++ digits).mkString.toInt + def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = + rep1sep(rep, sep) ?? Nil def rep1sep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = (rep ~ (sep ~> rep).*).map { case (x ~ xs) => x +: xs } @@ -58,4 +60,9 @@ trait Parsers def Uri(ex: Set[URI]) = mapOrFail(URIClass)( uri => new URI(uri)) examples(ex.map(_.toString)) } object Parsers extends Parsers -object DefaultParsers extends Parsers with ParserMain \ No newline at end of file +object DefaultParsers extends Parsers with ParserMain +{ + def matches(p: Parser[_], s: String): Boolean = + apply(p)(s).resultEmpty.isDefined + def validID(s: String): Boolean = matches(ID, s) +} \ No newline at end of file From 339c59bad400ba6c3b8f1cad01caaad985e1fafd Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 24 Mar 2011 21:23:11 -0400 Subject: [PATCH 140/823] tab completion: print message when input is invalid --- util/complete/JLineCompletion.scala | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala index 9103c3e72..c6fd26433 100644 --- a/util/complete/JLineCompletion.scala +++ b/util/complete/JLineCompletion.scala @@ -40,9 +40,17 @@ object JLineCompletion def parserAsCompletor(p: Parser[_]): ConsoleReader => Boolean = customCompletor(str => convertCompletions(Parser.completions(p, str))) def convertCompletions(c: Completions): (Seq[String], Seq[String]) = + { + val cs = c.get + if(cs.isEmpty) + (Nil, "{invalid input}" :: Nil) + else + convertCompletions(cs) + } + def convertCompletions(cs: Set[Completion]): (Seq[String], Seq[String]) = { val (insert, display) = - ( (Set.empty[String], Set.empty[String]) /: c.get) { case ( t @ (insert,display), comp) => + ( (Set.empty[String], Set.empty[String]) /: cs) { case ( t @ (insert,display), comp) => if(comp.isEmpty) t else (insert + comp.append, appendNonEmpty(display, comp.display.trim)) } (insert.toSeq, display.toSeq.sorted) From c803a4a16d80043f3710b61546e9e87b5cbfbf9c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 24 Mar 2011 21:25:57 -0400 Subject: [PATCH 141/823] tab completion fixes and cleanup --- util/collection/Attributes.scala | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index d176a5d71..4697c63b9 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -25,6 +25,7 @@ trait AttributeMap def contains[T](k: AttributeKey[T]): Boolean def put[T](k: AttributeKey[T], value: T): AttributeMap def keys: Iterable[AttributeKey[_]] + def ++(o: Iterable[AttributeEntry[_]]): AttributeMap def ++(o: AttributeMap): AttributeMap def entries: Iterable[AttributeEntry[_]] def isEmpty: Boolean @@ -32,6 +33,8 @@ trait AttributeMap object AttributeMap { val empty: AttributeMap = new BasicAttributeMap(Map.empty) + def apply(entries: Iterable[AttributeEntry[_]]): AttributeMap = empty ++ entries + def apply(entries: AttributeEntry[_]*): AttributeMap = empty ++ entries implicit def toNatTrans(map: AttributeMap): AttributeKey ~> Id = new (AttributeKey ~> Id) { def apply[T](key: AttributeKey[T]): T = map(key) } @@ -45,6 +48,11 @@ private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) def contains[T](k: AttributeKey[T]) = backing.contains(k) def put[T](k: AttributeKey[T], value: T): AttributeMap = new BasicAttributeMap( backing.updated(k, value) ) def keys: Iterable[AttributeKey[_]] = backing.keys + def ++(o: Iterable[AttributeEntry[_]]): AttributeMap = + { + val newBacking = (backing /: o) { case (b, AttributeEntry(key, value)) => b.updated(key, value) } + new BasicAttributeMap(newBacking) + } def ++(o: AttributeMap): AttributeMap = o match { case bam: BasicAttributeMap => new BasicAttributeMap(backing ++ bam.backing) From e016e644ae4afdc49ef9753f1dff63c96dcef223 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 25 Mar 2011 18:22:30 -0400 Subject: [PATCH 142/823] newline before JLine's above threshold prompt --- util/complete/JLineCompletion.scala | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala index c6fd26433..18fc11fa4 100644 --- a/util/complete/JLineCompletion.scala +++ b/util/complete/JLineCompletion.scala @@ -98,7 +98,12 @@ object JLineCompletion reader.drawLine() } def printCompletions(cs: Seq[String], reader: ConsoleReader): Unit = - if(cs.isEmpty) () else CandidateListCompletionHandler.printCandidates(reader, JavaConversions.asJavaList(cs), true) + { + // CandidateListCompletionHandler doesn't print a new line before the prompt + if(cs.size > reader.getAutoprintThreshhold) + reader.printNewline() + CandidateListCompletionHandler.printCandidates(reader, JavaConversions.asJavaList(cs), true) + } def commonPrefix(s: Seq[String]): String = if(s.isEmpty) "" else s reduceLeft commonPrefix def commonPrefix(a: String, b: String): String = From 1ddf5c8c315cdfacd54f07af19e4c6a064e1e741 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 29 Mar 2011 20:25:12 -0400 Subject: [PATCH 143/823] 'inspect actual ' for actual dependencies, 'inspect ' for declared --- util/collection/Settings.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 9450ee10e..cd0363dab 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -69,14 +69,14 @@ trait Init[Scope] def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key).get def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) - def compiled(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): CompiledMap = + def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): CompiledMap = { // prepend per-scope settings val withLocal = addLocal(init)(scopeLocal) // group by Scope/Key, dropping dead initializations val sMap: ScopedMap = grouped(withLocal) // delegate references to undefined values according to 'delegates' - val dMap: ScopedMap = delegate(sMap)(delegates) + val dMap: ScopedMap = if(actual) delegate(sMap)(delegates) else sMap // merge Seq[Setting[_]] into Compiled compile(dMap) } From 33206fc537674feb39c012e406278db65e194cc8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 7 Apr 2011 22:50:26 -0400 Subject: [PATCH 144/823] move toSeq up from IMap to PMap --- util/collection/PMap.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala index bc054ae87..32274c67a 100644 --- a/util/collection/PMap.scala +++ b/util/collection/PMap.scala @@ -11,6 +11,7 @@ trait RMap[K[_], V[_]] def apply[T](k: K[T]): V[T] def get[T](k: K[T]): Option[V[T]] def contains[T](k: K[T]): Boolean + def toSeq: Seq[(K[_], V[_])] } trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] @@ -19,7 +20,6 @@ trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] def remove[T](k: K[T]): IMap[K,V] def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): IMap[K,V] def mapValues[V2[_]](f: V ~> V2): IMap[K,V2] - def toSeq: Seq[(K[_], V[_])] } trait PMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] { @@ -82,6 +82,7 @@ class DelegatingPMap[K[_], V[_]](backing: mutable.Map[K[_], V[_]]) extends Abstr update(k, v) v } + def toSeq = backing.toSeq private[this] def cast[T](v: V[_]): V[T] = v.asInstanceOf[V[T]] private[this] def cast[T](o: Option[V[_]]): Option[V[T]] = o map cast[T] From 5409dd91d83940309aef3aa0f7af541458ad2685 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 8 Apr 2011 19:15:13 -0400 Subject: [PATCH 145/823] reorganize main --- util/control/MessageOnlyException.scala | 7 +++++++ 1 file changed, 7 insertions(+) create mode 100644 util/control/MessageOnlyException.scala diff --git a/util/control/MessageOnlyException.scala b/util/control/MessageOnlyException.scala new file mode 100644 index 000000000..6791a9339 --- /dev/null +++ b/util/control/MessageOnlyException.scala @@ -0,0 +1,7 @@ +/* sbt -- Simple Build Tool + * Copyright 2011 Mark Harrah + */ +package sbt + +final class MessageOnlyException(override val toString: String) extends RuntimeException(toString) +final class NoMessageException extends RuntimeException \ No newline at end of file From 944bd82306e211d73332cb2737ca936ed74d32a7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 8 Apr 2011 19:17:58 -0400 Subject: [PATCH 146/823] work on tests --- util/process/src/test/scala/ProcessSpecification.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/process/src/test/scala/ProcessSpecification.scala b/util/process/src/test/scala/ProcessSpecification.scala index f2f42d3ca..15f9c1f48 100644 --- a/util/process/src/test/scala/ProcessSpecification.scala +++ b/util/process/src/test/scala/ProcessSpecification.scala @@ -6,7 +6,7 @@ import Prop._ import Process._ -object ProcessSpecification extends Properties("Process I/O") +private[this] object ProcessSpecification extends Properties("Process I/O") { implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary(Gen.choose(0, 10) flatMap { size => Gen.resize(size, Arbitrary.arbArray[Byte].arbitrary) }) From c368a34d3672bf6ea61b7bb6814ccb6afe4400bb Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 12 Apr 2011 20:58:59 -0400 Subject: [PATCH 147/823] clean up scope delegation implementation --- util/collection/PMap.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala index 32274c67a..95bb1b6b0 100644 --- a/util/collection/PMap.scala +++ b/util/collection/PMap.scala @@ -52,7 +52,7 @@ object IMap put(k, f(this get k getOrElse init)) def mapValues[V2[_]](f: V ~> V2) = - new IMap0[K,V2](backing.mapValues(x => f(x))) + new IMap0[K,V2](backing.mapValues(x => f(x)).toMap) def toSeq = backing.toSeq override def toString = backing.toString From 90aa53a19e6429d7befd2ec4387b27570d57d072 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 13 Apr 2011 19:04:53 -0400 Subject: [PATCH 148/823] String representation for Compiled --- util/collection/Settings.scala | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index cd0363dab..12e96d61b 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -41,6 +41,8 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val del // this trait is intended to be mixed into an object trait Init[Scope] { + def display(skey: ScopedKey[_]): String + final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) type SettingSeq[T] = Seq[Setting[T]] @@ -149,6 +151,9 @@ trait Init[Scope] def Uninitialized(key: ScopedKey[_], refKey: ScopedKey[_]): Uninitialized = new Uninitialized(key, refKey, "Reference to uninitialized setting " + key.key.label + " (in " + key.scope + ") from " + refKey.key.label +" (in " + refKey.scope + ")") final class Compiled(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) + { + override def toString = display(key) + } sealed trait Initialize[T] { From 8b7e4218078cc9f5beb4b0313e173ffbb4bc8136 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 13 Apr 2011 19:06:36 -0400 Subject: [PATCH 149/823] improve error messages for cycles --- util/collection/Dag.scala | 27 +++++++++++++++++++-------- 1 file changed, 19 insertions(+), 8 deletions(-) diff --git a/util/collection/Dag.scala b/util/collection/Dag.scala index 9d6a295ea..29b43038a 100644 --- a/util/collection/Dag.scala +++ b/util/collection/Dag.scala @@ -22,20 +22,31 @@ object Dag val finished = asSet(new java.util.LinkedHashSet[T]) def visitAll(nodes: Iterable[T]) = nodes foreach visit - def visit(dag : T){ - if (!discovered(dag)) { - discovered(dag) = true; - visitAll(dependencies(dag)); - finished += dag; + def visit(node : T){ + if (!discovered(node)) { + discovered(node) = true; + try { visitAll(dependencies(node)); } catch { case c: Cyclic => throw node :: c } + finished += node; } - else if(!finished(dag)) - throw new Cyclic(dag) + else if(!finished(node)) + throw new Cyclic(node) } visitAll(nodes); finished.toList; } - final class Cyclic(val value: Any) extends Exception("Cyclic reference involving " + value) + final class Cyclic(val value: Any, val all: List[Any], val complete: Boolean) + extends Exception( "Cyclic reference involving " + (if(complete) all.mkString(", ") else value) ) + { + def this(value: Any) = this(value, value :: Nil, false) + def ::(a: Any): Cyclic = + if(complete) + this + else if(a == value) + new Cyclic(value, all, true) + else + new Cyclic(value, a :: all, false) + } } From 19ac4b51b22aa7dec00f09ec87b1e56894a28374 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 13 Apr 2011 19:09:33 -0400 Subject: [PATCH 150/823] support explicitly defining sequences of settings in .sbt files --- util/collection/Settings.scala | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 12e96d61b..aabb4c955 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -179,8 +179,13 @@ trait Init[Scope] case Seq(x, xs @ _*) => (join(xs) zipWith x)( (t,h) => h +: t) } } - final class Setting[T](val key: ScopedKey[T], val init: Initialize[T]) + sealed trait SettingsDefinition { + def settings: Seq[Setting[_]] + } + final class SettingList(val settings: Seq[Setting[_]]) extends SettingsDefinition + final class Setting[T](val key: ScopedKey[T], val init: Initialize[T]) extends SettingsDefinition { + def settings = this :: Nil def definitive: Boolean = !init.dependsOn.contains(key) def dependsOn: Seq[ScopedKey[_]] = remove(init.dependsOn, key) def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) From ed7721bb7d5c87e91aaf79b2f1882dbdb4c89d92 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 16 Apr 2011 11:24:58 -0400 Subject: [PATCH 151/823] add Types.idFun to replace Predef.identity, replace a :== overload idFun[T]: T => T instead of identity[T](t: T): T doesn't require a new class file when used as a function value replaced overloads of :== that assigned the Scoped reference on the right to the Scoped on the left with <<= scoped.identity --- util/collection/TypeFunctions.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index 93e44154a..8f542fb99 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -14,6 +14,7 @@ trait TypeFunctions final val left = new (Id ~> P1of2[Left, Nothing]#Flip) { def apply[T](t: T) = Left(t) } final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } + final def idFun[T] = (t: T) => t def nestCon[M[_], N[_], G[_]](f: M ~> N): (M ∙ G)#l ~> (N ∙ G)#l = f.asInstanceOf[(M ∙ G)#l ~> (N ∙ G)#l] // implemented with a cast to avoid extra object+method call. castless version: From 23fed6d061be6f0ae4176a558bd191f8df1e7f88 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 19 Apr 2011 17:58:05 -0400 Subject: [PATCH 152/823] use left, some, right to avoid extra anonymous classes --- util/complete/Parser.scala | 40 +++++++++++++++++++++----------------- 1 file changed, 22 insertions(+), 18 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 53d6ca01a..f81357ea1 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -4,6 +4,7 @@ package sbt.complete import Parser._ + import sbt.Types.{left, right, some} sealed trait Parser[+T] { @@ -104,9 +105,9 @@ object Parser extends ParserMain def choiceParser[A,B](a: Parser[A], b: Parser[B]): Parser[Either[A,B]] = if(a.valid) - if(b.valid) new HetParser(a,b) else a.map( Left(_) ) + if(b.valid) new HetParser(a,b) else a.map( left.fn ) else - b.map( Right(_) ) + b.map( right.fn ) def opt[T](a: Parser[T]): Parser[Option[T]] = if(a.valid) new Optional(a) else success(None) @@ -205,7 +206,7 @@ trait ParserMain def completions = Completions.single(Completion.suggestStrict(ch.toString)) override def toString = "'" + ch + "'" } - implicit def literal(s: String): Parser[String] = stringLiteral(s, s.toList) + implicit def literal(s: String): Parser[String] = stringLiteral(s, 0) object ~ { def unapply[A,B](t: (A,B)): Some[(A,B)] = Some(t) } @@ -248,13 +249,13 @@ trait ParserMain } else a - def matched(t: Parser[_], seenReverse: List[Char] = Nil, partial: Boolean = false): Parser[String] = + def matched(t: Parser[_], seen: Seq[Char] = Vector.empty, partial: Boolean = false): Parser[String] = if(!t.valid) - if(partial && !seenReverse.isEmpty) success(seenReverse.reverse.mkString) else Invalid + if(partial && !seen.isEmpty) success(seen.mkString) else Invalid else if(t.result.isEmpty) - new MatchedString(t, seenReverse, partial) + new MatchedString(t, seen, partial) else - success(seenReverse.reverse.mkString) + success(seen.mkString) def token[T](t: Parser[T]): Parser[T] = token(t, "", true) def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false) @@ -278,8 +279,11 @@ trait ParserMain if(valid.isEmpty) failure("") else new ParserSeq(valid) } - def stringLiteral(s: String, remaining: List[Char]): Parser[String] = - if(s.isEmpty) error("String literal cannot be empty") else if(remaining.isEmpty) success(s) else new StringLiteral(s, remaining) + def stringLiteral(s: String, start: Int): Parser[String] = + { + val len = s.length + if(len == 0) error("String literal cannot be empty") else if(start >= len) success(s) else new StringLiteral(s, start) + } } sealed trait ValidParser[T] extends Parser[T] { @@ -321,7 +325,7 @@ private final class HomParser[A](a: Parser[A], b: Parser[A]) extends ValidParser private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends ValidParser[Either[A,B]] { def derive(c: Char) = (a derive c) || (b derive c) - lazy val resultEmpty = a.resultEmpty.map(Left(_)) orElse b.resultEmpty.map(Right(_)) + lazy val resultEmpty = a.resultEmpty.map(left.fn) orElse b.resultEmpty.map(right.fn) lazy val completions = a.completions ++ b.completions override def toString = "(" + a + " || " + b + ")" } @@ -373,10 +377,10 @@ private final class Filter[T](p: Parser[T], f: T => Boolean) extends ValidParser override def toString = "filter(" + p + ")" override def isTokenStart = p.isTokenStart } -private final class MatchedString(delegate: Parser[_], seenReverse: List[Char], partial: Boolean) extends ValidParser[String] +private final class MatchedString(delegate: Parser[_], seenV: Vector[Char], partial: Boolean) extends ValidParser[String] { - lazy val seen = seenReverse.reverse.mkString - def derive(c: Char) = matched(delegate derive c, c :: seenReverse, partial) + lazy val seen = seenV.mkString + def derive(c: Char) = matched(delegate derive c, seenV :+ c, partial) def completions = delegate.completions def resultEmpty = if(delegate.resultEmpty.isDefined) Some(seen) else if(partial) Some(seen) else None override def isTokenStart = delegate.isTokenStart @@ -422,12 +426,12 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Completions(fixed map(f => Completion.suggestion(f)) ) override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" } -private final class StringLiteral(str: String, remaining: List[Char]) extends ValidParser[String] +private final class StringLiteral(str: String, start: Int) extends ValidParser[String] { - assert(str.length > 0 && !remaining.isEmpty) + assert(0 <= start && start < str.length) def resultEmpty = None - def derive(c: Char) = if(remaining.head == c) stringLiteral(str, remaining.tail) else Invalid - lazy val completions = Completions.single(Completion.suggestion(remaining.mkString)) + def derive(c: Char) = if(str.charAt(start) == c) stringLiteral(str, start+1) else Invalid + lazy val completions = Completions.single(Completion.suggestion(str.substring(start))) override def toString = '"' + str + '"' } private final class CharacterClass(f: Char => Boolean) extends ValidParser[Char] @@ -440,7 +444,7 @@ private final class CharacterClass(f: Char => Boolean) extends ValidParser[Char] private final class Optional[T](delegate: Parser[T]) extends ValidParser[Option[T]] { def resultEmpty = Some(None) - def derive(c: Char) = (delegate derive c).map(Some(_)) + def derive(c: Char) = (delegate derive c).map(some.fn) lazy val completions = Completion.empty +: delegate.completions override def toString = delegate.toString + "?" } From df1e0384815e635c3018779591b209f948393883 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 19 Apr 2011 22:20:16 -0400 Subject: [PATCH 153/823] fix matched signature --- util/complete/Parser.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index f81357ea1..7d1409bc1 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -249,7 +249,7 @@ trait ParserMain } else a - def matched(t: Parser[_], seen: Seq[Char] = Vector.empty, partial: Boolean = false): Parser[String] = + def matched(t: Parser[_], seen: Vector[Char] = Vector.empty, partial: Boolean = false): Parser[String] = if(!t.valid) if(partial && !seen.isEmpty) success(seen.mkString) else Invalid else if(t.result.isEmpty) From 69ae123fc8c2cac0b62c6e0fc127f213cc032e85 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 20 Apr 2011 18:31:10 -0400 Subject: [PATCH 154/823] Cache doc task --- cache/CacheIO.scala | 2 +- cache/FileInfo.scala | 9 +++++++-- cache/tracking/Tracked.scala | 4 ++-- 3 files changed, 10 insertions(+), 5 deletions(-) diff --git a/cache/CacheIO.scala b/cache/CacheIO.scala index dad9bd467..ac698c24e 100644 --- a/cache/CacheIO.scala +++ b/cache/CacheIO.scala @@ -24,7 +24,7 @@ object CacheIO fromFile[T](file) getOrElse default def fromFile[T](file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Option[T] = try { Some( Operations.fromFile(file)(stampedFormat(format)) ) } - catch { case e: FileNotFoundException => None } + catch { case e: Exception => None } def toFile[T](format: Format[T])(value: T)(file: File)(implicit mf: Manifest[Format[T]]): Unit = toFile(value)(file)(format, mf) diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala index ae626b827..e4706c1fa 100644 --- a/cache/FileInfo.scala +++ b/cache/FileInfo.scala @@ -46,7 +46,7 @@ object FileInfo implicit val format: Format[F] import Cache._ implicit def fileInfoEquiv: Equiv[F] = defaultEquiv - implicit def infoInputCache: InputCache[F] = basicInput + def infoInputCache: InputCache[F] = basicInput implicit def fileInputCache: InputCache[File] = wrapIn[File,F] } object full extends Style @@ -95,7 +95,7 @@ object FilesInfo val manifest: Manifest[Format[FilesInfo[F]]] def empty: FilesInfo[F] = new FilesInfo[F](Set.empty) import Cache._ - implicit def infosInputCache: InputCache[FilesInfo[F]] = basicInput + def infosInputCache: InputCache[FilesInfo[F]] = basicInput implicit def filesInputCache: InputCache[Set[File]] = wrapIn[Set[File],FilesInfo[F]] implicit def filesInfoEquiv: Equiv[FilesInfo[F]] = defaultEquiv } @@ -112,4 +112,9 @@ object FilesInfo lazy val hash: Style { type F = HashFileInfo } = new BasicStyle(FileInfo.hash) lazy val lastModified: Style { type F = ModifiedFileInfo } = new BasicStyle(FileInfo.lastModified) lazy val exists: Style { type F = PlainFileInfo } = new BasicStyle(FileInfo.exists) + + implicit def existsInputsCache: InputCache[FilesInfo[PlainFileInfo]] = exists.infosInputCache + implicit def hashInputsCache: InputCache[FilesInfo[HashFileInfo]] = hash.infosInputCache + implicit def modifiedInputsCache: InputCache[FilesInfo[ModifiedFileInfo]] = lastModified.infosInputCache + implicit def fullInputsCache: InputCache[FilesInfo[HashModifiedFileInfo]] = full.infosInputCache } \ No newline at end of file diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index 1add819f4..10b37101f 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -3,7 +3,7 @@ */ package sbt -import java.io.{File,IOException} +import java.io.File import CacheIO.{fromFile, toFile} import sbinary.{Format, JavaIO} import scala.reflect.Manifest @@ -120,7 +120,7 @@ class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O stream => equiv.equiv(value, format.reads(stream)) } } catch { - case _: IOException => false + case _: Exception => false } } object Difference From b727cf94f2ea5d7f66b22e623288f78a35596949 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 20 Apr 2011 20:18:58 -0400 Subject: [PATCH 155/823] task/setting/attribute descriptions --- util/collection/Attributes.scala | 12 ++++++++++-- 1 file changed, 10 insertions(+), 2 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 4697c63b9..4c3a16873 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -6,15 +6,23 @@ package sbt import Types._ // T must be invariant to work properly. -// Because it is sealed and the only instances go through make, +// Because it is sealed and the only instances go through AttributeKey.apply, // a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts sealed trait AttributeKey[T] { def label: String + def description: Option[String] override final def toString = label } object AttributeKey { - def apply[T](name: String): AttributeKey[T] = new AttributeKey[T] { def label = name } + def apply[T](name: String): AttributeKey[T] = new AttributeKey[T] { + def label = name + def description = None + } + def apply[T](name: String, description0: String): AttributeKey[T] = new AttributeKey[T] { + def label = name + def description = Some(description0) + } } trait AttributeMap From 21b95c1b72c728ff1c8e0c8b2bf791ce0b668180 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 25 Apr 2011 20:20:05 -0400 Subject: [PATCH 156/823] work on parser error handling --- util/complete/EditDistance.scala | 37 ++++ util/complete/Parser.scala | 314 +++++++++++++++++++++---------- util/complete/Parsers.scala | 19 +- 3 files changed, 262 insertions(+), 108 deletions(-) create mode 100644 util/complete/EditDistance.scala diff --git a/util/complete/EditDistance.scala b/util/complete/EditDistance.scala new file mode 100644 index 000000000..60137d533 --- /dev/null +++ b/util/complete/EditDistance.scala @@ -0,0 +1,37 @@ +package sbt.complete + +/** @author Paul Phillips*/ +object EditDistance { + /** Translated from the java version at + * http://www.merriampark.com/ld.htm + * which is declared to be public domain. + */ + def levenshtein(s: String, t: String, insertCost: Int, deleteCost: Int, subCost: Int, transposeCost: Int, transpositions: Boolean = false): Int = { + val n = s.length + val m = t.length + if (n == 0) return m + if (m == 0) return n + + val d = Array.ofDim[Int](n + 1, m + 1) + 0 to n foreach (x => d(x)(0) = x) + 0 to m foreach (x => d(0)(x) = x) + + for (i <- 1 to n ; val s_i = s(i - 1) ; j <- 1 to m) { + val t_j = t(j - 1) + val cost = if (s_i == t_j) 0 else 1 + + val c1 = d(i - 1)(j) + deleteCost + val c2 = d(i)(j - 1) + insertCost + val c3 = d(i - 1)(j - 1) + cost*subCost + + d(i)(j) = c1 min c2 min c3 + + if (transpositions) { + if (i > 1 && j > 1 && s(i - 1) == t(j - 2) && s(i - 2) == t(j - 1)) + d(i)(j) = d(i)(j) min (d(i - 2)(j - 2) + cost*transposeCost) + } + } + + d(n)(m) + } +} \ No newline at end of file diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 7d1409bc1..821603e1f 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -5,15 +5,18 @@ package sbt.complete import Parser._ import sbt.Types.{left, right, some} + import sbt.Collections.separate sealed trait Parser[+T] { def derive(i: Char): Parser[T] - def resultEmpty: Option[T] - def result: Option[T] = None + def resultEmpty: Result[T] + def result: Option[T] def completions: Completions - def valid: Boolean + def failure: Option[Failure] def isTokenStart = false + def ifValid[S](p: => Parser[S]): Parser[S] + def valid: Boolean } sealed trait RichParser[A] { @@ -39,6 +42,9 @@ sealed trait RichParser[A] def ??[B >: A](alt: B): Parser[B] def <~[B](b: Parser[B]): Parser[A] def ~>[B](b: Parser[B]): Parser[B] + + /** Uses the specified message if the original Parser fails.*/ + def !!!(msg: String): Parser[A] def unary_- : Parser[Unit] def & (o: Parser[_]): Parser[A] @@ -51,57 +57,110 @@ sealed trait RichParser[A] def string(implicit ev: A <:< Seq[Char]): Parser[String] /** Produces a Parser that filters the original parser. * If 'f' is not true when applied to the output of the original parser, the Parser returned by this method fails.*/ - def filter(f: A => Boolean): Parser[A] + def filter(f: A => Boolean, msg: String => String): Parser[A] def flatMap[B](f: A => Parser[B]): Parser[B] } object Parser extends ParserMain { + sealed abstract class Result[+T] { + def isFailure: Boolean + def isValid: Boolean + def errors: Seq[String] + def or[B >: T](b: => Result[B]): Result[B] + def either[B](b: => Result[B]): Result[Either[T,B]] + def map[B](f: T => B): Result[B] + def flatMap[B](f: T => Result[B]): Result[B] + def &&(b: => Result[_]): Result[T] + def filter(f: T => Boolean, msg: => String): Result[T] + def seq[B](b: => Result[B]): Result[(T,B)] = app(b)( (m,n) => (m,n) ) + def app[B,C](b: => Result[B])(f: (T, B) => C): Result[C] + def toEither: Either[Seq[String], T] + } + final case class Value[+T](value: T) extends Result[T] { + def isFailure = false + def isValid: Boolean = true + def errors = Nil + def app[B,C](b: => Result[B])(f: (T, B) => C): Result[C] = b match { + case fail: Failure => fail + case Value(bv) => Value(f(value, bv)) + } + def &&(b: => Result[_]): Result[T] = b match { case f: Failure => f; case _ => this } + def or[B >: T](b: => Result[B]): Result[B] = this + def either[B](b: => Result[B]): Result[Either[T,B]] = Value(Left(value)) + def map[B](f: T => B): Result[B] = Value(f(value)) + def flatMap[B](f: T => Result[B]): Result[B] = f(value) + def filter(f: T => Boolean, msg: => String): Result[T] = if(f(value)) this else mkFailure(msg) + def toEither = Right(value) + } + final class Failure(mkErrors: => Seq[String]) extends Result[Nothing] { + lazy val errors: Seq[String] = mkErrors + def isFailure = true + def isValid = false + def map[B](f: Nothing => B) = this + def flatMap[B](f: Nothing => Result[B]) = this + def or[B](b: => Result[B]): Result[B] = b match { + case v: Value[B] => v + case f: Failure => concatErrors(f) + } + def either[B](b: => Result[B]): Result[Either[Nothing,B]] = b match { + case Value(v) => Value(Right(v)) + case f: Failure => concatErrors(f) + } + def filter(f: Nothing => Boolean, msg: => String) = this + def app[B,C](b: => Result[B])(f: (Nothing, B) => C): Result[C] = this + def &&(b: => Result[_]) = this + def toEither = Left(errors) + + private[this] def concatErrors(f: Failure) = mkFailures(errors ++ f.errors) + } + def mkFailures(errors: => Seq[String]): Failure = new Failure(errors.distinct) + def mkFailure(error: => String): Failure = new Failure(error :: Nil) + def checkMatches(a: Parser[_], completions: Seq[String]) { - val bad = completions.filter( apply(a)(_).resultEmpty.isEmpty) + val bad = completions.filter( apply(a)(_).resultEmpty.isFailure) if(!bad.isEmpty) error("Invalid example completions: " + bad.mkString("'", "', '", "'")) } + def tuple[A,B](a: Option[A], b: Option[B]): Option[(A,B)] = + (a,b) match { case (Some(av), Some(bv)) => Some(av, bv); case _ => None } def mapParser[A,B](a: Parser[A], f: A => B): Parser[B] = - if(a.valid) { + a.ifValid { a.result match { case Some(av) => success( f(av) ) case None => new MapParser(a, f) } } - else Invalid def bindParser[A,B](a: Parser[A], f: A => Parser[B]): Parser[B] = - if(a.valid) { + a.ifValid { a.result match { case Some(av) => f(av) case None => new BindParser(a, f) } } - else Invalid - def filterParser[T](a: Parser[T], f: T => Boolean): Parser[T] = - if(a.valid) { + def filterParser[T](a: Parser[T], f: T => Boolean, seen: String, msg: String => String): Parser[T] = + a.ifValid { a.result match { - case Some(av) => if( f(av) ) success( av ) else Invalid - case None => new Filter(a, f) + case Some(av) => if( f(av) ) success( av ) else Parser.failure(msg(seen)) + case None => new Filter(a, f, seen, msg) } } - else Invalid def seqParser[A,B](a: Parser[A], b: Parser[B]): Parser[(A,B)] = - if(a.valid && b.valid) + a.ifValid { b.ifValid { (a.result, b.result) match { case (Some(av), Some(bv)) => success( (av, bv) ) case (Some(av), None) => b map { bv => (av, bv) } case (None, Some(bv)) => a map { av => (av, bv) } case (None, None) => new SeqParser(a,b) } - else Invalid + }} def choiceParser[A,B](a: Parser[A], b: Parser[B]): Parser[Either[A,B]] = if(a.valid) @@ -112,6 +171,9 @@ object Parser extends ParserMain def opt[T](a: Parser[T]): Parser[Option[T]] = if(a.valid) new Optional(a) else success(None) + def onFailure[T](delegate: Parser[T], msg: String): Parser[T] = + if(delegate.valid) new OnFailure(delegate, msg) else failure(msg) + def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) @@ -123,34 +185,35 @@ object Parser extends ParserMain assume(max >= min, "Minimum must be less than or equal to maximum (min: " + min + ", max: " + max + ")") def checkRepeated(invalidButOptional: => Parser[Seq[T]]): Parser[Seq[T]] = - if(repeated.valid) - repeated.result match - { - case Some(value) => success(revAcc reverse_::: value :: Nil) // revAcc should be Nil here - case None => if(max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) - } - else if(min == 0) - invalidButOptional - else - Invalid + repeated match + { + case i: Invalid if min == 0 => invalidButOptional + case i: Invalid => i + case _ => + repeated.result match + { + case Some(value) => success(revAcc reverse_::: value :: Nil) // revAcc should be Nil here + case None => if(max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) + } + } partial match { case Some(part) => - if(part.valid) + part.ifValid { part.result match { case Some(value) => repeat(None, repeated, min, max, value :: revAcc) case None => checkRepeated(part.map(lv => (lv :: revAcc).reverse)) } - else Invalid + } case None => checkRepeated(success(Nil)) } } def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) - def and[T](a: Parser[T], b: Parser[_]): Parser[T] = if(a.valid && b.valid) new And(a, b) else Invalid + def and[T](a: Parser[T], b: Parser[_]): Parser[T] = a.ifValid( b.ifValid( new And(a, b) )) } trait ParserMain { @@ -169,40 +232,43 @@ trait ParserMain def ??[B >: A](alt: B): Parser[B] = a.? map { _ getOrElse alt } def <~[B](b: Parser[B]): Parser[A] = (a ~ b) map { case av ~ _ => av } def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } + def !!!(msg: String): Parser[A] = onFailure(a, msg) def unary_- = not(a) def & (o: Parser[_]) = and(a, o) def - (o: Parser[_]) = sub(a, o) def examples(s: String*): Parser[A] = examples(s.toSet) def examples(s: Set[String], check: Boolean = false): Parser[A] = Parser.examples(a, s, check) - def filter(f: A => Boolean): Parser[A] = filterParser(a, f) + def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) } implicit def literalRichParser(c: Char): RichParser[Char] = richParser(c) implicit def literalRichParser(s: String): RichParser[String] = richParser(s) - def failure[T](msg: String): Parser[T] = Invalid(msg) + def invalid(msgs: => Seq[String]): Parser[Nothing] = Invalid(mkFailures(msgs)) + def failure(msg: => String): Parser[Nothing] = invalid(msg :: Nil) def success[T](value: T): Parser[T] = new ValidParser[T] { override def result = Some(value) - def resultEmpty = result - def derive(c: Char) = Invalid + def resultEmpty = Value(value) + def derive(c: Char) = Parser.failure("Expected end of input.") def completions = Completions.empty override def toString = "success(" + value + ")" } implicit def range(r: collection.immutable.NumericRange[Char]): Parser[Char] = - new CharacterClass(r contains _).examples(r.map(_.toString) : _*) + charClass(r contains _).examples(r.map(_.toString) : _*) def chars(legal: String): Parser[Char] = { val set = legal.toSet - new CharacterClass(set) examples(set.map(_.toString)) + charClass(set, "character in '" + legal + "'") examples(set.map(_.toString)) } - def charClass(f: Char => Boolean): Parser[Char] = new CharacterClass(f) + def charClass(f: Char => Boolean, label: String = ""): Parser[Char] = new CharacterClass(f, label) implicit def literal(ch: Char): Parser[Char] = new ValidParser[Char] { - def resultEmpty = None - def derive(c: Char) = if(c == ch) success(ch) else Invalid + def result = None + def resultEmpty = mkFailure( "Expected '" + ch + "'" ) + def derive(c: Char) = if(c == ch) success(ch) else new Invalid(resultEmpty) def completions = Completions.single(Completion.suggestStrict(ch.toString)) override def toString = "'" + ch + "'" } @@ -212,19 +278,22 @@ trait ParserMain } // intended to be temporary pending proper error feedback - def result[T](p: Parser[T], s: String): Either[(String,Int), T] = + def result[T](p: Parser[T], s: String): Either[(Seq[String],Int), T] = { - def loop(i: Int, a: Parser[T]): Either[(String,Int), T] = - if(a.valid) + def loop(i: Int, a: Parser[T]): Either[(Seq[String],Int), T] = + a match { - val ci = i+1 - if(ci >= s.length) - a.resultEmpty.toRight(("Unexpected end of input", ci)) - else - loop(ci, a derive s(ci) ) + case Invalid(f) => Left( (f.errors, i) ) + case _ => + val ci = i+1 + if(ci >= s.length) + a.resultEmpty.toEither.left.map { msgs => + val nonEmpty = if(msgs.isEmpty) "Unexpected end of input" :: Nil else msgs + (nonEmpty, ci) + } + else + loop(ci, a derive s(ci) ) } - else - Left(("Parse error",i)) loop(-1, p) } @@ -250,12 +319,15 @@ trait ParserMain else a def matched(t: Parser[_], seen: Vector[Char] = Vector.empty, partial: Boolean = false): Parser[String] = - if(!t.valid) - if(partial && !seen.isEmpty) success(seen.mkString) else Invalid - else if(t.result.isEmpty) - new MatchedString(t, seen, partial) - else - success(seen.mkString) + t match + { + case i: Invalid => if(partial && !seen.isEmpty) success(seen.mkString) else i + case _ => + if(t.result.isEmpty) + new MatchedString(t, seen, partial) + else + success(seen.mkString) + } def token[T](t: Parser[T]): Parser[T] = token(t, "", true) def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false) @@ -273,10 +345,12 @@ trait ParserMain def not(p: Parser[_]): Parser[Unit] = new Not(p) - def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = + def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = seq0(p, Nil) + def seq0[T](p: Seq[Parser[T]], errors: => Seq[String]): Parser[Seq[T]] = { - val valid = p.filter(_.valid) - if(valid.isEmpty) failure("") else new ParserSeq(valid) + val (newErrors, valid) = separate(p) { case Invalid(f) => Left(f.errors); case ok => Right(ok) } + def combinedErrors = errors ++ newErrors.flatten + if(valid.isEmpty) invalid(combinedErrors) else new ParserSeq(valid, combinedErrors) } def stringLiteral(s: String, start: Int): Parser[String] = @@ -288,27 +362,40 @@ trait ParserMain sealed trait ValidParser[T] extends Parser[T] { final def valid = true + final def failure = None + final def ifValid[S](p: => Parser[S]): Parser[S] = p } -private object Invalid extends Invalid("inv") -private sealed case class Invalid(val message: String) extends Parser[Nothing] +private final case class Invalid(fail: Failure) extends Parser[Nothing] { - def resultEmpty = None + def failure = Some(fail) + def result = None + def resultEmpty = fail def derive(c: Char) = error("Invalid.") - override def valid = false def completions = Completions.nil - override def toString = message + override def toString = fail.errors.mkString("; ") + def valid = false + def ifValid[S](p: => Parser[S]): Parser[S] = this +} +private final class OnFailure[A](a: Parser[A], message: String) extends ValidParser[A] +{ + def result = a.result + def resultEmpty = a.resultEmpty match { case f: Failure => mkFailure(message); case v: Value[A] => v } + def derive(c: Char) = onFailure(a derive c, message) + def completions = a.completions + override def toString = "(" + a + " !!! \"" + message + "\" )" + override def isTokenStart = a.isTokenStart } private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends ValidParser[(A,B)] { - def cross(ao: Option[A], bo: Option[B]): Option[(A,B)] = for(av <- ao; bv <- bo) yield (av,bv) - lazy val resultEmpty = cross(a.resultEmpty, b.resultEmpty) + lazy val result = tuple(a.result,b.result) + lazy val resultEmpty = a.resultEmpty seq b.resultEmpty def derive(c: Char) = { val common = a.derive(c) ~ b a.resultEmpty match { - case Some(av) => common | b.derive(c).map(br => (av,br)) - case None => common + case Value(av) => common | b.derive(c).map(br => (av,br)) + case _: Failure => common } } lazy val completions = a.completions x b.completions @@ -317,35 +404,47 @@ private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends ValidPars private final class HomParser[A](a: Parser[A], b: Parser[A]) extends ValidParser[A] { + lazy val result = tuple(a.result, b.result) map (_._1) def derive(c: Char) = (a derive c) | (b derive c) - lazy val resultEmpty = a.resultEmpty orElse b.resultEmpty + lazy val resultEmpty = a.resultEmpty or b.resultEmpty lazy val completions = a.completions ++ b.completions override def toString = "(" + a + " | " + b + ")" } private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends ValidParser[Either[A,B]] { + lazy val result = tuple(a.result, b.result) map { case (a,b) => Left(a) } def derive(c: Char) = (a derive c) || (b derive c) - lazy val resultEmpty = a.resultEmpty.map(left.fn) orElse b.resultEmpty.map(right.fn) + lazy val resultEmpty = a.resultEmpty either b.resultEmpty lazy val completions = a.completions ++ b.completions override def toString = "(" + a + " || " + b + ")" } -private final class ParserSeq[T](a: Seq[Parser[T]]) extends ValidParser[Seq[T]] +private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) extends ValidParser[Seq[T]] { assert(!a.isEmpty) - lazy val resultEmpty = { val rs = a.flatMap(_.resultEmpty); if(rs.isEmpty) None else Some(rs) } + lazy val resultEmpty: Result[Seq[T]] = + { + val res = a.map(_.resultEmpty) + val (failures, values) = separate(res)(_.toEither) + if(failures.isEmpty) Value(values) else mkFailures(failures.flatten ++ errors) + } + def result = { + val success = a.flatMap(_.result) + if(success.length == a.length) Some(success) else None + } lazy val completions = a.map(_.completions).reduceLeft(_ ++ _) - def derive(c: Char) = seq(a.map(_ derive c)) + def derive(c: Char) = seq0(a.map(_ derive c), errors) override def toString = "seq(" + a + ")" } private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends ValidParser[B] { - lazy val resultEmpty = a.resultEmpty match { case None => None; case Some(av) => f(av).resultEmpty } + lazy val result = a.result flatMap { av => f(av).result } + lazy val resultEmpty = a.resultEmpty flatMap { av => f(av).resultEmpty } lazy val completions = a.completions flatMap { c => apply(a)(c.append).resultEmpty match { - case None => Completions.strict(Set.empty + c) - case Some(av) => c x f(av).completions + case _: Failure => Completions.strict(Set.empty + c) + case Value(av) => c x f(av).completions } } @@ -354,8 +453,8 @@ private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends Val val common = a derive c flatMap f a.resultEmpty match { - case Some(av) => common | derive1(f(av), c) - case None => common + case Value(av) => common | derive1(f(av), c) + case _: Failure => common } } override def isTokenStart = a.isTokenStart @@ -363,17 +462,20 @@ private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends Val } private final class MapParser[A,B](a: Parser[A], f: A => B) extends ValidParser[B] { + lazy val result = a.result map f lazy val resultEmpty = a.resultEmpty map f def derive(c: Char) = (a derive c) map f def completions = a.completions override def isTokenStart = a.isTokenStart override def toString = "map(" + a + ")" } -private final class Filter[T](p: Parser[T], f: T => Boolean) extends ValidParser[T] +private final class Filter[T](p: Parser[T], f: T => Boolean, seen: String, msg: String => String) extends ValidParser[T] { - lazy val resultEmpty = p.resultEmpty filter f - def derive(c: Char) = (p derive c) filter f - lazy val completions = p.completions filterS { s => apply(p)(s).resultEmpty.filter(f).isDefined } + def filterResult(r: Result[T]) = p.resultEmpty.filter(f, msg(seen)) + lazy val result = p.result filter f + lazy val resultEmpty = filterResult(p.resultEmpty) + def derive(c: Char) = filterParser(p derive c, f, seen + c, msg) + lazy val completions = p.completions filterS { s => filterResult(apply(p)(s).resultEmpty).isValid } override def toString = "filter(" + p + ")" override def isTokenStart = p.isTokenStart } @@ -382,7 +484,8 @@ private final class MatchedString(delegate: Parser[_], seenV: Vector[Char], part lazy val seen = seenV.mkString def derive(c: Char) = matched(delegate derive c, seenV :+ c, partial) def completions = delegate.completions - def resultEmpty = if(delegate.resultEmpty.isDefined) Some(seen) else if(partial) Some(seen) else None + def result = if(delegate.result.isDefined) Some(seen) else None + def resultEmpty = delegate.resultEmpty match { case f: Failure if !partial => f; case _ => Value(seen) } override def isTokenStart = delegate.isTokenStart override def toString = "matched(" + partial + ", " + seen + ", " + delegate + ")" } @@ -398,30 +501,37 @@ private final class TokenStart[T](delegate: Parser[T], seen: String, track: Bool else Completions.single(Completion.displayStrict(seen)) + def result = delegate.result def resultEmpty = delegate.resultEmpty override def isTokenStart = true override def toString = "token('" + seen + "', " + track + ", " + delegate + ")" } private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] { + lazy val result = tuple(a.result,b.result) map { _._1 } def derive(c: Char) = (a derive c) & (b derive c) - lazy val completions = a.completions.filterS(s => apply(b)(s).resultEmpty.isDefined ) - lazy val resultEmpty = if(b.resultEmpty.isDefined) a.resultEmpty else None + lazy val completions = a.completions.filterS(s => apply(b)(s).resultEmpty.isValid ) + lazy val resultEmpty = a.resultEmpty && b.resultEmpty } private final class Not(delegate: Parser[_]) extends ValidParser[Unit] { def derive(c: Char) = if(delegate.valid) not(delegate derive c) else this def completions = Completions.empty - lazy val resultEmpty = if(delegate.resultEmpty.isDefined) None else Some(()) + def result = None + lazy val resultEmpty = delegate.resultEmpty match { + case f: Failure => Value(()) + case v: Value[_] => mkFailure("Excluded.") + } } private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends ValidParser[T] { def derive(c: Char) = examples(delegate derive c, fixed.collect { case x if x.length > 0 && x(0) == c => x substring 1 }) + def result = delegate.result lazy val resultEmpty = delegate.resultEmpty lazy val completions = if(fixed.isEmpty) - if(resultEmpty.isEmpty) Completions.nil else Completions.empty + if(resultEmpty.isValid) Completions.nil else Completions.empty else Completions(fixed map(f => Completion.suggestion(f)) ) override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" @@ -429,21 +539,25 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends private final class StringLiteral(str: String, start: Int) extends ValidParser[String] { assert(0 <= start && start < str.length) - def resultEmpty = None - def derive(c: Char) = if(str.charAt(start) == c) stringLiteral(str, start+1) else Invalid + def failMsg = "Expected '" + str + "'" + def resultEmpty = mkFailure(failMsg) + def result = None + def derive(c: Char) = if(str.charAt(start) == c) stringLiteral(str, start+1) else new Invalid(resultEmpty) lazy val completions = Completions.single(Completion.suggestion(str.substring(start))) override def toString = '"' + str + '"' } -private final class CharacterClass(f: Char => Boolean) extends ValidParser[Char] +private final class CharacterClass(f: Char => Boolean, label: String) extends ValidParser[Char] { - def resultEmpty = None - def derive(c: Char) = if( f(c) ) success(c) else Invalid + def result = None + def resultEmpty = mkFailure("Expected " + label) + def derive(c: Char) = if( f(c) ) success(c) else Invalid(resultEmpty) def completions = Completions.empty - override def toString = "class()" + override def toString = "class(" + label + ")" } private final class Optional[T](delegate: Parser[T]) extends ValidParser[Option[T]] { - def resultEmpty = Some(None) + def result = delegate.result map some.fn + def resultEmpty = Value(None) def derive(c: Char) = (delegate derive c).map(some.fn) lazy val completions = Completion.empty +: delegate.completions override def toString = delegate.toString + "?" @@ -460,8 +574,8 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m val partD = repeat(Some(part derive c), repeated, min, max, accumulatedReverse) part.resultEmpty match { - case Some(pv) => partD | repeatDerive(c, pv :: accumulatedReverse) - case None => partD + case Value(pv) => partD | repeatDerive(c, pv :: accumulatedReverse) + case _: Failure => partD } case None => repeatDerive(c, accumulatedReverse) } @@ -481,21 +595,21 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m case None => fin } } - lazy val resultEmpty: Option[Seq[T]] = + def result = None + lazy val resultEmpty: Result[Seq[T]] = { val partialAccumulatedOption = partial match { - case None => Some(accumulatedReverse) + case None => Value(accumulatedReverse) case Some(partialPattern) => partialPattern.resultEmpty.map(_ :: accumulatedReverse) } - for(partialAccumulated <- partialAccumulatedOption; repeatEmpty <- repeatedParseEmpty) yield - partialAccumulated reverse_::: repeatEmpty + (partialAccumulatedOption app repeatedParseEmpty)(_ reverse_::: _) } - private def repeatedParseEmpty: Option[List[T]] = + private def repeatedParseEmpty: Result[List[T]] = { if(min == 0) - Some(Nil) + Value(Nil) else // forced determinism for(value <- repeated.resultEmpty) yield diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 85c5a73fb..ceb441d09 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -14,12 +14,12 @@ trait Parsers lazy val any: Parser[Char] = charClass(_ => true) lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") - lazy val Digit = charClass(_.isDigit) examples DigitSet - lazy val Letter = charClass(_.isLetter) + lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet + lazy val Letter = charClass(_.isLetter, "letter") def IDStart = Letter - lazy val IDChar = charClass(isIDChar) + lazy val IDChar = charClass(isIDChar, "ID character") lazy val ID = IDStart ~ IDChar.* map { case x ~ xs => (x +: xs).mkString } - lazy val OpChar = charClass(isOpChar) + lazy val OpChar = charClass(isOpChar, "symbol") lazy val Op = OpChar.+.string lazy val OpOrID = ID | Op @@ -28,12 +28,15 @@ trait Parsers def isIDChar(c: Char) = c.isLetterOrDigit || c == '-' || c == '_' def isDelimiter(c: Char) = c match { case '`' | '\'' | '\"' | /*';' | */',' | '.' => true ; case _ => false } - lazy val NotSpaceClass = charClass(!_.isWhitespace) - lazy val SpaceClass = charClass(_.isWhitespace) + lazy val NotSpaceClass = charClass(!_.isWhitespace, "non-whitespace character") + lazy val SpaceClass = charClass(_.isWhitespace, "whitespace character") lazy val NotSpace = NotSpaceClass.+.string lazy val Space = SpaceClass.+.examples(" ") lazy val OptSpace = SpaceClass.*.examples(" ") - lazy val URIClass = charClass(x => !x.isWhitespace && ')' != x).+.string + lazy val URIClass = URIChar.+.string !!! "Invalid URI" + + lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%") + def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') // TODO: implement def fileParser(base: File): Parser[File] = token(mapOrFail(NotSpace)(s => new File(s.mkString)), "") @@ -63,6 +66,6 @@ object Parsers extends Parsers object DefaultParsers extends Parsers with ParserMain { def matches(p: Parser[_], s: String): Boolean = - apply(p)(s).resultEmpty.isDefined + apply(p)(s).resultEmpty.isValid def validID(s: String): Boolean = matches(ID, s) } \ No newline at end of file From f4998e1d4a7da2c00bf3ca7edbb7e91a6b59c0ac Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 26 Apr 2011 20:49:43 -0400 Subject: [PATCH 157/823] fix tab completion for filtered parsers --- util/complete/Parser.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 821603e1f..45067043b 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -147,8 +147,8 @@ object Parser extends ParserMain a.ifValid { a.result match { - case Some(av) => if( f(av) ) success( av ) else Parser.failure(msg(seen)) - case None => new Filter(a, f, seen, msg) + case Some(av) if f(av) => success( av ) + case _ => new Filter(a, f, seen, msg) } } @@ -471,7 +471,7 @@ private final class MapParser[A,B](a: Parser[A], f: A => B) extends ValidParser[ } private final class Filter[T](p: Parser[T], f: T => Boolean, seen: String, msg: String => String) extends ValidParser[T] { - def filterResult(r: Result[T]) = p.resultEmpty.filter(f, msg(seen)) + def filterResult(r: Result[T]) = r.filter(f, msg(seen)) lazy val result = p.result filter f lazy val resultEmpty = filterResult(p.resultEmpty) def derive(c: Char) = filterParser(p derive c, f, seen + c, msg) From 58d2e3415c45afe6502074a3b2a81419a9feae54 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 26 Apr 2011 22:29:30 -0400 Subject: [PATCH 158/823] trying out different costs for edit distance --- util/complete/EditDistance.scala | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/util/complete/EditDistance.scala b/util/complete/EditDistance.scala index 60137d533..218c48c0a 100644 --- a/util/complete/EditDistance.scala +++ b/util/complete/EditDistance.scala @@ -6,7 +6,7 @@ object EditDistance { * http://www.merriampark.com/ld.htm * which is declared to be public domain. */ - def levenshtein(s: String, t: String, insertCost: Int, deleteCost: Int, subCost: Int, transposeCost: Int, transpositions: Boolean = false): Int = { + def levenshtein(s: String, t: String, insertCost: Int = 1, deleteCost: Int = 1, subCost: Int = 1, transposeCost: Int = 1, matchCost: Int = 0, transpositions: Boolean = false): Int = { val n = s.length val m = t.length if (n == 0) return m @@ -18,17 +18,19 @@ object EditDistance { for (i <- 1 to n ; val s_i = s(i - 1) ; j <- 1 to m) { val t_j = t(j - 1) - val cost = if (s_i == t_j) 0 else 1 + val cost = if (s_i == t_j) matchCost else subCost + val tcost = if (s_i == t_j) matchCost else transposeCost + val c1 = d(i - 1)(j) + deleteCost val c2 = d(i)(j - 1) + insertCost - val c3 = d(i - 1)(j - 1) + cost*subCost + val c3 = d(i - 1)(j - 1) + cost d(i)(j) = c1 min c2 min c3 if (transpositions) { if (i > 1 && j > 1 && s(i - 1) == t(j - 2) && s(i - 2) == t(j - 1)) - d(i)(j) = d(i)(j) min (d(i - 2)(j - 2) + cost*transposeCost) + d(i)(j) = d(i)(j) min (d(i - 2)(j - 2) + cost) } } From 375f09cd2685fe7b99b9b5334eb008241fa4185d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 27 Apr 2011 20:40:52 -0400 Subject: [PATCH 159/823] speed up startup --- util/collection/Dag.scala | 18 ++++++++++++++++++ util/collection/PMap.scala | 7 +++++++ util/collection/Settings.scala | 10 ++++------ 3 files changed, 29 insertions(+), 6 deletions(-) diff --git a/util/collection/Dag.scala b/util/collection/Dag.scala index 29b43038a..8a6d0ba46 100644 --- a/util/collection/Dag.scala +++ b/util/collection/Dag.scala @@ -36,6 +36,24 @@ object Dag finished.toList; } + // doesn't check for cycles + def topologicalSortUnchecked[T](node: T)(dependencies: T => Iterable[T]): List[T] = + { + val discovered = new mutable.HashSet[T] + var finished: List[T] = Nil + + def visitAll(nodes: Iterable[T]) = nodes foreach visit + def visit(node : T){ + if (!discovered(node)) { + discovered(node) = true; + visitAll(dependencies(node)) + finished ::= node; + } + } + + visit(node); + finished; + } final class Cyclic(val value: Any, val all: List[Any], val complete: Boolean) extends Exception( "Cyclic reference involving " + (if(complete) all.mkString(", ") else value) ) { diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala index 95bb1b6b0..febab0286 100644 --- a/util/collection/PMap.scala +++ b/util/collection/PMap.scala @@ -12,6 +12,8 @@ trait RMap[K[_], V[_]] def get[T](k: K[T]): Option[V[T]] def contains[T](k: K[T]): Boolean def toSeq: Seq[(K[_], V[_])] + def keys: Iterable[K[_]] + def values: Iterable[V[_]] } trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] @@ -54,6 +56,8 @@ object IMap def mapValues[V2[_]](f: V ~> V2) = new IMap0[K,V2](backing.mapValues(x => f(x)).toMap) def toSeq = backing.toSeq + def keys = backing.keys + def values = backing.values override def toString = backing.toString } @@ -83,6 +87,9 @@ class DelegatingPMap[K[_], V[_]](backing: mutable.Map[K[_], V[_]]) extends Abstr v } def toSeq = backing.toSeq + def keys = backing.keys + def values = backing.values + private[this] def cast[T](v: V[_]): V[T] = v.asInstanceOf[V[T]] private[this] def cast[T](o: Option[V[_]]): Option[V[T]] = o map cast[T] diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index aabb4c955..40d54de0d 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -68,7 +68,7 @@ trait Init[Scope] def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { def apply[T](k: ScopedKey[T]): T = getValue(s, k) } - def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key).get + def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse error("Internal settings error: invalid reference to " + display(k)) def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): CompiledMap = @@ -124,18 +124,16 @@ trait Init[Scope] } private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean): ScopedKey[T] = { - val scache = PMap.empty[ScopedKey, ScopedKey] def resolve(search: Seq[Scope]): ScopedKey[T] = search match { case Seq() => throw Uninitialized(k, refKey) case Seq(x, xs @ _*) => val sk = ScopedKey(x, k.key) - scache.getOrUpdate(sk, if(defines(sMap, sk, refKey, isFirst)) sk else resolve(xs)) + val definesKey = (refKey != sk || !isFirst) && (sMap contains sk) + if(definesKey) sk else resolve(xs) } resolve(scopes) } - private[this] def defines(map: ScopedMap, key: ScopedKey[_], refKey: ScopedKey[_], isFirst: Boolean): Boolean = - (map get key) match { case Some(Seq(x, _*)) => (refKey != key) || !isFirst; case _ => false } private[this] def applyInits(ordered: Seq[Compiled])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = (empty /: ordered){ (m, comp) => comp.eval(m) } @@ -149,7 +147,7 @@ trait Init[Scope] final class Uninitialized(val key: ScopedKey[_], val refKey: ScopedKey[_], msg: String) extends Exception(msg) def Uninitialized(key: ScopedKey[_], refKey: ScopedKey[_]): Uninitialized = - new Uninitialized(key, refKey, "Reference to uninitialized setting " + key.key.label + " (in " + key.scope + ") from " + refKey.key.label +" (in " + refKey.scope + ")") + new Uninitialized(key, refKey, "Reference to uninitialized setting " + display(key) + " from " + display(refKey)) final class Compiled(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) { override def toString = display(key) From 767beb7993916e933e54a52c700c5d4d407de6f5 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 7 May 2011 22:02:05 -0400 Subject: [PATCH 160/823] test fixes --- util/complete/src/test/scala/ParserTest.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 111f2cbb6..931c01f01 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -110,7 +110,7 @@ object ParserExample val ann = aqn ~ an def r = apply(ann)("a"*(n*2)).resultEmpty - println(r.isDefined) + println(r.isValid) } def run2(n: Int) { From 99110c1dd32460835e23289af271a72cc6d47c68 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 7 May 2011 22:02:06 -0400 Subject: [PATCH 161/823] basic optional input support --- util/collection/Settings.scala | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 40d54de0d..60694120b 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -53,6 +53,7 @@ trait Init[Scope] def setting[T](key: ScopedKey[T], init: Initialize[T]): Setting[T] = new Setting[T](key, init) def value[T](value: => T): Initialize[T] = new Value(value _) + def optional[T,U](key: ScopedKey[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(key), f) def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head))) def app[HL <: HList, T](inputs: KList[ScopedKey, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) def uniform[S,T](inputs: Seq[ScopedKey[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) @@ -192,6 +193,14 @@ trait Init[Scope] override def toString = "setting(" + key + ")" } + private[this] final class Optional[S,T](a: Option[ScopedKey[S]], f: Option[S] => T) extends Initialize[T] + { + def dependsOn = a.toList + def map[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) + def get(map: Settings[Scope]): T = f(a map asFunction(map)) + def mapReferenced(g: MapScoped) = new Optional(mapKey(g), f) + private[this] def mapKey(g: MapScoped) = try { a map g.fn } catch { case _: Uninitialized => None } + } private[this] final class Joined[S,T,U](a: Initialize[S], b: Initialize[T], f: (S,T) => U) extends Initialize[U] { def dependsOn = a.dependsOn ++ b.dependsOn From 7c2880915d15a52d4fbad422adfae13fd624d0e6 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 7 May 2011 22:02:06 -0400 Subject: [PATCH 162/823] Use standard {build}/id syntax for 'project' command --- util/complete/Parsers.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index ceb441d09..c6f157827 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -35,7 +35,7 @@ trait Parsers lazy val OptSpace = SpaceClass.*.examples(" ") lazy val URIClass = URIChar.+.string !!! "Invalid URI" - lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%") + lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%#") def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') // TODO: implement @@ -52,6 +52,7 @@ trait Parsers def rep1sep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = (rep ~ (sep ~> rep).*).map { case (x ~ xs) => x +: xs } + def some[T](p: Parser[T]): Parser[Option[T]] = p map { v => Some(v) } def mapOrFail[S,T](p: Parser[S])(f: S => T): Parser[T] = p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } From 13a0c155dfc41a9d5bcb9fa398450b076a9771c0 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 7 May 2011 22:02:06 -0400 Subject: [PATCH 163/823] support extra axis for streams --- util/collection/Attributes.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 4c3a16873..f275cdbe0 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -72,9 +72,9 @@ private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) } // type inference required less generality -final case class AttributeEntry[T](a: AttributeKey[T], b: T) +final case class AttributeEntry[T](key: AttributeKey[T], value: T) { - override def toString = a.label + ": " + b + override def toString = key.label + ": " + value } final case class Attributed[D](data: D)(val metadata: AttributeMap) From c53c94c72ab2ef38525e847c785d2fc4ec213817 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 7 May 2011 22:02:06 -0400 Subject: [PATCH 164/823] logging cleanup --- util/log/BasicLogger.scala | 12 ++++---- util/log/BufferedLogger.scala | 57 ++++++++++++++++++----------------- util/log/FullLogger.scala | 12 +++++++- 3 files changed, 47 insertions(+), 34 deletions(-) diff --git a/util/log/BasicLogger.scala b/util/log/BasicLogger.scala index f7adb11ce..c58dc57c6 100644 --- a/util/log/BasicLogger.scala +++ b/util/log/BasicLogger.scala @@ -9,10 +9,10 @@ abstract class BasicLogger extends AbstractLogger private var traceEnabledVar = java.lang.Integer.MAX_VALUE private var level: Level.Value = Level.Info private var successEnabledVar = true - def successEnabled = successEnabledVar - def setSuccessEnabled(flag: Boolean) { successEnabledVar = flag } - def getLevel = level - def setLevel(newLevel: Level.Value) { level = newLevel } - def setTrace(level: Int) { traceEnabledVar = level } - def getTrace = traceEnabledVar + def successEnabled = synchronized { successEnabledVar } + def setSuccessEnabled(flag: Boolean): Unit = synchronized { successEnabledVar = flag } + def getLevel = synchronized { level } + def setLevel(newLevel: Level.Value): Unit = synchronized { level = newLevel } + def setTrace(level: Int): Unit = synchronized { traceEnabledVar = level } + def getTrace = synchronized { traceEnabledVar } } \ No newline at end of file diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala index 73aa7f8e5..37dbbb1df 100644 --- a/util/log/BufferedLogger.scala +++ b/util/log/BufferedLogger.scala @@ -13,21 +13,19 @@ package sbt * * This class assumes that it is the only client of the delegate logger. * */ -class BufferedLogger(delegate: AbstractLogger) extends AbstractLogger +class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { private[this] val buffer = new ListBuffer[LogEvent] private[this] var recording = false /** Enables buffering. */ - def record() = { recording = true } - def buffer[T](f: => T): T = - { + def record() = synchronized { recording = true } + def buffer[T](f: => T): T = synchronized { record() try { f } finally { stopQuietly() } } - def bufferQuietly[T](f: => T): T = - { + def bufferQuietly[T](f: => T): T = synchronized { record() try { @@ -37,33 +35,36 @@ class BufferedLogger(delegate: AbstractLogger) extends AbstractLogger } catch { case e => stopQuietly(); throw e } } - private def stopQuietly() = try { stop() } catch { case e: Exception => () } + def stopQuietly() = synchronized { try { stop() } catch { case e: Exception => () } } /** Flushes the buffer to the delegate logger. This method calls logAll on the delegate * so that the messages are written consecutively. The buffer is cleared in the process. */ - def play() { delegate.logAll(buffer.readOnly); buffer.clear() } + def play(): Unit = synchronized { delegate.logAll(buffer.readOnly); buffer.clear() } /** Clears buffered events and disables buffering. */ - def clear(): Unit = { buffer.clear(); recording = false } + def clear(): Unit = synchronized { buffer.clear(); recording = false } /** Plays buffered events and disables buffering. */ - def stop() { play(); clear() } + def stop(): Unit = synchronized { play(); clear() } - def setLevel(newLevel: Level.Value) - { - buffer += new SetLevel(newLevel) - delegate.setLevel(newLevel) + override def setLevel(newLevel: Level.Value): Unit = synchronized { + super.setLevel(newLevel) + if(recording) + buffer += new SetLevel(newLevel) + else + delegate.setLevel(newLevel) } - def setSuccessEnabled(flag: Boolean) - { - buffer += new SetSuccess(flag) - delegate.setSuccessEnabled(flag) + override def setSuccessEnabled(flag: Boolean): Unit = synchronized { + super.setSuccessEnabled(flag) + if(recording) + buffer += new SetSuccess(flag) + else + delegate.setSuccessEnabled(flag) } - def successEnabled = delegate.successEnabled - def getLevel = delegate.getLevel - def getTrace = delegate.getTrace - def setTrace(level: Int) - { - buffer += new SetTrace(level) - delegate.setTrace(level) + override def setTrace(level: Int): Unit = synchronized { + super.setTrace(level) + if(recording) + buffer += new SetTrace(level) + else + delegate.setTrace(level) } def trace(t: => Throwable): Unit = @@ -72,16 +73,17 @@ class BufferedLogger(delegate: AbstractLogger) extends AbstractLogger doBufferable(Level.Info, new Success(message), _.success(message)) def log(level: Level.Value, message: => String): Unit = doBufferable(level, new Log(level, message), _.log(level, message)) - def logAll(events: Seq[LogEvent]): Unit = + def logAll(events: Seq[LogEvent]): Unit = synchronized { if(recording) buffer ++= events else delegate.logAll(events) + } def control(event: ControlEvent.Value, message: => String): Unit = doBufferable(Level.Info, new ControlEvent(event, message), _.control(event, message)) private def doBufferable(level: Level.Value, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = doBufferableIf(atLevel(level), appendIfBuffered, doUnbuffered) - private def doBufferableIf(condition: => Boolean, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = + private def doBufferableIf(condition: => Boolean, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = synchronized { if(condition) { if(recording) @@ -89,4 +91,5 @@ class BufferedLogger(delegate: AbstractLogger) extends AbstractLogger else doUnbuffered(delegate) } + } } \ No newline at end of file diff --git a/util/log/FullLogger.scala b/util/log/FullLogger.scala index e562fdb20..ca88f0b4d 100644 --- a/util/log/FullLogger.scala +++ b/util/log/FullLogger.scala @@ -4,8 +4,9 @@ package sbt /** Promotes the simple Logger interface to the full AbstractLogger interface. */ -class FullLogger(delegate: Logger, override val ansiCodesSupported: Boolean = false) extends BasicLogger +class FullLogger(delegate: Logger) extends BasicLogger { + override val ansiCodesSupported: Boolean = delegate.ansiCodesSupported def trace(t: => Throwable) { if(traceEnabled) @@ -23,3 +24,12 @@ class FullLogger(delegate: Logger, override val ansiCodesSupported: Boolean = fa info(message) def logAll(events: Seq[LogEvent]): Unit = events.foreach(log) } +object FullLogger +{ + def apply(delegate: Logger): AbstractLogger = + delegate match + { + case d: AbstractLogger => d + case _ => new FullLogger(delegate) + } +} \ No newline at end of file From 3cc8c52dea67c7943ced1db7076044e1533931a2 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 17 May 2011 20:09:20 -0400 Subject: [PATCH 165/823] build sxr, api docs and use sbinary 0.4.0 --- cache/Cache.scala | 16 ++++++++-------- cache/SeparatedCache.scala | 7 +++---- cache/tracking/Tracked.scala | 6 ++---- util/log/ConsoleLogger.scala | 17 +++++++++-------- 4 files changed, 22 insertions(+), 24 deletions(-) diff --git a/cache/Cache.scala b/cache/Cache.scala index 21e89abc4..e394a8903 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -3,7 +3,7 @@ */ package sbt -import sbinary.{CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output} +import sbinary.{CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output => Out} import java.io.{ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream} import java.net.{URI, URL} import Types.:+: @@ -45,7 +45,7 @@ object Cache extends CacheImplicits println(label + ".read: " + v) v } - def write(to: Output, v: Internal) + def write(to: Out, v: Internal) { println(label + ".write: " + v) c.write(to, v) @@ -79,7 +79,7 @@ trait BasicCacheImplicits val isDefined = BooleanFormat.reads(from) if(isDefined) Some(t.read(from)) else None } - def write(to: Output, j: Internal): Unit = + def write(to: Out, j: Internal): Unit = { BooleanFormat.writes(to, j.isDefined) j foreach { x => t.write(to, x) } @@ -129,7 +129,7 @@ trait BasicCacheImplicits if(left <= 0) acc.reverse else next(left - 1, t.read(from) :: acc) next(size, Nil) } - def write(to: Output, vs: Internal) + def write(to: Out, vs: Internal) { val size = vs.length IntFormat.writes(to, size) @@ -157,7 +157,7 @@ trait BasicCacheImplicits type Internal = jCache.Internal def convert(i: I) = jCache.convert(f(i)) def read(from: Input) = jCache.read(from) - def write(to: Output, j: Internal) = jCache.write(to, j) + def write(to: Out, j: Internal) = jCache.write(to, j) def equiv = jCache.equiv } @@ -180,7 +180,7 @@ trait HListCacheImplicits val t = tail.read(from) (h, t) } - def write(to: Output, j: Internal) + def write(to: Out, j: Internal) { head.write(to, j._1) tail.write(to, j._2) @@ -202,7 +202,7 @@ trait HListCacheImplicits val t = tail.reads(from) HCons(h, t) } - def writes(to: Output, hc: H :+: T) + def writes(to: Out, hc: H :+: T) { head.writes(to, hc.head) tail.writes(to, hc.tail) @@ -225,7 +225,7 @@ trait UnionImplicits val value = cache.read(in) new Found[cache.Internal](cache, clazz, value, index) } - def write(to: Output, i: Internal) + def write(to: Out, i: Internal) { def write0[I](f: Found[I]) { diff --git a/cache/SeparatedCache.scala b/cache/SeparatedCache.scala index 59bbbe379..523716ac3 100644 --- a/cache/SeparatedCache.scala +++ b/cache/SeparatedCache.scala @@ -4,9 +4,8 @@ package sbt import Types.:+: -import sbinary.{DefaultProtocol, Format, Input, JavaIO, Output} +import sbinary.{DefaultProtocol, Format, Input, Output => Out} import DefaultProtocol.ByteFormat -import JavaIO._ import java.io.{File, InputStream, OutputStream} trait InputCache[I] @@ -14,7 +13,7 @@ trait InputCache[I] type Internal def convert(i: I): Internal def read(from: Input): Internal - def write(to: Output, j: Internal): Unit + def write(to: Out, j: Internal): Unit def equiv: Equiv[Internal] } object InputCache @@ -25,7 +24,7 @@ object InputCache type Internal = I def convert(i: I) = i def read(from: Input): I = fmt.reads(from) - def write(to: Output, i: I) = fmt.writes(to, i) + def write(to: Out, i: I) = fmt.writes(to, i) def equiv = eqv } } diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index 10b37101f..9d2848b73 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -5,7 +5,7 @@ package sbt import java.io.File import CacheIO.{fromFile, toFile} -import sbinary.{Format, JavaIO} +import sbinary.Format import scala.reflect.Manifest import scala.collection.mutable import IO.{delete, read, write} @@ -29,8 +29,6 @@ object Tracked def diffOutputs(cache: File, style: FilesInfo.Style): Difference = Difference.outputs(cache, style) - import sbinary.JavaIO._ - def lastOutput[I,O](cacheFile: File)(f: (I,Option[O]) => O)(implicit o: Format[O], mf: Manifest[Format[O]]): I => O = in => { val previous: Option[O] = fromFile[O](cacheFile) @@ -112,7 +110,7 @@ class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O ifChanged(value) } } - import JavaIO._ + def update(value: O): Unit = Using.fileOutputStream(false)(cacheFile)(stream => format.writes(stream, value)) def uptodate(value: O): Boolean = try { diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index f7229573e..c1a77e745 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -54,16 +54,17 @@ object ConsoleLogger * This logger is not thread-safe.*/ class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean) extends BasicLogger { - def messageColor(level: Level.Value) = Console.RESET + import scala.Console.{BLUE, GREEN, RED, RESET, YELLOW} + def messageColor(level: Level.Value) = RESET def labelColor(level: Level.Value) = level match { - case Level.Error => Console.RED - case Level.Warn => Console.YELLOW - case _ => Console.RESET + case Level.Error => RED + case Level.Warn => YELLOW + case _ => RESET } - def successLabelColor = Console.GREEN - def successMessageColor = Console.RESET + def successLabelColor = GREEN + def successMessageColor = RESET override def success(message: => String) { if(successEnabled) @@ -81,7 +82,7 @@ class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ans if(atLevel(level)) log(labelColor(level), level.toString, messageColor(level), message) } - private def reset(): Unit = setColor(Console.RESET) + private def reset(): Unit = setColor(RESET) private def setColor(color: String) { @@ -108,7 +109,7 @@ class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ans def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } def control(event: ControlEvent.Value, message: => String) - { log(labelColor(Level.Info), Level.Info.toString, Console.BLUE, message) } + { log(labelColor(Level.Info), Level.Info.toString, BLUE, message) } } sealed trait ConsoleOut { From dd5177bc2b17dbc15ea9c77060dab5a688a3ff8e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 23 May 2011 08:13:13 -0400 Subject: [PATCH 166/823] task axis delegation --- util/collection/Attributes.scala | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index f275cdbe0..1025c03df 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -11,6 +11,7 @@ import Types._ sealed trait AttributeKey[T] { def label: String def description: Option[String] + def extend: Seq[AttributeKey[_]] override final def toString = label } object AttributeKey @@ -18,10 +19,17 @@ object AttributeKey def apply[T](name: String): AttributeKey[T] = new AttributeKey[T] { def label = name def description = None + def extend = Nil } def apply[T](name: String, description0: String): AttributeKey[T] = new AttributeKey[T] { def label = name def description = Some(description0) + def extend = Nil + } + def apply[T](name: String, description0: String, extend0: Seq[AttributeKey[_]]): AttributeKey[T] = new AttributeKey[T] { + def label = name + def description = Some(description0) + def extend = extend0 } } From e702de0fe3ba3e78ff4a4a438581f356df021f83 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 23 May 2011 18:40:03 -0400 Subject: [PATCH 167/823] fixes #23 --- util/log/BufferedLogger.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala index 37dbbb1df..50598c936 100644 --- a/util/log/BufferedLogger.scala +++ b/util/log/BufferedLogger.scala @@ -20,12 +20,12 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger /** Enables buffering. */ def record() = synchronized { recording = true } - def buffer[T](f: => T): T = synchronized { + def buffer[T](f: => T): T = { record() try { f } finally { stopQuietly() } } - def bufferQuietly[T](f: => T): T = synchronized { + def bufferQuietly[T](f: => T): T = { record() try { From 9904c165becabd135d08925e8d6cf5655f871486 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 23 May 2011 18:40:03 -0400 Subject: [PATCH 168/823] an annotation can reference a non-simple type, fixes #24 --- interface/other | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/interface/other b/interface/other index 2ee086bcb..78ebbc3c8 100644 --- a/interface/other +++ b/interface/other @@ -36,7 +36,7 @@ TypeParameter upperBound: Type Annotation - base: SimpleType + base: Type arguments: AnnotationArgument* AnnotationArgument name: String From 59ffcac74ae0954d999df59d8769344b5acee1c9 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 26 May 2011 08:21:26 -0400 Subject: [PATCH 169/823] back A.Key with Manifest, dropping object equality. fixes #27 type inference restoration pending switch to 2.9.0 --- util/collection/Attributes.scala | 16 +++++++++++++--- 1 file changed, 13 insertions(+), 3 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 1025c03df..2302d326d 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -4,29 +4,39 @@ package sbt import Types._ +import scala.reflect.Manifest // T must be invariant to work properly. // Because it is sealed and the only instances go through AttributeKey.apply, // a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts sealed trait AttributeKey[T] { + def manifest: Manifest[T] def label: String def description: Option[String] def extend: Seq[AttributeKey[_]] override final def toString = label + override final def hashCode = label.hashCode + override final def equals(o: Any) = (this eq o.asInstanceOf[AnyRef]) || (o match { + case a: AttributeKey[t] => a.label == this.label && a.manifest == this.manifest + case _ => false + }) } object AttributeKey { - def apply[T](name: String): AttributeKey[T] = new AttributeKey[T] { + def apply[T](name: String)(implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + def manifest = mf def label = name def description = None def extend = Nil } - def apply[T](name: String, description0: String): AttributeKey[T] = new AttributeKey[T] { + def apply[T](name: String, description0: String)(implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + def manifest = mf def label = name def description = Some(description0) def extend = Nil } - def apply[T](name: String, description0: String, extend0: Seq[AttributeKey[_]]): AttributeKey[T] = new AttributeKey[T] { + def apply[T](name: String, description0: String, extend0: Seq[AttributeKey[_]])(implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + def manifest = mf def label = name def description = Some(description0) def extend = extend0 From fa90cc7de6379304139b4fe8f30021c3d731f8d9 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 26 May 2011 22:24:26 -0400 Subject: [PATCH 170/823] forgot a test --- util/collection/src/test/scala/KeyTest.scala | 35 ++++++++++++++++++++ 1 file changed, 35 insertions(+) create mode 100644 util/collection/src/test/scala/KeyTest.scala diff --git a/util/collection/src/test/scala/KeyTest.scala b/util/collection/src/test/scala/KeyTest.scala new file mode 100644 index 000000000..9ac4f86bb --- /dev/null +++ b/util/collection/src/test/scala/KeyTest.scala @@ -0,0 +1,35 @@ +package sbt + + import org.scalacheck._ + import Prop._ + +object KeyTest extends Properties("AttributeKey") +{ + property("equality") = { + compare(AttributeKey[Int]("test"), AttributeKey[Int]("test"), true) && + compare(AttributeKey[Int]("test"), AttributeKey[Int]("test", "description"), true) && + compare(AttributeKey[Int]("test", "a"), AttributeKey[Int]("test", "b"), true) && + compare(AttributeKey[Int]("test"), AttributeKey[Int]("tests"), false) && + compare(AttributeKey[Int]("test"), AttributeKey[Double]("test"), false) && + compare(AttributeKey[java.lang.Integer]("test"), AttributeKey[Int]("test"), false) && + compare(AttributeKey[Map[Int, String]]("test"), AttributeKey[Map[Int, String]]("test"), true) && + compare(AttributeKey[Map[Int, String]]("test"), AttributeKey[Map[Int, _]]("test"), false) + } + + def compare(a: AttributeKey[_], b: AttributeKey[_], same: Boolean) = + ("a.label: " + a.label) |: + ("a.manifest: " + a.manifest) |: + ("b.label: " + b.label) |: + ("b.manifest: " + b.manifest) |: + ("expected equal? " + same) |: + compare0(a, b, same) + + def compare0(a: AttributeKey[_], b: AttributeKey[_], same: Boolean) = + if(same) + { + ("equality" |: (a == b)) && + ("hash" |: (a.hashCode == b.hashCode)) + } + else + ("equality" |: (a != b)) +} \ No newline at end of file From f0608da0a82fc5eef9883714a7b455638dfcf11c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 29 May 2011 19:17:31 -0400 Subject: [PATCH 171/823] more release-worthy compile message and analysis toString --- util/collection/Util.scala | 9 ++++++++- util/complete/Parser.scala | 2 +- 2 files changed, 9 insertions(+), 2 deletions(-) diff --git a/util/collection/Util.scala b/util/collection/Util.scala index a3bf330aa..cceb1ec99 100644 --- a/util/collection/Util.scala +++ b/util/collection/Util.scala @@ -3,7 +3,7 @@ */ package sbt -object Collections +object Util { def separate[T,A,B](ps: Seq[T])(f: T => Either[A,B]): (Seq[A], Seq[B]) = ((Nil: Seq[A], Nil: Seq[B]) /: ps)( (xs, y) => prependEither(xs, f(y)) ) @@ -14,4 +14,11 @@ object Collections case Left(l) => (l +: acc._1, acc._2) case Right(r) => (acc._1, r +: acc._2) } + def counted(prefix: String, single: String, plural: String, count: Int): Option[String] = + count match + { + case 0 => None + case 1 => Some("1 " + prefix + single) + case x => Some(x.toString + " " + prefix + plural) + } } \ No newline at end of file diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 45067043b..6eb9c135b 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -5,7 +5,7 @@ package sbt.complete import Parser._ import sbt.Types.{left, right, some} - import sbt.Collections.separate + import sbt.Util.separate sealed trait Parser[+T] { From d1ad850a1227971aa03066bb40c67fc4c6d1b812 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 30 May 2011 22:10:01 -0400 Subject: [PATCH 172/823] error handling adjustments, including showing failing task in red (for #29) --- util/control/ErrorHandling.scala | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/util/control/ErrorHandling.scala b/util/control/ErrorHandling.scala index 4072edff3..a346e0d14 100644 --- a/util/control/ErrorHandling.scala +++ b/util/control/ErrorHandling.scala @@ -21,6 +21,15 @@ object ErrorHandling def convert[T](f: => T): Either[Exception, T] = try { Right(f) } catch { case e: Exception => Left(e) } + + def reducedToString(e: Throwable): String = + if(e.getClass == classOf[RuntimeException]) + { + val msg = e.getMessage + if(msg == null || msg.isEmpty) e.toString else msg + } + else + e.toString } final class TranslatedException private[sbt](msg: String, cause: Throwable) extends RuntimeException(msg, cause) { From cb2c37afa076957235b6cc36bc8ad26b2b30bcac Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 31 May 2011 18:37:07 -0400 Subject: [PATCH 173/823] rearrange products settings 1. enables exporting jar to classpath instead of class directory 2. starts to make post-processing class files easier --- util/collection/Attributes.scala | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 2302d326d..980cbc9ec 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -98,9 +98,10 @@ final case class AttributeEntry[T](key: AttributeKey[T], value: T) final case class Attributed[D](data: D)(val metadata: AttributeMap) { def put[T](key: AttributeKey[T], value: T): Attributed[D] = Attributed(data)(metadata.put(key, value)) + def map[T](f: D => T): Attributed[T] = Attributed(f(data))(metadata) } object Attributed { - implicit def blankSeq[T](in: Seq[T]): Seq[Attributed[T]] = in map blank - implicit def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) + def blankSeq[T](in: Seq[T]): Seq[Attributed[T]] = in map blank + def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) } \ No newline at end of file From a152f157f5ba38772d09a2c8c8f709442cac666b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 1 Jun 2011 02:19:46 -0400 Subject: [PATCH 174/823] implement shortcut for API equality checking, fixes #18 --- interface/other | 9 +++++++++ interface/src/main/java/xsbti/AnalysisCallback.java | 2 +- interface/src/test/scala/TestCallback.scala | 4 ++-- 3 files changed, 12 insertions(+), 3 deletions(-) diff --git a/interface/other b/interface/other index 78ebbc3c8..64b09422d 100644 --- a/interface/other +++ b/interface/other @@ -1,7 +1,16 @@ Source + compilation: Compilation + hash: Byte* + api: SourceAPI + +SourceAPI packages : Package* definitions: Definition* +Compilation + startTime: Long + target: String + Package name: String diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index d3eb2ab54..efca555c4 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -23,5 +23,5 @@ public interface AnalysisCallback /** Called after the source at the given location has been processed. */ public void endSource(File sourcePath); /** Called when the public API of a source file is extracted. */ - public void api(File sourceFile, xsbti.api.Source source); + public void api(File sourceFile, xsbti.api.SourceAPI source); } \ No newline at end of file diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index d554a9a08..c49e11317 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -10,7 +10,7 @@ class TestCallback extends AnalysisCallback val sourceDependencies = new ArrayBuffer[(File, File)] val binaryDependencies = new ArrayBuffer[(File, String, File)] val products = new ArrayBuffer[(File, File)] - val apis = new ArrayBuffer[(File, xsbti.api.Source)] + val apis = new ArrayBuffer[(File, xsbti.api.SourceAPI)] def beginSource(source: File) { beganSources += source } @@ -19,5 +19,5 @@ class TestCallback extends AnalysisCallback def generatedClass(source: File, module: File) { products += ((source, module)) } def endSource(source: File) { endedSources += source } - def api(source: File, sourceAPI: xsbti.api.Source) { apis += ((source, sourceAPI)) } + def api(source: File, sourceAPI: xsbti.api.SourceAPI) { apis += ((source, sourceAPI)) } } \ No newline at end of file From 8c89a8b13765b6855e864a171166d598f3d53c02 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 10 Jun 2011 07:48:53 -0400 Subject: [PATCH 175/823] honor formatEnabled setting, fixes #48 --- util/log/ConsoleLogger.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index c1a77e745..34c8a9575 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -44,7 +44,7 @@ object ConsoleLogger def apply(): ConsoleLogger = apply(systemOut) def apply(out: PrintStream): ConsoleLogger = apply(printStreamOut(out)) def apply(out: PrintWriter): ConsoleLogger = apply(printWriterOut(out)) - def apply(out: ConsoleOut, ansiCodesSupported: Boolean = formatEnabled, useColor: Boolean = true): ConsoleLogger = + def apply(out: ConsoleOut, ansiCodesSupported: Boolean = formatEnabled, useColor: Boolean = formatEnabled): ConsoleLogger = new ConsoleLogger(out, ansiCodesSupported, useColor) } From eec68ee8dae398cf6b2b1db99f00c5fe726b1ffe Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 17 Jun 2011 18:03:59 -0400 Subject: [PATCH 176/823] minor changes to parsers --- util/complete/Parser.scala | 4 ++-- util/complete/Parsers.scala | 1 + 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 6eb9c135b..82b00db66 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -243,8 +243,8 @@ trait ParserMain def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) } - implicit def literalRichParser(c: Char): RichParser[Char] = richParser(c) - implicit def literalRichParser(s: String): RichParser[String] = richParser(s) + implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) + implicit def literalRichStringParser(s: String): RichParser[String] = richParser(s) def invalid(msgs: => Seq[String]): Parser[Nothing] = Invalid(mkFailures(msgs)) def failure(msg: => String): Parser[Nothing] = invalid(msg :: Nil) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index c6f157827..93ab9b9ac 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -46,6 +46,7 @@ trait Parsers lazy val NatBasic = mapOrFail( Digit.+ )( _.mkString.toInt ) private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = (neg.toSeq ++ digits).mkString.toInt + lazy val Bool = ("true" ^^^ true) | ("false" ^^^ false) def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = rep1sep(rep, sep) ?? Nil From 77d8bf8a57503572c960c80a5acaf517c896616b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 20 Jun 2011 15:25:23 -0400 Subject: [PATCH 177/823] lazy InputCache for recursive caches --- cache/SeparatedCache.scala | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/cache/SeparatedCache.scala b/cache/SeparatedCache.scala index 523716ac3..a126229bd 100644 --- a/cache/SeparatedCache.scala +++ b/cache/SeparatedCache.scala @@ -27,6 +27,16 @@ object InputCache def write(to: Out, i: I) = fmt.writes(to, i) def equiv = eqv } + def lzy[I](mkIn: => InputCache[I]): InputCache[I] = + new InputCache[I] + { + lazy val ic = mkIn + type Internal = ic.Internal + def convert(i: I) = ic convert i + def read(from: Input): ic.Internal = ic.read(from) + def write(to: Out, i: ic.Internal) = ic.write(to, i) + def equiv = ic.equiv + } } class BasicCache[I,O](implicit input: InputCache[I], outFormat: Format[O]) extends Cache[I,O] From 2b6d5c1316e13bd47170cfe726889bc4926bc4a4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 22 Jun 2011 19:17:10 -0400 Subject: [PATCH 178/823] add extraLoggers to make it easier to add loggers --- util/log/Level.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/log/Level.scala b/util/log/Level.scala index 62fb5f2c2..f501cd40c 100644 --- a/util/log/Level.scala +++ b/util/log/Level.scala @@ -16,6 +16,7 @@ object Level extends Enumeration val SuccessLabel = "success" def union(a: Value, b: Value) = if(a.id < b.id) a else b + def unionAll(vs: Seq[Value]) = vs reduceLeft union /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ def apply(s: String) = values.find(s == _.toString) From 5c8d619880a037bc75b40296b940214135722534 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 26 Jun 2011 12:27:06 -0400 Subject: [PATCH 179/823] apply javac log level approach to directJavac --- util/log/LoggerWriter.scala | 17 ++++++++++++++--- 1 file changed, 14 insertions(+), 3 deletions(-) diff --git a/util/log/LoggerWriter.scala b/util/log/LoggerWriter.scala index 81c0d89d0..aeb67ce72 100644 --- a/util/log/LoggerWriter.scala +++ b/util/log/LoggerWriter.scala @@ -5,11 +5,13 @@ package sbt /** Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. * A line is delimited by `nl`, which is by default the platform line separator.*/ -class LoggerWriter(delegate: Logger, level: Level.Value, nl: String) extends java.io.Writer +class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: String = System.getProperty("line.separator")) extends java.io.Writer { - def this(delegate: Logger, level: Level.Value) = this(delegate, level, System.getProperty("line.separator")) + def this(delegate: Logger, level: Level.Value) = this(delegate, Some(level)) + def this(delegate: Logger) = this(delegate, None) private[this] val buffer = new StringBuilder + private[this] val lines = new collection.mutable.ListBuffer[String] override def close() = flush() override def flush(): Unit = @@ -20,6 +22,12 @@ class LoggerWriter(delegate: Logger, level: Level.Value, nl: String) extends jav buffer.clear() } } + def flushLines(level: Level.Value): Unit = + synchronized { + for(line <- lines) + delegate.log(level, line) + lines.clear() + } override def write(content: Array[Char], offset: Int, length: Int): Unit = synchronized { buffer.appendAll(content, offset, length) @@ -36,5 +44,8 @@ class LoggerWriter(delegate: Logger, level: Level.Value, nl: String) extends jav process() } } - private[this] def log(s: String): Unit = delegate.log(level, s) + private[this] def log(s: String): Unit = unbufferedLevel match { + case None => lines += s + case Some(level) => delegate.log(level, s) + } } \ No newline at end of file From 9578ed3db0f8e05a46cb23f3015d77bf59ed1999 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 26 Jun 2011 12:27:06 -0400 Subject: [PATCH 180/823] move locks test to scripted tests --- util/log/src/test/scala/LogWriterTest.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/log/src/test/scala/LogWriterTest.scala b/util/log/src/test/scala/LogWriterTest.scala index 6d01341a0..95736d524 100644 --- a/util/log/src/test/scala/LogWriterTest.scala +++ b/util/log/src/test/scala/LogWriterTest.scala @@ -20,7 +20,7 @@ object LogWriterTest extends Properties("Log Writer") property("properly logged") = forAll { (output: Output, newLine: NewLine) => import output.{lines, level} val log = new RecordingLogger - val writer = new LoggerWriter(log, level, newLine.str) + val writer = new LoggerWriter(log, Some(level), newLine.str) logLines(writer, lines, newLine.str) val events = log.getEvents ("Recorded:\n" + events.map(show).mkString("\n")) |: From c25c92da401e7beec06d33423b9350b20eb62c5e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 29 Jun 2011 21:44:37 -0400 Subject: [PATCH 181/823] add missing Attributed.get method --- util/collection/Attributes.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 980cbc9ec..6474fefb3 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -97,6 +97,7 @@ final case class AttributeEntry[T](key: AttributeKey[T], value: T) final case class Attributed[D](data: D)(val metadata: AttributeMap) { + def get[T](key: AttributeKey[T]): Option[T] = metadata.get(key) def put[T](key: AttributeKey[T], value: T): Attributed[D] = Attributed(data)(metadata.put(key, value)) def map[T](f: D => T): Attributed[T] = Attributed(f(data))(metadata) } From a612cc0ba132d24925c03dca391bfbefa1925c39 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 29 Jun 2011 21:44:55 -0400 Subject: [PATCH 182/823] settings example and simple test --- .../src/test/scala/SettingsExample.scala | 86 +++++++++++++++++++ .../src/test/scala/SettingsTest.scala | 23 +++++ 2 files changed, 109 insertions(+) create mode 100644 util/collection/src/test/scala/SettingsExample.scala create mode 100644 util/collection/src/test/scala/SettingsTest.scala diff --git a/util/collection/src/test/scala/SettingsExample.scala b/util/collection/src/test/scala/SettingsExample.scala new file mode 100644 index 000000000..38c09d634 --- /dev/null +++ b/util/collection/src/test/scala/SettingsExample.scala @@ -0,0 +1,86 @@ +package sbt + +/** Define our settings system */ + +// A basic scope indexed by an integer. +final case class Scope(index: Int) + +// Extend the Init trait. +// (It is done this way because the Scope type parameter is used everywhere in Init. +// Lots of type constructors would become binary, which as you may know requires lots of type lambdas +// when you want a type function with only one parameter. +// That would be a general pain.) +object SettingsExample extends Init[Scope] +{ + // This is the only abstract method, providing a way of showing a Scope+AttributeKey[_] + override def display(key: ScopedKey[_]): String = + key.scope.index + "/" + key.key.label + + // A sample delegation function that delegates to a Scope with a lower index. + val delegates: Scope => Seq[Scope] = { case s @ Scope(index) => + s +: (if(index <= 0) Nil else delegates(Scope(index-1)) ) + } + + // Not using this feature in this example. + val scopeLocal: ScopeLocal = _ => Nil + + // These three functions + a scope (here, Scope) are sufficient for defining our settings system. +} + +/** Usage Example **/ + +object SettingsUsage +{ + import SettingsExample._ + import Types._ + + // Define some keys + val a = AttributeKey[Int]("a") + val b = AttributeKey[Int]("b") + + // Scope these keys + val a3 = ScopedKey(Scope(3), a) + val a4 = ScopedKey(Scope(4), a) + val a5 = ScopedKey(Scope(5), a) + + val b4 = ScopedKey(Scope(4), b) + + // Define some settings + val mySettings: Seq[Setting[_]] = Seq( + setting( a3, value( 3 ) ), + setting( b4, app(a4 :^: KNil) { case av :+: HNil => av * 3 } ), + update(a5)(_ + 1) + ) + + // "compiles" and applies the settings. + // This can be split into multiple steps to access intermediate results if desired. + // The 'inspect' command operates on the output of 'compile', for example. + val applied: Settings[Scope] = make(mySettings)(delegates, scopeLocal) + + // Show results. + for(i <- 0 to 5; k <- Seq(a, b)) { + println( k.label + i + " = " + applied.get( Scope(i), k) ) + } + +/** Output: +* For the None results, we never defined the value and there was no value to delegate to. +* For a3, we explicitly defined it to be 3. +* a4 wasn't defined, so it delegates to a3 according to our delegates function. +* b4 gets the value for a4 (which delegates to a3, so it is 3) and multiplies by 3 +* a5 is defined as the previous value of a5 + 1 and +* since no previous value of a5 was defined, it delegates to a4, resulting in 3+1=4. +* b5 isn't defined explicitly, so it delegates to b4 and is therefore equal to 9 as well +a0 = None +b0 = None +a1 = None +b1 = None +a2 = None +b2 = None +a3 = Some(3) +b3 = None +a4 = Some(3) +b4 = Some(9) +a5 = Some(4) +b5 = Some(9) +**/ +} diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala new file mode 100644 index 000000000..8d88bd30d --- /dev/null +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -0,0 +1,23 @@ +package sbt + +import org.scalacheck._ +import Prop._ +import SettingsUsage._ + +object SettingsTest extends Properties("settings") +{ + def tests = + for(i <- 0 to 5; k <- Seq(a, b)) yield { + val value = applied.get( Scope(i), k) + val expected = expectedValues(2*i + (if(k == a) 0 else 1)) + ("Index: " + i) |: + ("Key: " + k.label) |: + ("Value: " + value) |: + ("Expected: " + expected) |: + (value == expected) + } + + property("Basic settings test") = secure( all( tests: _*) ) + + lazy val expectedValues = None :: None :: None :: None :: None :: None :: Some(3) :: None :: Some(3) :: Some(9) :: Some(4) :: Some(9) :: Nil +} \ No newline at end of file From 8d778b72ed3bf4b20e40ea3755b501da0f1e8fcd Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 9 Jul 2011 16:54:41 -0400 Subject: [PATCH 183/823] part II of fix for #90 --- util/log/BufferedLogger.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala index 50598c936..2e04b81f2 100644 --- a/util/log/BufferedLogger.scala +++ b/util/log/BufferedLogger.scala @@ -45,6 +45,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger /** Plays buffered events and disables buffering. */ def stop(): Unit = synchronized { play(); clear() } + override def ansiCodesSupported = delegate.ansiCodesSupported override def setLevel(newLevel: Level.Value): Unit = synchronized { super.setLevel(newLevel) if(recording) From a6f7e9840c1227199d51dbf6ac00314f1cf051fc Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 9 Jul 2011 16:54:41 -0400 Subject: [PATCH 184/823] global settings preparation: separate compilation/loading stages of Eval --- util/collection/TypeFunctions.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index 8f542fb99..185c72226 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -15,6 +15,7 @@ trait TypeFunctions final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } final def idFun[T] = (t: T) => t + final def const[A,B](b: B): A=> B = _ => b def nestCon[M[_], N[_], G[_]](f: M ~> N): (M ∙ G)#l ~> (N ∙ G)#l = f.asInstanceOf[(M ∙ G)#l ~> (N ∙ G)#l] // implemented with a cast to avoid extra object+method call. castless version: From b272920ef3c46f4284179a6a00320a56dad5ae73 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 12 Jul 2011 07:47:31 -0400 Subject: [PATCH 185/823] clean up whitespace handling in commands. fixes #97 --- util/complete/Parsers.scala | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 93ab9b9ac..4a82ab482 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -11,7 +11,7 @@ package sbt.complete // Some predefined parsers trait Parsers { - lazy val any: Parser[Char] = charClass(_ => true) + lazy val any: Parser[Char] = charClass(_ => true, "any character") lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet @@ -23,6 +23,14 @@ trait Parsers lazy val Op = OpChar.+.string lazy val OpOrID = ID | Op + def opOrIDSpaced(s: String): Parser[Char] = + if(DefaultParsers.matches(ID, s)) + OpChar | SpaceClass + else if(DefaultParsers.matches(Op, s)) + IDStart | SpaceClass + else + any + def isOpChar(c: Char) = !isDelimiter(c) && isOpType(getType(c)) def isOpType(cat: Int) = cat match { case MATH_SYMBOL | OTHER_SYMBOL | DASH_PUNCTUATION | OTHER_PUNCTUATION | MODIFIER_SYMBOL | CURRENCY_SYMBOL => true; case _ => false } def isIDChar(c: Char) = c.isLetterOrDigit || c == '-' || c == '_' From 2c3fc0abd6a09cb388742fc21fff47a6a663a93e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 18 Jul 2011 17:14:22 -0400 Subject: [PATCH 186/823] support incremental recompilation when using exportJars. fixes #108 --- interface/src/main/java/xsbti/AnalysisCallback.java | 4 ++-- interface/src/test/scala/TestCallback.scala | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index efca555c4..53a33253b 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -18,8 +18,8 @@ public interface AnalysisCallback * class named name from class or jar file binary. */ public void binaryDependency(File binary, String name, File source); /** Called to indicate that the source file source produces a class file at - * module.*/ - public void generatedClass(File source, File module); + * module contain class name.*/ + public void generatedClass(File source, File module, String name); /** Called after the source at the given location has been processed. */ public void endSource(File sourcePath); /** Called when the public API of a source file is extracted. */ diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index c49e11317..6621a40ef 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -9,14 +9,14 @@ class TestCallback extends AnalysisCallback val endedSources = new ArrayBuffer[File] val sourceDependencies = new ArrayBuffer[(File, File)] val binaryDependencies = new ArrayBuffer[(File, String, File)] - val products = new ArrayBuffer[(File, File)] + val products = new ArrayBuffer[(File, File, String)] val apis = new ArrayBuffer[(File, xsbti.api.SourceAPI)] def beginSource(source: File) { beganSources += source } def sourceDependency(dependsOn: File, source: File) { sourceDependencies += ((dependsOn, source)) } def binaryDependency(binary: File, name: String, source: File) { binaryDependencies += ((binary, name, source)) } - def generatedClass(source: File, module: File) { products += ((source, module)) } + def generatedClass(source: File, module: File, name: String) { products += ((source, module, name)) } def endSource(source: File) { endedSources += source } def api(source: File, sourceAPI: xsbti.api.SourceAPI) { apis += ((source, sourceAPI)) } From a6dd6b07b5ef9cd7ca3ea7c4b944befa55865e82 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 19 Jul 2011 21:29:05 -0400 Subject: [PATCH 187/823] proper resolvedScoped implementation --- util/collection/Reduced.scala | 75 ++++++++++++++++++++++++++++++++++ util/collection/Settings.scala | 21 ++++++++++ 2 files changed, 96 insertions(+) create mode 100644 util/collection/Reduced.scala diff --git a/util/collection/Reduced.scala b/util/collection/Reduced.scala new file mode 100644 index 000000000..59c9a8193 --- /dev/null +++ b/util/collection/Reduced.scala @@ -0,0 +1,75 @@ +package sbt + + import Types._ + +final case class ReducedH[K[_], HL <: HList, HLk <: HList](keys: KList[K, HLk], expand: HLk => HL) +{ + def prepend[T](key: K[T])(g: K ~> Option) = + g(key) match + { + case None => ReducedH[K, T :+: HL, T :+: HLk]( KCons(key, keys), { case v :+: hli => HCons(v, expand(hli)) }) + case Some(v) => ReducedH[K, T :+: HL, HLk](keys, hli => HCons(v, expand(hli))) + } + def combine[T](f: (KList[K, HLk], HLk => HL) => T): T = f(keys, expand) +} +final case class ReducedK[K[_], M[_], HL <: HList, HLk <: HList](keys: KList[(K ∙ M)#l, HLk], expand: KList[M, HLk] => KList[M, HL]) +{ + def prepend[T](key: K[M[T]])(g: (K ∙ M)#l ~> (Option ∙ M)#l): ReducedK[K, M, T :+: HL, _ <: HList] = + g(key) match + { + case None => ReducedK[K, M, T :+: HL, T :+: HLk]( KCons[T, HLk, (K ∙ M)#l](key, keys), { case KCons(v, hli) => KCons(v, expand(hli)) }) + case Some(v) => ReducedK[K, M, T :+: HL, HLk](keys, hli => KCons(v, expand(hli)) ) + } + def combine[T](f: (KList[(K ∙ M)#l, HLk], KList[M, HLk] => KList[M, HL]) => T): T = f(keys, expand) +} +final case class ReducedSeq[K[_], T](keys: Seq[K[T]], expand: Seq[T] => Seq[T]) +{ + def prepend(key: K[T])(g: K ~> Option) = + g(key) match + { + case None => ReducedSeq[K, T](key +: keys, { case Seq(x, xs @ _*) => x +: expand(xs) }) + case Some(v) => ReducedSeq[K, T](keys, xs => v +: expand(xs)) + } +} +object Reduced +{ + def reduceK[HL <: HList, K[_], M[_]](keys: KList[(K ∙ M)#l, HL], g: (K ∙ M)#l ~> (Option ∙ M)#l): ReducedK[K, M, HL, _ <: HList] = + { + type RedK[HL <: HList] = ReducedK[K, M, HL, _ <: HList] + keys.foldr[RedK,(K ∙ M)#l] { new KFold[(K ∙ M)#l, RedK] { + def knil = emptyK + def kcons[H,T<:HList](h: K[M[H]], acc: RedK[T]): RedK[H :+: T] = + acc.prepend(h)(g) + }} + } + def reduceH[HL <: HList, K[_]](keys: KList[K, HL], g: K ~> Option): ReducedH[K, HL, _ <: HList] = + { + type RedH[HL <: HList] = ReducedH[K, HL, _ <: HList] + keys.foldr { new KFold[K, RedH] { + def knil = emptyH + def kcons[H,T<:HList](h: K[H], acc: RedH[T]): RedH[H :+: T] = + acc.prepend(h)(g) + }} + } + def reduceSeq[K[_], T](keys: Seq[K[T]], g: K ~> Option): ReducedSeq[K, T] = (ReducedSeq[K, T](Nil, idFun) /: keys) { (red, key) => red.prepend(key)(g) } + def emptyH[K[_]] = ReducedH[K, HNil, HNil](KNil, idFun) + def emptyK[K[_], M[_]] = ReducedK[K, M, HNil, HNil](KNil, idFun) +} +/* + +def mapConstant(inputs: Seq[ScopedKey[S]], g: ScopedKey ~> Option) = + split(Nil, inputs.head, inputs.tails, g) match + { + None => new Uniform(f, inputs) + Some((left, x, Nil)) => new Uniform(in => f(in :+ x), left) + Some((Nil, x, right)) => new Uniform(in => f(x +: in), right) + Some((left, x, right)) => new Joined(uniformID(left), mapConstant(right), (l,r) => (l ++ (x +: r)) ) + } +def split[S, M[_]](acc: List[M[S]], head: M[S], tail: List[M[S]], f: M ~> Option): Option[(Seq[M[S]], S, Seq[M[S]])] = + (f(head), tail) match + { + case (None, Nil) => None + case (None, x :: xs) => split( head :: acc, x, xs, f) + case (Some(v), ts) => Some( (acc.reverse, v, ts) ) + } +*/ \ No newline at end of file diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 60694120b..c9042d9dc 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -50,6 +50,7 @@ trait Init[Scope] type CompiledMap = Map[ScopedKey[_], Compiled] type MapScoped = ScopedKey ~> ScopedKey type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] + type MapConstant = ScopedKey ~> Option def setting[T](key: ScopedKey[T], init: Initialize[T]): Setting[T] = new Setting[T](key, init) def value[T](value: => T): Initialize[T] = new Value(value _) @@ -161,6 +162,7 @@ trait Init[Scope] def mapReferenced(g: MapScoped): Initialize[T] def zip[S](o: Initialize[S]): Initialize[(T,S)] = zipWith(o)((x,y) => (x,y)) def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = new Joined[T,S,U](this, o, f) + def mapConstant(g: MapConstant): Initialize[T] def get(map: Settings[Scope]): T } object Initialize @@ -190,6 +192,7 @@ trait Init[Scope] def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init.map(t => f(key,t))) + def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g) override def toString = "setting(" + key + ")" } @@ -199,6 +202,11 @@ trait Init[Scope] def map[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) def get(map: Settings[Scope]): T = f(a map asFunction(map)) def mapReferenced(g: MapScoped) = new Optional(mapKey(g), f) + def mapConstant(g: MapConstant): Initialize[T] = + (a flatMap g.fn) match { + case None => this + case s => new Value(() => f(s)) + } private[this] def mapKey(g: MapScoped) = try { a map g.fn } catch { case _: Uninitialized => None } } private[this] final class Joined[S,T,U](a: Initialize[S], b: Initialize[T], f: (S,T) => U) extends Initialize[U] @@ -206,6 +214,7 @@ trait Init[Scope] def dependsOn = a.dependsOn ++ b.dependsOn def mapReferenced(g: MapScoped) = new Joined(a mapReferenced g, b mapReferenced g, f) def map[Z](g: U => Z) = new Joined[S,T,Z](a, b, (s,t) => g(f(s,t))) + def mapConstant(g: MapConstant) = new Joined[S,T,U](a mapConstant g, b mapConstant g, f) def get(map: Settings[Scope]): U = f(a get map, b get map) } private[this] final class Value[T](value: () => T) extends Initialize[T] @@ -213,6 +222,7 @@ trait Init[Scope] def dependsOn = Nil def mapReferenced(g: MapScoped) = this def map[S](g: T => S) = new Value[S](() => g(value())) + def mapConstant(g: MapConstant) = this def get(map: Settings[Scope]): T = value() } private[this] final class Apply[HL <: HList, T](val f: HL => T, val inputs: KList[ScopedKey, HL]) extends Initialize[T] @@ -220,6 +230,7 @@ trait Init[Scope] def dependsOn = inputs.toList def mapReferenced(g: MapScoped) = new Apply(f, inputs transform g) def map[S](g: T => S) = new Apply(g compose f, inputs) + def mapConstant(g: MapConstant) = Reduced.reduceH(inputs, g).combine( (keys, expand) => new Apply(f compose expand, keys) ) def get(map: Settings[Scope]) = f(inputs down asTransform(map) ) } private[this] final class KApply[HL <: HList, M[_], T](val f: KList[M, HL] => T, val inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL]) extends Initialize[T] @@ -228,6 +239,11 @@ trait Init[Scope] def mapReferenced(g: MapScoped) = new KApply[HL, M, T](f, inputs.transform[({type l[t] = ScopedKey[M[t]]})#l]( nestCon(g) ) ) def map[S](g: T => S) = new KApply[HL, M, S](g compose f, inputs) def get(map: Settings[Scope]) = f(inputs.transform[M]( nestCon[ScopedKey, Id, M](asTransform(map)) )) + def mapConstant(g: MapConstant) = + { + def mk[HLk <: HList](keys: KList[(ScopedKey ∙ M)#l, HLk], expand: KList[M, HLk] => KList[M, HL]) = new KApply[HLk, M, T](f compose expand, keys) + Reduced.reduceK[HL, ScopedKey, M](inputs, nestCon(g)) combine mk + } private[this] def unnest(l: List[ScopedKey[M[T]] forSome { type T }]): List[ScopedKey[_]] = l.asInstanceOf[List[ScopedKey[_]]] } private[this] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[ScopedKey[S]]) extends Initialize[T] @@ -235,6 +251,11 @@ trait Init[Scope] def dependsOn = inputs def mapReferenced(g: MapScoped) = new Uniform(f, inputs map g.fn[S]) def map[S](g: T => S) = new Uniform(g compose f, inputs) + def mapConstant(g: MapConstant) = + { + val red = Reduced.reduceSeq(inputs, g) + new Uniform(f compose red.expand, red.keys) + } def get(map: Settings[Scope]) = f(inputs map asFunction(map)) } private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) From f2328e164ed7fd869d0c63e0df0df516d1abc852 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 24 Jul 2011 22:35:27 -0400 Subject: [PATCH 188/823] add ability to hide a token until it is explicitly started --- util/complete/Parser.scala | 16 +++++++++------- 1 file changed, 9 insertions(+), 7 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 82b00db66..f2545ebe4 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -329,11 +329,12 @@ trait ParserMain success(seen.mkString) } - def token[T](t: Parser[T]): Parser[T] = token(t, "", true) - def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false) - def token[T](t: Parser[T], seen: String, track: Boolean): Parser[T] = + def token[T](t: Parser[T]): Parser[T] = token(t, "", true, false) + def token[T](t: Parser[T], hide: Boolean): Parser[T] = token(t, "", true, hide) + def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false, false) + def token[T](t: Parser[T], seen: String, track: Boolean, hide: Boolean): Parser[T] = if(t.valid && !t.isTokenStart) - if(t.result.isEmpty) new TokenStart(t, seen, track) else t + if(t.result.isEmpty) new TokenStart(t, seen, track, hide) else t else t @@ -489,11 +490,12 @@ private final class MatchedString(delegate: Parser[_], seenV: Vector[Char], part override def isTokenStart = delegate.isTokenStart override def toString = "matched(" + partial + ", " + seen + ", " + delegate + ")" } -private final class TokenStart[T](delegate: Parser[T], seen: String, track: Boolean) extends ValidParser[T] +private final class TokenStart[T](delegate: Parser[T], seen: String, track: Boolean, hide: Boolean) extends ValidParser[T] { - def derive(c: Char) = token( delegate derive c, if(track) seen + c else seen, track) + def derive(c: Char) = token( delegate derive c, if(track) seen + c else seen, track, hide) lazy val completions = - if(track) + if(hide) Completions.nil + else if(track) { val dcs = delegate.completions Completions( for(c <- dcs.get) yield Completion.token(seen, c.append) ) From 9c70e479d80877b3165ed815c825ae4b285a9abb Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 25 Jul 2011 21:59:22 -0400 Subject: [PATCH 189/823] display all undefined settings at once --- util/collection/Settings.scala | 76 ++++++++++++++++++++++++++++------ 1 file changed, 63 insertions(+), 13 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index c9042d9dc..88735ed81 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -49,6 +49,8 @@ trait Init[Scope] type ScopedMap = IMap[ScopedKey, SettingSeq] type CompiledMap = Map[ScopedKey[_], Compiled] type MapScoped = ScopedKey ~> ScopedKey + type ValidatedRef[T] = Either[Undefined, ScopedKey[T]] + type ValidateRef = ScopedKey ~> ValidatedRef type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] type MapConstant = ScopedKey ~> Option @@ -116,23 +118,30 @@ trait Init[Scope] def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope]): ScopedMap = { - def refMap(refKey: ScopedKey[_], isFirst: Boolean) = new (ScopedKey ~> ScopedKey) { def apply[T](k: ScopedKey[T]) = + def refMap(refKey: ScopedKey[_], isFirst: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = delegateForKey(sMap, k, delegates(k.scope), refKey, isFirst) } + val undefineds = new collection.mutable.ListBuffer[Undefined] val f = new (SettingSeq ~> SettingSeq) { def apply[T](ks: Seq[Setting[T]]) = - ks.zipWithIndex.map{ case (s,i) => s mapReferenced refMap(s.key, i == 0) } + ks.zipWithIndex.map{ case (s,i) => + (s validateReferenced refMap(s.key, i == 0) ) match { + case Right(v) => v + case Left(l) => undefineds ++= l; s + } + } } - sMap mapValues f + val result = sMap mapValues f + if(undefineds.isEmpty) result else throw Uninitialized(undefineds.toList) } - private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean): ScopedKey[T] = + private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean): Either[Undefined, ScopedKey[T]] = { - def resolve(search: Seq[Scope]): ScopedKey[T] = + def resolve(search: Seq[Scope]): Either[Undefined, ScopedKey[T]] = search match { - case Seq() => throw Uninitialized(k, refKey) + case Seq() => Left(Undefined(refKey, k)) case Seq(x, xs @ _*) => val sk = ScopedKey(x, k.key) val definesKey = (refKey != sk || !isFirst) && (sMap contains sk) - if(definesKey) sk else resolve(xs) + if(definesKey) Right(sk) else resolve(xs) } resolve(scopes) } @@ -147,9 +156,17 @@ trait Init[Scope] map.set(key.scope, key.key, value) } - final class Uninitialized(val key: ScopedKey[_], val refKey: ScopedKey[_], msg: String) extends Exception(msg) - def Uninitialized(key: ScopedKey[_], refKey: ScopedKey[_]): Uninitialized = - new Uninitialized(key, refKey, "Reference to uninitialized setting " + display(key) + " from " + display(refKey)) + final class Uninitialized(val undefined: Seq[Undefined], msg: String) extends Exception(msg) + final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_]) + def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(definingKey, referencedKey) + def Uninitialized(keys: Seq[Undefined]): Uninitialized = + { + assert(!keys.isEmpty) + val keyStrings = keys map { u => display(u.referencedKey) + " from " + display(u.definingKey) } + val suffix = if(keyStrings.length > 1) "s" else "" + val keysString = keyStrings.mkString("\n\t", "\n\t", "") + new Uninitialized(keys, "Reference" + suffix + " to undefined setting" + suffix + ": " + keysString) + } final class Compiled(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) { override def toString = display(key) @@ -160,6 +177,7 @@ trait Init[Scope] def dependsOn: Seq[ScopedKey[_]] def map[S](g: T => S): Initialize[S] def mapReferenced(g: MapScoped): Initialize[T] + def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Initialize[T]] def zip[S](o: Initialize[S]): Initialize[(T,S)] = zipWith(o)((x,y) => (x,y)) def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = new Joined[T,S,U](this, o, f) def mapConstant(g: MapConstant): Initialize[T] @@ -190,6 +208,7 @@ trait Init[Scope] def definitive: Boolean = !init.dependsOn.contains(key) def dependsOn: Seq[ScopedKey[_]] = remove(init.dependsOn, key) def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) + def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI)) def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init.map(t => f(key,t))) def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g) @@ -201,18 +220,23 @@ trait Init[Scope] def dependsOn = a.toList def map[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) def get(map: Settings[Scope]): T = f(a map asFunction(map)) - def mapReferenced(g: MapScoped) = new Optional(mapKey(g), f) + def mapReferenced(g: MapScoped) = new Optional(a map g.fn, f) + def validateReferenced(g: ValidateRef) = Right( new Optional(a flatMap { sk => g(sk).right.toOption }, f) ) def mapConstant(g: MapConstant): Initialize[T] = (a flatMap g.fn) match { case None => this case s => new Value(() => f(s)) } - private[this] def mapKey(g: MapScoped) = try { a map g.fn } catch { case _: Uninitialized => None } } private[this] final class Joined[S,T,U](a: Initialize[S], b: Initialize[T], f: (S,T) => U) extends Initialize[U] { def dependsOn = a.dependsOn ++ b.dependsOn def mapReferenced(g: MapScoped) = new Joined(a mapReferenced g, b mapReferenced g, f) + def validateReferenced(g: ValidateRef) = + (a validateReferenced g, b validateReferenced g) match { + case (Right(ak), Right(bk)) => Right( new Joined(ak, bk, f) ) + case (au, bu) => Left( (au.left.toSeq ++ bu.left.toSeq).flatten ) + } def map[Z](g: U => Z) = new Joined[S,T,Z](a, b, (s,t) => g(f(s,t))) def mapConstant(g: MapConstant) = new Joined[S,T,U](a mapConstant g, b mapConstant g, f) def get(map: Settings[Scope]): U = f(a get map, b get map) @@ -221,6 +245,7 @@ trait Init[Scope] { def dependsOn = Nil def mapReferenced(g: MapScoped) = this + def validateReferenced(g: ValidateRef) = Right(this) def map[S](g: T => S) = new Value[S](() => g(value())) def mapConstant(g: MapConstant) = this def get(map: Settings[Scope]): T = value() @@ -232,24 +257,49 @@ trait Init[Scope] def map[S](g: T => S) = new Apply(g compose f, inputs) def mapConstant(g: MapConstant) = Reduced.reduceH(inputs, g).combine( (keys, expand) => new Apply(f compose expand, keys) ) def get(map: Settings[Scope]) = f(inputs down asTransform(map) ) + def validateReferenced(g: ValidateRef) = + { + val tx = inputs.transform(g) + val undefs = tx.toList.flatMap(_.left.toSeq) + val get = new (ValidatedRef ~> ScopedKey) { def apply[T](vr: ValidatedRef[T]) = vr.right.get } + if(undefs.isEmpty) Right(new Apply(f, tx transform get)) else Left(undefs) + } } + private[this] final class KApply[HL <: HList, M[_], T](val f: KList[M, HL] => T, val inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL]) extends Initialize[T] { + type ScopedKeyM[T] = ScopedKey[M[T]] + type VRefM[T] = ValidatedRef[M[T]] def dependsOn = unnest(inputs.toList) def mapReferenced(g: MapScoped) = new KApply[HL, M, T](f, inputs.transform[({type l[t] = ScopedKey[M[t]]})#l]( nestCon(g) ) ) def map[S](g: T => S) = new KApply[HL, M, S](g compose f, inputs) def get(map: Settings[Scope]) = f(inputs.transform[M]( nestCon[ScopedKey, Id, M](asTransform(map)) )) def mapConstant(g: MapConstant) = { - def mk[HLk <: HList](keys: KList[(ScopedKey ∙ M)#l, HLk], expand: KList[M, HLk] => KList[M, HL]) = new KApply[HLk, M, T](f compose expand, keys) + def mk[HLk <: HList](keys: KList[ScopedKeyM, HLk], expand: KList[M, HLk] => KList[M, HL]) = new KApply[HLk, M, T](f compose expand, keys) Reduced.reduceK[HL, ScopedKey, M](inputs, nestCon(g)) combine mk } + def validateReferenced(g: ValidateRef) = + { + val tx = inputs.transform[VRefM](nestCon(g)) + val undefs = tx.toList.flatMap(_.left.toSeq) + val get = new (VRefM ~> ScopedKeyM) { def apply[T](vr: ValidatedRef[M[T]]) = vr.right.get } + if(undefs.isEmpty) + Right(new KApply[HL, M, T](f, tx.transform( get ) )) + else + Left(undefs) + } private[this] def unnest(l: List[ScopedKey[M[T]] forSome { type T }]): List[ScopedKey[_]] = l.asInstanceOf[List[ScopedKey[_]]] } private[this] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[ScopedKey[S]]) extends Initialize[T] { def dependsOn = inputs def mapReferenced(g: MapScoped) = new Uniform(f, inputs map g.fn[S]) + def validateReferenced(g: ValidateRef) = + { + val (undefs, ok) = List.separate(inputs map g.fn[S]) + if(undefs.isEmpty) Right( new Uniform(f, ok) ) else Left(undefs) + } def map[S](g: T => S) = new Uniform(g compose f, inputs) def mapConstant(g: MapConstant) = { From 65c1320c6060cb0c41688cdc4234025ef112121b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 27 Jul 2011 21:20:08 -0400 Subject: [PATCH 190/823] for undefined references, suggest the nearest defined scope that is more specific if it exists. fixes #135 --- util/collection/Settings.scala | 30 ++++++++++++++++++++++++------ 1 file changed, 24 insertions(+), 6 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 88735ed81..6135173fa 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -131,7 +131,7 @@ trait Init[Scope] } } val result = sMap mapValues f - if(undefineds.isEmpty) result else throw Uninitialized(undefineds.toList) + if(undefineds.isEmpty) result else throw Uninitialized(sMap, delegates, undefineds.toList) } private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean): Either[Undefined, ScopedKey[T]] = { @@ -156,16 +156,34 @@ trait Init[Scope] map.set(key.scope, key.key, value) } + def showUndefined(u: Undefined, sMap: ScopedMap, delegates: Scope => Seq[Scope]): String = + { + val guessed = guessIntendedScope(sMap, delegates, u.referencedKey) + display(u.referencedKey) + " from " + display(u.definingKey) + guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString + } + + def guessIntendedScope(sMap: ScopedMap, delegates: Scope => Seq[Scope], key: ScopedKey[_]): Option[ScopedKey[_]] = + { + val distances = sMap.keys.toSeq.flatMap { validKey => refinedDistance(delegates, validKey, key).map( dist => (dist, validKey) ) } + distances.sortBy(_._1).map(_._2).headOption + } + def refinedDistance(delegates: Scope => Seq[Scope], a: ScopedKey[_], b: ScopedKey[_]): Option[Int] = + if(a.key == b.key) + { + val dist = delegates(a.scope).indexOf(b.scope) + if(dist < 0) None else Some(dist) + } + else None + final class Uninitialized(val undefined: Seq[Undefined], msg: String) extends Exception(msg) final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_]) def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(definingKey, referencedKey) - def Uninitialized(keys: Seq[Undefined]): Uninitialized = + def Uninitialized(sMap: ScopedMap, delegates: Scope => Seq[Scope], keys: Seq[Undefined]): Uninitialized = { assert(!keys.isEmpty) - val keyStrings = keys map { u => display(u.referencedKey) + " from " + display(u.definingKey) } - val suffix = if(keyStrings.length > 1) "s" else "" - val keysString = keyStrings.mkString("\n\t", "\n\t", "") - new Uninitialized(keys, "Reference" + suffix + " to undefined setting" + suffix + ": " + keysString) + val suffix = if(keys.length > 1) "s" else "" + val keysString = keys.map(u => showUndefined(u, sMap, delegates)).mkString("\n\n ", "\n\n ", "") + new Uninitialized(keys, "Reference" + suffix + " to undefined setting" + suffix + ": " + keysString + "\n ") } final class Compiled(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) { From 96d46b2c7af10f40ff17f8ed6c93db55188cabee Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 30 Jul 2011 18:11:20 -0400 Subject: [PATCH 191/823] Seq[Setting[_]] <=> SettingsDefinition --- util/collection/Settings.scala | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 6135173fa..418b5b737 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -216,6 +216,10 @@ trait Init[Scope] case Seq(x, xs @ _*) => (join(xs) zipWith x)( (t,h) => h +: t) } } + object SettingsDefinition { + implicit def unwrapSettingsDefinition(d: SettingsDefinition): Seq[Setting[_]] = d.settings + implicit def wrapSettingsDefinition(ss: Seq[Setting[_]]): SettingsDefinition = new SettingList(ss) + } sealed trait SettingsDefinition { def settings: Seq[Setting[_]] } From baea865ecfb5b84812ea058d78e785e33b8dc64d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 4 Aug 2011 07:20:25 -0400 Subject: [PATCH 192/823] try out simplified display of scoped keys --- util/collection/Settings.scala | 21 +++++++++++---------- util/collection/Show.scala | 5 +++++ 2 files changed, 16 insertions(+), 10 deletions(-) create mode 100644 util/collection/Show.scala diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 418b5b737..f37302f7f 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -41,7 +41,8 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val del // this trait is intended to be mixed into an object trait Init[Scope] { - def display(skey: ScopedKey[_]): String + /** The Show instance used when a detailed String needs to be generated. It is typically used when no context is available.*/ + def showFullKey: Show[ScopedKey[_]] final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) @@ -72,23 +73,23 @@ trait Init[Scope] def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { def apply[T](k: ScopedKey[T]): T = getValue(s, k) } - def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse error("Internal settings error: invalid reference to " + display(k)) + def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse error("Internal settings error: invalid reference to " + showFullKey(k)) def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) - def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): CompiledMap = + def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = { // prepend per-scope settings val withLocal = addLocal(init)(scopeLocal) // group by Scope/Key, dropping dead initializations val sMap: ScopedMap = grouped(withLocal) // delegate references to undefined values according to 'delegates' - val dMap: ScopedMap = if(actual) delegate(sMap)(delegates) else sMap + val dMap: ScopedMap = if(actual) delegate(sMap)(delegates, display) else sMap // merge Seq[Setting[_]] into Compiled compile(dMap) } - def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): Settings[Scope] = + def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): Settings[Scope] = { - val cMap = compiled(init)(delegates, scopeLocal) + val cMap = compiled(init)(delegates, scopeLocal, display) // order the initializations. cyclic references are detected here. val ordered: Seq[Compiled] = sort(cMap) // evaluation: apply the initializations. @@ -116,7 +117,7 @@ trait Init[Scope] def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopeLocal): Seq[Setting[_]] = init.flatMap( _.dependsOn flatMap scopeLocal ) ++ init - def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope]): ScopedMap = + def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = { def refMap(refKey: ScopedKey[_], isFirst: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = delegateForKey(sMap, k, delegates(k.scope), refKey, isFirst) @@ -156,7 +157,7 @@ trait Init[Scope] map.set(key.scope, key.key, value) } - def showUndefined(u: Undefined, sMap: ScopedMap, delegates: Scope => Seq[Scope]): String = + def showUndefined(u: Undefined, sMap: ScopedMap, delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = { val guessed = guessIntendedScope(sMap, delegates, u.referencedKey) display(u.referencedKey) + " from " + display(u.definingKey) + guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString @@ -178,7 +179,7 @@ trait Init[Scope] final class Uninitialized(val undefined: Seq[Undefined], msg: String) extends Exception(msg) final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_]) def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(definingKey, referencedKey) - def Uninitialized(sMap: ScopedMap, delegates: Scope => Seq[Scope], keys: Seq[Undefined]): Uninitialized = + def Uninitialized(sMap: ScopedMap, delegates: Scope => Seq[Scope], keys: Seq[Undefined])(implicit display: Show[ScopedKey[_]]): Uninitialized = { assert(!keys.isEmpty) val suffix = if(keys.length > 1) "s" else "" @@ -187,7 +188,7 @@ trait Init[Scope] } final class Compiled(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) { - override def toString = display(key) + override def toString = showFullKey(key) } sealed trait Initialize[T] diff --git a/util/collection/Show.scala b/util/collection/Show.scala new file mode 100644 index 000000000..b19a6ca2d --- /dev/null +++ b/util/collection/Show.scala @@ -0,0 +1,5 @@ +package sbt + +trait Show[T] { + def apply(t: T): String +} \ No newline at end of file From b35d9bfcfb5e799ecf1823d0ea01cc4c24aa2d26 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 5 Aug 2011 21:59:49 -0400 Subject: [PATCH 193/823] preserve key+configuration ambiguity through task+extra parsing. fixes #135 --- util/complete/Parser.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index f2545ebe4..a532af46a 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -346,6 +346,7 @@ trait ParserMain def not(p: Parser[_]): Parser[Unit] = new Not(p) + def oneOf[T](p: Seq[Parser[T]]): Parser[T] = p.reduceLeft(_ | _) def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = seq0(p, Nil) def seq0[T](p: Seq[Parser[T]], errors: => Seq[String]): Parser[Seq[T]] = { From ebddc4009f39d29ffad2e6531b0c6f452c92fda4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 14 Aug 2011 10:53:37 -0400 Subject: [PATCH 194/823] fix ScalaProviderTest --- util/collection/Reduced.scala | 75 ----------------------------------- 1 file changed, 75 deletions(-) delete mode 100644 util/collection/Reduced.scala diff --git a/util/collection/Reduced.scala b/util/collection/Reduced.scala deleted file mode 100644 index 59c9a8193..000000000 --- a/util/collection/Reduced.scala +++ /dev/null @@ -1,75 +0,0 @@ -package sbt - - import Types._ - -final case class ReducedH[K[_], HL <: HList, HLk <: HList](keys: KList[K, HLk], expand: HLk => HL) -{ - def prepend[T](key: K[T])(g: K ~> Option) = - g(key) match - { - case None => ReducedH[K, T :+: HL, T :+: HLk]( KCons(key, keys), { case v :+: hli => HCons(v, expand(hli)) }) - case Some(v) => ReducedH[K, T :+: HL, HLk](keys, hli => HCons(v, expand(hli))) - } - def combine[T](f: (KList[K, HLk], HLk => HL) => T): T = f(keys, expand) -} -final case class ReducedK[K[_], M[_], HL <: HList, HLk <: HList](keys: KList[(K ∙ M)#l, HLk], expand: KList[M, HLk] => KList[M, HL]) -{ - def prepend[T](key: K[M[T]])(g: (K ∙ M)#l ~> (Option ∙ M)#l): ReducedK[K, M, T :+: HL, _ <: HList] = - g(key) match - { - case None => ReducedK[K, M, T :+: HL, T :+: HLk]( KCons[T, HLk, (K ∙ M)#l](key, keys), { case KCons(v, hli) => KCons(v, expand(hli)) }) - case Some(v) => ReducedK[K, M, T :+: HL, HLk](keys, hli => KCons(v, expand(hli)) ) - } - def combine[T](f: (KList[(K ∙ M)#l, HLk], KList[M, HLk] => KList[M, HL]) => T): T = f(keys, expand) -} -final case class ReducedSeq[K[_], T](keys: Seq[K[T]], expand: Seq[T] => Seq[T]) -{ - def prepend(key: K[T])(g: K ~> Option) = - g(key) match - { - case None => ReducedSeq[K, T](key +: keys, { case Seq(x, xs @ _*) => x +: expand(xs) }) - case Some(v) => ReducedSeq[K, T](keys, xs => v +: expand(xs)) - } -} -object Reduced -{ - def reduceK[HL <: HList, K[_], M[_]](keys: KList[(K ∙ M)#l, HL], g: (K ∙ M)#l ~> (Option ∙ M)#l): ReducedK[K, M, HL, _ <: HList] = - { - type RedK[HL <: HList] = ReducedK[K, M, HL, _ <: HList] - keys.foldr[RedK,(K ∙ M)#l] { new KFold[(K ∙ M)#l, RedK] { - def knil = emptyK - def kcons[H,T<:HList](h: K[M[H]], acc: RedK[T]): RedK[H :+: T] = - acc.prepend(h)(g) - }} - } - def reduceH[HL <: HList, K[_]](keys: KList[K, HL], g: K ~> Option): ReducedH[K, HL, _ <: HList] = - { - type RedH[HL <: HList] = ReducedH[K, HL, _ <: HList] - keys.foldr { new KFold[K, RedH] { - def knil = emptyH - def kcons[H,T<:HList](h: K[H], acc: RedH[T]): RedH[H :+: T] = - acc.prepend(h)(g) - }} - } - def reduceSeq[K[_], T](keys: Seq[K[T]], g: K ~> Option): ReducedSeq[K, T] = (ReducedSeq[K, T](Nil, idFun) /: keys) { (red, key) => red.prepend(key)(g) } - def emptyH[K[_]] = ReducedH[K, HNil, HNil](KNil, idFun) - def emptyK[K[_], M[_]] = ReducedK[K, M, HNil, HNil](KNil, idFun) -} -/* - -def mapConstant(inputs: Seq[ScopedKey[S]], g: ScopedKey ~> Option) = - split(Nil, inputs.head, inputs.tails, g) match - { - None => new Uniform(f, inputs) - Some((left, x, Nil)) => new Uniform(in => f(in :+ x), left) - Some((Nil, x, right)) => new Uniform(in => f(x +: in), right) - Some((left, x, right)) => new Joined(uniformID(left), mapConstant(right), (l,r) => (l ++ (x +: r)) ) - } -def split[S, M[_]](acc: List[M[S]], head: M[S], tail: List[M[S]], f: M ~> Option): Option[(Seq[M[S]], S, Seq[M[S]])] = - (f(head), tail) match - { - case (None, Nil) => None - case (None, x :: xs) => split( head :: acc, x, xs, f) - case (Some(v), ts) => Some( (acc.reverse, v, ts) ) - } -*/ \ No newline at end of file From 01b27f58755b91dabb7c4bd89c77a95d22816980 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 14 Aug 2011 10:53:37 -0400 Subject: [PATCH 195/823] Settings overhaul, intended to be source compatible where it matters. Moves many methods previously provided by implicit conversions directly onto the classes for better discoverability, especially with scaladoc. 1. Initialize now allowed in more places. Minor renamings in Initialize to avoid conflicts a. map -> apply b. get -> evaluate 2. Identity on Scoped* is deprecated- it is now redundant 3. Can now use += and <+= for String, Int, Long, Double settings. There may be some problematic corner cases in inference, especially with +=, ++, <+=, <++= 4. Some classes with a scoped: ScopedKey[T] method now have scopedKey: ScopedKey[T] instead. 5. The implicit conversion to ScopedKey[T] is now deprecated. Use the scopedKey method. 6. :== and ::= are now private[sbt] to better reflect that they were internal use only. --- util/collection/Settings.scala | 182 ++++++++++-------- .../src/test/scala/SettingsExample.scala | 9 +- 2 files changed, 105 insertions(+), 86 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index f37302f7f..41fbd4bc3 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -44,28 +44,31 @@ trait Init[Scope] /** The Show instance used when a detailed String needs to be generated. It is typically used when no context is available.*/ def showFullKey: Show[ScopedKey[_]] - final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) + final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) extends KeyedInitialize[T] { + def scopedKey = this + } type SettingSeq[T] = Seq[Setting[T]] type ScopedMap = IMap[ScopedKey, SettingSeq] type CompiledMap = Map[ScopedKey[_], Compiled] type MapScoped = ScopedKey ~> ScopedKey type ValidatedRef[T] = Either[Undefined, ScopedKey[T]] + type ValidatedInit[T] = Either[Seq[Undefined], Initialize[T]] type ValidateRef = ScopedKey ~> ValidatedRef type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] type MapConstant = ScopedKey ~> Option def setting[T](key: ScopedKey[T], init: Initialize[T]): Setting[T] = new Setting[T](key, init) def value[T](value: => T): Initialize[T] = new Value(value _) - def optional[T,U](key: ScopedKey[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(key), f) + def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head))) - def app[HL <: HList, T](inputs: KList[ScopedKey, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) - def uniform[S,T](inputs: Seq[ScopedKey[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) - def kapp[HL <: HList, M[_], T](inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL])(f: KList[M, HL] => T): Initialize[T] = new KApply[HL, M, T](f, inputs) + def app[HL <: HList, T](inputs: KList[Initialize, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) + def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) + def kapp[HL <: HList, M[_], T](inputs: KList[({type l[t] = Initialize[M[t]]})#l, HL])(f: KList[M, HL] => T): Initialize[T] = new KApply[HL, M, T](f, inputs) // the following is a temporary workaround for the "... cannot be instantiated from ..." bug, which renders 'kapp' above unusable outside this source file class KApp[HL <: HList, M[_], T] { - type Composed[S] = ScopedKey[M[S]] + type Composed[S] = Initialize[M[S]] def apply(inputs: KList[Composed, HL])(f: KList[M, HL] => T): Initialize[T] = new KApply[HL, M, T](f, inputs) } @@ -152,7 +155,7 @@ trait Init[Scope] private[this] def applySetting[T](map: Settings[Scope], setting: Setting[T]): Settings[Scope] = { - val value = setting.init.get(map) + val value = setting.init.evaluate(map) val key = setting.key map.set(key.scope, key.key, value) } @@ -194,28 +197,26 @@ trait Init[Scope] sealed trait Initialize[T] { def dependsOn: Seq[ScopedKey[_]] - def map[S](g: T => S): Initialize[S] + def apply[S](g: T => S): Initialize[S] def mapReferenced(g: MapScoped): Initialize[T] - def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Initialize[T]] + def validateReferenced(g: ValidateRef): ValidatedInit[T] def zip[S](o: Initialize[S]): Initialize[(T,S)] = zipWith(o)((x,y) => (x,y)) - def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = new Joined[T,S,U](this, o, f) + def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = + new Apply[T :+: S :+: HNil, U]( { case t :+: s :+: HNil => f(t,s)}, this :^: o :^: KNil) def mapConstant(g: MapConstant): Initialize[T] - def get(map: Settings[Scope]): T + def evaluate(map: Settings[Scope]): T } object Initialize { implicit def joinInitialize[T](s: Seq[Initialize[T]]): JoinInitSeq[T] = new JoinInitSeq(s) final class JoinInitSeq[T](s: Seq[Initialize[T]]) { - def join[S](f: Seq[T] => S): Initialize[S] = this.join map f - def join: Initialize[Seq[T]] = Initialize.join(s) + def joinWith[S](f: Seq[T] => S): Initialize[S] = uniform(s)(f) + def join: Initialize[Seq[T]] = uniform(s)(idFun) } - def join[T](inits: Seq[Initialize[T]]): Initialize[Seq[T]] = - inits match - { - case Seq() => value( Nil ) - case Seq(x, xs @ _*) => (join(xs) zipWith x)( (t,h) => h +: t) - } + def join[T](inits: Seq[Initialize[T]]): Initialize[Seq[T]] = uniform(inits)(idFun) + def joinAny[M[_]](inits: Seq[Initialize[M[T]] forSome { type T }]): Initialize[Seq[M[_]]] = + join(inits.asInstanceOf[Seq[Initialize[M[Any]]]]).asInstanceOf[Initialize[Seq[M[T] forSome { type T }]]] } object SettingsDefinition { implicit def unwrapSettingsDefinition(d: SettingsDefinition): Seq[Setting[_]] = d.settings @@ -233,103 +234,120 @@ trait Init[Scope] def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI)) def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) - def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init.map(t => f(key,t))) + def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t))) def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g) override def toString = "setting(" + key + ")" } - private[this] final class Optional[S,T](a: Option[ScopedKey[S]], f: Option[S] => T) extends Initialize[T] + // mainly for reducing generated class count + private[this] def validateReferencedT(g: ValidateRef) = + new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateReferenced g } + + private[this] def mapReferencedT(g: MapScoped) = + new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapReferenced g } + + private[this] def mapConstantT(g: MapConstant) = + new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapConstant g } + + private[this] def evaluateT(g: Settings[Scope]) = + new (Initialize ~> Id) { def apply[T](i: Initialize[T]) = i evaluate g } + + private[this] def dependencies(ls: Seq[Initialize[_]]): Seq[ScopedKey[_]] = ls.flatMap(_.dependsOn) + + sealed trait Keyed[S, T] extends Initialize[T] { - def dependsOn = a.toList - def map[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) - def get(map: Settings[Scope]): T = f(a map asFunction(map)) - def mapReferenced(g: MapScoped) = new Optional(a map g.fn, f) - def validateReferenced(g: ValidateRef) = Right( new Optional(a flatMap { sk => g(sk).right.toOption }, f) ) - def mapConstant(g: MapConstant): Initialize[T] = - (a flatMap g.fn) match { - case None => this - case s => new Value(() => f(s)) - } + def scopedKey: ScopedKey[S] + protected def transform: S => T + final def dependsOn = scopedKey :: Nil + final def apply[Z](g: T => Z): Initialize[Z] = new GetValue(scopedKey, g compose transform) + final def evaluate(ss: Settings[Scope]): T = transform(getValue(ss, scopedKey)) + final def mapReferenced(g: MapScoped): Initialize[T] = new GetValue( g(scopedKey), transform) + final def validateReferenced(g: ValidateRef): ValidatedInit[T] = g(scopedKey) match { + case Left(un) => Left(un :: Nil) + case Right(nk) => Right(new GetValue(nk, transform)) + } + final def mapConstant(g: MapConstant): Initialize[T] = g(scopedKey) match { + case None => this + case Some(const) => new Value(() => transform(const)) + } + @deprecated("Use scopedKey.") + def scoped = scopedKey } - private[this] final class Joined[S,T,U](a: Initialize[S], b: Initialize[T], f: (S,T) => U) extends Initialize[U] + private[this] final class GetValue[S,T](val scopedKey: ScopedKey[S], val transform: S => T) extends Keyed[S, T] + trait KeyedInitialize[T] extends Keyed[T, T] { + protected final val transform = idFun[T] + } + + private[this] final class Optional[S,T](a: Option[Initialize[S]], f: Option[S] => T) extends Initialize[T] { - def dependsOn = a.dependsOn ++ b.dependsOn - def mapReferenced(g: MapScoped) = new Joined(a mapReferenced g, b mapReferenced g, f) - def validateReferenced(g: ValidateRef) = - (a validateReferenced g, b validateReferenced g) match { - case (Right(ak), Right(bk)) => Right( new Joined(ak, bk, f) ) - case (au, bu) => Left( (au.left.toSeq ++ bu.left.toSeq).flatten ) - } - def map[Z](g: U => Z) = new Joined[S,T,Z](a, b, (s,t) => g(f(s,t))) - def mapConstant(g: MapConstant) = new Joined[S,T,U](a mapConstant g, b mapConstant g, f) - def get(map: Settings[Scope]): U = f(a get map, b get map) + def dependsOn = dependencies(a.toList) + def apply[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) + def evaluate(ss: Settings[Scope]): T = f(a map evaluateT(ss).fn) + def mapReferenced(g: MapScoped) = new Optional(a map mapReferencedT(g).fn, f) + def validateReferenced(g: ValidateRef) = Right( new Optional(a flatMap { _.validateReferenced(g).right.toOption }, f) ) + def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) } private[this] final class Value[T](value: () => T) extends Initialize[T] { def dependsOn = Nil def mapReferenced(g: MapScoped) = this def validateReferenced(g: ValidateRef) = Right(this) - def map[S](g: T => S) = new Value[S](() => g(value())) + def apply[S](g: T => S) = new Value[S](() => g(value())) def mapConstant(g: MapConstant) = this - def get(map: Settings[Scope]): T = value() + def evaluate(map: Settings[Scope]): T = value() } - private[this] final class Apply[HL <: HList, T](val f: HL => T, val inputs: KList[ScopedKey, HL]) extends Initialize[T] + private[this] final class Apply[HL <: HList, T](val f: HL => T, val inputs: KList[Initialize, HL]) extends Initialize[T] { - def dependsOn = inputs.toList - def mapReferenced(g: MapScoped) = new Apply(f, inputs transform g) - def map[S](g: T => S) = new Apply(g compose f, inputs) - def mapConstant(g: MapConstant) = Reduced.reduceH(inputs, g).combine( (keys, expand) => new Apply(f compose expand, keys) ) - def get(map: Settings[Scope]) = f(inputs down asTransform(map) ) + def dependsOn = dependencies(inputs.toList) + def mapReferenced(g: MapScoped) = mapInputs( mapReferencedT(g) ) + def apply[S](g: T => S) = new Apply(g compose f, inputs) + def mapConstant(g: MapConstant) = mapInputs( mapConstantT(g) ) + def mapInputs(g: Initialize ~> Initialize): Initialize[T] = new Apply(f, inputs transform g) + def evaluate(ss: Settings[Scope]) = f(inputs down evaluateT(ss)) def validateReferenced(g: ValidateRef) = { - val tx = inputs.transform(g) - val undefs = tx.toList.flatMap(_.left.toSeq) - val get = new (ValidatedRef ~> ScopedKey) { def apply[T](vr: ValidatedRef[T]) = vr.right.get } + val tx = inputs transform validateReferencedT(g) + val undefs = tx.toList.flatMap(_.left.toSeq.flatten) + val get = new (ValidatedInit ~> Initialize) { def apply[T](vr: ValidatedInit[T]) = vr.right.get } if(undefs.isEmpty) Right(new Apply(f, tx transform get)) else Left(undefs) } } - private[this] final class KApply[HL <: HList, M[_], T](val f: KList[M, HL] => T, val inputs: KList[({type l[t] = ScopedKey[M[t]]})#l, HL]) extends Initialize[T] + private[this] final class KApply[HL <: HList, M[_], T](val f: KList[M, HL] => T, val inputs: KList[({type l[t] = Initialize[M[t]]})#l, HL]) extends Initialize[T] { - type ScopedKeyM[T] = ScopedKey[M[T]] - type VRefM[T] = ValidatedRef[M[T]] - def dependsOn = unnest(inputs.toList) - def mapReferenced(g: MapScoped) = new KApply[HL, M, T](f, inputs.transform[({type l[t] = ScopedKey[M[t]]})#l]( nestCon(g) ) ) - def map[S](g: T => S) = new KApply[HL, M, S](g compose f, inputs) - def get(map: Settings[Scope]) = f(inputs.transform[M]( nestCon[ScopedKey, Id, M](asTransform(map)) )) - def mapConstant(g: MapConstant) = - { - def mk[HLk <: HList](keys: KList[ScopedKeyM, HLk], expand: KList[M, HLk] => KList[M, HL]) = new KApply[HLk, M, T](f compose expand, keys) - Reduced.reduceK[HL, ScopedKey, M](inputs, nestCon(g)) combine mk - } + type InitializeM[T] = Initialize[M[T]] + type VInitM[T] = ValidatedInit[M[T]] + def dependsOn = dependencies(unnest(inputs.toList)) + def mapReferenced(g: MapScoped) = mapInputs( mapReferencedT(g) ) + def apply[S](g: T => S) = new KApply[HL, M, S](g compose f, inputs) + def evaluate(ss: Settings[Scope]) = f(inputs.transform[M]( nestCon(evaluateT(ss)) )) + def mapConstant(g: MapConstant) = mapInputs(mapConstantT(g)) + def mapInputs(g: Initialize ~> Initialize): Initialize[T] = + new KApply[HL, M, T](f, inputs.transform[({type l[t] = Initialize[M[t]]})#l]( nestCon(g) )) def validateReferenced(g: ValidateRef) = { - val tx = inputs.transform[VRefM](nestCon(g)) - val undefs = tx.toList.flatMap(_.left.toSeq) - val get = new (VRefM ~> ScopedKeyM) { def apply[T](vr: ValidatedRef[M[T]]) = vr.right.get } + val tx = inputs.transform[VInitM](nestCon(validateReferencedT(g))) + val undefs = tx.toList.flatMap(_.left.toSeq.flatten) + val get = new (VInitM ~> InitializeM) { def apply[T](vr: VInitM[T]) = vr.right.get } if(undefs.isEmpty) - Right(new KApply[HL, M, T](f, tx.transform( get ) )) + Right(new KApply[HL, M, T](f, tx transform get)) else Left(undefs) } - private[this] def unnest(l: List[ScopedKey[M[T]] forSome { type T }]): List[ScopedKey[_]] = l.asInstanceOf[List[ScopedKey[_]]] + private[this] def unnest(l: List[Initialize[M[T]] forSome { type T }]): List[Initialize[_]] = l.asInstanceOf[List[Initialize[_]]] } - private[this] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[ScopedKey[S]]) extends Initialize[T] + private[this] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[Initialize[S]]) extends Initialize[T] { - def dependsOn = inputs - def mapReferenced(g: MapScoped) = new Uniform(f, inputs map g.fn[S]) + def dependsOn = dependencies(inputs) + def mapReferenced(g: MapScoped) = new Uniform(f, inputs map mapReferencedT(g).fn) def validateReferenced(g: ValidateRef) = { - val (undefs, ok) = List.separate(inputs map g.fn[S]) - if(undefs.isEmpty) Right( new Uniform(f, ok) ) else Left(undefs) + val (undefs, ok) = List.separate(inputs map validateReferencedT(g).fn ) + if(undefs.isEmpty) Right( new Uniform(f, ok) ) else Left(undefs.flatten) } - def map[S](g: T => S) = new Uniform(g compose f, inputs) - def mapConstant(g: MapConstant) = - { - val red = Reduced.reduceSeq(inputs, g) - new Uniform(f compose red.expand, red.keys) - } - def get(map: Settings[Scope]) = f(inputs map asFunction(map)) + def apply[S](g: T => S) = new Uniform(g compose f, inputs) + def mapConstant(g: MapConstant) = new Uniform(f, inputs map mapConstantT(g).fn) + def evaluate(ss: Settings[Scope]) = f(inputs map evaluateT(ss).fn ) } private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) } diff --git a/util/collection/src/test/scala/SettingsExample.scala b/util/collection/src/test/scala/SettingsExample.scala index 38c09d634..8d7136f0f 100644 --- a/util/collection/src/test/scala/SettingsExample.scala +++ b/util/collection/src/test/scala/SettingsExample.scala @@ -12,9 +12,10 @@ final case class Scope(index: Int) // That would be a general pain.) object SettingsExample extends Init[Scope] { - // This is the only abstract method, providing a way of showing a Scope+AttributeKey[_] - override def display(key: ScopedKey[_]): String = - key.scope.index + "/" + key.key.label + // Provides a way of showing a Scope+AttributeKey[_] + val showFullKey: Show[ScopedKey[_]] = new Show[ScopedKey[_]] { + def apply(key: ScopedKey[_]) = key.scope.index + "/" + key.key.label + } // A sample delegation function that delegates to a Scope with a lower index. val delegates: Scope => Seq[Scope] = { case s @ Scope(index) => @@ -55,7 +56,7 @@ object SettingsUsage // "compiles" and applies the settings. // This can be split into multiple steps to access intermediate results if desired. // The 'inspect' command operates on the output of 'compile', for example. - val applied: Settings[Scope] = make(mySettings)(delegates, scopeLocal) + val applied: Settings[Scope] = make(mySettings)(delegates, scopeLocal, showFullKey) // Show results. for(i <- 0 to 5; k <- Seq(a, b)) { From 4a5a64a8f0b61084d0e823b0026dd6bbf10b9cff Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 14 Aug 2011 10:53:37 -0400 Subject: [PATCH 196/823] fix ++ command to not require a space after it --- util/complete/Parsers.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 4a82ab482..fd0568494 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -27,7 +27,7 @@ trait Parsers if(DefaultParsers.matches(ID, s)) OpChar | SpaceClass else if(DefaultParsers.matches(Op, s)) - IDStart | SpaceClass + IDChar | SpaceClass else any From 93b64e0fd3e56c20833e188191239d0f8228435e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 14 Aug 2011 10:53:37 -0400 Subject: [PATCH 197/823] clean up undefined reference checking to be a proper function --- util/collection/PMap.scala | 16 +++++++++++++- util/collection/Settings.scala | 39 ++++++++++++++++++---------------- 2 files changed, 36 insertions(+), 19 deletions(-) diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala index febab0286..6eb37689d 100644 --- a/util/collection/PMap.scala +++ b/util/collection/PMap.scala @@ -14,6 +14,7 @@ trait RMap[K[_], V[_]] def toSeq: Seq[(K[_], V[_])] def keys: Iterable[K[_]] def values: Iterable[V[_]] + def isEmpty: Boolean } trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] @@ -22,6 +23,7 @@ trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] def remove[T](k: K[T]): IMap[K,V] def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): IMap[K,V] def mapValues[V2[_]](f: V ~> V2): IMap[K,V2] + def mapSeparate[VL[_], VR[_]](f: V ~> ({type l[T] = Either[VL[T], VR[T]]})#l ): (IMap[K,VL], IMap[K,VR]) } trait PMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] { @@ -55,9 +57,21 @@ object IMap def mapValues[V2[_]](f: V ~> V2) = new IMap0[K,V2](backing.mapValues(x => f(x)).toMap) + + def mapSeparate[VL[_], VR[_]](f: V ~> ({type l[T] = Either[VL[T], VR[T]]})#l ) = + { + val mapped = backing.view.map { case (k,v) => f(v) match { + case Left(l) => Left((k, l)) + case Right(r) => Right((k, r)) + }} + val (l, r) = List.separate[(K[_],VL[_]), (K[_],VR[_])]( mapped.toList ) + (new IMap0[K,VL](l.toMap), new IMap0[K,VR](r.toMap)) + } + def toSeq = backing.toSeq def keys = backing.keys def values = backing.values + def isEmpty = backing.isEmpty override def toString = backing.toString } @@ -89,7 +103,7 @@ class DelegatingPMap[K[_], V[_]](backing: mutable.Map[K[_], V[_]]) extends Abstr def toSeq = backing.toSeq def keys = backing.keys def values = backing.values - + def isEmpty = backing.isEmpty private[this] def cast[T](v: V[_]): V[T] = v.asInstanceOf[V[T]] private[this] def cast[T](o: Option[V[_]]): Option[V[T]] = o map cast[T] diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 41fbd4bc3..ddfb41b97 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -125,17 +125,18 @@ trait Init[Scope] def refMap(refKey: ScopedKey[_], isFirst: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = delegateForKey(sMap, k, delegates(k.scope), refKey, isFirst) } - val undefineds = new collection.mutable.ListBuffer[Undefined] - val f = new (SettingSeq ~> SettingSeq) { def apply[T](ks: Seq[Setting[T]]) = - ks.zipWithIndex.map{ case (s,i) => - (s validateReferenced refMap(s.key, i == 0) ) match { - case Right(v) => v - case Left(l) => undefineds ++= l; s - } - } - } - val result = sMap mapValues f - if(undefineds.isEmpty) result else throw Uninitialized(sMap, delegates, undefineds.toList) + type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] + val f = new (SettingSeq ~> ValidatedSettings) { def apply[T](ks: Seq[Setting[T]]) = { + val validated = ks.zipWithIndex map { case (s,i) => s validateReferenced refMap(s.key, i == 0) } + val (undefs, valid) = List separate validated + if(undefs.isEmpty) Right(valid) else Left(undefs.flatten) + }} + type Undefs[_] = Seq[Undefined] + val (undefineds, result) = sMap.mapSeparate[Undefs, SettingSeq]( f ) + if(undefineds.isEmpty) + result + else + throw Uninitialized(sMap.keys.toSeq, delegates, undefineds.values.flatten.toList, false) } private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean): Either[Undefined, ScopedKey[T]] = { @@ -160,15 +161,15 @@ trait Init[Scope] map.set(key.scope, key.key, value) } - def showUndefined(u: Undefined, sMap: ScopedMap, delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = + def showUndefined(u: Undefined, validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = { - val guessed = guessIntendedScope(sMap, delegates, u.referencedKey) + val guessed = guessIntendedScope(validKeys, delegates, u.referencedKey) display(u.referencedKey) + " from " + display(u.definingKey) + guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString } - def guessIntendedScope(sMap: ScopedMap, delegates: Scope => Seq[Scope], key: ScopedKey[_]): Option[ScopedKey[_]] = + def guessIntendedScope(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], key: ScopedKey[_]): Option[ScopedKey[_]] = { - val distances = sMap.keys.toSeq.flatMap { validKey => refinedDistance(delegates, validKey, key).map( dist => (dist, validKey) ) } + val distances = validKeys.flatMap { validKey => refinedDistance(delegates, validKey, key).map( dist => (dist, validKey) ) } distances.sortBy(_._1).map(_._2).headOption } def refinedDistance(delegates: Scope => Seq[Scope], a: ScopedKey[_], b: ScopedKey[_]): Option[Int] = @@ -181,13 +182,15 @@ trait Init[Scope] final class Uninitialized(val undefined: Seq[Undefined], msg: String) extends Exception(msg) final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_]) + final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(definingKey, referencedKey) - def Uninitialized(sMap: ScopedMap, delegates: Scope => Seq[Scope], keys: Seq[Undefined])(implicit display: Show[ScopedKey[_]]): Uninitialized = + def Uninitialized(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], keys: Seq[Undefined], runtime: Boolean)(implicit display: Show[ScopedKey[_]]): Uninitialized = { assert(!keys.isEmpty) val suffix = if(keys.length > 1) "s" else "" - val keysString = keys.map(u => showUndefined(u, sMap, delegates)).mkString("\n\n ", "\n\n ", "") - new Uninitialized(keys, "Reference" + suffix + " to undefined setting" + suffix + ": " + keysString + "\n ") + val prefix = if(runtime) "Runtime reference" else "Reference" + val keysString = keys.map(u => showUndefined(u, validKeys, delegates)).mkString("\n\n ", "\n\n ", "") + new Uninitialized(keys, prefix + suffix + " to undefined setting" + suffix + ": " + keysString + "\n ") } final class Compiled(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) { From 0b5e6484ba6299b0621b6b8d9e944108dc7b1acb Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 14 Aug 2011 10:53:37 -0400 Subject: [PATCH 198/823] drop unused KApply from settings --- util/collection/Settings.scala | 31 ------------------------------- 1 file changed, 31 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index ddfb41b97..0b83eb86b 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -64,13 +64,6 @@ trait Init[Scope] def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head))) def app[HL <: HList, T](inputs: KList[Initialize, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) - def kapp[HL <: HList, M[_], T](inputs: KList[({type l[t] = Initialize[M[t]]})#l, HL])(f: KList[M, HL] => T): Initialize[T] = new KApply[HL, M, T](f, inputs) - - // the following is a temporary workaround for the "... cannot be instantiated from ..." bug, which renders 'kapp' above unusable outside this source file - class KApp[HL <: HList, M[_], T] { - type Composed[S] = Initialize[M[S]] - def apply(inputs: KList[Composed, HL])(f: KList[M, HL] => T): Initialize[T] = new KApply[HL, M, T](f, inputs) - } def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { @@ -315,30 +308,6 @@ trait Init[Scope] if(undefs.isEmpty) Right(new Apply(f, tx transform get)) else Left(undefs) } } - - private[this] final class KApply[HL <: HList, M[_], T](val f: KList[M, HL] => T, val inputs: KList[({type l[t] = Initialize[M[t]]})#l, HL]) extends Initialize[T] - { - type InitializeM[T] = Initialize[M[T]] - type VInitM[T] = ValidatedInit[M[T]] - def dependsOn = dependencies(unnest(inputs.toList)) - def mapReferenced(g: MapScoped) = mapInputs( mapReferencedT(g) ) - def apply[S](g: T => S) = new KApply[HL, M, S](g compose f, inputs) - def evaluate(ss: Settings[Scope]) = f(inputs.transform[M]( nestCon(evaluateT(ss)) )) - def mapConstant(g: MapConstant) = mapInputs(mapConstantT(g)) - def mapInputs(g: Initialize ~> Initialize): Initialize[T] = - new KApply[HL, M, T](f, inputs.transform[({type l[t] = Initialize[M[t]]})#l]( nestCon(g) )) - def validateReferenced(g: ValidateRef) = - { - val tx = inputs.transform[VInitM](nestCon(validateReferencedT(g))) - val undefs = tx.toList.flatMap(_.left.toSeq.flatten) - val get = new (VInitM ~> InitializeM) { def apply[T](vr: VInitM[T]) = vr.right.get } - if(undefs.isEmpty) - Right(new KApply[HL, M, T](f, tx transform get)) - else - Left(undefs) - } - private[this] def unnest(l: List[Initialize[M[T]] forSome { type T }]): List[Initialize[_]] = l.asInstanceOf[List[Initialize[_]]] - } private[this] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[Initialize[S]]) extends Initialize[T] { def dependsOn = dependencies(inputs) From 8ce99503271dbd68266ad3ad61d0fe98003b5768 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 14 Aug 2011 10:53:37 -0400 Subject: [PATCH 199/823] allow setting initialization to be partially dynamic and run in parallel --- util/collection/INode.scala | 172 ++++++++++++++++++ util/collection/PMap.scala | 3 + util/collection/Settings.scala | 62 ++++--- .../src/test/scala/SettingsExample.scala | 4 +- .../src/test/scala/SettingsTest.scala | 95 +++++++++- 5 files changed, 302 insertions(+), 34 deletions(-) create mode 100644 util/collection/INode.scala diff --git a/util/collection/INode.scala b/util/collection/INode.scala new file mode 100644 index 000000000..e21c0b6b7 --- /dev/null +++ b/util/collection/INode.scala @@ -0,0 +1,172 @@ +package sbt + + import java.lang.Runnable + import java.util.concurrent.{atomic, Executor, LinkedBlockingQueue} + import atomic.{AtomicBoolean, AtomicInteger} + import Types.{:+:, Id} + +object EvaluationState extends Enumeration { + val New, Blocked, Ready, Calling, Evaluated = Value +} + +abstract class EvaluateSettings[Scope] +{ + protected val init: Init[Scope] + import init._ + protected def executor: Executor + protected def compiledSettings: Seq[Compiled[_]] + + import EvaluationState.{Value => EvaluationState, _} + + private[this] val complete = new LinkedBlockingQueue[Option[Throwable]] + private[this] val static = PMap.empty[ScopedKey, INode] + private[this] def getStatic[T](key: ScopedKey[T]): INode[T] = static get key getOrElse error("Illegal reference to key " + key) + + private[this] val transform: Initialize ~> INode = new (Initialize ~> INode) { def apply[T](i: Initialize[T]): INode[T] = i match { + case k: Keyed[s, T] => single(getStatic(k.scopedKey), k.transform) + case a: Apply[hl,T] => new MixedNode(a.inputs transform transform, a.f) + case u: Uniform[s, T] => new UniformNode(u.inputs map transform.fn[s], u.f) + case b: Bind[s,T] => new BindNode[s,T]( transform(b.in), x => transform(b.f(x))) + case v: Value[T] => constant(v.value) + case o: Optional[s,T] => o.a match { + case None => constant( () => o.f(None) ) + case Some(i) => single[s,T](transform(i), x => o.f(Some(x))) + } + }} + private[this] val roots: Seq[INode[_]] = compiledSettings flatMap { cs => + (cs.settings map { s => + val t = transform(s.init) + static(s.key) = t + t + }): Seq[INode[_]] + } + private[this] var running = new AtomicInteger + private[this] var cancel = new AtomicBoolean(false) + + def run(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = + { + assert(running.get() == 0, "Already running") + startWork() + roots.foreach( _.registerIfNew() ) + workComplete() + complete.take() foreach { ex => + cancel.set(true) + throw ex + } + getResults(delegates) + } + private[this] def getResults(implicit delegates: Scope => Seq[Scope]) = (empty /: static.toTypedSeq) { case (ss, static.TPair(key, node)) => ss.set(key.scope, key.key, node.get) } + private[this] val getValue = new (INode ~> Id) { def apply[T](node: INode[T]) = node.get } + + private[this] def submitEvaluate(node: INode[_]) = submit(node.evaluate()) + private[this] def submitCallComplete[T](node: BindNode[_, T], value: T) = submit(node.callComplete(value)) + private[this] def submit(work: => Unit): Unit = + { + startWork() + executor.execute(new Runnable { def run = if(!cancel.get()) run0(work) }) + } + private[this] def run0(work: => Unit): Unit = + { + try { work } catch { case e => complete.put( Some(e) ) } + workComplete() + } + + private[this] def startWork(): Unit = running.incrementAndGet() + private[this] def workComplete(): Unit = + if(running.decrementAndGet() == 0) + complete.put( None ) + + private[this] sealed abstract class INode[T] + { + private[this] var state: EvaluationState = New + private[this] var value: T = _ + private[this] val blocking = new collection.mutable.ListBuffer[INode[_]] + private[this] var blockedOn: Int = 0 + private[this] val calledBy = new collection.mutable.ListBuffer[BindNode[_, T]] + + override def toString = getClass.getName + " (state=" + state + ",blockedOn=" + blockedOn + ",calledBy=" + calledBy.size + ",blocking=" + blocking.size + "): " + + ( (static.toSeq.flatMap { case (key, value) => if(value eq this) key.toString :: Nil else Nil }).headOption getOrElse "non-static") + + final def get: T = synchronized { + assert(value != null, toString + " not evaluated") + value + } + final def doneOrBlock(from: INode[_]): Boolean = synchronized { + val ready = state == Evaluated + if(!ready) blocking += from + registerIfNew() + ready + } + final def isDone: Boolean = synchronized { state == Evaluated } + final def isNew: Boolean = synchronized { state == New } + final def isCalling: Boolean = synchronized { state == Calling } + final def registerIfNew(): Unit = synchronized { if(state == New) register() } + private[this] def register() + { + assert(state == New, "Already registered and: " + toString) + val deps = dependsOn + blockedOn = deps.size - deps.count(_.doneOrBlock(this)) + if(blockedOn == 0) + schedule() + else + state = Blocked + } + + final def schedule(): Unit = synchronized { + assert(state == New || state == Blocked, "Invalid state for schedule() call: " + toString) + state = Ready + submitEvaluate(this) + } + final def unblocked(): Unit = synchronized { + assert(state == Blocked, "Invalid state for unblocked() call: " + toString) + blockedOn -= 1 + assert(blockedOn >= 0, "Negative blockedOn: " + blockedOn + " for " + toString) + if(blockedOn == 0) schedule() + } + final def evaluate(): Unit = synchronized { evaluate0() } + protected final def makeCall(source: BindNode[_, T], target: INode[T]) { + assert(state == Ready, "Invalid state for call to makeCall: " + toString) + state = Calling + target.call(source) + } + protected final def setValue(v: T) { + assert(state != Evaluated, "Already evaluated (trying to set value to " + v + "): " + toString) + value = v + state = Evaluated + blocking foreach { _.unblocked() } + blocking.clear() + calledBy foreach { node => submitCallComplete(node, value) } + calledBy.clear() + } + final def call(by: BindNode[_, T]): Unit = synchronized { + registerIfNew() + state match { + case Evaluated => submitCallComplete(by, value) + case _ => calledBy += by + } + } + protected def dependsOn: Seq[INode[_]] + protected def evaluate0(): Unit + } + private[this] def constant[T](f: () => T): INode[T] = new MixedNode[HNil, T](KNil, _ => f()) + private[this] def single[S,T](in: INode[S], f: S => T): INode[T] = new MixedNode[S :+: HNil, T](in :^: KNil, hl => f(hl.head)) + private[this] final class BindNode[S,T](in: INode[S], f: S => INode[T]) extends INode[T] + { + protected def dependsOn = in :: Nil + protected def evaluate0(): Unit = makeCall(this, f(in.get) ) + def callComplete(value: T): Unit = synchronized { + assert(isCalling, "Invalid state for callComplete(" + value + "): " + toString) + setValue(value) + } + } + private[this] final class UniformNode[S,T](in: Seq[INode[S]], f: Seq[S] => T) extends INode[T] + { + protected def dependsOn = in + protected def evaluate0(): Unit = setValue( f(in.map(_.get)) ) + } + private[this] final class MixedNode[HL <: HList, T](in: KList[INode, HL], f: HL => T) extends INode[T] + { + protected def dependsOn = in.toList + protected def evaluate0(): Unit = setValue( f( in down getValue ) ) + } +} diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala index 6eb37689d..1a2afb6d5 100644 --- a/util/collection/PMap.scala +++ b/util/collection/PMap.scala @@ -12,9 +12,12 @@ trait RMap[K[_], V[_]] def get[T](k: K[T]): Option[V[T]] def contains[T](k: K[T]): Boolean def toSeq: Seq[(K[_], V[_])] + def toTypedSeq = toSeq.map{ case (k: K[t],v) => TPair[t](k,v.asInstanceOf[V[t]]) } def keys: Iterable[K[_]] def values: Iterable[V[_]] def isEmpty: Boolean + + final case class TPair[T](key: K[T], value: V[T]) } trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 0b83eb86b..48aff00e8 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -50,7 +50,7 @@ trait Init[Scope] type SettingSeq[T] = Seq[Setting[T]] type ScopedMap = IMap[ScopedKey, SettingSeq] - type CompiledMap = Map[ScopedKey[_], Compiled] + type CompiledMap = Map[ScopedKey[_], Compiled[_]] type MapScoped = ScopedKey ~> ScopedKey type ValidatedRef[T] = Either[Undefined, ScopedKey[T]] type ValidatedInit[T] = Either[Seq[Undefined], Initialize[T]] @@ -62,6 +62,7 @@ trait Init[Scope] def value[T](value: => T): Initialize[T] = new Value(value _) def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head))) + def bind[S,T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) def app[HL <: HList, T](inputs: KList[Initialize, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) @@ -87,18 +88,18 @@ trait Init[Scope] { val cMap = compiled(init)(delegates, scopeLocal, display) // order the initializations. cyclic references are detected here. - val ordered: Seq[Compiled] = sort(cMap) + val ordered: Seq[Compiled[_]] = sort(cMap) // evaluation: apply the initializations. - applyInits(ordered) + try { applyInits(ordered) } + catch { case rru: RuntimeUndefined => throw Uninitialized(cMap.keys.toSeq, delegates, rru.undefined, true) } } - def sort(cMap: CompiledMap): Seq[Compiled] = + def sort(cMap: CompiledMap): Seq[Compiled[_]] = Dag.topologicalSort(cMap.values)(_.dependencies.map(cMap)) def compile(sMap: ScopedMap): CompiledMap = - sMap.toSeq.map { case (k, ss) => + sMap.toTypedSeq.map { case sMap.TPair(k, ss) => val deps = ss flatMap { _.dependsOn } toSet; - val eval = (settings: Settings[Scope]) => (settings /: ss)(applySetting) - (k, new Compiled(k, deps, eval)) + (k, new Compiled(k, deps, ss)) } toMap; def grouped(init: Seq[Setting[_]]): ScopedMap = @@ -144,14 +145,17 @@ trait Init[Scope] resolve(scopes) } - private[this] def applyInits(ordered: Seq[Compiled])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = - (empty /: ordered){ (m, comp) => comp.eval(m) } - - private[this] def applySetting[T](map: Settings[Scope], setting: Setting[T]): Settings[Scope] = + private[this] def applyInits(ordered: Seq[Compiled[_]])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = { - val value = setting.init.evaluate(map) - val key = setting.key - map.set(key.scope, key.key, value) + val x = java.util.concurrent.Executors.newFixedThreadPool(Runtime.getRuntime.availableProcessors) + try { + val eval: EvaluateSettings[Scope] = new EvaluateSettings[Scope] { + override val init: Init.this.type = Init.this + def compiledSettings = ordered + def executor = x + } + eval.run + } finally { x.shutdown() } } def showUndefined(u: Undefined, validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = @@ -185,7 +189,7 @@ trait Init[Scope] val keysString = keys.map(u => showUndefined(u, validKeys, delegates)).mkString("\n\n ", "\n\n ", "") new Uninitialized(keys, prefix + suffix + " to undefined setting" + suffix + ": " + keysString + "\n ") } - final class Compiled(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]], val eval: Settings[Scope] => Settings[Scope]) + final class Compiled[T](val key: ScopedKey[T], val dependencies: Iterable[ScopedKey[_]], val settings: Seq[Setting[T]]) { override def toString = showFullKey(key) } @@ -253,7 +257,7 @@ trait Init[Scope] sealed trait Keyed[S, T] extends Initialize[T] { def scopedKey: ScopedKey[S] - protected def transform: S => T + def transform: S => T final def dependsOn = scopedKey :: Nil final def apply[Z](g: T => Z): Initialize[Z] = new GetValue(scopedKey, g compose transform) final def evaluate(ss: Settings[Scope]): T = transform(getValue(ss, scopedKey)) @@ -271,10 +275,24 @@ trait Init[Scope] } private[this] final class GetValue[S,T](val scopedKey: ScopedKey[S], val transform: S => T) extends Keyed[S, T] trait KeyedInitialize[T] extends Keyed[T, T] { - protected final val transform = idFun[T] + final val transform = idFun[T] } - - private[this] final class Optional[S,T](a: Option[Initialize[S]], f: Option[S] => T) extends Initialize[T] + private[sbt] final class Bind[S,T](val f: S => Initialize[T], val in: Initialize[S]) extends Initialize[T] + { + def dependsOn = in.dependsOn + def apply[Z](g: T => Z): Initialize[Z] = new Bind[S,Z](s => f(s)(g), in) + def evaluate(ss: Settings[Scope]): T = f(in evaluate ss) evaluate ss + def mapReferenced(g: MapScoped) = new Bind[S,T](s => f(s) mapReferenced g, in mapReferenced g) + def validateReferenced(g: ValidateRef) = (in validateReferenced g).right.map { validIn => + new Bind[S,T](s => handleUndefined( f(s) validateReferenced g), validIn) + } + def handleUndefined(vr: ValidatedInit[T]): Initialize[T] = vr match { + case Left(undefs) => throw new RuntimeUndefined(undefs) + case Right(x) => x + } + def mapConstant(g: MapConstant) = new Bind[S,T](s => f(s) mapConstant g, in mapConstant g) + } + private[sbt] final class Optional[S,T](val a: Option[Initialize[S]], val f: Option[S] => T) extends Initialize[T] { def dependsOn = dependencies(a.toList) def apply[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) @@ -283,7 +301,7 @@ trait Init[Scope] def validateReferenced(g: ValidateRef) = Right( new Optional(a flatMap { _.validateReferenced(g).right.toOption }, f) ) def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) } - private[this] final class Value[T](value: () => T) extends Initialize[T] + private[sbt] final class Value[T](val value: () => T) extends Initialize[T] { def dependsOn = Nil def mapReferenced(g: MapScoped) = this @@ -292,7 +310,7 @@ trait Init[Scope] def mapConstant(g: MapConstant) = this def evaluate(map: Settings[Scope]): T = value() } - private[this] final class Apply[HL <: HList, T](val f: HL => T, val inputs: KList[Initialize, HL]) extends Initialize[T] + private[sbt] final class Apply[HL <: HList, T](val f: HL => T, val inputs: KList[Initialize, HL]) extends Initialize[T] { def dependsOn = dependencies(inputs.toList) def mapReferenced(g: MapScoped) = mapInputs( mapReferencedT(g) ) @@ -308,7 +326,7 @@ trait Init[Scope] if(undefs.isEmpty) Right(new Apply(f, tx transform get)) else Left(undefs) } } - private[this] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[Initialize[S]]) extends Initialize[T] + private[sbt] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[Initialize[S]]) extends Initialize[T] { def dependsOn = dependencies(inputs) def mapReferenced(g: MapScoped) = new Uniform(f, inputs map mapReferencedT(g).fn) diff --git a/util/collection/src/test/scala/SettingsExample.scala b/util/collection/src/test/scala/SettingsExample.scala index 8d7136f0f..558de7f4a 100644 --- a/util/collection/src/test/scala/SettingsExample.scala +++ b/util/collection/src/test/scala/SettingsExample.scala @@ -59,9 +59,9 @@ object SettingsUsage val applied: Settings[Scope] = make(mySettings)(delegates, scopeLocal, showFullKey) // Show results. - for(i <- 0 to 5; k <- Seq(a, b)) { +/* for(i <- 0 to 5; k <- Seq(a, b)) { println( k.label + i + " = " + applied.get( Scope(i), k) ) - } + }*/ /** Output: * For the None results, we never defined the value and there was no value to delegate to. diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index 8d88bd30d..2e57685ea 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -3,21 +3,96 @@ package sbt import org.scalacheck._ import Prop._ import SettingsUsage._ +import SettingsExample._ object SettingsTest extends Properties("settings") { - def tests = - for(i <- 0 to 5; k <- Seq(a, b)) yield { - val value = applied.get( Scope(i), k) - val expected = expectedValues(2*i + (if(k == a) 0 else 1)) - ("Index: " + i) |: - ("Key: " + k.label) |: - ("Value: " + value) |: - ("Expected: " + expected) |: - (value == expected) - } + final val ChainMax = 5000 + lazy val chainLengthGen = Gen.choose(1, ChainMax) property("Basic settings test") = secure( all( tests: _*) ) + property("Basic chain") = forAll(chainLengthGen) { (i: Int) => + val abs = math.abs(i) + singleIntTest( chain( abs, value(0)), abs ) + } + property("Basic bind chain") = forAll(chainLengthGen) { (i: Int) => + val abs = math.abs(i) + singleIntTest( chainBind(value(abs)), 0 ) + } + + property("Allows references to completed settings") = forAllNoShrink(30) { allowedReference _ } + final def allowedReference(intermediate: Int): Prop = + { + val top = value(intermediate) + def iterate(init: Initialize[Int]): Initialize[Int] = + bind(init) { t => + if(t <= 0) + top + else + iterate(value(t-1) ) + } + try { evaluate( setting(chk, iterate(top)) :: Nil); true } + catch { case e: Exception => ("Unexpected exception: " + e) |: false } + } + +// Circular (dynamic) references currently loop infinitely. +// This is the expected behavior (detecting dynamic cycles is expensive), +// but it may be necessary to provide an option to detect them (with a performance hit) +// This would test that cycle detection. +// property("Catches circular references") = forAll(chainLengthGen) { checkCircularReferences _ } + final def checkCircularReferences(intermediate: Int): Prop = + { + val ccr = new CCR(intermediate) + try { evaluate( setting(chk, ccr.top) :: Nil); false } + catch { case e: Exception => true } + } + + def tests = + for(i <- 0 to 5; k <- Seq(a, b)) yield { + val expected = expectedValues(2*i + (if(k == a) 0 else 1)) + checkKey[Int]( ScopedKey( Scope(i), k ), expected, applied) + } + lazy val expectedValues = None :: None :: None :: None :: None :: None :: Some(3) :: None :: Some(3) :: Some(9) :: Some(4) :: Some(9) :: Nil + + lazy val ch = AttributeKey[Int]("ch") + lazy val chk = ScopedKey( Scope(0), ch) + def chain(i: Int, prev: Initialize[Int]): Initialize[Int] = + if(i <= 0) prev else chain(i - 1, prev(_ + 1)) + + def chainBind(prev: Initialize[Int]): Initialize[Int] = + bind(prev) { v => + if(v <= 0) prev else chainBind(value(v - 1) ) + } + def singleIntTest(i: Initialize[Int], expected: Int) = + { + val eval = evaluate( setting( chk, i ) :: Nil ) + checkKey( chk, Some(expected), eval ) + } + + def checkKey[T](key: ScopedKey[T], expected: Option[T], settings: Settings[Scope]) = + { + val value = settings.get( key.scope, key.key) + ("Key: " + key) |: + ("Value: " + value) |: + ("Expected: " + expected) |: + (value == expected) + } + + def evaluate(settings: Seq[Setting[_]]): Settings[Scope] = + try { make(settings)(delegates, scopeLocal, showFullKey) } + catch { case e => e.printStackTrace; throw e } +} +// This setup is a workaround for module synchronization issues +final class CCR(intermediate: Int) +{ + lazy val top = iterate(value(intermediate), intermediate) + def iterate(init: Initialize[Int], i: Int): Initialize[Int] = + bind(init) { t => + if(t <= 0) + top + else + iterate(value(t - 1), t-1) + } } \ No newline at end of file From 8e4906f410465248aea23cd29c55eb0564dbffb7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 14 Aug 2011 10:53:38 -0400 Subject: [PATCH 200/823] fix undefined key suggestion for updating settings --- util/collection/Settings.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 48aff00e8..c0ab96e18 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -170,12 +170,12 @@ trait Init[Scope] distances.sortBy(_._1).map(_._2).headOption } def refinedDistance(delegates: Scope => Seq[Scope], a: ScopedKey[_], b: ScopedKey[_]): Option[Int] = - if(a.key == b.key) + if(a.key != b.key || a == b) None + else { val dist = delegates(a.scope).indexOf(b.scope) if(dist < 0) None else Some(dist) } - else None final class Uninitialized(val undefined: Seq[Undefined], msg: String) extends Exception(msg) final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_]) From fdb47eca8df2741c3fa91aa6e6fbb46a8ea74dd3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 26 Aug 2011 23:27:03 -0400 Subject: [PATCH 201/823] fix dependsOn breakage from Initialize rework --- util/collection/Settings.scala | 24 ++++++++++++------------ 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index c0ab96e18..0f4c5c83a 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -98,7 +98,7 @@ trait Init[Scope] def compile(sMap: ScopedMap): CompiledMap = sMap.toTypedSeq.map { case sMap.TPair(k, ss) => - val deps = ss flatMap { _.dependsOn } toSet; + val deps = ss flatMap { _.dependencies } toSet; (k, new Compiled(k, deps, ss)) } toMap; @@ -112,7 +112,7 @@ trait Init[Scope] if(s.definitive) s :: Nil else ss :+ s def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopeLocal): Seq[Setting[_]] = - init.flatMap( _.dependsOn flatMap scopeLocal ) ++ init + init.flatMap( _.dependencies flatMap scopeLocal ) ++ init def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = { @@ -196,7 +196,7 @@ trait Init[Scope] sealed trait Initialize[T] { - def dependsOn: Seq[ScopedKey[_]] + def dependencies: Seq[ScopedKey[_]] def apply[S](g: T => S): Initialize[S] def mapReferenced(g: MapScoped): Initialize[T] def validateReferenced(g: ValidateRef): ValidatedInit[T] @@ -229,8 +229,8 @@ trait Init[Scope] final class Setting[T](val key: ScopedKey[T], val init: Initialize[T]) extends SettingsDefinition { def settings = this :: Nil - def definitive: Boolean = !init.dependsOn.contains(key) - def dependsOn: Seq[ScopedKey[_]] = remove(init.dependsOn, key) + def definitive: Boolean = !init.dependencies.contains(key) + def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI)) def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) @@ -252,13 +252,13 @@ trait Init[Scope] private[this] def evaluateT(g: Settings[Scope]) = new (Initialize ~> Id) { def apply[T](i: Initialize[T]) = i evaluate g } - private[this] def dependencies(ls: Seq[Initialize[_]]): Seq[ScopedKey[_]] = ls.flatMap(_.dependsOn) + private[this] def deps(ls: Seq[Initialize[_]]): Seq[ScopedKey[_]] = ls.flatMap(_.dependencies) sealed trait Keyed[S, T] extends Initialize[T] { def scopedKey: ScopedKey[S] def transform: S => T - final def dependsOn = scopedKey :: Nil + final def dependencies = scopedKey :: Nil final def apply[Z](g: T => Z): Initialize[Z] = new GetValue(scopedKey, g compose transform) final def evaluate(ss: Settings[Scope]): T = transform(getValue(ss, scopedKey)) final def mapReferenced(g: MapScoped): Initialize[T] = new GetValue( g(scopedKey), transform) @@ -279,7 +279,7 @@ trait Init[Scope] } private[sbt] final class Bind[S,T](val f: S => Initialize[T], val in: Initialize[S]) extends Initialize[T] { - def dependsOn = in.dependsOn + def dependencies = in.dependencies def apply[Z](g: T => Z): Initialize[Z] = new Bind[S,Z](s => f(s)(g), in) def evaluate(ss: Settings[Scope]): T = f(in evaluate ss) evaluate ss def mapReferenced(g: MapScoped) = new Bind[S,T](s => f(s) mapReferenced g, in mapReferenced g) @@ -294,7 +294,7 @@ trait Init[Scope] } private[sbt] final class Optional[S,T](val a: Option[Initialize[S]], val f: Option[S] => T) extends Initialize[T] { - def dependsOn = dependencies(a.toList) + def dependencies = deps(a.toList) def apply[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) def evaluate(ss: Settings[Scope]): T = f(a map evaluateT(ss).fn) def mapReferenced(g: MapScoped) = new Optional(a map mapReferencedT(g).fn, f) @@ -303,7 +303,7 @@ trait Init[Scope] } private[sbt] final class Value[T](val value: () => T) extends Initialize[T] { - def dependsOn = Nil + def dependencies = Nil def mapReferenced(g: MapScoped) = this def validateReferenced(g: ValidateRef) = Right(this) def apply[S](g: T => S) = new Value[S](() => g(value())) @@ -312,7 +312,7 @@ trait Init[Scope] } private[sbt] final class Apply[HL <: HList, T](val f: HL => T, val inputs: KList[Initialize, HL]) extends Initialize[T] { - def dependsOn = dependencies(inputs.toList) + def dependencies = deps(inputs.toList) def mapReferenced(g: MapScoped) = mapInputs( mapReferencedT(g) ) def apply[S](g: T => S) = new Apply(g compose f, inputs) def mapConstant(g: MapConstant) = mapInputs( mapConstantT(g) ) @@ -328,7 +328,7 @@ trait Init[Scope] } private[sbt] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[Initialize[S]]) extends Initialize[T] { - def dependsOn = dependencies(inputs) + def dependencies = deps(inputs) def mapReferenced(g: MapScoped) = new Uniform(f, inputs map mapReferencedT(g).fn) def validateReferenced(g: ValidateRef) = { From 9756e99e16b85483e682361bb1057c743a309bdb Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 20 Sep 2011 20:51:47 -0400 Subject: [PATCH 202/823] provide consecutive tab press count for completion combinators --- util/complete/JLineCompletion.scala | 33 +++++++++----- util/complete/Parser.scala | 68 ++++++++++++++--------------- 2 files changed, 57 insertions(+), 44 deletions(-) diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala index 18fc11fa4..2557a1b08 100644 --- a/util/complete/JLineCompletion.scala +++ b/util/complete/JLineCompletion.scala @@ -10,19 +10,26 @@ package sbt.complete object JLineCompletion { def installCustomCompletor(reader: ConsoleReader, parser: Parser[_]): Unit = - installCustomCompletor(parserAsCompletor(parser), reader) - def installCustomCompletor(reader: ConsoleReader)(complete: String => (Seq[String], Seq[String])): Unit = + installCustomCompletor(reader)(parserAsCompletor(parser)) + def installCustomCompletor(reader: ConsoleReader)(complete: (String, Int) => (Seq[String], Seq[String])): Unit = installCustomCompletor(customCompletor(complete), reader) - def installCustomCompletor(complete: ConsoleReader => Boolean, reader: ConsoleReader): Unit = + def installCustomCompletor(complete: (ConsoleReader, Int) => Boolean, reader: ConsoleReader): Unit = { reader.removeCompletor(DummyCompletor) reader.addCompletor(DummyCompletor) reader.setCompletionHandler(new CustomHandler(complete)) } - private[this] final class CustomHandler(completeImpl: ConsoleReader => Boolean) extends CompletionHandler + private[this] final class CustomHandler(completeImpl: (ConsoleReader, Int) => Boolean) extends CompletionHandler { - override def complete(reader: ConsoleReader, candidates: java.util.List[_], position: Int) = completeImpl(reader) + private[this] var previous: Option[(String,Int)] = None + private[this] var level: Int = 1 + override def complete(reader: ConsoleReader, candidates: java.util.List[_], position: Int) = { + val current = Some(bufferSnapshot(reader)) + level = if(current == previous) level + 1 else 1 + previous = current + completeImpl(reader, level) + } } // always provides dummy completions so that the custom completion handler gets called @@ -37,8 +44,9 @@ object JLineCompletion } } - def parserAsCompletor(p: Parser[_]): ConsoleReader => Boolean = - customCompletor(str => convertCompletions(Parser.completions(p, str))) + def parserAsCompletor(p: Parser[_]): (String, Int) => (Seq[String], Seq[String]) = + (str, level) => convertCompletions(Parser.completions(p, str, level)) + def convertCompletions(c: Completions): (Seq[String], Seq[String]) = { val cs = c.get @@ -57,13 +65,18 @@ object JLineCompletion } def appendNonEmpty(set: Set[String], add: String) = if(add.isEmpty) set else set + add - def customCompletor(f: String => (Seq[String], Seq[String])): ConsoleReader => Boolean = - reader => { - val success = complete(beforeCursor(reader), f, reader) + def customCompletor(f: (String, Int) => (Seq[String], Seq[String])): (ConsoleReader, Int) => Boolean = + (reader, level) => { + val success = complete(beforeCursor(reader), reader => f(reader, level), reader) reader.flushConsole() success } + def bufferSnapshot(reader: ConsoleReader): (String, Int) = + { + val b = reader.getCursorBuffer + (b.getBuffer.toString, b.cursor) + } def beforeCursor(reader: ConsoleReader): String = { val b = reader.getCursorBuffer diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index a532af46a..1df3e0195 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -4,7 +4,7 @@ package sbt.complete import Parser._ - import sbt.Types.{left, right, some} + import sbt.Types.{const, left, right, some} import sbt.Util.separate sealed trait Parser[+T] @@ -12,7 +12,7 @@ sealed trait Parser[+T] def derive(i: Char): Parser[T] def resultEmpty: Result[T] def result: Option[T] - def completions: Completions + def completions(level: Int): Completions def failure: Option[Failure] def isTokenStart = false def ifValid[S](p: => Parser[S]): Parser[S] @@ -252,7 +252,7 @@ trait ParserMain override def result = Some(value) def resultEmpty = Value(value) def derive(c: Char) = Parser.failure("Expected end of input.") - def completions = Completions.empty + def completions(level: Int) = Completions.empty override def toString = "success(" + value + ")" } @@ -269,7 +269,7 @@ trait ParserMain def result = None def resultEmpty = mkFailure( "Expected '" + ch + "'" ) def derive(c: Char) = if(c == ch) success(ch) else new Invalid(resultEmpty) - def completions = Completions.single(Completion.suggestStrict(ch.toString)) + def completions(level: Int) = Completions.single(Completion.suggestStrict(ch.toString)) override def toString = "'" + ch + "'" } implicit def literal(s: String): Parser[String] = stringLiteral(s, 0) @@ -304,7 +304,7 @@ trait ParserMain if(p.valid) p.derive(c) else p // The x Completions.empty removes any trailing token completions where append.isEmpty - def completions(p: Parser[_], s: String): Completions = apply(p)(s).completions x Completions.empty + def completions(p: Parser[_], s: String, level: Int): Completions = apply(p)(s).completions(level) x Completions.empty def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = if(a.valid) { @@ -329,10 +329,10 @@ trait ParserMain success(seen.mkString) } - def token[T](t: Parser[T]): Parser[T] = token(t, "", true, false) - def token[T](t: Parser[T], hide: Boolean): Parser[T] = token(t, "", true, hide) - def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false, false) - def token[T](t: Parser[T], seen: String, track: Boolean, hide: Boolean): Parser[T] = + def token[T](t: Parser[T]): Parser[T] = token(t, "", true, const(false)) + def token[T](t: Parser[T], hide: Int => Boolean): Parser[T] = token(t, "", true, hide) + def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false, const(false)) + def token[T](t: Parser[T], seen: String, track: Boolean, hide: Int => Boolean): Parser[T] = if(t.valid && !t.isTokenStart) if(t.result.isEmpty) new TokenStart(t, seen, track, hide) else t else @@ -373,7 +373,7 @@ private final case class Invalid(fail: Failure) extends Parser[Nothing] def result = None def resultEmpty = fail def derive(c: Char) = error("Invalid.") - def completions = Completions.nil + def completions(level: Int) = Completions.nil override def toString = fail.errors.mkString("; ") def valid = false def ifValid[S](p: => Parser[S]): Parser[S] = this @@ -383,7 +383,7 @@ private final class OnFailure[A](a: Parser[A], message: String) extends ValidPar def result = a.result def resultEmpty = a.resultEmpty match { case f: Failure => mkFailure(message); case v: Value[A] => v } def derive(c: Char) = onFailure(a derive c, message) - def completions = a.completions + def completions(level: Int) = a.completions(level) override def toString = "(" + a + " !!! \"" + message + "\" )" override def isTokenStart = a.isTokenStart } @@ -400,7 +400,7 @@ private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends ValidPars case _: Failure => common } } - lazy val completions = a.completions x b.completions + def completions(level: Int) = a.completions(level) x b.completions(level) override def toString = "(" + a + " ~ " + b + ")" } @@ -409,7 +409,7 @@ private final class HomParser[A](a: Parser[A], b: Parser[A]) extends ValidParser lazy val result = tuple(a.result, b.result) map (_._1) def derive(c: Char) = (a derive c) | (b derive c) lazy val resultEmpty = a.resultEmpty or b.resultEmpty - lazy val completions = a.completions ++ b.completions + def completions(level: Int) = a.completions(level) ++ b.completions(level) override def toString = "(" + a + " | " + b + ")" } private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends ValidParser[Either[A,B]] @@ -417,7 +417,7 @@ private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends ValidPars lazy val result = tuple(a.result, b.result) map { case (a,b) => Left(a) } def derive(c: Char) = (a derive c) || (b derive c) lazy val resultEmpty = a.resultEmpty either b.resultEmpty - lazy val completions = a.completions ++ b.completions + def completions(level: Int) = a.completions(level) ++ b.completions(level) override def toString = "(" + a + " || " + b + ")" } private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) extends ValidParser[Seq[T]] @@ -433,7 +433,7 @@ private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) exte val success = a.flatMap(_.result) if(success.length == a.length) Some(success) else None } - lazy val completions = a.map(_.completions).reduceLeft(_ ++ _) + def completions(level: Int) = a.map(_.completions(level)).reduceLeft(_ ++ _) def derive(c: Char) = seq0(a.map(_ derive c), errors) override def toString = "seq(" + a + ")" } @@ -442,11 +442,11 @@ private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends Val { lazy val result = a.result flatMap { av => f(av).result } lazy val resultEmpty = a.resultEmpty flatMap { av => f(av).resultEmpty } - lazy val completions = - a.completions flatMap { c => + def completions(level: Int) = + a.completions(level) flatMap { c => apply(a)(c.append).resultEmpty match { case _: Failure => Completions.strict(Set.empty + c) - case Value(av) => c x f(av).completions + case Value(av) => c x f(av).completions(level) } } @@ -467,7 +467,7 @@ private final class MapParser[A,B](a: Parser[A], f: A => B) extends ValidParser[ lazy val result = a.result map f lazy val resultEmpty = a.resultEmpty map f def derive(c: Char) = (a derive c) map f - def completions = a.completions + def completions(level: Int) = a.completions(level) override def isTokenStart = a.isTokenStart override def toString = "map(" + a + ")" } @@ -477,7 +477,7 @@ private final class Filter[T](p: Parser[T], f: T => Boolean, seen: String, msg: lazy val result = p.result filter f lazy val resultEmpty = filterResult(p.resultEmpty) def derive(c: Char) = filterParser(p derive c, f, seen + c, msg) - lazy val completions = p.completions filterS { s => filterResult(apply(p)(s).resultEmpty).isValid } + def completions(level: Int) = p.completions(level) filterS { s => filterResult(apply(p)(s).resultEmpty).isValid } override def toString = "filter(" + p + ")" override def isTokenStart = p.isTokenStart } @@ -485,20 +485,20 @@ private final class MatchedString(delegate: Parser[_], seenV: Vector[Char], part { lazy val seen = seenV.mkString def derive(c: Char) = matched(delegate derive c, seenV :+ c, partial) - def completions = delegate.completions + def completions(level: Int) = delegate.completions(level) def result = if(delegate.result.isDefined) Some(seen) else None def resultEmpty = delegate.resultEmpty match { case f: Failure if !partial => f; case _ => Value(seen) } override def isTokenStart = delegate.isTokenStart override def toString = "matched(" + partial + ", " + seen + ", " + delegate + ")" } -private final class TokenStart[T](delegate: Parser[T], seen: String, track: Boolean, hide: Boolean) extends ValidParser[T] +private final class TokenStart[T](delegate: Parser[T], seen: String, track: Boolean, hide: Int => Boolean) extends ValidParser[T] { def derive(c: Char) = token( delegate derive c, if(track) seen + c else seen, track, hide) - lazy val completions = - if(hide) Completions.nil + def completions(level: Int) = + if(hide(level)) Completions.nil else if(track) { - val dcs = delegate.completions + val dcs = delegate.completions(level) Completions( for(c <- dcs.get) yield Completion.token(seen, c.append) ) } else @@ -513,14 +513,14 @@ private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] { lazy val result = tuple(a.result,b.result) map { _._1 } def derive(c: Char) = (a derive c) & (b derive c) - lazy val completions = a.completions.filterS(s => apply(b)(s).resultEmpty.isValid ) + def completions(level: Int) = a.completions(level).filterS(s => apply(b)(s).resultEmpty.isValid ) lazy val resultEmpty = a.resultEmpty && b.resultEmpty } private final class Not(delegate: Parser[_]) extends ValidParser[Unit] { def derive(c: Char) = if(delegate.valid) not(delegate derive c) else this - def completions = Completions.empty + def completions(level: Int) = Completions.empty def result = None lazy val resultEmpty = delegate.resultEmpty match { case f: Failure => Value(()) @@ -532,7 +532,7 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends def derive(c: Char) = examples(delegate derive c, fixed.collect { case x if x.length > 0 && x(0) == c => x substring 1 }) def result = delegate.result lazy val resultEmpty = delegate.resultEmpty - lazy val completions = + def completions(level: Int) = if(fixed.isEmpty) if(resultEmpty.isValid) Completions.nil else Completions.empty else @@ -546,7 +546,7 @@ private final class StringLiteral(str: String, start: Int) extends ValidParser[S def resultEmpty = mkFailure(failMsg) def result = None def derive(c: Char) = if(str.charAt(start) == c) stringLiteral(str, start+1) else new Invalid(resultEmpty) - lazy val completions = Completions.single(Completion.suggestion(str.substring(start))) + def completions(level: Int) = Completions.single(Completion.suggestion(str.substring(start))) override def toString = '"' + str + '"' } private final class CharacterClass(f: Char => Boolean, label: String) extends ValidParser[Char] @@ -554,7 +554,7 @@ private final class CharacterClass(f: Char => Boolean, label: String) extends Va def result = None def resultEmpty = mkFailure("Expected " + label) def derive(c: Char) = if( f(c) ) success(c) else Invalid(resultEmpty) - def completions = Completions.empty + def completions(level: Int) = Completions.empty override def toString = "class(" + label + ")" } private final class Optional[T](delegate: Parser[T]) extends ValidParser[Option[T]] @@ -562,7 +562,7 @@ private final class Optional[T](delegate: Parser[T]) extends ValidParser[Option[ def result = delegate.result map some.fn def resultEmpty = Value(None) def derive(c: Char) = (delegate derive c).map(some.fn) - lazy val completions = Completion.empty +: delegate.completions + def completions(level: Int) = Completion.empty +: delegate.completions(level) override def toString = delegate.toString + "?" } private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, accumulatedReverse: List[T]) extends ValidParser[Seq[T]] @@ -585,16 +585,16 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m def repeatDerive(c: Char, accRev: List[T]): Parser[Seq[T]] = repeat(Some(repeated derive c), repeated, (min - 1) max 0, max.decrement, accRev) - lazy val completions = + def completions(level: Int) = { def pow(comp: Completions, exp: Completions, n: Int): Completions = if(n == 1) comp else pow(comp x exp, exp, n - 1) - val repC = repeated.completions + val repC = repeated.completions(level) val fin = if(min == 0) Completion.empty +: repC else pow(repC, repC, min) partial match { - case Some(p) => p.completions x fin + case Some(p) => p.completions(level) x fin case None => fin } } From 7d85f3c0474c5a75261a9963246c0537ac2db0a4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 26 Sep 2011 08:20:07 -0400 Subject: [PATCH 203/823] fix order of returned lists in Util.separate --- util/collection/Util.scala | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/util/collection/Util.scala b/util/collection/Util.scala index cceb1ec99..0b6b25118 100644 --- a/util/collection/Util.scala +++ b/util/collection/Util.scala @@ -6,7 +6,10 @@ package sbt object Util { def separate[T,A,B](ps: Seq[T])(f: T => Either[A,B]): (Seq[A], Seq[B]) = - ((Nil: Seq[A], Nil: Seq[B]) /: ps)( (xs, y) => prependEither(xs, f(y)) ) + { + val (a,b) = ((Nil: Seq[A], Nil: Seq[B]) /: ps)( (xs, y) => prependEither(xs, f(y)) ) + (a.reverse, b.reverse) + } def prependEither[A,B](acc: (Seq[A], Seq[B]), next: Either[A,B]): (Seq[A], Seq[B]) = next match From 3d4ad0b076453ddc63fbe144223bdaa556c040d8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 26 Sep 2011 08:20:07 -0400 Subject: [PATCH 204/823] fix laziness of parser failure messages --- util/complete/Parser.scala | 15 ++++++++------- 1 file changed, 8 insertions(+), 7 deletions(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 1df3e0195..f788d0f4c 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -75,7 +75,7 @@ object Parser extends ParserMain def filter(f: T => Boolean, msg: => String): Result[T] def seq[B](b: => Result[B]): Result[(T,B)] = app(b)( (m,n) => (m,n) ) def app[B,C](b: => Result[B])(f: (T, B) => C): Result[C] - def toEither: Either[Seq[String], T] + def toEither: Either[() => Seq[String], T] } final case class Value[+T](value: T) extends Result[T] { def isFailure = false @@ -110,7 +110,7 @@ object Parser extends ParserMain def filter(f: Nothing => Boolean, msg: => String) = this def app[B,C](b: => Result[B])(f: (Nothing, B) => C): Result[C] = this def &&(b: => Result[_]) = this - def toEither = Left(errors) + def toEither = Left(() => errors) private[this] def concatErrors(f: Failure) = mkFailures(errors ++ f.errors) } @@ -278,16 +278,17 @@ trait ParserMain } // intended to be temporary pending proper error feedback - def result[T](p: Parser[T], s: String): Either[(Seq[String],Int), T] = + def result[T](p: Parser[T], s: String): Either[() => (Seq[String],Int), T] = { - def loop(i: Int, a: Parser[T]): Either[(Seq[String],Int), T] = + def loop(i: Int, a: Parser[T]): Either[() => (Seq[String],Int), T] = a match { - case Invalid(f) => Left( (f.errors, i) ) + case Invalid(f) => Left( () => (f.errors, i) ) case _ => val ci = i+1 if(ci >= s.length) - a.resultEmpty.toEither.left.map { msgs => + a.resultEmpty.toEither.left.map { msgs0 => () => + val msgs = msgs0() val nonEmpty = if(msgs.isEmpty) "Unexpected end of input" :: Nil else msgs (nonEmpty, ci) } @@ -427,7 +428,7 @@ private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) exte { val res = a.map(_.resultEmpty) val (failures, values) = separate(res)(_.toEither) - if(failures.isEmpty) Value(values) else mkFailures(failures.flatten ++ errors) + if(failures.isEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) } def result = { val success = a.flatMap(_.result) From f8e3084e8fdc34801d85fc8149269c2607fb1aed Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 1 Oct 2011 14:39:40 -0400 Subject: [PATCH 205/823] fix parser test --- util/complete/src/test/scala/ParserTest.scala | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 931c01f01..9c6ec091d 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -54,13 +54,13 @@ object ParserTest extends Properties("Completing Parser") ( ("display '" + in + "'") |: checkOne(in, nestedDisplay, expectDisplay) ) def checkOne(in: String, parser: Parser[_], expect: Completion): Prop = - p(completions(parser, in)) == Completions.single(expect) + p(completions(parser, in, 1)) == Completions.single(expect) def checkInvalid(in: String) = ( ("token '" + in + "'") |: checkInv(in, nested) ) && ( ("display '" + in + "'") |: checkInv(in, nestedDisplay) ) def checkInv(in: String, parser: Parser[_]): Prop = - p(completions(parser, in)) == Completions.nil + p(completions(parser, in, 1)) == Completions.nil property("nested tokens a") = checkSingle("", Completion.tokenStrict("","a1") )( Completion.displayStrict("")) property("nested tokens a1") = checkSingle("a", Completion.tokenStrict("a","1") )( Completion.displayStrict("")) @@ -91,9 +91,9 @@ object ParserExample val t = name ~ options ~ include // Get completions for some different inputs - println(completions(t, "te")) - println(completions(t, "test ")) - println(completions(t, "test w")) + println(completions(t, "te", 1)) + println(completions(t, "test ",1)) + println(completions(t, "test w", 1)) // Get the parsed result for different inputs println(apply(t)("te").resultEmpty) From 5874d45525f88a0e6e16ced70bc1f5018c1ebbd1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 1 Oct 2011 14:39:40 -0400 Subject: [PATCH 206/823] local settings, sbt-package-private for now --- util/collection/Attributes.scala | 21 +++++++++++++++++---- util/collection/INode.scala | 5 ++++- util/collection/Settings.scala | 19 +++++++++++++++++++ 3 files changed, 40 insertions(+), 5 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 6474fefb3..2e2d647bd 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -14,33 +14,46 @@ sealed trait AttributeKey[T] { def label: String def description: Option[String] def extend: Seq[AttributeKey[_]] + def isLocal: Boolean +} +private[sbt] abstract class SharedAttributeKey[T] extends AttributeKey[T] { override final def toString = label override final def hashCode = label.hashCode override final def equals(o: Any) = (this eq o.asInstanceOf[AnyRef]) || (o match { - case a: AttributeKey[t] => a.label == this.label && a.manifest == this.manifest + case a: SharedAttributeKey[t] => a.label == this.label && a.manifest == this.manifest case _ => false }) + final def isLocal: Boolean = false } object AttributeKey { - def apply[T](name: String)(implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + def apply[T](name: String)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { def manifest = mf def label = name def description = None def extend = Nil } - def apply[T](name: String, description0: String)(implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + def apply[T](name: String, description0: String)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { def manifest = mf def label = name def description = Some(description0) def extend = Nil } - def apply[T](name: String, description0: String, extend0: Seq[AttributeKey[_]])(implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + def apply[T](name: String, description0: String, extend0: Seq[AttributeKey[_]])(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { def manifest = mf def label = name def description = Some(description0) def extend = extend0 } + private[sbt] def local[T](implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + def manifest = mf + def label = LocalLabel + def description = None + def extend = Nil + override def toString = label + def isLocal: Boolean = true + } + private[sbt] final val LocalLabel = "$local" } trait AttributeMap diff --git a/util/collection/INode.scala b/util/collection/INode.scala index e21c0b6b7..b47031a43 100644 --- a/util/collection/INode.scala +++ b/util/collection/INode.scala @@ -55,7 +55,10 @@ abstract class EvaluateSettings[Scope] } getResults(delegates) } - private[this] def getResults(implicit delegates: Scope => Seq[Scope]) = (empty /: static.toTypedSeq) { case (ss, static.TPair(key, node)) => ss.set(key.scope, key.key, node.get) } + private[this] def getResults(implicit delegates: Scope => Seq[Scope]) = + (empty /: static.toTypedSeq) { case (ss, static.TPair(key, node)) => + if(key.key.isLocal) ss else ss.set(key.scope, key.key, node.get) + } private[this] val getValue = new (INode ~> Id) { def apply[T](node: INode[T]) = node.get } private[this] def submitEvaluate(node: INode[_]) = submit(node.evaluate()) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 0f4c5c83a..ef30f854f 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -193,6 +193,25 @@ trait Init[Scope] { override def toString = showFullKey(key) } + final class Flattened(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]]) + + def flattenLocals(compiled: CompiledMap): Map[ScopedKey[_],Flattened] = + { + import collection.breakOut + val locals = compiled.flatMap { case (key, comp) => if(key.key.isLocal) Seq[Compiled[_]](comp) else Nil }(breakOut) + val ordered = Dag.topologicalSort(locals)(_.dependencies.flatMap(dep => if(dep.key.isLocal) Seq[Compiled[_]](compiled(dep)) else Nil)) + def flatten(cmap: Map[ScopedKey[_],Flattened], key: ScopedKey[_], deps: Iterable[ScopedKey[_]]): Flattened = + new Flattened(key, deps.flatMap(dep => if(dep.key.isLocal) cmap(dep).dependencies else dep :: Nil)) + + val empty = Map.empty[ScopedKey[_],Flattened] + val flattenedLocals = (empty /: ordered) { (cmap, c) => cmap.updated(c.key, flatten(cmap, c.key, c.dependencies)) } + compiled.flatMap{ case (key, comp) => + if(key.key.isLocal) + Nil + else + Seq[ (ScopedKey[_], Flattened)]( (key, flatten(flattenedLocals, key, comp.dependencies)) ) + }(breakOut) + } sealed trait Initialize[T] { From ba4c6de918e2bf9108fcb9ac8a7ca986d9ddd551 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 1 Oct 2011 14:39:40 -0400 Subject: [PATCH 207/823] generalize addArtifact arguments to Initialize[...]. fixes #207 --- util/collection/Settings.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index ef30f854f..c4743eb83 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -198,19 +198,19 @@ trait Init[Scope] def flattenLocals(compiled: CompiledMap): Map[ScopedKey[_],Flattened] = { import collection.breakOut - val locals = compiled.flatMap { case (key, comp) => if(key.key.isLocal) Seq[Compiled[_]](comp) else Nil }(breakOut) + val locals = compiled flatMap { case (key, comp) => if(key.key.isLocal) Seq[Compiled[_]](comp) else Nil } val ordered = Dag.topologicalSort(locals)(_.dependencies.flatMap(dep => if(dep.key.isLocal) Seq[Compiled[_]](compiled(dep)) else Nil)) def flatten(cmap: Map[ScopedKey[_],Flattened], key: ScopedKey[_], deps: Iterable[ScopedKey[_]]): Flattened = new Flattened(key, deps.flatMap(dep => if(dep.key.isLocal) cmap(dep).dependencies else dep :: Nil)) val empty = Map.empty[ScopedKey[_],Flattened] val flattenedLocals = (empty /: ordered) { (cmap, c) => cmap.updated(c.key, flatten(cmap, c.key, c.dependencies)) } - compiled.flatMap{ case (key, comp) => + compiled flatMap{ case (key, comp) => if(key.key.isLocal) Nil else Seq[ (ScopedKey[_], Flattened)]( (key, flatten(flattenedLocals, key, comp.dependencies)) ) - }(breakOut) + } } sealed trait Initialize[T] From 6b136f1c17bfac73eed3f70412c034c16e4576b3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 5 Oct 2011 18:09:27 -0400 Subject: [PATCH 208/823] store hashes of API instead of full API. fixes #21 --- interface/other | 1 + 1 file changed, 1 insertion(+) diff --git a/interface/other b/interface/other index 64b09422d..993c9d4f6 100644 --- a/interface/other +++ b/interface/other @@ -2,6 +2,7 @@ Source compilation: Compilation hash: Byte* api: SourceAPI + apiHash: Int SourceAPI packages : Package* From f18c44d00da367858e9d8d99c4881841138155a4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 9 Oct 2011 21:48:15 -0400 Subject: [PATCH 209/823] fix stackoverflow caused by using List.separate, as tracked down by pvlugter --- util/collection/PMap.scala | 2 +- util/collection/Settings.scala | 4 ++-- util/collection/Util.scala | 3 +++ 3 files changed, 6 insertions(+), 3 deletions(-) diff --git a/util/collection/PMap.scala b/util/collection/PMap.scala index 1a2afb6d5..8b1772220 100644 --- a/util/collection/PMap.scala +++ b/util/collection/PMap.scala @@ -67,7 +67,7 @@ object IMap case Left(l) => Left((k, l)) case Right(r) => Right((k, r)) }} - val (l, r) = List.separate[(K[_],VL[_]), (K[_],VR[_])]( mapped.toList ) + val (l, r) = Util.separateE[(K[_],VL[_]), (K[_],VR[_])]( mapped.toList ) (new IMap0[K,VL](l.toMap), new IMap0[K,VR](r.toMap)) } diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index c4743eb83..4111c9082 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -122,7 +122,7 @@ trait Init[Scope] type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] val f = new (SettingSeq ~> ValidatedSettings) { def apply[T](ks: Seq[Setting[T]]) = { val validated = ks.zipWithIndex map { case (s,i) => s validateReferenced refMap(s.key, i == 0) } - val (undefs, valid) = List separate validated + val (undefs, valid) = Util separateE validated if(undefs.isEmpty) Right(valid) else Left(undefs.flatten) }} type Undefs[_] = Seq[Undefined] @@ -351,7 +351,7 @@ trait Init[Scope] def mapReferenced(g: MapScoped) = new Uniform(f, inputs map mapReferencedT(g).fn) def validateReferenced(g: ValidateRef) = { - val (undefs, ok) = List.separate(inputs map validateReferencedT(g).fn ) + val (undefs, ok) = Util.separateE(inputs map validateReferencedT(g).fn ) if(undefs.isEmpty) Right( new Uniform(f, ok) ) else Left(undefs.flatten) } def apply[S](g: T => S) = new Uniform(g compose f, inputs) diff --git a/util/collection/Util.scala b/util/collection/Util.scala index 0b6b25118..608284c98 100644 --- a/util/collection/Util.scala +++ b/util/collection/Util.scala @@ -5,6 +5,9 @@ package sbt object Util { + def separateE[A,B](ps: Seq[Either[A,B]]): (Seq[A], Seq[B]) = + separate(ps)(Types.idFun) + def separate[T,A,B](ps: Seq[T])(f: T => Either[A,B]): (Seq[A], Seq[B]) = { val (a,b) = ((Nil: Seq[A], Nil: Seq[B]) /: ps)( (xs, y) => prependEither(xs, f(y)) ) From 591f90ce710ff1448247aeff487fbf79c0d2a9b8 Mon Sep 17 00:00:00 2001 From: softprops Date: Thu, 13 Oct 2011 02:12:30 -0400 Subject: [PATCH 210/823] add support for a masked readline --- util/complete/LineReader.scala | 13 ++++++++----- 1 file changed, 8 insertions(+), 5 deletions(-) diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index b8ab32f87..66479e736 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -10,9 +10,12 @@ package sbt abstract class JLine extends LineReader { protected[this] val reader: ConsoleReader - def readLine(prompt: String) = JLine.withJLine { unsynchronizedReadLine(prompt) } - private[this] def unsynchronizedReadLine(prompt: String) = - reader.readLine(prompt) match + def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } + private[this] def unsynchronizedReadLine(prompt: String, mask: Option[Char]) = + (mask match { + case Some(m) => reader.readLine(prompt, m) + case None => reader.readLine(prompt) + }) match { case null => None case x => Some(x.trim) @@ -59,7 +62,7 @@ private object JLine trait LineReader { - def readLine(prompt: String): Option[String] + def readLine(prompt: String, mask: Option[Char] = None): Option[String] } final class FullReader(historyPath: Option[File], complete: Parser[_]) extends JLine { @@ -81,4 +84,4 @@ class SimpleReader private[sbt] (historyPath: Option[File]) extends JLine object SimpleReader extends JLine { protected[this] val reader = JLine.createReader() -} \ No newline at end of file +} From 5898cba4a8ea48804c569365153eb98b7a707923 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 16 Oct 2011 20:20:45 -0400 Subject: [PATCH 211/823] brief API documentation on some core settings types --- util/collection/HList.scala | 2 ++ util/collection/KList.scala | 6 +++++- 2 files changed, 7 insertions(+), 1 deletion(-) diff --git a/util/collection/HList.scala b/util/collection/HList.scala index df0391de8..cb76594d0 100644 --- a/util/collection/HList.scala +++ b/util/collection/HList.scala @@ -5,6 +5,8 @@ package sbt import Types._ +/** A minimal heterogeneous list type. For background, see +* http://apocalisp.wordpress.com/2010/07/06/type-level-programming-in-scala-part-6a-heterogeneous-list basics/ */ sealed trait HList { type Wrap[M[_]] <: HList diff --git a/util/collection/KList.scala b/util/collection/KList.scala index 8035a4f2f..7b58aca32 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -8,7 +8,11 @@ import Types._ /** A higher-order heterogeneous list. It has a type constructor M[_] and * type parameters HL. The underlying data is M applied to each type parameter. * Explicitly tracking M[_] allows performing natural transformations or ensuring -* all data conforms to some common type. */ +* all data conforms to some common type. +* +* For background, see +* http://apocalisp.wordpress.com/2010/11/01/type-level-programming-in-scala-part-8a-klist%C2%A0motivation/ + */ sealed trait KList[+M[_], HL <: HList] { type Raw = HL From 64bf50cd08384ab4050e6c6300804029c93fed08 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 18 Oct 2011 22:43:25 -0400 Subject: [PATCH 212/823] task execution interruptible using ctrl+c. fixes #228,#229 - interrupts task execution only - no further tasks scheduled - existing tasks interrupted - a task must terminate any other started threads when interrupted - set cancelable to true to enable - currently, 'run' properly terminates if the application properly terminates when interrupted - 'console' does not, 'test' depends on the test framework - also bundled: set connectInput to true to connect standard input to forked run --- util/collection/Signal.scala | 43 ++++++++++++++++++++++++++++++++++ util/process/Process.scala | 2 ++ util/process/ProcessImpl.scala | 8 +++---- 3 files changed, 49 insertions(+), 4 deletions(-) create mode 100644 util/collection/Signal.scala diff --git a/util/collection/Signal.scala b/util/collection/Signal.scala new file mode 100644 index 000000000..09756249d --- /dev/null +++ b/util/collection/Signal.scala @@ -0,0 +1,43 @@ +package sbt + +object Signals +{ + def withHandler[T](handler: () => Unit)(action: () => T): T = + { + val result = + try + { + val signals = new Signals0 + signals.withHandler(handler)(action) + } + catch { case e: LinkageError => Right(action()) } + + result match { + case Left(e) => throw e + case Right(v) => v + } + } +} + +// Must only be referenced using a +// try { } catch { case e: LinkageError => ... } +// block to +private final class Signals0 +{ + // returns a LinkageError in `action` as Left(t) in order to avoid it being + // incorrectly swallowed as missing Signal/SignalHandler + def withHandler[T](handler: () => Unit)(action: () => T): Either[Throwable, T] = + { + import sun.misc.{Signal,SignalHandler} + val intSignal = new Signal("INT") + val newHandler = new SignalHandler { + def handle(sig: Signal) { handler() } + } + + val oldHandler = Signal.handle(intSignal, newHandler) + + try Right(action()) + catch { case e: LinkageError => Left(e) } + finally Signal.handle(intSignal, oldHandler) + } +} \ No newline at end of file diff --git a/util/process/Process.scala b/util/process/Process.scala index 2dd70484c..5a1f46b4a 100644 --- a/util/process/Process.scala +++ b/util/process/Process.scala @@ -166,6 +166,8 @@ trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder * The newly started process reads from standard input of the current process if `connectInput` is true.*/ def run(log: ProcessLogger, connectInput: Boolean): Process + def runBuffered(log: ProcessLogger, connectInput: Boolean): Process + /** Constructs a command that runs this command first and then `other` if this command succeeds.*/ def #&& (other: ProcessBuilder): ProcessBuilder /** Constructs a command that runs this command first and then `other` if this command does not succeed.*/ diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index c20b23f20..69191d054 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -159,10 +159,10 @@ private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPa def ! = run(false).exitValue() def !< = run(true).exitValue() - def !(log: ProcessLogger) = runBuffered(log, false) - def !<(log: ProcessLogger) = runBuffered(log, true) - private[this] def runBuffered(log: ProcessLogger, connectInput: Boolean) = - log.buffer { run(log, connectInput).exitValue() } + def !(log: ProcessLogger) = runBuffered(log, false).exitValue() + def !<(log: ProcessLogger) = runBuffered(log, true).exitValue() + def runBuffered(log: ProcessLogger, connectInput: Boolean) = + log.buffer { run(log, connectInput) } def !(io: ProcessIO) = run(io).exitValue() def canPipeTo = false From f0fe396b3a53ff171c10006688f77a6dc4f6273f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 19 Oct 2011 22:23:47 -0400 Subject: [PATCH 213/823] preserve API information needed for detecting annotations on defs. fixes #232 --- interface/definition | 1 + 1 file changed, 1 insertion(+) diff --git a/interface/definition b/interface/definition index 2dcd4025b..dadd41adb 100644 --- a/interface/definition +++ b/interface/definition @@ -16,6 +16,7 @@ Definition definitionType: DefinitionType selfType: ~Type structure: ~Structure + savedAnnotations: String* TypeMember TypeAlias tpe: Type From 8beb823a9b40bade79897c049a4fc80c8f6e819b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 30 Oct 2011 18:39:18 -0400 Subject: [PATCH 214/823] cleanup, add regex for escape sequences to be used later --- util/log/ConsoleLogger.scala | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index 34c8a9575..a453b3b8a 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -46,6 +46,10 @@ object ConsoleLogger def apply(out: PrintWriter): ConsoleLogger = apply(printWriterOut(out)) def apply(out: ConsoleOut, ansiCodesSupported: Boolean = formatEnabled, useColor: Boolean = formatEnabled): ConsoleLogger = new ConsoleLogger(out, ansiCodesSupported, useColor) + + private[this] val EscapeSequence = (27.toChar + "[^@-~]*[@-~]").r + def stripEscapeSequences(s: String): String = + EscapeSequence.pattern.matcher(s).replaceAll("") } /** A logger that logs to the console. On supported systems, the level labels are From 30bdcf68d43fc05dcc8c2e975a4ba1880a82bf0f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 4 Nov 2011 13:11:10 -0400 Subject: [PATCH 215/823] preserve IOException type when translating exceptions. fixes #253 --- util/control/ErrorHandling.scala | 10 +++++++--- 1 file changed, 7 insertions(+), 3 deletions(-) diff --git a/util/control/ErrorHandling.scala b/util/control/ErrorHandling.scala index a346e0d14..caf653da1 100644 --- a/util/control/ErrorHandling.scala +++ b/util/control/ErrorHandling.scala @@ -7,7 +7,10 @@ object ErrorHandling { def translate[T](msg: => String)(f: => T) = try { f } - catch { case e: Exception => throw new TranslatedException(msg + e.toString, e) } + catch { + case e: IOException => throw new TranslatedIOException(msg + e.toString, e) + case e: Exception => throw new TranslatedException(msg + e.toString, e) + } def wideConvert[T](f: => T): Either[Throwable, T] = try { Right(f) } @@ -31,7 +34,8 @@ object ErrorHandling else e.toString } -final class TranslatedException private[sbt](msg: String, cause: Throwable) extends RuntimeException(msg, cause) +sealed class TranslatedException private[sbt](msg: String, cause: Throwable) extends RuntimeException(msg, cause) { override def toString = msg -} \ No newline at end of file +} +final class TranslatedIOException private[sbt](msg: String, cause: IOException) extends TranslatedException(msg, cause) From 1578dcc46f3e77b6c77c01769d572c7dbd5b5686 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 4 Nov 2011 13:44:09 -0400 Subject: [PATCH 216/823] missing import --- util/control/ErrorHandling.scala | 2 ++ 1 file changed, 2 insertions(+) diff --git a/util/control/ErrorHandling.scala b/util/control/ErrorHandling.scala index caf653da1..a1ba760f3 100644 --- a/util/control/ErrorHandling.scala +++ b/util/control/ErrorHandling.scala @@ -3,6 +3,8 @@ */ package sbt + import java.io.IOException + object ErrorHandling { def translate[T](msg: => String)(f: => T) = From b94c6e8949eb30f2237f785e03402ce1ffeb6077 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 5 Nov 2011 08:40:16 -0400 Subject: [PATCH 217/823] in cyclic error message, put each node string on different line --- util/collection/Dag.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/util/collection/Dag.scala b/util/collection/Dag.scala index 8a6d0ba46..b617070dd 100644 --- a/util/collection/Dag.scala +++ b/util/collection/Dag.scala @@ -55,7 +55,9 @@ object Dag finished; } final class Cyclic(val value: Any, val all: List[Any], val complete: Boolean) - extends Exception( "Cyclic reference involving " + (if(complete) all.mkString(", ") else value) ) + extends Exception( "Cyclic reference involving " + + (if(complete) all.mkString("\n ", "\n ", "") else value) + ) { def this(value: Any) = this(value, value :: Nil, false) def ::(a: Any): Cyclic = From 9e708b17e24271cdd130979c7ac01467bbf4b43c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 27 Nov 2011 17:48:01 -0500 Subject: [PATCH 218/823] fixes #280. sort aggregate and classpath dependencies separately to keep cycle detection for them separate --- util/collection/Dag.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/Dag.scala b/util/collection/Dag.scala index b617070dd..14a418f26 100644 --- a/util/collection/Dag.scala +++ b/util/collection/Dag.scala @@ -60,6 +60,7 @@ object Dag ) { def this(value: Any) = this(value, value :: Nil, false) + override def toString = getMessage def ::(a: Any): Cyclic = if(complete) this From 0ff6b65376a93612c184cd4e42aa38cdaa490350 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 12 Dec 2011 19:24:58 -0500 Subject: [PATCH 219/823] treat case differences differently --- util/complete/EditDistance.scala | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/util/complete/EditDistance.scala b/util/complete/EditDistance.scala index 218c48c0a..e7e295f2a 100644 --- a/util/complete/EditDistance.scala +++ b/util/complete/EditDistance.scala @@ -1,12 +1,14 @@ package sbt.complete + import java.lang.Character.{toLowerCase => lower} + /** @author Paul Phillips*/ object EditDistance { /** Translated from the java version at * http://www.merriampark.com/ld.htm * which is declared to be public domain. */ - def levenshtein(s: String, t: String, insertCost: Int = 1, deleteCost: Int = 1, subCost: Int = 1, transposeCost: Int = 1, matchCost: Int = 0, transpositions: Boolean = false): Int = { + def levenshtein(s: String, t: String, insertCost: Int = 1, deleteCost: Int = 1, subCost: Int = 1, transposeCost: Int = 1, matchCost: Int = 0, caseCost: Int = 1, transpositions: Boolean = false): Int = { val n = s.length val m = t.length if (n == 0) return m @@ -18,7 +20,7 @@ object EditDistance { for (i <- 1 to n ; val s_i = s(i - 1) ; j <- 1 to m) { val t_j = t(j - 1) - val cost = if (s_i == t_j) matchCost else subCost + val cost = if (s_i == t_j) matchCost else if(lower(s_i) == lower(t_j)) caseCost else subCost val tcost = if (s_i == t_j) matchCost else transposeCost From a9ccd74eb8865f09ddc4955b96c0a0f802a07e10 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 13 Dec 2011 17:29:08 -0500 Subject: [PATCH 220/823] fix 'not' parser combinator, add EOF --- util/complete/Parser.scala | 5 ++++- util/complete/Parsers.scala | 2 ++ 2 files changed, 6 insertions(+), 1 deletion(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index f788d0f4c..5c7bcddc3 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -345,7 +345,10 @@ trait ParserMain else b - def not(p: Parser[_]): Parser[Unit] = new Not(p) + def not(p: Parser[_]): Parser[Unit] = p.result match { + case None => new Not(p) + case Some(_) => failure("Excluded.") + } def oneOf[T](p: Seq[Parser[T]]): Parser[T] = p.reduceLeft(_ | _) def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = seq0(p, Nil) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index fd0568494..367f2fadb 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -11,6 +11,8 @@ package sbt.complete // Some predefined parsers trait Parsers { + lazy val EOF = not(any) + lazy val any: Parser[Char] = charClass(_ => true, "any character") lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") From 392ec5150af2038fb02d7b58dc8319f8dbec4406 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 9 Jan 2012 08:00:29 -0500 Subject: [PATCH 221/823] moved task axis before the key --- util/collection/Show.scala | 4 ++++ util/complete/Parser.scala | 3 ++- 2 files changed, 6 insertions(+), 1 deletion(-) diff --git a/util/collection/Show.scala b/util/collection/Show.scala index b19a6ca2d..fe4e85950 100644 --- a/util/collection/Show.scala +++ b/util/collection/Show.scala @@ -2,4 +2,8 @@ package sbt trait Show[T] { def apply(t: T): String +} +object Show +{ + def apply[T](f: T => String): Show[T] = new Show[T] { def apply(t: T): String = f(t) } } \ No newline at end of file diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 5c7bcddc3..a23ae3f48 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -431,7 +431,8 @@ private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) exte { val res = a.map(_.resultEmpty) val (failures, values) = separate(res)(_.toEither) - if(failures.isEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) +// if(failures.isEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) + if(values.nonEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) } def result = { val success = a.flatMap(_.result) From d4c7544f9cde5e8a756436937254838a6334b0be Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 9 Jan 2012 08:00:35 -0500 Subject: [PATCH 222/823] API for embedding incremental compilation --- .../java/xsbti/compile/ClasspathOptions.java | 29 +++++++++ .../main/java/xsbti/compile/CompileOrder.java | 34 +++++++++++ .../main/java/xsbti/compile/Compilers.java | 8 +++ .../main/java/xsbti/compile/DefinesClass.java | 14 +++++ .../xsbti/compile/IncrementalCompiler.java | 60 +++++++++++++++++++ .../src/main/java/xsbti/compile/Inputs.java | 14 +++++ .../main/java/xsbti/compile/JavaCompiler.java | 15 +++++ .../src/main/java/xsbti/compile/Options.java | 33 ++++++++++ .../java/xsbti/compile/ScalaInstance.java | 34 +++++++++++ .../src/main/java/xsbti/compile/Setup.java | 24 ++++++++ 10 files changed, 265 insertions(+) create mode 100644 interface/src/main/java/xsbti/compile/ClasspathOptions.java create mode 100644 interface/src/main/java/xsbti/compile/CompileOrder.java create mode 100644 interface/src/main/java/xsbti/compile/Compilers.java create mode 100644 interface/src/main/java/xsbti/compile/DefinesClass.java create mode 100644 interface/src/main/java/xsbti/compile/IncrementalCompiler.java create mode 100644 interface/src/main/java/xsbti/compile/Inputs.java create mode 100644 interface/src/main/java/xsbti/compile/JavaCompiler.java create mode 100644 interface/src/main/java/xsbti/compile/Options.java create mode 100644 interface/src/main/java/xsbti/compile/ScalaInstance.java create mode 100644 interface/src/main/java/xsbti/compile/Setup.java diff --git a/interface/src/main/java/xsbti/compile/ClasspathOptions.java b/interface/src/main/java/xsbti/compile/ClasspathOptions.java new file mode 100644 index 000000000..e4aba32b7 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/ClasspathOptions.java @@ -0,0 +1,29 @@ +package xsbti.compile; + +/** +* Configures modifications to the classpath based on the Scala instance used for compilation. +* This is typically used for the Scala compiler only and all values set to false for the Java compiler. +*/ +public interface ClasspathOptions +{ + /** If true, includes the Scala library on the boot classpath. This should usually be true.*/ + boolean bootLibrary(); + + /** If true, includes the Scala compiler on the standard classpath. + * This is typically false and is instead managed by the build tool or environment. + */ + boolean compiler(); + + /** If true, includes extra jars from the Scala instance on the standard classpath. + * This is typically false and is instead managed by the build tool or environment. + */ + boolean extra(); + + /** If true, automatically configures the boot classpath. This should usually be true.*/ + boolean autoBoot(); + + /** If true, the Scala library jar is filtered from the standard classpath. + * This should usually be true because the library should be included on the boot classpath of the Scala compiler and not the standard classpath. + */ + boolean filterLibrary(); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/CompileOrder.java b/interface/src/main/java/xsbti/compile/CompileOrder.java new file mode 100644 index 000000000..62b15bf1f --- /dev/null +++ b/interface/src/main/java/xsbti/compile/CompileOrder.java @@ -0,0 +1,34 @@ +package xsbti.compile; + +/** +* Defines the order in which Scala and Java sources are compiled when compiling a set of sources with both Java and Scala sources. +* This setting has no effect if only Java sources or only Scala sources are being compiled. +* It is generally more efficient to use JavaThenScala or ScalaThenJava when mixed compilation is not required. +*/ +public enum CompileOrder +{ + /** + * Allows Scala sources to depend on Java sources and allows Java sources to depend on Scala sources. + * + * In this mode, both Java and Scala sources are passed to the Scala compiler, which generates class files for the Scala sources. + * Then, Java sources are passed to the Java compiler, which generates class files for the Java sources. + * The Scala classes compiled in the first step are included on the classpath to the Java compiler. + */ + Mixed, + /** + * Allows Scala sources to depend on the Java sources in the compilation, but does not allow Java sources to depend on Scala sources. + * + * In this mode, both Java and Scala sources are passed to the Scala compiler, which generates class files for the Scala sources. + * Then, Java sources are passed to the Java compiler, which generates class files for the Java sources. + * The Scala classes compiled in the first step are included on the classpath to the Java compiler. + */ + JavaThenScala, + /** + * Allows Java sources to depend on the Scala sources in the compilation, but does not allow Scala sources to depend on Java sources. + * + * In this mode, both Java and Scala sources are passed to the Scala compiler, which generates class files for the Scala sources. + * Then, Java sources are passed to the Java compiler, which generates class files for the Java sources. + * The Scala classes compiled in the first step are included on the classpath to the Java compiler. + */ + ScalaThenJava; +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/Compilers.java b/interface/src/main/java/xsbti/compile/Compilers.java new file mode 100644 index 000000000..0bb194534 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/Compilers.java @@ -0,0 +1,8 @@ +package xsbti.compile; + +public interface Compilers +{ + JavaCompiler javac(); + // should be cached by client if desired + ScalaCompiler scalac(); +} diff --git a/interface/src/main/java/xsbti/compile/DefinesClass.java b/interface/src/main/java/xsbti/compile/DefinesClass.java new file mode 100644 index 000000000..6369c2661 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/DefinesClass.java @@ -0,0 +1,14 @@ +package xsbti.compile; + +import java.io.File; + +/** +* Determines if an entry on a classpath contains a class. +*/ +public interface DefinesClass +{ + /** + * Returns true if the classpath entry contains the requested class. + */ + boolean apply(String className); +} diff --git a/interface/src/main/java/xsbti/compile/IncrementalCompiler.java b/interface/src/main/java/xsbti/compile/IncrementalCompiler.java new file mode 100644 index 000000000..9fe301029 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/IncrementalCompiler.java @@ -0,0 +1,60 @@ +package xsbti.compile; + +import xsbti.Logger; +import java.io.File; + +/* +* This API is subject to change. +* +* It is the client's responsibility to: +* 1. Manage class loaders. Usually the client will want to: +* i. Keep the class loader used by the ScalaInstance warm. +* ii. Keep the class loader of the incremental recompilation classes (xsbti.compile) warm. +* iii. Share the class loader for Scala classes between the incremental compiler implementation and the ScalaInstance where possible (must be binary compatible) +* 2. Manage the compiler interface jar. The interface must be compiled against the exact Scala version used for compilation and a compatible Java version. +* 3. Manage compilation order between different compilations. +* i. Execute a compilation for each dependency, obtaining an Analysis for each. +* ii. Provide the Analysis from previous compilations to dependent compilations in the analysis map. +* 4. Provide an implementation of JavaCompiler for compiling Java sources. +* 5. Define a function that determines if a classpath entry contains a class (Setup.definesClass). +* i. This is provided by the client so that the client can cache this information across compilations when compiling multiple sets of sources. +* ii. The cache should be cleared for each new compilation run or else recompilation will not properly account for changes to the classpath. +* 6. Provide a cache directory. +* i. This directory is used by IncrementalCompiler to persist data between compilations. +* ii. It should be a different directory for each set of sources being compiled. +* 7. Manage parallel execution. +* i. Each compilation may be performed in a different thread as long as the dependencies have been compiled already. +* ii. Implementations of all types should be immutable and arrays treated as immutable. +* 8. Ensure general invariants: +* i. The implementations of all types are immutable, except for the already discussed Setup.definesClass. +* ii. Arrays are treated as immutable. +* iii. No value is ever null. +*/ +public interface IncrementalCompiler +{ + /** + * Performs an incremental compilation as configured by `in`. + * The returned Analysis should be provided to compilations depending on the classes from this compilation. + */ + Analysis compile(Inputs in, Logger log); + + /** + * Creates a compiler instance that can be used by the `compile` method. + * + * @param instance The Scala version to use + * @param interfaceJar The compiler interface jar compiled for the Scala version being used + * @param options Configures how arguments to the underlying Scala compiler will be built. + */ + ScalaCompiler newScalaCompiler(ScalaInstance instance, File interfaceJar, ClasspathOptions options, Logger log); + + /** + * Compiles the source interface for a Scala version. The resulting jar can then be used by the `newScalaCompiler` method + * to create a ScalaCompiler for incremental compilation. It is the client's responsibility to manage compiled jars for + * different Scala versions. + * + * @param sourceJar The jar file containing the compiler interface sources. These are published as sbt's compiler-interface-src module. + * @param targetJar Where to create the output jar file containing the compiled classes. + * @param instance The ScalaInstance to compile the compiler interface for. + * @param log The logger to use during compilation. */ + void compileInterfaceJar(File sourceJar, File targetJar, ScalaInstance instance, Logger log); +} diff --git a/interface/src/main/java/xsbti/compile/Inputs.java b/interface/src/main/java/xsbti/compile/Inputs.java new file mode 100644 index 000000000..5c9ded425 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/Inputs.java @@ -0,0 +1,14 @@ +package xsbti.compile; + +/** Configures a single compilation of a single set of sources.*/ +public interface Inputs +{ + /** The Scala and Java compilers to use for compilation.*/ + Compilers compilers(); + + /** Standard compilation options, such as the sources and classpath to use. */ + Options options(); + + /** Configures incremental compilation.*/ + Setup setup(); +} diff --git a/interface/src/main/java/xsbti/compile/JavaCompiler.java b/interface/src/main/java/xsbti/compile/JavaCompiler.java new file mode 100644 index 000000000..05bafdfe1 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/JavaCompiler.java @@ -0,0 +1,15 @@ +package xsbti.compile; + +import java.io.File; +import xsbti.Logger; + +/** +* Interface to a Java compiler. +*/ +public interface JavaCompiler +{ + /** Compiles Java sources using the provided classpath, output directory, and additional options. + * If supported, the number of reported errors should be limited to `maximumErrors`. + * Output should be sent to the provided logger.*/ + void compile(File[] sources, File[] classpath, File outputDirectory, String[] options, int maximumErrors, Logger log); +} diff --git a/interface/src/main/java/xsbti/compile/Options.java b/interface/src/main/java/xsbti/compile/Options.java new file mode 100644 index 000000000..9e0d03e98 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/Options.java @@ -0,0 +1,33 @@ +package xsbti.compile; + +import java.io.File; + +/** Standard compilation options.*/ +public interface Options +{ + /** The classpath to use for compilation. + * This will be modified according to the ClasspathOptions used to configure the ScalaCompiler.*/ + File[] classpath(); + + /** All sources that should be recompiled. + * This should include Scala and Java sources, which are identified by their extension. */ + File[] sources(); + + /** The directory where class files should be generated. + * Incremental compilation will manage the class files in this directory. + * In particular, outdated class files will be deleted before compilation. + * It is important that this directory is exclusively used for one set of sources. */ + File classesDirectory(); + + /** The options to pass to the Scala compiler other than the sources and classpath to use. */ + String[] options(); + + /** The options to pass to the Java compiler other than the sources and classpath to use. */ + String[] javacOptions(); + + /** The maximum number of errors that the Scala compiler should report.*/ + int maxErrors(); + + /** Controls the order in which Java and Scala sources are compiled.*/ + CompileOrder order(); +} diff --git a/interface/src/main/java/xsbti/compile/ScalaInstance.java b/interface/src/main/java/xsbti/compile/ScalaInstance.java new file mode 100644 index 000000000..fd66d9a9c --- /dev/null +++ b/interface/src/main/java/xsbti/compile/ScalaInstance.java @@ -0,0 +1,34 @@ +package xsbti.compile; + +import java.io.File; + +/** +* Defines Scala instance, which is a reference version String, a unique version String, a set of jars, and a class loader for a Scala version. +* +* Note that in this API a 'jar' can actually be any valid classpath entry. +*/ +public interface ScalaInstance +{ + /** The version used to refer to this Scala version. + * It need not be unique and can be a dynamic version like 2.10.0-SNAPSHOT. + */ + String version(); + + /** A class loader providing access to the classes and resources in the library and compiler jars. */ + ClassLoader loader(); + + /** The library jar file.*/ + File libraryJar(); + + /** The compiler jar file.*/ + File compilerJar(); + + /** Jars provided by this Scala instance other than the compiler and library jars. */ + File[] extraJars(); + + /** All jar files provided by this Scala instance.*/ + File[] jars(); + + /** The unique identifier for this Scala instance. An implementation should usually obtain this from the compiler.properties file in the compiler jar. */ + String actualVersion(); +} diff --git a/interface/src/main/java/xsbti/compile/Setup.java b/interface/src/main/java/xsbti/compile/Setup.java new file mode 100644 index 000000000..ca9999978 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/Setup.java @@ -0,0 +1,24 @@ +package xsbti.compile; + +import java.io.File; +import xsbti.Maybe; + +/** Configures incremental recompilation. */ +public interface Setup +{ + /** Provides the Analysis for the given classpath entry.*/ + Maybe analysisMap(File file); + + /** Provides a function to determine if classpath entry `file` contains a given class. + * The returned function should generally cache information about `file`, such as the list of entries in a jar. + */ + DefinesClass definesClass(File file); + + /** If true, no sources are actually compiled and the Analysis from the previous compilation is returned.*/ + boolean skip(); + + /** The directory used to cache information across compilations. + * This directory can be removed to force a full recompilation. + * The directory should be unique and not shared between compilations. */ + File cacheDirectory(); +} From 74eaee5a5e835957aae33c8f934d706753593c1f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 9 Jan 2012 08:00:29 -0500 Subject: [PATCH 223/823] new aggregation approach, still need exclusion mechanism --- util/collection/Relation.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/util/collection/Relation.scala b/util/collection/Relation.scala index c5195ffb7..dce3d9048 100644 --- a/util/collection/Relation.scala +++ b/util/collection/Relation.scala @@ -17,7 +17,6 @@ object Relation make(forward, reverse) } - private[sbt] def remove[X,Y](map: M[X,Y], from: X, to: Y): M[X,Y] = map.get(from) match { case Some(tos) => From e23abdfce3ca07b84316498e3963088fcb4641fb Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 14 Jan 2012 21:09:11 -0500 Subject: [PATCH 224/823] explicitly close streams on java.lang.Process to avoid descriptor leaks --- util/process/ProcessImpl.scala | 28 ++++++++++++++++++---------- 1 file changed, 18 insertions(+), 10 deletions(-) diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index 69191d054..cffe9f44b 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -55,6 +55,11 @@ object BasicIO final val BufferSize = 8192 final val Newline = System.getProperty("line.separator") + def closeProcessStreams(p: JProcess): Unit = + Seq(p.getOutputStream, p.getInputStream, p.getErrorStream) foreach { s => + if(s ne null) close(s) + } + def close(c: java.io.Closeable) = try { c.close() } catch { case _: java.io.IOException => () } def processFully(buffer: Appendable): InputStream => Unit = processFully(appendLine(buffer)) def processFully(processLine: String => Unit): InputStream => Unit = @@ -364,8 +369,6 @@ private[sbt] class DummyProcessBuilder(override val toString: String, exitValue override def run(io: ProcessIO): Process = new DummyProcess(exitValue) override def canPipeTo = true } -/** A thin wrapper around a java.lang.Process. `ioThreads` are the Threads created to do I/O. -* The implementation of `exitValue` waits until these threads die before returning. */ private class DummyProcess(action: => Int) extends Process { private[this] val exitCode = Future(action) @@ -399,18 +402,23 @@ private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProc * returning. */ private class SimpleProcess(p: JProcess, inputThread: Thread, outputThreads: List[Thread]) extends Process { - override def exitValue() = + override def exitValue(): Int = { - try { p.waitFor() }// wait for the process to terminate - finally { inputThread.interrupt() } // we interrupt the input thread to notify it that it can terminate - outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) + andCleanup { p.waitFor() }// wait for the process to terminate p.exitValue() } override def destroy() = - { - try { p.destroy() } - finally { inputThread.interrupt() } - } + andCleanup { p.destroy() } + + private[this] def andCleanup[T](action: => Unit): Unit = + try + { + try { action } + finally { inputThread.interrupt() } // we interrupt the input thread to notify it that it can terminate + outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) + } + finally + BasicIO.closeProcessStreams(p) } private class FileOutput(file: File, append: Boolean) extends OutputStreamBuilder(new FileOutputStream(file, append), file.getAbsolutePath) From c3c7c92053e0b7a86b4ac5ea8bc7306206a3ce4e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 14 Jan 2012 21:09:11 -0500 Subject: [PATCH 225/823] work around unclosed jline history-related streams --- util/complete/LineReader.scala | 65 +++++++++++++++++++++------------- 1 file changed, 40 insertions(+), 25 deletions(-) diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index 66479e736..abbd0e8f5 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -3,23 +3,40 @@ */ package sbt - import jline.{Completor, ConsoleReader} - import java.io.File + import jline.{Completor, ConsoleReader, History} + import java.io.{File,PrintWriter} import complete.Parser abstract class JLine extends LineReader { protected[this] val reader: ConsoleReader + protected[this] val historyPath: Option[File] + def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } + private[this] def unsynchronizedReadLine(prompt: String, mask: Option[Char]) = - (mask match { - case Some(m) => reader.readLine(prompt, m) - case None => reader.readLine(prompt) - }) match + readLineWithHistory(prompt, mask) match { case null => None case x => Some(x.trim) } + + private[this] def readLineWithHistory(prompt: String, mask: Option[Char]): String = + historyPath match + { + case None => readLineDirect(prompt, mask) + case Some(file) => + val h = reader.getHistory + JLine.loadHistory(h, file) + try { readLineDirect(prompt, mask) } + finally { JLine.saveHistory(h, file) } + } + + private[this] def readLineDirect(prompt: String, mask: Option[Char]): String = + mask match { + case Some(m) => reader.readLine(prompt, m) + case None => reader.readLine(prompt) + } } private object JLine { @@ -44,18 +61,20 @@ private object JLine try { action } finally { t.enableEcho() } } - private[sbt] def initializeHistory(cr: ConsoleReader, historyPath: Option[File]): Unit = - for(historyLocation <- historyPath) - { - val historyFile = historyLocation.getAbsoluteFile - ErrorHandling.wideConvert - { - historyFile.getParentFile.mkdirs() - val history = cr.getHistory - history.setMaxSize(MaxHistorySize) - history.setHistoryFile(historyFile) - } + private[sbt] def loadHistory(h: History, file: File) + { + h.setMaxSize(MaxHistorySize) + if(file.isFile) IO.reader(file)( h.load ) + } + private[sbt] def saveHistory(h: History, file: File): Unit = + Using.fileWriter()(file) { writer => + val out = new PrintWriter(writer, false) + h.setOutput(out) + h.flushBuffer() + out.close() + h.setOutput(null) } + def simple(historyPath: Option[File]): SimpleReader = new SimpleReader(historyPath) val MaxHistorySize = 500 } @@ -64,24 +83,20 @@ trait LineReader { def readLine(prompt: String, mask: Option[Char] = None): Option[String] } -final class FullReader(historyPath: Option[File], complete: Parser[_]) extends JLine +final class FullReader(val historyPath: Option[File], complete: Parser[_]) extends JLine { protected[this] val reader = { val cr = new ConsoleReader cr.setBellEnabled(false) - JLine.initializeHistory(cr, historyPath) sbt.complete.JLineCompletion.installCustomCompletor(cr, complete) cr } } -class SimpleReader private[sbt] (historyPath: Option[File]) extends JLine -{ - protected[this] val reader = JLine.createReader() - JLine.initializeHistory(reader, historyPath) -} -object SimpleReader extends JLine +class SimpleReader private[sbt] (val historyPath: Option[File]) extends JLine { protected[this] val reader = JLine.createReader() } +object SimpleReader extends SimpleReader(None) + From acf8e1ba68f708ce68c08e950e4567ed6b4d215a Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Fri, 20 Jan 2012 17:31:36 +0400 Subject: [PATCH 226/823] Add SourcePosition to setting. --- util/collection/Settings.scala | 31 ++++++++++++++++++++++--------- 1 file changed, 22 insertions(+), 9 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 4111c9082..fe97313f2 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -58,10 +58,10 @@ trait Init[Scope] type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] type MapConstant = ScopedKey ~> Option - def setting[T](key: ScopedKey[T], init: Initialize[T]): Setting[T] = new Setting[T](key, init) + def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) def value[T](value: => T): Initialize[T] = new Value(value _) def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) - def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head))) + def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head)), NoPosition) def bind[S,T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) def app[HL <: HList, T](inputs: KList[Initialize, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) @@ -245,19 +245,32 @@ trait Init[Scope] def settings: Seq[Setting[_]] } final class SettingList(val settings: Seq[Setting[_]]) extends SettingsDefinition - final class Setting[T](val key: ScopedKey[T], val init: Initialize[T]) extends SettingsDefinition + final class Setting[T](val key: ScopedKey[T], val init: Initialize[T], val pos: SourcePosition) extends SettingsDefinition { def settings = this :: Nil def definitive: Boolean = !init.dependencies.contains(key) def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) - def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) - def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI)) - def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) - def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t))) - def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g) - override def toString = "setting(" + key + ")" + def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g, pos) + def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI, pos)) + def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init, pos) + def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t)), pos) + def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g, pos) + def setPos(pos: SourceCoord) = new Setting(key, init, pos) + override def toString = "setting(" + key + ") at " + pos } + trait SourcePosition { + def fileName: String + def line: Int + } + + case object NoPosition extends SourcePosition { + override def fileName = throw new UnsupportedOperationException("NoPosition") + override def line = throw new UnsupportedOperationException("NoPosition") + } + + case class SourceCoord(fileName: String, line: Int) extends SourcePosition + // mainly for reducing generated class count private[this] def validateReferencedT(g: ValidateRef) = new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateReferenced g } From e1182031a0adc6b2023d74dfe93ba8100968306d Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Fri, 20 Jan 2012 17:31:36 +0400 Subject: [PATCH 227/823] Add SourcePosition to setting. --- util/collection/Settings.scala | 31 ++++++++++++++++++++++--------- 1 file changed, 22 insertions(+), 9 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 4111c9082..fe97313f2 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -58,10 +58,10 @@ trait Init[Scope] type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] type MapConstant = ScopedKey ~> Option - def setting[T](key: ScopedKey[T], init: Initialize[T]): Setting[T] = new Setting[T](key, init) + def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) def value[T](value: => T): Initialize[T] = new Value(value _) def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) - def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head))) + def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head)), NoPosition) def bind[S,T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) def app[HL <: HList, T](inputs: KList[Initialize, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) @@ -245,19 +245,32 @@ trait Init[Scope] def settings: Seq[Setting[_]] } final class SettingList(val settings: Seq[Setting[_]]) extends SettingsDefinition - final class Setting[T](val key: ScopedKey[T], val init: Initialize[T]) extends SettingsDefinition + final class Setting[T](val key: ScopedKey[T], val init: Initialize[T], val pos: SourcePosition) extends SettingsDefinition { def settings = this :: Nil def definitive: Boolean = !init.dependencies.contains(key) def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) - def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) - def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI)) - def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) - def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t))) - def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g) - override def toString = "setting(" + key + ")" + def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g, pos) + def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI, pos)) + def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init, pos) + def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t)), pos) + def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g, pos) + def setPos(pos: SourceCoord) = new Setting(key, init, pos) + override def toString = "setting(" + key + ") at " + pos } + trait SourcePosition { + def fileName: String + def line: Int + } + + case object NoPosition extends SourcePosition { + override def fileName = throw new UnsupportedOperationException("NoPosition") + override def line = throw new UnsupportedOperationException("NoPosition") + } + + case class SourceCoord(fileName: String, line: Int) extends SourcePosition + // mainly for reducing generated class count private[this] def validateReferencedT(g: ValidateRef) = new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateReferenced g } From a3fa54be2ce509abbdcaae1e5b6768bf40c6e631 Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Tue, 24 Jan 2012 13:32:21 +0400 Subject: [PATCH 228/823] Change SourcePosition definition + minor cleanup. --- util/collection/Settings.scala | 12 ++---------- 1 file changed, 2 insertions(+), 10 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index fe97313f2..1c5634c1c 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -259,16 +259,8 @@ trait Init[Scope] override def toString = "setting(" + key + ") at " + pos } - trait SourcePosition { - def fileName: String - def line: Int - } - - case object NoPosition extends SourcePosition { - override def fileName = throw new UnsupportedOperationException("NoPosition") - override def line = throw new UnsupportedOperationException("NoPosition") - } - + sealed trait SourcePosition + case object NoPosition extends SourcePosition case class SourceCoord(fileName: String, line: Int) extends SourcePosition // mainly for reducing generated class count From 79ee34fa9706826859a2456479426ad0895ad6eb Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Fri, 27 Jan 2012 17:51:13 +0400 Subject: [PATCH 229/823] More cleanup. --- util/collection/Settings.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 1c5634c1c..01c3d5e2f 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -255,13 +255,13 @@ trait Init[Scope] def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init, pos) def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t)), pos) def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g, pos) - def setPos(pos: SourceCoord) = new Setting(key, init, pos) + def withPos(pos: SourceCoord) = new Setting(key, init, pos) override def toString = "setting(" + key + ") at " + pos } sealed trait SourcePosition case object NoPosition extends SourcePosition - case class SourceCoord(fileName: String, line: Int) extends SourcePosition + final case class SourceCoord(fileName: String, line: Int) extends SourcePosition // mainly for reducing generated class count private[this] def validateReferencedT(g: ValidateRef) = From 494553461416a9e9ad89e88faab6c8c95e0cc53f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 29 Jan 2012 14:36:27 -0500 Subject: [PATCH 230/823] split command core to main/command/ --- util/collection/Settings.scala | 2 +- util/control/MessageOnlyException.scala | 9 ++++++- util/log/GlobalLogging.scala | 27 +++++++++++++++++++ util/log/MainLogging.scala | 36 +++++++++++++++++++++++++ 4 files changed, 72 insertions(+), 2 deletions(-) create mode 100644 util/log/GlobalLogging.scala create mode 100644 util/log/MainLogging.scala diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 4111c9082..d8945a787 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -177,7 +177,7 @@ trait Init[Scope] if(dist < 0) None else Some(dist) } - final class Uninitialized(val undefined: Seq[Undefined], msg: String) extends Exception(msg) + final class Uninitialized(val undefined: Seq[Undefined], override val toString: String) extends Exception(toString) final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_]) final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(definingKey, referencedKey) diff --git a/util/control/MessageOnlyException.scala b/util/control/MessageOnlyException.scala index 6791a9339..7fa43746d 100644 --- a/util/control/MessageOnlyException.scala +++ b/util/control/MessageOnlyException.scala @@ -4,4 +4,11 @@ package sbt final class MessageOnlyException(override val toString: String) extends RuntimeException(toString) -final class NoMessageException extends RuntimeException \ No newline at end of file + +/** A dummy exception for the top-level exception handler to know that an exception +* has been handled, but is being passed further up to indicate general failure. */ +final class AlreadyHandledException extends RuntimeException + +/** A marker trait for a top-level exception handler to know that this exception +* doesn't make sense to display. */ +trait UnprintableException extends Throwable \ No newline at end of file diff --git a/util/log/GlobalLogging.scala b/util/log/GlobalLogging.scala new file mode 100644 index 000000000..e54b00a00 --- /dev/null +++ b/util/log/GlobalLogging.scala @@ -0,0 +1,27 @@ +/* sbt -- Simple Build Tool + * Copyright 2010 Mark Harrah + */ +package sbt + + import java.io.{File, PrintWriter} + +final case class GlobalLogging(full: Logger, backed: ConsoleLogger, backing: GlobalLogBacking) +final case class GlobalLogBacking(file: File, last: Option[File], newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: () => File) +{ + def shift(newFile: File) = GlobalLogBacking(newFile, Some(file), newLogger, newBackingFile) + def shiftNew() = shift(newBackingFile()) + def unshift = GlobalLogBacking(last getOrElse file, None, newLogger, newBackingFile) +} +object GlobalLogBacking +{ + def apply(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File): GlobalLogBacking = + GlobalLogBacking(newBackingFile, None, newLogger, newBackingFile _) +} +object GlobalLogging +{ + def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File): GlobalLogging = + { + val log = ConsoleLogger() + GlobalLogging(log, log, GlobalLogBacking(newLogger, newBackingFile)) + } +} \ No newline at end of file diff --git a/util/log/MainLogging.scala b/util/log/MainLogging.scala new file mode 100644 index 000000000..b07abf4e3 --- /dev/null +++ b/util/log/MainLogging.scala @@ -0,0 +1,36 @@ +package sbt + + import java.io.PrintWriter + +object MainLogging +{ + def multiLogger(config: MultiLoggerConfig): Logger = + { + import config._ + val multi = new MultiLogger(console :: backed :: extra) + // sets multi to the most verbose for clients that inspect the current level + multi setLevel Level.unionAll(backingLevel :: screenLevel :: extra.map(_.getLevel)) + // set the specific levels + console setLevel screenLevel + backed setLevel backingLevel + console setTrace screenTrace + backed setTrace backingTrace + multi: Logger + } + def globalDefault(writer: PrintWriter, backing: GlobalLogBacking): GlobalLogging = + { + val backed = defaultBacked()(writer) + val full = multiLogger(defaultMultiConfig( backed ) ) + GlobalLogging(full, backed, backing) + } + + def defaultMultiConfig(backing: AbstractLogger): MultiLoggerConfig = + new MultiLoggerConfig(defaultScreen, backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) + + def defaultScreen: AbstractLogger = ConsoleLogger() + + def defaultBacked(useColor: Boolean = ConsoleLogger.formatEnabled): PrintWriter => ConsoleLogger = + to => ConsoleLogger(ConsoleLogger.printWriterOut(to), useColor = useColor) // TODO: should probably filter ANSI codes when useColor=false +} + +final case class MultiLoggerConfig(console: AbstractLogger, backed: AbstractLogger, extra: List[AbstractLogger], screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int) \ No newline at end of file From c6cba20682aa6925030ee6874f6f0a199d8d378d Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Fri, 20 Jan 2012 17:31:36 +0400 Subject: [PATCH 231/823] Add SourcePosition to setting. --- util/collection/Settings.scala | 31 ++++++++++++++++++++++--------- 1 file changed, 22 insertions(+), 9 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 4111c9082..fe97313f2 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -58,10 +58,10 @@ trait Init[Scope] type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] type MapConstant = ScopedKey ~> Option - def setting[T](key: ScopedKey[T], init: Initialize[T]): Setting[T] = new Setting[T](key, init) + def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) def value[T](value: => T): Initialize[T] = new Value(value _) def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) - def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head))) + def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head)), NoPosition) def bind[S,T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) def app[HL <: HList, T](inputs: KList[Initialize, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) @@ -245,19 +245,32 @@ trait Init[Scope] def settings: Seq[Setting[_]] } final class SettingList(val settings: Seq[Setting[_]]) extends SettingsDefinition - final class Setting[T](val key: ScopedKey[T], val init: Initialize[T]) extends SettingsDefinition + final class Setting[T](val key: ScopedKey[T], val init: Initialize[T], val pos: SourcePosition) extends SettingsDefinition { def settings = this :: Nil def definitive: Boolean = !init.dependencies.contains(key) def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) - def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g) - def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI)) - def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init) - def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t))) - def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g) - override def toString = "setting(" + key + ")" + def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g, pos) + def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI, pos)) + def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init, pos) + def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t)), pos) + def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g, pos) + def setPos(pos: SourceCoord) = new Setting(key, init, pos) + override def toString = "setting(" + key + ") at " + pos } + trait SourcePosition { + def fileName: String + def line: Int + } + + case object NoPosition extends SourcePosition { + override def fileName = throw new UnsupportedOperationException("NoPosition") + override def line = throw new UnsupportedOperationException("NoPosition") + } + + case class SourceCoord(fileName: String, line: Int) extends SourcePosition + // mainly for reducing generated class count private[this] def validateReferencedT(g: ValidateRef) = new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateReferenced g } From 5f0774fe9adb705f2d8a692027b7d689ab3ddb13 Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Tue, 24 Jan 2012 13:32:21 +0400 Subject: [PATCH 232/823] Change SourcePosition definition + minor cleanup. --- util/collection/Settings.scala | 12 ++---------- 1 file changed, 2 insertions(+), 10 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index fe97313f2..1c5634c1c 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -259,16 +259,8 @@ trait Init[Scope] override def toString = "setting(" + key + ") at " + pos } - trait SourcePosition { - def fileName: String - def line: Int - } - - case object NoPosition extends SourcePosition { - override def fileName = throw new UnsupportedOperationException("NoPosition") - override def line = throw new UnsupportedOperationException("NoPosition") - } - + sealed trait SourcePosition + case object NoPosition extends SourcePosition case class SourceCoord(fileName: String, line: Int) extends SourcePosition // mainly for reducing generated class count From e2c1ef32fa13ad0527c30565829cae6c9040dcca Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Fri, 27 Jan 2012 17:51:13 +0400 Subject: [PATCH 233/823] More cleanup. --- util/collection/Settings.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 1c5634c1c..01c3d5e2f 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -255,13 +255,13 @@ trait Init[Scope] def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init, pos) def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t)), pos) def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g, pos) - def setPos(pos: SourceCoord) = new Setting(key, init, pos) + def withPos(pos: SourceCoord) = new Setting(key, init, pos) override def toString = "setting(" + key + ") at " + pos } sealed trait SourcePosition case object NoPosition extends SourcePosition - case class SourceCoord(fileName: String, line: Int) extends SourcePosition + final case class SourceCoord(fileName: String, line: Int) extends SourcePosition // mainly for reducing generated class count private[this] def validateReferencedT(g: ValidateRef) = From 786fe5f4cec14abe3219c6fde018af56a9b919c3 Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Thu, 16 Feb 2012 16:58:51 +0400 Subject: [PATCH 234/823] Remember the range for settings read from .sbt files --- util/collection/Positions.scala | 20 ++++++++++++++++++++ util/collection/Settings.scala | 6 +----- 2 files changed, 21 insertions(+), 5 deletions(-) create mode 100755 util/collection/Positions.scala diff --git a/util/collection/Positions.scala b/util/collection/Positions.scala new file mode 100755 index 000000000..2cd3f77e1 --- /dev/null +++ b/util/collection/Positions.scala @@ -0,0 +1,20 @@ +package sbt + +sealed trait SourcePosition + +sealed trait FilePosition { + def path: String + def startLine: Int +} + +case object NoPosition extends SourcePosition + +final case class LinePosition(path: String, startLine: Int) extends SourcePosition with FilePosition + +final case class LineRange(start: Int, end: Int) { + def shift(n: Int) = new LineRange(start + n, end + n) +} + +final case class RangePosition(path: String, range: LineRange) extends SourcePosition with FilePosition { + def startLine = range.start +} diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 210c8e3c1..6df7291af 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -255,14 +255,10 @@ trait Init[Scope] def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init, pos) def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t)), pos) def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g, pos) - def withPos(pos: SourceCoord) = new Setting(key, init, pos) + def withPos(pos: SourcePosition) = new Setting(key, init, pos) override def toString = "setting(" + key + ") at " + pos } - sealed trait SourcePosition - case object NoPosition extends SourcePosition - final case class SourceCoord(fileName: String, line: Int) extends SourcePosition - // mainly for reducing generated class count private[this] def validateReferencedT(g: ValidateRef) = new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateReferenced g } From 0f3c75a2eaba48bd1b32f730c88f71850f97156a Mon Sep 17 00:00:00 2001 From: "Daniel C. Sobral" Date: Fri, 17 Feb 2012 17:42:32 -0200 Subject: [PATCH 235/823] Revert "explicitly close streams" Revert "explicitly close streams on java.lang.Process to avoid descriptor leaks" This reverts commit 3191eedf9e3e1eee6b7d2144d11fe248f191654a. --- util/process/ProcessImpl.scala | 28 ++++++++++------------------ 1 file changed, 10 insertions(+), 18 deletions(-) diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index cffe9f44b..69191d054 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -55,11 +55,6 @@ object BasicIO final val BufferSize = 8192 final val Newline = System.getProperty("line.separator") - def closeProcessStreams(p: JProcess): Unit = - Seq(p.getOutputStream, p.getInputStream, p.getErrorStream) foreach { s => - if(s ne null) close(s) - } - def close(c: java.io.Closeable) = try { c.close() } catch { case _: java.io.IOException => () } def processFully(buffer: Appendable): InputStream => Unit = processFully(appendLine(buffer)) def processFully(processLine: String => Unit): InputStream => Unit = @@ -369,6 +364,8 @@ private[sbt] class DummyProcessBuilder(override val toString: String, exitValue override def run(io: ProcessIO): Process = new DummyProcess(exitValue) override def canPipeTo = true } +/** A thin wrapper around a java.lang.Process. `ioThreads` are the Threads created to do I/O. +* The implementation of `exitValue` waits until these threads die before returning. */ private class DummyProcess(action: => Int) extends Process { private[this] val exitCode = Future(action) @@ -402,23 +399,18 @@ private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProc * returning. */ private class SimpleProcess(p: JProcess, inputThread: Thread, outputThreads: List[Thread]) extends Process { - override def exitValue(): Int = + override def exitValue() = { - andCleanup { p.waitFor() }// wait for the process to terminate + try { p.waitFor() }// wait for the process to terminate + finally { inputThread.interrupt() } // we interrupt the input thread to notify it that it can terminate + outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) p.exitValue() } override def destroy() = - andCleanup { p.destroy() } - - private[this] def andCleanup[T](action: => Unit): Unit = - try - { - try { action } - finally { inputThread.interrupt() } // we interrupt the input thread to notify it that it can terminate - outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) - } - finally - BasicIO.closeProcessStreams(p) + { + try { p.destroy() } + finally { inputThread.interrupt() } + } } private class FileOutput(file: File, append: Boolean) extends OutputStreamBuilder(new FileOutputStream(file, append), file.getAbsolutePath) From be6cd00b81ae155a94f93e40a0f182e8fb6b51bf Mon Sep 17 00:00:00 2001 From: "Daniel C. Sobral" Date: Fri, 17 Feb 2012 17:51:04 -0200 Subject: [PATCH 236/823] Fix file descriptor leak. Close an InputStream when finished reading it. When given an OutputStream to connect to a process input, close it when the transfer is completed. Protect System.in in this latter case. --- util/process/ProcessImpl.scala | 12 ++++++++---- 1 file changed, 8 insertions(+), 4 deletions(-) diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index 69191d054..f29d8bfa3 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -51,7 +51,6 @@ object BasicIO private def processErrFully(log: ProcessLogger) = processFully(s => log.error(s)) private def processInfoFully(log: ProcessLogger) = processFully(s => log.info(s)) - def ignoreOut = (i: OutputStream) => () final val BufferSize = 8192 final val Newline = System.getProperty("line.separator") @@ -62,6 +61,7 @@ object BasicIO { val reader = new BufferedReader(new InputStreamReader(in)) processLinesFully(processLine)(reader.readLine) + reader.close() } def processLinesFully(processLine: String => Unit)(readLine: () => String) { @@ -76,8 +76,11 @@ object BasicIO } readFully() } - def connectToIn(o: OutputStream) { transferFully(System.in, o) } - def input(connect: Boolean): OutputStream => Unit = if(connect) connectToIn else ignoreOut + def connectToIn(o: OutputStream) { transferFully(Uncloseable protect System.in, o) } + def input(connect: Boolean): OutputStream => Unit = { outputToProcess => + if(connect) connectToIn(outputToProcess) + else outputToProcess.close() + } def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput)) def standard(in: OutputStream => Unit): ProcessIO = new ProcessIO(in, toStdOut, toStdErr) @@ -110,6 +113,7 @@ object BasicIO } } read + in.close() } } @@ -469,4 +473,4 @@ private object Streamed } } -private final class Streamed[T](val process: T => Unit, val done: Int => Unit, val stream: () => Stream[T]) extends NotNull \ No newline at end of file +private final class Streamed[T](val process: T => Unit, val done: Int => Unit, val stream: () => Stream[T]) extends NotNull From 4b43a154cb812032033312b2c08aff6b01d39318 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 19 Feb 2012 22:41:26 -0500 Subject: [PATCH 237/823] cleanup, fix compilation --- util/process/ProcessImpl.scala | 10 ++++------ 1 file changed, 4 insertions(+), 6 deletions(-) diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index f29d8bfa3..d14b64f78 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -51,6 +51,7 @@ object BasicIO private def processErrFully(log: ProcessLogger) = processFully(s => log.error(s)) private def processInfoFully(log: ProcessLogger) = processFully(s => log.info(s)) + def closeOut = (_: OutputStream).close() final val BufferSize = 8192 final val Newline = System.getProperty("line.separator") @@ -61,7 +62,7 @@ object BasicIO { val reader = new BufferedReader(new InputStreamReader(in)) processLinesFully(processLine)(reader.readLine) - reader.close() + reader.close() } def processLinesFully(processLine: String => Unit)(readLine: () => String) { @@ -77,10 +78,7 @@ object BasicIO readFully() } def connectToIn(o: OutputStream) { transferFully(Uncloseable protect System.in, o) } - def input(connect: Boolean): OutputStream => Unit = { outputToProcess => - if(connect) connectToIn(outputToProcess) - else outputToProcess.close() - } + def input(connect: Boolean): OutputStream => Unit = if(connect) connectToIn else closeOut def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput)) def standard(in: OutputStream => Unit): ProcessIO = new ProcessIO(in, toStdOut, toStdErr) @@ -113,7 +111,7 @@ object BasicIO } } read - in.close() + in.close() } } From 5893aa0e55ddc4bd9c2841bc962c3ee990fcab78 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 25 Feb 2012 12:01:07 -0500 Subject: [PATCH 238/823] cleanup SourcePosition hierarchy --- util/collection/Positions.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/util/collection/Positions.scala b/util/collection/Positions.scala index 2cd3f77e1..b2aa22ee2 100755 --- a/util/collection/Positions.scala +++ b/util/collection/Positions.scala @@ -2,19 +2,19 @@ package sbt sealed trait SourcePosition -sealed trait FilePosition { +sealed trait FilePosition extends SourcePosition { def path: String def startLine: Int } case object NoPosition extends SourcePosition -final case class LinePosition(path: String, startLine: Int) extends SourcePosition with FilePosition +final case class LinePosition(path: String, startLine: Int) extends FilePosition final case class LineRange(start: Int, end: Int) { def shift(n: Int) = new LineRange(start + n, end + n) } -final case class RangePosition(path: String, range: LineRange) extends SourcePosition with FilePosition { +final case class RangePosition(path: String, range: LineRange) extends FilePosition { def startLine = range.start } From 27970799c824866090cb7ccb44d4ccd17f0b89f8 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Sun, 4 Mar 2012 13:19:58 +0100 Subject: [PATCH 239/823] Macro def aware recompilation. - Read macro modifier from method definition. - Always recompile downstream files after a file containing macro defs is recompiled. - Source is extended with a hasMacro attribute. Mark suggests that this might be better tracked in Relations, but I'm not sure how to make that change. --- interface/other | 1 + interface/src/main/java/xsbti/api/Modifiers.java | 12 +++++++++--- 2 files changed, 10 insertions(+), 3 deletions(-) diff --git a/interface/other b/interface/other index 993c9d4f6..aeec1ae5a 100644 --- a/interface/other +++ b/interface/other @@ -3,6 +3,7 @@ Source hash: Byte* api: SourceAPI apiHash: Int + hasMacro: Boolean SourceAPI packages : Package* diff --git a/interface/src/main/java/xsbti/api/Modifiers.java b/interface/src/main/java/xsbti/api/Modifiers.java index 14737be57..575879608 100644 --- a/interface/src/main/java/xsbti/api/Modifiers.java +++ b/interface/src/main/java/xsbti/api/Modifiers.java @@ -8,13 +8,14 @@ public final class Modifiers implements java.io.Serializable private static final int SealedBit = 3; private static final int ImplicitBit = 4; private static final int LazyBit = 5; + private static final int MacroBit = 6; private static final int flag(boolean set, int bit) { return set ? (1 << bit) : 0; } - public Modifiers(boolean isAbstract, boolean isOverride, boolean isFinal, boolean isSealed, boolean isImplicit, boolean isLazy) + public Modifiers(boolean isAbstract, boolean isOverride, boolean isFinal, boolean isSealed, boolean isImplicit, boolean isLazy, boolean isMacro) { this.flags = (byte)( flag(isAbstract, AbstractBit) | @@ -22,7 +23,8 @@ public final class Modifiers implements java.io.Serializable flag(isFinal, FinalBit) | flag(isSealed, SealedBit) | flag(isImplicit, ImplicitBit) | - flag(isLazy, LazyBit) + flag(isLazy, LazyBit) | + flag(isMacro, MacroBit) ); } @@ -62,8 +64,12 @@ public final class Modifiers implements java.io.Serializable { return flag(LazyBit); } + public final boolean isMacro() + { + return flag(MacroBit); + } public String toString() { - return "Modifiers(" + "isAbstract: " + isAbstract() + ", " + "isOverride: " + isOverride() + ", " + "isFinal: " + isFinal() + ", " + "isSealed: " + isSealed() + ", " + "isImplicit: " + isImplicit() + ", " + "isLazy: " + isLazy()+ ")"; + return "Modifiers(" + "isAbstract: " + isAbstract() + ", " + "isOverride: " + isOverride() + ", " + "isFinal: " + isFinal() + ", " + "isSealed: " + isSealed() + ", " + "isImplicit: " + isImplicit() + ", " + "isLazy: " + isLazy() + ", " + "isMacro: " + isMacro()+ ")"; } } From 26be0c0be47fddaae52083ccddb153c73fc50855 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 9 Mar 2012 07:08:38 -0500 Subject: [PATCH 240/823] move error processing to complete/ --- util/complete/Parser.scala | 6 ++++++ util/complete/ProcessError.scala | 30 ++++++++++++++++++++++++++++++ 2 files changed, 36 insertions(+) create mode 100644 util/complete/ProcessError.scala diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index a23ae3f48..5f9086b58 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -277,6 +277,12 @@ trait ParserMain def unapply[A,B](t: (A,B)): Some[(A,B)] = Some(t) } + def parse[T](str: String, parser: Parser[T]): Either[String, T] = + Parser.result(parser, str).left.map { failures => + val (msgs,pos) = failures() + ProcessError(str, msgs, pos) + } + // intended to be temporary pending proper error feedback def result[T](p: Parser[T], s: String): Either[() => (Seq[String],Int), T] = { diff --git a/util/complete/ProcessError.scala b/util/complete/ProcessError.scala new file mode 100644 index 000000000..76ea2f71d --- /dev/null +++ b/util/complete/ProcessError.scala @@ -0,0 +1,30 @@ +package sbt.complete + +object ProcessError +{ + def apply(command: String, msgs: Seq[String], index: Int): String = + { + val (line, modIndex) = extractLine(command, index) + val point = pointerSpace(command, modIndex) + msgs.mkString("\n") + "\n" + line + "\n" + point + "^" + } + def extractLine(s: String, i: Int): (String, Int) = + { + val notNewline = (c: Char) => c != '\n' && c != '\r' + val left = takeRightWhile( s.substring(0, i) )( notNewline ) + val right = s substring i takeWhile notNewline + (left + right, left.length) + } + def takeRightWhile(s: String)(pred: Char => Boolean): String = + { + def loop(i: Int): String = + if(i < 0) + s + else if( pred(s(i)) ) + loop(i-1) + else + s.substring(i+1) + loop(s.length - 1) + } + def pointerSpace(s: String, i: Int): String = (s take i) map { case '\t' => '\t'; case _ => ' ' } mkString; +} \ No newline at end of file From 1848d14815d4538a57d59c2de4a87519cea4c54b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 9 Mar 2012 07:08:38 -0500 Subject: [PATCH 241/823] handle CONT signal to reset JLine after resuming from stop. fixes #394 --- util/collection/Signal.scala | 8 ++++---- util/complete/LineReader.scala | 22 ++++++++++++++++++---- 2 files changed, 22 insertions(+), 8 deletions(-) diff --git a/util/collection/Signal.scala b/util/collection/Signal.scala index 09756249d..3152d4d49 100644 --- a/util/collection/Signal.scala +++ b/util/collection/Signal.scala @@ -2,13 +2,13 @@ package sbt object Signals { - def withHandler[T](handler: () => Unit)(action: () => T): T = + def withHandler[T](handler: () => Unit, signal: String = "INT")(action: () => T): T = { val result = try { val signals = new Signals0 - signals.withHandler(handler)(action) + signals.withHandler(signal, handler, action) } catch { case e: LinkageError => Right(action()) } @@ -26,10 +26,10 @@ private final class Signals0 { // returns a LinkageError in `action` as Left(t) in order to avoid it being // incorrectly swallowed as missing Signal/SignalHandler - def withHandler[T](handler: () => Unit)(action: () => T): Either[Throwable, T] = + def withHandler[T](signal: String, handler: () => Unit, action: () => T): Either[Throwable, T] = { import sun.misc.{Signal,SignalHandler} - val intSignal = new Signal("INT") + val intSignal = new Signal(signal) val newHandler = new SignalHandler { def handle(sig: Signal) { handler() } } diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index abbd0e8f5..1cb90b55b 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -9,6 +9,7 @@ package sbt abstract class JLine extends LineReader { + protected[this] val handleCONT: Boolean protected[this] val reader: ConsoleReader protected[this] val historyPath: Option[File] @@ -33,10 +34,22 @@ abstract class JLine extends LineReader } private[this] def readLineDirect(prompt: String, mask: Option[Char]): String = + if(handleCONT) + Signals.withHandler(() => resume(), signal = "CONT")( () => readLineDirectRaw(prompt, mask) ) + else + readLineDirectRaw(prompt, mask) + private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): String = mask match { case Some(m) => reader.readLine(prompt, m) case None => reader.readLine(prompt) } + private[this] def resume() + { + jline.Terminal.resetTerminal + JLine.terminal.disableEcho() + reader.drawLine() + reader.flushConsole() + } } private object JLine { @@ -75,15 +88,16 @@ private object JLine h.setOutput(null) } - def simple(historyPath: Option[File]): SimpleReader = new SimpleReader(historyPath) + def simple(historyPath: Option[File], handleCONT: Boolean = HandleCONT): SimpleReader = new SimpleReader(historyPath, handleCONT) val MaxHistorySize = 500 + val HandleCONT = !java.lang.Boolean.getBoolean("sbt.disable.cont") } trait LineReader { def readLine(prompt: String, mask: Option[Char] = None): Option[String] } -final class FullReader(val historyPath: Option[File], complete: Parser[_]) extends JLine +final class FullReader(val historyPath: Option[File], complete: Parser[_], val handleCONT: Boolean = JLine.HandleCONT) extends JLine { protected[this] val reader = { @@ -94,9 +108,9 @@ final class FullReader(val historyPath: Option[File], complete: Parser[_]) exten } } -class SimpleReader private[sbt] (val historyPath: Option[File]) extends JLine +class SimpleReader private[sbt] (val historyPath: Option[File], val handleCONT: Boolean) extends JLine { protected[this] val reader = JLine.createReader() } -object SimpleReader extends SimpleReader(None) +object SimpleReader extends SimpleReader(None, JLine.HandleCONT) From d6bc087271acf3ed8e683db68945284b675bd5c8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 9 Mar 2012 13:38:45 -0500 Subject: [PATCH 242/823] handle absence of CONT signal --- util/collection/Signal.scala | 18 +++++++++++++++++- util/complete/LineReader.scala | 4 ++-- 2 files changed, 19 insertions(+), 3 deletions(-) diff --git a/util/collection/Signal.scala b/util/collection/Signal.scala index 3152d4d49..8bad472cd 100644 --- a/util/collection/Signal.scala +++ b/util/collection/Signal.scala @@ -2,7 +2,9 @@ package sbt object Signals { - def withHandler[T](handler: () => Unit, signal: String = "INT")(action: () => T): T = + val CONT = "CONT" + val INT = "INT" + def withHandler[T](handler: () => Unit, signal: String = INT)(action: () => T): T = { val result = try @@ -17,6 +19,13 @@ object Signals case Right(v) => v } } + def supported(signal: String): Boolean = + try + { + val signals = new Signals0 + signals.supported(signal) + } + catch { case e: LinkageError => false } } // Must only be referenced using a @@ -24,6 +33,13 @@ object Signals // block to private final class Signals0 { + def supported(signal: String): Boolean = + { + import sun.misc.Signal + try { new Signal(signal); true } + catch { case e: IllegalArgumentException => false } + } + // returns a LinkageError in `action` as Left(t) in order to avoid it being // incorrectly swallowed as missing Signal/SignalHandler def withHandler[T](signal: String, handler: () => Unit, action: () => T): Either[Throwable, T] = diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index 1cb90b55b..a48e4c141 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -35,7 +35,7 @@ abstract class JLine extends LineReader private[this] def readLineDirect(prompt: String, mask: Option[Char]): String = if(handleCONT) - Signals.withHandler(() => resume(), signal = "CONT")( () => readLineDirectRaw(prompt, mask) ) + Signals.withHandler(() => resume(), signal = Signals.CONT)( () => readLineDirectRaw(prompt, mask) ) else readLineDirectRaw(prompt, mask) private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): String = @@ -90,7 +90,7 @@ private object JLine def simple(historyPath: Option[File], handleCONT: Boolean = HandleCONT): SimpleReader = new SimpleReader(historyPath, handleCONT) val MaxHistorySize = 500 - val HandleCONT = !java.lang.Boolean.getBoolean("sbt.disable.cont") + val HandleCONT = !java.lang.Boolean.getBoolean("sbt.disable.cont") && Signals.supported(Signals.CONT) } trait LineReader From 48170f649a0c0d7f3bb06e611d385fde2504bb91 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 10 Mar 2012 14:16:40 -0500 Subject: [PATCH 243/823] convenience functions for testing parsers --- util/complete/Parser.scala | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 5f9086b58..e299edfd0 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -283,6 +283,16 @@ trait ParserMain ProcessError(str, msgs, pos) } + def sample(str: String, parser: Parser[_], completions: Boolean = false): Unit = + if(completions) sampleParse(str, parser) else sampleCompletions(str, parser) + def sampleParse(str: String, parser: Parser[_]): Unit = + parse(str, parser) match { + case Left(msg) => println(msg) + case Right(v) => println(v) + } + def sampleCompletions(str: String, parser: Parser[_], level: Int = 1): Unit = + Parser.completions(parser, str, level).get foreach println + // intended to be temporary pending proper error feedback def result[T](p: Parser[T], s: String): Either[() => (Seq[String],Int), T] = { From 8a7a3228e86441d21a387a4b8da182f46e4faf36 Mon Sep 17 00:00:00 2001 From: "e.e d3si9n" Date: Sun, 11 Mar 2012 07:31:39 -0400 Subject: [PATCH 244/823] implemented parser for escaped string and verbatim string --- util/complete/Parsers.scala | 45 ++++++++++++++++++++++++++++++++++++- 1 file changed, 44 insertions(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 367f2fadb..43c4d5799 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -17,6 +17,10 @@ trait Parsers lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet + lazy val OctalDigitSet = Set("0","1","2","3","4","5","6","7") + lazy val OctalDigit = charClass(c => OctalDigitSet(c.toString), "octal") examples OctalDigitSet + lazy val HexDigitSet = Set("0","1","2","3","4","5","6","7","8","9", "A", "B", "C", "D", "E", "F") + lazy val HexDigit = charClass(c => HexDigitSet(c.toString.toUpperCase), "hex") examples HexDigitSet lazy val Letter = charClass(_.isLetter, "letter") def IDStart = Letter lazy val IDChar = charClass(isIDChar, "ID character") @@ -44,6 +48,12 @@ trait Parsers lazy val Space = SpaceClass.+.examples(" ") lazy val OptSpace = SpaceClass.*.examples(" ") lazy val URIClass = URIChar.+.string !!! "Invalid URI" + lazy val VerbatimDQuotes = "\"\"\"" + lazy val DQuoteChar = '\"' + lazy val DQuoteClass = charClass(_ == DQuoteChar, "double-quote character") + lazy val NotDQuoteClass = charClass(_ != DQuoteChar, "non-double-quote character") + lazy val NotDQuoteBackslashClass = charClass({ c: Char => + c != DQuoteChar && c != '\\' }, "non-double-quote character") lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%#") def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') @@ -57,6 +67,39 @@ trait Parsers private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = (neg.toSeq ++ digits).mkString.toInt lazy val Bool = ("true" ^^^ true) | ("false" ^^^ false) + lazy val StringBasic = StringVerbatim | StringEscapable | NotQuoted + def StringVerbatim: Parser[String] = { + var dqcount = 0 + val p = VerbatimDQuotes ~ + charClass(_ match { + case DQuoteChar => + dqcount += 1 + dqcount < 3 + case _ => + dqcount = 0 + true + }).*.string ~ DQuoteChar + p map { case ((s, p), c) => s + p + c.toString } filter( + { _.endsWith(VerbatimDQuotes) }, _ => "Expected '%s'" format VerbatimDQuotes) map { s => + s.substring(3, s.length - 3) } + } + lazy val StringEscapable: Parser[String] = { + val p = DQuoteChar ~> + (EscapeSequence | NotDQuoteBackslashClass map {_.toString}).* <~ DQuoteChar + p map { _.mkString } + } + lazy val EscapeSequence: Parser[String] = + "\\" ~> ("b" ^^^ "\b" | "t" ^^^ "\t" | "n" ^^^ "\n" | "f" ^^^ "\f" | "r" ^^^ "\r" | + "\"" ^^^ "\"" | "'" ^^^ "\'" | "\\" ^^^ "\\" | OctalEscape | UnicodeEscape) + lazy val OctalEscape: Parser[String] = + repeat(OctalDigit, 1, 3) map { seq => + Integer.parseInt(seq.mkString, 8).asInstanceOf[Char].toString + } + lazy val UnicodeEscape: Parser[String] = + ("u" ~> repeat(HexDigit, 4, 4)) map { seq => + Integer.parseInt(seq.mkString, 16).asInstanceOf[Char].toString + } + lazy val NotQuoted = (NotDQuoteClass ~ NotSpace) map { case (c, s) => c.toString + s } def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = rep1sep(rep, sep) ?? Nil @@ -67,7 +110,7 @@ trait Parsers def mapOrFail[S,T](p: Parser[S])(f: S => T): Parser[T] = p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } - def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(NotSpace, display)).* <~ SpaceClass.* + def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(StringBasic, display)).* <~ SpaceClass.* def flag[T](p: Parser[T]): Parser[Boolean] = (p ^^^ true) ?? false From e6e778a1a31a17aefb6553d7562d36cd1c587bc6 Mon Sep 17 00:00:00 2001 From: "e.e d3si9n" Date: Sun, 11 Mar 2012 13:12:23 -0400 Subject: [PATCH 245/823] removed Octal --- util/complete/Parsers.scala | 12 +++--------- 1 file changed, 3 insertions(+), 9 deletions(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 43c4d5799..226bd9b0b 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -17,10 +17,8 @@ trait Parsers lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet - lazy val OctalDigitSet = Set("0","1","2","3","4","5","6","7") - lazy val OctalDigit = charClass(c => OctalDigitSet(c.toString), "octal") examples OctalDigitSet - lazy val HexDigitSet = Set("0","1","2","3","4","5","6","7","8","9", "A", "B", "C", "D", "E", "F") - lazy val HexDigit = charClass(c => HexDigitSet(c.toString.toUpperCase), "hex") examples HexDigitSet + lazy val HexDigitSet = Set('0','1','2','3','4','5','6','7','8','9','A','B','C','D','E','F') + lazy val HexDigit = charClass(HexDigitSet, "hex") examples HexDigitSet.map(_.toString) lazy val Letter = charClass(_.isLetter, "letter") def IDStart = Letter lazy val IDChar = charClass(isIDChar, "ID character") @@ -90,11 +88,7 @@ trait Parsers } lazy val EscapeSequence: Parser[String] = "\\" ~> ("b" ^^^ "\b" | "t" ^^^ "\t" | "n" ^^^ "\n" | "f" ^^^ "\f" | "r" ^^^ "\r" | - "\"" ^^^ "\"" | "'" ^^^ "\'" | "\\" ^^^ "\\" | OctalEscape | UnicodeEscape) - lazy val OctalEscape: Parser[String] = - repeat(OctalDigit, 1, 3) map { seq => - Integer.parseInt(seq.mkString, 8).asInstanceOf[Char].toString - } + "\"" ^^^ "\"" | "'" ^^^ "\'" | "\\" ^^^ "\\" | UnicodeEscape) lazy val UnicodeEscape: Parser[String] = ("u" ~> repeat(HexDigit, 4, 4)) map { seq => Integer.parseInt(seq.mkString, 16).asInstanceOf[Char].toString From feb315b878ff45761d62678c3cb9ad5d2d980077 Mon Sep 17 00:00:00 2001 From: "e.e d3si9n" Date: Sun, 11 Mar 2012 13:19:13 -0400 Subject: [PATCH 246/823] StringVerbatim is now stateless --- util/complete/Parsers.scala | 18 +++--------------- 1 file changed, 3 insertions(+), 15 deletions(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 226bd9b0b..9413a728e 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -66,21 +66,9 @@ trait Parsers (neg.toSeq ++ digits).mkString.toInt lazy val Bool = ("true" ^^^ true) | ("false" ^^^ false) lazy val StringBasic = StringVerbatim | StringEscapable | NotQuoted - def StringVerbatim: Parser[String] = { - var dqcount = 0 - val p = VerbatimDQuotes ~ - charClass(_ match { - case DQuoteChar => - dqcount += 1 - dqcount < 3 - case _ => - dqcount = 0 - true - }).*.string ~ DQuoteChar - p map { case ((s, p), c) => s + p + c.toString } filter( - { _.endsWith(VerbatimDQuotes) }, _ => "Expected '%s'" format VerbatimDQuotes) map { s => - s.substring(3, s.length - 3) } - } + lazy val StringVerbatim: Parser[String] = VerbatimDQuotes ~> + any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ + VerbatimDQuotes lazy val StringEscapable: Parser[String] = { val p = DQuoteChar ~> (EscapeSequence | NotDQuoteBackslashClass map {_.toString}).* <~ DQuoteChar From 9239e2fd464e72bc6454076b251871a4fdc91fce Mon Sep 17 00:00:00 2001 From: "e.e d3si9n" Date: Sun, 11 Mar 2012 15:02:50 -0400 Subject: [PATCH 247/823] fixes NotQuoted --- util/complete/Parsers.scala | 30 ++++++++++++++---------------- 1 file changed, 14 insertions(+), 16 deletions(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 9413a728e..7a84a13a1 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -48,10 +48,12 @@ trait Parsers lazy val URIClass = URIChar.+.string !!! "Invalid URI" lazy val VerbatimDQuotes = "\"\"\"" lazy val DQuoteChar = '\"' + lazy val BackslashChar = '\\' lazy val DQuoteClass = charClass(_ == DQuoteChar, "double-quote character") - lazy val NotDQuoteClass = charClass(_ != DQuoteChar, "non-double-quote character") - lazy val NotDQuoteBackslashClass = charClass({ c: Char => - c != DQuoteChar && c != '\\' }, "non-double-quote character") + lazy val NotDQuoteSpaceClass = + charClass({ c: Char => (c != DQuoteChar) && !c.isWhitespace }, "non-double-quote-space character") + lazy val NotDQuoteBackslashClass = + charClass({ c: Char => (c != DQuoteChar) && (c != BackslashChar) }, "non-double-quote-backslash character") lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%#") def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') @@ -69,19 +71,15 @@ trait Parsers lazy val StringVerbatim: Parser[String] = VerbatimDQuotes ~> any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ VerbatimDQuotes - lazy val StringEscapable: Parser[String] = { - val p = DQuoteChar ~> - (EscapeSequence | NotDQuoteBackslashClass map {_.toString}).* <~ DQuoteChar - p map { _.mkString } - } - lazy val EscapeSequence: Parser[String] = - "\\" ~> ("b" ^^^ "\b" | "t" ^^^ "\t" | "n" ^^^ "\n" | "f" ^^^ "\f" | "r" ^^^ "\r" | - "\"" ^^^ "\"" | "'" ^^^ "\'" | "\\" ^^^ "\\" | UnicodeEscape) - lazy val UnicodeEscape: Parser[String] = - ("u" ~> repeat(HexDigit, 4, 4)) map { seq => - Integer.parseInt(seq.mkString, 16).asInstanceOf[Char].toString - } - lazy val NotQuoted = (NotDQuoteClass ~ NotSpace) map { case (c, s) => c.toString + s } + lazy val StringEscapable: Parser[String] = + (DQuoteChar ~> (NotDQuoteBackslashClass | EscapeSequence).+.string <~ DQuoteChar | + (DQuoteChar ~ DQuoteChar) ^^^ "") + lazy val EscapeSequence: Parser[Char] = + BackslashChar ~> ('b' ^^^ '\b' | 't' ^^^ '\t' | 'n' ^^^ '\n' | 'f' ^^^ '\f' | 'r' ^^^ '\r' | + '\"' ^^^ '\"' | '\'' ^^^ '\'' | '\\' ^^^ '\\' | UnicodeEscape) + lazy val UnicodeEscape: Parser[Char] = + ("u" ~> repeat(HexDigit, 4, 4)) map { seq => Integer.parseInt(seq.mkString, 16).toChar } + lazy val NotQuoted = (NotDQuoteSpaceClass ~ NotSpace) map { case (c, s) => c.toString + s } def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = rep1sep(rep, sep) ?? Nil From 51db55d84760f301c6af0501d395df47b0930fdc Mon Sep 17 00:00:00 2001 From: Indrajit Raychaudhuri Date: Mon, 12 Mar 2012 04:44:27 +0530 Subject: [PATCH 248/823] Replace `Pair.apply` with `Util.pairID`, avoids extra class generation --- util/collection/Util.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/util/collection/Util.scala b/util/collection/Util.scala index 608284c98..429a1b61d 100644 --- a/util/collection/Util.scala +++ b/util/collection/Util.scala @@ -27,4 +27,6 @@ object Util case 1 => Some("1 " + prefix + single) case x => Some(x.toString + " " + prefix + plural) } -} \ No newline at end of file + + def pairID[A,B] = (a: A, b: B) => (a,b) +} From 0e130d29e9070b96007110d417c4529854e6c57f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 13 Mar 2012 08:01:58 -0400 Subject: [PATCH 249/823] fix argument parsing, which unintentionally required two characters. ref #396 --- util/complete/Parsers.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 7a84a13a1..ef34ea5fc 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -43,6 +43,7 @@ trait Parsers lazy val NotSpaceClass = charClass(!_.isWhitespace, "non-whitespace character") lazy val SpaceClass = charClass(_.isWhitespace, "whitespace character") lazy val NotSpace = NotSpaceClass.+.string + lazy val OptNotSpace = NotSpaceClass.*.string lazy val Space = SpaceClass.+.examples(" ") lazy val OptSpace = SpaceClass.*.examples(" ") lazy val URIClass = URIChar.+.string !!! "Invalid URI" @@ -79,7 +80,7 @@ trait Parsers '\"' ^^^ '\"' | '\'' ^^^ '\'' | '\\' ^^^ '\\' | UnicodeEscape) lazy val UnicodeEscape: Parser[Char] = ("u" ~> repeat(HexDigit, 4, 4)) map { seq => Integer.parseInt(seq.mkString, 16).toChar } - lazy val NotQuoted = (NotDQuoteSpaceClass ~ NotSpace) map { case (c, s) => c.toString + s } + lazy val NotQuoted = (NotDQuoteSpaceClass ~ OptNotSpace) map { case (c, s) => c.toString + s } def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = rep1sep(rep, sep) ?? Nil From 8fc5db4a8aca49f0df4ac52a86df5c5b43d5eecc Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 17 Mar 2012 19:31:03 -0400 Subject: [PATCH 250/823] work around for 2.10. pattern matching on KNil now requires KNil() --- util/collection/KList.scala | 24 +++++++++++++----------- util/collection/Types.scala | 2 +- 2 files changed, 14 insertions(+), 12 deletions(-) diff --git a/util/collection/KList.scala b/util/collection/KList.scala index 7b58aca32..5c0952cf1 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -13,7 +13,7 @@ import Types._ * For background, see * http://apocalisp.wordpress.com/2010/11/01/type-level-programming-in-scala-part-8a-klist%C2%A0motivation/ */ -sealed trait KList[+M[_], HL <: HList] +sealed trait KList[M[_], HL <: HList] { type Raw = HL /** Transform to the underlying HList type.*/ @@ -33,12 +33,12 @@ trait KFold[M[_],P[_ <: HList]] def knil: P[HNil] } -final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] +final case class KCons[H, T <: HList, M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] { def down(implicit f: M ~> Id) = HCons(f(head), tail down f) def transform[N[_]](f: M ~> N) = KCons( f(head), tail transform f ) // prepend - def :^: [N[X] >: M[X], G](g: N[G]) = KCons(g, this) + def :^: [G](g: M[G]) = KCons(g, this) def toList = head :: tail.toList def combine[N[X] >: M[X]]: (H :+: T)#Wrap[N] = HCons(head, tail.combine) @@ -48,29 +48,31 @@ final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) exten def foldr[P[_ <: HList],N[X] >: M[X]](f: KFold[N,P]) = f.kcons(head, tail foldr f) } -sealed class KNil extends KList[Nothing, HNil] +sealed case class KNil[M[_]]() extends KList[M, HNil] { - def down(implicit f: Nothing ~> Id) = HNil - def transform[N[_]](f: Nothing ~> N) = KNil - def :^: [M[_], H](h: M[H]) = KCons(h, this) + def down(implicit f: M ~> Id) = HNil + def transform[N[_]](f: M ~> N) = new KNil[N] def toList = Nil def combine[N[X]] = HNil override def foldr[P[_ <: HList],N[_]](f: KFold[N,P]) = f.knil override def toString = "KNil" } -object KNil extends KNil - +object KNil +{ + def :^: [M[_], H](h: M[H]) = KCons(h, new KNil[M]) +} object KList { + implicit def convert[M[_]](k: KNil.type): KNil[M] = KNil() // nicer alias for pattern matching val :^: = KCons - def fromList[M[_]](s: Seq[M[_]]): KList[M, _ <: HList] = if(s.isEmpty) KNil else KCons(s.head, fromList(s.tail)) + def fromList[M[_]](s: Seq[M[_]]): KList[M, _ <: HList] = if(s.isEmpty) KNil() else KCons(s.head, fromList(s.tail)) // haven't found a way to convince scalac that KList[M, H :+: T] implies KCons[H,T,M] // Therefore, this method exists to put the cast in one location. implicit def kcons[H, T <: HList, M[_]](kl: KList[M, H :+: T]): KCons[H,T,M] = kl.asInstanceOf[KCons[H,T,M]] // haven't need this, but for symmetry with kcons: - implicit def knil[M[_]](kl: KList[M, HNil]): KNil = KNil + implicit def knil[M[_]](kl: KList[M, HNil]): KNil[M] = KNil() } diff --git a/util/collection/Types.scala b/util/collection/Types.scala index 42b81f990..3a4be2c11 100644 --- a/util/collection/Types.scala +++ b/util/collection/Types.scala @@ -7,7 +7,7 @@ object Types extends Types { implicit def hconsToK[M[_], H, T <: HList](h: M[H] :+: T)(implicit mt: T => KList[M, T]): KList[M, H :+: T] = KCons[H, T, M](h.head, mt(h.tail) ) - implicit def hnilToK(hnil: HNil): KNil = KNil + implicit def hnilToK[M[_]](hnil: HNil): KNil[M] = KNil() } trait Types extends TypeFunctions From 474cd75d060a5813f2cc618ecaade5a51656176e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 17 Mar 2012 19:31:55 -0400 Subject: [PATCH 251/823] print-warnings task for Scala 2.10+ to avoid needing to rerun 'compile' to see deprecation/unchecked warnings --- .../src/main/java/xsbti/AnalysisCallback.java | 3 +++ interface/src/test/scala/TestCallback.scala | 1 + util/log/Logger.scala | 25 +++++++++++++++++++ 3 files changed, 29 insertions(+) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 53a33253b..2ba6f6c83 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -24,4 +24,7 @@ public interface AnalysisCallback public void endSource(File sourcePath); /** Called when the public API of a source file is extracted. */ public void api(File sourceFile, xsbti.api.SourceAPI source); + /** Provides problems discovered during compilation. These may be reported (logged) or unreported. + * Unreported problems are usually unreported because reporting was not enabled via a command line switch. */ + public void problem(Position pos, String msg, Severity severity, boolean reported); } \ No newline at end of file diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index 6621a40ef..95fcbf96c 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -20,4 +20,5 @@ class TestCallback extends AnalysisCallback def endSource(source: File) { endedSources += source } def api(source: File, sourceAPI: xsbti.api.SourceAPI) { apis += ((source, sourceAPI)) } + def problem(pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean) {} } \ No newline at end of file diff --git a/util/log/Logger.scala b/util/log/Logger.scala index 04babcc4e..1f6d4d3a5 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -4,6 +4,9 @@ package sbt import xsbti.{Logger => xLogger, F0} + import xsbti.{Maybe,Position,Problem,Severity} + + import java.io.File abstract class AbstractLogger extends Logger { @@ -61,6 +64,28 @@ object Logger } } def f0[T](t: =>T): F0[T] = new F0[T] { def apply = t } + + def m2o[S](m: Maybe[S]): Option[S] = if(m.isDefined) Some(m.get) else None + def o2m[S](o: Option[S]): Maybe[S] = o match { case Some(v) => Maybe.just(v); case None => Maybe.nothing() } + + def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = + new Position { + val line = o2m(line0) + val lineContent = content + val offset = o2m(offset0) + val pointer = o2m(pointer0) + val pointerSpace = o2m(pointerSpace0) + val sourcePath = o2m(sourcePath0) + val sourceFile = o2m(sourceFile0) + } + + def problem(pos: Position, msg: String, sev: Severity): Problem = + new Problem + { + val position = pos + val message = msg + val severity = sev + } } /** This is intended to be the simplest logging interface for use by code that wants to log. From c7c4969eb720f38f23999a7f4978903af0bc0eaa Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 17 Mar 2012 22:58:13 -0400 Subject: [PATCH 252/823] Revert "work around for 2.10. pattern matching on KNil now requires KNil()" This reverts commit 2f726b34c3a9028bf78a62b026bd0b3c64e8203c. This commit caused "java.lang.Error: typeConstructor inapplicable for " when running 'sxr' --- util/collection/KList.scala | 24 +++++++++++------------- util/collection/Types.scala | 2 +- 2 files changed, 12 insertions(+), 14 deletions(-) diff --git a/util/collection/KList.scala b/util/collection/KList.scala index 5c0952cf1..7b58aca32 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -13,7 +13,7 @@ import Types._ * For background, see * http://apocalisp.wordpress.com/2010/11/01/type-level-programming-in-scala-part-8a-klist%C2%A0motivation/ */ -sealed trait KList[M[_], HL <: HList] +sealed trait KList[+M[_], HL <: HList] { type Raw = HL /** Transform to the underlying HList type.*/ @@ -33,12 +33,12 @@ trait KFold[M[_],P[_ <: HList]] def knil: P[HNil] } -final case class KCons[H, T <: HList, M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] +final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] { def down(implicit f: M ~> Id) = HCons(f(head), tail down f) def transform[N[_]](f: M ~> N) = KCons( f(head), tail transform f ) // prepend - def :^: [G](g: M[G]) = KCons(g, this) + def :^: [N[X] >: M[X], G](g: N[G]) = KCons(g, this) def toList = head :: tail.toList def combine[N[X] >: M[X]]: (H :+: T)#Wrap[N] = HCons(head, tail.combine) @@ -48,31 +48,29 @@ final case class KCons[H, T <: HList, M[_]](head: M[H], tail: KList[M,T]) extend def foldr[P[_ <: HList],N[X] >: M[X]](f: KFold[N,P]) = f.kcons(head, tail foldr f) } -sealed case class KNil[M[_]]() extends KList[M, HNil] +sealed class KNil extends KList[Nothing, HNil] { - def down(implicit f: M ~> Id) = HNil - def transform[N[_]](f: M ~> N) = new KNil[N] + def down(implicit f: Nothing ~> Id) = HNil + def transform[N[_]](f: Nothing ~> N) = KNil + def :^: [M[_], H](h: M[H]) = KCons(h, this) def toList = Nil def combine[N[X]] = HNil override def foldr[P[_ <: HList],N[_]](f: KFold[N,P]) = f.knil override def toString = "KNil" } -object KNil -{ - def :^: [M[_], H](h: M[H]) = KCons(h, new KNil[M]) -} +object KNil extends KNil + object KList { - implicit def convert[M[_]](k: KNil.type): KNil[M] = KNil() // nicer alias for pattern matching val :^: = KCons - def fromList[M[_]](s: Seq[M[_]]): KList[M, _ <: HList] = if(s.isEmpty) KNil() else KCons(s.head, fromList(s.tail)) + def fromList[M[_]](s: Seq[M[_]]): KList[M, _ <: HList] = if(s.isEmpty) KNil else KCons(s.head, fromList(s.tail)) // haven't found a way to convince scalac that KList[M, H :+: T] implies KCons[H,T,M] // Therefore, this method exists to put the cast in one location. implicit def kcons[H, T <: HList, M[_]](kl: KList[M, H :+: T]): KCons[H,T,M] = kl.asInstanceOf[KCons[H,T,M]] // haven't need this, but for symmetry with kcons: - implicit def knil[M[_]](kl: KList[M, HNil]): KNil[M] = KNil() + implicit def knil[M[_]](kl: KList[M, HNil]): KNil = KNil } diff --git a/util/collection/Types.scala b/util/collection/Types.scala index 3a4be2c11..42b81f990 100644 --- a/util/collection/Types.scala +++ b/util/collection/Types.scala @@ -7,7 +7,7 @@ object Types extends Types { implicit def hconsToK[M[_], H, T <: HList](h: M[H] :+: T)(implicit mt: T => KList[M, T]): KList[M, H :+: T] = KCons[H, T, M](h.head, mt(h.tail) ) - implicit def hnilToK[M[_]](hnil: HNil): KNil[M] = KNil() + implicit def hnilToK(hnil: HNil): KNil = KNil } trait Types extends TypeFunctions From 94b4a3784a1a443e372c27375ec79a471d83eeb7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 25 Mar 2012 20:35:09 -0400 Subject: [PATCH 253/823] rank settings, tasks and use this to restrict help/settings/tasks output. fixes #315 --- util/collection/Attributes.scala | 37 ++++++++++++++++++++------------ 1 file changed, 23 insertions(+), 14 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 2e2d647bd..f5ea9dcd4 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -15,6 +15,7 @@ sealed trait AttributeKey[T] { def description: Option[String] def extend: Seq[AttributeKey[_]] def isLocal: Boolean + def rank: Int } private[sbt] abstract class SharedAttributeKey[T] extends AttributeKey[T] { override final def toString = label @@ -27,23 +28,30 @@ private[sbt] abstract class SharedAttributeKey[T] extends AttributeKey[T] { } object AttributeKey { - def apply[T](name: String)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { + def apply[T](name: String)(implicit mf: Manifest[T]): AttributeKey[T] = + make(name, None, Nil, Int.MaxValue) + + def apply[T](name: String, rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + make(name, None, Nil, rank) + + def apply[T](name: String, description: String)(implicit mf: Manifest[T]): AttributeKey[T] = + apply(name, description, Nil) + + def apply[T](name: String, description: String, rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + apply(name, description, Nil, rank) + + def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]])(implicit mf: Manifest[T]): AttributeKey[T] = + apply(name, description, extend, Int.MaxValue) + + def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]], rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + make(name, Some(description), extend, rank) + + private[this] def make[T](name: String, description0: Option[String], extend0: Seq[AttributeKey[_]], rank0: Int)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { def manifest = mf def label = name - def description = None - def extend = Nil - } - def apply[T](name: String, description0: String)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { - def manifest = mf - def label = name - def description = Some(description0) - def extend = Nil - } - def apply[T](name: String, description0: String, extend0: Seq[AttributeKey[_]])(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { - def manifest = mf - def label = name - def description = Some(description0) + def description = description0 def extend = extend0 + def rank = rank0 } private[sbt] def local[T](implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { def manifest = mf @@ -52,6 +60,7 @@ object AttributeKey def extend = Nil override def toString = label def isLocal: Boolean = true + def rank = Int.MaxValue } private[sbt] final val LocalLabel = "$local" } From ec8f9884e0c8a8179fc17b4c6a7d2ca47a392350 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 25 Mar 2012 20:36:05 -0400 Subject: [PATCH 254/823] fix sample method interpretation of completions argument --- util/complete/Parser.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index e299edfd0..d16969b55 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -284,7 +284,7 @@ trait ParserMain } def sample(str: String, parser: Parser[_], completions: Boolean = false): Unit = - if(completions) sampleParse(str, parser) else sampleCompletions(str, parser) + if(completions) sampleCompletions(str, parser) else sampleParse(str, parser) def sampleParse(str: String, parser: Parser[_]): Unit = parse(str, parser) match { case Left(msg) => println(msg) From 740094c4d246ad0cb4d83885cd4bacdc3506254d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 6 Apr 2012 20:28:31 -0400 Subject: [PATCH 255/823] enhance 'projects' to allow temporarily adding/removing builds to the session --- util/complete/Parsers.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index ef34ea5fc..772ec20d5 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -96,7 +96,8 @@ trait Parsers def flag[T](p: Parser[T]): Parser[Boolean] = (p ^^^ true) ?? false def trimmed(p: Parser[String]) = p map { _.trim } - def Uri(ex: Set[URI]) = mapOrFail(URIClass)( uri => new URI(uri)) examples(ex.map(_.toString)) + lazy val basicUri = mapOrFail(URIClass)( uri => new URI(uri)) + def Uri(ex: Set[URI]) = basicUri examples(ex.map(_.toString)) } object Parsers extends Parsers object DefaultParsers extends Parsers with ParserMain From d4f8a615dd894e6e1424b8fd282ea6b20fa66c43 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 18 Apr 2012 08:07:53 -0400 Subject: [PATCH 256/823] reorganize compilation modules --- interface/src/main/java/xsbti/ArtifactInfo.java | 9 +++++++++ 1 file changed, 9 insertions(+) create mode 100644 interface/src/main/java/xsbti/ArtifactInfo.java diff --git a/interface/src/main/java/xsbti/ArtifactInfo.java b/interface/src/main/java/xsbti/ArtifactInfo.java new file mode 100644 index 000000000..6f2eedae5 --- /dev/null +++ b/interface/src/main/java/xsbti/ArtifactInfo.java @@ -0,0 +1,9 @@ +package xsbti; + +public final class ArtifactInfo +{ + public static final String ScalaOrganization = "org.scala-lang"; + public static final String ScalaLibraryID = "scala-library"; + public static final String ScalaCompilerID = "scala-compiler"; + public static final String SbtOrganization = "org.scala-sbt"; +} \ No newline at end of file From 5d4b89c9652297a0a75ce83442497ac86fbf0a60 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 9 Jan 2012 08:00:35 -0500 Subject: [PATCH 257/823] using some of the embedding interfaces --- interface/src/main/java/xsbti/compile/ScalaInstance.java | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/interface/src/main/java/xsbti/compile/ScalaInstance.java b/interface/src/main/java/xsbti/compile/ScalaInstance.java index fd66d9a9c..4e41e1ca2 100644 --- a/interface/src/main/java/xsbti/compile/ScalaInstance.java +++ b/interface/src/main/java/xsbti/compile/ScalaInstance.java @@ -24,10 +24,10 @@ public interface ScalaInstance File compilerJar(); /** Jars provided by this Scala instance other than the compiler and library jars. */ - File[] extraJars(); + File[] otherJars(); /** All jar files provided by this Scala instance.*/ - File[] jars(); + File[] allJars(); /** The unique identifier for this Scala instance. An implementation should usually obtain this from the compiler.properties file in the compiler jar. */ String actualVersion(); From acc03cb293ed2000f163fc9e231828598c00df6a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 18 Apr 2012 16:01:45 -0400 Subject: [PATCH 258/823] implement embedded interface --- .../src/main/java/xsbti/compile/IncrementalCompiler.java | 3 ++- interface/src/main/java/xsbti/compile/Setup.java | 8 ++++---- 2 files changed, 6 insertions(+), 5 deletions(-) diff --git a/interface/src/main/java/xsbti/compile/IncrementalCompiler.java b/interface/src/main/java/xsbti/compile/IncrementalCompiler.java index 9fe301029..f2323111d 100644 --- a/interface/src/main/java/xsbti/compile/IncrementalCompiler.java +++ b/interface/src/main/java/xsbti/compile/IncrementalCompiler.java @@ -52,9 +52,10 @@ public interface IncrementalCompiler * to create a ScalaCompiler for incremental compilation. It is the client's responsibility to manage compiled jars for * different Scala versions. * + * @param label A brief name describing the source component for use in error messages * @param sourceJar The jar file containing the compiler interface sources. These are published as sbt's compiler-interface-src module. * @param targetJar Where to create the output jar file containing the compiled classes. * @param instance The ScalaInstance to compile the compiler interface for. * @param log The logger to use during compilation. */ - void compileInterfaceJar(File sourceJar, File targetJar, ScalaInstance instance, Logger log); + void compileInterfaceJar(String label, File sourceJar, File targetJar, File interfaceJar, ScalaInstance instance, Logger log); } diff --git a/interface/src/main/java/xsbti/compile/Setup.java b/interface/src/main/java/xsbti/compile/Setup.java index ca9999978..a1a9a1ad8 100644 --- a/interface/src/main/java/xsbti/compile/Setup.java +++ b/interface/src/main/java/xsbti/compile/Setup.java @@ -17,8 +17,8 @@ public interface Setup /** If true, no sources are actually compiled and the Analysis from the previous compilation is returned.*/ boolean skip(); - /** The directory used to cache information across compilations. - * This directory can be removed to force a full recompilation. - * The directory should be unique and not shared between compilations. */ - File cacheDirectory(); + /** The file used to cache information across compilations. + * This file can be removed to force a full recompilation. + * The file should be unique and not shared between compilations. */ + File cacheFile(); } From c6c6061639636fc84355d6a63cbb026ca7224f50 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 28 Apr 2012 18:58:38 -0400 Subject: [PATCH 259/823] basis for a resident compiler unstable, but can be tested with -Dsbt.resident.limit=n n is the maximum Globals kept around --- .../src/main/java/xsbti/compile/CachedCompiler.java | 11 +++++++++++ .../java/xsbti/compile/CachedCompilerProvider.java | 10 ++++++++++ .../main/java/xsbti/compile/DependencyChanges.java | 13 +++++++++++++ .../src/main/java/xsbti/compile/GlobalsCache.java | 10 ++++++++++ interface/src/main/java/xsbti/compile/Setup.java | 2 ++ 5 files changed, 46 insertions(+) create mode 100644 interface/src/main/java/xsbti/compile/CachedCompiler.java create mode 100644 interface/src/main/java/xsbti/compile/CachedCompilerProvider.java create mode 100644 interface/src/main/java/xsbti/compile/DependencyChanges.java create mode 100644 interface/src/main/java/xsbti/compile/GlobalsCache.java diff --git a/interface/src/main/java/xsbti/compile/CachedCompiler.java b/interface/src/main/java/xsbti/compile/CachedCompiler.java new file mode 100644 index 000000000..2f97e395b --- /dev/null +++ b/interface/src/main/java/xsbti/compile/CachedCompiler.java @@ -0,0 +1,11 @@ +package xsbti.compile; + +import xsbti.AnalysisCallback; +import xsbti.Logger; +import xsbti.Reporter; +import java.io.File; + +public interface CachedCompiler +{ + public void run(File[] sources, DependencyChanges cpChanges, AnalysisCallback callback, Logger log, Reporter delegate); +} diff --git a/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java b/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java new file mode 100644 index 000000000..43d3aaf7e --- /dev/null +++ b/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java @@ -0,0 +1,10 @@ +package xsbti.compile; + +import xsbti.Logger; +import xsbti.Reporter; + +public interface CachedCompilerProvider +{ + ScalaInstance scalaInstance(); + CachedCompiler newCachedCompiler(String[] arguments, Logger log, Reporter reporter); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/DependencyChanges.java b/interface/src/main/java/xsbti/compile/DependencyChanges.java new file mode 100644 index 000000000..4f6bda55a --- /dev/null +++ b/interface/src/main/java/xsbti/compile/DependencyChanges.java @@ -0,0 +1,13 @@ +package xsbti.compile; + + import java.io.File; + +// only includes changes to dependencies outside of the project +public interface DependencyChanges +{ + boolean isEmpty(); + // class files or jar files + File[] modifiedBinaries(); + // class names + String[] modifiedClasses(); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/GlobalsCache.java b/interface/src/main/java/xsbti/compile/GlobalsCache.java new file mode 100644 index 000000000..1e070a1f1 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/GlobalsCache.java @@ -0,0 +1,10 @@ +package xsbti.compile; + +import xsbti.Logger; +import xsbti.Reporter; + +public interface GlobalsCache +{ + public CachedCompiler apply(String[] args, boolean forceNew, CachedCompilerProvider provider, Logger log, Reporter reporter); + public void clear(); +} diff --git a/interface/src/main/java/xsbti/compile/Setup.java b/interface/src/main/java/xsbti/compile/Setup.java index a1a9a1ad8..cf261aa7a 100644 --- a/interface/src/main/java/xsbti/compile/Setup.java +++ b/interface/src/main/java/xsbti/compile/Setup.java @@ -21,4 +21,6 @@ public interface Setup * This file can be removed to force a full recompilation. * The file should be unique and not shared between compilations. */ File cacheFile(); + + GlobalsCache cache(); } From 05fb991488454558b1bd755b045fbcad85c4811f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 6 May 2012 14:15:03 -0400 Subject: [PATCH 260/823] move to revised warning interface in the compiler --- interface/src/main/java/xsbti/AnalysisCallback.java | 2 +- interface/src/main/java/xsbti/Problem.java | 1 + util/log/Logger.scala | 3 ++- 3 files changed, 4 insertions(+), 2 deletions(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 2ba6f6c83..c23b43ecd 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -26,5 +26,5 @@ public interface AnalysisCallback public void api(File sourceFile, xsbti.api.SourceAPI source); /** Provides problems discovered during compilation. These may be reported (logged) or unreported. * Unreported problems are usually unreported because reporting was not enabled via a command line switch. */ - public void problem(Position pos, String msg, Severity severity, boolean reported); + public void problem(String what, Position pos, String msg, Severity severity, boolean reported); } \ No newline at end of file diff --git a/interface/src/main/java/xsbti/Problem.java b/interface/src/main/java/xsbti/Problem.java index cf2641900..db7f67b22 100644 --- a/interface/src/main/java/xsbti/Problem.java +++ b/interface/src/main/java/xsbti/Problem.java @@ -5,6 +5,7 @@ package xsbti; public interface Problem { + String category(); Severity severity(); String message(); Position position(); diff --git a/util/log/Logger.scala b/util/log/Logger.scala index 1f6d4d3a5..7715b80db 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -79,9 +79,10 @@ object Logger val sourceFile = o2m(sourceFile0) } - def problem(pos: Position, msg: String, sev: Severity): Problem = + def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = new Problem { + val category = cat val position = pos val message = msg val severity = sev From 25159137d70338719449a0b6bba066dbc715c2a8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 12 May 2012 23:12:29 -0400 Subject: [PATCH 261/823] approximate type parameters and references by name not as accurate, but simpler. --- interface/other | 2 +- interface/type | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/interface/other b/interface/other index aeec1ae5a..a8f5ba49c 100644 --- a/interface/other +++ b/interface/other @@ -39,7 +39,7 @@ MethodParameter modifier: ParameterModifier TypeParameter - id: Int + id: String annotations: Annotation* typeParameters : TypeParameter* variance: Variance diff --git a/interface/type b/interface/type index ae24f5cc1..ac0926e92 100644 --- a/interface/type +++ b/interface/type @@ -5,7 +5,7 @@ Type prefix : SimpleType id: String ParameterRef - id: Int + id: String Singleton path: Path EmptyType From f3e49dcbfe13a9124b297b79038a82bc75a359d3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 12 May 2012 23:12:29 -0400 Subject: [PATCH 262/823] cleanup compilation tests --- interface/src/test/scala/TestCallback.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index 95fcbf96c..096d73a83 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -20,5 +20,5 @@ class TestCallback extends AnalysisCallback def endSource(source: File) { endedSources += source } def api(source: File, sourceAPI: xsbti.api.SourceAPI) { apis += ((source, sourceAPI)) } - def problem(pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean) {} + def problem(category: String, pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean) {} } \ No newline at end of file From f3253e496d0d175b85badf9e7240be0f9ccc670b Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Wed, 16 May 2012 11:59:42 +0400 Subject: [PATCH 263/823] Use java 7 Redirect.INHERIT to inherit subprocess' input stream. --- util/process/InheritInput.scala | 20 ++++++++++++++++++++ util/process/ProcessImpl.scala | 33 +++++++++++++++++---------------- 2 files changed, 37 insertions(+), 16 deletions(-) create mode 100755 util/process/InheritInput.scala diff --git a/util/process/InheritInput.scala b/util/process/InheritInput.scala new file mode 100755 index 000000000..bd45ebe62 --- /dev/null +++ b/util/process/InheritInput.scala @@ -0,0 +1,20 @@ +/* sbt -- Simple Build Tool + * Copyright 2012 Eugene Vigdorchik + */ +package sbt + +import java.lang.{ProcessBuilder => JProcessBuilder} + +/** On java 7, inherit System.in for a ProcessBuilder. */ +private[sbt] object InheritInput { + def apply(p: JProcessBuilder): (Boolean, JProcessBuilder) = (redirectInput, inherit) match { + case (Some(m), Some(f)) => (true, m.invoke(p, f).asInstanceOf[JProcessBuilder]) + case _ => (false, p) + } + + private[this] val pbClass = Class.forName("java.lang.ProcessBuilder") + private[this] val redirectClass = pbClass.getClasses find (_.getSimpleName == "Redirect") + + private[this] val redirectInput = redirectClass map (pbClass.getMethod("redirectInput", _)) + private[this] val inherit = redirectClass map (_ getField "INHERIT" get null) +} diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index d14b64f78..cea74272f 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -379,40 +379,41 @@ private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProc { override def run(io: ProcessIO): Process = { - val process = p.start() // start the external process + val (inherited, pp) = InheritInput(p) + val process = pp.start() // start the external process import io.{writeInput, processOutput, processError} // spawn threads that process the input, output, and error streams using the functions defined in `io` - val inThread = Spawn(writeInput(process.getOutputStream), true) + if(!inherited) + Spawn(writeInput(process.getOutputStream), true) + val outThread = Spawn(processOutput(process.getInputStream)) val errorThread = if(!p.redirectErrorStream) Spawn(processError(process.getErrorStream)) :: Nil else Nil - new SimpleProcess(process, inThread, outThread :: errorThread) + new SimpleProcess(process, outThread :: errorThread) } override def toString = p.command.toString override def canPipeTo = true } -/** A thin wrapper around a java.lang.Process. `outputThreads` are the Threads created to read from the -* output and error streams of the process. `inputThread` is the Thread created to write to the input stream of -* the process. -* The implementation of `exitValue` interrupts `inputThread` and then waits until all I/O threads die before -* returning. */ -private class SimpleProcess(p: JProcess, inputThread: Thread, outputThreads: List[Thread]) extends Process + +/** A thin wrapper around a java.lang.Process. `outputThreads` are the Threads created to read from the +* output and error streams of the process. +* The implementation of `exitValue` wait for the process to finish and then waits until the threads reading output and error streams die before +* returning. Note that the thread that reads the input stream cannot be interrupted, see https://github.com/harrah/xsbt/issues/327 and +* http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4514257 */ +private class SimpleProcess(p: JProcess, outputThreads: List[Thread]) extends Process { override def exitValue() = { - try { p.waitFor() }// wait for the process to terminate - finally { inputThread.interrupt() } // we interrupt the input thread to notify it that it can terminate + def waitDone(): Unit = + try { p.waitFor() } catch { case _: InterruptedException => waitDone() } + waitDone() outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) p.exitValue() } - override def destroy() = - { - try { p.destroy() } - finally { inputThread.interrupt() } - } + override def destroy() = p.destroy() } private class FileOutput(file: File, append: Boolean) extends OutputStreamBuilder(new FileOutputStream(file, append), file.getAbsolutePath) From 29e5143deda4bbe352044b2592715418073acdbd Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Wed, 16 May 2012 19:56:33 +0400 Subject: [PATCH 264/823] Additional method in ProcessIO to process inheriting input. --- util/process/InheritInput.scala | 6 +++--- util/process/Process.scala | 10 +++++----- util/process/ProcessImpl.scala | 35 +++++++++++++++++++++------------ 3 files changed, 30 insertions(+), 21 deletions(-) diff --git a/util/process/InheritInput.scala b/util/process/InheritInput.scala index bd45ebe62..5cfe30b79 100755 --- a/util/process/InheritInput.scala +++ b/util/process/InheritInput.scala @@ -7,9 +7,9 @@ import java.lang.{ProcessBuilder => JProcessBuilder} /** On java 7, inherit System.in for a ProcessBuilder. */ private[sbt] object InheritInput { - def apply(p: JProcessBuilder): (Boolean, JProcessBuilder) = (redirectInput, inherit) match { - case (Some(m), Some(f)) => (true, m.invoke(p, f).asInstanceOf[JProcessBuilder]) - case _ => (false, p) + def apply(p: JProcessBuilder): Option[JProcessBuilder] = (redirectInput, inherit) match { + case (Some(m), Some(f)) => Some(m.invoke(p, f).asInstanceOf[JProcessBuilder]) + case _ => None } private[this] val pbClass = Class.forName("java.lang.ProcessBuilder") diff --git a/util/process/Process.scala b/util/process/Process.scala index 5a1f46b4a..bd7e5cbc6 100644 --- a/util/process/Process.scala +++ b/util/process/Process.scala @@ -180,15 +180,15 @@ trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder def canPipeTo: Boolean } /** Each method will be called in a separate thread.*/ -final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: InputStream => Unit, val processError: InputStream => Unit) extends NotNull +final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: InputStream => Unit, val processError: InputStream => Unit, val inheritInput: JProcessBuilder => Option[JProcessBuilder]) extends NotNull { - def withOutput(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, process, processError) - def withError(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, processOutput, process) - def withInput(write: OutputStream => Unit): ProcessIO = new ProcessIO(write, processOutput, processError) + def withOutput(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, process, processError, inheritInput) + def withError(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, processOutput, process, inheritInput) + def withInput(write: OutputStream => Unit): ProcessIO = new ProcessIO(write, processOutput, processError, inheritInput) } trait ProcessLogger { def info(s: => String): Unit def error(s: => String): Unit def buffer[T](f: => T): T -} \ No newline at end of file +} diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index cea74272f..e58122efa 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -43,8 +43,8 @@ private object Future object BasicIO { - def apply(buffer: StringBuffer, log: Option[ProcessLogger], withIn: Boolean) = new ProcessIO(input(withIn), processFully(buffer), getErr(log)) - def apply(log: ProcessLogger, withIn: Boolean) = new ProcessIO(input(withIn), processInfoFully(log), processErrFully(log)) + def apply(buffer: StringBuffer, log: Option[ProcessLogger], withIn: Boolean) = new ProcessIO(input(withIn), processFully(buffer), getErr(log), inheritInput(withIn)) + def apply(log: ProcessLogger, withIn: Boolean) = new ProcessIO(input(withIn), processInfoFully(log), processErrFully(log), inheritInput(withIn)) def getErr(log: Option[ProcessLogger]) = log match { case Some(lg) => processErrFully(lg); case None => toStdErr } @@ -78,9 +78,9 @@ object BasicIO readFully() } def connectToIn(o: OutputStream) { transferFully(Uncloseable protect System.in, o) } - def input(connect: Boolean): OutputStream => Unit = if(connect) connectToIn else closeOut - def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput)) - def standard(in: OutputStream => Unit): ProcessIO = new ProcessIO(in, toStdOut, toStdErr) + def input(connect: Boolean): OutputStream => Unit = if(connect) connectToIn else closeOut + def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput), inheritInput(connectInput)) + def standard(in: OutputStream => Unit, inheritIn: JProcessBuilder => Option[JProcessBuilder]): ProcessIO = new ProcessIO(in, toStdOut, toStdErr, inheritIn) def toStdErr = (in: InputStream) => transferFully(in, System.err) def toStdOut = (in: InputStream) => transferFully(in, System.out) @@ -113,6 +113,8 @@ object BasicIO read in.close() } + + def inheritInput(connect: Boolean) = { p: JProcessBuilder => if (connect) InheritInput(p) else None } } @@ -154,7 +156,7 @@ private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPa private[this] def lines(withInput: Boolean, nonZeroException: Boolean, log: Option[ProcessLogger]): Stream[String] = { val streamed = Streamed[String](nonZeroException) - val process = run(new ProcessIO(BasicIO.input(withInput), BasicIO.processFully(streamed.process), BasicIO.getErr(log))) + val process = run(new ProcessIO(BasicIO.input(withInput), BasicIO.processFully(streamed.process), BasicIO.getErr(log), BasicIO.inheritInput(withInput))) Spawn { streamed.done(process.exitValue()) } streamed.stream() } @@ -379,13 +381,14 @@ private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProc { override def run(io: ProcessIO): Process = { - val (inherited, pp) = InheritInput(p) - val process = pp.start() // start the external process - import io.{writeInput, processOutput, processError} - // spawn threads that process the input, output, and error streams using the functions defined in `io` - if(!inherited) - Spawn(writeInput(process.getOutputStream), true) + import io._ + val process = inheritInput(p) map (_.start()) getOrElse { + val proc = p.start() + Spawn(writeInput(proc.getOutputStream)) + proc + } + // spawn threads that process the output and error streams. val outThread = Spawn(processOutput(process.getInputStream)) val errorThread = if(!p.redirectErrorStream) @@ -408,7 +411,13 @@ private class SimpleProcess(p: JProcess, outputThreads: List[Thread]) extends Pr override def exitValue() = { def waitDone(): Unit = - try { p.waitFor() } catch { case _: InterruptedException => waitDone() } + try { + p.waitFor() + } catch { + case _: InterruptedException => + // Guard against possible spurious wakeups, check thread interrupted status. + if(Thread.interrupted()) p.destroy() else waitDone() + } waitDone() outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) p.exitValue() From e5bedb3e1418fc2acdcffe5b04c81641f839823e Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Thu, 17 May 2012 10:24:22 +0400 Subject: [PATCH 265/823] Refactor according to the comments. --- util/process/InheritInput.scala | 6 +++--- util/process/Process.scala | 2 +- util/process/ProcessImpl.scala | 29 ++++++++++++----------------- 3 files changed, 16 insertions(+), 21 deletions(-) diff --git a/util/process/InheritInput.scala b/util/process/InheritInput.scala index 5cfe30b79..1c9ef0ee8 100755 --- a/util/process/InheritInput.scala +++ b/util/process/InheritInput.scala @@ -7,9 +7,9 @@ import java.lang.{ProcessBuilder => JProcessBuilder} /** On java 7, inherit System.in for a ProcessBuilder. */ private[sbt] object InheritInput { - def apply(p: JProcessBuilder): Option[JProcessBuilder] = (redirectInput, inherit) match { - case (Some(m), Some(f)) => Some(m.invoke(p, f).asInstanceOf[JProcessBuilder]) - case _ => None + def apply(p: JProcessBuilder): Boolean = (redirectInput, inherit) match { + case (Some(m), Some(f)) => m.invoke(p, f); true + case _ => false } private[this] val pbClass = Class.forName("java.lang.ProcessBuilder") diff --git a/util/process/Process.scala b/util/process/Process.scala index bd7e5cbc6..b2a127977 100644 --- a/util/process/Process.scala +++ b/util/process/Process.scala @@ -180,7 +180,7 @@ trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder def canPipeTo: Boolean } /** Each method will be called in a separate thread.*/ -final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: InputStream => Unit, val processError: InputStream => Unit, val inheritInput: JProcessBuilder => Option[JProcessBuilder]) extends NotNull +final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: InputStream => Unit, val processError: InputStream => Unit, val inheritInput: JProcessBuilder => Boolean) extends NotNull { def withOutput(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, process, processError, inheritInput) def withError(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, processOutput, process, inheritInput) diff --git a/util/process/ProcessImpl.scala b/util/process/ProcessImpl.scala index e58122efa..44dcaed2d 100644 --- a/util/process/ProcessImpl.scala +++ b/util/process/ProcessImpl.scala @@ -80,7 +80,7 @@ object BasicIO def connectToIn(o: OutputStream) { transferFully(Uncloseable protect System.in, o) } def input(connect: Boolean): OutputStream => Unit = if(connect) connectToIn else closeOut def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput), inheritInput(connectInput)) - def standard(in: OutputStream => Unit, inheritIn: JProcessBuilder => Option[JProcessBuilder]): ProcessIO = new ProcessIO(in, toStdOut, toStdErr, inheritIn) + def standard(in: OutputStream => Unit, inheritIn: JProcessBuilder => Boolean): ProcessIO = new ProcessIO(in, toStdOut, toStdErr, inheritIn) def toStdErr = (in: InputStream) => transferFully(in, System.err) def toStdOut = (in: InputStream) => transferFully(in, System.out) @@ -114,7 +114,7 @@ object BasicIO in.close() } - def inheritInput(connect: Boolean) = { p: JProcessBuilder => if (connect) InheritInput(p) else None } + def inheritInput(connect: Boolean) = { p: JProcessBuilder => if (connect) InheritInput(p) else false } } @@ -382,13 +382,12 @@ private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProc override def run(io: ProcessIO): Process = { import io._ - val process = inheritInput(p) map (_.start()) getOrElse { - val proc = p.start() - Spawn(writeInput(proc.getOutputStream)) - proc - } + val inherited = inheritInput(p) + val process = p.start() - // spawn threads that process the output and error streams. + // spawn threads that process the output and error streams, and also write input if not inherited. + if (!inherited) + Spawn(writeInput(process.getOutputStream)) val outThread = Spawn(processOutput(process.getInputStream)) val errorThread = if(!p.redirectErrorStream) @@ -410,15 +409,11 @@ private class SimpleProcess(p: JProcess, outputThreads: List[Thread]) extends Pr { override def exitValue() = { - def waitDone(): Unit = - try { - p.waitFor() - } catch { - case _: InterruptedException => - // Guard against possible spurious wakeups, check thread interrupted status. - if(Thread.interrupted()) p.destroy() else waitDone() - } - waitDone() + try { + p.waitFor() + } catch { + case _: InterruptedException => p.destroy() + } outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) p.exitValue() } From eec347e2dd7e037c30f1578d4dbb9943ea3548f2 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 19 May 2012 18:20:19 -0400 Subject: [PATCH 266/823] ensure enableEcho called after jline.Terminal.getTerminal. fixes #460 --- util/complete/LineReader.scala | 5 +++-- util/log/ConsoleLogger.scala | 9 +++++++-- 2 files changed, 10 insertions(+), 4 deletions(-) diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index a48e4c141..1318061af 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -53,8 +53,9 @@ abstract class JLine extends LineReader } private object JLine { - def terminal = jline.Terminal.getTerminal - def resetTerminal() = withTerminal { _ => jline.Terminal.resetTerminal } + // When calling this, ensure that enableEcho has been or will be called. + // getTerminal will initialize the terminal to disable echo. + private def terminal = jline.Terminal.getTerminal private def withTerminal[T](f: jline.Terminal => T): T = synchronized { diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index a453b3b8a..ad626336f 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -35,8 +35,13 @@ object ConsoleLogger } private[this] def ansiSupported = - try { jline.Terminal.getTerminal.isANSISupported } - catch { case e: Exception => !isWindows } + try { + val terminal = jline.Terminal.getTerminal + terminal.enableEcho() // #460 + terminal.isANSISupported + } catch { + case e: Exception => !isWindows + } private[this] def os = System.getProperty("os.name") private[this] def isWindows = os.toLowerCase.indexOf("windows") >= 0 From 1f9433f175eb8829d99f3698371b0f67d8104007 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 19 May 2012 18:20:19 -0400 Subject: [PATCH 267/823] Second try at printing message when stack trace suppressed. Problems: 1. Without a message, users don't find 'last' 2. Showing a message for every error clutters output. This tries to address these issues by: 1. Only showing the message when other feedback has not been provided and 'last' would not usually be helpful. This will require ongoing tweaking. For now, all commands except 'compile' display the message. 'update' could omit the message as well, but perhaps knowing about 'last' might be useful there. 2. Including the exact command to show the output: last test:compile and not just last 3. Highlighting the command in blue for visibility as an experiment. Review by @ijuma and @retronym, please. --- util/control/MessageOnlyException.scala | 8 +++-- util/log/ConsoleLogger.scala | 39 +++++++++++++++---------- util/log/MainLogging.scala | 8 +++-- 3 files changed, 34 insertions(+), 21 deletions(-) diff --git a/util/control/MessageOnlyException.scala b/util/control/MessageOnlyException.scala index 7fa43746d..75b7737d8 100644 --- a/util/control/MessageOnlyException.scala +++ b/util/control/MessageOnlyException.scala @@ -7,8 +7,12 @@ final class MessageOnlyException(override val toString: String) extends RuntimeE /** A dummy exception for the top-level exception handler to know that an exception * has been handled, but is being passed further up to indicate general failure. */ -final class AlreadyHandledException extends RuntimeException +final class AlreadyHandledException(val underlying: Throwable) extends RuntimeException /** A marker trait for a top-level exception handler to know that this exception * doesn't make sense to display. */ -trait UnprintableException extends Throwable \ No newline at end of file +trait UnprintableException extends Throwable + +/** A marker trait that refines UnprintableException to indicate to a top-level exception handler +* that the code throwing this exception has already provided feedback to the user about the error condition. */ +trait FeedbackProvidedException extends UnprintableException diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index ad626336f..d8ce50dd0 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -42,15 +42,16 @@ object ConsoleLogger } catch { case e: Exception => !isWindows } + val noSuppressedMessage = (_: SuppressedTraceContext) => None private[this] def os = System.getProperty("os.name") private[this] def isWindows = os.toLowerCase.indexOf("windows") >= 0 - def apply(): ConsoleLogger = apply(systemOut) def apply(out: PrintStream): ConsoleLogger = apply(printStreamOut(out)) def apply(out: PrintWriter): ConsoleLogger = apply(printWriterOut(out)) - def apply(out: ConsoleOut, ansiCodesSupported: Boolean = formatEnabled, useColor: Boolean = formatEnabled): ConsoleLogger = - new ConsoleLogger(out, ansiCodesSupported, useColor) + def apply(out: ConsoleOut = systemOut, ansiCodesSupported: Boolean = formatEnabled, + useColor: Boolean = formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = noSuppressedMessage): ConsoleLogger = + new ConsoleLogger(out, ansiCodesSupported, useColor, suppressedMessage) private[this] val EscapeSequence = (27.toChar + "[^@-~]*[@-~]").r def stripEscapeSequences(s: String): String = @@ -61,7 +62,7 @@ object ConsoleLogger * colored. * * This logger is not thread-safe.*/ -class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean) extends BasicLogger +class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean, val suppressedMessage: SuppressedTraceContext => Option[String]) extends BasicLogger { import scala.Console.{BLUE, GREEN, RED, RESET, YELLOW} def messageColor(level: Level.Value) = RESET @@ -85,6 +86,9 @@ class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ans val traceLevel = getTrace if(traceLevel >= 0) out.print(StackTrace.trimmed(t, traceLevel)) + if(traceLevel <= 2) + for(msg <- suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useColor))) + printLabeledLine(labelColor(Level.Error), "trace", messageColor(Level.Error), msg) } def log(level: Level.Value, message: => String) { @@ -102,24 +106,27 @@ class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ans out.lockObject.synchronized { for(line <- message.split("""\n""")) - { - reset() - out.print("[") - setColor(labelColor) - out.print(label) - reset() - out.print("] ") - setColor(messageColor) - out.print(line) - reset() - out.println() - } + printLabeledLine(labelColor, label, messageColor, line) } + private def printLabeledLine(labelColor: String, label: String, messageColor: String, line: String): Unit = + { + reset() + out.print("[") + setColor(labelColor) + out.print(label) + reset() + out.print("] ") + setColor(messageColor) + out.print(line) + reset() + out.println() + } def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } def control(event: ControlEvent.Value, message: => String) { log(labelColor(Level.Info), Level.Info.toString, BLUE, message) } } +final class SuppressedTraceContext(val traceLevel: Int, val useColor: Boolean) sealed trait ConsoleOut { val lockObject: AnyRef diff --git a/util/log/MainLogging.scala b/util/log/MainLogging.scala index b07abf4e3..25f42a6af 100644 --- a/util/log/MainLogging.scala +++ b/util/log/MainLogging.scala @@ -25,12 +25,14 @@ object MainLogging } def defaultMultiConfig(backing: AbstractLogger): MultiLoggerConfig = - new MultiLoggerConfig(defaultScreen, backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) + new MultiLoggerConfig(defaultScreen(ConsoleLogger.noSuppressedMessage), backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) - def defaultScreen: AbstractLogger = ConsoleLogger() + def defaultScreen(): AbstractLogger = ConsoleLogger() + def defaultScreen(suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = ConsoleLogger(suppressedMessage = suppressedMessage) def defaultBacked(useColor: Boolean = ConsoleLogger.formatEnabled): PrintWriter => ConsoleLogger = to => ConsoleLogger(ConsoleLogger.printWriterOut(to), useColor = useColor) // TODO: should probably filter ANSI codes when useColor=false } -final case class MultiLoggerConfig(console: AbstractLogger, backed: AbstractLogger, extra: List[AbstractLogger], screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int) \ No newline at end of file +final case class MultiLoggerConfig(console: AbstractLogger, backed: AbstractLogger, extra: List[AbstractLogger], + screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int) \ No newline at end of file From fbee96939da26942d7633283afbe40722172f5d0 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 21 May 2012 22:23:44 -0400 Subject: [PATCH 268/823] print message and stack trace when exception occurs in completion --- util/complete/JLineCompletion.scala | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala index 2557a1b08..0af53e262 100644 --- a/util/complete/JLineCompletion.scala +++ b/util/complete/JLineCompletion.scala @@ -28,7 +28,12 @@ object JLineCompletion val current = Some(bufferSnapshot(reader)) level = if(current == previous) level + 1 else 1 previous = current - completeImpl(reader, level) + try completeImpl(reader, level) + catch { case e: Exception => + reader.printString("\nException occurred while determining completions.") + e.printStackTrace() + false + } } } From 935eed087f8f30e0a2c594bdf4bacd9c7628e444 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 23 May 2012 20:13:52 -0400 Subject: [PATCH 269/823] another fix related to #460 --- util/complete/LineReader.scala | 10 ++++++++-- 1 file changed, 8 insertions(+), 2 deletions(-) diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index 1318061af..ecd1eafd9 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -62,10 +62,16 @@ private object JLine val t = terminal t.synchronized { f(t) } } - def createReader() = + /** For accessing the JLine Terminal object. + * This ensures synchronized access as well as re-enabling echo after getting the Terminal. */ + def usingTerminal[T](f: jline.Terminal => T): T = withTerminal { t => - val cr = new ConsoleReader t.enableEcho() + f(t) + } + def createReader() = + usingTerminal { t => + val cr = new ConsoleReader cr.setBellEnabled(false) cr } From 4c1a979d8af76e79b825553dfa9ab41c2e955f86 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 16 Jun 2012 23:40:52 -0400 Subject: [PATCH 270/823] disable resident-compiler related code paths when it isn't being used. fixes #486. The underlying issue with the resident compiler needs fixing, however. --- .../src/main/java/xsbti/compile/CachedCompilerProvider.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java b/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java index 43d3aaf7e..f32c54e3d 100644 --- a/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java +++ b/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java @@ -6,5 +6,5 @@ import xsbti.Reporter; public interface CachedCompilerProvider { ScalaInstance scalaInstance(); - CachedCompiler newCachedCompiler(String[] arguments, Logger log, Reporter reporter); + CachedCompiler newCachedCompiler(String[] arguments, Logger log, Reporter reporter, boolean resident); } \ No newline at end of file From 4e574d0df3f74b8bb7b86a57f7bb443ceb45306e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 1 Jul 2012 15:16:41 -0400 Subject: [PATCH 271/823] better handling of multi-loggers with mixed escape sequence support * multi-logger supports ansi escapes if at least one logger support them * escape sequences removed from strings for loggers without escape support --- util/log/ConsoleLogger.scala | 47 ++++++++++++++ util/log/MainLogging.scala | 2 +- util/log/MultiLogger.scala | 26 +++++++- util/log/src/test/scala/Escapes.scala | 91 +++++++++++++++++++++++++++ 4 files changed, 163 insertions(+), 3 deletions(-) create mode 100644 util/log/src/test/scala/Escapes.scala diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index d8ce50dd0..bc48d7ad2 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -27,6 +27,53 @@ object ConsoleLogger def println() = { out.newLine(); out.flush() } } + /** Escape character, used to introduce an escape sequence. */ + final val ESC = '\u001B' + + /** An escape terminator is a character in the range `@` (decimal value 64) to `~` (decimal value 126). + * It is the final character in an escape sequence. */ + def isEscapeTerminator(c: Char): Boolean = + c >= '@' && c <= '~' + + /** Returns true if the string contains the ESC character. */ + def hasEscapeSequence(s: String): Boolean = + s.indexOf(ESC) >= 0 + + /** Returns the string `s` with escape sequences removed. + * An escape sequence starts with the ESC character (decimal value 27) and ends with an escape terminator. + * @see isEscapeTerminator + */ + def removeEscapeSequences(s: String): String = + if(s.isEmpty || !hasEscapeSequence(s)) + s + else + { + val sb = new java.lang.StringBuilder + nextESC(s, 0, sb) + sb.toString + } + private[this] def nextESC(s: String, start: Int, sb: java.lang.StringBuilder) + { + val escIndex = s.indexOf(ESC, start) + if(escIndex < 0) + sb.append(s, start, s.length) + else { + sb.append(s, start, escIndex) + val next = skipESC(s, escIndex+1) + nextESC(s, next, sb) + } + } + + + /** Skips the escape sequence starting at `i-1`. `i` should be positioned at the character after the ESC that starts the sequence. */ + private[this] def skipESC(s: String, i: Int): Int = + if(i >= s.length) + i + else if( isEscapeTerminator(s.charAt(i)) ) + i+1 + else + skipESC(s, i+1) + val formatEnabled = { import java.lang.Boolean.{getBoolean, parseBoolean} diff --git a/util/log/MainLogging.scala b/util/log/MainLogging.scala index 25f42a6af..d7b45e043 100644 --- a/util/log/MainLogging.scala +++ b/util/log/MainLogging.scala @@ -31,7 +31,7 @@ object MainLogging def defaultScreen(suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = ConsoleLogger(suppressedMessage = suppressedMessage) def defaultBacked(useColor: Boolean = ConsoleLogger.formatEnabled): PrintWriter => ConsoleLogger = - to => ConsoleLogger(ConsoleLogger.printWriterOut(to), useColor = useColor) // TODO: should probably filter ANSI codes when useColor=false + to => ConsoleLogger(ConsoleLogger.printWriterOut(to), useColor = useColor) } final case class MultiLoggerConfig(console: AbstractLogger, backed: AbstractLogger, extra: List[AbstractLogger], diff --git a/util/log/MultiLogger.scala b/util/log/MultiLogger.scala index 9cdb65386..cd73bf2c3 100644 --- a/util/log/MultiLogger.scala +++ b/util/log/MultiLogger.scala @@ -8,7 +8,10 @@ package sbt // on the behavior of the delegates. class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { - override lazy val ansiCodesSupported = delegates.forall(_.ansiCodesSupported) + override lazy val ansiCodesSupported = delegates exists supported + private[this] lazy val allSupportCodes = delegates forall supported + private[this] def supported = (_: AbstractLogger).ansiCodesSupported + override def setLevel(newLevel: Level.Value) { super.setLevel(newLevel) @@ -29,5 +32,24 @@ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger def success(message: => String) { dispatch(new Success(message)) } def logAll(events: Seq[LogEvent]) { delegates.foreach(_.logAll(events)) } def control(event: ControlEvent.Value, message: => String) { delegates.foreach(_.control(event, message)) } - private def dispatch(event: LogEvent) { delegates.foreach(_.log(event)) } + private[this] def dispatch(event: LogEvent) + { + val plainEvent = if(allSupportCodes) event else removeEscapes(event) + for( d <- delegates) + if(d.ansiCodesSupported) + d.log(event) + else + d.log(plainEvent) + } + + private[this] def removeEscapes(event: LogEvent): LogEvent = + { + import ConsoleLogger.{removeEscapeSequences => rm} + event match { + case s: Success => new Success(rm(s.msg)) + case l: Log => new Log(l.level, rm(l.msg)) + case ce: ControlEvent => new ControlEvent(ce.event, rm(ce.msg)) + case _: Trace | _: SetLevel | _: SetTrace | _: SetSuccess => event + } + } } \ No newline at end of file diff --git a/util/log/src/test/scala/Escapes.scala b/util/log/src/test/scala/Escapes.scala new file mode 100644 index 000000000..f90499574 --- /dev/null +++ b/util/log/src/test/scala/Escapes.scala @@ -0,0 +1,91 @@ +package sbt + +import org.scalacheck._ +import Prop._ +import Gen.{listOf, oneOf} + +import ConsoleLogger.{ESC, hasEscapeSequence, isEscapeTerminator, removeEscapeSequences} + +object Escapes extends Properties("Escapes") +{ + property("genTerminator only generates terminators") = + forAllNoShrink(genTerminator) { (c: Char) => isEscapeTerminator(c) } + + property("genWithoutTerminator only generates terminators") = + forAllNoShrink(genWithoutTerminator) { (s: String) => + s.forall { c => !isEscapeTerminator(c) } + } + + property("hasEscapeSequence is false when no escape character is present") = forAllNoShrink(genWithoutEscape) { (s: String) => + !hasEscapeSequence(s) + } + + property("hasEscapeSequence is true when escape character is present") = forAllNoShrink(genWithRandomEscapes) { (s: String) => + hasEscapeSequence(s) + } + + property("removeEscapeSequences is the identity when no escape character is present") = forAllNoShrink(genWithoutEscape) { (s: String) => + val removed: String = removeEscapeSequences(s) + ("Escape sequence removed: '" + removed + "'") |: + (removed == s) + } + + property("No escape characters remain after removeEscapeSequences") = forAll { (s: String) => + val removed: String = removeEscapeSequences(s) + ("Escape sequence removed: '" + removed + "'") |: + !hasEscapeSequence(removed) + } + + property("removeEscapeSequences returns string without escape sequences") = + forAllNoShrink( genWithoutEscape, genEscapePairs ) { (start: String, escapes: List[EscapeAndNot]) => + val withEscapes: String = start + escapes.map { ean => ean.escape.makeString + ean.notEscape } + val removed: String = removeEscapeSequences(withEscapes) + val original = start + escapes.map(_.notEscape) + ("Input string with escapes: '" + withEscapes + "'") |: + ("Escapes removed '" + removed + "'") |: + (original == removed) + } + + final case class EscapeAndNot(escape: EscapeSequence, notEscape: String) + final case class EscapeSequence(content: String, terminator: Char) + { + assert( content.forall(c => !isEscapeTerminator(c) ), "Escape sequence content contains an escape terminator: '" + content + "'" ) + assert( isEscapeTerminator(terminator) ) + def makeString: String = ESC + content + terminator + } + private[this] def noEscape(s: String): String = s.replace(ESC, ' ') + + lazy val genEscapeSequence: Gen[EscapeSequence] = oneOf(genKnownSequence, genArbitraryEscapeSequence) + lazy val genEscapePair: Gen[EscapeAndNot] = for(esc <- genEscapeSequence; not <- genWithoutEscape) yield EscapeAndNot(esc, not) + lazy val genEscapePairs: Gen[List[EscapeAndNot]] = listOf(genEscapePair) + + lazy val genArbitraryEscapeSequence: Gen[EscapeSequence] = + for(content <- genWithoutTerminator; term <- genTerminator) yield + new EscapeSequence(content, term) + + lazy val genKnownSequence: Gen[EscapeSequence] = + oneOf((misc ++ setGraphicsMode ++ setMode ++ resetMode).map(toEscapeSequence)) + + def toEscapeSequence(s: String): EscapeSequence = EscapeSequence(s.init, s.last) + + lazy val misc = Seq("14;23H", "5;3f", "2A", "94B", "19C", "85D", "s", "u", "2J", "K") + + lazy val setGraphicsMode: Seq[String] = + for(txt <- 0 to 8; fg <- 30 to 37; bg <- 40 to 47) yield + txt.toString + ";" + fg.toString + ";" + bg.toString + "m" + + lazy val resetMode = setModeLike('I') + lazy val setMode = setModeLike('h') + def setModeLike(term: Char): Seq[String] = (0 to 19).map(i => "=" + i.toString + term) + + lazy val genWithoutTerminator = genRawString.map( _.filter { c => !isEscapeTerminator(c) } ) + + lazy val genTerminator: Gen[Char] = Gen.choose('@', '~') + lazy val genWithoutEscape: Gen[String] = genRawString.map(noEscape) + + def genWithRandomEscapes: Gen[String] = + for(ls <- listOf(genRawString); end <- genRawString) yield + ls.mkString("", ESC.toString, ESC.toString + end) + + private def genRawString = Arbitrary.arbString.arbitrary +} From fbb09b1433da9df69a53a2d15c10f5fdccba3072 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 1 Jul 2012 15:16:41 -0400 Subject: [PATCH 272/823] fix task tests --- util/collection/TypeFunctions.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index 185c72226..bbfad9b8e 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -26,6 +26,7 @@ trait TypeFunctions implicit def toFn1[A,B](f: A => B): Fn1[A,B] = new Fn1[A,B] { def ∙[C](g: C => A) = f compose g } + def idK[M[_]]: M ~> M = new (M ~> M) { def apply[T](m: M[T]): M[T] = m } type Endo[T] = T=>T type ~>|[A[_],B[_]] = A ~> Compose[Option, B]#Apply From 2196cbedaf36e03d8ac5f0c9972aad0286c43612 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 1 Jul 2012 15:16:41 -0400 Subject: [PATCH 273/823] remove most occurrences of ScalaObject --- util/process/src/test/scala/ProcessSpecification.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/process/src/test/scala/ProcessSpecification.scala b/util/process/src/test/scala/ProcessSpecification.scala index 15f9c1f48..6810025bf 100644 --- a/util/process/src/test/scala/ProcessSpecification.scala +++ b/util/process/src/test/scala/ProcessSpecification.scala @@ -82,10 +82,10 @@ private[this] object ProcessSpecification extends Properties("Process I/O") { val ignore = echo // just for the compile dependency so that this test is rerun when TestedProcess.scala changes, not used otherwise - val thisClasspath = List(getSource[ScalaObject], getSource[IO.type], getSource[SourceTag]).mkString(File.pathSeparator) + val thisClasspath = List(getSource[Product], getSource[IO.type], getSource[SourceTag]).mkString(File.pathSeparator) "java -cp " + thisClasspath + " " + command } private def getSource[T : Manifest]: String = IO.classLocationFile[T].getAbsolutePath } -private trait SourceTag \ No newline at end of file +private trait SourceTag From e9ed0feb878d90b7911122930529e99a2fee99f8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 6 Jul 2012 10:28:51 -0400 Subject: [PATCH 274/823] repeatDep parser combinator --- util/complete/Parsers.scala | 9 +++++ util/complete/src/test/scala/ParserTest.scala | 35 +++++++++++++++++-- 2 files changed, 42 insertions(+), 2 deletions(-) diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 772ec20d5..5e4d46bc1 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -95,6 +95,15 @@ trait Parsers def flag[T](p: Parser[T]): Parser[Boolean] = (p ^^^ true) ?? false + def repeatDep[A](p: Seq[A] => Parser[A], sep: Parser[Any]): Parser[Seq[A]] = + { + def loop(acc: Seq[A]): Parser[Seq[A]] = { + val next = (sep ~> p(acc)) flatMap { result => loop(acc :+ result) } + next ?? acc + } + p(Vector()) flatMap { first => loop(Seq(first)) } + } + def trimmed(p: Parser[String]) = p map { _.trim } lazy val basicUri = mapOrFail(URIClass)( uri => new URI(uri)) def Uri(ex: Set[URI]) = basicUri examples(ex.map(_.toString)) diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 9c6ec091d..a7d276a38 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -41,6 +41,7 @@ object JLineTest object ParserTest extends Properties("Completing Parser") { import Parsers._ + import DefaultParsers.matches val nested = (token("a1") ~ token("b2")) ~ "c3" val nestedDisplay = (token("a1", "") ~ token("b2", "")) ~ "c3" @@ -54,13 +55,23 @@ object ParserTest extends Properties("Completing Parser") ( ("display '" + in + "'") |: checkOne(in, nestedDisplay, expectDisplay) ) def checkOne(in: String, parser: Parser[_], expect: Completion): Prop = - p(completions(parser, in, 1)) == Completions.single(expect) + completions(parser, in, 1) == Completions.single(expect) + def checkAll(in: String, parser: Parser[_], expect: Completions): Prop = + { + val cs = completions(parser, in, 1) + ("completions: " + cs) |: ("Expected: " + expect) |: ( (cs == expect): Prop) + } + def checkInvalid(in: String) = ( ("token '" + in + "'") |: checkInv(in, nested) ) && ( ("display '" + in + "'") |: checkInv(in, nestedDisplay) ) + def checkInv(in: String, parser: Parser[_]): Prop = - p(completions(parser, in, 1)) == Completions.nil + { + val cs = completions(parser, in, 1) + ("completions: " + cs) |: (( cs == Completions.nil): Prop) + } property("nested tokens a") = checkSingle("", Completion.tokenStrict("","a1") )( Completion.displayStrict("")) property("nested tokens a1") = checkSingle("a", Completion.tokenStrict("a","1") )( Completion.displayStrict("")) @@ -78,6 +89,26 @@ object ParserTest extends Properties("Completing Parser") property("no suggest at token end") = checkOne("asdf", token("asdf"), Completion.suggestStrict("")) property("empty suggest for examples") = checkOne("asdf", any.+.examples("asdf", "qwer"), Completion.suggestStrict("")) property("empty suggest for examples token") = checkOne("asdf", token(any.+.examples("asdf", "qwer")), Completion.suggestStrict("")) + + val colors = Set("blue", "green", "red") + val base = (seen: Seq[String]) => token( ID examples (colors -- seen) ) + val sep = token( Space ) + val repeat = repeatDep( base, sep) + def completionStrings(ss: Set[String]): Completions = Completions(ss.map { s => Completion.tokenStrict("", s) }) + + property("repeatDep no suggestions for bad input") = checkInv(".", repeat) + property("repeatDep suggest all") = checkAll("", repeat, completionStrings(colors)) + property("repeatDep suggest remaining two") = { + val first = colors.toSeq.head + checkAll(first + " ", repeat, completionStrings(colors - first)) + } + property("repeatDep suggest remaining one") = { + val take = colors.toSeq.take(2) + checkAll(take.mkString("", " ", " "), repeat, completionStrings(colors -- take)) + } + property("repeatDep requires at least one token") = !matches(repeat, "") + property("repeatDep accepts one token") = matches(repeat, colors.toSeq.head) + property("repeatDep accepts two tokens") = matches(repeat, colors.toSeq.take(2).mkString(" ")) } object ParserExample { From 35cfba21c046c8845d4fe1a6c94f6f8cdfbd654b Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Tue, 10 Jul 2012 20:51:34 +0400 Subject: [PATCH 275/823] Break compiler dependency from 'collection' project. --- util/collection/Util.scala | 7 ------- util/{collection => relation}/Relation.scala | 2 +- .../src/test/scala/RelationTest.scala | 0 3 files changed, 1 insertion(+), 8 deletions(-) rename util/{collection => relation}/Relation.scala (99%) rename util/{collection => relation}/src/test/scala/RelationTest.scala (100%) diff --git a/util/collection/Util.scala b/util/collection/Util.scala index 429a1b61d..a886c233d 100644 --- a/util/collection/Util.scala +++ b/util/collection/Util.scala @@ -20,13 +20,6 @@ object Util case Left(l) => (l +: acc._1, acc._2) case Right(r) => (acc._1, r +: acc._2) } - def counted(prefix: String, single: String, plural: String, count: Int): Option[String] = - count match - { - case 0 => None - case 1 => Some("1 " + prefix + single) - case x => Some(x.toString + " " + prefix + plural) - } def pairID[A,B] = (a: A, b: B) => (a,b) } diff --git a/util/collection/Relation.scala b/util/relation/Relation.scala similarity index 99% rename from util/collection/Relation.scala rename to util/relation/Relation.scala index dce3d9048..0128333bd 100644 --- a/util/collection/Relation.scala +++ b/util/relation/Relation.scala @@ -119,4 +119,4 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def contains(a: A, b: B): Boolean = forward(a)(b) override def toString = all.map { case (a,b) => a + " -> " + b }.mkString("Relation [", ", ", "]") -} \ No newline at end of file +} diff --git a/util/collection/src/test/scala/RelationTest.scala b/util/relation/src/test/scala/RelationTest.scala similarity index 100% rename from util/collection/src/test/scala/RelationTest.scala rename to util/relation/src/test/scala/RelationTest.scala From 550599bea098736a3421fb7b9de32e9bccd990fe Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Tue, 10 Jul 2012 21:12:39 +0400 Subject: [PATCH 276/823] Changes required to use sbt as-is from Scala-IDE. --- interface/other | 6 +++++- .../java/xsbti/compile/CachedCompiler.java | 2 +- .../xsbti/compile/CachedCompilerProvider.java | 2 +- .../java/xsbti/compile/CompileProgress.java | 7 +++++++ .../main/java/xsbti/compile/GlobalsCache.java | 2 +- .../main/java/xsbti/compile/JavaCompiler.java | 2 +- .../java/xsbti/compile/MultipleOutput.java | 20 +++++++++++++++++++ .../src/main/java/xsbti/compile/Options.java | 7 ++----- .../src/main/java/xsbti/compile/Output.java | 9 +++++++++ .../src/main/java/xsbti/compile/Setup.java | 3 +++ .../main/java/xsbti/compile/SingleOutput.java | 12 +++++++++++ 11 files changed, 62 insertions(+), 10 deletions(-) create mode 100755 interface/src/main/java/xsbti/compile/CompileProgress.java create mode 100755 interface/src/main/java/xsbti/compile/MultipleOutput.java create mode 100755 interface/src/main/java/xsbti/compile/Output.java create mode 100755 interface/src/main/java/xsbti/compile/SingleOutput.java diff --git a/interface/other b/interface/other index a8f5ba49c..111896f0b 100644 --- a/interface/other +++ b/interface/other @@ -9,9 +9,13 @@ SourceAPI packages : Package* definitions: Definition* +OutputSetting + sourceDirectory: String + outputDirectory: String + Compilation startTime: Long - target: String + outputs: OutputSetting* Package name: String diff --git a/interface/src/main/java/xsbti/compile/CachedCompiler.java b/interface/src/main/java/xsbti/compile/CachedCompiler.java index 2f97e395b..97a1a33b5 100644 --- a/interface/src/main/java/xsbti/compile/CachedCompiler.java +++ b/interface/src/main/java/xsbti/compile/CachedCompiler.java @@ -7,5 +7,5 @@ import java.io.File; public interface CachedCompiler { - public void run(File[] sources, DependencyChanges cpChanges, AnalysisCallback callback, Logger log, Reporter delegate); + public void run(File[] sources, DependencyChanges cpChanges, AnalysisCallback callback, Logger log, Reporter delegate, CompileProgress progress); } diff --git a/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java b/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java index f32c54e3d..313f27505 100644 --- a/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java +++ b/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java @@ -6,5 +6,5 @@ import xsbti.Reporter; public interface CachedCompilerProvider { ScalaInstance scalaInstance(); - CachedCompiler newCachedCompiler(String[] arguments, Logger log, Reporter reporter, boolean resident); + CachedCompiler newCachedCompiler(String[] arguments, Output output, Logger log, Reporter reporter, boolean resident); } \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/CompileProgress.java b/interface/src/main/java/xsbti/compile/CompileProgress.java new file mode 100755 index 000000000..902a50018 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/CompileProgress.java @@ -0,0 +1,7 @@ +package xsbti.compile; + +public interface CompileProgress { + void startUnit(String phase, String unitPath); + + boolean advance(int current, int total); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/GlobalsCache.java b/interface/src/main/java/xsbti/compile/GlobalsCache.java index 1e070a1f1..a3a412836 100644 --- a/interface/src/main/java/xsbti/compile/GlobalsCache.java +++ b/interface/src/main/java/xsbti/compile/GlobalsCache.java @@ -5,6 +5,6 @@ import xsbti.Reporter; public interface GlobalsCache { - public CachedCompiler apply(String[] args, boolean forceNew, CachedCompilerProvider provider, Logger log, Reporter reporter); + public CachedCompiler apply(String[] args, Output output, boolean forceNew, CachedCompilerProvider provider, Logger log, Reporter reporter); public void clear(); } diff --git a/interface/src/main/java/xsbti/compile/JavaCompiler.java b/interface/src/main/java/xsbti/compile/JavaCompiler.java index 05bafdfe1..c9947c700 100644 --- a/interface/src/main/java/xsbti/compile/JavaCompiler.java +++ b/interface/src/main/java/xsbti/compile/JavaCompiler.java @@ -11,5 +11,5 @@ public interface JavaCompiler /** Compiles Java sources using the provided classpath, output directory, and additional options. * If supported, the number of reported errors should be limited to `maximumErrors`. * Output should be sent to the provided logger.*/ - void compile(File[] sources, File[] classpath, File outputDirectory, String[] options, int maximumErrors, Logger log); + void compile(File[] sources, File[] classpath, Output output, String[] options, int maximumErrors, Logger log); } diff --git a/interface/src/main/java/xsbti/compile/MultipleOutput.java b/interface/src/main/java/xsbti/compile/MultipleOutput.java new file mode 100755 index 000000000..6ba3479e6 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/MultipleOutput.java @@ -0,0 +1,20 @@ +package xsbti.compile; + +import java.io.File; + +public interface MultipleOutput extends Output { + + interface OutputGroup { + /** The directory where source files are stored for this group. + * Source directories should uniquely identify the group for a source file. */ + File sourceDirectory(); + + /** The directory where class files should be generated. + * Incremental compilation will manage the class files in this directory. + * In particular, outdated class files will be deleted before compilation. + * It is important that this directory is exclusively used for one set of sources. */ + File outputDirectory(); + } + + OutputGroup[] outputGroups(); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/Options.java b/interface/src/main/java/xsbti/compile/Options.java index 9e0d03e98..877ff5ebf 100644 --- a/interface/src/main/java/xsbti/compile/Options.java +++ b/interface/src/main/java/xsbti/compile/Options.java @@ -13,11 +13,8 @@ public interface Options * This should include Scala and Java sources, which are identified by their extension. */ File[] sources(); - /** The directory where class files should be generated. - * Incremental compilation will manage the class files in this directory. - * In particular, outdated class files will be deleted before compilation. - * It is important that this directory is exclusively used for one set of sources. */ - File classesDirectory(); + /** Output for the compilation. */ + Output output(); /** The options to pass to the Scala compiler other than the sources and classpath to use. */ String[] options(); diff --git a/interface/src/main/java/xsbti/compile/Output.java b/interface/src/main/java/xsbti/compile/Output.java new file mode 100755 index 000000000..c7f28a2f1 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/Output.java @@ -0,0 +1,9 @@ +package xsbti.compile; + +import java.io.File; +/** Abstract interface denoting the output of the compilation. Inheritors are SingleOutput with a global output directory and + * MultipleOutput that specifies the output directory per source file. + */ +public interface Output +{ +} diff --git a/interface/src/main/java/xsbti/compile/Setup.java b/interface/src/main/java/xsbti/compile/Setup.java index cf261aa7a..1efc08782 100644 --- a/interface/src/main/java/xsbti/compile/Setup.java +++ b/interface/src/main/java/xsbti/compile/Setup.java @@ -23,4 +23,7 @@ public interface Setup File cacheFile(); GlobalsCache cache(); + + /** If returned, the progress that should be used to report scala compilation to. */ + Maybe progress(); } diff --git a/interface/src/main/java/xsbti/compile/SingleOutput.java b/interface/src/main/java/xsbti/compile/SingleOutput.java new file mode 100755 index 000000000..cb200c9b7 --- /dev/null +++ b/interface/src/main/java/xsbti/compile/SingleOutput.java @@ -0,0 +1,12 @@ +package xsbti.compile; + +import java.io.File; + +public interface SingleOutput extends Output { + + /** The directory where class files should be generated. + * Incremental compilation will manage the class files in this directory. + * In particular, outdated class files will be deleted before compilation. + * It is important that this directory is exclusively used for one set of sources. */ + File outputDirectory(); +} \ No newline at end of file From d0c1b536e8fb630e1c98d353288fa304c3eeb7f0 Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Wed, 11 Jul 2012 12:15:04 +0400 Subject: [PATCH 277/823] Add xsbti.Reporter to required inputs instead of maxErrors. --- interface/src/main/java/xsbti/compile/JavaCompiler.java | 3 +-- interface/src/main/java/xsbti/compile/Options.java | 3 --- interface/src/main/java/xsbti/compile/Setup.java | 4 ++++ 3 files changed, 5 insertions(+), 5 deletions(-) diff --git a/interface/src/main/java/xsbti/compile/JavaCompiler.java b/interface/src/main/java/xsbti/compile/JavaCompiler.java index c9947c700..ff6b83cc3 100644 --- a/interface/src/main/java/xsbti/compile/JavaCompiler.java +++ b/interface/src/main/java/xsbti/compile/JavaCompiler.java @@ -9,7 +9,6 @@ import xsbti.Logger; public interface JavaCompiler { /** Compiles Java sources using the provided classpath, output directory, and additional options. - * If supported, the number of reported errors should be limited to `maximumErrors`. * Output should be sent to the provided logger.*/ - void compile(File[] sources, File[] classpath, Output output, String[] options, int maximumErrors, Logger log); + void compile(File[] sources, File[] classpath, Output output, String[] options, Logger log); } diff --git a/interface/src/main/java/xsbti/compile/Options.java b/interface/src/main/java/xsbti/compile/Options.java index 877ff5ebf..78643202d 100644 --- a/interface/src/main/java/xsbti/compile/Options.java +++ b/interface/src/main/java/xsbti/compile/Options.java @@ -22,9 +22,6 @@ public interface Options /** The options to pass to the Java compiler other than the sources and classpath to use. */ String[] javacOptions(); - /** The maximum number of errors that the Scala compiler should report.*/ - int maxErrors(); - /** Controls the order in which Java and Scala sources are compiled.*/ CompileOrder order(); } diff --git a/interface/src/main/java/xsbti/compile/Setup.java b/interface/src/main/java/xsbti/compile/Setup.java index 1efc08782..050e20c2f 100644 --- a/interface/src/main/java/xsbti/compile/Setup.java +++ b/interface/src/main/java/xsbti/compile/Setup.java @@ -2,6 +2,7 @@ package xsbti.compile; import java.io.File; import xsbti.Maybe; +import xsbti.Reporter; /** Configures incremental recompilation. */ public interface Setup @@ -26,4 +27,7 @@ public interface Setup /** If returned, the progress that should be used to report scala compilation to. */ Maybe progress(); + + /** The reporter that should be used to report scala compilation to. */ + Reporter reporter(); } From a509c472073bb5a72a5f5cfaef78b14c2f8a6c65 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 13 Jul 2012 13:41:00 -0400 Subject: [PATCH 278/823] print completions containing a newline first and on separate lines --- util/complete/JLineCompletion.scala | 32 +++++++++++++++++++++++++---- 1 file changed, 28 insertions(+), 4 deletions(-) diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala index 0af53e262..d772940c7 100644 --- a/util/complete/JLineCompletion.scala +++ b/util/complete/JLineCompletion.scala @@ -110,17 +110,41 @@ object JLineCompletion reader.redrawLine() } + /** `display` is assumed to be the exact strings requested to be displayed. + * In particular, duplicates should have been removed already. */ def showCompletions(display: Seq[String], reader: ConsoleReader) { printCompletions(display, reader) reader.drawLine() } - def printCompletions(cs: Seq[String], reader: ConsoleReader): Unit = + def printCompletions(cs: Seq[String], reader: ConsoleReader) { - // CandidateListCompletionHandler doesn't print a new line before the prompt - if(cs.size > reader.getAutoprintThreshhold) + val print = shouldPrint(cs, reader) + reader.printNewline() + if(print) printLinesAndColumns(cs, reader) + } + def printLinesAndColumns(cs: Seq[String], reader: ConsoleReader) + { + val (lines, columns) = cs partition hasNewline + for(line <- lines) { + reader.printString(line) reader.printNewline() - CandidateListCompletionHandler.printCandidates(reader, JavaConversions.asJavaList(cs), true) + } + reader.printColumns(JavaConversions.asJavaList(columns)) + } + def hasNewline(s: String): Boolean = s.indexOf('\n') >= 0 + def shouldPrint(cs: Seq[String], reader: ConsoleReader): Boolean = + { + val size = cs.size + (size <= reader.getAutoprintThreshhold) || + confirm("Display all %d possibilities? (y or n) ".format(size), 'y', 'n', reader) + } + def confirm(prompt: String, trueC: Char, falseC: Char, reader: ConsoleReader): Boolean = + { + reader.printNewline() + reader.printString(prompt) + reader.flushConsole() + reader.readCharacter( Array(trueC, falseC) ) == trueC } def commonPrefix(s: Seq[String]): String = if(s.isEmpty) "" else s reduceLeft commonPrefix From 9fea4d170323160df0f9e27f6405390c00af9ea7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 13 Jul 2012 13:41:00 -0400 Subject: [PATCH 279/823] methods for working with Scala identifiers --- util/collection/Util.scala | 10 ++++++++++ util/complete/Parsers.scala | 11 +++++++++-- 2 files changed, 19 insertions(+), 2 deletions(-) diff --git a/util/collection/Util.scala b/util/collection/Util.scala index a886c233d..c36bb6443 100644 --- a/util/collection/Util.scala +++ b/util/collection/Util.scala @@ -22,4 +22,14 @@ object Util } def pairID[A,B] = (a: A, b: B) => (a,b) + + private[this] lazy val Hypen = """-(\p{javaLowerCase})""".r + def hypenToCamel(s: String): String = + Hypen.replaceAllIn(s, _.group(1).toUpperCase) + + private[this] lazy val Camel = """(\p{javaLowerCase})(\p{javaUpperCase})""".r + def camelToHypen(s: String): String = + Camel.replaceAllIn(s, m => m.group(1) + "-" + m.group(2).toLowerCase) + + def quoteIfKeyword(s: String): String = if(ScalaKeywords.values(s)) '`' + s + '`' else s } diff --git a/util/complete/Parsers.scala b/util/complete/Parsers.scala index 5e4d46bc1..f289cc6ac 100644 --- a/util/complete/Parsers.scala +++ b/util/complete/Parsers.scala @@ -22,11 +22,17 @@ trait Parsers lazy val Letter = charClass(_.isLetter, "letter") def IDStart = Letter lazy val IDChar = charClass(isIDChar, "ID character") - lazy val ID = IDStart ~ IDChar.* map { case x ~ xs => (x +: xs).mkString } + lazy val ID = identifier(IDStart, IDChar) lazy val OpChar = charClass(isOpChar, "symbol") lazy val Op = OpChar.+.string lazy val OpOrID = ID | Op + lazy val ScalaIDChar = charClass(isScalaIDChar, "Scala identifier character") + lazy val ScalaID = identifier(IDStart, ScalaIDChar) + + def identifier(start: Parser[Char], rep: Parser[Char]): Parser[String] = + start ~ rep.* map { case x ~ xs => (x +: xs).mkString } + def opOrIDSpaced(s: String): Parser[Char] = if(DefaultParsers.matches(ID, s)) OpChar | SpaceClass @@ -37,7 +43,8 @@ trait Parsers def isOpChar(c: Char) = !isDelimiter(c) && isOpType(getType(c)) def isOpType(cat: Int) = cat match { case MATH_SYMBOL | OTHER_SYMBOL | DASH_PUNCTUATION | OTHER_PUNCTUATION | MODIFIER_SYMBOL | CURRENCY_SYMBOL => true; case _ => false } - def isIDChar(c: Char) = c.isLetterOrDigit || c == '-' || c == '_' + def isIDChar(c: Char) = isScalaIDChar(c) || c == '-' + def isScalaIDChar(c: Char) = c.isLetterOrDigit || c == '_' def isDelimiter(c: Char) = c match { case '`' | '\'' | '\"' | /*';' | */',' | '.' => true ; case _ => false } lazy val NotSpaceClass = charClass(!_.isWhitespace, "non-whitespace character") From af85595da02a9de793dd6770f08a43c2e4ac59ab Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 13 Jul 2012 13:41:00 -0400 Subject: [PATCH 280/823] JLine completion integration now considers a suggestion with a newline to be preformatted --- util/complete/JLineCompletion.scala | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala index d772940c7..8eabd0ea5 100644 --- a/util/complete/JLineCompletion.scala +++ b/util/complete/JLineCompletion.scala @@ -64,11 +64,11 @@ object JLineCompletion { val (insert, display) = ( (Set.empty[String], Set.empty[String]) /: cs) { case ( t @ (insert,display), comp) => - if(comp.isEmpty) t else (insert + comp.append, appendNonEmpty(display, comp.display.trim)) + if(comp.isEmpty) t else (insert + comp.append, appendNonEmpty(display, comp.display)) } (insert.toSeq, display.toSeq.sorted) } - def appendNonEmpty(set: Set[String], add: String) = if(add.isEmpty) set else set + add + def appendNonEmpty(set: Set[String], add: String) = if(add.trim.isEmpty) set else set + add def customCompletor(f: (String, Int) => (Seq[String], Seq[String])): (ConsoleReader, Int) => Boolean = (reader, level) => { @@ -128,9 +128,10 @@ object JLineCompletion val (lines, columns) = cs partition hasNewline for(line <- lines) { reader.printString(line) - reader.printNewline() + if(line.charAt(line.length - 1) != '\n') + reader.printNewline() } - reader.printColumns(JavaConversions.asJavaList(columns)) + reader.printColumns(JavaConversions.asJavaList(columns.map(_.trim))) } def hasNewline(s: String): Boolean = s.indexOf('\n') >= 0 def shouldPrint(cs: Seq[String], reader: ConsoleReader): Boolean = From 73166e2e57735767b8c24ee607248eec17b83bf9 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 13 Jul 2012 13:41:00 -0400 Subject: [PATCH 281/823] clean up Completions and allow arbitrary 'display' for Token --- util/complete/Completions.scala | 41 +++++++++++++++++++++------------ 1 file changed, 26 insertions(+), 15 deletions(-) diff --git a/util/complete/Completions.scala b/util/complete/Completions.scala index a14527d48..594a9b9da 100644 --- a/util/complete/Completions.scala +++ b/util/complete/Completions.scala @@ -32,10 +32,12 @@ object Completions /** Returns a strict Completions instance using the provided Completion Set. */ def strict(cs: Set[Completion]): Completions = apply(cs) - /** No suggested completions, not even the empty Completion.*/ + /** No suggested completions, not even the empty Completion. + * This typically represents invalid input. */ val nil: Completions = strict(Set.empty) - /** Only includes an empty Suggestion */ + /** Only includes an empty Suggestion. + * This typically represents valid input that either has no completions or accepts no further input. */ val empty: Completions = strict(Set.empty + Completion.empty) /** Returns a strict Completions instance containing only the provided Completion.*/ @@ -78,17 +80,15 @@ final class DisplayOnly(val display: String) extends Completion def append = "" override def toString = "{" + display + "}" } -final class Token(prepend0: String, append0: String) extends Completion +final class Token(val display: String, val append: String) extends Completion { - lazy val prepend = prepend0 - lazy val append = append0 - def isEmpty = prepend.isEmpty && append.isEmpty - def display = prepend + append - override final def toString = "[" + prepend + "," + append +"]" + @deprecated("Retained only for compatibility. All information is now in `display` and `append`.", "0.12.1") + lazy val prepend = display.stripSuffix(append) + def isEmpty = display.isEmpty && append.isEmpty + override final def toString = "[" + display + "]++" + append } -final class Suggestion(append0: String) extends Completion +final class Suggestion(val append: String) extends Completion { - lazy val append = append0 def isEmpty = append.isEmpty def display = append override def toString = append @@ -116,7 +116,7 @@ object Completion { case (as: Suggestion, bs: Suggestion) => as.append == bs.append case (ad: DisplayOnly, bd: DisplayOnly) => ad.display == bd.display - case (at: Token, bt: Token) => at.prepend == bt.prepend && at.append == bt.append + case (at: Token, bt: Token) => at.display == bt.display && at.append == bt.append case _ => false } @@ -125,16 +125,27 @@ object Completion { case as: Suggestion => (0, as.append).hashCode case ad: DisplayOnly => (1, ad.display).hashCode - case at: Token => (2, at.prepend, at.append).hashCode + case at: Token => (2, at.display, at.append).hashCode } - val empty: Completion = suggestStrict("") - def single(c: Char): Completion = suggestStrict(c.toString) + val empty: Completion = suggestion("") + def single(c: Char): Completion = suggestion(c.toString) + // TODO: make strict in 0.13.0 to match DisplayOnly def displayOnly(value: => String): Completion = new DisplayOnly(value) + @deprecated("Use displayOnly.", "0.12.1") def displayStrict(value: String): Completion = displayOnly(value) - def token(prepend: => String, append: => String): Completion = new Token(prepend, append) + + // TODO: make strict in 0.13.0 to match Token + def token(prepend: => String, append: => String): Completion = new Token(prepend+append, append) + @deprecated("Use token.", "0.12.1") def tokenStrict(prepend: String, append: String): Completion = token(prepend, append) + + /** @since 0.12.1 */ + def tokenDisplay(append: String, display: String): Completion = new Token(display, append) + + // TODO: make strict in 0.13.0 to match Suggestion def suggestion(value: => String): Completion = new Suggestion(value) + @deprecated("Use suggestion.", "0.12.1") def suggestStrict(value: String): Completion = suggestion(value) } \ No newline at end of file From fa97cc0d22632b99fb0e5e86295e76a24409bc3e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 13 Jul 2012 13:41:00 -0400 Subject: [PATCH 282/823] basic code for cleaning up Manifest.toString --- util/complete/TypeString.scala | 77 ++++++++++++++++++++++++++++++++++ 1 file changed, 77 insertions(+) create mode 100644 util/complete/TypeString.scala diff --git a/util/complete/TypeString.scala b/util/complete/TypeString.scala new file mode 100644 index 000000000..976b672e2 --- /dev/null +++ b/util/complete/TypeString.scala @@ -0,0 +1,77 @@ +package sbt.complete + + import DefaultParsers._ + import TypeString._ + +/** Basic representation of types parsed from Manifest.toString. +* This can only represent the structure of parameterized types. +* All other types are represented by a TypeString with an empty `args`. */ +private[sbt] final class TypeString(val base: String, val args: List[TypeString]) +{ + override def toString = + if(base.startsWith(FunctionName)) + args.dropRight(1).mkString("(", ",", ")") + " => " + args.last + else if(base.startsWith(TupleName)) + args.mkString("(",",",")") + else + cleanupTypeName(base) + (if(args.isEmpty) "" else args.mkString("[", ",", "]")) +} + +private[sbt] object TypeString +{ + /** Makes the string representation of a type as returned by Manifest.toString more readable.*/ + def cleanup(typeString: String): String = + parse(typeString, typeStringParser) match { + case Right(ts) => ts.toString + case Left(err) => typeString + } + + /** Makes a fully qualified type name provided by Manifest.toString more readable. + * The argument should be just a name (like scala.Tuple2) and not a full type (like scala.Tuple2[Int,Boolean])*/ + def cleanupTypeName(base: String): String = + dropPrefix(base).replace('$', '.') + + /** Removes prefixes from a fully qualified type name that are unnecessary in the presence of standard imports for an sbt setting. + * This does not use the compiler and is therefore a conservative approximation.*/ + def dropPrefix(base: String): String = + if(base.startsWith(SbtPrefix)) + base.substring(SbtPrefix.length) + else if(base.startsWith(CollectionPrefix)) + { + val simple = base.substring(CollectionPrefix.length) + if(ShortenCollection(simple)) simple else base + } + else if(base.startsWith(ScalaPrefix)) + base.substring(ScalaPrefix.length) + else if(base.startsWith(JavaPrefix)) + base.substring(JavaPrefix.length) + else + TypeMap.getOrElse(base, base) + + final val CollectionPrefix = "scala.collection." + final val FunctionName = "scala.Function" + final val TupleName = "scala.Tuple" + final val SbtPrefix = "sbt." + final val ScalaPrefix = "scala." + final val JavaPrefix = "java.lang." + /* scala.collection.X -> X */ + val ShortenCollection = Set("Seq", "List", "Set", "Map", "Iterable") + val TypeMap = Map( + "java.io.File" -> "File", + "java.net.URL" -> "URL", + "java.net.URI" -> "URI" + ) + + /** A Parser that extracts basic structure from the string representation of a type from Manifest.toString. + * This is rudimentary and essentially only decomposes the string into names and arguments for parameterized types. + * */ + lazy val typeStringParser: Parser[TypeString] = + { + def isFullScalaIDChar(c: Char) = isScalaIDChar(c) || c == '.' || c == '$' + lazy val fullScalaID = identifier(IDStart, charClass(isFullScalaIDChar, "Scala identifier character") ) + lazy val tpe: Parser[TypeString] = + for( id <- fullScalaID; args <- ('[' ~> rep1sep(tpe, ',') <~ ']').?) yield + new TypeString(id, args.toList.flatten) + tpe + } +} \ No newline at end of file From e5ffceaef8a4d270d38f806a9feb3a778024f0b1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 13 Jul 2012 13:41:00 -0400 Subject: [PATCH 283/823] clean up token completions and make providing a general completion function easier --- util/complete/Parser.scala | 55 +++++++++++++++++++--------- util/complete/TokenCompletions.scala | 38 +++++++++++++++++++ 2 files changed, 76 insertions(+), 17 deletions(-) create mode 100644 util/complete/TokenCompletions.scala diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index d16969b55..7f0fba5db 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -346,12 +346,39 @@ trait ParserMain success(seen.mkString) } - def token[T](t: Parser[T]): Parser[T] = token(t, "", true, const(false)) - def token[T](t: Parser[T], hide: Int => Boolean): Parser[T] = token(t, "", true, hide) - def token[T](t: Parser[T], description: String): Parser[T] = token(t, description, false, const(false)) + /** Establishes delegate parser `t` as a single token of tab completion. + * When tab completion of part of this token is requested, the completions provided by the delegate `t` or a later derivative are appended to + * the prefix String already seen by this parser. */ + def token[T](t: Parser[T]): Parser[T] = token(t, TokenCompletions.default) + + /** Establishes delegate parser `t` as a single token of tab completion. + * When tab completion of part of this token is requested, no completions are returned if `hide` returns true for the current tab completion level. + * Otherwise, the completions provided by the delegate `t` or a later derivative are appended to the prefix String already seen by this parser.*/ + def token[T](t: Parser[T], hide: Int => Boolean): Parser[T] = token(t, TokenCompletions.default.hideWhen(hide)) + + /** Establishes delegate parser `t` as a single token of tab completion. + * When tab completion of part of this token is requested, `description` is displayed for suggestions and no completions are ever performed. */ + def token[T](t: Parser[T], description: String): Parser[T] = token(t, TokenCompletions.displayOnly(description)) + + /** Establishes delegate parser `t` as a single token of tab completion. + * When tab completion of part of this token is requested, `display` is used as the printed suggestion, but the completions from the delegate + * parser `t` are used to complete if unambiguous. */ + def tokenDisplay[T](t: Parser[T], display: String): Parser[T] = + token(t, TokenCompletions.overrideDisplay(display)) + + def token[T](t: Parser[T], complete: TokenCompletions): Parser[T] = + mkToken(t, "", complete) + + @deprecated("Use a different `token` overload.", "0.12.1") def token[T](t: Parser[T], seen: String, track: Boolean, hide: Int => Boolean): Parser[T] = + { + val base = if(track) TokenCompletions.default else TokenCompletions.displayOnly(seen) + token(t, base.hideWhen(hide)) + } + + private[sbt] def mkToken[T](t: Parser[T], seen: String, complete: TokenCompletions): Parser[T] = if(t.valid && !t.isTokenStart) - if(t.result.isEmpty) new TokenStart(t, seen, track, hide) else t + if(t.result.isEmpty) new TokenStart(t, seen, complete) else t else t @@ -512,23 +539,17 @@ private final class MatchedString(delegate: Parser[_], seenV: Vector[Char], part override def isTokenStart = delegate.isTokenStart override def toString = "matched(" + partial + ", " + seen + ", " + delegate + ")" } -private final class TokenStart[T](delegate: Parser[T], seen: String, track: Boolean, hide: Int => Boolean) extends ValidParser[T] +private final class TokenStart[T](delegate: Parser[T], seen: String, complete: TokenCompletions) extends ValidParser[T] { - def derive(c: Char) = token( delegate derive c, if(track) seen + c else seen, track, hide) - def completions(level: Int) = - if(hide(level)) Completions.nil - else if(track) - { - val dcs = delegate.completions(level) - Completions( for(c <- dcs.get) yield Completion.token(seen, c.append) ) - } - else - Completions.single(Completion.displayStrict(seen)) - + def derive(c: Char) = mkToken( delegate derive c, seen + c, complete) + def completions(level: Int) = complete match { + case dc: TokenCompletions.Delegating => dc.completions(seen, level, delegate.completions(level)) + case fc: TokenCompletions.Fixed => fc.completions(seen, level) + } def result = delegate.result def resultEmpty = delegate.resultEmpty override def isTokenStart = true - override def toString = "token('" + seen + "', " + track + ", " + delegate + ")" + override def toString = "token('" + complete + ", " + delegate + ")" } private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] { diff --git a/util/complete/TokenCompletions.scala b/util/complete/TokenCompletions.scala new file mode 100644 index 000000000..aee6353db --- /dev/null +++ b/util/complete/TokenCompletions.scala @@ -0,0 +1,38 @@ +package sbt.complete + + import Completion.{displayStrict, token => ctoken, tokenDisplay} + +sealed trait TokenCompletions { + def hideWhen(f: Int => Boolean): TokenCompletions +} +object TokenCompletions +{ + private[sbt] abstract class Delegating extends TokenCompletions { outer => + def completions(seen: String, level: Int, delegate: Completions): Completions + final def hideWhen(hide: Int => Boolean): TokenCompletions = new Delegating { + def completions(seen: String, level: Int, delegate: Completions): Completions = + if(hide(level)) Completions.nil else outer.completions(seen, level, delegate) + } + } + private[sbt] abstract class Fixed extends TokenCompletions { outer => + def completions(seen: String, level: Int): Completions + final def hideWhen(hide: Int => Boolean): TokenCompletions = new Fixed { + def completions(seen: String, level: Int) = + if(hide(level)) Completions.nil else outer.completions(seen, level) + } + } + + val default: TokenCompletions = mapDelegateCompletions((seen,level,c) => ctoken(seen, c.append)) + + def displayOnly(msg: String): TokenCompletions = new Fixed { + def completions(seen: String, level: Int) = Completions.single(displayStrict(msg)) + } + def overrideDisplay(msg: String): TokenCompletions = mapDelegateCompletions((seen,level,c) => tokenDisplay(display = msg, append = c.append)) + + def fixed(f: (String, Int) => Completions): TokenCompletions = new Fixed { + def completions(seen: String, level: Int) = f(seen, level) + } + def mapDelegateCompletions(f: (String, Int, Completion) => Completion): TokenCompletions = new Delegating { + def completions(seen: String, level: Int, delegate: Completions) = Completions( delegate.get.map(c => f(seen, level, c)) ) + } +} \ No newline at end of file From b65a7078f106d8bf361275a863d58c40516686df Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Tue, 17 Jul 2012 21:09:42 +0400 Subject: [PATCH 284/823] Fix compilation error for 2.10.0-M5 for all but main project. --- util/collection/Dag.scala | 6 +++--- util/collection/IDSet.scala | 4 ++-- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/util/collection/Dag.scala b/util/collection/Dag.scala index 14a418f26..4250b0f10 100644 --- a/util/collection/Dag.scala +++ b/util/collection/Dag.scala @@ -11,15 +11,15 @@ trait Dag[Node <: Dag[Node]]{ } object Dag { - import scala.collection.{mutable, JavaConversions}; - import JavaConversions.{asIterable, asSet} + import scala.collection.{mutable, JavaConverters} + import JavaConverters.asScalaSetConverter def topologicalSort[T](root: T)(dependencies: T => Iterable[T]): List[T] = topologicalSort(root :: Nil)(dependencies) def topologicalSort[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = { val discovered = new mutable.HashSet[T] - val finished = asSet(new java.util.LinkedHashSet[T]) + val finished = (new java.util.LinkedHashSet[T]).asScala def visitAll(nodes: Iterable[T]) = nodes foreach visit def visit(node : T){ diff --git a/util/collection/IDSet.scala b/util/collection/IDSet.scala index 29ecf469d..683c7a76b 100644 --- a/util/collection/IDSet.scala +++ b/util/collection/IDSet.scala @@ -37,9 +37,9 @@ object IDSet def += (t: T) = backing.put(t, Dummy) def ++=(t: Iterable[T]) = t foreach += def -= (t:T) = if(backing.remove(t) eq null) false else true - def all = collection.JavaConversions.asIterable(backing.keySet) + def all = collection.JavaConversions.asScalaIterable(backing.keySet) def isEmpty = backing.isEmpty def process[S](t: T)(ifSeen: S)(ifNew: => S) = if(contains(t)) ifSeen else { this += t ; ifNew } override def toString = backing.toString } -} \ No newline at end of file +} From 2e8fdbdf05065715cdb0bd3c526b2b3f47ebaad8 Mon Sep 17 00:00:00 2001 From: Eugene Vigdorchik Date: Tue, 24 Jul 2012 10:43:56 +0400 Subject: [PATCH 285/823] Extend reporter to be used by the IDE. --- interface/src/main/java/xsbti/ExtendedReporter.java | 10 ++++++++++ 1 file changed, 10 insertions(+) create mode 100755 interface/src/main/java/xsbti/ExtendedReporter.java diff --git a/interface/src/main/java/xsbti/ExtendedReporter.java b/interface/src/main/java/xsbti/ExtendedReporter.java new file mode 100755 index 000000000..7bc4acc47 --- /dev/null +++ b/interface/src/main/java/xsbti/ExtendedReporter.java @@ -0,0 +1,10 @@ +/* sbt -- Simple Build Tool + * Copyright 2012 Eugene Vigdorchik + */ +package xsbti; + +/** An addition to standard reporter. Used by the IDE. */ +public interface ExtendedReporter extends Reporter +{ + public void comment(Position pos, String msg); +} From d17de8e83a999b4833d816d9b53eb299dff424bb Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 1 Sep 2012 09:56:09 -0400 Subject: [PATCH 286/823] back all ConsoleLoggers by a common ConsoleOut The common ConsoleOut merges (overwrites) consecutive Resolving xxxx ... lines when ansi codes are enabled. --- util/log/ConsoleLogger.scala | 26 ++++++++++++++++++++++++++ util/log/GlobalLogging.scala | 5 ++++- util/log/MainLogging.scala | 16 ++++++++++++++-- 3 files changed, 44 insertions(+), 3 deletions(-) diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala index bc48d7ad2..711489c5c 100644 --- a/util/log/ConsoleLogger.scala +++ b/util/log/ConsoleLogger.scala @@ -8,6 +8,29 @@ package sbt object ConsoleLogger { def systemOut: ConsoleOut = printStreamOut(System.out) + def overwriteContaining(s: String): (String,String) => Boolean = (cur, prev) => + cur.contains(s) && prev.contains(s) + + /** ConsoleOut instance that is backed by System.out. It overwrites the previously printed line + * if the function `f(lineToWrite, previousLine)` returns true. + * + * The ConsoleOut returned by this method assumes that the only newlines are from println calls + * and not in the String arguments. */ + def systemOutOverwrite(f: (String,String) => Boolean): ConsoleOut = new ConsoleOut { + val lockObject = System.out + private[this] var last: Option[String] = None + private[this] var current = new java.lang.StringBuffer + def print(s: String): Unit = synchronized { current.append(s) } + def println(s: String): Unit = synchronized { current.append(s); println() } + def println(): Unit = synchronized { + val s = current.toString + if(formatEnabled && last.exists(lmsg => f(s, lmsg))) + System.out.print(OverwriteLine) + System.out.println(s) + last = Some(s) + current = new java.lang.StringBuffer + } + } def printStreamOut(out: PrintStream): ConsoleOut = new ConsoleOut { val lockObject = out def print(s: String) = out.print(s) @@ -30,6 +53,9 @@ object ConsoleLogger /** Escape character, used to introduce an escape sequence. */ final val ESC = '\u001B' + /** Move to beginning of previous line and clear the line. */ + private[sbt] final val OverwriteLine = "\r\u001BM\u001B[2K" + /** An escape terminator is a character in the range `@` (decimal value 64) to `~` (decimal value 126). * It is the final character in an escape sequence. */ def isEscapeTerminator(c: Char): Boolean = diff --git a/util/log/GlobalLogging.scala b/util/log/GlobalLogging.scala index e54b00a00..f712dd88a 100644 --- a/util/log/GlobalLogging.scala +++ b/util/log/GlobalLogging.scala @@ -19,9 +19,12 @@ object GlobalLogBacking } object GlobalLogging { + @deprecated("Explicitly specify standard out.", "0.13.0") def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File): GlobalLogging = + initial(newLogger, newBackingFile, ConsoleLogger.systemOut) + def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File, console: ConsoleOut): GlobalLogging = { - val log = ConsoleLogger() + val log = ConsoleLogger(console) GlobalLogging(log, log, GlobalLogBacking(newLogger, newBackingFile)) } } \ No newline at end of file diff --git a/util/log/MainLogging.scala b/util/log/MainLogging.scala index d7b45e043..9e024ef28 100644 --- a/util/log/MainLogging.scala +++ b/util/log/MainLogging.scala @@ -18,17 +18,29 @@ object MainLogging multi: Logger } def globalDefault(writer: PrintWriter, backing: GlobalLogBacking): GlobalLogging = + globalDefault(writer, backing, ConsoleLogger.systemOut) + def globalDefault(writer: PrintWriter, backing: GlobalLogBacking, console: ConsoleOut): GlobalLogging = { val backed = defaultBacked()(writer) - val full = multiLogger(defaultMultiConfig( backed ) ) + val full = multiLogger(defaultMultiConfig(console, backed ) ) GlobalLogging(full, backed, backing) } + @deprecated("Explicitly specify the console output.", "0.13.0") def defaultMultiConfig(backing: AbstractLogger): MultiLoggerConfig = - new MultiLoggerConfig(defaultScreen(ConsoleLogger.noSuppressedMessage), backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) + defaultMultiConfig(ConsoleLogger.systemOut, backing) + def defaultMultiConfig(console: ConsoleOut, backing: AbstractLogger): MultiLoggerConfig = + new MultiLoggerConfig(defaultScreen(console, ConsoleLogger.noSuppressedMessage), backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) + @deprecated("Explicitly specify the console output.", "0.13.0") def defaultScreen(): AbstractLogger = ConsoleLogger() + + @deprecated("Explicitly specify the console output.", "0.13.0") def defaultScreen(suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = ConsoleLogger(suppressedMessage = suppressedMessage) + + def defaultScreen(console: ConsoleOut): AbstractLogger = ConsoleLogger(console) + def defaultScreen(console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = + ConsoleLogger(console, suppressedMessage = suppressedMessage) def defaultBacked(useColor: Boolean = ConsoleLogger.formatEnabled): PrintWriter => ConsoleLogger = to => ConsoleLogger(ConsoleLogger.printWriterOut(to), useColor = useColor) From c8ffd6a54dc50e2065893f92f2b4b4b652627094 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 18 Sep 2012 13:22:40 -0400 Subject: [PATCH 287/823] better error message for null setting values --- util/collection/INode.scala | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/util/collection/INode.scala b/util/collection/INode.scala index b47031a43..0e48324e3 100644 --- a/util/collection/INode.scala +++ b/util/collection/INode.scala @@ -88,7 +88,10 @@ abstract class EvaluateSettings[Scope] private[this] val calledBy = new collection.mutable.ListBuffer[BindNode[_, T]] override def toString = getClass.getName + " (state=" + state + ",blockedOn=" + blockedOn + ",calledBy=" + calledBy.size + ",blocking=" + blocking.size + "): " + - ( (static.toSeq.flatMap { case (key, value) => if(value eq this) key.toString :: Nil else Nil }).headOption getOrElse "non-static") + keyString + + private[this] def keyString = + (static.toSeq.flatMap { case (key, value) => if(value eq this) init.showFullKey(key) :: Nil else Nil }).headOption getOrElse "non-static" final def get: T = synchronized { assert(value != null, toString + " not evaluated") @@ -134,6 +137,7 @@ abstract class EvaluateSettings[Scope] } protected final def setValue(v: T) { assert(state != Evaluated, "Already evaluated (trying to set value to " + v + "): " + toString) + if(v == null) error("Setting value cannot be null: " + keyString) value = v state = Evaluated blocking foreach { _.unblocked() } From efa362c583571650864efc37ec58aa0d7466ad50 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Sun, 23 Sep 2012 18:14:27 +0200 Subject: [PATCH 288/823] Fix #552 Compensate for JLine's absent EOF detection. In the unsupported terminal mode, JLine treats a broken stdin as an endless stream of empty lines. This is problematic for idea-sbt-plugin: if the IntelliJ process is forcibly killed and leaves the child SBT process running, it consumes considerable CPU processing these. Patching JLine itself would be the cleanest solution (the change has already been applied to JLine 2), but I've shied away from that and instead wrapped the InputStream that is read by JLine to intercept the result of -1 from read(). When this happens, the flat `inputEof` is set to true. --- util/complete/LineReader.scala | 25 +++++++++++++++++++++---- 1 file changed, 21 insertions(+), 4 deletions(-) diff --git a/util/complete/LineReader.scala b/util/complete/LineReader.scala index ecd1eafd9..9f3ca9036 100644 --- a/util/complete/LineReader.scala +++ b/util/complete/LineReader.scala @@ -3,14 +3,17 @@ */ package sbt - import jline.{Completor, ConsoleReader, History} - import java.io.{File,PrintWriter} + import jline.{ConsoleReader, History} + import java.io.{File, InputStream, PrintWriter} import complete.Parser - + import java.util.concurrent.atomic.AtomicBoolean + abstract class JLine extends LineReader { protected[this] val handleCONT: Boolean protected[this] val reader: ConsoleReader + /** Is the input stream at EOF? Compensates for absent EOF detection in JLine's UnsupportedTerminal. */ + protected[this] val inputEof = new AtomicBoolean(false) protected[this] val historyPath: Option[File] def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } @@ -39,10 +42,14 @@ abstract class JLine extends LineReader else readLineDirectRaw(prompt, mask) private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): String = - mask match { + { + val line = mask match { case Some(m) => reader.readLine(prompt, m) case None => reader.readLine(prompt) } + if (inputEof.get) null else line + } + private[this] def resume() { jline.Terminal.resetTerminal @@ -109,6 +116,16 @@ final class FullReader(val historyPath: Option[File], complete: Parser[_], val h protected[this] val reader = { val cr = new ConsoleReader + if (!cr.getTerminal.isSupported) { + val input = cr.getInput + cr.setInput(new InputStream { + def read(): Int = { + val c = input.read() + if (c == -1) inputEof.set(true) + c + } + }) + } cr.setBellEnabled(false) sbt.complete.JLineCompletion.installCustomCompletor(cr, complete) cr From aee2742932c51e9018d21674b8d580e5abec0f8d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 5 Oct 2012 09:06:26 -0400 Subject: [PATCH 289/823] API extraction: handle any type that is annotated, not just the spec'd simple type. Fixes #559. --- interface/type | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/interface/type b/interface/type index ac0926e92..dbb393dd6 100644 --- a/interface/type +++ b/interface/type @@ -16,7 +16,7 @@ Type baseType: Type value: String Annotated - baseType : SimpleType + baseType : Type annotations : Annotation* Structure parents : ~Type* From 1f88fe9d7c635ff62c42d5891c46dc8465aa5fa7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 15 Oct 2012 12:42:27 -0400 Subject: [PATCH 290/823] Parser.failOnException method, don't let rhs of alias fail the parse. Fixes #572. alias only parses the right hand side for tab completion help. The assignment should happen whether or not the parse is successful because the context may change by the time the alias is actually evaluated. In particular, the 'set' command uses the loaded project for tab completion in 0.12.1. When a .sbtrc file is processed, the project has not been loaded yet, so aliases involving set fail. Wrapping the rhs in failOnException addresses this. --- util/complete/Parser.scala | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index 7f0fba5db..a10d1bdcb 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -45,6 +45,9 @@ sealed trait RichParser[A] /** Uses the specified message if the original Parser fails.*/ def !!!(msg: String): Parser[A] + /** If an exception is thrown by the original Parser, + * capture it and fail locally instead of allowing the exception to propagate up and terminate parsing.*/ + def failOnException: Parser[A] def unary_- : Parser[Unit] def & (o: Parser[_]): Parser[A] @@ -173,6 +176,8 @@ object Parser extends ParserMain def onFailure[T](delegate: Parser[T], msg: String): Parser[T] = if(delegate.valid) new OnFailure(delegate, msg) else failure(msg) + def trapAndFail[T](delegate: Parser[T]): Parser[T] = + delegate.ifValid( new TrapAndFail(delegate) ) def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) @@ -233,6 +238,7 @@ trait ParserMain def <~[B](b: Parser[B]): Parser[A] = (a ~ b) map { case av ~ _ => av } def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } def !!!(msg: String): Parser[A] = onFailure(a, msg) + def failOnException: Parser[A] = trapAndFail(a) def unary_- = not(a) def & (o: Parser[_]) = and(a, o) @@ -425,6 +431,18 @@ private final case class Invalid(fail: Failure) extends Parser[Nothing] def valid = false def ifValid[S](p: => Parser[S]): Parser[S] = this } + +private final class TrapAndFail[A](a: Parser[A]) extends ValidParser[A] +{ + def result = try { a.result } catch { case e: Exception => None } + def resultEmpty = try { a.resultEmpty } catch { case e: Exception => fail(e) } + def derive(c: Char) = try { trapAndFail(a derive c) } catch { case e: Exception => Invalid(fail(e)) } + def completions(level: Int) = try { a.completions(level) } catch { case e: Exception => Completions.nil } + override def toString = "trap(" + a + ")" + override def isTokenStart = a.isTokenStart + private[this] def fail(e: Exception): Failure = mkFailure(e.toString) +} + private final class OnFailure[A](a: Parser[A], message: String) extends ValidParser[A] { def result = a.result From 0e472a99f9b35ac2fca2c362bfc43b550d7a6bc8 Mon Sep 17 00:00:00 2001 From: Benjy Date: Wed, 10 Oct 2012 18:21:44 -0700 Subject: [PATCH 291/823] Analysis.groupBy implementation. --- util/relation/Relation.scala | 11 ++++++++--- 1 file changed, 8 insertions(+), 3 deletions(-) diff --git a/util/relation/Relation.scala b/util/relation/Relation.scala index 0128333bd..04efe3e3e 100644 --- a/util/relation/Relation.scala +++ b/util/relation/Relation.scala @@ -72,10 +72,13 @@ trait Relation[A,B] def contains(a: A, b: B): Boolean /** Returns a relation with only pairs (a,b) for which f(a,b) is true.*/ def filter(f: (A,B) => Boolean): Relation[A,B] - + + /** Partitions this relation into a map of relations according to some discriminator function. */ + def groupBy[K](f: ((A,B)) => K): Map[K, Relation[A,B]] + /** Returns all pairs in this relation.*/ def all: Traversable[(A,B)] - + def forwardMap: Map[A, Set[B]] def reverseMap: Map[B, Set[A]] } @@ -93,7 +96,7 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def size = fwd.size def all: Traversable[(A,B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable - + def +(pair: (A,B)) = this + (pair._1, Set(pair._2)) def +(from: A, to: B) = this + (from, to :: Nil) def +(from: A, to: Traversable[B]) = @@ -116,6 +119,8 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def filter(f: (A,B) => Boolean): Relation[A,B] = Relation.empty[A,B] ++ all.filter(f.tupled) + def groupBy[K](f: ((A,B)) => K): Map[K, Relation[A,B]] = all.groupBy(f) mapValues { Relation.empty[A,B] ++ _ } + def contains(a: A, b: B): Boolean = forward(a)(b) override def toString = all.map { case (a,b) => a + " -> " + b }.mkString("Relation [", ", ", "]") From 19315265c15f2543419f817f0a97900fe83d0463 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 14 Nov 2012 09:04:29 -0500 Subject: [PATCH 292/823] taking care of deprecations removed in Scala master --- util/collection/IDSet.scala | 2 +- util/collection/Util.scala | 2 ++ util/complete/JLineCompletion.scala | 2 +- util/complete/Parser.scala | 4 ++-- 4 files changed, 6 insertions(+), 4 deletions(-) diff --git a/util/collection/IDSet.scala b/util/collection/IDSet.scala index 683c7a76b..447082d8b 100644 --- a/util/collection/IDSet.scala +++ b/util/collection/IDSet.scala @@ -37,7 +37,7 @@ object IDSet def += (t: T) = backing.put(t, Dummy) def ++=(t: Iterable[T]) = t foreach += def -= (t:T) = if(backing.remove(t) eq null) false else true - def all = collection.JavaConversions.asScalaIterable(backing.keySet) + def all = collection.JavaConversions.collectionAsScalaIterable(backing.keySet) def isEmpty = backing.isEmpty def process[S](t: T)(ifSeen: S)(ifNew: => S) = if(contains(t)) ifSeen else { this += t ; ifNew } override def toString = backing.toString diff --git a/util/collection/Util.scala b/util/collection/Util.scala index c36bb6443..5aede5f0d 100644 --- a/util/collection/Util.scala +++ b/util/collection/Util.scala @@ -5,6 +5,8 @@ package sbt object Util { + def makeList[T](size: Int, value: T): List[T] = List.fill(size)(value) + def separateE[A,B](ps: Seq[Either[A,B]]): (Seq[A], Seq[B]) = separate(ps)(Types.idFun) diff --git a/util/complete/JLineCompletion.scala b/util/complete/JLineCompletion.scala index 8eabd0ea5..d41f47b85 100644 --- a/util/complete/JLineCompletion.scala +++ b/util/complete/JLineCompletion.scala @@ -131,7 +131,7 @@ object JLineCompletion if(line.charAt(line.length - 1) != '\n') reader.printNewline() } - reader.printColumns(JavaConversions.asJavaList(columns.map(_.trim))) + reader.printColumns(JavaConversions.seqAsJavaList(columns.map(_.trim))) } def hasNewline(s: String): Boolean = s.indexOf('\n') >= 0 def shouldPrint(cs: Seq[String], reader: ConsoleReader): Boolean = diff --git a/util/complete/Parser.scala b/util/complete/Parser.scala index a10d1bdcb..a994e0658 100644 --- a/util/complete/Parser.scala +++ b/util/complete/Parser.scala @@ -5,7 +5,7 @@ package sbt.complete import Parser._ import sbt.Types.{const, left, right, some} - import sbt.Util.separate + import sbt.Util.{makeList,separate} sealed trait Parser[+T] { @@ -676,7 +676,7 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m else // forced determinism for(value <- repeated.resultEmpty) yield - List.make(min, value) + makeList(min, value) } override def toString = "repeat(" + min + "," + max +"," + partial + "," + repeated + ")" } \ No newline at end of file From dbe4b74c1073c0d70fc56dfeb4072411c295f0f4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 31 Jul 2012 11:52:10 -0400 Subject: [PATCH 293/823] reorganization of main/ * split several source files * move base settings sources (Scope, Structure, ...) into main/settings/ * breaks cycles. In particular, setting system moved from Project to Def --- util/collection/Attributes.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index f5ea9dcd4..0376ca76c 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -125,6 +125,7 @@ final case class Attributed[D](data: D)(val metadata: AttributeMap) } object Attributed { + def data[T](in: Seq[Attributed[T]]): Seq[T] = in.map(_.data) def blankSeq[T](in: Seq[T]): Seq[Attributed[T]] = in map blank def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) } \ No newline at end of file From 15fec197c31b36cf183800fe751b5f04b10ef8ab Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 31 Jul 2012 11:52:10 -0400 Subject: [PATCH 294/823] 2.10.0-M5, different arity generalization 1. KList[M[_]] now instead of KList[HL <: HList, M[_]] a. head, tail work properly in this variant b. disadvantage is that full type not easily transformed to new type constructor 2. AList abstracts on K[L[x]], a higher order type constructor. A. Instances written for: a. KList b. Seq[M[T]] for a fixed T c. TupleN d. single values e. operate on one type constructor when nested B. Main disadvantage is type inference. It just doesn't happen for K[L[x]]. This is mitigated by AList being used internally and rarely needing to construct a K. --- util/collection/AList.scala | 211 ++++++++++++++++++ util/collection/Classes.scala | 27 +++ util/collection/INode.scala | 21 +- util/collection/KList.scala | 103 ++++----- util/collection/Settings.scala | 44 ++-- util/collection/TypeFunctions.scala | 3 +- util/collection/Types.scala | 7 +- .../collection/src/test/scala/KListTest.scala | 19 -- util/collection/src/test/scala/PMapTest.scala | 5 +- .../src/test/scala/SettingsExample.scala | 2 +- 10 files changed, 308 insertions(+), 134 deletions(-) create mode 100644 util/collection/AList.scala create mode 100644 util/collection/Classes.scala delete mode 100644 util/collection/src/test/scala/KListTest.scala diff --git a/util/collection/AList.scala b/util/collection/AList.scala new file mode 100644 index 000000000..212a58411 --- /dev/null +++ b/util/collection/AList.scala @@ -0,0 +1,211 @@ +package sbt + + import Classes.Applicative + import Types._ + +/** An abstraction over (* -> *) -> * with the purpose of abstracting over arity abstractions like KList and TupleN +* as well as homogeneous sequences Seq[M[T]]. */ +trait AList[K[L[x]] ] +{ + def transform[M[_], N[_]](value: K[M], f: M ~> N): K[N] + def traverse[M[_], N[_], P[_]](value: K[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[K[P]] + def foldr[M[_], A](value: K[M], f: (M[_], A) => A, init: A): A + + def toList[M[_]](value: K[M]): List[M[_]] = foldr[M, List[M[_]]](value, _ :: _, Nil) + def apply[M[_], C](value: K[M], f: K[Id] => C)(implicit a: Applicative[M]): M[C] = + a.map(f, traverse[M, M, Id](value, idK[M])(a)) +} +object AList +{ + type Empty = AList[({ type l[L[x]] = Unit})#l] + val empty: Empty = new Empty { + def transform[M[_], N[_]](in: Unit, f: M ~> N) = () + def foldr[M[_], T](in: Unit, f: (M[_], T) => T, init: T) = init + override def apply[M[_], C](in: Unit, f: Unit => C)(implicit app: Applicative[M]): M[C] = app.pure( f( () ) ) + def traverse[M[_], N[_], P[_]](in: Unit, f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Unit] = np.pure( () ) + } + + type SeqList[T] = AList[({ type l[L[x]] = List[L[T]] })#l] + def seq[T]: SeqList[T] = new SeqList[T] + { + def transform[M[_], N[_]](s: List[M[T]], f: M ~> N) = s.map(f.fn[T]) + def foldr[M[_], A](s: List[M[T]], f: (M[_], A) => A, init: A): A = (init /: s.reverse)( (t, m) => f(m,t)) + override def apply[M[_], C](s: List[M[T]], f: List[T] => C)(implicit ap: Applicative[M]): M[C] = + { + def loop[V](in: List[M[T]], g: List[T] => V): M[V] = + in match { + case Nil => ap.pure(g(Nil)) + case x :: xs => + val h = (ts: List[T]) => (t: T) => g(t :: ts) + ap.apply( loop(xs, h), x ) + } + loop(s, f) + } + def traverse[M[_], N[_], P[_]](s: List[M[T]], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[List[P[T]]] = ??? + } + + def klist[KL[M[_]] <: KList[M] { type Transform[N[_]] = KL[N] }]: AList[KL] = new AList[KL] { + def transform[M[_], N[_]](k: KL[M], f: M ~> N) = k.transform(f) + def foldr[M[_], T](k: KL[M], f: (M[_], T) => T, init: T): T = k.foldr(f, init) + override def apply[M[_], C](k: KL[M], f: KL[Id] => C)(implicit app: Applicative[M]): M[C] = k.apply(f)(app) + def traverse[M[_], N[_], P[_]](k: KL[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KL[P]] = k.traverse[N,P](f)(np) + } + + type Single[A] = AList[({ type l[L[x]] = L[A]})#l] + def single[A]: Single[A] = new Single[A] { + def transform[M[_], N[_]](a: M[A], f: M ~> N) = f(a) + def foldr[M[_], T](a: M[A], f: (M[_], T) => T, init: T): T = f(a, init) + def traverse[M[_], N[_], P[_]](a: M[A], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[P[A]] = f(a) + } + + + type ASplit[K[L[x]], B[x]] = AList[ ({ type l[L[x]] = K[ (L ∙ B)#l] })#l ] + def asplit[ K[L[x]], B[x] ](base: AList[K]): ASplit[K,B] = new ASplit[K, B] + { + type Split[ L[x] ] = K[ (L ∙ B)#l ] + def transform[M[_], N[_]](value: Split[M], f: M ~> N): Split[N] = + base.transform[(M ∙ B)#l, (N ∙ B)#l](value, nestCon[M,N,B](f)) + + def traverse[M[_], N[_], P[_]](value: Split[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Split[P]] = + { + val g = nestCon[M, (N ∙ P)#l, B](f) + base.traverse[(M ∙ B)#l, N, (P ∙ B)#l](value, g)(np) + } + + def foldr[M[_], A](value: Split[M], f: (M[_], A) => A, init: A): A = + base.foldr[(M ∙ B)#l, A](value, f, init) + } + + // TODO: auto-generate + sealed trait T2K[A,B] { type l[L[x]] = (L[A], L[B]) } + type T2List[A,B] = AList[T2K[A,B]#l] + def tuple2[A, B]: T2List[A,B] = new T2List[A,B] + { + type T2[M[_]] = (M[A], M[B]) + def transform[M[_], N[_]](t: T2[M], f: M ~> N): T2[N] = (f(t._1), f(t._2)) + def foldr[M[_], T](t: T2[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, init)) + def traverse[M[_], N[_], P[_]](t: T2[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T2[P]] = + { + val g = (Tuple2.apply[P[A], P[B]] _).curried + np.apply( np.map(g, f(t._1)), f(t._2) ) + } + } + + sealed trait T3K[A,B,C] { type l[L[x]] = (L[A], L[B], L[C]) } + type T3List[A,B,C] = AList[T3K[A,B,C]#l] + def tuple3[A, B, C]: T3List[A,B,C] = new T3List[A,B,C] + { + type T3[M[_]] = (M[A], M[B], M[C]) + def transform[M[_], N[_]](t: T3[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3)) + def foldr[M[_], T](t: T3[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, init))) + def traverse[M[_], N[_], P[_]](t: T3[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T3[P]] = + { + val g = (Tuple3.apply[P[A],P[B],P[C]] _).curried + np.apply( np.apply( np.map(g, f(t._1)), f(t._2) ), f(t._3) ) + } + } + + sealed trait T4K[A,B,C,D] { type l[L[x]] = (L[A], L[B], L[C], L[D]) } + type T4List[A,B,C,D] = AList[T4K[A,B,C,D]#l] + def tuple4[A, B, C, D]: T4List[A,B,C,D] = new T4List[A,B,C,D] + { + type T4[M[_]] = (M[A], M[B], M[C], M[D]) + def transform[M[_], N[_]](t: T4[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4)) + def foldr[M[_], T](t: T4[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, init)))) + def traverse[M[_], N[_], P[_]](t: T4[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T4[P]] = + { + val g = (Tuple4.apply[P[A], P[B], P[C], P[D]] _).curried + np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)) + } + } + + sealed trait T5K[A,B,C,D,E] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E]) } + type T5List[A,B,C,D,E] = AList[T5K[A,B,C,D,E]#l] + def tuple5[A, B, C, D, E]: T5List[A,B,C,D,E] = new T5List[A,B,C,D,E] { + type T5[M[_]] = (M[A], M[B], M[C], M[D], M[E]) + def transform[M[_], N[_]](t: T5[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5)) + def foldr[M[_], T](t: T5[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, init))))) + def traverse[M[_], N[_], P[_]](t: T5[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T5[P]] = + { + val g = (Tuple5.apply[P[A],P[B],P[C],P[D],P[E]] _ ).curried + np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5) ) + } + } + + sealed trait T6K[A,B,C,D,E,F] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F]) } + type T6List[A,B,C,D,E,F] = AList[T6K[A,B,C,D,E,F]#l] + def tuple6[A, B, C, D, E, F]: T6List[A,B,C,D,E,F] = new T6List[A,B,C,D,E,F] { + type T6[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F]) + def transform[M[_], N[_]](t: T6[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6)) + def foldr[M[_], T](t: T6[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, init)))))) + def traverse[M[_], N[_], P[_]](t: T6[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T6[P]] = + { + val g = (Tuple6.apply[P[A],P[B],P[C],P[D],P[E],P[F]] _ ).curried + np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)) + } + } + + sealed trait T7K[A,B,C,D,E,F,G] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G]) } + type T7List[A,B,C,D,E,F,G] = AList[T7K[A,B,C,D,E,F,G]#l] + def tuple7[A,B,C,D,E,F,G]: T7List[A,B,C,D,E,F,G] = new T7List[A,B,C,D,E,F,G] { + type T7[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G]) + def transform[M[_], N[_]](t: T7[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7)) + def foldr[M[_], T](t: T7[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, init))))))) + def traverse[M[_], N[_], P[_]](t: T7[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T7[P]] = + { + val g = (Tuple7.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G]] _ ).curried + np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)) + } + } + sealed trait T8K[A,B,C,D,E,F,G,H] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H]) } + type T8List[A,B,C,D,E,F,G,H] = AList[T8K[A,B,C,D,E,F,G,H]#l] + def tuple8[A,B,C,D,E,F,G,H]: T8List[A,B,C,D,E,F,G,H] = new T8List[A,B,C,D,E,F,G,H] { + type T8[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H]) + def transform[M[_], N[_]](t: T8[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8)) + def foldr[M[_], T](t: T8[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, init)))))))) + def traverse[M[_], N[_], P[_]](t: T8[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T8[P]] = + { + val g = (Tuple8.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G],P[H]] _ ).curried + np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)) + } + } + + sealed trait T9K[A,B,C,D,E,F,G,H,I] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I]) } + type T9List[A,B,C,D,E,F,G,H,I] = AList[T9K[A,B,C,D,E,F,G,H,I]#l] + def tuple9[A,B,C,D,E,F,G,H,I]: T9List[A,B,C,D,E,F,G,H,I] = new T9List[A,B,C,D,E,F,G,H,I] { + type T9[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I]) + def transform[M[_], N[_]](t: T9[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9)) + def foldr[M[_], T](t: T9[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, init))))))))) + def traverse[M[_], N[_], P[_]](t: T9[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T9[P]] = + { + val g = (Tuple9.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G],P[H],P[I]] _ ).curried + np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)) + } + } + + sealed trait T10K[A,B,C,D,E,F,G,H,I,J] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I], L[J]) } + type T10List[A,B,C,D,E,F,G,H,I,J] = AList[T10K[A,B,C,D,E,F,G,H,I,J]#l] + def tuple10[A,B,C,D,E,F,G,H,I,J]: T10List[A,B,C,D,E,F,G,H,I,J] = new T10List[A,B,C,D,E,F,G,H,I,J] { + type T10[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I], M[J]) + def transform[M[_], N[_]](t: T10[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9), f(t._10)) + def foldr[M[_], T](t: T10[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, f(t._10, init)))))))))) + def traverse[M[_], N[_], P[_]](t: T10[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T10[P]] = + { + val g = (Tuple10.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G],P[H],P[I],P[J]] _ ).curried + np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)), f(t._10)) + } + } + + sealed trait T11K[A,B,C,D,E,F,G,H,I,J,K] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I], L[J], L[K]) } + type T11List[A,B,C,D,E,F,G,H,I,J,K] = AList[T11K[A,B,C,D,E,F,G,H,I,J,K]#l] + def tuple11[A,B,C,D,E,F,G,H,I,J,K]: T11List[A,B,C,D,E,F,G,H,I,J,K] = new T11List[A,B,C,D,E,F,G,H,I,J,K] { + type T11[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I], M[J], M[K]) + def transform[M[_], N[_]](t: T11[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9), f(t._10), f(t._11)) + def foldr[M[_], T](t: T11[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, f(t._10, f(t._11,init))))))))))) + def traverse[M[_], N[_], P[_]](t: T11[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T11[P]] = + { + val g = (Tuple11.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G],P[H],P[I],P[J],P[K]] _ ).curried + np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)), f(t._10)), f(t._11)) + } + } +} diff --git a/util/collection/Classes.scala b/util/collection/Classes.scala new file mode 100644 index 000000000..74796c829 --- /dev/null +++ b/util/collection/Classes.scala @@ -0,0 +1,27 @@ +package sbt + +object Classes +{ + trait Applicative[M[_]] + { + def apply[S,T](f: M[S => T], v: M[S]): M[T] + def pure[S](s: => S): M[S] + def map[S, T](f: S => T, v: M[S]): M[T] + } + trait Monad[M[_]] extends Applicative[M] + { + def flatten[T](m: M[M[T]]): M[T] + } + implicit val optionMonad: Monad[Option] = new Monad[Option] { + def apply[S,T](f: Option[S => T], v: Option[S]) = (f, v) match { case (Some(fv), Some(vv)) => Some(fv(vv)); case _ => None } + def pure[S](s: => S) = Some(s) + def map[S, T](f: S => T, v: Option[S]) = v map f + def flatten[T](m: Option[Option[T]]): Option[T] = m.flatten + } + implicit val listMonad: Monad[List] = new Monad[List] { + def apply[S,T](f: List[S => T], v: List[S]) = for(fv <- f; vv <- v) yield fv(vv) + def pure[S](s: => S) = s :: Nil + def map[S, T](f: S => T, v: List[S]) = v map f + def flatten[T](m: List[List[T]]): List[T] = m.flatten + } +} \ No newline at end of file diff --git a/util/collection/INode.scala b/util/collection/INode.scala index 0e48324e3..1ac9152a2 100644 --- a/util/collection/INode.scala +++ b/util/collection/INode.scala @@ -3,7 +3,7 @@ package sbt import java.lang.Runnable import java.util.concurrent.{atomic, Executor, LinkedBlockingQueue} import atomic.{AtomicBoolean, AtomicInteger} - import Types.{:+:, Id} + import Types.{:+:, ConstK, Id} object EvaluationState extends Enumeration { val New, Blocked, Ready, Calling, Evaluated = Value @@ -24,8 +24,7 @@ abstract class EvaluateSettings[Scope] private[this] val transform: Initialize ~> INode = new (Initialize ~> INode) { def apply[T](i: Initialize[T]): INode[T] = i match { case k: Keyed[s, T] => single(getStatic(k.scopedKey), k.transform) - case a: Apply[hl,T] => new MixedNode(a.inputs transform transform, a.f) - case u: Uniform[s, T] => new UniformNode(u.inputs map transform.fn[s], u.f) + case a: Apply[k,T] => new MixedNode[k,T]( a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) case b: Bind[s,T] => new BindNode[s,T]( transform(b.in), x => transform(b.f(x))) case v: Value[T] => constant(v.value) case o: Optional[s,T] => o.a match { @@ -155,8 +154,9 @@ abstract class EvaluateSettings[Scope] protected def dependsOn: Seq[INode[_]] protected def evaluate0(): Unit } - private[this] def constant[T](f: () => T): INode[T] = new MixedNode[HNil, T](KNil, _ => f()) - private[this] def single[S,T](in: INode[S], f: S => T): INode[T] = new MixedNode[S :+: HNil, T](in :^: KNil, hl => f(hl.head)) + + private[this] def constant[T](f: () => T): INode[T] = new MixedNode[ConstK[Unit]#l, T]((), _ => f(), AList.empty) + private[this] def single[S,T](in: INode[S], f: S => T): INode[T] = new MixedNode[ ({ type l[L[x]] = L[S] })#l, T](in, f, AList.single[S]) private[this] final class BindNode[S,T](in: INode[S], f: S => INode[T]) extends INode[T] { protected def dependsOn = in :: Nil @@ -166,14 +166,9 @@ abstract class EvaluateSettings[Scope] setValue(value) } } - private[this] final class UniformNode[S,T](in: Seq[INode[S]], f: Seq[S] => T) extends INode[T] + private[this] final class MixedNode[K[L[x]], T](in: K[INode], f: K[Id] => T, alist: AList[K]) extends INode[T] { - protected def dependsOn = in - protected def evaluate0(): Unit = setValue( f(in.map(_.get)) ) - } - private[this] final class MixedNode[HL <: HList, T](in: KList[INode, HL], f: HL => T) extends INode[T] - { - protected def dependsOn = in.toList - protected def evaluate0(): Unit = setValue( f( in down getValue ) ) + protected def dependsOn = alist.toList(in) + protected def evaluate0(): Unit = setValue( f( alist.transform(in, getValue) ) ) } } diff --git a/util/collection/KList.scala b/util/collection/KList.scala index 7b58aca32..70e3852f9 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -1,76 +1,49 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ package sbt -import Types._ + import Types._ + import Classes.Applicative -/** A higher-order heterogeneous list. It has a type constructor M[_] and -* type parameters HL. The underlying data is M applied to each type parameter. -* Explicitly tracking M[_] allows performing natural transformations or ensuring -* all data conforms to some common type. -* -* For background, see -* http://apocalisp.wordpress.com/2010/11/01/type-level-programming-in-scala-part-8a-klist%C2%A0motivation/ - */ -sealed trait KList[+M[_], HL <: HList] +/** Heterogeneous list with each element having type M[T] for some type T.*/ +sealed trait KList[+M[_]] { - type Raw = HL - /** Transform to the underlying HList type.*/ - def down(implicit ev: M ~> Id): HL - /** Apply a natural transformation. */ - def transform[N[_]](f: M ~> N): KList[N, HL] - /** Convert to a List. */ + type Transform[N[_]] <: KList[N] + + /** Apply the natural transformation `f` to each element. */ + def transform[N[_]](f: M ~> N): Transform[N] + + def foldr[T](f: (M[_], T) => T, init: T): T = init // had trouble defining it in KNil + def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] + def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] def toList: List[M[_]] - /** Convert to an HList. */ - def combine[N[X] >: M[X]]: HL#Wrap[N] - - def foldr[P[_ <: HList],N[X] >: M[X]](f: KFold[N,P]): P[HL] } -trait KFold[M[_],P[_ <: HList]] +final case class KCons[H, +T <: KList[M], +M[_]](head: M[H], tail: T) extends KList[M] { - def kcons[H,T <: HList](h: M[H], acc: P[T]): P[H :+: T] - def knil: P[HNil] -} + final type Transform[N[_]] = KCons[H, tail.Transform[N], N] -final case class KCons[H, T <: HList, +M[_]](head: M[H], tail: KList[M,T]) extends KList[M, H :+: T] + def transform[N[_]](f: M ~> N) = KCons(f(head), tail.transform(f)) + def toList: List[M[_]] = head :: tail.toList + def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] = + { + val g = (t: tail.Transform[Id]) => (h: H) =>f( KCons[H, tail.Transform[Id], Id](h, t) ) + ap.apply( tail.apply[N, H => Z](g), head ) + } + def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] = + { + val tt: N[tail.Transform[P]] = tail.traverse[N,P](f) + val g = (t: tail.Transform[P]) => (h: P[H]) => KCons(h, t) + np.apply(np.map(g, tt), f(head)) + } + def :^:[A,N[x] >: M[x]](h: N[A]) = KCons(h, this) + override def foldr[T](f: (M[_], T) => T, init: T): T = f(head, tail.foldr(f, init)) +} +sealed abstract class KNil extends KList[Nothing] { - def down(implicit f: M ~> Id) = HCons(f(head), tail down f) - def transform[N[_]](f: M ~> N) = KCons( f(head), tail transform f ) - // prepend - def :^: [N[X] >: M[X], G](g: N[G]) = KCons(g, this) - def toList = head :: tail.toList - - def combine[N[X] >: M[X]]: (H :+: T)#Wrap[N] = HCons(head, tail.combine) - - override def toString = head + " :^: " + tail.toString - - def foldr[P[_ <: HList],N[X] >: M[X]](f: KFold[N,P]) = f.kcons(head, tail foldr f) + final type Transform[N[_]] = KNil + final def transform[N[_]](f: Nothing ~> N): Transform[N] = KNil + final def toList = Nil + final def apply[N[x], Z](f: KNil => Z)(implicit ap: Applicative[N]): N[Z] = ap.pure(f(KNil)) + final def traverse[N[_], P[_]](f: Nothing ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KNil] = np.pure(KNil) } - -sealed class KNil extends KList[Nothing, HNil] -{ - def down(implicit f: Nothing ~> Id) = HNil - def transform[N[_]](f: Nothing ~> N) = KNil - def :^: [M[_], H](h: M[H]) = KCons(h, this) - def toList = Nil - def combine[N[X]] = HNil - override def foldr[P[_ <: HList],N[_]](f: KFold[N,P]) = f.knil - override def toString = "KNil" -} -object KNil extends KNil - -object KList -{ - // nicer alias for pattern matching - val :^: = KCons - - def fromList[M[_]](s: Seq[M[_]]): KList[M, _ <: HList] = if(s.isEmpty) KNil else KCons(s.head, fromList(s.tail)) - - // haven't found a way to convince scalac that KList[M, H :+: T] implies KCons[H,T,M] - // Therefore, this method exists to put the cast in one location. - implicit def kcons[H, T <: HList, M[_]](kl: KList[M, H :+: T]): KCons[H,T,M] = - kl.asInstanceOf[KCons[H,T,M]] - // haven't need this, but for symmetry with kcons: - implicit def knil[M[_]](kl: KList[M, HNil]): KNil = KNil +case object KNil extends KNil { + def :^:[M[_], H](h: M[H]): KCons[H, KNil, M] = KCons(h, this) } diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 6df7291af..981096c0b 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -61,10 +61,12 @@ trait Init[Scope] def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) def value[T](value: => T): Initialize[T] = new Value(value _) def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) - def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, app(key :^: KNil)(hl => f(hl.head)), NoPosition) + def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, map(key)(f), NoPosition) def bind[S,T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) - def app[HL <: HList, T](inputs: KList[Initialize, HL])(f: HL => T): Initialize[T] = new Apply(f, inputs) - def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Uniform(f, inputs) + def map[S,T](in: Initialize[S])(f: S => T): Initialize[T] = new Apply[ ({ type l[L[x]] = L[S] })#l, T](f, in, AList.single[S]) + def app[K[L[x]], T](inputs: K[Initialize])(f: K[Id] => T)(implicit alist: AList[K]): Initialize[T] = new Apply[K, T](f, inputs, alist) + def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = + new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { @@ -219,11 +221,12 @@ trait Init[Scope] def apply[S](g: T => S): Initialize[S] def mapReferenced(g: MapScoped): Initialize[T] def validateReferenced(g: ValidateRef): ValidatedInit[T] - def zip[S](o: Initialize[S]): Initialize[(T,S)] = zipWith(o)((x,y) => (x,y)) - def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = - new Apply[T :+: S :+: HNil, U]( { case t :+: s :+: HNil => f(t,s)}, this :^: o :^: KNil) def mapConstant(g: MapConstant): Initialize[T] def evaluate(map: Settings[Scope]): T + def zip[S](o: Initialize[S]): Initialize[(T,S)] = zipTupled(o)(idFun) + def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = zipTupled(o)(f.tupled) + private[this] def zipTupled[S,U](o: Initialize[S])(f: ((T,S)) => U): Initialize[U] = + new Apply[({ type l[L[x]] = (L[T], L[S]) })#l, U](f, (this, o), AList.tuple2[T,S]) } object Initialize { @@ -330,34 +333,21 @@ trait Init[Scope] def mapConstant(g: MapConstant) = this def evaluate(map: Settings[Scope]): T = value() } - private[sbt] final class Apply[HL <: HList, T](val f: HL => T, val inputs: KList[Initialize, HL]) extends Initialize[T] + private[sbt] final class Apply[K[L[x]], T](val f: K[Id] => T, val inputs: K[Initialize], val alist: AList[K]) extends Initialize[T] { - def dependencies = deps(inputs.toList) + def dependencies = deps(alist.toList(inputs)) def mapReferenced(g: MapScoped) = mapInputs( mapReferencedT(g) ) - def apply[S](g: T => S) = new Apply(g compose f, inputs) + def apply[S](g: T => S) = new Apply(g compose f, inputs, alist) def mapConstant(g: MapConstant) = mapInputs( mapConstantT(g) ) - def mapInputs(g: Initialize ~> Initialize): Initialize[T] = new Apply(f, inputs transform g) - def evaluate(ss: Settings[Scope]) = f(inputs down evaluateT(ss)) + def mapInputs(g: Initialize ~> Initialize): Initialize[T] = new Apply(f, alist.transform(inputs, g), alist) + def evaluate(ss: Settings[Scope]) = f(alist.transform(inputs, evaluateT(ss))) def validateReferenced(g: ValidateRef) = { - val tx = inputs transform validateReferencedT(g) - val undefs = tx.toList.flatMap(_.left.toSeq.flatten) + val tx = alist.transform(inputs, validateReferencedT(g)) + val undefs = alist.toList(tx).flatMap(_.left.toSeq.flatten) val get = new (ValidatedInit ~> Initialize) { def apply[T](vr: ValidatedInit[T]) = vr.right.get } - if(undefs.isEmpty) Right(new Apply(f, tx transform get)) else Left(undefs) + if(undefs.isEmpty) Right(new Apply(f, alist.transform(tx, get), alist)) else Left(undefs) } } - private[sbt] final class Uniform[S, T](val f: Seq[S] => T, val inputs: Seq[Initialize[S]]) extends Initialize[T] - { - def dependencies = deps(inputs) - def mapReferenced(g: MapScoped) = new Uniform(f, inputs map mapReferencedT(g).fn) - def validateReferenced(g: ValidateRef) = - { - val (undefs, ok) = Util.separateE(inputs map validateReferencedT(g).fn ) - if(undefs.isEmpty) Right( new Uniform(f, ok) ) else Left(undefs.flatten) - } - def apply[S](g: T => S) = new Uniform(g compose f, inputs) - def mapConstant(g: MapConstant) = new Uniform(f, inputs map mapConstantT(g).fn) - def evaluate(ss: Settings[Scope]) = f(inputs map evaluateT(ss).fn ) - } private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) } diff --git a/util/collection/TypeFunctions.scala b/util/collection/TypeFunctions.scala index bbfad9b8e..6a4978750 100644 --- a/util/collection/TypeFunctions.scala +++ b/util/collection/TypeFunctions.scala @@ -7,6 +7,7 @@ trait TypeFunctions { type Id[X] = X sealed trait Const[A] { type Apply[B] = A } + sealed trait ConstK[A] { type l[L[x]] = A } sealed trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] } sealed trait ∙[A[_], B[_]] { type l[T] = A[B[T]] } sealed trait P1of2[M[_,_], A] { type Apply[B] = M[A,B]; type Flip[B] = M[B, A] } @@ -16,6 +17,7 @@ trait TypeFunctions final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } final def idFun[T] = (t: T) => t final def const[A,B](b: B): A=> B = _ => b + final def idK[M[_]]: M ~> M = new (M ~> M) { def apply[T](m: M[T]): M[T] = m } def nestCon[M[_], N[_], G[_]](f: M ~> N): (M ∙ G)#l ~> (N ∙ G)#l = f.asInstanceOf[(M ∙ G)#l ~> (N ∙ G)#l] // implemented with a cast to avoid extra object+method call. castless version: @@ -26,7 +28,6 @@ trait TypeFunctions implicit def toFn1[A,B](f: A => B): Fn1[A,B] = new Fn1[A,B] { def ∙[C](g: C => A) = f compose g } - def idK[M[_]]: M ~> M = new (M ~> M) { def apply[T](m: M[T]): M[T] = m } type Endo[T] = T=>T type ~>|[A[_],B[_]] = A ~> Compose[Option, B]#Apply diff --git a/util/collection/Types.scala b/util/collection/Types.scala index 42b81f990..d3a3420b0 100644 --- a/util/collection/Types.scala +++ b/util/collection/Types.scala @@ -4,15 +4,10 @@ package sbt object Types extends Types -{ - implicit def hconsToK[M[_], H, T <: HList](h: M[H] :+: T)(implicit mt: T => KList[M, T]): KList[M, H :+: T] = - KCons[H, T, M](h.head, mt(h.tail) ) - implicit def hnilToK(hnil: HNil): KNil = KNil -} trait Types extends TypeFunctions { val :^: = KCons - val :+: = HCons type :+:[H, T <: HList] = HCons[H,T] + val :+: = HCons } diff --git a/util/collection/src/test/scala/KListTest.scala b/util/collection/src/test/scala/KListTest.scala deleted file mode 100644 index 2ca25a31a..000000000 --- a/util/collection/src/test/scala/KListTest.scala +++ /dev/null @@ -1,19 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt - -import Types._ - -object KTest { - val f = new (Option ~> List) { def apply[T](o: Option[T]): List[T] = o.toList } - - val x = Some(3) :^: Some("asdf") :^: KNil - val y = x transform f - val m1a = y match { case List(3) :^: List("asdf") :^: KNil => println("true") } - val m1b = (List(3) :^: KNil) match { case yy :^: KNil => println("true") } - - val head = new (List ~> Id) { def apply[T](xs: List[T]): T = xs.head } - val z = y down head - val m2 = z match { case 3 :+: "asdf" :+: HNil => println("true") } -} diff --git a/util/collection/src/test/scala/PMapTest.scala b/util/collection/src/test/scala/PMapTest.scala index bac4b7364..7970e175e 100644 --- a/util/collection/src/test/scala/PMapTest.scala +++ b/util/collection/src/test/scala/PMapTest.scala @@ -13,6 +13,7 @@ object PMapTest mp(Some(3)) = 9 val x = Some(3) :^: Some("asdf") :^: KNil val y = x.transform[Id](mp) - val z = y.down - z match { case 9 :+: "a" :+: HNil => println("true") } + assert(y.head == 9) + assert(y.tail.head == "a") + assert(y.tail.tail == KNil) } \ No newline at end of file diff --git a/util/collection/src/test/scala/SettingsExample.scala b/util/collection/src/test/scala/SettingsExample.scala index 558de7f4a..637f0ad51 100644 --- a/util/collection/src/test/scala/SettingsExample.scala +++ b/util/collection/src/test/scala/SettingsExample.scala @@ -49,7 +49,7 @@ object SettingsUsage // Define some settings val mySettings: Seq[Setting[_]] = Seq( setting( a3, value( 3 ) ), - setting( b4, app(a4 :^: KNil) { case av :+: HNil => av * 3 } ), + setting( b4, map(a4)(_ * 3)), update(a5)(_ + 1) ) From c95df4681bf1ae03348bf08640c62b9d25aeb47c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 31 Jul 2012 11:52:10 -0400 Subject: [PATCH 295/823] task setting macros for :=, +=, ++= also, bump to 2.10.0-M6 --- util/appmacro/ContextUtil.scala | 104 ++++++++++++ util/appmacro/Instance.scala | 260 ++++++++++++++++++++++++++++++ util/appmacro/KListBuilder.scala | 58 +++++++ util/appmacro/MixedBuilder.scala | 16 ++ util/appmacro/TupleBuilder.scala | 56 +++++++ util/appmacro/TupleNBuilder.scala | 51 ++++++ util/collection/AList.scala | 11 +- util/collection/KList.scala | 7 + util/collection/Settings.scala | 4 +- 9 files changed, 563 insertions(+), 4 deletions(-) create mode 100644 util/appmacro/ContextUtil.scala create mode 100644 util/appmacro/Instance.scala create mode 100644 util/appmacro/KListBuilder.scala create mode 100644 util/appmacro/MixedBuilder.scala create mode 100644 util/appmacro/TupleBuilder.scala create mode 100644 util/appmacro/TupleNBuilder.scala diff --git a/util/appmacro/ContextUtil.scala b/util/appmacro/ContextUtil.scala new file mode 100644 index 000000000..31f61c356 --- /dev/null +++ b/util/appmacro/ContextUtil.scala @@ -0,0 +1,104 @@ +package sbt +package appmacro + + import scala.reflect._ + import makro._ + import scala.tools.nsc.Global + +object ContextUtil { + /** Constructs an object with utility methods for operating in the provided macro context `c`. + * Callers should explicitly specify the type parameter as `c.type` in order to preserve the path dependent types. */ + def apply[C <: Context with Singleton](c: C): ContextUtil[C] = new ContextUtil(c) +} + +/** Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). +* This is not thread safe due to the underlying Context and related data structures not being thread safe. +* Use `ContextUtil[c.type](c)` to construct. */ +final class ContextUtil[C <: Context with Singleton](val ctx: C) +{ + import ctx.universe.{Apply=>ApplyTree,_} + + val alistType = ctx.typeOf[AList[KList]] + val alist: Symbol = alistType.typeSymbol.companionSymbol + val alistTC: Type = alistType.typeConstructor + + /** Modifiers for a local val.*/ + val localModifiers = Modifiers(NoFlags) + + def getPos(sym: Symbol) = if(sym eq null) NoPosition else sym.pos + + /** Constructs a unique term name with the given prefix within this Context. + * (The current implementation uses Context.fresh, which increments*/ + def freshTermName(prefix: String) = newTermName(ctx.fresh("$" + prefix)) + + def typeTree(tpe: Type) = TypeTree().setType(tpe) + + /** Constructs a new, local ValDef with the given Type, a unique name, + * the same position as `sym`, and an empty implementation (no rhs). */ + def freshValDef(tpe: Type, sym: Symbol): ValDef = + { + val vd = localValDef(typeTree(tpe), EmptyTree) + vd setPos getPos(sym) + vd + } + + /** Constructs a ValDef with local modifiers and a unique name. */ + def localValDef(tpt: Tree, rhs: Tree): ValDef = + ValDef(localModifiers, freshTermName("q"), tpt, rhs) + + /** Constructs a tuple value of the right TupleN type from the provided inputs.*/ + def mkTuple(args: List[Tree]): Tree = + { + val global: Global = ctx.universe.asInstanceOf[Global] + global.gen.mkTuple(args.asInstanceOf[List[global.Tree]]).asInstanceOf[ctx.universe.Tree] + } + + /** Creates a new, synthetic type variable with the specified `owner`. */ + def newTypeVariable(owner: Symbol): Symbol = + { + val global: Global = ctx.universe.asInstanceOf[Global] + owner.asInstanceOf[global.Symbol].newSyntheticTypeParam().asInstanceOf[ctx.universe.Symbol] + } + /** The type representing the type constructor `[X] X` */ + val idTC: Type = + { + val tvar = newTypeVariable(NoSymbol) + polyType(tvar :: Nil, refVar(tvar)) + } + /** Constructs a new, synthetic type variable that is a type constructor. For example, in type Y[L[x]], L is such a type variable. */ + def newTCVariable(owner: Symbol): Symbol = + { + val global: Global = ctx.universe.asInstanceOf[Global] + val tc = owner.asInstanceOf[global.Symbol].newSyntheticTypeParam() + val arg = tc.newSyntheticTypeParam("x", 0L) + tc.setInfo(global.PolyType(arg :: Nil, global.TypeBounds.empty)).asInstanceOf[ctx.universe.Symbol] + } + /** Returns the Symbol that references the statically accessible singleton `i`. */ + def singleton[T <: AnyRef with Singleton](i: T)(implicit it: ctx.TypeTag[i.type]): Symbol = + it.tpe match { + case SingleType(_, sym) if !sym.isFreeTerm && sym.isStatic => sym + case x => error("Instance must be static (was " + x + ").") + } + /** Constructs a Type that references the given type variable. */ + def refVar(variable: Symbol): Type = typeRef(NoPrefix, variable, Nil) + + /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ + def method(obj: Symbol, name: String): Symbol = { + val global: Global = ctx.universe.asInstanceOf[Global] + obj.asInstanceOf[global.Symbol].info.nonPrivateMember(global.newTermName(name)).asInstanceOf[ctx.universe.Symbol] + } + + /** Returns a Type representing the type constructor tcp.. For example, given + * `object Demo { type M[x] = List[x] }`, the call `extractTC(Demo, "M")` will return a type representing + * the type constructor `[x] List[x]`. + **/ + def extractTC(tcp: AnyRef with Singleton, name: String)(implicit it: ctx.TypeTag[tcp.type]): ctx.Type = + { + val global: Global = ctx.universe.asInstanceOf[Global] + val itTpe = it.tpe.asInstanceOf[global.Type] + val m = itTpe.nonPrivateMember(global.newTypeName(name)) + val tc = itTpe.memberInfo(m).asInstanceOf[ctx.universe.Type] + assert(tc != NoType && tc.isHigherKinded, "Invalid type constructor: " + tc) + tc + } +} \ No newline at end of file diff --git a/util/appmacro/Instance.scala b/util/appmacro/Instance.scala new file mode 100644 index 000000000..05a80b4e8 --- /dev/null +++ b/util/appmacro/Instance.scala @@ -0,0 +1,260 @@ +package sbt +package appmacro + + import Classes.Applicative + import Types.Id + +/** The separate hierarchy from Applicative/Monad is for two reasons. +* +* 1. The type constructor is represented as an abstract type because a TypeTag cannot represent a type constructor directly. +* 2. The applicative interface is uncurried. +*/ +trait Instance +{ + type M[x] + def app[K[L[x]], Z](in: K[M], f: K[Id] => Z)(implicit a: AList[K]): M[Z] + def map[S,T](in: M[S], f: S => T): M[T] + def pure[T](t: () => T): M[T] +} +trait Convert +{ + def apply[T: c.TypeTag](c: scala.reflect.makro.Context)(in: c.Tree): c.Tree +} +trait MonadInstance extends Instance +{ + def flatten[T](in: M[M[T]]): M[T] +} +object InputWrapper +{ + def wrap[T](in: Any): T = error("This method is an implementation detail and should not be referenced.") +} + + import scala.reflect._ + import makro._ + +object Instance +{ + final val DynamicDependencyError = "Illegal dynamic dependency." + final val DynamicReferenceError = "Illegal dynamic reference." + final val ApplyName = "app" + final val FlattenName = "flatten" + final val PureName = "pure" + final val MapName = "map" + final val InstanceTCName = "M" + final val WrapName = "wrap" + + final class Input[U <: Universe with Singleton](val tpe: U#Type, val expr: U#Tree, val local: U#ValDef) + + /** Implementation of a macro that provides a direct syntax for applicative functors and monads. + * It is intended to be used in conjunction with another macro that conditions the inputs. + * + * This method processes the Tree `t` to find inputs of the form `InputWrapper.wrap[T]( input )` + * This form is typically constructed by another macro that pretends to be able to get a value of type `T` + * from a value convertible to `M[T]`. This `wrap(input)` form has two main purposes. + * First, it identifies the inputs that should be transformed. + * Second, it allows the input trees to be wrapped for later conversion into the appropriate `M[T]` type by `convert`. + * This wrapping is necessary because applying the first macro must preserve the original type, + * but it is useful to delay conversion until the outer, second macro is called. The `wrap` method accomplishes this by + * allowing the original `Tree` and `Type` to be hidden behind the raw `T` type. This method will remove the call to `wrap` + * so that it is not actually called at runtime. + * + * Each `input` in each expression of the form `InputWrapper.wrap[T]( input )` is transformed by `convert`. + * This transformation converts the input Tree to a Tree of type `M[T]`. + * The original wrapped expression `wrap(input)` is replaced by a reference to a new local `val $x: T`, where `$x` is a fresh name. + * These converted inputs are passed to `builder` as well as the list of these synthetic `ValDef`s. + * The `TupleBuilder` instance constructs a tuple (Tree) from the inputs and defines the right hand side of the vals + * that unpacks the tuple containing the results of the inputs. + * + * The constructed tuple of inputs and the code that unpacks the results of the inputs are then passed to the `i`, + * which is an implementation of `Instance` that is statically accessible. + * An Instance defines a applicative functor associated with a specific type constructor and, if it implements MonadInstance as well, a monad. + * Typically, it will be either a top-level module or a stable member of a top-level module (such as a val or a nested module). + * The `with Singleton` part of the type verifies some cases at macro compilation time, + * while the full check for static accessibility is done at macro expansion time. + * Note: Ideally, the types would verify that `i: MonadInstance` when `t.isRight`. + * With the various dependent types involved, this is not worth it. + * + * The `t` argument is the argument of the macro that will be transformed as described above. + * If the macro that calls this method is for a multi-input map (app followed by map), + * `t` should be the argument wrapped in Left. + * If this is for multi-input flatMap (app followed by flatMap), + * this should be the argument wrapped in Right. + */ + def contImpl[T: c.TypeTag](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]])( + implicit tt: c.TypeTag[T], mt: c.TypeTag[i.M[T]], it: c.TypeTag[i.type]): c.Expr[i.M[T]] = + { + import c.universe.{Apply=>ApplyTree,_} + + import scala.tools.nsc.Global + // Used to access compiler methods not yet exposed via the reflection/macro APIs + val global: Global = c.universe.asInstanceOf[Global] + + val util = ContextUtil[c.type](c) + val mTC: Type = util.extractTC(i, InstanceTCName) + + // the tree for the macro argument + val (tree, treeType) = t match { + case Left(l) => (l.tree, tt.tpe.normalize) + case Right(r) => (r.tree, mt.tpe.normalize) + } + + val instanceSym = util.singleton(i) + // A Tree that references the statically accessible Instance that provides the actual implementations of map, flatMap, ... + val instance = Ident(instanceSym) + + val parameterModifiers = Modifiers(Flag.PARAM) + + val wrapperSym = util.singleton(InputWrapper) + val wrapMethodSymbol = util.method(wrapperSym, WrapName) + def isWrapper(fun: Tree) = fun.symbol == wrapMethodSymbol + + type In = Input[c.universe.type] + var inputs = List[In]() + + // constructs a ValDef with a parameter modifier, a unique name, with the provided Type and with an empty rhs + def freshMethodParameter(tpe: Type): ValDef = + ValDef(parameterModifiers, freshTermName("p"), typeTree(tpe), EmptyTree) + + def freshTermName(prefix: String) = newTermName(c.fresh("$" + prefix)) + def typeTree(tpe: Type) = TypeTree().setType(tpe) + + // constructs a function that applies f to each subtree of the input tree + def visitor(f: Tree => Unit): Tree => Unit = + { + val v: Transformer = new Transformer { + override def transform(tree: Tree): Tree = { f(tree); super.transform(tree) } + } + (tree: Tree) => v.transform(tree) + } + + /* Local definitions in the macro. This is used to ensure + * references are to M instances defined outside of the macro call.*/ + val defs = new collection.mutable.HashSet[Symbol] + + // a reference is illegal if it is to an M instance defined within the scope of the macro call + def illegalReference(sym: Symbol): Boolean = + sym != null && sym != NoSymbol && defs.contains(sym) + + // a function that checks the provided tree for illegal references to M instances defined in the + // expression passed to the macro and for illegal dereferencing of M instances. + val checkQual = visitor { + case s @ ApplyTree(fun, qual :: Nil) => if(isWrapper(fun)) c.error(s.pos, DynamicDependencyError) + case id @ Ident(name) if illegalReference(id.symbol) => c.error(id.pos, DynamicReferenceError) + case _ => () + } + // adds the symbols for all non-Ident subtrees to `defs`. + val defSearch = visitor { + case _: Ident => () + case tree => if(tree.symbol ne null) defs += tree.symbol; + } + + // transforms the original tree into calls to the Instance functions pure, map, ..., + // resulting in a value of type M[T] + def makeApp(body: Tree): Tree = + inputs match { + case Nil => pure(body) + case x :: Nil => single(body, x) + case xs => arbArity(body, xs) + } + + // no inputs, so construct M[T] via Instance.pure or pure+flatten + def pure(body: Tree): Tree = + { + val typeApplied = TypeApply(Select(instance, PureName), typeTree(treeType) :: Nil) + val p = ApplyTree(typeApplied, Function(Nil, body) :: Nil) + if(t.isLeft) p else flatten(p) + } + // m should have type M[M[T]] + // the returned Tree will have type M[T] + def flatten(m: Tree): Tree = + { + val typedFlatten = TypeApply(Select(instance, FlattenName), typeTree(tt.tpe) :: Nil) + ApplyTree(typedFlatten, m :: Nil) + } + + // calls Instance.map or flatmap directly, skipping the intermediate Instance.app that is unnecessary for a single input + def single(body: Tree, input: In): Tree = + { + val variable = input.local + val param = ValDef(parameterModifiers, variable.name, variable.tpt, EmptyTree) + val typeApplied = TypeApply(Select(instance, MapName), variable.tpt :: typeTree(treeType) :: Nil) + val mapped = ApplyTree(typeApplied, input.expr :: Function(param :: Nil, body) :: Nil) + if(t.isLeft) mapped else flatten(mapped) + } + + // calls Instance.app to get the values for all inputs and then calls Instance.map or flatMap to evaluate the body + def arbArity(body: Tree, inputs: List[In]): Tree = + { + val result = builder.make(c)(mTC, inputs) + val param = freshMethodParameter( appliedType(result.representationC, util.idTC :: Nil) ) + val bindings = result.extract(param) + val f = Function(param :: Nil, Block(bindings, body)) + val ttt = typeTree(treeType) + val typedApp = TypeApply(Select(instance, ApplyName), typeTree(result.representationC) :: ttt :: Nil) + val app = ApplyTree(ApplyTree(typedApp, result.input :: f :: Nil), result.alistInstance :: Nil) + if(t.isLeft) app else flatten(app) + } + + // called when transforming the tree to add an input + // for `qual` of type M[A], and a selection qual.value, + // the call is addType(Type A, Tree qual) + // the result is a Tree representing a reference to + // the bound value of the input + def addType(tpe: Type, qual: Tree): Tree = + { + checkQual(qual) + val vd = util.freshValDef(tpe, qual.symbol) + inputs ::= new Input(tpe, qual, vd) + Ident(vd.name) + } + + // the main tree transformer that replaces calls to InputWrapper.wrap(x) with + // plain Idents that reference the actual input value + object appTransformer extends Transformer + { + override def transform(tree: Tree): Tree = + tree match + { + case ApplyTree(TypeApply(fun, t :: Nil), qual :: Nil) if isWrapper(fun) => + val tag = c.TypeTag(t.tpe) + addType(t.tpe, convert(c)(qual)(tag) ) + case _ => super.transform(tree) + } + } + + // collects all definitions in the tree. used for finding illegal references + defSearch(tree) + + // applies the transformation + // resetting attributes: a) must be local b) must be done + // on the transformed tree and not the wrapped tree or else there are obscure errors + val tr = makeApp( c.resetLocalAttrs(appTransformer.transform(tree)) ) + c.Expr[i.M[T]](tr) + } + + import Types._ + + implicit def applicativeInstance[A[_]](implicit ap: Applicative[A]): Instance { type M[x] = A[x] } = new Instance + { + type M[x] = A[x] + def app[ K[L[x]], Z ](in: K[A], f: K[Id] => Z)(implicit a: AList[K]) = a.apply[A,Z](in, f) + def map[S,T](in: A[S], f: S => T) = ap.map(f, in) + def pure[S](s: () => S): M[S] = ap.pure(s()) + } + + type AI[A[_]] = Instance { type M[x] = A[x] } + def compose[A[_], B[_]](implicit a: AI[A], b: AI[B]): Instance { type M[x] = A[B[x]] } = new Composed[A,B](a,b) + // made a public, named, unsealed class because of trouble with macros and inference when the Instance is not an object + class Composed[A[_], B[_]](a: AI[A], b: AI[B]) extends Instance + { + type M[x] = A[B[x]] + def pure[S](s: () => S): A[B[S]] = a.pure(() => b.pure(s)) + def map[S,T](in: M[S], f: S => T): M[T] = a.map(in, (bv: B[S]) => b.map(bv, f)) + def app[ K[L[x]], Z ](in: K[M], f: K[Id] => Z)(implicit alist: AList[K]): A[B[Z]] = + { + val g: K[B] => B[Z] = in => b.app[K, Z](in, f) + type Split[ L[x] ] = K[ (L ∙ B)#l ] + a.app[Split, B[Z]](in, g)(AList.asplit(alist)) + } + } +} diff --git a/util/appmacro/KListBuilder.scala b/util/appmacro/KListBuilder.scala new file mode 100644 index 000000000..5b658ea69 --- /dev/null +++ b/util/appmacro/KListBuilder.scala @@ -0,0 +1,58 @@ +package sbt +package appmacro + + import Types.Id + import scala.tools.nsc.Global + import scala.reflect._ + import makro._ + +/** A `TupleBuilder` that uses a KList as the tuple representation.*/ +object KListBuilder extends TupleBuilder +{ + def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] + { + val ctx: c.type = c + val util = ContextUtil[c.type](c) + import c.universe.{Apply=>ApplyTree,_} + import util._ + + val knilType = c.typeOf[KNil] + val knil = Ident(knilType.typeSymbol.companionSymbol) + val kconsTpe = c.typeOf[KCons[Int,KNil,List]] + val kcons = kconsTpe.typeSymbol.companionSymbol + val mTC: Type = mt.asInstanceOf[c.universe.Type] + val kconsTC: Type = kconsTpe.typeConstructor + + /** This is the L in the type function [L[x]] ... */ + val tcVariable: Symbol = newTCVariable(NoSymbol) + + /** Instantiates KCons[h, t <: KList[L], L], where L is the type constructor variable */ + def kconsType(h: Type, t: Type): Type = + appliedType(kconsTC, h :: t :: refVar(tcVariable) :: Nil) + + def bindKList(prev: ValDef, revBindings: List[ValDef], params: List[ValDef]): List[ValDef] = + params match + { + case ValDef(mods, name, tpt, _) :: xs => + val head = ValDef(mods, name, tpt, Select(Ident(prev.name), "head")) + val tail = localValDef(TypeTree(), Select(Ident(prev.name), "tail")) + val base = head :: revBindings + bindKList(tail, if(xs.isEmpty) base else tail :: base, xs) + case Nil => revBindings.reverse + } + + /** The input trees combined in a KList */ + val klist = (inputs :\ (knil: Tree))( (in, klist) => ApplyTree(kcons, in.expr, klist) ) + + /** The input types combined in a KList type. The main concern is tracking the heterogeneous types. + * The type constructor is tcVariable, so that it can be applied to [X] X or M later. + * When applied to `M`, this type gives the type of the `input` KList. */ + val klistType: Type = (inputs :\ knilType)( (in, klist) => kconsType(in.tpe, klist) ) + + val representationC = PolyType(tcVariable :: Nil, klistType) + val resultType = appliedType(representationC, idTC :: Nil) + val input = klist + val alistInstance = TypeApply(Select(Ident(alist), "klist"), typeTree(representationC) :: Nil) + def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) + } +} \ No newline at end of file diff --git a/util/appmacro/MixedBuilder.scala b/util/appmacro/MixedBuilder.scala new file mode 100644 index 000000000..593f60382 --- /dev/null +++ b/util/appmacro/MixedBuilder.scala @@ -0,0 +1,16 @@ +package sbt +package appmacro + + import scala.reflect._ + import makro._ + +/** A builder that uses `TupleN` as the representation for small numbers of inputs (up to `TupleNBuilder.MaxInputs`) +* and `KList` for larger numbers of inputs. This builder cannot handle fewer than 2 inputs.*/ +object MixedBuilder extends TupleBuilder +{ + def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = + { + val delegate = if(inputs.size > TupleNBuilder.MaxInputs) KListBuilder else TupleNBuilder + delegate.make(c)(mt, inputs) + } +} \ No newline at end of file diff --git a/util/appmacro/TupleBuilder.scala b/util/appmacro/TupleBuilder.scala new file mode 100644 index 000000000..f91d3c91c --- /dev/null +++ b/util/appmacro/TupleBuilder.scala @@ -0,0 +1,56 @@ +package sbt +package appmacro + + import Types.Id + import scala.tools.nsc.Global + import scala.reflect._ + import makro._ + +/** +* A `TupleBuilder` abstracts the work of constructing a tuple data structure such as a `TupleN` or `KList` +* and extracting values from it. The `Instance` macro implementation will (roughly) traverse the tree of its argument +* and ultimately obtain a list of expressions with type `M[T]` for different types `T`. +* The macro constructs an `Input` value for each of these expressions that contains the `Type` for `T`, +* the `Tree` for the expression, and a `ValDef` that will hold the value for the input. +* +* `TupleBuilder.apply` is provided with the list of `Input`s and is expected to provide three values in the returned BuilderResult. +* First, it returns the constructed tuple data structure Tree in `input`. +* Next, it provides the type constructor `representationC` that, when applied to M, gives the type of tuple data structure. +* For example, a builder that constructs a `Tuple3` for inputs `M[Int]`, `M[Boolean]`, and `M[String]` +* would provide a Type representing `[L[x]] (L[Int], L[Boolean], L[String])`. The `input` method +* would return a value whose type is that type constructor applied to M, or `(M[Int], M[Boolean], M[String])`. +* +* Finally, the `extract` method provides a list of vals that extract information from the applied input. +* The type of the applied input is the type constructor applied to `Id` (`[X] X`). +* The returned list of ValDefs should be the ValDefs from `inputs`, but with non-empty right-hand sides. +*/ +trait TupleBuilder { + /** A convenience alias for a list of inputs (associated with a Universe of type U). */ + type Inputs[U <: Universe with Singleton] = List[Instance.Input[U]] + + /** Constructs a one-time use Builder for Context `c` and type constructor `tcType`. */ + def make(c: Context)(tcType: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] +} + +trait BuilderResult[C <: Context with Singleton] +{ + val ctx: C + import ctx.universe._ + + /** Represents the higher-order type constructor `[L[x]] ...` where `...` is the + * type of the data structure containing the added expressions, + * except that it is abstracted over the type constructor applied to each heterogeneous part of the type . */ + def representationC: PolyType + + /** The instance of AList for the input. For a `representationC` of `[L[x]]`, this `Tree` should have a `Type` of `AList[L]`*/ + def alistInstance: Tree + + /** Returns the completed value containing all expressions added to the builder. */ + def input: Tree + + /* The list of definitions that extract values from a value of type `$representationC[Id]`. + * The returned value should be identical to the `ValDef`s provided to the `TupleBuilder.make` method but with + * non-empty right hand sides. Each `ValDef` may refer to `param` and previous `ValDef`s in the list.*/ + def extract(param: ValDef): List[ValDef] +} + diff --git a/util/appmacro/TupleNBuilder.scala b/util/appmacro/TupleNBuilder.scala new file mode 100644 index 000000000..ddf312f1b --- /dev/null +++ b/util/appmacro/TupleNBuilder.scala @@ -0,0 +1,51 @@ +package sbt +package appmacro + + import Types.Id + import scala.tools.nsc.Global + import scala.reflect._ + import makro._ + +/** A builder that uses a TupleN as the tuple representation. +* It is limited to tuples of size 2 to `MaxInputs`. */ +object TupleNBuilder extends TupleBuilder +{ + /** The largest number of inputs that this builder can handle. */ + final val MaxInputs = 11 + final val TupleMethodName = "tuple" + + def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] + { + val util = ContextUtil[c.type](c) + import c.universe.{Apply=>ApplyTree,_} + import util._ + + val global: Global = c.universe.asInstanceOf[Global] + val mTC: Type = mt.asInstanceOf[c.universe.Type] + + val ctx: c.type = c + val representationC: PolyType = { + val tcVariable: Symbol = newTCVariable(NoSymbol) + val tupleTypeArgs = inputs.map(in => typeRef(NoPrefix, tcVariable, in.tpe :: Nil).asInstanceOf[global.Type]) + val tuple = global.definitions.tupleType(tupleTypeArgs) + PolyType(tcVariable :: Nil, tuple.asInstanceOf[Type] ) + } + val resultType = appliedType(representationC, idTC :: Nil) + + val input: Tree = mkTuple(inputs.map(_.expr)) + val alistInstance: Tree = { + val select = Select(Ident(alist), TupleMethodName + inputs.size.toString) + TypeApply(select, inputs.map(in => typeTree(in.tpe))) + } + def extract(param: ValDef): List[ValDef] = bindTuple(param, Nil, inputs.map(_.local), 1) + + def bindTuple(param: ValDef, revBindings: List[ValDef], params: List[ValDef], i: Int): List[ValDef] = + params match + { + case ValDef(mods, name, tpt, _) :: xs => + val x = ValDef(mods, name, tpt, Select(Ident(param.name), "_" + i.toString)) + bindTuple(param, x :: revBindings, xs, i+1) + case Nil => revBindings.reverse + } + } +} diff --git a/util/collection/AList.scala b/util/collection/AList.scala index 212a58411..6e5946318 100644 --- a/util/collection/AList.scala +++ b/util/collection/AList.scala @@ -3,8 +3,9 @@ package sbt import Classes.Applicative import Types._ -/** An abstraction over (* -> *) -> * with the purpose of abstracting over arity abstractions like KList and TupleN -* as well as homogeneous sequences Seq[M[T]]. */ +/** An abstraction over a higher-order type constructor `K[x[y]]` with the purpose of abstracting +* over heterogeneous sequences like `KList` and `TupleN` with elements with a common type +* constructor as well as homogeneous sequences `Seq[M[T]]`. */ trait AList[K[L[x]] ] { def transform[M[_], N[_]](value: K[M], f: M ~> N): K[N] @@ -18,6 +19,7 @@ trait AList[K[L[x]] ] object AList { type Empty = AList[({ type l[L[x]] = Unit})#l] + /** AList for Unit, which represents a sequence that is always empty.*/ val empty: Empty = new Empty { def transform[M[_], N[_]](in: Unit, f: M ~> N) = () def foldr[M[_], T](in: Unit, f: (M[_], T) => T, init: T) = init @@ -26,6 +28,7 @@ object AList } type SeqList[T] = AList[({ type l[L[x]] = List[L[T]] })#l] + /** AList for a homogeneous sequence. */ def seq[T]: SeqList[T] = new SeqList[T] { def transform[M[_], N[_]](s: List[M[T]], f: M ~> N) = s.map(f.fn[T]) @@ -44,6 +47,7 @@ object AList def traverse[M[_], N[_], P[_]](s: List[M[T]], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[List[P[T]]] = ??? } + /** AList for the abitrary arity data structure KList. */ def klist[KL[M[_]] <: KList[M] { type Transform[N[_]] = KL[N] }]: AList[KL] = new AList[KL] { def transform[M[_], N[_]](k: KL[M], f: M ~> N) = k.transform(f) def foldr[M[_], T](k: KL[M], f: (M[_], T) => T, init: T): T = k.foldr(f, init) @@ -51,6 +55,7 @@ object AList def traverse[M[_], N[_], P[_]](k: KL[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KL[P]] = k.traverse[N,P](f)(np) } + /** AList for a single value. */ type Single[A] = AList[({ type l[L[x]] = L[A]})#l] def single[A]: Single[A] = new Single[A] { def transform[M[_], N[_]](a: M[A], f: M ~> N) = f(a) @@ -58,8 +63,8 @@ object AList def traverse[M[_], N[_], P[_]](a: M[A], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[P[A]] = f(a) } - type ASplit[K[L[x]], B[x]] = AList[ ({ type l[L[x]] = K[ (L ∙ B)#l] })#l ] + /** AList that operates on the outer type constructor `A` of a composition `[x] A[B[x]]` for type constructors `A` and `B`*/ def asplit[ K[L[x]], B[x] ](base: AList[K]): ASplit[K,B] = new ASplit[K, B] { type Split[ L[x] ] = K[ (L ∙ B)#l ] diff --git a/util/collection/KList.scala b/util/collection/KList.scala index 70e3852f9..7ecc6ba6a 100644 --- a/util/collection/KList.scala +++ b/util/collection/KList.scala @@ -11,9 +11,16 @@ sealed trait KList[+M[_]] /** Apply the natural transformation `f` to each element. */ def transform[N[_]](f: M ~> N): Transform[N] + /** Folds this list using a function that operates on the homogeneous type of the elements of this list. */ def foldr[T](f: (M[_], T) => T, init: T): T = init // had trouble defining it in KNil + + /** Applies `f` to the elements of this list in the applicative functor defined by `ap`. */ def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] + + /** Equivalent to `transform(f) . apply(x => x)`, this is the essence of the iterator at the level of natural transformations.*/ def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] + + /** Discards the heterogeneous type information and constructs a plain List from this KList's elements. */ def toList: List[M[_]] } final case class KCons[H, +T <: KList[M], +M[_]](head: M[H], tail: T) extends KList[M] diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 981096c0b..783956f2a 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -59,7 +59,9 @@ trait Init[Scope] type MapConstant = ScopedKey ~> Option def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) - def value[T](value: => T): Initialize[T] = new Value(value _) + def valueStrict[T](value: T): Initialize[T] = pure(() => value) + def value[T](value: => T): Initialize[T] = pure(value _) + def pure[T](value: () => T): Initialize[T] = new Value(value) def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, map(key)(f), NoPosition) def bind[S,T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) From 72a580d7e8447bc1628ea7be396325a7908b2487 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 1 Aug 2012 16:20:03 -0400 Subject: [PATCH 296/823] Task macro cleanup * use normal TypeTree constructor * remove unnecessary 'with Singleton' in macro utility * integrate changes suggested by @xeno-by * add refVar back and call asTypeConstructor instead of asType to refer to a type variable --- util/appmacro/ContextUtil.scala | 31 +++++++++++++--------------- util/appmacro/Instance.scala | 34 +++++++++---------------------- util/appmacro/KListBuilder.scala | 4 ++-- util/appmacro/TupleNBuilder.scala | 2 +- 4 files changed, 27 insertions(+), 44 deletions(-) diff --git a/util/appmacro/ContextUtil.scala b/util/appmacro/ContextUtil.scala index 31f61c356..8be859494 100644 --- a/util/appmacro/ContextUtil.scala +++ b/util/appmacro/ContextUtil.scala @@ -14,7 +14,7 @@ object ContextUtil { /** Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). * This is not thread safe due to the underlying Context and related data structures not being thread safe. * Use `ContextUtil[c.type](c)` to construct. */ -final class ContextUtil[C <: Context with Singleton](val ctx: C) +final class ContextUtil[C <: Context](val ctx: C) { import ctx.universe.{Apply=>ApplyTree,_} @@ -31,13 +31,11 @@ final class ContextUtil[C <: Context with Singleton](val ctx: C) * (The current implementation uses Context.fresh, which increments*/ def freshTermName(prefix: String) = newTermName(ctx.fresh("$" + prefix)) - def typeTree(tpe: Type) = TypeTree().setType(tpe) - /** Constructs a new, local ValDef with the given Type, a unique name, * the same position as `sym`, and an empty implementation (no rhs). */ def freshValDef(tpe: Type, sym: Symbol): ValDef = { - val vd = localValDef(typeTree(tpe), EmptyTree) + val vd = localValDef(TypeTree(tpe), EmptyTree) vd setPos getPos(sym) vd } @@ -54,10 +52,10 @@ final class ContextUtil[C <: Context with Singleton](val ctx: C) } /** Creates a new, synthetic type variable with the specified `owner`. */ - def newTypeVariable(owner: Symbol): Symbol = + def newTypeVariable(owner: Symbol, prefix: String = "T0"): TypeSymbol = { val global: Global = ctx.universe.asInstanceOf[Global] - owner.asInstanceOf[global.Symbol].newSyntheticTypeParam().asInstanceOf[ctx.universe.Symbol] + owner.asInstanceOf[global.Symbol].newSyntheticTypeParam(prefix, 0L).asInstanceOf[ctx.universe.TypeSymbol] } /** The type representing the type constructor `[X] X` */ val idTC: Type = @@ -65,28 +63,27 @@ final class ContextUtil[C <: Context with Singleton](val ctx: C) val tvar = newTypeVariable(NoSymbol) polyType(tvar :: Nil, refVar(tvar)) } + /** A Type that references the given type variable. */ + def refVar(variable: TypeSymbol): Type = variable.asTypeConstructor /** Constructs a new, synthetic type variable that is a type constructor. For example, in type Y[L[x]], L is such a type variable. */ - def newTCVariable(owner: Symbol): Symbol = + def newTCVariable(owner: Symbol): TypeSymbol = { - val global: Global = ctx.universe.asInstanceOf[Global] - val tc = owner.asInstanceOf[global.Symbol].newSyntheticTypeParam() - val arg = tc.newSyntheticTypeParam("x", 0L) - tc.setInfo(global.PolyType(arg :: Nil, global.TypeBounds.empty)).asInstanceOf[ctx.universe.Symbol] + val tc = newTypeVariable(owner) + val arg = newTypeVariable(tc, "x") + tc.setTypeSignature(PolyType(arg :: Nil, emptyTypeBounds)) + tc } + def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.asType, definitions.AnyClass.asType) + /** Returns the Symbol that references the statically accessible singleton `i`. */ def singleton[T <: AnyRef with Singleton](i: T)(implicit it: ctx.TypeTag[i.type]): Symbol = it.tpe match { case SingleType(_, sym) if !sym.isFreeTerm && sym.isStatic => sym case x => error("Instance must be static (was " + x + ").") } - /** Constructs a Type that references the given type variable. */ - def refVar(variable: Symbol): Type = typeRef(NoPrefix, variable, Nil) /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ - def method(obj: Symbol, name: String): Symbol = { - val global: Global = ctx.universe.asInstanceOf[Global] - obj.asInstanceOf[global.Symbol].info.nonPrivateMember(global.newTermName(name)).asInstanceOf[ctx.universe.Symbol] - } + def method(obj: Symbol, name: String): Symbol = obj.typeSignature.nonPrivateMember(newTermName(name)) /** Returns a Type representing the type constructor tcp.. For example, given * `object Demo { type M[x] = List[x] }`, the call `extractTC(Demo, "M")` will return a type representing diff --git a/util/appmacro/Instance.scala b/util/appmacro/Instance.scala index 05a80b4e8..6dec0eabc 100644 --- a/util/appmacro/Instance.scala +++ b/util/appmacro/Instance.scala @@ -84,10 +84,6 @@ object Instance implicit tt: c.TypeTag[T], mt: c.TypeTag[i.M[T]], it: c.TypeTag[i.type]): c.Expr[i.M[T]] = { import c.universe.{Apply=>ApplyTree,_} - - import scala.tools.nsc.Global - // Used to access compiler methods not yet exposed via the reflection/macro APIs - val global: Global = c.universe.asInstanceOf[Global] val util = ContextUtil[c.type](c) val mTC: Type = util.extractTC(i, InstanceTCName) @@ -113,19 +109,9 @@ object Instance // constructs a ValDef with a parameter modifier, a unique name, with the provided Type and with an empty rhs def freshMethodParameter(tpe: Type): ValDef = - ValDef(parameterModifiers, freshTermName("p"), typeTree(tpe), EmptyTree) + ValDef(parameterModifiers, freshTermName("p"), TypeTree(tpe), EmptyTree) def freshTermName(prefix: String) = newTermName(c.fresh("$" + prefix)) - def typeTree(tpe: Type) = TypeTree().setType(tpe) - - // constructs a function that applies f to each subtree of the input tree - def visitor(f: Tree => Unit): Tree => Unit = - { - val v: Transformer = new Transformer { - override def transform(tree: Tree): Tree = { f(tree); super.transform(tree) } - } - (tree: Tree) => v.transform(tree) - } /* Local definitions in the macro. This is used to ensure * references are to M instances defined outside of the macro call.*/ @@ -137,13 +123,13 @@ object Instance // a function that checks the provided tree for illegal references to M instances defined in the // expression passed to the macro and for illegal dereferencing of M instances. - val checkQual = visitor { + val checkQual: Tree => Unit = { case s @ ApplyTree(fun, qual :: Nil) => if(isWrapper(fun)) c.error(s.pos, DynamicDependencyError) case id @ Ident(name) if illegalReference(id.symbol) => c.error(id.pos, DynamicReferenceError) case _ => () } // adds the symbols for all non-Ident subtrees to `defs`. - val defSearch = visitor { + val defSearch: Tree => Unit = { case _: Ident => () case tree => if(tree.symbol ne null) defs += tree.symbol; } @@ -160,7 +146,7 @@ object Instance // no inputs, so construct M[T] via Instance.pure or pure+flatten def pure(body: Tree): Tree = { - val typeApplied = TypeApply(Select(instance, PureName), typeTree(treeType) :: Nil) + val typeApplied = TypeApply(Select(instance, PureName), TypeTree(treeType) :: Nil) val p = ApplyTree(typeApplied, Function(Nil, body) :: Nil) if(t.isLeft) p else flatten(p) } @@ -168,7 +154,7 @@ object Instance // the returned Tree will have type M[T] def flatten(m: Tree): Tree = { - val typedFlatten = TypeApply(Select(instance, FlattenName), typeTree(tt.tpe) :: Nil) + val typedFlatten = TypeApply(Select(instance, FlattenName), TypeTree(tt.tpe) :: Nil) ApplyTree(typedFlatten, m :: Nil) } @@ -177,7 +163,7 @@ object Instance { val variable = input.local val param = ValDef(parameterModifiers, variable.name, variable.tpt, EmptyTree) - val typeApplied = TypeApply(Select(instance, MapName), variable.tpt :: typeTree(treeType) :: Nil) + val typeApplied = TypeApply(Select(instance, MapName), variable.tpt :: TypeTree(treeType) :: Nil) val mapped = ApplyTree(typeApplied, input.expr :: Function(param :: Nil, body) :: Nil) if(t.isLeft) mapped else flatten(mapped) } @@ -189,8 +175,8 @@ object Instance val param = freshMethodParameter( appliedType(result.representationC, util.idTC :: Nil) ) val bindings = result.extract(param) val f = Function(param :: Nil, Block(bindings, body)) - val ttt = typeTree(treeType) - val typedApp = TypeApply(Select(instance, ApplyName), typeTree(result.representationC) :: ttt :: Nil) + val ttt = TypeTree(treeType) + val typedApp = TypeApply(Select(instance, ApplyName), TypeTree(result.representationC) :: ttt :: Nil) val app = ApplyTree(ApplyTree(typedApp, result.input :: f :: Nil), result.alistInstance :: Nil) if(t.isLeft) app else flatten(app) } @@ -202,7 +188,7 @@ object Instance // the bound value of the input def addType(tpe: Type, qual: Tree): Tree = { - checkQual(qual) + qual.foreach(checkQual) val vd = util.freshValDef(tpe, qual.symbol) inputs ::= new Input(tpe, qual, vd) Ident(vd.name) @@ -223,7 +209,7 @@ object Instance } // collects all definitions in the tree. used for finding illegal references - defSearch(tree) + tree.foreach(defSearch) // applies the transformation // resetting attributes: a) must be local b) must be done diff --git a/util/appmacro/KListBuilder.scala b/util/appmacro/KListBuilder.scala index 5b658ea69..b57a39449 100644 --- a/util/appmacro/KListBuilder.scala +++ b/util/appmacro/KListBuilder.scala @@ -24,7 +24,7 @@ object KListBuilder extends TupleBuilder val kconsTC: Type = kconsTpe.typeConstructor /** This is the L in the type function [L[x]] ... */ - val tcVariable: Symbol = newTCVariable(NoSymbol) + val tcVariable: TypeSymbol = newTCVariable(NoSymbol) /** Instantiates KCons[h, t <: KList[L], L], where L is the type constructor variable */ def kconsType(h: Type, t: Type): Type = @@ -52,7 +52,7 @@ object KListBuilder extends TupleBuilder val representationC = PolyType(tcVariable :: Nil, klistType) val resultType = appliedType(representationC, idTC :: Nil) val input = klist - val alistInstance = TypeApply(Select(Ident(alist), "klist"), typeTree(representationC) :: Nil) + val alistInstance = TypeApply(Select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) } } \ No newline at end of file diff --git a/util/appmacro/TupleNBuilder.scala b/util/appmacro/TupleNBuilder.scala index ddf312f1b..4f034adae 100644 --- a/util/appmacro/TupleNBuilder.scala +++ b/util/appmacro/TupleNBuilder.scala @@ -35,7 +35,7 @@ object TupleNBuilder extends TupleBuilder val input: Tree = mkTuple(inputs.map(_.expr)) val alistInstance: Tree = { val select = Select(Ident(alist), TupleMethodName + inputs.size.toString) - TypeApply(select, inputs.map(in => typeTree(in.tpe))) + TypeApply(select, inputs.map(in => TypeTree(in.tpe))) } def extract(param: ValDef): List[ValDef] = bindTuple(param, Nil, inputs.map(_.local), 1) From ec34ec580f4e2b0dd3652e041b02c1c2f20af3e9 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 20 Aug 2012 15:55:50 -0400 Subject: [PATCH 297/823] move explicit task/setting macros to Def, move to AbsTypeTag --- util/appmacro/ContextUtil.scala | 12 +++++++----- util/appmacro/Instance.scala | 11 ++++++----- 2 files changed, 13 insertions(+), 10 deletions(-) diff --git a/util/appmacro/ContextUtil.scala b/util/appmacro/ContextUtil.scala index 8be859494..2451c7c98 100644 --- a/util/appmacro/ContextUtil.scala +++ b/util/appmacro/ContextUtil.scala @@ -18,15 +18,17 @@ final class ContextUtil[C <: Context](val ctx: C) { import ctx.universe.{Apply=>ApplyTree,_} - val alistType = ctx.typeOf[AList[KList]] - val alist: Symbol = alistType.typeSymbol.companionSymbol - val alistTC: Type = alistType.typeConstructor + lazy val alistType = ctx.typeOf[AList[KList]] + lazy val alist: Symbol = alistType.typeSymbol.companionSymbol + lazy val alistTC: Type = alistType.typeConstructor /** Modifiers for a local val.*/ - val localModifiers = Modifiers(NoFlags) + lazy val localModifiers = Modifiers(NoFlags) def getPos(sym: Symbol) = if(sym eq null) NoPosition else sym.pos + def atypeOf[T](implicit att: AbsTypeTag[T]): Type = att.tpe + /** Constructs a unique term name with the given prefix within this Context. * (The current implementation uses Context.fresh, which increments*/ def freshTermName(prefix: String) = newTermName(ctx.fresh("$" + prefix)) @@ -58,7 +60,7 @@ final class ContextUtil[C <: Context](val ctx: C) owner.asInstanceOf[global.Symbol].newSyntheticTypeParam(prefix, 0L).asInstanceOf[ctx.universe.TypeSymbol] } /** The type representing the type constructor `[X] X` */ - val idTC: Type = + lazy val idTC: Type = { val tvar = newTypeVariable(NoSymbol) polyType(tvar :: Nil, refVar(tvar)) diff --git a/util/appmacro/Instance.scala b/util/appmacro/Instance.scala index 6dec0eabc..71360c1c7 100644 --- a/util/appmacro/Instance.scala +++ b/util/appmacro/Instance.scala @@ -18,7 +18,7 @@ trait Instance } trait Convert { - def apply[T: c.TypeTag](c: scala.reflect.makro.Context)(in: c.Tree): c.Tree + def apply[T: c.AbsTypeTag](c: scala.reflect.makro.Context)(in: c.Tree): c.Tree } trait MonadInstance extends Instance { @@ -80,18 +80,19 @@ object Instance * If this is for multi-input flatMap (app followed by flatMap), * this should be the argument wrapped in Right. */ - def contImpl[T: c.TypeTag](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]])( - implicit tt: c.TypeTag[T], mt: c.TypeTag[i.M[T]], it: c.TypeTag[i.type]): c.Expr[i.M[T]] = + def contImpl[T](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]])( + implicit tt: c.AbsTypeTag[T], it: c.TypeTag[i.type]): c.Expr[i.M[T]] = { import c.universe.{Apply=>ApplyTree,_} val util = ContextUtil[c.type](c) val mTC: Type = util.extractTC(i, InstanceTCName) + val mttpe: Type = appliedType(mTC, tt.tpe :: Nil).normalize // the tree for the macro argument val (tree, treeType) = t match { case Left(l) => (l.tree, tt.tpe.normalize) - case Right(r) => (r.tree, mt.tpe.normalize) + case Right(r) => (r.tree, mttpe) } val instanceSym = util.singleton(i) @@ -202,7 +203,7 @@ object Instance tree match { case ApplyTree(TypeApply(fun, t :: Nil), qual :: Nil) if isWrapper(fun) => - val tag = c.TypeTag(t.tpe) + val tag = c.AbsTypeTag(t.tpe) addType(t.tpe, convert(c)(qual)(tag) ) case _ => super.transform(tree) } From 64e000a37d44e434b1a41f9f1bf2c8240f487d46 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 20 Aug 2012 15:55:51 -0400 Subject: [PATCH 298/823] Properly apply transformations to dynamic tasks. That is, implement Initialize[Task[T]].flatten correctly. This requires preserving the transformations applied in a scope so that they can be applied to an Initialize value after static settings have been evaluated. --- util/collection/INode.scala | 1 + util/collection/Settings.scala | 24 ++++++++++++++++++++---- 2 files changed, 21 insertions(+), 4 deletions(-) diff --git a/util/collection/INode.scala b/util/collection/INode.scala index 1ac9152a2..86ddff060 100644 --- a/util/collection/INode.scala +++ b/util/collection/INode.scala @@ -27,6 +27,7 @@ abstract class EvaluateSettings[Scope] case a: Apply[k,T] => new MixedNode[k,T]( a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) case b: Bind[s,T] => new BindNode[s,T]( transform(b.in), x => transform(b.f(x))) case v: Value[T] => constant(v.value) + case t: TransformCapture => constant(() => t.f) case o: Optional[s,T] => o.a match { case None => constant( () => o.f(None) ) case Some(i) => single[s,T](transform(i), x => o.f(Some(x))) diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index 783956f2a..faa2577cf 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -58,6 +58,9 @@ trait Init[Scope] type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] type MapConstant = ScopedKey ~> Option + /** The result of this initialization is the composition of applied transformations. + * This can be useful when dealing with dynamic Initialize values. */ + lazy val capturedTransformations: Initialize[Initialize ~> Initialize] = new TransformCapture(idK[Initialize]) def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) def valueStrict[T](value: T): Initialize[T] = pure(() => value) def value[T](value: => T): Initialize[T] = pure(value _) @@ -264,6 +267,14 @@ trait Init[Scope] override def toString = "setting(" + key + ") at " + pos } + private[this] def handleUndefined[T](vr: ValidatedInit[T]): Initialize[T] = vr match { + case Left(undefs) => throw new RuntimeUndefined(undefs) + case Right(x) => x + } + + private[this] lazy val getValidated = + new (ValidatedInit ~> Initialize) { def apply[T](v: ValidatedInit[T]) = handleUndefined[T](v) } + // mainly for reducing generated class count private[this] def validateReferencedT(g: ValidateRef) = new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateReferenced g } @@ -302,6 +313,15 @@ trait Init[Scope] trait KeyedInitialize[T] extends Keyed[T, T] { final val transform = idFun[T] } + private[sbt] final class TransformCapture(val f: Initialize ~> Initialize) extends Initialize[Initialize ~> Initialize] + { + def dependencies = Nil + def apply[Z](g2: (Initialize ~> Initialize) => Z): Initialize[Z] = map(this)(g2) + def evaluate(ss: Settings[Scope]): Initialize ~> Initialize = f + def mapReferenced(g: MapScoped) = new TransformCapture(mapReferencedT(g) ∙ f) + def mapConstant(g: MapConstant) = new TransformCapture(mapConstantT(g) ∙ f) + def validateReferenced(g: ValidateRef) = Right(new TransformCapture(getValidated ∙ validateReferencedT(g) ∙ f)) + } private[sbt] final class Bind[S,T](val f: S => Initialize[T], val in: Initialize[S]) extends Initialize[T] { def dependencies = in.dependencies @@ -311,10 +331,6 @@ trait Init[Scope] def validateReferenced(g: ValidateRef) = (in validateReferenced g).right.map { validIn => new Bind[S,T](s => handleUndefined( f(s) validateReferenced g), validIn) } - def handleUndefined(vr: ValidatedInit[T]): Initialize[T] = vr match { - case Left(undefs) => throw new RuntimeUndefined(undefs) - case Right(x) => x - } def mapConstant(g: MapConstant) = new Bind[S,T](s => f(s) mapConstant g, in mapConstant g) } private[sbt] final class Optional[S,T](val a: Option[Initialize[S]], val f: Option[S] => T) extends Initialize[T] From 530e125a9eb30e465dd39d1ab2dcad1315004e0b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 24 Aug 2012 13:27:34 -0400 Subject: [PATCH 299/823] Scala 2.10.0-M7 --- util/appmacro/ContextUtil.scala | 15 ++++++++++----- util/appmacro/Instance.scala | 4 ++-- util/appmacro/KListBuilder.scala | 2 +- util/appmacro/MixedBuilder.scala | 2 +- util/appmacro/TupleBuilder.scala | 2 +- util/appmacro/TupleNBuilder.scala | 2 +- 6 files changed, 16 insertions(+), 11 deletions(-) diff --git a/util/appmacro/ContextUtil.scala b/util/appmacro/ContextUtil.scala index 2451c7c98..ad66c84e2 100644 --- a/util/appmacro/ContextUtil.scala +++ b/util/appmacro/ContextUtil.scala @@ -2,7 +2,7 @@ package sbt package appmacro import scala.reflect._ - import makro._ + import macros._ import scala.tools.nsc.Global object ContextUtil { @@ -66,7 +66,7 @@ final class ContextUtil[C <: Context](val ctx: C) polyType(tvar :: Nil, refVar(tvar)) } /** A Type that references the given type variable. */ - def refVar(variable: TypeSymbol): Type = variable.asTypeConstructor + def refVar(variable: TypeSymbol): Type = variable.toTypeConstructor /** Constructs a new, synthetic type variable that is a type constructor. For example, in type Y[L[x]], L is such a type variable. */ def newTCVariable(owner: Symbol): TypeSymbol = { @@ -75,7 +75,7 @@ final class ContextUtil[C <: Context](val ctx: C) tc.setTypeSignature(PolyType(arg :: Nil, emptyTypeBounds)) tc } - def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.asType, definitions.AnyClass.asType) + def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) /** Returns the Symbol that references the statically accessible singleton `i`. */ def singleton[T <: AnyRef with Singleton](i: T)(implicit it: ctx.TypeTag[i.type]): Symbol = @@ -85,7 +85,12 @@ final class ContextUtil[C <: Context](val ctx: C) } /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ - def method(obj: Symbol, name: String): Symbol = obj.typeSignature.nonPrivateMember(newTermName(name)) + def method(obj: Symbol, name: String): Symbol = { + val global: Global = ctx.universe.asInstanceOf[Global] + val ts: Type = obj.typeSignature + val m: global.Symbol = ts.asInstanceOf[global.Type].nonPrivateMember(global.newTermName(name)) + m.asInstanceOf[Symbol] + } /** Returns a Type representing the type constructor tcp.. For example, given * `object Demo { type M[x] = List[x] }`, the call `extractTC(Demo, "M")` will return a type representing @@ -97,7 +102,7 @@ final class ContextUtil[C <: Context](val ctx: C) val itTpe = it.tpe.asInstanceOf[global.Type] val m = itTpe.nonPrivateMember(global.newTypeName(name)) val tc = itTpe.memberInfo(m).asInstanceOf[ctx.universe.Type] - assert(tc != NoType && tc.isHigherKinded, "Invalid type constructor: " + tc) + assert(tc != NoType && tc.takesTypeArgs, "Invalid type constructor: " + tc) tc } } \ No newline at end of file diff --git a/util/appmacro/Instance.scala b/util/appmacro/Instance.scala index 71360c1c7..e03a29cb2 100644 --- a/util/appmacro/Instance.scala +++ b/util/appmacro/Instance.scala @@ -18,7 +18,7 @@ trait Instance } trait Convert { - def apply[T: c.AbsTypeTag](c: scala.reflect.makro.Context)(in: c.Tree): c.Tree + def apply[T: c.AbsTypeTag](c: scala.reflect.macros.Context)(in: c.Tree): c.Tree } trait MonadInstance extends Instance { @@ -30,7 +30,7 @@ object InputWrapper } import scala.reflect._ - import makro._ + import macros._ object Instance { diff --git a/util/appmacro/KListBuilder.scala b/util/appmacro/KListBuilder.scala index b57a39449..7ae0696d0 100644 --- a/util/appmacro/KListBuilder.scala +++ b/util/appmacro/KListBuilder.scala @@ -4,7 +4,7 @@ package appmacro import Types.Id import scala.tools.nsc.Global import scala.reflect._ - import makro._ + import macros._ /** A `TupleBuilder` that uses a KList as the tuple representation.*/ object KListBuilder extends TupleBuilder diff --git a/util/appmacro/MixedBuilder.scala b/util/appmacro/MixedBuilder.scala index 593f60382..e58adb2b0 100644 --- a/util/appmacro/MixedBuilder.scala +++ b/util/appmacro/MixedBuilder.scala @@ -2,7 +2,7 @@ package sbt package appmacro import scala.reflect._ - import makro._ + import macros._ /** A builder that uses `TupleN` as the representation for small numbers of inputs (up to `TupleNBuilder.MaxInputs`) * and `KList` for larger numbers of inputs. This builder cannot handle fewer than 2 inputs.*/ diff --git a/util/appmacro/TupleBuilder.scala b/util/appmacro/TupleBuilder.scala index f91d3c91c..f6442cb02 100644 --- a/util/appmacro/TupleBuilder.scala +++ b/util/appmacro/TupleBuilder.scala @@ -4,7 +4,7 @@ package appmacro import Types.Id import scala.tools.nsc.Global import scala.reflect._ - import makro._ + import macros._ /** * A `TupleBuilder` abstracts the work of constructing a tuple data structure such as a `TupleN` or `KList` diff --git a/util/appmacro/TupleNBuilder.scala b/util/appmacro/TupleNBuilder.scala index 4f034adae..c7b9929ab 100644 --- a/util/appmacro/TupleNBuilder.scala +++ b/util/appmacro/TupleNBuilder.scala @@ -4,7 +4,7 @@ package appmacro import Types.Id import scala.tools.nsc.Global import scala.reflect._ - import makro._ + import macros._ /** A builder that uses a TupleN as the tuple representation. * It is limited to tuples of size 2 to `MaxInputs`. */ From 981ada6f4b950f1359763612a00b20eb5a19c577 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 21 Sep 2012 16:42:07 -0400 Subject: [PATCH 300/823] AbsTypeTag -> WeakTypeTag and converted more settings --- util/appmacro/ContextUtil.scala | 2 +- util/appmacro/Instance.scala | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/util/appmacro/ContextUtil.scala b/util/appmacro/ContextUtil.scala index ad66c84e2..3fd45207a 100644 --- a/util/appmacro/ContextUtil.scala +++ b/util/appmacro/ContextUtil.scala @@ -27,7 +27,7 @@ final class ContextUtil[C <: Context](val ctx: C) def getPos(sym: Symbol) = if(sym eq null) NoPosition else sym.pos - def atypeOf[T](implicit att: AbsTypeTag[T]): Type = att.tpe + def atypeOf[T](implicit att: WeakTypeTag[T]): Type = att.tpe /** Constructs a unique term name with the given prefix within this Context. * (The current implementation uses Context.fresh, which increments*/ diff --git a/util/appmacro/Instance.scala b/util/appmacro/Instance.scala index e03a29cb2..1dd51e26b 100644 --- a/util/appmacro/Instance.scala +++ b/util/appmacro/Instance.scala @@ -18,7 +18,7 @@ trait Instance } trait Convert { - def apply[T: c.AbsTypeTag](c: scala.reflect.macros.Context)(in: c.Tree): c.Tree + def apply[T: c.WeakTypeTag](c: scala.reflect.macros.Context)(in: c.Tree): c.Tree } trait MonadInstance extends Instance { @@ -81,7 +81,7 @@ object Instance * this should be the argument wrapped in Right. */ def contImpl[T](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]])( - implicit tt: c.AbsTypeTag[T], it: c.TypeTag[i.type]): c.Expr[i.M[T]] = + implicit tt: c.WeakTypeTag[T], it: c.TypeTag[i.type]): c.Expr[i.M[T]] = { import c.universe.{Apply=>ApplyTree,_} @@ -203,7 +203,7 @@ object Instance tree match { case ApplyTree(TypeApply(fun, t :: Nil), qual :: Nil) if isWrapper(fun) => - val tag = c.AbsTypeTag(t.tpe) + val tag = c.WeakTypeTag(t.tpe) addType(t.tpe, convert(c)(qual)(tag) ) case _ => super.transform(tree) } From ed41547a47544f58de425cd68291c4998bbba843 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 2 Nov 2012 14:20:17 -0400 Subject: [PATCH 301/823] InputTask macro Similar to task macros, the parsed value is accessed by calling `parsed` on a Parser[T], Initialize[Parser[T]], or Initialize[State => Parser[T]]. Values of tasks and settings may be accessed as usual via `value`. --- util/appmacro/ContextUtil.scala | 98 ++++++++++++++++++++++++++++++++ util/appmacro/Instance.scala | 99 +++++++++++++-------------------- util/collection/Settings.scala | 3 + 3 files changed, 141 insertions(+), 59 deletions(-) diff --git a/util/appmacro/ContextUtil.scala b/util/appmacro/ContextUtil.scala index 3fd45207a..a62846e78 100644 --- a/util/appmacro/ContextUtil.scala +++ b/util/appmacro/ContextUtil.scala @@ -4,11 +4,32 @@ package appmacro import scala.reflect._ import macros._ import scala.tools.nsc.Global + import ContextUtil.{DynamicDependencyError, DynamicReferenceError} object ContextUtil { + final val DynamicDependencyError = "Illegal dynamic dependency" + final val DynamicReferenceError = "Illegal dynamic reference" + /** Constructs an object with utility methods for operating in the provided macro context `c`. * Callers should explicitly specify the type parameter as `c.type` in order to preserve the path dependent types. */ def apply[C <: Context with Singleton](c: C): ContextUtil[C] = new ContextUtil(c) + + + /** Helper for implementing a no-argument macro that is introduced via an implicit. + * This method removes the implicit conversion and evaluates the function `f` on the target of the conversion. + * + * Given `myImplicitConversion(someValue).extensionMethod`, where `extensionMethod` is a macro that uses this + * method, the result of this method is `f()`. */ + def selectMacroImpl[T: c.WeakTypeTag, S: c.WeakTypeTag](c: Context)(f: c.Expr[S] => c.Expr[T]): c.Expr[T] = + { + import c.universe._ + c.macroApplication match { + case Select(Apply(_, t :: Nil), _) => f( c.Expr[S](t) ) + case x => unexpectedTree(x) + } + } + + def unexpectedTree[C <: Context](tree: C#Tree): Nothing = error("Unexpected macro application tree (" + tree.getClass + "): " + tree) } /** Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). @@ -42,6 +63,50 @@ final class ContextUtil[C <: Context](val ctx: C) vd } + /* Tests whether a Tree is a Select on `methodName`. */ + def isWrapper(methodName: String): Tree => Boolean = { + case Select(_, nme) => nme.decoded == methodName + case _ => false + } + + lazy val parameterModifiers = Modifiers(Flag.PARAM) + + /** Collects all definitions in the tree for use in checkReferences. + * This excludes definitions in wrapped expressions because checkReferences won't allow nested dereferencing anyway. */ + def collectDefs(tree: Tree, isWrapper: Tree => Boolean): collection.Set[Symbol] = + { + val defs = new collection.mutable.HashSet[Symbol] + // adds the symbols for all non-Ident subtrees to `defs`. + val process = new Traverser { + override def traverse(t: Tree) = t match { + case _: Ident => () + case ApplyTree(TypeApply(fun, tpe :: Nil), qual :: Nil) if isWrapper(fun) => () + case tree => + if(tree.symbol ne null) defs += tree.symbol; + super.traverse(tree) + } + } + process.traverse(tree) + defs + } + + /** A reference is illegal if it is to an M instance defined within the scope of the macro call. + * As an approximation, disallow referenced to any local definitions `defs`. */ + def illegalReference(defs: collection.Set[Symbol], sym: Symbol): Boolean = + sym != null && sym != NoSymbol && defs.contains(sym) + + /** A function that checks the provided tree for illegal references to M instances defined in the + * expression passed to the macro and for illegal dereferencing of M instances. */ + def checkReferences(defs: collection.Set[Symbol], isWrapper: Tree => Boolean): Tree => Unit = { + case s @ ApplyTree(TypeApply(fun, tpe :: Nil), qual :: Nil) => if(isWrapper(fun)) ctx.error(s.pos, DynamicDependencyError) + case id @ Ident(name) if illegalReference(defs, id.symbol) => ctx.error(id.pos, DynamicReferenceError + ": " + name) + case _ => () + } + + /** Constructs a ValDef with a parameter modifier, a unique name, with the provided Type and with an empty rhs. */ + def freshMethodParameter(tpe: Type): ValDef = + ValDef(parameterModifiers, freshTermName("p"), TypeTree(tpe), EmptyTree) + /** Constructs a ValDef with local modifiers and a unique name. */ def localValDef(tpt: Tree, rhs: Tree): ValDef = ValDef(localModifiers, freshTermName("q"), tpt, rhs) @@ -75,8 +140,18 @@ final class ContextUtil[C <: Context](val ctx: C) tc.setTypeSignature(PolyType(arg :: Nil, emptyTypeBounds)) tc } + /** >: Nothing <: Any */ def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) + /** Create a Tree that references the `val` represented by `vd`. */ + def refVal(vd: ValDef): Tree = + { + val t = Ident(vd.name) + assert(vd.tpt.tpe != null, "val type is null: " + vd + ", tpt: " + vd.tpt.tpe) + t.setType(vd.tpt.tpe) + t + } + /** Returns the Symbol that references the statically accessible singleton `i`. */ def singleton[T <: AnyRef with Singleton](i: T)(implicit it: ctx.TypeTag[i.type]): Symbol = it.tpe match { @@ -105,4 +180,27 @@ final class ContextUtil[C <: Context](val ctx: C) assert(tc != NoType && tc.takesTypeArgs, "Invalid type constructor: " + tc) tc } + + /** Substitutes wrappers in tree `t` with the result of `subWrapper`. + * A wrapper is a Tree of the form `f[T](v)` for which isWrapper() returns true. + * Typically, `f` is a `Select` or `Ident`. + * The wrapper is replaced with the result of `subWrapper(, )` */ + def transformWrappers(t: Tree, isWrapper: Tree => Boolean, subWrapper: (Type, Tree) => Tree): Tree = + { + // the main tree transformer that replaces calls to InputWrapper.wrap(x) with + // plain Idents that reference the actual input value + object appTransformer extends Transformer + { + override def transform(tree: Tree): Tree = + tree match + { + case ApplyTree(TypeApply(fun, targ :: Nil), qual :: Nil) if isWrapper(fun) => + assert(qual.tpe != null, "Internal error: null type for wrapped tree with " + qual.getClass + "\n\t" + qual + "\n in " + t) + subWrapper(targ.tpe, qual) + case _ => super.transform(tree) + } + } + + appTransformer.transform(t) + } } \ No newline at end of file diff --git a/util/appmacro/Instance.scala b/util/appmacro/Instance.scala index 1dd51e26b..f70941dd0 100644 --- a/util/appmacro/Instance.scala +++ b/util/appmacro/Instance.scala @@ -24,24 +24,45 @@ trait MonadInstance extends Instance { def flatten[T](in: M[M[T]]): M[T] } -object InputWrapper -{ - def wrap[T](in: Any): T = error("This method is an implementation detail and should not be referenced.") -} import scala.reflect._ import macros._ +object InputWrapper +{ + /** The name of the wrapper method should be obscure. + * Wrapper checking is based solely on this name, so it must not conflict with a user method name. + * The user should never see this method because it is compile-time only and only used internally by the task macro system.*/ + final val WrapName = "wrap_\u2603\u2603" + + // This method should be annotated as compile-time only when that feature is implemented + def wrap_\u2603\u2603[T](in: Any): T = error("This method is an implementation detail and should not be referenced.") + + /** Wraps an arbitrary Tree in a call to the `wrap` method of this module for later processing by an enclosing macro. + * The resulting Tree is the manually constructed version of: + * + * `c.universe.reify { InputWrapper.[T](ts.splice) }` + */ + def wrapKey[T: c.WeakTypeTag](c: Context)(ts: c.Expr[Any]): c.Expr[T] = + { + import c.universe.{Apply=>ApplyTree,_} + val util = new ContextUtil[c.type](c) + val iw = util.singleton(InputWrapper) + val tpe = c.weakTypeOf[T] + val nme = newTermName(WrapName).encoded + val tree = ApplyTree(TypeApply(Select(Ident(iw), nme), TypeTree(tpe) :: Nil), ts.tree :: Nil) + tree.setPos(ts.tree.pos) + c.Expr[T](tree) + } +} + object Instance { - final val DynamicDependencyError = "Illegal dynamic dependency." - final val DynamicReferenceError = "Illegal dynamic reference." final val ApplyName = "app" final val FlattenName = "flatten" final val PureName = "pure" final val MapName = "map" final val InstanceTCName = "M" - final val WrapName = "wrap" final class Input[U <: Universe with Singleton](val tpe: U#Type, val expr: U#Tree, val local: U#ValDef) @@ -99,41 +120,14 @@ object Instance // A Tree that references the statically accessible Instance that provides the actual implementations of map, flatMap, ... val instance = Ident(instanceSym) - val parameterModifiers = Modifiers(Flag.PARAM) - - val wrapperSym = util.singleton(InputWrapper) - val wrapMethodSymbol = util.method(wrapperSym, WrapName) - def isWrapper(fun: Tree) = fun.symbol == wrapMethodSymbol + val isWrapper: Tree => Boolean = util.isWrapper(InputWrapper.WrapName) type In = Input[c.universe.type] var inputs = List[In]() - // constructs a ValDef with a parameter modifier, a unique name, with the provided Type and with an empty rhs - def freshMethodParameter(tpe: Type): ValDef = - ValDef(parameterModifiers, freshTermName("p"), TypeTree(tpe), EmptyTree) - - def freshTermName(prefix: String) = newTermName(c.fresh("$" + prefix)) - - /* Local definitions in the macro. This is used to ensure - * references are to M instances defined outside of the macro call.*/ - val defs = new collection.mutable.HashSet[Symbol] - - // a reference is illegal if it is to an M instance defined within the scope of the macro call - def illegalReference(sym: Symbol): Boolean = - sym != null && sym != NoSymbol && defs.contains(sym) - - // a function that checks the provided tree for illegal references to M instances defined in the - // expression passed to the macro and for illegal dereferencing of M instances. - val checkQual: Tree => Unit = { - case s @ ApplyTree(fun, qual :: Nil) => if(isWrapper(fun)) c.error(s.pos, DynamicDependencyError) - case id @ Ident(name) if illegalReference(id.symbol) => c.error(id.pos, DynamicReferenceError) - case _ => () - } - // adds the symbols for all non-Ident subtrees to `defs`. - val defSearch: Tree => Unit = { - case _: Ident => () - case tree => if(tree.symbol ne null) defs += tree.symbol; - } + // Local definitions in the macro. This is used to ensure references are to M instances defined outside of the macro call. + val defs = util.collectDefs(tree, isWrapper) + val checkQual: Tree => Unit = util.checkReferences(defs, isWrapper) // transforms the original tree into calls to the Instance functions pure, map, ..., // resulting in a value of type M[T] @@ -163,7 +157,7 @@ object Instance def single(body: Tree, input: In): Tree = { val variable = input.local - val param = ValDef(parameterModifiers, variable.name, variable.tpt, EmptyTree) + val param = ValDef(util.parameterModifiers, variable.name, variable.tpt, EmptyTree) val typeApplied = TypeApply(Select(instance, MapName), variable.tpt :: TypeTree(treeType) :: Nil) val mapped = ApplyTree(typeApplied, input.expr :: Function(param :: Nil, body) :: Nil) if(t.isLeft) mapped else flatten(mapped) @@ -173,7 +167,7 @@ object Instance def arbArity(body: Tree, inputs: List[In]): Tree = { val result = builder.make(c)(mTC, inputs) - val param = freshMethodParameter( appliedType(result.representationC, util.idTC :: Nil) ) + val param = util.freshMethodParameter( appliedType(result.representationC, util.idTC :: Nil) ) val bindings = result.extract(param) val f = Function(param :: Nil, Block(bindings, body)) val ttt = TypeTree(treeType) @@ -192,30 +186,17 @@ object Instance qual.foreach(checkQual) val vd = util.freshValDef(tpe, qual.symbol) inputs ::= new Input(tpe, qual, vd) - Ident(vd.name) + util.refVal(vd) } - - // the main tree transformer that replaces calls to InputWrapper.wrap(x) with - // plain Idents that reference the actual input value - object appTransformer extends Transformer + def sub(tpe: Type, qual: Tree): Tree = { - override def transform(tree: Tree): Tree = - tree match - { - case ApplyTree(TypeApply(fun, t :: Nil), qual :: Nil) if isWrapper(fun) => - val tag = c.WeakTypeTag(t.tpe) - addType(t.tpe, convert(c)(qual)(tag) ) - case _ => super.transform(tree) - } + val tag = c.WeakTypeTag(tpe) + addType(tpe, convert(c)(qual)(tag) ) } - // collects all definitions in the tree. used for finding illegal references - tree.foreach(defSearch) - // applies the transformation - // resetting attributes: a) must be local b) must be done - // on the transformed tree and not the wrapped tree or else there are obscure errors - val tr = makeApp( c.resetLocalAttrs(appTransformer.transform(tree)) ) + // resetting attributes must be: a) local b) done here and not wider or else there are obscure errors + val tr = makeApp( c.resetLocalAttrs( util.transformWrappers(tree, isWrapper, (tpe, tr) => sub(tpe, tr)) ) ) c.Expr[i.M[T]](tr) } diff --git a/util/collection/Settings.scala b/util/collection/Settings.scala index faa2577cf..d85876496 100644 --- a/util/collection/Settings.scala +++ b/util/collection/Settings.scala @@ -79,6 +79,9 @@ trait Init[Scope] } def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse error("Internal settings error: invalid reference to " + showFullKey(k)) def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) + def mapScope(f: Scope => Scope): MapScoped = new MapScoped { + def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) + } def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = { From 237b80eb227d9f7d768882f960a581dd4223ddf3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 13 Nov 2012 14:52:33 -0500 Subject: [PATCH 302/823] Transition to all camelCase key labels. 1. Hyphenated labels are still accepted when parsing scoped keys (so 'sbt test-only' still works) There is currently no timeline for removing this support for hyphenated keys. 2. Only camelCase is shown for tab completion. 3. AttributeKey.rawLabel provides the unnormalized label. This should only be used to implement support for accepting hyphenated keys as input for compatibility. 4. AttributeKey.normLabel provides the normalized label (hyphenated converted to camelCase) --- util/collection/Attributes.scala | 6 +++++- util/collection/Util.scala | 10 ++++++++-- 2 files changed, 13 insertions(+), 3 deletions(-) diff --git a/util/collection/Attributes.scala b/util/collection/Attributes.scala index 0376ca76c..227e2fdd7 100644 --- a/util/collection/Attributes.scala +++ b/util/collection/Attributes.scala @@ -11,6 +11,8 @@ import scala.reflect.Manifest // a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts sealed trait AttributeKey[T] { def manifest: Manifest[T] + @deprecated("Should only be used for compatibility during the transition from hyphenated labels to camelCase labels.", "0.13.0") + def rawLabel: String def label: String def description: Option[String] def extend: Seq[AttributeKey[_]] @@ -48,13 +50,15 @@ object AttributeKey private[this] def make[T](name: String, description0: Option[String], extend0: Seq[AttributeKey[_]], rank0: Int)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { def manifest = mf - def label = name + def rawLabel = name + val label = Util.hyphenToCamel(name) def description = description0 def extend = extend0 def rank = rank0 } private[sbt] def local[T](implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { def manifest = mf + def rawLabel = LocalLabel def label = LocalLabel def description = None def extend = Nil diff --git a/util/collection/Util.scala b/util/collection/Util.scala index 5aede5f0d..f4f6fbb50 100644 --- a/util/collection/Util.scala +++ b/util/collection/Util.scala @@ -26,8 +26,14 @@ object Util def pairID[A,B] = (a: A, b: B) => (a,b) private[this] lazy val Hypen = """-(\p{javaLowerCase})""".r - def hypenToCamel(s: String): String = - Hypen.replaceAllIn(s, _.group(1).toUpperCase) + def hasHyphen(s: String): Boolean = s.indexOf('-') >= 0 + @deprecated("Use the properly spelled version: hyphenToCamel", "0.13.0") + def hypenToCamel(s: String): String = hyphenToCamel(s) + def hyphenToCamel(s: String): String = + if(hasHyphen(s)) + Hypen.replaceAllIn(s, _.group(1).toUpperCase) + else + s private[this] lazy val Camel = """(\p{javaLowerCase})(\p{javaUpperCase})""".r def camelToHypen(s: String): String = From 9890f711028f0bff4f719d014a304d6db71579e3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 18 Nov 2012 09:20:24 -0500 Subject: [PATCH 303/823] Explicitly specify type parameters in calls to KCons in KList builder. scalac couldn't infer the type constructor otherwise. --- util/appmacro/KListBuilder.scala | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/util/appmacro/KListBuilder.scala b/util/appmacro/KListBuilder.scala index 7ae0696d0..551566419 100644 --- a/util/appmacro/KListBuilder.scala +++ b/util/appmacro/KListBuilder.scala @@ -41,8 +41,16 @@ object KListBuilder extends TupleBuilder case Nil => revBindings.reverse } + private[this] def makeKList(revInputs: Inputs[c.universe.type], klist: Tree, klistType: Type): Tree = + revInputs match { + case in :: tail => + val next = ApplyTree(TypeApply(Ident(kcons), TypeTree(in.tpe) :: TypeTree(klistType) :: TypeTree(mTC) :: Nil), in.expr :: klist :: Nil) + makeKList(tail, next, appliedType(kconsTC, in.tpe :: klistType :: mTC :: Nil)) + case Nil => klist + } + /** The input trees combined in a KList */ - val klist = (inputs :\ (knil: Tree))( (in, klist) => ApplyTree(kcons, in.expr, klist) ) + val klist = makeKList(inputs.reverse, knil, knilType) /** The input types combined in a KList type. The main concern is tracking the heterogeneous types. * The type constructor is tcVariable, so that it can be applied to [X] X or M later. From 6c5e4ae21c5fc670510440e18065fea90f4cb985 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Fri, 7 Dec 2012 10:27:08 -0800 Subject: [PATCH 304/823] Follow source layout convention supported by Eclipse. Moved source files so directory structure follow package structure. That makes it possible to use Scala Eclipse plugin with sbt's source code. --- cache/{ => src/main/scala/sbt}/Cache.scala | 0 cache/{ => src/main/scala/sbt}/CacheIO.scala | 0 cache/{ => src/main/scala/sbt}/FileInfo.scala | 0 cache/{ => src/main/scala/sbt}/SeparatedCache.scala | 0 cache/tracking/{ => src/main/scala/sbt}/ChangeReport.scala | 0 cache/tracking/{ => src/main/scala/sbt}/Tracked.scala | 0 util/appmacro/{ => src/main/scala/sbt/appmacro}/ContextUtil.scala | 0 util/appmacro/{ => src/main/scala/sbt/appmacro}/Instance.scala | 0 .../appmacro/{ => src/main/scala/sbt/appmacro}/KListBuilder.scala | 0 .../appmacro/{ => src/main/scala/sbt/appmacro}/MixedBuilder.scala | 0 .../appmacro/{ => src/main/scala/sbt/appmacro}/TupleBuilder.scala | 0 .../{ => src/main/scala/sbt/appmacro}/TupleNBuilder.scala | 0 util/collection/{ => src/main/scala/sbt}/AList.scala | 0 util/collection/{ => src/main/scala/sbt}/Attributes.scala | 0 util/collection/{ => src/main/scala/sbt}/Classes.scala | 0 util/collection/{ => src/main/scala/sbt}/Dag.scala | 0 util/collection/{ => src/main/scala/sbt}/HList.scala | 0 util/collection/{ => src/main/scala/sbt}/IDSet.scala | 0 util/collection/{ => src/main/scala/sbt}/INode.scala | 0 util/collection/{ => src/main/scala/sbt}/KList.scala | 0 util/collection/{ => src/main/scala/sbt}/PMap.scala | 0 util/collection/{ => src/main/scala/sbt}/Param.scala | 0 util/collection/{ => src/main/scala/sbt}/Positions.scala | 0 util/collection/{ => src/main/scala/sbt}/Settings.scala | 0 util/collection/{ => src/main/scala/sbt}/Show.scala | 0 util/collection/{ => src/main/scala/sbt}/Signal.scala | 0 util/collection/{ => src/main/scala/sbt}/TypeFunctions.scala | 0 util/collection/{ => src/main/scala/sbt}/Types.scala | 0 util/collection/{ => src/main/scala/sbt}/Util.scala | 0 util/complete/{ => src/main/scala/sbt}/LineReader.scala | 0 util/complete/{ => src/main/scala/sbt/complete}/Completions.scala | 0 .../complete/{ => src/main/scala/sbt/complete}/EditDistance.scala | 0 util/complete/{ => src/main/scala/sbt/complete}/History.scala | 0 .../{ => src/main/scala/sbt/complete}/HistoryCommands.scala | 0 .../{ => src/main/scala/sbt/complete}/JLineCompletion.scala | 0 util/complete/{ => src/main/scala/sbt/complete}/Parser.scala | 0 util/complete/{ => src/main/scala/sbt/complete}/Parsers.scala | 0 .../complete/{ => src/main/scala/sbt/complete}/ProcessError.scala | 0 .../{ => src/main/scala/sbt/complete}/TokenCompletions.scala | 0 util/complete/{ => src/main/scala/sbt/complete}/TypeString.scala | 0 util/complete/{ => src/main/scala/sbt/complete}/UpperBound.scala | 0 util/control/{ => src/main/scala/sbt}/ErrorHandling.scala | 0 util/control/{ => src/main/scala/sbt}/ExitHook.scala | 0 util/control/{ => src/main/scala/sbt}/MessageOnlyException.scala | 0 util/log/{ => src/main/scala/sbt}/BasicLogger.scala | 0 util/log/{ => src/main/scala/sbt}/BufferedLogger.scala | 0 util/log/{ => src/main/scala/sbt}/ConsoleLogger.scala | 0 util/log/{ => src/main/scala/sbt}/FilterLogger.scala | 0 util/log/{ => src/main/scala/sbt}/FullLogger.scala | 0 util/log/{ => src/main/scala/sbt}/GlobalLogging.scala | 0 util/log/{ => src/main/scala/sbt}/Level.scala | 0 util/log/{ => src/main/scala/sbt}/LogEvent.scala | 0 util/log/{ => src/main/scala/sbt}/Logger.scala | 0 util/log/{ => src/main/scala/sbt}/LoggerWriter.scala | 0 util/log/{ => src/main/scala/sbt}/MainLogging.scala | 0 util/log/{ => src/main/scala/sbt}/MultiLogger.scala | 0 util/log/{ => src/main/scala/sbt}/StackTrace.scala | 0 util/process/{ => src/main/scala/sbt}/InheritInput.scala | 0 util/process/{ => src/main/scala/sbt}/Process.scala | 0 util/process/{ => src/main/scala/sbt}/ProcessImpl.scala | 0 util/relation/{ => src/main/scala/sbt}/Relation.scala | 0 61 files changed, 0 insertions(+), 0 deletions(-) rename cache/{ => src/main/scala/sbt}/Cache.scala (100%) rename cache/{ => src/main/scala/sbt}/CacheIO.scala (100%) rename cache/{ => src/main/scala/sbt}/FileInfo.scala (100%) rename cache/{ => src/main/scala/sbt}/SeparatedCache.scala (100%) rename cache/tracking/{ => src/main/scala/sbt}/ChangeReport.scala (100%) rename cache/tracking/{ => src/main/scala/sbt}/Tracked.scala (100%) rename util/appmacro/{ => src/main/scala/sbt/appmacro}/ContextUtil.scala (100%) rename util/appmacro/{ => src/main/scala/sbt/appmacro}/Instance.scala (100%) rename util/appmacro/{ => src/main/scala/sbt/appmacro}/KListBuilder.scala (100%) rename util/appmacro/{ => src/main/scala/sbt/appmacro}/MixedBuilder.scala (100%) rename util/appmacro/{ => src/main/scala/sbt/appmacro}/TupleBuilder.scala (100%) rename util/appmacro/{ => src/main/scala/sbt/appmacro}/TupleNBuilder.scala (100%) rename util/collection/{ => src/main/scala/sbt}/AList.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Attributes.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Classes.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Dag.scala (100%) rename util/collection/{ => src/main/scala/sbt}/HList.scala (100%) rename util/collection/{ => src/main/scala/sbt}/IDSet.scala (100%) rename util/collection/{ => src/main/scala/sbt}/INode.scala (100%) rename util/collection/{ => src/main/scala/sbt}/KList.scala (100%) rename util/collection/{ => src/main/scala/sbt}/PMap.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Param.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Positions.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Settings.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Show.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Signal.scala (100%) rename util/collection/{ => src/main/scala/sbt}/TypeFunctions.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Types.scala (100%) rename util/collection/{ => src/main/scala/sbt}/Util.scala (100%) rename util/complete/{ => src/main/scala/sbt}/LineReader.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/Completions.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/EditDistance.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/History.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/HistoryCommands.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/JLineCompletion.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/Parser.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/Parsers.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/ProcessError.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/TokenCompletions.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/TypeString.scala (100%) rename util/complete/{ => src/main/scala/sbt/complete}/UpperBound.scala (100%) rename util/control/{ => src/main/scala/sbt}/ErrorHandling.scala (100%) rename util/control/{ => src/main/scala/sbt}/ExitHook.scala (100%) rename util/control/{ => src/main/scala/sbt}/MessageOnlyException.scala (100%) rename util/log/{ => src/main/scala/sbt}/BasicLogger.scala (100%) rename util/log/{ => src/main/scala/sbt}/BufferedLogger.scala (100%) rename util/log/{ => src/main/scala/sbt}/ConsoleLogger.scala (100%) rename util/log/{ => src/main/scala/sbt}/FilterLogger.scala (100%) rename util/log/{ => src/main/scala/sbt}/FullLogger.scala (100%) rename util/log/{ => src/main/scala/sbt}/GlobalLogging.scala (100%) rename util/log/{ => src/main/scala/sbt}/Level.scala (100%) rename util/log/{ => src/main/scala/sbt}/LogEvent.scala (100%) rename util/log/{ => src/main/scala/sbt}/Logger.scala (100%) rename util/log/{ => src/main/scala/sbt}/LoggerWriter.scala (100%) rename util/log/{ => src/main/scala/sbt}/MainLogging.scala (100%) rename util/log/{ => src/main/scala/sbt}/MultiLogger.scala (100%) rename util/log/{ => src/main/scala/sbt}/StackTrace.scala (100%) rename util/process/{ => src/main/scala/sbt}/InheritInput.scala (100%) rename util/process/{ => src/main/scala/sbt}/Process.scala (100%) rename util/process/{ => src/main/scala/sbt}/ProcessImpl.scala (100%) rename util/relation/{ => src/main/scala/sbt}/Relation.scala (100%) diff --git a/cache/Cache.scala b/cache/src/main/scala/sbt/Cache.scala similarity index 100% rename from cache/Cache.scala rename to cache/src/main/scala/sbt/Cache.scala diff --git a/cache/CacheIO.scala b/cache/src/main/scala/sbt/CacheIO.scala similarity index 100% rename from cache/CacheIO.scala rename to cache/src/main/scala/sbt/CacheIO.scala diff --git a/cache/FileInfo.scala b/cache/src/main/scala/sbt/FileInfo.scala similarity index 100% rename from cache/FileInfo.scala rename to cache/src/main/scala/sbt/FileInfo.scala diff --git a/cache/SeparatedCache.scala b/cache/src/main/scala/sbt/SeparatedCache.scala similarity index 100% rename from cache/SeparatedCache.scala rename to cache/src/main/scala/sbt/SeparatedCache.scala diff --git a/cache/tracking/ChangeReport.scala b/cache/tracking/src/main/scala/sbt/ChangeReport.scala similarity index 100% rename from cache/tracking/ChangeReport.scala rename to cache/tracking/src/main/scala/sbt/ChangeReport.scala diff --git a/cache/tracking/Tracked.scala b/cache/tracking/src/main/scala/sbt/Tracked.scala similarity index 100% rename from cache/tracking/Tracked.scala rename to cache/tracking/src/main/scala/sbt/Tracked.scala diff --git a/util/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala similarity index 100% rename from util/appmacro/ContextUtil.scala rename to util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala diff --git a/util/appmacro/Instance.scala b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala similarity index 100% rename from util/appmacro/Instance.scala rename to util/appmacro/src/main/scala/sbt/appmacro/Instance.scala diff --git a/util/appmacro/KListBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala similarity index 100% rename from util/appmacro/KListBuilder.scala rename to util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala diff --git a/util/appmacro/MixedBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala similarity index 100% rename from util/appmacro/MixedBuilder.scala rename to util/appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala diff --git a/util/appmacro/TupleBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala similarity index 100% rename from util/appmacro/TupleBuilder.scala rename to util/appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala diff --git a/util/appmacro/TupleNBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala similarity index 100% rename from util/appmacro/TupleNBuilder.scala rename to util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala diff --git a/util/collection/AList.scala b/util/collection/src/main/scala/sbt/AList.scala similarity index 100% rename from util/collection/AList.scala rename to util/collection/src/main/scala/sbt/AList.scala diff --git a/util/collection/Attributes.scala b/util/collection/src/main/scala/sbt/Attributes.scala similarity index 100% rename from util/collection/Attributes.scala rename to util/collection/src/main/scala/sbt/Attributes.scala diff --git a/util/collection/Classes.scala b/util/collection/src/main/scala/sbt/Classes.scala similarity index 100% rename from util/collection/Classes.scala rename to util/collection/src/main/scala/sbt/Classes.scala diff --git a/util/collection/Dag.scala b/util/collection/src/main/scala/sbt/Dag.scala similarity index 100% rename from util/collection/Dag.scala rename to util/collection/src/main/scala/sbt/Dag.scala diff --git a/util/collection/HList.scala b/util/collection/src/main/scala/sbt/HList.scala similarity index 100% rename from util/collection/HList.scala rename to util/collection/src/main/scala/sbt/HList.scala diff --git a/util/collection/IDSet.scala b/util/collection/src/main/scala/sbt/IDSet.scala similarity index 100% rename from util/collection/IDSet.scala rename to util/collection/src/main/scala/sbt/IDSet.scala diff --git a/util/collection/INode.scala b/util/collection/src/main/scala/sbt/INode.scala similarity index 100% rename from util/collection/INode.scala rename to util/collection/src/main/scala/sbt/INode.scala diff --git a/util/collection/KList.scala b/util/collection/src/main/scala/sbt/KList.scala similarity index 100% rename from util/collection/KList.scala rename to util/collection/src/main/scala/sbt/KList.scala diff --git a/util/collection/PMap.scala b/util/collection/src/main/scala/sbt/PMap.scala similarity index 100% rename from util/collection/PMap.scala rename to util/collection/src/main/scala/sbt/PMap.scala diff --git a/util/collection/Param.scala b/util/collection/src/main/scala/sbt/Param.scala similarity index 100% rename from util/collection/Param.scala rename to util/collection/src/main/scala/sbt/Param.scala diff --git a/util/collection/Positions.scala b/util/collection/src/main/scala/sbt/Positions.scala similarity index 100% rename from util/collection/Positions.scala rename to util/collection/src/main/scala/sbt/Positions.scala diff --git a/util/collection/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala similarity index 100% rename from util/collection/Settings.scala rename to util/collection/src/main/scala/sbt/Settings.scala diff --git a/util/collection/Show.scala b/util/collection/src/main/scala/sbt/Show.scala similarity index 100% rename from util/collection/Show.scala rename to util/collection/src/main/scala/sbt/Show.scala diff --git a/util/collection/Signal.scala b/util/collection/src/main/scala/sbt/Signal.scala similarity index 100% rename from util/collection/Signal.scala rename to util/collection/src/main/scala/sbt/Signal.scala diff --git a/util/collection/TypeFunctions.scala b/util/collection/src/main/scala/sbt/TypeFunctions.scala similarity index 100% rename from util/collection/TypeFunctions.scala rename to util/collection/src/main/scala/sbt/TypeFunctions.scala diff --git a/util/collection/Types.scala b/util/collection/src/main/scala/sbt/Types.scala similarity index 100% rename from util/collection/Types.scala rename to util/collection/src/main/scala/sbt/Types.scala diff --git a/util/collection/Util.scala b/util/collection/src/main/scala/sbt/Util.scala similarity index 100% rename from util/collection/Util.scala rename to util/collection/src/main/scala/sbt/Util.scala diff --git a/util/complete/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala similarity index 100% rename from util/complete/LineReader.scala rename to util/complete/src/main/scala/sbt/LineReader.scala diff --git a/util/complete/Completions.scala b/util/complete/src/main/scala/sbt/complete/Completions.scala similarity index 100% rename from util/complete/Completions.scala rename to util/complete/src/main/scala/sbt/complete/Completions.scala diff --git a/util/complete/EditDistance.scala b/util/complete/src/main/scala/sbt/complete/EditDistance.scala similarity index 100% rename from util/complete/EditDistance.scala rename to util/complete/src/main/scala/sbt/complete/EditDistance.scala diff --git a/util/complete/History.scala b/util/complete/src/main/scala/sbt/complete/History.scala similarity index 100% rename from util/complete/History.scala rename to util/complete/src/main/scala/sbt/complete/History.scala diff --git a/util/complete/HistoryCommands.scala b/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala similarity index 100% rename from util/complete/HistoryCommands.scala rename to util/complete/src/main/scala/sbt/complete/HistoryCommands.scala diff --git a/util/complete/JLineCompletion.scala b/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala similarity index 100% rename from util/complete/JLineCompletion.scala rename to util/complete/src/main/scala/sbt/complete/JLineCompletion.scala diff --git a/util/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala similarity index 100% rename from util/complete/Parser.scala rename to util/complete/src/main/scala/sbt/complete/Parser.scala diff --git a/util/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala similarity index 100% rename from util/complete/Parsers.scala rename to util/complete/src/main/scala/sbt/complete/Parsers.scala diff --git a/util/complete/ProcessError.scala b/util/complete/src/main/scala/sbt/complete/ProcessError.scala similarity index 100% rename from util/complete/ProcessError.scala rename to util/complete/src/main/scala/sbt/complete/ProcessError.scala diff --git a/util/complete/TokenCompletions.scala b/util/complete/src/main/scala/sbt/complete/TokenCompletions.scala similarity index 100% rename from util/complete/TokenCompletions.scala rename to util/complete/src/main/scala/sbt/complete/TokenCompletions.scala diff --git a/util/complete/TypeString.scala b/util/complete/src/main/scala/sbt/complete/TypeString.scala similarity index 100% rename from util/complete/TypeString.scala rename to util/complete/src/main/scala/sbt/complete/TypeString.scala diff --git a/util/complete/UpperBound.scala b/util/complete/src/main/scala/sbt/complete/UpperBound.scala similarity index 100% rename from util/complete/UpperBound.scala rename to util/complete/src/main/scala/sbt/complete/UpperBound.scala diff --git a/util/control/ErrorHandling.scala b/util/control/src/main/scala/sbt/ErrorHandling.scala similarity index 100% rename from util/control/ErrorHandling.scala rename to util/control/src/main/scala/sbt/ErrorHandling.scala diff --git a/util/control/ExitHook.scala b/util/control/src/main/scala/sbt/ExitHook.scala similarity index 100% rename from util/control/ExitHook.scala rename to util/control/src/main/scala/sbt/ExitHook.scala diff --git a/util/control/MessageOnlyException.scala b/util/control/src/main/scala/sbt/MessageOnlyException.scala similarity index 100% rename from util/control/MessageOnlyException.scala rename to util/control/src/main/scala/sbt/MessageOnlyException.scala diff --git a/util/log/BasicLogger.scala b/util/log/src/main/scala/sbt/BasicLogger.scala similarity index 100% rename from util/log/BasicLogger.scala rename to util/log/src/main/scala/sbt/BasicLogger.scala diff --git a/util/log/BufferedLogger.scala b/util/log/src/main/scala/sbt/BufferedLogger.scala similarity index 100% rename from util/log/BufferedLogger.scala rename to util/log/src/main/scala/sbt/BufferedLogger.scala diff --git a/util/log/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala similarity index 100% rename from util/log/ConsoleLogger.scala rename to util/log/src/main/scala/sbt/ConsoleLogger.scala diff --git a/util/log/FilterLogger.scala b/util/log/src/main/scala/sbt/FilterLogger.scala similarity index 100% rename from util/log/FilterLogger.scala rename to util/log/src/main/scala/sbt/FilterLogger.scala diff --git a/util/log/FullLogger.scala b/util/log/src/main/scala/sbt/FullLogger.scala similarity index 100% rename from util/log/FullLogger.scala rename to util/log/src/main/scala/sbt/FullLogger.scala diff --git a/util/log/GlobalLogging.scala b/util/log/src/main/scala/sbt/GlobalLogging.scala similarity index 100% rename from util/log/GlobalLogging.scala rename to util/log/src/main/scala/sbt/GlobalLogging.scala diff --git a/util/log/Level.scala b/util/log/src/main/scala/sbt/Level.scala similarity index 100% rename from util/log/Level.scala rename to util/log/src/main/scala/sbt/Level.scala diff --git a/util/log/LogEvent.scala b/util/log/src/main/scala/sbt/LogEvent.scala similarity index 100% rename from util/log/LogEvent.scala rename to util/log/src/main/scala/sbt/LogEvent.scala diff --git a/util/log/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala similarity index 100% rename from util/log/Logger.scala rename to util/log/src/main/scala/sbt/Logger.scala diff --git a/util/log/LoggerWriter.scala b/util/log/src/main/scala/sbt/LoggerWriter.scala similarity index 100% rename from util/log/LoggerWriter.scala rename to util/log/src/main/scala/sbt/LoggerWriter.scala diff --git a/util/log/MainLogging.scala b/util/log/src/main/scala/sbt/MainLogging.scala similarity index 100% rename from util/log/MainLogging.scala rename to util/log/src/main/scala/sbt/MainLogging.scala diff --git a/util/log/MultiLogger.scala b/util/log/src/main/scala/sbt/MultiLogger.scala similarity index 100% rename from util/log/MultiLogger.scala rename to util/log/src/main/scala/sbt/MultiLogger.scala diff --git a/util/log/StackTrace.scala b/util/log/src/main/scala/sbt/StackTrace.scala similarity index 100% rename from util/log/StackTrace.scala rename to util/log/src/main/scala/sbt/StackTrace.scala diff --git a/util/process/InheritInput.scala b/util/process/src/main/scala/sbt/InheritInput.scala similarity index 100% rename from util/process/InheritInput.scala rename to util/process/src/main/scala/sbt/InheritInput.scala diff --git a/util/process/Process.scala b/util/process/src/main/scala/sbt/Process.scala similarity index 100% rename from util/process/Process.scala rename to util/process/src/main/scala/sbt/Process.scala diff --git a/util/process/ProcessImpl.scala b/util/process/src/main/scala/sbt/ProcessImpl.scala similarity index 100% rename from util/process/ProcessImpl.scala rename to util/process/src/main/scala/sbt/ProcessImpl.scala diff --git a/util/relation/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala similarity index 100% rename from util/relation/Relation.scala rename to util/relation/src/main/scala/sbt/Relation.scala From 4ad81e9d046cb545e6cdb13a5fbd7a16b8ff67c1 Mon Sep 17 00:00:00 2001 From: Anthony Date: Tue, 18 Dec 2012 18:57:42 -0500 Subject: [PATCH 305/823] Multi-line prompt text offset issue (ticket #625) --- util/complete/src/main/scala/sbt/LineReader.scala | 13 +++++++++++-- 1 file changed, 11 insertions(+), 2 deletions(-) diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index 9f3ca9036..6bde880cb 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -43,13 +43,22 @@ abstract class JLine extends LineReader readLineDirectRaw(prompt, mask) private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): String = { + val newprompt = handleMultilinePrompt(prompt) val line = mask match { - case Some(m) => reader.readLine(prompt, m) - case None => reader.readLine(prompt) + case Some(m) => reader.readLine(newprompt, m) + case None => reader.readLine(newprompt) } if (inputEof.get) null else line } + private[this] def handleMultilinePrompt(prompt: String): String = { + var lines = """\r?\n""".r.split(prompt) + lines.size match { + case 0 | 1 => prompt + case _ => reader.printString(lines.init.mkString("\n") + "\n"); lines.last; + } + } + private[this] def resume() { jline.Terminal.resetTerminal From 89ad7d720acff594828221a5facdedea82a11c06 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 20 Dec 2012 09:25:35 -0500 Subject: [PATCH 306/823] minor cleanup of previous commit: var->val in sbt.JLine --- util/complete/src/main/scala/sbt/LineReader.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index 6bde880cb..b30d657ec 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -52,7 +52,7 @@ abstract class JLine extends LineReader } private[this] def handleMultilinePrompt(prompt: String): String = { - var lines = """\r?\n""".r.split(prompt) + val lines = """\r?\n""".r.split(prompt) lines.size match { case 0 | 1 => prompt case _ => reader.printString(lines.init.mkString("\n") + "\n"); lines.last; From 4fa45f957d3797d47c9be2ddb16ccefc5288bd8f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 3 Jan 2013 11:37:40 -0500 Subject: [PATCH 307/823] require a failure message for parser --- .../src/main/scala/sbt/complete/Parser.scala | 24 +++++++++++++------ 1 file changed, 17 insertions(+), 7 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index a994e0658..642d6f914 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -49,9 +49,13 @@ sealed trait RichParser[A] * capture it and fail locally instead of allowing the exception to propagate up and terminate parsing.*/ def failOnException: Parser[A] + @deprecated("Use `not` and explicitly provide the failure message", "0.13.0") def unary_- : Parser[Unit] def & (o: Parser[_]): Parser[A] + + @deprecated("Use `and` and `not` and explicitly provide the failure message", "0.13.0") def - (o: Parser[_]): Parser[A] + /** Explicitly defines the completions for the original Parser.*/ def examples(s: String*): Parser[A] /** Explicitly defines the completions for the original Parser.*/ @@ -216,6 +220,7 @@ object Parser extends ParserMain } } + @deprecated("Explicitly call `and` and `not` to provide the failure message.", "0.13.0") def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) def and[T](a: Parser[T], b: Parser[_]): Parser[T] = a.ifValid( b.ifValid( new And(a, b) )) @@ -239,7 +244,7 @@ trait ParserMain def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } def !!!(msg: String): Parser[A] = onFailure(a, msg) def failOnException: Parser[A] = trapAndFail(a) - + def unary_- = not(a) def & (o: Parser[_]) = and(a, o) def - (o: Parser[_]) = sub(a, o) @@ -394,9 +399,12 @@ trait ParserMain else b - def not(p: Parser[_]): Parser[Unit] = p.result match { - case None => new Not(p) - case Some(_) => failure("Excluded.") + @deprecated("Explicitly specify the failure message.", "0.13.0") + def not(p: Parser[_]): Parser[Unit] = not(p, "Excluded.") + + def not(p: Parser[_], failMessage: String): Parser[Unit] = p.result match { + case None => new Not(p, failMessage) + case Some(_) => failure(failMessage) } def oneOf[T](p: Seq[Parser[T]]): Parser[T] = p.reduceLeft(_ | _) @@ -575,17 +583,19 @@ private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] def derive(c: Char) = (a derive c) & (b derive c) def completions(level: Int) = a.completions(level).filterS(s => apply(b)(s).resultEmpty.isValid ) lazy val resultEmpty = a.resultEmpty && b.resultEmpty + override def toString = s"($a) && ($b)" } -private final class Not(delegate: Parser[_]) extends ValidParser[Unit] +private final class Not(delegate: Parser[_], failMessage: String) extends ValidParser[Unit] { - def derive(c: Char) = if(delegate.valid) not(delegate derive c) else this + def derive(c: Char) = if(delegate.valid) not(delegate derive c, failMessage) else this def completions(level: Int) = Completions.empty def result = None lazy val resultEmpty = delegate.resultEmpty match { case f: Failure => Value(()) - case v: Value[_] => mkFailure("Excluded.") + case v: Value[_] => mkFailure(failMessage) } + override def toString = s" -($delegate)" } private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends ValidParser[T] { From 169a08df55607b4d27697b2ae214bdb5e2aab4b8 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 3 Jan 2013 17:40:07 -0500 Subject: [PATCH 308/823] update version for backported Parser deprecations --- .../src/main/scala/sbt/complete/Parser.scala | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 642d6f914..00e9fb79d 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -49,11 +49,11 @@ sealed trait RichParser[A] * capture it and fail locally instead of allowing the exception to propagate up and terminate parsing.*/ def failOnException: Parser[A] - @deprecated("Use `not` and explicitly provide the failure message", "0.13.0") + @deprecated("Use `not` and explicitly provide the failure message", "0.12.2") def unary_- : Parser[Unit] def & (o: Parser[_]): Parser[A] - @deprecated("Use `and` and `not` and explicitly provide the failure message", "0.13.0") + @deprecated("Use `and` and `not` and explicitly provide the failure message", "0.12.2") def - (o: Parser[_]): Parser[A] /** Explicitly defines the completions for the original Parser.*/ @@ -220,7 +220,7 @@ object Parser extends ParserMain } } - @deprecated("Explicitly call `and` and `not` to provide the failure message.", "0.13.0") + @deprecated("Explicitly call `and` and `not` to provide the failure message.", "0.12.2") def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) def and[T](a: Parser[T], b: Parser[_]): Parser[T] = a.ifValid( b.ifValid( new And(a, b) )) @@ -399,7 +399,7 @@ trait ParserMain else b - @deprecated("Explicitly specify the failure message.", "0.13.0") + @deprecated("Explicitly specify the failure message.", "0.12.2") def not(p: Parser[_]): Parser[Unit] = not(p, "Excluded.") def not(p: Parser[_], failMessage: String): Parser[Unit] = p.result match { @@ -583,7 +583,7 @@ private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] def derive(c: Char) = (a derive c) & (b derive c) def completions(level: Int) = a.completions(level).filterS(s => apply(b)(s).resultEmpty.isValid ) lazy val resultEmpty = a.resultEmpty && b.resultEmpty - override def toString = s"($a) && ($b)" + override def toString = "(%s) && (%s)".format(a,b) } private final class Not(delegate: Parser[_], failMessage: String) extends ValidParser[Unit] @@ -595,7 +595,7 @@ private final class Not(delegate: Parser[_], failMessage: String) extends ValidP case f: Failure => Value(()) case v: Value[_] => mkFailure(failMessage) } - override def toString = s" -($delegate)" + override def toString = " -(%s)".format(delegate) } private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends ValidParser[T] { From c826078002a7bf93e3d68e62367834833d64c18e Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 4 Jan 2013 17:22:40 -0500 Subject: [PATCH 309/823] Convert references to harrah/xsbt to sbt/sbt --- util/process/src/main/scala/sbt/ProcessImpl.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/process/src/main/scala/sbt/ProcessImpl.scala b/util/process/src/main/scala/sbt/ProcessImpl.scala index 44dcaed2d..617a6cef3 100644 --- a/util/process/src/main/scala/sbt/ProcessImpl.scala +++ b/util/process/src/main/scala/sbt/ProcessImpl.scala @@ -403,7 +403,7 @@ private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProc /** A thin wrapper around a java.lang.Process. `outputThreads` are the Threads created to read from the * output and error streams of the process. * The implementation of `exitValue` wait for the process to finish and then waits until the threads reading output and error streams die before -* returning. Note that the thread that reads the input stream cannot be interrupted, see https://github.com/harrah/xsbt/issues/327 and +* returning. Note that the thread that reads the input stream cannot be interrupted, see https://github.com/sbt/sbt/issues/327 and * http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4514257 */ private class SimpleProcess(p: JProcess, outputThreads: List[Thread]) extends Process { From cf08f2dd18c976a6e1ad881ae334a337cd0ba970 Mon Sep 17 00:00:00 2001 From: "Paolo G. Giarrusso" Date: Sun, 16 Dec 2012 19:23:26 +0100 Subject: [PATCH 310/823] Don't catch org.scalacheck.Prop.Exception --- util/collection/src/test/scala/SettingsTest.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index 2e57685ea..f4796b76c 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -33,7 +33,7 @@ object SettingsTest extends Properties("settings") iterate(value(t-1) ) } try { evaluate( setting(chk, iterate(top)) :: Nil); true } - catch { case e: Exception => ("Unexpected exception: " + e) |: false } + catch { case e: java.lang.Exception => ("Unexpected exception: " + e) |: false } } // Circular (dynamic) references currently loop infinitely. @@ -45,7 +45,7 @@ object SettingsTest extends Properties("settings") { val ccr = new CCR(intermediate) try { evaluate( setting(chk, ccr.top) :: Nil); false } - catch { case e: Exception => true } + catch { case e: java.lang.Exception => true } } def tests = From dd007f94420a83eba732cbf362ab99dbd00a1518 Mon Sep 17 00:00:00 2001 From: "Paolo G. Giarrusso" Date: Sun, 16 Dec 2012 20:29:24 +0100 Subject: [PATCH 311/823] Silence boring Eclipse warnings: unused imports --- interface/src/main/java/xsbti/compile/DefinesClass.java | 2 -- interface/src/main/java/xsbti/compile/Output.java | 2 -- 2 files changed, 4 deletions(-) diff --git a/interface/src/main/java/xsbti/compile/DefinesClass.java b/interface/src/main/java/xsbti/compile/DefinesClass.java index 6369c2661..261c6ca22 100644 --- a/interface/src/main/java/xsbti/compile/DefinesClass.java +++ b/interface/src/main/java/xsbti/compile/DefinesClass.java @@ -1,7 +1,5 @@ package xsbti.compile; -import java.io.File; - /** * Determines if an entry on a classpath contains a class. */ diff --git a/interface/src/main/java/xsbti/compile/Output.java b/interface/src/main/java/xsbti/compile/Output.java index c7f28a2f1..4f785884e 100755 --- a/interface/src/main/java/xsbti/compile/Output.java +++ b/interface/src/main/java/xsbti/compile/Output.java @@ -1,6 +1,4 @@ package xsbti.compile; - -import java.io.File; /** Abstract interface denoting the output of the compilation. Inheritors are SingleOutput with a global output directory and * MultipleOutput that specifies the output directory per source file. */ From e5673f742619ff095810ba7d9073aba10c922117 Mon Sep 17 00:00:00 2001 From: "Paolo G. Giarrusso" Date: Sun, 16 Dec 2012 20:30:30 +0100 Subject: [PATCH 312/823] Silence boring Eclipse warnings: catching all exceptions Here I make explicit where catching all exceptions is intended. Mark Harrah corrected one decision during review. --- util/collection/src/test/scala/SettingsTest.scala | 2 +- util/control/src/main/scala/sbt/ErrorHandling.scala | 2 +- util/log/src/main/scala/sbt/BufferedLogger.scala | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index f4796b76c..9ff703526 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -82,7 +82,7 @@ object SettingsTest extends Properties("settings") def evaluate(settings: Seq[Setting[_]]): Settings[Scope] = try { make(settings)(delegates, scopeLocal, showFullKey) } - catch { case e => e.printStackTrace; throw e } + catch { case e: Throwable => e.printStackTrace; throw e } } // This setup is a workaround for module synchronization issues final class CCR(intermediate: Int) diff --git a/util/control/src/main/scala/sbt/ErrorHandling.scala b/util/control/src/main/scala/sbt/ErrorHandling.scala index a1ba760f3..b6e616ae3 100644 --- a/util/control/src/main/scala/sbt/ErrorHandling.scala +++ b/util/control/src/main/scala/sbt/ErrorHandling.scala @@ -20,7 +20,7 @@ object ErrorHandling { case ex @ (_: Exception | _: StackOverflowError) => Left(ex) case err @ (_: ThreadDeath | _: VirtualMachineError) => throw err - case x => Left(x) + case x: Throwable => Left(x) } def convert[T](f: => T): Either[Exception, T] = diff --git a/util/log/src/main/scala/sbt/BufferedLogger.scala b/util/log/src/main/scala/sbt/BufferedLogger.scala index 2e04b81f2..0b9d7a593 100644 --- a/util/log/src/main/scala/sbt/BufferedLogger.scala +++ b/util/log/src/main/scala/sbt/BufferedLogger.scala @@ -33,7 +33,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger clear() result } - catch { case e => stopQuietly(); throw e } + catch { case e: Throwable => stopQuietly(); throw e } } def stopQuietly() = synchronized { try { stop() } catch { case e: Exception => () } } From ef84332a516ff2094d6c004f1b3da0da630cb9a6 Mon Sep 17 00:00:00 2001 From: "Paolo G. Giarrusso" Date: Tue, 8 Jan 2013 00:39:40 +0100 Subject: [PATCH 313/823] Silence boring Eclipse warnings: catching all exceptions, part 2 These warning fixes are new since my last pull request, please verify. --- util/collection/src/main/scala/sbt/INode.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/INode.scala b/util/collection/src/main/scala/sbt/INode.scala index 86ddff060..e9f64ef6c 100644 --- a/util/collection/src/main/scala/sbt/INode.scala +++ b/util/collection/src/main/scala/sbt/INode.scala @@ -70,7 +70,7 @@ abstract class EvaluateSettings[Scope] } private[this] def run0(work: => Unit): Unit = { - try { work } catch { case e => complete.put( Some(e) ) } + try { work } catch { case e: Throwable => complete.put( Some(e) ) } workComplete() } From 89c645db4421a591f502e1c5ed052bc5e3ced09c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 28 Jan 2013 17:14:53 -0500 Subject: [PATCH 314/823] remove deprecated Initialize.scoped method --- util/collection/src/main/scala/sbt/Settings.scala | 2 -- 1 file changed, 2 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index d85876496..3c1433ab1 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -309,8 +309,6 @@ trait Init[Scope] case None => this case Some(const) => new Value(() => transform(const)) } - @deprecated("Use scopedKey.") - def scoped = scopedKey } private[this] final class GetValue[S,T](val scopedKey: ScopedKey[S], val transform: S => T) extends Keyed[S, T] trait KeyedInitialize[T] extends Keyed[T, T] { From 37063924ec4f909956acffb734af4ba0845f11d3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 28 Jan 2013 17:14:53 -0500 Subject: [PATCH 315/823] Reduce InputTask to the ideal wrapper around 'State => Parser[Initialize[Task[T]]]' Ref #407. --- .../appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index a62846e78..3afc0d617 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -143,6 +143,13 @@ final class ContextUtil[C <: Context](val ctx: C) /** >: Nothing <: Any */ def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) + def functionType(args: List[Type], result: Type): Type = + { + val global: Global = ctx.universe.asInstanceOf[Global] + val tpe = global.definitions.functionType(args.asInstanceOf[List[global.Type]], result.asInstanceOf[global.Type]) + tpe.asInstanceOf[Type] + } + /** Create a Tree that references the `val` represented by `vd`. */ def refVal(vd: ValDef): Tree = { From 4a08ec9c60072895553ee2732229ce333c2772ef Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 28 Jan 2013 17:14:53 -0500 Subject: [PATCH 316/823] use standard Context.weakTypeOf --- util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala | 2 -- 1 file changed, 2 deletions(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index 3afc0d617..85a2aed50 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -48,8 +48,6 @@ final class ContextUtil[C <: Context](val ctx: C) def getPos(sym: Symbol) = if(sym eq null) NoPosition else sym.pos - def atypeOf[T](implicit att: WeakTypeTag[T]): Type = att.tpe - /** Constructs a unique term name with the given prefix within this Context. * (The current implementation uses Context.fresh, which increments*/ def freshTermName(prefix: String) = newTermName(ctx.fresh("$" + prefix)) From 3e2aa82fde9408d73e0a96463d6fde737fc41734 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 4 Feb 2013 17:30:31 -0500 Subject: [PATCH 317/823] -Xlint --- cache/tracking/src/main/scala/sbt/Tracked.scala | 8 ++++---- util/complete/src/main/scala/sbt/complete/Parser.scala | 2 +- util/process/src/main/scala/sbt/ProcessImpl.scala | 4 ++-- 3 files changed, 7 insertions(+), 7 deletions(-) diff --git a/cache/tracking/src/main/scala/sbt/Tracked.scala b/cache/tracking/src/main/scala/sbt/Tracked.scala index 9d2848b73..fb0747ed9 100644 --- a/cache/tracking/src/main/scala/sbt/Tracked.scala +++ b/cache/tracking/src/main/scala/sbt/Tracked.scala @@ -77,11 +77,11 @@ object Tracked trait Tracked { /** Cleans outputs and clears the cache.*/ - def clean: Unit + def clean(): Unit } class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked { - def clean = delete(cacheFile) + def clean() = delete(cacheFile) /** Reads the previous timestamp, evaluates the provided function, * and then updates the timestamp if the function completes normally.*/ def apply[T](f: Long => T): T = @@ -99,7 +99,7 @@ class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O]) extends Tracked { - def clean = delete(cacheFile) + def clean() = delete(cacheFile) def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => { if(uptodate(value)) @@ -136,7 +136,7 @@ object Difference } class Difference(val cache: File, val style: FilesInfo.Style, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked { - def clean = + def clean() = { if(defineClean) delete(raw(cachedFilesInfo)) else () clearCache() diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 00e9fb79d..9cccacf6f 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -130,7 +130,7 @@ object Parser extends ParserMain if(!bad.isEmpty) error("Invalid example completions: " + bad.mkString("'", "', '", "'")) } def tuple[A,B](a: Option[A], b: Option[B]): Option[(A,B)] = - (a,b) match { case (Some(av), Some(bv)) => Some(av, bv); case _ => None } + (a,b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } def mapParser[A,B](a: Parser[A], f: A => B): Parser[B] = a.ifValid { diff --git a/util/process/src/main/scala/sbt/ProcessImpl.scala b/util/process/src/main/scala/sbt/ProcessImpl.scala index 617a6cef3..deec99be0 100644 --- a/util/process/src/main/scala/sbt/ProcessImpl.scala +++ b/util/process/src/main/scala/sbt/ProcessImpl.scala @@ -28,7 +28,7 @@ private object Future def apply[T](f: => T): () => T = { val result = new SyncVar[Either[Throwable, T]] - def run: Unit = + def run(): Unit = try { result.set(Right(f)) } catch { case e: Exception => result.set(Left(e)) } Spawn(run) @@ -100,7 +100,7 @@ object BasicIO { val continueCount = 1//if(in.isInstanceOf[PipedInputStream]) 1 else 0 val buffer = new Array[Byte](BufferSize) - def read + def read() { val byteCount = in.read(buffer) if(byteCount >= continueCount) From badee8bacdba9446e9b471476e905fcb71b72cc1 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 13 Feb 2013 03:25:13 -0500 Subject: [PATCH 318/823] Update to 2.10.1-RC1 Needed an explicit type in PMap to workaround an error. Need to drop tuple assignment of parser.parsed in input task macro as a workaround for macro/resetAllAttrs/pattern matching/annotation issue in RC1. --- util/collection/src/main/scala/sbt/PMap.scala | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/util/collection/src/main/scala/sbt/PMap.scala b/util/collection/src/main/scala/sbt/PMap.scala index 8b1772220..9cf7f26f2 100644 --- a/util/collection/src/main/scala/sbt/PMap.scala +++ b/util/collection/src/main/scala/sbt/PMap.scala @@ -3,7 +3,6 @@ */ package sbt - import Types._ import collection.mutable trait RMap[K[_], V[_]] @@ -12,7 +11,7 @@ trait RMap[K[_], V[_]] def get[T](k: K[T]): Option[V[T]] def contains[T](k: K[T]): Boolean def toSeq: Seq[(K[_], V[_])] - def toTypedSeq = toSeq.map{ case (k: K[t],v) => TPair[t](k,v.asInstanceOf[V[t]]) } + def toTypedSeq: Seq[TPair[_]] = toSeq.map{ case (k: K[t],v) => TPair[t](k,v.asInstanceOf[V[t]]) } def keys: Iterable[K[_]] def values: Iterable[V[_]] def isEmpty: Boolean From 4e4455d03aa54e8b7b04f89f1f72d8e1834914e4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 19 Feb 2013 08:54:40 -0500 Subject: [PATCH 319/823] Use @compileTimeOnly for .value and .parsed methods. Needed to set position on wrapper method for correct error message position. --- .../main/scala/sbt/appmacro/ContextUtil.scala | 4 ++-- .../main/scala/sbt/appmacro/Instance.scala | 20 ++++++++++++------- 2 files changed, 15 insertions(+), 9 deletions(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index 85a2aed50..41a52003f 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -20,11 +20,11 @@ object ContextUtil { * * Given `myImplicitConversion(someValue).extensionMethod`, where `extensionMethod` is a macro that uses this * method, the result of this method is `f()`. */ - def selectMacroImpl[T: c.WeakTypeTag, S: c.WeakTypeTag](c: Context)(f: c.Expr[S] => c.Expr[T]): c.Expr[T] = + def selectMacroImpl[T: c.WeakTypeTag, S: c.WeakTypeTag](c: Context)(f: (c.Expr[S], c.Position) => c.Expr[T]): c.Expr[T] = { import c.universe._ c.macroApplication match { - case Select(Apply(_, t :: Nil), _) => f( c.Expr[S](t) ) + case s @ Select(Apply(_, t :: Nil), tp) => f( c.Expr[S](t), s.pos ) case x => unexpectedTree(x) } } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala index f70941dd0..49b8bb71a 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala @@ -27,7 +27,9 @@ trait MonadInstance extends Instance import scala.reflect._ import macros._ + import reflect.internal.annotations.compileTimeOnly +// This needs to be moved to main/settings object InputWrapper { /** The name of the wrapper method should be obscure. @@ -35,22 +37,26 @@ object InputWrapper * The user should never see this method because it is compile-time only and only used internally by the task macro system.*/ final val WrapName = "wrap_\u2603\u2603" - // This method should be annotated as compile-time only when that feature is implemented + @compileTimeOnly("`value` can only be used within a task or setting macro, such as :=, +=, ++=, Def.task, or Def.setting.") def wrap_\u2603\u2603[T](in: Any): T = error("This method is an implementation detail and should not be referenced.") - /** Wraps an arbitrary Tree in a call to the `wrap` method of this module for later processing by an enclosing macro. + def wrapKey[T: c.WeakTypeTag](c: Context)(ts: c.Expr[Any], pos: c.Position): c.Expr[T] = wrapImpl[T,InputWrapper.type](c, InputWrapper, WrapName)(ts, pos) + + /** Wraps an arbitrary Tree in a call to the `.` method of this module for later processing by an enclosing macro. * The resulting Tree is the manually constructed version of: * - * `c.universe.reify { InputWrapper.[T](ts.splice) }` + * `c.universe.reify { .[T](ts.splice) }` */ - def wrapKey[T: c.WeakTypeTag](c: Context)(ts: c.Expr[Any]): c.Expr[T] = + def wrapImpl[T: c.WeakTypeTag, S <: AnyRef with Singleton](c: Context, s: S, wrapName: String)(ts: c.Expr[Any], pos: c.Position)(implicit it: c.TypeTag[s.type]): c.Expr[T] = { import c.universe.{Apply=>ApplyTree,_} val util = new ContextUtil[c.type](c) - val iw = util.singleton(InputWrapper) + val iw = util.singleton(s) val tpe = c.weakTypeOf[T] - val nme = newTermName(WrapName).encoded - val tree = ApplyTree(TypeApply(Select(Ident(iw), nme), TypeTree(tpe) :: Nil), ts.tree :: Nil) + val nme = newTermName(wrapName).encoded + val sel = Select(Ident(iw), nme) + sel.setPos(pos) // need to set the position on Select, because that is where the compileTimeOnly check looks + val tree = ApplyTree(TypeApply(sel, TypeTree(tpe) :: Nil), ts.tree :: Nil) tree.setPos(ts.tree.pos) c.Expr[T](tree) } From 7c5d4c1692b9f235c61ac6b94476ae7d817d5e85 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Wed, 16 Jan 2013 21:36:27 -0800 Subject: [PATCH 320/823] Strip down trailing whitespace. I have Eclipse configured to do that automatically when saving file. I decided to finally commit those changes to files I touch a lot. --- interface/src/main/java/xsbti/compile/Setup.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/interface/src/main/java/xsbti/compile/Setup.java b/interface/src/main/java/xsbti/compile/Setup.java index 050e20c2f..9a2a6bf4c 100644 --- a/interface/src/main/java/xsbti/compile/Setup.java +++ b/interface/src/main/java/xsbti/compile/Setup.java @@ -19,7 +19,7 @@ public interface Setup boolean skip(); /** The file used to cache information across compilations. - * This file can be removed to force a full recompilation. + * This file can be removed to force a full recompilation. * The file should be unique and not shared between compilations. */ File cacheFile(); From 39428a996dcad6dca53c42356a65dc43d522d973 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Tue, 19 Feb 2013 00:16:51 -0800 Subject: [PATCH 321/823] Introduce incremental compiler options. Introduce a way to configure incremental compiler itself instead of underlying Java/Scala compiler. Specific list of changes in this commit: * Add a method to `xsbti.compile.Setup` that returns incremental compiler options as a `java.util.Map`. We considered statis interface instead of a `Map` but based on mailing list feedback we decided that it's not the best way to go because static interface is hard to evolve it by adding new options. * Since passing `java.util.Map` not very convenient we convert it immediately to `sbt.inc.IncOptions` * Add options argument to various methods/classes that implement incremental compilation so in the end options reach `sbt.inc.IncOptions` object * Add `incOptions` task that allows users to configure incremental compiler options in their build files. Default implementation of that tasks returns just `IncOptions.DEFAULT` * Both system property `xsbt.inc.debug` and `IncOptions.relationsDebug` trigger debugging of relations now. In the near future, we should deprecate use of `xsbt.inc.debug`. --- interface/src/main/java/xsbti/compile/Setup.java | 14 ++++++++++++++ 1 file changed, 14 insertions(+) diff --git a/interface/src/main/java/xsbti/compile/Setup.java b/interface/src/main/java/xsbti/compile/Setup.java index 9a2a6bf4c..edf250b8b 100644 --- a/interface/src/main/java/xsbti/compile/Setup.java +++ b/interface/src/main/java/xsbti/compile/Setup.java @@ -1,6 +1,8 @@ package xsbti.compile; import java.io.File; +import java.util.Map; + import xsbti.Maybe; import xsbti.Reporter; @@ -30,4 +32,16 @@ public interface Setup /** The reporter that should be used to report scala compilation to. */ Reporter reporter(); + + /** + * Returns incremental compiler options. + * + * @see sbt.inc.IncOptions for details + * + * You can get default options by calling sbt.inc.IncOptions.toStringMap(sbt.inc.IncOptions.Default). + * + * In the future, we'll extend API in xsbti to provide factory methods that would allow to obtain + * defaults values so one can depend on xsbti package only. + **/ + Map incrementalCompilerOptions(); } From 0280216e02b715d22cad0ba511140cc6cfd21d80 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 21 Feb 2013 20:44:26 -0500 Subject: [PATCH 322/823] Replace Scala jars in UpdateReport with ScalaProvider jars in more situations. Fixes #661. Specifically, when the Scala version for sbt is the same as that for the project being built, the jars in UpdateReport should be the same as those in ScalaProvider. This is because the loader will come from the ScalaProvider, which uses jars in the boot directory instead of the cache. The first part of the fix for #661 checks that loaded classes come from the classpath and so they need to line up. --- interface/src/main/java/xsbti/compile/ScalaInstance.java | 9 ++++++--- 1 file changed, 6 insertions(+), 3 deletions(-) diff --git a/interface/src/main/java/xsbti/compile/ScalaInstance.java b/interface/src/main/java/xsbti/compile/ScalaInstance.java index 4e41e1ca2..c7f3984e3 100644 --- a/interface/src/main/java/xsbti/compile/ScalaInstance.java +++ b/interface/src/main/java/xsbti/compile/ScalaInstance.java @@ -17,13 +17,16 @@ public interface ScalaInstance /** A class loader providing access to the classes and resources in the library and compiler jars. */ ClassLoader loader(); - /** The library jar file.*/ + /**@deprecated Only `jars` can be reliably provided for modularized Scala. (Since 0.13.0) */ + @Deprecated File libraryJar(); - /** The compiler jar file.*/ + /**@deprecated Only `jars` can be reliably provided for modularized Scala. (Since 0.13.0) */ + @Deprecated File compilerJar(); - /** Jars provided by this Scala instance other than the compiler and library jars. */ + /**@deprecated Only `jars` can be reliably provided for modularized Scala. (Since 0.13.0) */ + @Deprecated File[] otherJars(); /** All jar files provided by this Scala instance.*/ From 67010fa0b295de6a7054584b9d563e60297baf5d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 25 Feb 2013 09:24:04 -0500 Subject: [PATCH 323/823] Split ConsoleOut into its own file, track the global ConsoleOut and use it instead of StandardMain.console --- .../src/main/scala/sbt/ConsoleLogger.scala | 73 +++++-------------- util/log/src/main/scala/sbt/ConsoleOut.scala | 62 ++++++++++++++++ .../src/main/scala/sbt/GlobalLogging.scala | 24 +++++- util/log/src/main/scala/sbt/MainLogging.scala | 11 ++- 4 files changed, 109 insertions(+), 61 deletions(-) create mode 100644 util/log/src/main/scala/sbt/ConsoleOut.scala diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala index 711489c5c..124c61e90 100644 --- a/util/log/src/main/scala/sbt/ConsoleLogger.scala +++ b/util/log/src/main/scala/sbt/ConsoleLogger.scala @@ -7,55 +7,27 @@ package sbt object ConsoleLogger { - def systemOut: ConsoleOut = printStreamOut(System.out) - def overwriteContaining(s: String): (String,String) => Boolean = (cur, prev) => - cur.contains(s) && prev.contains(s) + @deprecated("Moved to ConsoleOut", "0.13.0") + def systemOut: ConsoleOut = ConsoleOut.systemOut - /** ConsoleOut instance that is backed by System.out. It overwrites the previously printed line - * if the function `f(lineToWrite, previousLine)` returns true. - * - * The ConsoleOut returned by this method assumes that the only newlines are from println calls - * and not in the String arguments. */ - def systemOutOverwrite(f: (String,String) => Boolean): ConsoleOut = new ConsoleOut { - val lockObject = System.out - private[this] var last: Option[String] = None - private[this] var current = new java.lang.StringBuffer - def print(s: String): Unit = synchronized { current.append(s) } - def println(s: String): Unit = synchronized { current.append(s); println() } - def println(): Unit = synchronized { - val s = current.toString - if(formatEnabled && last.exists(lmsg => f(s, lmsg))) - System.out.print(OverwriteLine) - System.out.println(s) - last = Some(s) - current = new java.lang.StringBuffer - } - } - def printStreamOut(out: PrintStream): ConsoleOut = new ConsoleOut { - val lockObject = out - def print(s: String) = out.print(s) - def println(s: String) = out.println(s) - def println() = out.println() - } - def printWriterOut(out: PrintWriter): ConsoleOut = new ConsoleOut { - val lockObject = out - def print(s: String) = out.print(s) - def println(s: String) = { out.println(s); out.flush() } - def println() = { out.println(); out.flush() } - } - def bufferedWriterOut(out: BufferedWriter): ConsoleOut = new ConsoleOut { - val lockObject = out - def print(s: String) = out.write(s) - def println(s: String) = { out.write(s); println() } - def println() = { out.newLine(); out.flush() } - } + @deprecated("Moved to ConsoleOut", "0.13.0") + def overwriteContaining(s: String): (String,String) => Boolean = ConsoleOut.overwriteContaining(s) + + @deprecated("Moved to ConsoleOut", "0.13.0") + def systemOutOverwrite(f: (String,String) => Boolean): ConsoleOut = ConsoleOut.systemOutOverwrite(f) + + @deprecated("Moved to ConsoleOut", "0.13.0") + def printStreamOut(out: PrintStream): ConsoleOut = ConsoleOut.printStreamOut(out) + + @deprecated("Moved to ConsoleOut", "0.13.0") + def printWriterOut(out: PrintWriter): ConsoleOut = ConsoleOut.printWriterOut(out) + + @deprecated("Moved to ConsoleOut", "0.13.0") + def bufferedWriterOut(out: BufferedWriter): ConsoleOut = bufferedWriterOut(out) /** Escape character, used to introduce an escape sequence. */ final val ESC = '\u001B' - /** Move to beginning of previous line and clear the line. */ - private[sbt] final val OverwriteLine = "\r\u001BM\u001B[2K" - /** An escape terminator is a character in the range `@` (decimal value 64) to `~` (decimal value 126). * It is the final character in an escape sequence. */ def isEscapeTerminator(c: Char): Boolean = @@ -120,9 +92,9 @@ object ConsoleLogger private[this] def os = System.getProperty("os.name") private[this] def isWindows = os.toLowerCase.indexOf("windows") >= 0 - def apply(out: PrintStream): ConsoleLogger = apply(printStreamOut(out)) - def apply(out: PrintWriter): ConsoleLogger = apply(printWriterOut(out)) - def apply(out: ConsoleOut = systemOut, ansiCodesSupported: Boolean = formatEnabled, + def apply(out: PrintStream): ConsoleLogger = apply(ConsoleOut.printStreamOut(out)) + def apply(out: PrintWriter): ConsoleLogger = apply(ConsoleOut.printWriterOut(out)) + def apply(out: ConsoleOut = ConsoleOut.systemOut, ansiCodesSupported: Boolean = formatEnabled, useColor: Boolean = formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = noSuppressedMessage): ConsoleLogger = new ConsoleLogger(out, ansiCodesSupported, useColor, suppressedMessage) @@ -200,10 +172,3 @@ class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ans { log(labelColor(Level.Info), Level.Info.toString, BLUE, message) } } final class SuppressedTraceContext(val traceLevel: Int, val useColor: Boolean) -sealed trait ConsoleOut -{ - val lockObject: AnyRef - def print(s: String): Unit - def println(s: String): Unit - def println(): Unit -} \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/ConsoleOut.scala b/util/log/src/main/scala/sbt/ConsoleOut.scala new file mode 100644 index 000000000..916c3727e --- /dev/null +++ b/util/log/src/main/scala/sbt/ConsoleOut.scala @@ -0,0 +1,62 @@ +package sbt + + import java.io.{BufferedWriter, PrintStream, PrintWriter} + +sealed trait ConsoleOut +{ + val lockObject: AnyRef + def print(s: String): Unit + def println(s: String): Unit + def println(): Unit +} + +object ConsoleOut +{ + def systemOut: ConsoleOut = printStreamOut(System.out) + + def overwriteContaining(s: String): (String,String) => Boolean = (cur, prev) => + cur.contains(s) && prev.contains(s) + + /** Move to beginning of previous line and clear the line. */ + private[this] final val OverwriteLine = "\r\u001BM\u001B[2K" + + /** ConsoleOut instance that is backed by System.out. It overwrites the previously printed line + * if the function `f(lineToWrite, previousLine)` returns true. + * + * The ConsoleOut returned by this method assumes that the only newlines are from println calls + * and not in the String arguments. */ + def systemOutOverwrite(f: (String,String) => Boolean): ConsoleOut = new ConsoleOut { + val lockObject = System.out + private[this] var last: Option[String] = None + private[this] var current = new java.lang.StringBuffer + def print(s: String): Unit = synchronized { current.append(s) } + def println(s: String): Unit = synchronized { current.append(s); println() } + def println(): Unit = synchronized { + val s = current.toString + if(ConsoleLogger.formatEnabled && last.exists(lmsg => f(s, lmsg))) + System.out.print(OverwriteLine) + System.out.println(s) + last = Some(s) + current = new java.lang.StringBuffer + } + } + + def printStreamOut(out: PrintStream): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.print(s) + def println(s: String) = out.println(s) + def println() = out.println() + } + def printWriterOut(out: PrintWriter): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.print(s) + def println(s: String) = { out.println(s); out.flush() } + def println() = { out.println(); out.flush() } + } + def bufferedWriterOut(out: BufferedWriter): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.write(s) + def println(s: String) = { out.write(s); println() } + def println() = { out.newLine(); out.flush() } + } +} \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/GlobalLogging.scala b/util/log/src/main/scala/sbt/GlobalLogging.scala index f712dd88a..ed7aa9e09 100644 --- a/util/log/src/main/scala/sbt/GlobalLogging.scala +++ b/util/log/src/main/scala/sbt/GlobalLogging.scala @@ -5,11 +5,28 @@ package sbt import java.io.{File, PrintWriter} -final case class GlobalLogging(full: Logger, backed: ConsoleLogger, backing: GlobalLogBacking) +/** Provides the current global logging configuration. +* +* `full` is the current global logger. It should not be set directly because it is generated as needed from `backing.newLogger`. +* `console` is where all logging from all ConsoleLoggers should go. +* `backed` is the ConsoleLogger that other loggers should feed into. +* `backing` tracks the files that persist the global logging. */ +final case class GlobalLogging(full: Logger, console: ConsoleOut, backed: ConsoleLogger, backing: GlobalLogBacking) + +/** Tracks the files that persist the global logging. +* `file` is the current backing file. `last` is the previous backing file, if there is one. +* `newLogger` creates a new global logging configuration from a sink and backing configuration. +* `newBackingFile` creates a new temporary location for the next backing file. */ final case class GlobalLogBacking(file: File, last: Option[File], newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: () => File) { + /** Shifts the current backing file to `last` and sets the current backing to `newFile`. */ def shift(newFile: File) = GlobalLogBacking(newFile, Some(file), newLogger, newBackingFile) + + /** Shifts the current backing file to `last` and sets the current backing to a new temporary file generated by `newBackingFile`. */ def shiftNew() = shift(newBackingFile()) + + /** If there is a previous backing file in `last`, that becomes the current backing file and the previous backing is cleared. + * Otherwise, no changes are made. */ def unshift = GlobalLogBacking(last getOrElse file, None, newLogger, newBackingFile) } object GlobalLogBacking @@ -21,10 +38,11 @@ object GlobalLogging { @deprecated("Explicitly specify standard out.", "0.13.0") def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File): GlobalLogging = - initial(newLogger, newBackingFile, ConsoleLogger.systemOut) + initial(newLogger, newBackingFile, ConsoleOut.systemOut) + def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File, console: ConsoleOut): GlobalLogging = { val log = ConsoleLogger(console) - GlobalLogging(log, log, GlobalLogBacking(newLogger, newBackingFile)) + GlobalLogging(log, console, log, GlobalLogBacking(newLogger, newBackingFile)) } } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/MainLogging.scala b/util/log/src/main/scala/sbt/MainLogging.scala index 9e024ef28..abed543c7 100644 --- a/util/log/src/main/scala/sbt/MainLogging.scala +++ b/util/log/src/main/scala/sbt/MainLogging.scala @@ -17,18 +17,21 @@ object MainLogging backed setTrace backingTrace multi: Logger } + + @deprecated("Explicitly specify the console output.", "0.13.0") def globalDefault(writer: PrintWriter, backing: GlobalLogBacking): GlobalLogging = - globalDefault(writer, backing, ConsoleLogger.systemOut) + globalDefault(writer, backing, ConsoleOut.systemOut) + def globalDefault(writer: PrintWriter, backing: GlobalLogBacking, console: ConsoleOut): GlobalLogging = { val backed = defaultBacked()(writer) val full = multiLogger(defaultMultiConfig(console, backed ) ) - GlobalLogging(full, backed, backing) + GlobalLogging(full, console, backed, backing) } @deprecated("Explicitly specify the console output.", "0.13.0") def defaultMultiConfig(backing: AbstractLogger): MultiLoggerConfig = - defaultMultiConfig(ConsoleLogger.systemOut, backing) + defaultMultiConfig(ConsoleOut.systemOut, backing) def defaultMultiConfig(console: ConsoleOut, backing: AbstractLogger): MultiLoggerConfig = new MultiLoggerConfig(defaultScreen(console, ConsoleLogger.noSuppressedMessage), backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) @@ -43,7 +46,7 @@ object MainLogging ConsoleLogger(console, suppressedMessage = suppressedMessage) def defaultBacked(useColor: Boolean = ConsoleLogger.formatEnabled): PrintWriter => ConsoleLogger = - to => ConsoleLogger(ConsoleLogger.printWriterOut(to), useColor = useColor) + to => ConsoleLogger(ConsoleOut.printWriterOut(to), useColor = useColor) } final case class MultiLoggerConfig(console: AbstractLogger, backed: AbstractLogger, extra: List[AbstractLogger], From d69db30af71e2520d299a73048b7982c06842a8a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 25 Feb 2013 09:24:04 -0500 Subject: [PATCH 324/823] deprecations --- cache/src/main/scala/sbt/Cache.scala | 6 +++--- .../src/main/scala/sbt/appmacro/ContextUtil.scala | 6 ++++-- .../src/main/scala/sbt/appmacro/Instance.scala | 12 ++++++------ .../src/main/scala/sbt/appmacro/KListBuilder.scala | 6 +++--- .../src/main/scala/sbt/appmacro/TupleNBuilder.scala | 6 +++--- util/collection/src/main/scala/sbt/INode.scala | 4 ++-- util/collection/src/main/scala/sbt/Settings.scala | 2 +- .../src/main/scala/sbt/complete/EditDistance.scala | 2 +- .../src/main/scala/sbt/complete/Parser.scala | 6 +++--- 9 files changed, 26 insertions(+), 24 deletions(-) diff --git a/cache/src/main/scala/sbt/Cache.scala b/cache/src/main/scala/sbt/Cache.scala index e394a8903..725a103a8 100644 --- a/cache/src/main/scala/sbt/Cache.scala +++ b/cache/src/main/scala/sbt/Cache.scala @@ -251,7 +251,7 @@ trait UnionImplicits new UnionCache[H :+: T, UB] { val size = 1 + t.size - def c = mf.erasure + def c = mf.runtimeClass def find(value: UB): Found[_] = if(c.isInstance(value)) new Found[head.Internal](head, c, head.convert(value.asInstanceOf[H]), size - 1) else t.find(value) def at(i: Int): (InputCache[_ <: UB], Class[_]) = if(size == i + 1) (head, c) else t.at(i) @@ -259,8 +259,8 @@ trait UnionImplicits implicit def unionNil[UB]: UnionCache[HNil, UB] = new UnionCache[HNil, UB] { def size = 0 - def find(value: UB) = error("No valid sum type for " + value) - def at(i: Int) = error("Invalid union index " + i) + def find(value: UB) = sys.error("No valid sum type for " + value) + def at(i: Int) = sys.error("Invalid union index " + i) } final class Found[I](val cache: InputCache[_] { type Internal = I }, val clazz: Class[_], val value: I, val index: Int) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index 41a52003f..b9fe29388 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -29,7 +29,7 @@ object ContextUtil { } } - def unexpectedTree[C <: Context](tree: C#Tree): Nothing = error("Unexpected macro application tree (" + tree.getClass + "): " + tree) + def unexpectedTree[C <: Context](tree: C#Tree): Nothing = sys.error("Unexpected macro application tree (" + tree.getClass + "): " + tree) } /** Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). @@ -161,9 +161,11 @@ final class ContextUtil[C <: Context](val ctx: C) def singleton[T <: AnyRef with Singleton](i: T)(implicit it: ctx.TypeTag[i.type]): Symbol = it.tpe match { case SingleType(_, sym) if !sym.isFreeTerm && sym.isStatic => sym - case x => error("Instance must be static (was " + x + ").") + case x => sys.error("Instance must be static (was " + x + ").") } + def select(t: Tree, name: String): Tree = Select(t, newTermName(name)) + /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ def method(obj: Symbol, name: String): Symbol = { val global: Global = ctx.universe.asInstanceOf[Global] diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala index 49b8bb71a..d86fb2a4c 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala @@ -38,7 +38,7 @@ object InputWrapper final val WrapName = "wrap_\u2603\u2603" @compileTimeOnly("`value` can only be used within a task or setting macro, such as :=, +=, ++=, Def.task, or Def.setting.") - def wrap_\u2603\u2603[T](in: Any): T = error("This method is an implementation detail and should not be referenced.") + def wrap_\u2603\u2603[T](in: Any): T = sys.error("This method is an implementation detail and should not be referenced.") def wrapKey[T: c.WeakTypeTag](c: Context)(ts: c.Expr[Any], pos: c.Position): c.Expr[T] = wrapImpl[T,InputWrapper.type](c, InputWrapper, WrapName)(ts, pos) @@ -54,7 +54,7 @@ object InputWrapper val iw = util.singleton(s) val tpe = c.weakTypeOf[T] val nme = newTermName(wrapName).encoded - val sel = Select(Ident(iw), nme) + val sel = util.select(Ident(iw), nme) sel.setPos(pos) // need to set the position on Select, because that is where the compileTimeOnly check looks val tree = ApplyTree(TypeApply(sel, TypeTree(tpe) :: Nil), ts.tree :: Nil) tree.setPos(ts.tree.pos) @@ -147,7 +147,7 @@ object Instance // no inputs, so construct M[T] via Instance.pure or pure+flatten def pure(body: Tree): Tree = { - val typeApplied = TypeApply(Select(instance, PureName), TypeTree(treeType) :: Nil) + val typeApplied = TypeApply(util.select(instance, PureName), TypeTree(treeType) :: Nil) val p = ApplyTree(typeApplied, Function(Nil, body) :: Nil) if(t.isLeft) p else flatten(p) } @@ -155,7 +155,7 @@ object Instance // the returned Tree will have type M[T] def flatten(m: Tree): Tree = { - val typedFlatten = TypeApply(Select(instance, FlattenName), TypeTree(tt.tpe) :: Nil) + val typedFlatten = TypeApply(util.select(instance, FlattenName), TypeTree(tt.tpe) :: Nil) ApplyTree(typedFlatten, m :: Nil) } @@ -164,7 +164,7 @@ object Instance { val variable = input.local val param = ValDef(util.parameterModifiers, variable.name, variable.tpt, EmptyTree) - val typeApplied = TypeApply(Select(instance, MapName), variable.tpt :: TypeTree(treeType) :: Nil) + val typeApplied = TypeApply(util.select(instance, MapName), variable.tpt :: TypeTree(treeType) :: Nil) val mapped = ApplyTree(typeApplied, input.expr :: Function(param :: Nil, body) :: Nil) if(t.isLeft) mapped else flatten(mapped) } @@ -177,7 +177,7 @@ object Instance val bindings = result.extract(param) val f = Function(param :: Nil, Block(bindings, body)) val ttt = TypeTree(treeType) - val typedApp = TypeApply(Select(instance, ApplyName), TypeTree(result.representationC) :: ttt :: Nil) + val typedApp = TypeApply(util.select(instance, ApplyName), TypeTree(result.representationC) :: ttt :: Nil) val app = ApplyTree(ApplyTree(typedApp, result.input :: f :: Nil), result.alistInstance :: Nil) if(t.isLeft) app else flatten(app) } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala index 551566419..195123c6c 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala @@ -34,8 +34,8 @@ object KListBuilder extends TupleBuilder params match { case ValDef(mods, name, tpt, _) :: xs => - val head = ValDef(mods, name, tpt, Select(Ident(prev.name), "head")) - val tail = localValDef(TypeTree(), Select(Ident(prev.name), "tail")) + val head = ValDef(mods, name, tpt, select(Ident(prev.name), "head")) + val tail = localValDef(TypeTree(), select(Ident(prev.name), "tail")) val base = head :: revBindings bindKList(tail, if(xs.isEmpty) base else tail :: base, xs) case Nil => revBindings.reverse @@ -60,7 +60,7 @@ object KListBuilder extends TupleBuilder val representationC = PolyType(tcVariable :: Nil, klistType) val resultType = appliedType(representationC, idTC :: Nil) val input = klist - val alistInstance = TypeApply(Select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) + val alistInstance = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) } } \ No newline at end of file diff --git a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala index c7b9929ab..805098353 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala @@ -34,8 +34,8 @@ object TupleNBuilder extends TupleBuilder val input: Tree = mkTuple(inputs.map(_.expr)) val alistInstance: Tree = { - val select = Select(Ident(alist), TupleMethodName + inputs.size.toString) - TypeApply(select, inputs.map(in => TypeTree(in.tpe))) + val selectTree = select(Ident(alist), TupleMethodName + inputs.size.toString) + TypeApply(selectTree, inputs.map(in => TypeTree(in.tpe))) } def extract(param: ValDef): List[ValDef] = bindTuple(param, Nil, inputs.map(_.local), 1) @@ -43,7 +43,7 @@ object TupleNBuilder extends TupleBuilder params match { case ValDef(mods, name, tpt, _) :: xs => - val x = ValDef(mods, name, tpt, Select(Ident(param.name), "_" + i.toString)) + val x = ValDef(mods, name, tpt, select(Ident(param.name), "_" + i.toString)) bindTuple(param, x :: revBindings, xs, i+1) case Nil => revBindings.reverse } diff --git a/util/collection/src/main/scala/sbt/INode.scala b/util/collection/src/main/scala/sbt/INode.scala index e9f64ef6c..6c2e845ba 100644 --- a/util/collection/src/main/scala/sbt/INode.scala +++ b/util/collection/src/main/scala/sbt/INode.scala @@ -20,7 +20,7 @@ abstract class EvaluateSettings[Scope] private[this] val complete = new LinkedBlockingQueue[Option[Throwable]] private[this] val static = PMap.empty[ScopedKey, INode] - private[this] def getStatic[T](key: ScopedKey[T]): INode[T] = static get key getOrElse error("Illegal reference to key " + key) + private[this] def getStatic[T](key: ScopedKey[T]): INode[T] = static get key getOrElse sys.error("Illegal reference to key " + key) private[this] val transform: Initialize ~> INode = new (Initialize ~> INode) { def apply[T](i: Initialize[T]): INode[T] = i match { case k: Keyed[s, T] => single(getStatic(k.scopedKey), k.transform) @@ -137,7 +137,7 @@ abstract class EvaluateSettings[Scope] } protected final def setValue(v: T) { assert(state != Evaluated, "Already evaluated (trying to set value to " + v + "): " + toString) - if(v == null) error("Setting value cannot be null: " + keyString) + if(v == null) sys.error("Setting value cannot be null: " + keyString) value = v state = Evaluated blocking foreach { _.unblocked() } diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 3c1433ab1..b76ecfe5c 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -77,7 +77,7 @@ trait Init[Scope] def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { def apply[T](k: ScopedKey[T]): T = getValue(s, k) } - def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse error("Internal settings error: invalid reference to " + showFullKey(k)) + def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse sys.error("Internal settings error: invalid reference to " + showFullKey(k)) def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) def mapScope(f: Scope => Scope): MapScoped = new MapScoped { def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) diff --git a/util/complete/src/main/scala/sbt/complete/EditDistance.scala b/util/complete/src/main/scala/sbt/complete/EditDistance.scala index e7e295f2a..5e4cb277f 100644 --- a/util/complete/src/main/scala/sbt/complete/EditDistance.scala +++ b/util/complete/src/main/scala/sbt/complete/EditDistance.scala @@ -18,7 +18,7 @@ object EditDistance { 0 to n foreach (x => d(x)(0) = x) 0 to m foreach (x => d(0)(x) = x) - for (i <- 1 to n ; val s_i = s(i - 1) ; j <- 1 to m) { + for (i <- 1 to n ; s_i = s(i - 1) ; j <- 1 to m) { val t_j = t(j - 1) val cost = if (s_i == t_j) matchCost else if(lower(s_i) == lower(t_j)) caseCost else subCost val tcost = if (s_i == t_j) matchCost else transposeCost diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 9cccacf6f..8e5af37e3 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -127,7 +127,7 @@ object Parser extends ParserMain def checkMatches(a: Parser[_], completions: Seq[String]) { val bad = completions.filter( apply(a)(_).resultEmpty.isFailure) - if(!bad.isEmpty) error("Invalid example completions: " + bad.mkString("'", "', '", "'")) + if(!bad.isEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) } def tuple[A,B](a: Option[A], b: Option[B]): Option[(A,B)] = (a,b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } @@ -419,7 +419,7 @@ trait ParserMain def stringLiteral(s: String, start: Int): Parser[String] = { val len = s.length - if(len == 0) error("String literal cannot be empty") else if(start >= len) success(s) else new StringLiteral(s, start) + if(len == 0) sys.error("String literal cannot be empty") else if(start >= len) success(s) else new StringLiteral(s, start) } } sealed trait ValidParser[T] extends Parser[T] @@ -433,7 +433,7 @@ private final case class Invalid(fail: Failure) extends Parser[Nothing] def failure = Some(fail) def result = None def resultEmpty = fail - def derive(c: Char) = error("Invalid.") + def derive(c: Char) = sys.error("Invalid.") def completions(level: Int) = Completions.nil override def toString = fail.errors.mkString("; ") def valid = false From 1aacd4b86d65ec55f9ce79b6ba7ccd1b487880ef Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 25 Feb 2013 09:24:05 -0500 Subject: [PATCH 325/823] make GlobalLogging.backed less specific: AbstractLogger is fine --- util/log/src/main/scala/sbt/GlobalLogging.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/log/src/main/scala/sbt/GlobalLogging.scala b/util/log/src/main/scala/sbt/GlobalLogging.scala index ed7aa9e09..91b9a040f 100644 --- a/util/log/src/main/scala/sbt/GlobalLogging.scala +++ b/util/log/src/main/scala/sbt/GlobalLogging.scala @@ -9,9 +9,9 @@ package sbt * * `full` is the current global logger. It should not be set directly because it is generated as needed from `backing.newLogger`. * `console` is where all logging from all ConsoleLoggers should go. -* `backed` is the ConsoleLogger that other loggers should feed into. +* `backed` is the Logger that other loggers should feed into. * `backing` tracks the files that persist the global logging. */ -final case class GlobalLogging(full: Logger, console: ConsoleOut, backed: ConsoleLogger, backing: GlobalLogBacking) +final case class GlobalLogging(full: Logger, console: ConsoleOut, backed: AbstractLogger, backing: GlobalLogBacking) /** Tracks the files that persist the global logging. * `file` is the current backing file. `last` is the previous backing file, if there is one. From 3b93691476018738846aa234559e4b409ad693f4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 25 Feb 2013 09:24:05 -0500 Subject: [PATCH 326/823] Move GlobalLogBacking.newLogger to GlobalLogging to make the role of GlobalLogBacking clearer. --- .../src/main/scala/sbt/GlobalLogging.scala | 25 ++++++++----------- util/log/src/main/scala/sbt/MainLogging.scala | 15 ++++++----- 2 files changed, 17 insertions(+), 23 deletions(-) diff --git a/util/log/src/main/scala/sbt/GlobalLogging.scala b/util/log/src/main/scala/sbt/GlobalLogging.scala index 91b9a040f..63eb9805a 100644 --- a/util/log/src/main/scala/sbt/GlobalLogging.scala +++ b/util/log/src/main/scala/sbt/GlobalLogging.scala @@ -10,39 +10,34 @@ package sbt * `full` is the current global logger. It should not be set directly because it is generated as needed from `backing.newLogger`. * `console` is where all logging from all ConsoleLoggers should go. * `backed` is the Logger that other loggers should feed into. -* `backing` tracks the files that persist the global logging. */ -final case class GlobalLogging(full: Logger, console: ConsoleOut, backed: AbstractLogger, backing: GlobalLogBacking) +* `backing` tracks the files that persist the global logging. +* `newLogger` creates a new global logging configuration from a sink and backing configuration. +*/ +final case class GlobalLogging(full: Logger, console: ConsoleOut, backed: AbstractLogger, backing: GlobalLogBacking, newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging) /** Tracks the files that persist the global logging. * `file` is the current backing file. `last` is the previous backing file, if there is one. -* `newLogger` creates a new global logging configuration from a sink and backing configuration. * `newBackingFile` creates a new temporary location for the next backing file. */ -final case class GlobalLogBacking(file: File, last: Option[File], newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: () => File) +final case class GlobalLogBacking(file: File, last: Option[File], newBackingFile: () => File) { /** Shifts the current backing file to `last` and sets the current backing to `newFile`. */ - def shift(newFile: File) = GlobalLogBacking(newFile, Some(file), newLogger, newBackingFile) + def shift(newFile: File) = GlobalLogBacking(newFile, Some(file), newBackingFile) /** Shifts the current backing file to `last` and sets the current backing to a new temporary file generated by `newBackingFile`. */ def shiftNew() = shift(newBackingFile()) /** If there is a previous backing file in `last`, that becomes the current backing file and the previous backing is cleared. * Otherwise, no changes are made. */ - def unshift = GlobalLogBacking(last getOrElse file, None, newLogger, newBackingFile) + def unshift = GlobalLogBacking(last getOrElse file, None, newBackingFile) } -object GlobalLogBacking -{ - def apply(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File): GlobalLogBacking = - GlobalLogBacking(newBackingFile, None, newLogger, newBackingFile _) +object GlobalLogBacking { + def apply(newBackingFile: => File): GlobalLogBacking = GlobalLogBacking(newBackingFile, None, newBackingFile _) } object GlobalLogging { - @deprecated("Explicitly specify standard out.", "0.13.0") - def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File): GlobalLogging = - initial(newLogger, newBackingFile, ConsoleOut.systemOut) - def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File, console: ConsoleOut): GlobalLogging = { val log = ConsoleLogger(console) - GlobalLogging(log, console, log, GlobalLogBacking(newLogger, newBackingFile)) + GlobalLogging(log, console, log, GlobalLogBacking(newBackingFile), newLogger) } } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/MainLogging.scala b/util/log/src/main/scala/sbt/MainLogging.scala index abed543c7..5611dbd48 100644 --- a/util/log/src/main/scala/sbt/MainLogging.scala +++ b/util/log/src/main/scala/sbt/MainLogging.scala @@ -18,15 +18,14 @@ object MainLogging multi: Logger } - @deprecated("Explicitly specify the console output.", "0.13.0") - def globalDefault(writer: PrintWriter, backing: GlobalLogBacking): GlobalLogging = - globalDefault(writer, backing, ConsoleOut.systemOut) - - def globalDefault(writer: PrintWriter, backing: GlobalLogBacking, console: ConsoleOut): GlobalLogging = + def globalDefault(console: ConsoleOut): (PrintWriter, GlobalLogBacking) => GlobalLogging = { - val backed = defaultBacked()(writer) - val full = multiLogger(defaultMultiConfig(console, backed ) ) - GlobalLogging(full, console, backed, backing) + lazy val f: (PrintWriter, GlobalLogBacking) => GlobalLogging = (writer, backing) => { + val backed = defaultBacked()(writer) + val full = multiLogger(defaultMultiConfig(console, backed ) ) + GlobalLogging(full, console, backed, backing, f) + } + f } @deprecated("Explicitly specify the console output.", "0.13.0") From ae3690676e47c34cbe8e859fbd0171c44a825f14 Mon Sep 17 00:00:00 2001 From: Alex Dupre Date: Mon, 25 Feb 2013 16:06:45 +0100 Subject: [PATCH 327/823] Switch from JLine 1.0 to 2.10. --- .../src/main/scala/sbt/LineReader.scala | 77 +++++++------------ .../scala/sbt/complete/JLineCompletion.scala | 37 ++++----- .../src/main/scala/sbt/ConsoleLogger.scala | 6 +- 3 files changed, 49 insertions(+), 71 deletions(-) diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index b30d657ec..896ab7a8b 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -3,7 +3,8 @@ */ package sbt - import jline.{ConsoleReader, History} + import jline.console.ConsoleReader + import jline.console.history.{FileHistory, MemoryHistory} import java.io.{File, InputStream, PrintWriter} import complete.Parser import java.util.concurrent.atomic.AtomicBoolean @@ -12,9 +13,6 @@ abstract class JLine extends LineReader { protected[this] val handleCONT: Boolean protected[this] val reader: ConsoleReader - /** Is the input stream at EOF? Compensates for absent EOF detection in JLine's UnsupportedTerminal. */ - protected[this] val inputEof = new AtomicBoolean(false) - protected[this] val historyPath: Option[File] def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } @@ -26,14 +24,12 @@ abstract class JLine extends LineReader } private[this] def readLineWithHistory(prompt: String, mask: Option[Char]): String = - historyPath match + reader.getHistory match { - case None => readLineDirect(prompt, mask) - case Some(file) => - val h = reader.getHistory - JLine.loadHistory(h, file) + case fh: FileHistory => try { readLineDirect(prompt, mask) } - finally { JLine.saveHistory(h, file) } + finally { fh.flush() } + case _ => readLineDirect(prompt, mask) } private[this] def readLineDirect(prompt: String, mask: Option[Char]): String = @@ -44,34 +40,33 @@ abstract class JLine extends LineReader private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): String = { val newprompt = handleMultilinePrompt(prompt) - val line = mask match { + mask match { case Some(m) => reader.readLine(newprompt, m) case None => reader.readLine(newprompt) } - if (inputEof.get) null else line } private[this] def handleMultilinePrompt(prompt: String): String = { val lines = """\r?\n""".r.split(prompt) lines.size match { case 0 | 1 => prompt - case _ => reader.printString(lines.init.mkString("\n") + "\n"); lines.last; + case _ => reader.print(lines.init.mkString("\n") + "\n"); lines.last; } } private[this] def resume() { - jline.Terminal.resetTerminal - JLine.terminal.disableEcho() + jline.TerminalFactory.reset + JLine.terminal.setEchoEnabled(false) reader.drawLine() - reader.flushConsole() + reader.flush() } } private object JLine { // When calling this, ensure that enableEcho has been or will be called. - // getTerminal will initialize the terminal to disable echo. - private def terminal = jline.Terminal.getTerminal + // TerminalFactory.get will initialize the terminal to disable echo. + private def terminal = jline.TerminalFactory.get private def withTerminal[T](f: jline.Terminal => T): T = synchronized { @@ -82,33 +77,26 @@ private object JLine * This ensures synchronized access as well as re-enabling echo after getting the Terminal. */ def usingTerminal[T](f: jline.Terminal => T): T = withTerminal { t => - t.enableEcho() + t.setEchoEnabled(true) f(t) } - def createReader() = + def createReader(historyPath: Option[File]) = usingTerminal { t => val cr = new ConsoleReader cr.setBellEnabled(false) + val h = historyPath match { + case None => new MemoryHistory + case Some(file) => new FileHistory(file) + } + h.setMaxSize(MaxHistorySize) + cr.setHistory(h) cr } def withJLine[T](action: => T): T = withTerminal { t => - t.disableEcho() + t.setEchoEnabled(false) try { action } - finally { t.enableEcho() } - } - private[sbt] def loadHistory(h: History, file: File) - { - h.setMaxSize(MaxHistorySize) - if(file.isFile) IO.reader(file)( h.load ) - } - private[sbt] def saveHistory(h: History, file: File): Unit = - Using.fileWriter()(file) { writer => - val out = new PrintWriter(writer, false) - h.setOutput(out) - h.flushBuffer() - out.close() - h.setOutput(null) + finally { t.setEchoEnabled(true) } } def simple(historyPath: Option[File], handleCONT: Boolean = HandleCONT): SimpleReader = new SimpleReader(historyPath, handleCONT) @@ -120,30 +108,19 @@ trait LineReader { def readLine(prompt: String, mask: Option[Char] = None): Option[String] } -final class FullReader(val historyPath: Option[File], complete: Parser[_], val handleCONT: Boolean = JLine.HandleCONT) extends JLine +final class FullReader(historyPath: Option[File], complete: Parser[_], val handleCONT: Boolean = JLine.HandleCONT) extends JLine { protected[this] val reader = { - val cr = new ConsoleReader - if (!cr.getTerminal.isSupported) { - val input = cr.getInput - cr.setInput(new InputStream { - def read(): Int = { - val c = input.read() - if (c == -1) inputEof.set(true) - c - } - }) - } - cr.setBellEnabled(false) + val cr = JLine.createReader(historyPath) sbt.complete.JLineCompletion.installCustomCompletor(cr, complete) cr } } -class SimpleReader private[sbt] (val historyPath: Option[File], val handleCONT: Boolean) extends JLine +class SimpleReader private[sbt] (historyPath: Option[File], val handleCONT: Boolean) extends JLine { - protected[this] val reader = JLine.createReader() + protected[this] val reader = JLine.createReader(historyPath) } object SimpleReader extends SimpleReader(None, JLine.HandleCONT) diff --git a/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala b/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala index d41f47b85..1aae8e826 100644 --- a/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala +++ b/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala @@ -3,7 +3,8 @@ */ package sbt.complete - import jline.{CandidateListCompletionHandler,Completor,CompletionHandler,ConsoleReader} + import jline.console.ConsoleReader + import jline.console.completer.{CandidateListCompletionHandler,Completer,CompletionHandler} import scala.annotation.tailrec import collection.JavaConversions @@ -15,8 +16,8 @@ object JLineCompletion installCustomCompletor(customCompletor(complete), reader) def installCustomCompletor(complete: (ConsoleReader, Int) => Boolean, reader: ConsoleReader): Unit = { - reader.removeCompletor(DummyCompletor) - reader.addCompletor(DummyCompletor) + reader.removeCompleter(DummyCompletor) + reader.addCompleter(DummyCompletor) reader.setCompletionHandler(new CustomHandler(complete)) } @@ -24,13 +25,13 @@ object JLineCompletion { private[this] var previous: Option[(String,Int)] = None private[this] var level: Int = 1 - override def complete(reader: ConsoleReader, candidates: java.util.List[_], position: Int) = { + override def complete(reader: ConsoleReader, candidates: java.util.List[CharSequence], position: Int) = { val current = Some(bufferSnapshot(reader)) level = if(current == previous) level + 1 else 1 previous = current try completeImpl(reader, level) catch { case e: Exception => - reader.printString("\nException occurred while determining completions.") + reader.print("\nException occurred while determining completions.") e.printStackTrace() false } @@ -40,9 +41,9 @@ object JLineCompletion // always provides dummy completions so that the custom completion handler gets called // (ConsoleReader doesn't call the handler if there aren't any completions) // the custom handler will then throw away the candidates and call the custom function - private[this] final object DummyCompletor extends Completor + private[this] final object DummyCompletor extends Completer { - override def complete(buffer: String, cursor: Int, candidates: java.util.List[_]): Int = + override def complete(buffer: String, cursor: Int, candidates: java.util.List[CharSequence]): Int = { candidates.asInstanceOf[java.util.List[String]] add "dummy" 0 @@ -73,19 +74,19 @@ object JLineCompletion def customCompletor(f: (String, Int) => (Seq[String], Seq[String])): (ConsoleReader, Int) => Boolean = (reader, level) => { val success = complete(beforeCursor(reader), reader => f(reader, level), reader) - reader.flushConsole() + reader.flush() success } def bufferSnapshot(reader: ConsoleReader): (String, Int) = { val b = reader.getCursorBuffer - (b.getBuffer.toString, b.cursor) + (b.buffer.toString, b.cursor) } def beforeCursor(reader: ConsoleReader): String = { val b = reader.getCursorBuffer - b.getBuffer.substring(0, b.cursor) + b.buffer.substring(0, b.cursor) } // returns false if there was nothing to insert and nothing to display @@ -120,16 +121,16 @@ object JLineCompletion def printCompletions(cs: Seq[String], reader: ConsoleReader) { val print = shouldPrint(cs, reader) - reader.printNewline() + reader.println() if(print) printLinesAndColumns(cs, reader) } def printLinesAndColumns(cs: Seq[String], reader: ConsoleReader) { val (lines, columns) = cs partition hasNewline for(line <- lines) { - reader.printString(line) + reader.print(line) if(line.charAt(line.length - 1) != '\n') - reader.printNewline() + reader.println() } reader.printColumns(JavaConversions.seqAsJavaList(columns.map(_.trim))) } @@ -137,15 +138,15 @@ object JLineCompletion def shouldPrint(cs: Seq[String], reader: ConsoleReader): Boolean = { val size = cs.size - (size <= reader.getAutoprintThreshhold) || + (size <= reader.getAutoprintThreshold) || confirm("Display all %d possibilities? (y or n) ".format(size), 'y', 'n', reader) } def confirm(prompt: String, trueC: Char, falseC: Char, reader: ConsoleReader): Boolean = { - reader.printNewline() - reader.printString(prompt) - reader.flushConsole() - reader.readCharacter( Array(trueC, falseC) ) == trueC + reader.println() + reader.print(prompt) + reader.flush() + reader.readCharacter(trueC, falseC) == trueC } def commonPrefix(s: Seq[String]): String = if(s.isEmpty) "" else s reduceLeft commonPrefix diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala index 124c61e90..0bc73c2fe 100644 --- a/util/log/src/main/scala/sbt/ConsoleLogger.scala +++ b/util/log/src/main/scala/sbt/ConsoleLogger.scala @@ -81,9 +81,9 @@ object ConsoleLogger private[this] def ansiSupported = try { - val terminal = jline.Terminal.getTerminal - terminal.enableEcho() // #460 - terminal.isANSISupported + val terminal = jline.TerminalFactory.get + terminal.setEchoEnabled(true) // #460 + terminal.isAnsiSupported } catch { case e: Exception => !isWindows } From cb9266d05aa77c331cc68a4e09bf4f6093a73dba Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 26 Feb 2013 09:04:21 -0500 Subject: [PATCH 328/823] add Jline.createReader() back for source compatibility --- util/complete/src/main/scala/sbt/LineReader.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index 896ab7a8b..6d81a2391 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -80,7 +80,8 @@ private object JLine t.setEchoEnabled(true) f(t) } - def createReader(historyPath: Option[File]) = + def createReader(): ConsoleReader = createReader(None) + def createReader(historyPath: Option[File]): ConsoleReader = usingTerminal { t => val cr = new ConsoleReader cr.setBellEnabled(false) From 829d6b751307fb880e97a062fb4df4d034a5468f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 26 Feb 2013 09:09:16 -0500 Subject: [PATCH 329/823] changes needed for tests for jline 2.10 --- util/complete/src/test/scala/ParserTest.scala | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index a7d276a38..fd42ecf90 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -18,9 +18,10 @@ object JLineTest val parsers = Map("1" -> one, "2" -> two, "3" -> three, "4" -> four, "5" -> five) def main(args: Array[String]) { - import jline.{ConsoleReader,Terminal} + import jline.TerminalFactory + import jline.console.ConsoleReader val reader = new ConsoleReader() - Terminal.getTerminal.disableEcho() + TerminalFactory.get.setEchoEnabled(false) val parser = parsers(args(0)) JLineCompletion.installCustomCompletor(reader, parser) From 283ebc0dcbb6893a64886f26259a70622bd6ec01 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 28 Feb 2013 17:59:38 -0500 Subject: [PATCH 330/823] Export approximate command lines executed for 'doc', 'compile', and 'console' --- interface/src/main/java/xsbti/compile/CachedCompiler.java | 2 ++ 1 file changed, 2 insertions(+) diff --git a/interface/src/main/java/xsbti/compile/CachedCompiler.java b/interface/src/main/java/xsbti/compile/CachedCompiler.java index 97a1a33b5..0722a68b9 100644 --- a/interface/src/main/java/xsbti/compile/CachedCompiler.java +++ b/interface/src/main/java/xsbti/compile/CachedCompiler.java @@ -7,5 +7,7 @@ import java.io.File; public interface CachedCompiler { + /** Returns an array of arguments representing the nearest command line equivalent of a call to run but without the command name itself.*/ + public String[] commandArguments(File[] sources); public void run(File[] sources, DependencyChanges cpChanges, AnalysisCallback callback, Logger log, Reporter delegate, CompileProgress progress); } From 4abc8f3d7b8ccbeb8901af44ec86d5ae716b5bad Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 28 Feb 2013 17:59:38 -0500 Subject: [PATCH 331/823] make classpaths exported --- util/collection/src/main/scala/sbt/Settings.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index b76ecfe5c..08f347ccd 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -267,6 +267,7 @@ trait Init[Scope] def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t)), pos) def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g, pos) def withPos(pos: SourcePosition) = new Setting(key, init, pos) + private[sbt] def mapInitialize(f: Initialize[T] => Initialize[T]): Setting[T] = new Setting(key, f(init), pos) override def toString = "setting(" + key + ") at " + pos } From b951c2c2cc637532e725def87b4c92c6645e2024 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 8 Mar 2013 14:23:30 -0500 Subject: [PATCH 332/823] Construct input tasks in multiple steps to allow input task reuse. Fixes #407. --- .../main/scala/sbt/appmacro/ContextUtil.scala | 31 ++++---- .../src/main/scala/sbt/appmacro/Convert.scala | 38 ++++++++++ .../main/scala/sbt/appmacro/Instance.scala | 71 ++++++------------- 3 files changed, 74 insertions(+), 66 deletions(-) create mode 100644 util/appmacro/src/main/scala/sbt/appmacro/Convert.scala diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index b9fe29388..48dd32466 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -20,11 +20,11 @@ object ContextUtil { * * Given `myImplicitConversion(someValue).extensionMethod`, where `extensionMethod` is a macro that uses this * method, the result of this method is `f()`. */ - def selectMacroImpl[T: c.WeakTypeTag, S: c.WeakTypeTag](c: Context)(f: (c.Expr[S], c.Position) => c.Expr[T]): c.Expr[T] = + def selectMacroImpl[T: c.WeakTypeTag](c: Context)(f: (c.Expr[Any], c.Position) => c.Expr[T]): c.Expr[T] = { import c.universe._ c.macroApplication match { - case s @ Select(Apply(_, t :: Nil), tp) => f( c.Expr[S](t), s.pos ) + case s @ Select(Apply(_, t :: Nil), tp) => f( c.Expr[Any](t), s.pos ) case x => unexpectedTree(x) } } @@ -61,24 +61,18 @@ final class ContextUtil[C <: Context](val ctx: C) vd } - /* Tests whether a Tree is a Select on `methodName`. */ - def isWrapper(methodName: String): Tree => Boolean = { - case Select(_, nme) => nme.decoded == methodName - case _ => false - } - lazy val parameterModifiers = Modifiers(Flag.PARAM) /** Collects all definitions in the tree for use in checkReferences. * This excludes definitions in wrapped expressions because checkReferences won't allow nested dereferencing anyway. */ - def collectDefs(tree: Tree, isWrapper: Tree => Boolean): collection.Set[Symbol] = + def collectDefs(tree: Tree, isWrapper: (String, Type, Tree) => Boolean): collection.Set[Symbol] = { val defs = new collection.mutable.HashSet[Symbol] // adds the symbols for all non-Ident subtrees to `defs`. val process = new Traverser { override def traverse(t: Tree) = t match { case _: Ident => () - case ApplyTree(TypeApply(fun, tpe :: Nil), qual :: Nil) if isWrapper(fun) => () + case ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) if isWrapper(nme.decoded, tpe.tpe, qual) => () case tree => if(tree.symbol ne null) defs += tree.symbol; super.traverse(tree) @@ -95,8 +89,9 @@ final class ContextUtil[C <: Context](val ctx: C) /** A function that checks the provided tree for illegal references to M instances defined in the * expression passed to the macro and for illegal dereferencing of M instances. */ - def checkReferences(defs: collection.Set[Symbol], isWrapper: Tree => Boolean): Tree => Unit = { - case s @ ApplyTree(TypeApply(fun, tpe :: Nil), qual :: Nil) => if(isWrapper(fun)) ctx.error(s.pos, DynamicDependencyError) + def checkReferences(defs: collection.Set[Symbol], isWrapper: (String, Type, Tree) => Boolean): Tree => Unit = { + case s @ ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) => + if(isWrapper(nme.decoded, tpe.tpe, qual)) ctx.error(s.pos, DynamicDependencyError) case id @ Ident(name) if illegalReference(defs, id.symbol) => ctx.error(id.pos, DynamicReferenceError + ": " + name) case _ => () } @@ -189,10 +184,10 @@ final class ContextUtil[C <: Context](val ctx: C) } /** Substitutes wrappers in tree `t` with the result of `subWrapper`. - * A wrapper is a Tree of the form `f[T](v)` for which isWrapper() returns true. + * A wrapper is a Tree of the form `f[T](v)` for which isWrapper(, , .target) returns true. * Typically, `f` is a `Select` or `Ident`. * The wrapper is replaced with the result of `subWrapper(, )` */ - def transformWrappers(t: Tree, isWrapper: Tree => Boolean, subWrapper: (Type, Tree) => Tree): Tree = + def transformWrappers(t: Tree, subWrapper: (String, Type, Tree) => Converted[ctx.type]): Tree = { // the main tree transformer that replaces calls to InputWrapper.wrap(x) with // plain Idents that reference the actual input value @@ -201,9 +196,11 @@ final class ContextUtil[C <: Context](val ctx: C) override def transform(tree: Tree): Tree = tree match { - case ApplyTree(TypeApply(fun, targ :: Nil), qual :: Nil) if isWrapper(fun) => - assert(qual.tpe != null, "Internal error: null type for wrapped tree with " + qual.getClass + "\n\t" + qual + "\n in " + t) - subWrapper(targ.tpe, qual) + case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => subWrapper(nme.decoded, targ.tpe, qual) match { + case Converted.Success(t, finalTx) => finalTx(t) + case Converted.Failure(p,m) => ctx.abort(p, m) + case _: Converted.NotApplicable[_] => super.transform(tree) + } case _ => super.transform(tree) } } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Convert.scala b/util/appmacro/src/main/scala/sbt/appmacro/Convert.scala new file mode 100644 index 000000000..6dedf776b --- /dev/null +++ b/util/appmacro/src/main/scala/sbt/appmacro/Convert.scala @@ -0,0 +1,38 @@ +package sbt +package appmacro + + import scala.reflect._ + import macros._ + import Types.idFun + +abstract class Convert +{ + def apply[T: c.WeakTypeTag](c: Context)(nme: String, in: c.Tree): Converted[c.type] + def asPredicate(c: Context): (String, c.Type, c.Tree) => Boolean = + (n,tpe,tree) => { + val tag = c.WeakTypeTag(tpe) + apply(c)(n,tree)(tag).isSuccess + } +} +sealed trait Converted[C <: Context with Singleton] { + def isSuccess: Boolean + def transform(f: C#Tree => C#Tree): Converted[C] +} +object Converted { + def NotApplicable[C <: Context with Singleton] = new NotApplicable[C] + final case class Failure[C <: Context with Singleton](position: C#Position, message: String) extends Converted[C] { + def isSuccess = false + def transform(f: C#Tree => C#Tree): Converted[C] = new Failure(position, message) + } + final class NotApplicable[C <: Context with Singleton] extends Converted[C] { + def isSuccess = false + def transform(f: C#Tree => C#Tree): Converted[C] = this + } + final case class Success[C <: Context with Singleton](tree: C#Tree, finalTransform: C#Tree => C#Tree) extends Converted[C] { + def isSuccess = true + def transform(f: C#Tree => C#Tree): Converted[C] = Success(f(tree), finalTransform) + } + object Success { + def apply[C <: Context with Singleton](tree: C#Tree): Success[C] = Success(tree, idFun) + } +} \ No newline at end of file diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala index d86fb2a4c..3e8b45cf0 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala @@ -16,10 +16,7 @@ trait Instance def map[S,T](in: M[S], f: S => T): M[T] def pure[T](t: () => T): M[T] } -trait Convert -{ - def apply[T: c.WeakTypeTag](c: scala.reflect.macros.Context)(in: c.Tree): c.Tree -} + trait MonadInstance extends Instance { def flatten[T](in: M[M[T]]): M[T] @@ -29,39 +26,6 @@ trait MonadInstance extends Instance import macros._ import reflect.internal.annotations.compileTimeOnly -// This needs to be moved to main/settings -object InputWrapper -{ - /** The name of the wrapper method should be obscure. - * Wrapper checking is based solely on this name, so it must not conflict with a user method name. - * The user should never see this method because it is compile-time only and only used internally by the task macro system.*/ - final val WrapName = "wrap_\u2603\u2603" - - @compileTimeOnly("`value` can only be used within a task or setting macro, such as :=, +=, ++=, Def.task, or Def.setting.") - def wrap_\u2603\u2603[T](in: Any): T = sys.error("This method is an implementation detail and should not be referenced.") - - def wrapKey[T: c.WeakTypeTag](c: Context)(ts: c.Expr[Any], pos: c.Position): c.Expr[T] = wrapImpl[T,InputWrapper.type](c, InputWrapper, WrapName)(ts, pos) - - /** Wraps an arbitrary Tree in a call to the `.` method of this module for later processing by an enclosing macro. - * The resulting Tree is the manually constructed version of: - * - * `c.universe.reify { .[T](ts.splice) }` - */ - def wrapImpl[T: c.WeakTypeTag, S <: AnyRef with Singleton](c: Context, s: S, wrapName: String)(ts: c.Expr[Any], pos: c.Position)(implicit it: c.TypeTag[s.type]): c.Expr[T] = - { - import c.universe.{Apply=>ApplyTree,_} - val util = new ContextUtil[c.type](c) - val iw = util.singleton(s) - val tpe = c.weakTypeOf[T] - val nme = newTermName(wrapName).encoded - val sel = util.select(Ident(iw), nme) - sel.setPos(pos) // need to set the position on Select, because that is where the compileTimeOnly check looks - val tree = ApplyTree(TypeApply(sel, TypeTree(tpe) :: Nil), ts.tree :: Nil) - tree.setPos(ts.tree.pos) - c.Expr[T](tree) - } -} - object Instance { final val ApplyName = "app" @@ -71,11 +35,17 @@ object Instance final val InstanceTCName = "M" final class Input[U <: Universe with Singleton](val tpe: U#Type, val expr: U#Tree, val local: U#ValDef) + trait Transform[C <: Context with Singleton, N[_]] { + def apply(in: C#Tree): C#Tree + } + def idTransform[C <: Context with Singleton]: Transform[C,Id] = new Transform[C,Id] { + def apply(in: C#Tree): C#Tree = in + } /** Implementation of a macro that provides a direct syntax for applicative functors and monads. * It is intended to be used in conjunction with another macro that conditions the inputs. * - * This method processes the Tree `t` to find inputs of the form `InputWrapper.wrap[T]( input )` + * This method processes the Tree `t` to find inputs of the form `wrap[T]( input )` * This form is typically constructed by another macro that pretends to be able to get a value of type `T` * from a value convertible to `M[T]`. This `wrap(input)` form has two main purposes. * First, it identifies the inputs that should be transformed. @@ -85,7 +55,7 @@ object Instance * allowing the original `Tree` and `Type` to be hidden behind the raw `T` type. This method will remove the call to `wrap` * so that it is not actually called at runtime. * - * Each `input` in each expression of the form `InputWrapper.wrap[T]( input )` is transformed by `convert`. + * Each `input` in each expression of the form `wrap[T]( input )` is transformed by `convert`. * This transformation converts the input Tree to a Tree of type `M[T]`. * The original wrapped expression `wrap(input)` is replaced by a reference to a new local `val $x: T`, where `$x` is a fresh name. * These converted inputs are passed to `builder` as well as the list of these synthetic `ValDef`s. @@ -107,18 +77,18 @@ object Instance * If this is for multi-input flatMap (app followed by flatMap), * this should be the argument wrapped in Right. */ - def contImpl[T](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]])( - implicit tt: c.WeakTypeTag[T], it: c.TypeTag[i.type]): c.Expr[i.M[T]] = + def contImpl[T,N[_]](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]], inner: Transform[c.type,N])( + implicit tt: c.WeakTypeTag[T], nt: c.WeakTypeTag[N[T]], it: c.TypeTag[i.type]): c.Expr[i.M[N[T]]] = { import c.universe.{Apply=>ApplyTree,_} val util = ContextUtil[c.type](c) val mTC: Type = util.extractTC(i, InstanceTCName) - val mttpe: Type = appliedType(mTC, tt.tpe :: Nil).normalize + val mttpe: Type = appliedType(mTC, nt.tpe :: Nil).normalize // the tree for the macro argument val (tree, treeType) = t match { - case Left(l) => (l.tree, tt.tpe.normalize) + case Left(l) => (l.tree, nt.tpe.normalize) case Right(r) => (r.tree, mttpe) } @@ -126,7 +96,7 @@ object Instance // A Tree that references the statically accessible Instance that provides the actual implementations of map, flatMap, ... val instance = Ident(instanceSym) - val isWrapper: Tree => Boolean = util.isWrapper(InputWrapper.WrapName) + val isWrapper: (String, Type, Tree) => Boolean = convert.asPredicate(c) type In = Input[c.universe.type] var inputs = List[In]() @@ -194,16 +164,19 @@ object Instance inputs ::= new Input(tpe, qual, vd) util.refVal(vd) } - def sub(tpe: Type, qual: Tree): Tree = + def sub(name: String, tpe: Type, qual: Tree): Converted[c.type] = { - val tag = c.WeakTypeTag(tpe) - addType(tpe, convert(c)(qual)(tag) ) + val tag = c.WeakTypeTag[T](tpe) + convert[T](c)(name, qual)(tag) transform { tree => + addType(tpe, tree) + } } // applies the transformation + val tx = util.transformWrappers(tree, (n,tpe,t) => sub(n,tpe,t)) // resetting attributes must be: a) local b) done here and not wider or else there are obscure errors - val tr = makeApp( c.resetLocalAttrs( util.transformWrappers(tree, isWrapper, (tpe, tr) => sub(tpe, tr)) ) ) - c.Expr[i.M[T]](tr) + val tr = makeApp( c.resetLocalAttrs( inner(tx) ) ) + c.Expr[i.M[N[T]]](tr) } import Types._ From 033fd23314435cc24a09502efd89759f621a43f7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 13 Mar 2013 12:40:03 -0400 Subject: [PATCH 333/823] Logger.Null that discards logged messages --- util/log/src/main/scala/sbt/Logger.scala | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/util/log/src/main/scala/sbt/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala index 7715b80db..29d965e76 100644 --- a/util/log/src/main/scala/sbt/Logger.scala +++ b/util/log/src/main/scala/sbt/Logger.scala @@ -40,6 +40,21 @@ abstract class AbstractLogger extends Logger object Logger { + // make public in 0.13 + private[sbt] val Null: AbstractLogger = new AbstractLogger { + def getLevel: Level.Value = Level.Error + def setLevel(newLevel: Level.Value) {} + def getTrace = 0 + def setTrace(flag: Int) {} + def successEnabled = false + def setSuccessEnabled(flag: Boolean) {} + def control(event: ControlEvent.Value, message: => String) {} + def logAll(events: Seq[LogEvent]) {} + def trace(t: => Throwable) {} + def success(message: => String) {} + def log(level: Level.Value, message: => String) {} + } + implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger implicit def log2PLog(log: Logger): ProcessLogger = absLog2PLog(new FullLogger(log)) implicit def xlog2Log(lg: xLogger): Logger = new Logger { From 9d21724129bb267b598bd0ae91322df6bcddaf9a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 27 Mar 2013 09:17:53 -0400 Subject: [PATCH 334/823] API for evaluating a setting or task in multiple scopes --- util/collection/src/main/scala/sbt/INode.scala | 4 +++- util/collection/src/main/scala/sbt/Settings.scala | 1 + 2 files changed, 4 insertions(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/INode.scala b/util/collection/src/main/scala/sbt/INode.scala index 6c2e845ba..4ce5ef8bb 100644 --- a/util/collection/src/main/scala/sbt/INode.scala +++ b/util/collection/src/main/scala/sbt/INode.scala @@ -20,12 +20,14 @@ abstract class EvaluateSettings[Scope] private[this] val complete = new LinkedBlockingQueue[Option[Throwable]] private[this] val static = PMap.empty[ScopedKey, INode] + private[this] val allScopes: Set[Scope] = compiledSettings.map(_.key.scope).toSet private[this] def getStatic[T](key: ScopedKey[T]): INode[T] = static get key getOrElse sys.error("Illegal reference to key " + key) private[this] val transform: Initialize ~> INode = new (Initialize ~> INode) { def apply[T](i: Initialize[T]): INode[T] = i match { case k: Keyed[s, T] => single(getStatic(k.scopedKey), k.transform) case a: Apply[k,T] => new MixedNode[k,T]( a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) case b: Bind[s,T] => new BindNode[s,T]( transform(b.in), x => transform(b.f(x))) + case init.StaticScopes => constant(() => allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] case v: Value[T] => constant(v.value) case t: TransformCapture => constant(() => t.f) case o: Optional[s,T] => o.a match { @@ -33,7 +35,7 @@ abstract class EvaluateSettings[Scope] case Some(i) => single[s,T](transform(i), x => o.f(Some(x))) } }} - private[this] val roots: Seq[INode[_]] = compiledSettings flatMap { cs => + private[this] lazy val roots: Seq[INode[_]] = compiledSettings flatMap { cs => (cs.settings map { s => val t = transform(s.init) static(s.key) = t diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 08f347ccd..cf12a58fc 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -353,6 +353,7 @@ trait Init[Scope] def mapConstant(g: MapConstant) = this def evaluate(map: Settings[Scope]): T = value() } + private[sbt] final val StaticScopes: Initialize[Set[Scope]] = new Value(() => error("internal sbt error: GetScopes not substituted")) private[sbt] final class Apply[K[L[x]], T](val f: K[Id] => T, val inputs: K[Initialize], val alist: AList[K]) extends Initialize[T] { def dependencies = deps(alist.toList(inputs)) From b0bd2e838e27e652c4711dbf5652b1f1e839bbd3 Mon Sep 17 00:00:00 2001 From: cheeseng Date: Thu, 4 Apr 2013 10:39:29 +0800 Subject: [PATCH 335/823] Normalize line endings. --- .../src/main/scala/sbt/Positions.scala | 40 +++++++++---------- 1 file changed, 20 insertions(+), 20 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Positions.scala b/util/collection/src/main/scala/sbt/Positions.scala index b2aa22ee2..f52c583b0 100755 --- a/util/collection/src/main/scala/sbt/Positions.scala +++ b/util/collection/src/main/scala/sbt/Positions.scala @@ -1,20 +1,20 @@ -package sbt - -sealed trait SourcePosition - -sealed trait FilePosition extends SourcePosition { - def path: String - def startLine: Int -} - -case object NoPosition extends SourcePosition - -final case class LinePosition(path: String, startLine: Int) extends FilePosition - -final case class LineRange(start: Int, end: Int) { - def shift(n: Int) = new LineRange(start + n, end + n) -} - -final case class RangePosition(path: String, range: LineRange) extends FilePosition { - def startLine = range.start -} +package sbt + +sealed trait SourcePosition + +sealed trait FilePosition extends SourcePosition { + def path: String + def startLine: Int +} + +case object NoPosition extends SourcePosition + +final case class LinePosition(path: String, startLine: Int) extends FilePosition + +final case class LineRange(start: Int, end: Int) { + def shift(n: Int) = new LineRange(start + n, end + n) +} + +final case class RangePosition(path: String, range: LineRange) extends FilePosition { + def startLine = range.start +} From 5f53b895096680ec637d25ee1f10fe2fb82c1eb4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 26 Apr 2013 22:35:27 -0400 Subject: [PATCH 336/823] Record and persist public inheritance dependencies. Includes placeholders for adding public inherited dependencies for Java classes. --- interface/src/main/java/xsbti/AnalysisCallback.java | 12 ++++++++---- 1 file changed, 8 insertions(+), 4 deletions(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index c23b43ecd..d00f5b7ed 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -12,11 +12,15 @@ public interface AnalysisCallback /** Called to indicate that the source file source depends on the source file * dependsOn. Note that only source files included in the current compilation will * passed to this method. Dependencies on classes generated by sources not in the current compilation will - * be passed as class dependencies to the classDependency method.*/ - public void sourceDependency(File dependsOn, File source); + * be passed as class dependencies to the classDependency method. + * If publicInherited is true, this dependency is a result of inheritance by a + * template accessible outside of the source file. */ + public void sourceDependency(File dependsOn, File source, boolean publicInherited); /** Called to indicate that the source file source depends on the top-level - * class named name from class or jar file binary. */ - public void binaryDependency(File binary, String name, File source); + * class named name from class or jar file binary. + * If publicInherited is true, this dependency is a result of inheritance by a + * template accessible outside of the source file. */ + public void binaryDependency(File binary, String name, File source, boolean publicInherited); /** Called to indicate that the source file source produces a class file at * module contain class name.*/ public void generatedClass(File source, File module, String name); From bbd01021b2b7f590a26778dff649ad7980c60e38 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 27 Apr 2013 16:27:29 -0400 Subject: [PATCH 337/823] fix compilation error in TestCallback --- interface/src/test/scala/TestCallback.scala | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index 096d73a83..061457723 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -7,15 +7,15 @@ class TestCallback extends AnalysisCallback { val beganSources = new ArrayBuffer[File] val endedSources = new ArrayBuffer[File] - val sourceDependencies = new ArrayBuffer[(File, File)] - val binaryDependencies = new ArrayBuffer[(File, String, File)] + val sourceDependencies = new ArrayBuffer[(File, File, Boolean)] + val binaryDependencies = new ArrayBuffer[(File, String, File, Boolean)] val products = new ArrayBuffer[(File, File, String)] val apis = new ArrayBuffer[(File, xsbti.api.SourceAPI)] def beginSource(source: File) { beganSources += source } - def sourceDependency(dependsOn: File, source: File) { sourceDependencies += ((dependsOn, source)) } - def binaryDependency(binary: File, name: String, source: File) { binaryDependencies += ((binary, name, source)) } + def sourceDependency(dependsOn: File, source: File, inherited: Boolean) { sourceDependencies += ((dependsOn, source, inherited)) } + def binaryDependency(binary: File, name: String, source: File, inherited: Boolean) { binaryDependencies += ((binary, name, source, inherited)) } def generatedClass(source: File, module: File, name: String) { products += ((source, module, name)) } def endSource(source: File) { endedSources += source } From 1c741a2e069444080183ca32cedef689df996567 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 May 2013 12:56:58 -0400 Subject: [PATCH 338/823] Derived settings, which allows injecting settings wherever their dependencies are defined. This is an advanced feature initially intended for internal sbt use. --- .../src/main/scala/sbt/Settings.scala | 71 ++++++++++++++++--- 1 file changed, 60 insertions(+), 11 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index cf12a58fc..4f9bf5206 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -66,13 +66,22 @@ trait Init[Scope] def value[T](value: => T): Initialize[T] = pure(value _) def pure[T](value: () => T): Initialize[T] = new Value(value) def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) - def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = new Setting[T](key, map(key)(f), NoPosition) + def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = setting[T](key, map(key)(f), NoPosition) def bind[S,T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) def map[S,T](in: Initialize[S])(f: S => T): Initialize[T] = new Apply[ ({ type l[L[x]] = L[S] })#l, T](f, in, AList.single[S]) def app[K[L[x]], T](inputs: K[Initialize])(f: K[Id] => T)(implicit alist: AList[K]): Initialize[T] = new Apply[K, T](f, inputs, alist) def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) + def derive[T](s: Setting[T]): Setting[T] = { + deriveAllowed(s) foreach error + new DerivedSetting[T](s.key, s.init, s.pos) + } + def deriveAllowed[T](s: Setting[T]): Option[String] = s.init match { + case _: Bind[_,_] => Some("Cannot derive from dynamic dependencies.") + case _ => None + } + def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { def apply[T](k: ScopedKey[T]): T = getValue(s, k) @@ -83,10 +92,37 @@ trait Init[Scope] def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) } + private[this] def plain[T](s: Setting[T]): Setting[T] = if(s.isDerived) new Setting(s.key, s.init, s.pos) else s + private[this] def derive(init: Seq[Setting[_]]): Seq[Setting[_]] = + { + import collection.mutable + val (derived, defs) = init.partition(_.isDerived) + val derivs = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Setting[_]]] + for(s <- derived; d <- s.dependencies) + derivs.getOrElseUpdate(d.key, new mutable.ListBuffer) += plain(s) + + val deriveFor = (sk: ScopedKey[_]) => + derivs.get(sk.key).toList.flatMap(_.toList).map(_.setScope(sk.scope)) + + val processed = new mutable.HashSet[ScopedKey[_]] + val out = new mutable.ListBuffer[Setting[_]] + def process(rem: List[Setting[_]]): Unit = rem match { + case s :: ss => + val sk = s.key + val ds = if(processed.add(sk)) deriveFor(sk) else Nil + out ++= ds + process(ds ::: ss) + case Nil => + } + process(defs.toList) + out.toList ++ defs + } + def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = { - // prepend per-scope settings - val withLocal = addLocal(init)(scopeLocal) + val derived = derive(init) + // prepend per-scope settings + val withLocal = addLocal(derived)(scopeLocal) // group by Scope/Key, dropping dead initializations val sMap: ScopedMap = grouped(withLocal) // delegate references to undefined values according to 'delegates' @@ -256,20 +292,33 @@ trait Init[Scope] def settings: Seq[Setting[_]] } final class SettingList(val settings: Seq[Setting[_]]) extends SettingsDefinition - final class Setting[T](val key: ScopedKey[T], val init: Initialize[T], val pos: SourcePosition) extends SettingsDefinition + sealed class Setting[T] private[Init](val key: ScopedKey[T], val init: Initialize[T], val pos: SourcePosition) extends SettingsDefinition { def settings = this :: Nil def definitive: Boolean = !init.dependencies.contains(key) def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) - def mapReferenced(g: MapScoped): Setting[T] = new Setting(key, init mapReferenced g, pos) - def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => new Setting(key, newI, pos)) - def mapKey(g: MapScoped): Setting[T] = new Setting(g(key), init, pos) - def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = new Setting(key, init(t => f(key,t)), pos) - def mapConstant(g: MapConstant): Setting[T] = new Setting(key, init mapConstant g, pos) - def withPos(pos: SourcePosition) = new Setting(key, init, pos) - private[sbt] def mapInitialize(f: Initialize[T] => Initialize[T]): Setting[T] = new Setting(key, f(init), pos) + def mapReferenced(g: MapScoped): Setting[T] = make(key, init mapReferenced g, pos) + def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => make(key, newI, pos)) + def mapKey(g: MapScoped): Setting[T] = make(g(key), init, pos) + def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = make(key, init(t => f(key,t)), pos) + def mapConstant(g: MapConstant): Setting[T] = make(key, init mapConstant g, pos) + def withPos(pos: SourcePosition) = make(key, init, pos) + def positionString: Option[String] = pos match { + case pos: FilePosition => Some(pos.path + ":" + pos.startLine) + case NoPosition => None + } + private[sbt] def mapInitialize(f: Initialize[T] => Initialize[T]): Setting[T] = make(key, f(init), pos) override def toString = "setting(" + key + ") at " + pos + + protected[this] def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new Setting[T](key, init, pos) + protected[sbt] def isDerived: Boolean = false + private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) } + private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition) extends Setting[T](sk, i, p) { + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos) + protected[sbt] override def isDerived: Boolean = true + } + private[this] def handleUndefined[T](vr: ValidatedInit[T]): Initialize[T] = vr match { case Left(undefs) => throw new RuntimeUndefined(undefs) From 94f4d4e8c0f63ea1fb2748f881f388cbe55d5b45 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 May 2013 12:56:58 -0400 Subject: [PATCH 339/823] display derived settings information in 'inspect' --- util/collection/src/main/scala/sbt/Settings.scala | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 4f9bf5206..2b67ad197 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -92,14 +92,13 @@ trait Init[Scope] def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) } - private[this] def plain[T](s: Setting[T]): Setting[T] = if(s.isDerived) new Setting(s.key, s.init, s.pos) else s private[this] def derive(init: Seq[Setting[_]]): Seq[Setting[_]] = { import collection.mutable val (derived, defs) = init.partition(_.isDerived) val derivs = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Setting[_]]] for(s <- derived; d <- s.dependencies) - derivs.getOrElseUpdate(d.key, new mutable.ListBuffer) += plain(s) + derivs.getOrElseUpdate(d.key, new mutable.ListBuffer) += s val deriveFor = (sk: ScopedKey[_]) => derivs.get(sk.key).toList.flatMap(_.toList).map(_.setScope(sk.scope)) @@ -117,7 +116,7 @@ trait Init[Scope] process(defs.toList) out.toList ++ defs } - + def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = { val derived = derive(init) From 962a163f334e77593d4a9fb621546ff1ed434f8f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 May 2013 12:56:58 -0400 Subject: [PATCH 340/823] ensure a derived setting is only injected into a scope once --- util/collection/src/main/scala/sbt/Settings.scala | 12 ++++++++---- 1 file changed, 8 insertions(+), 4 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 2b67ad197..1b2d60f3b 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -96,12 +96,15 @@ trait Init[Scope] { import collection.mutable val (derived, defs) = init.partition(_.isDerived) - val derivs = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Setting[_]]] + final class Derived[T](val setting: Setting[T]) { val inScopes = new mutable.HashSet[Scope] } + val derivs = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived[_]]] for(s <- derived; d <- s.dependencies) - derivs.getOrElseUpdate(d.key, new mutable.ListBuffer) += s + derivs.getOrElseUpdate(d.key, new mutable.ListBuffer) += new Derived(s) - val deriveFor = (sk: ScopedKey[_]) => - derivs.get(sk.key).toList.flatMap(_.toList).map(_.setScope(sk.scope)) + val deriveFor = (sk: ScopedKey[_]) => { + val derivedForKey: List[Derived[_]] = derivs.get(sk.key).toList.flatten + derivedForKey.filter(_.inScopes add sk.scope).map(_.setting setScope sk.scope) + } val processed = new mutable.HashSet[ScopedKey[_]] val out = new mutable.ListBuffer[Setting[_]] @@ -119,6 +122,7 @@ trait Init[Scope] def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = { + // inject derived settings into scopes where their dependencies are directly defined val derived = derive(init) // prepend per-scope settings val withLocal = addLocal(derived)(scopeLocal) From 68ca419a7cb0714a86698c39ddbdab1feb847d0c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 May 2013 12:56:58 -0400 Subject: [PATCH 341/823] require dynamic initialization to be explicitly enabled for derived settings --- util/collection/src/main/scala/sbt/Settings.scala | 11 +++++++---- 1 file changed, 7 insertions(+), 4 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 1b2d60f3b..647b2c89e 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -73,12 +73,15 @@ trait Init[Scope] def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) - def derive[T](s: Setting[T]): Setting[T] = { - deriveAllowed(s) foreach error + /** Constructs a derived setting that will be automatically defined in every scope where one of its dependencies is explicitly defined. + * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. + * Only the static dependencies are tracked, however. */ + final def derive[T](s: Setting[T], allowDynamic: Boolean = false): Setting[T] = { + deriveAllowed(s, allowDynamic) foreach error new DerivedSetting[T](s.key, s.init, s.pos) } - def deriveAllowed[T](s: Setting[T]): Option[String] = s.init match { - case _: Bind[_,_] => Some("Cannot derive from dynamic dependencies.") + def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { + case _: Bind[_,_] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") case _ => None } From 08e4e3786f70675d5564539306a6a05efc74f3f2 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 May 2013 12:56:58 -0400 Subject: [PATCH 342/823] more specific error when dependencies of a derived setting are undefined --- .../src/main/scala/sbt/Settings.scala | 18 ++++++++++-------- 1 file changed, 10 insertions(+), 8 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 647b2c89e..1c22804a8 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -168,12 +168,12 @@ trait Init[Scope] def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = { - def refMap(refKey: ScopedKey[_], isFirst: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = - delegateForKey(sMap, k, delegates(k.scope), refKey, isFirst) + def refMap(refKey: ScopedKey[_], isFirst: Boolean, derived: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = + delegateForKey(sMap, k, delegates(k.scope), refKey, isFirst, derived) } type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] val f = new (SettingSeq ~> ValidatedSettings) { def apply[T](ks: Seq[Setting[T]]) = { - val validated = ks.zipWithIndex map { case (s,i) => s validateReferenced refMap(s.key, i == 0) } + val validated = ks.zipWithIndex map { case (s,i) => s validateReferenced refMap(s.key, i == 0, s.isDerived) } val (undefs, valid) = Util separateE validated if(undefs.isEmpty) Right(valid) else Left(undefs.flatten) }} @@ -184,11 +184,11 @@ trait Init[Scope] else throw Uninitialized(sMap.keys.toSeq, delegates, undefineds.values.flatten.toList, false) } - private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean): Either[Undefined, ScopedKey[T]] = + private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean, derived: Boolean): Either[Undefined, ScopedKey[T]] = { def resolve(search: Seq[Scope]): Either[Undefined, ScopedKey[T]] = search match { - case Seq() => Left(Undefined(refKey, k)) + case Seq() => Left(Undefined(refKey, k, derived)) case Seq(x, xs @ _*) => val sk = ScopedKey(x, k.key) val definesKey = (refKey != sk || !isFirst) && (sMap contains sk) @@ -213,7 +213,9 @@ trait Init[Scope] def showUndefined(u: Undefined, validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = { val guessed = guessIntendedScope(validKeys, delegates, u.referencedKey) - display(u.referencedKey) + " from " + display(u.definingKey) + guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString + val guessedString = if(u.derived) "" else guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString + val derivedString = if(u.derived) ", which is a derived setting that needs this key to be defined in this scope." else "" + display(u.referencedKey) + " from " + display(u.definingKey) + derivedString + guessedString } def guessIntendedScope(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], key: ScopedKey[_]): Option[ScopedKey[_]] = @@ -230,9 +232,9 @@ trait Init[Scope] } final class Uninitialized(val undefined: Seq[Undefined], override val toString: String) extends Exception(toString) - final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_]) + final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_], val derived: Boolean) final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") - def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(definingKey, referencedKey) + def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean): Undefined = new Undefined(definingKey, referencedKey, derived) def Uninitialized(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], keys: Seq[Undefined], runtime: Boolean)(implicit display: Show[ScopedKey[_]]): Uninitialized = { assert(!keys.isEmpty) From 6ffff6fb35626397a258e20c79b9ffe8637322e6 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 May 2013 12:56:59 -0400 Subject: [PATCH 343/823] support filtering the Scopes that a derived setting is applied in --- .../src/main/scala/sbt/Settings.scala | 19 ++++++++++--------- 1 file changed, 10 insertions(+), 9 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 1c22804a8..79cdaf751 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -73,12 +73,13 @@ trait Init[Scope] def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) - /** Constructs a derived setting that will be automatically defined in every scope where one of its dependencies is explicitly defined. + /** Constructs a derived setting that will be automatically defined in every scope where one of its dependencies + * is explicitly defined and the where the scope matches `filter`. * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. - * Only the static dependencies are tracked, however. */ - final def derive[T](s: Setting[T], allowDynamic: Boolean = false): Setting[T] = { + * Only the static dependencies are tracked, however. */ + final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true)): Setting[T] = { deriveAllowed(s, allowDynamic) foreach error - new DerivedSetting[T](s.key, s.init, s.pos) + new DerivedSetting[T](s.key, s.init, s.pos, filter) } def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { case _: Bind[_,_] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") @@ -98,15 +99,15 @@ trait Init[Scope] private[this] def derive(init: Seq[Setting[_]]): Seq[Setting[_]] = { import collection.mutable - val (derived, defs) = init.partition(_.isDerived) - final class Derived[T](val setting: Setting[T]) { val inScopes = new mutable.HashSet[Scope] } + val (derived, defs) = Util.separate[Setting[_],DerivedSetting[_],Setting[_]](init) { case d: DerivedSetting[_] => Left(d); case s => Right(s) } + final class Derived[T](val setting: DerivedSetting[T]) { val inScopes = new mutable.HashSet[Scope] } val derivs = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived[_]]] for(s <- derived; d <- s.dependencies) derivs.getOrElseUpdate(d.key, new mutable.ListBuffer) += new Derived(s) val deriveFor = (sk: ScopedKey[_]) => { val derivedForKey: List[Derived[_]] = derivs.get(sk.key).toList.flatten - derivedForKey.filter(_.inScopes add sk.scope).map(_.setting setScope sk.scope) + derivedForKey.filter(d => d.inScopes.add(sk.scope) && d.setting.filter(sk.scope)).map(_.setting setScope sk.scope) } val processed = new mutable.HashSet[ScopedKey[_]] @@ -322,8 +323,8 @@ trait Init[Scope] protected[sbt] def isDerived: Boolean = false private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) } - private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition) extends Setting[T](sk, i, p) { - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos) + private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean) extends Setting[T](sk, i, p) { + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter) protected[sbt] override def isDerived: Boolean = true } From 7a10aeca379c37dacf60968cfb457c42dc73f363 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 May 2013 12:56:59 -0400 Subject: [PATCH 344/823] Default settings, which give internal sbt settings something like Plugin.globalSettings. --- .../src/main/scala/sbt/Settings.scala | 30 ++++++++++++++++--- 1 file changed, 26 insertions(+), 4 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 79cdaf751..6a16e1616 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -79,12 +79,21 @@ trait Init[Scope] * Only the static dependencies are tracked, however. */ final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true)): Setting[T] = { deriveAllowed(s, allowDynamic) foreach error - new DerivedSetting[T](s.key, s.init, s.pos, filter) + new DerivedSetting[T](s.key, s.init, s.pos, filter, nextDefaultID()) } def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { case _: Bind[_,_] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") case _ => None } + // id is used for equality + private[sbt] final def defaultSetting[T](s: Setting[T]): Setting[T] = s match { + case _: DefaultSetting[_] | _: DerivedSetting[_] => s + case _ => new DefaultSetting[T](s.key, s.init, s.pos, nextDefaultID()) + } + private[sbt] def defaultSettings(ss: Seq[Setting[_]]): Seq[Setting[_]] = ss.map(s => defaultSetting(s)) + private[this] final val nextID = new java.util.concurrent.atomic.AtomicLong + private[this] final def nextDefaultID(): Long = nextID.incrementAndGet() + def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { @@ -123,11 +132,17 @@ trait Init[Scope] process(defs.toList) out.toList ++ defs } + private[this] def applyDefaults(ss: Seq[Setting[_]]): Seq[Setting[_]] = + { + val (defaults, others) = Util.separate[Setting[_], DefaultSetting[_], Setting[_]](ss) { case u: DefaultSetting[_] => Left(u); case s => Right(s) } + defaults.distinct ++ others + } def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = { + val initDefaults = applyDefaults(init) // inject derived settings into scopes where their dependencies are directly defined - val derived = derive(init) + val derived = derive(initDefaults) // prepend per-scope settings val withLocal = addLocal(derived)(scopeLocal) // group by Scope/Key, dropping dead initializations @@ -323,10 +338,17 @@ trait Init[Scope] protected[sbt] def isDerived: Boolean = false private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) } - private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean) extends Setting[T](sk, i, p) { - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter) + private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, id: Long) extends DefaultSetting[T](sk, i, p, id) { + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, id) protected[sbt] override def isDerived: Boolean = true } + // Only keep the first occurence of this setting and move it to the front so that it has lower precedence than non-defaults. + // This is intended for internal sbt use only, where alternatives like Plugin.globalSettings are not available. + private[Init] sealed class DefaultSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val id: Long) extends Setting[T](sk, i, p) { + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DefaultSetting[T](key, init, pos, id) + override final def hashCode = id.hashCode + override final def equals(o: Any): Boolean = o match { case d: DefaultSetting[_] => d.id == id; case _ => false } + } private[this] def handleUndefined[T](vr: ValidatedInit[T]): Initialize[T] = vr match { From ed11008126baefa9b4110ce48211ced065d61e09 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 8 May 2013 12:56:59 -0400 Subject: [PATCH 345/823] only derive settings when all dependencies are defined --- .../src/main/scala/sbt/Settings.scala | 102 +++++++++++++----- 1 file changed, 75 insertions(+), 27 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 6a16e1616..232272323 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -105,33 +105,6 @@ trait Init[Scope] def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) } - private[this] def derive(init: Seq[Setting[_]]): Seq[Setting[_]] = - { - import collection.mutable - val (derived, defs) = Util.separate[Setting[_],DerivedSetting[_],Setting[_]](init) { case d: DerivedSetting[_] => Left(d); case s => Right(s) } - final class Derived[T](val setting: DerivedSetting[T]) { val inScopes = new mutable.HashSet[Scope] } - val derivs = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived[_]]] - for(s <- derived; d <- s.dependencies) - derivs.getOrElseUpdate(d.key, new mutable.ListBuffer) += new Derived(s) - - val deriveFor = (sk: ScopedKey[_]) => { - val derivedForKey: List[Derived[_]] = derivs.get(sk.key).toList.flatten - derivedForKey.filter(d => d.inScopes.add(sk.scope) && d.setting.filter(sk.scope)).map(_.setting setScope sk.scope) - } - - val processed = new mutable.HashSet[ScopedKey[_]] - val out = new mutable.ListBuffer[Setting[_]] - def process(rem: List[Setting[_]]): Unit = rem match { - case s :: ss => - val sk = s.key - val ds = if(processed.add(sk)) deriveFor(sk) else Nil - out ++= ds - process(ds ::: ss) - case Nil => - } - process(defs.toList) - out.toList ++ defs - } private[this] def applyDefaults(ss: Seq[Setting[_]]): Seq[Setting[_]] = { val (defaults, others) = Util.separate[Setting[_], DefaultSetting[_], Setting[_]](ss) { case u: DefaultSetting[_] => Left(u); case s => Right(s) } @@ -283,6 +256,81 @@ trait Init[Scope] } } + private[this] def derive(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope]): Seq[Setting[_]] = + { + import collection.mutable + + final class Derived(val setting: DerivedSetting[_]) { + val dependencies = setting.dependencies.map(_.key) + val inScopes = new mutable.HashSet[Scope] + } + final class Deriveds(val key: AttributeKey[_], val settings: mutable.ListBuffer[Derived]) { + def dependencies = settings.flatMap(_.dependencies) + override def toString = "Derived settings for " + key.label + } + + // separate `derived` settings from normal settings (`defs`) + val (derived, defs) = Util.separate[Setting[_],Derived,Setting[_]](init) { case d: DerivedSetting[_] => Left(new Derived(d)); case s => Right(s) } + + // group derived settings by the key they define + val derivsByDef = new mutable.HashMap[AttributeKey[_], Deriveds] + for(s <- derived) { + val key = s.setting.key.key + derivsByDef.getOrElseUpdate(key, new Deriveds(key, new mutable.ListBuffer)).settings += s + } + + // sort derived settings so that dependencies come first + // this is necessary when verifying that a derived setting's dependencies exist + val ddeps = (d: Deriveds) => d.dependencies.flatMap(derivsByDef.get) + val sortedDerivs = Dag.topologicalSort(derivsByDef.values)(ddeps) + + // index derived settings by triggering key. This maps a key to the list of settings potentially derived from it. + val derivedBy = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived]] + for(s <- derived; d <- s.dependencies) + derivedBy.getOrElseUpdate(d, new mutable.ListBuffer) += s + + // set of defined scoped keys, used to ensure a derived setting is only added if all dependencies are present + val defined = new mutable.HashSet[ScopedKey[_]] + def addDefs(ss: Seq[Setting[_]]) { for(s <- ss) defined += s.key } + addDefs(defs) + + // true iff the scoped key is in `defined`, taking delegation into account + def isDefined(key: AttributeKey[_], scope: Scope) = + delegates(scope).exists(s => defined.contains(ScopedKey(s, key))) + + // true iff all dependencies of derived setting `d` have a value (potentially via delegation) in `scope` + def allDepsDefined(d: Derived, scope: Scope): Boolean = d.dependencies.forall(dep => isDefined(dep, scope)) + + // list of injectable derived settings for `sk`. A derived setting is injectable if: + // 1. it has not been previously injected into this scope + // 2. it applies to this scope (as determined by its `filter`) + // 3. all of its dependencies are defined for that scope (allowing for delegation) + val deriveFor = (sk: ScopedKey[_]) => { + val derivedForKey: List[Derived] = derivedBy.get(sk.key).toList.flatten + val scope = sk.scope + val filtered = derivedForKey.filter(d => d.inScopes.add(scope) && d.setting.filter(scope) && allDepsDefined(d, scope)) + val scoped = filtered.map(_.setting setScope scope) + addDefs(scoped) + scoped + } + + val processed = new mutable.HashSet[ScopedKey[_]] + // valid derived settings to be added before normal settings + val out = new mutable.ListBuffer[Setting[_]] + + // derives settings, transitively so that a derived setting can trigger another + def process(rem: List[Setting[_]]): Unit = rem match { + case s :: ss => + val sk = s.key + val ds = if(processed.add(sk)) deriveFor(sk) else Nil + out ++= ds + process(ds ::: ss) + case Nil => + } + process(defs.toList) + out.toList ++ defs + } + sealed trait Initialize[T] { def dependencies: Seq[ScopedKey[_]] From 19c78ac4131f2b2d5caccf91a8582e5593d041a4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 9 May 2013 17:13:22 -0400 Subject: [PATCH 346/823] Show defining locations when there are cycles between derived settings --- util/collection/src/main/scala/sbt/Settings.scala | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 232272323..2d5ef2159 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -256,6 +256,16 @@ trait Init[Scope] } } + def definedAtString(settings: Seq[Setting[_]]): String = + { + val posDefined = settings.flatMap(_.positionString.toList) + if (posDefined.size > 0) { + val header = if (posDefined.size == settings.size) "defined at:" else + "some of the defining occurrences:" + header + (posDefined.distinct mkString ("\n\t", "\n\t", "\n")) + } else "" + } + private[this] def derive(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope]): Seq[Setting[_]] = { import collection.mutable @@ -266,7 +276,8 @@ trait Init[Scope] } final class Deriveds(val key: AttributeKey[_], val settings: mutable.ListBuffer[Derived]) { def dependencies = settings.flatMap(_.dependencies) - override def toString = "Derived settings for " + key.label + // This is mainly for use in the cyclic reference error message + override def toString = s"Derived settings for ${key.label}, ${definedAtString(settings.map(_.setting))}" } // separate `derived` settings from normal settings (`defs`) From 61decef972b58e165692db44e7cc2ec55e21cd7d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 9 May 2013 17:13:22 -0400 Subject: [PATCH 347/823] Derived settings: handle scopeLocal in derive and allow triggering dependencies to be filtered --- .../src/main/scala/sbt/Settings.scala | 52 ++++++++++++------- 1 file changed, 32 insertions(+), 20 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 2d5ef2159..569a0275c 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -77,9 +77,9 @@ trait Init[Scope] * is explicitly defined and the where the scope matches `filter`. * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. * Only the static dependencies are tracked, however. */ - final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true)): Setting[T] = { + final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true)): Setting[T] = { deriveAllowed(s, allowDynamic) foreach error - new DerivedSetting[T](s.key, s.init, s.pos, filter, nextDefaultID()) + new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger, nextDefaultID()) } def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { case _: Bind[_,_] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") @@ -115,11 +115,10 @@ trait Init[Scope] { val initDefaults = applyDefaults(init) // inject derived settings into scopes where their dependencies are directly defined - val derived = derive(initDefaults) - // prepend per-scope settings - val withLocal = addLocal(derived)(scopeLocal) + // and prepend per-scope settings + val derived = deriveAndLocal(initDefaults) // group by Scope/Key, dropping dead initializations - val sMap: ScopedMap = grouped(withLocal) + val sMap: ScopedMap = grouped(derived) // delegate references to undefined values according to 'delegates' val dMap: ScopedMap = if(actual) delegate(sMap)(delegates, display) else sMap // merge Seq[Setting[_]] into Compiled @@ -266,12 +265,13 @@ trait Init[Scope] } else "" } - private[this] def derive(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope]): Seq[Setting[_]] = + private[this] def deriveAndLocal(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): Seq[Setting[_]] = { import collection.mutable final class Derived(val setting: DerivedSetting[_]) { val dependencies = setting.dependencies.map(_.key) + def triggeredBy = dependencies.filter(setting.trigger) val inScopes = new mutable.HashSet[Scope] } final class Deriveds(val key: AttributeKey[_], val settings: mutable.ListBuffer[Derived]) { @@ -281,7 +281,8 @@ trait Init[Scope] } // separate `derived` settings from normal settings (`defs`) - val (derived, defs) = Util.separate[Setting[_],Derived,Setting[_]](init) { case d: DerivedSetting[_] => Left(new Derived(d)); case s => Right(s) } + val (derived, rawDefs) = Util.separate[Setting[_],Derived,Setting[_]](init) { case d: DerivedSetting[_] => Left(new Derived(d)); case s => Right(s) } + val defs = addLocal(rawDefs)(scopeLocal) // group derived settings by the key they define val derivsByDef = new mutable.HashMap[AttributeKey[_], Deriveds] @@ -297,7 +298,7 @@ trait Init[Scope] // index derived settings by triggering key. This maps a key to the list of settings potentially derived from it. val derivedBy = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived]] - for(s <- derived; d <- s.dependencies) + for(s <- derived; d <- s.triggeredBy) derivedBy.getOrElseUpdate(d, new mutable.ListBuffer) += s // set of defined scoped keys, used to ensure a derived setting is only added if all dependencies are present @@ -310,19 +311,29 @@ trait Init[Scope] delegates(scope).exists(s => defined.contains(ScopedKey(s, key))) // true iff all dependencies of derived setting `d` have a value (potentially via delegation) in `scope` - def allDepsDefined(d: Derived, scope: Scope): Boolean = d.dependencies.forall(dep => isDefined(dep, scope)) + def allDepsDefined(d: Derived, scope: Scope, local: Set[AttributeKey[_]]): Boolean = + d.dependencies.forall(dep => local(dep) || isDefined(dep, scope)) - // list of injectable derived settings for `sk`. A derived setting is injectable if: - // 1. it has not been previously injected into this scope - // 2. it applies to this scope (as determined by its `filter`) - // 3. all of its dependencies are defined for that scope (allowing for delegation) + // List of injectable derived settings and their local settings for `sk`. + // A derived setting is injectable if: + // 1. it has not been previously injected into this scope + // 2. it applies to this scope (as determined by its `filter`) + // 3. all of its dependencies that match `trigger` are defined for that scope (allowing for delegation) + // This needs to handle local settings because a derived setting wouldn't be injected if it's local setting didn't exist yet. val deriveFor = (sk: ScopedKey[_]) => { val derivedForKey: List[Derived] = derivedBy.get(sk.key).toList.flatten val scope = sk.scope - val filtered = derivedForKey.filter(d => d.inScopes.add(scope) && d.setting.filter(scope) && allDepsDefined(d, scope)) - val scoped = filtered.map(_.setting setScope scope) - addDefs(scoped) - scoped + def localAndDerived(d: Derived): Seq[Setting[_]] = + if(d.inScopes.add(scope) && d.setting.filter(scope)) + { + val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(scope, dep))) + if(allDepsDefined(d, scope, local.map(_.key.key).toSet)) + local :+ d.setting.setScope(scope) + else + Nil + } + else Nil + derivedForKey.flatMap(localAndDerived) } val processed = new mutable.HashSet[ScopedKey[_]] @@ -335,6 +346,7 @@ trait Init[Scope] val sk = s.key val ds = if(processed.add(sk)) deriveFor(sk) else Nil out ++= ds + addDefs(ds) process(ds ::: ss) case Nil => } @@ -397,8 +409,8 @@ trait Init[Scope] protected[sbt] def isDerived: Boolean = false private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) } - private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, id: Long) extends DefaultSetting[T](sk, i, p, id) { - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, id) + private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean, id: Long) extends DefaultSetting[T](sk, i, p, id) { + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, trigger, id) protected[sbt] override def isDerived: Boolean = true } // Only keep the first occurence of this setting and move it to the front so that it has lower precedence than non-defaults. From a1b793dc1ea1b1788a1c01f612fa72c56fe0a9db Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 27 May 2013 19:12:39 -0400 Subject: [PATCH 348/823] Merge ExtendedReporter into Reporter. --- interface/src/main/java/xsbti/ExtendedReporter.java | 10 ---------- interface/src/main/java/xsbti/Reporter.java | 4 +++- 2 files changed, 3 insertions(+), 11 deletions(-) delete mode 100755 interface/src/main/java/xsbti/ExtendedReporter.java diff --git a/interface/src/main/java/xsbti/ExtendedReporter.java b/interface/src/main/java/xsbti/ExtendedReporter.java deleted file mode 100755 index 7bc4acc47..000000000 --- a/interface/src/main/java/xsbti/ExtendedReporter.java +++ /dev/null @@ -1,10 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2012 Eugene Vigdorchik - */ -package xsbti; - -/** An addition to standard reporter. Used by the IDE. */ -public interface ExtendedReporter extends Reporter -{ - public void comment(Position pos, String msg); -} diff --git a/interface/src/main/java/xsbti/Reporter.java b/interface/src/main/java/xsbti/Reporter.java index 8556cbe8a..439e2738f 100644 --- a/interface/src/main/java/xsbti/Reporter.java +++ b/interface/src/main/java/xsbti/Reporter.java @@ -17,4 +17,6 @@ public interface Reporter public Problem[] problems(); /** Logs a message.*/ public void log(Position pos, String msg, Severity sev); -} \ No newline at end of file + /** Reports a comment. */ + public void comment(Position pos, String msg); +} From 6b0bc78fd90da28de71919d1fc900e3b46ae0e00 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 27 May 2013 19:12:39 -0400 Subject: [PATCH 349/823] local SyncVar implementation to deal with std lib deprecations --- .../src/main/scala/sbt/ProcessImpl.scala | 2 - util/process/src/main/scala/sbt/SyncVar.scala | 40 +++++++++++++++++++ 2 files changed, 40 insertions(+), 2 deletions(-) create mode 100644 util/process/src/main/scala/sbt/SyncVar.scala diff --git a/util/process/src/main/scala/sbt/ProcessImpl.scala b/util/process/src/main/scala/sbt/ProcessImpl.scala index deec99be0..9b3464703 100644 --- a/util/process/src/main/scala/sbt/ProcessImpl.scala +++ b/util/process/src/main/scala/sbt/ProcessImpl.scala @@ -9,8 +9,6 @@ import java.io.{FilterInputStream, FilterOutputStream, PipedInputStream, PipedOu import java.io.{File, FileInputStream, FileOutputStream} import java.net.URL -import scala.concurrent.SyncVar - /** Runs provided code in a new Thread and returns the Thread instance. */ private object Spawn { diff --git a/util/process/src/main/scala/sbt/SyncVar.scala b/util/process/src/main/scala/sbt/SyncVar.scala new file mode 100644 index 000000000..a04675851 --- /dev/null +++ b/util/process/src/main/scala/sbt/SyncVar.scala @@ -0,0 +1,40 @@ +package sbt + +// minimal copy of scala.concurrent.SyncVar since that version deprecated put and unset +private[sbt] final class SyncVar[A] +{ + private[this] var isDefined: Boolean = false + private[this] var value: Option[A] = None + + /** Waits until a value is set and then gets it. Does not clear the value */ + def get: A = synchronized { + while (!isDefined) wait() + value.get + } + + /** Waits until a value is set, gets it, and finally clears the value. */ + def take(): A = synchronized { + try get finally unset() + } + + /** Sets the value, whether or not it is currently defined. */ + def set(x: A): Unit = synchronized { + isDefined = true + value = Some(x) + notifyAll() + } + + /** Sets the value, first waiting until it is undefined if it is currently defined. */ + def put(x: A): Unit = synchronized { + while (isDefined) wait() + set(x) + } + + /** Clears the value, whether or not it is current defined. */ + def unset(): Unit = synchronized { + isDefined = false + value = None + notifyAll() + } +} + From 3dd714b1fac3b706cc2852e18870381db6a0c58d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 15 Jun 2013 23:55:05 -0400 Subject: [PATCH 350/823] Fully implement StaticScopes subclass of Initialize in order to support use in Task flatMap. Fixes #784. --- util/collection/src/main/scala/sbt/Settings.scala | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 569a0275c..2b040f18b 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -504,7 +504,15 @@ trait Init[Scope] def mapConstant(g: MapConstant) = this def evaluate(map: Settings[Scope]): T = value() } - private[sbt] final val StaticScopes: Initialize[Set[Scope]] = new Value(() => error("internal sbt error: GetScopes not substituted")) + private[sbt] final object StaticScopes extends Initialize[Set[Scope]] + { + def dependencies = Nil + def mapReferenced(g: MapScoped) = this + def validateReferenced(g: ValidateRef) = Right(this) + def apply[S](g: Set[Scope] => S) = map(this)(g) + def mapConstant(g: MapConstant) = this + def evaluate(map: Settings[Scope]) = map.scopes + } private[sbt] final class Apply[K[L[x]], T](val f: K[Id] => T, val inputs: K[Initialize], val alist: AList[K]) extends Initialize[T] { def dependencies = deps(alist.toList(inputs)) From d4f6b9cf78b3b5c7f35b28d1af4e812edf88d584 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 17 Jun 2013 12:06:13 -0400 Subject: [PATCH 351/823] 'definitive' Parser failures Support a definitive flag for Failure that ignores later failures instead of appending them. This is useful to override the default behavior of listing the failures of alternative parsers. --- util/complete/src/main/scala/sbt/complete/Parser.scala | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 8e5af37e3..6bcb549bd 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -100,7 +100,7 @@ object Parser extends ParserMain def filter(f: T => Boolean, msg: => String): Result[T] = if(f(value)) this else mkFailure(msg) def toEither = Right(value) } - final class Failure(mkErrors: => Seq[String]) extends Result[Nothing] { + final class Failure private[sbt](mkErrors: => Seq[String], val definitive: Boolean) extends Result[Nothing] { lazy val errors: Seq[String] = mkErrors def isFailure = true def isValid = false @@ -108,11 +108,11 @@ object Parser extends ParserMain def flatMap[B](f: Nothing => Result[B]) = this def or[B](b: => Result[B]): Result[B] = b match { case v: Value[B] => v - case f: Failure => concatErrors(f) + case f: Failure => if(definitive) this else concatErrors(f) } def either[B](b: => Result[B]): Result[Either[Nothing,B]] = b match { case Value(v) => Value(Right(v)) - case f: Failure => concatErrors(f) + case f: Failure => if(definitive) this else concatErrors(f) } def filter(f: Nothing => Boolean, msg: => String) = this def app[B,C](b: => Result[B])(f: (Nothing, B) => C): Result[C] = this @@ -121,8 +121,8 @@ object Parser extends ParserMain private[this] def concatErrors(f: Failure) = mkFailures(errors ++ f.errors) } - def mkFailures(errors: => Seq[String]): Failure = new Failure(errors.distinct) - def mkFailure(error: => String): Failure = new Failure(error :: Nil) + def mkFailures(errors: => Seq[String], definitive: Boolean = false): Failure = new Failure(errors.distinct, definitive) + def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) def checkMatches(a: Parser[_], completions: Seq[String]) { From ac3bfc16ae52d3208d83b2d2972935b9cfa04767 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 17 Jun 2013 12:06:13 -0400 Subject: [PATCH 352/823] Merge failures from a,b in a|b when a,b fail on the same input position. Previously, only the failures from b were used. --- .../src/main/scala/sbt/complete/Parser.scala | 17 +++++++++-------- 1 file changed, 9 insertions(+), 8 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 6bcb549bd..0af417278 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -108,18 +108,18 @@ object Parser extends ParserMain def flatMap[B](f: Nothing => Result[B]) = this def or[B](b: => Result[B]): Result[B] = b match { case v: Value[B] => v - case f: Failure => if(definitive) this else concatErrors(f) + case f: Failure => if(definitive) this else this ++ f } def either[B](b: => Result[B]): Result[Either[Nothing,B]] = b match { case Value(v) => Value(Right(v)) - case f: Failure => if(definitive) this else concatErrors(f) + case f: Failure => if(definitive) this else this ++ f } def filter(f: Nothing => Boolean, msg: => String) = this def app[B,C](b: => Result[B])(f: (Nothing, B) => C): Result[C] = this def &&(b: => Result[_]) = this def toEither = Left(() => errors) - private[this] def concatErrors(f: Failure) = mkFailures(errors ++ f.errors) + private[sbt] def ++(f: Failure) = mkFailures(errors ++ f.errors) } def mkFailures(errors: => Seq[String], definitive: Boolean = false): Failure = new Failure(errors.distinct, definitive) def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) @@ -393,11 +393,12 @@ trait ParserMain else t - def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = - if(a.valid) - if(b.valid) new HomParser(a, b) else a - else - b + def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = (a,b) match { + case (Invalid(af), Invalid(bf)) => Invalid(af ++ bf) + case (Invalid(_), bv) => bv + case (av, Invalid(_)) => av + case (av, bv) => new HomParser(a, b) + } @deprecated("Explicitly specify the failure message.", "0.12.2") def not(p: Parser[_]): Parser[Unit] = not(p, "Excluded.") From 0f088ab25a66459497b5dae07383bd648770fdf3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 17 Jun 2013 12:06:13 -0400 Subject: [PATCH 353/823] invalid/failure Parser construction methods now accept definitive flag --- util/complete/src/main/scala/sbt/complete/Parser.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 0af417278..4ad4ded03 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -257,8 +257,8 @@ trait ParserMain implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) implicit def literalRichStringParser(s: String): RichParser[String] = richParser(s) - def invalid(msgs: => Seq[String]): Parser[Nothing] = Invalid(mkFailures(msgs)) - def failure(msg: => String): Parser[Nothing] = invalid(msg :: Nil) + def invalid(msgs: => Seq[String], definitive: Boolean = false): Parser[Nothing] = Invalid(mkFailures(msgs, definitive)) + def failure(msg: => String, definitive: Boolean = false): Parser[Nothing] = invalid(msg :: Nil, definitive) def success[T](value: T): Parser[T] = new ValidParser[T] { override def result = Some(value) def resultEmpty = Value(value) From de63a2c4487bddcf9eeb5edef72813e2751feff5 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 17 Jun 2013 12:06:13 -0400 Subject: [PATCH 354/823] SoftInvalid parser, which defers being invalid in order to preserve a failure message on empty input. --- .../src/main/scala/sbt/complete/Parser.scala | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 4ad4ded03..8a6081c14 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -257,6 +257,13 @@ trait ParserMain implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) implicit def literalRichStringParser(s: String): RichParser[String] = richParser(s) + /** Construct a parser that is valid, but has no valid result. This is used as a way + * to provide a definitive Failure when a parser doesn't match empty input. For example, + * in `softFailure(...) | p`, if `p` doesn't match the empty sequence, the failure will come + * from the Parser constructed by the `softFailure` method. */ + private[sbt] def softFailure(msg: => String, definitive: Boolean = false): Parser[Nothing] = + SoftInvalid( mkFailures(msg :: Nil, definitive) ) + def invalid(msgs: => Seq[String], definitive: Boolean = false): Parser[Nothing] = Invalid(mkFailures(msgs, definitive)) def failure(msg: => String, definitive: Boolean = false): Parser[Nothing] = invalid(msg :: Nil, definitive) def success[T](value: T): Parser[T] = new ValidParser[T] { @@ -441,6 +448,15 @@ private final case class Invalid(fail: Failure) extends Parser[Nothing] def ifValid[S](p: => Parser[S]): Parser[S] = this } +private final case class SoftInvalid(fail: Failure) extends ValidParser[Nothing] +{ + def result = None + def resultEmpty = fail + def derive(c: Char) = Invalid(fail) + def completions(level: Int) = Completions.nil + override def toString = fail.errors.mkString("; ") +} + private final class TrapAndFail[A](a: Parser[A]) extends ValidParser[A] { def result = try { a.result } catch { case e: Exception => None } From aa2bd76e5eedf71d4b5b0ae72a60b009fd9fd563 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 18 Jun 2013 18:29:01 -0400 Subject: [PATCH 355/823] Support dynamic evaluations of optional settings (Initialize.evaluate). --- util/collection/src/main/scala/sbt/Settings.scala | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 2b040f18b..bd025cb5c 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -99,11 +99,12 @@ trait Init[Scope] def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { def apply[T](k: ScopedKey[T]): T = getValue(s, k) } - def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse sys.error("Internal settings error: invalid reference to " + showFullKey(k)) + def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse( throw new InvalidReference(k) ) def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) def mapScope(f: Scope => Scope): MapScoped = new MapScoped { def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) } + private final class InvalidReference(val key: ScopedKey[_]) extends RuntimeException("Internal settings error: invalid reference to " + showFullKey(key)) private[this] def applyDefaults(ss: Seq[Setting[_]]): Seq[Setting[_]] = { @@ -490,10 +491,12 @@ trait Init[Scope] { def dependencies = deps(a.toList) def apply[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) - def evaluate(ss: Settings[Scope]): T = f(a map evaluateT(ss).fn) def mapReferenced(g: MapScoped) = new Optional(a map mapReferencedT(g).fn, f) def validateReferenced(g: ValidateRef) = Right( new Optional(a flatMap { _.validateReferenced(g).right.toOption }, f) ) def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) + def evaluate(ss: Settings[Scope]): T = f( a.flatMap( i => trapBadRef(evaluateT(ss)(i)) ) ) + // proper solution is for evaluate to be deprecated or for external use only and a new internal method returning Either be used + private[this] def trapBadRef[A](run: => A): Option[A] = try Some(run) catch { case e: InvalidReferenceException => None } } private[sbt] final class Value[T](val value: () => T) extends Initialize[T] { From 169a88dd3083da2ba90d0f6b080a2e3679d8f2fc Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 18 Jun 2013 18:29:01 -0400 Subject: [PATCH 356/823] Require projects to have unique target directories. Configuring projects so that target directories overlap is usually unintentional and the error message that results is usually unrelated to the cause. --- util/collection/src/main/scala/sbt/Settings.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index bd025cb5c..722430912 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -496,7 +496,7 @@ trait Init[Scope] def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) def evaluate(ss: Settings[Scope]): T = f( a.flatMap( i => trapBadRef(evaluateT(ss)(i)) ) ) // proper solution is for evaluate to be deprecated or for external use only and a new internal method returning Either be used - private[this] def trapBadRef[A](run: => A): Option[A] = try Some(run) catch { case e: InvalidReferenceException => None } + private[this] def trapBadRef[A](run: => A): Option[A] = try Some(run) catch { case e: InvalidReference => None } } private[sbt] final class Value[T](val value: () => T) extends Initialize[T] { From e7cdcc2deb86daa1d5f9c12af32be81b85d5ab31 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 19 Jun 2013 11:53:11 -0400 Subject: [PATCH 357/823] set position on parameter references in task/setting macros --- util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala | 3 ++- util/appmacro/src/main/scala/sbt/appmacro/Instance.scala | 2 +- 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index 48dd32466..dffc5e0c6 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -144,11 +144,12 @@ final class ContextUtil[C <: Context](val ctx: C) } /** Create a Tree that references the `val` represented by `vd`. */ - def refVal(vd: ValDef): Tree = + def refVal(vd: ValDef, pos: Position): Tree = { val t = Ident(vd.name) assert(vd.tpt.tpe != null, "val type is null: " + vd + ", tpt: " + vd.tpt.tpe) t.setType(vd.tpt.tpe) + t.setPos(pos) t } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala index 3e8b45cf0..5928df8bc 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala @@ -162,7 +162,7 @@ object Instance qual.foreach(checkQual) val vd = util.freshValDef(tpe, qual.symbol) inputs ::= new Input(tpe, qual, vd) - util.refVal(vd) + util.refVal(vd, qual.pos) } def sub(name: String, tpe: Type, qual: Tree): Converted[c.type] = { From f47ad3fb723d861f98e83b98ba9a9207a91a3382 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 23 Jun 2013 19:57:30 -0400 Subject: [PATCH 358/823] Experimental task progress interface. Fixes #592. Set sbt.task.timings=true to print timings for tasks. This sample progress handler shows how to get names for tasks and deal with flatMapped tasks. There are still some tasks that make it through as anonymous, which needs to be investigated. A setting to provide a custom handler should come in a subsequent commit. --- util/collection/src/main/scala/sbt/IDSet.scala | 2 ++ 1 file changed, 2 insertions(+) diff --git a/util/collection/src/main/scala/sbt/IDSet.scala b/util/collection/src/main/scala/sbt/IDSet.scala index 447082d8b..43a0d6f16 100644 --- a/util/collection/src/main/scala/sbt/IDSet.scala +++ b/util/collection/src/main/scala/sbt/IDSet.scala @@ -12,6 +12,7 @@ trait IDSet[T] def ++=(t: Iterable[T]): Unit def -= (t: T): Boolean def all: collection.Iterable[T] + def toList: List[T] def isEmpty: Boolean def foreach(f: T => Unit): Unit def process[S](t: T)(ifSeen: S)(ifNew: => S): S @@ -38,6 +39,7 @@ object IDSet def ++=(t: Iterable[T]) = t foreach += def -= (t:T) = if(backing.remove(t) eq null) false else true def all = collection.JavaConversions.collectionAsScalaIterable(backing.keySet) + def toList = all.toList def isEmpty = backing.isEmpty def process[S](t: T)(ifSeen: S)(ifNew: => S) = if(contains(t)) ifSeen else { this += t ; ifNew } override def toString = backing.toString From a17c7474156b07c35976c4df48c09926e55af993 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 24 Jun 2013 17:46:54 -0400 Subject: [PATCH 359/823] deprecate xml process constructors, which are replaced by proper string interpolation in Scala 2.10 --- util/process/src/main/scala/sbt/Process.scala | 2 ++ 1 file changed, 2 insertions(+) diff --git a/util/process/src/main/scala/sbt/Process.scala b/util/process/src/main/scala/sbt/Process.scala index b2a127977..0fe40612d 100644 --- a/util/process/src/main/scala/sbt/Process.scala +++ b/util/process/src/main/scala/sbt/Process.scala @@ -14,6 +14,7 @@ trait ProcessExtra implicit def builderToProcess(builder: JProcessBuilder): ProcessBuilder = apply(builder) implicit def fileToProcess(file: File): FilePartialBuilder = apply(file) implicit def urlToProcess(url: URL): URLPartialBuilder = apply(url) + @deprecated("Use string interpolation", "0.13.0") implicit def xmlToProcess(command: scala.xml.Elem): ProcessBuilder = apply(command) implicit def buildersToProcess[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = applySeq(builders) @@ -54,6 +55,7 @@ object Process extends ProcessExtra def apply(builder: JProcessBuilder): ProcessBuilder = new SimpleProcessBuilder(builder) def apply(file: File): FilePartialBuilder = new FileBuilder(file) def apply(url: URL): URLPartialBuilder = new URLBuilder(url) + @deprecated("Use string interpolation", "0.13.0") def apply(command: scala.xml.Elem): ProcessBuilder = apply(command.text.trim) def applySeq[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = builders.map(convert) From 874a357f25870ddaba7be8fe71b97269aae3dd32 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 26 Jun 2013 10:07:32 -0400 Subject: [PATCH 360/823] jline/jansi fixes for windows. Fixes #763, fixes #562. The startup script should set sbt.cygwin=true if running from cygwin. This will set the terminal type properly for JLine if not already set. If sbt.cygwin=false or unset and os.name includes "windows", JAnsi is downloaded by the launcher and installed on standard out/err. The value for jline.terminal is transformed from explicit jline.X to the basic types "windows", "unix", or "none". Now that sbt uses JLine 2.0, these types are understood by both sbt's JLine and Scala's. Older Scala versions shaded the classes but not the terminal property so both couldn't be configured with a class name at the same time. --- .../src/main/scala/sbt/LineReader.scala | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index 6d81a2391..e3d3df0f0 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -64,6 +64,24 @@ abstract class JLine extends LineReader } private object JLine { + private[this] val TerminalProperty = "jline.terminal" + + fixTerminalProperty() + + // translate explicit class names to type in order to support + // older Scala, since it shaded classes but not the system property + private[sbt] def fixTerminalProperty() { + val newValue = System.getProperty(TerminalProperty) match { + case "jline.UnixTerminal" => "unix" + case null if System.getProperty("sbt.cygwin") != null => "unix" + case "jline.WindowsTerminal" => "windows" + case "jline.AnsiWindowsTerminal" => "windows" + case "jline.UnsupportedTerminal" => "none" + case x => x + } + if(newValue != null) System.setProperty(TerminalProperty, newValue) + } + // When calling this, ensure that enableEcho has been or will be called. // TerminalFactory.get will initialize the terminal to disable echo. private def terminal = jline.TerminalFactory.get From e805eb919d1ad7c46493c7781f966427080bc663 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 8 Jul 2013 18:42:00 -0400 Subject: [PATCH 361/823] Provide a better error message when an older launcher is used with 0.13 and JLine classes are incompatible. --- util/log/src/main/scala/sbt/ConsoleLogger.scala | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala index 0bc73c2fe..e027876b3 100644 --- a/util/log/src/main/scala/sbt/ConsoleLogger.scala +++ b/util/log/src/main/scala/sbt/ConsoleLogger.scala @@ -78,6 +78,7 @@ object ConsoleLogger val value = System.getProperty("sbt.log.format") if(value eq null) (ansiSupported && !getBoolean("sbt.log.noformat")) else parseBoolean(value) } + private[this] def jline1to2CompatMsg = "Found class jline.Terminal, but interface was expected" private[this] def ansiSupported = try { @@ -86,7 +87,15 @@ object ConsoleLogger terminal.isAnsiSupported } catch { case e: Exception => !isWindows + + // sbt 0.13 drops JLine 1.0 from the launcher and uses 2.x as a normal dependency + // when 0.13 is used with a 0.12 launcher or earlier, the JLine classes from the launcher get loaded + // this results in a linkage error as detected below. The detection is likely jvm specific, but the priority + // is avoiding mistakenly identifying something as a launcher incompatibility when it is not + case e: IncompatibleClassChangeError if e.getMessage == jline1to2CompatMsg => + throw new IncompatibleClassChangeError("JLine incompatibility detected. Check that the sbt launcher is version 0.13.x or later.") } + val noSuppressedMessage = (_: SuppressedTraceContext) => None private[this] def os = System.getProperty("os.name") From 577424fe70d883076b68ad4c5dffff885e4096c7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 12 Jul 2013 09:42:16 -0400 Subject: [PATCH 362/823] disable JLine event expansion --- util/complete/src/main/scala/sbt/LineReader.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index e3d3df0f0..911f61da3 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -102,6 +102,7 @@ private object JLine def createReader(historyPath: Option[File]): ConsoleReader = usingTerminal { t => val cr = new ConsoleReader + cr.setExpandEvents(false) // https://issues.scala-lang.org/browse/SI-7650 cr.setBellEnabled(false) val h = historyPath match { case None => new MemoryHistory From 1d829e2512fb2d9deb57303987fc0fad94476173 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 18 Jul 2013 22:38:16 -0400 Subject: [PATCH 363/823] specify explicit type to work around 2.11 volatile override error --- util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala index 195123c6c..c22825c1b 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala @@ -60,7 +60,7 @@ object KListBuilder extends TupleBuilder val representationC = PolyType(tcVariable :: Nil, klistType) val resultType = appliedType(representationC, idTC :: Nil) val input = klist - val alistInstance = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) + val alistInstance: ctx.universe.Tree = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) } } \ No newline at end of file From 66a48b08c751e398eadf2b1f5287e8f74f18b405 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Fri, 19 Jul 2013 14:39:26 -0700 Subject: [PATCH 364/823] Handle compilation cancellation properly. Incremental compiler didn't have any explicit logic to handle cancelled compilation so it would go into inconsistent state. Specifically, what would happen is that it would treat cancelled compilation as a compilation that finished normally and try to produce a new Analysis object out of partial information collected in AnalysisCallback. The most obvious outcome would be that the new Analysis would contain latest hashes for source files. The next time incremental compiler was asked to recompile the same files that it didn't recompile due to cancelled compilation it would think they were already successfully compiled and would do nothing. We fix that problem by following the same logic that handles compilation errors, cleans up partial results (produced class files) and makes sure that no Analysis is created out of broken state. We do that by introducing a new exception `CompileCancelled` and throwing it at the same spot as an exception signalizing compilation errors is being thrown. We also modify `IncrementalCompile` to catch that exception and gracefully return as there was no compilation invoked. NOTE: In case there were compilation errors reported _before_ compilation cancellations was requested we'll still report them using an old mechanism so partial errors are not lost in case of cancelled compilation. --- interface/src/main/java/xsbti/CompileCancelled.java | 9 +++++++++ 1 file changed, 9 insertions(+) create mode 100644 interface/src/main/java/xsbti/CompileCancelled.java diff --git a/interface/src/main/java/xsbti/CompileCancelled.java b/interface/src/main/java/xsbti/CompileCancelled.java new file mode 100644 index 000000000..bcd3695dd --- /dev/null +++ b/interface/src/main/java/xsbti/CompileCancelled.java @@ -0,0 +1,9 @@ +package xsbti; + +/** + * An exception thrown when compilation cancellation has been requested during + * Scala compiler run. + */ +public abstract class CompileCancelled extends RuntimeException { + public abstract String[] arguments(); +} From 3781820ddaa3c1e48f0b19af6534d3bc54ae271a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 19 Jul 2013 20:02:46 -0400 Subject: [PATCH 365/823] init/restore instead of setEchoEnabled in order to handle full jline customizations. Fixes #822. --- util/complete/src/main/scala/sbt/LineReader.scala | 8 ++++---- util/complete/src/test/scala/ParserTest.scala | 2 +- util/log/src/main/scala/sbt/ConsoleLogger.scala | 2 +- 3 files changed, 6 insertions(+), 6 deletions(-) diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index 911f61da3..9fba225f4 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -57,7 +57,7 @@ abstract class JLine extends LineReader private[this] def resume() { jline.TerminalFactory.reset - JLine.terminal.setEchoEnabled(false) + JLine.terminal.init reader.drawLine() reader.flush() } @@ -95,7 +95,7 @@ private object JLine * This ensures synchronized access as well as re-enabling echo after getting the Terminal. */ def usingTerminal[T](f: jline.Terminal => T): T = withTerminal { t => - t.setEchoEnabled(true) + t.restore f(t) } def createReader(): ConsoleReader = createReader(None) @@ -114,9 +114,9 @@ private object JLine } def withJLine[T](action: => T): T = withTerminal { t => - t.setEchoEnabled(false) + t.init try { action } - finally { t.setEchoEnabled(true) } + finally { t.restore } } def simple(historyPath: Option[File], handleCONT: Boolean = HandleCONT): SimpleReader = new SimpleReader(historyPath, handleCONT) diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index fd42ecf90..7a5d20b23 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -21,7 +21,7 @@ object JLineTest import jline.TerminalFactory import jline.console.ConsoleReader val reader = new ConsoleReader() - TerminalFactory.get.setEchoEnabled(false) + TerminalFactory.get.init val parser = parsers(args(0)) JLineCompletion.installCustomCompletor(reader, parser) diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala index e027876b3..6fefc890a 100644 --- a/util/log/src/main/scala/sbt/ConsoleLogger.scala +++ b/util/log/src/main/scala/sbt/ConsoleLogger.scala @@ -83,7 +83,7 @@ object ConsoleLogger private[this] def ansiSupported = try { val terminal = jline.TerminalFactory.get - terminal.setEchoEnabled(true) // #460 + terminal.restore // #460 terminal.isAnsiSupported } catch { case e: Exception => !isWindows From dae220ecad8b9b4ad8fcedc7d81b56ee9bdbd01f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 16 Aug 2013 14:21:45 -0400 Subject: [PATCH 366/823] Restore lower case hex digits to HexDigit Parser, accidentally removed in 8545e912da9a7606831c9176709f747ba5913a10. --- util/complete/src/main/scala/sbt/complete/Parsers.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala index f289cc6ac..c9184cfe8 100644 --- a/util/complete/src/main/scala/sbt/complete/Parsers.scala +++ b/util/complete/src/main/scala/sbt/complete/Parsers.scala @@ -18,7 +18,8 @@ trait Parsers lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet lazy val HexDigitSet = Set('0','1','2','3','4','5','6','7','8','9','A','B','C','D','E','F') - lazy val HexDigit = charClass(HexDigitSet, "hex") examples HexDigitSet.map(_.toString) + lazy val HexDigit = charClass(c => HexDigitSet(c.toUpper), "hex digit") examples HexDigitSet.map(_.toString) + lazy val Letter = charClass(_.isLetter, "letter") def IDStart = Letter lazy val IDChar = charClass(isIDChar, "ID character") From 9614c4f95a37ad06bab688aad7c012378cd75429 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 16 Aug 2013 14:21:45 -0400 Subject: [PATCH 367/823] API docs for Parser(s). --- .../src/main/scala/sbt/complete/Parser.scala | 99 ++++++++++++-- .../src/main/scala/sbt/complete/Parsers.scala | 124 +++++++++++++++++- 2 files changed, 207 insertions(+), 16 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 8a6081c14..798ea6d49 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -7,6 +7,11 @@ package sbt.complete import sbt.Types.{const, left, right, some} import sbt.Util.{makeList,separate} +/** A String parser that provides semi-automatic tab completion. +* A successful parse results in a value of type `T`. +* The methods in this trait are what must be implemented to define a new Parser implementation, but are not typically useful for common usage. +* Instead, most useful methods for combining smaller parsers into larger parsers are implicitly added by the [[RichParser]] type. +*/ sealed trait Parser[+T] { def derive(i: Char): Parser[T] @@ -20,37 +25,57 @@ sealed trait Parser[+T] } sealed trait RichParser[A] { - /** Produces a Parser that applies the original Parser and then applies `next` (in order).*/ + /** Apply the original Parser and then apply `next` (in order). The result of both is provides as a pair. */ def ~[B](next: Parser[B]): Parser[(A,B)] - /** Produces a Parser that applies the original Parser one or more times.*/ + + /** Apply the original Parser one or more times and provide the non-empty sequence of results.*/ def + : Parser[Seq[A]] - /** Produces a Parser that applies the original Parser zero or more times.*/ + + /** Apply the original Parser zero or more times and provide the (potentially empty) sequence of results.*/ def * : Parser[Seq[A]] - /** Produces a Parser that applies the original Parser zero or one times.*/ + + /** Apply the original Parser zero or one times, returning None if it was applied zero times or the result wrapped in Some if it was applied once.*/ def ? : Parser[Option[A]] - /** Produces a Parser that applies either the original Parser or `b`.*/ + + /** Apply either the original Parser or `b`.*/ def |[B >: A](b: Parser[B]): Parser[B] - /** Produces a Parser that applies either the original Parser or `b`.*/ + + /** Apply either the original Parser or `b`.*/ def ||[B](b: Parser[B]): Parser[Either[A,B]] - /** Produces a Parser that applies the original Parser to the input and then applies `f` to the result.*/ + + /** Apply the original Parser to the input and then apply `f` to the result.*/ def map[B](f: A => B): Parser[B] + /** Returns the original parser. This is useful for converting literals to Parsers. * For example, `'c'.id` or `"asdf".id`*/ def id: Parser[A] + /** Apply the original Parser, but provide `value` as the result if it succeeds. */ def ^^^[B](value: B): Parser[B] + + /** Apply the original Parser, but provide `alt` as the result if it fails.*/ def ??[B >: A](alt: B): Parser[B] + + /** Produces a Parser that applies the original Parser and then applies `next` (in order), discarding the result of `next`. + * (The arrow point in the direction of the retained result.)*/ def <~[B](b: Parser[B]): Parser[A] + + /** Produces a Parser that applies the original Parser and then applies `next` (in order), discarding the result of the original parser. + * (The arrow point in the direction of the retained result.)*/ def ~>[B](b: Parser[B]): Parser[B] /** Uses the specified message if the original Parser fails.*/ def !!!(msg: String): Parser[A] + /** If an exception is thrown by the original Parser, * capture it and fail locally instead of allowing the exception to propagate up and terminate parsing.*/ def failOnException: Parser[A] @deprecated("Use `not` and explicitly provide the failure message", "0.12.2") def unary_- : Parser[Unit] + + /** Apply the original parser, but only succeed if `o` also succeeds. + * Note that `o` does not need to consume the same amount of input to satisfy this condition.*/ def & (o: Parser[_]): Parser[A] @deprecated("Use `and` and `not` and explicitly provide the failure message", "0.12.2") @@ -58,16 +83,23 @@ sealed trait RichParser[A] /** Explicitly defines the completions for the original Parser.*/ def examples(s: String*): Parser[A] + /** Explicitly defines the completions for the original Parser.*/ def examples(s: Set[String], check: Boolean = false): Parser[A] + /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ def string(implicit ev: A <:< Seq[Char]): Parser[String] + /** Produces a Parser that filters the original parser. - * If 'f' is not true when applied to the output of the original parser, the Parser returned by this method fails.*/ + * If 'f' is not true when applied to the output of the original parser, the Parser returned by this method fails. + * The failure message is constructed by applying `msg` to the String that was successfully parsed by the original parser. */ def filter(f: A => Boolean, msg: String => String): Parser[A] + /** Applies the original parser, applies `f` to the result to get the next parser, and applies that parser and uses its result for the overall result. */ def flatMap[B](f: A => Parser[B]): Parser[B] } + +/** Contains Parser implementation helper methods not typically needed for using parsers. */ object Parser extends ParserMain { sealed abstract class Result[+T] { @@ -129,9 +161,11 @@ object Parser extends ParserMain val bad = completions.filter( apply(a)(_).resultEmpty.isFailure) if(!bad.isEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) } + def tuple[A,B](a: Option[A], b: Option[B]): Option[(A,B)] = (a,b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } + def mapParser[A,B](a: Parser[A], f: A => B): Parser[B] = a.ifValid { a.result match @@ -227,6 +261,7 @@ object Parser extends ParserMain } trait ParserMain { + /** Provides combinators for Parsers.*/ implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] { def ~[B](b: Parser[B]) = seqParser(a, b) @@ -254,6 +289,7 @@ trait ParserMain def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) } + implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) implicit def literalRichStringParser(s: String): RichParser[String] = richParser(s) @@ -263,9 +299,16 @@ trait ParserMain * from the Parser constructed by the `softFailure` method. */ private[sbt] def softFailure(msg: => String, definitive: Boolean = false): Parser[Nothing] = SoftInvalid( mkFailures(msg :: Nil, definitive) ) - + + /** Defines a parser that always fails on any input with messages `msgs`. + * If `definitive` is `true`, any failures by later alternatives are discarded.*/ def invalid(msgs: => Seq[String], definitive: Boolean = false): Parser[Nothing] = Invalid(mkFailures(msgs, definitive)) + + /** Defines a parser that always fails on any input with message `msg`. + * If `definitive` is `true`, any failures by later alternatives are discarded.*/ def failure(msg: => String, definitive: Boolean = false): Parser[Nothing] = invalid(msg :: Nil, definitive) + + /** Defines a parser that always succeeds on empty input with the result `value`.*/ def success[T](value: T): Parser[T] = new ValidParser[T] { override def result = Some(value) def resultEmpty = Value(value) @@ -274,15 +317,22 @@ trait ParserMain override def toString = "success(" + value + ")" } + /** Presents a Char range as a Parser. A single Char is parsed only if it is in the given range.*/ implicit def range(r: collection.immutable.NumericRange[Char]): Parser[Char] = charClass(r contains _).examples(r.map(_.toString) : _*) + + /** Defines a Parser that parses a single character only if it is contained in `legal`.*/ def chars(legal: String): Parser[Char] = { val set = legal.toSet charClass(set, "character in '" + legal + "'") examples(set.map(_.toString)) } + + /** Defines a Parser that parses a single character only if the predicate `f` returns true for that character. + * If this parser fails, `label` is used as the failure message. */ def charClass(f: Char => Boolean, label: String = ""): Parser[Char] = new CharacterClass(f, label) + /** Presents a single Char `ch` as a Parser that only parses that exact character. */ implicit def literal(ch: Char): Parser[Char] = new ValidParser[Char] { def result = None def resultEmpty = mkFailure( "Expected '" + ch + "'" ) @@ -290,24 +340,44 @@ trait ParserMain def completions(level: Int) = Completions.single(Completion.suggestStrict(ch.toString)) override def toString = "'" + ch + "'" } + /** Presents a literal String `s` as a Parser that only parses that exact text and provides it as the result.*/ implicit def literal(s: String): Parser[String] = stringLiteral(s, 0) + + /** See [[unapply]]. */ object ~ { + /** Convenience for destructuring a tuple that mirrors the `~` combinator.*/ def unapply[A,B](t: (A,B)): Some[(A,B)] = Some(t) } + /** Parses input `str` using `parser`. If successful, the result is provided wrapped in `Right`. If unsuccesful, an error message is provided in `Left`.*/ def parse[T](str: String, parser: Parser[T]): Either[String, T] = Parser.result(parser, str).left.map { failures => val (msgs,pos) = failures() ProcessError(str, msgs, pos) } + /** Convenience method to use when developing a parser. + * `parser` is applied to the input `str`. + * If `completions` is true, the available completions for the input are displayed. + * Otherwise, the result of parsing is printed using the result's `toString` method. + * If parsing fails, the error message is displayed. + * + * See also [[sampleParse]] and [[sampleCompletions]]. */ def sample(str: String, parser: Parser[_], completions: Boolean = false): Unit = if(completions) sampleCompletions(str, parser) else sampleParse(str, parser) + + /** Convenience method to use when developing a parser. + * `parser` is applied to the input `str` and the result of parsing is printed using the result's `toString` method. + * If parsing fails, the error message is displayed. */ def sampleParse(str: String, parser: Parser[_]): Unit = parse(str, parser) match { case Left(msg) => println(msg) case Right(v) => println(v) } + + /** Convenience method to use when developing a parser. + * `parser` is applied to the input `str` and the available completions are displayed on separate lines. + * If parsing fails, the error message is displayed. */ def sampleCompletions(str: String, parser: Parser[_], level: Int = 1): Unit = Parser.completions(parser, str, level).get foreach println @@ -332,14 +402,21 @@ trait ParserMain loop(-1, p) } + /** Applies parser `p` to input `s`. */ def apply[T](p: Parser[T])(s: String): Parser[T] = (p /: s)(derive1) + /** Applies parser `p` to a single character of input. */ def derive1[T](p: Parser[T], c: Char): Parser[T] = if(p.valid) p.derive(c) else p - // The x Completions.empty removes any trailing token completions where append.isEmpty - def completions(p: Parser[_], s: String, level: Int): Completions = apply(p)(s).completions(level) x Completions.empty + /** Applies parser `p` to input `s` and returns the completions at verbosity `level`. + * The interpretation of `level` is up to parser definitions, but 0 is the default by convention, + * with increasing positive numbers corresponding to increasing verbosity. Typically no more than + * a few levels are defined. */ + def completions(p: Parser[_], s: String, level: Int): Completions = + // The x Completions.empty removes any trailing token completions where append.isEmpty + apply(p)(s).completions(level) x Completions.empty def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = if(a.valid) { diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala index c9184cfe8..6bc745285 100644 --- a/util/complete/src/main/scala/sbt/complete/Parsers.scala +++ b/util/complete/src/main/scala/sbt/complete/Parsers.scala @@ -8,29 +8,55 @@ package sbt.complete import java.net.URI import java.lang.Character.{getType, MATH_SYMBOL, OTHER_SYMBOL, DASH_PUNCTUATION, OTHER_PUNCTUATION, MODIFIER_SYMBOL, CURRENCY_SYMBOL} -// Some predefined parsers +/** Provides standard implementations of commonly useful [[Parser]]s. */ trait Parsers { + /** Matches the end of input, providing no useful result on success. */ lazy val EOF = not(any) + /** Parses any single character and provides that character as the result. */ lazy val any: Parser[Char] = charClass(_ => true, "any character") + /** Set that contains each digit in a String representation.*/ lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") + + /** Parses any single digit and provides that digit as a Char as the result.*/ lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet + + /** Set containing Chars for hexadecimal digits 0-9 and A-F (but not a-f). */ lazy val HexDigitSet = Set('0','1','2','3','4','5','6','7','8','9','A','B','C','D','E','F') + + /** Parses a single hexadecimal digit (0-9, a-f, A-F). */ lazy val HexDigit = charClass(c => HexDigitSet(c.toUpper), "hex digit") examples HexDigitSet.map(_.toString) + /** Parses a single letter, according to Char.isLetter, into a Char. */ lazy val Letter = charClass(_.isLetter, "letter") + + /** Parses the first Char in an sbt identifier, which must be a [[Letter]].*/ def IDStart = Letter + + /** Parses an identifier Char other than the first character. This includes letters, digits, dash `-`, and underscore `_`.*/ lazy val IDChar = charClass(isIDChar, "ID character") + + /** Parses an identifier String, which must start with [[IDStart]] and contain zero or more [[IDChar]]s after that. */ lazy val ID = identifier(IDStart, IDChar) + + /** Parses a single operator Char, as allowed by [[isOpChar]]. */ lazy val OpChar = charClass(isOpChar, "symbol") + + /** Parses a non-empty operator String, which consists only of characters allowed by [[OpChar]]. */ lazy val Op = OpChar.+.string + + /** Parses either an operator String defined by [[Op]] or a non-symbolic identifier defined by [[ID]]. */ lazy val OpOrID = ID | Op + /** Parses a single, non-symbolic Scala identifier Char. Valid characters are letters, digits, and the underscore character `_`. */ lazy val ScalaIDChar = charClass(isScalaIDChar, "Scala identifier character") + + /** Parses a non-symbolic Scala-like identifier. The identifier must start with [[IDStart]] and contain zero or more [[ScalaIDChar]]s after that.*/ lazy val ScalaID = identifier(IDStart, ScalaIDChar) + /** Parses a String that starts with `start` and is followed by zero or more characters parsed by `rep`.*/ def identifier(start: Parser[Char], rep: Parser[Char]): Parser[String] = start ~ rep.* map { case x ~ xs => (x +: xs).mkString } @@ -42,67 +68,143 @@ trait Parsers else any + /** Returns true if `c` an operator character. */ def isOpChar(c: Char) = !isDelimiter(c) && isOpType(getType(c)) def isOpType(cat: Int) = cat match { case MATH_SYMBOL | OTHER_SYMBOL | DASH_PUNCTUATION | OTHER_PUNCTUATION | MODIFIER_SYMBOL | CURRENCY_SYMBOL => true; case _ => false } + /** Returns true if `c` is a dash `-`, a letter, digit, or an underscore `_`. */ def isIDChar(c: Char) = isScalaIDChar(c) || c == '-' + + /** Returns true if `c` is a letter, digit, or an underscore `_`. */ def isScalaIDChar(c: Char) = c.isLetterOrDigit || c == '_' + def isDelimiter(c: Char) = c match { case '`' | '\'' | '\"' | /*';' | */',' | '.' => true ; case _ => false } + /** Matches a single character that is not a whitespace character. */ lazy val NotSpaceClass = charClass(!_.isWhitespace, "non-whitespace character") + + /** Matches a single whitespace character, as determined by Char.isWhitespace.*/ lazy val SpaceClass = charClass(_.isWhitespace, "whitespace character") + + /** Matches a non-empty String consisting of non-whitespace characters. */ lazy val NotSpace = NotSpaceClass.+.string + + /** Matches a possibly empty String consisting of non-whitespace characters. */ lazy val OptNotSpace = NotSpaceClass.*.string + + /** Matches a non-empty String consisting of whitespace characters. + * The suggested tab completion is a single, constant space character.*/ lazy val Space = SpaceClass.+.examples(" ") + + /** Matches a possibly empty String consisting of whitespace characters. + * The suggested tab completion is a single, constant space character.*/ lazy val OptSpace = SpaceClass.*.examples(" ") + + /** Parses a non-empty String that contains only valid URI characters, as defined by [[URIChar]].*/ lazy val URIClass = URIChar.+.string !!! "Invalid URI" + + /** Triple-quotes, as used for verbatim quoting.*/ lazy val VerbatimDQuotes = "\"\"\"" + + /** Double quote character. */ lazy val DQuoteChar = '\"' + + /** Backslash character. */ lazy val BackslashChar = '\\' + + /** Matches a single double quote. */ lazy val DQuoteClass = charClass(_ == DQuoteChar, "double-quote character") + + /** Matches any character except a double quote or whitespace. */ lazy val NotDQuoteSpaceClass = charClass({ c: Char => (c != DQuoteChar) && !c.isWhitespace }, "non-double-quote-space character") + + /** Matches any character except a double quote or backslash. */ lazy val NotDQuoteBackslashClass = charClass({ c: Char => (c != DQuoteChar) && (c != BackslashChar) }, "non-double-quote-backslash character") + /** Matches a single character that is valid somewhere in a URI. */ lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%#") + + /** Returns true if `c` is an ASCII letter or digit. */ def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') // TODO: implement def fileParser(base: File): Parser[File] = token(mapOrFail(NotSpace)(s => new File(s.mkString)), "") - + + /** Parses a port number. Currently, this accepts any integer and presents a tab completion suggestion of ``. */ lazy val Port = token(IntBasic, "") + + /** Parses a signed integer. */ lazy val IntBasic = mapOrFail( '-'.? ~ Digit.+ )( Function.tupled(toInt) ) + + /** Parses an unsigned integer. */ lazy val NatBasic = mapOrFail( Digit.+ )( _.mkString.toInt ) + private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = (neg.toSeq ++ digits).mkString.toInt + + /** Parses the lower-case values `true` and `false` into their respesct Boolean values. */ lazy val Bool = ("true" ^^^ true) | ("false" ^^^ false) + + /** Parses a potentially quoted String value. The value may be verbatim quoted ([[StringVerbatim]]), + * quoted with interpreted escapes ([[StringEscapable]]), or unquoted ([[NotQuoted]]). */ lazy val StringBasic = StringVerbatim | StringEscapable | NotQuoted - lazy val StringVerbatim: Parser[String] = VerbatimDQuotes ~> - any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ - VerbatimDQuotes + + /** Parses a verbatim quoted String value, discarding the quotes in the result. This kind of quoted text starts with triple quotes `"""` + * and ends at the next triple quotes and may contain any character in between. */ + lazy val StringVerbatim: Parser[String] = VerbatimDQuotes ~> + any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ + VerbatimDQuotes + + /** Parses a string value, interpreting escapes and discarding the surrounding quotes in the result. + * See [[EscapeSequence]] for supported escapes. */ lazy val StringEscapable: Parser[String] = (DQuoteChar ~> (NotDQuoteBackslashClass | EscapeSequence).+.string <~ DQuoteChar | (DQuoteChar ~ DQuoteChar) ^^^ "") + + /** Parses a single escape sequence into the represented Char. + * Escapes start with a backslash and are followed by `u` for a [[UnicodeEscape]] or by `b`, `t`, `n`, `f`, `r`, `"`, `'`, `\` for standard escapes. */ lazy val EscapeSequence: Parser[Char] = BackslashChar ~> ('b' ^^^ '\b' | 't' ^^^ '\t' | 'n' ^^^ '\n' | 'f' ^^^ '\f' | 'r' ^^^ '\r' | '\"' ^^^ '\"' | '\'' ^^^ '\'' | '\\' ^^^ '\\' | UnicodeEscape) + + /** Parses a single unicode escape sequence into the represented Char. + * A unicode escape begins with a backslash, followed by a `u` and 4 hexadecimal digits representing the unicode value. */ lazy val UnicodeEscape: Parser[Char] = ("u" ~> repeat(HexDigit, 4, 4)) map { seq => Integer.parseInt(seq.mkString, 16).toChar } + + /** Parses an unquoted, non-empty String value that cannot start with a double quote and cannot contain whitespace.*/ lazy val NotQuoted = (NotDQuoteSpaceClass ~ OptNotSpace) map { case (c, s) => c.toString + s } + /** Applies `rep` zero or more times, separated by `sep`. + * The result is the (possibly empty) sequence of results from the multiple `rep` applications. The `sep` results are discarded. */ def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = rep1sep(rep, sep) ?? Nil + + /** Applies `rep` one or more times, separated by `sep`. + * The result is the non-empty sequence of results from the multiple `rep` applications. The `sep` results are discarded. */ def rep1sep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = (rep ~ (sep ~> rep).*).map { case (x ~ xs) => x +: xs } + /** Wraps the result of `p` in `Some`.*/ def some[T](p: Parser[T]): Parser[Option[T]] = p map { v => Some(v) } + + /** Applies `f` to the result of `p`, transforming any exception when evaluating + * `f` into a parse failure with the exception `toString` as the message.*/ def mapOrFail[S,T](p: Parser[S])(f: S => T): Parser[T] = p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } + /** Parses a space-delimited, possibly empty sequence of arguments. + * The arguments may use quotes and escapes according to [[StringBasic]]. */ def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(StringBasic, display)).* <~ SpaceClass.* + /** Applies `p` and uses `true` as the result if it succeeds and turns failure into a result of `false`. */ def flag[T](p: Parser[T]): Parser[Boolean] = (p ^^^ true) ?? false + /** Defines a sequence parser where the parser used for each part depends on the previously parsed values. + * `p` is applied to the (possibly empty) sequence of already parsed values to obtain the next parser to use. + * The parsers obtained in this way are separated by `sep`, whose result is discarded and only the sequence + * of values from the parsers returned by `p` is used for the result. */ def repeatDep[A](p: Seq[A] => Parser[A], sep: Parser[Any]): Parser[Seq[A]] = { def loop(acc: Seq[A]): Parser[Seq[A]] = { @@ -112,14 +214,26 @@ trait Parsers p(Vector()) flatMap { first => loop(Seq(first)) } } + /** Applies String.trim to the result of `p`. */ def trimmed(p: Parser[String]) = p map { _.trim } + + /** Parses a URI that is valid according to the single argument java.net.URI constructor. */ lazy val basicUri = mapOrFail(URIClass)( uri => new URI(uri)) + + /** Parses a URI that is valid according to the single argument java.net.URI constructor, using `ex` as tab completion examples. */ def Uri(ex: Set[URI]) = basicUri examples(ex.map(_.toString)) } + +/** Provides standard [[Parser]] implementations. */ object Parsers extends Parsers + +/** Provides common [[Parser]] implementations and helper methods.*/ object DefaultParsers extends Parsers with ParserMain { + /** Applies parser `p` to input `s` and returns `true` if the parse was successful. */ def matches(p: Parser[_], s: String): Boolean = apply(p)(s).resultEmpty.isValid + + /** Returns `true` if `s` parses successfully according to [[ID]].*/ def validID(s: String): Boolean = matches(ID, s) } \ No newline at end of file From a2ab8a36302ad9d102ea2a5d29c738db31477fe2 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 16 Aug 2013 14:21:45 -0400 Subject: [PATCH 368/823] Cleaned up API docs for Relation. --- .../src/main/scala/sbt/Relation.scala | 42 ++++++++++++------- 1 file changed, 28 insertions(+), 14 deletions(-) diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala index 04efe3e3e..acf19d6b7 100644 --- a/util/relation/src/main/scala/sbt/Relation.scala +++ b/util/relation/src/main/scala/sbt/Relation.scala @@ -9,7 +9,12 @@ object Relation { /** Constructs a new immutable, finite relation that is initially empty. */ def empty[A,B]: Relation[A,B] = make(Map.empty, Map.empty) + + /** Constructs a [[Relation]] from underlying `forward` and `reverse` representations, without checking that they are consistent. + * This is a low-level constructor and the alternatives [[empty]] and [[reconstruct]] should be preferred. */ def make[A,B](forward: Map[A,Set[B]], reverse: Map[B, Set[A]]): Relation[A,B] = new MRelation(forward, reverse) + + /** Constructs a relation such that for every entry `_1 -> _2s` in `forward` and every `_2` in `_2s`, `(_1, _2)` is in the relation. */ def reconstruct[A,B](forward: Map[A, Set[B]]): Relation[A,B] = { val reversePairs = for( (a,bs) <- forward.view; b <- bs.view) yield (b, a) @@ -39,47 +44,56 @@ object Relation /** Binary relation between A and B. It is a set of pairs (_1, _2) for _1 in A, _2 in B. */ trait Relation[A,B] { - /** Returns the set of all _2s such that (_1, _2) is in this relation. */ + /** Returns the set of all `_2`s such that `(_1, _2)` is in this relation. */ def forward(_1: A): Set[B] - /** Returns the set of all _1s such that (_1, _2) is in this relation. */ + /** Returns the set of all `_1`s such that `(_1, _2)` is in this relation. */ def reverse(_2: B): Set[A] - /** Includes the relation given by `pair`. */ + /** Includes `pair` in the relation. */ def +(pair: (A, B)): Relation[A,B] - /** Includes the relation (a, b). */ + /** Includes `(a, b)` in the relation. */ def +(a: A, b: B): Relation[A,B] - /** Includes the relations (a, b) for all b in bs. */ + /** Includes in the relation `(a, b)` for all `b` in `bs`. */ def +(a: A, bs: Traversable[B]): Relation[A,B] - /** Returns the union of the relation r with this relation. */ + /** Returns the union of the relation `r` with this relation. */ def ++(r: Relation[A,B]): Relation[A,B] - /** Includes the given relations. */ + /** Includes the given pairs in this relation. */ def ++(rs: Traversable[(A,B)]): Relation[A,B] - /** Removes all relations (_1, _2) for all _1 in _1s. */ + /** Removes all elements `(_1, _2)` for all `_1` in `_1s` from this relation. */ def --(_1s: Traversable[A]): Relation[A,B] /** Removes all `pairs` from this relation. */ def --(pairs: TraversableOnce[(A,B)]): Relation[A,B] - /** Removes all pairs (_1, _2) from this relation. */ + /** Removes all pairs `(_1, _2)` from this relation. */ def -(_1: A): Relation[A,B] /** Removes `pair` from this relation. */ def -(pair: (A,B)): Relation[A,B] - /** Returns the set of all _1s such that (_1, _2) is in this relation. */ + /** Returns the set of all `_1`s such that `(_1, _2)` is in this relation. */ def _1s: collection.Set[A] - /** Returns the set of all _2s such that (_1, _2) is in this relation. */ + /** Returns the set of all `_2`s such that `(_1, _2)` is in this relation. */ def _2s: collection.Set[B] /** Returns the number of pairs in this relation */ def size: Int - /** Returns true iff (a,b) is in this relation*/ + /** Returns true iff `(a,b)` is in this relation*/ def contains(a: A, b: B): Boolean - /** Returns a relation with only pairs (a,b) for which f(a,b) is true.*/ + /** Returns a relation with only pairs `(a,b)` for which `f(a,b)` is true.*/ def filter(f: (A,B) => Boolean): Relation[A,B] - /** Partitions this relation into a map of relations according to some discriminator function. */ + /** Partitions this relation into a map of relations according to some discriminator function `f`. */ def groupBy[K](f: ((A,B)) => K): Map[K, Relation[A,B]] /** Returns all pairs in this relation.*/ def all: Traversable[(A,B)] + /** Represents this relation as a `Map` from a `_1` to the set of `_2`s such that `(_1, _2)` is in this relation. + * + * Specifically, there is one entry for each `_1` such that `(_1, _2)` is in this relation for some `_2`. + * The value associated with a given `_1` is the set of all `_2`s such that `(_1, _2)` is in this relation.*/ def forwardMap: Map[A, Set[B]] + + /** Represents this relation as a `Map` from a `_2` to the set of `_1`s such that `(_1, _2)` is in this relation. + * + * Specifically, there is one entry for each `_2` such that `(_1, _2)` is in this relation for some `_1`. + * The value associated with a given `_2` is the set of all `_1`s such that `(_1, _2)` is in this relation.*/ def reverseMap: Map[B, Set[A]] } private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) extends Relation[A,B] From b71af2150ebab2b7cbf238ee4a1fc8ea8fe22877 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 16 Aug 2013 14:21:45 -0400 Subject: [PATCH 369/823] API docs for Attributes.scala --- .../src/main/scala/sbt/Attributes.scala | 64 +++++++++++++++++++ 1 file changed, 64 insertions(+) diff --git a/util/collection/src/main/scala/sbt/Attributes.scala b/util/collection/src/main/scala/sbt/Attributes.scala index 227e2fdd7..456a74482 100644 --- a/util/collection/src/main/scala/sbt/Attributes.scala +++ b/util/collection/src/main/scala/sbt/Attributes.scala @@ -9,14 +9,32 @@ import scala.reflect.Manifest // T must be invariant to work properly. // Because it is sealed and the only instances go through AttributeKey.apply, // a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts + +/** A key in an [[AttributeMap]] that constrains its associated value to be of type `T`. +* The key is uniquely defined by its [[label]] and type `T`, represented at runtime by [[manifest]]. */ sealed trait AttributeKey[T] { + + /** The runtime evidence for `T` */ def manifest: Manifest[T] + @deprecated("Should only be used for compatibility during the transition from hyphenated labels to camelCase labels.", "0.13.0") def rawLabel: String + + /** The label is the identifier for the key and is camelCase by convention. */ def label: String + + /** An optional, brief description of the key. */ def description: Option[String] + + /** In environments that support delegation, looking up this key when it has no associated value will delegate to the values associated with these keys. + * The delegation proceeds in order the keys are returned here.*/ def extend: Seq[AttributeKey[_]] + + /** Specifies whether this key is a local, anonymous key (`true`) or not (`false`). + * This is typically only used for programmatic, intermediate keys that should not be referenced outside of a specific scope. */ def isLocal: Boolean + + /** Identifies the relative importance of a key among other keys.*/ def rank: Int } private[sbt] abstract class SharedAttributeKey[T] extends AttributeKey[T] { @@ -69,24 +87,58 @@ object AttributeKey private[sbt] final val LocalLabel = "$local" } +/** An immutable map where a key is the tuple `(String,T)` for a fixed type `T` and can only be associated with values of type `T`. +* It is therefore possible for this map to contain mappings for keys with the same label but different types. +* Excluding this possibility is the responsibility of the client if desired. */ trait AttributeMap { + /** Gets the value of type `T` associated with the key `k`. + * If a key with the same label but different type is defined, this method will fail. */ def apply[T](k: AttributeKey[T]): T + + /** Gets the value of type `T` associated with the key `k` or `None` if no value is associated. + * If a key with the same label but a different type is defined, this method will return `None`. */ def get[T](k: AttributeKey[T]): Option[T] + + /** Returns this map without the mapping for `k`. + * This method will not remove a mapping for a key with the same label but a different type. */ def remove[T](k: AttributeKey[T]): AttributeMap + + /** Returns true if this map contains a mapping for `k`. + * If a key with the same label but a different type is defined in this map, this method will return `false`. */ def contains[T](k: AttributeKey[T]): Boolean + + /** Adds the mapping `k -> value` to this map, replacing any existing mapping for `k`. + * Any mappings for keys with the same label but different types are unaffected. */ def put[T](k: AttributeKey[T], value: T): AttributeMap + + /** All keys with defined mappings. There may be multiple keys with the same `label`, but different types. */ def keys: Iterable[AttributeKey[_]] + + /** Adds the mappings in `o` to this map, with mappings in `o` taking precedence over existing mappings.*/ def ++(o: Iterable[AttributeEntry[_]]): AttributeMap + + /** Combines the mappings in `o` with the mappings in this map, with mappings in `o` taking precedence over existing mappings.*/ def ++(o: AttributeMap): AttributeMap + + /** All mappings in this map. The [[AttributeEntry]] type preserves the typesafety of mappings, although the specific types are unknown.*/ def entries: Iterable[AttributeEntry[_]] + + /** `true` if there are no mappings in this map, `false` if there are. */ def isEmpty: Boolean } object AttributeMap { + /** An [[AttributeMap]] without any mappings. */ val empty: AttributeMap = new BasicAttributeMap(Map.empty) + + /** Constructs an [[AttributeMap]] containing the given `entries`. */ def apply(entries: Iterable[AttributeEntry[_]]): AttributeMap = empty ++ entries + + /** Constructs an [[AttributeMap]] containing the given `entries`.*/ def apply(entries: AttributeEntry[_]*): AttributeMap = empty ++ entries + + /** Presents an `AttributeMap` as a natural transformation. */ implicit def toNatTrans(map: AttributeMap): AttributeKey ~> Id = new (AttributeKey ~> Id) { def apply[T](key: AttributeKey[T]): T = map(key) } @@ -116,20 +168,32 @@ private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) } // type inference required less generality +/** A map entry where `key` is constrained to only be associated with a fixed value of type `T`. */ final case class AttributeEntry[T](key: AttributeKey[T], value: T) { override def toString = key.label + ": " + value } +/** Associates a `metadata` map with `data`. */ final case class Attributed[D](data: D)(val metadata: AttributeMap) { + /** Retrieves the associated value of `key` from the metadata. */ def get[T](key: AttributeKey[T]): Option[T] = metadata.get(key) + + /** Defines a mapping `key -> value` in the metadata. */ def put[T](key: AttributeKey[T], value: T): Attributed[D] = Attributed(data)(metadata.put(key, value)) + + /** Transforms the data by applying `f`. */ def map[T](f: D => T): Attributed[T] = Attributed(f(data))(metadata) } object Attributed { + /** Extracts the underlying data from the sequence `in`. */ def data[T](in: Seq[Attributed[T]]): Seq[T] = in.map(_.data) + + /** Associates empty metadata maps with each entry of `in`.*/ def blankSeq[T](in: Seq[T]): Seq[Attributed[T]] = in map blank + + /** Associates an empty metadata map with `data`. */ def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) } \ No newline at end of file From 62137f708fe510b446f235e5332e69946aa755de Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 30 Aug 2013 18:34:54 -0400 Subject: [PATCH 370/823] Show source position of undefined setting. --- .../src/main/scala/sbt/Settings.scala | 45 ++++++++++++++----- 1 file changed, 34 insertions(+), 11 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 722430912..65efd1656 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -157,12 +157,12 @@ trait Init[Scope] def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = { - def refMap(refKey: ScopedKey[_], isFirst: Boolean, derived: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = - delegateForKey(sMap, k, delegates(k.scope), refKey, isFirst, derived) + def refMap(ref: Setting[_], isFirst: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = + delegateForKey(sMap, k, delegates(k.scope), ref, isFirst) } type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] val f = new (SettingSeq ~> ValidatedSettings) { def apply[T](ks: Seq[Setting[T]]) = { - val validated = ks.zipWithIndex map { case (s,i) => s validateReferenced refMap(s.key, i == 0, s.isDerived) } + val validated = ks.zipWithIndex map { case (s,i) => s validateReferenced refMap(s, i == 0) } val (undefs, valid) = Util separateE validated if(undefs.isEmpty) Right(valid) else Left(undefs.flatten) }} @@ -173,14 +173,14 @@ trait Init[Scope] else throw Uninitialized(sMap.keys.toSeq, delegates, undefineds.values.flatten.toList, false) } - private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], refKey: ScopedKey[_], isFirst: Boolean, derived: Boolean): Either[Undefined, ScopedKey[T]] = + private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], ref: Setting[_], isFirst: Boolean): Either[Undefined, ScopedKey[T]] = { def resolve(search: Seq[Scope]): Either[Undefined, ScopedKey[T]] = search match { - case Seq() => Left(Undefined(refKey, k, derived)) + case Seq() => Left(Undefined(ref, k)) case Seq(x, xs @ _*) => val sk = ScopedKey(x, k.key) - val definesKey = (refKey != sk || !isFirst) && (sMap contains sk) + val definesKey = (ref.key != sk || !isFirst) && (sMap contains sk) if(definesKey) Right(sk) else resolve(xs) } resolve(scopes) @@ -202,10 +202,15 @@ trait Init[Scope] def showUndefined(u: Undefined, validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = { val guessed = guessIntendedScope(validKeys, delegates, u.referencedKey) - val guessedString = if(u.derived) "" else guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString - val derivedString = if(u.derived) ", which is a derived setting that needs this key to be defined in this scope." else "" - display(u.referencedKey) + " from " + display(u.definingKey) + derivedString + guessedString + val derived = u.defining.isDerived + val refString = display(u.defining.key) + val sourceString = if(derived) "" else parenPosString(u.defining) + val guessedString = if(derived) "" else guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString + val derivedString = if(derived) ", which is a derived setting that needs this key to be defined in this scope." else "" + display(u.referencedKey) + " from " + refString + sourceString + derivedString + guessedString } + private[this] def parenPosString(s: Setting[_]): String = + s.positionString match { case None => ""; case Some(s) => " (" + s + ")" } def guessIntendedScope(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], key: ScopedKey[_]): Option[ScopedKey[_]] = { @@ -221,9 +226,27 @@ trait Init[Scope] } final class Uninitialized(val undefined: Seq[Undefined], override val toString: String) extends Exception(toString) - final class Undefined(val definingKey: ScopedKey[_], val referencedKey: ScopedKey[_], val derived: Boolean) + final class Undefined private[sbt](val defining: Setting[_], val referencedKey: ScopedKey[_]) + { + @deprecated("For compatibility only, use `defining` directly.", "0.13.1") + val definingKey = defining.key + @deprecated("For compatibility only, use `defining` directly.", "0.13.1") + val derived: Boolean = defining.isDerived + @deprecated("Use the non-deprecated Undefined factory method.", "0.13.1") + def this(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean) = this( fakeUndefinedSetting(definingKey, derived), referencedKey) + } final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") - def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean): Undefined = new Undefined(definingKey, referencedKey, derived) + + @deprecated("Use the other overload.", "0.13.1") + def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean): Undefined = + new Undefined(fakeUndefinedSetting(definingKey, derived), referencedKey) + private[this] def fakeUndefinedSetting[T](definingKey: ScopedKey[T], d: Boolean): Setting[T] = + { + val init: Initialize[T] = pure(() => error("Dummy setting for compatibility only.")) + new Setting(definingKey, init, NoPosition) { override def isDerived = d } + } + + def Undefined(defining: Setting[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(defining, referencedKey) def Uninitialized(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], keys: Seq[Undefined], runtime: Boolean)(implicit display: Show[ScopedKey[_]]): Uninitialized = { assert(!keys.isEmpty) From 8883ab324b0f7a976b21de7eab2cb6b894783e45 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 19 Sep 2013 12:38:10 -0400 Subject: [PATCH 371/823] The Process methods that are redirection-like should not discard the exit code of the input. Only piping should do that. This addresses an inconsistency with Fork, where using the CustomOutput OutputStrategy makes the exit code always zero. --- util/process/src/main/scala/sbt/Process.scala | 8 ++- .../src/main/scala/sbt/ProcessImpl.scala | 20 +++--- .../src/test/scala/ProcessSpecification.scala | 67 ++++++++++++++----- 3 files changed, 69 insertions(+), 26 deletions(-) diff --git a/util/process/src/main/scala/sbt/Process.scala b/util/process/src/main/scala/sbt/Process.scala index 0fe40612d..a370048e4 100644 --- a/util/process/src/main/scala/sbt/Process.scala +++ b/util/process/src/main/scala/sbt/Process.scala @@ -80,7 +80,7 @@ trait SourcePartialBuilder extends NotNull * argument is call-by-name, so the stream is recreated, written, and closed each * time this process is executed. */ def #>(out: => OutputStream): ProcessBuilder = #> (new OutputStreamBuilder(out)) - def #>(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(toSource, b, false) + def #>(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(toSource, b, false, ExitCodes.firstIfNonzero) private def toFile(f: File, append: Boolean) = #> (new FileOutput(f, append)) def cat = toSource protected def toSource: ProcessBuilder @@ -95,7 +95,7 @@ trait SinkPartialBuilder extends NotNull * argument is call-by-name, so the stream is recreated, read, and closed each * time this process is executed. */ def #<(in: => InputStream): ProcessBuilder = #< (new InputStreamBuilder(in)) - def #<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, toSink, false) + def #<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, toSink, false, ExitCodes.firstIfNonzero) protected def toSink: ProcessBuilder } @@ -174,7 +174,9 @@ trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder def #&& (other: ProcessBuilder): ProcessBuilder /** Constructs a command that runs this command first and then `other` if this command does not succeed.*/ def #|| (other: ProcessBuilder): ProcessBuilder - /** Constructs a command that will run this command and pipes the output to `other`. `other` must be a simple command.*/ + /** Constructs a command that will run this command and pipes the output to `other`. + * `other` must be a simple command. + * The exit code will be that of `other` regardless of whether this command succeeds. */ def #| (other: ProcessBuilder): ProcessBuilder /** Constructs a command that will run this command and then `other`. The exit code will be the exit code of `other`.*/ def ### (other: ProcessBuilder): ProcessBuilder diff --git a/util/process/src/main/scala/sbt/ProcessImpl.scala b/util/process/src/main/scala/sbt/ProcessImpl.scala index 9b3464703..9a3aae606 100644 --- a/util/process/src/main/scala/sbt/ProcessImpl.scala +++ b/util/process/src/main/scala/sbt/ProcessImpl.scala @@ -114,6 +114,10 @@ object BasicIO def inheritInput(connect: Boolean) = { p: JProcessBuilder => if (connect) InheritInput(p) else false } } +private[sbt] object ExitCodes { + def ignoreFirst: (Int, Int) => Int = (a,b) => b + def firstIfNonzero: (Int, Int) => Int = (a,b) => if(a != 0) a else b +} private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPartialBuilder with SourcePartialBuilder @@ -123,7 +127,7 @@ private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPa def #|(other: ProcessBuilder): ProcessBuilder = { require(other.canPipeTo, "Piping to multiple processes is not supported.") - new PipedProcessBuilder(this, other, false) + new PipedProcessBuilder(this, other, false, exitCode = ExitCodes.ignoreFirst) } def ###(other: ProcessBuilder): ProcessBuilder = new SequenceProcessBuilder(this, other) @@ -181,7 +185,7 @@ private[sbt] class FileBuilder(base: File) extends FilePartialBuilder with SinkP def #<<(f: File): ProcessBuilder = #<<(new FileInput(f)) def #<<(u: URL): ProcessBuilder = #<<(new URLInput(u)) def #<<(s: => InputStream): ProcessBuilder = #<<(new InputStreamBuilder(s)) - def #<<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, new FileOutput(base, true), false) + def #<<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, new FileOutput(base, true), false, ExitCodes.firstIfNonzero) } private abstract class BasicBuilder extends AbstractProcessBuilder @@ -235,9 +239,9 @@ private abstract class SequentialProcessBuilder(a: ProcessBuilder, b: ProcessBui checkNotThis(b) override def toString = " ( " + a + " " + operatorString + " " + b + " ) " } -private class PipedProcessBuilder(first: ProcessBuilder, second: ProcessBuilder, toError: Boolean) extends SequentialProcessBuilder(first, second, if(toError) "#|!" else "#|") +private class PipedProcessBuilder(first: ProcessBuilder, second: ProcessBuilder, toError: Boolean, exitCode: (Int,Int) => Int) extends SequentialProcessBuilder(first, second, if(toError) "#|!" else "#|") { - override def createProcess(io: ProcessIO) = new PipedProcesses(first, second, io, toError) + override def createProcess(io: ProcessIO) = new PipedProcesses(first, second, io, toError, exitCode) } private class AndProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#&&") { @@ -274,7 +278,7 @@ private class OrProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) ext private class ProcessSequence(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, ignore => true) -private class PipedProcesses(a: ProcessBuilder, b: ProcessBuilder, defaultIO: ProcessIO, toError: Boolean) extends CompoundProcess +private class PipedProcesses(a: ProcessBuilder, b: ProcessBuilder, defaultIO: ProcessIO, toError: Boolean, exitCode: (Int, Int) => Int) extends CompoundProcess { protected[this] override def runAndExitValue() = { @@ -302,11 +306,11 @@ private class PipedProcesses(a: ProcessBuilder, b: ProcessBuilder, defaultIO: Pr try { runInterruptible { - first.exitValue + val firstResult = first.exitValue currentSource.put(None) currentSink.put(None) - val result = second.exitValue - result + val secondResult = second.exitValue + exitCode(firstResult, secondResult) } { first.destroy() second.destroy() diff --git a/util/process/src/test/scala/ProcessSpecification.scala b/util/process/src/test/scala/ProcessSpecification.scala index 6810025bf..f48a8282c 100644 --- a/util/process/src/test/scala/ProcessSpecification.scala +++ b/util/process/src/test/scala/ProcessSpecification.scala @@ -6,7 +6,7 @@ import Prop._ import Process._ -private[this] object ProcessSpecification extends Properties("Process I/O") +object ProcessSpecification extends Properties("Process I/O") { implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary(Gen.choose(0, 10) flatMap { size => Gen.resize(size, Arbitrary.arbArray[Byte].arbitrary) }) @@ -15,8 +15,11 @@ private[this] object ProcessSpecification extends Properties("Process I/O") property("#|| correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ #|| _)(_ || _)) property("### correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ ### _)( (x,latest) => latest))*/ property("Pipe to output file") = forAll( (data: Array[Byte]) => checkFileOut(data)) - property("Pipe to input file") = forAll( (data: Array[Byte]) => checkFileIn(data)) + property("Pipe from input file") = forAll( (data: Array[Byte]) => checkFileIn(data)) property("Pipe to process") = forAll( (data: Array[Byte]) => checkPipe(data)) + property("Pipe to process ignores input exit code") = forAll( (data: Array[Byte], code: Byte) => checkPipeExit(data, code)) + property("Pipe from input file to bad process preserves correct exit code.") = forAll( (data: Array[Byte], code: Byte) => checkFileInExit(data, code)) + property("Pipe to output file from bad process preserves correct exit code.") = forAll( (data: Array[Byte], code: Byte) => checkFileOutExit(data, code)) private def checkBinary(codes: Array[Byte])(reduceProcesses: (ProcessBuilder, ProcessBuilder) => ProcessBuilder)(reduceExit: (Boolean, Boolean) => Boolean) = { @@ -55,29 +58,63 @@ private[this] object ProcessSpecification extends Properties("Process I/O") temporaryFile #> catCommand #| catCommand #> temporaryFile2 } } + private def checkPipeExit(data: Array[Byte], code: Byte) = + withTempFiles { (a,b) => + IO.write(a, data) + val catCommand = process("sbt.cat") + val exitCommand = process(s"sbt.exit $code") + val exit = (a #> exitCommand #| catCommand #> b).! + (s"Exit code: $exit") |: + (s"Output file length: ${b.length}") |: + (exit == 0) && + (b.length == 0) + } + + private def checkFileOutExit(data: Array[Byte], exitCode: Byte) = + withTempFiles { (a,b) => + IO.write(a, data) + val code = unsigned(exitCode) + val command = process(s"sbt.exit $code") + val exit = (a #> command #> b).! + (s"Exit code: $exit, expected: $code") |: + (s"Output file length: ${b.length}") |: + (exit == code) && + (b.length == 0) + } + + private def checkFileInExit(data: Array[Byte], exitCode: Byte) = + withTempFiles { (a,b) => + IO.write(a, data) + val code = unsigned(exitCode) + val command = process(s"sbt.exit $code") + val exit = (a #> command).! + (s"Exit code: $exit, expected: $code") |: + (exit == code) + } + private def temp() = File.createTempFile("sbt", "") private def withData(data: Array[Byte])(f: (File, File) => ProcessBuilder) = + withTempFiles { (a, b) => + IO.write(a, data) + val process = f(a, b) + ( process ! ) == 0 && sameFiles(a, b) + } + private def sameFiles(a: File, b: File) = + IO.readBytes(a) sameElements IO.readBytes(b) + + private def withTempFiles[T](f: (File, File) => T): T = { val temporaryFile1 = temp() val temporaryFile2 = temp() - try - { - IO.write(temporaryFile1, data) - val process = f(temporaryFile1, temporaryFile2) - ( process ! ) == 0 && - { - val b1 = IO.readBytes(temporaryFile1) - val b2 = IO.readBytes(temporaryFile2) - b1 sameElements b2 - } - } + try f(temporaryFile1, temporaryFile2) finally { temporaryFile1.delete() temporaryFile2.delete() } - } - private def unsigned(b: Byte): Int = ((b: Int) +256) % 256 + } + private def unsigned(b: Int): Int = ((b: Int) +256) % 256 + private def unsigned(b: Byte): Int = unsigned(b: Int) private def process(command: String) = { val ignore = echo // just for the compile dependency so that this test is rerun when TestedProcess.scala changes, not used otherwise From a3a3dc12264edc2db19c45f72474a5e6e6a18284 Mon Sep 17 00:00:00 2001 From: James Roper Date: Tue, 24 Sep 2013 12:17:46 +1000 Subject: [PATCH 372/823] String upper/lower case no longer locale dependent Fixed many instances of the Turkish i bug. Spare a thought for the poor Turks! --- util/collection/src/main/scala/sbt/Util.scala | 6 ++++-- util/log/src/main/scala/sbt/ConsoleLogger.scala | 5 +++-- 2 files changed, 7 insertions(+), 4 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Util.scala b/util/collection/src/main/scala/sbt/Util.scala index f4f6fbb50..27b32dd87 100644 --- a/util/collection/src/main/scala/sbt/Util.scala +++ b/util/collection/src/main/scala/sbt/Util.scala @@ -3,6 +3,8 @@ */ package sbt +import java.util.Locale + object Util { def makeList[T](size: Int, value: T): List[T] = List.fill(size)(value) @@ -31,13 +33,13 @@ object Util def hypenToCamel(s: String): String = hyphenToCamel(s) def hyphenToCamel(s: String): String = if(hasHyphen(s)) - Hypen.replaceAllIn(s, _.group(1).toUpperCase) + Hypen.replaceAllIn(s, _.group(1).toUpperCase(Locale.ENGLISH)) else s private[this] lazy val Camel = """(\p{javaLowerCase})(\p{javaUpperCase})""".r def camelToHypen(s: String): String = - Camel.replaceAllIn(s, m => m.group(1) + "-" + m.group(2).toLowerCase) + Camel.replaceAllIn(s, m => m.group(1) + "-" + m.group(2).toLowerCase(Locale.ENGLISH)) def quoteIfKeyword(s: String): String = if(ScalaKeywords.values(s)) '`' + s + '`' else s } diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala index 6fefc890a..e5c8f040f 100644 --- a/util/log/src/main/scala/sbt/ConsoleLogger.scala +++ b/util/log/src/main/scala/sbt/ConsoleLogger.scala @@ -3,7 +3,8 @@ */ package sbt - import java.io.{BufferedWriter, PrintStream, PrintWriter} +import java.io.{BufferedWriter, PrintStream, PrintWriter} +import java.util.Locale object ConsoleLogger { @@ -99,7 +100,7 @@ object ConsoleLogger val noSuppressedMessage = (_: SuppressedTraceContext) => None private[this] def os = System.getProperty("os.name") - private[this] def isWindows = os.toLowerCase.indexOf("windows") >= 0 + private[this] def isWindows = os.toLowerCase(Locale.ENGLISH).indexOf("windows") >= 0 def apply(out: PrintStream): ConsoleLogger = apply(ConsoleOut.printStreamOut(out)) def apply(out: PrintWriter): ConsoleLogger = apply(ConsoleOut.printWriterOut(out)) From 5ac9390be6b9c7df5b18ce1500ed1618a3f1bad6 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 2 Oct 2013 09:10:38 -0400 Subject: [PATCH 373/823] TrapExit support for multiple, concurrent managed applications. Fixes #831. --- util/log/src/main/scala/sbt/Logger.scala | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/util/log/src/main/scala/sbt/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala index 29d965e76..ce8201e9c 100644 --- a/util/log/src/main/scala/sbt/Logger.scala +++ b/util/log/src/main/scala/sbt/Logger.scala @@ -57,7 +57,11 @@ object Logger implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger implicit def log2PLog(log: Logger): ProcessLogger = absLog2PLog(new FullLogger(log)) - implicit def xlog2Log(lg: xLogger): Logger = new Logger { + implicit def xlog2Log(lg: xLogger): Logger = lg match { + case l: Logger => l + case _ => wrapXLogger(lg) + } + private[this] def wrapXLogger(lg: xLogger): Logger = new Logger { override def debug(msg: F0[String]): Unit = lg.debug(msg) override def warn(msg: F0[String]): Unit = lg.warn(msg) override def info(msg: F0[String]): Unit = lg.info(msg) From 00dba88c912611b411fac7598c06c6412a53a9f8 Mon Sep 17 00:00:00 2001 From: Benjy Date: Thu, 10 Oct 2013 20:42:00 -0700 Subject: [PATCH 374/823] equals/hashCode on Modifiers. --- interface/src/main/java/xsbti/api/Modifiers.java | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/interface/src/main/java/xsbti/api/Modifiers.java b/interface/src/main/java/xsbti/api/Modifiers.java index 575879608..78fa13901 100644 --- a/interface/src/main/java/xsbti/api/Modifiers.java +++ b/interface/src/main/java/xsbti/api/Modifiers.java @@ -68,6 +68,14 @@ public final class Modifiers implements java.io.Serializable { return flag(MacroBit); } + public boolean equals(Object o) + { + return (o instanceof Modifiers) && flags == ((Modifiers)o).flags; + } + public int hashCode() + { + return flags; + } public String toString() { return "Modifiers(" + "isAbstract: " + isAbstract() + ", " + "isOverride: " + isOverride() + ", " + "isFinal: " + isFinal() + ", " + "isSealed: " + isSealed() + ", " + "isImplicit: " + isImplicit() + ", " + "isLazy: " + isLazy() + ", " + "isMacro: " + isMacro()+ ")"; From 3cbe4f942beb24999a66625482e4b357b52406c8 Mon Sep 17 00:00:00 2001 From: Benjy Date: Thu, 10 Oct 2013 21:04:23 -0700 Subject: [PATCH 375/823] Add merge, partition and groupBy methods to Relation. Also add equals/hashCode to Relation. Also add a basic test for groupBy. --- .../src/main/scala/sbt/Relation.scala | 34 ++++++++++++++++--- .../src/test/scala/RelationTest.scala | 10 ++++++ 2 files changed, 40 insertions(+), 4 deletions(-) diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala index acf19d6b7..d97ee2321 100644 --- a/util/relation/src/main/scala/sbt/Relation.scala +++ b/util/relation/src/main/scala/sbt/Relation.scala @@ -19,9 +19,11 @@ object Relation { val reversePairs = for( (a,bs) <- forward.view; b <- bs.view) yield (b, a) val reverse = (Map.empty[B,Set[A]] /: reversePairs) { case (m, (b, a)) => add(m, b, a :: Nil) } - make(forward, reverse) + make(forward filter { case (a, bs) => bs.nonEmpty }, reverse) } + def merge[A,B](rels: Traversable[Relation[A,B]]): Relation[A,B] = (Relation.empty[A, B] /: rels)(_ ++ _) + private[sbt] def remove[X,Y](map: M[X,Y], from: X, to: Y): M[X,Y] = map.get(from) match { case Some(tos) => @@ -62,6 +64,8 @@ trait Relation[A,B] def --(_1s: Traversable[A]): Relation[A,B] /** Removes all `pairs` from this relation. */ def --(pairs: TraversableOnce[(A,B)]): Relation[A,B] + /** Removes all `relations` from this relation. */ + def --(relations: Relation[A,B]): Relation[A,B] /** Removes all pairs `(_1, _2)` from this relation. */ def -(_1: A): Relation[A,B] /** Removes `pair` from this relation. */ @@ -75,11 +79,16 @@ trait Relation[A,B] /** Returns true iff `(a,b)` is in this relation*/ def contains(a: A, b: B): Boolean + /** Returns a relation with only pairs `(a,b)` for which `f(a,b)` is true.*/ def filter(f: (A,B) => Boolean): Relation[A,B] - /** Partitions this relation into a map of relations according to some discriminator function `f`. */ - def groupBy[K](f: ((A,B)) => K): Map[K, Relation[A,B]] + /** Returns a pair of relations: the first contains only pairs `(a,b)` for which `f(a,b)` is true and + * the other only pairs `(a,b)` for which `f(a,b)` is false. */ + def partition(f: (A,B) => Boolean): (Relation[A,B], Relation[A,B]) + + /** Partitions this relation into a map of relations according to some discriminator function. */ + def groupBy[K](discriminator: ((A,B)) => K): Map[K, Relation[A,B]] /** Returns all pairs in this relation.*/ def all: Traversable[(A,B)] @@ -96,6 +105,8 @@ trait Relation[A,B] * The value associated with a given `_2` is the set of all `_1`s such that `(_1, _2)` is in this relation.*/ def reverseMap: Map[B, Set[A]] } + +// Note that we assume without checking that fwd and rev are consistent. private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) extends Relation[A,B] { def forwardMap = fwd @@ -121,6 +132,7 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def --(ts: Traversable[A]): Relation[A,B] = ((this: Relation[A,B]) /: ts) { _ - _ } def --(pairs: TraversableOnce[(A,B)]): Relation[A,B] = ((this: Relation[A,B]) /: pairs) { _ - _ } + def --(relations: Relation[A,B]): Relation[A,B] = --(relations.all) def -(pair: (A,B)): Relation[A,B] = new MRelation( remove(fwd, pair._1, pair._2), remove(rev, pair._2, pair._1) ) def -(t: A): Relation[A,B] = @@ -133,9 +145,23 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def filter(f: (A,B) => Boolean): Relation[A,B] = Relation.empty[A,B] ++ all.filter(f.tupled) - def groupBy[K](f: ((A,B)) => K): Map[K, Relation[A,B]] = all.groupBy(f) mapValues { Relation.empty[A,B] ++ _ } + def partition(f: (A,B) => Boolean): (Relation[A,B], Relation[A,B]) = { + val (y, n) = all.partition(f.tupled) + (Relation.empty[A,B] ++ y, Relation.empty[A,B] ++ n) + } + + def groupBy[K](discriminator: ((A,B)) => K): Map[K, Relation[A,B]] = all.groupBy(discriminator) mapValues { Relation.empty[A,B] ++ _ } def contains(a: A, b: B): Boolean = forward(a)(b) + override def equals(other: Any) = other match { + // We assume that the forward and reverse maps are consistent, so we only use the forward map + // for equality. Note that key -> Empty is semantically the same as key not existing. + case o: MRelation[A,B] => forwardMap.filterNot(_._2.isEmpty) == o.forwardMap.filterNot(_._2.isEmpty) + case _ => false + } + + override def hashCode = fwd.filterNot(_._2.isEmpty).hashCode() + override def toString = all.map { case (a,b) => a + " -> " + b }.mkString("Relation [", ", ", "]") } diff --git a/util/relation/src/test/scala/RelationTest.scala b/util/relation/src/test/scala/RelationTest.scala index e82bd861d..de63fe893 100644 --- a/util/relation/src/test/scala/RelationTest.scala +++ b/util/relation/src/test/scala/RelationTest.scala @@ -50,6 +50,16 @@ object RelationTest extends Properties("Relation") ("Reverse map does not contain removed" |: ( notIn(r.reverseMap, b, a) ) ) } } + + property("Groups correctly") = forAll { (entries: List[(Int, Double)], randomInt: Int) => + val splitInto = randomInt % 10 + 1 // Split into 1-10 groups. + val rel = Relation.empty[Int, Double] ++ entries + val grouped = rel groupBy (_._1 % splitInto) + all(grouped.toSeq) { + case (k, rel_k) => rel_k._1s forall { _ % splitInto == k } + } + } + def all[T](s: Seq[T])(p: T => Prop): Prop = if(s.isEmpty) true else s.map(p).reduceLeft(_ && _) } From 41e96dbc66f0952ecfd632083208e3bdda4810cb Mon Sep 17 00:00:00 2001 From: Benjy Date: Sun, 13 Oct 2013 22:27:10 -0700 Subject: [PATCH 376/823] Test for Analysis split/merge. Requires scalacheck generators for Analysis and its subobjects. These may be useful for other tests in the future. Also fixes a bug in RelationTest. --- util/relation/src/test/scala/RelationTest.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/relation/src/test/scala/RelationTest.scala b/util/relation/src/test/scala/RelationTest.scala index de63fe893..03b728915 100644 --- a/util/relation/src/test/scala/RelationTest.scala +++ b/util/relation/src/test/scala/RelationTest.scala @@ -52,7 +52,7 @@ object RelationTest extends Properties("Relation") } property("Groups correctly") = forAll { (entries: List[(Int, Double)], randomInt: Int) => - val splitInto = randomInt % 10 + 1 // Split into 1-10 groups. + val splitInto = math.abs(randomInt) % 10 + 1 // Split into 1-10 groups. val rel = Relation.empty[Int, Double] ++ entries val grouped = rel groupBy (_._1 % splitInto) all(grouped.toSeq) { From d30a3f3bfd1c98b6b6535de2b3f494064e67f7b5 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 23 Oct 2013 09:46:43 -0400 Subject: [PATCH 377/823] drop view for iterator in IMap --- util/collection/src/main/scala/sbt/PMap.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/PMap.scala b/util/collection/src/main/scala/sbt/PMap.scala index 9cf7f26f2..67a8899cd 100644 --- a/util/collection/src/main/scala/sbt/PMap.scala +++ b/util/collection/src/main/scala/sbt/PMap.scala @@ -62,7 +62,7 @@ object IMap def mapSeparate[VL[_], VR[_]](f: V ~> ({type l[T] = Either[VL[T], VR[T]]})#l ) = { - val mapped = backing.view.map { case (k,v) => f(v) match { + val mapped = backing.iterator.map { case (k,v) => f(v) match { case Left(l) => Left((k, l)) case Right(r) => Right((k, r)) }} From a0540f5ce91dfb4b100c24f0dd4da23d7b18b4a7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 23 Oct 2013 09:46:43 -0400 Subject: [PATCH 378/823] cleanups in Settings --- .../src/main/scala/sbt/Settings.scala | 22 ++++++++----------- 1 file changed, 9 insertions(+), 13 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 65efd1656..8bb458ec8 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -162,8 +162,7 @@ trait Init[Scope] } type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] val f = new (SettingSeq ~> ValidatedSettings) { def apply[T](ks: Seq[Setting[T]]) = { - val validated = ks.zipWithIndex map { case (s,i) => s validateReferenced refMap(s, i == 0) } - val (undefs, valid) = Util separateE validated + val (undefs, valid) = Util.separate(ks.zipWithIndex){ case (s,i) => s validateReferenced refMap(s, i == 0) } if(undefs.isEmpty) Right(valid) else Left(undefs.flatten) }} type Undefs[_] = Seq[Undefined] @@ -173,17 +172,11 @@ trait Init[Scope] else throw Uninitialized(sMap.keys.toSeq, delegates, undefineds.values.flatten.toList, false) } - private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], ref: Setting[_], isFirst: Boolean): Either[Undefined, ScopedKey[T]] = + private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], ref: Setting[_], isFirst: Boolean): Either[Undefined, ScopedKey[T]] = { - def resolve(search: Seq[Scope]): Either[Undefined, ScopedKey[T]] = - search match { - case Seq() => Left(Undefined(ref, k)) - case Seq(x, xs @ _*) => - val sk = ScopedKey(x, k.key) - val definesKey = (ref.key != sk || !isFirst) && (sMap contains sk) - if(definesKey) Right(sk) else resolve(xs) - } - resolve(scopes) + val skeys = scopes.iterator.map(x => ScopedKey(x, k.key)) + val definedAt = skeys.find( sk => (!isFirst || ref.key != sk) && (sMap contains sk)) + definedAt.toRight(Undefined(ref, k)) } private[this] def applyInits(ordered: Seq[Compiled[_]])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = @@ -515,7 +508,10 @@ trait Init[Scope] def dependencies = deps(a.toList) def apply[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) def mapReferenced(g: MapScoped) = new Optional(a map mapReferencedT(g).fn, f) - def validateReferenced(g: ValidateRef) = Right( new Optional(a flatMap { _.validateReferenced(g).right.toOption }, f) ) + def validateReferenced(g: ValidateRef) = a match { + case None => Right(this) + case Some(i) => Right( new Optional(i.validateReferenced(g).right.toOption, f) ) + } def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) def evaluate(ss: Settings[Scope]): T = f( a.flatMap( i => trapBadRef(evaluateT(ss)(i)) ) ) // proper solution is for evaluate to be deprecated or for external use only and a new internal method returning Either be used From 8dc24826d3a3413a2d7d708b4309eb7e18ce152a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 23 Oct 2013 09:46:43 -0400 Subject: [PATCH 379/823] shortcut heterogeneous AList to KList.toList --- util/collection/src/main/scala/sbt/AList.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util/collection/src/main/scala/sbt/AList.scala b/util/collection/src/main/scala/sbt/AList.scala index 6e5946318..1bc361e0d 100644 --- a/util/collection/src/main/scala/sbt/AList.scala +++ b/util/collection/src/main/scala/sbt/AList.scala @@ -53,6 +53,7 @@ object AList def foldr[M[_], T](k: KL[M], f: (M[_], T) => T, init: T): T = k.foldr(f, init) override def apply[M[_], C](k: KL[M], f: KL[Id] => C)(implicit app: Applicative[M]): M[C] = k.apply(f)(app) def traverse[M[_], N[_], P[_]](k: KL[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KL[P]] = k.traverse[N,P](f)(np) + override def toList[M[_]](k: KL[M]) = k.toList } /** AList for a single value. */ From 9c75143382ae76aa786f284d4f1f7b6f65f0ff2d Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Thu, 24 Oct 2013 12:21:53 +0200 Subject: [PATCH 380/823] Remove AnalysisCallback.{beginSource, endSource} methods. As pointed out by @harrah in #705, both beginSource and endSource are not used in sbt internally for anything meaningful. We've discussed an option of deprecating those methods but since they are not doing anything meaningful Mark prefers to have compile-time error in case somebody implements or calls those methods. I agree with that hence removal. --- interface/src/main/java/xsbti/AnalysisCallback.java | 4 ---- interface/src/test/scala/TestCallback.scala | 7 +------ 2 files changed, 1 insertion(+), 10 deletions(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index d00f5b7ed..55a90f011 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -7,8 +7,6 @@ import java.io.File; public interface AnalysisCallback { - /** Called before the source at the given location is processed. */ - public void beginSource(File source); /** Called to indicate that the source file source depends on the source file * dependsOn. Note that only source files included in the current compilation will * passed to this method. Dependencies on classes generated by sources not in the current compilation will @@ -24,8 +22,6 @@ public interface AnalysisCallback /** Called to indicate that the source file source produces a class file at * module contain class name.*/ public void generatedClass(File source, File module, String name); - /** Called after the source at the given location has been processed. */ - public void endSource(File sourcePath); /** Called when the public API of a source file is extracted. */ public void api(File sourceFile, xsbti.api.SourceAPI source); /** Provides problems discovered during compilation. These may be reported (logged) or unreported. diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index 061457723..5c0de068e 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -5,20 +5,15 @@ package xsbti class TestCallback extends AnalysisCallback { - val beganSources = new ArrayBuffer[File] - val endedSources = new ArrayBuffer[File] val sourceDependencies = new ArrayBuffer[(File, File, Boolean)] val binaryDependencies = new ArrayBuffer[(File, String, File, Boolean)] val products = new ArrayBuffer[(File, File, String)] val apis = new ArrayBuffer[(File, xsbti.api.SourceAPI)] - def beginSource(source: File) { beganSources += source } - def sourceDependency(dependsOn: File, source: File, inherited: Boolean) { sourceDependencies += ((dependsOn, source, inherited)) } def binaryDependency(binary: File, name: String, source: File, inherited: Boolean) { binaryDependencies += ((binary, name, source, inherited)) } def generatedClass(source: File, module: File, name: String) { products += ((source, module, name)) } - def endSource(source: File) { endedSources += source } def api(source: File, sourceAPI: xsbti.api.SourceAPI) { apis += ((source, sourceAPI)) } def problem(category: String, pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean) {} -} \ No newline at end of file +} From cb7c12a4ad54060ea3ce0a130e6b85c4adb55e29 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 24 Oct 2013 16:34:16 -0400 Subject: [PATCH 381/823] Transfer logging,trace levels from old to new global loggers. --- util/log/src/main/scala/sbt/Logger.scala | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/util/log/src/main/scala/sbt/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala index ce8201e9c..c556f620c 100644 --- a/util/log/src/main/scala/sbt/Logger.scala +++ b/util/log/src/main/scala/sbt/Logger.scala @@ -40,6 +40,11 @@ abstract class AbstractLogger extends Logger object Logger { + def transferLevels(oldLog: AbstractLogger, newLog: AbstractLogger) { + newLog.setLevel(oldLog.getLevel) + newLog.setTrace(oldLog.getTrace) + } + // make public in 0.13 private[sbt] val Null: AbstractLogger = new AbstractLogger { def getLevel: Level.Value = Level.Error From 785b750e474064a76b5bca6d43dc4b99c050e006 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Fri, 25 Oct 2013 16:30:43 +0200 Subject: [PATCH 382/823] Remove dead source file `F0.scala` from interface subproject. It doesn't seem to be used anywhere. --- interface/src/test/scala/F0.scala | 6 ------ 1 file changed, 6 deletions(-) delete mode 100644 interface/src/test/scala/F0.scala diff --git a/interface/src/test/scala/F0.scala b/interface/src/test/scala/F0.scala deleted file mode 100644 index 94a1aa876..000000000 --- a/interface/src/test/scala/F0.scala +++ /dev/null @@ -1,6 +0,0 @@ -package xsbti - -object g0 -{ - def apply[T](s: => T) = new F0[T] { def apply = s } -} \ No newline at end of file From 18128d6146f6e4acc96b7c1faf90f2be0125c3ff Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Fri, 25 Oct 2013 16:31:47 +0200 Subject: [PATCH 383/823] Move TestCallback.scala to a directory matching its package name. --- interface/src/test/scala/{ => xsbti}/TestCallback.scala | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename interface/src/test/scala/{ => xsbti}/TestCallback.scala (100%) diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/xsbti/TestCallback.scala similarity index 100% rename from interface/src/test/scala/TestCallback.scala rename to interface/src/test/scala/xsbti/TestCallback.scala From effb255c97535117169a260ac762528466f5c737 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Fri, 2 Aug 2013 16:27:47 -0700 Subject: [PATCH 384/823] Fix unstable existential type names bug. Fix the problem with unstable names synthesized for existential types (declared with underscore syntax) by renaming type variables to a scheme that is guaranteed to be stable no matter where given the existential type appears. The sheme we use are De Bruijn-like indices that capture both position of type variable declarion within single existential type and nesting level of nested existential type. This way we properly support nested existential types by avoiding name clashes. In general, we can perform renamings like that because type variables declared in existential types are scoped to those types so the renaming operation is local. There's a specs2 unit test covering instability of existential types. The test is included in compiler-interface project and the build definition has been modified to enable building and executing tests in compiler-interface project. Some dependencies has been modified: * compiler-interface project depends on api project for testing (test makes us of SameAPI) * dependency on junit has been introduced because it's needed for `@RunWith` annotation which declares that specs2 unit test should be ran with JUnitRunner SameAPI has been modified to expose a method that allows us to compare two definitions. This commit also adds `ScalaCompilerForUnitTesting` class that allows to compile a piece of Scala code and inspect information recorded callbacks defined in `AnalysisCallback` interface. That class uses existing ConsoleLogger for logging. I considered doing the same for ConsoleReporter. There's LoggingReporter defined which would fit our usecase but it's defined in compile subproject that compiler-interface doesn't depend on so we roll our own. ScalaCompilerForUnit testing uses TestCallback from compiler-interface subproject for recording information passed to callbacks. In order to be able to access TestCallback from compiler-interface subproject I had to tweak dependencies between interface and compiler-interface so test classes from the former are visible in the latter. I also modified the TestCallback itself to accumulate apis in a HashMap instead of a buffer of tuples for easier lookup. An integration test has been added which tests scenario mentioned in #823. This commit fixes #823. --- interface/src/test/scala/xsbti/TestCallback.scala | 12 ++++++++---- 1 file changed, 8 insertions(+), 4 deletions(-) diff --git a/interface/src/test/scala/xsbti/TestCallback.scala b/interface/src/test/scala/xsbti/TestCallback.scala index 5c0de068e..f8454336a 100644 --- a/interface/src/test/scala/xsbti/TestCallback.scala +++ b/interface/src/test/scala/xsbti/TestCallback.scala @@ -1,19 +1,23 @@ package xsbti - import java.io.File - import scala.collection.mutable.ArrayBuffer +import java.io.File +import scala.collection.mutable.ArrayBuffer +import xsbti.api.SourceAPI class TestCallback extends AnalysisCallback { val sourceDependencies = new ArrayBuffer[(File, File, Boolean)] val binaryDependencies = new ArrayBuffer[(File, String, File, Boolean)] val products = new ArrayBuffer[(File, File, String)] - val apis = new ArrayBuffer[(File, xsbti.api.SourceAPI)] + val apis: scala.collection.mutable.Map[File, SourceAPI] = scala.collection.mutable.Map.empty def sourceDependency(dependsOn: File, source: File, inherited: Boolean) { sourceDependencies += ((dependsOn, source, inherited)) } def binaryDependency(binary: File, name: String, source: File, inherited: Boolean) { binaryDependencies += ((binary, name, source, inherited)) } def generatedClass(source: File, module: File, name: String) { products += ((source, module, name)) } - def api(source: File, sourceAPI: xsbti.api.SourceAPI) { apis += ((source, sourceAPI)) } + def api(source: File, sourceAPI: SourceAPI): Unit = { + assert(!apis.contains(source), s"The `api` method should be called once per source file: $source") + apis(source) = sourceAPI + } def problem(category: String, pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean) {} } From 9fefc18e2db4540713315e33d7f506837735e55b Mon Sep 17 00:00:00 2001 From: Bruno Bieth Date: Fri, 1 Nov 2013 11:35:28 +0100 Subject: [PATCH 385/823] avoid deadlock in ConsoleOut.systemOutOverwrite System.out can be reset after being captured by `val lockObject`. Then locking `lockObject` and calling `println()` could lead to a deadlock. --- util/log/src/main/scala/sbt/ConsoleOut.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/util/log/src/main/scala/sbt/ConsoleOut.scala b/util/log/src/main/scala/sbt/ConsoleOut.scala index 916c3727e..07f17ff72 100644 --- a/util/log/src/main/scala/sbt/ConsoleOut.scala +++ b/util/log/src/main/scala/sbt/ConsoleOut.scala @@ -34,8 +34,8 @@ object ConsoleOut def println(): Unit = synchronized { val s = current.toString if(ConsoleLogger.formatEnabled && last.exists(lmsg => f(s, lmsg))) - System.out.print(OverwriteLine) - System.out.println(s) + lockObject.print(OverwriteLine) + lockObject.println(s) last = Some(s) current = new java.lang.StringBuffer } @@ -59,4 +59,4 @@ object ConsoleOut def println(s: String) = { out.write(s); println() } def println() = { out.newLine(); out.flush() } } -} \ No newline at end of file +} From 57f87fe6c12d11bfb9e2c8b219e2d4317ee92c85 Mon Sep 17 00:00:00 2001 From: Benjy Date: Sun, 3 Nov 2013 10:08:46 -0800 Subject: [PATCH 386/823] Fix implementation of Relation.size. Nothing was actually using it yet, fortunately. --- util/relation/src/main/scala/sbt/Relation.scala | 2 +- util/relation/src/test/scala/RelationTest.scala | 7 +++++++ 2 files changed, 8 insertions(+), 1 deletion(-) diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala index d97ee2321..725512d0b 100644 --- a/util/relation/src/main/scala/sbt/Relation.scala +++ b/util/relation/src/main/scala/sbt/Relation.scala @@ -118,7 +118,7 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def _1s = fwd.keySet def _2s = rev.keySet - def size = fwd.size + def size = (fwd.valuesIterator map { _.size }).foldLeft(0)(_ + _) def all: Traversable[(A,B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable diff --git a/util/relation/src/test/scala/RelationTest.scala b/util/relation/src/test/scala/RelationTest.scala index 03b728915..3dcc03f38 100644 --- a/util/relation/src/test/scala/RelationTest.scala +++ b/util/relation/src/test/scala/RelationTest.scala @@ -60,6 +60,13 @@ object RelationTest extends Properties("Relation") } } + property("Computes size correctly") = forAll { (entries: List[(Int, Double)]) => + val rel = Relation.empty[Int, Double] ++ entries + val expected = rel.all.size // Note: not entries.length, as entries may have duplicates. + val computed = rel.size + "Expected size: %d. Computed size: %d.".format(expected, computed) |: expected == computed + } + def all[T](s: Seq[T])(p: T => Prop): Prop = if(s.isEmpty) true else s.map(p).reduceLeft(_ && _) } From cfe5f3cebc2daa2aa2102a8c2b6b76745cb63807 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 4 Nov 2013 11:27:46 -0500 Subject: [PATCH 387/823] update to ScalaCheck 1.11.0 --- util/collection/src/test/scala/DagSpecification.scala | 4 ++-- .../process/src/test/scala/ProcessSpecification.scala | 11 ++++++++--- 2 files changed, 10 insertions(+), 5 deletions(-) diff --git a/util/collection/src/test/scala/DagSpecification.scala b/util/collection/src/test/scala/DagSpecification.scala index 7cf19f2df..77ff80120 100644 --- a/util/collection/src/test/scala/DagSpecification.scala +++ b/util/collection/src/test/scala/DagSpecification.scala @@ -18,7 +18,7 @@ object DagSpecification extends Properties("Dag") private def dagGen(nodeCount: Int): Gen[TestDag] = { val nodes = new HashSet[TestDag] - def nonterminalGen(p: Gen.Params): Gen[TestDag] = + def nonterminalGen(p: Gen.Parameters): Gen[TestDag] = { for(i <- 0 until nodeCount; nextDeps <- Gen.someOf(nodes).apply(p)) nodes += new TestDag(i, nextDeps) @@ -27,7 +27,7 @@ object DagSpecification extends Properties("Dag") } Gen.parameterized(nonterminalGen) } - + private def isSet[T](c: Seq[T]) = Set(c: _*).size == c.size private def dependenciesPrecedeNodes(sort: List[TestDag]) = { diff --git a/util/process/src/test/scala/ProcessSpecification.scala b/util/process/src/test/scala/ProcessSpecification.scala index f48a8282c..6298ce544 100644 --- a/util/process/src/test/scala/ProcessSpecification.scala +++ b/util/process/src/test/scala/ProcessSpecification.scala @@ -8,7 +8,12 @@ import Process._ object ProcessSpecification extends Properties("Process I/O") { - implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary(Gen.choose(0, 10) flatMap { size => Gen.resize(size, Arbitrary.arbArray[Byte].arbitrary) }) + implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary( + for(size <- Gen.choose(0, 10); + l <- Gen.listOfN[Byte](size, Arbitrary.arbByte.arbitrary)) + yield + l.toArray + ) /*property("Correct exit code") = forAll( (exitCode: Byte) => checkExit(exitCode)) property("#&& correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ #&& _)(_ && _)) @@ -99,7 +104,7 @@ object ProcessSpecification extends Properties("Process I/O") val process = f(a, b) ( process ! ) == 0 && sameFiles(a, b) } - private def sameFiles(a: File, b: File) = + private def sameFiles(a: File, b: File) = IO.readBytes(a) sameElements IO.readBytes(b) private def withTempFiles[T](f: (File, File) => T): T = @@ -112,7 +117,7 @@ object ProcessSpecification extends Properties("Process I/O") temporaryFile1.delete() temporaryFile2.delete() } - } + } private def unsigned(b: Int): Int = ((b: Int) +256) % 256 private def unsigned(b: Byte): Int = unsigned(b: Int) private def process(command: String) = From 4ec88dba43a5876c203233f48e4cedbbb8964901 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 22 Nov 2013 13:08:10 -0500 Subject: [PATCH 388/823] Remove the need for resetLocalAttrs. Fixes #994, #952. The fix was made possible by the very helpful information provided by @retronym. This commit does two key things: 1. changes the owner when splicing original trees into new trees 2. ensures the synthetic trees that get spliced into original trees do not need typechecking Given this original source (from Defaults.scala): ... lazy val sourceConfigPaths = Seq( ... unmanagedSourceDirectories := Seq(scalaSource.value, javaSource.value), ... ) ... After expansion of .value, this looks something like: unmanagedSourceDirectories := Seq( InputWrapper.wrapInit[File](scalaSource), InputWrapper.wrapInit[File](javaSource) ) where wrapInit is something like: def wrapInit[T](a: Any): T After expansion of := we have (approximately): unmanagedSourceDirectories <<= Instance.app( (scalaSource, javaSource) ) { $p1: (File, File) => val $q4: File = $p1._1 val $q3: File = $p1._2 Seq($q3, $q4) } So, a) `scalaSource` and `javaSource` are user trees that are spliced into a tuple constructor after being temporarily held in `InputWrapper.wrapInit` b) the constructed tuple `(scalaSource, javaSource)` is passed as an argument to another method call (without going through a val or anything) and shouldn't need owner changing c) the synthetic vals $q3 and $q4 need their owner properly set to the anonymous function d) the references (Idents) $q3 and $q4 are spliced into the user tree `Seq(..., ...)` and their symbols need to be the Symbol for the referenced vals e) generally, treeCopy needs to be used when substituting Trees in order to preserve attributes, like Types and Positions changeOwner is called on the body `Seq($q3, $q4)` with the original owner sourceConfigPaths to be changed to the new anonymous function. In this example, no owners are actually changed, but when the body contains vals or anonymous functions, they will. An example of the compiler crash seen when the symbol of the references is not that of the vals: symbol value $q3 does not exist in sbt.Defaults.sourceConfigPaths$lzycompute at scala.reflect.internal.SymbolTable.abort(SymbolTable.scala:49) at scala.tools.nsc.Global.abort(Global.scala:254) at scala.tools.nsc.backend.icode.GenICode$ICodePhase.genLoadIdent$1(GenICode.scala:1038) at scala.tools.nsc.backend.icode.GenICode$ICodePhase.scala$tools$nsc$backend$icode$GenICode$ICodePhase$$genLoad(GenICode.scala:1044) at scala.tools.nsc.backend.icode.GenICode$ICodePhase$$anonfun$genLoadArguments$1.apply(GenICode.scala:1246) at scala.tools.nsc.backend.icode.GenICode$ICodePhase$$anonfun$genLoadArguments$1.apply(GenICode.scala:1244) ... Other problems with the synthetic tree when it is spliced under the original tree often result in type mismatches or some other compiler error that doesn't result in a crash. If the owner is not changed correctly on the original tree that gets spliced under a synthetic tree, one way it can crash the compiler is: java.lang.IllegalArgumentException: Could not find proxy for val $q23: java.io.File in List(value $q23, method apply, anonymous class $anonfun$globalCore$5, value globalCore, object Defaults, package sbt, package ) (currentOwner= value dir ) ... while compiling: /home/mark/code/sbt/main/src/main/scala/sbt/Defaults.scala during phase: global=lambdalift, atPhase=constructors ... last tree to typer: term $outer symbol: value $outer (flags: private[this]) symbol definition: private[this] val $outer: sbt.BuildCommon tpe: symbol owners: value $outer -> anonymous class $anonfun$87 -> value x$298 -> method derive -> class BuildCommon$class -> package sbt context owners: value dir -> value globalCore -> object Defaults -> package sbt ... The problem here is the difference between context owners and the proxy search chain. --- .../main/scala/sbt/appmacro/ContextUtil.scala | 79 +++++++++++++------ .../main/scala/sbt/appmacro/Instance.scala | 53 +++++++------ .../scala/sbt/appmacro/KListBuilder.scala | 8 +- .../scala/sbt/appmacro/TupleNBuilder.scala | 8 +- 4 files changed, 93 insertions(+), 55 deletions(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index dffc5e0c6..381674e47 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -35,10 +35,15 @@ object ContextUtil { /** Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). * This is not thread safe due to the underlying Context and related data structures not being thread safe. * Use `ContextUtil[c.type](c)` to construct. */ -final class ContextUtil[C <: Context](val ctx: C) +final class ContextUtil[C <: Context](val ctx: C) { import ctx.universe.{Apply=>ApplyTree,_} + val powerContext = ctx.asInstanceOf[reflect.macros.runtime.Context] + val global: powerContext.universe.type = powerContext.universe + def callsiteTyper: global.analyzer.Typer = powerContext.callsiteTyper + val initialOwner: Symbol = callsiteTyper.context.owner.asInstanceOf[ctx.universe.Symbol] + lazy val alistType = ctx.typeOf[AList[KList]] lazy val alist: Symbol = alistType.typeSymbol.companionSymbol lazy val alistTC: Type = alistType.typeConstructor @@ -52,12 +57,15 @@ final class ContextUtil[C <: Context](val ctx: C) * (The current implementation uses Context.fresh, which increments*/ def freshTermName(prefix: String) = newTermName(ctx.fresh("$" + prefix)) - /** Constructs a new, local ValDef with the given Type, a unique name, - * the same position as `sym`, and an empty implementation (no rhs). */ - def freshValDef(tpe: Type, sym: Symbol): ValDef = + /** Constructs a new, synthetic, local ValDef Type `tpe`, a unique name, + * Position `pos`, an empty implementation (no rhs), and owned by `owner`. */ + def freshValDef(tpe: Type, pos: Position, owner: Symbol): ValDef = { - val vd = localValDef(TypeTree(tpe), EmptyTree) - vd setPos getPos(sym) + val SYNTHETIC = (1 << 21).toLong.asInstanceOf[FlagSet] + val sym = owner.newTermSymbol(freshTermName("q"), pos, SYNTHETIC) + setInfo(sym, tpe) + val vd = ValDef(sym, EmptyTree) + vd.setPos(pos) vd } @@ -65,7 +73,7 @@ final class ContextUtil[C <: Context](val ctx: C) /** Collects all definitions in the tree for use in checkReferences. * This excludes definitions in wrapped expressions because checkReferences won't allow nested dereferencing anyway. */ - def collectDefs(tree: Tree, isWrapper: (String, Type, Tree) => Boolean): collection.Set[Symbol] = + def collectDefs(tree: Tree, isWrapper: (String, Type, Tree) => Boolean): collection.Set[Symbol] = { val defs = new collection.mutable.HashSet[Symbol] // adds the symbols for all non-Ident subtrees to `defs`. @@ -106,17 +114,17 @@ final class ContextUtil[C <: Context](val ctx: C) /** Constructs a tuple value of the right TupleN type from the provided inputs.*/ def mkTuple(args: List[Tree]): Tree = - { - val global: Global = ctx.universe.asInstanceOf[Global] global.gen.mkTuple(args.asInstanceOf[List[global.Tree]]).asInstanceOf[ctx.universe.Tree] - } + + def setSymbol[Tree](t: Tree, sym: Symbol): Unit = + t.asInstanceOf[global.Tree].setSymbol(sym.asInstanceOf[global.Symbol]) + def setInfo[Tree](sym: Symbol, tpe: Type): Unit = + sym.asInstanceOf[global.Symbol].setInfo(tpe.asInstanceOf[global.Type]) /** Creates a new, synthetic type variable with the specified `owner`. */ def newTypeVariable(owner: Symbol, prefix: String = "T0"): TypeSymbol = - { - val global: Global = ctx.universe.asInstanceOf[Global] owner.asInstanceOf[global.Symbol].newSyntheticTypeParam(prefix, 0L).asInstanceOf[ctx.universe.TypeSymbol] - } + /** The type representing the type constructor `[X] X` */ lazy val idTC: Type = { @@ -136,21 +144,42 @@ final class ContextUtil[C <: Context](val ctx: C) /** >: Nothing <: Any */ def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) + /** Creates a new anonymous function symbol with Position `pos`. */ + def functionSymbol(pos: Position): Symbol = + callsiteTyper.context.owner.newAnonymousFunctionValue(pos.asInstanceOf[global.Position]).asInstanceOf[ctx.universe.Symbol] + def functionType(args: List[Type], result: Type): Type = { - val global: Global = ctx.universe.asInstanceOf[Global] val tpe = global.definitions.functionType(args.asInstanceOf[List[global.Type]], result.asInstanceOf[global.Type]) tpe.asInstanceOf[Type] } - /** Create a Tree that references the `val` represented by `vd`. */ - def refVal(vd: ValDef, pos: Position): Tree = + /** Create a Tree that references the `val` represented by `vd`, copying attributes from `replaced`. */ + def refVal(replaced: Tree, vd: ValDef): Tree = + treeCopy.Ident(replaced, vd.name).setSymbol(vd.symbol) + + /** Creates a Function tree using `functionSym` as the Symbol and changing `initialOwner` to `functionSym` in `body`.*/ + def createFunction(params: List[ValDef], body: Tree, functionSym: Symbol): Tree = { - val t = Ident(vd.name) - assert(vd.tpt.tpe != null, "val type is null: " + vd + ", tpt: " + vd.tpt.tpe) - t.setType(vd.tpt.tpe) - t.setPos(pos) - t + changeOwner(body, initialOwner, functionSym) + val f = Function(params, body) + setSymbol(f, functionSym) + f + } + + def changeOwner(tree: Tree, prev: Symbol, next: Symbol): Unit = + new ChangeOwnerAndModuleClassTraverser(prev.asInstanceOf[global.Symbol], next.asInstanceOf[global.Symbol]).traverse(tree.asInstanceOf[global.Tree]) + + // Workaround copied from scala/async:can be removed once https://github.com/scala/scala/pull/3179 is merged. + private[this] class ChangeOwnerAndModuleClassTraverser(oldowner: global.Symbol, newowner: global.Symbol) extends global.ChangeOwnerTraverser(oldowner, newowner) + { + override def traverse(tree: global.Tree) { + tree match { + case _: global.DefTree => change(tree.symbol.moduleClass) + case _ => + } + super.traverse(tree) + } } /** Returns the Symbol that references the statically accessible singleton `i`. */ @@ -164,7 +193,6 @@ final class ContextUtil[C <: Context](val ctx: C) /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ def method(obj: Symbol, name: String): Symbol = { - val global: Global = ctx.universe.asInstanceOf[Global] val ts: Type = obj.typeSignature val m: global.Symbol = ts.asInstanceOf[global.Type].nonPrivateMember(global.newTermName(name)) m.asInstanceOf[Symbol] @@ -176,7 +204,6 @@ final class ContextUtil[C <: Context](val ctx: C) **/ def extractTC(tcp: AnyRef with Singleton, name: String)(implicit it: ctx.TypeTag[tcp.type]): ctx.Type = { - val global: Global = ctx.universe.asInstanceOf[Global] val itTpe = it.tpe.asInstanceOf[global.Type] val m = itTpe.nonPrivateMember(global.newTypeName(name)) val tc = itTpe.memberInfo(m).asInstanceOf[ctx.universe.Type] @@ -187,8 +214,8 @@ final class ContextUtil[C <: Context](val ctx: C) /** Substitutes wrappers in tree `t` with the result of `subWrapper`. * A wrapper is a Tree of the form `f[T](v)` for which isWrapper(, , .target) returns true. * Typically, `f` is a `Select` or `Ident`. - * The wrapper is replaced with the result of `subWrapper(, )` */ - def transformWrappers(t: Tree, subWrapper: (String, Type, Tree) => Converted[ctx.type]): Tree = + * The wrapper is replaced with the result of `subWrapper(, , )` */ + def transformWrappers(t: Tree, subWrapper: (String, Type, Tree, Tree) => Converted[ctx.type]): Tree = { // the main tree transformer that replaces calls to InputWrapper.wrap(x) with // plain Idents that reference the actual input value @@ -197,7 +224,7 @@ final class ContextUtil[C <: Context](val ctx: C) override def transform(tree: Tree): Tree = tree match { - case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => subWrapper(nme.decoded, targ.tpe, qual) match { + case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => subWrapper(nme.decoded, targ.tpe, qual, tree) match { case Converted.Success(t, finalTx) => finalTx(t) case Converted.Failure(p,m) => ctx.abort(p, m) case _: Converted.NotApplicable[_] => super.transform(tree) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala index 5928df8bc..0de166b67 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala @@ -61,7 +61,7 @@ object Instance * These converted inputs are passed to `builder` as well as the list of these synthetic `ValDef`s. * The `TupleBuilder` instance constructs a tuple (Tree) from the inputs and defines the right hand side of the vals * that unpacks the tuple containing the results of the inputs. - * + * * The constructed tuple of inputs and the code that unpacks the results of the inputs are then passed to the `i`, * which is an implementation of `Instance` that is statically accessible. * An Instance defines a applicative functor associated with a specific type constructor and, if it implements MonadInstance as well, a monad. @@ -70,18 +70,18 @@ object Instance * while the full check for static accessibility is done at macro expansion time. * Note: Ideally, the types would verify that `i: MonadInstance` when `t.isRight`. * With the various dependent types involved, this is not worth it. - * + * * The `t` argument is the argument of the macro that will be transformed as described above. * If the macro that calls this method is for a multi-input map (app followed by map), * `t` should be the argument wrapped in Left. - * If this is for multi-input flatMap (app followed by flatMap), + * If this is for multi-input flatMap (app followed by flatMap), * this should be the argument wrapped in Right. */ def contImpl[T,N[_]](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]], inner: Transform[c.type,N])( implicit tt: c.WeakTypeTag[T], nt: c.WeakTypeTag[N[T]], it: c.TypeTag[i.type]): c.Expr[i.M[N[T]]] = { import c.universe.{Apply=>ApplyTree,_} - + val util = ContextUtil[c.type](c) val mTC: Type = util.extractTC(i, InstanceTCName) val mttpe: Type = appliedType(mTC, nt.tpe :: Nil).normalize @@ -91,19 +91,24 @@ object Instance case Left(l) => (l.tree, nt.tpe.normalize) case Right(r) => (r.tree, mttpe) } + // the Symbol for the anonymous function passed to the appropriate Instance.map/flatMap/pure method + // this Symbol needs to be known up front so that it can be used as the owner of synthetic vals + val functionSym = util.functionSymbol(tree.pos) val instanceSym = util.singleton(i) - // A Tree that references the statically accessible Instance that provides the actual implementations of map, flatMap, ... + // A Tree that references the statically accessible Instance that provides the actual implementations of map, flatMap, ... val instance = Ident(instanceSym) val isWrapper: (String, Type, Tree) => Boolean = convert.asPredicate(c) + // Local definitions `defs` in the macro. This is used to ensure references are to M instances defined outside of the macro call. + // Also `refCount` is the number of references, which is used to create the private, synthetic method containing the body + val defs = util.collectDefs(tree, isWrapper) + val checkQual: Tree => Unit = util.checkReferences(defs, isWrapper) + type In = Input[c.universe.type] var inputs = List[In]() - // Local definitions in the macro. This is used to ensure references are to M instances defined outside of the macro call. - val defs = util.collectDefs(tree, isWrapper) - val checkQual: Tree => Unit = util.checkReferences(defs, isWrapper) // transforms the original tree into calls to the Instance functions pure, map, ..., // resulting in a value of type M[T] @@ -118,7 +123,8 @@ object Instance def pure(body: Tree): Tree = { val typeApplied = TypeApply(util.select(instance, PureName), TypeTree(treeType) :: Nil) - val p = ApplyTree(typeApplied, Function(Nil, body) :: Nil) + val f = util.createFunction(Nil, body, functionSym) + val p = ApplyTree(typeApplied, f :: Nil) if(t.isLeft) p else flatten(p) } // m should have type M[M[T]] @@ -133,9 +139,10 @@ object Instance def single(body: Tree, input: In): Tree = { val variable = input.local - val param = ValDef(util.parameterModifiers, variable.name, variable.tpt, EmptyTree) + val param = treeCopy.ValDef(variable, util.parameterModifiers, variable.name, variable.tpt, EmptyTree) val typeApplied = TypeApply(util.select(instance, MapName), variable.tpt :: TypeTree(treeType) :: Nil) - val mapped = ApplyTree(typeApplied, input.expr :: Function(param :: Nil, body) :: Nil) + val f = util.createFunction(param :: Nil, body, functionSym) + val mapped = ApplyTree(typeApplied, input.expr :: f :: Nil) if(t.isLeft) mapped else flatten(mapped) } @@ -145,37 +152,37 @@ object Instance val result = builder.make(c)(mTC, inputs) val param = util.freshMethodParameter( appliedType(result.representationC, util.idTC :: Nil) ) val bindings = result.extract(param) - val f = Function(param :: Nil, Block(bindings, body)) + val f = util.createFunction(param :: Nil, Block(bindings, body), functionSym) val ttt = TypeTree(treeType) val typedApp = TypeApply(util.select(instance, ApplyName), TypeTree(result.representationC) :: ttt :: Nil) val app = ApplyTree(ApplyTree(typedApp, result.input :: f :: Nil), result.alistInstance :: Nil) if(t.isLeft) app else flatten(app) } - // called when transforming the tree to add an input - // for `qual` of type M[A], and a selection qual.value, + // Called when transforming the tree to add an input. + // For `qual` of type M[A], and a `selection` qual.value, // the call is addType(Type A, Tree qual) - // the result is a Tree representing a reference to - // the bound value of the input - def addType(tpe: Type, qual: Tree): Tree = + // The result is a Tree representing a reference to + // the bound value of the input. + def addType(tpe: Type, qual: Tree, selection: Tree): Tree = { qual.foreach(checkQual) - val vd = util.freshValDef(tpe, qual.symbol) + val vd = util.freshValDef(tpe, qual.symbol.pos, functionSym) inputs ::= new Input(tpe, qual, vd) - util.refVal(vd, qual.pos) + util.refVal(selection, vd) } - def sub(name: String, tpe: Type, qual: Tree): Converted[c.type] = + def sub(name: String, tpe: Type, qual: Tree, replace: Tree): Converted[c.type] = { val tag = c.WeakTypeTag[T](tpe) convert[T](c)(name, qual)(tag) transform { tree => - addType(tpe, tree) + addType(tpe, tree, replace) } } // applies the transformation - val tx = util.transformWrappers(tree, (n,tpe,t) => sub(n,tpe,t)) + val tx = util.transformWrappers(tree, (n,tpe,t,replace) => sub(n,tpe,t,replace)) // resetting attributes must be: a) local b) done here and not wider or else there are obscure errors - val tr = makeApp( c.resetLocalAttrs( inner(tx) ) ) + val tr = makeApp( inner(tx) ) c.Expr[i.M[N[T]]](tr) } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala index c22825c1b..e9fb207d8 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala @@ -33,8 +33,10 @@ object KListBuilder extends TupleBuilder def bindKList(prev: ValDef, revBindings: List[ValDef], params: List[ValDef]): List[ValDef] = params match { - case ValDef(mods, name, tpt, _) :: xs => - val head = ValDef(mods, name, tpt, select(Ident(prev.name), "head")) + case (x @ ValDef(mods, name, tpt, _)) :: xs => + val rhs = select(Ident(prev.name), "head") + val head = treeCopy.ValDef(x, mods, name, tpt, rhs) + util.setSymbol(head, x.symbol) val tail = localValDef(TypeTree(), select(Ident(prev.name), "tail")) val base = head :: revBindings bindKList(tail, if(xs.isEmpty) base else tail :: base, xs) @@ -53,7 +55,7 @@ object KListBuilder extends TupleBuilder val klist = makeKList(inputs.reverse, knil, knilType) /** The input types combined in a KList type. The main concern is tracking the heterogeneous types. - * The type constructor is tcVariable, so that it can be applied to [X] X or M later. + * The type constructor is tcVariable, so that it can be applied to [X] X or M later. * When applied to `M`, this type gives the type of the `input` KList. */ val klistType: Type = (inputs :\ knilType)( (in, klist) => kconsType(in.tpe, klist) ) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala index 805098353..89fe31792 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala @@ -42,9 +42,11 @@ object TupleNBuilder extends TupleBuilder def bindTuple(param: ValDef, revBindings: List[ValDef], params: List[ValDef], i: Int): List[ValDef] = params match { - case ValDef(mods, name, tpt, _) :: xs => - val x = ValDef(mods, name, tpt, select(Ident(param.name), "_" + i.toString)) - bindTuple(param, x :: revBindings, xs, i+1) + case (x @ ValDef(mods, name, tpt, _)) :: xs => + val rhs = select(Ident(param.name), "_" + i.toString) + val newVal = treeCopy.ValDef(x, mods, name, tpt, rhs) + util.setSymbol(newVal, x.symbol) + bindTuple(param, newVal :: revBindings, xs, i+1) case Nil => revBindings.reverse } } From 8857a4fb9a45af5a8adb4dce453f36edf7caf377 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 25 Nov 2013 21:03:40 -0500 Subject: [PATCH 389/823] Add -Dsbt.cli.nodelegation option to experiment with no delegation for running/showing tasks/settings from the command line. With this set to true, the following is no longer allowed for example: > compile:update --- .../src/main/scala/sbt/Settings.scala | 23 ++++++++++--------- 1 file changed, 12 insertions(+), 11 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 8bb458ec8..350ae4026 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -13,6 +13,7 @@ sealed trait Settings[Scope] def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] def get[T](scope: Scope, key: AttributeKey[T]): Option[T] + def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] } @@ -23,11 +24,11 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val del def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) } toSeq; def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = - delegates(scope).toStream.flatMap(sc => scopeLocal(sc, key) ).headOption + delegates(scope).toStream.flatMap(sc => getDirect(sc, key) ).headOption def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] = - delegates(scope).toStream.filter(sc => scopeLocal(sc, key).isDefined ).headOption + delegates(scope).toStream.filter(sc => getDirect(sc, key).isDefined ).headOption - private def scopeLocal[T](scope: Scope, key: AttributeKey[T]): Option[T] = + def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] = (data get scope).flatMap(_ get key) def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] = @@ -73,8 +74,8 @@ trait Init[Scope] def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) - /** Constructs a derived setting that will be automatically defined in every scope where one of its dependencies - * is explicitly defined and the where the scope matches `filter`. + /** Constructs a derived setting that will be automatically defined in every scope where one of its dependencies + * is explicitly defined and the where the scope matches `filter`. * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. * Only the static dependencies are tracked, however. */ final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true)): Setting[T] = { @@ -154,7 +155,7 @@ trait Init[Scope] def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopeLocal): Seq[Setting[_]] = init.flatMap( _.dependencies flatMap scopeLocal ) ++ init - + def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = { def refMap(ref: Setting[_], isFirst: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = @@ -178,7 +179,7 @@ trait Init[Scope] val definedAt = skeys.find( sk => (!isFirst || ref.key != sk) && (sMap contains sk)) definedAt.toRight(Undefined(ref, k)) } - + private[this] def applyInits(ordered: Seq[Compiled[_]])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = { val x = java.util.concurrent.Executors.newFixedThreadPool(Runtime.getRuntime.availableProcessors) @@ -253,7 +254,7 @@ trait Init[Scope] override def toString = showFullKey(key) } final class Flattened(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]]) - + def flattenLocals(compiled: CompiledMap): Map[ScopedKey[_],Flattened] = { import collection.breakOut @@ -261,7 +262,7 @@ trait Init[Scope] val ordered = Dag.topologicalSort(locals)(_.dependencies.flatMap(dep => if(dep.key.isLocal) Seq[Compiled[_]](compiled(dep)) else Nil)) def flatten(cmap: Map[ScopedKey[_],Flattened], key: ScopedKey[_], deps: Iterable[ScopedKey[_]]): Flattened = new Flattened(key, deps.flatMap(dep => if(dep.key.isLocal) cmap(dep).dependencies else dep :: Nil)) - + val empty = Map.empty[ScopedKey[_],Flattened] val flattenedLocals = (empty /: ordered) { (cmap, c) => cmap.updated(c.key, flatten(cmap, c.key, c.dependencies)) } compiled flatMap{ case (key, comp) => @@ -365,7 +366,7 @@ trait Init[Scope] out ++= ds addDefs(ds) process(ds ::: ss) - case Nil => + case Nil => } process(defs.toList) out.toList ++ defs @@ -437,7 +438,7 @@ trait Init[Scope] override final def hashCode = id.hashCode override final def equals(o: Any): Boolean = o match { case d: DefaultSetting[_] => d.id == id; case _ => false } } - + private[this] def handleUndefined[T](vr: ValidatedInit[T]): Initialize[T] = vr match { case Left(undefs) => throw new RuntimeUndefined(undefs) From 4da31cd27d0f67c3c4bb47d78773d5aa09a68d32 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Tue, 19 Nov 2013 21:16:06 +0100 Subject: [PATCH 390/823] Extract source code dependencies by tree walking. Previously incremental compiler was extracting source code dependencies by inspecting `CompilationUnit.depends` set. This set is constructed by Scala compiler and it contains all symbols that given compilation unit refers or even saw (in case of implicit search). There are a few problems with this approach: * The contract for `CompilationUnit.depend` is not clearly defined in Scala compiler and there are no tests around it. Read: it's not an official, maintained API. * Improvements to incremental compiler require more context information about given dependency. For example, we want to distinguish between dependency on a class when you just select members from it or inherit from it. The other example is that we might want to know dependencies of a given class instead of the whole compilation unit to make the invalidation logic more precise. That led to the idea of pushing dependency extracting logic to incremental compiler side so it can evolve indepedently from Scala compiler releases and can be refined as needed. We extract dependencies of a compilation unit by walking a type-checked tree and gathering symbols attached to them. Specifically, the tree walk is implemented as a separate phase that runs after pickler and extracts symbols from following tree nodes: * `Import` so we can track dependencies on unused imports * `Select` which is used for selecting all terms * `Ident` used for referring to local terms, package-local terms and top-level packages * `TypeTree` which is used for referring to all types Note that we do not extract just a single symbol assigned to `TypeTree` node because it might represent a complex type that mentions several symbols. We collect all those symbols by traversing the type with CollectTypeTraverser. The implementation of the traverser is inspired by `CollectTypeCollector` from Scala 2.10. The `source-dependencies/typeref-only` test covers a scenario where the dependency is introduced through a TypeRef only. --- interface/src/main/java/xsbti/AnalysisCallback.java | 12 ++++++++++++ interface/src/test/scala/xsbti/TestCallback.scala | 2 +- 2 files changed, 13 insertions(+), 1 deletion(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 55a90f011..ff239ae74 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -27,4 +27,16 @@ public interface AnalysisCallback /** Provides problems discovered during compilation. These may be reported (logged) or unreported. * Unreported problems are usually unreported because reporting was not enabled via a command line switch. */ public void problem(String what, Position pos, String msg, Severity severity, boolean reported); + /** + * Determines whether member reference and inheritance dependencies should be extracted in given compiler + * run. + * + * As the signature suggests, this method's implementation is meant to be side-effect free. It's added + * to AnalysisCallback because it indicates how other callback calls should be interpreted by both + * implementation of AnalysisCallback and it's clients. + * + * NOTE: This method is an implementation detail and can be removed at any point without deprecation. + * Do not depend on it, please. + */ + public boolean memberRefAndInheritanceDeps(); } \ No newline at end of file diff --git a/interface/src/test/scala/xsbti/TestCallback.scala b/interface/src/test/scala/xsbti/TestCallback.scala index f8454336a..28bee5466 100644 --- a/interface/src/test/scala/xsbti/TestCallback.scala +++ b/interface/src/test/scala/xsbti/TestCallback.scala @@ -4,7 +4,7 @@ import java.io.File import scala.collection.mutable.ArrayBuffer import xsbti.api.SourceAPI -class TestCallback extends AnalysisCallback +class TestCallback(override val memberRefAndInheritanceDeps: Boolean = false) extends AnalysisCallback { val sourceDependencies = new ArrayBuffer[(File, File, Boolean)] val binaryDependencies = new ArrayBuffer[(File, String, File, Boolean)] From 7f04c14a123aa7f8d687d34fb51c9851950e90fa Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Thu, 28 Nov 2013 13:42:39 +0100 Subject: [PATCH 391/823] Rename Relations.{memberRefAndInheritanceDeps => nameHashing} The previous name of the flag was rather specific: it indicated whether the new source dependency tracking is supported by given Relations object. However, there will be more functionality added to Relations that is specific to name hashing algorithm. Therefore it makes sense to name the flag as just `nameHashing`. I decided to rename Relations implementation classes to be more consistent with the name of the flag and with the purpose they serve. The flag in AnalysisCallback (and classes implementing it) has been renamed as well. --- interface/src/main/java/xsbti/AnalysisCallback.java | 9 ++++++--- interface/src/test/scala/xsbti/TestCallback.scala | 2 +- 2 files changed, 7 insertions(+), 4 deletions(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index ff239ae74..790db124a 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -28,8 +28,11 @@ public interface AnalysisCallback * Unreported problems are usually unreported because reporting was not enabled via a command line switch. */ public void problem(String what, Position pos, String msg, Severity severity, boolean reported); /** - * Determines whether member reference and inheritance dependencies should be extracted in given compiler - * run. + * Determines whether method calls through this interface should be interpreted as serving + * name hashing algorithm needs in given compiler run. + * + * In particular, it indicates whether member reference and inheritance dependencies should be + * extracted. * * As the signature suggests, this method's implementation is meant to be side-effect free. It's added * to AnalysisCallback because it indicates how other callback calls should be interpreted by both @@ -38,5 +41,5 @@ public interface AnalysisCallback * NOTE: This method is an implementation detail and can be removed at any point without deprecation. * Do not depend on it, please. */ - public boolean memberRefAndInheritanceDeps(); + public boolean nameHashing(); } \ No newline at end of file diff --git a/interface/src/test/scala/xsbti/TestCallback.scala b/interface/src/test/scala/xsbti/TestCallback.scala index 28bee5466..e620f6be2 100644 --- a/interface/src/test/scala/xsbti/TestCallback.scala +++ b/interface/src/test/scala/xsbti/TestCallback.scala @@ -4,7 +4,7 @@ import java.io.File import scala.collection.mutable.ArrayBuffer import xsbti.api.SourceAPI -class TestCallback(override val memberRefAndInheritanceDeps: Boolean = false) extends AnalysisCallback +class TestCallback(override val nameHashing: Boolean = false) extends AnalysisCallback { val sourceDependencies = new ArrayBuffer[(File, File, Boolean)] val binaryDependencies = new ArrayBuffer[(File, String, File, Boolean)] From d1678e7cac94e5d68d6722eb55be826d78d8be7c Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Fri, 22 Nov 2013 18:25:23 +0000 Subject: [PATCH 392/823] Fix deriveAndLocal bug --- util/collection/src/main/scala/sbt/Settings.scala | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 350ae4026..3780adad2 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -342,12 +342,13 @@ trait Init[Scope] val derivedForKey: List[Derived] = derivedBy.get(sk.key).toList.flatten val scope = sk.scope def localAndDerived(d: Derived): Seq[Setting[_]] = - if(d.inScopes.add(scope) && d.setting.filter(scope)) + if(!d.inScopes.contains(scope) && d.setting.filter(scope)) { val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(scope, dep))) - if(allDepsDefined(d, scope, local.map(_.key.key).toSet)) + if(allDepsDefined(d, scope, local.map(_.key.key).toSet)) { + d.inScopes.add(scope) local :+ d.setting.setScope(scope) - else + } else Nil } else Nil From 9b1727889b7730ba05012a960bac8d0e97425165 Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Mon, 25 Nov 2013 12:09:07 +0000 Subject: [PATCH 393/823] Added test case for derived settings --- .../src/test/scala/SettingsTest.scala | 30 +++++++++++++++++-- 1 file changed, 28 insertions(+), 2 deletions(-) diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index 9ff703526..a00f77cf7 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -21,7 +21,7 @@ object SettingsTest extends Properties("settings") singleIntTest( chainBind(value(abs)), 0 ) } - property("Allows references to completed settings") = forAllNoShrink(30) { allowedReference _ } + property("Allows references to completed settings") = forAllNoShrink(30) { allowedReference } final def allowedReference(intermediate: Int): Prop = { val top = value(intermediate) @@ -36,6 +36,32 @@ object SettingsTest extends Properties("settings") catch { case e: java.lang.Exception => ("Unexpected exception: " + e) |: false } } + property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettings } + final def derivedSettings(nr: Int): Prop = + { + val alphaStr = Gen.alphaStr + val genScopedKeys = { + val attrKeys = for { + list <- Gen.listOfN(nr, alphaStr) suchThat (l => l.size == l.distinct.size) + item <- list + } yield AttributeKey[Int](item) + attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) + } + forAll(genScopedKeys) { scopedKeys => + val last = scopedKeys.last + val derivedSettings: Seq[Setting[Int]] = ( + for { + List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 + nextInit = if (scoped0 == chk) chk + else (scoped0 zipWith chk) { (p, _) => p + 1 } + } yield derive(setting(scoped1, nextInit)) + ).toSeq + + { checkKey(last, Some(nr-1), evaluate(setting(chk, value(0)) +: derivedSettings)) :| "Not derived?" } && + { checkKey( last, None, evaluate(derivedSettings)) :| "Should not be derived" } + } + } + // Circular (dynamic) references currently loop infinitely. // This is the expected behavior (detecting dynamic cycles is expensive), // but it may be necessary to provide an option to detect them (with a performance hit) @@ -95,4 +121,4 @@ final class CCR(intermediate: Int) else iterate(value(t - 1), t-1) } -} \ No newline at end of file +} From ee8089f4a9efb4aa6764b3ea665a2980e3f94f56 Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Mon, 25 Nov 2013 14:30:11 +0000 Subject: [PATCH 394/823] Removed unnecessary catch for exception --- util/collection/src/test/scala/SettingsTest.scala | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index a00f77cf7..0cfb5ea83 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -32,8 +32,7 @@ object SettingsTest extends Properties("settings") else iterate(value(t-1) ) } - try { evaluate( setting(chk, iterate(top)) :: Nil); true } - catch { case e: java.lang.Exception => ("Unexpected exception: " + e) |: false } + evaluate( setting(chk, iterate(top)) :: Nil); true } property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettings } From 06b862db5f7d4462a13b985a67a2d0abab450f7d Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Tue, 3 Dec 2013 12:27:29 +0100 Subject: [PATCH 395/823] Add support for tracking names used in Scala source files. Tracking of used names is a component needed by the name hashing algorithm. The extraction and storage of used names is active only when `AnalysisCallback.nameHashing` flag is enabled and it's disabled by default. This change constists of two parts: 1. Modification of Relations to include a new `names` relation that allows us to track used names in Scala source files 2. Implementation of logic that extracts used names from Scala compilation units (that correspond to Scala source files) The first part is straightforward: add standard set of methods in Relations (along with their implementation) and update the logic which serializes and deserializes Relations. The second part is implemented as tree walk that collects all symbols associated with trees. For each symbol we extract a simple, decoded name and add it to a set of extracted names. Check documentation of `ExtractUsedNames` for discussion of implementation details. The `ExtractUsedNames` comes with unit tests grouped in `ExtractUsedNamesSpecification`. Check that class for details. Given the fact that we fork while running tests in `compiler-interface` subproject and tests are ran in parallel which involves allocating multiple Scala compiler instances we had to bump the default memory limit. This commit contains fixes for gkossakowski/sbt#3, gkossakowski/sbt#5 and gkossakowski/sbt#6 issues. --- interface/src/main/java/xsbti/AnalysisCallback.java | 1 + interface/src/test/scala/xsbti/TestCallback.scala | 2 ++ 2 files changed, 3 insertions(+) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 790db124a..0e083d4eb 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -24,6 +24,7 @@ public interface AnalysisCallback public void generatedClass(File source, File module, String name); /** Called when the public API of a source file is extracted. */ public void api(File sourceFile, xsbti.api.SourceAPI source); + public void usedName(File sourceFile, String names); /** Provides problems discovered during compilation. These may be reported (logged) or unreported. * Unreported problems are usually unreported because reporting was not enabled via a command line switch. */ public void problem(String what, Position pos, String msg, Severity severity, boolean reported); diff --git a/interface/src/test/scala/xsbti/TestCallback.scala b/interface/src/test/scala/xsbti/TestCallback.scala index e620f6be2..3ea7e32e1 100644 --- a/interface/src/test/scala/xsbti/TestCallback.scala +++ b/interface/src/test/scala/xsbti/TestCallback.scala @@ -9,12 +9,14 @@ class TestCallback(override val nameHashing: Boolean = false) extends AnalysisCa val sourceDependencies = new ArrayBuffer[(File, File, Boolean)] val binaryDependencies = new ArrayBuffer[(File, String, File, Boolean)] val products = new ArrayBuffer[(File, File, String)] + val usedNames = scala.collection.mutable.Map.empty[File, Set[String]].withDefaultValue(Set.empty) val apis: scala.collection.mutable.Map[File, SourceAPI] = scala.collection.mutable.Map.empty def sourceDependency(dependsOn: File, source: File, inherited: Boolean) { sourceDependencies += ((dependsOn, source, inherited)) } def binaryDependency(binary: File, name: String, source: File, inherited: Boolean) { binaryDependencies += ((binary, name, source, inherited)) } def generatedClass(source: File, module: File, name: String) { products += ((source, module, name)) } + def usedName(source: File, name: String) { usedNames(source) += name } def api(source: File, sourceAPI: SourceAPI): Unit = { assert(!apis.contains(source), s"The `api` method should be called once per source file: $source") apis(source) = sourceAPI From f3c136df62f46b140b73960c9a01dd7d778efd1d Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Wed, 4 Dec 2013 01:34:18 +0100 Subject: [PATCH 396/823] Add hashing of public names defined in a source file. A hash for given name in a source file is computed by combining hashes of all definitions with given name. When hashing a single definition we take into account all information about it except nested definitions. For example, if we have following definition class Foo[T] { def bar(x: Int): Int = ??? } hash sum for `Foo` will include the fact that we have a class with a single type parameter but it won't include hash sum of `bar` method. Computed hash sums are location-sensitive. Each definition is hashed along with its location so we properly detect cases when definition's signature stays the same but it's moved around in the same compilation unit. The location is defined as sequence of selections. Each selection consists of a name and name type. The name type is either term name or type name. Scala specification (9.2) guarantees that each publicly visible definition is uniquely identified by a sequence of such selectors. For example, if we have: object Foo { class Bar { def abc: Int } } then location of `abc` is Seq((TermName, Foo), (TypeName, Bar)) It's worth mentioning that we track name-hash pairs separately for regular (non implicit) and implicit members. That's required for name hashing algorithm because it does not apply its heuristic when implicit members are being modified. Another important characteristic is that we include all inherited members when computing name hashes. Here comes the detailed list of changes made in this commit: * HashAPI has new parameter `includeDefinitions` that allows shallow hashing of Structures (where we do not compute hashes recursively) * HashAPI exposes `finalizeHash` method that allow one to capture current hash at any time. This is useful if you want to hash a list of definitions and not just whole `SourceAPI`. * NameHashing implements actual extraction of public definitions, grouping them by simple name and computing hash sums for each group using HashAPI * `Source` class (defined in interface/other file) has been extended to include `_internalOnly_nameHashes` field. This field stores NameHashes data structure for given source file. The NameHashes stores two separate collections of name-hash pairs for regular and implicit members. The prefix `_internalOnly_` is used to indicate that this is not an official incremental compiler's or sbt's API and it's for use by incremental compiler internals only. We had to use such a prefix because the `datatype` code generator doesn't support emitting access modifiers * `AnalysisCallback` implementation has been modified to gather all name hashes and store them in the Source object * TestCaseGenerators has been modified to implement generation of NameHashes * The NameHashingSpecification contains a few unit tests that make sure that the basic functionality works properly --- interface/other | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/interface/other b/interface/other index 111896f0b..68e4c3a50 100644 --- a/interface/other +++ b/interface/other @@ -3,8 +3,17 @@ Source hash: Byte* api: SourceAPI apiHash: Int + _internalOnly_nameHashes: _internalOnly_NameHashes hasMacro: Boolean +_internalOnly_NameHashes + regularMembers: _internalOnly_NameHash* + implicitMembers: _internalOnly_NameHash* + +_internalOnly_NameHash + name: String + hash: Int + SourceAPI packages : Package* definitions: Definition* From 5dcd8bd913a780fe8e911e996d6927d5abb03e45 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 6 Dec 2013 20:43:48 -0500 Subject: [PATCH 397/823] TaskKey[T].previous: Option[T], which returns the value of the task the last time it executed. This requires a Format[T] to be implicitly available at the call site and requires the task to be referenced statically (not in a settingDyn call). References to previous task values in the form of a ScopedKey[Task[T]] + Format[T] are collected at setting load time in the 'references' setting. These are used to know which tasks should be persisted (the ScopedKey) and how to persist them (the Format). When checking/delegating previous references, rules are slightly different. A normal reference from a task t in scope s cannot refer to t in s unless there is an earlier definition of t in s. However, a previous reference does not have this restriction. This commit modifies validateReferenced to allow this. TODO: user documentation TODO: stable selection of the Format when there are multiple .previous calls on the same task TODO: make it usable in InputTasks, specifically Parsers --- .../collection/src/main/scala/sbt/INode.scala | 16 ++-- .../src/main/scala/sbt/Settings.scala | 92 +++++++++++++++---- 2 files changed, 82 insertions(+), 26 deletions(-) diff --git a/util/collection/src/main/scala/sbt/INode.scala b/util/collection/src/main/scala/sbt/INode.scala index 4ce5ef8bb..67b1c5b36 100644 --- a/util/collection/src/main/scala/sbt/INode.scala +++ b/util/collection/src/main/scala/sbt/INode.scala @@ -27,9 +27,10 @@ abstract class EvaluateSettings[Scope] case k: Keyed[s, T] => single(getStatic(k.scopedKey), k.transform) case a: Apply[k,T] => new MixedNode[k,T]( a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) case b: Bind[s,T] => new BindNode[s,T]( transform(b.in), x => transform(b.f(x))) - case init.StaticScopes => constant(() => allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] + case init.StaticScopes => strictConstant(allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] case v: Value[T] => constant(v.value) - case t: TransformCapture => constant(() => t.f) + case v: ValidationCapture[T] => strictConstant(v.key) + case t: TransformCapture => strictConstant(t.f) case o: Optional[s,T] => o.a match { case None => constant( () => o.f(None) ) case Some(i) => single[s,T](transform(i), x => o.f(Some(x))) @@ -80,7 +81,7 @@ abstract class EvaluateSettings[Scope] private[this] def workComplete(): Unit = if(running.decrementAndGet() == 0) complete.put( None ) - + private[this] sealed abstract class INode[T] { private[this] var state: EvaluationState = New @@ -92,9 +93,9 @@ abstract class EvaluateSettings[Scope] override def toString = getClass.getName + " (state=" + state + ",blockedOn=" + blockedOn + ",calledBy=" + calledBy.size + ",blocking=" + blocking.size + "): " + keyString - private[this] def keyString = + private[this] def keyString = (static.toSeq.flatMap { case (key, value) => if(value eq this) init.showFullKey(key) :: Nil else Nil }).headOption getOrElse "non-static" - + final def get: T = synchronized { assert(value != null, toString + " not evaluated") value @@ -103,7 +104,7 @@ abstract class EvaluateSettings[Scope] val ready = state == Evaluated if(!ready) blocking += from registerIfNew() - ready + ready } final def isDone: Boolean = synchronized { state == Evaluated } final def isNew: Boolean = synchronized { state == New } @@ -119,7 +120,7 @@ abstract class EvaluateSettings[Scope] else state = Blocked } - + final def schedule(): Unit = synchronized { assert(state == New || state == Blocked, "Invalid state for schedule() call: " + toString) state = Ready @@ -158,6 +159,7 @@ abstract class EvaluateSettings[Scope] protected def evaluate0(): Unit } + private[this] def strictConstant[T](v: T): INode[T] = constant(() => v) private[this] def constant[T](f: () => T): INode[T] = new MixedNode[ConstK[Unit]#l, T]((), _ => f(), AList.empty) private[this] def single[S,T](in: INode[S], f: S => T): INode[T] = new MixedNode[ ({ type l[L[x]] = L[S] })#l, T](in, f, AList.single[S]) private[this] final class BindNode[S,T](in: INode[S], f: S => INode[T]) extends INode[T] diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 3780adad2..32d4b9f85 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -59,9 +59,14 @@ trait Init[Scope] type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] type MapConstant = ScopedKey ~> Option + private[sbt] abstract class ValidateKeyRef { + def apply[T](key: ScopedKey[T], selfRefOk: Boolean): ValidatedRef[T] + } + /** The result of this initialization is the composition of applied transformations. * This can be useful when dealing with dynamic Initialize values. */ lazy val capturedTransformations: Initialize[Initialize ~> Initialize] = new TransformCapture(idK[Initialize]) + def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) def valueStrict[T](value: T): Initialize[T] = pure(() => value) def value[T](value: => T): Initialize[T] = pure(value _) @@ -74,10 +79,15 @@ trait Init[Scope] def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) + /** The result of this initialization is the validated `key`. + * No dependency is introduced on `key`. If `selfRefOk` is true, validation will not fail if the key is referenced by a definition of `key`. + * That is, key := f(validated(key).value) is allowed only if `selfRefOk == true`. */ + private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) + /** Constructs a derived setting that will be automatically defined in every scope where one of its dependencies * is explicitly defined and the where the scope matches `filter`. * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. - * Only the static dependencies are tracked, however. */ + * Only the static dependencies are tracked, however. Dependencies on previous values do not introduce a derived setting either. */ final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true)): Setting[T] = { deriveAllowed(s, allowDynamic) foreach error new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger, nextDefaultID()) @@ -158,12 +168,12 @@ trait Init[Scope] def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = { - def refMap(ref: Setting[_], isFirst: Boolean) = new ValidateRef { def apply[T](k: ScopedKey[T]) = - delegateForKey(sMap, k, delegates(k.scope), ref, isFirst) + def refMap(ref: Setting[_], isFirst: Boolean) = new ValidateKeyRef { def apply[T](k: ScopedKey[T], selfRefOk: Boolean) = + delegateForKey(sMap, k, delegates(k.scope), ref, selfRefOk || !isFirst) } type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] val f = new (SettingSeq ~> ValidatedSettings) { def apply[T](ks: Seq[Setting[T]]) = { - val (undefs, valid) = Util.separate(ks.zipWithIndex){ case (s,i) => s validateReferenced refMap(s, i == 0) } + val (undefs, valid) = Util.separate(ks.zipWithIndex){ case (s,i) => s validateKeyReferenced refMap(s, i == 0) } if(undefs.isEmpty) Right(valid) else Left(undefs.flatten) }} type Undefs[_] = Seq[Undefined] @@ -173,10 +183,10 @@ trait Init[Scope] else throw Uninitialized(sMap.keys.toSeq, delegates, undefineds.values.flatten.toList, false) } - private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], ref: Setting[_], isFirst: Boolean): Either[Undefined, ScopedKey[T]] = + private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], ref: Setting[_], selfRefOk: Boolean): Either[Undefined, ScopedKey[T]] = { val skeys = scopes.iterator.map(x => ScopedKey(x, k.key)) - val definedAt = skeys.find( sk => (!isFirst || ref.key != sk) && (sMap contains sk)) + val definedAt = skeys.find( sk => (selfRefOk || ref.key != sk) && (sMap contains sk)) definedAt.toRight(Undefined(ref, k)) } @@ -377,14 +387,25 @@ trait Init[Scope] { def dependencies: Seq[ScopedKey[_]] def apply[S](g: T => S): Initialize[S] + + @deprecated("Will be made private.", "0.13.2") def mapReferenced(g: MapScoped): Initialize[T] - def validateReferenced(g: ValidateRef): ValidatedInit[T] + @deprecated("Will be made private.", "0.13.2") def mapConstant(g: MapConstant): Initialize[T] + + @deprecated("Will be made private.", "0.13.2") + def validateReferenced(g: ValidateRef): ValidatedInit[T] = + validateKeyReferenced( new ValidateKeyRef { def apply[T](key: ScopedKey[T], selfRefOk: Boolean) = g(key) }) + + private[sbt] def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] + def evaluate(map: Settings[Scope]): T def zip[S](o: Initialize[S]): Initialize[(T,S)] = zipTupled(o)(idFun) def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = zipTupled(o)(f.tupled) private[this] def zipTupled[S,U](o: Initialize[S])(f: ((T,S)) => U): Initialize[U] = new Apply[({ type l[L[x]] = (L[T], L[S]) })#l, U](f, (this, o), AList.tuple2[T,S]) + /** A fold on the static attributes of this and nested Initializes. */ + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S } object Initialize { @@ -411,10 +432,17 @@ trait Init[Scope] def settings = this :: Nil def definitive: Boolean = !init.dependencies.contains(key) def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) + @deprecated("Will be made private.", "0.13.2") def mapReferenced(g: MapScoped): Setting[T] = make(key, init mapReferenced g, pos) + @deprecated("Will be made private.", "0.13.2") def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => make(key, newI, pos)) + + private[sbt] def validateKeyReferenced(g: ValidateKeyRef): Either[Seq[Undefined], Setting[T]] = + (init validateKeyReferenced g).right.map(newI => make(key, newI, pos)) + def mapKey(g: MapScoped): Setting[T] = make(g(key), init, pos) def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = make(key, init(t => f(key,t)), pos) + @deprecated("Will be made private.", "0.13.2") def mapConstant(g: MapConstant): Setting[T] = make(key, init mapConstant g, pos) def withPos(pos: SourcePosition) = make(key, init, pos) def positionString: Option[String] = pos match { @@ -450,8 +478,8 @@ trait Init[Scope] new (ValidatedInit ~> Initialize) { def apply[T](v: ValidatedInit[T]) = handleUndefined[T](v) } // mainly for reducing generated class count - private[this] def validateReferencedT(g: ValidateRef) = - new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateReferenced g } + private[this] def validateKeyReferencedT(g: ValidateKeyRef) = + new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateKeyReferenced g } private[this] def mapReferencedT(g: MapScoped) = new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapReferenced g } @@ -472,7 +500,7 @@ trait Init[Scope] final def apply[Z](g: T => Z): Initialize[Z] = new GetValue(scopedKey, g compose transform) final def evaluate(ss: Settings[Scope]): T = transform(getValue(ss, scopedKey)) final def mapReferenced(g: MapScoped): Initialize[T] = new GetValue( g(scopedKey), transform) - final def validateReferenced(g: ValidateRef): ValidatedInit[T] = g(scopedKey) match { + private[sbt] final def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] = g(scopedKey, false) match { case Left(un) => Left(un :: Nil) case Right(nk) => Right(new GetValue(nk, transform)) } @@ -480,11 +508,13 @@ trait Init[Scope] case None => this case Some(const) => new Value(() => transform(const)) } + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init } private[this] final class GetValue[S,T](val scopedKey: ScopedKey[S], val transform: S => T) extends Keyed[S, T] trait KeyedInitialize[T] extends Keyed[T, T] { final val transform = idFun[T] } + private[sbt] final class TransformCapture(val f: Initialize ~> Initialize) extends Initialize[Initialize ~> Initialize] { def dependencies = Nil @@ -492,7 +522,21 @@ trait Init[Scope] def evaluate(ss: Settings[Scope]): Initialize ~> Initialize = f def mapReferenced(g: MapScoped) = new TransformCapture(mapReferencedT(g) ∙ f) def mapConstant(g: MapConstant) = new TransformCapture(mapConstantT(g) ∙ f) - def validateReferenced(g: ValidateRef) = Right(new TransformCapture(getValidated ∙ validateReferencedT(g) ∙ f)) + def validateKeyReferenced(g: ValidateKeyRef) = Right(new TransformCapture(getValidated ∙ validateKeyReferencedT(g) ∙ f)) + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init + } + private[sbt] final class ValidationCapture[T](val key: ScopedKey[T], val selfRefOk: Boolean) extends Initialize[ScopedKey[T]] { + def dependencies = Nil + def apply[Z](g2: ScopedKey[T] => Z): Initialize[Z] = map(this)(g2) + def evaluate(ss: Settings[Scope]) = key + def mapReferenced(g: MapScoped) = new ValidationCapture(g(key), selfRefOk) + def mapConstant(g: MapConstant) = this + def validateKeyReferenced(g: ValidateKeyRef) = g(key, selfRefOk) match { + case Left(un) => Left(un :: Nil) + case Right(k) => Right(new ValidationCapture(k, selfRefOk)) + } + + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init } private[sbt] final class Bind[S,T](val f: S => Initialize[T], val in: Initialize[S]) extends Initialize[T] { @@ -500,42 +544,49 @@ trait Init[Scope] def apply[Z](g: T => Z): Initialize[Z] = new Bind[S,Z](s => f(s)(g), in) def evaluate(ss: Settings[Scope]): T = f(in evaluate ss) evaluate ss def mapReferenced(g: MapScoped) = new Bind[S,T](s => f(s) mapReferenced g, in mapReferenced g) - def validateReferenced(g: ValidateRef) = (in validateReferenced g).right.map { validIn => - new Bind[S,T](s => handleUndefined( f(s) validateReferenced g), validIn) + def validateKeyReferenced(g: ValidateKeyRef) = (in validateKeyReferenced g).right.map { validIn => + new Bind[S,T](s => handleUndefined( f(s) validateKeyReferenced g), validIn) } def mapConstant(g: MapConstant) = new Bind[S,T](s => f(s) mapConstant g, in mapConstant g) + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = in.processAttributes(init)(f) } private[sbt] final class Optional[S,T](val a: Option[Initialize[S]], val f: Option[S] => T) extends Initialize[T] { def dependencies = deps(a.toList) def apply[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) def mapReferenced(g: MapScoped) = new Optional(a map mapReferencedT(g).fn, f) - def validateReferenced(g: ValidateRef) = a match { + def validateKeyReferenced(g: ValidateKeyRef) = a match { case None => Right(this) - case Some(i) => Right( new Optional(i.validateReferenced(g).right.toOption, f) ) + case Some(i) => Right( new Optional(i.validateKeyReferenced(g).right.toOption, f) ) } def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) def evaluate(ss: Settings[Scope]): T = f( a.flatMap( i => trapBadRef(evaluateT(ss)(i)) ) ) // proper solution is for evaluate to be deprecated or for external use only and a new internal method returning Either be used private[this] def trapBadRef[A](run: => A): Option[A] = try Some(run) catch { case e: InvalidReference => None } + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = a match { + case None => init + case Some(i) => i.processAttributes(init)(f) + } } private[sbt] final class Value[T](val value: () => T) extends Initialize[T] { def dependencies = Nil def mapReferenced(g: MapScoped) = this - def validateReferenced(g: ValidateRef) = Right(this) + def validateKeyReferenced(g: ValidateKeyRef) = Right(this) def apply[S](g: T => S) = new Value[S](() => g(value())) def mapConstant(g: MapConstant) = this def evaluate(map: Settings[Scope]): T = value() + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init } private[sbt] final object StaticScopes extends Initialize[Set[Scope]] { def dependencies = Nil def mapReferenced(g: MapScoped) = this - def validateReferenced(g: ValidateRef) = Right(this) + def validateKeyReferenced(g: ValidateKeyRef) = Right(this) def apply[S](g: Set[Scope] => S) = map(this)(g) def mapConstant(g: MapConstant) = this def evaluate(map: Settings[Scope]) = map.scopes + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init } private[sbt] final class Apply[K[L[x]], T](val f: K[Id] => T, val inputs: K[Initialize], val alist: AList[K]) extends Initialize[T] { @@ -545,13 +596,16 @@ trait Init[Scope] def mapConstant(g: MapConstant) = mapInputs( mapConstantT(g) ) def mapInputs(g: Initialize ~> Initialize): Initialize[T] = new Apply(f, alist.transform(inputs, g), alist) def evaluate(ss: Settings[Scope]) = f(alist.transform(inputs, evaluateT(ss))) - def validateReferenced(g: ValidateRef) = + def validateKeyReferenced(g: ValidateKeyRef) = { - val tx = alist.transform(inputs, validateReferencedT(g)) + val tx = alist.transform(inputs, validateKeyReferencedT(g)) val undefs = alist.toList(tx).flatMap(_.left.toSeq.flatten) val get = new (ValidatedInit ~> Initialize) { def apply[T](vr: ValidatedInit[T]) = vr.right.get } if(undefs.isEmpty) Right(new Apply(f, alist.transform(tx, get), alist)) else Left(undefs) } + + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = + (init /: alist.toList(inputs)) { (v, i) => i.processAttributes(v)(f) } } private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) } From ca3877e138160ca75f0c476f74e50f4efdbeafe9 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 3 Jan 2014 19:32:18 -0500 Subject: [PATCH 398/823] Logic system supporting auto plugins and initial outline of AutoPlugin and Natures types. * Not integrated into project loading * Doesn't yet check that negation is acyclic before execution --- util/collection/src/main/scala/sbt/Dag.scala | 19 +- .../src/main/scala/sbt/logic/Logic.scala | 297 ++++++++++++++++++ .../logic/src/test/scala/sbt/logic/Test.scala | 84 +++++ .../src/main/scala/sbt/Relation.scala | 8 +- 4 files changed, 395 insertions(+), 13 deletions(-) create mode 100644 util/logic/src/main/scala/sbt/logic/Logic.scala create mode 100644 util/logic/src/test/scala/sbt/logic/Test.scala diff --git a/util/collection/src/main/scala/sbt/Dag.scala b/util/collection/src/main/scala/sbt/Dag.scala index 4250b0f10..ef8f9cec1 100644 --- a/util/collection/src/main/scala/sbt/Dag.scala +++ b/util/collection/src/main/scala/sbt/Dag.scala @@ -15,7 +15,7 @@ object Dag import JavaConverters.asScalaSetConverter def topologicalSort[T](root: T)(dependencies: T => Iterable[T]): List[T] = topologicalSort(root :: Nil)(dependencies) - + def topologicalSort[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = { val discovered = new mutable.HashSet[T] @@ -24,7 +24,7 @@ object Dag def visitAll(nodes: Iterable[T]) = nodes foreach visit def visit(node : T){ if (!discovered(node)) { - discovered(node) = true; + discovered(node) = true; try { visitAll(dependencies(node)); } catch { case c: Cyclic => throw node :: c } finished += node; } @@ -33,11 +33,13 @@ object Dag } visitAll(nodes); - + finished.toList; } // doesn't check for cycles - def topologicalSortUnchecked[T](node: T)(dependencies: T => Iterable[T]): List[T] = + def topologicalSortUnchecked[T](node: T)(dependencies: T => Iterable[T]): List[T] = topologicalSortUnchecked(node :: Nil)(dependencies) + + def topologicalSortUnchecked[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = { val discovered = new mutable.HashSet[T] var finished: List[T] = Nil @@ -45,23 +47,23 @@ object Dag def visitAll(nodes: Iterable[T]) = nodes foreach visit def visit(node : T){ if (!discovered(node)) { - discovered(node) = true; + discovered(node) = true; visitAll(dependencies(node)) finished ::= node; } } - visit(node); + visitAll(nodes); finished; } final class Cyclic(val value: Any, val all: List[Any], val complete: Boolean) extends Exception( "Cyclic reference involving " + - (if(complete) all.mkString("\n ", "\n ", "") else value) + (if(complete) all.mkString("\n ", "\n ", "") else value) ) { def this(value: Any) = this(value, value :: Nil, false) override def toString = getMessage - def ::(a: Any): Cyclic = + def ::(a: Any): Cyclic = if(complete) this else if(a == value) @@ -70,4 +72,3 @@ object Dag new Cyclic(value, a :: all, false) } } - diff --git a/util/logic/src/main/scala/sbt/logic/Logic.scala b/util/logic/src/main/scala/sbt/logic/Logic.scala new file mode 100644 index 000000000..8d02b2ab9 --- /dev/null +++ b/util/logic/src/main/scala/sbt/logic/Logic.scala @@ -0,0 +1,297 @@ +package sbt +package logic + + import scala.annotation.tailrec + import Formula.{And, True} + +/* +Defines a propositional logic with negation as failure and only allows stratified rule sets (negation must be acyclic) in order to have a unique minimal model. + +For example, this is not allowed: + + p :- not q + + q :- not p +but this is: + + p :- q + + q :- p +as is this: + + p :- q + + q := not r + + + Some useful links: + + https://en.wikipedia.org/wiki/Nonmonotonic_logic + + https://en.wikipedia.org/wiki/Negation_as_failure + + https://en.wikipedia.org/wiki/Propositional_logic + + https://en.wikipedia.org/wiki/Stable_model_semantics + + http://www.w3.org/2005/rules/wg/wiki/negation +*/ + + +/** Disjunction (or) of the list of clauses. */ +final case class Clauses(clauses: List[Clause]) { + assert(clauses.nonEmpty, "At least one clause is required.") +} + +/** When the `body` Formula succeeds, atoms in `head` are true. */ +final case class Clause(body: Formula, head: Set[Atom]) + +/** A literal is an [[Atom]] or its [[negation|Negated]]. */ +sealed abstract class Literal extends Formula { + /** The underlying (positive) atom. */ + def atom: Atom + /** Negates this literal.*/ + def unary_! : Literal +} +/** A variable with name `label`. */ +final case class Atom(label: String) extends Literal { + def atom = this + def unary_! : Negated = Negated(this) +} +/** A negated atom, in the sense of negation as failure, not logical negation. +* That is, it is true if `atom` is not known/defined. */ +final case class Negated(atom: Atom) extends Literal { + def unary_! : Atom = atom +} + +/** A formula consists of variables, negation, and conjunction (and). +* (Disjunction is not currently included- it is modeled at the level of a sequence of clauses. +* This is less convenient when defining clauses, but is not less powerful.) */ +sealed abstract class Formula { + /** Constructs a clause that proves `atoms` when this formula is true. */ + def proves(atom: Atom, atoms: Atom*): Clause = Clause(this, (atom +: atoms).toSet) + + /** Constructs a formula that is true iff this formula and `f` are both true.*/ + def && (f: Formula): Formula = (this, f) match { + case (True, x) => x + case (x, True) => x + case (And(as), And(bs)) => And(as ++ bs) + case (And(as), b: Literal) => And(as + b) + case (a: Literal, And(bs)) => And(bs + a) + case (a: Literal, b: Literal) => And( Set(a,b) ) + } +} + + +object Formula { + /** A conjunction of literals. */ + final case class And(literals: Set[Literal]) extends Formula { + assert(literals.nonEmpty, "'And' requires at least one literal.") + } + final case object True extends Formula +} + +object Logic +{ + def reduceAll(clauses: List[Clause], initialFacts: Set[Literal]): Matched = reduce(Clauses(clauses), initialFacts) + + /** Computes the variables in the unique stable model for the program represented by `clauses` and `initialFacts`. + * `clause` may not have any negative feedback (that is, negation is acyclic) + * and `initialFacts` cannot be in the head of any clauses in `clause`. + * These restrictions ensure that the logic program has a unique minimal model. */ + def reduce(clauses: Clauses, initialFacts: Set[Literal]): Matched = + { + val (posSeq, negSeq) = separate(initialFacts.toSeq) + val (pos, neg) = (posSeq.toSet, negSeq.toSet) + + checkContradictions(pos, neg) + checkOverlap(clauses, pos) + checkAcyclic(clauses) + + reduce0(clauses, initialFacts, Matched.empty) + } + + + /** Verifies `initialFacts` are not in the head of any `clauses`. + * This avoids the situation where an atom is proved but no clauses prove it. + * This isn't necessarily a problem, but the main sbt use cases expects + * a proven atom to have at least one clause satisfied. */ + def checkOverlap(clauses: Clauses, initialFacts: Set[Atom]) { + val as = atoms(clauses) + val initialOverlap = initialFacts.filter(as.inHead) + if(initialOverlap.nonEmpty) throw new InitialOverlap(initialOverlap) + } + + private[this] def checkContradictions(pos: Set[Atom], neg: Set[Atom]) { + val contradictions = pos intersect neg + if(contradictions.nonEmpty) throw new InitialContradictions(contradictions) + } + + def checkAcyclic(clauses: Clauses) { + // TODO + } + + final class InitialContradictions(val literals: Set[Atom]) extends RuntimeException("Initial facts cannot be both true and false:\n\t" + literals.mkString("\n\t")) + final class InitialOverlap(val literals: Set[Atom]) extends RuntimeException("Initial positive facts cannot be implied by any clauses:\n\t" + literals.mkString("\n\t")) + final class CyclicNegation(val cycle: List[Atom]) extends RuntimeException("Negation may not be involved in a cycle:\n\t" + cycle.mkString("\n\t")) + + /** Tracks proven atoms in the reverse order they were proved. */ + final class Matched private(val provenSet: Set[Atom], reverseOrdered: List[Atom]) { + def add(atoms: Set[Atom]): Matched = add(atoms.toList) + def add(atoms: List[Atom]): Matched = { + val newOnly = atoms.filterNot(provenSet) + new Matched(provenSet ++ newOnly, newOnly ::: reverseOrdered) + } + def ordered: List[Atom] = reverseOrdered.reverse + override def toString = ordered.map(_.label).mkString("Matched(", ",", ")") + } + object Matched { + val empty = new Matched(Set.empty, Nil) + } + + /** Separates a sequence of literals into `(pos, neg)` atom sequences. */ + private[this] def separate(lits: Seq[Literal]): (Seq[Atom], Seq[Atom]) = Util.separate(lits) { + case a: Atom => Left(a) + case Negated(n) => Right(n) + } + + /** Finds clauses that have no body and thus prove their head. + * Returns `(, )`. */ + private[this] def findProven(c: Clauses): (Set[Atom], List[Clause]) = + { + val (proven, unproven) = c.clauses.partition(_.body == True) + (proven.flatMap(_.head).toSet, unproven) + } + private[this] def keepPositive(lits: Set[Literal]): Set[Atom] = + lits.collect{ case a: Atom => a}.toSet + + // precondition: factsToProcess contains no contradictions + @tailrec + private[this] def reduce0(clauses: Clauses, factsToProcess: Set[Literal], state: Matched): Matched = + applyAll(clauses, factsToProcess) match { + case None => // all of the remaining clauses failed on the new facts + state + case Some(applied) => + val (proven, unprovenClauses) = findProven(applied) + val processedFacts = state add keepPositive(factsToProcess) + val newlyProven = proven -- processedFacts.provenSet + val newState = processedFacts add newlyProven + if(unprovenClauses.isEmpty) + newState // no remaining clauses, done. + else { + val unproven = Clauses(unprovenClauses) + val nextFacts: Set[Literal] = if(newlyProven.nonEmpty) newlyProven.toSet else inferFailure(unproven) + reduce0(unproven, nextFacts, newState) + } + } + + /** Finds negated atoms under the negation as failure rule and returns them. + * This should be called only after there are no more known atoms to be substituted. */ + private[this] def inferFailure(clauses: Clauses): Set[Literal] = + { + /* At this point, there is at least one clause and one of the following is the case as the result of the acyclic negation rule: + i. there is at least one variable that occurs in a clause body but not in the head of a clause + ii. there is at least one variable that occurs in the head of a clause and does not transitively depend on a negated variable + In either case, each such variable x cannot be proven true and therefore proves 'not x' (negation as failure, !x in the code). + */ + val allAtoms = atoms(clauses) + val newFacts: Set[Literal] = negated(allAtoms.triviallyFalse) + if(newFacts.nonEmpty) + newFacts + else { + val possiblyTrue = hasNegatedDependency(clauses.clauses, Relation.empty, Relation.empty) + val newlyFalse: Set[Literal] = negated(allAtoms.inHead -- possiblyTrue) + if(newlyFalse.nonEmpty) + newlyFalse + else // should never happen due to the acyclic negation rule + error(s"No progress:\n\tclauses: $clauses\n\tpossibly true: $possiblyTrue") + } + } + + private[this] def negated(atoms: Set[Atom]): Set[Literal] = atoms.map(a => Negated(a)) + + /** Computes the set of atoms in `clauses` that directly or transitively take a negated atom as input. + * For example, for the following clauses, this method would return `List(a, d)` : + * a :- b, not c + * d :- a + */ + @tailrec + def hasNegatedDependency(clauses: Seq[Clause], posDeps: Relation[Atom, Atom], negDeps: Relation[Atom, Atom]): List[Atom] = + clauses match { + case Seq() => + // because cycles between positive literals are allowed, this isn't strictly a topological sort + Dag.topologicalSortUnchecked(negDeps._1s)(posDeps.reverse) + case Clause(formula, head) +: tail => + // collect direct positive and negative literals and track them in separate graphs + val (pos, neg) = directDeps(formula) + val (newPos, newNeg) = ( (posDeps, negDeps) /: head) { case ( (pdeps, ndeps), d) => + (pdeps + (d, pos), ndeps + (d, neg) ) + } + hasNegatedDependency(tail, newPos, newNeg) + } + + /** Computes the `(positive, negative)` literals in `formula`. */ + private[this] def directDeps(formula: Formula): (Seq[Atom], Seq[Atom]) = formula match { + case And(lits) => separate(lits.toSeq) + case Negated(a) => (Nil, a :: Nil) + case a: Atom => (a :: Nil, Nil) + case True => (Nil, Nil) + } + + /** Computes the atoms in the heads and bodies of the clauses in `clause`. */ + def atoms(cs: Clauses): Atoms = cs.clauses.map(c => Atoms(c.head, atoms(c.body))).reduce(_ ++ _) + + /** Computes the set of all atoms in `formula`. */ + def atoms(formula: Formula): Set[Atom] = formula match { + case And(lits) => lits.map(_.atom) + case Negated(lit) => Set(lit) + case a: Atom => Set(a) + case True => Set() + } + + /** Represents the set of atoms in the heads of clauses and in the bodies (formulas) of clauses. */ + final case class Atoms(val inHead: Set[Atom], val inFormula: Set[Atom]) { + /** Concatenates this with `as`. */ + def ++ (as: Atoms): Atoms = Atoms(inHead ++ as.inHead, inFormula ++ as.inFormula) + /** Atoms that cannot be true because they do not occur in a head. */ + def triviallyFalse: Set[Atom] = inFormula -- inHead + } + + /** Applies known facts to `clause`s, deriving a new, possibly empty list of clauses. + * 1. If a fact is in the body of a clause, the derived clause has that fact removed from the body. + * 2. If the negation of a fact is in a body of a clause, that clause fails and is removed. + * 3. If a fact or its negation is in the head of a clause, the derived clause has that fact (or its negation) removed from the head. + * 4. If a head is empty, the clause proves nothing and is removed. + * + * NOTE: empty bodies do not cause a clause to succeed yet. + * All known facts must be applied before this can be done in order to avoid inconsistencies. + * Precondition: no contradictions in `facts` + * Postcondition: no atom in `facts` is present in the result + * Postcondition: No clauses have an empty head + * */ + def applyAll(cs: Clauses, facts: Set[Literal]): Option[Clauses] = + { + val newClauses = + if(facts.isEmpty) + cs.clauses.filter(_.head.nonEmpty) // still need to drop clauses with an empty head + else + cs.clauses.map(c => applyAll(c, facts)).flatMap(_.toList) + if(newClauses.isEmpty) None else Some(Clauses(newClauses)) + } + + def applyAll(c: Clause, facts: Set[Literal]): Option[Clause] = + { + val atoms = facts.map(_.atom) + val newHead = c.head -- atoms // 3. + if(newHead.isEmpty) // 4. empty head + None + else + substitute(c.body, facts).map( f => Clause(f, newHead) ) // 1, 2 + } + + /** Derives the formula that results from substituting `facts` into `formula`. */ + @tailrec + def substitute(formula: Formula, facts: Set[Literal]): Option[Formula] = formula match { + case And(lits) => + def negated(lits: Set[Literal]): Set[Literal] = lits.map(a => !a) + if( lits.exists( negated(facts) ) ) // 2. + None + else { + val newLits = lits -- facts + val newF = if(newLits.isEmpty) True else And(newLits) + Some(newF) // 1. + } + case True => Some(True) + case lit: Literal => // define in terms of And + substitute(And(Set(lit)), facts) + } +} diff --git a/util/logic/src/test/scala/sbt/logic/Test.scala b/util/logic/src/test/scala/sbt/logic/Test.scala new file mode 100644 index 000000000..49836998a --- /dev/null +++ b/util/logic/src/test/scala/sbt/logic/Test.scala @@ -0,0 +1,84 @@ +package sbt +package logic + +object Test { + val A = Atom("A") + val B = Atom("B") + val C = Atom("C") + val D = Atom("D") + val E = Atom("E") + val F = Atom("F") + val G = Atom("G") + + val clauses = + A.proves(B) :: + A.proves(F) :: + B.proves(F) :: + F.proves(A) :: + (!C).proves(F) :: + D.proves(C) :: + C.proves(D) :: + Nil + + val cycles = Logic.reduceAll(clauses, Set()) + + val badClauses = + A.proves(D) :: + clauses + + val excludedNeg = { + val cs = + (!A).proves(B) :: + Nil + val init = + (!A) :: + (!B) :: + Nil + Logic.reduceAll(cs, init.toSet) + } + + val excludedPos = { + val cs = + A.proves(B) :: + Nil + val init = + A :: + (!B) :: + Nil + Logic.reduceAll(cs, init.toSet) + } + + val trivial = { + val cs = + Formula.True.proves(A) :: + Nil + Logic.reduceAll(cs, Set.empty) + } + + val lessTrivial = { + val cs = + Formula.True.proves(A) :: + Formula.True.proves(B) :: + (A && B && (!C)).proves(D) :: + Nil + Logic.reduceAll(cs, Set()) + } + + val ordering = { + val cs = + E.proves(F) :: + (C && !D).proves(E) :: + (A && B).proves(C) :: + Nil + Logic.reduceAll(cs, Set(A,B)) + } + + def all { + println(s"Cycles: $cycles") + println(s"xNeg: $excludedNeg") + println(s"xPos: $excludedPos") + println(s"trivial: $trivial") + println(s"lessTrivial: $lessTrivial") + println(s"ordering: $ordering") + } +} diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala index 725512d0b..77c0b70c2 100644 --- a/util/relation/src/main/scala/sbt/Relation.scala +++ b/util/relation/src/main/scala/sbt/Relation.scala @@ -40,7 +40,7 @@ object Relation private[sbt] def get[X,Y](map: M[X,Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) - private[sbt] type M[X,Y] = Map[X, Set[Y]] + private[sbt] type M[X,Y] = Map[X, Set[Y]] } /** Binary relation between A and B. It is a set of pairs (_1, _2) for _1 in A, _2 in B. */ @@ -111,7 +111,7 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext { def forwardMap = fwd def reverseMap = rev - + def forward(t: A) = get(fwd, t) def reverse(t: B) = get(rev, t) @@ -119,12 +119,12 @@ private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ext def _2s = rev.keySet def size = (fwd.valuesIterator map { _.size }).foldLeft(0)(_ + _) - + def all: Traversable[(A,B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable def +(pair: (A,B)) = this + (pair._1, Set(pair._2)) def +(from: A, to: B) = this + (from, to :: Nil) - def +(from: A, to: Traversable[B]) = + def +(from: A, to: Traversable[B]) = if(to.isEmpty) this else new MRelation( add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, from :: Nil) }) def ++(rs: Traversable[(A,B)]) = ((this: Relation[A,B]) /: rs) { _ + _ } From 6a2e8947bb8703570e0044825b3b7cc5fd8b1f49 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 24 Jan 2014 14:19:18 -0500 Subject: [PATCH 399/823] Acyclic negation checking in logic system that backs auto-plugins. --- util/collection/src/main/scala/sbt/Dag.scala | 44 +++++++++++++++++++ .../src/main/scala/sbt/logic/Logic.scala | 37 +++++++++++++--- 2 files changed, 74 insertions(+), 7 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Dag.scala b/util/collection/src/main/scala/sbt/Dag.scala index ef8f9cec1..58fb397ed 100644 --- a/util/collection/src/main/scala/sbt/Dag.scala +++ b/util/collection/src/main/scala/sbt/Dag.scala @@ -71,4 +71,48 @@ object Dag else new Cyclic(value, a :: all, false) } + + private[sbt] trait System[A] { + type B + def dependencies(t: A): List[B] + def isNegated(b: B): Boolean + def toA(b: B): A + } + private[sbt] def findNegativeCycle[T](system: System[T])(nodes: List[system.B]): List[system.B] = + { + import scala.annotation.tailrec + import system._ + val finished = new mutable.HashSet[T] + val visited = new mutable.HashSet[T] + + def visit(nodes: List[B], stack: List[B]): List[B] = nodes match { + case Nil => Nil + case node :: tail => + val atom = toA(node) + if(!visited(atom)) + { + visited += atom + visit(dependencies(atom), node :: stack) match { + case Nil => + finished += atom + visit(tail, stack) + case cycle => cycle + } + } + else if(!finished(atom)) + { + // cycle. If negation is involved, it is an error. + val between = stack.takeWhile(f => toA(f) != atom) + if(between exists isNegated) + between + else + visit(tail, stack) + } + else + visit(tail, stack) + } + + visit(nodes, Nil) + } + } diff --git a/util/logic/src/main/scala/sbt/logic/Logic.scala b/util/logic/src/main/scala/sbt/logic/Logic.scala index 8d02b2ab9..bb6731949 100644 --- a/util/logic/src/main/scala/sbt/logic/Logic.scala +++ b/util/logic/src/main/scala/sbt/logic/Logic.scala @@ -117,12 +117,31 @@ object Logic } def checkAcyclic(clauses: Clauses) { - // TODO + val deps = dependencyMap(clauses) + val cycle = Dag.findNegativeCycle(system(deps))(deps.keys.toList) + if(cycle.nonEmpty) + throw new CyclicNegation(cycle) } + private[this] def system(deps: Map[Atom, Set[Literal]]) = new Dag.System[Atom] { + type B = Literal + def dependencies(a: Atom) = deps.getOrElse(a, Set.empty).toList + def isNegated(b: Literal) = b match { + case Negated(_) => true + case Atom(_) => false + } + def toA(b: Literal) = b.atom + } + + private[this] def dependencyMap(clauses: Clauses): Map[Atom, Set[Literal]] = + (Map.empty[Atom, Set[Literal]] /: clauses.clauses) { + case (m, Clause(formula, heads)) => + val deps = literals(formula) + (m /: heads) { (n, head) => n.updated(head, n.getOrElse(head, Set.empty) ++ deps) } + } final class InitialContradictions(val literals: Set[Atom]) extends RuntimeException("Initial facts cannot be both true and false:\n\t" + literals.mkString("\n\t")) final class InitialOverlap(val literals: Set[Atom]) extends RuntimeException("Initial positive facts cannot be implied by any clauses:\n\t" + literals.mkString("\n\t")) - final class CyclicNegation(val cycle: List[Atom]) extends RuntimeException("Negation may not be involved in a cycle:\n\t" + cycle.mkString("\n\t")) + final class CyclicNegation(val cycle: List[Literal]) extends RuntimeException("Negation may not be involved in a cycle:\n\t" + cycle.mkString("\n\t")) /** Tracks proven atoms in the reverse order they were proved. */ final class Matched private(val provenSet: Set[Atom], reverseOrdered: List[Atom]) { @@ -220,11 +239,15 @@ object Logic } /** Computes the `(positive, negative)` literals in `formula`. */ - private[this] def directDeps(formula: Formula): (Seq[Atom], Seq[Atom]) = formula match { - case And(lits) => separate(lits.toSeq) - case Negated(a) => (Nil, a :: Nil) - case a: Atom => (a :: Nil, Nil) - case True => (Nil, Nil) + private[this] def directDeps(formula: Formula): (Seq[Atom], Seq[Atom]) = + Util.separate(literals(formula).toSeq) { + case Negated(a) => Right(a) + case a: Atom => Left(a) + } + private[this] def literals(formula: Formula): Set[Literal] = formula match { + case And(lits) => lits + case l: Literal => Set(l) + case True => Set.empty } /** Computes the atoms in the heads and bodies of the clauses in `clause`. */ From 3e1142843ef83da575b629b52bb873c1f2a4403f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 24 Jan 2014 14:19:18 -0500 Subject: [PATCH 400/823] Translate errors from logic system to Natures system. --- util/collection/src/main/scala/sbt/Dag.scala | 3 +- .../src/main/scala/sbt/logic/Logic.scala | 36 ++++++++++--------- 2 files changed, 22 insertions(+), 17 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Dag.scala b/util/collection/src/main/scala/sbt/Dag.scala index 58fb397ed..0ce07baf2 100644 --- a/util/collection/src/main/scala/sbt/Dag.scala +++ b/util/collection/src/main/scala/sbt/Dag.scala @@ -88,6 +88,7 @@ object Dag def visit(nodes: List[B], stack: List[B]): List[B] = nodes match { case Nil => Nil case node :: tail => + def indent = "\t" * stack.size val atom = toA(node) if(!visited(atom)) { @@ -102,7 +103,7 @@ object Dag else if(!finished(atom)) { // cycle. If negation is involved, it is an error. - val between = stack.takeWhile(f => toA(f) != atom) + val between = node :: stack.takeWhile(f => toA(f) != atom) if(between exists isNegated) between else diff --git a/util/logic/src/main/scala/sbt/logic/Logic.scala b/util/logic/src/main/scala/sbt/logic/Logic.scala index bb6731949..2181fbb7e 100644 --- a/util/logic/src/main/scala/sbt/logic/Logic.scala +++ b/util/logic/src/main/scala/sbt/logic/Logic.scala @@ -82,22 +82,26 @@ object Formula { object Logic { - def reduceAll(clauses: List[Clause], initialFacts: Set[Literal]): Matched = reduce(Clauses(clauses), initialFacts) + def reduceAll(clauses: List[Clause], initialFacts: Set[Literal]): Either[LogicException, Matched] = + reduce(Clauses(clauses), initialFacts) /** Computes the variables in the unique stable model for the program represented by `clauses` and `initialFacts`. * `clause` may not have any negative feedback (that is, negation is acyclic) * and `initialFacts` cannot be in the head of any clauses in `clause`. * These restrictions ensure that the logic program has a unique minimal model. */ - def reduce(clauses: Clauses, initialFacts: Set[Literal]): Matched = + def reduce(clauses: Clauses, initialFacts: Set[Literal]): Either[LogicException, Matched] = { val (posSeq, negSeq) = separate(initialFacts.toSeq) val (pos, neg) = (posSeq.toSet, negSeq.toSet) - checkContradictions(pos, neg) - checkOverlap(clauses, pos) - checkAcyclic(clauses) + val problem = + checkContradictions(pos, neg) orElse + checkOverlap(clauses, pos) orElse + checkAcyclic(clauses) - reduce0(clauses, initialFacts, Matched.empty) + problem.toLeft( + reduce0(clauses, initialFacts, Matched.empty) + ) } @@ -105,22 +109,21 @@ object Logic * This avoids the situation where an atom is proved but no clauses prove it. * This isn't necessarily a problem, but the main sbt use cases expects * a proven atom to have at least one clause satisfied. */ - def checkOverlap(clauses: Clauses, initialFacts: Set[Atom]) { + private[this] def checkOverlap(clauses: Clauses, initialFacts: Set[Atom]): Option[InitialOverlap] = { val as = atoms(clauses) val initialOverlap = initialFacts.filter(as.inHead) - if(initialOverlap.nonEmpty) throw new InitialOverlap(initialOverlap) + if(initialOverlap.nonEmpty) Some(new InitialOverlap(initialOverlap)) else None } - private[this] def checkContradictions(pos: Set[Atom], neg: Set[Atom]) { + private[this] def checkContradictions(pos: Set[Atom], neg: Set[Atom]): Option[InitialContradictions] = { val contradictions = pos intersect neg - if(contradictions.nonEmpty) throw new InitialContradictions(contradictions) + if(contradictions.nonEmpty) Some(new InitialContradictions(contradictions)) else None } - def checkAcyclic(clauses: Clauses) { + private[this] def checkAcyclic(clauses: Clauses): Option[CyclicNegation] = { val deps = dependencyMap(clauses) val cycle = Dag.findNegativeCycle(system(deps))(deps.keys.toList) - if(cycle.nonEmpty) - throw new CyclicNegation(cycle) + if(cycle.nonEmpty) Some(new CyclicNegation(cycle)) else None } private[this] def system(deps: Map[Atom, Set[Literal]]) = new Dag.System[Atom] { type B = Literal @@ -139,9 +142,10 @@ object Logic (m /: heads) { (n, head) => n.updated(head, n.getOrElse(head, Set.empty) ++ deps) } } - final class InitialContradictions(val literals: Set[Atom]) extends RuntimeException("Initial facts cannot be both true and false:\n\t" + literals.mkString("\n\t")) - final class InitialOverlap(val literals: Set[Atom]) extends RuntimeException("Initial positive facts cannot be implied by any clauses:\n\t" + literals.mkString("\n\t")) - final class CyclicNegation(val cycle: List[Literal]) extends RuntimeException("Negation may not be involved in a cycle:\n\t" + cycle.mkString("\n\t")) + sealed abstract class LogicException(override val toString: String) + final class InitialContradictions(val literals: Set[Atom]) extends LogicException("Initial facts cannot be both true and false:\n\t" + literals.mkString("\n\t")) + final class InitialOverlap(val literals: Set[Atom]) extends LogicException("Initial positive facts cannot be implied by any clauses:\n\t" + literals.mkString("\n\t")) + final class CyclicNegation(val cycle: List[Literal]) extends LogicException("Negation may not be involved in a cycle:\n\t" + cycle.mkString("\n\t")) /** Tracks proven atoms in the reverse order they were proved. */ final class Matched private(val provenSet: Set[Atom], reverseOrdered: List[Atom]) { From 7d03a9da99c0e385fd0f54acedaaf66019d9750c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 24 Jan 2014 14:19:18 -0500 Subject: [PATCH 401/823] API docs, better terminology for negative cycle checking in logic system. --- util/collection/src/main/scala/sbt/Dag.scala | 57 ++++++++++++------- .../src/main/scala/sbt/logic/Logic.scala | 11 ++-- 2 files changed, 41 insertions(+), 27 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Dag.scala b/util/collection/src/main/scala/sbt/Dag.scala index 0ce07baf2..f0594ed50 100644 --- a/util/collection/src/main/scala/sbt/Dag.scala +++ b/util/collection/src/main/scala/sbt/Dag.scala @@ -72,39 +72,52 @@ object Dag new Cyclic(value, a :: all, false) } - private[sbt] trait System[A] { - type B - def dependencies(t: A): List[B] - def isNegated(b: B): Boolean - def toA(b: B): A + /** A directed graph with edges labeled positive or negative. */ + private[sbt] trait DirectedSignedGraph[Node] + { + /** Directed edge type that tracks the sign and target (head) vertex. + * The sign can be obtained via [[isNegative]] and the target vertex via [[head]]. */ + type Arrow + /** List of initial nodes. */ + def nodes: List[Arrow] + /** Outgoing edges for `n`. */ + def dependencies(n: Node): List[Arrow] + /** `true` if the edge `a` is "negative", false if it is "positive". */ + def isNegative(a: Arrow): Boolean + /** The target of the directed edge `a`. */ + def head(a: Arrow): Node } - private[sbt] def findNegativeCycle[T](system: System[T])(nodes: List[system.B]): List[system.B] = + + /** Traverses a directed graph defined by `graph` looking for a cycle that includes a "negative" edge. + * The directed edges are weighted by the caller as "positive" or "negative". + * If a cycle containing a "negative" edge is detected, its member edges are returned in order. + * Otherwise, the empty list is returned. */ + private[sbt] def findNegativeCycle[Node](graph: DirectedSignedGraph[Node]): List[graph.Arrow] = { import scala.annotation.tailrec - import system._ - val finished = new mutable.HashSet[T] - val visited = new mutable.HashSet[T] + import graph._ + val finished = new mutable.HashSet[Node] + val visited = new mutable.HashSet[Node] - def visit(nodes: List[B], stack: List[B]): List[B] = nodes match { + def visit(edges: List[Arrow], stack: List[Arrow]): List[Arrow] = edges match { case Nil => Nil - case node :: tail => - def indent = "\t" * stack.size - val atom = toA(node) - if(!visited(atom)) + case edge :: tail => + val node = head(edge) + if(!visited(node)) { - visited += atom - visit(dependencies(atom), node :: stack) match { + visited += node + visit(dependencies(node), edge :: stack) match { case Nil => - finished += atom + finished += node visit(tail, stack) case cycle => cycle } } - else if(!finished(atom)) + else if(!finished(node)) { - // cycle. If negation is involved, it is an error. - val between = node :: stack.takeWhile(f => toA(f) != atom) - if(between exists isNegated) + // cycle. If a negative edge is involved, it is an error. + val between = edge :: stack.takeWhile(f => head(f) != node) + if(between exists isNegative) between else visit(tail, stack) @@ -113,7 +126,7 @@ object Dag visit(tail, stack) } - visit(nodes, Nil) + visit(graph.nodes, Nil) } } diff --git a/util/logic/src/main/scala/sbt/logic/Logic.scala b/util/logic/src/main/scala/sbt/logic/Logic.scala index 2181fbb7e..4eb8e64b1 100644 --- a/util/logic/src/main/scala/sbt/logic/Logic.scala +++ b/util/logic/src/main/scala/sbt/logic/Logic.scala @@ -122,17 +122,18 @@ object Logic private[this] def checkAcyclic(clauses: Clauses): Option[CyclicNegation] = { val deps = dependencyMap(clauses) - val cycle = Dag.findNegativeCycle(system(deps))(deps.keys.toList) + val cycle = Dag.findNegativeCycle(graph(deps)) if(cycle.nonEmpty) Some(new CyclicNegation(cycle)) else None } - private[this] def system(deps: Map[Atom, Set[Literal]]) = new Dag.System[Atom] { - type B = Literal + private[this] def graph(deps: Map[Atom, Set[Literal]]) = new Dag.DirectedSignedGraph[Atom] { + type Arrow = Literal + def nodes = deps.keys.toList def dependencies(a: Atom) = deps.getOrElse(a, Set.empty).toList - def isNegated(b: Literal) = b match { + def isNegative(b: Literal) = b match { case Negated(_) => true case Atom(_) => false } - def toA(b: Literal) = b.atom + def head(b: Literal) = b.atom } private[this] def dependencyMap(clauses: Clauses): Map[Atom, Set[Literal]] = From 01708c572ede9400d3772bf7cbba66179eb734a3 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 24 Jan 2014 14:19:18 -0500 Subject: [PATCH 402/823] Convert logic system test cases into unit tests. Still TODO for auto-plugins/logic: * property-based tests for logic system * user documentation * (optional) 'about plugins' or similar to show more information about the auto-plugins for a project * (deferred) allow AutoPlugin to inject Commands directly? * (deferred) provide AutoPlugin functionality to arbitrary scopes instead of just at the Project level? --- .../logic/src/test/scala/sbt/logic/Test.scala | 35 ++++++++++++++++++- 1 file changed, 34 insertions(+), 1 deletion(-) diff --git a/util/logic/src/test/scala/sbt/logic/Test.scala b/util/logic/src/test/scala/sbt/logic/Test.scala index 49836998a..cf50ef9fd 100644 --- a/util/logic/src/test/scala/sbt/logic/Test.scala +++ b/util/logic/src/test/scala/sbt/logic/Test.scala @@ -1,7 +1,40 @@ package sbt package logic -object Test { + import org.scalacheck._ + import Prop.secure + import Logic.{LogicException, Matched} + +object LogicTest extends Properties("Logic") +{ + import TestClauses._ + + property("Handles trivial resolution.") = secure( expect(trivial, Set(A) ) ) + property("Handles less trivial resolution.") = secure( expect(lessTrivial, Set(B,A,D)) ) + property("Handles cycles without negation") = secure( expect(cycles, Set(F,A,B)) ) + property("Handles basic exclusion.") = secure( expect(excludedPos, Set()) ) + property("Handles exclusion of head proved by negation.") = secure( expect(excludedNeg, Set()) ) + // TODO: actually check ordering, probably as part of a check that dependencies are satisifed + property("Properly orders results.") = secure( expect(ordering, Set(B,A,C,E,F))) + property("Detects cyclic negation") = secure( + Logic.reduceAll(badClauses, Set()) match { + case Right(res) => false + case Left(err: Logic.CyclicNegation) => true + case Left(err) => error(s"Expected cyclic error, got: $err") + } + ) + + def expect(result: Either[LogicException, Matched], expected: Set[Atom]) = result match { + case Left(err) => false + case Right(res) => + val actual = res.provenSet + (actual == expected) || error(s"Expected to prove $expected, but actually proved $actual") + } +} + +object TestClauses +{ + val A = Atom("A") val B = Atom("B") val C = Atom("C") From 489b48f736524b9d731b5f905b1bbcb79dd47580 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Wed, 12 Feb 2014 12:59:32 +0100 Subject: [PATCH 403/823] SI-8263 Avoid SOE in Symbol#logicallyEnclosingMember under Scala 2.11 Since the fix for SI-2066, Scala 2.11 calls logicallyEnclosingMember on the `x` in the expansion of the task macro: InitializeInstance.app[[T0[x]](T0[java.io.File], T0[java.io.File]), Seq[java.io.File]] This exposed the fact that SBT has created `T0` with `NoSymbol` as the owner. This led to the a SOE. I will also change the compiler to be more tolerant of this, but we can observe good discipline in the macro and pick a sensible owner. --- util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala | 4 ++-- util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala index e9fb207d8..81d3be06f 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala @@ -24,7 +24,7 @@ object KListBuilder extends TupleBuilder val kconsTC: Type = kconsTpe.typeConstructor /** This is the L in the type function [L[x]] ... */ - val tcVariable: TypeSymbol = newTCVariable(NoSymbol) + val tcVariable: TypeSymbol = newTCVariable(util.initialOwner) /** Instantiates KCons[h, t <: KList[L], L], where L is the type constructor variable */ def kconsType(h: Type, t: Type): Type = @@ -65,4 +65,4 @@ object KListBuilder extends TupleBuilder val alistInstance: ctx.universe.Tree = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) } -} \ No newline at end of file +} diff --git a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala index 89fe31792..871932b20 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala @@ -25,7 +25,7 @@ object TupleNBuilder extends TupleBuilder val ctx: c.type = c val representationC: PolyType = { - val tcVariable: Symbol = newTCVariable(NoSymbol) + val tcVariable: Symbol = newTCVariable(util.initialOwner) val tupleTypeArgs = inputs.map(in => typeRef(NoPrefix, tcVariable, in.tpe :: Nil).asInstanceOf[global.Type]) val tuple = global.definitions.tupleType(tupleTypeArgs) PolyType(tcVariable :: Nil, tuple.asInstanceOf[Type] ) From 3b24308396ded7f968f1cdd7e73fe02833e1b3c0 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Sat, 15 Feb 2014 12:59:03 +0100 Subject: [PATCH 404/823] using compat._ to plug source compatibility breakages This commit makes the code source compatible across Scala 2.10.3 and https://github.com/scala/scala/pull/3452, which is proposed for inclusion in Scala 2.11.0-RC1. We only strictly need the incremental compiler to build on Scala 2.11, as that is integrated into the IDE. But we gain valuable insight into compiler regressions by building *all* of SBT with 2.11. We only got there recently (the 0.13 branch of SBT now fully cross compiles with 2.10.3 and 2.11.0-SNAPSHOT), and this aims to keep things that way. Once 2.10 support is dropped, SBT macros will be able to exploit the new reflection APIs in 2.11 to avoid the need for casting to compiler internals, which aren't governed by binary compatibility. This has been prototyped by @xeno-by: https://github.com/sbt/sbt/pull/1121 --- util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala | 4 ++++ util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala | 4 ++++ util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala | 4 ++++ 3 files changed, 12 insertions(+) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index 381674e47..c0c849fab 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -32,12 +32,16 @@ object ContextUtil { def unexpectedTree[C <: Context](tree: C#Tree): Nothing = sys.error("Unexpected macro application tree (" + tree.getClass + "): " + tree) } +// TODO 2.11 Remove this after dropping 2.10.x support. +private object HasCompat { val compat = ??? }; import HasCompat._ + /** Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). * This is not thread safe due to the underlying Context and related data structures not being thread safe. * Use `ContextUtil[c.type](c)` to construct. */ final class ContextUtil[C <: Context](val ctx: C) { import ctx.universe.{Apply=>ApplyTree,_} + import compat._ val powerContext = ctx.asInstanceOf[reflect.macros.runtime.Context] val global: powerContext.universe.type = powerContext.universe diff --git a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala index 81d3be06f..d9dbebe42 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala @@ -9,11 +9,15 @@ package appmacro /** A `TupleBuilder` that uses a KList as the tuple representation.*/ object KListBuilder extends TupleBuilder { + // TODO 2.11 Remove this after dropping 2.10.x support. + private object HasCompat { val compat = ??? }; import HasCompat._ + def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { val ctx: c.type = c val util = ContextUtil[c.type](c) import c.universe.{Apply=>ApplyTree,_} + import compat._ import util._ val knilType = c.typeOf[KNil] diff --git a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala index 871932b20..28fa581a4 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala @@ -14,10 +14,14 @@ object TupleNBuilder extends TupleBuilder final val MaxInputs = 11 final val TupleMethodName = "tuple" + // TODO 2.11 Remove this after dropping 2.10.x support. + private object HasCompat { val compat = ??? }; import HasCompat._ + def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { val util = ContextUtil[c.type](c) import c.universe.{Apply=>ApplyTree,_} + import compat._ import util._ val global: Global = c.universe.asInstanceOf[Global] From f2d6528c5c44a63ca8fbed2f3fc73967c149fe9b Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Fri, 7 Mar 2014 17:48:31 +0100 Subject: [PATCH 405/823] Fix task macro's handling of Symbol owners in .value The qualifier of the `.value` call may contain `DefTree`s (e.g. vals, defs) or `Function` trees. When we snip them out of the tree and graft them into a new context, we must also call `changeOwner`, so that the symbol owner structure and the tree structure are coherent. Failure to do so resulted in a crash in the compiler backend. Fixes #1150 --- .../main/scala/sbt/appmacro/ContextUtil.scala | 22 ++++++++++--------- 1 file changed, 12 insertions(+), 10 deletions(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index c0c849fab..389fd33f8 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -226,17 +226,19 @@ final class ContextUtil[C <: Context](val ctx: C) object appTransformer extends Transformer { override def transform(tree: Tree): Tree = - tree match - { - case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => subWrapper(nme.decoded, targ.tpe, qual, tree) match { - case Converted.Success(t, finalTx) => finalTx(t) - case Converted.Failure(p,m) => ctx.abort(p, m) - case _: Converted.NotApplicable[_] => super.transform(tree) - } + tree match { + case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => + changeOwner(qual, currentOwner, initialOwner) // Fixes https://github.com/sbt/sbt/issues/1150 + subWrapper(nme.decoded, targ.tpe, qual, tree) match { + case Converted.Success(t, finalTx) => finalTx(t) + case Converted.Failure(p,m) => ctx.abort(p, m) + case _: Converted.NotApplicable[_] => super.transform(tree) + } case _ => super.transform(tree) } } - - appTransformer.transform(t) + appTransformer.atOwner(initialOwner) { + appTransformer.transform(t) + } } -} \ No newline at end of file +} From 44c3e27eb731326ac2328a7561d1e2d33b5c70cb Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Fri, 7 Mar 2014 22:25:29 -0500 Subject: [PATCH 406/823] Revert "Fix task macro's handling of Symbol owners in .value" This reverts commit 3017bfcd07466a2c3aa9bc5fe4760929db8ae0ed. This was causing sbt to be unable to compile. Reverting temporarily until we have a shot at a full fix. --- .../main/scala/sbt/appmacro/ContextUtil.scala | 22 +++++++++---------- 1 file changed, 10 insertions(+), 12 deletions(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index 389fd33f8..c0c849fab 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -226,19 +226,17 @@ final class ContextUtil[C <: Context](val ctx: C) object appTransformer extends Transformer { override def transform(tree: Tree): Tree = - tree match { - case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => - changeOwner(qual, currentOwner, initialOwner) // Fixes https://github.com/sbt/sbt/issues/1150 - subWrapper(nme.decoded, targ.tpe, qual, tree) match { - case Converted.Success(t, finalTx) => finalTx(t) - case Converted.Failure(p,m) => ctx.abort(p, m) - case _: Converted.NotApplicable[_] => super.transform(tree) - } + tree match + { + case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => subWrapper(nme.decoded, targ.tpe, qual, tree) match { + case Converted.Success(t, finalTx) => finalTx(t) + case Converted.Failure(p,m) => ctx.abort(p, m) + case _: Converted.NotApplicable[_] => super.transform(tree) + } case _ => super.transform(tree) } } - appTransformer.atOwner(initialOwner) { - appTransformer.transform(t) - } + + appTransformer.transform(t) } -} +} \ No newline at end of file From ba2d12a46dd8e472ae39faa3916df5ae5724063a Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Fri, 7 Mar 2014 17:48:31 +0100 Subject: [PATCH 407/823] Fix task macro's handling of Symbol owners in .value The qualifier of the `.value` call may contain `DefTree`s (e.g. vals, defs) or `Function` trees. When we snip them out of the tree and graft them into a new context, we must also call `changeOwner`, so that the symbol owner structure and the tree structure are coherent. Failure to do so resulted in a crash in the compiler backend. Fixes #1150 --- .../main/scala/sbt/appmacro/ContextUtil.scala | 23 +++++++++++-------- 1 file changed, 13 insertions(+), 10 deletions(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index c0c849fab..fe1baa696 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -226,17 +226,20 @@ final class ContextUtil[C <: Context](val ctx: C) object appTransformer extends Transformer { override def transform(tree: Tree): Tree = - tree match - { - case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => subWrapper(nme.decoded, targ.tpe, qual, tree) match { - case Converted.Success(t, finalTx) => finalTx(t) - case Converted.Failure(p,m) => ctx.abort(p, m) - case _: Converted.NotApplicable[_] => super.transform(tree) - } + tree match { + case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => + subWrapper(nme.decoded, targ.tpe, qual, tree) match { + case Converted.Success(t, finalTx) => + changeOwner(qual, currentOwner, initialOwner) // Fixes https://github.com/sbt/sbt/issues/1150 + finalTx(t) + case Converted.Failure(p,m) => ctx.abort(p, m) + case _: Converted.NotApplicable[_] => super.transform(tree) + } case _ => super.transform(tree) } } - - appTransformer.transform(t) + appTransformer.atOwner(initialOwner) { + appTransformer.transform(t) + } } -} \ No newline at end of file +} From 13a40b14560494681d8de4abf3e57e551beeba17 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Fri, 7 Feb 2014 18:05:00 +0100 Subject: [PATCH 408/823] Fix NPE in task macro accessing q"{...}".symbol.pos We shouldn't assume that the qualifier of a `Select` is a `SymTree`; it may be a `Block`. One place that happens is after the transformation of named/defaults applications. That causes the reported `NullPointerException'. In any case, using `qual.symbol.pos` sense here; it yields the position of the defintions *referred to* by `qual`, not the position of `qual` itself. Both problems are easily fixed: use `qual.pos` instead. Fixes #1107 --- util/appmacro/src/main/scala/sbt/appmacro/Instance.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala index 0de166b67..043ad8731 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala @@ -167,7 +167,7 @@ object Instance def addType(tpe: Type, qual: Tree, selection: Tree): Tree = { qual.foreach(checkQual) - val vd = util.freshValDef(tpe, qual.symbol.pos, functionSym) + val vd = util.freshValDef(tpe, qual.pos, functionSym) inputs ::= new Input(tpe, qual, vd) util.refVal(selection, vd) } From fdfbaf99d4037dbc79f09ed576220d65be38cea8 Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Sat, 5 Apr 2014 19:46:27 +0100 Subject: [PATCH 409/823] Implemented a file parser. Added SourceOfExamples for lazy example listing (especially useful when lazily searching for files that match a certain prefix). --- .../src/main/scala/sbt/complete/Parser.scala | 43 +++++++++++++++++-- .../src/main/scala/sbt/complete/Parsers.scala | 35 ++++++++++++--- 2 files changed, 69 insertions(+), 9 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 798ea6d49..3ace05d26 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -70,7 +70,7 @@ sealed trait RichParser[A] /** If an exception is thrown by the original Parser, * capture it and fail locally instead of allowing the exception to propagate up and terminate parsing.*/ def failOnException: Parser[A] - + @deprecated("Use `not` and explicitly provide the failure message", "0.12.2") def unary_- : Parser[Unit] @@ -87,6 +87,9 @@ sealed trait RichParser[A] /** Explicitly defines the completions for the original Parser.*/ def examples(s: Set[String], check: Boolean = false): Parser[A] + /** Explicitly defines the completions for the original Parser.*/ + def examples(s: SourceOfExamples, maxNumberOfExamples: Int): Parser[A] + /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ def string(implicit ev: A <:< Seq[Char]): Parser[String] @@ -239,7 +242,7 @@ object Parser extends ParserMain case None => if(max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) } } - + partial match { case Some(part) => @@ -285,6 +288,7 @@ trait ParserMain def - (o: Parser[_]) = sub(a, o) def examples(s: String*): Parser[A] = examples(s.toSet) def examples(s: Set[String], check: Boolean = false): Parser[A] = Parser.examples(a, s, check) + def examples(s: SourceOfExamples, maxNumberOfExamples: Int): Parser[A] = Parser.examples(a, s, maxNumberOfExamples) def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) @@ -295,7 +299,7 @@ trait ParserMain /** Construct a parser that is valid, but has no valid result. This is used as a way * to provide a definitive Failure when a parser doesn't match empty input. For example, - * in `softFailure(...) | p`, if `p` doesn't match the empty sequence, the failure will come + * in `softFailure(...) | p`, if `p` doesn't match the empty sequence, the failure will come * from the Parser constructed by the `softFailure` method. */ private[sbt] def softFailure(msg: => String, definitive: Boolean = false): Parser[Nothing] = SoftInvalid( mkFailures(msg :: Nil, definitive) ) @@ -430,6 +434,17 @@ trait ParserMain } else a + def examples[A](a: Parser[A], completions: SourceOfExamples, maxNumberOfExamples: Int): Parser[A] = + if(a.valid) { + a.result match + { + case Some(av) => success( av ) + case None => + new DynamicExamples(a, completions, maxNumberOfExamples) + } + } + else a + def matched(t: Parser[_], seen: Vector[Char] = Vector.empty, partial: Boolean = false): Parser[String] = t match { @@ -442,7 +457,7 @@ trait ParserMain } /** Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, the completions provided by the delegate `t` or a later derivative are appended to + * When tab completion of part of this token is requested, the completions provided by the delegate `t` or a later derivative are appended to * the prefix String already seen by this parser. */ def token[T](t: Parser[T]): Parser[T] = token(t, TokenCompletions.default) @@ -703,6 +718,26 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Completions(fixed map(f => Completion.suggestion(f)) ) override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" } +abstract class SourceOfExamples +{ + def apply(): Iterable[String] + def withAddedPrefix(addedPrefix: String): SourceOfExamples +} +private final class DynamicExamples[T](delegate: Parser[T], sourceOfExamples: SourceOfExamples, maxNumberOfExamples: Int = 10) extends ValidParser[T] +{ + def derive(c: Char) = examples(delegate derive c, sourceOfExamples.withAddedPrefix(c.toString), maxNumberOfExamples) + def result = delegate.result + lazy val resultEmpty = delegate.resultEmpty + def completions(level: Int) = { + if(sourceOfExamples().isEmpty) + if(resultEmpty.isValid) Completions.nil else Completions.empty + else { + val examplesBasedOnTheResult = sourceOfExamples().take(maxNumberOfExamples).toSet + Completions(examplesBasedOnTheResult.map(ex => Completion.suggestion(ex))) + } + } + override def toString = "examples(" + delegate + ", " + sourceOfExamples().take(2).toList + ")" +} private final class StringLiteral(str: String, start: Int) extends ValidParser[String] { assert(0 <= start && start < str.length) diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala index 6bc745285..0ae64fb44 100644 --- a/util/complete/src/main/scala/sbt/complete/Parsers.scala +++ b/util/complete/src/main/scala/sbt/complete/Parsers.scala @@ -7,6 +7,7 @@ package sbt.complete import java.io.File import java.net.URI import java.lang.Character.{getType, MATH_SYMBOL, OTHER_SYMBOL, DASH_PUNCTUATION, OTHER_PUNCTUATION, MODIFIER_SYMBOL, CURRENCY_SYMBOL} + import java.nio.file.{Files, Path} /** Provides standard implementations of commonly useful [[Parser]]s. */ trait Parsers @@ -78,7 +79,7 @@ trait Parsers def isScalaIDChar(c: Char) = c.isLetterOrDigit || c == '_' def isDelimiter(c: Char) = c match { case '`' | '\'' | '\"' | /*';' | */',' | '.' => true ; case _ => false } - + /** Matches a single character that is not a whitespace character. */ lazy val NotSpaceClass = charClass(!_.isWhitespace, "non-whitespace character") @@ -128,8 +129,32 @@ trait Parsers /** Returns true if `c` is an ASCII letter or digit. */ def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') - // TODO: implement - def fileParser(base: File): Parser[File] = token(mapOrFail(NotSpace)(s => new File(s.mkString)), "") + class FileExamples(base: Path, prefix: String = "") extends SourceOfExamples { + private val prefixPath: String = "." + File.separator + prefix + + override def apply(): Iterable[String] = files(base).map(base.relativize).map(_.toString.substring(prefix.length)) + + override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) + + protected def fileStartsWithPrefix(path: Path): Boolean = path.toString.startsWith(prefixPath) + + protected def directoryStartsWithPrefix(path: Path): Boolean = { + val pathString = path.toString + pathString.startsWith(prefixPath) || prefixPath.startsWith(pathString) + } + + protected def files(directory: Path): Iterable[Path] = { + import scala.collection.JavaConversions._ + val subPathStream = Files.newDirectoryStream(directory).toStream + val (subDirectories, filesOnly) = subPathStream.partition(path => Files.isDirectory(path)) + filesOnly.filter(fileStartsWithPrefix) ++ subDirectories.filter(directoryStartsWithPrefix).flatMap(files) + } + } + + def fileParser(base: File, maxNumberOfExamples: Int = 25): Parser[File] = + OptSpace ~> StringBasic + .examples(new FileExamples(base.toPath), maxNumberOfExamples) + .map(new File(_)) /** Parses a port number. Currently, this accepts any integer and presents a tab completion suggestion of ``. */ lazy val Port = token(IntBasic, "") @@ -153,7 +178,7 @@ trait Parsers /** Parses a verbatim quoted String value, discarding the quotes in the result. This kind of quoted text starts with triple quotes `"""` * and ends at the next triple quotes and may contain any character in between. */ lazy val StringVerbatim: Parser[String] = VerbatimDQuotes ~> - any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ + any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ VerbatimDQuotes /** Parses a string value, interpreting escapes and discarding the surrounding quotes in the result. @@ -168,7 +193,7 @@ trait Parsers BackslashChar ~> ('b' ^^^ '\b' | 't' ^^^ '\t' | 'n' ^^^ '\n' | 'f' ^^^ '\f' | 'r' ^^^ '\r' | '\"' ^^^ '\"' | '\'' ^^^ '\'' | '\\' ^^^ '\\' | UnicodeEscape) - /** Parses a single unicode escape sequence into the represented Char. + /** Parses a single unicode escape sequence into the represented Char. * A unicode escape begins with a backslash, followed by a `u` and 4 hexadecimal digits representing the unicode value. */ lazy val UnicodeEscape: Parser[Char] = ("u" ~> repeat(HexDigit, 4, 4)) map { seq => Integer.parseInt(seq.mkString, 16).toChar } From c1c52d4802f21f3a9b74b03bf294c27e3207095f Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Sun, 6 Apr 2014 00:01:30 +0100 Subject: [PATCH 410/823] Ported the file search with pre-Java 7 API. --- .../src/main/scala/sbt/complete/Parsers.scala | 21 ++++++++----------- 1 file changed, 9 insertions(+), 12 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala index 0ae64fb44..bac5db0c2 100644 --- a/util/complete/src/main/scala/sbt/complete/Parsers.scala +++ b/util/complete/src/main/scala/sbt/complete/Parsers.scala @@ -7,7 +7,6 @@ package sbt.complete import java.io.File import java.net.URI import java.lang.Character.{getType, MATH_SYMBOL, OTHER_SYMBOL, DASH_PUNCTUATION, OTHER_PUNCTUATION, MODIFIER_SYMBOL, CURRENCY_SYMBOL} - import java.nio.file.{Files, Path} /** Provides standard implementations of commonly useful [[Parser]]s. */ trait Parsers @@ -129,31 +128,29 @@ trait Parsers /** Returns true if `c` is an ASCII letter or digit. */ def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') - class FileExamples(base: Path, prefix: String = "") extends SourceOfExamples { - private val prefixPath: String = "." + File.separator + prefix + class FileExamples(base: File, prefix: String = "") extends SourceOfExamples { + private val relativizedPrefix: String = "." + File.separator + prefix - override def apply(): Iterable[String] = files(base).map(base.relativize).map(_.toString.substring(prefix.length)) + override def apply(): Iterable[String] = files(base).map(_.toString.substring(relativizedPrefix.length)) override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) - protected def fileStartsWithPrefix(path: Path): Boolean = path.toString.startsWith(prefixPath) + protected def fileStartsWithPrefix(path: File): Boolean = path.toString.startsWith(relativizedPrefix) - protected def directoryStartsWithPrefix(path: Path): Boolean = { + protected def directoryStartsWithPrefix(path: File): Boolean = { val pathString = path.toString - pathString.startsWith(prefixPath) || prefixPath.startsWith(pathString) + pathString.startsWith(relativizedPrefix) || relativizedPrefix.startsWith(pathString) } - protected def files(directory: Path): Iterable[Path] = { - import scala.collection.JavaConversions._ - val subPathStream = Files.newDirectoryStream(directory).toStream - val (subDirectories, filesOnly) = subPathStream.partition(path => Files.isDirectory(path)) + protected def files(directory: File): Iterable[File] = { + val (subDirectories, filesOnly) = directory.listFiles().toStream.partition(_.isDirectory) filesOnly.filter(fileStartsWithPrefix) ++ subDirectories.filter(directoryStartsWithPrefix).flatMap(files) } } def fileParser(base: File, maxNumberOfExamples: Int = 25): Parser[File] = OptSpace ~> StringBasic - .examples(new FileExamples(base.toPath), maxNumberOfExamples) + .examples(new FileExamples(base), maxNumberOfExamples) .map(new File(_)) /** Parses a port number. Currently, this accepts any integer and presents a tab completion suggestion of ``. */ From ab6a73016875fd65154ad8d4a5ddc9bce7e8fd5b Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Sun, 6 Apr 2014 22:39:10 +0100 Subject: [PATCH 411/823] Overloaded the the fileParser method. Renamed SourceOfExamples to ExampleSource. Documented fileParser, FileExamples, and ExampleSource. --- .../src/main/scala/sbt/complete/Parser.scala | 36 +++++++++++++------ .../src/main/scala/sbt/complete/Parsers.scala | 19 ++++++++-- 2 files changed, 43 insertions(+), 12 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 3ace05d26..48d137e00 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -88,7 +88,7 @@ sealed trait RichParser[A] def examples(s: Set[String], check: Boolean = false): Parser[A] /** Explicitly defines the completions for the original Parser.*/ - def examples(s: SourceOfExamples, maxNumberOfExamples: Int): Parser[A] + def examples(s: ExampleSource, maxNumberOfExamples: Int): Parser[A] /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ def string(implicit ev: A <:< Seq[Char]): Parser[String] @@ -288,7 +288,7 @@ trait ParserMain def - (o: Parser[_]) = sub(a, o) def examples(s: String*): Parser[A] = examples(s.toSet) def examples(s: Set[String], check: Boolean = false): Parser[A] = Parser.examples(a, s, check) - def examples(s: SourceOfExamples, maxNumberOfExamples: Int): Parser[A] = Parser.examples(a, s, maxNumberOfExamples) + def examples(s: ExampleSource, maxNumberOfExamples: Int): Parser[A] = Parser.examples(a, s, maxNumberOfExamples) def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) @@ -434,7 +434,7 @@ trait ParserMain } else a - def examples[A](a: Parser[A], completions: SourceOfExamples, maxNumberOfExamples: Int): Parser[A] = + def examples[A](a: Parser[A], completions: ExampleSource, maxNumberOfExamples: Int): Parser[A] = if(a.valid) { a.result match { @@ -718,25 +718,41 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Completions(fixed map(f => Completion.suggestion(f)) ) override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" } -abstract class SourceOfExamples + +/** + * These sources of examples are used in parsers for user input completion. An example of such a source is the + * [[sbt.complete.Parsers.FileExamples]] class, which provides a list of suggested files to the user as they press the + * TAB key in the console. + */ +abstract class ExampleSource { + /** + * @return a (possibly lazy) list of completion example strings. These strings are continuations of user's input. The + * user's input is incremented with calls to [[withAddedPrefix]]. + */ def apply(): Iterable[String] - def withAddedPrefix(addedPrefix: String): SourceOfExamples + + /** + * @param addedPrefix a string that just typed in by the user. + * @return a new source of only those examples that start with the string typed by the user so far (with addition of + * the just added prefix). + */ + def withAddedPrefix(addedPrefix: String): ExampleSource } -private final class DynamicExamples[T](delegate: Parser[T], sourceOfExamples: SourceOfExamples, maxNumberOfExamples: Int = 10) extends ValidParser[T] +private final class DynamicExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int) extends ValidParser[T] { - def derive(c: Char) = examples(delegate derive c, sourceOfExamples.withAddedPrefix(c.toString), maxNumberOfExamples) + def derive(c: Char) = examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples) def result = delegate.result lazy val resultEmpty = delegate.resultEmpty def completions(level: Int) = { - if(sourceOfExamples().isEmpty) + if(exampleSource().isEmpty) if(resultEmpty.isValid) Completions.nil else Completions.empty else { - val examplesBasedOnTheResult = sourceOfExamples().take(maxNumberOfExamples).toSet + val examplesBasedOnTheResult = exampleSource().take(maxNumberOfExamples).toSet Completions(examplesBasedOnTheResult.map(ex => Completion.suggestion(ex))) } } - override def toString = "examples(" + delegate + ", " + sourceOfExamples().take(2).toList + ")" + override def toString = "examples(" + delegate + ", " + exampleSource().take(2).toList + ")" } private final class StringLiteral(str: String, start: Int) extends ValidParser[String] { diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala index bac5db0c2..375a622f9 100644 --- a/util/complete/src/main/scala/sbt/complete/Parsers.scala +++ b/util/complete/src/main/scala/sbt/complete/Parsers.scala @@ -128,7 +128,12 @@ trait Parsers /** Returns true if `c` is an ASCII letter or digit. */ def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') - class FileExamples(base: File, prefix: String = "") extends SourceOfExamples { + /** + * Provides path completion examples based on files in the base directory. + * @param base the directory within which this class will search for completion examples. + * @param prefix the part of the path already written by the user. + */ + class FileExamples(base: File, prefix: String = "") extends ExampleSource { private val relativizedPrefix: String = "." + File.separator + prefix override def apply(): Iterable[String] = files(base).map(_.toString.substring(relativizedPrefix.length)) @@ -148,11 +153,21 @@ trait Parsers } } - def fileParser(base: File, maxNumberOfExamples: Int = 25): Parser[File] = + /** + * @param base the directory used for completion proposals (when the user presses the TAB key). Only paths under this + * directory will be proposed. + * @return the file that was parsed from the input string. The returned path may or may not exist. + */ + def fileParser(base: File, maxNumberOfExamples: Int): Parser[File] = OptSpace ~> StringBasic .examples(new FileExamples(base), maxNumberOfExamples) .map(new File(_)) + /** + * See the overloaded [[fileParser]] method. + */ + def fileParser(base: File): Parser[File] = fileParser(base, 25) + /** Parses a port number. Currently, this accepts any integer and presents a tab completion suggestion of ``. */ lazy val Port = token(IntBasic, "") From b9e37107b2011c0ff1133e908816be7784fe6444 Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Sun, 6 Apr 2014 22:48:22 +0100 Subject: [PATCH 412/823] Moved ExampleSource into a separate file. --- .../scala/sbt/complete/ExampleSource.scala | 49 +++++++++++++++++++ .../src/main/scala/sbt/complete/Parser.scala | 21 -------- .../src/main/scala/sbt/complete/Parsers.scala | 25 ---------- 3 files changed, 49 insertions(+), 46 deletions(-) create mode 100644 util/complete/src/main/scala/sbt/complete/ExampleSource.scala diff --git a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala new file mode 100644 index 000000000..be46bc587 --- /dev/null +++ b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala @@ -0,0 +1,49 @@ +package sbt.complete + +import java.io.File + +/** + * These sources of examples are used in parsers for user input completion. An example of such a source is the + * [[sbt.complete.FileExamples]] class, which provides a list of suggested files to the user as they press the + * TAB key in the console. + */ +trait ExampleSource +{ + /** + * @return a (possibly lazy) list of completion example strings. These strings are continuations of user's input. The + * user's input is incremented with calls to [[withAddedPrefix]]. + */ + def apply(): Iterable[String] + + /** + * @param addedPrefix a string that just typed in by the user. + * @return a new source of only those examples that start with the string typed by the user so far (with addition of + * the just added prefix). + */ + def withAddedPrefix(addedPrefix: String): ExampleSource +} + +/** + * Provides path completion examples based on files in the base directory. + * @param base the directory within which this class will search for completion examples. + * @param prefix the part of the path already written by the user. + */ +class FileExamples(base: File, prefix: String = "") extends ExampleSource { + private val relativizedPrefix: String = "." + File.separator + prefix + + override def apply(): Iterable[String] = files(base).map(_.toString.substring(relativizedPrefix.length)) + + override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) + + protected def fileStartsWithPrefix(path: File): Boolean = path.toString.startsWith(relativizedPrefix) + + protected def directoryStartsWithPrefix(path: File): Boolean = { + val pathString = path.toString + pathString.startsWith(relativizedPrefix) || relativizedPrefix.startsWith(pathString) + } + + protected def files(directory: File): Iterable[File] = { + val (subDirectories, filesOnly) = directory.listFiles().toStream.partition(_.isDirectory) + filesOnly.filter(fileStartsWithPrefix) ++ subDirectories.filter(directoryStartsWithPrefix).flatMap(files) + } +} \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 48d137e00..83ac69f2d 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -718,27 +718,6 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Completions(fixed map(f => Completion.suggestion(f)) ) override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" } - -/** - * These sources of examples are used in parsers for user input completion. An example of such a source is the - * [[sbt.complete.Parsers.FileExamples]] class, which provides a list of suggested files to the user as they press the - * TAB key in the console. - */ -abstract class ExampleSource -{ - /** - * @return a (possibly lazy) list of completion example strings. These strings are continuations of user's input. The - * user's input is incremented with calls to [[withAddedPrefix]]. - */ - def apply(): Iterable[String] - - /** - * @param addedPrefix a string that just typed in by the user. - * @return a new source of only those examples that start with the string typed by the user so far (with addition of - * the just added prefix). - */ - def withAddedPrefix(addedPrefix: String): ExampleSource -} private final class DynamicExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int) extends ValidParser[T] { def derive(c: Char) = examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples) diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala index 375a622f9..911253332 100644 --- a/util/complete/src/main/scala/sbt/complete/Parsers.scala +++ b/util/complete/src/main/scala/sbt/complete/Parsers.scala @@ -128,31 +128,6 @@ trait Parsers /** Returns true if `c` is an ASCII letter or digit. */ def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') - /** - * Provides path completion examples based on files in the base directory. - * @param base the directory within which this class will search for completion examples. - * @param prefix the part of the path already written by the user. - */ - class FileExamples(base: File, prefix: String = "") extends ExampleSource { - private val relativizedPrefix: String = "." + File.separator + prefix - - override def apply(): Iterable[String] = files(base).map(_.toString.substring(relativizedPrefix.length)) - - override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) - - protected def fileStartsWithPrefix(path: File): Boolean = path.toString.startsWith(relativizedPrefix) - - protected def directoryStartsWithPrefix(path: File): Boolean = { - val pathString = path.toString - pathString.startsWith(relativizedPrefix) || relativizedPrefix.startsWith(pathString) - } - - protected def files(directory: File): Iterable[File] = { - val (subDirectories, filesOnly) = directory.listFiles().toStream.partition(_.isDirectory) - filesOnly.filter(fileStartsWithPrefix) ++ subDirectories.filter(directoryStartsWithPrefix).flatMap(files) - } - } - /** * @param base the directory used for completion proposals (when the user presses the TAB key). Only paths under this * directory will be proposed. From 6a4eb92ee500ae88d30021ab9a0f392c037ce2dc Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Sun, 6 Apr 2014 23:49:15 +0100 Subject: [PATCH 413/823] Documented the new Parsers API a bit. Prepared the new API so that we can port the old ones to the new. Added support for filtering erroneous examples. --- .../scala/sbt/complete/ExampleSource.scala | 57 +++++++++++-------- .../src/main/scala/sbt/complete/Parser.scala | 55 +++++++++++++++--- .../src/main/scala/sbt/complete/Parsers.scala | 19 +++---- 3 files changed, 87 insertions(+), 44 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala index be46bc587..f576f1bff 100644 --- a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala +++ b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala @@ -9,18 +9,29 @@ import java.io.File */ trait ExampleSource { - /** - * @return a (possibly lazy) list of completion example strings. These strings are continuations of user's input. The - * user's input is incremented with calls to [[withAddedPrefix]]. - */ - def apply(): Iterable[String] + /** + * @return a (possibly lazy) list of completion example strings. These strings are continuations of user's input. The + * user's input is incremented with calls to [[withAddedPrefix]]. + */ + def apply(): Iterable[String] - /** - * @param addedPrefix a string that just typed in by the user. - * @return a new source of only those examples that start with the string typed by the user so far (with addition of - * the just added prefix). - */ - def withAddedPrefix(addedPrefix: String): ExampleSource + /** + * @param addedPrefix a string that just typed in by the user. + * @return a new source of only those examples that start with the string typed by the user so far (with addition of + * the just added prefix). + */ + def withAddedPrefix(addedPrefix: String): ExampleSource +} + +/** + * @param examples the source of examples that will be displayed to the user when they press the TAB key. + */ +sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSource { + override def withAddedPrefix(addedPrefix: String): ExampleSource = FixedSetExamples(examplesWithRemovedPrefix(addedPrefix)) + + override def apply(): Iterable[String] = examples + + private def examplesWithRemovedPrefix(prefix: String) = examples.collect { case example if example startsWith prefix => example substring prefix.length } } /** @@ -29,21 +40,21 @@ trait ExampleSource * @param prefix the part of the path already written by the user. */ class FileExamples(base: File, prefix: String = "") extends ExampleSource { - private val relativizedPrefix: String = "." + File.separator + prefix + private val relativizedPrefix: String = "." + File.separator + prefix - override def apply(): Iterable[String] = files(base).map(_.toString.substring(relativizedPrefix.length)) + override def apply(): Iterable[String] = files(base).map(_.toString.substring(relativizedPrefix.length)) - override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) + override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) - protected def fileStartsWithPrefix(path: File): Boolean = path.toString.startsWith(relativizedPrefix) + protected def fileStartsWithPrefix(path: File): Boolean = path.toString.startsWith(relativizedPrefix) - protected def directoryStartsWithPrefix(path: File): Boolean = { - val pathString = path.toString - pathString.startsWith(relativizedPrefix) || relativizedPrefix.startsWith(pathString) - } + protected def directoryStartsWithPrefix(path: File): Boolean = { + val pathString = path.toString + pathString.startsWith(relativizedPrefix) || relativizedPrefix.startsWith(pathString) + } - protected def files(directory: File): Iterable[File] = { - val (subDirectories, filesOnly) = directory.listFiles().toStream.partition(_.isDirectory) - filesOnly.filter(fileStartsWithPrefix) ++ subDirectories.filter(directoryStartsWithPrefix).flatMap(files) - } + protected def files(directory: File): Iterable[File] = { + val (subDirectories, filesOnly) = directory.listFiles().toStream.partition(_.isDirectory) + filesOnly.filter(fileStartsWithPrefix) ++ subDirectories.filter(directoryStartsWithPrefix).flatMap(files) + } } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 83ac69f2d..eb1844db6 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -4,7 +4,7 @@ package sbt.complete import Parser._ - import sbt.Types.{const, left, right, some} + import sbt.Types.{left, right, some} import sbt.Util.{makeList,separate} /** A String parser that provides semi-automatic tab completion. @@ -87,8 +87,23 @@ sealed trait RichParser[A] /** Explicitly defines the completions for the original Parser.*/ def examples(s: Set[String], check: Boolean = false): Parser[A] - /** Explicitly defines the completions for the original Parser.*/ - def examples(s: ExampleSource, maxNumberOfExamples: Int): Parser[A] + /** + * @param exampleSource the source of examples when displaying completions to the user. + * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can + * prevent lengthy pauses and avoids bad interactive user experience. + * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the + * given parser). Invalid examples will be filtered out and only valid suggestions will + * be displayed. + * @return a new parser with a new source of completions. + */ + def examples(exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] + + /** + * @param exampleSource the source of examples when displaying completions to the user. + * @return a new parser with a new source of completions. It displays at most 25 completion examples and does not + * remove invalid examples. + */ + def examples(exampleSource: ExampleSource): Parser[A] = examples(exampleSource, maxNumberOfExamples = 25, removeInvalidExamples = false) /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ def string(implicit ev: A <:< Seq[Char]): Parser[String] @@ -288,7 +303,7 @@ trait ParserMain def - (o: Parser[_]) = sub(a, o) def examples(s: String*): Parser[A] = examples(s.toSet) def examples(s: Set[String], check: Boolean = false): Parser[A] = Parser.examples(a, s, check) - def examples(s: ExampleSource, maxNumberOfExamples: Int): Parser[A] = Parser.examples(a, s, maxNumberOfExamples) + def examples(s: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = Parser.examples(a, s, maxNumberOfExamples, removeInvalidExamples) def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) @@ -434,13 +449,24 @@ trait ParserMain } else a - def examples[A](a: Parser[A], completions: ExampleSource, maxNumberOfExamples: Int): Parser[A] = + /** + * @param a the parser to decorate with a source of examples. All validation and parsing is delegated to this parser, + * only [[Parser.completions]] is modified. + * @param completions the source of examples when displaying completions to the user. + * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can + * prevent lengthy pauses and avoids bad interactive user experience. + * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the given parser). An + * exception is thrown if the example source contains no valid completion suggestions. + * @tparam A the type of values that are returned by the parser. + * @return + */ + def examples[A](a: Parser[A], completions: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = if(a.valid) { a.result match { case Some(av) => success( av ) case None => - new DynamicExamples(a, completions, maxNumberOfExamples) + new DynamicExamples(a, completions, maxNumberOfExamples, removeInvalidExamples) } } else a @@ -718,20 +744,31 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Completions(fixed map(f => Completion.suggestion(f)) ) override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" } -private final class DynamicExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int) extends ValidParser[T] +private final class DynamicExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean) extends ValidParser[T] { - def derive(c: Char) = examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples) + def derive(c: Char) = examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples, removeInvalidExamples) def result = delegate.result lazy val resultEmpty = delegate.resultEmpty def completions(level: Int) = { if(exampleSource().isEmpty) if(resultEmpty.isValid) Completions.nil else Completions.empty else { - val examplesBasedOnTheResult = exampleSource().take(maxNumberOfExamples).toSet + val examplesBasedOnTheResult = filteredExamples.take(maxNumberOfExamples).toSet Completions(examplesBasedOnTheResult.map(ex => Completion.suggestion(ex))) } } override def toString = "examples(" + delegate + ", " + exampleSource().take(2).toList + ")" + + private def filteredExamples: Iterable[String] = { + if (removeInvalidExamples) + exampleSource().filter(isExampleValid) + else + exampleSource() + } + + private def isExampleValid(example: String): Boolean = { + apply(delegate)(example).resultEmpty.isFailure + } } private final class StringLiteral(str: String, start: Int) extends ValidParser[String] { diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala index 911253332..cb1b15d1a 100644 --- a/util/complete/src/main/scala/sbt/complete/Parsers.scala +++ b/util/complete/src/main/scala/sbt/complete/Parsers.scala @@ -128,21 +128,16 @@ trait Parsers /** Returns true if `c` is an ASCII letter or digit. */ def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') - /** - * @param base the directory used for completion proposals (when the user presses the TAB key). Only paths under this - * directory will be proposed. - * @return the file that was parsed from the input string. The returned path may or may not exist. - */ - def fileParser(base: File, maxNumberOfExamples: Int): Parser[File] = + /** + * @param base the directory used for completion proposals (when the user presses the TAB key). Only paths under this + * directory will be proposed. + * @return the file that was parsed from the input string. The returned path may or may not exist. + */ + def fileParser(base: File): Parser[File] = OptSpace ~> StringBasic - .examples(new FileExamples(base), maxNumberOfExamples) + .examples(new FileExamples(base)) .map(new File(_)) - /** - * See the overloaded [[fileParser]] method. - */ - def fileParser(base: File): Parser[File] = fileParser(base, 25) - /** Parses a port number. Currently, this accepts any integer and presents a tab completion suggestion of ``. */ lazy val Port = token(IntBasic, "") From 35aad2b95ba606e2e36f1ee3445975267d0b1c1a Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Mon, 7 Apr 2014 16:42:08 -0400 Subject: [PATCH 414/823] Part #1 of cancel-task-hooks - Hooks EvaluateTask. * Create a new EvaluateTaskConfig which gives us a bit more freedom over changign config options to EvaluateTask in the future. * Create adapted from old EvaluateTask to new EvaluateTask * Add hooks into signals class to register/remote a signal listener directly, rather than in an "arm" block. * Create TaskEvaluationCancelHandler to control the strategy of who/whom can cancel (sbt-server vs. sbt-terminal). * Create a null-object for the "can't cancel" scenario so the code path is exactly the same. This commit does not wire settings into the build yet, nor does it fix the config extractio methods. --- .../src/main/scala/sbt/Signal.scala | 32 +++++++++++++++++++ 1 file changed, 32 insertions(+) diff --git a/util/collection/src/main/scala/sbt/Signal.scala b/util/collection/src/main/scala/sbt/Signal.scala index 8bad472cd..0069e4b53 100644 --- a/util/collection/src/main/scala/sbt/Signal.scala +++ b/util/collection/src/main/scala/sbt/Signal.scala @@ -19,6 +19,38 @@ object Signals case Right(v) => v } } + + /** Helper interface so we can expose internals of signal-isms to others. */ + sealed trait Registration { + def remove(): Unit + } + /** Register a signal handler that can be removed later. + * NOTE: Does not stack with other signal handlers!!!! + */ + def register(handler: () => Unit, signal: String = INT): Registration = + // TODO - Maybe we can just ignore things if not is-supported. + if(supported(signal)) { + import sun.misc.{Signal,SignalHandler} + val intSignal = new Signal(signal) + val newHandler = new SignalHandler { + def handle(sig: Signal) { handler() } + } + val oldHandler = Signal.handle(intSignal, newHandler) + object unregisterNewHandler extends Registration { + override def remove(): Unit = { + Signal.handle(intSignal, oldHandler) + } + } + unregisterNewHandler + } else { + // TODO - Maybe we should just throw an exception if we don't support signals... + object NullUnregisterNewHandler extends Registration { + override def remove(): Unit = () + } + NullUnregisterNewHandler + } + + def supported(signal: String): Boolean = try { From 6f80efade20560d5b88e22d452f1feca29ab1c53 Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Tue, 8 Apr 2014 20:40:51 +0100 Subject: [PATCH 415/823] Documented the DynamicExamples and FixedSetExamples classes. --- .../main/scala/sbt/complete/ExampleSource.scala | 3 ++- .../src/main/scala/sbt/complete/Parser.scala | 16 ++++++++++++++++ 2 files changed, 18 insertions(+), 1 deletion(-) diff --git a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala index f576f1bff..e13626083 100644 --- a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala +++ b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala @@ -24,7 +24,8 @@ trait ExampleSource } /** - * @param examples the source of examples that will be displayed to the user when they press the TAB key. + * A convenience example source that wraps any collection of strings into a source of examples. + * @param examples the examples that will be displayed to the user when they press the TAB key. */ sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSource { override def withAddedPrefix(addedPrefix: String): ExampleSource = FixedSetExamples(examplesWithRemovedPrefix(addedPrefix)) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index eb1844db6..c76c2f06a 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -744,6 +744,22 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends Completions(fixed map(f => Completion.suggestion(f)) ) override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" } + +/** + * This class wraps an existing parser (the delegate), and replaces the delegate's completions with examples from + * the given example source. + * + * This class asks the example source for a limited amount of examples (to prevent lengthy and expensive + * computations and large amounts of allocated data). It then passes these examples on to the UI. + * + * @param delegate the parser to decorate with completion examples (i.e., completion of user input). + * @param exampleSource the source from which this class will take examples (potentially filter them with the delegate + * parser), and pass them to the UI. + * @param maxNumberOfExamples the maximum number of completions to read from the example source and pass to the UI. This + * limit prevents lengthy example generation and allocation of large amounts of memory. + * @param removeInvalidExamples indicates whether to remove examples that are deemed invalid by the delegate parser. + * @tparam T the type of value produced by the parser. + */ private final class DynamicExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean) extends ValidParser[T] { def derive(c: Char) = examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples, removeInvalidExamples) From d8ef5af53379e6667f010b3fffec634314391920 Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Tue, 8 Apr 2014 21:27:27 +0100 Subject: [PATCH 416/823] Now using ExampleSource in collection-based completion parsers. Removed the Examples parser. Renamed DynamicExamples to ParserWithExamples. --- .../src/main/scala/sbt/complete/Parser.scala | 37 ++----------------- util/complete/src/test/scala/ParserTest.scala | 3 +- 2 files changed, 5 insertions(+), 35 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index c76c2f06a..8a54b09fd 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -174,16 +174,9 @@ object Parser extends ParserMain def mkFailures(errors: => Seq[String], definitive: Boolean = false): Failure = new Failure(errors.distinct, definitive) def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) - def checkMatches(a: Parser[_], completions: Seq[String]) - { - val bad = completions.filter( apply(a)(_).resultEmpty.isFailure) - if(!bad.isEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) - } - def tuple[A,B](a: Option[A], b: Option[B]): Option[(A,B)] = (a,b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } - def mapParser[A,B](a: Parser[A], f: A => B): Parser[B] = a.ifValid { a.result match @@ -302,7 +295,7 @@ trait ParserMain def & (o: Parser[_]) = and(a, o) def - (o: Parser[_]) = sub(a, o) def examples(s: String*): Parser[A] = examples(s.toSet) - def examples(s: Set[String], check: Boolean = false): Parser[A] = Parser.examples(a, s, check) + def examples(s: Set[String], check: Boolean = false): Parser[A] = examples(new FixedSetExamples(s), s.size, check) def examples(s: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = Parser.examples(a, s, maxNumberOfExamples, removeInvalidExamples) def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) @@ -437,18 +430,6 @@ trait ParserMain // The x Completions.empty removes any trailing token completions where append.isEmpty apply(p)(s).completions(level) x Completions.empty - def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = - if(a.valid) { - a.result match - { - case Some(av) => success( av ) - case None => - if(check) checkMatches(a, completions.toSeq) - new Examples(a, completions) - } - } - else a - /** * @param a the parser to decorate with a source of examples. All validation and parsing is delegated to this parser, * only [[Parser.completions]] is modified. @@ -466,7 +447,7 @@ trait ParserMain { case Some(av) => success( av ) case None => - new DynamicExamples(a, completions, maxNumberOfExamples, removeInvalidExamples) + new ParserWithExamples(a, completions, maxNumberOfExamples, removeInvalidExamples) } } else a @@ -732,18 +713,6 @@ private final class Not(delegate: Parser[_], failMessage: String) extends ValidP } override def toString = " -(%s)".format(delegate) } -private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends ValidParser[T] -{ - def derive(c: Char) = examples(delegate derive c, fixed.collect { case x if x.length > 0 && x(0) == c => x substring 1 }) - def result = delegate.result - lazy val resultEmpty = delegate.resultEmpty - def completions(level: Int) = - if(fixed.isEmpty) - if(resultEmpty.isValid) Completions.nil else Completions.empty - else - Completions(fixed map(f => Completion.suggestion(f)) ) - override def toString = "examples(" + delegate + ", " + fixed.take(2) + ")" -} /** * This class wraps an existing parser (the delegate), and replaces the delegate's completions with examples from @@ -760,7 +729,7 @@ private final class Examples[T](delegate: Parser[T], fixed: Set[String]) extends * @param removeInvalidExamples indicates whether to remove examples that are deemed invalid by the delegate parser. * @tparam T the type of value produced by the parser. */ -private final class DynamicExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean) extends ValidParser[T] +private final class ParserWithExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean) extends ValidParser[T] { def derive(c: Char) = examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples, removeInvalidExamples) def result = delegate.result diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 7a5d20b23..78ee28dc0 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -118,7 +118,8 @@ object ParserExample val name = token("test") val options = (ws ~> token("quick" | "failed" | "new") )* - val include = (ws ~> token(examples(notws.string, Set("am", "is", "are", "was", "were") )) )* + val exampleSet = Set("am", "is", "are", "was", "were") + val include = (ws ~> token(examples(notws.string, new FixedSetExamples(exampleSet), exampleSet.size, false )) )* val t = name ~ options ~ include From f6aaf9ad6723fcd2d3531b1c6ee78565230c0990 Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Sat, 12 Apr 2014 20:16:58 +0100 Subject: [PATCH 417/823] Created unit tests for ParserWithExamples and FixedSetExampleSource. --- .../src/main/scala/sbt/complete/Parser.scala | 2 +- .../sbt/complete/FixedSetExamplesTest.scala | 26 ++++++ .../sbt/complete/ParserWithExamplesTest.scala | 93 +++++++++++++++++++ 3 files changed, 120 insertions(+), 1 deletion(-) create mode 100644 util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala create mode 100644 util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 8a54b09fd..faea14d9a 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -752,7 +752,7 @@ private final class ParserWithExamples[T](delegate: Parser[T], exampleSource: Ex } private def isExampleValid(example: String): Boolean = { - apply(delegate)(example).resultEmpty.isFailure + apply(delegate)(example).resultEmpty.isValid } } private final class StringLiteral(str: String, start: Int) extends ValidParser[String] diff --git a/util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala b/util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala new file mode 100644 index 000000000..b9a5b2de2 --- /dev/null +++ b/util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala @@ -0,0 +1,26 @@ +package sbt.complete + +import org.specs2.mutable.Specification +import org.specs2.specification.Scope + +class FixedSetExamplesTest extends Specification { + + "adding a prefix" should { + "produce a smaller set of examples with the prefix removed" in new examples { + fixedSetExamples.withAddedPrefix("f")() must containTheSameElementsAs(List("oo", "ool", "u")) + fixedSetExamples.withAddedPrefix("fo")() must containTheSameElementsAs(List("o", "ol")) + fixedSetExamples.withAddedPrefix("b")() must containTheSameElementsAs(List("ar")) + } + } + + "without a prefix" should { + "produce the original set" in new examples { + fixedSetExamples() mustEqual exampleSet + } + } + + trait examples extends Scope { + val exampleSet = List("foo", "bar", "fool", "fu") + val fixedSetExamples = FixedSetExamples(exampleSet) + } +} diff --git a/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala new file mode 100644 index 000000000..664018f8f --- /dev/null +++ b/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala @@ -0,0 +1,93 @@ +package sbt.complete + +import org.specs2.mutable.Specification +import org.specs2.specification.Scope +import Completion._ + +class ParserWithExamplesTest extends Specification { + + "listing a limited number of completions" should { + "grab only the needed number of elements the iterable source of examples" in new parserWithLazyExamples { + parserWithExamples.completions(0) + examples.size shouldEqual maxNumberOfExamples + } + } + + "listing only valid completions" should { + "remove invalid examples" in new parserWithValidExamples { + val validCompletions = Completions(Set( + suggestion("blue"), + suggestion("red") + )) + parserWithExamples.completions(0) shouldEqual validCompletions + } + } + + "listing completions in a derived parser" should { + "produce only examples that match the derivation" in new parserWithValidExamples { + val derivedCompletions = Completions(Set( + suggestion("lue") + )) + parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions + } + } + + "listing unfiltered completions" should { + "produce all examples" in new parserWithAllExamples { + val completions = Completions(examples.map(suggestion(_)).toSet) + parserWithExamples.completions(0) shouldEqual completions + } + } + + "listing completions in a derived parser" should { + "produce only examples that match the derivation" in new parserWithAllExamples { + val derivedCompletions = Completions(Set( + suggestion("lue"), + suggestion("lock") + )) + parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions + } + } + + class parserWithLazyExamples extends parser(GrowableSourceOfExamples(), maxNumberOfExamples = 5, removeInvalidExamples = false) + + class parserWithValidExamples extends parser(removeInvalidExamples = true) + + class parserWithAllExamples extends parser(removeInvalidExamples = false) + + case class parser(examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), + maxNumberOfExamples: Int = 25, + removeInvalidExamples: Boolean) extends Scope { + + import DefaultParsers._ + + val colorParser = "blue" | "green" | "black" | "red" + val parserWithExamples = new ParserWithExamples[String]( + colorParser, + FixedSetExamples(examples), + maxNumberOfExamples, + removeInvalidExamples + ) + } + + case class GrowableSourceOfExamples() extends Iterable[String] { + var numberOfIteratedElements: Int = 0 + + override def iterator: Iterator[String] = { + new Iterator[String] { + var currentElement = 0 + + override def next(): String = { + currentElement += 1 + numberOfIteratedElements = Math.max(currentElement, numberOfIteratedElements) + numberOfIteratedElements.toString + } + + override def hasNext: Boolean = true + } + } + + override def size: Int = numberOfIteratedElements + } + +} From 757fe4228d8991cc12e5c2b4063f60ff76051c21 Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Mon, 14 Apr 2014 08:24:02 +0100 Subject: [PATCH 418/823] Improved the description of ParserWithExamples tests. --- .../sbt/complete/ParserWithExamplesTest.scala | 20 +++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala index 664018f8f..1151e1b0d 100644 --- a/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala +++ b/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala @@ -7,14 +7,14 @@ import Completion._ class ParserWithExamplesTest extends Specification { "listing a limited number of completions" should { - "grab only the needed number of elements the iterable source of examples" in new parserWithLazyExamples { + "grab only the needed number of elements from the iterable source of examples" in new parserWithLazyExamples { parserWithExamples.completions(0) examples.size shouldEqual maxNumberOfExamples } } "listing only valid completions" should { - "remove invalid examples" in new parserWithValidExamples { + "use the delegate parser to remove invalid examples" in new parserWithValidExamples { val validCompletions = Completions(Set( suggestion("blue"), suggestion("red") @@ -23,8 +23,8 @@ class ParserWithExamplesTest extends Specification { } } - "listing completions in a derived parser" should { - "produce only examples that match the derivation" in new parserWithValidExamples { + "listing valid completions in a derived parser" should { + "produce only valid examples that start with the character of the derivation" in new parserWithValidExamples { val derivedCompletions = Completions(Set( suggestion("lue") )) @@ -32,15 +32,15 @@ class ParserWithExamplesTest extends Specification { } } - "listing unfiltered completions" should { - "produce all examples" in new parserWithAllExamples { + "listing valid and invalid completions" should { + "produce the entire source of examples" in new parserWithAllExamples { val completions = Completions(examples.map(suggestion(_)).toSet) parserWithExamples.completions(0) shouldEqual completions } } - "listing completions in a derived parser" should { - "produce only examples that match the derivation" in new parserWithAllExamples { + "listing valid and invalid completions in a derived parser" should { + "produce only examples that start with the character of the derivation" in new parserWithAllExamples { val derivedCompletions = Completions(Set( suggestion("lue"), suggestion("lock") @@ -62,7 +62,7 @@ class ParserWithExamplesTest extends Specification { import DefaultParsers._ val colorParser = "blue" | "green" | "black" | "red" - val parserWithExamples = new ParserWithExamples[String]( + val parserWithExamples: Parser[String] = new ParserWithExamples[String]( colorParser, FixedSetExamples(examples), maxNumberOfExamples, @@ -71,7 +71,7 @@ class ParserWithExamplesTest extends Specification { } case class GrowableSourceOfExamples() extends Iterable[String] { - var numberOfIteratedElements: Int = 0 + private var numberOfIteratedElements: Int = 0 override def iterator: Iterator[String] = { new Iterator[String] { From a6da7640c43cf4c38aa900a37b1009f020ed1142 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 14 Apr 2014 12:13:39 -0400 Subject: [PATCH 419/823] Update CONTRIBUTING.md --- LICENSE | 2 +- NOTICE | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/LICENSE b/LICENSE index 46c73ae23..d70192438 100644 --- a/LICENSE +++ b/LICENSE @@ -1,4 +1,4 @@ -Copyright (c) 2008, 2009, 2010 Steven Blundy, Josh Cough, Mark Harrah, Stuart Roebuck, Tony Sloane, Vesa Vilhonen, Jason Zaugg +Copyright (c) 2008-2014 Typesafe Inc, Mark Harrah, Grzegorz Kossakowski, Josh Suereth, Indrajit Raychaudhuri, Eugene Yokota, and other contributors. All rights reserved. Redistribution and use in source and binary forms, with or without diff --git a/NOTICE b/NOTICE index 88899abdc..55efecac8 100644 --- a/NOTICE +++ b/NOTICE @@ -1,5 +1,5 @@ -Simple Build Tool -Copyright 2008, 2009, 2010 Mark Harrah, Jason Zaugg +sbt +Copyright (c) 2008-2014 Typesafe Inc, Mark Harrah, Grzegorz Kossakowski, Josh Suereth, Indrajit Raychaudhuri, Eugene Yokota, and other contributors. Licensed under BSD-style license (see LICENSE) Portions based on code from the Scala compiler. Portions of the Scala From cbfd3f1c08a271af5bf6960f9b5d3fd07965483f Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Wed, 16 Apr 2014 08:36:27 +0100 Subject: [PATCH 420/823] Added tests for FileExamples. Improved the file-searching in FileExamples. --- .../scala/sbt/complete/ExampleSource.scala | 31 ++++--- .../scala/sbt/complete/FileExamplesTest.scala | 92 +++++++++++++++++++ 2 files changed, 108 insertions(+), 15 deletions(-) create mode 100644 util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala diff --git a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala index e13626083..565a8c3f1 100644 --- a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala +++ b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala @@ -1,6 +1,7 @@ package sbt.complete import java.io.File +import sbt.IO._ /** * These sources of examples are used in parsers for user input completion. An example of such a source is the @@ -27,12 +28,15 @@ trait ExampleSource * A convenience example source that wraps any collection of strings into a source of examples. * @param examples the examples that will be displayed to the user when they press the TAB key. */ -sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSource { +sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSource +{ override def withAddedPrefix(addedPrefix: String): ExampleSource = FixedSetExamples(examplesWithRemovedPrefix(addedPrefix)) override def apply(): Iterable[String] = examples - private def examplesWithRemovedPrefix(prefix: String) = examples.collect { case example if example startsWith prefix => example substring prefix.length } + private def examplesWithRemovedPrefix(prefix: String) = examples.collect { + case example if example startsWith prefix => example substring prefix.length + } } /** @@ -40,22 +44,19 @@ sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSo * @param base the directory within which this class will search for completion examples. * @param prefix the part of the path already written by the user. */ -class FileExamples(base: File, prefix: String = "") extends ExampleSource { - private val relativizedPrefix: String = "." + File.separator + prefix - - override def apply(): Iterable[String] = files(base).map(_.toString.substring(relativizedPrefix.length)) +class FileExamples(base: File, prefix: String = "") extends ExampleSource +{ + override def apply(): Stream[String] = files(base).map(_ substring prefix.length) override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) - protected def fileStartsWithPrefix(path: File): Boolean = path.toString.startsWith(relativizedPrefix) - - protected def directoryStartsWithPrefix(path: File): Boolean = { - val pathString = path.toString - pathString.startsWith(relativizedPrefix) || relativizedPrefix.startsWith(pathString) + protected def files(directory: File): Stream[String] = { + val childPaths = directory.listFiles().toStream + val prefixedDirectChildPaths = childPaths.map(relativize(base, _).get).filter(_ startsWith prefix) + val dirsToRecurseInto = childPaths.filter(_.isDirectory).map(relativize(base, _).get).filter(dirStartsWithPrefix) + prefixedDirectChildPaths append dirsToRecurseInto.flatMap(dir => files(new File(base, dir))) } - protected def files(directory: File): Iterable[File] = { - val (subDirectories, filesOnly) = directory.listFiles().toStream.partition(_.isDirectory) - filesOnly.filter(fileStartsWithPrefix) ++ subDirectories.filter(directoryStartsWithPrefix).flatMap(files) - } + private def dirStartsWithPrefix(relativizedPath: String): Boolean = + (relativizedPath startsWith prefix) || (prefix startsWith relativizedPath) } \ No newline at end of file diff --git a/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala new file mode 100644 index 000000000..08c9a5884 --- /dev/null +++ b/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala @@ -0,0 +1,92 @@ +package sbt.complete + +import org.specs2.mutable.Specification +import org.specs2.specification.Scope +import sbt.IO.withTemporaryDirectory +import java.io.File +import sbt.IO._ + +class FileExamplesTest extends Specification +{ + + "listing all files in an absolute base directory" should { + "produce the entire base directory's contents" in new directoryStructure { + fileExamples().toList should containTheSameElementsAs(allRelativizedPaths) + } + } + + "listing files with a prefix that matches none" should { + "produce an empty list" in new directoryStructure(withCompletionPrefix = "z") { + fileExamples().toList should beEmpty + } + } + + "listing single-character prefixed files" should { + "produce matching paths only" in new directoryStructure(withCompletionPrefix = "f") { + fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + } + } + + "listing directory-prefixed files" should { + "produce matching paths only" in new directoryStructure(withCompletionPrefix = "far") { + fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + } + + "produce sub-dir contents only when appending a file separator to the directory" in new directoryStructure(withCompletionPrefix = "far" + File.separator) { + fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + } + } + + "listing files with a sub-path prefix" should { + "produce matching paths only" in new directoryStructure(withCompletionPrefix = "far" + File.separator + "ba") { + fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + } + } + + "completing a full path" should { + "produce a list with an empty string" in new directoryStructure(withCompletionPrefix = "bazaar") { + fileExamples().toList shouldEqual List("") + } + } + + class directoryStructure(withCompletionPrefix: String = "") extends Scope with DelayedInit + { + var fileExamples: FileExamples = _ + var baseDir: File = _ + var childFiles: List[File] = _ + var childDirectories: List[File] = _ + var nestedFiles: List[File] = _ + var nestedDirectories: List[File] = _ + + def allRelativizedPaths: List[String] = + (childFiles ++ childDirectories ++ nestedFiles ++ nestedDirectories).map(relativize(baseDir, _).get) + + def prefixedPathsOnly: List[String] = + allRelativizedPaths.filter(_ startsWith withCompletionPrefix).map(_ substring withCompletionPrefix.length) + + override def delayedInit(testBody: => Unit): Unit = { + withTemporaryDirectory { + tempDir => + createSampleDirStructure(tempDir) + fileExamples = new FileExamples(baseDir, withCompletionPrefix) + testBody + } + } + + private def createSampleDirStructure(tempDir: File): Unit = { + childFiles = toChildFiles(tempDir, List("foo", "bar", "bazaar")) + childDirectories = toChildFiles(tempDir, List("moo", "far")) + nestedFiles = toChildFiles(childDirectories(1), List("farfile1", "barfile2")) + nestedDirectories = toChildFiles(childDirectories(1), List("fardir1", "bardir2")) + + (childDirectories ++ nestedDirectories).map(_.mkdirs()) + (childFiles ++ nestedFiles).map(_.createNewFile()) + + // NOTE: Creating a new file here because `tempDir.listFiles()` returned an empty list. + baseDir = new File(tempDir.getCanonicalPath) + } + + private def toChildFiles(baseDir: File, files: List[String]): List[File] = files.map(new File(baseDir, _)) + } + +} From 0ed9eb17aa717d65d607b62cf13d0d8ceb6725d9 Mon Sep 17 00:00:00 2001 From: Matej Urbas Date: Fri, 18 Apr 2014 16:41:05 +0100 Subject: [PATCH 421/823] Reintroduced the `examples` method. Reintroduced and deprecated the `checkMatches` method. --- .../src/main/scala/sbt/complete/Parser.scala | 19 +++++++++++++++++-- 1 file changed, 17 insertions(+), 2 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index faea14d9a..575cc5ec6 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -174,6 +174,13 @@ object Parser extends ParserMain def mkFailures(errors: => Seq[String], definitive: Boolean = false): Failure = new Failure(errors.distinct, definitive) def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) + @deprecated("This method is deprecated and will be removed in the next major version. Use the parser directly to check for invalid completions.", since = "0.13.2") + def checkMatches(a: Parser[_], completions: Seq[String]) + { + val bad = completions.filter( apply(a)(_).resultEmpty.isFailure) + if(!bad.isEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) + } + def tuple[A,B](a: Option[A], b: Option[B]): Option[(A,B)] = (a,b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } @@ -430,6 +437,9 @@ trait ParserMain // The x Completions.empty removes any trailing token completions where append.isEmpty apply(p)(s).completions(level) x Completions.empty + def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = + examples(a, new FixedSetExamples(completions), completions.size, check) + /** * @param a the parser to decorate with a source of examples. All validation and parsing is delegated to this parser, * only [[Parser.completions]] is modified. @@ -437,7 +447,7 @@ trait ParserMain * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can * prevent lengthy pauses and avoids bad interactive user experience. * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the given parser). An - * exception is thrown if the example source contains no valid completion suggestions. + * exception is thrown if the example source contains no valid completion suggestions. * @tparam A the type of values that are returned by the parser. * @return */ @@ -731,9 +741,13 @@ private final class Not(delegate: Parser[_], failMessage: String) extends ValidP */ private final class ParserWithExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean) extends ValidParser[T] { - def derive(c: Char) = examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples, removeInvalidExamples) + def derive(c: Char) = + examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples, removeInvalidExamples) + def result = delegate.result + lazy val resultEmpty = delegate.resultEmpty + def completions(level: Int) = { if(exampleSource().isEmpty) if(resultEmpty.isValid) Completions.nil else Completions.empty @@ -742,6 +756,7 @@ private final class ParserWithExamples[T](delegate: Parser[T], exampleSource: Ex Completions(examplesBasedOnTheResult.map(ex => Completion.suggestion(ex))) } } + override def toString = "examples(" + delegate + ", " + exampleSource().take(2).toList + ")" private def filteredExamples: Iterable[String] = { From 89d2aa291cba2bab1c68e098b79150436bd11da4 Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Wed, 11 Dec 2013 15:24:31 +0000 Subject: [PATCH 422/823] DerivedSetting not a DefaultSetting anymore --- util/collection/src/main/scala/sbt/Settings.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 32d4b9f85..9c551d829 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -90,7 +90,7 @@ trait Init[Scope] * Only the static dependencies are tracked, however. Dependencies on previous values do not introduce a derived setting either. */ final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true)): Setting[T] = { deriveAllowed(s, allowDynamic) foreach error - new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger, nextDefaultID()) + new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) } def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { case _: Bind[_,_] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") @@ -456,8 +456,8 @@ trait Init[Scope] protected[sbt] def isDerived: Boolean = false private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) } - private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean, id: Long) extends DefaultSetting[T](sk, i, p, id) { - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, trigger, id) + private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean) extends Setting[T](sk, i, p) { + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, trigger) protected[sbt] override def isDerived: Boolean = true } // Only keep the first occurence of this setting and move it to the front so that it has lower precedence than non-defaults. From 431c61775c2effbbcdef1c27a9c23c656f4a6435 Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Wed, 11 Dec 2013 15:24:46 +0000 Subject: [PATCH 423/823] Derived settings to replace their DerivedSetting, not go at the beginning --- .../src/main/scala/sbt/Settings.scala | 19 ++++++++++++++----- 1 file changed, 14 insertions(+), 5 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 9c551d829..557113d01 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -301,6 +301,7 @@ trait Init[Scope] val dependencies = setting.dependencies.map(_.key) def triggeredBy = dependencies.filter(setting.trigger) val inScopes = new mutable.HashSet[Scope] + val outputs = new mutable.ListBuffer[Setting[_]] } final class Deriveds(val key: AttributeKey[_], val settings: mutable.ListBuffer[Derived]) { def dependencies = settings.flatMap(_.dependencies) @@ -312,6 +313,7 @@ trait Init[Scope] val (derived, rawDefs) = Util.separate[Setting[_],Derived,Setting[_]](init) { case d: DerivedSetting[_] => Left(new Derived(d)); case s => Right(s) } val defs = addLocal(rawDefs)(scopeLocal) + // group derived settings by the key they define val derivsByDef = new mutable.HashMap[AttributeKey[_], Deriveds] for(s <- derived) { @@ -329,6 +331,10 @@ trait Init[Scope] for(s <- derived; d <- s.triggeredBy) derivedBy.getOrElseUpdate(d, new mutable.ListBuffer) += s + // Map a DerivedSetting[_] to the `Derived` struct wrapping it. Used to ultimately replace a DerivedSetting with + // the `Setting`s that were actually derived from it: `Derived.outputs` + val derivedToStruct: Map[DerivedSetting[_], Derived] = (derived map { s => s.setting -> s }).toMap + // set of defined scoped keys, used to ensure a derived setting is only added if all dependencies are present val defined = new mutable.HashSet[ScopedKey[_]] def addDefs(ss: Seq[Setting[_]]) { for(s <- ss) defined += s.key } @@ -357,7 +363,9 @@ trait Init[Scope] val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(scope, dep))) if(allDepsDefined(d, scope, local.map(_.key.key).toSet)) { d.inScopes.add(scope) - local :+ d.setting.setScope(scope) + val out = local :+ d.setting.setScope(scope) + d.outputs ++= out + out } else Nil } @@ -366,21 +374,22 @@ trait Init[Scope] } val processed = new mutable.HashSet[ScopedKey[_]] - // valid derived settings to be added before normal settings - val out = new mutable.ListBuffer[Setting[_]] // derives settings, transitively so that a derived setting can trigger another def process(rem: List[Setting[_]]): Unit = rem match { case s :: ss => val sk = s.key val ds = if(processed.add(sk)) deriveFor(sk) else Nil - out ++= ds addDefs(ds) process(ds ::: ss) case Nil => } process(defs.toList) - out.toList ++ defs + + // Take all the original defs and DerivedSettings along with locals, replace each DerivedSetting with the actual + // settings that were derived. + val allDefs = addLocal(init)(scopeLocal) + allDefs flatMap { case d: DerivedSetting[_] => (derivedToStruct get d map (_.outputs)).toStream.flatten; case s => Stream(s) } } sealed trait Initialize[T] From 6f5242d812acc1d99fdfae512fffbbaa9f715578 Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Tue, 17 Dec 2013 12:30:49 +0000 Subject: [PATCH 424/823] Derive settings only under the scope of the DerivedSetting --- .../src/main/scala/sbt/Settings.scala | 35 ++++++++++++------- 1 file changed, 22 insertions(+), 13 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 557113d01..89e8a9d7e 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -348,28 +348,37 @@ trait Init[Scope] def allDepsDefined(d: Derived, scope: Scope, local: Set[AttributeKey[_]]): Boolean = d.dependencies.forall(dep => local(dep) || isDefined(dep, scope)) - // List of injectable derived settings and their local settings for `sk`. + // Returns the list of injectable derived settings and their local settings for `sk`. + // The settings are to be injected under `outputScope` = whichever scope is more specific of: + // * the dependency's (`sk`) scope + // * the DerivedSetting's scope in which it has been declared, `definingScope` + // provided that these two scopes intersect. // A derived setting is injectable if: - // 1. it has not been previously injected into this scope - // 2. it applies to this scope (as determined by its `filter`) - // 3. all of its dependencies that match `trigger` are defined for that scope (allowing for delegation) + // 1. it has not been previously injected into outputScope + // 2. it applies to outputScope (as determined by its `filter`) + // 3. all of its dependencies are defined for outputScope (allowing for delegation) // This needs to handle local settings because a derived setting wouldn't be injected if it's local setting didn't exist yet. val deriveFor = (sk: ScopedKey[_]) => { val derivedForKey: List[Derived] = derivedBy.get(sk.key).toList.flatten val scope = sk.scope - def localAndDerived(d: Derived): Seq[Setting[_]] = - if(!d.inScopes.contains(scope) && d.setting.filter(scope)) - { - val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(scope, dep))) - if(allDepsDefined(d, scope, local.map(_.key.key).toSet)) { - d.inScopes.add(scope) - val out = local :+ d.setting.setScope(scope) + def localAndDerived(d: Derived): Seq[Setting[_]] = { + def definingScope = d.setting.key.scope + def intersect(s1: Scope, s2: Scope): Option[Scope] = + if (delegates(s1).contains(s2)) Some(s1) // s1 is more specific + else if (delegates(s2).contains(s1)) Some(s2) // s2 is more specific + else None + val outputScope = intersect(scope, definingScope) + outputScope collect { case s if !d.inScopes.contains(s) && d.setting.filter(s) => + val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(s, dep))) + if(allDepsDefined(d, s, local.map(_.key.key).toSet)) { + d.inScopes.add(s) + val out = local :+ d.setting.setScope(s) d.outputs ++= out out } else Nil - } - else Nil + } getOrElse Nil + } derivedForKey.flatMap(localAndDerived) } From decd323b645f442c29b6f8129879de1aa2040b3b Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Tue, 17 Dec 2013 18:19:43 +0000 Subject: [PATCH 425/823] Decouple DefaultSetting from Setting/DerivedSetting; BuildCommon.derive() produces default settings by default --- .../src/main/scala/sbt/Settings.scala | 27 ++++++++++++------- 1 file changed, 18 insertions(+), 9 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 89e8a9d7e..af56cb4f8 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -88,19 +88,17 @@ trait Init[Scope] * is explicitly defined and the where the scope matches `filter`. * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. * Only the static dependencies are tracked, however. Dependencies on previous values do not introduce a derived setting either. */ - final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true)): Setting[T] = { + final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true), default: Boolean = false): Setting[T] = { deriveAllowed(s, allowDynamic) foreach error - new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) + def d = new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) + if (default) d.default() else d } def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { case _: Bind[_,_] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") case _ => None } // id is used for equality - private[sbt] final def defaultSetting[T](s: Setting[T]): Setting[T] = s match { - case _: DefaultSetting[_] | _: DerivedSetting[_] => s - case _ => new DefaultSetting[T](s.key, s.init, s.pos, nextDefaultID()) - } + private[sbt] final def defaultSetting[T](s: Setting[T]): Setting[T] = s.default() private[sbt] def defaultSettings(ss: Seq[Setting[_]]): Seq[Setting[_]] = ss.map(s => defaultSetting(s)) private[this] final val nextID = new java.util.concurrent.atomic.AtomicLong private[this] final def nextDefaultID(): Long = nextID.incrementAndGet() @@ -473,17 +471,28 @@ trait Init[Scope] protected[this] def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new Setting[T](key, init, pos) protected[sbt] def isDerived: Boolean = false private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) + /** Turn this setting into a `DefaultSetting` if it's not already, otherwise returns `this` */ + private[sbt] def default(id: => Long = nextDefaultID()): DefaultSetting[T] = DefaultSetting(key, init, pos, id) } - private[Init] final class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean) extends Setting[T](sk, i, p) { + private[Init] sealed class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean) extends Setting[T](sk, i, p) { override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, trigger) protected[sbt] override def isDerived: Boolean = true + override def default(_id: => Long): DefaultSetting[T] = new DerivedSetting[T](sk, i, p, filter, trigger) with DefaultSetting[T] { val id = _id } + override def toString = "derived " + super.toString } // Only keep the first occurence of this setting and move it to the front so that it has lower precedence than non-defaults. // This is intended for internal sbt use only, where alternatives like Plugin.globalSettings are not available. - private[Init] sealed class DefaultSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val id: Long) extends Setting[T](sk, i, p) { - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DefaultSetting[T](key, init, pos, id) + private[Init] sealed trait DefaultSetting[T] extends Setting[T] { + val id: Long + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = super.make(key, init, pos) default id override final def hashCode = id.hashCode override final def equals(o: Any): Boolean = o match { case d: DefaultSetting[_] => d.id == id; case _ => false } + override def toString = s"default($id) " + super.toString + override def default(id: => Long) = this + } + + object DefaultSetting { + def apply[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, _id: Long) = new Setting[T](sk, i, p) with DefaultSetting[T] { val id = _id } } From 962f0bad763ea57aa0bf39a51d329e3f355bafdc Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Wed, 18 Dec 2013 10:52:45 +0000 Subject: [PATCH 426/823] Optimise scope intersection for GlobalScope --- util/collection/src/main/scala/sbt/Settings.scala | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index af56cb4f8..bc88520f1 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -3,7 +3,7 @@ */ package sbt - import Types._ +import Types._ sealed trait Settings[Scope] { @@ -291,6 +291,11 @@ trait Init[Scope] } else "" } + /** + * Intersects two scopes, returning the more specific one if they intersect, or None otherwise. + * Not implemented here because we want to optimise for Scope.GlobalScope which is inaccessible here. */ + private[sbt] def intersect(s1: Scope, s2: Scope)(implicit delegates: Scope => Seq[Scope]): Option[Scope] = ??? + private[this] def deriveAndLocal(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): Seq[Setting[_]] = { import collection.mutable @@ -361,10 +366,6 @@ trait Init[Scope] val scope = sk.scope def localAndDerived(d: Derived): Seq[Setting[_]] = { def definingScope = d.setting.key.scope - def intersect(s1: Scope, s2: Scope): Option[Scope] = - if (delegates(s1).contains(s2)) Some(s1) // s1 is more specific - else if (delegates(s2).contains(s1)) Some(s2) // s2 is more specific - else None val outputScope = intersect(scope, definingScope) outputScope collect { case s if !d.inScopes.contains(s) && d.setting.filter(s) => val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(s, dep))) From 767e2487d030b96a1021480558d39a8fffdd40d5 Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Mon, 23 Dec 2013 15:27:29 +0000 Subject: [PATCH 427/823] Couple of fixes --- util/collection/src/main/scala/sbt/Settings.scala | 9 ++++++--- 1 file changed, 6 insertions(+), 3 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index bc88520f1..7a6a7b7ee 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -90,7 +90,7 @@ trait Init[Scope] * Only the static dependencies are tracked, however. Dependencies on previous values do not introduce a derived setting either. */ final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true), default: Boolean = false): Setting[T] = { deriveAllowed(s, allowDynamic) foreach error - def d = new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) + val d = new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) if (default) d.default() else d } def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { @@ -293,8 +293,11 @@ trait Init[Scope] /** * Intersects two scopes, returning the more specific one if they intersect, or None otherwise. - * Not implemented here because we want to optimise for Scope.GlobalScope which is inaccessible here. */ - private[sbt] def intersect(s1: Scope, s2: Scope)(implicit delegates: Scope => Seq[Scope]): Option[Scope] = ??? + */ + private[sbt] def intersect(s1: Scope, s2: Scope)(implicit delegates: Scope => Seq[Scope]): Option[Scope] = + if (delegates(s1).contains(s2)) Some(s1) // s1 is more specific + else if (delegates(s2).contains(s1)) Some(s2) // s2 is more specific + else None private[this] def deriveAndLocal(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): Seq[Setting[_]] = { From eb93fdd7a6d6d5b9f3d00742a3f9022e1563df85 Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Thu, 1 May 2014 03:09:50 +0100 Subject: [PATCH 428/823] Improve SettingsExample to allow orthogonal scopes (like projects/tasks) at a certain nestIndex --- util/collection/src/test/scala/SettingsExample.scala | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/util/collection/src/test/scala/SettingsExample.scala b/util/collection/src/test/scala/SettingsExample.scala index 637f0ad51..9d863be31 100644 --- a/util/collection/src/test/scala/SettingsExample.scala +++ b/util/collection/src/test/scala/SettingsExample.scala @@ -3,7 +3,7 @@ package sbt /** Define our settings system */ // A basic scope indexed by an integer. -final case class Scope(index: Int) +final case class Scope(nestIndex: Int, idAtIndex: Int = 0) // Extend the Init trait. // (It is done this way because the Scope type parameter is used everywhere in Init. @@ -14,12 +14,12 @@ object SettingsExample extends Init[Scope] { // Provides a way of showing a Scope+AttributeKey[_] val showFullKey: Show[ScopedKey[_]] = new Show[ScopedKey[_]] { - def apply(key: ScopedKey[_]) = key.scope.index + "/" + key.key.label + def apply(key: ScopedKey[_]) = s"${key.scope.nestIndex}(${key.scope.idAtIndex})/${key.key.label}" } // A sample delegation function that delegates to a Scope with a lower index. - val delegates: Scope => Seq[Scope] = { case s @ Scope(index) => - s +: (if(index <= 0) Nil else delegates(Scope(index-1)) ) + val delegates: Scope => Seq[Scope] = { case s @ Scope(index, proj) => + s +: (if(index <= 0) Nil else { (if (proj > 0) List(Scope(index)) else Nil) ++: delegates(Scope(index-1)) }) } // Not using this feature in this example. From 1e9dee900a4acd2a9e7f1b2a75e93db6568ab17e Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Thu, 1 May 2014 03:11:20 +0100 Subject: [PATCH 429/823] Add 2 derived settings tests: 1) non-default derived settings, if they produce anything, the settings they produce must supersede previous assignents (in the settings seq) to the same key. 2) even if a derived setting is scoped at a higher scope (e.g. ThisBuild) the settings it produces are scoped at the intersection of that (the defining) scope and the scope of the triggering dependency. 2 is particularly nice as it enables this behaviour: derive(b in ThisBuild := a.value + 1) a in project1 := 0 // a could be defined in all projects ==> Now (b in project1).value == (a in project1).value + 1 == 1 and similarly in all other projects all with a single derived setting --- .../src/test/scala/SettingsTest.scala | 104 ++++++++++++++---- 1 file changed, 80 insertions(+), 24 deletions(-) diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index 0cfb5ea83..1bdea8f38 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -7,6 +7,9 @@ import SettingsExample._ object SettingsTest extends Properties("settings") { + + import scala.reflect.Manifest + final val ChainMax = 5000 lazy val chainLengthGen = Gen.choose(1, ChainMax) @@ -35,31 +38,84 @@ object SettingsTest extends Properties("settings") evaluate( setting(chk, iterate(top)) :: Nil); true } - property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettings } - final def derivedSettings(nr: Int): Prop = - { - val alphaStr = Gen.alphaStr - val genScopedKeys = { - val attrKeys = for { - list <- Gen.listOfN(nr, alphaStr) suchThat (l => l.size == l.distinct.size) - item <- list - } yield AttributeKey[Int](item) - attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) - } - forAll(genScopedKeys) { scopedKeys => - val last = scopedKeys.last - val derivedSettings: Seq[Setting[Int]] = ( - for { - List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 - nextInit = if (scoped0 == chk) chk - else (scoped0 zipWith chk) { (p, _) => p + 1 } - } yield derive(setting(scoped1, nextInit)) - ).toSeq + property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettings } + final def derivedSettings(nr: Int): Prop = + { + val genScopedKeys = { + val attrKeys = mkAttrKeys[Int](nr) + attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) + } + forAll(genScopedKeys) { scopedKeys => + val last = scopedKeys.last + val derivedSettings: Seq[Setting[Int]] = ( + for { + List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 + nextInit = if (scoped0 == chk) chk + else (scoped0 zipWith chk) { (p, _) => p + 1 } + } yield derive(setting(scoped1, nextInit)) + ).toSeq - { checkKey(last, Some(nr-1), evaluate(setting(chk, value(0)) +: derivedSettings)) :| "Not derived?" } && - { checkKey( last, None, evaluate(derivedSettings)) :| "Should not be derived" } - } - } + { checkKey(last, Some(nr-1), evaluate(setting(chk, value(0)) +: derivedSettings)) :| "Not derived?" } && + { checkKey( last, None, evaluate(derivedSettings)) :| "Should not be derived" } + } + } + + private def mkAttrKeys[T](nr: Int)(implicit mf: Manifest[T]): Gen[List[AttributeKey[T]]] = + { + val alphaStr = Gen.alphaStr + for { + list <- Gen.listOfN(nr, alphaStr) suchThat (l => l.size == l.distinct.size) + item <- list + } yield AttributeKey[T](item) + } + + property("Derived setting(s) replace DerivedSetting in the Seq[Setting[_]]") = derivedKeepsPosition + final def derivedKeepsPosition: Prop = + { + val a: ScopedKey[Int] = ScopedKey(Scope(0), AttributeKey[Int]("a")) + val b: ScopedKey[Int] = ScopedKey(Scope(0), AttributeKey[Int]("b")) + val prop1 = { + val settings: Seq[Setting[_]] = Seq( + setting(a, value(3)), + setting(b, value(6)), + derive(setting(b, a)), + setting(a, value(5)), + setting(b, value(8)) + ) + val ev = evaluate(settings) + checkKey(a, Some(5), ev) && checkKey(b, Some(8), ev) + } + val prop2 = { + val settings: Seq[Setting[Int]] = Seq( + setting(a, value(3)), + setting(b, value(6)), + derive(setting(b, a)), + setting(a, value(5)) + ) + val ev = evaluate(settings) + checkKey(a, Some(5), ev) && checkKey(b, Some(5), ev) + } + prop1 && prop2 + } + + property("DerivedSetting in ThisBuild scopes derived settings under projects thus allowing safe +=") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettingsScope } + final def derivedSettingsScope(nrProjects: Int): Prop = + { + forAll(mkAttrKeys[Int](2)) { case List(key, derivedKey) => + val projectKeys = for { proj <- 1 to nrProjects } yield ScopedKey(Scope(1, proj), key) + val projectDerivedKeys = for { proj <- 1 to nrProjects } yield ScopedKey(Scope(1, proj), derivedKey) + val globalKey = ScopedKey(Scope(0), key) + val globalDerivedKey = ScopedKey(Scope(0), derivedKey) + // Each project defines an initial value, but the update is defined in globalKey. + // However, the derived Settings that come from this should be scoped in each project. + val settings: Seq[Setting[_]] = + derive(setting(globalDerivedKey, SettingsExample.map(globalKey)(_ + 1))) +: projectKeys.map(pk => setting(pk, value(0))) + val ev = evaluate(settings) + // Also check that the key has no value at the "global" scope + val props = for { pk <- projectDerivedKeys } yield checkKey(pk, Some(1), ev) + checkKey(globalDerivedKey, None, ev) && Prop.all(props: _*) + } + } // Circular (dynamic) references currently loop infinitely. // This is the expected behavior (detecting dynamic cycles is expensive), From 4258189951a4fc42d08b51ee012c700290b6278f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 1 May 2014 12:50:07 -0400 Subject: [PATCH 430/823] added scalariform --- cache/src/main/scala/sbt/Cache.scala | 459 +++--- cache/src/main/scala/sbt/CacheIO.scala | 73 +- cache/src/main/scala/sbt/FileInfo.scala | 176 +- cache/src/main/scala/sbt/SeparatedCache.scala | 105 +- .../src/main/scala/sbt/ChangeReport.scala | 123 +- .../tracking/src/main/scala/sbt/Tracked.scala | 352 ++-- .../main/scala/sbt/appmacro/ContextUtil.scala | 423 ++--- .../src/main/scala/sbt/appmacro/Convert.scala | 57 +- .../main/scala/sbt/appmacro/Instance.scala | 370 +++-- .../scala/sbt/appmacro/KListBuilder.scala | 113 +- .../scala/sbt/appmacro/MixedBuilder.scala | 23 +- .../scala/sbt/appmacro/TupleBuilder.scala | 81 +- .../scala/sbt/appmacro/TupleNBuilder.scala | 91 +- .../collection/src/main/scala/sbt/AList.scala | 391 +++-- .../src/main/scala/sbt/Attributes.scala | 293 ++-- .../src/main/scala/sbt/Classes.scala | 45 +- util/collection/src/main/scala/sbt/Dag.scala | 218 ++- .../collection/src/main/scala/sbt/HList.scala | 36 +- .../collection/src/main/scala/sbt/IDSet.scala | 72 +- .../collection/src/main/scala/sbt/INode.scala | 318 ++-- .../collection/src/main/scala/sbt/KList.scala | 79 +- util/collection/src/main/scala/sbt/PMap.scala | 170 +- .../collection/src/main/scala/sbt/Param.scala | 39 +- .../src/main/scala/sbt/Positions.scala | 8 +- .../src/main/scala/sbt/Settings.scala | 1124 +++++++------ util/collection/src/main/scala/sbt/Show.scala | 7 +- .../src/main/scala/sbt/Signal.scala | 148 +- .../src/main/scala/sbt/TypeFunctions.scala | 71 +- .../collection/src/main/scala/sbt/Types.scala | 9 +- util/collection/src/main/scala/sbt/Util.scala | 58 +- .../src/main/scala/sbt/LineReader.scala | 235 ++- .../main/scala/sbt/complete/Completions.scala | 243 ++- .../scala/sbt/complete/EditDistance.scala | 58 +- .../scala/sbt/complete/ExampleSource.scala | 61 +- .../src/main/scala/sbt/complete/History.scala | 63 +- .../scala/sbt/complete/HistoryCommands.scala | 115 +- .../scala/sbt/complete/JLineCompletion.scala | 281 ++-- .../src/main/scala/sbt/complete/Parser.scala | 1421 ++++++++--------- .../src/main/scala/sbt/complete/Parsers.scala | 372 +++-- .../scala/sbt/complete/ProcessError.scala | 53 +- .../scala/sbt/complete/TokenCompletions.scala | 57 +- .../main/scala/sbt/complete/TypeString.scala | 136 +- .../main/scala/sbt/complete/UpperBound.scala | 72 +- .../src/main/scala/sbt/ErrorHandling.scala | 59 +- .../control/src/main/scala/sbt/ExitHook.scala | 25 +- .../main/scala/sbt/MessageOnlyException.scala | 18 +- util/log/src/main/scala/sbt/BasicLogger.scala | 21 +- .../src/main/scala/sbt/BufferedLogger.scala | 169 +- .../src/main/scala/sbt/ConsoleLogger.scala | 305 ++-- util/log/src/main/scala/sbt/ConsoleOut.scala | 104 +- .../log/src/main/scala/sbt/FilterLogger.scala | 58 +- util/log/src/main/scala/sbt/FullLogger.scala | 49 +- .../src/main/scala/sbt/GlobalLogging.scala | 61 +- util/log/src/main/scala/sbt/Level.scala | 39 +- util/log/src/main/scala/sbt/LogEvent.scala | 7 +- util/log/src/main/scala/sbt/Logger.scala | 233 ++- .../log/src/main/scala/sbt/LoggerWriter.scala | 84 +- util/log/src/main/scala/sbt/MainLogging.scala | 81 +- util/log/src/main/scala/sbt/MultiLogger.scala | 83 +- util/log/src/main/scala/sbt/StackTrace.scala | 97 +- .../src/main/scala/sbt/logic/Logic.scala | 509 +++--- .../src/main/scala/sbt/InheritInput.scala | 19 +- util/process/src/main/scala/sbt/Process.scala | 349 ++-- .../src/main/scala/sbt/ProcessImpl.scala | 751 ++++----- util/process/src/main/scala/sbt/SyncVar.scala | 59 +- .../src/main/scala/sbt/Relation.scala | 255 +-- 66 files changed, 6007 insertions(+), 6127 deletions(-) diff --git a/cache/src/main/scala/sbt/Cache.scala b/cache/src/main/scala/sbt/Cache.scala index 725a103a8..c241394ba 100644 --- a/cache/src/main/scala/sbt/Cache.scala +++ b/cache/src/main/scala/sbt/Cache.scala @@ -3,271 +3,246 @@ */ package sbt -import sbinary.{CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output => Out} -import java.io.{ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream} -import java.net.{URI, URL} +import sbinary.{ CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output => Out } +import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } +import java.net.{ URI, URL } import Types.:+: -import DefaultProtocol.{asProduct2, asSingleton, BooleanFormat, ByteFormat, IntFormat, wrap} +import DefaultProtocol.{ asProduct2, asSingleton, BooleanFormat, ByteFormat, IntFormat, wrap } import scala.xml.NodeSeq -trait Cache[I,O] -{ - def apply(file: File)(i: I): Either[O, O => Unit] +trait Cache[I, O] { + def apply(file: File)(i: I): Either[O, O => Unit] } -trait SBinaryFormats extends CollectionTypes with JavaFormats -{ - implicit def urlFormat: Format[URL] = DefaultProtocol.UrlFormat - implicit def uriFormat: Format[URI] = DefaultProtocol.UriFormat +trait SBinaryFormats extends CollectionTypes with JavaFormats { + implicit def urlFormat: Format[URL] = DefaultProtocol.UrlFormat + implicit def uriFormat: Format[URI] = DefaultProtocol.UriFormat } -object Cache extends CacheImplicits -{ - def cache[I,O](implicit c: Cache[I,O]): Cache[I,O] = c +object Cache extends CacheImplicits { + def cache[I, O](implicit c: Cache[I, O]): Cache[I, O] = c - def cached[I,O](file: File)(f: I => O)(implicit cache: Cache[I,O]): I => O = - in => - cache(file)(in) match - { - case Left(value) => value - case Right(store) => - val out = f(in) - store(out) - out - } + def cached[I, O](file: File)(f: I => O)(implicit cache: Cache[I, O]): I => O = + in => + cache(file)(in) match { + case Left(value) => value + case Right(store) => + val out = f(in) + store(out) + out + } - def debug[I](label: String, c: InputCache[I]): InputCache[I] = - new InputCache[I] - { - type Internal = c.Internal - def convert(i: I) = c.convert(i) - def read(from: Input) = - { - val v = c.read(from) - println(label + ".read: " + v) - v - } - def write(to: Out, v: Internal) - { - println(label + ".write: " + v) - c.write(to, v) - } - def equiv: Equiv[Internal] = new Equiv[Internal] { - def equiv(a: Internal, b: Internal)= - { - val equ = c.equiv.equiv(a,b) - println(label + ".equiv(" + a + ", " + b +"): " + equ) - equ - } - } - } + def debug[I](label: String, c: InputCache[I]): InputCache[I] = + new InputCache[I] { + type Internal = c.Internal + def convert(i: I) = c.convert(i) + def read(from: Input) = + { + val v = c.read(from) + println(label + ".read: " + v) + v + } + def write(to: Out, v: Internal) { + println(label + ".write: " + v) + c.write(to, v) + } + def equiv: Equiv[Internal] = new Equiv[Internal] { + def equiv(a: Internal, b: Internal) = + { + val equ = c.equiv.equiv(a, b) + println(label + ".equiv(" + a + ", " + b + "): " + equ) + equ + } + } + } } trait CacheImplicits extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits with UnionImplicits -trait BasicCacheImplicits -{ - implicit def basicCache[I, O](implicit in: InputCache[I], outFormat: Format[O]): Cache[I,O] = - new BasicCache()(in, outFormat) - def basicInput[I](implicit eq: Equiv[I], fmt: Format[I]): InputCache[I] = InputCache.basicInputCache(fmt, eq) +trait BasicCacheImplicits { + implicit def basicCache[I, O](implicit in: InputCache[I], outFormat: Format[O]): Cache[I, O] = + new BasicCache()(in, outFormat) + def basicInput[I](implicit eq: Equiv[I], fmt: Format[I]): InputCache[I] = InputCache.basicInputCache(fmt, eq) - def defaultEquiv[T]: Equiv[T] = new Equiv[T] { def equiv(a: T, b: T) = a == b } - - implicit def optInputCache[T](implicit t: InputCache[T]): InputCache[Option[T]] = - new InputCache[Option[T]] - { - type Internal = Option[t.Internal] - def convert(v: Option[T]): Internal = v.map(x => t.convert(x)) - def read(from: Input) = - { - val isDefined = BooleanFormat.reads(from) - if(isDefined) Some(t.read(from)) else None - } - def write(to: Out, j: Internal): Unit = - { - BooleanFormat.writes(to, j.isDefined) - j foreach { x => t.write(to, x) } - } - def equiv = optEquiv(t.equiv) - } - - def wrapEquiv[S,T](f: S => T)(implicit eqT: Equiv[T]): Equiv[S] = - new Equiv[S] { - def equiv(a: S, b: S) = - eqT.equiv( f(a), f(b) ) - } + def defaultEquiv[T]: Equiv[T] = new Equiv[T] { def equiv(a: T, b: T) = a == b } - implicit def optEquiv[T](implicit t: Equiv[T]): Equiv[Option[T]] = - new Equiv[Option[T]] { - def equiv(a: Option[T], b: Option[T]) = - (a,b) match - { - case (None, None) => true - case (Some(va), Some(vb)) => t.equiv(va, vb) - case _ => false - } - } - implicit def urlEquiv(implicit uriEq: Equiv[URI]): Equiv[URL] = wrapEquiv[URL, URI](_.toURI)(uriEq) - implicit def uriEquiv: Equiv[URI] = defaultEquiv - implicit def stringSetEquiv: Equiv[Set[String]] = defaultEquiv - implicit def stringMapEquiv: Equiv[Map[String, String]] = defaultEquiv + implicit def optInputCache[T](implicit t: InputCache[T]): InputCache[Option[T]] = + new InputCache[Option[T]] { + type Internal = Option[t.Internal] + def convert(v: Option[T]): Internal = v.map(x => t.convert(x)) + def read(from: Input) = + { + val isDefined = BooleanFormat.reads(from) + if (isDefined) Some(t.read(from)) else None + } + def write(to: Out, j: Internal): Unit = + { + BooleanFormat.writes(to, j.isDefined) + j foreach { x => t.write(to, x) } + } + def equiv = optEquiv(t.equiv) + } - def streamFormat[T](write: (T, OutputStream) => Unit, f: InputStream => T): Format[T] = - { - val toBytes = (t: T) => { val bos = new ByteArrayOutputStream; write(t, bos); bos.toByteArray } - val fromBytes = (bs: Array[Byte]) => f(new ByteArrayInputStream(bs)) - wrap(toBytes, fromBytes)(DefaultProtocol.ByteArrayFormat) - } - - implicit def xmlInputCache(implicit strEq: InputCache[String]): InputCache[NodeSeq] = wrapIn[NodeSeq, String](_.toString, strEq) + def wrapEquiv[S, T](f: S => T)(implicit eqT: Equiv[T]): Equiv[S] = + new Equiv[S] { + def equiv(a: S, b: S) = + eqT.equiv(f(a), f(b)) + } - implicit def seqCache[T](implicit t: InputCache[T]): InputCache[Seq[T]] = - new InputCache[Seq[T]] - { - type Internal = Seq[t.Internal] - def convert(v: Seq[T]) = v.map(x => t.convert(x)) - def read(from: Input) = - { - val size = IntFormat.reads(from) - def next(left: Int, acc: List[t.Internal]): Internal = - if(left <= 0) acc.reverse else next(left - 1, t.read(from) :: acc) - next(size, Nil) - } - def write(to: Out, vs: Internal) - { - val size = vs.length - IntFormat.writes(to, size) - for(v <- vs) t.write(to, v) - } - def equiv: Equiv[Internal] = seqEquiv(t.equiv) - } + implicit def optEquiv[T](implicit t: Equiv[T]): Equiv[Option[T]] = + new Equiv[Option[T]] { + def equiv(a: Option[T], b: Option[T]) = + (a, b) match { + case (None, None) => true + case (Some(va), Some(vb)) => t.equiv(va, vb) + case _ => false + } + } + implicit def urlEquiv(implicit uriEq: Equiv[URI]): Equiv[URL] = wrapEquiv[URL, URI](_.toURI)(uriEq) + implicit def uriEquiv: Equiv[URI] = defaultEquiv + implicit def stringSetEquiv: Equiv[Set[String]] = defaultEquiv + implicit def stringMapEquiv: Equiv[Map[String, String]] = defaultEquiv - implicit def arrEquiv[T](implicit t: Equiv[T]): Equiv[Array[T]] = - wrapEquiv( (x: Array[T]) => x :Seq[T] )(seqEquiv[T](t)) + def streamFormat[T](write: (T, OutputStream) => Unit, f: InputStream => T): Format[T] = + { + val toBytes = (t: T) => { val bos = new ByteArrayOutputStream; write(t, bos); bos.toByteArray } + val fromBytes = (bs: Array[Byte]) => f(new ByteArrayInputStream(bs)) + wrap(toBytes, fromBytes)(DefaultProtocol.ByteArrayFormat) + } - implicit def seqEquiv[T](implicit t: Equiv[T]): Equiv[Seq[T]] = - new Equiv[Seq[T]] - { - def equiv(a: Seq[T], b: Seq[T]) = - a.length == b.length && - ((a,b).zipped forall t.equiv) - } - implicit def seqFormat[T](implicit t: Format[T]): Format[Seq[T]] = - wrap[Seq[T], List[T]](_.toList, _.toSeq)(DefaultProtocol.listFormat) - - def wrapIn[I,J](implicit f: I => J, jCache: InputCache[J]): InputCache[I] = - new InputCache[I] - { - type Internal = jCache.Internal - def convert(i: I) = jCache.convert(f(i)) - def read(from: Input) = jCache.read(from) - def write(to: Out, j: Internal) = jCache.write(to, j) - def equiv = jCache.equiv - } + implicit def xmlInputCache(implicit strEq: InputCache[String]): InputCache[NodeSeq] = wrapIn[NodeSeq, String](_.toString, strEq) - def singleton[T](t: T): InputCache[T] = - basicInput(trueEquiv, asSingleton(t)) + implicit def seqCache[T](implicit t: InputCache[T]): InputCache[Seq[T]] = + new InputCache[Seq[T]] { + type Internal = Seq[t.Internal] + def convert(v: Seq[T]) = v.map(x => t.convert(x)) + def read(from: Input) = + { + val size = IntFormat.reads(from) + def next(left: Int, acc: List[t.Internal]): Internal = + if (left <= 0) acc.reverse else next(left - 1, t.read(from) :: acc) + next(size, Nil) + } + def write(to: Out, vs: Internal) { + val size = vs.length + IntFormat.writes(to, size) + for (v <- vs) t.write(to, v) + } + def equiv: Equiv[Internal] = seqEquiv(t.equiv) + } - def trueEquiv[T] = new Equiv[T] { def equiv(a: T, b: T) = true } + implicit def arrEquiv[T](implicit t: Equiv[T]): Equiv[Array[T]] = + wrapEquiv((x: Array[T]) => x: Seq[T])(seqEquiv[T](t)) + + implicit def seqEquiv[T](implicit t: Equiv[T]): Equiv[Seq[T]] = + new Equiv[Seq[T]] { + def equiv(a: Seq[T], b: Seq[T]) = + a.length == b.length && + ((a, b).zipped forall t.equiv) + } + implicit def seqFormat[T](implicit t: Format[T]): Format[Seq[T]] = + wrap[Seq[T], List[T]](_.toList, _.toSeq)(DefaultProtocol.listFormat) + + def wrapIn[I, J](implicit f: I => J, jCache: InputCache[J]): InputCache[I] = + new InputCache[I] { + type Internal = jCache.Internal + def convert(i: I) = jCache.convert(f(i)) + def read(from: Input) = jCache.read(from) + def write(to: Out, j: Internal) = jCache.write(to, j) + def equiv = jCache.equiv + } + + def singleton[T](t: T): InputCache[T] = + basicInput(trueEquiv, asSingleton(t)) + + def trueEquiv[T] = new Equiv[T] { def equiv(a: T, b: T) = true } } -trait HListCacheImplicits -{ - implicit def hConsCache[H, T <: HList](implicit head: InputCache[H], tail: InputCache[T]): InputCache[H :+: T] = - new InputCache[H :+: T] - { - type Internal = (head.Internal, tail.Internal) - def convert(in: H :+: T) = (head.convert(in.head), tail.convert(in.tail)) - def read(from: Input) = - { - val h = head.read(from) - val t = tail.read(from) - (h, t) - } - def write(to: Out, j: Internal) - { - head.write(to, j._1) - tail.write(to, j._2) - } - def equiv = new Equiv[Internal] - { - def equiv(a: Internal, b: Internal) = - head.equiv.equiv(a._1, b._1) && - tail.equiv.equiv(a._2, b._2) - } - } - - implicit def hNilCache: InputCache[HNil] = Cache.singleton(HNil : HNil) +trait HListCacheImplicits { + implicit def hConsCache[H, T <: HList](implicit head: InputCache[H], tail: InputCache[T]): InputCache[H :+: T] = + new InputCache[H :+: T] { + type Internal = (head.Internal, tail.Internal) + def convert(in: H :+: T) = (head.convert(in.head), tail.convert(in.tail)) + def read(from: Input) = + { + val h = head.read(from) + val t = tail.read(from) + (h, t) + } + def write(to: Out, j: Internal) { + head.write(to, j._1) + tail.write(to, j._2) + } + def equiv = new Equiv[Internal] { + def equiv(a: Internal, b: Internal) = + head.equiv.equiv(a._1, b._1) && + tail.equiv.equiv(a._2, b._2) + } + } - implicit def hConsFormat[H, T <: HList](implicit head: Format[H], tail: Format[T]): Format[H :+: T] = new Format[H :+: T] { - def reads(from: Input) = - { - val h = head.reads(from) - val t = tail.reads(from) - HCons(h, t) - } - def writes(to: Out, hc: H :+: T) - { - head.writes(to, hc.head) - tail.writes(to, hc.tail) - } - } + implicit def hNilCache: InputCache[HNil] = Cache.singleton(HNil: HNil) - implicit def hNilFormat: Format[HNil] = asSingleton(HNil) + implicit def hConsFormat[H, T <: HList](implicit head: Format[H], tail: Format[T]): Format[H :+: T] = new Format[H :+: T] { + def reads(from: Input) = + { + val h = head.reads(from) + val t = tail.reads(from) + HCons(h, t) + } + def writes(to: Out, hc: H :+: T) { + head.writes(to, hc.head) + tail.writes(to, hc.tail) + } + } + + implicit def hNilFormat: Format[HNil] = asSingleton(HNil) } -trait UnionImplicits -{ - def unionInputCache[UB, HL <: HList](implicit uc: UnionCache[HL, UB]): InputCache[UB] = - new InputCache[UB] - { - type Internal = Found[_] - def convert(in: UB) = uc.find(in) - def read(in: Input) = - { - val index = ByteFormat.reads(in) - val (cache, clazz) = uc.at(index) - val value = cache.read(in) - new Found[cache.Internal](cache, clazz, value, index) - } - def write(to: Out, i: Internal) - { - def write0[I](f: Found[I]) - { - ByteFormat.writes(to, f.index.toByte) - f.cache.write(to, f.value) - } - write0(i) - } - def equiv: Equiv[Internal] = new Equiv[Internal] - { - def equiv(a: Internal, b: Internal) = - { - if(a.clazz == b.clazz) - force(a.cache.equiv, a.value, b.value) - else - false - } - def force[T <: UB, UB](e: Equiv[T], a: UB, b: UB) = e.equiv(a.asInstanceOf[T], b.asInstanceOf[T]) - } - } +trait UnionImplicits { + def unionInputCache[UB, HL <: HList](implicit uc: UnionCache[HL, UB]): InputCache[UB] = + new InputCache[UB] { + type Internal = Found[_] + def convert(in: UB) = uc.find(in) + def read(in: Input) = + { + val index = ByteFormat.reads(in) + val (cache, clazz) = uc.at(index) + val value = cache.read(in) + new Found[cache.Internal](cache, clazz, value, index) + } + def write(to: Out, i: Internal) { + def write0[I](f: Found[I]) { + ByteFormat.writes(to, f.index.toByte) + f.cache.write(to, f.value) + } + write0(i) + } + def equiv: Equiv[Internal] = new Equiv[Internal] { + def equiv(a: Internal, b: Internal) = + { + if (a.clazz == b.clazz) + force(a.cache.equiv, a.value, b.value) + else + false + } + def force[T <: UB, UB](e: Equiv[T], a: UB, b: UB) = e.equiv(a.asInstanceOf[T], b.asInstanceOf[T]) + } + } - implicit def unionCons[H <: UB, UB, T <: HList](implicit head: InputCache[H], mf: Manifest[H], t: UnionCache[T, UB]): UnionCache[H :+: T, UB] = - new UnionCache[H :+: T, UB] - { - val size = 1 + t.size - def c = mf.runtimeClass - def find(value: UB): Found[_] = - if(c.isInstance(value)) new Found[head.Internal](head, c, head.convert(value.asInstanceOf[H]), size - 1) else t.find(value) - def at(i: Int): (InputCache[_ <: UB], Class[_]) = if(size == i + 1) (head, c) else t.at(i) - } + implicit def unionCons[H <: UB, UB, T <: HList](implicit head: InputCache[H], mf: Manifest[H], t: UnionCache[T, UB]): UnionCache[H :+: T, UB] = + new UnionCache[H :+: T, UB] { + val size = 1 + t.size + def c = mf.runtimeClass + def find(value: UB): Found[_] = + if (c.isInstance(value)) new Found[head.Internal](head, c, head.convert(value.asInstanceOf[H]), size - 1) else t.find(value) + def at(i: Int): (InputCache[_ <: UB], Class[_]) = if (size == i + 1) (head, c) else t.at(i) + } - implicit def unionNil[UB]: UnionCache[HNil, UB] = new UnionCache[HNil, UB] { - def size = 0 - def find(value: UB) = sys.error("No valid sum type for " + value) - def at(i: Int) = sys.error("Invalid union index " + i) - } + implicit def unionNil[UB]: UnionCache[HNil, UB] = new UnionCache[HNil, UB] { + def size = 0 + def find(value: UB) = sys.error("No valid sum type for " + value) + def at(i: Int) = sys.error("Invalid union index " + i) + } - final class Found[I](val cache: InputCache[_] { type Internal = I }, val clazz: Class[_], val value: I, val index: Int) - sealed trait UnionCache[HL <: HList, UB] - { - def size: Int - def at(i: Int): (InputCache[_ <: UB], Class[_]) - def find(forValue: UB): Found[_] - } + final class Found[I](val cache: InputCache[_] { type Internal = I }, val clazz: Class[_], val value: I, val index: Int) + sealed trait UnionCache[HL <: HList, UB] { + def size: Int + def at(i: Int): (InputCache[_ <: UB], Class[_]) + def find(forValue: UB): Found[_] + } } \ No newline at end of file diff --git a/cache/src/main/scala/sbt/CacheIO.scala b/cache/src/main/scala/sbt/CacheIO.scala index ac698c24e..a50da7ee7 100644 --- a/cache/src/main/scala/sbt/CacheIO.scala +++ b/cache/src/main/scala/sbt/CacheIO.scala @@ -3,43 +3,42 @@ */ package sbt -import java.io.{File, FileNotFoundException} -import sbinary.{DefaultProtocol, Format, Operations} +import java.io.{ File, FileNotFoundException } +import sbinary.{ DefaultProtocol, Format, Operations } import scala.reflect.Manifest -object CacheIO -{ - def toBytes[T](format: Format[T])(value: T)(implicit mf: Manifest[Format[T]]): Array[Byte] = - toBytes[T](value)(format, mf) - def toBytes[T](value: T)(implicit format: Format[T], mf: Manifest[Format[T]]): Array[Byte] = - Operations.toByteArray(value)(stampedFormat(format)) - def fromBytes[T](format: Format[T], default: => T)(bytes: Array[Byte])(implicit mf: Manifest[Format[T]]): T = - fromBytes(default)(bytes)(format, mf) - def fromBytes[T](default: => T)(bytes: Array[Byte])(implicit format: Format[T], mf: Manifest[Format[T]]): T = - if(bytes.isEmpty) default else Operations.fromByteArray(bytes)(stampedFormat(format)) - - def fromFile[T](format: Format[T], default: => T)(file: File)(implicit mf: Manifest[Format[T]]): T = - fromFile(file, default)(format, mf) - def fromFile[T](file: File, default: => T)(implicit format: Format[T], mf: Manifest[Format[T]]): T = - fromFile[T](file) getOrElse default - def fromFile[T](file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Option[T] = - try { Some( Operations.fromFile(file)(stampedFormat(format)) ) } - catch { case e: Exception => None } - - def toFile[T](format: Format[T])(value: T)(file: File)(implicit mf: Manifest[Format[T]]): Unit = - toFile(value)(file)(format, mf) - def toFile[T](value: T)(file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Unit = - { - IO.createDirectory(file.getParentFile) - Operations.toFile(value)(file)(stampedFormat(format)) - } - def stampedFormat[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Format[T] = - { - import DefaultProtocol._ - withStamp(stamp(format))(format) - } - def stamp[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Int = typeHash(mf) - def typeHash[T](implicit mf: Manifest[T]) = mf.toString.hashCode - def manifest[T](implicit mf: Manifest[T]): Manifest[T] = mf - def objManifest[T](t: T)(implicit mf: Manifest[T]): Manifest[T] = mf +object CacheIO { + def toBytes[T](format: Format[T])(value: T)(implicit mf: Manifest[Format[T]]): Array[Byte] = + toBytes[T](value)(format, mf) + def toBytes[T](value: T)(implicit format: Format[T], mf: Manifest[Format[T]]): Array[Byte] = + Operations.toByteArray(value)(stampedFormat(format)) + def fromBytes[T](format: Format[T], default: => T)(bytes: Array[Byte])(implicit mf: Manifest[Format[T]]): T = + fromBytes(default)(bytes)(format, mf) + def fromBytes[T](default: => T)(bytes: Array[Byte])(implicit format: Format[T], mf: Manifest[Format[T]]): T = + if (bytes.isEmpty) default else Operations.fromByteArray(bytes)(stampedFormat(format)) + + def fromFile[T](format: Format[T], default: => T)(file: File)(implicit mf: Manifest[Format[T]]): T = + fromFile(file, default)(format, mf) + def fromFile[T](file: File, default: => T)(implicit format: Format[T], mf: Manifest[Format[T]]): T = + fromFile[T](file) getOrElse default + def fromFile[T](file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Option[T] = + try { Some(Operations.fromFile(file)(stampedFormat(format))) } + catch { case e: Exception => None } + + def toFile[T](format: Format[T])(value: T)(file: File)(implicit mf: Manifest[Format[T]]): Unit = + toFile(value)(file)(format, mf) + def toFile[T](value: T)(file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Unit = + { + IO.createDirectory(file.getParentFile) + Operations.toFile(value)(file)(stampedFormat(format)) + } + def stampedFormat[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Format[T] = + { + import DefaultProtocol._ + withStamp(stamp(format))(format) + } + def stamp[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Int = typeHash(mf) + def typeHash[T](implicit mf: Manifest[T]) = mf.toString.hashCode + def manifest[T](implicit mf: Manifest[T]): Manifest[T] = mf + def objManifest[T](t: T)(implicit mf: Manifest[T]): Manifest[T] = mf } \ No newline at end of file diff --git a/cache/src/main/scala/sbt/FileInfo.scala b/cache/src/main/scala/sbt/FileInfo.scala index e4706c1fa..c735adcb0 100644 --- a/cache/src/main/scala/sbt/FileInfo.scala +++ b/cache/src/main/scala/sbt/FileInfo.scala @@ -3,26 +3,22 @@ */ package sbt -import java.io.{File, IOException} -import sbinary.{DefaultProtocol, Format} +import java.io.{ File, IOException } +import sbinary.{ DefaultProtocol, Format } import DefaultProtocol._ import scala.reflect.Manifest -sealed trait FileInfo extends NotNull -{ - val file: File +sealed trait FileInfo extends NotNull { + val file: File } -sealed trait HashFileInfo extends FileInfo -{ - val hash: List[Byte] +sealed trait HashFileInfo extends FileInfo { + val hash: List[Byte] } -sealed trait ModifiedFileInfo extends FileInfo -{ - val lastModified: Long +sealed trait ModifiedFileInfo extends FileInfo { + val lastModified: Long } -sealed trait PlainFileInfo extends FileInfo -{ - def exists: Boolean +sealed trait PlainFileInfo extends FileInfo { + def exists: Boolean } sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo @@ -31,90 +27,80 @@ private final case class FileHash(file: File, hash: List[Byte]) extends HashFile private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo -object FileInfo -{ - implicit def existsInputCache: InputCache[PlainFileInfo] = exists.infoInputCache - implicit def modifiedInputCache: InputCache[ModifiedFileInfo] = lastModified.infoInputCache - implicit def hashInputCache: InputCache[HashFileInfo] = hash.infoInputCache - implicit def fullInputCache: InputCache[HashModifiedFileInfo] = full.infoInputCache +object FileInfo { + implicit def existsInputCache: InputCache[PlainFileInfo] = exists.infoInputCache + implicit def modifiedInputCache: InputCache[ModifiedFileInfo] = lastModified.infoInputCache + implicit def hashInputCache: InputCache[HashFileInfo] = hash.infoInputCache + implicit def fullInputCache: InputCache[HashModifiedFileInfo] = full.infoInputCache - sealed trait Style - { - type F <: FileInfo - implicit def apply(file: File): F - implicit def unapply(info: F): File = info.file - implicit val format: Format[F] - import Cache._ - implicit def fileInfoEquiv: Equiv[F] = defaultEquiv - def infoInputCache: InputCache[F] = basicInput - implicit def fileInputCache: InputCache[File] = wrapIn[File,F] - } - object full extends Style - { - type F = HashModifiedFileInfo - implicit def apply(file: File): HashModifiedFileInfo = make(file, Hash(file).toList, file.lastModified) - def make(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, hash, lastModified) - implicit val format: Format[HashModifiedFileInfo] = wrap(f => (f.file, f.hash, f.lastModified), (make _).tupled) - } - object hash extends Style - { - type F = HashFileInfo - implicit def apply(file: File): HashFileInfo = make(file, computeHash(file)) - def make(file: File, hash: List[Byte]): HashFileInfo = FileHash(file.getAbsoluteFile, hash) - implicit val format: Format[HashFileInfo] = wrap(f => (f.file, f.hash), (make _).tupled) - private def computeHash(file: File): List[Byte] = try { Hash(file).toList } catch { case e: Exception => Nil } - } - object lastModified extends Style - { - type F = ModifiedFileInfo - implicit def apply(file: File): ModifiedFileInfo = make(file, file.lastModified) - def make(file: File, lastModified: Long): ModifiedFileInfo = FileModified(file.getAbsoluteFile, lastModified) - implicit val format: Format[ModifiedFileInfo] = wrap(f => (f.file, f.lastModified), (make _).tupled) - } - object exists extends Style - { - type F = PlainFileInfo - implicit def apply(file: File): PlainFileInfo = make(file) - def make(file: File): PlainFileInfo = { val abs = file.getAbsoluteFile; PlainFile(abs, abs.exists) } - implicit val format: Format[PlainFileInfo] = asProduct2[PlainFileInfo, File, Boolean](PlainFile.apply)(x => (x.file, x.exists)) - } + sealed trait Style { + type F <: FileInfo + implicit def apply(file: File): F + implicit def unapply(info: F): File = info.file + implicit val format: Format[F] + import Cache._ + implicit def fileInfoEquiv: Equiv[F] = defaultEquiv + def infoInputCache: InputCache[F] = basicInput + implicit def fileInputCache: InputCache[File] = wrapIn[File, F] + } + object full extends Style { + type F = HashModifiedFileInfo + implicit def apply(file: File): HashModifiedFileInfo = make(file, Hash(file).toList, file.lastModified) + def make(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, hash, lastModified) + implicit val format: Format[HashModifiedFileInfo] = wrap(f => (f.file, f.hash, f.lastModified), (make _).tupled) + } + object hash extends Style { + type F = HashFileInfo + implicit def apply(file: File): HashFileInfo = make(file, computeHash(file)) + def make(file: File, hash: List[Byte]): HashFileInfo = FileHash(file.getAbsoluteFile, hash) + implicit val format: Format[HashFileInfo] = wrap(f => (f.file, f.hash), (make _).tupled) + private def computeHash(file: File): List[Byte] = try { Hash(file).toList } catch { case e: Exception => Nil } + } + object lastModified extends Style { + type F = ModifiedFileInfo + implicit def apply(file: File): ModifiedFileInfo = make(file, file.lastModified) + def make(file: File, lastModified: Long): ModifiedFileInfo = FileModified(file.getAbsoluteFile, lastModified) + implicit val format: Format[ModifiedFileInfo] = wrap(f => (f.file, f.lastModified), (make _).tupled) + } + object exists extends Style { + type F = PlainFileInfo + implicit def apply(file: File): PlainFileInfo = make(file) + def make(file: File): PlainFileInfo = { val abs = file.getAbsoluteFile; PlainFile(abs, abs.exists) } + implicit val format: Format[PlainFileInfo] = asProduct2[PlainFileInfo, File, Boolean](PlainFile.apply)(x => (x.file, x.exists)) + } } -final case class FilesInfo[F <: FileInfo] private(files: Set[F]) -object FilesInfo -{ - sealed abstract class Style - { - type F <: FileInfo - val fileStyle: FileInfo.Style { type F = Style.this.F } +final case class FilesInfo[F <: FileInfo] private (files: Set[F]) +object FilesInfo { + sealed abstract class Style { + type F <: FileInfo + val fileStyle: FileInfo.Style { type F = Style.this.F } - //def manifest: Manifest[F] = fileStyle.manifest - implicit def apply(files: Set[File]): FilesInfo[F] - implicit def unapply(info: FilesInfo[F]): Set[File] = info.files.map(_.file) - implicit val formats: Format[FilesInfo[F]] - val manifest: Manifest[Format[FilesInfo[F]]] - def empty: FilesInfo[F] = new FilesInfo[F](Set.empty) - import Cache._ - def infosInputCache: InputCache[FilesInfo[F]] = basicInput - implicit def filesInputCache: InputCache[Set[File]] = wrapIn[Set[File],FilesInfo[F]] - implicit def filesInfoEquiv: Equiv[FilesInfo[F]] = defaultEquiv - } - private final class BasicStyle[FI <: FileInfo](style: FileInfo.Style { type F = FI }) - (implicit val manifest: Manifest[Format[FilesInfo[FI]]]) extends Style - { - type F = FI - val fileStyle: FileInfo.Style { type F = FI } = style - private implicit val infoFormat: Format[FI] = fileStyle.format - implicit def apply(files: Set[File]): FilesInfo[F] = FilesInfo( files.map(_.getAbsoluteFile).map(fileStyle.apply) ) - implicit val formats: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) - } - lazy val full: Style { type F = HashModifiedFileInfo } = new BasicStyle(FileInfo.full) - lazy val hash: Style { type F = HashFileInfo } = new BasicStyle(FileInfo.hash) - lazy val lastModified: Style { type F = ModifiedFileInfo } = new BasicStyle(FileInfo.lastModified) - lazy val exists: Style { type F = PlainFileInfo } = new BasicStyle(FileInfo.exists) + //def manifest: Manifest[F] = fileStyle.manifest + implicit def apply(files: Set[File]): FilesInfo[F] + implicit def unapply(info: FilesInfo[F]): Set[File] = info.files.map(_.file) + implicit val formats: Format[FilesInfo[F]] + val manifest: Manifest[Format[FilesInfo[F]]] + def empty: FilesInfo[F] = new FilesInfo[F](Set.empty) + import Cache._ + def infosInputCache: InputCache[FilesInfo[F]] = basicInput + implicit def filesInputCache: InputCache[Set[File]] = wrapIn[Set[File], FilesInfo[F]] + implicit def filesInfoEquiv: Equiv[FilesInfo[F]] = defaultEquiv + } + private final class BasicStyle[FI <: FileInfo](style: FileInfo.Style { type F = FI })(implicit val manifest: Manifest[Format[FilesInfo[FI]]]) extends Style { + type F = FI + val fileStyle: FileInfo.Style { type F = FI } = style + private implicit val infoFormat: Format[FI] = fileStyle.format + implicit def apply(files: Set[File]): FilesInfo[F] = FilesInfo(files.map(_.getAbsoluteFile).map(fileStyle.apply)) + implicit val formats: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) + } + lazy val full: Style { type F = HashModifiedFileInfo } = new BasicStyle(FileInfo.full) + lazy val hash: Style { type F = HashFileInfo } = new BasicStyle(FileInfo.hash) + lazy val lastModified: Style { type F = ModifiedFileInfo } = new BasicStyle(FileInfo.lastModified) + lazy val exists: Style { type F = PlainFileInfo } = new BasicStyle(FileInfo.exists) - implicit def existsInputsCache: InputCache[FilesInfo[PlainFileInfo]] = exists.infosInputCache - implicit def hashInputsCache: InputCache[FilesInfo[HashFileInfo]] = hash.infosInputCache - implicit def modifiedInputsCache: InputCache[FilesInfo[ModifiedFileInfo]] = lastModified.infosInputCache - implicit def fullInputsCache: InputCache[FilesInfo[HashModifiedFileInfo]] = full.infosInputCache + implicit def existsInputsCache: InputCache[FilesInfo[PlainFileInfo]] = exists.infosInputCache + implicit def hashInputsCache: InputCache[FilesInfo[HashFileInfo]] = hash.infosInputCache + implicit def modifiedInputsCache: InputCache[FilesInfo[ModifiedFileInfo]] = lastModified.infosInputCache + implicit def fullInputsCache: InputCache[FilesInfo[HashModifiedFileInfo]] = full.infosInputCache } \ No newline at end of file diff --git a/cache/src/main/scala/sbt/SeparatedCache.scala b/cache/src/main/scala/sbt/SeparatedCache.scala index a126229bd..9d11f1f3c 100644 --- a/cache/src/main/scala/sbt/SeparatedCache.scala +++ b/cache/src/main/scala/sbt/SeparatedCache.scala @@ -4,64 +4,59 @@ package sbt import Types.:+: -import sbinary.{DefaultProtocol, Format, Input, Output => Out} +import sbinary.{ DefaultProtocol, Format, Input, Output => Out } import DefaultProtocol.ByteFormat -import java.io.{File, InputStream, OutputStream} +import java.io.{ File, InputStream, OutputStream } -trait InputCache[I] -{ - type Internal - def convert(i: I): Internal - def read(from: Input): Internal - def write(to: Out, j: Internal): Unit - def equiv: Equiv[Internal] +trait InputCache[I] { + type Internal + def convert(i: I): Internal + def read(from: Input): Internal + def write(to: Out, j: Internal): Unit + def equiv: Equiv[Internal] } -object InputCache -{ - implicit def basicInputCache[I](implicit fmt: Format[I], eqv: Equiv[I]): InputCache[I] = - new InputCache[I] - { - type Internal = I - def convert(i: I) = i - def read(from: Input): I = fmt.reads(from) - def write(to: Out, i: I) = fmt.writes(to, i) - def equiv = eqv - } - def lzy[I](mkIn: => InputCache[I]): InputCache[I] = - new InputCache[I] - { - lazy val ic = mkIn - type Internal = ic.Internal - def convert(i: I) = ic convert i - def read(from: Input): ic.Internal = ic.read(from) - def write(to: Out, i: ic.Internal) = ic.write(to, i) - def equiv = ic.equiv - } +object InputCache { + implicit def basicInputCache[I](implicit fmt: Format[I], eqv: Equiv[I]): InputCache[I] = + new InputCache[I] { + type Internal = I + def convert(i: I) = i + def read(from: Input): I = fmt.reads(from) + def write(to: Out, i: I) = fmt.writes(to, i) + def equiv = eqv + } + def lzy[I](mkIn: => InputCache[I]): InputCache[I] = + new InputCache[I] { + lazy val ic = mkIn + type Internal = ic.Internal + def convert(i: I) = ic convert i + def read(from: Input): ic.Internal = ic.read(from) + def write(to: Out, i: ic.Internal) = ic.write(to, i) + def equiv = ic.equiv + } } -class BasicCache[I,O](implicit input: InputCache[I], outFormat: Format[O]) extends Cache[I,O] -{ - def apply(file: File)(in: I) = - { - val j = input.convert(in) - try { applyImpl(file, j) } - catch { case e: Exception => Right(update(file)(j)) } - } - protected def applyImpl(file: File, in: input.Internal) = - { - Using.fileInputStream(file) { stream => - val previousIn = input.read(stream) - if(input.equiv.equiv(in, previousIn)) - Left(outFormat.reads(stream)) - else - Right(update(file)(in)) - } - } - protected def update(file: File)(in: input.Internal) = (out: O) => - { - Using.fileOutputStream(false)(file) { stream => - input.write(stream, in) - outFormat.writes(stream, out) - } - } +class BasicCache[I, O](implicit input: InputCache[I], outFormat: Format[O]) extends Cache[I, O] { + def apply(file: File)(in: I) = + { + val j = input.convert(in) + try { applyImpl(file, j) } + catch { case e: Exception => Right(update(file)(j)) } + } + protected def applyImpl(file: File, in: input.Internal) = + { + Using.fileInputStream(file) { stream => + val previousIn = input.read(stream) + if (input.equiv.equiv(in, previousIn)) + Left(outFormat.reads(stream)) + else + Right(update(file)(in)) + } + } + protected def update(file: File)(in: input.Internal) = (out: O) => + { + Using.fileOutputStream(false)(file) { stream => + input.write(stream, in) + outFormat.writes(stream, out) + } + } } \ No newline at end of file diff --git a/cache/tracking/src/main/scala/sbt/ChangeReport.scala b/cache/tracking/src/main/scala/sbt/ChangeReport.scala index 634650f20..8502f9d3f 100644 --- a/cache/tracking/src/main/scala/sbt/ChangeReport.scala +++ b/cache/tracking/src/main/scala/sbt/ChangeReport.scala @@ -3,71 +3,68 @@ */ package sbt -object ChangeReport -{ - def modified[T](files: Set[T]) = - new EmptyChangeReport[T] - { - override def checked = files - override def modified = files - override def markAllModified = this - } - def unmodified[T](files: Set[T]) = - new EmptyChangeReport[T] - { - override def checked = files - override def unmodified = files - } +object ChangeReport { + def modified[T](files: Set[T]) = + new EmptyChangeReport[T] { + override def checked = files + override def modified = files + override def markAllModified = this + } + def unmodified[T](files: Set[T]) = + new EmptyChangeReport[T] { + override def checked = files + override def unmodified = files + } } /** The result of comparing some current set of objects against a previous set of objects.*/ -trait ChangeReport[T] extends NotNull -{ - /** The set of all of the objects in the current set.*/ - def checked: Set[T] - /** All of the objects that are in the same state in the current and reference sets.*/ - def unmodified: Set[T] - /** All checked objects that are not in the same state as the reference. This includes objects that are in both - * sets but have changed and files that are only in one set.*/ - def modified: Set[T] // all changes, including added - /** All objects that are only in the current set.*/ - def added: Set[T] - /** All objects only in the previous set*/ - def removed: Set[T] - def +++(other: ChangeReport[T]): ChangeReport[T] = new CompoundChangeReport(this, other) - /** Generate a new report with this report's unmodified set included in the new report's modified set. The new report's - * unmodified set is empty. The new report's added, removed, and checked sets are the same as in this report. */ - def markAllModified: ChangeReport[T] = - new ChangeReport[T] - { - def checked = ChangeReport.this.checked - def unmodified = Set.empty[T] - def modified = ChangeReport.this.checked - def added = ChangeReport.this.added - def removed = ChangeReport.this.removed - override def markAllModified = this - } - override def toString = - { - val labels = List("Checked", "Modified", "Unmodified", "Added", "Removed") - val sets = List(checked, modified, unmodified, added, removed) - val keyValues = labels.zip(sets).map{ case (label, set) => label + ": " + set.mkString(", ") } - keyValues.mkString("Change report:\n\t", "\n\t", "") - } +trait ChangeReport[T] extends NotNull { + /** The set of all of the objects in the current set.*/ + def checked: Set[T] + /** All of the objects that are in the same state in the current and reference sets.*/ + def unmodified: Set[T] + /** + * All checked objects that are not in the same state as the reference. This includes objects that are in both + * sets but have changed and files that are only in one set. + */ + def modified: Set[T] // all changes, including added + /** All objects that are only in the current set.*/ + def added: Set[T] + /** All objects only in the previous set*/ + def removed: Set[T] + def +++(other: ChangeReport[T]): ChangeReport[T] = new CompoundChangeReport(this, other) + /** + * Generate a new report with this report's unmodified set included in the new report's modified set. The new report's + * unmodified set is empty. The new report's added, removed, and checked sets are the same as in this report. + */ + def markAllModified: ChangeReport[T] = + new ChangeReport[T] { + def checked = ChangeReport.this.checked + def unmodified = Set.empty[T] + def modified = ChangeReport.this.checked + def added = ChangeReport.this.added + def removed = ChangeReport.this.removed + override def markAllModified = this + } + override def toString = + { + val labels = List("Checked", "Modified", "Unmodified", "Added", "Removed") + val sets = List(checked, modified, unmodified, added, removed) + val keyValues = labels.zip(sets).map { case (label, set) => label + ": " + set.mkString(", ") } + keyValues.mkString("Change report:\n\t", "\n\t", "") + } } -class EmptyChangeReport[T] extends ChangeReport[T] -{ - def checked = Set.empty[T] - def unmodified = Set.empty[T] - def modified = Set.empty[T] - def added = Set.empty[T] - def removed = Set.empty[T] - override def toString = "No changes" +class EmptyChangeReport[T] extends ChangeReport[T] { + def checked = Set.empty[T] + def unmodified = Set.empty[T] + def modified = Set.empty[T] + def added = Set.empty[T] + def removed = Set.empty[T] + override def toString = "No changes" } -private class CompoundChangeReport[T](a: ChangeReport[T], b: ChangeReport[T]) extends ChangeReport[T] -{ - lazy val checked = a.checked ++ b.checked - lazy val unmodified = a.unmodified ++ b.unmodified - lazy val modified = a.modified ++ b.modified - lazy val added = a.added ++ b.added - lazy val removed = a.removed ++ b.removed +private class CompoundChangeReport[T](a: ChangeReport[T], b: ChangeReport[T]) extends ChangeReport[T] { + lazy val checked = a.checked ++ b.checked + lazy val unmodified = a.unmodified ++ b.unmodified + lazy val modified = a.modified ++ b.modified + lazy val added = a.added ++ b.added + lazy val removed = a.removed ++ b.removed } \ No newline at end of file diff --git a/cache/tracking/src/main/scala/sbt/Tracked.scala b/cache/tracking/src/main/scala/sbt/Tracked.scala index fb0747ed9..c851ef9a5 100644 --- a/cache/tracking/src/main/scala/sbt/Tracked.scala +++ b/cache/tracking/src/main/scala/sbt/Tracked.scala @@ -4,204 +4,202 @@ package sbt import java.io.File -import CacheIO.{fromFile, toFile} +import CacheIO.{ fromFile, toFile } import sbinary.Format import scala.reflect.Manifest import scala.collection.mutable -import IO.{delete, read, write} +import IO.{ delete, read, write } +object Tracked { + /** + * Creates a tracker that provides the last time it was evaluated. + * If 'useStartTime' is true, the recorded time is the start of the evaluated function. + * If 'useStartTime' is false, the recorded time is when the evaluated function completes. + * In both cases, the timestamp is not updated if the function throws an exception. + */ + def tstamp(cacheFile: File, useStartTime: Boolean = true): Timestamp = new Timestamp(cacheFile, useStartTime) + /** Creates a tracker that only evaluates a function when the input has changed.*/ + //def changed[O](cacheFile: File)(implicit format: Format[O], equiv: Equiv[O]): Changed[O] = + // new Changed[O](cacheFile) -object Tracked -{ - /** Creates a tracker that provides the last time it was evaluated. - * If 'useStartTime' is true, the recorded time is the start of the evaluated function. - * If 'useStartTime' is false, the recorded time is when the evaluated function completes. - * In both cases, the timestamp is not updated if the function throws an exception.*/ - def tstamp(cacheFile: File, useStartTime: Boolean = true): Timestamp = new Timestamp(cacheFile, useStartTime) - /** Creates a tracker that only evaluates a function when the input has changed.*/ - //def changed[O](cacheFile: File)(implicit format: Format[O], equiv: Equiv[O]): Changed[O] = - // new Changed[O](cacheFile) - - /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ - def diffInputs(cache: File, style: FilesInfo.Style): Difference = - Difference.inputs(cache, style) - /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ - def diffOutputs(cache: File, style: FilesInfo.Style): Difference = - Difference.outputs(cache, style) + /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ + def diffInputs(cache: File, style: FilesInfo.Style): Difference = + Difference.inputs(cache, style) + /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ + def diffOutputs(cache: File, style: FilesInfo.Style): Difference = + Difference.outputs(cache, style) - def lastOutput[I,O](cacheFile: File)(f: (I,Option[O]) => O)(implicit o: Format[O], mf: Manifest[Format[O]]): I => O = in => - { - val previous: Option[O] = fromFile[O](cacheFile) - val next = f(in, previous) - toFile(next)(cacheFile) - next - } + def lastOutput[I, O](cacheFile: File)(f: (I, Option[O]) => O)(implicit o: Format[O], mf: Manifest[Format[O]]): I => O = in => + { + val previous: Option[O] = fromFile[O](cacheFile) + val next = f(in, previous) + toFile(next)(cacheFile) + next + } - def inputChanged[I,O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): I => O = in => - { - val help = new CacheHelp(ic) - val conv = help.convert(in) - val changed = help.changed(cacheFile, conv) - val result = f(changed, in) - - if(changed) - help.save(cacheFile, conv) + def inputChanged[I, O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): I => O = in => + { + val help = new CacheHelp(ic) + val conv = help.convert(in) + val changed = help.changed(cacheFile, conv) + val result = f(changed, in) - result - } - def outputChanged[I,O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): (() => I) => O = in => - { - val initial = in() - val help = new CacheHelp(ic) - val changed = help.changed(cacheFile, help.convert(initial)) - val result = f(changed, initial) - - if(changed) - help.save(cacheFile, help.convert(in())) + if (changed) + help.save(cacheFile, conv) - result - } - final class CacheHelp[I](val ic: InputCache[I]) - { - def convert(i: I): ic.Internal = ic.convert(i) - def save(cacheFile: File, value: ic.Internal): Unit = - Using.fileOutputStream()(cacheFile)(out => ic.write(out, value) ) - def changed(cacheFile: File, converted: ic.Internal): Boolean = - try { - val prev = Using.fileInputStream(cacheFile)(x => ic.read(x)) - !ic.equiv.equiv(converted, prev) - } catch { case e: Exception => true } - } + result + } + def outputChanged[I, O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): (() => I) => O = in => + { + val initial = in() + val help = new CacheHelp(ic) + val changed = help.changed(cacheFile, help.convert(initial)) + val result = f(changed, initial) + + if (changed) + help.save(cacheFile, help.convert(in())) + + result + } + final class CacheHelp[I](val ic: InputCache[I]) { + def convert(i: I): ic.Internal = ic.convert(i) + def save(cacheFile: File, value: ic.Internal): Unit = + Using.fileOutputStream()(cacheFile)(out => ic.write(out, value)) + def changed(cacheFile: File, converted: ic.Internal): Boolean = + try { + val prev = Using.fileInputStream(cacheFile)(x => ic.read(x)) + !ic.equiv.equiv(converted, prev) + } catch { case e: Exception => true } + } } -trait Tracked -{ - /** Cleans outputs and clears the cache.*/ - def clean(): Unit +trait Tracked { + /** Cleans outputs and clears the cache.*/ + def clean(): Unit } -class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked -{ - def clean() = delete(cacheFile) - /** Reads the previous timestamp, evaluates the provided function, - * and then updates the timestamp if the function completes normally.*/ - def apply[T](f: Long => T): T = - { - val start = now() - val result = f(readTimestamp) - write(cacheFile, (if(useStartTime) start else now()).toString) - result - } - private def now() = System.currentTimeMillis - def readTimestamp: Long = - try { read(cacheFile).toLong } - catch { case _: NumberFormatException | _: java.io.FileNotFoundException => 0 } +class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked { + def clean() = delete(cacheFile) + /** + * Reads the previous timestamp, evaluates the provided function, + * and then updates the timestamp if the function completes normally. + */ + def apply[T](f: Long => T): T = + { + val start = now() + val result = f(readTimestamp) + write(cacheFile, (if (useStartTime) start else now()).toString) + result + } + private def now() = System.currentTimeMillis + def readTimestamp: Long = + try { read(cacheFile).toLong } + catch { case _: NumberFormatException | _: java.io.FileNotFoundException => 0 } } -class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O]) extends Tracked -{ - def clean() = delete(cacheFile) - def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => - { - if(uptodate(value)) - ifUnchanged(value) - else - { - update(value) - ifChanged(value) - } - } +class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O]) extends Tracked { + def clean() = delete(cacheFile) + def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => + { + if (uptodate(value)) + ifUnchanged(value) + else { + update(value) + ifChanged(value) + } + } - def update(value: O): Unit = Using.fileOutputStream(false)(cacheFile)(stream => format.writes(stream, value)) - def uptodate(value: O): Boolean = - try { - Using.fileInputStream(cacheFile) { - stream => equiv.equiv(value, format.reads(stream)) - } - } catch { - case _: Exception => false - } + def update(value: O): Unit = Using.fileOutputStream(false)(cacheFile)(stream => format.writes(stream, value)) + def uptodate(value: O): Boolean = + try { + Using.fileInputStream(cacheFile) { + stream => equiv.equiv(value, format.reads(stream)) + } + } catch { + case _: Exception => false + } } -object Difference -{ - def constructor(defineClean: Boolean, filesAreOutputs: Boolean): (File, FilesInfo.Style) => Difference = - (cache, style) => new Difference(cache, style, defineClean, filesAreOutputs) +object Difference { + def constructor(defineClean: Boolean, filesAreOutputs: Boolean): (File, FilesInfo.Style) => Difference = + (cache, style) => new Difference(cache, style, defineClean, filesAreOutputs) - /** Provides a constructor for a Difference that removes the files from the previous run on a call to 'clean' and saves the - * hash/last modified time of the files as they are after running the function. This means that this information must be evaluated twice: - * before and after running the function.*/ - val outputs = constructor(true, true) - /** Provides a constructor for a Difference that does nothing on a call to 'clean' and saves the - * hash/last modified time of the files as they were prior to running the function.*/ - val inputs = constructor(false, false) + /** + * Provides a constructor for a Difference that removes the files from the previous run on a call to 'clean' and saves the + * hash/last modified time of the files as they are after running the function. This means that this information must be evaluated twice: + * before and after running the function. + */ + val outputs = constructor(true, true) + /** + * Provides a constructor for a Difference that does nothing on a call to 'clean' and saves the + * hash/last modified time of the files as they were prior to running the function. + */ + val inputs = constructor(false, false) } -class Difference(val cache: File, val style: FilesInfo.Style, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked -{ - def clean() = - { - if(defineClean) delete(raw(cachedFilesInfo)) else () - clearCache() - } - private def clearCache() = delete(cache) - - private def cachedFilesInfo = fromFile(style.formats, style.empty)(cache)(style.manifest).files - private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) - - def apply[T](files: Set[File])(f: ChangeReport[File] => T): T = - { - val lastFilesInfo = cachedFilesInfo - apply(files, lastFilesInfo)(f)(_ => files) - } - - def apply[T](f: ChangeReport[File] => T)(implicit toFiles: T => Set[File]): T = - { - val lastFilesInfo = cachedFilesInfo - apply(raw(lastFilesInfo), lastFilesInfo)(f)(toFiles) - } - - private def abs(files: Set[File]) = files.map(_.getAbsoluteFile) - private[this] def apply[T](files: Set[File], lastFilesInfo: Set[style.F])(f: ChangeReport[File] => T)(extractFiles: T => Set[File]): T = - { - val lastFiles = raw(lastFilesInfo) - val currentFiles = abs(files) - val currentFilesInfo = style(currentFiles) +class Difference(val cache: File, val style: FilesInfo.Style, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked { + def clean() = + { + if (defineClean) delete(raw(cachedFilesInfo)) else () + clearCache() + } + private def clearCache() = delete(cache) - val report = new ChangeReport[File] - { - lazy val checked = currentFiles - lazy val removed = lastFiles -- checked // all files that were included previously but not this time. This is independent of whether the files exist. - lazy val added = checked -- lastFiles // all files included now but not previously. This is independent of whether the files exist. - lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) ++ added - lazy val unmodified = checked -- modified - } + private def cachedFilesInfo = fromFile(style.formats, style.empty)(cache)(style.manifest).files + private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) - val result = f(report) - val info = if(filesAreOutputs) style(abs(extractFiles(result))) else currentFilesInfo - toFile(style.formats)(info)(cache)(style.manifest) - result - } + def apply[T](files: Set[File])(f: ChangeReport[File] => T): T = + { + val lastFilesInfo = cachedFilesInfo + apply(files, lastFilesInfo)(f)(_ => files) + } + + def apply[T](f: ChangeReport[File] => T)(implicit toFiles: T => Set[File]): T = + { + val lastFilesInfo = cachedFilesInfo + apply(raw(lastFilesInfo), lastFilesInfo)(f)(toFiles) + } + + private def abs(files: Set[File]) = files.map(_.getAbsoluteFile) + private[this] def apply[T](files: Set[File], lastFilesInfo: Set[style.F])(f: ChangeReport[File] => T)(extractFiles: T => Set[File]): T = + { + val lastFiles = raw(lastFilesInfo) + val currentFiles = abs(files) + val currentFilesInfo = style(currentFiles) + + val report = new ChangeReport[File] { + lazy val checked = currentFiles + lazy val removed = lastFiles -- checked // all files that were included previously but not this time. This is independent of whether the files exist. + lazy val added = checked -- lastFiles // all files included now but not previously. This is independent of whether the files exist. + lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) ++ added + lazy val unmodified = checked -- modified + } + + val result = f(report) + val info = if (filesAreOutputs) style(abs(extractFiles(result))) else currentFilesInfo + toFile(style.formats)(info)(cache)(style.manifest) + result + } } object FileFunction { - type UpdateFunction = (ChangeReport[File], ChangeReport[File]) => Set[File] - - def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style = FilesInfo.lastModified, outStyle: FilesInfo.Style = FilesInfo.exists)(action: Set[File] => Set[File]): Set[File] => Set[File] = - cached(cacheBaseDirectory)(inStyle, outStyle)( (in, out) => action(in.checked) ) - - def cached(cacheBaseDirectory: File)(inStyle: FilesInfo.Style, outStyle: FilesInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = - { - import Path._ - lazy val inCache = Difference.inputs(cacheBaseDirectory / "in-cache", inStyle) - lazy val outCache = Difference.outputs(cacheBaseDirectory / "out-cache", outStyle) - inputs => - { - inCache(inputs) { inReport => - outCache { outReport => - if(inReport.modified.isEmpty && outReport.modified.isEmpty) - outReport.checked - else - action(inReport, outReport) - } - } - } - } + type UpdateFunction = (ChangeReport[File], ChangeReport[File]) => Set[File] + + def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style = FilesInfo.lastModified, outStyle: FilesInfo.Style = FilesInfo.exists)(action: Set[File] => Set[File]): Set[File] => Set[File] = + cached(cacheBaseDirectory)(inStyle, outStyle)((in, out) => action(in.checked)) + + def cached(cacheBaseDirectory: File)(inStyle: FilesInfo.Style, outStyle: FilesInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = + { + import Path._ + lazy val inCache = Difference.inputs(cacheBaseDirectory / "in-cache", inStyle) + lazy val outCache = Difference.outputs(cacheBaseDirectory / "out-cache", outStyle) + inputs => + { + inCache(inputs) { inReport => + outCache { outReport => + if (inReport.modified.isEmpty && outReport.modified.isEmpty) + outReport.checked + else + action(inReport, outReport) + } + } + } + } } \ No newline at end of file diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index fe1baa696..29a962de7 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -1,245 +1,260 @@ package sbt package appmacro - import scala.reflect._ - import macros._ - import scala.tools.nsc.Global - import ContextUtil.{DynamicDependencyError, DynamicReferenceError} +import scala.reflect._ +import macros._ +import scala.tools.nsc.Global +import ContextUtil.{ DynamicDependencyError, DynamicReferenceError } object ContextUtil { - final val DynamicDependencyError = "Illegal dynamic dependency" - final val DynamicReferenceError = "Illegal dynamic reference" + final val DynamicDependencyError = "Illegal dynamic dependency" + final val DynamicReferenceError = "Illegal dynamic reference" - /** Constructs an object with utility methods for operating in the provided macro context `c`. - * Callers should explicitly specify the type parameter as `c.type` in order to preserve the path dependent types. */ - def apply[C <: Context with Singleton](c: C): ContextUtil[C] = new ContextUtil(c) + /** + * Constructs an object with utility methods for operating in the provided macro context `c`. + * Callers should explicitly specify the type parameter as `c.type` in order to preserve the path dependent types. + */ + def apply[C <: Context with Singleton](c: C): ContextUtil[C] = new ContextUtil(c) + /** + * Helper for implementing a no-argument macro that is introduced via an implicit. + * This method removes the implicit conversion and evaluates the function `f` on the target of the conversion. + * + * Given `myImplicitConversion(someValue).extensionMethod`, where `extensionMethod` is a macro that uses this + * method, the result of this method is `f()`. + */ + def selectMacroImpl[T: c.WeakTypeTag](c: Context)(f: (c.Expr[Any], c.Position) => c.Expr[T]): c.Expr[T] = + { + import c.universe._ + c.macroApplication match { + case s @ Select(Apply(_, t :: Nil), tp) => f(c.Expr[Any](t), s.pos) + case x => unexpectedTree(x) + } + } - /** Helper for implementing a no-argument macro that is introduced via an implicit. - * This method removes the implicit conversion and evaluates the function `f` on the target of the conversion. - * - * Given `myImplicitConversion(someValue).extensionMethod`, where `extensionMethod` is a macro that uses this - * method, the result of this method is `f()`. */ - def selectMacroImpl[T: c.WeakTypeTag](c: Context)(f: (c.Expr[Any], c.Position) => c.Expr[T]): c.Expr[T] = - { - import c.universe._ - c.macroApplication match { - case s @ Select(Apply(_, t :: Nil), tp) => f( c.Expr[Any](t), s.pos ) - case x => unexpectedTree(x) - } - } - - def unexpectedTree[C <: Context](tree: C#Tree): Nothing = sys.error("Unexpected macro application tree (" + tree.getClass + "): " + tree) + def unexpectedTree[C <: Context](tree: C#Tree): Nothing = sys.error("Unexpected macro application tree (" + tree.getClass + "): " + tree) } // TODO 2.11 Remove this after dropping 2.10.x support. private object HasCompat { val compat = ??? }; import HasCompat._ -/** Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). -* This is not thread safe due to the underlying Context and related data structures not being thread safe. -* Use `ContextUtil[c.type](c)` to construct. */ -final class ContextUtil[C <: Context](val ctx: C) -{ - import ctx.universe.{Apply=>ApplyTree,_} - import compat._ +/** + * Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). + * This is not thread safe due to the underlying Context and related data structures not being thread safe. + * Use `ContextUtil[c.type](c)` to construct. + */ +final class ContextUtil[C <: Context](val ctx: C) { + import ctx.universe.{ Apply => ApplyTree, _ } + import compat._ - val powerContext = ctx.asInstanceOf[reflect.macros.runtime.Context] - val global: powerContext.universe.type = powerContext.universe - def callsiteTyper: global.analyzer.Typer = powerContext.callsiteTyper - val initialOwner: Symbol = callsiteTyper.context.owner.asInstanceOf[ctx.universe.Symbol] + val powerContext = ctx.asInstanceOf[reflect.macros.runtime.Context] + val global: powerContext.universe.type = powerContext.universe + def callsiteTyper: global.analyzer.Typer = powerContext.callsiteTyper + val initialOwner: Symbol = callsiteTyper.context.owner.asInstanceOf[ctx.universe.Symbol] - lazy val alistType = ctx.typeOf[AList[KList]] - lazy val alist: Symbol = alistType.typeSymbol.companionSymbol - lazy val alistTC: Type = alistType.typeConstructor + lazy val alistType = ctx.typeOf[AList[KList]] + lazy val alist: Symbol = alistType.typeSymbol.companionSymbol + lazy val alistTC: Type = alistType.typeConstructor - /** Modifiers for a local val.*/ - lazy val localModifiers = Modifiers(NoFlags) + /** Modifiers for a local val.*/ + lazy val localModifiers = Modifiers(NoFlags) - def getPos(sym: Symbol) = if(sym eq null) NoPosition else sym.pos + def getPos(sym: Symbol) = if (sym eq null) NoPosition else sym.pos - /** Constructs a unique term name with the given prefix within this Context. - * (The current implementation uses Context.fresh, which increments*/ - def freshTermName(prefix: String) = newTermName(ctx.fresh("$" + prefix)) + /** + * Constructs a unique term name with the given prefix within this Context. + * (The current implementation uses Context.fresh, which increments + */ + def freshTermName(prefix: String) = newTermName(ctx.fresh("$" + prefix)) - /** Constructs a new, synthetic, local ValDef Type `tpe`, a unique name, - * Position `pos`, an empty implementation (no rhs), and owned by `owner`. */ - def freshValDef(tpe: Type, pos: Position, owner: Symbol): ValDef = - { - val SYNTHETIC = (1 << 21).toLong.asInstanceOf[FlagSet] - val sym = owner.newTermSymbol(freshTermName("q"), pos, SYNTHETIC) - setInfo(sym, tpe) - val vd = ValDef(sym, EmptyTree) - vd.setPos(pos) - vd - } + /** + * Constructs a new, synthetic, local ValDef Type `tpe`, a unique name, + * Position `pos`, an empty implementation (no rhs), and owned by `owner`. + */ + def freshValDef(tpe: Type, pos: Position, owner: Symbol): ValDef = + { + val SYNTHETIC = (1 << 21).toLong.asInstanceOf[FlagSet] + val sym = owner.newTermSymbol(freshTermName("q"), pos, SYNTHETIC) + setInfo(sym, tpe) + val vd = ValDef(sym, EmptyTree) + vd.setPos(pos) + vd + } - lazy val parameterModifiers = Modifiers(Flag.PARAM) + lazy val parameterModifiers = Modifiers(Flag.PARAM) - /** Collects all definitions in the tree for use in checkReferences. - * This excludes definitions in wrapped expressions because checkReferences won't allow nested dereferencing anyway. */ - def collectDefs(tree: Tree, isWrapper: (String, Type, Tree) => Boolean): collection.Set[Symbol] = - { - val defs = new collection.mutable.HashSet[Symbol] - // adds the symbols for all non-Ident subtrees to `defs`. - val process = new Traverser { - override def traverse(t: Tree) = t match { - case _: Ident => () - case ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) if isWrapper(nme.decoded, tpe.tpe, qual) => () - case tree => - if(tree.symbol ne null) defs += tree.symbol; - super.traverse(tree) - } - } - process.traverse(tree) - defs - } + /** + * Collects all definitions in the tree for use in checkReferences. + * This excludes definitions in wrapped expressions because checkReferences won't allow nested dereferencing anyway. + */ + def collectDefs(tree: Tree, isWrapper: (String, Type, Tree) => Boolean): collection.Set[Symbol] = + { + val defs = new collection.mutable.HashSet[Symbol] + // adds the symbols for all non-Ident subtrees to `defs`. + val process = new Traverser { + override def traverse(t: Tree) = t match { + case _: Ident => () + case ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) if isWrapper(nme.decoded, tpe.tpe, qual) => () + case tree => + if (tree.symbol ne null) defs += tree.symbol; + super.traverse(tree) + } + } + process.traverse(tree) + defs + } - /** A reference is illegal if it is to an M instance defined within the scope of the macro call. - * As an approximation, disallow referenced to any local definitions `defs`. */ - def illegalReference(defs: collection.Set[Symbol], sym: Symbol): Boolean = - sym != null && sym != NoSymbol && defs.contains(sym) + /** + * A reference is illegal if it is to an M instance defined within the scope of the macro call. + * As an approximation, disallow referenced to any local definitions `defs`. + */ + def illegalReference(defs: collection.Set[Symbol], sym: Symbol): Boolean = + sym != null && sym != NoSymbol && defs.contains(sym) - /** A function that checks the provided tree for illegal references to M instances defined in the - * expression passed to the macro and for illegal dereferencing of M instances. */ - def checkReferences(defs: collection.Set[Symbol], isWrapper: (String, Type, Tree) => Boolean): Tree => Unit = { - case s @ ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) => - if(isWrapper(nme.decoded, tpe.tpe, qual)) ctx.error(s.pos, DynamicDependencyError) - case id @ Ident(name) if illegalReference(defs, id.symbol) => ctx.error(id.pos, DynamicReferenceError + ": " + name) - case _ => () - } + /** + * A function that checks the provided tree for illegal references to M instances defined in the + * expression passed to the macro and for illegal dereferencing of M instances. + */ + def checkReferences(defs: collection.Set[Symbol], isWrapper: (String, Type, Tree) => Boolean): Tree => Unit = { + case s @ ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) => + if (isWrapper(nme.decoded, tpe.tpe, qual)) ctx.error(s.pos, DynamicDependencyError) + case id @ Ident(name) if illegalReference(defs, id.symbol) => ctx.error(id.pos, DynamicReferenceError + ": " + name) + case _ => () + } - /** Constructs a ValDef with a parameter modifier, a unique name, with the provided Type and with an empty rhs. */ - def freshMethodParameter(tpe: Type): ValDef = - ValDef(parameterModifiers, freshTermName("p"), TypeTree(tpe), EmptyTree) + /** Constructs a ValDef with a parameter modifier, a unique name, with the provided Type and with an empty rhs. */ + def freshMethodParameter(tpe: Type): ValDef = + ValDef(parameterModifiers, freshTermName("p"), TypeTree(tpe), EmptyTree) - /** Constructs a ValDef with local modifiers and a unique name. */ - def localValDef(tpt: Tree, rhs: Tree): ValDef = - ValDef(localModifiers, freshTermName("q"), tpt, rhs) + /** Constructs a ValDef with local modifiers and a unique name. */ + def localValDef(tpt: Tree, rhs: Tree): ValDef = + ValDef(localModifiers, freshTermName("q"), tpt, rhs) - /** Constructs a tuple value of the right TupleN type from the provided inputs.*/ - def mkTuple(args: List[Tree]): Tree = - global.gen.mkTuple(args.asInstanceOf[List[global.Tree]]).asInstanceOf[ctx.universe.Tree] + /** Constructs a tuple value of the right TupleN type from the provided inputs.*/ + def mkTuple(args: List[Tree]): Tree = + global.gen.mkTuple(args.asInstanceOf[List[global.Tree]]).asInstanceOf[ctx.universe.Tree] - def setSymbol[Tree](t: Tree, sym: Symbol): Unit = - t.asInstanceOf[global.Tree].setSymbol(sym.asInstanceOf[global.Symbol]) - def setInfo[Tree](sym: Symbol, tpe: Type): Unit = - sym.asInstanceOf[global.Symbol].setInfo(tpe.asInstanceOf[global.Type]) + def setSymbol[Tree](t: Tree, sym: Symbol): Unit = + t.asInstanceOf[global.Tree].setSymbol(sym.asInstanceOf[global.Symbol]) + def setInfo[Tree](sym: Symbol, tpe: Type): Unit = + sym.asInstanceOf[global.Symbol].setInfo(tpe.asInstanceOf[global.Type]) - /** Creates a new, synthetic type variable with the specified `owner`. */ - def newTypeVariable(owner: Symbol, prefix: String = "T0"): TypeSymbol = - owner.asInstanceOf[global.Symbol].newSyntheticTypeParam(prefix, 0L).asInstanceOf[ctx.universe.TypeSymbol] + /** Creates a new, synthetic type variable with the specified `owner`. */ + def newTypeVariable(owner: Symbol, prefix: String = "T0"): TypeSymbol = + owner.asInstanceOf[global.Symbol].newSyntheticTypeParam(prefix, 0L).asInstanceOf[ctx.universe.TypeSymbol] - /** The type representing the type constructor `[X] X` */ - lazy val idTC: Type = - { - val tvar = newTypeVariable(NoSymbol) - polyType(tvar :: Nil, refVar(tvar)) - } - /** A Type that references the given type variable. */ - def refVar(variable: TypeSymbol): Type = variable.toTypeConstructor - /** Constructs a new, synthetic type variable that is a type constructor. For example, in type Y[L[x]], L is such a type variable. */ - def newTCVariable(owner: Symbol): TypeSymbol = - { - val tc = newTypeVariable(owner) - val arg = newTypeVariable(tc, "x") - tc.setTypeSignature(PolyType(arg :: Nil, emptyTypeBounds)) - tc - } - /** >: Nothing <: Any */ - def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) + /** The type representing the type constructor `[X] X` */ + lazy val idTC: Type = + { + val tvar = newTypeVariable(NoSymbol) + polyType(tvar :: Nil, refVar(tvar)) + } + /** A Type that references the given type variable. */ + def refVar(variable: TypeSymbol): Type = variable.toTypeConstructor + /** Constructs a new, synthetic type variable that is a type constructor. For example, in type Y[L[x]], L is such a type variable. */ + def newTCVariable(owner: Symbol): TypeSymbol = + { + val tc = newTypeVariable(owner) + val arg = newTypeVariable(tc, "x") + tc.setTypeSignature(PolyType(arg :: Nil, emptyTypeBounds)) + tc + } + /** >: Nothing <: Any */ + def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) - /** Creates a new anonymous function symbol with Position `pos`. */ - def functionSymbol(pos: Position): Symbol = - callsiteTyper.context.owner.newAnonymousFunctionValue(pos.asInstanceOf[global.Position]).asInstanceOf[ctx.universe.Symbol] + /** Creates a new anonymous function symbol with Position `pos`. */ + def functionSymbol(pos: Position): Symbol = + callsiteTyper.context.owner.newAnonymousFunctionValue(pos.asInstanceOf[global.Position]).asInstanceOf[ctx.universe.Symbol] - def functionType(args: List[Type], result: Type): Type = - { - val tpe = global.definitions.functionType(args.asInstanceOf[List[global.Type]], result.asInstanceOf[global.Type]) - tpe.asInstanceOf[Type] - } + def functionType(args: List[Type], result: Type): Type = + { + val tpe = global.definitions.functionType(args.asInstanceOf[List[global.Type]], result.asInstanceOf[global.Type]) + tpe.asInstanceOf[Type] + } - /** Create a Tree that references the `val` represented by `vd`, copying attributes from `replaced`. */ - def refVal(replaced: Tree, vd: ValDef): Tree = - treeCopy.Ident(replaced, vd.name).setSymbol(vd.symbol) + /** Create a Tree that references the `val` represented by `vd`, copying attributes from `replaced`. */ + def refVal(replaced: Tree, vd: ValDef): Tree = + treeCopy.Ident(replaced, vd.name).setSymbol(vd.symbol) - /** Creates a Function tree using `functionSym` as the Symbol and changing `initialOwner` to `functionSym` in `body`.*/ - def createFunction(params: List[ValDef], body: Tree, functionSym: Symbol): Tree = - { - changeOwner(body, initialOwner, functionSym) - val f = Function(params, body) - setSymbol(f, functionSym) - f - } + /** Creates a Function tree using `functionSym` as the Symbol and changing `initialOwner` to `functionSym` in `body`.*/ + def createFunction(params: List[ValDef], body: Tree, functionSym: Symbol): Tree = + { + changeOwner(body, initialOwner, functionSym) + val f = Function(params, body) + setSymbol(f, functionSym) + f + } - def changeOwner(tree: Tree, prev: Symbol, next: Symbol): Unit = - new ChangeOwnerAndModuleClassTraverser(prev.asInstanceOf[global.Symbol], next.asInstanceOf[global.Symbol]).traverse(tree.asInstanceOf[global.Tree]) + def changeOwner(tree: Tree, prev: Symbol, next: Symbol): Unit = + new ChangeOwnerAndModuleClassTraverser(prev.asInstanceOf[global.Symbol], next.asInstanceOf[global.Symbol]).traverse(tree.asInstanceOf[global.Tree]) - // Workaround copied from scala/async:can be removed once https://github.com/scala/scala/pull/3179 is merged. - private[this] class ChangeOwnerAndModuleClassTraverser(oldowner: global.Symbol, newowner: global.Symbol) extends global.ChangeOwnerTraverser(oldowner, newowner) - { - override def traverse(tree: global.Tree) { - tree match { - case _: global.DefTree => change(tree.symbol.moduleClass) - case _ => - } - super.traverse(tree) - } - } + // Workaround copied from scala/async:can be removed once https://github.com/scala/scala/pull/3179 is merged. + private[this] class ChangeOwnerAndModuleClassTraverser(oldowner: global.Symbol, newowner: global.Symbol) extends global.ChangeOwnerTraverser(oldowner, newowner) { + override def traverse(tree: global.Tree) { + tree match { + case _: global.DefTree => change(tree.symbol.moduleClass) + case _ => + } + super.traverse(tree) + } + } - /** Returns the Symbol that references the statically accessible singleton `i`. */ - def singleton[T <: AnyRef with Singleton](i: T)(implicit it: ctx.TypeTag[i.type]): Symbol = - it.tpe match { - case SingleType(_, sym) if !sym.isFreeTerm && sym.isStatic => sym - case x => sys.error("Instance must be static (was " + x + ").") - } + /** Returns the Symbol that references the statically accessible singleton `i`. */ + def singleton[T <: AnyRef with Singleton](i: T)(implicit it: ctx.TypeTag[i.type]): Symbol = + it.tpe match { + case SingleType(_, sym) if !sym.isFreeTerm && sym.isStatic => sym + case x => sys.error("Instance must be static (was " + x + ").") + } - def select(t: Tree, name: String): Tree = Select(t, newTermName(name)) + def select(t: Tree, name: String): Tree = Select(t, newTermName(name)) - /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ - def method(obj: Symbol, name: String): Symbol = { - val ts: Type = obj.typeSignature - val m: global.Symbol = ts.asInstanceOf[global.Type].nonPrivateMember(global.newTermName(name)) - m.asInstanceOf[Symbol] - } + /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ + def method(obj: Symbol, name: String): Symbol = { + val ts: Type = obj.typeSignature + val m: global.Symbol = ts.asInstanceOf[global.Type].nonPrivateMember(global.newTermName(name)) + m.asInstanceOf[Symbol] + } - /** Returns a Type representing the type constructor tcp.. For example, given - * `object Demo { type M[x] = List[x] }`, the call `extractTC(Demo, "M")` will return a type representing - * the type constructor `[x] List[x]`. - **/ - def extractTC(tcp: AnyRef with Singleton, name: String)(implicit it: ctx.TypeTag[tcp.type]): ctx.Type = - { - val itTpe = it.tpe.asInstanceOf[global.Type] - val m = itTpe.nonPrivateMember(global.newTypeName(name)) - val tc = itTpe.memberInfo(m).asInstanceOf[ctx.universe.Type] - assert(tc != NoType && tc.takesTypeArgs, "Invalid type constructor: " + tc) - tc - } + /** + * Returns a Type representing the type constructor tcp.. For example, given + * `object Demo { type M[x] = List[x] }`, the call `extractTC(Demo, "M")` will return a type representing + * the type constructor `[x] List[x]`. + */ + def extractTC(tcp: AnyRef with Singleton, name: String)(implicit it: ctx.TypeTag[tcp.type]): ctx.Type = + { + val itTpe = it.tpe.asInstanceOf[global.Type] + val m = itTpe.nonPrivateMember(global.newTypeName(name)) + val tc = itTpe.memberInfo(m).asInstanceOf[ctx.universe.Type] + assert(tc != NoType && tc.takesTypeArgs, "Invalid type constructor: " + tc) + tc + } - /** Substitutes wrappers in tree `t` with the result of `subWrapper`. - * A wrapper is a Tree of the form `f[T](v)` for which isWrapper(, , .target) returns true. - * Typically, `f` is a `Select` or `Ident`. - * The wrapper is replaced with the result of `subWrapper(, , )` */ - def transformWrappers(t: Tree, subWrapper: (String, Type, Tree, Tree) => Converted[ctx.type]): Tree = - { - // the main tree transformer that replaces calls to InputWrapper.wrap(x) with - // plain Idents that reference the actual input value - object appTransformer extends Transformer - { - override def transform(tree: Tree): Tree = - tree match { - case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => - subWrapper(nme.decoded, targ.tpe, qual, tree) match { - case Converted.Success(t, finalTx) => - changeOwner(qual, currentOwner, initialOwner) // Fixes https://github.com/sbt/sbt/issues/1150 - finalTx(t) - case Converted.Failure(p,m) => ctx.abort(p, m) - case _: Converted.NotApplicable[_] => super.transform(tree) - } - case _ => super.transform(tree) - } - } - appTransformer.atOwner(initialOwner) { - appTransformer.transform(t) - } - } + /** + * Substitutes wrappers in tree `t` with the result of `subWrapper`. + * A wrapper is a Tree of the form `f[T](v)` for which isWrapper(, , .target) returns true. + * Typically, `f` is a `Select` or `Ident`. + * The wrapper is replaced with the result of `subWrapper(, , )` + */ + def transformWrappers(t: Tree, subWrapper: (String, Type, Tree, Tree) => Converted[ctx.type]): Tree = + { + // the main tree transformer that replaces calls to InputWrapper.wrap(x) with + // plain Idents that reference the actual input value + object appTransformer extends Transformer { + override def transform(tree: Tree): Tree = + tree match { + case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => + subWrapper(nme.decoded, targ.tpe, qual, tree) match { + case Converted.Success(t, finalTx) => + changeOwner(qual, currentOwner, initialOwner) // Fixes https://github.com/sbt/sbt/issues/1150 + finalTx(t) + case Converted.Failure(p, m) => ctx.abort(p, m) + case _: Converted.NotApplicable[_] => super.transform(tree) + } + case _ => super.transform(tree) + } + } + appTransformer.atOwner(initialOwner) { + appTransformer.transform(t) + } + } } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Convert.scala b/util/appmacro/src/main/scala/sbt/appmacro/Convert.scala index 6dedf776b..3a2e562a6 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/Convert.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/Convert.scala @@ -1,38 +1,37 @@ package sbt package appmacro - import scala.reflect._ - import macros._ - import Types.idFun +import scala.reflect._ +import macros._ +import Types.idFun -abstract class Convert -{ - def apply[T: c.WeakTypeTag](c: Context)(nme: String, in: c.Tree): Converted[c.type] - def asPredicate(c: Context): (String, c.Type, c.Tree) => Boolean = - (n,tpe,tree) => { - val tag = c.WeakTypeTag(tpe) - apply(c)(n,tree)(tag).isSuccess - } +abstract class Convert { + def apply[T: c.WeakTypeTag](c: Context)(nme: String, in: c.Tree): Converted[c.type] + def asPredicate(c: Context): (String, c.Type, c.Tree) => Boolean = + (n, tpe, tree) => { + val tag = c.WeakTypeTag(tpe) + apply(c)(n, tree)(tag).isSuccess + } } sealed trait Converted[C <: Context with Singleton] { - def isSuccess: Boolean - def transform(f: C#Tree => C#Tree): Converted[C] + def isSuccess: Boolean + def transform(f: C#Tree => C#Tree): Converted[C] } object Converted { - def NotApplicable[C <: Context with Singleton] = new NotApplicable[C] - final case class Failure[C <: Context with Singleton](position: C#Position, message: String) extends Converted[C] { - def isSuccess = false - def transform(f: C#Tree => C#Tree): Converted[C] = new Failure(position, message) - } - final class NotApplicable[C <: Context with Singleton] extends Converted[C] { - def isSuccess = false - def transform(f: C#Tree => C#Tree): Converted[C] = this - } - final case class Success[C <: Context with Singleton](tree: C#Tree, finalTransform: C#Tree => C#Tree) extends Converted[C] { - def isSuccess = true - def transform(f: C#Tree => C#Tree): Converted[C] = Success(f(tree), finalTransform) - } - object Success { - def apply[C <: Context with Singleton](tree: C#Tree): Success[C] = Success(tree, idFun) - } + def NotApplicable[C <: Context with Singleton] = new NotApplicable[C] + final case class Failure[C <: Context with Singleton](position: C#Position, message: String) extends Converted[C] { + def isSuccess = false + def transform(f: C#Tree => C#Tree): Converted[C] = new Failure(position, message) + } + final class NotApplicable[C <: Context with Singleton] extends Converted[C] { + def isSuccess = false + def transform(f: C#Tree => C#Tree): Converted[C] = this + } + final case class Success[C <: Context with Singleton](tree: C#Tree, finalTransform: C#Tree => C#Tree) extends Converted[C] { + def isSuccess = true + def transform(f: C#Tree => C#Tree): Converted[C] = Success(f(tree), finalTransform) + } + object Success { + def apply[C <: Context with Singleton](tree: C#Tree): Success[C] = Success(tree, idFun) + } } \ No newline at end of file diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala index 043ad8731..7a63feca5 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala @@ -1,214 +1,210 @@ package sbt package appmacro - import Classes.Applicative - import Types.Id +import Classes.Applicative +import Types.Id -/** The separate hierarchy from Applicative/Monad is for two reasons. -* -* 1. The type constructor is represented as an abstract type because a TypeTag cannot represent a type constructor directly. -* 2. The applicative interface is uncurried. -*/ -trait Instance -{ - type M[x] - def app[K[L[x]], Z](in: K[M], f: K[Id] => Z)(implicit a: AList[K]): M[Z] - def map[S,T](in: M[S], f: S => T): M[T] - def pure[T](t: () => T): M[T] +/** + * The separate hierarchy from Applicative/Monad is for two reasons. + * + * 1. The type constructor is represented as an abstract type because a TypeTag cannot represent a type constructor directly. + * 2. The applicative interface is uncurried. + */ +trait Instance { + type M[x] + def app[K[L[x]], Z](in: K[M], f: K[Id] => Z)(implicit a: AList[K]): M[Z] + def map[S, T](in: M[S], f: S => T): M[T] + def pure[T](t: () => T): M[T] } -trait MonadInstance extends Instance -{ - def flatten[T](in: M[M[T]]): M[T] +trait MonadInstance extends Instance { + def flatten[T](in: M[M[T]]): M[T] } - import scala.reflect._ - import macros._ - import reflect.internal.annotations.compileTimeOnly +import scala.reflect._ +import macros._ +import reflect.internal.annotations.compileTimeOnly -object Instance -{ - final val ApplyName = "app" - final val FlattenName = "flatten" - final val PureName = "pure" - final val MapName = "map" - final val InstanceTCName = "M" +object Instance { + final val ApplyName = "app" + final val FlattenName = "flatten" + final val PureName = "pure" + final val MapName = "map" + final val InstanceTCName = "M" - final class Input[U <: Universe with Singleton](val tpe: U#Type, val expr: U#Tree, val local: U#ValDef) - trait Transform[C <: Context with Singleton, N[_]] { - def apply(in: C#Tree): C#Tree - } - def idTransform[C <: Context with Singleton]: Transform[C,Id] = new Transform[C,Id] { - def apply(in: C#Tree): C#Tree = in - } + final class Input[U <: Universe with Singleton](val tpe: U#Type, val expr: U#Tree, val local: U#ValDef) + trait Transform[C <: Context with Singleton, N[_]] { + def apply(in: C#Tree): C#Tree + } + def idTransform[C <: Context with Singleton]: Transform[C, Id] = new Transform[C, Id] { + def apply(in: C#Tree): C#Tree = in + } - /** Implementation of a macro that provides a direct syntax for applicative functors and monads. - * It is intended to be used in conjunction with another macro that conditions the inputs. - * - * This method processes the Tree `t` to find inputs of the form `wrap[T]( input )` - * This form is typically constructed by another macro that pretends to be able to get a value of type `T` - * from a value convertible to `M[T]`. This `wrap(input)` form has two main purposes. - * First, it identifies the inputs that should be transformed. - * Second, it allows the input trees to be wrapped for later conversion into the appropriate `M[T]` type by `convert`. - * This wrapping is necessary because applying the first macro must preserve the original type, - * but it is useful to delay conversion until the outer, second macro is called. The `wrap` method accomplishes this by - * allowing the original `Tree` and `Type` to be hidden behind the raw `T` type. This method will remove the call to `wrap` - * so that it is not actually called at runtime. - * - * Each `input` in each expression of the form `wrap[T]( input )` is transformed by `convert`. - * This transformation converts the input Tree to a Tree of type `M[T]`. - * The original wrapped expression `wrap(input)` is replaced by a reference to a new local `val $x: T`, where `$x` is a fresh name. - * These converted inputs are passed to `builder` as well as the list of these synthetic `ValDef`s. - * The `TupleBuilder` instance constructs a tuple (Tree) from the inputs and defines the right hand side of the vals - * that unpacks the tuple containing the results of the inputs. - * - * The constructed tuple of inputs and the code that unpacks the results of the inputs are then passed to the `i`, - * which is an implementation of `Instance` that is statically accessible. - * An Instance defines a applicative functor associated with a specific type constructor and, if it implements MonadInstance as well, a monad. - * Typically, it will be either a top-level module or a stable member of a top-level module (such as a val or a nested module). - * The `with Singleton` part of the type verifies some cases at macro compilation time, - * while the full check for static accessibility is done at macro expansion time. - * Note: Ideally, the types would verify that `i: MonadInstance` when `t.isRight`. - * With the various dependent types involved, this is not worth it. - * - * The `t` argument is the argument of the macro that will be transformed as described above. - * If the macro that calls this method is for a multi-input map (app followed by map), - * `t` should be the argument wrapped in Left. - * If this is for multi-input flatMap (app followed by flatMap), - * this should be the argument wrapped in Right. - */ - def contImpl[T,N[_]](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]], inner: Transform[c.type,N])( - implicit tt: c.WeakTypeTag[T], nt: c.WeakTypeTag[N[T]], it: c.TypeTag[i.type]): c.Expr[i.M[N[T]]] = - { - import c.universe.{Apply=>ApplyTree,_} + /** + * Implementation of a macro that provides a direct syntax for applicative functors and monads. + * It is intended to be used in conjunction with another macro that conditions the inputs. + * + * This method processes the Tree `t` to find inputs of the form `wrap[T]( input )` + * This form is typically constructed by another macro that pretends to be able to get a value of type `T` + * from a value convertible to `M[T]`. This `wrap(input)` form has two main purposes. + * First, it identifies the inputs that should be transformed. + * Second, it allows the input trees to be wrapped for later conversion into the appropriate `M[T]` type by `convert`. + * This wrapping is necessary because applying the first macro must preserve the original type, + * but it is useful to delay conversion until the outer, second macro is called. The `wrap` method accomplishes this by + * allowing the original `Tree` and `Type` to be hidden behind the raw `T` type. This method will remove the call to `wrap` + * so that it is not actually called at runtime. + * + * Each `input` in each expression of the form `wrap[T]( input )` is transformed by `convert`. + * This transformation converts the input Tree to a Tree of type `M[T]`. + * The original wrapped expression `wrap(input)` is replaced by a reference to a new local `val $x: T`, where `$x` is a fresh name. + * These converted inputs are passed to `builder` as well as the list of these synthetic `ValDef`s. + * The `TupleBuilder` instance constructs a tuple (Tree) from the inputs and defines the right hand side of the vals + * that unpacks the tuple containing the results of the inputs. + * + * The constructed tuple of inputs and the code that unpacks the results of the inputs are then passed to the `i`, + * which is an implementation of `Instance` that is statically accessible. + * An Instance defines a applicative functor associated with a specific type constructor and, if it implements MonadInstance as well, a monad. + * Typically, it will be either a top-level module or a stable member of a top-level module (such as a val or a nested module). + * The `with Singleton` part of the type verifies some cases at macro compilation time, + * while the full check for static accessibility is done at macro expansion time. + * Note: Ideally, the types would verify that `i: MonadInstance` when `t.isRight`. + * With the various dependent types involved, this is not worth it. + * + * The `t` argument is the argument of the macro that will be transformed as described above. + * If the macro that calls this method is for a multi-input map (app followed by map), + * `t` should be the argument wrapped in Left. + * If this is for multi-input flatMap (app followed by flatMap), + * this should be the argument wrapped in Right. + */ + def contImpl[T, N[_]](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]], inner: Transform[c.type, N])( + implicit tt: c.WeakTypeTag[T], nt: c.WeakTypeTag[N[T]], it: c.TypeTag[i.type]): c.Expr[i.M[N[T]]] = + { + import c.universe.{ Apply => ApplyTree, _ } - val util = ContextUtil[c.type](c) - val mTC: Type = util.extractTC(i, InstanceTCName) - val mttpe: Type = appliedType(mTC, nt.tpe :: Nil).normalize + val util = ContextUtil[c.type](c) + val mTC: Type = util.extractTC(i, InstanceTCName) + val mttpe: Type = appliedType(mTC, nt.tpe :: Nil).normalize - // the tree for the macro argument - val (tree, treeType) = t match { - case Left(l) => (l.tree, nt.tpe.normalize) - case Right(r) => (r.tree, mttpe) - } - // the Symbol for the anonymous function passed to the appropriate Instance.map/flatMap/pure method - // this Symbol needs to be known up front so that it can be used as the owner of synthetic vals - val functionSym = util.functionSymbol(tree.pos) + // the tree for the macro argument + val (tree, treeType) = t match { + case Left(l) => (l.tree, nt.tpe.normalize) + case Right(r) => (r.tree, mttpe) + } + // the Symbol for the anonymous function passed to the appropriate Instance.map/flatMap/pure method + // this Symbol needs to be known up front so that it can be used as the owner of synthetic vals + val functionSym = util.functionSymbol(tree.pos) - val instanceSym = util.singleton(i) - // A Tree that references the statically accessible Instance that provides the actual implementations of map, flatMap, ... - val instance = Ident(instanceSym) + val instanceSym = util.singleton(i) + // A Tree that references the statically accessible Instance that provides the actual implementations of map, flatMap, ... + val instance = Ident(instanceSym) - val isWrapper: (String, Type, Tree) => Boolean = convert.asPredicate(c) + val isWrapper: (String, Type, Tree) => Boolean = convert.asPredicate(c) - // Local definitions `defs` in the macro. This is used to ensure references are to M instances defined outside of the macro call. - // Also `refCount` is the number of references, which is used to create the private, synthetic method containing the body - val defs = util.collectDefs(tree, isWrapper) - val checkQual: Tree => Unit = util.checkReferences(defs, isWrapper) + // Local definitions `defs` in the macro. This is used to ensure references are to M instances defined outside of the macro call. + // Also `refCount` is the number of references, which is used to create the private, synthetic method containing the body + val defs = util.collectDefs(tree, isWrapper) + val checkQual: Tree => Unit = util.checkReferences(defs, isWrapper) - type In = Input[c.universe.type] - var inputs = List[In]() + type In = Input[c.universe.type] + var inputs = List[In]() + // transforms the original tree into calls to the Instance functions pure, map, ..., + // resulting in a value of type M[T] + def makeApp(body: Tree): Tree = + inputs match { + case Nil => pure(body) + case x :: Nil => single(body, x) + case xs => arbArity(body, xs) + } - // transforms the original tree into calls to the Instance functions pure, map, ..., - // resulting in a value of type M[T] - def makeApp(body: Tree): Tree = - inputs match { - case Nil => pure(body) - case x :: Nil => single(body, x) - case xs => arbArity(body, xs) - } + // no inputs, so construct M[T] via Instance.pure or pure+flatten + def pure(body: Tree): Tree = + { + val typeApplied = TypeApply(util.select(instance, PureName), TypeTree(treeType) :: Nil) + val f = util.createFunction(Nil, body, functionSym) + val p = ApplyTree(typeApplied, f :: Nil) + if (t.isLeft) p else flatten(p) + } + // m should have type M[M[T]] + // the returned Tree will have type M[T] + def flatten(m: Tree): Tree = + { + val typedFlatten = TypeApply(util.select(instance, FlattenName), TypeTree(tt.tpe) :: Nil) + ApplyTree(typedFlatten, m :: Nil) + } - // no inputs, so construct M[T] via Instance.pure or pure+flatten - def pure(body: Tree): Tree = - { - val typeApplied = TypeApply(util.select(instance, PureName), TypeTree(treeType) :: Nil) - val f = util.createFunction(Nil, body, functionSym) - val p = ApplyTree(typeApplied, f :: Nil) - if(t.isLeft) p else flatten(p) - } - // m should have type M[M[T]] - // the returned Tree will have type M[T] - def flatten(m: Tree): Tree = - { - val typedFlatten = TypeApply(util.select(instance, FlattenName), TypeTree(tt.tpe) :: Nil) - ApplyTree(typedFlatten, m :: Nil) - } + // calls Instance.map or flatmap directly, skipping the intermediate Instance.app that is unnecessary for a single input + def single(body: Tree, input: In): Tree = + { + val variable = input.local + val param = treeCopy.ValDef(variable, util.parameterModifiers, variable.name, variable.tpt, EmptyTree) + val typeApplied = TypeApply(util.select(instance, MapName), variable.tpt :: TypeTree(treeType) :: Nil) + val f = util.createFunction(param :: Nil, body, functionSym) + val mapped = ApplyTree(typeApplied, input.expr :: f :: Nil) + if (t.isLeft) mapped else flatten(mapped) + } - // calls Instance.map or flatmap directly, skipping the intermediate Instance.app that is unnecessary for a single input - def single(body: Tree, input: In): Tree = - { - val variable = input.local - val param = treeCopy.ValDef(variable, util.parameterModifiers, variable.name, variable.tpt, EmptyTree) - val typeApplied = TypeApply(util.select(instance, MapName), variable.tpt :: TypeTree(treeType) :: Nil) - val f = util.createFunction(param :: Nil, body, functionSym) - val mapped = ApplyTree(typeApplied, input.expr :: f :: Nil) - if(t.isLeft) mapped else flatten(mapped) - } + // calls Instance.app to get the values for all inputs and then calls Instance.map or flatMap to evaluate the body + def arbArity(body: Tree, inputs: List[In]): Tree = + { + val result = builder.make(c)(mTC, inputs) + val param = util.freshMethodParameter(appliedType(result.representationC, util.idTC :: Nil)) + val bindings = result.extract(param) + val f = util.createFunction(param :: Nil, Block(bindings, body), functionSym) + val ttt = TypeTree(treeType) + val typedApp = TypeApply(util.select(instance, ApplyName), TypeTree(result.representationC) :: ttt :: Nil) + val app = ApplyTree(ApplyTree(typedApp, result.input :: f :: Nil), result.alistInstance :: Nil) + if (t.isLeft) app else flatten(app) + } - // calls Instance.app to get the values for all inputs and then calls Instance.map or flatMap to evaluate the body - def arbArity(body: Tree, inputs: List[In]): Tree = - { - val result = builder.make(c)(mTC, inputs) - val param = util.freshMethodParameter( appliedType(result.representationC, util.idTC :: Nil) ) - val bindings = result.extract(param) - val f = util.createFunction(param :: Nil, Block(bindings, body), functionSym) - val ttt = TypeTree(treeType) - val typedApp = TypeApply(util.select(instance, ApplyName), TypeTree(result.representationC) :: ttt :: Nil) - val app = ApplyTree(ApplyTree(typedApp, result.input :: f :: Nil), result.alistInstance :: Nil) - if(t.isLeft) app else flatten(app) - } + // Called when transforming the tree to add an input. + // For `qual` of type M[A], and a `selection` qual.value, + // the call is addType(Type A, Tree qual) + // The result is a Tree representing a reference to + // the bound value of the input. + def addType(tpe: Type, qual: Tree, selection: Tree): Tree = + { + qual.foreach(checkQual) + val vd = util.freshValDef(tpe, qual.pos, functionSym) + inputs ::= new Input(tpe, qual, vd) + util.refVal(selection, vd) + } + def sub(name: String, tpe: Type, qual: Tree, replace: Tree): Converted[c.type] = + { + val tag = c.WeakTypeTag[T](tpe) + convert[T](c)(name, qual)(tag) transform { tree => + addType(tpe, tree, replace) + } + } - // Called when transforming the tree to add an input. - // For `qual` of type M[A], and a `selection` qual.value, - // the call is addType(Type A, Tree qual) - // The result is a Tree representing a reference to - // the bound value of the input. - def addType(tpe: Type, qual: Tree, selection: Tree): Tree = - { - qual.foreach(checkQual) - val vd = util.freshValDef(tpe, qual.pos, functionSym) - inputs ::= new Input(tpe, qual, vd) - util.refVal(selection, vd) - } - def sub(name: String, tpe: Type, qual: Tree, replace: Tree): Converted[c.type] = - { - val tag = c.WeakTypeTag[T](tpe) - convert[T](c)(name, qual)(tag) transform { tree => - addType(tpe, tree, replace) - } - } + // applies the transformation + val tx = util.transformWrappers(tree, (n, tpe, t, replace) => sub(n, tpe, t, replace)) + // resetting attributes must be: a) local b) done here and not wider or else there are obscure errors + val tr = makeApp(inner(tx)) + c.Expr[i.M[N[T]]](tr) + } - // applies the transformation - val tx = util.transformWrappers(tree, (n,tpe,t,replace) => sub(n,tpe,t,replace)) - // resetting attributes must be: a) local b) done here and not wider or else there are obscure errors - val tr = makeApp( inner(tx) ) - c.Expr[i.M[N[T]]](tr) - } + import Types._ - import Types._ + implicit def applicativeInstance[A[_]](implicit ap: Applicative[A]): Instance { type M[x] = A[x] } = new Instance { + type M[x] = A[x] + def app[K[L[x]], Z](in: K[A], f: K[Id] => Z)(implicit a: AList[K]) = a.apply[A, Z](in, f) + def map[S, T](in: A[S], f: S => T) = ap.map(f, in) + def pure[S](s: () => S): M[S] = ap.pure(s()) + } - implicit def applicativeInstance[A[_]](implicit ap: Applicative[A]): Instance { type M[x] = A[x] } = new Instance - { - type M[x] = A[x] - def app[ K[L[x]], Z ](in: K[A], f: K[Id] => Z)(implicit a: AList[K]) = a.apply[A,Z](in, f) - def map[S,T](in: A[S], f: S => T) = ap.map(f, in) - def pure[S](s: () => S): M[S] = ap.pure(s()) - } - - type AI[A[_]] = Instance { type M[x] = A[x] } - def compose[A[_], B[_]](implicit a: AI[A], b: AI[B]): Instance { type M[x] = A[B[x]] } = new Composed[A,B](a,b) - // made a public, named, unsealed class because of trouble with macros and inference when the Instance is not an object - class Composed[A[_], B[_]](a: AI[A], b: AI[B]) extends Instance - { - type M[x] = A[B[x]] - def pure[S](s: () => S): A[B[S]] = a.pure(() => b.pure(s)) - def map[S,T](in: M[S], f: S => T): M[T] = a.map(in, (bv: B[S]) => b.map(bv, f)) - def app[ K[L[x]], Z ](in: K[M], f: K[Id] => Z)(implicit alist: AList[K]): A[B[Z]] = - { - val g: K[B] => B[Z] = in => b.app[K, Z](in, f) - type Split[ L[x] ] = K[ (L ∙ B)#l ] - a.app[Split, B[Z]](in, g)(AList.asplit(alist)) - } - } + type AI[A[_]] = Instance { type M[x] = A[x] } + def compose[A[_], B[_]](implicit a: AI[A], b: AI[B]): Instance { type M[x] = A[B[x]] } = new Composed[A, B](a, b) + // made a public, named, unsealed class because of trouble with macros and inference when the Instance is not an object + class Composed[A[_], B[_]](a: AI[A], b: AI[B]) extends Instance { + type M[x] = A[B[x]] + def pure[S](s: () => S): A[B[S]] = a.pure(() => b.pure(s)) + def map[S, T](in: M[S], f: S => T): M[T] = a.map(in, (bv: B[S]) => b.map(bv, f)) + def app[K[L[x]], Z](in: K[M], f: K[Id] => Z)(implicit alist: AList[K]): A[B[Z]] = + { + val g: K[B] => B[Z] = in => b.app[K, Z](in, f) + type Split[L[x]] = K[(L ∙ B)#l] + a.app[Split, B[Z]](in, g)(AList.asplit(alist)) + } + } } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala index d9dbebe42..b5c2878f3 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala @@ -1,72 +1,71 @@ package sbt package appmacro - import Types.Id - import scala.tools.nsc.Global - import scala.reflect._ - import macros._ +import Types.Id +import scala.tools.nsc.Global +import scala.reflect._ +import macros._ /** A `TupleBuilder` that uses a KList as the tuple representation.*/ -object KListBuilder extends TupleBuilder -{ - // TODO 2.11 Remove this after dropping 2.10.x support. - private object HasCompat { val compat = ??? }; import HasCompat._ +object KListBuilder extends TupleBuilder { + // TODO 2.11 Remove this after dropping 2.10.x support. + private object HasCompat { val compat = ??? }; import HasCompat._ - def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] - { - val ctx: c.type = c - val util = ContextUtil[c.type](c) - import c.universe.{Apply=>ApplyTree,_} - import compat._ - import util._ + def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { + val ctx: c.type = c + val util = ContextUtil[c.type](c) + import c.universe.{ Apply => ApplyTree, _ } + import compat._ + import util._ - val knilType = c.typeOf[KNil] - val knil = Ident(knilType.typeSymbol.companionSymbol) - val kconsTpe = c.typeOf[KCons[Int,KNil,List]] - val kcons = kconsTpe.typeSymbol.companionSymbol - val mTC: Type = mt.asInstanceOf[c.universe.Type] - val kconsTC: Type = kconsTpe.typeConstructor + val knilType = c.typeOf[KNil] + val knil = Ident(knilType.typeSymbol.companionSymbol) + val kconsTpe = c.typeOf[KCons[Int, KNil, List]] + val kcons = kconsTpe.typeSymbol.companionSymbol + val mTC: Type = mt.asInstanceOf[c.universe.Type] + val kconsTC: Type = kconsTpe.typeConstructor - /** This is the L in the type function [L[x]] ... */ - val tcVariable: TypeSymbol = newTCVariable(util.initialOwner) + /** This is the L in the type function [L[x]] ... */ + val tcVariable: TypeSymbol = newTCVariable(util.initialOwner) - /** Instantiates KCons[h, t <: KList[L], L], where L is the type constructor variable */ - def kconsType(h: Type, t: Type): Type = - appliedType(kconsTC, h :: t :: refVar(tcVariable) :: Nil) + /** Instantiates KCons[h, t <: KList[L], L], where L is the type constructor variable */ + def kconsType(h: Type, t: Type): Type = + appliedType(kconsTC, h :: t :: refVar(tcVariable) :: Nil) - def bindKList(prev: ValDef, revBindings: List[ValDef], params: List[ValDef]): List[ValDef] = - params match - { - case (x @ ValDef(mods, name, tpt, _)) :: xs => - val rhs = select(Ident(prev.name), "head") - val head = treeCopy.ValDef(x, mods, name, tpt, rhs) - util.setSymbol(head, x.symbol) - val tail = localValDef(TypeTree(), select(Ident(prev.name), "tail")) - val base = head :: revBindings - bindKList(tail, if(xs.isEmpty) base else tail :: base, xs) - case Nil => revBindings.reverse - } + def bindKList(prev: ValDef, revBindings: List[ValDef], params: List[ValDef]): List[ValDef] = + params match { + case (x @ ValDef(mods, name, tpt, _)) :: xs => + val rhs = select(Ident(prev.name), "head") + val head = treeCopy.ValDef(x, mods, name, tpt, rhs) + util.setSymbol(head, x.symbol) + val tail = localValDef(TypeTree(), select(Ident(prev.name), "tail")) + val base = head :: revBindings + bindKList(tail, if (xs.isEmpty) base else tail :: base, xs) + case Nil => revBindings.reverse + } - private[this] def makeKList(revInputs: Inputs[c.universe.type], klist: Tree, klistType: Type): Tree = - revInputs match { - case in :: tail => - val next = ApplyTree(TypeApply(Ident(kcons), TypeTree(in.tpe) :: TypeTree(klistType) :: TypeTree(mTC) :: Nil), in.expr :: klist :: Nil) - makeKList(tail, next, appliedType(kconsTC, in.tpe :: klistType :: mTC :: Nil)) - case Nil => klist - } + private[this] def makeKList(revInputs: Inputs[c.universe.type], klist: Tree, klistType: Type): Tree = + revInputs match { + case in :: tail => + val next = ApplyTree(TypeApply(Ident(kcons), TypeTree(in.tpe) :: TypeTree(klistType) :: TypeTree(mTC) :: Nil), in.expr :: klist :: Nil) + makeKList(tail, next, appliedType(kconsTC, in.tpe :: klistType :: mTC :: Nil)) + case Nil => klist + } - /** The input trees combined in a KList */ - val klist = makeKList(inputs.reverse, knil, knilType) + /** The input trees combined in a KList */ + val klist = makeKList(inputs.reverse, knil, knilType) - /** The input types combined in a KList type. The main concern is tracking the heterogeneous types. - * The type constructor is tcVariable, so that it can be applied to [X] X or M later. - * When applied to `M`, this type gives the type of the `input` KList. */ - val klistType: Type = (inputs :\ knilType)( (in, klist) => kconsType(in.tpe, klist) ) + /** + * The input types combined in a KList type. The main concern is tracking the heterogeneous types. + * The type constructor is tcVariable, so that it can be applied to [X] X or M later. + * When applied to `M`, this type gives the type of the `input` KList. + */ + val klistType: Type = (inputs :\ knilType)((in, klist) => kconsType(in.tpe, klist)) - val representationC = PolyType(tcVariable :: Nil, klistType) - val resultType = appliedType(representationC, idTC :: Nil) - val input = klist - val alistInstance: ctx.universe.Tree = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) - def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) - } + val representationC = PolyType(tcVariable :: Nil, klistType) + val resultType = appliedType(representationC, idTC :: Nil) + val input = klist + val alistInstance: ctx.universe.Tree = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) + def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) + } } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala index e58adb2b0..019dc8b20 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala @@ -1,16 +1,17 @@ package sbt package appmacro - import scala.reflect._ - import macros._ +import scala.reflect._ +import macros._ -/** A builder that uses `TupleN` as the representation for small numbers of inputs (up to `TupleNBuilder.MaxInputs`) -* and `KList` for larger numbers of inputs. This builder cannot handle fewer than 2 inputs.*/ -object MixedBuilder extends TupleBuilder -{ - def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = - { - val delegate = if(inputs.size > TupleNBuilder.MaxInputs) KListBuilder else TupleNBuilder - delegate.make(c)(mt, inputs) - } +/** + * A builder that uses `TupleN` as the representation for small numbers of inputs (up to `TupleNBuilder.MaxInputs`) + * and `KList` for larger numbers of inputs. This builder cannot handle fewer than 2 inputs. + */ +object MixedBuilder extends TupleBuilder { + def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = + { + val delegate = if (inputs.size > TupleNBuilder.MaxInputs) KListBuilder else TupleNBuilder + delegate.make(c)(mt, inputs) + } } \ No newline at end of file diff --git a/util/appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala index f6442cb02..a6ea2d84c 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala @@ -1,56 +1,57 @@ package sbt package appmacro - import Types.Id - import scala.tools.nsc.Global - import scala.reflect._ - import macros._ +import Types.Id +import scala.tools.nsc.Global +import scala.reflect._ +import macros._ -/** -* A `TupleBuilder` abstracts the work of constructing a tuple data structure such as a `TupleN` or `KList` -* and extracting values from it. The `Instance` macro implementation will (roughly) traverse the tree of its argument -* and ultimately obtain a list of expressions with type `M[T]` for different types `T`. -* The macro constructs an `Input` value for each of these expressions that contains the `Type` for `T`, -* the `Tree` for the expression, and a `ValDef` that will hold the value for the input. -* -* `TupleBuilder.apply` is provided with the list of `Input`s and is expected to provide three values in the returned BuilderResult. -* First, it returns the constructed tuple data structure Tree in `input`. -* Next, it provides the type constructor `representationC` that, when applied to M, gives the type of tuple data structure. -* For example, a builder that constructs a `Tuple3` for inputs `M[Int]`, `M[Boolean]`, and `M[String]` -* would provide a Type representing `[L[x]] (L[Int], L[Boolean], L[String])`. The `input` method -* would return a value whose type is that type constructor applied to M, or `(M[Int], M[Boolean], M[String])`. -* -* Finally, the `extract` method provides a list of vals that extract information from the applied input. -* The type of the applied input is the type constructor applied to `Id` (`[X] X`). -* The returned list of ValDefs should be the ValDefs from `inputs`, but with non-empty right-hand sides. -*/ +/** + * A `TupleBuilder` abstracts the work of constructing a tuple data structure such as a `TupleN` or `KList` + * and extracting values from it. The `Instance` macro implementation will (roughly) traverse the tree of its argument + * and ultimately obtain a list of expressions with type `M[T]` for different types `T`. + * The macro constructs an `Input` value for each of these expressions that contains the `Type` for `T`, + * the `Tree` for the expression, and a `ValDef` that will hold the value for the input. + * + * `TupleBuilder.apply` is provided with the list of `Input`s and is expected to provide three values in the returned BuilderResult. + * First, it returns the constructed tuple data structure Tree in `input`. + * Next, it provides the type constructor `representationC` that, when applied to M, gives the type of tuple data structure. + * For example, a builder that constructs a `Tuple3` for inputs `M[Int]`, `M[Boolean]`, and `M[String]` + * would provide a Type representing `[L[x]] (L[Int], L[Boolean], L[String])`. The `input` method + * would return a value whose type is that type constructor applied to M, or `(M[Int], M[Boolean], M[String])`. + * + * Finally, the `extract` method provides a list of vals that extract information from the applied input. + * The type of the applied input is the type constructor applied to `Id` (`[X] X`). + * The returned list of ValDefs should be the ValDefs from `inputs`, but with non-empty right-hand sides. + */ trait TupleBuilder { - /** A convenience alias for a list of inputs (associated with a Universe of type U). */ - type Inputs[U <: Universe with Singleton] = List[Instance.Input[U]] + /** A convenience alias for a list of inputs (associated with a Universe of type U). */ + type Inputs[U <: Universe with Singleton] = List[Instance.Input[U]] - /** Constructs a one-time use Builder for Context `c` and type constructor `tcType`. */ - def make(c: Context)(tcType: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] + /** Constructs a one-time use Builder for Context `c` and type constructor `tcType`. */ + def make(c: Context)(tcType: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] } -trait BuilderResult[C <: Context with Singleton] -{ - val ctx: C - import ctx.universe._ +trait BuilderResult[C <: Context with Singleton] { + val ctx: C + import ctx.universe._ - /** Represents the higher-order type constructor `[L[x]] ...` where `...` is the - * type of the data structure containing the added expressions, - * except that it is abstracted over the type constructor applied to each heterogeneous part of the type . */ - def representationC: PolyType + /** + * Represents the higher-order type constructor `[L[x]] ...` where `...` is the + * type of the data structure containing the added expressions, + * except that it is abstracted over the type constructor applied to each heterogeneous part of the type . + */ + def representationC: PolyType - /** The instance of AList for the input. For a `representationC` of `[L[x]]`, this `Tree` should have a `Type` of `AList[L]`*/ - def alistInstance: Tree + /** The instance of AList for the input. For a `representationC` of `[L[x]]`, this `Tree` should have a `Type` of `AList[L]`*/ + def alistInstance: Tree - /** Returns the completed value containing all expressions added to the builder. */ - def input: Tree + /** Returns the completed value containing all expressions added to the builder. */ + def input: Tree - /* The list of definitions that extract values from a value of type `$representationC[Id]`. + /* The list of definitions that extract values from a value of type `$representationC[Id]`. * The returned value should be identical to the `ValDef`s provided to the `TupleBuilder.make` method but with * non-empty right hand sides. Each `ValDef` may refer to `param` and previous `ValDef`s in the list.*/ - def extract(param: ValDef): List[ValDef] + def extract(param: ValDef): List[ValDef] } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala index 28fa581a4..232174c81 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala @@ -1,57 +1,56 @@ package sbt package appmacro - import Types.Id - import scala.tools.nsc.Global - import scala.reflect._ - import macros._ +import Types.Id +import scala.tools.nsc.Global +import scala.reflect._ +import macros._ -/** A builder that uses a TupleN as the tuple representation. -* It is limited to tuples of size 2 to `MaxInputs`. */ -object TupleNBuilder extends TupleBuilder -{ - /** The largest number of inputs that this builder can handle. */ - final val MaxInputs = 11 - final val TupleMethodName = "tuple" +/** + * A builder that uses a TupleN as the tuple representation. + * It is limited to tuples of size 2 to `MaxInputs`. + */ +object TupleNBuilder extends TupleBuilder { + /** The largest number of inputs that this builder can handle. */ + final val MaxInputs = 11 + final val TupleMethodName = "tuple" - // TODO 2.11 Remove this after dropping 2.10.x support. - private object HasCompat { val compat = ??? }; import HasCompat._ + // TODO 2.11 Remove this after dropping 2.10.x support. + private object HasCompat { val compat = ??? }; import HasCompat._ - def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] - { - val util = ContextUtil[c.type](c) - import c.universe.{Apply=>ApplyTree,_} - import compat._ - import util._ + def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { + val util = ContextUtil[c.type](c) + import c.universe.{ Apply => ApplyTree, _ } + import compat._ + import util._ - val global: Global = c.universe.asInstanceOf[Global] - val mTC: Type = mt.asInstanceOf[c.universe.Type] + val global: Global = c.universe.asInstanceOf[Global] + val mTC: Type = mt.asInstanceOf[c.universe.Type] - val ctx: c.type = c - val representationC: PolyType = { - val tcVariable: Symbol = newTCVariable(util.initialOwner) - val tupleTypeArgs = inputs.map(in => typeRef(NoPrefix, tcVariable, in.tpe :: Nil).asInstanceOf[global.Type]) - val tuple = global.definitions.tupleType(tupleTypeArgs) - PolyType(tcVariable :: Nil, tuple.asInstanceOf[Type] ) - } - val resultType = appliedType(representationC, idTC :: Nil) + val ctx: c.type = c + val representationC: PolyType = { + val tcVariable: Symbol = newTCVariable(util.initialOwner) + val tupleTypeArgs = inputs.map(in => typeRef(NoPrefix, tcVariable, in.tpe :: Nil).asInstanceOf[global.Type]) + val tuple = global.definitions.tupleType(tupleTypeArgs) + PolyType(tcVariable :: Nil, tuple.asInstanceOf[Type]) + } + val resultType = appliedType(representationC, idTC :: Nil) - val input: Tree = mkTuple(inputs.map(_.expr)) - val alistInstance: Tree = { - val selectTree = select(Ident(alist), TupleMethodName + inputs.size.toString) - TypeApply(selectTree, inputs.map(in => TypeTree(in.tpe))) - } - def extract(param: ValDef): List[ValDef] = bindTuple(param, Nil, inputs.map(_.local), 1) + val input: Tree = mkTuple(inputs.map(_.expr)) + val alistInstance: Tree = { + val selectTree = select(Ident(alist), TupleMethodName + inputs.size.toString) + TypeApply(selectTree, inputs.map(in => TypeTree(in.tpe))) + } + def extract(param: ValDef): List[ValDef] = bindTuple(param, Nil, inputs.map(_.local), 1) - def bindTuple(param: ValDef, revBindings: List[ValDef], params: List[ValDef], i: Int): List[ValDef] = - params match - { - case (x @ ValDef(mods, name, tpt, _)) :: xs => - val rhs = select(Ident(param.name), "_" + i.toString) - val newVal = treeCopy.ValDef(x, mods, name, tpt, rhs) - util.setSymbol(newVal, x.symbol) - bindTuple(param, newVal :: revBindings, xs, i+1) - case Nil => revBindings.reverse - } - } + def bindTuple(param: ValDef, revBindings: List[ValDef], params: List[ValDef], i: Int): List[ValDef] = + params match { + case (x @ ValDef(mods, name, tpt, _)) :: xs => + val rhs = select(Ident(param.name), "_" + i.toString) + val newVal = treeCopy.ValDef(x, mods, name, tpt, rhs) + util.setSymbol(newVal, x.symbol) + bindTuple(param, newVal :: revBindings, xs, i + 1) + case Nil => revBindings.reverse + } + } } diff --git a/util/collection/src/main/scala/sbt/AList.scala b/util/collection/src/main/scala/sbt/AList.scala index 1bc361e0d..10e1454e7 100644 --- a/util/collection/src/main/scala/sbt/AList.scala +++ b/util/collection/src/main/scala/sbt/AList.scala @@ -1,217 +1,212 @@ package sbt - import Classes.Applicative - import Types._ +import Classes.Applicative +import Types._ -/** An abstraction over a higher-order type constructor `K[x[y]]` with the purpose of abstracting -* over heterogeneous sequences like `KList` and `TupleN` with elements with a common type -* constructor as well as homogeneous sequences `Seq[M[T]]`. */ -trait AList[K[L[x]] ] -{ - def transform[M[_], N[_]](value: K[M], f: M ~> N): K[N] - def traverse[M[_], N[_], P[_]](value: K[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[K[P]] - def foldr[M[_], A](value: K[M], f: (M[_], A) => A, init: A): A +/** + * An abstraction over a higher-order type constructor `K[x[y]]` with the purpose of abstracting + * over heterogeneous sequences like `KList` and `TupleN` with elements with a common type + * constructor as well as homogeneous sequences `Seq[M[T]]`. + */ +trait AList[K[L[x]]] { + def transform[M[_], N[_]](value: K[M], f: M ~> N): K[N] + def traverse[M[_], N[_], P[_]](value: K[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[K[P]] + def foldr[M[_], A](value: K[M], f: (M[_], A) => A, init: A): A - def toList[M[_]](value: K[M]): List[M[_]] = foldr[M, List[M[_]]](value, _ :: _, Nil) - def apply[M[_], C](value: K[M], f: K[Id] => C)(implicit a: Applicative[M]): M[C] = - a.map(f, traverse[M, M, Id](value, idK[M])(a)) + def toList[M[_]](value: K[M]): List[M[_]] = foldr[M, List[M[_]]](value, _ :: _, Nil) + def apply[M[_], C](value: K[M], f: K[Id] => C)(implicit a: Applicative[M]): M[C] = + a.map(f, traverse[M, M, Id](value, idK[M])(a)) } -object AList -{ - type Empty = AList[({ type l[L[x]] = Unit})#l] - /** AList for Unit, which represents a sequence that is always empty.*/ - val empty: Empty = new Empty { - def transform[M[_], N[_]](in: Unit, f: M ~> N) = () - def foldr[M[_], T](in: Unit, f: (M[_], T) => T, init: T) = init - override def apply[M[_], C](in: Unit, f: Unit => C)(implicit app: Applicative[M]): M[C] = app.pure( f( () ) ) - def traverse[M[_], N[_], P[_]](in: Unit, f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Unit] = np.pure( () ) - } +object AList { + type Empty = AList[({ type l[L[x]] = Unit })#l] + /** AList for Unit, which represents a sequence that is always empty.*/ + val empty: Empty = new Empty { + def transform[M[_], N[_]](in: Unit, f: M ~> N) = () + def foldr[M[_], T](in: Unit, f: (M[_], T) => T, init: T) = init + override def apply[M[_], C](in: Unit, f: Unit => C)(implicit app: Applicative[M]): M[C] = app.pure(f(())) + def traverse[M[_], N[_], P[_]](in: Unit, f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Unit] = np.pure(()) + } - type SeqList[T] = AList[({ type l[L[x]] = List[L[T]] })#l] - /** AList for a homogeneous sequence. */ - def seq[T]: SeqList[T] = new SeqList[T] - { - def transform[M[_], N[_]](s: List[M[T]], f: M ~> N) = s.map(f.fn[T]) - def foldr[M[_], A](s: List[M[T]], f: (M[_], A) => A, init: A): A = (init /: s.reverse)( (t, m) => f(m,t)) - override def apply[M[_], C](s: List[M[T]], f: List[T] => C)(implicit ap: Applicative[M]): M[C] = - { - def loop[V](in: List[M[T]], g: List[T] => V): M[V] = - in match { - case Nil => ap.pure(g(Nil)) - case x :: xs => - val h = (ts: List[T]) => (t: T) => g(t :: ts) - ap.apply( loop(xs, h), x ) - } - loop(s, f) - } - def traverse[M[_], N[_], P[_]](s: List[M[T]], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[List[P[T]]] = ??? - } - - /** AList for the abitrary arity data structure KList. */ - def klist[KL[M[_]] <: KList[M] { type Transform[N[_]] = KL[N] }]: AList[KL] = new AList[KL] { - def transform[M[_], N[_]](k: KL[M], f: M ~> N) = k.transform(f) - def foldr[M[_], T](k: KL[M], f: (M[_], T) => T, init: T): T = k.foldr(f, init) - override def apply[M[_], C](k: KL[M], f: KL[Id] => C)(implicit app: Applicative[M]): M[C] = k.apply(f)(app) - def traverse[M[_], N[_], P[_]](k: KL[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KL[P]] = k.traverse[N,P](f)(np) - override def toList[M[_]](k: KL[M]) = k.toList - } - - /** AList for a single value. */ - type Single[A] = AList[({ type l[L[x]] = L[A]})#l] - def single[A]: Single[A] = new Single[A] { - def transform[M[_], N[_]](a: M[A], f: M ~> N) = f(a) - def foldr[M[_], T](a: M[A], f: (M[_], T) => T, init: T): T = f(a, init) - def traverse[M[_], N[_], P[_]](a: M[A], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[P[A]] = f(a) - } - - type ASplit[K[L[x]], B[x]] = AList[ ({ type l[L[x]] = K[ (L ∙ B)#l] })#l ] - /** AList that operates on the outer type constructor `A` of a composition `[x] A[B[x]]` for type constructors `A` and `B`*/ - def asplit[ K[L[x]], B[x] ](base: AList[K]): ASplit[K,B] = new ASplit[K, B] - { - type Split[ L[x] ] = K[ (L ∙ B)#l ] - def transform[M[_], N[_]](value: Split[M], f: M ~> N): Split[N] = - base.transform[(M ∙ B)#l, (N ∙ B)#l](value, nestCon[M,N,B](f)) - - def traverse[M[_], N[_], P[_]](value: Split[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Split[P]] = + type SeqList[T] = AList[({ type l[L[x]] = List[L[T]] })#l] + /** AList for a homogeneous sequence. */ + def seq[T]: SeqList[T] = new SeqList[T] { + def transform[M[_], N[_]](s: List[M[T]], f: M ~> N) = s.map(f.fn[T]) + def foldr[M[_], A](s: List[M[T]], f: (M[_], A) => A, init: A): A = (init /: s.reverse)((t, m) => f(m, t)) + override def apply[M[_], C](s: List[M[T]], f: List[T] => C)(implicit ap: Applicative[M]): M[C] = { - val g = nestCon[M, (N ∙ P)#l, B](f) - base.traverse[(M ∙ B)#l, N, (P ∙ B)#l](value, g)(np) + def loop[V](in: List[M[T]], g: List[T] => V): M[V] = + in match { + case Nil => ap.pure(g(Nil)) + case x :: xs => + val h = (ts: List[T]) => (t: T) => g(t :: ts) + ap.apply(loop(xs, h), x) + } + loop(s, f) + } + def traverse[M[_], N[_], P[_]](s: List[M[T]], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[List[P[T]]] = ??? + } + + /** AList for the abitrary arity data structure KList. */ + def klist[KL[M[_]] <: KList[M] { type Transform[N[_]] = KL[N] }]: AList[KL] = new AList[KL] { + def transform[M[_], N[_]](k: KL[M], f: M ~> N) = k.transform(f) + def foldr[M[_], T](k: KL[M], f: (M[_], T) => T, init: T): T = k.foldr(f, init) + override def apply[M[_], C](k: KL[M], f: KL[Id] => C)(implicit app: Applicative[M]): M[C] = k.apply(f)(app) + def traverse[M[_], N[_], P[_]](k: KL[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KL[P]] = k.traverse[N, P](f)(np) + override def toList[M[_]](k: KL[M]) = k.toList + } + + /** AList for a single value. */ + type Single[A] = AList[({ type l[L[x]] = L[A] })#l] + def single[A]: Single[A] = new Single[A] { + def transform[M[_], N[_]](a: M[A], f: M ~> N) = f(a) + def foldr[M[_], T](a: M[A], f: (M[_], T) => T, init: T): T = f(a, init) + def traverse[M[_], N[_], P[_]](a: M[A], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[P[A]] = f(a) + } + + type ASplit[K[L[x]], B[x]] = AList[({ type l[L[x]] = K[(L ∙ B)#l] })#l] + /** AList that operates on the outer type constructor `A` of a composition `[x] A[B[x]]` for type constructors `A` and `B`*/ + def asplit[K[L[x]], B[x]](base: AList[K]): ASplit[K, B] = new ASplit[K, B] { + type Split[L[x]] = K[(L ∙ B)#l] + def transform[M[_], N[_]](value: Split[M], f: M ~> N): Split[N] = + base.transform[(M ∙ B)#l, (N ∙ B)#l](value, nestCon[M, N, B](f)) + + def traverse[M[_], N[_], P[_]](value: Split[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Split[P]] = + { + val g = nestCon[M, (N ∙ P)#l, B](f) + base.traverse[(M ∙ B)#l, N, (P ∙ B)#l](value, g)(np) } - def foldr[M[_], A](value: Split[M], f: (M[_], A) => A, init: A): A = - base.foldr[(M ∙ B)#l, A](value, f, init) - } + def foldr[M[_], A](value: Split[M], f: (M[_], A) => A, init: A): A = + base.foldr[(M ∙ B)#l, A](value, f, init) + } - // TODO: auto-generate - sealed trait T2K[A,B] { type l[L[x]] = (L[A], L[B]) } - type T2List[A,B] = AList[T2K[A,B]#l] - def tuple2[A, B]: T2List[A,B] = new T2List[A,B] - { - type T2[M[_]] = (M[A], M[B]) - def transform[M[_], N[_]](t: T2[M], f: M ~> N): T2[N] = (f(t._1), f(t._2)) - def foldr[M[_], T](t: T2[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, init)) - def traverse[M[_], N[_], P[_]](t: T2[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T2[P]] = - { - val g = (Tuple2.apply[P[A], P[B]] _).curried - np.apply( np.map(g, f(t._1)), f(t._2) ) - } - } + // TODO: auto-generate + sealed trait T2K[A, B] { type l[L[x]] = (L[A], L[B]) } + type T2List[A, B] = AList[T2K[A, B]#l] + def tuple2[A, B]: T2List[A, B] = new T2List[A, B] { + type T2[M[_]] = (M[A], M[B]) + def transform[M[_], N[_]](t: T2[M], f: M ~> N): T2[N] = (f(t._1), f(t._2)) + def foldr[M[_], T](t: T2[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, init)) + def traverse[M[_], N[_], P[_]](t: T2[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T2[P]] = + { + val g = (Tuple2.apply[P[A], P[B]] _).curried + np.apply(np.map(g, f(t._1)), f(t._2)) + } + } - sealed trait T3K[A,B,C] { type l[L[x]] = (L[A], L[B], L[C]) } - type T3List[A,B,C] = AList[T3K[A,B,C]#l] - def tuple3[A, B, C]: T3List[A,B,C] = new T3List[A,B,C] - { - type T3[M[_]] = (M[A], M[B], M[C]) - def transform[M[_], N[_]](t: T3[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3)) - def foldr[M[_], T](t: T3[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, init))) - def traverse[M[_], N[_], P[_]](t: T3[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T3[P]] = - { - val g = (Tuple3.apply[P[A],P[B],P[C]] _).curried - np.apply( np.apply( np.map(g, f(t._1)), f(t._2) ), f(t._3) ) - } - } + sealed trait T3K[A, B, C] { type l[L[x]] = (L[A], L[B], L[C]) } + type T3List[A, B, C] = AList[T3K[A, B, C]#l] + def tuple3[A, B, C]: T3List[A, B, C] = new T3List[A, B, C] { + type T3[M[_]] = (M[A], M[B], M[C]) + def transform[M[_], N[_]](t: T3[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3)) + def foldr[M[_], T](t: T3[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, init))) + def traverse[M[_], N[_], P[_]](t: T3[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T3[P]] = + { + val g = (Tuple3.apply[P[A], P[B], P[C]] _).curried + np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)) + } + } - sealed trait T4K[A,B,C,D] { type l[L[x]] = (L[A], L[B], L[C], L[D]) } - type T4List[A,B,C,D] = AList[T4K[A,B,C,D]#l] - def tuple4[A, B, C, D]: T4List[A,B,C,D] = new T4List[A,B,C,D] - { - type T4[M[_]] = (M[A], M[B], M[C], M[D]) - def transform[M[_], N[_]](t: T4[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4)) - def foldr[M[_], T](t: T4[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, init)))) - def traverse[M[_], N[_], P[_]](t: T4[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T4[P]] = - { - val g = (Tuple4.apply[P[A], P[B], P[C], P[D]] _).curried - np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)) - } - } + sealed trait T4K[A, B, C, D] { type l[L[x]] = (L[A], L[B], L[C], L[D]) } + type T4List[A, B, C, D] = AList[T4K[A, B, C, D]#l] + def tuple4[A, B, C, D]: T4List[A, B, C, D] = new T4List[A, B, C, D] { + type T4[M[_]] = (M[A], M[B], M[C], M[D]) + def transform[M[_], N[_]](t: T4[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4)) + def foldr[M[_], T](t: T4[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, init)))) + def traverse[M[_], N[_], P[_]](t: T4[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T4[P]] = + { + val g = (Tuple4.apply[P[A], P[B], P[C], P[D]] _).curried + np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)) + } + } - sealed trait T5K[A,B,C,D,E] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E]) } - type T5List[A,B,C,D,E] = AList[T5K[A,B,C,D,E]#l] - def tuple5[A, B, C, D, E]: T5List[A,B,C,D,E] = new T5List[A,B,C,D,E] { - type T5[M[_]] = (M[A], M[B], M[C], M[D], M[E]) - def transform[M[_], N[_]](t: T5[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5)) - def foldr[M[_], T](t: T5[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, init))))) - def traverse[M[_], N[_], P[_]](t: T5[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T5[P]] = - { - val g = (Tuple5.apply[P[A],P[B],P[C],P[D],P[E]] _ ).curried - np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5) ) - } - } + sealed trait T5K[A, B, C, D, E] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E]) } + type T5List[A, B, C, D, E] = AList[T5K[A, B, C, D, E]#l] + def tuple5[A, B, C, D, E]: T5List[A, B, C, D, E] = new T5List[A, B, C, D, E] { + type T5[M[_]] = (M[A], M[B], M[C], M[D], M[E]) + def transform[M[_], N[_]](t: T5[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5)) + def foldr[M[_], T](t: T5[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, init))))) + def traverse[M[_], N[_], P[_]](t: T5[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T5[P]] = + { + val g = (Tuple5.apply[P[A], P[B], P[C], P[D], P[E]] _).curried + np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)) + } + } - sealed trait T6K[A,B,C,D,E,F] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F]) } - type T6List[A,B,C,D,E,F] = AList[T6K[A,B,C,D,E,F]#l] - def tuple6[A, B, C, D, E, F]: T6List[A,B,C,D,E,F] = new T6List[A,B,C,D,E,F] { - type T6[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F]) - def transform[M[_], N[_]](t: T6[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6)) - def foldr[M[_], T](t: T6[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, init)))))) - def traverse[M[_], N[_], P[_]](t: T6[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T6[P]] = - { - val g = (Tuple6.apply[P[A],P[B],P[C],P[D],P[E],P[F]] _ ).curried - np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)) - } - } + sealed trait T6K[A, B, C, D, E, F] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F]) } + type T6List[A, B, C, D, E, F] = AList[T6K[A, B, C, D, E, F]#l] + def tuple6[A, B, C, D, E, F]: T6List[A, B, C, D, E, F] = new T6List[A, B, C, D, E, F] { + type T6[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F]) + def transform[M[_], N[_]](t: T6[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6)) + def foldr[M[_], T](t: T6[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, init)))))) + def traverse[M[_], N[_], P[_]](t: T6[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T6[P]] = + { + val g = (Tuple6.apply[P[A], P[B], P[C], P[D], P[E], P[F]] _).curried + np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)) + } + } - sealed trait T7K[A,B,C,D,E,F,G] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G]) } - type T7List[A,B,C,D,E,F,G] = AList[T7K[A,B,C,D,E,F,G]#l] - def tuple7[A,B,C,D,E,F,G]: T7List[A,B,C,D,E,F,G] = new T7List[A,B,C,D,E,F,G] { - type T7[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G]) - def transform[M[_], N[_]](t: T7[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7)) - def foldr[M[_], T](t: T7[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, init))))))) - def traverse[M[_], N[_], P[_]](t: T7[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T7[P]] = - { - val g = (Tuple7.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G]] _ ).curried - np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)) - } - } - sealed trait T8K[A,B,C,D,E,F,G,H] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H]) } - type T8List[A,B,C,D,E,F,G,H] = AList[T8K[A,B,C,D,E,F,G,H]#l] - def tuple8[A,B,C,D,E,F,G,H]: T8List[A,B,C,D,E,F,G,H] = new T8List[A,B,C,D,E,F,G,H] { - type T8[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H]) - def transform[M[_], N[_]](t: T8[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8)) - def foldr[M[_], T](t: T8[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, init)))))))) - def traverse[M[_], N[_], P[_]](t: T8[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T8[P]] = - { - val g = (Tuple8.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G],P[H]] _ ).curried - np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)) - } - } + sealed trait T7K[A, B, C, D, E, F, G] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G]) } + type T7List[A, B, C, D, E, F, G] = AList[T7K[A, B, C, D, E, F, G]#l] + def tuple7[A, B, C, D, E, F, G]: T7List[A, B, C, D, E, F, G] = new T7List[A, B, C, D, E, F, G] { + type T7[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G]) + def transform[M[_], N[_]](t: T7[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7)) + def foldr[M[_], T](t: T7[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, init))))))) + def traverse[M[_], N[_], P[_]](t: T7[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T7[P]] = + { + val g = (Tuple7.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G]] _).curried + np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)) + } + } + sealed trait T8K[A, B, C, D, E, F, G, H] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H]) } + type T8List[A, B, C, D, E, F, G, H] = AList[T8K[A, B, C, D, E, F, G, H]#l] + def tuple8[A, B, C, D, E, F, G, H]: T8List[A, B, C, D, E, F, G, H] = new T8List[A, B, C, D, E, F, G, H] { + type T8[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H]) + def transform[M[_], N[_]](t: T8[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8)) + def foldr[M[_], T](t: T8[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, init)))))))) + def traverse[M[_], N[_], P[_]](t: T8[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T8[P]] = + { + val g = (Tuple8.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G], P[H]] _).curried + np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)) + } + } - sealed trait T9K[A,B,C,D,E,F,G,H,I] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I]) } - type T9List[A,B,C,D,E,F,G,H,I] = AList[T9K[A,B,C,D,E,F,G,H,I]#l] - def tuple9[A,B,C,D,E,F,G,H,I]: T9List[A,B,C,D,E,F,G,H,I] = new T9List[A,B,C,D,E,F,G,H,I] { - type T9[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I]) - def transform[M[_], N[_]](t: T9[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9)) - def foldr[M[_], T](t: T9[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, init))))))))) - def traverse[M[_], N[_], P[_]](t: T9[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T9[P]] = - { - val g = (Tuple9.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G],P[H],P[I]] _ ).curried - np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)) - } - } + sealed trait T9K[A, B, C, D, E, F, G, H, I] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I]) } + type T9List[A, B, C, D, E, F, G, H, I] = AList[T9K[A, B, C, D, E, F, G, H, I]#l] + def tuple9[A, B, C, D, E, F, G, H, I]: T9List[A, B, C, D, E, F, G, H, I] = new T9List[A, B, C, D, E, F, G, H, I] { + type T9[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I]) + def transform[M[_], N[_]](t: T9[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9)) + def foldr[M[_], T](t: T9[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, init))))))))) + def traverse[M[_], N[_], P[_]](t: T9[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T9[P]] = + { + val g = (Tuple9.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G], P[H], P[I]] _).curried + np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)) + } + } - sealed trait T10K[A,B,C,D,E,F,G,H,I,J] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I], L[J]) } - type T10List[A,B,C,D,E,F,G,H,I,J] = AList[T10K[A,B,C,D,E,F,G,H,I,J]#l] - def tuple10[A,B,C,D,E,F,G,H,I,J]: T10List[A,B,C,D,E,F,G,H,I,J] = new T10List[A,B,C,D,E,F,G,H,I,J] { - type T10[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I], M[J]) - def transform[M[_], N[_]](t: T10[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9), f(t._10)) - def foldr[M[_], T](t: T10[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, f(t._10, init)))))))))) - def traverse[M[_], N[_], P[_]](t: T10[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T10[P]] = - { - val g = (Tuple10.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G],P[H],P[I],P[J]] _ ).curried - np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)), f(t._10)) - } - } + sealed trait T10K[A, B, C, D, E, F, G, H, I, J] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I], L[J]) } + type T10List[A, B, C, D, E, F, G, H, I, J] = AList[T10K[A, B, C, D, E, F, G, H, I, J]#l] + def tuple10[A, B, C, D, E, F, G, H, I, J]: T10List[A, B, C, D, E, F, G, H, I, J] = new T10List[A, B, C, D, E, F, G, H, I, J] { + type T10[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I], M[J]) + def transform[M[_], N[_]](t: T10[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9), f(t._10)) + def foldr[M[_], T](t: T10[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, f(t._10, init)))))))))) + def traverse[M[_], N[_], P[_]](t: T10[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T10[P]] = + { + val g = (Tuple10.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G], P[H], P[I], P[J]] _).curried + np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)), f(t._10)) + } + } - sealed trait T11K[A,B,C,D,E,F,G,H,I,J,K] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I], L[J], L[K]) } - type T11List[A,B,C,D,E,F,G,H,I,J,K] = AList[T11K[A,B,C,D,E,F,G,H,I,J,K]#l] - def tuple11[A,B,C,D,E,F,G,H,I,J,K]: T11List[A,B,C,D,E,F,G,H,I,J,K] = new T11List[A,B,C,D,E,F,G,H,I,J,K] { - type T11[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I], M[J], M[K]) - def transform[M[_], N[_]](t: T11[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9), f(t._10), f(t._11)) - def foldr[M[_], T](t: T11[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, f(t._10, f(t._11,init))))))))))) - def traverse[M[_], N[_], P[_]](t: T11[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T11[P]] = - { - val g = (Tuple11.apply[P[A],P[B],P[C],P[D],P[E],P[F],P[G],P[H],P[I],P[J],P[K]] _ ).curried - np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.apply( np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)), f(t._10)), f(t._11)) - } - } + sealed trait T11K[A, B, C, D, E, F, G, H, I, J, K] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I], L[J], L[K]) } + type T11List[A, B, C, D, E, F, G, H, I, J, K] = AList[T11K[A, B, C, D, E, F, G, H, I, J, K]#l] + def tuple11[A, B, C, D, E, F, G, H, I, J, K]: T11List[A, B, C, D, E, F, G, H, I, J, K] = new T11List[A, B, C, D, E, F, G, H, I, J, K] { + type T11[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I], M[J], M[K]) + def transform[M[_], N[_]](t: T11[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9), f(t._10), f(t._11)) + def foldr[M[_], T](t: T11[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, f(t._10, f(t._11, init))))))))))) + def traverse[M[_], N[_], P[_]](t: T11[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T11[P]] = + { + val g = (Tuple11.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G], P[H], P[I], P[J], P[K]] _).curried + np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)), f(t._10)), f(t._11)) + } + } } diff --git a/util/collection/src/main/scala/sbt/Attributes.scala b/util/collection/src/main/scala/sbt/Attributes.scala index 456a74482..64f379012 100644 --- a/util/collection/src/main/scala/sbt/Attributes.scala +++ b/util/collection/src/main/scala/sbt/Attributes.scala @@ -10,190 +10,201 @@ import scala.reflect.Manifest // Because it is sealed and the only instances go through AttributeKey.apply, // a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts -/** A key in an [[AttributeMap]] that constrains its associated value to be of type `T`. -* The key is uniquely defined by its [[label]] and type `T`, represented at runtime by [[manifest]]. */ +/** + * A key in an [[AttributeMap]] that constrains its associated value to be of type `T`. + * The key is uniquely defined by its [[label]] and type `T`, represented at runtime by [[manifest]]. + */ sealed trait AttributeKey[T] { - /** The runtime evidence for `T` */ - def manifest: Manifest[T] + /** The runtime evidence for `T` */ + def manifest: Manifest[T] - @deprecated("Should only be used for compatibility during the transition from hyphenated labels to camelCase labels.", "0.13.0") - def rawLabel: String + @deprecated("Should only be used for compatibility during the transition from hyphenated labels to camelCase labels.", "0.13.0") + def rawLabel: String - /** The label is the identifier for the key and is camelCase by convention. */ - def label: String + /** The label is the identifier for the key and is camelCase by convention. */ + def label: String - /** An optional, brief description of the key. */ - def description: Option[String] + /** An optional, brief description of the key. */ + def description: Option[String] - /** In environments that support delegation, looking up this key when it has no associated value will delegate to the values associated with these keys. - * The delegation proceeds in order the keys are returned here.*/ - def extend: Seq[AttributeKey[_]] + /** + * In environments that support delegation, looking up this key when it has no associated value will delegate to the values associated with these keys. + * The delegation proceeds in order the keys are returned here. + */ + def extend: Seq[AttributeKey[_]] - /** Specifies whether this key is a local, anonymous key (`true`) or not (`false`). - * This is typically only used for programmatic, intermediate keys that should not be referenced outside of a specific scope. */ - def isLocal: Boolean + /** + * Specifies whether this key is a local, anonymous key (`true`) or not (`false`). + * This is typically only used for programmatic, intermediate keys that should not be referenced outside of a specific scope. + */ + def isLocal: Boolean - /** Identifies the relative importance of a key among other keys.*/ - def rank: Int + /** Identifies the relative importance of a key among other keys.*/ + def rank: Int } private[sbt] abstract class SharedAttributeKey[T] extends AttributeKey[T] { - override final def toString = label - override final def hashCode = label.hashCode - override final def equals(o: Any) = (this eq o.asInstanceOf[AnyRef]) || (o match { - case a: SharedAttributeKey[t] => a.label == this.label && a.manifest == this.manifest - case _ => false - }) - final def isLocal: Boolean = false + override final def toString = label + override final def hashCode = label.hashCode + override final def equals(o: Any) = (this eq o.asInstanceOf[AnyRef]) || (o match { + case a: SharedAttributeKey[t] => a.label == this.label && a.manifest == this.manifest + case _ => false + }) + final def isLocal: Boolean = false } -object AttributeKey -{ - def apply[T](name: String)(implicit mf: Manifest[T]): AttributeKey[T] = - make(name, None, Nil, Int.MaxValue) +object AttributeKey { + def apply[T](name: String)(implicit mf: Manifest[T]): AttributeKey[T] = + make(name, None, Nil, Int.MaxValue) - def apply[T](name: String, rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = - make(name, None, Nil, rank) + def apply[T](name: String, rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + make(name, None, Nil, rank) - def apply[T](name: String, description: String)(implicit mf: Manifest[T]): AttributeKey[T] = - apply(name, description, Nil) + def apply[T](name: String, description: String)(implicit mf: Manifest[T]): AttributeKey[T] = + apply(name, description, Nil) - def apply[T](name: String, description: String, rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = - apply(name, description, Nil, rank) + def apply[T](name: String, description: String, rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + apply(name, description, Nil, rank) - def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]])(implicit mf: Manifest[T]): AttributeKey[T] = - apply(name, description, extend, Int.MaxValue) + def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]])(implicit mf: Manifest[T]): AttributeKey[T] = + apply(name, description, extend, Int.MaxValue) - def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]], rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = - make(name, Some(description), extend, rank) + def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]], rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + make(name, Some(description), extend, rank) - private[this] def make[T](name: String, description0: Option[String], extend0: Seq[AttributeKey[_]], rank0: Int)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { - def manifest = mf - def rawLabel = name - val label = Util.hyphenToCamel(name) - def description = description0 - def extend = extend0 - def rank = rank0 - } - private[sbt] def local[T](implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { - def manifest = mf - def rawLabel = LocalLabel - def label = LocalLabel - def description = None - def extend = Nil - override def toString = label - def isLocal: Boolean = true - def rank = Int.MaxValue - } - private[sbt] final val LocalLabel = "$local" + private[this] def make[T](name: String, description0: Option[String], extend0: Seq[AttributeKey[_]], rank0: Int)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { + def manifest = mf + def rawLabel = name + val label = Util.hyphenToCamel(name) + def description = description0 + def extend = extend0 + def rank = rank0 + } + private[sbt] def local[T](implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + def manifest = mf + def rawLabel = LocalLabel + def label = LocalLabel + def description = None + def extend = Nil + override def toString = label + def isLocal: Boolean = true + def rank = Int.MaxValue + } + private[sbt] final val LocalLabel = "$local" } -/** An immutable map where a key is the tuple `(String,T)` for a fixed type `T` and can only be associated with values of type `T`. -* It is therefore possible for this map to contain mappings for keys with the same label but different types. -* Excluding this possibility is the responsibility of the client if desired. */ -trait AttributeMap -{ - /** Gets the value of type `T` associated with the key `k`. - * If a key with the same label but different type is defined, this method will fail. */ - def apply[T](k: AttributeKey[T]): T +/** + * An immutable map where a key is the tuple `(String,T)` for a fixed type `T` and can only be associated with values of type `T`. + * It is therefore possible for this map to contain mappings for keys with the same label but different types. + * Excluding this possibility is the responsibility of the client if desired. + */ +trait AttributeMap { + /** + * Gets the value of type `T` associated with the key `k`. + * If a key with the same label but different type is defined, this method will fail. + */ + def apply[T](k: AttributeKey[T]): T - /** Gets the value of type `T` associated with the key `k` or `None` if no value is associated. - * If a key with the same label but a different type is defined, this method will return `None`. */ - def get[T](k: AttributeKey[T]): Option[T] + /** + * Gets the value of type `T` associated with the key `k` or `None` if no value is associated. + * If a key with the same label but a different type is defined, this method will return `None`. + */ + def get[T](k: AttributeKey[T]): Option[T] - /** Returns this map without the mapping for `k`. - * This method will not remove a mapping for a key with the same label but a different type. */ - def remove[T](k: AttributeKey[T]): AttributeMap + /** + * Returns this map without the mapping for `k`. + * This method will not remove a mapping for a key with the same label but a different type. + */ + def remove[T](k: AttributeKey[T]): AttributeMap - /** Returns true if this map contains a mapping for `k`. - * If a key with the same label but a different type is defined in this map, this method will return `false`. */ - def contains[T](k: AttributeKey[T]): Boolean + /** + * Returns true if this map contains a mapping for `k`. + * If a key with the same label but a different type is defined in this map, this method will return `false`. + */ + def contains[T](k: AttributeKey[T]): Boolean - /** Adds the mapping `k -> value` to this map, replacing any existing mapping for `k`. - * Any mappings for keys with the same label but different types are unaffected. */ - def put[T](k: AttributeKey[T], value: T): AttributeMap + /** + * Adds the mapping `k -> value` to this map, replacing any existing mapping for `k`. + * Any mappings for keys with the same label but different types are unaffected. + */ + def put[T](k: AttributeKey[T], value: T): AttributeMap - /** All keys with defined mappings. There may be multiple keys with the same `label`, but different types. */ - def keys: Iterable[AttributeKey[_]] + /** All keys with defined mappings. There may be multiple keys with the same `label`, but different types. */ + def keys: Iterable[AttributeKey[_]] - /** Adds the mappings in `o` to this map, with mappings in `o` taking precedence over existing mappings.*/ - def ++(o: Iterable[AttributeEntry[_]]): AttributeMap + /** Adds the mappings in `o` to this map, with mappings in `o` taking precedence over existing mappings.*/ + def ++(o: Iterable[AttributeEntry[_]]): AttributeMap - /** Combines the mappings in `o` with the mappings in this map, with mappings in `o` taking precedence over existing mappings.*/ - def ++(o: AttributeMap): AttributeMap + /** Combines the mappings in `o` with the mappings in this map, with mappings in `o` taking precedence over existing mappings.*/ + def ++(o: AttributeMap): AttributeMap - /** All mappings in this map. The [[AttributeEntry]] type preserves the typesafety of mappings, although the specific types are unknown.*/ - def entries: Iterable[AttributeEntry[_]] + /** All mappings in this map. The [[AttributeEntry]] type preserves the typesafety of mappings, although the specific types are unknown.*/ + def entries: Iterable[AttributeEntry[_]] - /** `true` if there are no mappings in this map, `false` if there are. */ - def isEmpty: Boolean + /** `true` if there are no mappings in this map, `false` if there are. */ + def isEmpty: Boolean } -object AttributeMap -{ - /** An [[AttributeMap]] without any mappings. */ - val empty: AttributeMap = new BasicAttributeMap(Map.empty) +object AttributeMap { + /** An [[AttributeMap]] without any mappings. */ + val empty: AttributeMap = new BasicAttributeMap(Map.empty) - /** Constructs an [[AttributeMap]] containing the given `entries`. */ - def apply(entries: Iterable[AttributeEntry[_]]): AttributeMap = empty ++ entries + /** Constructs an [[AttributeMap]] containing the given `entries`. */ + def apply(entries: Iterable[AttributeEntry[_]]): AttributeMap = empty ++ entries - /** Constructs an [[AttributeMap]] containing the given `entries`.*/ - def apply(entries: AttributeEntry[_]*): AttributeMap = empty ++ entries + /** Constructs an [[AttributeMap]] containing the given `entries`.*/ + def apply(entries: AttributeEntry[_]*): AttributeMap = empty ++ entries - /** Presents an `AttributeMap` as a natural transformation. */ - implicit def toNatTrans(map: AttributeMap): AttributeKey ~> Id = new (AttributeKey ~> Id) { - def apply[T](key: AttributeKey[T]): T = map(key) - } + /** Presents an `AttributeMap` as a natural transformation. */ + implicit def toNatTrans(map: AttributeMap): AttributeKey ~> Id = new (AttributeKey ~> Id) { + def apply[T](key: AttributeKey[T]): T = map(key) + } } -private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) extends AttributeMap -{ - def isEmpty: Boolean = backing.isEmpty - def apply[T](k: AttributeKey[T]) = backing(k).asInstanceOf[T] - def get[T](k: AttributeKey[T]) = backing.get(k).asInstanceOf[Option[T]] - def remove[T](k: AttributeKey[T]): AttributeMap = new BasicAttributeMap( backing - k ) - def contains[T](k: AttributeKey[T]) = backing.contains(k) - def put[T](k: AttributeKey[T], value: T): AttributeMap = new BasicAttributeMap( backing.updated(k, value) ) - def keys: Iterable[AttributeKey[_]] = backing.keys - def ++(o: Iterable[AttributeEntry[_]]): AttributeMap = - { - val newBacking = (backing /: o) { case (b, AttributeEntry(key, value)) => b.updated(key, value) } - new BasicAttributeMap(newBacking) - } - def ++(o: AttributeMap): AttributeMap = - o match { - case bam: BasicAttributeMap => new BasicAttributeMap(backing ++ bam.backing) - case _ => o ++ this - } - def entries: Iterable[AttributeEntry[_]] = - for( (k: AttributeKey[kt], v) <- backing) yield AttributeEntry(k, v.asInstanceOf[kt]) - override def toString = entries.mkString("(", ", ", ")") +private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) extends AttributeMap { + def isEmpty: Boolean = backing.isEmpty + def apply[T](k: AttributeKey[T]) = backing(k).asInstanceOf[T] + def get[T](k: AttributeKey[T]) = backing.get(k).asInstanceOf[Option[T]] + def remove[T](k: AttributeKey[T]): AttributeMap = new BasicAttributeMap(backing - k) + def contains[T](k: AttributeKey[T]) = backing.contains(k) + def put[T](k: AttributeKey[T], value: T): AttributeMap = new BasicAttributeMap(backing.updated(k, value)) + def keys: Iterable[AttributeKey[_]] = backing.keys + def ++(o: Iterable[AttributeEntry[_]]): AttributeMap = + { + val newBacking = (backing /: o) { case (b, AttributeEntry(key, value)) => b.updated(key, value) } + new BasicAttributeMap(newBacking) + } + def ++(o: AttributeMap): AttributeMap = + o match { + case bam: BasicAttributeMap => new BasicAttributeMap(backing ++ bam.backing) + case _ => o ++ this + } + def entries: Iterable[AttributeEntry[_]] = + for ((k: AttributeKey[kt], v) <- backing) yield AttributeEntry(k, v.asInstanceOf[kt]) + override def toString = entries.mkString("(", ", ", ")") } // type inference required less generality /** A map entry where `key` is constrained to only be associated with a fixed value of type `T`. */ -final case class AttributeEntry[T](key: AttributeKey[T], value: T) -{ - override def toString = key.label + ": " + value +final case class AttributeEntry[T](key: AttributeKey[T], value: T) { + override def toString = key.label + ": " + value } /** Associates a `metadata` map with `data`. */ -final case class Attributed[D](data: D)(val metadata: AttributeMap) -{ - /** Retrieves the associated value of `key` from the metadata. */ - def get[T](key: AttributeKey[T]): Option[T] = metadata.get(key) +final case class Attributed[D](data: D)(val metadata: AttributeMap) { + /** Retrieves the associated value of `key` from the metadata. */ + def get[T](key: AttributeKey[T]): Option[T] = metadata.get(key) - /** Defines a mapping `key -> value` in the metadata. */ - def put[T](key: AttributeKey[T], value: T): Attributed[D] = Attributed(data)(metadata.put(key, value)) + /** Defines a mapping `key -> value` in the metadata. */ + def put[T](key: AttributeKey[T], value: T): Attributed[D] = Attributed(data)(metadata.put(key, value)) - /** Transforms the data by applying `f`. */ - def map[T](f: D => T): Attributed[T] = Attributed(f(data))(metadata) + /** Transforms the data by applying `f`. */ + def map[T](f: D => T): Attributed[T] = Attributed(f(data))(metadata) } -object Attributed -{ - /** Extracts the underlying data from the sequence `in`. */ - def data[T](in: Seq[Attributed[T]]): Seq[T] = in.map(_.data) +object Attributed { + /** Extracts the underlying data from the sequence `in`. */ + def data[T](in: Seq[Attributed[T]]): Seq[T] = in.map(_.data) - /** Associates empty metadata maps with each entry of `in`.*/ - def blankSeq[T](in: Seq[T]): Seq[Attributed[T]] = in map blank + /** Associates empty metadata maps with each entry of `in`.*/ + def blankSeq[T](in: Seq[T]): Seq[Attributed[T]] = in map blank - /** Associates an empty metadata map with `data`. */ - def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) + /** Associates an empty metadata map with `data`. */ + def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) } \ No newline at end of file diff --git a/util/collection/src/main/scala/sbt/Classes.scala b/util/collection/src/main/scala/sbt/Classes.scala index 74796c829..1db644f96 100644 --- a/util/collection/src/main/scala/sbt/Classes.scala +++ b/util/collection/src/main/scala/sbt/Classes.scala @@ -1,27 +1,24 @@ package sbt -object Classes -{ - trait Applicative[M[_]] - { - def apply[S,T](f: M[S => T], v: M[S]): M[T] - def pure[S](s: => S): M[S] - def map[S, T](f: S => T, v: M[S]): M[T] - } - trait Monad[M[_]] extends Applicative[M] - { - def flatten[T](m: M[M[T]]): M[T] - } - implicit val optionMonad: Monad[Option] = new Monad[Option] { - def apply[S,T](f: Option[S => T], v: Option[S]) = (f, v) match { case (Some(fv), Some(vv)) => Some(fv(vv)); case _ => None } - def pure[S](s: => S) = Some(s) - def map[S, T](f: S => T, v: Option[S]) = v map f - def flatten[T](m: Option[Option[T]]): Option[T] = m.flatten - } - implicit val listMonad: Monad[List] = new Monad[List] { - def apply[S,T](f: List[S => T], v: List[S]) = for(fv <- f; vv <- v) yield fv(vv) - def pure[S](s: => S) = s :: Nil - def map[S, T](f: S => T, v: List[S]) = v map f - def flatten[T](m: List[List[T]]): List[T] = m.flatten - } +object Classes { + trait Applicative[M[_]] { + def apply[S, T](f: M[S => T], v: M[S]): M[T] + def pure[S](s: => S): M[S] + def map[S, T](f: S => T, v: M[S]): M[T] + } + trait Monad[M[_]] extends Applicative[M] { + def flatten[T](m: M[M[T]]): M[T] + } + implicit val optionMonad: Monad[Option] = new Monad[Option] { + def apply[S, T](f: Option[S => T], v: Option[S]) = (f, v) match { case (Some(fv), Some(vv)) => Some(fv(vv)); case _ => None } + def pure[S](s: => S) = Some(s) + def map[S, T](f: S => T, v: Option[S]) = v map f + def flatten[T](m: Option[Option[T]]): Option[T] = m.flatten + } + implicit val listMonad: Monad[List] = new Monad[List] { + def apply[S, T](f: List[S => T], v: List[S]) = for (fv <- f; vv <- v) yield fv(vv) + def pure[S](s: => S) = s :: Nil + def map[S, T](f: S => T, v: List[S]) = v map f + def flatten[T](m: List[List[T]]): List[T] = m.flatten + } } \ No newline at end of file diff --git a/util/collection/src/main/scala/sbt/Dag.scala b/util/collection/src/main/scala/sbt/Dag.scala index f0594ed50..7c0fd6f2c 100644 --- a/util/collection/src/main/scala/sbt/Dag.scala +++ b/util/collection/src/main/scala/sbt/Dag.scala @@ -3,130 +3,126 @@ */ package sbt; -trait Dag[Node <: Dag[Node]]{ - self : Node => +trait Dag[Node <: Dag[Node]] { + self: Node => - def dependencies : Iterable[Node] - def topologicalSort = Dag.topologicalSort(self)(_.dependencies) + def dependencies: Iterable[Node] + def topologicalSort = Dag.topologicalSort(self)(_.dependencies) } -object Dag -{ - import scala.collection.{mutable, JavaConverters} - import JavaConverters.asScalaSetConverter +object Dag { + import scala.collection.{ mutable, JavaConverters } + import JavaConverters.asScalaSetConverter - def topologicalSort[T](root: T)(dependencies: T => Iterable[T]): List[T] = topologicalSort(root :: Nil)(dependencies) + def topologicalSort[T](root: T)(dependencies: T => Iterable[T]): List[T] = topologicalSort(root :: Nil)(dependencies) - def topologicalSort[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = - { - val discovered = new mutable.HashSet[T] - val finished = (new java.util.LinkedHashSet[T]).asScala + def topologicalSort[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = + { + val discovered = new mutable.HashSet[T] + val finished = (new java.util.LinkedHashSet[T]).asScala - def visitAll(nodes: Iterable[T]) = nodes foreach visit - def visit(node : T){ - if (!discovered(node)) { - discovered(node) = true; - try { visitAll(dependencies(node)); } catch { case c: Cyclic => throw node :: c } - finished += node; - } - else if(!finished(node)) - throw new Cyclic(node) - } + def visitAll(nodes: Iterable[T]) = nodes foreach visit + def visit(node: T) { + if (!discovered(node)) { + discovered(node) = true; + try { visitAll(dependencies(node)); } catch { case c: Cyclic => throw node :: c } + finished += node; + } else if (!finished(node)) + throw new Cyclic(node) + } - visitAll(nodes); + visitAll(nodes); - finished.toList; - } - // doesn't check for cycles - def topologicalSortUnchecked[T](node: T)(dependencies: T => Iterable[T]): List[T] = topologicalSortUnchecked(node :: Nil)(dependencies) + finished.toList; + } + // doesn't check for cycles + def topologicalSortUnchecked[T](node: T)(dependencies: T => Iterable[T]): List[T] = topologicalSortUnchecked(node :: Nil)(dependencies) - def topologicalSortUnchecked[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = - { - val discovered = new mutable.HashSet[T] - var finished: List[T] = Nil + def topologicalSortUnchecked[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = + { + val discovered = new mutable.HashSet[T] + var finished: List[T] = Nil - def visitAll(nodes: Iterable[T]) = nodes foreach visit - def visit(node : T){ - if (!discovered(node)) { - discovered(node) = true; - visitAll(dependencies(node)) - finished ::= node; - } - } + def visitAll(nodes: Iterable[T]) = nodes foreach visit + def visit(node: T) { + if (!discovered(node)) { + discovered(node) = true; + visitAll(dependencies(node)) + finished ::= node; + } + } - visitAll(nodes); - finished; - } - final class Cyclic(val value: Any, val all: List[Any], val complete: Boolean) - extends Exception( "Cyclic reference involving " + - (if(complete) all.mkString("\n ", "\n ", "") else value) - ) - { - def this(value: Any) = this(value, value :: Nil, false) - override def toString = getMessage - def ::(a: Any): Cyclic = - if(complete) - this - else if(a == value) - new Cyclic(value, all, true) - else - new Cyclic(value, a :: all, false) - } + visitAll(nodes); + finished; + } + final class Cyclic(val value: Any, val all: List[Any], val complete: Boolean) + extends Exception("Cyclic reference involving " + + (if (complete) all.mkString("\n ", "\n ", "") else value) + ) { + def this(value: Any) = this(value, value :: Nil, false) + override def toString = getMessage + def ::(a: Any): Cyclic = + if (complete) + this + else if (a == value) + new Cyclic(value, all, true) + else + new Cyclic(value, a :: all, false) + } - /** A directed graph with edges labeled positive or negative. */ - private[sbt] trait DirectedSignedGraph[Node] - { - /** Directed edge type that tracks the sign and target (head) vertex. - * The sign can be obtained via [[isNegative]] and the target vertex via [[head]]. */ - type Arrow - /** List of initial nodes. */ - def nodes: List[Arrow] - /** Outgoing edges for `n`. */ - def dependencies(n: Node): List[Arrow] - /** `true` if the edge `a` is "negative", false if it is "positive". */ - def isNegative(a: Arrow): Boolean - /** The target of the directed edge `a`. */ - def head(a: Arrow): Node - } + /** A directed graph with edges labeled positive or negative. */ + private[sbt] trait DirectedSignedGraph[Node] { + /** + * Directed edge type that tracks the sign and target (head) vertex. + * The sign can be obtained via [[isNegative]] and the target vertex via [[head]]. + */ + type Arrow + /** List of initial nodes. */ + def nodes: List[Arrow] + /** Outgoing edges for `n`. */ + def dependencies(n: Node): List[Arrow] + /** `true` if the edge `a` is "negative", false if it is "positive". */ + def isNegative(a: Arrow): Boolean + /** The target of the directed edge `a`. */ + def head(a: Arrow): Node + } - /** Traverses a directed graph defined by `graph` looking for a cycle that includes a "negative" edge. - * The directed edges are weighted by the caller as "positive" or "negative". - * If a cycle containing a "negative" edge is detected, its member edges are returned in order. - * Otherwise, the empty list is returned. */ - private[sbt] def findNegativeCycle[Node](graph: DirectedSignedGraph[Node]): List[graph.Arrow] = - { - import scala.annotation.tailrec - import graph._ - val finished = new mutable.HashSet[Node] - val visited = new mutable.HashSet[Node] + /** + * Traverses a directed graph defined by `graph` looking for a cycle that includes a "negative" edge. + * The directed edges are weighted by the caller as "positive" or "negative". + * If a cycle containing a "negative" edge is detected, its member edges are returned in order. + * Otherwise, the empty list is returned. + */ + private[sbt] def findNegativeCycle[Node](graph: DirectedSignedGraph[Node]): List[graph.Arrow] = + { + import scala.annotation.tailrec + import graph._ + val finished = new mutable.HashSet[Node] + val visited = new mutable.HashSet[Node] - def visit(edges: List[Arrow], stack: List[Arrow]): List[Arrow] = edges match { - case Nil => Nil - case edge :: tail => - val node = head(edge) - if(!visited(node)) - { - visited += node - visit(dependencies(node), edge :: stack) match { - case Nil => - finished += node - visit(tail, stack) - case cycle => cycle - } - } - else if(!finished(node)) - { - // cycle. If a negative edge is involved, it is an error. - val between = edge :: stack.takeWhile(f => head(f) != node) - if(between exists isNegative) - between - else - visit(tail, stack) - } - else - visit(tail, stack) - } + def visit(edges: List[Arrow], stack: List[Arrow]): List[Arrow] = edges match { + case Nil => Nil + case edge :: tail => + val node = head(edge) + if (!visited(node)) { + visited += node + visit(dependencies(node), edge :: stack) match { + case Nil => + finished += node + visit(tail, stack) + case cycle => cycle + } + } else if (!finished(node)) { + // cycle. If a negative edge is involved, it is an error. + val between = edge :: stack.takeWhile(f => head(f) != node) + if (between exists isNegative) + between + else + visit(tail, stack) + } else + visit(tail, stack) + } - visit(graph.nodes, Nil) - } + visit(graph.nodes, Nil) + } } diff --git a/util/collection/src/main/scala/sbt/HList.scala b/util/collection/src/main/scala/sbt/HList.scala index cb76594d0..23f5488c6 100644 --- a/util/collection/src/main/scala/sbt/HList.scala +++ b/util/collection/src/main/scala/sbt/HList.scala @@ -5,30 +5,28 @@ package sbt import Types._ -/** A minimal heterogeneous list type. For background, see -* http://apocalisp.wordpress.com/2010/07/06/type-level-programming-in-scala-part-6a-heterogeneous-list basics/ */ -sealed trait HList -{ - type Wrap[M[_]] <: HList +/** + * A minimal heterogeneous list type. For background, see + * http://apocalisp.wordpress.com/2010/07/06/type-level-programming-in-scala-part-6a-heterogeneous-list basics/ + */ +sealed trait HList { + type Wrap[M[_]] <: HList } -sealed trait HNil extends HList -{ - type Wrap[M[_]] = HNil - def :+: [G](g: G): G :+: HNil = HCons(g, this) +sealed trait HNil extends HList { + type Wrap[M[_]] = HNil + def :+:[G](g: G): G :+: HNil = HCons(g, this) - override def toString = "HNil" + override def toString = "HNil" } object HNil extends HNil -final case class HCons[H, T <: HList](head : H, tail : T) extends HList -{ - type Wrap[M[_]] = M[H] :+: T#Wrap[M] - def :+: [G](g: G): G :+: H :+: T = HCons(g, this) +final case class HCons[H, T <: HList](head: H, tail: T) extends HList { + type Wrap[M[_]] = M[H] :+: T#Wrap[M] + def :+:[G](g: G): G :+: H :+: T = HCons(g, this) - override def toString = head + " :+: " + tail.toString + override def toString = head + " :+: " + tail.toString } -object HList -{ - // contains no type information: not even A - implicit def fromList[A](list: Traversable[A]): HList = ((HNil: HList) /: list) ( (hl,v) => HCons(v, hl) ) +object HList { + // contains no type information: not even A + implicit def fromList[A](list: Traversable[A]): HList = ((HNil: HList) /: list)((hl, v) => HCons(v, hl)) } \ No newline at end of file diff --git a/util/collection/src/main/scala/sbt/IDSet.scala b/util/collection/src/main/scala/sbt/IDSet.scala index 43a0d6f16..4f5245a26 100644 --- a/util/collection/src/main/scala/sbt/IDSet.scala +++ b/util/collection/src/main/scala/sbt/IDSet.scala @@ -4,44 +4,42 @@ package sbt /** A mutable set interface that uses object identity to test for set membership.*/ -trait IDSet[T] -{ - def apply(t: T): Boolean - def contains(t: T): Boolean - def += (t: T): Unit - def ++=(t: Iterable[T]): Unit - def -= (t: T): Boolean - def all: collection.Iterable[T] - def toList: List[T] - def isEmpty: Boolean - def foreach(f: T => Unit): Unit - def process[S](t: T)(ifSeen: S)(ifNew: => S): S +trait IDSet[T] { + def apply(t: T): Boolean + def contains(t: T): Boolean + def +=(t: T): Unit + def ++=(t: Iterable[T]): Unit + def -=(t: T): Boolean + def all: collection.Iterable[T] + def toList: List[T] + def isEmpty: Boolean + def foreach(f: T => Unit): Unit + def process[S](t: T)(ifSeen: S)(ifNew: => S): S } -object IDSet -{ - implicit def toTraversable[T]: IDSet[T] => Traversable[T] = _.all - def apply[T](values: T*): IDSet[T] = apply(values) - def apply[T](values: Iterable[T]): IDSet[T] = - { - val s = create[T] - s ++= values - s - } - def create[T]: IDSet[T] = new IDSet[T] { - private[this] val backing = new java.util.IdentityHashMap[T, AnyRef] - private[this] val Dummy: AnyRef = "" +object IDSet { + implicit def toTraversable[T]: IDSet[T] => Traversable[T] = _.all + def apply[T](values: T*): IDSet[T] = apply(values) + def apply[T](values: Iterable[T]): IDSet[T] = + { + val s = create[T] + s ++= values + s + } + def create[T]: IDSet[T] = new IDSet[T] { + private[this] val backing = new java.util.IdentityHashMap[T, AnyRef] + private[this] val Dummy: AnyRef = "" - def apply(t: T) = contains(t) - def contains(t: T) = backing.containsKey(t) - def foreach(f: T => Unit) = all foreach f - def += (t: T) = backing.put(t, Dummy) - def ++=(t: Iterable[T]) = t foreach += - def -= (t:T) = if(backing.remove(t) eq null) false else true - def all = collection.JavaConversions.collectionAsScalaIterable(backing.keySet) - def toList = all.toList - def isEmpty = backing.isEmpty - def process[S](t: T)(ifSeen: S)(ifNew: => S) = if(contains(t)) ifSeen else { this += t ; ifNew } - override def toString = backing.toString - } + def apply(t: T) = contains(t) + def contains(t: T) = backing.containsKey(t) + def foreach(f: T => Unit) = all foreach f + def +=(t: T) = backing.put(t, Dummy) + def ++=(t: Iterable[T]) = t foreach += + def -=(t: T) = if (backing.remove(t) eq null) false else true + def all = collection.JavaConversions.collectionAsScalaIterable(backing.keySet) + def toList = all.toList + def isEmpty = backing.isEmpty + def process[S](t: T)(ifSeen: S)(ifNew: => S) = if (contains(t)) ifSeen else { this += t; ifNew } + override def toString = backing.toString + } } diff --git a/util/collection/src/main/scala/sbt/INode.scala b/util/collection/src/main/scala/sbt/INode.scala index 67b1c5b36..d56a22485 100644 --- a/util/collection/src/main/scala/sbt/INode.scala +++ b/util/collection/src/main/scala/sbt/INode.scala @@ -1,179 +1,177 @@ package sbt - import java.lang.Runnable - import java.util.concurrent.{atomic, Executor, LinkedBlockingQueue} - import atomic.{AtomicBoolean, AtomicInteger} - import Types.{:+:, ConstK, Id} +import java.lang.Runnable +import java.util.concurrent.{ atomic, Executor, LinkedBlockingQueue } +import atomic.{ AtomicBoolean, AtomicInteger } +import Types.{ :+:, ConstK, Id } object EvaluationState extends Enumeration { - val New, Blocked, Ready, Calling, Evaluated = Value + val New, Blocked, Ready, Calling, Evaluated = Value } -abstract class EvaluateSettings[Scope] -{ - protected val init: Init[Scope] - import init._ - protected def executor: Executor - protected def compiledSettings: Seq[Compiled[_]] +abstract class EvaluateSettings[Scope] { + protected val init: Init[Scope] + import init._ + protected def executor: Executor + protected def compiledSettings: Seq[Compiled[_]] - import EvaluationState.{Value => EvaluationState, _} + import EvaluationState.{ Value => EvaluationState, _ } - private[this] val complete = new LinkedBlockingQueue[Option[Throwable]] - private[this] val static = PMap.empty[ScopedKey, INode] - private[this] val allScopes: Set[Scope] = compiledSettings.map(_.key.scope).toSet - private[this] def getStatic[T](key: ScopedKey[T]): INode[T] = static get key getOrElse sys.error("Illegal reference to key " + key) + private[this] val complete = new LinkedBlockingQueue[Option[Throwable]] + private[this] val static = PMap.empty[ScopedKey, INode] + private[this] val allScopes: Set[Scope] = compiledSettings.map(_.key.scope).toSet + private[this] def getStatic[T](key: ScopedKey[T]): INode[T] = static get key getOrElse sys.error("Illegal reference to key " + key) - private[this] val transform: Initialize ~> INode = new (Initialize ~> INode) { def apply[T](i: Initialize[T]): INode[T] = i match { - case k: Keyed[s, T] => single(getStatic(k.scopedKey), k.transform) - case a: Apply[k,T] => new MixedNode[k,T]( a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) - case b: Bind[s,T] => new BindNode[s,T]( transform(b.in), x => transform(b.f(x))) - case init.StaticScopes => strictConstant(allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] - case v: Value[T] => constant(v.value) - case v: ValidationCapture[T] => strictConstant(v.key) - case t: TransformCapture => strictConstant(t.f) - case o: Optional[s,T] => o.a match { - case None => constant( () => o.f(None) ) - case Some(i) => single[s,T](transform(i), x => o.f(Some(x))) - } - }} - private[this] lazy val roots: Seq[INode[_]] = compiledSettings flatMap { cs => - (cs.settings map { s => - val t = transform(s.init) - static(s.key) = t - t - }): Seq[INode[_]] - } - private[this] var running = new AtomicInteger - private[this] var cancel = new AtomicBoolean(false) + private[this] val transform: Initialize ~> INode = new (Initialize ~> INode) { + def apply[T](i: Initialize[T]): INode[T] = i match { + case k: Keyed[s, T] => single(getStatic(k.scopedKey), k.transform) + case a: Apply[k, T] => new MixedNode[k, T](a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) + case b: Bind[s, T] => new BindNode[s, T](transform(b.in), x => transform(b.f(x))) + case init.StaticScopes => strictConstant(allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] + case v: Value[T] => constant(v.value) + case v: ValidationCapture[T] => strictConstant(v.key) + case t: TransformCapture => strictConstant(t.f) + case o: Optional[s, T] => o.a match { + case None => constant(() => o.f(None)) + case Some(i) => single[s, T](transform(i), x => o.f(Some(x))) + } + } + } + private[this] lazy val roots: Seq[INode[_]] = compiledSettings flatMap { cs => + (cs.settings map { s => + val t = transform(s.init) + static(s.key) = t + t + }): Seq[INode[_]] + } + private[this] var running = new AtomicInteger + private[this] var cancel = new AtomicBoolean(false) - def run(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = - { - assert(running.get() == 0, "Already running") - startWork() - roots.foreach( _.registerIfNew() ) - workComplete() - complete.take() foreach { ex => - cancel.set(true) - throw ex - } - getResults(delegates) - } - private[this] def getResults(implicit delegates: Scope => Seq[Scope]) = - (empty /: static.toTypedSeq) { case (ss, static.TPair(key, node)) => - if(key.key.isLocal) ss else ss.set(key.scope, key.key, node.get) - } - private[this] val getValue = new (INode ~> Id) { def apply[T](node: INode[T]) = node.get } + def run(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = + { + assert(running.get() == 0, "Already running") + startWork() + roots.foreach(_.registerIfNew()) + workComplete() + complete.take() foreach { ex => + cancel.set(true) + throw ex + } + getResults(delegates) + } + private[this] def getResults(implicit delegates: Scope => Seq[Scope]) = + (empty /: static.toTypedSeq) { + case (ss, static.TPair(key, node)) => + if (key.key.isLocal) ss else ss.set(key.scope, key.key, node.get) + } + private[this] val getValue = new (INode ~> Id) { def apply[T](node: INode[T]) = node.get } - private[this] def submitEvaluate(node: INode[_]) = submit(node.evaluate()) - private[this] def submitCallComplete[T](node: BindNode[_, T], value: T) = submit(node.callComplete(value)) - private[this] def submit(work: => Unit): Unit = - { - startWork() - executor.execute(new Runnable { def run = if(!cancel.get()) run0(work) }) - } - private[this] def run0(work: => Unit): Unit = - { - try { work } catch { case e: Throwable => complete.put( Some(e) ) } - workComplete() - } + private[this] def submitEvaluate(node: INode[_]) = submit(node.evaluate()) + private[this] def submitCallComplete[T](node: BindNode[_, T], value: T) = submit(node.callComplete(value)) + private[this] def submit(work: => Unit): Unit = + { + startWork() + executor.execute(new Runnable { def run = if (!cancel.get()) run0(work) }) + } + private[this] def run0(work: => Unit): Unit = + { + try { work } catch { case e: Throwable => complete.put(Some(e)) } + workComplete() + } - private[this] def startWork(): Unit = running.incrementAndGet() - private[this] def workComplete(): Unit = - if(running.decrementAndGet() == 0) - complete.put( None ) + private[this] def startWork(): Unit = running.incrementAndGet() + private[this] def workComplete(): Unit = + if (running.decrementAndGet() == 0) + complete.put(None) - private[this] sealed abstract class INode[T] - { - private[this] var state: EvaluationState = New - private[this] var value: T = _ - private[this] val blocking = new collection.mutable.ListBuffer[INode[_]] - private[this] var blockedOn: Int = 0 - private[this] val calledBy = new collection.mutable.ListBuffer[BindNode[_, T]] + private[this] sealed abstract class INode[T] { + private[this] var state: EvaluationState = New + private[this] var value: T = _ + private[this] val blocking = new collection.mutable.ListBuffer[INode[_]] + private[this] var blockedOn: Int = 0 + private[this] val calledBy = new collection.mutable.ListBuffer[BindNode[_, T]] - override def toString = getClass.getName + " (state=" + state + ",blockedOn=" + blockedOn + ",calledBy=" + calledBy.size + ",blocking=" + blocking.size + "): " + - keyString + override def toString = getClass.getName + " (state=" + state + ",blockedOn=" + blockedOn + ",calledBy=" + calledBy.size + ",blocking=" + blocking.size + "): " + + keyString - private[this] def keyString = - (static.toSeq.flatMap { case (key, value) => if(value eq this) init.showFullKey(key) :: Nil else Nil }).headOption getOrElse "non-static" + private[this] def keyString = + (static.toSeq.flatMap { case (key, value) => if (value eq this) init.showFullKey(key) :: Nil else Nil }).headOption getOrElse "non-static" - final def get: T = synchronized { - assert(value != null, toString + " not evaluated") - value - } - final def doneOrBlock(from: INode[_]): Boolean = synchronized { - val ready = state == Evaluated - if(!ready) blocking += from - registerIfNew() - ready - } - final def isDone: Boolean = synchronized { state == Evaluated } - final def isNew: Boolean = synchronized { state == New } - final def isCalling: Boolean = synchronized { state == Calling } - final def registerIfNew(): Unit = synchronized { if(state == New) register() } - private[this] def register() - { - assert(state == New, "Already registered and: " + toString) - val deps = dependsOn - blockedOn = deps.size - deps.count(_.doneOrBlock(this)) - if(blockedOn == 0) - schedule() - else - state = Blocked - } + final def get: T = synchronized { + assert(value != null, toString + " not evaluated") + value + } + final def doneOrBlock(from: INode[_]): Boolean = synchronized { + val ready = state == Evaluated + if (!ready) blocking += from + registerIfNew() + ready + } + final def isDone: Boolean = synchronized { state == Evaluated } + final def isNew: Boolean = synchronized { state == New } + final def isCalling: Boolean = synchronized { state == Calling } + final def registerIfNew(): Unit = synchronized { if (state == New) register() } + private[this] def register() { + assert(state == New, "Already registered and: " + toString) + val deps = dependsOn + blockedOn = deps.size - deps.count(_.doneOrBlock(this)) + if (blockedOn == 0) + schedule() + else + state = Blocked + } - final def schedule(): Unit = synchronized { - assert(state == New || state == Blocked, "Invalid state for schedule() call: " + toString) - state = Ready - submitEvaluate(this) - } - final def unblocked(): Unit = synchronized { - assert(state == Blocked, "Invalid state for unblocked() call: " + toString) - blockedOn -= 1 - assert(blockedOn >= 0, "Negative blockedOn: " + blockedOn + " for " + toString) - if(blockedOn == 0) schedule() - } - final def evaluate(): Unit = synchronized { evaluate0() } - protected final def makeCall(source: BindNode[_, T], target: INode[T]) { - assert(state == Ready, "Invalid state for call to makeCall: " + toString) - state = Calling - target.call(source) - } - protected final def setValue(v: T) { - assert(state != Evaluated, "Already evaluated (trying to set value to " + v + "): " + toString) - if(v == null) sys.error("Setting value cannot be null: " + keyString) - value = v - state = Evaluated - blocking foreach { _.unblocked() } - blocking.clear() - calledBy foreach { node => submitCallComplete(node, value) } - calledBy.clear() - } - final def call(by: BindNode[_, T]): Unit = synchronized { - registerIfNew() - state match { - case Evaluated => submitCallComplete(by, value) - case _ => calledBy += by - } - } - protected def dependsOn: Seq[INode[_]] - protected def evaluate0(): Unit - } + final def schedule(): Unit = synchronized { + assert(state == New || state == Blocked, "Invalid state for schedule() call: " + toString) + state = Ready + submitEvaluate(this) + } + final def unblocked(): Unit = synchronized { + assert(state == Blocked, "Invalid state for unblocked() call: " + toString) + blockedOn -= 1 + assert(blockedOn >= 0, "Negative blockedOn: " + blockedOn + " for " + toString) + if (blockedOn == 0) schedule() + } + final def evaluate(): Unit = synchronized { evaluate0() } + protected final def makeCall(source: BindNode[_, T], target: INode[T]) { + assert(state == Ready, "Invalid state for call to makeCall: " + toString) + state = Calling + target.call(source) + } + protected final def setValue(v: T) { + assert(state != Evaluated, "Already evaluated (trying to set value to " + v + "): " + toString) + if (v == null) sys.error("Setting value cannot be null: " + keyString) + value = v + state = Evaluated + blocking foreach { _.unblocked() } + blocking.clear() + calledBy foreach { node => submitCallComplete(node, value) } + calledBy.clear() + } + final def call(by: BindNode[_, T]): Unit = synchronized { + registerIfNew() + state match { + case Evaluated => submitCallComplete(by, value) + case _ => calledBy += by + } + } + protected def dependsOn: Seq[INode[_]] + protected def evaluate0(): Unit + } - private[this] def strictConstant[T](v: T): INode[T] = constant(() => v) - private[this] def constant[T](f: () => T): INode[T] = new MixedNode[ConstK[Unit]#l, T]((), _ => f(), AList.empty) - private[this] def single[S,T](in: INode[S], f: S => T): INode[T] = new MixedNode[ ({ type l[L[x]] = L[S] })#l, T](in, f, AList.single[S]) - private[this] final class BindNode[S,T](in: INode[S], f: S => INode[T]) extends INode[T] - { - protected def dependsOn = in :: Nil - protected def evaluate0(): Unit = makeCall(this, f(in.get) ) - def callComplete(value: T): Unit = synchronized { - assert(isCalling, "Invalid state for callComplete(" + value + "): " + toString) - setValue(value) - } - } - private[this] final class MixedNode[K[L[x]], T](in: K[INode], f: K[Id] => T, alist: AList[K]) extends INode[T] - { - protected def dependsOn = alist.toList(in) - protected def evaluate0(): Unit = setValue( f( alist.transform(in, getValue) ) ) - } + private[this] def strictConstant[T](v: T): INode[T] = constant(() => v) + private[this] def constant[T](f: () => T): INode[T] = new MixedNode[ConstK[Unit]#l, T]((), _ => f(), AList.empty) + private[this] def single[S, T](in: INode[S], f: S => T): INode[T] = new MixedNode[({ type l[L[x]] = L[S] })#l, T](in, f, AList.single[S]) + private[this] final class BindNode[S, T](in: INode[S], f: S => INode[T]) extends INode[T] { + protected def dependsOn = in :: Nil + protected def evaluate0(): Unit = makeCall(this, f(in.get)) + def callComplete(value: T): Unit = synchronized { + assert(isCalling, "Invalid state for callComplete(" + value + "): " + toString) + setValue(value) + } + } + private[this] final class MixedNode[K[L[x]], T](in: K[INode], f: K[Id] => T, alist: AList[K]) extends INode[T] { + protected def dependsOn = alist.toList(in) + protected def evaluate0(): Unit = setValue(f(alist.transform(in, getValue))) + } } diff --git a/util/collection/src/main/scala/sbt/KList.scala b/util/collection/src/main/scala/sbt/KList.scala index 7ecc6ba6a..0b09ac9b1 100644 --- a/util/collection/src/main/scala/sbt/KList.scala +++ b/util/collection/src/main/scala/sbt/KList.scala @@ -1,56 +1,53 @@ package sbt - import Types._ - import Classes.Applicative +import Types._ +import Classes.Applicative /** Heterogeneous list with each element having type M[T] for some type T.*/ -sealed trait KList[+M[_]] -{ - type Transform[N[_]] <: KList[N] +sealed trait KList[+M[_]] { + type Transform[N[_]] <: KList[N] - /** Apply the natural transformation `f` to each element. */ - def transform[N[_]](f: M ~> N): Transform[N] + /** Apply the natural transformation `f` to each element. */ + def transform[N[_]](f: M ~> N): Transform[N] - /** Folds this list using a function that operates on the homogeneous type of the elements of this list. */ - def foldr[T](f: (M[_], T) => T, init: T): T = init // had trouble defining it in KNil + /** Folds this list using a function that operates on the homogeneous type of the elements of this list. */ + def foldr[T](f: (M[_], T) => T, init: T): T = init // had trouble defining it in KNil - /** Applies `f` to the elements of this list in the applicative functor defined by `ap`. */ - def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] + /** Applies `f` to the elements of this list in the applicative functor defined by `ap`. */ + def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] - /** Equivalent to `transform(f) . apply(x => x)`, this is the essence of the iterator at the level of natural transformations.*/ - def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] + /** Equivalent to `transform(f) . apply(x => x)`, this is the essence of the iterator at the level of natural transformations.*/ + def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] - /** Discards the heterogeneous type information and constructs a plain List from this KList's elements. */ - def toList: List[M[_]] + /** Discards the heterogeneous type information and constructs a plain List from this KList's elements. */ + def toList: List[M[_]] } -final case class KCons[H, +T <: KList[M], +M[_]](head: M[H], tail: T) extends KList[M] -{ - final type Transform[N[_]] = KCons[H, tail.Transform[N], N] +final case class KCons[H, +T <: KList[M], +M[_]](head: M[H], tail: T) extends KList[M] { + final type Transform[N[_]] = KCons[H, tail.Transform[N], N] - def transform[N[_]](f: M ~> N) = KCons(f(head), tail.transform(f)) - def toList: List[M[_]] = head :: tail.toList - def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] = - { - val g = (t: tail.Transform[Id]) => (h: H) =>f( KCons[H, tail.Transform[Id], Id](h, t) ) - ap.apply( tail.apply[N, H => Z](g), head ) - } - def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] = - { - val tt: N[tail.Transform[P]] = tail.traverse[N,P](f) - val g = (t: tail.Transform[P]) => (h: P[H]) => KCons(h, t) - np.apply(np.map(g, tt), f(head)) - } - def :^:[A,N[x] >: M[x]](h: N[A]) = KCons(h, this) - override def foldr[T](f: (M[_], T) => T, init: T): T = f(head, tail.foldr(f, init)) + def transform[N[_]](f: M ~> N) = KCons(f(head), tail.transform(f)) + def toList: List[M[_]] = head :: tail.toList + def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] = + { + val g = (t: tail.Transform[Id]) => (h: H) => f(KCons[H, tail.Transform[Id], Id](h, t)) + ap.apply(tail.apply[N, H => Z](g), head) + } + def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] = + { + val tt: N[tail.Transform[P]] = tail.traverse[N, P](f) + val g = (t: tail.Transform[P]) => (h: P[H]) => KCons(h, t) + np.apply(np.map(g, tt), f(head)) + } + def :^:[A, N[x] >: M[x]](h: N[A]) = KCons(h, this) + override def foldr[T](f: (M[_], T) => T, init: T): T = f(head, tail.foldr(f, init)) } -sealed abstract class KNil extends KList[Nothing] -{ - final type Transform[N[_]] = KNil - final def transform[N[_]](f: Nothing ~> N): Transform[N] = KNil - final def toList = Nil - final def apply[N[x], Z](f: KNil => Z)(implicit ap: Applicative[N]): N[Z] = ap.pure(f(KNil)) - final def traverse[N[_], P[_]](f: Nothing ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KNil] = np.pure(KNil) +sealed abstract class KNil extends KList[Nothing] { + final type Transform[N[_]] = KNil + final def transform[N[_]](f: Nothing ~> N): Transform[N] = KNil + final def toList = Nil + final def apply[N[x], Z](f: KNil => Z)(implicit ap: Applicative[N]): N[Z] = ap.pure(f(KNil)) + final def traverse[N[_], P[_]](f: Nothing ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KNil] = np.pure(KNil) } case object KNil extends KNil { - def :^:[M[_], H](h: M[H]): KCons[H, KNil, M] = KCons(h, this) + def :^:[M[_], H](h: M[H]): KCons[H, KNil, M] = KCons(h, this) } diff --git a/util/collection/src/main/scala/sbt/PMap.scala b/util/collection/src/main/scala/sbt/PMap.scala index 67a8899cd..51c942112 100644 --- a/util/collection/src/main/scala/sbt/PMap.scala +++ b/util/collection/src/main/scala/sbt/PMap.scala @@ -3,112 +3,106 @@ */ package sbt - import collection.mutable +import collection.mutable -trait RMap[K[_], V[_]] -{ - def apply[T](k: K[T]): V[T] - def get[T](k: K[T]): Option[V[T]] - def contains[T](k: K[T]): Boolean - def toSeq: Seq[(K[_], V[_])] - def toTypedSeq: Seq[TPair[_]] = toSeq.map{ case (k: K[t],v) => TPair[t](k,v.asInstanceOf[V[t]]) } - def keys: Iterable[K[_]] - def values: Iterable[V[_]] - def isEmpty: Boolean +trait RMap[K[_], V[_]] { + def apply[T](k: K[T]): V[T] + def get[T](k: K[T]): Option[V[T]] + def contains[T](k: K[T]): Boolean + def toSeq: Seq[(K[_], V[_])] + def toTypedSeq: Seq[TPair[_]] = toSeq.map { case (k: K[t], v) => TPair[t](k, v.asInstanceOf[V[t]]) } + def keys: Iterable[K[_]] + def values: Iterable[V[_]] + def isEmpty: Boolean - final case class TPair[T](key: K[T], value: V[T]) + final case class TPair[T](key: K[T], value: V[T]) } -trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] -{ - def put[T](k: K[T], v: V[T]): IMap[K,V] - def remove[T](k: K[T]): IMap[K,V] - def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): IMap[K,V] - def mapValues[V2[_]](f: V ~> V2): IMap[K,V2] - def mapSeparate[VL[_], VR[_]](f: V ~> ({type l[T] = Either[VL[T], VR[T]]})#l ): (IMap[K,VL], IMap[K,VR]) +trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K, V] { + def put[T](k: K[T], v: V[T]): IMap[K, V] + def remove[T](k: K[T]): IMap[K, V] + def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): IMap[K, V] + def mapValues[V2[_]](f: V ~> V2): IMap[K, V2] + def mapSeparate[VL[_], VR[_]](f: V ~> ({ type l[T] = Either[VL[T], VR[T]] })#l): (IMap[K, VL], IMap[K, VR]) } -trait PMap[K[_], V[_]] extends (K ~> V) with RMap[K,V] -{ - def update[T](k: K[T], v: V[T]): Unit - def remove[T](k: K[T]): Option[V[T]] - def getOrUpdate[T](k: K[T], make: => V[T]): V[T] - def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] +trait PMap[K[_], V[_]] extends (K ~> V) with RMap[K, V] { + def update[T](k: K[T], v: V[T]): Unit + def remove[T](k: K[T]): Option[V[T]] + def getOrUpdate[T](k: K[T], make: => V[T]): V[T] + def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] } -object PMap -{ - implicit def toFunction[K[_], V[_]](map: PMap[K,V]): K[_] => V[_] = k => map(k) - def empty[K[_], V[_]]: PMap[K,V] = new DelegatingPMap[K,V](new mutable.HashMap) +object PMap { + implicit def toFunction[K[_], V[_]](map: PMap[K, V]): K[_] => V[_] = k => map(k) + def empty[K[_], V[_]]: PMap[K, V] = new DelegatingPMap[K, V](new mutable.HashMap) } -object IMap -{ - /** - * Only suitable for K that is invariant in its type parameter. - * Option and List keys are not suitable, for example, - * because None <:< Option[String] and None <: Option[Int]. - */ - def empty[K[_], V[_]]: IMap[K,V] = new IMap0[K,V](Map.empty) +object IMap { + /** + * Only suitable for K that is invariant in its type parameter. + * Option and List keys are not suitable, for example, + * because None <:< Option[String] and None <: Option[Int]. + */ + def empty[K[_], V[_]]: IMap[K, V] = new IMap0[K, V](Map.empty) - private[this] class IMap0[K[_], V[_]](backing: Map[K[_], V[_]]) extends AbstractRMap[K,V] with IMap[K,V] - { - def get[T](k: K[T]): Option[V[T]] = ( backing get k ).asInstanceOf[Option[V[T]]] - def put[T](k: K[T], v: V[T]) = new IMap0[K,V]( backing.updated(k, v) ) - def remove[T](k: K[T]) = new IMap0[K,V]( backing - k ) + private[this] class IMap0[K[_], V[_]](backing: Map[K[_], V[_]]) extends AbstractRMap[K, V] with IMap[K, V] { + def get[T](k: K[T]): Option[V[T]] = (backing get k).asInstanceOf[Option[V[T]]] + def put[T](k: K[T], v: V[T]) = new IMap0[K, V](backing.updated(k, v)) + def remove[T](k: K[T]) = new IMap0[K, V](backing - k) - def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]) = - put(k, f(this get k getOrElse init)) + def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]) = + put(k, f(this get k getOrElse init)) - def mapValues[V2[_]](f: V ~> V2) = - new IMap0[K,V2](backing.mapValues(x => f(x)).toMap) + def mapValues[V2[_]](f: V ~> V2) = + new IMap0[K, V2](backing.mapValues(x => f(x)).toMap) - def mapSeparate[VL[_], VR[_]](f: V ~> ({type l[T] = Either[VL[T], VR[T]]})#l ) = - { - val mapped = backing.iterator.map { case (k,v) => f(v) match { - case Left(l) => Left((k, l)) - case Right(r) => Right((k, r)) - }} - val (l, r) = Util.separateE[(K[_],VL[_]), (K[_],VR[_])]( mapped.toList ) - (new IMap0[K,VL](l.toMap), new IMap0[K,VR](r.toMap)) - } + def mapSeparate[VL[_], VR[_]](f: V ~> ({ type l[T] = Either[VL[T], VR[T]] })#l) = + { + val mapped = backing.iterator.map { + case (k, v) => f(v) match { + case Left(l) => Left((k, l)) + case Right(r) => Right((k, r)) + } + } + val (l, r) = Util.separateE[(K[_], VL[_]), (K[_], VR[_])](mapped.toList) + (new IMap0[K, VL](l.toMap), new IMap0[K, VR](r.toMap)) + } - def toSeq = backing.toSeq - def keys = backing.keys - def values = backing.values - def isEmpty = backing.isEmpty + def toSeq = backing.toSeq + def keys = backing.keys + def values = backing.values + def isEmpty = backing.isEmpty - override def toString = backing.toString - } + override def toString = backing.toString + } } -abstract class AbstractRMap[K[_], V[_]] extends RMap[K,V] -{ - def apply[T](k: K[T]): V[T] = get(k).get - def contains[T](k: K[T]): Boolean = get(k).isDefined +abstract class AbstractRMap[K[_], V[_]] extends RMap[K, V] { + def apply[T](k: K[T]): V[T] = get(k).get + def contains[T](k: K[T]): Boolean = get(k).isDefined } /** -* Only suitable for K that is invariant in its type parameter. -* Option and List keys are not suitable, for example, -* because None <:< Option[String] and None <: Option[Int]. -*/ -class DelegatingPMap[K[_], V[_]](backing: mutable.Map[K[_], V[_]]) extends AbstractRMap[K,V] with PMap[K,V] -{ - def get[T](k: K[T]): Option[V[T]] = cast[T]( backing.get(k) ) - def update[T](k: K[T], v: V[T]) { backing(k) = v } - def remove[T](k: K[T]) = cast( backing.remove(k) ) - def getOrUpdate[T](k: K[T], make: => V[T]) = cast[T]( backing.getOrElseUpdate(k, make) ) - def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] = - { - val v = f(this get k getOrElse init) - update(k, v) - v - } - def toSeq = backing.toSeq - def keys = backing.keys - def values = backing.values - def isEmpty = backing.isEmpty + * Only suitable for K that is invariant in its type parameter. + * Option and List keys are not suitable, for example, + * because None <:< Option[String] and None <: Option[Int]. + */ +class DelegatingPMap[K[_], V[_]](backing: mutable.Map[K[_], V[_]]) extends AbstractRMap[K, V] with PMap[K, V] { + def get[T](k: K[T]): Option[V[T]] = cast[T](backing.get(k)) + def update[T](k: K[T], v: V[T]) { backing(k) = v } + def remove[T](k: K[T]) = cast(backing.remove(k)) + def getOrUpdate[T](k: K[T], make: => V[T]) = cast[T](backing.getOrElseUpdate(k, make)) + def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] = + { + val v = f(this get k getOrElse init) + update(k, v) + v + } + def toSeq = backing.toSeq + def keys = backing.keys + def values = backing.values + def isEmpty = backing.isEmpty - private[this] def cast[T](v: V[_]): V[T] = v.asInstanceOf[V[T]] - private[this] def cast[T](o: Option[V[_]]): Option[V[T]] = o map cast[T] + private[this] def cast[T](v: V[_]): V[T] = v.asInstanceOf[V[T]] + private[this] def cast[T](o: Option[V[_]]): Option[V[T]] = o map cast[T] - override def toString = backing.toString + override def toString = backing.toString } diff --git a/util/collection/src/main/scala/sbt/Param.scala b/util/collection/src/main/scala/sbt/Param.scala index 3271465d9..6f674efdc 100644 --- a/util/collection/src/main/scala/sbt/Param.scala +++ b/util/collection/src/main/scala/sbt/Param.scala @@ -6,26 +6,25 @@ package sbt import Types._ // Used to emulate ~> literals -trait Param[A[_], B[_]] -{ - type T - def in: A[T] - def ret(out: B[T]) - def ret: B[T] +trait Param[A[_], B[_]] { + type T + def in: A[T] + def ret(out: B[T]) + def ret: B[T] } -object Param -{ - implicit def pToT[A[_], B[_]](p: Param[A,B] => Unit): A~>B = new (A ~> B) { - def apply[s](a: A[s]): B[s] = { - val v: Param[A,B] { type T = s} = new Param[A,B] { type T = s - def in = a - private var r: B[T] = _ - def ret(b: B[T]) {r = b} - def ret: B[T] = r - } - p(v) - v.ret - } - } +object Param { + implicit def pToT[A[_], B[_]](p: Param[A, B] => Unit): A ~> B = new (A ~> B) { + def apply[s](a: A[s]): B[s] = { + val v: Param[A, B] { type T = s } = new Param[A, B] { + type T = s + def in = a + private var r: B[T] = _ + def ret(b: B[T]) { r = b } + def ret: B[T] = r + } + p(v) + v.ret + } + } } \ No newline at end of file diff --git a/util/collection/src/main/scala/sbt/Positions.scala b/util/collection/src/main/scala/sbt/Positions.scala index f52c583b0..5d7e1915d 100755 --- a/util/collection/src/main/scala/sbt/Positions.scala +++ b/util/collection/src/main/scala/sbt/Positions.scala @@ -3,8 +3,8 @@ package sbt sealed trait SourcePosition sealed trait FilePosition extends SourcePosition { - def path: String - def startLine: Int + def path: String + def startLine: Int } case object NoPosition extends SourcePosition @@ -12,9 +12,9 @@ case object NoPosition extends SourcePosition final case class LinePosition(path: String, startLine: Int) extends FilePosition final case class LineRange(start: Int, end: Int) { - def shift(n: Int) = new LineRange(start + n, end + n) + def shift(n: Int) = new LineRange(start + n, end + n) } final case class RangePosition(path: String, range: LineRange) extends FilePosition { - def startLine = range.start + def startLine = range.start } diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 7a6a7b7ee..96393f917 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -5,638 +5,630 @@ package sbt import Types._ -sealed trait Settings[Scope] -{ - def data: Map[Scope, AttributeMap] - def keys(scope: Scope): Set[AttributeKey[_]] - def scopes: Set[Scope] - def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] - def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] - def get[T](scope: Scope, key: AttributeKey[T]): Option[T] - def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] - def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] +sealed trait Settings[Scope] { + def data: Map[Scope, AttributeMap] + def keys(scope: Scope): Set[AttributeKey[_]] + def scopes: Set[Scope] + def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] + def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] + def get[T](scope: Scope, key: AttributeKey[T]): Option[T] + def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] + def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] } -private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val delegates: Scope => Seq[Scope]) extends Settings[Scope] -{ - def scopes: Set[Scope] = data.keySet.toSet - def keys(scope: Scope) = data(scope).keys.toSet - def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) } toSeq; +private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val delegates: Scope => Seq[Scope]) extends Settings[Scope] { + def scopes: Set[Scope] = data.keySet.toSet + def keys(scope: Scope) = data(scope).keys.toSet + def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) } toSeq; - def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = - delegates(scope).toStream.flatMap(sc => getDirect(sc, key) ).headOption - def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] = - delegates(scope).toStream.filter(sc => getDirect(sc, key).isDefined ).headOption + def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = + delegates(scope).toStream.flatMap(sc => getDirect(sc, key)).headOption + def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] = + delegates(scope).toStream.filter(sc => getDirect(sc, key).isDefined).headOption - def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] = - (data get scope).flatMap(_ get key) + def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] = + (data get scope).flatMap(_ get key) - def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] = - { - val map = (data get scope) getOrElse AttributeMap.empty - val newData = data.updated(scope, map.put(key, value)) - new Settings0(newData, delegates) - } + def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] = + { + val map = (data get scope) getOrElse AttributeMap.empty + val newData = data.updated(scope, map.put(key, value)) + new Settings0(newData, delegates) + } } // delegates should contain the input Scope as the first entry // this trait is intended to be mixed into an object -trait Init[Scope] -{ - /** The Show instance used when a detailed String needs to be generated. It is typically used when no context is available.*/ - def showFullKey: Show[ScopedKey[_]] +trait Init[Scope] { + /** The Show instance used when a detailed String needs to be generated. It is typically used when no context is available.*/ + def showFullKey: Show[ScopedKey[_]] - final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) extends KeyedInitialize[T] { - def scopedKey = this - } + final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) extends KeyedInitialize[T] { + def scopedKey = this + } - type SettingSeq[T] = Seq[Setting[T]] - type ScopedMap = IMap[ScopedKey, SettingSeq] - type CompiledMap = Map[ScopedKey[_], Compiled[_]] - type MapScoped = ScopedKey ~> ScopedKey - type ValidatedRef[T] = Either[Undefined, ScopedKey[T]] - type ValidatedInit[T] = Either[Seq[Undefined], Initialize[T]] - type ValidateRef = ScopedKey ~> ValidatedRef - type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] - type MapConstant = ScopedKey ~> Option + type SettingSeq[T] = Seq[Setting[T]] + type ScopedMap = IMap[ScopedKey, SettingSeq] + type CompiledMap = Map[ScopedKey[_], Compiled[_]] + type MapScoped = ScopedKey ~> ScopedKey + type ValidatedRef[T] = Either[Undefined, ScopedKey[T]] + type ValidatedInit[T] = Either[Seq[Undefined], Initialize[T]] + type ValidateRef = ScopedKey ~> ValidatedRef + type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] + type MapConstant = ScopedKey ~> Option - private[sbt] abstract class ValidateKeyRef { - def apply[T](key: ScopedKey[T], selfRefOk: Boolean): ValidatedRef[T] - } + private[sbt] abstract class ValidateKeyRef { + def apply[T](key: ScopedKey[T], selfRefOk: Boolean): ValidatedRef[T] + } - /** The result of this initialization is the composition of applied transformations. - * This can be useful when dealing with dynamic Initialize values. */ - lazy val capturedTransformations: Initialize[Initialize ~> Initialize] = new TransformCapture(idK[Initialize]) + /** + * The result of this initialization is the composition of applied transformations. + * This can be useful when dealing with dynamic Initialize values. + */ + lazy val capturedTransformations: Initialize[Initialize ~> Initialize] = new TransformCapture(idK[Initialize]) - def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) - def valueStrict[T](value: T): Initialize[T] = pure(() => value) - def value[T](value: => T): Initialize[T] = pure(value _) - def pure[T](value: () => T): Initialize[T] = new Value(value) - def optional[T,U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) - def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = setting[T](key, map(key)(f), NoPosition) - def bind[S,T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) - def map[S,T](in: Initialize[S])(f: S => T): Initialize[T] = new Apply[ ({ type l[L[x]] = L[S] })#l, T](f, in, AList.single[S]) - def app[K[L[x]], T](inputs: K[Initialize])(f: K[Id] => T)(implicit alist: AList[K]): Initialize[T] = new Apply[K, T](f, inputs, alist) - def uniform[S,T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = - new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) + def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) + def valueStrict[T](value: T): Initialize[T] = pure(() => value) + def value[T](value: => T): Initialize[T] = pure(value _) + def pure[T](value: () => T): Initialize[T] = new Value(value) + def optional[T, U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) + def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = setting[T](key, map(key)(f), NoPosition) + def bind[S, T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) + def map[S, T](in: Initialize[S])(f: S => T): Initialize[T] = new Apply[({ type l[L[x]] = L[S] })#l, T](f, in, AList.single[S]) + def app[K[L[x]], T](inputs: K[Initialize])(f: K[Id] => T)(implicit alist: AList[K]): Initialize[T] = new Apply[K, T](f, inputs, alist) + def uniform[S, T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = + new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) - /** The result of this initialization is the validated `key`. - * No dependency is introduced on `key`. If `selfRefOk` is true, validation will not fail if the key is referenced by a definition of `key`. - * That is, key := f(validated(key).value) is allowed only if `selfRefOk == true`. */ - private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) + /** + * The result of this initialization is the validated `key`. + * No dependency is introduced on `key`. If `selfRefOk` is true, validation will not fail if the key is referenced by a definition of `key`. + * That is, key := f(validated(key).value) is allowed only if `selfRefOk == true`. + */ + private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) - /** Constructs a derived setting that will be automatically defined in every scope where one of its dependencies - * is explicitly defined and the where the scope matches `filter`. - * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. - * Only the static dependencies are tracked, however. Dependencies on previous values do not introduce a derived setting either. */ - final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true), default: Boolean = false): Setting[T] = { - deriveAllowed(s, allowDynamic) foreach error - val d = new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) - if (default) d.default() else d - } - def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { - case _: Bind[_,_] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") - case _ => None - } - // id is used for equality - private[sbt] final def defaultSetting[T](s: Setting[T]): Setting[T] = s.default() - private[sbt] def defaultSettings(ss: Seq[Setting[_]]): Seq[Setting[_]] = ss.map(s => defaultSetting(s)) - private[this] final val nextID = new java.util.concurrent.atomic.AtomicLong - private[this] final def nextDefaultID(): Long = nextID.incrementAndGet() + /** + * Constructs a derived setting that will be automatically defined in every scope where one of its dependencies + * is explicitly defined and the where the scope matches `filter`. + * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. + * Only the static dependencies are tracked, however. Dependencies on previous values do not introduce a derived setting either. + */ + final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true), default: Boolean = false): Setting[T] = { + deriveAllowed(s, allowDynamic) foreach error + val d = new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) + if (default) d.default() else d + } + def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { + case _: Bind[_, _] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") + case _ => None + } + // id is used for equality + private[sbt] final def defaultSetting[T](s: Setting[T]): Setting[T] = s.default() + private[sbt] def defaultSettings(ss: Seq[Setting[_]]): Seq[Setting[_]] = ss.map(s => defaultSetting(s)) + private[this] final val nextID = new java.util.concurrent.atomic.AtomicLong + private[this] final def nextDefaultID(): Long = nextID.incrementAndGet() + def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) + def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { + def apply[T](k: ScopedKey[T]): T = getValue(s, k) + } + def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse (throw new InvalidReference(k)) + def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) + def mapScope(f: Scope => Scope): MapScoped = new MapScoped { + def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) + } + private final class InvalidReference(val key: ScopedKey[_]) extends RuntimeException("Internal settings error: invalid reference to " + showFullKey(key)) - def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) - def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { - def apply[T](k: ScopedKey[T]): T = getValue(s, k) - } - def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse( throw new InvalidReference(k) ) - def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) - def mapScope(f: Scope => Scope): MapScoped = new MapScoped { - def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) - } - private final class InvalidReference(val key: ScopedKey[_]) extends RuntimeException("Internal settings error: invalid reference to " + showFullKey(key)) + private[this] def applyDefaults(ss: Seq[Setting[_]]): Seq[Setting[_]] = + { + val (defaults, others) = Util.separate[Setting[_], DefaultSetting[_], Setting[_]](ss) { case u: DefaultSetting[_] => Left(u); case s => Right(s) } + defaults.distinct ++ others + } - private[this] def applyDefaults(ss: Seq[Setting[_]]): Seq[Setting[_]] = - { - val (defaults, others) = Util.separate[Setting[_], DefaultSetting[_], Setting[_]](ss) { case u: DefaultSetting[_] => Left(u); case s => Right(s) } - defaults.distinct ++ others - } + def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = + { + val initDefaults = applyDefaults(init) + // inject derived settings into scopes where their dependencies are directly defined + // and prepend per-scope settings + val derived = deriveAndLocal(initDefaults) + // group by Scope/Key, dropping dead initializations + val sMap: ScopedMap = grouped(derived) + // delegate references to undefined values according to 'delegates' + val dMap: ScopedMap = if (actual) delegate(sMap)(delegates, display) else sMap + // merge Seq[Setting[_]] into Compiled + compile(dMap) + } + def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): Settings[Scope] = + { + val cMap = compiled(init)(delegates, scopeLocal, display) + // order the initializations. cyclic references are detected here. + val ordered: Seq[Compiled[_]] = sort(cMap) + // evaluation: apply the initializations. + try { applyInits(ordered) } + catch { case rru: RuntimeUndefined => throw Uninitialized(cMap.keys.toSeq, delegates, rru.undefined, true) } + } + def sort(cMap: CompiledMap): Seq[Compiled[_]] = + Dag.topologicalSort(cMap.values)(_.dependencies.map(cMap)) - def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = - { - val initDefaults = applyDefaults(init) - // inject derived settings into scopes where their dependencies are directly defined - // and prepend per-scope settings - val derived = deriveAndLocal(initDefaults) - // group by Scope/Key, dropping dead initializations - val sMap: ScopedMap = grouped(derived) - // delegate references to undefined values according to 'delegates' - val dMap: ScopedMap = if(actual) delegate(sMap)(delegates, display) else sMap - // merge Seq[Setting[_]] into Compiled - compile(dMap) - } - def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): Settings[Scope] = - { - val cMap = compiled(init)(delegates, scopeLocal, display) - // order the initializations. cyclic references are detected here. - val ordered: Seq[Compiled[_]] = sort(cMap) - // evaluation: apply the initializations. - try { applyInits(ordered) } - catch { case rru: RuntimeUndefined => throw Uninitialized(cMap.keys.toSeq, delegates, rru.undefined, true) } - } - def sort(cMap: CompiledMap): Seq[Compiled[_]] = - Dag.topologicalSort(cMap.values)(_.dependencies.map(cMap)) + def compile(sMap: ScopedMap): CompiledMap = + sMap.toTypedSeq.map { + case sMap.TPair(k, ss) => + val deps = ss flatMap { _.dependencies } toSet; + (k, new Compiled(k, deps, ss)) + } toMap; - def compile(sMap: ScopedMap): CompiledMap = - sMap.toTypedSeq.map { case sMap.TPair(k, ss) => - val deps = ss flatMap { _.dependencies } toSet; - (k, new Compiled(k, deps, ss)) - } toMap; + def grouped(init: Seq[Setting[_]]): ScopedMap = + ((IMap.empty: ScopedMap) /: init)((m, s) => add(m, s)) - def grouped(init: Seq[Setting[_]]): ScopedMap = - ((IMap.empty : ScopedMap) /: init) ( (m,s) => add(m,s) ) + def add[T](m: ScopedMap, s: Setting[T]): ScopedMap = + m.mapValue[T](s.key, Nil, ss => append(ss, s)) - def add[T](m: ScopedMap, s: Setting[T]): ScopedMap = - m.mapValue[T]( s.key, Nil, ss => append(ss, s)) + def append[T](ss: Seq[Setting[T]], s: Setting[T]): Seq[Setting[T]] = + if (s.definitive) s :: Nil else ss :+ s - def append[T](ss: Seq[Setting[T]], s: Setting[T]): Seq[Setting[T]] = - if(s.definitive) s :: Nil else ss :+ s + def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopeLocal): Seq[Setting[_]] = + init.flatMap(_.dependencies flatMap scopeLocal) ++ init - def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopeLocal): Seq[Setting[_]] = - init.flatMap( _.dependencies flatMap scopeLocal ) ++ init + def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = + { + def refMap(ref: Setting[_], isFirst: Boolean) = new ValidateKeyRef { + def apply[T](k: ScopedKey[T], selfRefOk: Boolean) = + delegateForKey(sMap, k, delegates(k.scope), ref, selfRefOk || !isFirst) + } + type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] + val f = new (SettingSeq ~> ValidatedSettings) { + def apply[T](ks: Seq[Setting[T]]) = { + val (undefs, valid) = Util.separate(ks.zipWithIndex) { case (s, i) => s validateKeyReferenced refMap(s, i == 0) } + if (undefs.isEmpty) Right(valid) else Left(undefs.flatten) + } + } + type Undefs[_] = Seq[Undefined] + val (undefineds, result) = sMap.mapSeparate[Undefs, SettingSeq](f) + if (undefineds.isEmpty) + result + else + throw Uninitialized(sMap.keys.toSeq, delegates, undefineds.values.flatten.toList, false) + } + private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], ref: Setting[_], selfRefOk: Boolean): Either[Undefined, ScopedKey[T]] = + { + val skeys = scopes.iterator.map(x => ScopedKey(x, k.key)) + val definedAt = skeys.find(sk => (selfRefOk || ref.key != sk) && (sMap contains sk)) + definedAt.toRight(Undefined(ref, k)) + } - def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = - { - def refMap(ref: Setting[_], isFirst: Boolean) = new ValidateKeyRef { def apply[T](k: ScopedKey[T], selfRefOk: Boolean) = - delegateForKey(sMap, k, delegates(k.scope), ref, selfRefOk || !isFirst) - } - type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] - val f = new (SettingSeq ~> ValidatedSettings) { def apply[T](ks: Seq[Setting[T]]) = { - val (undefs, valid) = Util.separate(ks.zipWithIndex){ case (s,i) => s validateKeyReferenced refMap(s, i == 0) } - if(undefs.isEmpty) Right(valid) else Left(undefs.flatten) - }} - type Undefs[_] = Seq[Undefined] - val (undefineds, result) = sMap.mapSeparate[Undefs, SettingSeq]( f ) - if(undefineds.isEmpty) - result - else - throw Uninitialized(sMap.keys.toSeq, delegates, undefineds.values.flatten.toList, false) - } - private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], ref: Setting[_], selfRefOk: Boolean): Either[Undefined, ScopedKey[T]] = - { - val skeys = scopes.iterator.map(x => ScopedKey(x, k.key)) - val definedAt = skeys.find( sk => (selfRefOk || ref.key != sk) && (sMap contains sk)) - definedAt.toRight(Undefined(ref, k)) - } + private[this] def applyInits(ordered: Seq[Compiled[_]])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = + { + val x = java.util.concurrent.Executors.newFixedThreadPool(Runtime.getRuntime.availableProcessors) + try { + val eval: EvaluateSettings[Scope] = new EvaluateSettings[Scope] { + override val init: Init.this.type = Init.this + def compiledSettings = ordered + def executor = x + } + eval.run + } finally { x.shutdown() } + } - private[this] def applyInits(ordered: Seq[Compiled[_]])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = - { - val x = java.util.concurrent.Executors.newFixedThreadPool(Runtime.getRuntime.availableProcessors) - try { - val eval: EvaluateSettings[Scope] = new EvaluateSettings[Scope] { - override val init: Init.this.type = Init.this - def compiledSettings = ordered - def executor = x - } - eval.run - } finally { x.shutdown() } - } + def showUndefined(u: Undefined, validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = + { + val guessed = guessIntendedScope(validKeys, delegates, u.referencedKey) + val derived = u.defining.isDerived + val refString = display(u.defining.key) + val sourceString = if (derived) "" else parenPosString(u.defining) + val guessedString = if (derived) "" else guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString + val derivedString = if (derived) ", which is a derived setting that needs this key to be defined in this scope." else "" + display(u.referencedKey) + " from " + refString + sourceString + derivedString + guessedString + } + private[this] def parenPosString(s: Setting[_]): String = + s.positionString match { case None => ""; case Some(s) => " (" + s + ")" } - def showUndefined(u: Undefined, validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = - { - val guessed = guessIntendedScope(validKeys, delegates, u.referencedKey) - val derived = u.defining.isDerived - val refString = display(u.defining.key) - val sourceString = if(derived) "" else parenPosString(u.defining) - val guessedString = if(derived) "" else guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString - val derivedString = if(derived) ", which is a derived setting that needs this key to be defined in this scope." else "" - display(u.referencedKey) + " from " + refString + sourceString + derivedString + guessedString - } - private[this] def parenPosString(s: Setting[_]): String = - s.positionString match { case None => ""; case Some(s) => " (" + s + ")" } + def guessIntendedScope(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], key: ScopedKey[_]): Option[ScopedKey[_]] = + { + val distances = validKeys.flatMap { validKey => refinedDistance(delegates, validKey, key).map(dist => (dist, validKey)) } + distances.sortBy(_._1).map(_._2).headOption + } + def refinedDistance(delegates: Scope => Seq[Scope], a: ScopedKey[_], b: ScopedKey[_]): Option[Int] = + if (a.key != b.key || a == b) None + else { + val dist = delegates(a.scope).indexOf(b.scope) + if (dist < 0) None else Some(dist) + } - def guessIntendedScope(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], key: ScopedKey[_]): Option[ScopedKey[_]] = - { - val distances = validKeys.flatMap { validKey => refinedDistance(delegates, validKey, key).map( dist => (dist, validKey) ) } - distances.sortBy(_._1).map(_._2).headOption - } - def refinedDistance(delegates: Scope => Seq[Scope], a: ScopedKey[_], b: ScopedKey[_]): Option[Int] = - if(a.key != b.key || a == b) None - else - { - val dist = delegates(a.scope).indexOf(b.scope) - if(dist < 0) None else Some(dist) - } + final class Uninitialized(val undefined: Seq[Undefined], override val toString: String) extends Exception(toString) + final class Undefined private[sbt] (val defining: Setting[_], val referencedKey: ScopedKey[_]) { + @deprecated("For compatibility only, use `defining` directly.", "0.13.1") + val definingKey = defining.key + @deprecated("For compatibility only, use `defining` directly.", "0.13.1") + val derived: Boolean = defining.isDerived + @deprecated("Use the non-deprecated Undefined factory method.", "0.13.1") + def this(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean) = this(fakeUndefinedSetting(definingKey, derived), referencedKey) + } + final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") - final class Uninitialized(val undefined: Seq[Undefined], override val toString: String) extends Exception(toString) - final class Undefined private[sbt](val defining: Setting[_], val referencedKey: ScopedKey[_]) - { - @deprecated("For compatibility only, use `defining` directly.", "0.13.1") - val definingKey = defining.key - @deprecated("For compatibility only, use `defining` directly.", "0.13.1") - val derived: Boolean = defining.isDerived - @deprecated("Use the non-deprecated Undefined factory method.", "0.13.1") - def this(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean) = this( fakeUndefinedSetting(definingKey, derived), referencedKey) - } - final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") + @deprecated("Use the other overload.", "0.13.1") + def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean): Undefined = + new Undefined(fakeUndefinedSetting(definingKey, derived), referencedKey) + private[this] def fakeUndefinedSetting[T](definingKey: ScopedKey[T], d: Boolean): Setting[T] = + { + val init: Initialize[T] = pure(() => error("Dummy setting for compatibility only.")) + new Setting(definingKey, init, NoPosition) { override def isDerived = d } + } - @deprecated("Use the other overload.", "0.13.1") - def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean): Undefined = - new Undefined(fakeUndefinedSetting(definingKey, derived), referencedKey) - private[this] def fakeUndefinedSetting[T](definingKey: ScopedKey[T], d: Boolean): Setting[T] = - { - val init: Initialize[T] = pure(() => error("Dummy setting for compatibility only.")) - new Setting(definingKey, init, NoPosition) { override def isDerived = d } - } + def Undefined(defining: Setting[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(defining, referencedKey) + def Uninitialized(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], keys: Seq[Undefined], runtime: Boolean)(implicit display: Show[ScopedKey[_]]): Uninitialized = + { + assert(!keys.isEmpty) + val suffix = if (keys.length > 1) "s" else "" + val prefix = if (runtime) "Runtime reference" else "Reference" + val keysString = keys.map(u => showUndefined(u, validKeys, delegates)).mkString("\n\n ", "\n\n ", "") + new Uninitialized(keys, prefix + suffix + " to undefined setting" + suffix + ": " + keysString + "\n ") + } + final class Compiled[T](val key: ScopedKey[T], val dependencies: Iterable[ScopedKey[_]], val settings: Seq[Setting[T]]) { + override def toString = showFullKey(key) + } + final class Flattened(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]]) - def Undefined(defining: Setting[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(defining, referencedKey) - def Uninitialized(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], keys: Seq[Undefined], runtime: Boolean)(implicit display: Show[ScopedKey[_]]): Uninitialized = - { - assert(!keys.isEmpty) - val suffix = if(keys.length > 1) "s" else "" - val prefix = if(runtime) "Runtime reference" else "Reference" - val keysString = keys.map(u => showUndefined(u, validKeys, delegates)).mkString("\n\n ", "\n\n ", "") - new Uninitialized(keys, prefix + suffix + " to undefined setting" + suffix + ": " + keysString + "\n ") - } - final class Compiled[T](val key: ScopedKey[T], val dependencies: Iterable[ScopedKey[_]], val settings: Seq[Setting[T]]) - { - override def toString = showFullKey(key) - } - final class Flattened(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]]) + def flattenLocals(compiled: CompiledMap): Map[ScopedKey[_], Flattened] = + { + import collection.breakOut + val locals = compiled flatMap { case (key, comp) => if (key.key.isLocal) Seq[Compiled[_]](comp) else Nil } + val ordered = Dag.topologicalSort(locals)(_.dependencies.flatMap(dep => if (dep.key.isLocal) Seq[Compiled[_]](compiled(dep)) else Nil)) + def flatten(cmap: Map[ScopedKey[_], Flattened], key: ScopedKey[_], deps: Iterable[ScopedKey[_]]): Flattened = + new Flattened(key, deps.flatMap(dep => if (dep.key.isLocal) cmap(dep).dependencies else dep :: Nil)) - def flattenLocals(compiled: CompiledMap): Map[ScopedKey[_],Flattened] = - { - import collection.breakOut - val locals = compiled flatMap { case (key, comp) => if(key.key.isLocal) Seq[Compiled[_]](comp) else Nil } - val ordered = Dag.topologicalSort(locals)(_.dependencies.flatMap(dep => if(dep.key.isLocal) Seq[Compiled[_]](compiled(dep)) else Nil)) - def flatten(cmap: Map[ScopedKey[_],Flattened], key: ScopedKey[_], deps: Iterable[ScopedKey[_]]): Flattened = - new Flattened(key, deps.flatMap(dep => if(dep.key.isLocal) cmap(dep).dependencies else dep :: Nil)) + val empty = Map.empty[ScopedKey[_], Flattened] + val flattenedLocals = (empty /: ordered) { (cmap, c) => cmap.updated(c.key, flatten(cmap, c.key, c.dependencies)) } + compiled flatMap { + case (key, comp) => + if (key.key.isLocal) + Nil + else + Seq[(ScopedKey[_], Flattened)]((key, flatten(flattenedLocals, key, comp.dependencies))) + } + } - val empty = Map.empty[ScopedKey[_],Flattened] - val flattenedLocals = (empty /: ordered) { (cmap, c) => cmap.updated(c.key, flatten(cmap, c.key, c.dependencies)) } - compiled flatMap{ case (key, comp) => - if(key.key.isLocal) - Nil - else - Seq[ (ScopedKey[_], Flattened)]( (key, flatten(flattenedLocals, key, comp.dependencies)) ) - } - } + def definedAtString(settings: Seq[Setting[_]]): String = + { + val posDefined = settings.flatMap(_.positionString.toList) + if (posDefined.size > 0) { + val header = if (posDefined.size == settings.size) "defined at:" else + "some of the defining occurrences:" + header + (posDefined.distinct mkString ("\n\t", "\n\t", "\n")) + } else "" + } - def definedAtString(settings: Seq[Setting[_]]): String = - { - val posDefined = settings.flatMap(_.positionString.toList) - if (posDefined.size > 0) { - val header = if (posDefined.size == settings.size) "defined at:" else - "some of the defining occurrences:" - header + (posDefined.distinct mkString ("\n\t", "\n\t", "\n")) - } else "" - } + /** + * Intersects two scopes, returning the more specific one if they intersect, or None otherwise. + */ + private[sbt] def intersect(s1: Scope, s2: Scope)(implicit delegates: Scope => Seq[Scope]): Option[Scope] = + if (delegates(s1).contains(s2)) Some(s1) // s1 is more specific + else if (delegates(s2).contains(s1)) Some(s2) // s2 is more specific + else None - /** - * Intersects two scopes, returning the more specific one if they intersect, or None otherwise. - */ - private[sbt] def intersect(s1: Scope, s2: Scope)(implicit delegates: Scope => Seq[Scope]): Option[Scope] = - if (delegates(s1).contains(s2)) Some(s1) // s1 is more specific - else if (delegates(s2).contains(s1)) Some(s2) // s2 is more specific - else None + private[this] def deriveAndLocal(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): Seq[Setting[_]] = + { + import collection.mutable - private[this] def deriveAndLocal(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): Seq[Setting[_]] = - { - import collection.mutable + final class Derived(val setting: DerivedSetting[_]) { + val dependencies = setting.dependencies.map(_.key) + def triggeredBy = dependencies.filter(setting.trigger) + val inScopes = new mutable.HashSet[Scope] + val outputs = new mutable.ListBuffer[Setting[_]] + } + final class Deriveds(val key: AttributeKey[_], val settings: mutable.ListBuffer[Derived]) { + def dependencies = settings.flatMap(_.dependencies) + // This is mainly for use in the cyclic reference error message + override def toString = s"Derived settings for ${key.label}, ${definedAtString(settings.map(_.setting))}" + } - final class Derived(val setting: DerivedSetting[_]) { - val dependencies = setting.dependencies.map(_.key) - def triggeredBy = dependencies.filter(setting.trigger) - val inScopes = new mutable.HashSet[Scope] - val outputs = new mutable.ListBuffer[Setting[_]] - } - final class Deriveds(val key: AttributeKey[_], val settings: mutable.ListBuffer[Derived]) { - def dependencies = settings.flatMap(_.dependencies) - // This is mainly for use in the cyclic reference error message - override def toString = s"Derived settings for ${key.label}, ${definedAtString(settings.map(_.setting))}" - } + // separate `derived` settings from normal settings (`defs`) + val (derived, rawDefs) = Util.separate[Setting[_], Derived, Setting[_]](init) { case d: DerivedSetting[_] => Left(new Derived(d)); case s => Right(s) } + val defs = addLocal(rawDefs)(scopeLocal) - // separate `derived` settings from normal settings (`defs`) - val (derived, rawDefs) = Util.separate[Setting[_],Derived,Setting[_]](init) { case d: DerivedSetting[_] => Left(new Derived(d)); case s => Right(s) } - val defs = addLocal(rawDefs)(scopeLocal) + // group derived settings by the key they define + val derivsByDef = new mutable.HashMap[AttributeKey[_], Deriveds] + for (s <- derived) { + val key = s.setting.key.key + derivsByDef.getOrElseUpdate(key, new Deriveds(key, new mutable.ListBuffer)).settings += s + } + // sort derived settings so that dependencies come first + // this is necessary when verifying that a derived setting's dependencies exist + val ddeps = (d: Deriveds) => d.dependencies.flatMap(derivsByDef.get) + val sortedDerivs = Dag.topologicalSort(derivsByDef.values)(ddeps) - // group derived settings by the key they define - val derivsByDef = new mutable.HashMap[AttributeKey[_], Deriveds] - for(s <- derived) { - val key = s.setting.key.key - derivsByDef.getOrElseUpdate(key, new Deriveds(key, new mutable.ListBuffer)).settings += s - } + // index derived settings by triggering key. This maps a key to the list of settings potentially derived from it. + val derivedBy = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived]] + for (s <- derived; d <- s.triggeredBy) + derivedBy.getOrElseUpdate(d, new mutable.ListBuffer) += s - // sort derived settings so that dependencies come first - // this is necessary when verifying that a derived setting's dependencies exist - val ddeps = (d: Deriveds) => d.dependencies.flatMap(derivsByDef.get) - val sortedDerivs = Dag.topologicalSort(derivsByDef.values)(ddeps) + // Map a DerivedSetting[_] to the `Derived` struct wrapping it. Used to ultimately replace a DerivedSetting with + // the `Setting`s that were actually derived from it: `Derived.outputs` + val derivedToStruct: Map[DerivedSetting[_], Derived] = (derived map { s => s.setting -> s }).toMap - // index derived settings by triggering key. This maps a key to the list of settings potentially derived from it. - val derivedBy = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived]] - for(s <- derived; d <- s.triggeredBy) - derivedBy.getOrElseUpdate(d, new mutable.ListBuffer) += s + // set of defined scoped keys, used to ensure a derived setting is only added if all dependencies are present + val defined = new mutable.HashSet[ScopedKey[_]] + def addDefs(ss: Seq[Setting[_]]) { for (s <- ss) defined += s.key } + addDefs(defs) - // Map a DerivedSetting[_] to the `Derived` struct wrapping it. Used to ultimately replace a DerivedSetting with - // the `Setting`s that were actually derived from it: `Derived.outputs` - val derivedToStruct: Map[DerivedSetting[_], Derived] = (derived map { s => s.setting -> s }).toMap + // true iff the scoped key is in `defined`, taking delegation into account + def isDefined(key: AttributeKey[_], scope: Scope) = + delegates(scope).exists(s => defined.contains(ScopedKey(s, key))) - // set of defined scoped keys, used to ensure a derived setting is only added if all dependencies are present - val defined = new mutable.HashSet[ScopedKey[_]] - def addDefs(ss: Seq[Setting[_]]) { for(s <- ss) defined += s.key } - addDefs(defs) + // true iff all dependencies of derived setting `d` have a value (potentially via delegation) in `scope` + def allDepsDefined(d: Derived, scope: Scope, local: Set[AttributeKey[_]]): Boolean = + d.dependencies.forall(dep => local(dep) || isDefined(dep, scope)) - // true iff the scoped key is in `defined`, taking delegation into account - def isDefined(key: AttributeKey[_], scope: Scope) = - delegates(scope).exists(s => defined.contains(ScopedKey(s, key))) + // Returns the list of injectable derived settings and their local settings for `sk`. + // The settings are to be injected under `outputScope` = whichever scope is more specific of: + // * the dependency's (`sk`) scope + // * the DerivedSetting's scope in which it has been declared, `definingScope` + // provided that these two scopes intersect. + // A derived setting is injectable if: + // 1. it has not been previously injected into outputScope + // 2. it applies to outputScope (as determined by its `filter`) + // 3. all of its dependencies are defined for outputScope (allowing for delegation) + // This needs to handle local settings because a derived setting wouldn't be injected if it's local setting didn't exist yet. + val deriveFor = (sk: ScopedKey[_]) => { + val derivedForKey: List[Derived] = derivedBy.get(sk.key).toList.flatten + val scope = sk.scope + def localAndDerived(d: Derived): Seq[Setting[_]] = { + def definingScope = d.setting.key.scope + val outputScope = intersect(scope, definingScope) + outputScope collect { + case s if !d.inScopes.contains(s) && d.setting.filter(s) => + val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(s, dep))) + if (allDepsDefined(d, s, local.map(_.key.key).toSet)) { + d.inScopes.add(s) + val out = local :+ d.setting.setScope(s) + d.outputs ++= out + out + } else + Nil + } getOrElse Nil + } + derivedForKey.flatMap(localAndDerived) + } - // true iff all dependencies of derived setting `d` have a value (potentially via delegation) in `scope` - def allDepsDefined(d: Derived, scope: Scope, local: Set[AttributeKey[_]]): Boolean = - d.dependencies.forall(dep => local(dep) || isDefined(dep, scope)) + val processed = new mutable.HashSet[ScopedKey[_]] - // Returns the list of injectable derived settings and their local settings for `sk`. - // The settings are to be injected under `outputScope` = whichever scope is more specific of: - // * the dependency's (`sk`) scope - // * the DerivedSetting's scope in which it has been declared, `definingScope` - // provided that these two scopes intersect. - // A derived setting is injectable if: - // 1. it has not been previously injected into outputScope - // 2. it applies to outputScope (as determined by its `filter`) - // 3. all of its dependencies are defined for outputScope (allowing for delegation) - // This needs to handle local settings because a derived setting wouldn't be injected if it's local setting didn't exist yet. - val deriveFor = (sk: ScopedKey[_]) => { - val derivedForKey: List[Derived] = derivedBy.get(sk.key).toList.flatten - val scope = sk.scope - def localAndDerived(d: Derived): Seq[Setting[_]] = { - def definingScope = d.setting.key.scope - val outputScope = intersect(scope, definingScope) - outputScope collect { case s if !d.inScopes.contains(s) && d.setting.filter(s) => - val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(s, dep))) - if(allDepsDefined(d, s, local.map(_.key.key).toSet)) { - d.inScopes.add(s) - val out = local :+ d.setting.setScope(s) - d.outputs ++= out - out - } else - Nil - } getOrElse Nil - } - derivedForKey.flatMap(localAndDerived) - } + // derives settings, transitively so that a derived setting can trigger another + def process(rem: List[Setting[_]]): Unit = rem match { + case s :: ss => + val sk = s.key + val ds = if (processed.add(sk)) deriveFor(sk) else Nil + addDefs(ds) + process(ds ::: ss) + case Nil => + } + process(defs.toList) - val processed = new mutable.HashSet[ScopedKey[_]] + // Take all the original defs and DerivedSettings along with locals, replace each DerivedSetting with the actual + // settings that were derived. + val allDefs = addLocal(init)(scopeLocal) + allDefs flatMap { case d: DerivedSetting[_] => (derivedToStruct get d map (_.outputs)).toStream.flatten; case s => Stream(s) } + } - // derives settings, transitively so that a derived setting can trigger another - def process(rem: List[Setting[_]]): Unit = rem match { - case s :: ss => - val sk = s.key - val ds = if(processed.add(sk)) deriveFor(sk) else Nil - addDefs(ds) - process(ds ::: ss) - case Nil => - } - process(defs.toList) + sealed trait Initialize[T] { + def dependencies: Seq[ScopedKey[_]] + def apply[S](g: T => S): Initialize[S] - // Take all the original defs and DerivedSettings along with locals, replace each DerivedSetting with the actual - // settings that were derived. - val allDefs = addLocal(init)(scopeLocal) - allDefs flatMap { case d: DerivedSetting[_] => (derivedToStruct get d map (_.outputs)).toStream.flatten; case s => Stream(s) } - } + @deprecated("Will be made private.", "0.13.2") + def mapReferenced(g: MapScoped): Initialize[T] + @deprecated("Will be made private.", "0.13.2") + def mapConstant(g: MapConstant): Initialize[T] - sealed trait Initialize[T] - { - def dependencies: Seq[ScopedKey[_]] - def apply[S](g: T => S): Initialize[S] + @deprecated("Will be made private.", "0.13.2") + def validateReferenced(g: ValidateRef): ValidatedInit[T] = + validateKeyReferenced(new ValidateKeyRef { def apply[T](key: ScopedKey[T], selfRefOk: Boolean) = g(key) }) - @deprecated("Will be made private.", "0.13.2") - def mapReferenced(g: MapScoped): Initialize[T] - @deprecated("Will be made private.", "0.13.2") - def mapConstant(g: MapConstant): Initialize[T] + private[sbt] def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] - @deprecated("Will be made private.", "0.13.2") - def validateReferenced(g: ValidateRef): ValidatedInit[T] = - validateKeyReferenced( new ValidateKeyRef { def apply[T](key: ScopedKey[T], selfRefOk: Boolean) = g(key) }) + def evaluate(map: Settings[Scope]): T + def zip[S](o: Initialize[S]): Initialize[(T, S)] = zipTupled(o)(idFun) + def zipWith[S, U](o: Initialize[S])(f: (T, S) => U): Initialize[U] = zipTupled(o)(f.tupled) + private[this] def zipTupled[S, U](o: Initialize[S])(f: ((T, S)) => U): Initialize[U] = + new Apply[({ type l[L[x]] = (L[T], L[S]) })#l, U](f, (this, o), AList.tuple2[T, S]) + /** A fold on the static attributes of this and nested Initializes. */ + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S + } + object Initialize { + implicit def joinInitialize[T](s: Seq[Initialize[T]]): JoinInitSeq[T] = new JoinInitSeq(s) + final class JoinInitSeq[T](s: Seq[Initialize[T]]) { + def joinWith[S](f: Seq[T] => S): Initialize[S] = uniform(s)(f) + def join: Initialize[Seq[T]] = uniform(s)(idFun) + } + def join[T](inits: Seq[Initialize[T]]): Initialize[Seq[T]] = uniform(inits)(idFun) + def joinAny[M[_]](inits: Seq[Initialize[M[T]] forSome { type T }]): Initialize[Seq[M[_]]] = + join(inits.asInstanceOf[Seq[Initialize[M[Any]]]]).asInstanceOf[Initialize[Seq[M[T] forSome { type T }]]] + } + object SettingsDefinition { + implicit def unwrapSettingsDefinition(d: SettingsDefinition): Seq[Setting[_]] = d.settings + implicit def wrapSettingsDefinition(ss: Seq[Setting[_]]): SettingsDefinition = new SettingList(ss) + } + sealed trait SettingsDefinition { + def settings: Seq[Setting[_]] + } + final class SettingList(val settings: Seq[Setting[_]]) extends SettingsDefinition + sealed class Setting[T] private[Init] (val key: ScopedKey[T], val init: Initialize[T], val pos: SourcePosition) extends SettingsDefinition { + def settings = this :: Nil + def definitive: Boolean = !init.dependencies.contains(key) + def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) + @deprecated("Will be made private.", "0.13.2") + def mapReferenced(g: MapScoped): Setting[T] = make(key, init mapReferenced g, pos) + @deprecated("Will be made private.", "0.13.2") + def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => make(key, newI, pos)) - private[sbt] def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] + private[sbt] def validateKeyReferenced(g: ValidateKeyRef): Either[Seq[Undefined], Setting[T]] = + (init validateKeyReferenced g).right.map(newI => make(key, newI, pos)) - def evaluate(map: Settings[Scope]): T - def zip[S](o: Initialize[S]): Initialize[(T,S)] = zipTupled(o)(idFun) - def zipWith[S,U](o: Initialize[S])(f: (T,S) => U): Initialize[U] = zipTupled(o)(f.tupled) - private[this] def zipTupled[S,U](o: Initialize[S])(f: ((T,S)) => U): Initialize[U] = - new Apply[({ type l[L[x]] = (L[T], L[S]) })#l, U](f, (this, o), AList.tuple2[T,S]) - /** A fold on the static attributes of this and nested Initializes. */ - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S - } - object Initialize - { - implicit def joinInitialize[T](s: Seq[Initialize[T]]): JoinInitSeq[T] = new JoinInitSeq(s) - final class JoinInitSeq[T](s: Seq[Initialize[T]]) - { - def joinWith[S](f: Seq[T] => S): Initialize[S] = uniform(s)(f) - def join: Initialize[Seq[T]] = uniform(s)(idFun) - } - def join[T](inits: Seq[Initialize[T]]): Initialize[Seq[T]] = uniform(inits)(idFun) - def joinAny[M[_]](inits: Seq[Initialize[M[T]] forSome { type T }]): Initialize[Seq[M[_]]] = - join(inits.asInstanceOf[Seq[Initialize[M[Any]]]]).asInstanceOf[Initialize[Seq[M[T] forSome { type T }]]] - } - object SettingsDefinition { - implicit def unwrapSettingsDefinition(d: SettingsDefinition): Seq[Setting[_]] = d.settings - implicit def wrapSettingsDefinition(ss: Seq[Setting[_]]): SettingsDefinition = new SettingList(ss) - } - sealed trait SettingsDefinition { - def settings: Seq[Setting[_]] - } - final class SettingList(val settings: Seq[Setting[_]]) extends SettingsDefinition - sealed class Setting[T] private[Init](val key: ScopedKey[T], val init: Initialize[T], val pos: SourcePosition) extends SettingsDefinition - { - def settings = this :: Nil - def definitive: Boolean = !init.dependencies.contains(key) - def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) - @deprecated("Will be made private.", "0.13.2") - def mapReferenced(g: MapScoped): Setting[T] = make(key, init mapReferenced g, pos) - @deprecated("Will be made private.", "0.13.2") - def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => make(key, newI, pos)) + def mapKey(g: MapScoped): Setting[T] = make(g(key), init, pos) + def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = make(key, init(t => f(key, t)), pos) + @deprecated("Will be made private.", "0.13.2") + def mapConstant(g: MapConstant): Setting[T] = make(key, init mapConstant g, pos) + def withPos(pos: SourcePosition) = make(key, init, pos) + def positionString: Option[String] = pos match { + case pos: FilePosition => Some(pos.path + ":" + pos.startLine) + case NoPosition => None + } + private[sbt] def mapInitialize(f: Initialize[T] => Initialize[T]): Setting[T] = make(key, f(init), pos) + override def toString = "setting(" + key + ") at " + pos - private[sbt] def validateKeyReferenced(g: ValidateKeyRef): Either[Seq[Undefined], Setting[T]] = - (init validateKeyReferenced g).right.map(newI => make(key, newI, pos)) + protected[this] def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new Setting[T](key, init, pos) + protected[sbt] def isDerived: Boolean = false + private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) + /** Turn this setting into a `DefaultSetting` if it's not already, otherwise returns `this` */ + private[sbt] def default(id: => Long = nextDefaultID()): DefaultSetting[T] = DefaultSetting(key, init, pos, id) + } + private[Init] sealed class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean) extends Setting[T](sk, i, p) { + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, trigger) + protected[sbt] override def isDerived: Boolean = true + override def default(_id: => Long): DefaultSetting[T] = new DerivedSetting[T](sk, i, p, filter, trigger) with DefaultSetting[T] { val id = _id } + override def toString = "derived " + super.toString + } + // Only keep the first occurence of this setting and move it to the front so that it has lower precedence than non-defaults. + // This is intended for internal sbt use only, where alternatives like Plugin.globalSettings are not available. + private[Init] sealed trait DefaultSetting[T] extends Setting[T] { + val id: Long + override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = super.make(key, init, pos) default id + override final def hashCode = id.hashCode + override final def equals(o: Any): Boolean = o match { case d: DefaultSetting[_] => d.id == id; case _ => false } + override def toString = s"default($id) " + super.toString + override def default(id: => Long) = this + } - def mapKey(g: MapScoped): Setting[T] = make(g(key), init, pos) - def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = make(key, init(t => f(key,t)), pos) - @deprecated("Will be made private.", "0.13.2") - def mapConstant(g: MapConstant): Setting[T] = make(key, init mapConstant g, pos) - def withPos(pos: SourcePosition) = make(key, init, pos) - def positionString: Option[String] = pos match { - case pos: FilePosition => Some(pos.path + ":" + pos.startLine) - case NoPosition => None - } - private[sbt] def mapInitialize(f: Initialize[T] => Initialize[T]): Setting[T] = make(key, f(init), pos) - override def toString = "setting(" + key + ") at " + pos + object DefaultSetting { + def apply[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, _id: Long) = new Setting[T](sk, i, p) with DefaultSetting[T] { val id = _id } + } - protected[this] def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new Setting[T](key, init, pos) - protected[sbt] def isDerived: Boolean = false - private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) - /** Turn this setting into a `DefaultSetting` if it's not already, otherwise returns `this` */ - private[sbt] def default(id: => Long = nextDefaultID()): DefaultSetting[T] = DefaultSetting(key, init, pos, id) - } - private[Init] sealed class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean) extends Setting[T](sk, i, p) { - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, trigger) - protected[sbt] override def isDerived: Boolean = true - override def default(_id: => Long): DefaultSetting[T] = new DerivedSetting[T](sk, i, p, filter, trigger) with DefaultSetting[T] { val id = _id } - override def toString = "derived " + super.toString - } - // Only keep the first occurence of this setting and move it to the front so that it has lower precedence than non-defaults. - // This is intended for internal sbt use only, where alternatives like Plugin.globalSettings are not available. - private[Init] sealed trait DefaultSetting[T] extends Setting[T] { - val id: Long - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = super.make(key, init, pos) default id - override final def hashCode = id.hashCode - override final def equals(o: Any): Boolean = o match { case d: DefaultSetting[_] => d.id == id; case _ => false } - override def toString = s"default($id) " + super.toString - override def default(id: => Long) = this - } + private[this] def handleUndefined[T](vr: ValidatedInit[T]): Initialize[T] = vr match { + case Left(undefs) => throw new RuntimeUndefined(undefs) + case Right(x) => x + } - object DefaultSetting { - def apply[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, _id: Long) = new Setting[T](sk, i, p) with DefaultSetting[T] { val id = _id } - } + private[this] lazy val getValidated = + new (ValidatedInit ~> Initialize) { def apply[T](v: ValidatedInit[T]) = handleUndefined[T](v) } + // mainly for reducing generated class count + private[this] def validateKeyReferencedT(g: ValidateKeyRef) = + new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateKeyReferenced g } - private[this] def handleUndefined[T](vr: ValidatedInit[T]): Initialize[T] = vr match { - case Left(undefs) => throw new RuntimeUndefined(undefs) - case Right(x) => x - } + private[this] def mapReferencedT(g: MapScoped) = + new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapReferenced g } - private[this] lazy val getValidated = - new (ValidatedInit ~> Initialize) { def apply[T](v: ValidatedInit[T]) = handleUndefined[T](v) } + private[this] def mapConstantT(g: MapConstant) = + new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapConstant g } - // mainly for reducing generated class count - private[this] def validateKeyReferencedT(g: ValidateKeyRef) = - new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateKeyReferenced g } + private[this] def evaluateT(g: Settings[Scope]) = + new (Initialize ~> Id) { def apply[T](i: Initialize[T]) = i evaluate g } - private[this] def mapReferencedT(g: MapScoped) = - new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapReferenced g } + private[this] def deps(ls: Seq[Initialize[_]]): Seq[ScopedKey[_]] = ls.flatMap(_.dependencies) - private[this] def mapConstantT(g: MapConstant) = - new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapConstant g } + sealed trait Keyed[S, T] extends Initialize[T] { + def scopedKey: ScopedKey[S] + def transform: S => T + final def dependencies = scopedKey :: Nil + final def apply[Z](g: T => Z): Initialize[Z] = new GetValue(scopedKey, g compose transform) + final def evaluate(ss: Settings[Scope]): T = transform(getValue(ss, scopedKey)) + final def mapReferenced(g: MapScoped): Initialize[T] = new GetValue(g(scopedKey), transform) + private[sbt] final def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] = g(scopedKey, false) match { + case Left(un) => Left(un :: Nil) + case Right(nk) => Right(new GetValue(nk, transform)) + } + final def mapConstant(g: MapConstant): Initialize[T] = g(scopedKey) match { + case None => this + case Some(const) => new Value(() => transform(const)) + } + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init + } + private[this] final class GetValue[S, T](val scopedKey: ScopedKey[S], val transform: S => T) extends Keyed[S, T] + trait KeyedInitialize[T] extends Keyed[T, T] { + final val transform = idFun[T] + } - private[this] def evaluateT(g: Settings[Scope]) = - new (Initialize ~> Id) { def apply[T](i: Initialize[T]) = i evaluate g } + private[sbt] final class TransformCapture(val f: Initialize ~> Initialize) extends Initialize[Initialize ~> Initialize] { + def dependencies = Nil + def apply[Z](g2: (Initialize ~> Initialize) => Z): Initialize[Z] = map(this)(g2) + def evaluate(ss: Settings[Scope]): Initialize ~> Initialize = f + def mapReferenced(g: MapScoped) = new TransformCapture(mapReferencedT(g) ∙ f) + def mapConstant(g: MapConstant) = new TransformCapture(mapConstantT(g) ∙ f) + def validateKeyReferenced(g: ValidateKeyRef) = Right(new TransformCapture(getValidated ∙ validateKeyReferencedT(g) ∙ f)) + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init + } + private[sbt] final class ValidationCapture[T](val key: ScopedKey[T], val selfRefOk: Boolean) extends Initialize[ScopedKey[T]] { + def dependencies = Nil + def apply[Z](g2: ScopedKey[T] => Z): Initialize[Z] = map(this)(g2) + def evaluate(ss: Settings[Scope]) = key + def mapReferenced(g: MapScoped) = new ValidationCapture(g(key), selfRefOk) + def mapConstant(g: MapConstant) = this + def validateKeyReferenced(g: ValidateKeyRef) = g(key, selfRefOk) match { + case Left(un) => Left(un :: Nil) + case Right(k) => Right(new ValidationCapture(k, selfRefOk)) + } - private[this] def deps(ls: Seq[Initialize[_]]): Seq[ScopedKey[_]] = ls.flatMap(_.dependencies) + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init + } + private[sbt] final class Bind[S, T](val f: S => Initialize[T], val in: Initialize[S]) extends Initialize[T] { + def dependencies = in.dependencies + def apply[Z](g: T => Z): Initialize[Z] = new Bind[S, Z](s => f(s)(g), in) + def evaluate(ss: Settings[Scope]): T = f(in evaluate ss) evaluate ss + def mapReferenced(g: MapScoped) = new Bind[S, T](s => f(s) mapReferenced g, in mapReferenced g) + def validateKeyReferenced(g: ValidateKeyRef) = (in validateKeyReferenced g).right.map { validIn => + new Bind[S, T](s => handleUndefined(f(s) validateKeyReferenced g), validIn) + } + def mapConstant(g: MapConstant) = new Bind[S, T](s => f(s) mapConstant g, in mapConstant g) + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = in.processAttributes(init)(f) + } + private[sbt] final class Optional[S, T](val a: Option[Initialize[S]], val f: Option[S] => T) extends Initialize[T] { + def dependencies = deps(a.toList) + def apply[Z](g: T => Z): Initialize[Z] = new Optional[S, Z](a, g compose f) + def mapReferenced(g: MapScoped) = new Optional(a map mapReferencedT(g).fn, f) + def validateKeyReferenced(g: ValidateKeyRef) = a match { + case None => Right(this) + case Some(i) => Right(new Optional(i.validateKeyReferenced(g).right.toOption, f)) + } + def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) + def evaluate(ss: Settings[Scope]): T = f(a.flatMap(i => trapBadRef(evaluateT(ss)(i)))) + // proper solution is for evaluate to be deprecated or for external use only and a new internal method returning Either be used + private[this] def trapBadRef[A](run: => A): Option[A] = try Some(run) catch { case e: InvalidReference => None } + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = a match { + case None => init + case Some(i) => i.processAttributes(init)(f) + } + } + private[sbt] final class Value[T](val value: () => T) extends Initialize[T] { + def dependencies = Nil + def mapReferenced(g: MapScoped) = this + def validateKeyReferenced(g: ValidateKeyRef) = Right(this) + def apply[S](g: T => S) = new Value[S](() => g(value())) + def mapConstant(g: MapConstant) = this + def evaluate(map: Settings[Scope]): T = value() + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init + } + private[sbt] final object StaticScopes extends Initialize[Set[Scope]] { + def dependencies = Nil + def mapReferenced(g: MapScoped) = this + def validateKeyReferenced(g: ValidateKeyRef) = Right(this) + def apply[S](g: Set[Scope] => S) = map(this)(g) + def mapConstant(g: MapConstant) = this + def evaluate(map: Settings[Scope]) = map.scopes + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init + } + private[sbt] final class Apply[K[L[x]], T](val f: K[Id] => T, val inputs: K[Initialize], val alist: AList[K]) extends Initialize[T] { + def dependencies = deps(alist.toList(inputs)) + def mapReferenced(g: MapScoped) = mapInputs(mapReferencedT(g)) + def apply[S](g: T => S) = new Apply(g compose f, inputs, alist) + def mapConstant(g: MapConstant) = mapInputs(mapConstantT(g)) + def mapInputs(g: Initialize ~> Initialize): Initialize[T] = new Apply(f, alist.transform(inputs, g), alist) + def evaluate(ss: Settings[Scope]) = f(alist.transform(inputs, evaluateT(ss))) + def validateKeyReferenced(g: ValidateKeyRef) = + { + val tx = alist.transform(inputs, validateKeyReferencedT(g)) + val undefs = alist.toList(tx).flatMap(_.left.toSeq.flatten) + val get = new (ValidatedInit ~> Initialize) { def apply[T](vr: ValidatedInit[T]) = vr.right.get } + if (undefs.isEmpty) Right(new Apply(f, alist.transform(tx, get), alist)) else Left(undefs) + } - sealed trait Keyed[S, T] extends Initialize[T] - { - def scopedKey: ScopedKey[S] - def transform: S => T - final def dependencies = scopedKey :: Nil - final def apply[Z](g: T => Z): Initialize[Z] = new GetValue(scopedKey, g compose transform) - final def evaluate(ss: Settings[Scope]): T = transform(getValue(ss, scopedKey)) - final def mapReferenced(g: MapScoped): Initialize[T] = new GetValue( g(scopedKey), transform) - private[sbt] final def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] = g(scopedKey, false) match { - case Left(un) => Left(un :: Nil) - case Right(nk) => Right(new GetValue(nk, transform)) - } - final def mapConstant(g: MapConstant): Initialize[T] = g(scopedKey) match { - case None => this - case Some(const) => new Value(() => transform(const)) - } - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[this] final class GetValue[S,T](val scopedKey: ScopedKey[S], val transform: S => T) extends Keyed[S, T] - trait KeyedInitialize[T] extends Keyed[T, T] { - final val transform = idFun[T] - } - - private[sbt] final class TransformCapture(val f: Initialize ~> Initialize) extends Initialize[Initialize ~> Initialize] - { - def dependencies = Nil - def apply[Z](g2: (Initialize ~> Initialize) => Z): Initialize[Z] = map(this)(g2) - def evaluate(ss: Settings[Scope]): Initialize ~> Initialize = f - def mapReferenced(g: MapScoped) = new TransformCapture(mapReferencedT(g) ∙ f) - def mapConstant(g: MapConstant) = new TransformCapture(mapConstantT(g) ∙ f) - def validateKeyReferenced(g: ValidateKeyRef) = Right(new TransformCapture(getValidated ∙ validateKeyReferencedT(g) ∙ f)) - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[sbt] final class ValidationCapture[T](val key: ScopedKey[T], val selfRefOk: Boolean) extends Initialize[ScopedKey[T]] { - def dependencies = Nil - def apply[Z](g2: ScopedKey[T] => Z): Initialize[Z] = map(this)(g2) - def evaluate(ss: Settings[Scope]) = key - def mapReferenced(g: MapScoped) = new ValidationCapture(g(key), selfRefOk) - def mapConstant(g: MapConstant) = this - def validateKeyReferenced(g: ValidateKeyRef) = g(key, selfRefOk) match { - case Left(un) => Left(un :: Nil) - case Right(k) => Right(new ValidationCapture(k, selfRefOk)) - } - - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[sbt] final class Bind[S,T](val f: S => Initialize[T], val in: Initialize[S]) extends Initialize[T] - { - def dependencies = in.dependencies - def apply[Z](g: T => Z): Initialize[Z] = new Bind[S,Z](s => f(s)(g), in) - def evaluate(ss: Settings[Scope]): T = f(in evaluate ss) evaluate ss - def mapReferenced(g: MapScoped) = new Bind[S,T](s => f(s) mapReferenced g, in mapReferenced g) - def validateKeyReferenced(g: ValidateKeyRef) = (in validateKeyReferenced g).right.map { validIn => - new Bind[S,T](s => handleUndefined( f(s) validateKeyReferenced g), validIn) - } - def mapConstant(g: MapConstant) = new Bind[S,T](s => f(s) mapConstant g, in mapConstant g) - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = in.processAttributes(init)(f) - } - private[sbt] final class Optional[S,T](val a: Option[Initialize[S]], val f: Option[S] => T) extends Initialize[T] - { - def dependencies = deps(a.toList) - def apply[Z](g: T => Z): Initialize[Z] = new Optional[S,Z](a, g compose f) - def mapReferenced(g: MapScoped) = new Optional(a map mapReferencedT(g).fn, f) - def validateKeyReferenced(g: ValidateKeyRef) = a match { - case None => Right(this) - case Some(i) => Right( new Optional(i.validateKeyReferenced(g).right.toOption, f) ) - } - def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) - def evaluate(ss: Settings[Scope]): T = f( a.flatMap( i => trapBadRef(evaluateT(ss)(i)) ) ) - // proper solution is for evaluate to be deprecated or for external use only and a new internal method returning Either be used - private[this] def trapBadRef[A](run: => A): Option[A] = try Some(run) catch { case e: InvalidReference => None } - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = a match { - case None => init - case Some(i) => i.processAttributes(init)(f) - } - } - private[sbt] final class Value[T](val value: () => T) extends Initialize[T] - { - def dependencies = Nil - def mapReferenced(g: MapScoped) = this - def validateKeyReferenced(g: ValidateKeyRef) = Right(this) - def apply[S](g: T => S) = new Value[S](() => g(value())) - def mapConstant(g: MapConstant) = this - def evaluate(map: Settings[Scope]): T = value() - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[sbt] final object StaticScopes extends Initialize[Set[Scope]] - { - def dependencies = Nil - def mapReferenced(g: MapScoped) = this - def validateKeyReferenced(g: ValidateKeyRef) = Right(this) - def apply[S](g: Set[Scope] => S) = map(this)(g) - def mapConstant(g: MapConstant) = this - def evaluate(map: Settings[Scope]) = map.scopes - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[sbt] final class Apply[K[L[x]], T](val f: K[Id] => T, val inputs: K[Initialize], val alist: AList[K]) extends Initialize[T] - { - def dependencies = deps(alist.toList(inputs)) - def mapReferenced(g: MapScoped) = mapInputs( mapReferencedT(g) ) - def apply[S](g: T => S) = new Apply(g compose f, inputs, alist) - def mapConstant(g: MapConstant) = mapInputs( mapConstantT(g) ) - def mapInputs(g: Initialize ~> Initialize): Initialize[T] = new Apply(f, alist.transform(inputs, g), alist) - def evaluate(ss: Settings[Scope]) = f(alist.transform(inputs, evaluateT(ss))) - def validateKeyReferenced(g: ValidateKeyRef) = - { - val tx = alist.transform(inputs, validateKeyReferencedT(g)) - val undefs = alist.toList(tx).flatMap(_.left.toSeq.flatten) - val get = new (ValidatedInit ~> Initialize) { def apply[T](vr: ValidatedInit[T]) = vr.right.get } - if(undefs.isEmpty) Right(new Apply(f, alist.transform(tx, get), alist)) else Left(undefs) - } - - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = - (init /: alist.toList(inputs)) { (v, i) => i.processAttributes(v)(f) } - } - private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) + private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = + (init /: alist.toList(inputs)) { (v, i) => i.processAttributes(v)(f) } + } + private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) } diff --git a/util/collection/src/main/scala/sbt/Show.scala b/util/collection/src/main/scala/sbt/Show.scala index fe4e85950..1f8e9703b 100644 --- a/util/collection/src/main/scala/sbt/Show.scala +++ b/util/collection/src/main/scala/sbt/Show.scala @@ -1,9 +1,8 @@ package sbt trait Show[T] { - def apply(t: T): String + def apply(t: T): String } -object Show -{ - def apply[T](f: T => String): Show[T] = new Show[T] { def apply(t: T): String = f(t) } +object Show { + def apply[T](f: T => String): Show[T] = new Show[T] { def apply(t: T): String = f(t) } } \ No newline at end of file diff --git a/util/collection/src/main/scala/sbt/Signal.scala b/util/collection/src/main/scala/sbt/Signal.scala index 0069e4b53..e8c9e7e6c 100644 --- a/util/collection/src/main/scala/sbt/Signal.scala +++ b/util/collection/src/main/scala/sbt/Signal.scala @@ -1,91 +1,85 @@ package sbt -object Signals -{ - val CONT = "CONT" - val INT = "INT" - def withHandler[T](handler: () => Unit, signal: String = INT)(action: () => T): T = - { - val result = - try - { - val signals = new Signals0 - signals.withHandler(signal, handler, action) - } - catch { case e: LinkageError => Right(action()) } +object Signals { + val CONT = "CONT" + val INT = "INT" + def withHandler[T](handler: () => Unit, signal: String = INT)(action: () => T): T = + { + val result = + try { + val signals = new Signals0 + signals.withHandler(signal, handler, action) + } catch { case e: LinkageError => Right(action()) } - result match { - case Left(e) => throw e - case Right(v) => v - } - } + result match { + case Left(e) => throw e + case Right(v) => v + } + } - /** Helper interface so we can expose internals of signal-isms to others. */ - sealed trait Registration { - def remove(): Unit - } - /** Register a signal handler that can be removed later. - * NOTE: Does not stack with other signal handlers!!!! - */ - def register(handler: () => Unit, signal: String = INT): Registration = - // TODO - Maybe we can just ignore things if not is-supported. - if(supported(signal)) { - import sun.misc.{Signal,SignalHandler} - val intSignal = new Signal(signal) - val newHandler = new SignalHandler { - def handle(sig: Signal) { handler() } - } - val oldHandler = Signal.handle(intSignal, newHandler) - object unregisterNewHandler extends Registration { - override def remove(): Unit = { - Signal.handle(intSignal, oldHandler) - } - } - unregisterNewHandler - } else { - // TODO - Maybe we should just throw an exception if we don't support signals... - object NullUnregisterNewHandler extends Registration { - override def remove(): Unit = () - } - NullUnregisterNewHandler - } + /** Helper interface so we can expose internals of signal-isms to others. */ + sealed trait Registration { + def remove(): Unit + } + /** + * Register a signal handler that can be removed later. + * NOTE: Does not stack with other signal handlers!!!! + */ + def register(handler: () => Unit, signal: String = INT): Registration = + // TODO - Maybe we can just ignore things if not is-supported. + if (supported(signal)) { + import sun.misc.{ Signal, SignalHandler } + val intSignal = new Signal(signal) + val newHandler = new SignalHandler { + def handle(sig: Signal) { handler() } + } + val oldHandler = Signal.handle(intSignal, newHandler) + object unregisterNewHandler extends Registration { + override def remove(): Unit = { + Signal.handle(intSignal, oldHandler) + } + } + unregisterNewHandler + } else { + // TODO - Maybe we should just throw an exception if we don't support signals... + object NullUnregisterNewHandler extends Registration { + override def remove(): Unit = () + } + NullUnregisterNewHandler + } - - def supported(signal: String): Boolean = - try - { - val signals = new Signals0 - signals.supported(signal) - } - catch { case e: LinkageError => false } + def supported(signal: String): Boolean = + try { + val signals = new Signals0 + signals.supported(signal) + } catch { case e: LinkageError => false } } // Must only be referenced using a // try { } catch { case e: LinkageError => ... } // block to -private final class Signals0 -{ - def supported(signal: String): Boolean = - { - import sun.misc.Signal - try { new Signal(signal); true } - catch { case e: IllegalArgumentException => false } - } +private final class Signals0 { + def supported(signal: String): Boolean = + { + import sun.misc.Signal + try { new Signal(signal); true } + catch { case e: IllegalArgumentException => false } + } - // returns a LinkageError in `action` as Left(t) in order to avoid it being - // incorrectly swallowed as missing Signal/SignalHandler - def withHandler[T](signal: String, handler: () => Unit, action: () => T): Either[Throwable, T] = - { - import sun.misc.{Signal,SignalHandler} - val intSignal = new Signal(signal) - val newHandler = new SignalHandler { - def handle(sig: Signal) { handler() } - } + // returns a LinkageError in `action` as Left(t) in order to avoid it being + // incorrectly swallowed as missing Signal/SignalHandler + def withHandler[T](signal: String, handler: () => Unit, action: () => T): Either[Throwable, T] = + { + import sun.misc.{ Signal, SignalHandler } + val intSignal = new Signal(signal) + val newHandler = new SignalHandler { + def handle(sig: Signal) { handler() } + } - val oldHandler = Signal.handle(intSignal, newHandler) + val oldHandler = Signal.handle(intSignal, newHandler) - try Right(action()) - catch { case e: LinkageError => Left(e) } - finally Signal.handle(intSignal, oldHandler) - } + try Right(action()) + catch { case e: LinkageError => Left(e) } + finally Signal.handle(intSignal, oldHandler) + } } \ No newline at end of file diff --git a/util/collection/src/main/scala/sbt/TypeFunctions.scala b/util/collection/src/main/scala/sbt/TypeFunctions.scala index 6a4978750..74f0a7d99 100644 --- a/util/collection/src/main/scala/sbt/TypeFunctions.scala +++ b/util/collection/src/main/scala/sbt/TypeFunctions.scala @@ -3,51 +3,48 @@ */ package sbt -trait TypeFunctions -{ - type Id[X] = X - sealed trait Const[A] { type Apply[B] = A } - sealed trait ConstK[A] { type l[L[x]] = A } - sealed trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] } - sealed trait ∙[A[_], B[_]] { type l[T] = A[B[T]] } - sealed trait P1of2[M[_,_], A] { type Apply[B] = M[A,B]; type Flip[B] = M[B, A] } +trait TypeFunctions { + type Id[X] = X + sealed trait Const[A] { type Apply[B] = A } + sealed trait ConstK[A] { type l[L[x]] = A } + sealed trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] } + sealed trait ∙[A[_], B[_]] { type l[T] = A[B[T]] } + sealed trait P1of2[M[_, _], A] { type Apply[B] = M[A, B]; type Flip[B] = M[B, A] } - final val left = new (Id ~> P1of2[Left, Nothing]#Flip) { def apply[T](t: T) = Left(t) } - final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } - final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } - final def idFun[T] = (t: T) => t - final def const[A,B](b: B): A=> B = _ => b - final def idK[M[_]]: M ~> M = new (M ~> M) { def apply[T](m: M[T]): M[T] = m } - - def nestCon[M[_], N[_], G[_]](f: M ~> N): (M ∙ G)#l ~> (N ∙ G)#l = - f.asInstanceOf[(M ∙ G)#l ~> (N ∙ G)#l] // implemented with a cast to avoid extra object+method call. castless version: - /* new ( (M ∙ G)#l ~> (N ∙ G)#l ) { + final val left = new (Id ~> P1of2[Left, Nothing]#Flip) { def apply[T](t: T) = Left(t) } + final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } + final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } + final def idFun[T] = (t: T) => t + final def const[A, B](b: B): A => B = _ => b + final def idK[M[_]]: M ~> M = new (M ~> M) { def apply[T](m: M[T]): M[T] = m } + + def nestCon[M[_], N[_], G[_]](f: M ~> N): (M ∙ G)#l ~> (N ∙ G)#l = + f.asInstanceOf[(M ∙ G)#l ~> (N ∙ G)#l] // implemented with a cast to avoid extra object+method call. castless version: + /* new ( (M ∙ G)#l ~> (N ∙ G)#l ) { def apply[T](mg: M[G[T]]): N[G[T]] = f(mg) }*/ - implicit def toFn1[A,B](f: A => B): Fn1[A,B] = new Fn1[A,B] { - def ∙[C](g: C => A) = f compose g - } - - type Endo[T] = T=>T - type ~>|[A[_],B[_]] = A ~> Compose[Option, B]#Apply + implicit def toFn1[A, B](f: A => B): Fn1[A, B] = new Fn1[A, B] { + def ∙[C](g: C => A) = f compose g + } + + type Endo[T] = T => T + type ~>|[A[_], B[_]] = A ~> Compose[Option, B]#Apply } object TypeFunctions extends TypeFunctions -trait ~>[-A[_], +B[_]] -{ outer => - def apply[T](a: A[T]): B[T] - // directly on ~> because of type inference limitations - final def ∙[C[_]](g: C ~> A): C ~> B = new (C ~> B) { def apply[T](c: C[T]) = outer.apply(g(c)) } - final def ∙[C,D](g: C => D)(implicit ev: D <:< A[D]): C => B[D] = i => apply(ev(g(i)) ) - final def fn[T] = (t: A[T]) => apply[T](t) +trait ~>[-A[_], +B[_]] { outer => + def apply[T](a: A[T]): B[T] + // directly on ~> because of type inference limitations + final def ∙[C[_]](g: C ~> A): C ~> B = new (C ~> B) { def apply[T](c: C[T]) = outer.apply(g(c)) } + final def ∙[C, D](g: C => D)(implicit ev: D <:< A[D]): C => B[D] = i => apply(ev(g(i))) + final def fn[T] = (t: A[T]) => apply[T](t) } -object ~> -{ - import TypeFunctions._ - val Id: Id ~> Id = new (Id ~> Id) { def apply[T](a: T): T = a } - implicit def tcIdEquals: (Id ~> Id) = Id +object ~> { + import TypeFunctions._ + val Id: Id ~> Id = new (Id ~> Id) { def apply[T](a: T): T = a } + implicit def tcIdEquals: (Id ~> Id) = Id } trait Fn1[A, B] { - def ∙[C](g: C => A): C => B + def ∙[C](g: C => A): C => B } \ No newline at end of file diff --git a/util/collection/src/main/scala/sbt/Types.scala b/util/collection/src/main/scala/sbt/Types.scala index d3a3420b0..29994f3d1 100644 --- a/util/collection/src/main/scala/sbt/Types.scala +++ b/util/collection/src/main/scala/sbt/Types.scala @@ -5,9 +5,8 @@ package sbt object Types extends Types -trait Types extends TypeFunctions -{ - val :^: = KCons - type :+:[H, T <: HList] = HCons[H,T] - val :+: = HCons +trait Types extends TypeFunctions { + val :^: = KCons + type :+:[H, T <: HList] = HCons[H, T] + val :+: = HCons } diff --git a/util/collection/src/main/scala/sbt/Util.scala b/util/collection/src/main/scala/sbt/Util.scala index 27b32dd87..befc7b5a9 100644 --- a/util/collection/src/main/scala/sbt/Util.scala +++ b/util/collection/src/main/scala/sbt/Util.scala @@ -5,41 +5,39 @@ package sbt import java.util.Locale -object Util -{ - def makeList[T](size: Int, value: T): List[T] = List.fill(size)(value) +object Util { + def makeList[T](size: Int, value: T): List[T] = List.fill(size)(value) - def separateE[A,B](ps: Seq[Either[A,B]]): (Seq[A], Seq[B]) = - separate(ps)(Types.idFun) + def separateE[A, B](ps: Seq[Either[A, B]]): (Seq[A], Seq[B]) = + separate(ps)(Types.idFun) - def separate[T,A,B](ps: Seq[T])(f: T => Either[A,B]): (Seq[A], Seq[B]) = - { - val (a,b) = ((Nil: Seq[A], Nil: Seq[B]) /: ps)( (xs, y) => prependEither(xs, f(y)) ) - (a.reverse, b.reverse) - } + def separate[T, A, B](ps: Seq[T])(f: T => Either[A, B]): (Seq[A], Seq[B]) = + { + val (a, b) = ((Nil: Seq[A], Nil: Seq[B]) /: ps)((xs, y) => prependEither(xs, f(y))) + (a.reverse, b.reverse) + } - def prependEither[A,B](acc: (Seq[A], Seq[B]), next: Either[A,B]): (Seq[A], Seq[B]) = - next match - { - case Left(l) => (l +: acc._1, acc._2) - case Right(r) => (acc._1, r +: acc._2) - } + def prependEither[A, B](acc: (Seq[A], Seq[B]), next: Either[A, B]): (Seq[A], Seq[B]) = + next match { + case Left(l) => (l +: acc._1, acc._2) + case Right(r) => (acc._1, r +: acc._2) + } - def pairID[A,B] = (a: A, b: B) => (a,b) + def pairID[A, B] = (a: A, b: B) => (a, b) - private[this] lazy val Hypen = """-(\p{javaLowerCase})""".r - def hasHyphen(s: String): Boolean = s.indexOf('-') >= 0 - @deprecated("Use the properly spelled version: hyphenToCamel", "0.13.0") - def hypenToCamel(s: String): String = hyphenToCamel(s) - def hyphenToCamel(s: String): String = - if(hasHyphen(s)) - Hypen.replaceAllIn(s, _.group(1).toUpperCase(Locale.ENGLISH)) - else - s + private[this] lazy val Hypen = """-(\p{javaLowerCase})""".r + def hasHyphen(s: String): Boolean = s.indexOf('-') >= 0 + @deprecated("Use the properly spelled version: hyphenToCamel", "0.13.0") + def hypenToCamel(s: String): String = hyphenToCamel(s) + def hyphenToCamel(s: String): String = + if (hasHyphen(s)) + Hypen.replaceAllIn(s, _.group(1).toUpperCase(Locale.ENGLISH)) + else + s - private[this] lazy val Camel = """(\p{javaLowerCase})(\p{javaUpperCase})""".r - def camelToHypen(s: String): String = - Camel.replaceAllIn(s, m => m.group(1) + "-" + m.group(2).toLowerCase(Locale.ENGLISH)) + private[this] lazy val Camel = """(\p{javaLowerCase})(\p{javaUpperCase})""".r + def camelToHypen(s: String): String = + Camel.replaceAllIn(s, m => m.group(1) + "-" + m.group(2).toLowerCase(Locale.ENGLISH)) - def quoteIfKeyword(s: String): String = if(ScalaKeywords.values(s)) '`' + s + '`' else s + def quoteIfKeyword(s: String): String = if (ScalaKeywords.values(s)) '`' + s + '`' else s } diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index 9fba225f4..8f9fc219f 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -3,144 +3,137 @@ */ package sbt - import jline.console.ConsoleReader - import jline.console.history.{FileHistory, MemoryHistory} - import java.io.{File, InputStream, PrintWriter} - import complete.Parser - import java.util.concurrent.atomic.AtomicBoolean +import jline.console.ConsoleReader +import jline.console.history.{ FileHistory, MemoryHistory } +import java.io.{ File, InputStream, PrintWriter } +import complete.Parser +import java.util.concurrent.atomic.AtomicBoolean -abstract class JLine extends LineReader -{ - protected[this] val handleCONT: Boolean - protected[this] val reader: ConsoleReader +abstract class JLine extends LineReader { + protected[this] val handleCONT: Boolean + protected[this] val reader: ConsoleReader - def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } + def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } - private[this] def unsynchronizedReadLine(prompt: String, mask: Option[Char]) = - readLineWithHistory(prompt, mask) match - { - case null => None - case x => Some(x.trim) - } + private[this] def unsynchronizedReadLine(prompt: String, mask: Option[Char]) = + readLineWithHistory(prompt, mask) match { + case null => None + case x => Some(x.trim) + } - private[this] def readLineWithHistory(prompt: String, mask: Option[Char]): String = - reader.getHistory match - { - case fh: FileHistory => - try { readLineDirect(prompt, mask) } - finally { fh.flush() } - case _ => readLineDirect(prompt, mask) - } + private[this] def readLineWithHistory(prompt: String, mask: Option[Char]): String = + reader.getHistory match { + case fh: FileHistory => + try { readLineDirect(prompt, mask) } + finally { fh.flush() } + case _ => readLineDirect(prompt, mask) + } - private[this] def readLineDirect(prompt: String, mask: Option[Char]): String = - if(handleCONT) - Signals.withHandler(() => resume(), signal = Signals.CONT)( () => readLineDirectRaw(prompt, mask) ) - else - readLineDirectRaw(prompt, mask) - private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): String = - { - val newprompt = handleMultilinePrompt(prompt) - mask match { - case Some(m) => reader.readLine(newprompt, m) - case None => reader.readLine(newprompt) - } - } + private[this] def readLineDirect(prompt: String, mask: Option[Char]): String = + if (handleCONT) + Signals.withHandler(() => resume(), signal = Signals.CONT)(() => readLineDirectRaw(prompt, mask)) + else + readLineDirectRaw(prompt, mask) + private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): String = + { + val newprompt = handleMultilinePrompt(prompt) + mask match { + case Some(m) => reader.readLine(newprompt, m) + case None => reader.readLine(newprompt) + } + } - private[this] def handleMultilinePrompt(prompt: String): String = { - val lines = """\r?\n""".r.split(prompt) - lines.size match { - case 0 | 1 => prompt - case _ => reader.print(lines.init.mkString("\n") + "\n"); lines.last; - } - } + private[this] def handleMultilinePrompt(prompt: String): String = { + val lines = """\r?\n""".r.split(prompt) + lines.size match { + case 0 | 1 => prompt + case _ => reader.print(lines.init.mkString("\n") + "\n"); lines.last; + } + } - private[this] def resume() - { - jline.TerminalFactory.reset - JLine.terminal.init - reader.drawLine() - reader.flush() - } + private[this] def resume() { + jline.TerminalFactory.reset + JLine.terminal.init + reader.drawLine() + reader.flush() + } } -private object JLine -{ - private[this] val TerminalProperty = "jline.terminal" +private object JLine { + private[this] val TerminalProperty = "jline.terminal" - fixTerminalProperty() + fixTerminalProperty() - // translate explicit class names to type in order to support - // older Scala, since it shaded classes but not the system property - private[sbt] def fixTerminalProperty() { - val newValue = System.getProperty(TerminalProperty) match { - case "jline.UnixTerminal" => "unix" - case null if System.getProperty("sbt.cygwin") != null => "unix" - case "jline.WindowsTerminal" => "windows" - case "jline.AnsiWindowsTerminal" => "windows" - case "jline.UnsupportedTerminal" => "none" - case x => x - } - if(newValue != null) System.setProperty(TerminalProperty, newValue) - } + // translate explicit class names to type in order to support + // older Scala, since it shaded classes but not the system property + private[sbt] def fixTerminalProperty() { + val newValue = System.getProperty(TerminalProperty) match { + case "jline.UnixTerminal" => "unix" + case null if System.getProperty("sbt.cygwin") != null => "unix" + case "jline.WindowsTerminal" => "windows" + case "jline.AnsiWindowsTerminal" => "windows" + case "jline.UnsupportedTerminal" => "none" + case x => x + } + if (newValue != null) System.setProperty(TerminalProperty, newValue) + } - // When calling this, ensure that enableEcho has been or will be called. - // TerminalFactory.get will initialize the terminal to disable echo. - private def terminal = jline.TerminalFactory.get - private def withTerminal[T](f: jline.Terminal => T): T = - synchronized - { - val t = terminal - t.synchronized { f(t) } - } - /** For accessing the JLine Terminal object. - * This ensures synchronized access as well as re-enabling echo after getting the Terminal. */ - def usingTerminal[T](f: jline.Terminal => T): T = - withTerminal { t => - t.restore - f(t) - } - def createReader(): ConsoleReader = createReader(None) - def createReader(historyPath: Option[File]): ConsoleReader = - usingTerminal { t => - val cr = new ConsoleReader - cr.setExpandEvents(false) // https://issues.scala-lang.org/browse/SI-7650 - cr.setBellEnabled(false) - val h = historyPath match { - case None => new MemoryHistory - case Some(file) => new FileHistory(file) - } - h.setMaxSize(MaxHistorySize) - cr.setHistory(h) - cr - } - def withJLine[T](action: => T): T = - withTerminal { t => - t.init - try { action } - finally { t.restore } - } + // When calling this, ensure that enableEcho has been or will be called. + // TerminalFactory.get will initialize the terminal to disable echo. + private def terminal = jline.TerminalFactory.get + private def withTerminal[T](f: jline.Terminal => T): T = + synchronized { + val t = terminal + t.synchronized { f(t) } + } + /** + * For accessing the JLine Terminal object. + * This ensures synchronized access as well as re-enabling echo after getting the Terminal. + */ + def usingTerminal[T](f: jline.Terminal => T): T = + withTerminal { t => + t.restore + f(t) + } + def createReader(): ConsoleReader = createReader(None) + def createReader(historyPath: Option[File]): ConsoleReader = + usingTerminal { t => + val cr = new ConsoleReader + cr.setExpandEvents(false) // https://issues.scala-lang.org/browse/SI-7650 + cr.setBellEnabled(false) + val h = historyPath match { + case None => new MemoryHistory + case Some(file) => new FileHistory(file) + } + h.setMaxSize(MaxHistorySize) + cr.setHistory(h) + cr + } + def withJLine[T](action: => T): T = + withTerminal { t => + t.init + try { action } + finally { t.restore } + } - def simple(historyPath: Option[File], handleCONT: Boolean = HandleCONT): SimpleReader = new SimpleReader(historyPath, handleCONT) - val MaxHistorySize = 500 - val HandleCONT = !java.lang.Boolean.getBoolean("sbt.disable.cont") && Signals.supported(Signals.CONT) + def simple(historyPath: Option[File], handleCONT: Boolean = HandleCONT): SimpleReader = new SimpleReader(historyPath, handleCONT) + val MaxHistorySize = 500 + val HandleCONT = !java.lang.Boolean.getBoolean("sbt.disable.cont") && Signals.supported(Signals.CONT) } -trait LineReader -{ - def readLine(prompt: String, mask: Option[Char] = None): Option[String] +trait LineReader { + def readLine(prompt: String, mask: Option[Char] = None): Option[String] } -final class FullReader(historyPath: Option[File], complete: Parser[_], val handleCONT: Boolean = JLine.HandleCONT) extends JLine -{ - protected[this] val reader = - { - val cr = JLine.createReader(historyPath) - sbt.complete.JLineCompletion.installCustomCompletor(cr, complete) - cr - } +final class FullReader(historyPath: Option[File], complete: Parser[_], val handleCONT: Boolean = JLine.HandleCONT) extends JLine { + protected[this] val reader = + { + val cr = JLine.createReader(historyPath) + sbt.complete.JLineCompletion.installCustomCompletor(cr, complete) + cr + } } -class SimpleReader private[sbt] (historyPath: Option[File], val handleCONT: Boolean) extends JLine -{ - protected[this] val reader = JLine.createReader(historyPath) +class SimpleReader private[sbt] (historyPath: Option[File], val handleCONT: Boolean) extends JLine { + protected[this] val reader = JLine.createReader(historyPath) } object SimpleReader extends SimpleReader(None, JLine.HandleCONT) diff --git a/util/complete/src/main/scala/sbt/complete/Completions.scala b/util/complete/src/main/scala/sbt/complete/Completions.scala index 594a9b9da..5237ad26d 100644 --- a/util/complete/src/main/scala/sbt/complete/Completions.scala +++ b/util/complete/src/main/scala/sbt/complete/Completions.scala @@ -4,148 +4,141 @@ package sbt.complete /** -* Represents a set of completions. -* It exists instead of implicitly defined operations on top of Set[Completion] -* for laziness. -*/ -sealed trait Completions -{ - def get: Set[Completion] - final def x(o: Completions): Completions = flatMap(_ x o) - final def ++(o: Completions): Completions = Completions( get ++ o.get ) - final def +:(o: Completion): Completions = Completions(get + o) - final def filter(f: Completion => Boolean): Completions = Completions(get filter f) - final def filterS(f: String => Boolean): Completions = filter(c => f(c.append)) - override def toString = get.mkString("Completions(",",",")") - final def flatMap(f: Completion => Completions): Completions = Completions(get.flatMap(c => f(c).get)) - final def map(f: Completion => Completion): Completions = Completions(get map f) - override final def hashCode = get.hashCode - override final def equals(o: Any) = o match { case c: Completions => get == c.get; case _ => false } + * Represents a set of completions. + * It exists instead of implicitly defined operations on top of Set[Completion] + * for laziness. + */ +sealed trait Completions { + def get: Set[Completion] + final def x(o: Completions): Completions = flatMap(_ x o) + final def ++(o: Completions): Completions = Completions(get ++ o.get) + final def +:(o: Completion): Completions = Completions(get + o) + final def filter(f: Completion => Boolean): Completions = Completions(get filter f) + final def filterS(f: String => Boolean): Completions = filter(c => f(c.append)) + override def toString = get.mkString("Completions(", ",", ")") + final def flatMap(f: Completion => Completions): Completions = Completions(get.flatMap(c => f(c).get)) + final def map(f: Completion => Completion): Completions = Completions(get map f) + override final def hashCode = get.hashCode + override final def equals(o: Any) = o match { case c: Completions => get == c.get; case _ => false } } -object Completions -{ - /** Returns a lazy Completions instance using the provided Completion Set. */ - def apply(cs: => Set[Completion]): Completions = new Completions { - lazy val get = cs - } +object Completions { + /** Returns a lazy Completions instance using the provided Completion Set. */ + def apply(cs: => Set[Completion]): Completions = new Completions { + lazy val get = cs + } - /** Returns a strict Completions instance using the provided Completion Set. */ - def strict(cs: Set[Completion]): Completions = apply(cs) + /** Returns a strict Completions instance using the provided Completion Set. */ + def strict(cs: Set[Completion]): Completions = apply(cs) - /** No suggested completions, not even the empty Completion. - * This typically represents invalid input. */ - val nil: Completions = strict(Set.empty) + /** + * No suggested completions, not even the empty Completion. + * This typically represents invalid input. + */ + val nil: Completions = strict(Set.empty) - /** Only includes an empty Suggestion. - * This typically represents valid input that either has no completions or accepts no further input. */ - val empty: Completions = strict(Set.empty + Completion.empty) + /** + * Only includes an empty Suggestion. + * This typically represents valid input that either has no completions or accepts no further input. + */ + val empty: Completions = strict(Set.empty + Completion.empty) - /** Returns a strict Completions instance containing only the provided Completion.*/ - def single(c: Completion): Completions = strict(Set.empty + c) + /** Returns a strict Completions instance containing only the provided Completion.*/ + def single(c: Completion): Completions = strict(Set.empty + c) } /** -* Represents a completion. -* The abstract members `display` and `append` are best explained with an example. -* -* Assuming space-delimited tokens, processing this: -* am is are w -* could produce these Completions: -* Completion { display = "was"; append = "as" } -* Completion { display = "were"; append = "ere" } -* to suggest the tokens "was" and "were". -* -* In this way, two pieces of information are preserved: -* 1) what needs to be appended to the current input if a completion is selected -* 2) the full token being completed, which is useful for presenting a user with choices to select -*/ -sealed trait Completion -{ - /** The proposed suffix to append to the existing input to complete the last token in the input.*/ - def append: String - /** The string to present to the user to represent the full token being suggested.*/ - def display: String - /** True if this Completion is suggesting the empty string.*/ - def isEmpty: Boolean + * Represents a completion. + * The abstract members `display` and `append` are best explained with an example. + * + * Assuming space-delimited tokens, processing this: + * am is are w + * could produce these Completions: + * Completion { display = "was"; append = "as" } + * Completion { display = "were"; append = "ere" } + * to suggest the tokens "was" and "were". + * + * In this way, two pieces of information are preserved: + * 1) what needs to be appended to the current input if a completion is selected + * 2) the full token being completed, which is useful for presenting a user with choices to select + */ +sealed trait Completion { + /** The proposed suffix to append to the existing input to complete the last token in the input.*/ + def append: String + /** The string to present to the user to represent the full token being suggested.*/ + def display: String + /** True if this Completion is suggesting the empty string.*/ + def isEmpty: Boolean - /** Appends the completions in `o` with the completions in this Completion.*/ - def ++(o: Completion): Completion = Completion.concat(this, o) - final def x(o: Completions): Completions = if(Completion evaluatesRight this) o.map(this ++ _) else Completions.strict(Set.empty + this) - override final lazy val hashCode = Completion.hashCode(this) - override final def equals(o: Any) = o match { case c: Completion => Completion.equal(this, c); case _ => false } + /** Appends the completions in `o` with the completions in this Completion.*/ + def ++(o: Completion): Completion = Completion.concat(this, o) + final def x(o: Completions): Completions = if (Completion evaluatesRight this) o.map(this ++ _) else Completions.strict(Set.empty + this) + override final lazy val hashCode = Completion.hashCode(this) + override final def equals(o: Any) = o match { case c: Completion => Completion.equal(this, c); case _ => false } } -final class DisplayOnly(val display: String) extends Completion -{ - def isEmpty = display.isEmpty - def append = "" - override def toString = "{" + display + "}" +final class DisplayOnly(val display: String) extends Completion { + def isEmpty = display.isEmpty + def append = "" + override def toString = "{" + display + "}" } -final class Token(val display: String, val append: String) extends Completion -{ - @deprecated("Retained only for compatibility. All information is now in `display` and `append`.", "0.12.1") - lazy val prepend = display.stripSuffix(append) - def isEmpty = display.isEmpty && append.isEmpty - override final def toString = "[" + display + "]++" + append +final class Token(val display: String, val append: String) extends Completion { + @deprecated("Retained only for compatibility. All information is now in `display` and `append`.", "0.12.1") + lazy val prepend = display.stripSuffix(append) + def isEmpty = display.isEmpty && append.isEmpty + override final def toString = "[" + display + "]++" + append } -final class Suggestion(val append: String) extends Completion -{ - def isEmpty = append.isEmpty - def display = append - override def toString = append +final class Suggestion(val append: String) extends Completion { + def isEmpty = append.isEmpty + def display = append + override def toString = append } -object Completion -{ - def concat(a: Completion, b: Completion): Completion = - (a,b) match - { - case (as: Suggestion, bs: Suggestion) => suggestion(as.append + bs.append) - case (at: Token, _) if at.append.isEmpty => b - case _ if a.isEmpty => b - case _ => a - } - def evaluatesRight(a: Completion): Boolean = - a match - { - case _: Suggestion => true - case at: Token if at.append.isEmpty => true - case _ => a.isEmpty - } +object Completion { + def concat(a: Completion, b: Completion): Completion = + (a, b) match { + case (as: Suggestion, bs: Suggestion) => suggestion(as.append + bs.append) + case (at: Token, _) if at.append.isEmpty => b + case _ if a.isEmpty => b + case _ => a + } + def evaluatesRight(a: Completion): Boolean = + a match { + case _: Suggestion => true + case at: Token if at.append.isEmpty => true + case _ => a.isEmpty + } - def equal(a: Completion, b: Completion): Boolean = - (a,b) match - { - case (as: Suggestion, bs: Suggestion) => as.append == bs.append - case (ad: DisplayOnly, bd: DisplayOnly) => ad.display == bd.display - case (at: Token, bt: Token) => at.display == bt.display && at.append == bt.append - case _ => false - } + def equal(a: Completion, b: Completion): Boolean = + (a, b) match { + case (as: Suggestion, bs: Suggestion) => as.append == bs.append + case (ad: DisplayOnly, bd: DisplayOnly) => ad.display == bd.display + case (at: Token, bt: Token) => at.display == bt.display && at.append == bt.append + case _ => false + } - def hashCode(a: Completion): Int = - a match - { - case as: Suggestion => (0, as.append).hashCode - case ad: DisplayOnly => (1, ad.display).hashCode - case at: Token => (2, at.display, at.append).hashCode - } + def hashCode(a: Completion): Int = + a match { + case as: Suggestion => (0, as.append).hashCode + case ad: DisplayOnly => (1, ad.display).hashCode + case at: Token => (2, at.display, at.append).hashCode + } - val empty: Completion = suggestion("") - def single(c: Char): Completion = suggestion(c.toString) - - // TODO: make strict in 0.13.0 to match DisplayOnly - def displayOnly(value: => String): Completion = new DisplayOnly(value) - @deprecated("Use displayOnly.", "0.12.1") - def displayStrict(value: String): Completion = displayOnly(value) + val empty: Completion = suggestion("") + def single(c: Char): Completion = suggestion(c.toString) - // TODO: make strict in 0.13.0 to match Token - def token(prepend: => String, append: => String): Completion = new Token(prepend+append, append) - @deprecated("Use token.", "0.12.1") - def tokenStrict(prepend: String, append: String): Completion = token(prepend, append) + // TODO: make strict in 0.13.0 to match DisplayOnly + def displayOnly(value: => String): Completion = new DisplayOnly(value) + @deprecated("Use displayOnly.", "0.12.1") + def displayStrict(value: String): Completion = displayOnly(value) - /** @since 0.12.1 */ - def tokenDisplay(append: String, display: String): Completion = new Token(display, append) + // TODO: make strict in 0.13.0 to match Token + def token(prepend: => String, append: => String): Completion = new Token(prepend + append, append) + @deprecated("Use token.", "0.12.1") + def tokenStrict(prepend: String, append: String): Completion = token(prepend, append) - // TODO: make strict in 0.13.0 to match Suggestion - def suggestion(value: => String): Completion = new Suggestion(value) - @deprecated("Use suggestion.", "0.12.1") - def suggestStrict(value: String): Completion = suggestion(value) + /** @since 0.12.1 */ + def tokenDisplay(append: String, display: String): Completion = new Token(display, append) + + // TODO: make strict in 0.13.0 to match Suggestion + def suggestion(value: => String): Completion = new Suggestion(value) + @deprecated("Use suggestion.", "0.12.1") + def suggestStrict(value: String): Completion = suggestion(value) } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/EditDistance.scala b/util/complete/src/main/scala/sbt/complete/EditDistance.scala index 5e4cb277f..95ed0c91f 100644 --- a/util/complete/src/main/scala/sbt/complete/EditDistance.scala +++ b/util/complete/src/main/scala/sbt/complete/EditDistance.scala @@ -1,41 +1,41 @@ package sbt.complete - import java.lang.Character.{toLowerCase => lower} +import java.lang.Character.{ toLowerCase => lower } /** @author Paul Phillips*/ object EditDistance { - /** Translated from the java version at - * http://www.merriampark.com/ld.htm - * which is declared to be public domain. - */ - def levenshtein(s: String, t: String, insertCost: Int = 1, deleteCost: Int = 1, subCost: Int = 1, transposeCost: Int = 1, matchCost: Int = 0, caseCost: Int = 1, transpositions: Boolean = false): Int = { - val n = s.length - val m = t.length - if (n == 0) return m - if (m == 0) return n + /** + * Translated from the java version at + * http://www.merriampark.com/ld.htm + * which is declared to be public domain. + */ + def levenshtein(s: String, t: String, insertCost: Int = 1, deleteCost: Int = 1, subCost: Int = 1, transposeCost: Int = 1, matchCost: Int = 0, caseCost: Int = 1, transpositions: Boolean = false): Int = { + val n = s.length + val m = t.length + if (n == 0) return m + if (m == 0) return n - val d = Array.ofDim[Int](n + 1, m + 1) - 0 to n foreach (x => d(x)(0) = x) - 0 to m foreach (x => d(0)(x) = x) + val d = Array.ofDim[Int](n + 1, m + 1) + 0 to n foreach (x => d(x)(0) = x) + 0 to m foreach (x => d(0)(x) = x) - for (i <- 1 to n ; s_i = s(i - 1) ; j <- 1 to m) { - val t_j = t(j - 1) - val cost = if (s_i == t_j) matchCost else if(lower(s_i) == lower(t_j)) caseCost else subCost - val tcost = if (s_i == t_j) matchCost else transposeCost - + for (i <- 1 to n; s_i = s(i - 1); j <- 1 to m) { + val t_j = t(j - 1) + val cost = if (s_i == t_j) matchCost else if (lower(s_i) == lower(t_j)) caseCost else subCost + val tcost = if (s_i == t_j) matchCost else transposeCost - val c1 = d(i - 1)(j) + deleteCost - val c2 = d(i)(j - 1) + insertCost - val c3 = d(i - 1)(j - 1) + cost + val c1 = d(i - 1)(j) + deleteCost + val c2 = d(i)(j - 1) + insertCost + val c3 = d(i - 1)(j - 1) + cost - d(i)(j) = c1 min c2 min c3 + d(i)(j) = c1 min c2 min c3 - if (transpositions) { - if (i > 1 && j > 1 && s(i - 1) == t(j - 2) && s(i - 2) == t(j - 1)) - d(i)(j) = d(i)(j) min (d(i - 2)(j - 2) + cost) - } - } + if (transpositions) { + if (i > 1 && j > 1 && s(i - 1) == t(j - 2) && s(i - 2) == t(j - 1)) + d(i)(j) = d(i)(j) min (d(i - 2)(j - 2) + cost) + } + } - d(n)(m) - } + d(n)(m) + } } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala index 565a8c3f1..6d0469aa0 100644 --- a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala +++ b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala @@ -8,35 +8,33 @@ import sbt.IO._ * [[sbt.complete.FileExamples]] class, which provides a list of suggested files to the user as they press the * TAB key in the console. */ -trait ExampleSource -{ - /** - * @return a (possibly lazy) list of completion example strings. These strings are continuations of user's input. The - * user's input is incremented with calls to [[withAddedPrefix]]. - */ - def apply(): Iterable[String] +trait ExampleSource { + /** + * @return a (possibly lazy) list of completion example strings. These strings are continuations of user's input. The + * user's input is incremented with calls to [[withAddedPrefix]]. + */ + def apply(): Iterable[String] - /** - * @param addedPrefix a string that just typed in by the user. - * @return a new source of only those examples that start with the string typed by the user so far (with addition of - * the just added prefix). - */ - def withAddedPrefix(addedPrefix: String): ExampleSource + /** + * @param addedPrefix a string that just typed in by the user. + * @return a new source of only those examples that start with the string typed by the user so far (with addition of + * the just added prefix). + */ + def withAddedPrefix(addedPrefix: String): ExampleSource } /** * A convenience example source that wraps any collection of strings into a source of examples. * @param examples the examples that will be displayed to the user when they press the TAB key. */ -sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSource -{ - override def withAddedPrefix(addedPrefix: String): ExampleSource = FixedSetExamples(examplesWithRemovedPrefix(addedPrefix)) +sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSource { + override def withAddedPrefix(addedPrefix: String): ExampleSource = FixedSetExamples(examplesWithRemovedPrefix(addedPrefix)) - override def apply(): Iterable[String] = examples + override def apply(): Iterable[String] = examples - private def examplesWithRemovedPrefix(prefix: String) = examples.collect { - case example if example startsWith prefix => example substring prefix.length - } + private def examplesWithRemovedPrefix(prefix: String) = examples.collect { + case example if example startsWith prefix => example substring prefix.length + } } /** @@ -44,19 +42,18 @@ sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSo * @param base the directory within which this class will search for completion examples. * @param prefix the part of the path already written by the user. */ -class FileExamples(base: File, prefix: String = "") extends ExampleSource -{ - override def apply(): Stream[String] = files(base).map(_ substring prefix.length) +class FileExamples(base: File, prefix: String = "") extends ExampleSource { + override def apply(): Stream[String] = files(base).map(_ substring prefix.length) - override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) + override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) - protected def files(directory: File): Stream[String] = { - val childPaths = directory.listFiles().toStream - val prefixedDirectChildPaths = childPaths.map(relativize(base, _).get).filter(_ startsWith prefix) - val dirsToRecurseInto = childPaths.filter(_.isDirectory).map(relativize(base, _).get).filter(dirStartsWithPrefix) - prefixedDirectChildPaths append dirsToRecurseInto.flatMap(dir => files(new File(base, dir))) - } + protected def files(directory: File): Stream[String] = { + val childPaths = directory.listFiles().toStream + val prefixedDirectChildPaths = childPaths.map(relativize(base, _).get).filter(_ startsWith prefix) + val dirsToRecurseInto = childPaths.filter(_.isDirectory).map(relativize(base, _).get).filter(dirStartsWithPrefix) + prefixedDirectChildPaths append dirsToRecurseInto.flatMap(dir => files(new File(base, dir))) + } - private def dirStartsWithPrefix(relativizedPath: String): Boolean = - (relativizedPath startsWith prefix) || (prefix startsWith relativizedPath) + private def dirStartsWithPrefix(relativizedPath: String): Boolean = + (relativizedPath startsWith prefix) || (prefix startsWith relativizedPath) } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/History.scala b/util/complete/src/main/scala/sbt/complete/History.scala index 9c36f2605..ca394abf8 100644 --- a/util/complete/src/main/scala/sbt/complete/History.scala +++ b/util/complete/src/main/scala/sbt/complete/History.scala @@ -4,47 +4,42 @@ package sbt package complete - import History.number - import java.io.File +import History.number +import java.io.File -final class History private(val lines: IndexedSeq[String], val path: Option[File], error: String => Unit) extends NotNull -{ - private def reversed = lines.reverse +final class History private (val lines: IndexedSeq[String], val path: Option[File], error: String => Unit) extends NotNull { + private def reversed = lines.reverse - def all: Seq[String] = lines - def size = lines.length - def !! : Option[String] = !- (1) - def apply(i: Int): Option[String] = if(0 <= i && i < size) Some( lines(i) ) else { error("Invalid history index: " + i); None } - def !(i: Int): Option[String] = apply(i) + def all: Seq[String] = lines + def size = lines.length + def !! : Option[String] = !-(1) + def apply(i: Int): Option[String] = if (0 <= i && i < size) Some(lines(i)) else { error("Invalid history index: " + i); None } + def !(i: Int): Option[String] = apply(i) - def !(s: String): Option[String] = - number(s) match - { - case Some(n) => if(n < 0) !- (-n) else apply(n) - case None => nonEmpty(s) { reversed.find(_.startsWith(s)) } - } - def !- (n: Int): Option[String] = apply(size - n - 1) + def !(s: String): Option[String] = + number(s) match { + case Some(n) => if (n < 0) !-(-n) else apply(n) + case None => nonEmpty(s) { reversed.find(_.startsWith(s)) } + } + def !-(n: Int): Option[String] = apply(size - n - 1) - def !?(s: String): Option[String] = nonEmpty(s) { reversed.drop(1).find(_.contains(s)) } + def !?(s: String): Option[String] = nonEmpty(s) { reversed.drop(1).find(_.contains(s)) } - private def nonEmpty[T](s: String)(act: => Option[T]): Option[T] = - if(s.isEmpty) - { - error("No action specified to history command") - None - } - else - act + private def nonEmpty[T](s: String)(act: => Option[T]): Option[T] = + if (s.isEmpty) { + error("No action specified to history command") + None + } else + act - def list(historySize: Int, show: Int): Seq[String] = - lines.toList.drop((lines.size - historySize) max 0).zipWithIndex.map { case (line, number) => " " + number + " " + line }.takeRight(show max 1) + def list(historySize: Int, show: Int): Seq[String] = + lines.toList.drop((lines.size - historySize) max 0).zipWithIndex.map { case (line, number) => " " + number + " " + line }.takeRight(show max 1) } -object History -{ - def apply(lines: Seq[String], path: Option[File], error: String => Unit): History = new History(lines.toIndexedSeq, path, error) +object History { + def apply(lines: Seq[String], path: Option[File], error: String => Unit): History = new History(lines.toIndexedSeq, path, error) - def number(s: String): Option[Int] = - try { Some(s.toInt) } - catch { case e: NumberFormatException => None } + def number(s: String): Option[Int] = + try { Some(s.toInt) } + catch { case e: NumberFormatException => None } } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala b/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala index 906aa328a..762f48c6d 100644 --- a/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala +++ b/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala @@ -4,69 +4,70 @@ package sbt package complete - import java.io.File +import java.io.File -object HistoryCommands -{ - val Start = "!" - // second characters - val Contains = "?" - val Last = "!" - val ListCommands = ":" +object HistoryCommands { + val Start = "!" + // second characters + val Contains = "?" + val Last = "!" + val ListCommands = ":" - def ContainsFull = h(Contains) - def LastFull = h(Last) - def ListFull = h(ListCommands) + def ContainsFull = h(Contains) + def LastFull = h(Last) + def ListFull = h(ListCommands) - def ListN = ListFull + "n" - def ContainsString = ContainsFull + "string" - def StartsWithString = Start + "string" - def Previous = Start + "-n" - def Nth = Start + "n" - - private def h(s: String) = Start + s - def plainCommands = Seq(ListFull, Start, LastFull, ContainsFull) + def ListN = ListFull + "n" + def ContainsString = ContainsFull + "string" + def StartsWithString = Start + "string" + def Previous = Start + "-n" + def Nth = Start + "n" - def descriptions = Seq( - LastFull -> "Execute the last command again", - ListFull -> "Show all previous commands", - ListN -> "Show the last n commands", - Nth -> ("Execute the command with index n, as shown by the " + ListFull + " command"), - Previous -> "Execute the nth command before this one", - StartsWithString -> "Execute the most recent command starting with 'string'", - ContainsString -> "Execute the most recent command containing 'string'" - ) - def helpString = "History commands:\n " + (descriptions.map{ case (c,d) => c + " " + d}).mkString("\n ") - def printHelp(): Unit = - println(helpString) - def printHistory(history: complete.History, historySize: Int, show: Int): Unit = - history.list(historySize, show).foreach(println) + private def h(s: String) = Start + s + def plainCommands = Seq(ListFull, Start, LastFull, ContainsFull) - import DefaultParsers._ + def descriptions = Seq( + LastFull -> "Execute the last command again", + ListFull -> "Show all previous commands", + ListN -> "Show the last n commands", + Nth -> ("Execute the command with index n, as shown by the " + ListFull + " command"), + Previous -> "Execute the nth command before this one", + StartsWithString -> "Execute the most recent command starting with 'string'", + ContainsString -> "Execute the most recent command containing 'string'" + ) + def helpString = "History commands:\n " + (descriptions.map { case (c, d) => c + " " + d }).mkString("\n ") + def printHelp(): Unit = + println(helpString) + def printHistory(history: complete.History, historySize: Int, show: Int): Unit = + history.list(historySize, show).foreach(println) - val MaxLines = 500 - lazy val num = token(NatBasic, "") - lazy val last = Last ^^^ { execute(_ !!) } - lazy val list = ListCommands ~> (num ?? Int.MaxValue) map { show => - (h: History) => { printHistory(h, MaxLines, show); Some(Nil) } - } - lazy val execStr = flag('?') ~ token(any.+.string, "") map { case (contains, str) => - execute(h => if(contains) h !? str else h ! str) - } - lazy val execInt = flag('-') ~ num map { case (neg, value) => - execute(h => if(neg) h !- value else h ! value) - } - lazy val help = success( (h: History) => { printHelp(); Some(Nil) } ) + import DefaultParsers._ - def execute(f: History => Option[String]): History => Option[List[String]] = (h: History) => - { - val command = f(h) - val lines = h.lines.toArray - command.foreach(lines(lines.length - 1) = _) - h.path foreach { h => IO.writeLines(h, lines) } - Some(command.toList) - } + val MaxLines = 500 + lazy val num = token(NatBasic, "") + lazy val last = Last ^^^ { execute(_ !!) } + lazy val list = ListCommands ~> (num ?? Int.MaxValue) map { show => + (h: History) => { printHistory(h, MaxLines, show); Some(Nil) } + } + lazy val execStr = flag('?') ~ token(any.+.string, "") map { + case (contains, str) => + execute(h => if (contains) h !? str else h ! str) + } + lazy val execInt = flag('-') ~ num map { + case (neg, value) => + execute(h => if (neg) h !- value else h ! value) + } + lazy val help = success((h: History) => { printHelp(); Some(Nil) }) - val actionParser: Parser[complete.History => Option[List[String]]] = - Start ~> (help | last | execInt | list | execStr ) // execStr must come last + def execute(f: History => Option[String]): History => Option[List[String]] = (h: History) => + { + val command = f(h) + val lines = h.lines.toArray + command.foreach(lines(lines.length - 1) = _) + h.path foreach { h => IO.writeLines(h, lines) } + Some(command.toList) + } + + val actionParser: Parser[complete.History => Option[List[String]]] = + Start ~> (help | last | execInt | list | execStr) // execStr must come last } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala b/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala index 1aae8e826..1d876f0ba 100644 --- a/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala +++ b/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala @@ -3,157 +3,154 @@ */ package sbt.complete - import jline.console.ConsoleReader - import jline.console.completer.{CandidateListCompletionHandler,Completer,CompletionHandler} - import scala.annotation.tailrec - import collection.JavaConversions +import jline.console.ConsoleReader +import jline.console.completer.{ CandidateListCompletionHandler, Completer, CompletionHandler } +import scala.annotation.tailrec +import collection.JavaConversions -object JLineCompletion -{ - def installCustomCompletor(reader: ConsoleReader, parser: Parser[_]): Unit = - installCustomCompletor(reader)(parserAsCompletor(parser)) - def installCustomCompletor(reader: ConsoleReader)(complete: (String, Int) => (Seq[String], Seq[String])): Unit = - installCustomCompletor(customCompletor(complete), reader) - def installCustomCompletor(complete: (ConsoleReader, Int) => Boolean, reader: ConsoleReader): Unit = - { - reader.removeCompleter(DummyCompletor) - reader.addCompleter(DummyCompletor) - reader.setCompletionHandler(new CustomHandler(complete)) - } +object JLineCompletion { + def installCustomCompletor(reader: ConsoleReader, parser: Parser[_]): Unit = + installCustomCompletor(reader)(parserAsCompletor(parser)) + def installCustomCompletor(reader: ConsoleReader)(complete: (String, Int) => (Seq[String], Seq[String])): Unit = + installCustomCompletor(customCompletor(complete), reader) + def installCustomCompletor(complete: (ConsoleReader, Int) => Boolean, reader: ConsoleReader): Unit = + { + reader.removeCompleter(DummyCompletor) + reader.addCompleter(DummyCompletor) + reader.setCompletionHandler(new CustomHandler(complete)) + } - private[this] final class CustomHandler(completeImpl: (ConsoleReader, Int) => Boolean) extends CompletionHandler - { - private[this] var previous: Option[(String,Int)] = None - private[this] var level: Int = 1 - override def complete(reader: ConsoleReader, candidates: java.util.List[CharSequence], position: Int) = { - val current = Some(bufferSnapshot(reader)) - level = if(current == previous) level + 1 else 1 - previous = current - try completeImpl(reader, level) - catch { case e: Exception => - reader.print("\nException occurred while determining completions.") - e.printStackTrace() - false - } - } - } - - // always provides dummy completions so that the custom completion handler gets called - // (ConsoleReader doesn't call the handler if there aren't any completions) - // the custom handler will then throw away the candidates and call the custom function - private[this] final object DummyCompletor extends Completer - { - override def complete(buffer: String, cursor: Int, candidates: java.util.List[CharSequence]): Int = - { - candidates.asInstanceOf[java.util.List[String]] add "dummy" - 0 - } - } + private[this] final class CustomHandler(completeImpl: (ConsoleReader, Int) => Boolean) extends CompletionHandler { + private[this] var previous: Option[(String, Int)] = None + private[this] var level: Int = 1 + override def complete(reader: ConsoleReader, candidates: java.util.List[CharSequence], position: Int) = { + val current = Some(bufferSnapshot(reader)) + level = if (current == previous) level + 1 else 1 + previous = current + try completeImpl(reader, level) + catch { + case e: Exception => + reader.print("\nException occurred while determining completions.") + e.printStackTrace() + false + } + } + } - def parserAsCompletor(p: Parser[_]): (String, Int) => (Seq[String], Seq[String]) = - (str, level) => convertCompletions(Parser.completions(p, str, level)) + // always provides dummy completions so that the custom completion handler gets called + // (ConsoleReader doesn't call the handler if there aren't any completions) + // the custom handler will then throw away the candidates and call the custom function + private[this] final object DummyCompletor extends Completer { + override def complete(buffer: String, cursor: Int, candidates: java.util.List[CharSequence]): Int = + { + candidates.asInstanceOf[java.util.List[String]] add "dummy" + 0 + } + } - def convertCompletions(c: Completions): (Seq[String], Seq[String]) = - { - val cs = c.get - if(cs.isEmpty) - (Nil, "{invalid input}" :: Nil) - else - convertCompletions(cs) - } - def convertCompletions(cs: Set[Completion]): (Seq[String], Seq[String]) = - { - val (insert, display) = - ( (Set.empty[String], Set.empty[String]) /: cs) { case ( t @ (insert,display), comp) => - if(comp.isEmpty) t else (insert + comp.append, appendNonEmpty(display, comp.display)) - } - (insert.toSeq, display.toSeq.sorted) - } - def appendNonEmpty(set: Set[String], add: String) = if(add.trim.isEmpty) set else set + add + def parserAsCompletor(p: Parser[_]): (String, Int) => (Seq[String], Seq[String]) = + (str, level) => convertCompletions(Parser.completions(p, str, level)) - def customCompletor(f: (String, Int) => (Seq[String], Seq[String])): (ConsoleReader, Int) => Boolean = - (reader, level) => { - val success = complete(beforeCursor(reader), reader => f(reader, level), reader) - reader.flush() - success - } + def convertCompletions(c: Completions): (Seq[String], Seq[String]) = + { + val cs = c.get + if (cs.isEmpty) + (Nil, "{invalid input}" :: Nil) + else + convertCompletions(cs) + } + def convertCompletions(cs: Set[Completion]): (Seq[String], Seq[String]) = + { + val (insert, display) = + ((Set.empty[String], Set.empty[String]) /: cs) { + case (t @ (insert, display), comp) => + if (comp.isEmpty) t else (insert + comp.append, appendNonEmpty(display, comp.display)) + } + (insert.toSeq, display.toSeq.sorted) + } + def appendNonEmpty(set: Set[String], add: String) = if (add.trim.isEmpty) set else set + add - def bufferSnapshot(reader: ConsoleReader): (String, Int) = - { - val b = reader.getCursorBuffer - (b.buffer.toString, b.cursor) - } - def beforeCursor(reader: ConsoleReader): String = - { - val b = reader.getCursorBuffer - b.buffer.substring(0, b.cursor) - } + def customCompletor(f: (String, Int) => (Seq[String], Seq[String])): (ConsoleReader, Int) => Boolean = + (reader, level) => { + val success = complete(beforeCursor(reader), reader => f(reader, level), reader) + reader.flush() + success + } - // returns false if there was nothing to insert and nothing to display - def complete(beforeCursor: String, completions: String => (Seq[String],Seq[String]), reader: ConsoleReader): Boolean = - { - val (insert,display) = completions(beforeCursor) - val common = commonPrefix(insert) - if(common.isEmpty) - if(display.isEmpty) - () - else - showCompletions(display, reader) - else - appendCompletion(common, reader) + def bufferSnapshot(reader: ConsoleReader): (String, Int) = + { + val b = reader.getCursorBuffer + (b.buffer.toString, b.cursor) + } + def beforeCursor(reader: ConsoleReader): String = + { + val b = reader.getCursorBuffer + b.buffer.substring(0, b.cursor) + } - !(common.isEmpty && display.isEmpty) - } + // returns false if there was nothing to insert and nothing to display + def complete(beforeCursor: String, completions: String => (Seq[String], Seq[String]), reader: ConsoleReader): Boolean = + { + val (insert, display) = completions(beforeCursor) + val common = commonPrefix(insert) + if (common.isEmpty) + if (display.isEmpty) + () + else + showCompletions(display, reader) + else + appendCompletion(common, reader) - def appendCompletion(common: String, reader: ConsoleReader) - { - reader.getCursorBuffer.write(common) - reader.redrawLine() - } + !(common.isEmpty && display.isEmpty) + } - /** `display` is assumed to be the exact strings requested to be displayed. - * In particular, duplicates should have been removed already. */ - def showCompletions(display: Seq[String], reader: ConsoleReader) - { - printCompletions(display, reader) - reader.drawLine() - } - def printCompletions(cs: Seq[String], reader: ConsoleReader) - { - val print = shouldPrint(cs, reader) - reader.println() - if(print) printLinesAndColumns(cs, reader) - } - def printLinesAndColumns(cs: Seq[String], reader: ConsoleReader) - { - val (lines, columns) = cs partition hasNewline - for(line <- lines) { - reader.print(line) - if(line.charAt(line.length - 1) != '\n') - reader.println() - } - reader.printColumns(JavaConversions.seqAsJavaList(columns.map(_.trim))) - } - def hasNewline(s: String): Boolean = s.indexOf('\n') >= 0 - def shouldPrint(cs: Seq[String], reader: ConsoleReader): Boolean = - { - val size = cs.size - (size <= reader.getAutoprintThreshold) || - confirm("Display all %d possibilities? (y or n) ".format(size), 'y', 'n', reader) - } - def confirm(prompt: String, trueC: Char, falseC: Char, reader: ConsoleReader): Boolean = - { - reader.println() - reader.print(prompt) - reader.flush() - reader.readCharacter(trueC, falseC) == trueC - } + def appendCompletion(common: String, reader: ConsoleReader) { + reader.getCursorBuffer.write(common) + reader.redrawLine() + } - def commonPrefix(s: Seq[String]): String = if(s.isEmpty) "" else s reduceLeft commonPrefix - def commonPrefix(a: String, b: String): String = - { - val len = a.length min b.length - @tailrec def loop(i: Int): Int = if(i >= len) len else if(a(i) != b(i)) i else loop(i+1) - a.substring(0, loop(0)) - } + /** + * `display` is assumed to be the exact strings requested to be displayed. + * In particular, duplicates should have been removed already. + */ + def showCompletions(display: Seq[String], reader: ConsoleReader) { + printCompletions(display, reader) + reader.drawLine() + } + def printCompletions(cs: Seq[String], reader: ConsoleReader) { + val print = shouldPrint(cs, reader) + reader.println() + if (print) printLinesAndColumns(cs, reader) + } + def printLinesAndColumns(cs: Seq[String], reader: ConsoleReader) { + val (lines, columns) = cs partition hasNewline + for (line <- lines) { + reader.print(line) + if (line.charAt(line.length - 1) != '\n') + reader.println() + } + reader.printColumns(JavaConversions.seqAsJavaList(columns.map(_.trim))) + } + def hasNewline(s: String): Boolean = s.indexOf('\n') >= 0 + def shouldPrint(cs: Seq[String], reader: ConsoleReader): Boolean = + { + val size = cs.size + (size <= reader.getAutoprintThreshold) || + confirm("Display all %d possibilities? (y or n) ".format(size), 'y', 'n', reader) + } + def confirm(prompt: String, trueC: Char, falseC: Char, reader: ConsoleReader): Boolean = + { + reader.println() + reader.print(prompt) + reader.flush() + reader.readCharacter(trueC, falseC) == trueC + } + + def commonPrefix(s: Seq[String]): String = if (s.isEmpty) "" else s reduceLeft commonPrefix + def commonPrefix(a: String, b: String): String = + { + val len = a.length min b.length + @tailrec def loop(i: Int): Int = if (i >= len) len else if (a(i) != b(i)) i else loop(i + 1) + a.substring(0, loop(0)) + } } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 575cc5ec6..393501792 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -3,725 +3,730 @@ */ package sbt.complete - import Parser._ - import sbt.Types.{left, right, some} - import sbt.Util.{makeList,separate} +import Parser._ +import sbt.Types.{ left, right, some } +import sbt.Util.{ makeList, separate } -/** A String parser that provides semi-automatic tab completion. -* A successful parse results in a value of type `T`. -* The methods in this trait are what must be implemented to define a new Parser implementation, but are not typically useful for common usage. -* Instead, most useful methods for combining smaller parsers into larger parsers are implicitly added by the [[RichParser]] type. -*/ -sealed trait Parser[+T] -{ - def derive(i: Char): Parser[T] - def resultEmpty: Result[T] - def result: Option[T] - def completions(level: Int): Completions - def failure: Option[Failure] - def isTokenStart = false - def ifValid[S](p: => Parser[S]): Parser[S] - def valid: Boolean +/** + * A String parser that provides semi-automatic tab completion. + * A successful parse results in a value of type `T`. + * The methods in this trait are what must be implemented to define a new Parser implementation, but are not typically useful for common usage. + * Instead, most useful methods for combining smaller parsers into larger parsers are implicitly added by the [[RichParser]] type. + */ +sealed trait Parser[+T] { + def derive(i: Char): Parser[T] + def resultEmpty: Result[T] + def result: Option[T] + def completions(level: Int): Completions + def failure: Option[Failure] + def isTokenStart = false + def ifValid[S](p: => Parser[S]): Parser[S] + def valid: Boolean } -sealed trait RichParser[A] -{ - /** Apply the original Parser and then apply `next` (in order). The result of both is provides as a pair. */ - def ~[B](next: Parser[B]): Parser[(A,B)] +sealed trait RichParser[A] { + /** Apply the original Parser and then apply `next` (in order). The result of both is provides as a pair. */ + def ~[B](next: Parser[B]): Parser[(A, B)] - /** Apply the original Parser one or more times and provide the non-empty sequence of results.*/ - def + : Parser[Seq[A]] + /** Apply the original Parser one or more times and provide the non-empty sequence of results.*/ + def + : Parser[Seq[A]] - /** Apply the original Parser zero or more times and provide the (potentially empty) sequence of results.*/ - def * : Parser[Seq[A]] + /** Apply the original Parser zero or more times and provide the (potentially empty) sequence of results.*/ + def * : Parser[Seq[A]] - /** Apply the original Parser zero or one times, returning None if it was applied zero times or the result wrapped in Some if it was applied once.*/ - def ? : Parser[Option[A]] + /** Apply the original Parser zero or one times, returning None if it was applied zero times or the result wrapped in Some if it was applied once.*/ + def ? : Parser[Option[A]] - /** Apply either the original Parser or `b`.*/ - def |[B >: A](b: Parser[B]): Parser[B] + /** Apply either the original Parser or `b`.*/ + def |[B >: A](b: Parser[B]): Parser[B] - /** Apply either the original Parser or `b`.*/ - def ||[B](b: Parser[B]): Parser[Either[A,B]] + /** Apply either the original Parser or `b`.*/ + def ||[B](b: Parser[B]): Parser[Either[A, B]] - /** Apply the original Parser to the input and then apply `f` to the result.*/ - def map[B](f: A => B): Parser[B] + /** Apply the original Parser to the input and then apply `f` to the result.*/ + def map[B](f: A => B): Parser[B] - /** Returns the original parser. This is useful for converting literals to Parsers. - * For example, `'c'.id` or `"asdf".id`*/ - def id: Parser[A] + /** + * Returns the original parser. This is useful for converting literals to Parsers. + * For example, `'c'.id` or `"asdf".id` + */ + def id: Parser[A] - /** Apply the original Parser, but provide `value` as the result if it succeeds. */ - def ^^^[B](value: B): Parser[B] + /** Apply the original Parser, but provide `value` as the result if it succeeds. */ + def ^^^[B](value: B): Parser[B] - /** Apply the original Parser, but provide `alt` as the result if it fails.*/ - def ??[B >: A](alt: B): Parser[B] + /** Apply the original Parser, but provide `alt` as the result if it fails.*/ + def ??[B >: A](alt: B): Parser[B] - /** Produces a Parser that applies the original Parser and then applies `next` (in order), discarding the result of `next`. - * (The arrow point in the direction of the retained result.)*/ - def <~[B](b: Parser[B]): Parser[A] + /** + * Produces a Parser that applies the original Parser and then applies `next` (in order), discarding the result of `next`. + * (The arrow point in the direction of the retained result.) + */ + def <~[B](b: Parser[B]): Parser[A] - /** Produces a Parser that applies the original Parser and then applies `next` (in order), discarding the result of the original parser. - * (The arrow point in the direction of the retained result.)*/ - def ~>[B](b: Parser[B]): Parser[B] + /** + * Produces a Parser that applies the original Parser and then applies `next` (in order), discarding the result of the original parser. + * (The arrow point in the direction of the retained result.) + */ + def ~>[B](b: Parser[B]): Parser[B] - /** Uses the specified message if the original Parser fails.*/ - def !!!(msg: String): Parser[A] + /** Uses the specified message if the original Parser fails.*/ + def !!!(msg: String): Parser[A] - /** If an exception is thrown by the original Parser, - * capture it and fail locally instead of allowing the exception to propagate up and terminate parsing.*/ - def failOnException: Parser[A] + /** + * If an exception is thrown by the original Parser, + * capture it and fail locally instead of allowing the exception to propagate up and terminate parsing. + */ + def failOnException: Parser[A] - @deprecated("Use `not` and explicitly provide the failure message", "0.12.2") - def unary_- : Parser[Unit] + @deprecated("Use `not` and explicitly provide the failure message", "0.12.2") + def unary_- : Parser[Unit] - /** Apply the original parser, but only succeed if `o` also succeeds. - * Note that `o` does not need to consume the same amount of input to satisfy this condition.*/ - def & (o: Parser[_]): Parser[A] + /** + * Apply the original parser, but only succeed if `o` also succeeds. + * Note that `o` does not need to consume the same amount of input to satisfy this condition. + */ + def &(o: Parser[_]): Parser[A] - @deprecated("Use `and` and `not` and explicitly provide the failure message", "0.12.2") - def - (o: Parser[_]): Parser[A] + @deprecated("Use `and` and `not` and explicitly provide the failure message", "0.12.2") + def -(o: Parser[_]): Parser[A] - /** Explicitly defines the completions for the original Parser.*/ - def examples(s: String*): Parser[A] + /** Explicitly defines the completions for the original Parser.*/ + def examples(s: String*): Parser[A] - /** Explicitly defines the completions for the original Parser.*/ - def examples(s: Set[String], check: Boolean = false): Parser[A] + /** Explicitly defines the completions for the original Parser.*/ + def examples(s: Set[String], check: Boolean = false): Parser[A] - /** - * @param exampleSource the source of examples when displaying completions to the user. - * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can - * prevent lengthy pauses and avoids bad interactive user experience. - * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the - * given parser). Invalid examples will be filtered out and only valid suggestions will - * be displayed. - * @return a new parser with a new source of completions. - */ - def examples(exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] + /** + * @param exampleSource the source of examples when displaying completions to the user. + * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can + * prevent lengthy pauses and avoids bad interactive user experience. + * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the + * given parser). Invalid examples will be filtered out and only valid suggestions will + * be displayed. + * @return a new parser with a new source of completions. + */ + def examples(exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] - /** - * @param exampleSource the source of examples when displaying completions to the user. - * @return a new parser with a new source of completions. It displays at most 25 completion examples and does not - * remove invalid examples. - */ - def examples(exampleSource: ExampleSource): Parser[A] = examples(exampleSource, maxNumberOfExamples = 25, removeInvalidExamples = false) + /** + * @param exampleSource the source of examples when displaying completions to the user. + * @return a new parser with a new source of completions. It displays at most 25 completion examples and does not + * remove invalid examples. + */ + def examples(exampleSource: ExampleSource): Parser[A] = examples(exampleSource, maxNumberOfExamples = 25, removeInvalidExamples = false) - /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ - def string(implicit ev: A <:< Seq[Char]): Parser[String] + /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ + def string(implicit ev: A <:< Seq[Char]): Parser[String] - /** Produces a Parser that filters the original parser. - * If 'f' is not true when applied to the output of the original parser, the Parser returned by this method fails. - * The failure message is constructed by applying `msg` to the String that was successfully parsed by the original parser. */ - def filter(f: A => Boolean, msg: String => String): Parser[A] + /** + * Produces a Parser that filters the original parser. + * If 'f' is not true when applied to the output of the original parser, the Parser returned by this method fails. + * The failure message is constructed by applying `msg` to the String that was successfully parsed by the original parser. + */ + def filter(f: A => Boolean, msg: String => String): Parser[A] - /** Applies the original parser, applies `f` to the result to get the next parser, and applies that parser and uses its result for the overall result. */ - def flatMap[B](f: A => Parser[B]): Parser[B] + /** Applies the original parser, applies `f` to the result to get the next parser, and applies that parser and uses its result for the overall result. */ + def flatMap[B](f: A => Parser[B]): Parser[B] } /** Contains Parser implementation helper methods not typically needed for using parsers. */ -object Parser extends ParserMain -{ - sealed abstract class Result[+T] { - def isFailure: Boolean - def isValid: Boolean - def errors: Seq[String] - def or[B >: T](b: => Result[B]): Result[B] - def either[B](b: => Result[B]): Result[Either[T,B]] - def map[B](f: T => B): Result[B] - def flatMap[B](f: T => Result[B]): Result[B] - def &&(b: => Result[_]): Result[T] - def filter(f: T => Boolean, msg: => String): Result[T] - def seq[B](b: => Result[B]): Result[(T,B)] = app(b)( (m,n) => (m,n) ) - def app[B,C](b: => Result[B])(f: (T, B) => C): Result[C] - def toEither: Either[() => Seq[String], T] - } - final case class Value[+T](value: T) extends Result[T] { - def isFailure = false - def isValid: Boolean = true - def errors = Nil - def app[B,C](b: => Result[B])(f: (T, B) => C): Result[C] = b match { - case fail: Failure => fail - case Value(bv) => Value(f(value, bv)) - } - def &&(b: => Result[_]): Result[T] = b match { case f: Failure => f; case _ => this } - def or[B >: T](b: => Result[B]): Result[B] = this - def either[B](b: => Result[B]): Result[Either[T,B]] = Value(Left(value)) - def map[B](f: T => B): Result[B] = Value(f(value)) - def flatMap[B](f: T => Result[B]): Result[B] = f(value) - def filter(f: T => Boolean, msg: => String): Result[T] = if(f(value)) this else mkFailure(msg) - def toEither = Right(value) - } - final class Failure private[sbt](mkErrors: => Seq[String], val definitive: Boolean) extends Result[Nothing] { - lazy val errors: Seq[String] = mkErrors - def isFailure = true - def isValid = false - def map[B](f: Nothing => B) = this - def flatMap[B](f: Nothing => Result[B]) = this - def or[B](b: => Result[B]): Result[B] = b match { - case v: Value[B] => v - case f: Failure => if(definitive) this else this ++ f - } - def either[B](b: => Result[B]): Result[Either[Nothing,B]] = b match { - case Value(v) => Value(Right(v)) - case f: Failure => if(definitive) this else this ++ f - } - def filter(f: Nothing => Boolean, msg: => String) = this - def app[B,C](b: => Result[B])(f: (Nothing, B) => C): Result[C] = this - def &&(b: => Result[_]) = this - def toEither = Left(() => errors) +object Parser extends ParserMain { + sealed abstract class Result[+T] { + def isFailure: Boolean + def isValid: Boolean + def errors: Seq[String] + def or[B >: T](b: => Result[B]): Result[B] + def either[B](b: => Result[B]): Result[Either[T, B]] + def map[B](f: T => B): Result[B] + def flatMap[B](f: T => Result[B]): Result[B] + def &&(b: => Result[_]): Result[T] + def filter(f: T => Boolean, msg: => String): Result[T] + def seq[B](b: => Result[B]): Result[(T, B)] = app(b)((m, n) => (m, n)) + def app[B, C](b: => Result[B])(f: (T, B) => C): Result[C] + def toEither: Either[() => Seq[String], T] + } + final case class Value[+T](value: T) extends Result[T] { + def isFailure = false + def isValid: Boolean = true + def errors = Nil + def app[B, C](b: => Result[B])(f: (T, B) => C): Result[C] = b match { + case fail: Failure => fail + case Value(bv) => Value(f(value, bv)) + } + def &&(b: => Result[_]): Result[T] = b match { case f: Failure => f; case _ => this } + def or[B >: T](b: => Result[B]): Result[B] = this + def either[B](b: => Result[B]): Result[Either[T, B]] = Value(Left(value)) + def map[B](f: T => B): Result[B] = Value(f(value)) + def flatMap[B](f: T => Result[B]): Result[B] = f(value) + def filter(f: T => Boolean, msg: => String): Result[T] = if (f(value)) this else mkFailure(msg) + def toEither = Right(value) + } + final class Failure private[sbt] (mkErrors: => Seq[String], val definitive: Boolean) extends Result[Nothing] { + lazy val errors: Seq[String] = mkErrors + def isFailure = true + def isValid = false + def map[B](f: Nothing => B) = this + def flatMap[B](f: Nothing => Result[B]) = this + def or[B](b: => Result[B]): Result[B] = b match { + case v: Value[B] => v + case f: Failure => if (definitive) this else this ++ f + } + def either[B](b: => Result[B]): Result[Either[Nothing, B]] = b match { + case Value(v) => Value(Right(v)) + case f: Failure => if (definitive) this else this ++ f + } + def filter(f: Nothing => Boolean, msg: => String) = this + def app[B, C](b: => Result[B])(f: (Nothing, B) => C): Result[C] = this + def &&(b: => Result[_]) = this + def toEither = Left(() => errors) - private[sbt] def ++(f: Failure) = mkFailures(errors ++ f.errors) - } - def mkFailures(errors: => Seq[String], definitive: Boolean = false): Failure = new Failure(errors.distinct, definitive) - def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) + private[sbt] def ++(f: Failure) = mkFailures(errors ++ f.errors) + } + def mkFailures(errors: => Seq[String], definitive: Boolean = false): Failure = new Failure(errors.distinct, definitive) + def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) - @deprecated("This method is deprecated and will be removed in the next major version. Use the parser directly to check for invalid completions.", since = "0.13.2") - def checkMatches(a: Parser[_], completions: Seq[String]) - { - val bad = completions.filter( apply(a)(_).resultEmpty.isFailure) - if(!bad.isEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) - } + @deprecated("This method is deprecated and will be removed in the next major version. Use the parser directly to check for invalid completions.", since = "0.13.2") + def checkMatches(a: Parser[_], completions: Seq[String]) { + val bad = completions.filter(apply(a)(_).resultEmpty.isFailure) + if (!bad.isEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) + } - def tuple[A,B](a: Option[A], b: Option[B]): Option[(A,B)] = - (a,b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } + def tuple[A, B](a: Option[A], b: Option[B]): Option[(A, B)] = + (a, b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } - def mapParser[A,B](a: Parser[A], f: A => B): Parser[B] = - a.ifValid { - a.result match - { - case Some(av) => success( f(av) ) - case None => new MapParser(a, f) - } - } + def mapParser[A, B](a: Parser[A], f: A => B): Parser[B] = + a.ifValid { + a.result match { + case Some(av) => success(f(av)) + case None => new MapParser(a, f) + } + } - def bindParser[A,B](a: Parser[A], f: A => Parser[B]): Parser[B] = - a.ifValid { - a.result match - { - case Some(av) => f(av) - case None => new BindParser(a, f) - } - } + def bindParser[A, B](a: Parser[A], f: A => Parser[B]): Parser[B] = + a.ifValid { + a.result match { + case Some(av) => f(av) + case None => new BindParser(a, f) + } + } - def filterParser[T](a: Parser[T], f: T => Boolean, seen: String, msg: String => String): Parser[T] = - a.ifValid { - a.result match - { - case Some(av) if f(av) => success( av ) - case _ => new Filter(a, f, seen, msg) - } - } + def filterParser[T](a: Parser[T], f: T => Boolean, seen: String, msg: String => String): Parser[T] = + a.ifValid { + a.result match { + case Some(av) if f(av) => success(av) + case _ => new Filter(a, f, seen, msg) + } + } - def seqParser[A,B](a: Parser[A], b: Parser[B]): Parser[(A,B)] = - a.ifValid { b.ifValid { - (a.result, b.result) match { - case (Some(av), Some(bv)) => success( (av, bv) ) - case (Some(av), None) => b map { bv => (av, bv) } - case (None, Some(bv)) => a map { av => (av, bv) } - case (None, None) => new SeqParser(a,b) - } - }} + def seqParser[A, B](a: Parser[A], b: Parser[B]): Parser[(A, B)] = + a.ifValid { + b.ifValid { + (a.result, b.result) match { + case (Some(av), Some(bv)) => success((av, bv)) + case (Some(av), None) => b map { bv => (av, bv) } + case (None, Some(bv)) => a map { av => (av, bv) } + case (None, None) => new SeqParser(a, b) + } + } + } - def choiceParser[A,B](a: Parser[A], b: Parser[B]): Parser[Either[A,B]] = - if(a.valid) - if(b.valid) new HetParser(a,b) else a.map( left.fn ) - else - b.map( right.fn ) + def choiceParser[A, B](a: Parser[A], b: Parser[B]): Parser[Either[A, B]] = + if (a.valid) + if (b.valid) new HetParser(a, b) else a.map(left.fn) + else + b.map(right.fn) - def opt[T](a: Parser[T]): Parser[Option[T]] = - if(a.valid) new Optional(a) else success(None) + def opt[T](a: Parser[T]): Parser[Option[T]] = + if (a.valid) new Optional(a) else success(None) - def onFailure[T](delegate: Parser[T], msg: String): Parser[T] = - if(delegate.valid) new OnFailure(delegate, msg) else failure(msg) - def trapAndFail[T](delegate: Parser[T]): Parser[T] = - delegate.ifValid( new TrapAndFail(delegate) ) + def onFailure[T](delegate: Parser[T], msg: String): Parser[T] = + if (delegate.valid) new OnFailure(delegate, msg) else failure(msg) + def trapAndFail[T](delegate: Parser[T]): Parser[T] = + delegate.ifValid(new TrapAndFail(delegate)) - def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) - def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) + def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) + def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) - def repeat[T](p: Parser[T], min: Int = 0, max: UpperBound = Infinite): Parser[Seq[T]] = - repeat(None, p, min, max, Nil) - private[complete] def repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, revAcc: List[T]): Parser[Seq[T]] = - { - assume(min >= 0, "Minimum must be greater than or equal to zero (was " + min + ")") - assume(max >= min, "Minimum must be less than or equal to maximum (min: " + min + ", max: " + max + ")") + def repeat[T](p: Parser[T], min: Int = 0, max: UpperBound = Infinite): Parser[Seq[T]] = + repeat(None, p, min, max, Nil) + private[complete] def repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, revAcc: List[T]): Parser[Seq[T]] = + { + assume(min >= 0, "Minimum must be greater than or equal to zero (was " + min + ")") + assume(max >= min, "Minimum must be less than or equal to maximum (min: " + min + ", max: " + max + ")") - def checkRepeated(invalidButOptional: => Parser[Seq[T]]): Parser[Seq[T]] = - repeated match - { - case i: Invalid if min == 0 => invalidButOptional - case i: Invalid => i - case _ => - repeated.result match - { - case Some(value) => success(revAcc reverse_::: value :: Nil) // revAcc should be Nil here - case None => if(max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) - } - } + def checkRepeated(invalidButOptional: => Parser[Seq[T]]): Parser[Seq[T]] = + repeated match { + case i: Invalid if min == 0 => invalidButOptional + case i: Invalid => i + case _ => + repeated.result match { + case Some(value) => success(revAcc reverse_::: value :: Nil) // revAcc should be Nil here + case None => if (max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) + } + } - partial match - { - case Some(part) => - part.ifValid { - part.result match - { - case Some(value) => repeat(None, repeated, min, max, value :: revAcc) - case None => checkRepeated(part.map(lv => (lv :: revAcc).reverse)) - } - } - case None => checkRepeated(success(Nil)) - } - } + partial match { + case Some(part) => + part.ifValid { + part.result match { + case Some(value) => repeat(None, repeated, min, max, value :: revAcc) + case None => checkRepeated(part.map(lv => (lv :: revAcc).reverse)) + } + } + case None => checkRepeated(success(Nil)) + } + } - @deprecated("Explicitly call `and` and `not` to provide the failure message.", "0.12.2") - def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) + @deprecated("Explicitly call `and` and `not` to provide the failure message.", "0.12.2") + def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) - def and[T](a: Parser[T], b: Parser[_]): Parser[T] = a.ifValid( b.ifValid( new And(a, b) )) + def and[T](a: Parser[T], b: Parser[_]): Parser[T] = a.ifValid(b.ifValid(new And(a, b))) } -trait ParserMain -{ - /** Provides combinators for Parsers.*/ - implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] - { - def ~[B](b: Parser[B]) = seqParser(a, b) - def ||[B](b: Parser[B]) = choiceParser(a,b) - def |[B >: A](b: Parser[B]) = homParser(a,b) - def ? = opt(a) - def * = zeroOrMore(a) - def + = oneOrMore(a) - def map[B](f: A => B) = mapParser(a, f) - def id = a +trait ParserMain { + /** Provides combinators for Parsers.*/ + implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] { + def ~[B](b: Parser[B]) = seqParser(a, b) + def ||[B](b: Parser[B]) = choiceParser(a, b) + def |[B >: A](b: Parser[B]) = homParser(a, b) + def ? = opt(a) + def * = zeroOrMore(a) + def + = oneOrMore(a) + def map[B](f: A => B) = mapParser(a, f) + def id = a - def ^^^[B](value: B): Parser[B] = a map { _ => value } - def ??[B >: A](alt: B): Parser[B] = a.? map { _ getOrElse alt } - def <~[B](b: Parser[B]): Parser[A] = (a ~ b) map { case av ~ _ => av } - def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } - def !!!(msg: String): Parser[A] = onFailure(a, msg) - def failOnException: Parser[A] = trapAndFail(a) + def ^^^[B](value: B): Parser[B] = a map { _ => value } + def ??[B >: A](alt: B): Parser[B] = a.? map { _ getOrElse alt } + def <~[B](b: Parser[B]): Parser[A] = (a ~ b) map { case av ~ _ => av } + def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } + def !!!(msg: String): Parser[A] = onFailure(a, msg) + def failOnException: Parser[A] = trapAndFail(a) - def unary_- = not(a) - def & (o: Parser[_]) = and(a, o) - def - (o: Parser[_]) = sub(a, o) - def examples(s: String*): Parser[A] = examples(s.toSet) - def examples(s: Set[String], check: Boolean = false): Parser[A] = examples(new FixedSetExamples(s), s.size, check) - def examples(s: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = Parser.examples(a, s, maxNumberOfExamples, removeInvalidExamples) - def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) - def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) - def flatMap[B](f: A => Parser[B]) = bindParser(a, f) - } + def unary_- = not(a) + def &(o: Parser[_]) = and(a, o) + def -(o: Parser[_]) = sub(a, o) + def examples(s: String*): Parser[A] = examples(s.toSet) + def examples(s: Set[String], check: Boolean = false): Parser[A] = examples(new FixedSetExamples(s), s.size, check) + def examples(s: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = Parser.examples(a, s, maxNumberOfExamples, removeInvalidExamples) + def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) + def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) + def flatMap[B](f: A => Parser[B]) = bindParser(a, f) + } - implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) - implicit def literalRichStringParser(s: String): RichParser[String] = richParser(s) + implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) + implicit def literalRichStringParser(s: String): RichParser[String] = richParser(s) - /** Construct a parser that is valid, but has no valid result. This is used as a way - * to provide a definitive Failure when a parser doesn't match empty input. For example, - * in `softFailure(...) | p`, if `p` doesn't match the empty sequence, the failure will come - * from the Parser constructed by the `softFailure` method. */ - private[sbt] def softFailure(msg: => String, definitive: Boolean = false): Parser[Nothing] = - SoftInvalid( mkFailures(msg :: Nil, definitive) ) + /** + * Construct a parser that is valid, but has no valid result. This is used as a way + * to provide a definitive Failure when a parser doesn't match empty input. For example, + * in `softFailure(...) | p`, if `p` doesn't match the empty sequence, the failure will come + * from the Parser constructed by the `softFailure` method. + */ + private[sbt] def softFailure(msg: => String, definitive: Boolean = false): Parser[Nothing] = + SoftInvalid(mkFailures(msg :: Nil, definitive)) - /** Defines a parser that always fails on any input with messages `msgs`. - * If `definitive` is `true`, any failures by later alternatives are discarded.*/ - def invalid(msgs: => Seq[String], definitive: Boolean = false): Parser[Nothing] = Invalid(mkFailures(msgs, definitive)) + /** + * Defines a parser that always fails on any input with messages `msgs`. + * If `definitive` is `true`, any failures by later alternatives are discarded. + */ + def invalid(msgs: => Seq[String], definitive: Boolean = false): Parser[Nothing] = Invalid(mkFailures(msgs, definitive)) - /** Defines a parser that always fails on any input with message `msg`. - * If `definitive` is `true`, any failures by later alternatives are discarded.*/ - def failure(msg: => String, definitive: Boolean = false): Parser[Nothing] = invalid(msg :: Nil, definitive) + /** + * Defines a parser that always fails on any input with message `msg`. + * If `definitive` is `true`, any failures by later alternatives are discarded. + */ + def failure(msg: => String, definitive: Boolean = false): Parser[Nothing] = invalid(msg :: Nil, definitive) - /** Defines a parser that always succeeds on empty input with the result `value`.*/ - def success[T](value: T): Parser[T] = new ValidParser[T] { - override def result = Some(value) - def resultEmpty = Value(value) - def derive(c: Char) = Parser.failure("Expected end of input.") - def completions(level: Int) = Completions.empty - override def toString = "success(" + value + ")" - } + /** Defines a parser that always succeeds on empty input with the result `value`.*/ + def success[T](value: T): Parser[T] = new ValidParser[T] { + override def result = Some(value) + def resultEmpty = Value(value) + def derive(c: Char) = Parser.failure("Expected end of input.") + def completions(level: Int) = Completions.empty + override def toString = "success(" + value + ")" + } - /** Presents a Char range as a Parser. A single Char is parsed only if it is in the given range.*/ - implicit def range(r: collection.immutable.NumericRange[Char]): Parser[Char] = - charClass(r contains _).examples(r.map(_.toString) : _*) + /** Presents a Char range as a Parser. A single Char is parsed only if it is in the given range.*/ + implicit def range(r: collection.immutable.NumericRange[Char]): Parser[Char] = + charClass(r contains _).examples(r.map(_.toString): _*) - /** Defines a Parser that parses a single character only if it is contained in `legal`.*/ - def chars(legal: String): Parser[Char] = - { - val set = legal.toSet - charClass(set, "character in '" + legal + "'") examples(set.map(_.toString)) - } + /** Defines a Parser that parses a single character only if it is contained in `legal`.*/ + def chars(legal: String): Parser[Char] = + { + val set = legal.toSet + charClass(set, "character in '" + legal + "'") examples (set.map(_.toString)) + } - /** Defines a Parser that parses a single character only if the predicate `f` returns true for that character. - * If this parser fails, `label` is used as the failure message. */ - def charClass(f: Char => Boolean, label: String = ""): Parser[Char] = new CharacterClass(f, label) + /** + * Defines a Parser that parses a single character only if the predicate `f` returns true for that character. + * If this parser fails, `label` is used as the failure message. + */ + def charClass(f: Char => Boolean, label: String = ""): Parser[Char] = new CharacterClass(f, label) - /** Presents a single Char `ch` as a Parser that only parses that exact character. */ - implicit def literal(ch: Char): Parser[Char] = new ValidParser[Char] { - def result = None - def resultEmpty = mkFailure( "Expected '" + ch + "'" ) - def derive(c: Char) = if(c == ch) success(ch) else new Invalid(resultEmpty) - def completions(level: Int) = Completions.single(Completion.suggestStrict(ch.toString)) - override def toString = "'" + ch + "'" - } - /** Presents a literal String `s` as a Parser that only parses that exact text and provides it as the result.*/ - implicit def literal(s: String): Parser[String] = stringLiteral(s, 0) + /** Presents a single Char `ch` as a Parser that only parses that exact character. */ + implicit def literal(ch: Char): Parser[Char] = new ValidParser[Char] { + def result = None + def resultEmpty = mkFailure("Expected '" + ch + "'") + def derive(c: Char) = if (c == ch) success(ch) else new Invalid(resultEmpty) + def completions(level: Int) = Completions.single(Completion.suggestStrict(ch.toString)) + override def toString = "'" + ch + "'" + } + /** Presents a literal String `s` as a Parser that only parses that exact text and provides it as the result.*/ + implicit def literal(s: String): Parser[String] = stringLiteral(s, 0) - /** See [[unapply]]. */ - object ~ { - /** Convenience for destructuring a tuple that mirrors the `~` combinator.*/ - def unapply[A,B](t: (A,B)): Some[(A,B)] = Some(t) - } + /** See [[unapply]]. */ + object ~ { + /** Convenience for destructuring a tuple that mirrors the `~` combinator.*/ + def unapply[A, B](t: (A, B)): Some[(A, B)] = Some(t) + } - /** Parses input `str` using `parser`. If successful, the result is provided wrapped in `Right`. If unsuccesful, an error message is provided in `Left`.*/ - def parse[T](str: String, parser: Parser[T]): Either[String, T] = - Parser.result(parser, str).left.map { failures => - val (msgs,pos) = failures() - ProcessError(str, msgs, pos) - } + /** Parses input `str` using `parser`. If successful, the result is provided wrapped in `Right`. If unsuccesful, an error message is provided in `Left`.*/ + def parse[T](str: String, parser: Parser[T]): Either[String, T] = + Parser.result(parser, str).left.map { failures => + val (msgs, pos) = failures() + ProcessError(str, msgs, pos) + } - /** Convenience method to use when developing a parser. - * `parser` is applied to the input `str`. - * If `completions` is true, the available completions for the input are displayed. - * Otherwise, the result of parsing is printed using the result's `toString` method. - * If parsing fails, the error message is displayed. - * - * See also [[sampleParse]] and [[sampleCompletions]]. */ - def sample(str: String, parser: Parser[_], completions: Boolean = false): Unit = - if(completions) sampleCompletions(str, parser) else sampleParse(str, parser) + /** + * Convenience method to use when developing a parser. + * `parser` is applied to the input `str`. + * If `completions` is true, the available completions for the input are displayed. + * Otherwise, the result of parsing is printed using the result's `toString` method. + * If parsing fails, the error message is displayed. + * + * See also [[sampleParse]] and [[sampleCompletions]]. + */ + def sample(str: String, parser: Parser[_], completions: Boolean = false): Unit = + if (completions) sampleCompletions(str, parser) else sampleParse(str, parser) - /** Convenience method to use when developing a parser. - * `parser` is applied to the input `str` and the result of parsing is printed using the result's `toString` method. - * If parsing fails, the error message is displayed. */ - def sampleParse(str: String, parser: Parser[_]): Unit = - parse(str, parser) match { - case Left(msg) => println(msg) - case Right(v) => println(v) - } + /** + * Convenience method to use when developing a parser. + * `parser` is applied to the input `str` and the result of parsing is printed using the result's `toString` method. + * If parsing fails, the error message is displayed. + */ + def sampleParse(str: String, parser: Parser[_]): Unit = + parse(str, parser) match { + case Left(msg) => println(msg) + case Right(v) => println(v) + } - /** Convenience method to use when developing a parser. - * `parser` is applied to the input `str` and the available completions are displayed on separate lines. - * If parsing fails, the error message is displayed. */ - def sampleCompletions(str: String, parser: Parser[_], level: Int = 1): Unit = - Parser.completions(parser, str, level).get foreach println + /** + * Convenience method to use when developing a parser. + * `parser` is applied to the input `str` and the available completions are displayed on separate lines. + * If parsing fails, the error message is displayed. + */ + def sampleCompletions(str: String, parser: Parser[_], level: Int = 1): Unit = + Parser.completions(parser, str, level).get foreach println - // intended to be temporary pending proper error feedback - def result[T](p: Parser[T], s: String): Either[() => (Seq[String],Int), T] = - { - def loop(i: Int, a: Parser[T]): Either[() => (Seq[String],Int), T] = - a match - { - case Invalid(f) => Left( () => (f.errors, i) ) - case _ => - val ci = i+1 - if(ci >= s.length) - a.resultEmpty.toEither.left.map { msgs0 => () => - val msgs = msgs0() - val nonEmpty = if(msgs.isEmpty) "Unexpected end of input" :: Nil else msgs - (nonEmpty, ci) - } - else - loop(ci, a derive s(ci) ) - } - loop(-1, p) - } + // intended to be temporary pending proper error feedback + def result[T](p: Parser[T], s: String): Either[() => (Seq[String], Int), T] = + { + def loop(i: Int, a: Parser[T]): Either[() => (Seq[String], Int), T] = + a match { + case Invalid(f) => Left(() => (f.errors, i)) + case _ => + val ci = i + 1 + if (ci >= s.length) + a.resultEmpty.toEither.left.map { msgs0 => + () => + val msgs = msgs0() + val nonEmpty = if (msgs.isEmpty) "Unexpected end of input" :: Nil else msgs + (nonEmpty, ci) + } + else + loop(ci, a derive s(ci)) + } + loop(-1, p) + } - /** Applies parser `p` to input `s`. */ - def apply[T](p: Parser[T])(s: String): Parser[T] = - (p /: s)(derive1) + /** Applies parser `p` to input `s`. */ + def apply[T](p: Parser[T])(s: String): Parser[T] = + (p /: s)(derive1) - /** Applies parser `p` to a single character of input. */ - def derive1[T](p: Parser[T], c: Char): Parser[T] = - if(p.valid) p.derive(c) else p + /** Applies parser `p` to a single character of input. */ + def derive1[T](p: Parser[T], c: Char): Parser[T] = + if (p.valid) p.derive(c) else p - /** Applies parser `p` to input `s` and returns the completions at verbosity `level`. - * The interpretation of `level` is up to parser definitions, but 0 is the default by convention, - * with increasing positive numbers corresponding to increasing verbosity. Typically no more than - * a few levels are defined. */ - def completions(p: Parser[_], s: String, level: Int): Completions = - // The x Completions.empty removes any trailing token completions where append.isEmpty - apply(p)(s).completions(level) x Completions.empty + /** + * Applies parser `p` to input `s` and returns the completions at verbosity `level`. + * The interpretation of `level` is up to parser definitions, but 0 is the default by convention, + * with increasing positive numbers corresponding to increasing verbosity. Typically no more than + * a few levels are defined. + */ + def completions(p: Parser[_], s: String, level: Int): Completions = + // The x Completions.empty removes any trailing token completions where append.isEmpty + apply(p)(s).completions(level) x Completions.empty - def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = - examples(a, new FixedSetExamples(completions), completions.size, check) + def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = + examples(a, new FixedSetExamples(completions), completions.size, check) - /** - * @param a the parser to decorate with a source of examples. All validation and parsing is delegated to this parser, - * only [[Parser.completions]] is modified. - * @param completions the source of examples when displaying completions to the user. - * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can - * prevent lengthy pauses and avoids bad interactive user experience. - * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the given parser). An - * exception is thrown if the example source contains no valid completion suggestions. - * @tparam A the type of values that are returned by the parser. - * @return - */ - def examples[A](a: Parser[A], completions: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = - if(a.valid) { - a.result match - { - case Some(av) => success( av ) - case None => - new ParserWithExamples(a, completions, maxNumberOfExamples, removeInvalidExamples) - } - } - else a + /** + * @param a the parser to decorate with a source of examples. All validation and parsing is delegated to this parser, + * only [[Parser.completions]] is modified. + * @param completions the source of examples when displaying completions to the user. + * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can + * prevent lengthy pauses and avoids bad interactive user experience. + * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the given parser). An + * exception is thrown if the example source contains no valid completion suggestions. + * @tparam A the type of values that are returned by the parser. + * @return + */ + def examples[A](a: Parser[A], completions: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = + if (a.valid) { + a.result match { + case Some(av) => success(av) + case None => + new ParserWithExamples(a, completions, maxNumberOfExamples, removeInvalidExamples) + } + } else a - def matched(t: Parser[_], seen: Vector[Char] = Vector.empty, partial: Boolean = false): Parser[String] = - t match - { - case i: Invalid => if(partial && !seen.isEmpty) success(seen.mkString) else i - case _ => - if(t.result.isEmpty) - new MatchedString(t, seen, partial) - else - success(seen.mkString) - } + def matched(t: Parser[_], seen: Vector[Char] = Vector.empty, partial: Boolean = false): Parser[String] = + t match { + case i: Invalid => if (partial && !seen.isEmpty) success(seen.mkString) else i + case _ => + if (t.result.isEmpty) + new MatchedString(t, seen, partial) + else + success(seen.mkString) + } - /** Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, the completions provided by the delegate `t` or a later derivative are appended to - * the prefix String already seen by this parser. */ - def token[T](t: Parser[T]): Parser[T] = token(t, TokenCompletions.default) + /** + * Establishes delegate parser `t` as a single token of tab completion. + * When tab completion of part of this token is requested, the completions provided by the delegate `t` or a later derivative are appended to + * the prefix String already seen by this parser. + */ + def token[T](t: Parser[T]): Parser[T] = token(t, TokenCompletions.default) - /** Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, no completions are returned if `hide` returns true for the current tab completion level. - * Otherwise, the completions provided by the delegate `t` or a later derivative are appended to the prefix String already seen by this parser.*/ - def token[T](t: Parser[T], hide: Int => Boolean): Parser[T] = token(t, TokenCompletions.default.hideWhen(hide)) + /** + * Establishes delegate parser `t` as a single token of tab completion. + * When tab completion of part of this token is requested, no completions are returned if `hide` returns true for the current tab completion level. + * Otherwise, the completions provided by the delegate `t` or a later derivative are appended to the prefix String already seen by this parser. + */ + def token[T](t: Parser[T], hide: Int => Boolean): Parser[T] = token(t, TokenCompletions.default.hideWhen(hide)) - /** Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, `description` is displayed for suggestions and no completions are ever performed. */ - def token[T](t: Parser[T], description: String): Parser[T] = token(t, TokenCompletions.displayOnly(description)) + /** + * Establishes delegate parser `t` as a single token of tab completion. + * When tab completion of part of this token is requested, `description` is displayed for suggestions and no completions are ever performed. + */ + def token[T](t: Parser[T], description: String): Parser[T] = token(t, TokenCompletions.displayOnly(description)) - /** Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, `display` is used as the printed suggestion, but the completions from the delegate - * parser `t` are used to complete if unambiguous. */ - def tokenDisplay[T](t: Parser[T], display: String): Parser[T] = - token(t, TokenCompletions.overrideDisplay(display)) + /** + * Establishes delegate parser `t` as a single token of tab completion. + * When tab completion of part of this token is requested, `display` is used as the printed suggestion, but the completions from the delegate + * parser `t` are used to complete if unambiguous. + */ + def tokenDisplay[T](t: Parser[T], display: String): Parser[T] = + token(t, TokenCompletions.overrideDisplay(display)) - def token[T](t: Parser[T], complete: TokenCompletions): Parser[T] = - mkToken(t, "", complete) + def token[T](t: Parser[T], complete: TokenCompletions): Parser[T] = + mkToken(t, "", complete) - @deprecated("Use a different `token` overload.", "0.12.1") - def token[T](t: Parser[T], seen: String, track: Boolean, hide: Int => Boolean): Parser[T] = - { - val base = if(track) TokenCompletions.default else TokenCompletions.displayOnly(seen) - token(t, base.hideWhen(hide)) - } + @deprecated("Use a different `token` overload.", "0.12.1") + def token[T](t: Parser[T], seen: String, track: Boolean, hide: Int => Boolean): Parser[T] = + { + val base = if (track) TokenCompletions.default else TokenCompletions.displayOnly(seen) + token(t, base.hideWhen(hide)) + } - private[sbt] def mkToken[T](t: Parser[T], seen: String, complete: TokenCompletions): Parser[T] = - if(t.valid && !t.isTokenStart) - if(t.result.isEmpty) new TokenStart(t, seen, complete) else t - else - t + private[sbt] def mkToken[T](t: Parser[T], seen: String, complete: TokenCompletions): Parser[T] = + if (t.valid && !t.isTokenStart) + if (t.result.isEmpty) new TokenStart(t, seen, complete) else t + else + t - def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = (a,b) match { - case (Invalid(af), Invalid(bf)) => Invalid(af ++ bf) - case (Invalid(_), bv) => bv - case (av, Invalid(_)) => av - case (av, bv) => new HomParser(a, b) - } + def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = (a, b) match { + case (Invalid(af), Invalid(bf)) => Invalid(af ++ bf) + case (Invalid(_), bv) => bv + case (av, Invalid(_)) => av + case (av, bv) => new HomParser(a, b) + } - @deprecated("Explicitly specify the failure message.", "0.12.2") - def not(p: Parser[_]): Parser[Unit] = not(p, "Excluded.") + @deprecated("Explicitly specify the failure message.", "0.12.2") + def not(p: Parser[_]): Parser[Unit] = not(p, "Excluded.") - def not(p: Parser[_], failMessage: String): Parser[Unit] = p.result match { - case None => new Not(p, failMessage) - case Some(_) => failure(failMessage) - } + def not(p: Parser[_], failMessage: String): Parser[Unit] = p.result match { + case None => new Not(p, failMessage) + case Some(_) => failure(failMessage) + } - def oneOf[T](p: Seq[Parser[T]]): Parser[T] = p.reduceLeft(_ | _) - def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = seq0(p, Nil) - def seq0[T](p: Seq[Parser[T]], errors: => Seq[String]): Parser[Seq[T]] = - { - val (newErrors, valid) = separate(p) { case Invalid(f) => Left(f.errors); case ok => Right(ok) } - def combinedErrors = errors ++ newErrors.flatten - if(valid.isEmpty) invalid(combinedErrors) else new ParserSeq(valid, combinedErrors) - } + def oneOf[T](p: Seq[Parser[T]]): Parser[T] = p.reduceLeft(_ | _) + def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = seq0(p, Nil) + def seq0[T](p: Seq[Parser[T]], errors: => Seq[String]): Parser[Seq[T]] = + { + val (newErrors, valid) = separate(p) { case Invalid(f) => Left(f.errors); case ok => Right(ok) } + def combinedErrors = errors ++ newErrors.flatten + if (valid.isEmpty) invalid(combinedErrors) else new ParserSeq(valid, combinedErrors) + } - def stringLiteral(s: String, start: Int): Parser[String] = - { - val len = s.length - if(len == 0) sys.error("String literal cannot be empty") else if(start >= len) success(s) else new StringLiteral(s, start) - } + def stringLiteral(s: String, start: Int): Parser[String] = + { + val len = s.length + if (len == 0) sys.error("String literal cannot be empty") else if (start >= len) success(s) else new StringLiteral(s, start) + } } -sealed trait ValidParser[T] extends Parser[T] -{ - final def valid = true - final def failure = None - final def ifValid[S](p: => Parser[S]): Parser[S] = p +sealed trait ValidParser[T] extends Parser[T] { + final def valid = true + final def failure = None + final def ifValid[S](p: => Parser[S]): Parser[S] = p } -private final case class Invalid(fail: Failure) extends Parser[Nothing] -{ - def failure = Some(fail) - def result = None - def resultEmpty = fail - def derive(c: Char) = sys.error("Invalid.") - def completions(level: Int) = Completions.nil - override def toString = fail.errors.mkString("; ") - def valid = false - def ifValid[S](p: => Parser[S]): Parser[S] = this +private final case class Invalid(fail: Failure) extends Parser[Nothing] { + def failure = Some(fail) + def result = None + def resultEmpty = fail + def derive(c: Char) = sys.error("Invalid.") + def completions(level: Int) = Completions.nil + override def toString = fail.errors.mkString("; ") + def valid = false + def ifValid[S](p: => Parser[S]): Parser[S] = this } -private final case class SoftInvalid(fail: Failure) extends ValidParser[Nothing] -{ - def result = None - def resultEmpty = fail - def derive(c: Char) = Invalid(fail) - def completions(level: Int) = Completions.nil - override def toString = fail.errors.mkString("; ") +private final case class SoftInvalid(fail: Failure) extends ValidParser[Nothing] { + def result = None + def resultEmpty = fail + def derive(c: Char) = Invalid(fail) + def completions(level: Int) = Completions.nil + override def toString = fail.errors.mkString("; ") } -private final class TrapAndFail[A](a: Parser[A]) extends ValidParser[A] -{ - def result = try { a.result } catch { case e: Exception => None } - def resultEmpty = try { a.resultEmpty } catch { case e: Exception => fail(e) } - def derive(c: Char) = try { trapAndFail(a derive c) } catch { case e: Exception => Invalid(fail(e)) } - def completions(level: Int) = try { a.completions(level) } catch { case e: Exception => Completions.nil } - override def toString = "trap(" + a + ")" - override def isTokenStart = a.isTokenStart - private[this] def fail(e: Exception): Failure = mkFailure(e.toString) +private final class TrapAndFail[A](a: Parser[A]) extends ValidParser[A] { + def result = try { a.result } catch { case e: Exception => None } + def resultEmpty = try { a.resultEmpty } catch { case e: Exception => fail(e) } + def derive(c: Char) = try { trapAndFail(a derive c) } catch { case e: Exception => Invalid(fail(e)) } + def completions(level: Int) = try { a.completions(level) } catch { case e: Exception => Completions.nil } + override def toString = "trap(" + a + ")" + override def isTokenStart = a.isTokenStart + private[this] def fail(e: Exception): Failure = mkFailure(e.toString) } -private final class OnFailure[A](a: Parser[A], message: String) extends ValidParser[A] -{ - def result = a.result - def resultEmpty = a.resultEmpty match { case f: Failure => mkFailure(message); case v: Value[A] => v } - def derive(c: Char) = onFailure(a derive c, message) - def completions(level: Int) = a.completions(level) - override def toString = "(" + a + " !!! \"" + message + "\" )" - override def isTokenStart = a.isTokenStart +private final class OnFailure[A](a: Parser[A], message: String) extends ValidParser[A] { + def result = a.result + def resultEmpty = a.resultEmpty match { case f: Failure => mkFailure(message); case v: Value[A] => v } + def derive(c: Char) = onFailure(a derive c, message) + def completions(level: Int) = a.completions(level) + override def toString = "(" + a + " !!! \"" + message + "\" )" + override def isTokenStart = a.isTokenStart } -private final class SeqParser[A,B](a: Parser[A], b: Parser[B]) extends ValidParser[(A,B)] -{ - lazy val result = tuple(a.result,b.result) - lazy val resultEmpty = a.resultEmpty seq b.resultEmpty - def derive(c: Char) = - { - val common = a.derive(c) ~ b - a.resultEmpty match - { - case Value(av) => common | b.derive(c).map(br => (av,br)) - case _: Failure => common - } - } - def completions(level: Int) = a.completions(level) x b.completions(level) - override def toString = "(" + a + " ~ " + b + ")" +private final class SeqParser[A, B](a: Parser[A], b: Parser[B]) extends ValidParser[(A, B)] { + lazy val result = tuple(a.result, b.result) + lazy val resultEmpty = a.resultEmpty seq b.resultEmpty + def derive(c: Char) = + { + val common = a.derive(c) ~ b + a.resultEmpty match { + case Value(av) => common | b.derive(c).map(br => (av, br)) + case _: Failure => common + } + } + def completions(level: Int) = a.completions(level) x b.completions(level) + override def toString = "(" + a + " ~ " + b + ")" } -private final class HomParser[A](a: Parser[A], b: Parser[A]) extends ValidParser[A] -{ - lazy val result = tuple(a.result, b.result) map (_._1) - def derive(c: Char) = (a derive c) | (b derive c) - lazy val resultEmpty = a.resultEmpty or b.resultEmpty - def completions(level: Int) = a.completions(level) ++ b.completions(level) - override def toString = "(" + a + " | " + b + ")" +private final class HomParser[A](a: Parser[A], b: Parser[A]) extends ValidParser[A] { + lazy val result = tuple(a.result, b.result) map (_._1) + def derive(c: Char) = (a derive c) | (b derive c) + lazy val resultEmpty = a.resultEmpty or b.resultEmpty + def completions(level: Int) = a.completions(level) ++ b.completions(level) + override def toString = "(" + a + " | " + b + ")" } -private final class HetParser[A,B](a: Parser[A], b: Parser[B]) extends ValidParser[Either[A,B]] -{ - lazy val result = tuple(a.result, b.result) map { case (a,b) => Left(a) } - def derive(c: Char) = (a derive c) || (b derive c) - lazy val resultEmpty = a.resultEmpty either b.resultEmpty - def completions(level: Int) = a.completions(level) ++ b.completions(level) - override def toString = "(" + a + " || " + b + ")" +private final class HetParser[A, B](a: Parser[A], b: Parser[B]) extends ValidParser[Either[A, B]] { + lazy val result = tuple(a.result, b.result) map { case (a, b) => Left(a) } + def derive(c: Char) = (a derive c) || (b derive c) + lazy val resultEmpty = a.resultEmpty either b.resultEmpty + def completions(level: Int) = a.completions(level) ++ b.completions(level) + override def toString = "(" + a + " || " + b + ")" } -private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) extends ValidParser[Seq[T]] -{ - assert(!a.isEmpty) - lazy val resultEmpty: Result[Seq[T]] = - { - val res = a.map(_.resultEmpty) - val (failures, values) = separate(res)(_.toEither) -// if(failures.isEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) - if(values.nonEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) - } - def result = { - val success = a.flatMap(_.result) - if(success.length == a.length) Some(success) else None - } - def completions(level: Int) = a.map(_.completions(level)).reduceLeft(_ ++ _) - def derive(c: Char) = seq0(a.map(_ derive c), errors) - override def toString = "seq(" + a + ")" +private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) extends ValidParser[Seq[T]] { + assert(!a.isEmpty) + lazy val resultEmpty: Result[Seq[T]] = + { + val res = a.map(_.resultEmpty) + val (failures, values) = separate(res)(_.toEither) + // if(failures.isEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) + if (values.nonEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) + } + def result = { + val success = a.flatMap(_.result) + if (success.length == a.length) Some(success) else None + } + def completions(level: Int) = a.map(_.completions(level)).reduceLeft(_ ++ _) + def derive(c: Char) = seq0(a.map(_ derive c), errors) + override def toString = "seq(" + a + ")" } -private final class BindParser[A,B](a: Parser[A], f: A => Parser[B]) extends ValidParser[B] -{ - lazy val result = a.result flatMap { av => f(av).result } - lazy val resultEmpty = a.resultEmpty flatMap { av => f(av).resultEmpty } - def completions(level: Int) = - a.completions(level) flatMap { c => - apply(a)(c.append).resultEmpty match { - case _: Failure => Completions.strict(Set.empty + c) - case Value(av) => c x f(av).completions(level) - } - } +private final class BindParser[A, B](a: Parser[A], f: A => Parser[B]) extends ValidParser[B] { + lazy val result = a.result flatMap { av => f(av).result } + lazy val resultEmpty = a.resultEmpty flatMap { av => f(av).resultEmpty } + def completions(level: Int) = + a.completions(level) flatMap { c => + apply(a)(c.append).resultEmpty match { + case _: Failure => Completions.strict(Set.empty + c) + case Value(av) => c x f(av).completions(level) + } + } - def derive(c: Char) = - { - val common = a derive c flatMap f - a.resultEmpty match - { - case Value(av) => common | derive1(f(av), c) - case _: Failure => common - } - } - override def isTokenStart = a.isTokenStart - override def toString = "bind(" + a + ")" + def derive(c: Char) = + { + val common = a derive c flatMap f + a.resultEmpty match { + case Value(av) => common | derive1(f(av), c) + case _: Failure => common + } + } + override def isTokenStart = a.isTokenStart + override def toString = "bind(" + a + ")" } -private final class MapParser[A,B](a: Parser[A], f: A => B) extends ValidParser[B] -{ - lazy val result = a.result map f - lazy val resultEmpty = a.resultEmpty map f - def derive(c: Char) = (a derive c) map f - def completions(level: Int) = a.completions(level) - override def isTokenStart = a.isTokenStart - override def toString = "map(" + a + ")" +private final class MapParser[A, B](a: Parser[A], f: A => B) extends ValidParser[B] { + lazy val result = a.result map f + lazy val resultEmpty = a.resultEmpty map f + def derive(c: Char) = (a derive c) map f + def completions(level: Int) = a.completions(level) + override def isTokenStart = a.isTokenStart + override def toString = "map(" + a + ")" } -private final class Filter[T](p: Parser[T], f: T => Boolean, seen: String, msg: String => String) extends ValidParser[T] -{ - def filterResult(r: Result[T]) = r.filter(f, msg(seen)) - lazy val result = p.result filter f - lazy val resultEmpty = filterResult(p.resultEmpty) - def derive(c: Char) = filterParser(p derive c, f, seen + c, msg) - def completions(level: Int) = p.completions(level) filterS { s => filterResult(apply(p)(s).resultEmpty).isValid } - override def toString = "filter(" + p + ")" - override def isTokenStart = p.isTokenStart +private final class Filter[T](p: Parser[T], f: T => Boolean, seen: String, msg: String => String) extends ValidParser[T] { + def filterResult(r: Result[T]) = r.filter(f, msg(seen)) + lazy val result = p.result filter f + lazy val resultEmpty = filterResult(p.resultEmpty) + def derive(c: Char) = filterParser(p derive c, f, seen + c, msg) + def completions(level: Int) = p.completions(level) filterS { s => filterResult(apply(p)(s).resultEmpty).isValid } + override def toString = "filter(" + p + ")" + override def isTokenStart = p.isTokenStart } -private final class MatchedString(delegate: Parser[_], seenV: Vector[Char], partial: Boolean) extends ValidParser[String] -{ - lazy val seen = seenV.mkString - def derive(c: Char) = matched(delegate derive c, seenV :+ c, partial) - def completions(level: Int) = delegate.completions(level) - def result = if(delegate.result.isDefined) Some(seen) else None - def resultEmpty = delegate.resultEmpty match { case f: Failure if !partial => f; case _ => Value(seen) } - override def isTokenStart = delegate.isTokenStart - override def toString = "matched(" + partial + ", " + seen + ", " + delegate + ")" +private final class MatchedString(delegate: Parser[_], seenV: Vector[Char], partial: Boolean) extends ValidParser[String] { + lazy val seen = seenV.mkString + def derive(c: Char) = matched(delegate derive c, seenV :+ c, partial) + def completions(level: Int) = delegate.completions(level) + def result = if (delegate.result.isDefined) Some(seen) else None + def resultEmpty = delegate.resultEmpty match { case f: Failure if !partial => f; case _ => Value(seen) } + override def isTokenStart = delegate.isTokenStart + override def toString = "matched(" + partial + ", " + seen + ", " + delegate + ")" } -private final class TokenStart[T](delegate: Parser[T], seen: String, complete: TokenCompletions) extends ValidParser[T] -{ - def derive(c: Char) = mkToken( delegate derive c, seen + c, complete) - def completions(level: Int) = complete match { - case dc: TokenCompletions.Delegating => dc.completions(seen, level, delegate.completions(level)) - case fc: TokenCompletions.Fixed => fc.completions(seen, level) - } - def result = delegate.result - def resultEmpty = delegate.resultEmpty - override def isTokenStart = true - override def toString = "token('" + complete + ", " + delegate + ")" +private final class TokenStart[T](delegate: Parser[T], seen: String, complete: TokenCompletions) extends ValidParser[T] { + def derive(c: Char) = mkToken(delegate derive c, seen + c, complete) + def completions(level: Int) = complete match { + case dc: TokenCompletions.Delegating => dc.completions(seen, level, delegate.completions(level)) + case fc: TokenCompletions.Fixed => fc.completions(seen, level) + } + def result = delegate.result + def resultEmpty = delegate.resultEmpty + override def isTokenStart = true + override def toString = "token('" + complete + ", " + delegate + ")" } -private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] -{ - lazy val result = tuple(a.result,b.result) map { _._1 } - def derive(c: Char) = (a derive c) & (b derive c) - def completions(level: Int) = a.completions(level).filterS(s => apply(b)(s).resultEmpty.isValid ) - lazy val resultEmpty = a.resultEmpty && b.resultEmpty - override def toString = "(%s) && (%s)".format(a,b) +private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] { + lazy val result = tuple(a.result, b.result) map { _._1 } + def derive(c: Char) = (a derive c) & (b derive c) + def completions(level: Int) = a.completions(level).filterS(s => apply(b)(s).resultEmpty.isValid) + lazy val resultEmpty = a.resultEmpty && b.resultEmpty + override def toString = "(%s) && (%s)".format(a, b) } -private final class Not(delegate: Parser[_], failMessage: String) extends ValidParser[Unit] -{ - def derive(c: Char) = if(delegate.valid) not(delegate derive c, failMessage) else this - def completions(level: Int) = Completions.empty - def result = None - lazy val resultEmpty = delegate.resultEmpty match { - case f: Failure => Value(()) - case v: Value[_] => mkFailure(failMessage) - } - override def toString = " -(%s)".format(delegate) +private final class Not(delegate: Parser[_], failMessage: String) extends ValidParser[Unit] { + def derive(c: Char) = if (delegate.valid) not(delegate derive c, failMessage) else this + def completions(level: Int) = Completions.empty + def result = None + lazy val resultEmpty = delegate.resultEmpty match { + case f: Failure => Value(()) + case v: Value[_] => mkFailure(failMessage) + } + override def toString = " -(%s)".format(delegate) } /** @@ -739,115 +744,105 @@ private final class Not(delegate: Parser[_], failMessage: String) extends ValidP * @param removeInvalidExamples indicates whether to remove examples that are deemed invalid by the delegate parser. * @tparam T the type of value produced by the parser. */ -private final class ParserWithExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean) extends ValidParser[T] -{ - def derive(c: Char) = - examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples, removeInvalidExamples) +private final class ParserWithExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean) extends ValidParser[T] { + def derive(c: Char) = + examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples, removeInvalidExamples) - def result = delegate.result + def result = delegate.result - lazy val resultEmpty = delegate.resultEmpty + lazy val resultEmpty = delegate.resultEmpty - def completions(level: Int) = { - if(exampleSource().isEmpty) - if(resultEmpty.isValid) Completions.nil else Completions.empty - else { - val examplesBasedOnTheResult = filteredExamples.take(maxNumberOfExamples).toSet - Completions(examplesBasedOnTheResult.map(ex => Completion.suggestion(ex))) - } - } + def completions(level: Int) = { + if (exampleSource().isEmpty) + if (resultEmpty.isValid) Completions.nil else Completions.empty + else { + val examplesBasedOnTheResult = filteredExamples.take(maxNumberOfExamples).toSet + Completions(examplesBasedOnTheResult.map(ex => Completion.suggestion(ex))) + } + } - override def toString = "examples(" + delegate + ", " + exampleSource().take(2).toList + ")" + override def toString = "examples(" + delegate + ", " + exampleSource().take(2).toList + ")" - private def filteredExamples: Iterable[String] = { - if (removeInvalidExamples) - exampleSource().filter(isExampleValid) - else - exampleSource() - } + private def filteredExamples: Iterable[String] = { + if (removeInvalidExamples) + exampleSource().filter(isExampleValid) + else + exampleSource() + } - private def isExampleValid(example: String): Boolean = { - apply(delegate)(example).resultEmpty.isValid - } + private def isExampleValid(example: String): Boolean = { + apply(delegate)(example).resultEmpty.isValid + } } -private final class StringLiteral(str: String, start: Int) extends ValidParser[String] -{ - assert(0 <= start && start < str.length) - def failMsg = "Expected '" + str + "'" - def resultEmpty = mkFailure(failMsg) - def result = None - def derive(c: Char) = if(str.charAt(start) == c) stringLiteral(str, start+1) else new Invalid(resultEmpty) - def completions(level: Int) = Completions.single(Completion.suggestion(str.substring(start))) - override def toString = '"' + str + '"' +private final class StringLiteral(str: String, start: Int) extends ValidParser[String] { + assert(0 <= start && start < str.length) + def failMsg = "Expected '" + str + "'" + def resultEmpty = mkFailure(failMsg) + def result = None + def derive(c: Char) = if (str.charAt(start) == c) stringLiteral(str, start + 1) else new Invalid(resultEmpty) + def completions(level: Int) = Completions.single(Completion.suggestion(str.substring(start))) + override def toString = '"' + str + '"' } -private final class CharacterClass(f: Char => Boolean, label: String) extends ValidParser[Char] -{ - def result = None - def resultEmpty = mkFailure("Expected " + label) - def derive(c: Char) = if( f(c) ) success(c) else Invalid(resultEmpty) - def completions(level: Int) = Completions.empty - override def toString = "class(" + label + ")" +private final class CharacterClass(f: Char => Boolean, label: String) extends ValidParser[Char] { + def result = None + def resultEmpty = mkFailure("Expected " + label) + def derive(c: Char) = if (f(c)) success(c) else Invalid(resultEmpty) + def completions(level: Int) = Completions.empty + override def toString = "class(" + label + ")" } -private final class Optional[T](delegate: Parser[T]) extends ValidParser[Option[T]] -{ - def result = delegate.result map some.fn - def resultEmpty = Value(None) - def derive(c: Char) = (delegate derive c).map(some.fn) - def completions(level: Int) = Completion.empty +: delegate.completions(level) - override def toString = delegate.toString + "?" +private final class Optional[T](delegate: Parser[T]) extends ValidParser[Option[T]] { + def result = delegate.result map some.fn + def resultEmpty = Value(None) + def derive(c: Char) = (delegate derive c).map(some.fn) + def completions(level: Int) = Completion.empty +: delegate.completions(level) + override def toString = delegate.toString + "?" } -private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, accumulatedReverse: List[T]) extends ValidParser[Seq[T]] -{ - assume(0 <= min, "Minimum occurences must be non-negative") - assume(max >= min, "Minimum occurences must be less than the maximum occurences") +private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, accumulatedReverse: List[T]) extends ValidParser[Seq[T]] { + assume(0 <= min, "Minimum occurences must be non-negative") + assume(max >= min, "Minimum occurences must be less than the maximum occurences") - def derive(c: Char) = - partial match - { - case Some(part) => - val partD = repeat(Some(part derive c), repeated, min, max, accumulatedReverse) - part.resultEmpty match - { - case Value(pv) => partD | repeatDerive(c, pv :: accumulatedReverse) - case _: Failure => partD - } - case None => repeatDerive(c, accumulatedReverse) - } + def derive(c: Char) = + partial match { + case Some(part) => + val partD = repeat(Some(part derive c), repeated, min, max, accumulatedReverse) + part.resultEmpty match { + case Value(pv) => partD | repeatDerive(c, pv :: accumulatedReverse) + case _: Failure => partD + } + case None => repeatDerive(c, accumulatedReverse) + } - def repeatDerive(c: Char, accRev: List[T]): Parser[Seq[T]] = repeat(Some(repeated derive c), repeated, (min - 1) max 0, max.decrement, accRev) + def repeatDerive(c: Char, accRev: List[T]): Parser[Seq[T]] = repeat(Some(repeated derive c), repeated, (min - 1) max 0, max.decrement, accRev) - def completions(level: Int) = - { - def pow(comp: Completions, exp: Completions, n: Int): Completions = - if(n == 1) comp else pow(comp x exp, exp, n - 1) + def completions(level: Int) = + { + def pow(comp: Completions, exp: Completions, n: Int): Completions = + if (n == 1) comp else pow(comp x exp, exp, n - 1) - val repC = repeated.completions(level) - val fin = if(min == 0) Completion.empty +: repC else pow(repC, repC, min) - partial match - { - case Some(p) => p.completions(level) x fin - case None => fin - } - } - def result = None - lazy val resultEmpty: Result[Seq[T]] = - { - val partialAccumulatedOption = - partial match - { - case None => Value(accumulatedReverse) - case Some(partialPattern) => partialPattern.resultEmpty.map(_ :: accumulatedReverse) - } - (partialAccumulatedOption app repeatedParseEmpty)(_ reverse_::: _) - } - private def repeatedParseEmpty: Result[List[T]] = - { - if(min == 0) - Value(Nil) - else - // forced determinism - for(value <- repeated.resultEmpty) yield - makeList(min, value) - } - override def toString = "repeat(" + min + "," + max +"," + partial + "," + repeated + ")" + val repC = repeated.completions(level) + val fin = if (min == 0) Completion.empty +: repC else pow(repC, repC, min) + partial match { + case Some(p) => p.completions(level) x fin + case None => fin + } + } + def result = None + lazy val resultEmpty: Result[Seq[T]] = + { + val partialAccumulatedOption = + partial match { + case None => Value(accumulatedReverse) + case Some(partialPattern) => partialPattern.resultEmpty.map(_ :: accumulatedReverse) + } + (partialAccumulatedOption app repeatedParseEmpty)(_ reverse_::: _) + } + private def repeatedParseEmpty: Result[List[T]] = + { + if (min == 0) + Value(Nil) + else + // forced determinism + for (value <- repeated.resultEmpty) yield makeList(min, value) + } + override def toString = "repeat(" + min + "," + max + "," + partial + "," + repeated + ")" } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/util/complete/src/main/scala/sbt/complete/Parsers.scala index cb1b15d1a..3183929e8 100644 --- a/util/complete/src/main/scala/sbt/complete/Parsers.scala +++ b/util/complete/src/main/scala/sbt/complete/Parsers.scala @@ -3,244 +3,266 @@ */ package sbt.complete - import Parser._ - import java.io.File - import java.net.URI - import java.lang.Character.{getType, MATH_SYMBOL, OTHER_SYMBOL, DASH_PUNCTUATION, OTHER_PUNCTUATION, MODIFIER_SYMBOL, CURRENCY_SYMBOL} +import Parser._ +import java.io.File +import java.net.URI +import java.lang.Character.{ getType, MATH_SYMBOL, OTHER_SYMBOL, DASH_PUNCTUATION, OTHER_PUNCTUATION, MODIFIER_SYMBOL, CURRENCY_SYMBOL } /** Provides standard implementations of commonly useful [[Parser]]s. */ -trait Parsers -{ - /** Matches the end of input, providing no useful result on success. */ - lazy val EOF = not(any) +trait Parsers { + /** Matches the end of input, providing no useful result on success. */ + lazy val EOF = not(any) - /** Parses any single character and provides that character as the result. */ - lazy val any: Parser[Char] = charClass(_ => true, "any character") + /** Parses any single character and provides that character as the result. */ + lazy val any: Parser[Char] = charClass(_ => true, "any character") - /** Set that contains each digit in a String representation.*/ - lazy val DigitSet = Set("0","1","2","3","4","5","6","7","8","9") + /** Set that contains each digit in a String representation.*/ + lazy val DigitSet = Set("0", "1", "2", "3", "4", "5", "6", "7", "8", "9") - /** Parses any single digit and provides that digit as a Char as the result.*/ - lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet + /** Parses any single digit and provides that digit as a Char as the result.*/ + lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet - /** Set containing Chars for hexadecimal digits 0-9 and A-F (but not a-f). */ - lazy val HexDigitSet = Set('0','1','2','3','4','5','6','7','8','9','A','B','C','D','E','F') + /** Set containing Chars for hexadecimal digits 0-9 and A-F (but not a-f). */ + lazy val HexDigitSet = Set('0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'A', 'B', 'C', 'D', 'E', 'F') - /** Parses a single hexadecimal digit (0-9, a-f, A-F). */ - lazy val HexDigit = charClass(c => HexDigitSet(c.toUpper), "hex digit") examples HexDigitSet.map(_.toString) + /** Parses a single hexadecimal digit (0-9, a-f, A-F). */ + lazy val HexDigit = charClass(c => HexDigitSet(c.toUpper), "hex digit") examples HexDigitSet.map(_.toString) - /** Parses a single letter, according to Char.isLetter, into a Char. */ - lazy val Letter = charClass(_.isLetter, "letter") + /** Parses a single letter, according to Char.isLetter, into a Char. */ + lazy val Letter = charClass(_.isLetter, "letter") - /** Parses the first Char in an sbt identifier, which must be a [[Letter]].*/ - def IDStart = Letter + /** Parses the first Char in an sbt identifier, which must be a [[Letter]].*/ + def IDStart = Letter - /** Parses an identifier Char other than the first character. This includes letters, digits, dash `-`, and underscore `_`.*/ - lazy val IDChar = charClass(isIDChar, "ID character") + /** Parses an identifier Char other than the first character. This includes letters, digits, dash `-`, and underscore `_`.*/ + lazy val IDChar = charClass(isIDChar, "ID character") - /** Parses an identifier String, which must start with [[IDStart]] and contain zero or more [[IDChar]]s after that. */ - lazy val ID = identifier(IDStart, IDChar) + /** Parses an identifier String, which must start with [[IDStart]] and contain zero or more [[IDChar]]s after that. */ + lazy val ID = identifier(IDStart, IDChar) - /** Parses a single operator Char, as allowed by [[isOpChar]]. */ - lazy val OpChar = charClass(isOpChar, "symbol") + /** Parses a single operator Char, as allowed by [[isOpChar]]. */ + lazy val OpChar = charClass(isOpChar, "symbol") - /** Parses a non-empty operator String, which consists only of characters allowed by [[OpChar]]. */ - lazy val Op = OpChar.+.string + /** Parses a non-empty operator String, which consists only of characters allowed by [[OpChar]]. */ + lazy val Op = OpChar.+.string - /** Parses either an operator String defined by [[Op]] or a non-symbolic identifier defined by [[ID]]. */ - lazy val OpOrID = ID | Op + /** Parses either an operator String defined by [[Op]] or a non-symbolic identifier defined by [[ID]]. */ + lazy val OpOrID = ID | Op - /** Parses a single, non-symbolic Scala identifier Char. Valid characters are letters, digits, and the underscore character `_`. */ - lazy val ScalaIDChar = charClass(isScalaIDChar, "Scala identifier character") + /** Parses a single, non-symbolic Scala identifier Char. Valid characters are letters, digits, and the underscore character `_`. */ + lazy val ScalaIDChar = charClass(isScalaIDChar, "Scala identifier character") - /** Parses a non-symbolic Scala-like identifier. The identifier must start with [[IDStart]] and contain zero or more [[ScalaIDChar]]s after that.*/ - lazy val ScalaID = identifier(IDStart, ScalaIDChar) + /** Parses a non-symbolic Scala-like identifier. The identifier must start with [[IDStart]] and contain zero or more [[ScalaIDChar]]s after that.*/ + lazy val ScalaID = identifier(IDStart, ScalaIDChar) - /** Parses a String that starts with `start` and is followed by zero or more characters parsed by `rep`.*/ - def identifier(start: Parser[Char], rep: Parser[Char]): Parser[String] = - start ~ rep.* map { case x ~ xs => (x +: xs).mkString } + /** Parses a String that starts with `start` and is followed by zero or more characters parsed by `rep`.*/ + def identifier(start: Parser[Char], rep: Parser[Char]): Parser[String] = + start ~ rep.* map { case x ~ xs => (x +: xs).mkString } - def opOrIDSpaced(s: String): Parser[Char] = - if(DefaultParsers.matches(ID, s)) - OpChar | SpaceClass - else if(DefaultParsers.matches(Op, s)) - IDChar | SpaceClass - else - any + def opOrIDSpaced(s: String): Parser[Char] = + if (DefaultParsers.matches(ID, s)) + OpChar | SpaceClass + else if (DefaultParsers.matches(Op, s)) + IDChar | SpaceClass + else + any - /** Returns true if `c` an operator character. */ - def isOpChar(c: Char) = !isDelimiter(c) && isOpType(getType(c)) - def isOpType(cat: Int) = cat match { case MATH_SYMBOL | OTHER_SYMBOL | DASH_PUNCTUATION | OTHER_PUNCTUATION | MODIFIER_SYMBOL | CURRENCY_SYMBOL => true; case _ => false } - /** Returns true if `c` is a dash `-`, a letter, digit, or an underscore `_`. */ - def isIDChar(c: Char) = isScalaIDChar(c) || c == '-' + /** Returns true if `c` an operator character. */ + def isOpChar(c: Char) = !isDelimiter(c) && isOpType(getType(c)) + def isOpType(cat: Int) = cat match { case MATH_SYMBOL | OTHER_SYMBOL | DASH_PUNCTUATION | OTHER_PUNCTUATION | MODIFIER_SYMBOL | CURRENCY_SYMBOL => true; case _ => false } + /** Returns true if `c` is a dash `-`, a letter, digit, or an underscore `_`. */ + def isIDChar(c: Char) = isScalaIDChar(c) || c == '-' - /** Returns true if `c` is a letter, digit, or an underscore `_`. */ - def isScalaIDChar(c: Char) = c.isLetterOrDigit || c == '_' + /** Returns true if `c` is a letter, digit, or an underscore `_`. */ + def isScalaIDChar(c: Char) = c.isLetterOrDigit || c == '_' - def isDelimiter(c: Char) = c match { case '`' | '\'' | '\"' | /*';' | */',' | '.' => true ; case _ => false } + def isDelimiter(c: Char) = c match { case '`' | '\'' | '\"' | /*';' | */ ',' | '.' => true; case _ => false } - /** Matches a single character that is not a whitespace character. */ - lazy val NotSpaceClass = charClass(!_.isWhitespace, "non-whitespace character") + /** Matches a single character that is not a whitespace character. */ + lazy val NotSpaceClass = charClass(!_.isWhitespace, "non-whitespace character") - /** Matches a single whitespace character, as determined by Char.isWhitespace.*/ - lazy val SpaceClass = charClass(_.isWhitespace, "whitespace character") + /** Matches a single whitespace character, as determined by Char.isWhitespace.*/ + lazy val SpaceClass = charClass(_.isWhitespace, "whitespace character") - /** Matches a non-empty String consisting of non-whitespace characters. */ - lazy val NotSpace = NotSpaceClass.+.string + /** Matches a non-empty String consisting of non-whitespace characters. */ + lazy val NotSpace = NotSpaceClass.+.string - /** Matches a possibly empty String consisting of non-whitespace characters. */ - lazy val OptNotSpace = NotSpaceClass.*.string + /** Matches a possibly empty String consisting of non-whitespace characters. */ + lazy val OptNotSpace = NotSpaceClass.*.string - /** Matches a non-empty String consisting of whitespace characters. - * The suggested tab completion is a single, constant space character.*/ - lazy val Space = SpaceClass.+.examples(" ") + /** + * Matches a non-empty String consisting of whitespace characters. + * The suggested tab completion is a single, constant space character. + */ + lazy val Space = SpaceClass.+.examples(" ") - /** Matches a possibly empty String consisting of whitespace characters. - * The suggested tab completion is a single, constant space character.*/ - lazy val OptSpace = SpaceClass.*.examples(" ") + /** + * Matches a possibly empty String consisting of whitespace characters. + * The suggested tab completion is a single, constant space character. + */ + lazy val OptSpace = SpaceClass.*.examples(" ") - /** Parses a non-empty String that contains only valid URI characters, as defined by [[URIChar]].*/ - lazy val URIClass = URIChar.+.string !!! "Invalid URI" + /** Parses a non-empty String that contains only valid URI characters, as defined by [[URIChar]].*/ + lazy val URIClass = URIChar.+.string !!! "Invalid URI" - /** Triple-quotes, as used for verbatim quoting.*/ - lazy val VerbatimDQuotes = "\"\"\"" + /** Triple-quotes, as used for verbatim quoting.*/ + lazy val VerbatimDQuotes = "\"\"\"" - /** Double quote character. */ - lazy val DQuoteChar = '\"' + /** Double quote character. */ + lazy val DQuoteChar = '\"' - /** Backslash character. */ - lazy val BackslashChar = '\\' + /** Backslash character. */ + lazy val BackslashChar = '\\' - /** Matches a single double quote. */ - lazy val DQuoteClass = charClass(_ == DQuoteChar, "double-quote character") + /** Matches a single double quote. */ + lazy val DQuoteClass = charClass(_ == DQuoteChar, "double-quote character") - /** Matches any character except a double quote or whitespace. */ - lazy val NotDQuoteSpaceClass = - charClass({ c: Char => (c != DQuoteChar) && !c.isWhitespace }, "non-double-quote-space character") + /** Matches any character except a double quote or whitespace. */ + lazy val NotDQuoteSpaceClass = + charClass({ c: Char => (c != DQuoteChar) && !c.isWhitespace }, "non-double-quote-space character") - /** Matches any character except a double quote or backslash. */ - lazy val NotDQuoteBackslashClass = - charClass({ c: Char => (c != DQuoteChar) && (c != BackslashChar) }, "non-double-quote-backslash character") + /** Matches any character except a double quote or backslash. */ + lazy val NotDQuoteBackslashClass = + charClass({ c: Char => (c != DQuoteChar) && (c != BackslashChar) }, "non-double-quote-backslash character") - /** Matches a single character that is valid somewhere in a URI. */ - lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%#") + /** Matches a single character that is valid somewhere in a URI. */ + lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%#") - /** Returns true if `c` is an ASCII letter or digit. */ - def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') + /** Returns true if `c` is an ASCII letter or digit. */ + def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') - /** - * @param base the directory used for completion proposals (when the user presses the TAB key). Only paths under this - * directory will be proposed. - * @return the file that was parsed from the input string. The returned path may or may not exist. - */ - def fileParser(base: File): Parser[File] = - OptSpace ~> StringBasic - .examples(new FileExamples(base)) - .map(new File(_)) + /** + * @param base the directory used for completion proposals (when the user presses the TAB key). Only paths under this + * directory will be proposed. + * @return the file that was parsed from the input string. The returned path may or may not exist. + */ + def fileParser(base: File): Parser[File] = + OptSpace ~> StringBasic + .examples(new FileExamples(base)) + .map(new File(_)) - /** Parses a port number. Currently, this accepts any integer and presents a tab completion suggestion of ``. */ - lazy val Port = token(IntBasic, "") + /** Parses a port number. Currently, this accepts any integer and presents a tab completion suggestion of ``. */ + lazy val Port = token(IntBasic, "") - /** Parses a signed integer. */ - lazy val IntBasic = mapOrFail( '-'.? ~ Digit.+ )( Function.tupled(toInt) ) + /** Parses a signed integer. */ + lazy val IntBasic = mapOrFail('-'.? ~ Digit.+)(Function.tupled(toInt)) - /** Parses an unsigned integer. */ - lazy val NatBasic = mapOrFail( Digit.+ )( _.mkString.toInt ) + /** Parses an unsigned integer. */ + lazy val NatBasic = mapOrFail(Digit.+)(_.mkString.toInt) - private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = - (neg.toSeq ++ digits).mkString.toInt + private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = + (neg.toSeq ++ digits).mkString.toInt - /** Parses the lower-case values `true` and `false` into their respesct Boolean values. */ - lazy val Bool = ("true" ^^^ true) | ("false" ^^^ false) + /** Parses the lower-case values `true` and `false` into their respesct Boolean values. */ + lazy val Bool = ("true" ^^^ true) | ("false" ^^^ false) - /** Parses a potentially quoted String value. The value may be verbatim quoted ([[StringVerbatim]]), - * quoted with interpreted escapes ([[StringEscapable]]), or unquoted ([[NotQuoted]]). */ - lazy val StringBasic = StringVerbatim | StringEscapable | NotQuoted + /** + * Parses a potentially quoted String value. The value may be verbatim quoted ([[StringVerbatim]]), + * quoted with interpreted escapes ([[StringEscapable]]), or unquoted ([[NotQuoted]]). + */ + lazy val StringBasic = StringVerbatim | StringEscapable | NotQuoted - /** Parses a verbatim quoted String value, discarding the quotes in the result. This kind of quoted text starts with triple quotes `"""` - * and ends at the next triple quotes and may contain any character in between. */ - lazy val StringVerbatim: Parser[String] = VerbatimDQuotes ~> - any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ - VerbatimDQuotes + /** + * Parses a verbatim quoted String value, discarding the quotes in the result. This kind of quoted text starts with triple quotes `"""` + * and ends at the next triple quotes and may contain any character in between. + */ + lazy val StringVerbatim: Parser[String] = VerbatimDQuotes ~> + any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ + VerbatimDQuotes - /** Parses a string value, interpreting escapes and discarding the surrounding quotes in the result. - * See [[EscapeSequence]] for supported escapes. */ - lazy val StringEscapable: Parser[String] = - (DQuoteChar ~> (NotDQuoteBackslashClass | EscapeSequence).+.string <~ DQuoteChar | - (DQuoteChar ~ DQuoteChar) ^^^ "") + /** + * Parses a string value, interpreting escapes and discarding the surrounding quotes in the result. + * See [[EscapeSequence]] for supported escapes. + */ + lazy val StringEscapable: Parser[String] = + (DQuoteChar ~> (NotDQuoteBackslashClass | EscapeSequence).+.string <~ DQuoteChar | + (DQuoteChar ~ DQuoteChar) ^^^ "") - /** Parses a single escape sequence into the represented Char. - * Escapes start with a backslash and are followed by `u` for a [[UnicodeEscape]] or by `b`, `t`, `n`, `f`, `r`, `"`, `'`, `\` for standard escapes. */ - lazy val EscapeSequence: Parser[Char] = - BackslashChar ~> ('b' ^^^ '\b' | 't' ^^^ '\t' | 'n' ^^^ '\n' | 'f' ^^^ '\f' | 'r' ^^^ '\r' | - '\"' ^^^ '\"' | '\'' ^^^ '\'' | '\\' ^^^ '\\' | UnicodeEscape) + /** + * Parses a single escape sequence into the represented Char. + * Escapes start with a backslash and are followed by `u` for a [[UnicodeEscape]] or by `b`, `t`, `n`, `f`, `r`, `"`, `'`, `\` for standard escapes. + */ + lazy val EscapeSequence: Parser[Char] = + BackslashChar ~> ('b' ^^^ '\b' | 't' ^^^ '\t' | 'n' ^^^ '\n' | 'f' ^^^ '\f' | 'r' ^^^ '\r' | + '\"' ^^^ '\"' | '\'' ^^^ '\'' | '\\' ^^^ '\\' | UnicodeEscape) - /** Parses a single unicode escape sequence into the represented Char. - * A unicode escape begins with a backslash, followed by a `u` and 4 hexadecimal digits representing the unicode value. */ - lazy val UnicodeEscape: Parser[Char] = - ("u" ~> repeat(HexDigit, 4, 4)) map { seq => Integer.parseInt(seq.mkString, 16).toChar } + /** + * Parses a single unicode escape sequence into the represented Char. + * A unicode escape begins with a backslash, followed by a `u` and 4 hexadecimal digits representing the unicode value. + */ + lazy val UnicodeEscape: Parser[Char] = + ("u" ~> repeat(HexDigit, 4, 4)) map { seq => Integer.parseInt(seq.mkString, 16).toChar } - /** Parses an unquoted, non-empty String value that cannot start with a double quote and cannot contain whitespace.*/ - lazy val NotQuoted = (NotDQuoteSpaceClass ~ OptNotSpace) map { case (c, s) => c.toString + s } + /** Parses an unquoted, non-empty String value that cannot start with a double quote and cannot contain whitespace.*/ + lazy val NotQuoted = (NotDQuoteSpaceClass ~ OptNotSpace) map { case (c, s) => c.toString + s } - /** Applies `rep` zero or more times, separated by `sep`. - * The result is the (possibly empty) sequence of results from the multiple `rep` applications. The `sep` results are discarded. */ - def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = - rep1sep(rep, sep) ?? Nil + /** + * Applies `rep` zero or more times, separated by `sep`. + * The result is the (possibly empty) sequence of results from the multiple `rep` applications. The `sep` results are discarded. + */ + def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = + rep1sep(rep, sep) ?? Nil - /** Applies `rep` one or more times, separated by `sep`. - * The result is the non-empty sequence of results from the multiple `rep` applications. The `sep` results are discarded. */ - def rep1sep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = - (rep ~ (sep ~> rep).*).map { case (x ~ xs) => x +: xs } + /** + * Applies `rep` one or more times, separated by `sep`. + * The result is the non-empty sequence of results from the multiple `rep` applications. The `sep` results are discarded. + */ + def rep1sep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = + (rep ~ (sep ~> rep).*).map { case (x ~ xs) => x +: xs } - /** Wraps the result of `p` in `Some`.*/ - def some[T](p: Parser[T]): Parser[Option[T]] = p map { v => Some(v) } + /** Wraps the result of `p` in `Some`.*/ + def some[T](p: Parser[T]): Parser[Option[T]] = p map { v => Some(v) } - /** Applies `f` to the result of `p`, transforming any exception when evaluating - * `f` into a parse failure with the exception `toString` as the message.*/ - def mapOrFail[S,T](p: Parser[S])(f: S => T): Parser[T] = - p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } + /** + * Applies `f` to the result of `p`, transforming any exception when evaluating + * `f` into a parse failure with the exception `toString` as the message. + */ + def mapOrFail[S, T](p: Parser[S])(f: S => T): Parser[T] = + p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } - /** Parses a space-delimited, possibly empty sequence of arguments. - * The arguments may use quotes and escapes according to [[StringBasic]]. */ - def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(StringBasic, display)).* <~ SpaceClass.* + /** + * Parses a space-delimited, possibly empty sequence of arguments. + * The arguments may use quotes and escapes according to [[StringBasic]]. + */ + def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(StringBasic, display)).* <~ SpaceClass.* - /** Applies `p` and uses `true` as the result if it succeeds and turns failure into a result of `false`. */ - def flag[T](p: Parser[T]): Parser[Boolean] = (p ^^^ true) ?? false + /** Applies `p` and uses `true` as the result if it succeeds and turns failure into a result of `false`. */ + def flag[T](p: Parser[T]): Parser[Boolean] = (p ^^^ true) ?? false - /** Defines a sequence parser where the parser used for each part depends on the previously parsed values. - * `p` is applied to the (possibly empty) sequence of already parsed values to obtain the next parser to use. - * The parsers obtained in this way are separated by `sep`, whose result is discarded and only the sequence - * of values from the parsers returned by `p` is used for the result. */ - def repeatDep[A](p: Seq[A] => Parser[A], sep: Parser[Any]): Parser[Seq[A]] = - { - def loop(acc: Seq[A]): Parser[Seq[A]] = { - val next = (sep ~> p(acc)) flatMap { result => loop(acc :+ result) } - next ?? acc - } - p(Vector()) flatMap { first => loop(Seq(first)) } - } + /** + * Defines a sequence parser where the parser used for each part depends on the previously parsed values. + * `p` is applied to the (possibly empty) sequence of already parsed values to obtain the next parser to use. + * The parsers obtained in this way are separated by `sep`, whose result is discarded and only the sequence + * of values from the parsers returned by `p` is used for the result. + */ + def repeatDep[A](p: Seq[A] => Parser[A], sep: Parser[Any]): Parser[Seq[A]] = + { + def loop(acc: Seq[A]): Parser[Seq[A]] = { + val next = (sep ~> p(acc)) flatMap { result => loop(acc :+ result) } + next ?? acc + } + p(Vector()) flatMap { first => loop(Seq(first)) } + } - /** Applies String.trim to the result of `p`. */ - def trimmed(p: Parser[String]) = p map { _.trim } + /** Applies String.trim to the result of `p`. */ + def trimmed(p: Parser[String]) = p map { _.trim } - /** Parses a URI that is valid according to the single argument java.net.URI constructor. */ - lazy val basicUri = mapOrFail(URIClass)( uri => new URI(uri)) + /** Parses a URI that is valid according to the single argument java.net.URI constructor. */ + lazy val basicUri = mapOrFail(URIClass)(uri => new URI(uri)) - /** Parses a URI that is valid according to the single argument java.net.URI constructor, using `ex` as tab completion examples. */ - def Uri(ex: Set[URI]) = basicUri examples(ex.map(_.toString)) + /** Parses a URI that is valid according to the single argument java.net.URI constructor, using `ex` as tab completion examples. */ + def Uri(ex: Set[URI]) = basicUri examples (ex.map(_.toString)) } /** Provides standard [[Parser]] implementations. */ object Parsers extends Parsers /** Provides common [[Parser]] implementations and helper methods.*/ -object DefaultParsers extends Parsers with ParserMain -{ - /** Applies parser `p` to input `s` and returns `true` if the parse was successful. */ - def matches(p: Parser[_], s: String): Boolean = - apply(p)(s).resultEmpty.isValid +object DefaultParsers extends Parsers with ParserMain { + /** Applies parser `p` to input `s` and returns `true` if the parse was successful. */ + def matches(p: Parser[_], s: String): Boolean = + apply(p)(s).resultEmpty.isValid - /** Returns `true` if `s` parses successfully according to [[ID]].*/ - def validID(s: String): Boolean = matches(ID, s) + /** Returns `true` if `s` parses successfully according to [[ID]].*/ + def validID(s: String): Boolean = matches(ID, s) } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/ProcessError.scala b/util/complete/src/main/scala/sbt/complete/ProcessError.scala index 76ea2f71d..7e6c9794e 100644 --- a/util/complete/src/main/scala/sbt/complete/ProcessError.scala +++ b/util/complete/src/main/scala/sbt/complete/ProcessError.scala @@ -1,30 +1,29 @@ package sbt.complete -object ProcessError -{ - def apply(command: String, msgs: Seq[String], index: Int): String = - { - val (line, modIndex) = extractLine(command, index) - val point = pointerSpace(command, modIndex) - msgs.mkString("\n") + "\n" + line + "\n" + point + "^" - } - def extractLine(s: String, i: Int): (String, Int) = - { - val notNewline = (c: Char) => c != '\n' && c != '\r' - val left = takeRightWhile( s.substring(0, i) )( notNewline ) - val right = s substring i takeWhile notNewline - (left + right, left.length) - } - def takeRightWhile(s: String)(pred: Char => Boolean): String = - { - def loop(i: Int): String = - if(i < 0) - s - else if( pred(s(i)) ) - loop(i-1) - else - s.substring(i+1) - loop(s.length - 1) - } - def pointerSpace(s: String, i: Int): String = (s take i) map { case '\t' => '\t'; case _ => ' ' } mkString; +object ProcessError { + def apply(command: String, msgs: Seq[String], index: Int): String = + { + val (line, modIndex) = extractLine(command, index) + val point = pointerSpace(command, modIndex) + msgs.mkString("\n") + "\n" + line + "\n" + point + "^" + } + def extractLine(s: String, i: Int): (String, Int) = + { + val notNewline = (c: Char) => c != '\n' && c != '\r' + val left = takeRightWhile(s.substring(0, i))(notNewline) + val right = s substring i takeWhile notNewline + (left + right, left.length) + } + def takeRightWhile(s: String)(pred: Char => Boolean): String = + { + def loop(i: Int): String = + if (i < 0) + s + else if (pred(s(i))) + loop(i - 1) + else + s.substring(i + 1) + loop(s.length - 1) + } + def pointerSpace(s: String, i: Int): String = (s take i) map { case '\t' => '\t'; case _ => ' ' } mkString; } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/TokenCompletions.scala b/util/complete/src/main/scala/sbt/complete/TokenCompletions.scala index aee6353db..96e70d2f1 100644 --- a/util/complete/src/main/scala/sbt/complete/TokenCompletions.scala +++ b/util/complete/src/main/scala/sbt/complete/TokenCompletions.scala @@ -1,38 +1,37 @@ package sbt.complete - import Completion.{displayStrict, token => ctoken, tokenDisplay} +import Completion.{ displayStrict, token => ctoken, tokenDisplay } sealed trait TokenCompletions { - def hideWhen(f: Int => Boolean): TokenCompletions + def hideWhen(f: Int => Boolean): TokenCompletions } -object TokenCompletions -{ - private[sbt] abstract class Delegating extends TokenCompletions { outer => - def completions(seen: String, level: Int, delegate: Completions): Completions - final def hideWhen(hide: Int => Boolean): TokenCompletions = new Delegating { - def completions(seen: String, level: Int, delegate: Completions): Completions = - if(hide(level)) Completions.nil else outer.completions(seen, level, delegate) - } - } - private[sbt] abstract class Fixed extends TokenCompletions { outer => - def completions(seen: String, level: Int): Completions - final def hideWhen(hide: Int => Boolean): TokenCompletions = new Fixed { - def completions(seen: String, level: Int) = - if(hide(level)) Completions.nil else outer.completions(seen, level) - } - } +object TokenCompletions { + private[sbt] abstract class Delegating extends TokenCompletions { outer => + def completions(seen: String, level: Int, delegate: Completions): Completions + final def hideWhen(hide: Int => Boolean): TokenCompletions = new Delegating { + def completions(seen: String, level: Int, delegate: Completions): Completions = + if (hide(level)) Completions.nil else outer.completions(seen, level, delegate) + } + } + private[sbt] abstract class Fixed extends TokenCompletions { outer => + def completions(seen: String, level: Int): Completions + final def hideWhen(hide: Int => Boolean): TokenCompletions = new Fixed { + def completions(seen: String, level: Int) = + if (hide(level)) Completions.nil else outer.completions(seen, level) + } + } - val default: TokenCompletions = mapDelegateCompletions((seen,level,c) => ctoken(seen, c.append)) + val default: TokenCompletions = mapDelegateCompletions((seen, level, c) => ctoken(seen, c.append)) - def displayOnly(msg: String): TokenCompletions = new Fixed { - def completions(seen: String, level: Int) = Completions.single(displayStrict(msg)) - } - def overrideDisplay(msg: String): TokenCompletions = mapDelegateCompletions((seen,level,c) => tokenDisplay(display = msg, append = c.append)) + def displayOnly(msg: String): TokenCompletions = new Fixed { + def completions(seen: String, level: Int) = Completions.single(displayStrict(msg)) + } + def overrideDisplay(msg: String): TokenCompletions = mapDelegateCompletions((seen, level, c) => tokenDisplay(display = msg, append = c.append)) - def fixed(f: (String, Int) => Completions): TokenCompletions = new Fixed { - def completions(seen: String, level: Int) = f(seen, level) - } - def mapDelegateCompletions(f: (String, Int, Completion) => Completion): TokenCompletions = new Delegating { - def completions(seen: String, level: Int, delegate: Completions) = Completions( delegate.get.map(c => f(seen, level, c)) ) - } + def fixed(f: (String, Int) => Completions): TokenCompletions = new Fixed { + def completions(seen: String, level: Int) = f(seen, level) + } + def mapDelegateCompletions(f: (String, Int, Completion) => Completion): TokenCompletions = new Delegating { + def completions(seen: String, level: Int, delegate: Completions) = Completions(delegate.get.map(c => f(seen, level, c))) + } } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/TypeString.scala b/util/complete/src/main/scala/sbt/complete/TypeString.scala index 976b672e2..6bf89ac05 100644 --- a/util/complete/src/main/scala/sbt/complete/TypeString.scala +++ b/util/complete/src/main/scala/sbt/complete/TypeString.scala @@ -1,77 +1,79 @@ package sbt.complete - import DefaultParsers._ - import TypeString._ +import DefaultParsers._ +import TypeString._ -/** Basic representation of types parsed from Manifest.toString. -* This can only represent the structure of parameterized types. -* All other types are represented by a TypeString with an empty `args`. */ -private[sbt] final class TypeString(val base: String, val args: List[TypeString]) -{ - override def toString = - if(base.startsWith(FunctionName)) - args.dropRight(1).mkString("(", ",", ")") + " => " + args.last - else if(base.startsWith(TupleName)) - args.mkString("(",",",")") - else - cleanupTypeName(base) + (if(args.isEmpty) "" else args.mkString("[", ",", "]")) +/** + * Basic representation of types parsed from Manifest.toString. + * This can only represent the structure of parameterized types. + * All other types are represented by a TypeString with an empty `args`. + */ +private[sbt] final class TypeString(val base: String, val args: List[TypeString]) { + override def toString = + if (base.startsWith(FunctionName)) + args.dropRight(1).mkString("(", ",", ")") + " => " + args.last + else if (base.startsWith(TupleName)) + args.mkString("(", ",", ")") + else + cleanupTypeName(base) + (if (args.isEmpty) "" else args.mkString("[", ",", "]")) } -private[sbt] object TypeString -{ - /** Makes the string representation of a type as returned by Manifest.toString more readable.*/ - def cleanup(typeString: String): String = - parse(typeString, typeStringParser) match { - case Right(ts) => ts.toString - case Left(err) => typeString - } +private[sbt] object TypeString { + /** Makes the string representation of a type as returned by Manifest.toString more readable.*/ + def cleanup(typeString: String): String = + parse(typeString, typeStringParser) match { + case Right(ts) => ts.toString + case Left(err) => typeString + } - /** Makes a fully qualified type name provided by Manifest.toString more readable. - * The argument should be just a name (like scala.Tuple2) and not a full type (like scala.Tuple2[Int,Boolean])*/ - def cleanupTypeName(base: String): String = - dropPrefix(base).replace('$', '.') + /** + * Makes a fully qualified type name provided by Manifest.toString more readable. + * The argument should be just a name (like scala.Tuple2) and not a full type (like scala.Tuple2[Int,Boolean]) + */ + def cleanupTypeName(base: String): String = + dropPrefix(base).replace('$', '.') - /** Removes prefixes from a fully qualified type name that are unnecessary in the presence of standard imports for an sbt setting. - * This does not use the compiler and is therefore a conservative approximation.*/ - def dropPrefix(base: String): String = - if(base.startsWith(SbtPrefix)) - base.substring(SbtPrefix.length) - else if(base.startsWith(CollectionPrefix)) - { - val simple = base.substring(CollectionPrefix.length) - if(ShortenCollection(simple)) simple else base - } - else if(base.startsWith(ScalaPrefix)) - base.substring(ScalaPrefix.length) - else if(base.startsWith(JavaPrefix)) - base.substring(JavaPrefix.length) - else - TypeMap.getOrElse(base, base) + /** + * Removes prefixes from a fully qualified type name that are unnecessary in the presence of standard imports for an sbt setting. + * This does not use the compiler and is therefore a conservative approximation. + */ + def dropPrefix(base: String): String = + if (base.startsWith(SbtPrefix)) + base.substring(SbtPrefix.length) + else if (base.startsWith(CollectionPrefix)) { + val simple = base.substring(CollectionPrefix.length) + if (ShortenCollection(simple)) simple else base + } else if (base.startsWith(ScalaPrefix)) + base.substring(ScalaPrefix.length) + else if (base.startsWith(JavaPrefix)) + base.substring(JavaPrefix.length) + else + TypeMap.getOrElse(base, base) - final val CollectionPrefix = "scala.collection." - final val FunctionName = "scala.Function" - final val TupleName = "scala.Tuple" - final val SbtPrefix = "sbt." - final val ScalaPrefix = "scala." - final val JavaPrefix = "java.lang." - /* scala.collection.X -> X */ - val ShortenCollection = Set("Seq", "List", "Set", "Map", "Iterable") - val TypeMap = Map( - "java.io.File" -> "File", - "java.net.URL" -> "URL", - "java.net.URI" -> "URI" - ) + final val CollectionPrefix = "scala.collection." + final val FunctionName = "scala.Function" + final val TupleName = "scala.Tuple" + final val SbtPrefix = "sbt." + final val ScalaPrefix = "scala." + final val JavaPrefix = "java.lang." + /* scala.collection.X -> X */ + val ShortenCollection = Set("Seq", "List", "Set", "Map", "Iterable") + val TypeMap = Map( + "java.io.File" -> "File", + "java.net.URL" -> "URL", + "java.net.URI" -> "URI" + ) - /** A Parser that extracts basic structure from the string representation of a type from Manifest.toString. - * This is rudimentary and essentially only decomposes the string into names and arguments for parameterized types. - * */ - lazy val typeStringParser: Parser[TypeString] = - { - def isFullScalaIDChar(c: Char) = isScalaIDChar(c) || c == '.' || c == '$' - lazy val fullScalaID = identifier(IDStart, charClass(isFullScalaIDChar, "Scala identifier character") ) - lazy val tpe: Parser[TypeString] = - for( id <- fullScalaID; args <- ('[' ~> rep1sep(tpe, ',') <~ ']').?) yield - new TypeString(id, args.toList.flatten) - tpe - } + /** + * A Parser that extracts basic structure from the string representation of a type from Manifest.toString. + * This is rudimentary and essentially only decomposes the string into names and arguments for parameterized types. + */ + lazy val typeStringParser: Parser[TypeString] = + { + def isFullScalaIDChar(c: Char) = isScalaIDChar(c) || c == '.' || c == '$' + lazy val fullScalaID = identifier(IDStart, charClass(isFullScalaIDChar, "Scala identifier character")) + lazy val tpe: Parser[TypeString] = + for (id <- fullScalaID; args <- ('[' ~> rep1sep(tpe, ',') <~ ']').?) yield new TypeString(id, args.toList.flatten) + tpe + } } \ No newline at end of file diff --git a/util/complete/src/main/scala/sbt/complete/UpperBound.scala b/util/complete/src/main/scala/sbt/complete/UpperBound.scala index ba1a69ef9..66a32e1a2 100644 --- a/util/complete/src/main/scala/sbt/complete/UpperBound.scala +++ b/util/complete/src/main/scala/sbt/complete/UpperBound.scala @@ -3,45 +3,45 @@ */ package sbt.complete -sealed trait UpperBound -{ - /** True if and only if the given value meets this bound.*/ - def >=(min: Int): Boolean - /** True if and only if this bound is one.*/ - def isOne: Boolean - /** True if and only if this bound is zero.*/ - def isZero: Boolean - /** If this bound is zero or Infinite, `decrement` returns this bound. - * Otherwise, this bound is finite and greater than zero and `decrement` returns the bound that is one less than this bound.*/ - def decrement: UpperBound - /** True if and only if this is unbounded.*/ - def isInfinite: Boolean +sealed trait UpperBound { + /** True if and only if the given value meets this bound.*/ + def >=(min: Int): Boolean + /** True if and only if this bound is one.*/ + def isOne: Boolean + /** True if and only if this bound is zero.*/ + def isZero: Boolean + /** + * If this bound is zero or Infinite, `decrement` returns this bound. + * Otherwise, this bound is finite and greater than zero and `decrement` returns the bound that is one less than this bound. + */ + def decrement: UpperBound + /** True if and only if this is unbounded.*/ + def isInfinite: Boolean } /** Represents unbounded. */ -case object Infinite extends UpperBound -{ - /** All finite numbers meet this bound. */ - def >=(min: Int) = true - def isOne = false - def isZero = false - def decrement = this - def isInfinite = true - override def toString = "Infinity" +case object Infinite extends UpperBound { + /** All finite numbers meet this bound. */ + def >=(min: Int) = true + def isOne = false + def isZero = false + def decrement = this + def isInfinite = true + override def toString = "Infinity" } -/** Represents a finite upper bound. The maximum allowed value is 'value', inclusive. -* It must positive. */ -final case class Finite(value: Int) extends UpperBound -{ - assume(value >= 0, "Maximum occurences must be nonnegative.") +/** + * Represents a finite upper bound. The maximum allowed value is 'value', inclusive. + * It must positive. + */ +final case class Finite(value: Int) extends UpperBound { + assume(value >= 0, "Maximum occurences must be nonnegative.") - def >=(min: Int) = value >= min - def isOne = value == 1 - def isZero = value == 0 - def decrement = Finite( (value - 1) max 0 ) - def isInfinite = false - override def toString = value.toString + def >=(min: Int) = value >= min + def isOne = value == 1 + def isZero = value == 0 + def decrement = Finite((value - 1) max 0) + def isInfinite = false + override def toString = value.toString } -object UpperBound -{ - implicit def intToFinite(i: Int): Finite = Finite(i) +object UpperBound { + implicit def intToFinite(i: Int): Finite = Finite(i) } \ No newline at end of file diff --git a/util/control/src/main/scala/sbt/ErrorHandling.scala b/util/control/src/main/scala/sbt/ErrorHandling.scala index b6e616ae3..70eba7d2f 100644 --- a/util/control/src/main/scala/sbt/ErrorHandling.scala +++ b/util/control/src/main/scala/sbt/ErrorHandling.scala @@ -3,41 +3,36 @@ */ package sbt - import java.io.IOException +import java.io.IOException -object ErrorHandling -{ - def translate[T](msg: => String)(f: => T) = - try { f } - catch { - case e: IOException => throw new TranslatedIOException(msg + e.toString, e) - case e: Exception => throw new TranslatedException(msg + e.toString, e) - } +object ErrorHandling { + def translate[T](msg: => String)(f: => T) = + try { f } + catch { + case e: IOException => throw new TranslatedIOException(msg + e.toString, e) + case e: Exception => throw new TranslatedException(msg + e.toString, e) + } - def wideConvert[T](f: => T): Either[Throwable, T] = - try { Right(f) } - catch - { - case ex @ (_: Exception | _: StackOverflowError) => Left(ex) - case err @ (_: ThreadDeath | _: VirtualMachineError) => throw err - case x: Throwable => Left(x) - } + def wideConvert[T](f: => T): Either[Throwable, T] = + try { Right(f) } + catch { + case ex @ (_: Exception | _: StackOverflowError) => Left(ex) + case err @ (_: ThreadDeath | _: VirtualMachineError) => throw err + case x: Throwable => Left(x) + } - def convert[T](f: => T): Either[Exception, T] = - try { Right(f) } - catch { case e: Exception => Left(e) } + def convert[T](f: => T): Either[Exception, T] = + try { Right(f) } + catch { case e: Exception => Left(e) } - def reducedToString(e: Throwable): String = - if(e.getClass == classOf[RuntimeException]) - { - val msg = e.getMessage - if(msg == null || msg.isEmpty) e.toString else msg - } - else - e.toString + def reducedToString(e: Throwable): String = + if (e.getClass == classOf[RuntimeException]) { + val msg = e.getMessage + if (msg == null || msg.isEmpty) e.toString else msg + } else + e.toString } -sealed class TranslatedException private[sbt](msg: String, cause: Throwable) extends RuntimeException(msg, cause) -{ - override def toString = msg +sealed class TranslatedException private[sbt] (msg: String, cause: Throwable) extends RuntimeException(msg, cause) { + override def toString = msg } -final class TranslatedIOException private[sbt](msg: String, cause: IOException) extends TranslatedException(msg, cause) +final class TranslatedIOException private[sbt] (msg: String, cause: IOException) extends TranslatedException(msg, cause) diff --git a/util/control/src/main/scala/sbt/ExitHook.scala b/util/control/src/main/scala/sbt/ExitHook.scala index de85bff42..8ee5ddf86 100644 --- a/util/control/src/main/scala/sbt/ExitHook.scala +++ b/util/control/src/main/scala/sbt/ExitHook.scala @@ -4,21 +4,18 @@ package sbt /** Defines a function to call as sbt exits.*/ -trait ExitHook -{ - /** Subclasses should implement this method, which is called when this hook is executed. */ - def runBeforeExiting(): Unit +trait ExitHook { + /** Subclasses should implement this method, which is called when this hook is executed. */ + def runBeforeExiting(): Unit } -object ExitHook -{ - def apply(f: => Unit): ExitHook = new ExitHook { def runBeforeExiting() = f } +object ExitHook { + def apply(f: => Unit): ExitHook = new ExitHook { def runBeforeExiting() = f } } -object ExitHooks -{ - /** Calls each registered exit hook, trapping any exceptions so that each hook is given a chance to run. */ - def runExitHooks(exitHooks: Seq[ExitHook]): Seq[Throwable] = - exitHooks.flatMap( hook => - ErrorHandling.wideConvert( hook.runBeforeExiting() ).left.toOption - ) +object ExitHooks { + /** Calls each registered exit hook, trapping any exceptions so that each hook is given a chance to run. */ + def runExitHooks(exitHooks: Seq[ExitHook]): Seq[Throwable] = + exitHooks.flatMap(hook => + ErrorHandling.wideConvert(hook.runBeforeExiting()).left.toOption + ) } \ No newline at end of file diff --git a/util/control/src/main/scala/sbt/MessageOnlyException.scala b/util/control/src/main/scala/sbt/MessageOnlyException.scala index 75b7737d8..ab4727b95 100644 --- a/util/control/src/main/scala/sbt/MessageOnlyException.scala +++ b/util/control/src/main/scala/sbt/MessageOnlyException.scala @@ -5,14 +5,20 @@ package sbt final class MessageOnlyException(override val toString: String) extends RuntimeException(toString) -/** A dummy exception for the top-level exception handler to know that an exception -* has been handled, but is being passed further up to indicate general failure. */ +/** + * A dummy exception for the top-level exception handler to know that an exception + * has been handled, but is being passed further up to indicate general failure. + */ final class AlreadyHandledException(val underlying: Throwable) extends RuntimeException -/** A marker trait for a top-level exception handler to know that this exception -* doesn't make sense to display. */ +/** + * A marker trait for a top-level exception handler to know that this exception + * doesn't make sense to display. + */ trait UnprintableException extends Throwable -/** A marker trait that refines UnprintableException to indicate to a top-level exception handler -* that the code throwing this exception has already provided feedback to the user about the error condition. */ +/** + * A marker trait that refines UnprintableException to indicate to a top-level exception handler + * that the code throwing this exception has already provided feedback to the user about the error condition. + */ trait FeedbackProvidedException extends UnprintableException diff --git a/util/log/src/main/scala/sbt/BasicLogger.scala b/util/log/src/main/scala/sbt/BasicLogger.scala index c58dc57c6..7fe59e8c0 100644 --- a/util/log/src/main/scala/sbt/BasicLogger.scala +++ b/util/log/src/main/scala/sbt/BasicLogger.scala @@ -4,15 +4,14 @@ package sbt /** Implements the level-setting methods of Logger.*/ -abstract class BasicLogger extends AbstractLogger -{ - private var traceEnabledVar = java.lang.Integer.MAX_VALUE - private var level: Level.Value = Level.Info - private var successEnabledVar = true - def successEnabled = synchronized { successEnabledVar } - def setSuccessEnabled(flag: Boolean): Unit = synchronized { successEnabledVar = flag } - def getLevel = synchronized { level } - def setLevel(newLevel: Level.Value): Unit = synchronized { level = newLevel } - def setTrace(level: Int): Unit = synchronized { traceEnabledVar = level } - def getTrace = synchronized { traceEnabledVar } +abstract class BasicLogger extends AbstractLogger { + private var traceEnabledVar = java.lang.Integer.MAX_VALUE + private var level: Level.Value = Level.Info + private var successEnabledVar = true + def successEnabled = synchronized { successEnabledVar } + def setSuccessEnabled(flag: Boolean): Unit = synchronized { successEnabledVar = flag } + def getLevel = synchronized { level } + def setLevel(newLevel: Level.Value): Unit = synchronized { level = newLevel } + def setTrace(level: Int): Unit = synchronized { traceEnabledVar = level } + def getTrace = synchronized { traceEnabledVar } } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/BufferedLogger.scala b/util/log/src/main/scala/sbt/BufferedLogger.scala index 0b9d7a593..a40d3f1be 100644 --- a/util/log/src/main/scala/sbt/BufferedLogger.scala +++ b/util/log/src/main/scala/sbt/BufferedLogger.scala @@ -3,94 +3,93 @@ */ package sbt - import scala.collection.mutable.ListBuffer +import scala.collection.mutable.ListBuffer -/** A logger that can buffer the logging done on it and then can flush the buffer -* to the delegate logger provided in the constructor. Use 'startRecording' to -* start buffering and then 'play' from to flush the buffer to the backing logger. -* The logging level set at the time a message is originally logged is used, not -* the level at the time 'play' is called. -* -* This class assumes that it is the only client of the delegate logger. -* */ -class BufferedLogger(delegate: AbstractLogger) extends BasicLogger -{ - private[this] val buffer = new ListBuffer[LogEvent] - private[this] var recording = false +/** + * A logger that can buffer the logging done on it and then can flush the buffer + * to the delegate logger provided in the constructor. Use 'startRecording' to + * start buffering and then 'play' from to flush the buffer to the backing logger. + * The logging level set at the time a message is originally logged is used, not + * the level at the time 'play' is called. + * + * This class assumes that it is the only client of the delegate logger. + */ +class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { + private[this] val buffer = new ListBuffer[LogEvent] + private[this] var recording = false - /** Enables buffering. */ - def record() = synchronized { recording = true } - def buffer[T](f: => T): T = { - record() - try { f } - finally { stopQuietly() } - } - def bufferQuietly[T](f: => T): T = { - record() - try - { - val result = f - clear() - result - } - catch { case e: Throwable => stopQuietly(); throw e } - } - def stopQuietly() = synchronized { try { stop() } catch { case e: Exception => () } } + /** Enables buffering. */ + def record() = synchronized { recording = true } + def buffer[T](f: => T): T = { + record() + try { f } + finally { stopQuietly() } + } + def bufferQuietly[T](f: => T): T = { + record() + try { + val result = f + clear() + result + } catch { case e: Throwable => stopQuietly(); throw e } + } + def stopQuietly() = synchronized { try { stop() } catch { case e: Exception => () } } - /** Flushes the buffer to the delegate logger. This method calls logAll on the delegate - * so that the messages are written consecutively. The buffer is cleared in the process. */ - def play(): Unit = synchronized { delegate.logAll(buffer.readOnly); buffer.clear() } - /** Clears buffered events and disables buffering. */ - def clear(): Unit = synchronized { buffer.clear(); recording = false } - /** Plays buffered events and disables buffering. */ - def stop(): Unit = synchronized { play(); clear() } + /** + * Flushes the buffer to the delegate logger. This method calls logAll on the delegate + * so that the messages are written consecutively. The buffer is cleared in the process. + */ + def play(): Unit = synchronized { delegate.logAll(buffer.readOnly); buffer.clear() } + /** Clears buffered events and disables buffering. */ + def clear(): Unit = synchronized { buffer.clear(); recording = false } + /** Plays buffered events and disables buffering. */ + def stop(): Unit = synchronized { play(); clear() } - override def ansiCodesSupported = delegate.ansiCodesSupported - override def setLevel(newLevel: Level.Value): Unit = synchronized { - super.setLevel(newLevel) - if(recording) - buffer += new SetLevel(newLevel) - else - delegate.setLevel(newLevel) - } - override def setSuccessEnabled(flag: Boolean): Unit = synchronized { - super.setSuccessEnabled(flag) - if(recording) - buffer += new SetSuccess(flag) - else - delegate.setSuccessEnabled(flag) - } - override def setTrace(level: Int): Unit = synchronized { - super.setTrace(level) - if(recording) - buffer += new SetTrace(level) - else - delegate.setTrace(level) - } + override def ansiCodesSupported = delegate.ansiCodesSupported + override def setLevel(newLevel: Level.Value): Unit = synchronized { + super.setLevel(newLevel) + if (recording) + buffer += new SetLevel(newLevel) + else + delegate.setLevel(newLevel) + } + override def setSuccessEnabled(flag: Boolean): Unit = synchronized { + super.setSuccessEnabled(flag) + if (recording) + buffer += new SetSuccess(flag) + else + delegate.setSuccessEnabled(flag) + } + override def setTrace(level: Int): Unit = synchronized { + super.setTrace(level) + if (recording) + buffer += new SetTrace(level) + else + delegate.setTrace(level) + } - def trace(t: => Throwable): Unit = - doBufferableIf(traceEnabled, new Trace(t), _.trace(t)) - def success(message: => String): Unit = - doBufferable(Level.Info, new Success(message), _.success(message)) - def log(level: Level.Value, message: => String): Unit = - doBufferable(level, new Log(level, message), _.log(level, message)) - def logAll(events: Seq[LogEvent]): Unit = synchronized { - if(recording) - buffer ++= events - else - delegate.logAll(events) - } - def control(event: ControlEvent.Value, message: => String): Unit = - doBufferable(Level.Info, new ControlEvent(event, message), _.control(event, message)) - private def doBufferable(level: Level.Value, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = - doBufferableIf(atLevel(level), appendIfBuffered, doUnbuffered) - private def doBufferableIf(condition: => Boolean, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = synchronized { - if(condition) - { - if(recording) - buffer += appendIfBuffered - else - doUnbuffered(delegate) - } - } + def trace(t: => Throwable): Unit = + doBufferableIf(traceEnabled, new Trace(t), _.trace(t)) + def success(message: => String): Unit = + doBufferable(Level.Info, new Success(message), _.success(message)) + def log(level: Level.Value, message: => String): Unit = + doBufferable(level, new Log(level, message), _.log(level, message)) + def logAll(events: Seq[LogEvent]): Unit = synchronized { + if (recording) + buffer ++= events + else + delegate.logAll(events) + } + def control(event: ControlEvent.Value, message: => String): Unit = + doBufferable(Level.Info, new ControlEvent(event, message), _.control(event, message)) + private def doBufferable(level: Level.Value, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = + doBufferableIf(atLevel(level), appendIfBuffered, doUnbuffered) + private def doBufferableIf(condition: => Boolean, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = synchronized { + if (condition) { + if (recording) + buffer += appendIfBuffered + else + doUnbuffered(delegate) + } + } } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala index e5c8f040f..a614e4315 100644 --- a/util/log/src/main/scala/sbt/ConsoleLogger.scala +++ b/util/log/src/main/scala/sbt/ConsoleLogger.scala @@ -3,182 +3,175 @@ */ package sbt -import java.io.{BufferedWriter, PrintStream, PrintWriter} +import java.io.{ BufferedWriter, PrintStream, PrintWriter } import java.util.Locale -object ConsoleLogger -{ - @deprecated("Moved to ConsoleOut", "0.13.0") - def systemOut: ConsoleOut = ConsoleOut.systemOut +object ConsoleLogger { + @deprecated("Moved to ConsoleOut", "0.13.0") + def systemOut: ConsoleOut = ConsoleOut.systemOut - @deprecated("Moved to ConsoleOut", "0.13.0") - def overwriteContaining(s: String): (String,String) => Boolean = ConsoleOut.overwriteContaining(s) + @deprecated("Moved to ConsoleOut", "0.13.0") + def overwriteContaining(s: String): (String, String) => Boolean = ConsoleOut.overwriteContaining(s) - @deprecated("Moved to ConsoleOut", "0.13.0") - def systemOutOverwrite(f: (String,String) => Boolean): ConsoleOut = ConsoleOut.systemOutOverwrite(f) + @deprecated("Moved to ConsoleOut", "0.13.0") + def systemOutOverwrite(f: (String, String) => Boolean): ConsoleOut = ConsoleOut.systemOutOverwrite(f) - @deprecated("Moved to ConsoleOut", "0.13.0") - def printStreamOut(out: PrintStream): ConsoleOut = ConsoleOut.printStreamOut(out) + @deprecated("Moved to ConsoleOut", "0.13.0") + def printStreamOut(out: PrintStream): ConsoleOut = ConsoleOut.printStreamOut(out) - @deprecated("Moved to ConsoleOut", "0.13.0") - def printWriterOut(out: PrintWriter): ConsoleOut = ConsoleOut.printWriterOut(out) + @deprecated("Moved to ConsoleOut", "0.13.0") + def printWriterOut(out: PrintWriter): ConsoleOut = ConsoleOut.printWriterOut(out) - @deprecated("Moved to ConsoleOut", "0.13.0") - def bufferedWriterOut(out: BufferedWriter): ConsoleOut = bufferedWriterOut(out) + @deprecated("Moved to ConsoleOut", "0.13.0") + def bufferedWriterOut(out: BufferedWriter): ConsoleOut = bufferedWriterOut(out) - /** Escape character, used to introduce an escape sequence. */ - final val ESC = '\u001B' + /** Escape character, used to introduce an escape sequence. */ + final val ESC = '\u001B' - /** An escape terminator is a character in the range `@` (decimal value 64) to `~` (decimal value 126). - * It is the final character in an escape sequence. */ - def isEscapeTerminator(c: Char): Boolean = - c >= '@' && c <= '~' + /** + * An escape terminator is a character in the range `@` (decimal value 64) to `~` (decimal value 126). + * It is the final character in an escape sequence. + */ + def isEscapeTerminator(c: Char): Boolean = + c >= '@' && c <= '~' - /** Returns true if the string contains the ESC character. */ - def hasEscapeSequence(s: String): Boolean = - s.indexOf(ESC) >= 0 + /** Returns true if the string contains the ESC character. */ + def hasEscapeSequence(s: String): Boolean = + s.indexOf(ESC) >= 0 - /** Returns the string `s` with escape sequences removed. - * An escape sequence starts with the ESC character (decimal value 27) and ends with an escape terminator. - * @see isEscapeTerminator - */ - def removeEscapeSequences(s: String): String = - if(s.isEmpty || !hasEscapeSequence(s)) - s - else - { - val sb = new java.lang.StringBuilder - nextESC(s, 0, sb) - sb.toString - } - private[this] def nextESC(s: String, start: Int, sb: java.lang.StringBuilder) - { - val escIndex = s.indexOf(ESC, start) - if(escIndex < 0) - sb.append(s, start, s.length) - else { - sb.append(s, start, escIndex) - val next = skipESC(s, escIndex+1) - nextESC(s, next, sb) - } - } - + /** + * Returns the string `s` with escape sequences removed. + * An escape sequence starts with the ESC character (decimal value 27) and ends with an escape terminator. + * @see isEscapeTerminator + */ + def removeEscapeSequences(s: String): String = + if (s.isEmpty || !hasEscapeSequence(s)) + s + else { + val sb = new java.lang.StringBuilder + nextESC(s, 0, sb) + sb.toString + } + private[this] def nextESC(s: String, start: Int, sb: java.lang.StringBuilder) { + val escIndex = s.indexOf(ESC, start) + if (escIndex < 0) + sb.append(s, start, s.length) + else { + sb.append(s, start, escIndex) + val next = skipESC(s, escIndex + 1) + nextESC(s, next, sb) + } + } - /** Skips the escape sequence starting at `i-1`. `i` should be positioned at the character after the ESC that starts the sequence. */ - private[this] def skipESC(s: String, i: Int): Int = - if(i >= s.length) - i - else if( isEscapeTerminator(s.charAt(i)) ) - i+1 - else - skipESC(s, i+1) + /** Skips the escape sequence starting at `i-1`. `i` should be positioned at the character after the ESC that starts the sequence. */ + private[this] def skipESC(s: String, i: Int): Int = + if (i >= s.length) + i + else if (isEscapeTerminator(s.charAt(i))) + i + 1 + else + skipESC(s, i + 1) - val formatEnabled = - { - import java.lang.Boolean.{getBoolean, parseBoolean} - val value = System.getProperty("sbt.log.format") - if(value eq null) (ansiSupported && !getBoolean("sbt.log.noformat")) else parseBoolean(value) - } - private[this] def jline1to2CompatMsg = "Found class jline.Terminal, but interface was expected" + val formatEnabled = + { + import java.lang.Boolean.{ getBoolean, parseBoolean } + val value = System.getProperty("sbt.log.format") + if (value eq null) (ansiSupported && !getBoolean("sbt.log.noformat")) else parseBoolean(value) + } + private[this] def jline1to2CompatMsg = "Found class jline.Terminal, but interface was expected" - private[this] def ansiSupported = - try { - val terminal = jline.TerminalFactory.get - terminal.restore // #460 - terminal.isAnsiSupported - } catch { - case e: Exception => !isWindows + private[this] def ansiSupported = + try { + val terminal = jline.TerminalFactory.get + terminal.restore // #460 + terminal.isAnsiSupported + } catch { + case e: Exception => !isWindows - // sbt 0.13 drops JLine 1.0 from the launcher and uses 2.x as a normal dependency - // when 0.13 is used with a 0.12 launcher or earlier, the JLine classes from the launcher get loaded - // this results in a linkage error as detected below. The detection is likely jvm specific, but the priority - // is avoiding mistakenly identifying something as a launcher incompatibility when it is not - case e: IncompatibleClassChangeError if e.getMessage == jline1to2CompatMsg => - throw new IncompatibleClassChangeError("JLine incompatibility detected. Check that the sbt launcher is version 0.13.x or later.") - } + // sbt 0.13 drops JLine 1.0 from the launcher and uses 2.x as a normal dependency + // when 0.13 is used with a 0.12 launcher or earlier, the JLine classes from the launcher get loaded + // this results in a linkage error as detected below. The detection is likely jvm specific, but the priority + // is avoiding mistakenly identifying something as a launcher incompatibility when it is not + case e: IncompatibleClassChangeError if e.getMessage == jline1to2CompatMsg => + throw new IncompatibleClassChangeError("JLine incompatibility detected. Check that the sbt launcher is version 0.13.x or later.") + } - val noSuppressedMessage = (_: SuppressedTraceContext) => None + val noSuppressedMessage = (_: SuppressedTraceContext) => None - private[this] def os = System.getProperty("os.name") - private[this] def isWindows = os.toLowerCase(Locale.ENGLISH).indexOf("windows") >= 0 - - def apply(out: PrintStream): ConsoleLogger = apply(ConsoleOut.printStreamOut(out)) - def apply(out: PrintWriter): ConsoleLogger = apply(ConsoleOut.printWriterOut(out)) - def apply(out: ConsoleOut = ConsoleOut.systemOut, ansiCodesSupported: Boolean = formatEnabled, - useColor: Boolean = formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = noSuppressedMessage): ConsoleLogger = - new ConsoleLogger(out, ansiCodesSupported, useColor, suppressedMessage) + private[this] def os = System.getProperty("os.name") + private[this] def isWindows = os.toLowerCase(Locale.ENGLISH).indexOf("windows") >= 0 - private[this] val EscapeSequence = (27.toChar + "[^@-~]*[@-~]").r - def stripEscapeSequences(s: String): String = - EscapeSequence.pattern.matcher(s).replaceAll("") + def apply(out: PrintStream): ConsoleLogger = apply(ConsoleOut.printStreamOut(out)) + def apply(out: PrintWriter): ConsoleLogger = apply(ConsoleOut.printWriterOut(out)) + def apply(out: ConsoleOut = ConsoleOut.systemOut, ansiCodesSupported: Boolean = formatEnabled, + useColor: Boolean = formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = noSuppressedMessage): ConsoleLogger = + new ConsoleLogger(out, ansiCodesSupported, useColor, suppressedMessage) + + private[this] val EscapeSequence = (27.toChar + "[^@-~]*[@-~]").r + def stripEscapeSequences(s: String): String = + EscapeSequence.pattern.matcher(s).replaceAll("") } -/** A logger that logs to the console. On supported systems, the level labels are -* colored. -* -* This logger is not thread-safe.*/ -class ConsoleLogger private[ConsoleLogger](val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean, val suppressedMessage: SuppressedTraceContext => Option[String]) extends BasicLogger -{ - import scala.Console.{BLUE, GREEN, RED, RESET, YELLOW} - def messageColor(level: Level.Value) = RESET - def labelColor(level: Level.Value) = - level match - { - case Level.Error => RED - case Level.Warn => YELLOW - case _ => RESET - } - def successLabelColor = GREEN - def successMessageColor = RESET - override def success(message: => String) - { - if(successEnabled) - log(successLabelColor, Level.SuccessLabel, successMessageColor, message) - } - def trace(t: => Throwable): Unit = - out.lockObject.synchronized - { - val traceLevel = getTrace - if(traceLevel >= 0) - out.print(StackTrace.trimmed(t, traceLevel)) - if(traceLevel <= 2) - for(msg <- suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useColor))) - printLabeledLine(labelColor(Level.Error), "trace", messageColor(Level.Error), msg) - } - def log(level: Level.Value, message: => String) - { - if(atLevel(level)) - log(labelColor(level), level.toString, messageColor(level), message) - } - private def reset(): Unit = setColor(RESET) - - private def setColor(color: String) - { - if(ansiCodesSupported && useColor) - out.lockObject.synchronized { out.print(color) } - } - private def log(labelColor: String, label: String, messageColor: String, message: String): Unit = - out.lockObject.synchronized - { - for(line <- message.split("""\n""")) - printLabeledLine(labelColor, label, messageColor, line) - } - private def printLabeledLine(labelColor: String, label: String, messageColor: String, line: String): Unit = - { - reset() - out.print("[") - setColor(labelColor) - out.print(label) - reset() - out.print("] ") - setColor(messageColor) - out.print(line) - reset() - out.println() - } +/** + * A logger that logs to the console. On supported systems, the level labels are + * colored. + * + * This logger is not thread-safe. + */ +class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean, val suppressedMessage: SuppressedTraceContext => Option[String]) extends BasicLogger { + import scala.Console.{ BLUE, GREEN, RED, RESET, YELLOW } + def messageColor(level: Level.Value) = RESET + def labelColor(level: Level.Value) = + level match { + case Level.Error => RED + case Level.Warn => YELLOW + case _ => RESET + } + def successLabelColor = GREEN + def successMessageColor = RESET + override def success(message: => String) { + if (successEnabled) + log(successLabelColor, Level.SuccessLabel, successMessageColor, message) + } + def trace(t: => Throwable): Unit = + out.lockObject.synchronized { + val traceLevel = getTrace + if (traceLevel >= 0) + out.print(StackTrace.trimmed(t, traceLevel)) + if (traceLevel <= 2) + for (msg <- suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useColor))) + printLabeledLine(labelColor(Level.Error), "trace", messageColor(Level.Error), msg) + } + def log(level: Level.Value, message: => String) { + if (atLevel(level)) + log(labelColor(level), level.toString, messageColor(level), message) + } + private def reset(): Unit = setColor(RESET) - def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } - def control(event: ControlEvent.Value, message: => String) - { log(labelColor(Level.Info), Level.Info.toString, BLUE, message) } + private def setColor(color: String) { + if (ansiCodesSupported && useColor) + out.lockObject.synchronized { out.print(color) } + } + private def log(labelColor: String, label: String, messageColor: String, message: String): Unit = + out.lockObject.synchronized { + for (line <- message.split("""\n""")) + printLabeledLine(labelColor, label, messageColor, line) + } + private def printLabeledLine(labelColor: String, label: String, messageColor: String, line: String): Unit = + { + reset() + out.print("[") + setColor(labelColor) + out.print(label) + reset() + out.print("] ") + setColor(messageColor) + out.print(line) + reset() + out.println() + } + + def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } + def control(event: ControlEvent.Value, message: => String) { log(labelColor(Level.Info), Level.Info.toString, BLUE, message) } } final class SuppressedTraceContext(val traceLevel: Int, val useColor: Boolean) diff --git a/util/log/src/main/scala/sbt/ConsoleOut.scala b/util/log/src/main/scala/sbt/ConsoleOut.scala index 07f17ff72..41367757b 100644 --- a/util/log/src/main/scala/sbt/ConsoleOut.scala +++ b/util/log/src/main/scala/sbt/ConsoleOut.scala @@ -1,62 +1,62 @@ package sbt - import java.io.{BufferedWriter, PrintStream, PrintWriter} +import java.io.{ BufferedWriter, PrintStream, PrintWriter } -sealed trait ConsoleOut -{ - val lockObject: AnyRef - def print(s: String): Unit - def println(s: String): Unit - def println(): Unit +sealed trait ConsoleOut { + val lockObject: AnyRef + def print(s: String): Unit + def println(s: String): Unit + def println(): Unit } -object ConsoleOut -{ - def systemOut: ConsoleOut = printStreamOut(System.out) +object ConsoleOut { + def systemOut: ConsoleOut = printStreamOut(System.out) - def overwriteContaining(s: String): (String,String) => Boolean = (cur, prev) => - cur.contains(s) && prev.contains(s) + def overwriteContaining(s: String): (String, String) => Boolean = (cur, prev) => + cur.contains(s) && prev.contains(s) - /** Move to beginning of previous line and clear the line. */ - private[this] final val OverwriteLine = "\r\u001BM\u001B[2K" + /** Move to beginning of previous line and clear the line. */ + private[this] final val OverwriteLine = "\r\u001BM\u001B[2K" - /** ConsoleOut instance that is backed by System.out. It overwrites the previously printed line - * if the function `f(lineToWrite, previousLine)` returns true. - * - * The ConsoleOut returned by this method assumes that the only newlines are from println calls - * and not in the String arguments. */ - def systemOutOverwrite(f: (String,String) => Boolean): ConsoleOut = new ConsoleOut { - val lockObject = System.out - private[this] var last: Option[String] = None - private[this] var current = new java.lang.StringBuffer - def print(s: String): Unit = synchronized { current.append(s) } - def println(s: String): Unit = synchronized { current.append(s); println() } - def println(): Unit = synchronized { - val s = current.toString - if(ConsoleLogger.formatEnabled && last.exists(lmsg => f(s, lmsg))) - lockObject.print(OverwriteLine) - lockObject.println(s) - last = Some(s) - current = new java.lang.StringBuffer - } - } + /** + * ConsoleOut instance that is backed by System.out. It overwrites the previously printed line + * if the function `f(lineToWrite, previousLine)` returns true. + * + * The ConsoleOut returned by this method assumes that the only newlines are from println calls + * and not in the String arguments. + */ + def systemOutOverwrite(f: (String, String) => Boolean): ConsoleOut = new ConsoleOut { + val lockObject = System.out + private[this] var last: Option[String] = None + private[this] var current = new java.lang.StringBuffer + def print(s: String): Unit = synchronized { current.append(s) } + def println(s: String): Unit = synchronized { current.append(s); println() } + def println(): Unit = synchronized { + val s = current.toString + if (ConsoleLogger.formatEnabled && last.exists(lmsg => f(s, lmsg))) + lockObject.print(OverwriteLine) + lockObject.println(s) + last = Some(s) + current = new java.lang.StringBuffer + } + } - def printStreamOut(out: PrintStream): ConsoleOut = new ConsoleOut { - val lockObject = out - def print(s: String) = out.print(s) - def println(s: String) = out.println(s) - def println() = out.println() - } - def printWriterOut(out: PrintWriter): ConsoleOut = new ConsoleOut { - val lockObject = out - def print(s: String) = out.print(s) - def println(s: String) = { out.println(s); out.flush() } - def println() = { out.println(); out.flush() } - } - def bufferedWriterOut(out: BufferedWriter): ConsoleOut = new ConsoleOut { - val lockObject = out - def print(s: String) = out.write(s) - def println(s: String) = { out.write(s); println() } - def println() = { out.newLine(); out.flush() } - } + def printStreamOut(out: PrintStream): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.print(s) + def println(s: String) = out.println(s) + def println() = out.println() + } + def printWriterOut(out: PrintWriter): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.print(s) + def println(s: String) = { out.println(s); out.flush() } + def println() = { out.println(); out.flush() } + } + def bufferedWriterOut(out: BufferedWriter): ConsoleOut = new ConsoleOut { + val lockObject = out + def print(s: String) = out.write(s) + def println(s: String) = { out.write(s); println() } + def println() = { out.newLine(); out.flush() } + } } diff --git a/util/log/src/main/scala/sbt/FilterLogger.scala b/util/log/src/main/scala/sbt/FilterLogger.scala index 59048c381..d3547f34f 100644 --- a/util/log/src/main/scala/sbt/FilterLogger.scala +++ b/util/log/src/main/scala/sbt/FilterLogger.scala @@ -3,35 +3,31 @@ */ package sbt -/** A filter logger is used to delegate messages but not the logging level to another logger. This means -* that messages are logged at the higher of the two levels set by this logger and its delegate. -* */ -class FilterLogger(delegate: AbstractLogger) extends BasicLogger -{ - override lazy val ansiCodesSupported = delegate.ansiCodesSupported - def trace(t: => Throwable) - { - if(traceEnabled) - delegate.trace(t) - } - override def setSuccessEnabled(flag: Boolean) { delegate.setSuccessEnabled(flag) } - override def successEnabled = delegate.successEnabled - override def setTrace(level: Int) { delegate.setTrace(level) } - override def getTrace = delegate.getTrace - def log(level: Level.Value, message: => String) - { - if(atLevel(level)) - delegate.log(level, message) - } - def success(message: => String) - { - if(successEnabled) - delegate.success(message) - } - def control(event: ControlEvent.Value, message: => String) - { - if(atLevel(Level.Info)) - delegate.control(event, message) - } - def logAll(events: Seq[LogEvent]): Unit = delegate.logAll(events) +/** + * A filter logger is used to delegate messages but not the logging level to another logger. This means + * that messages are logged at the higher of the two levels set by this logger and its delegate. + */ +class FilterLogger(delegate: AbstractLogger) extends BasicLogger { + override lazy val ansiCodesSupported = delegate.ansiCodesSupported + def trace(t: => Throwable) { + if (traceEnabled) + delegate.trace(t) + } + override def setSuccessEnabled(flag: Boolean) { delegate.setSuccessEnabled(flag) } + override def successEnabled = delegate.successEnabled + override def setTrace(level: Int) { delegate.setTrace(level) } + override def getTrace = delegate.getTrace + def log(level: Level.Value, message: => String) { + if (atLevel(level)) + delegate.log(level, message) + } + def success(message: => String) { + if (successEnabled) + delegate.success(message) + } + def control(event: ControlEvent.Value, message: => String) { + if (atLevel(Level.Info)) + delegate.control(event, message) + } + def logAll(events: Seq[LogEvent]): Unit = delegate.logAll(events) } diff --git a/util/log/src/main/scala/sbt/FullLogger.scala b/util/log/src/main/scala/sbt/FullLogger.scala index ca88f0b4d..968712317 100644 --- a/util/log/src/main/scala/sbt/FullLogger.scala +++ b/util/log/src/main/scala/sbt/FullLogger.scala @@ -4,32 +4,27 @@ package sbt /** Promotes the simple Logger interface to the full AbstractLogger interface. */ -class FullLogger(delegate: Logger) extends BasicLogger -{ - override val ansiCodesSupported: Boolean = delegate.ansiCodesSupported - def trace(t: => Throwable) - { - if(traceEnabled) - delegate.trace(t) - } - def log(level: Level.Value, message: => String) - { - if(atLevel(level)) - delegate.log(level, message) - } - def success(message: => String): Unit = - if(successEnabled) - delegate.success(message) - def control(event: ControlEvent.Value, message: => String): Unit = - info(message) - def logAll(events: Seq[LogEvent]): Unit = events.foreach(log) +class FullLogger(delegate: Logger) extends BasicLogger { + override val ansiCodesSupported: Boolean = delegate.ansiCodesSupported + def trace(t: => Throwable) { + if (traceEnabled) + delegate.trace(t) + } + def log(level: Level.Value, message: => String) { + if (atLevel(level)) + delegate.log(level, message) + } + def success(message: => String): Unit = + if (successEnabled) + delegate.success(message) + def control(event: ControlEvent.Value, message: => String): Unit = + info(message) + def logAll(events: Seq[LogEvent]): Unit = events.foreach(log) } -object FullLogger -{ - def apply(delegate: Logger): AbstractLogger = - delegate match - { - case d: AbstractLogger => d - case _ => new FullLogger(delegate) - } +object FullLogger { + def apply(delegate: Logger): AbstractLogger = + delegate match { + case d: AbstractLogger => d + case _ => new FullLogger(delegate) + } } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/GlobalLogging.scala b/util/log/src/main/scala/sbt/GlobalLogging.scala index 63eb9805a..1cd32653b 100644 --- a/util/log/src/main/scala/sbt/GlobalLogging.scala +++ b/util/log/src/main/scala/sbt/GlobalLogging.scala @@ -3,41 +3,44 @@ */ package sbt - import java.io.{File, PrintWriter} +import java.io.{ File, PrintWriter } -/** Provides the current global logging configuration. -* -* `full` is the current global logger. It should not be set directly because it is generated as needed from `backing.newLogger`. -* `console` is where all logging from all ConsoleLoggers should go. -* `backed` is the Logger that other loggers should feed into. -* `backing` tracks the files that persist the global logging. -* `newLogger` creates a new global logging configuration from a sink and backing configuration. -*/ +/** + * Provides the current global logging configuration. + * + * `full` is the current global logger. It should not be set directly because it is generated as needed from `backing.newLogger`. + * `console` is where all logging from all ConsoleLoggers should go. + * `backed` is the Logger that other loggers should feed into. + * `backing` tracks the files that persist the global logging. + * `newLogger` creates a new global logging configuration from a sink and backing configuration. + */ final case class GlobalLogging(full: Logger, console: ConsoleOut, backed: AbstractLogger, backing: GlobalLogBacking, newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging) -/** Tracks the files that persist the global logging. -* `file` is the current backing file. `last` is the previous backing file, if there is one. -* `newBackingFile` creates a new temporary location for the next backing file. */ -final case class GlobalLogBacking(file: File, last: Option[File], newBackingFile: () => File) -{ - /** Shifts the current backing file to `last` and sets the current backing to `newFile`. */ - def shift(newFile: File) = GlobalLogBacking(newFile, Some(file), newBackingFile) +/** + * Tracks the files that persist the global logging. + * `file` is the current backing file. `last` is the previous backing file, if there is one. + * `newBackingFile` creates a new temporary location for the next backing file. + */ +final case class GlobalLogBacking(file: File, last: Option[File], newBackingFile: () => File) { + /** Shifts the current backing file to `last` and sets the current backing to `newFile`. */ + def shift(newFile: File) = GlobalLogBacking(newFile, Some(file), newBackingFile) - /** Shifts the current backing file to `last` and sets the current backing to a new temporary file generated by `newBackingFile`. */ - def shiftNew() = shift(newBackingFile()) + /** Shifts the current backing file to `last` and sets the current backing to a new temporary file generated by `newBackingFile`. */ + def shiftNew() = shift(newBackingFile()) - /** If there is a previous backing file in `last`, that becomes the current backing file and the previous backing is cleared. - * Otherwise, no changes are made. */ - def unshift = GlobalLogBacking(last getOrElse file, None, newBackingFile) + /** + * If there is a previous backing file in `last`, that becomes the current backing file and the previous backing is cleared. + * Otherwise, no changes are made. + */ + def unshift = GlobalLogBacking(last getOrElse file, None, newBackingFile) } object GlobalLogBacking { - def apply(newBackingFile: => File): GlobalLogBacking = GlobalLogBacking(newBackingFile, None, newBackingFile _) + def apply(newBackingFile: => File): GlobalLogBacking = GlobalLogBacking(newBackingFile, None, newBackingFile _) } -object GlobalLogging -{ - def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File, console: ConsoleOut): GlobalLogging = - { - val log = ConsoleLogger(console) - GlobalLogging(log, console, log, GlobalLogBacking(newBackingFile), newLogger) - } +object GlobalLogging { + def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File, console: ConsoleOut): GlobalLogging = + { + val log = ConsoleLogger(console) + GlobalLogging(log, console, log, GlobalLogBacking(newBackingFile), newLogger) + } } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/Level.scala b/util/log/src/main/scala/sbt/Level.scala index f501cd40c..7744b9495 100644 --- a/util/log/src/main/scala/sbt/Level.scala +++ b/util/log/src/main/scala/sbt/Level.scala @@ -1,25 +1,28 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ - package sbt +package sbt -/** An enumeration defining the levels available for logging. A level includes all of the levels -* with id larger than its own id. For example, Warn (id=3) includes Error (id=4).*/ -object Level extends Enumeration -{ - val Debug = Value(1, "debug") - val Info = Value(2, "info") - val Warn = Value(3, "warn") - val Error = Value(4, "error") - /** Defines the label to use for success messages. - * Because the label for levels is defined in this module, the success label is also defined here. */ - val SuccessLabel = "success" +/** + * An enumeration defining the levels available for logging. A level includes all of the levels + * with id larger than its own id. For example, Warn (id=3) includes Error (id=4). + */ +object Level extends Enumeration { + val Debug = Value(1, "debug") + val Info = Value(2, "info") + val Warn = Value(3, "warn") + val Error = Value(4, "error") + /** + * Defines the label to use for success messages. + * Because the label for levels is defined in this module, the success label is also defined here. + */ + val SuccessLabel = "success" - def union(a: Value, b: Value) = if(a.id < b.id) a else b - def unionAll(vs: Seq[Value]) = vs reduceLeft union + def union(a: Value, b: Value) = if (a.id < b.id) a else b + def unionAll(vs: Seq[Value]) = vs reduceLeft union - /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ - def apply(s: String) = values.find(s == _.toString) - /** Same as apply, defined for use in pattern matching. */ - private[sbt] def unapply(s: String) = apply(s) + /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ + def apply(s: String) = values.find(s == _.toString) + /** Same as apply, defined for use in pattern matching. */ + private[sbt] def unapply(s: String) = apply(s) } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/LogEvent.scala b/util/log/src/main/scala/sbt/LogEvent.scala index 7bd91c2a4..d48957c75 100644 --- a/util/log/src/main/scala/sbt/LogEvent.scala +++ b/util/log/src/main/scala/sbt/LogEvent.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ - package sbt +package sbt sealed trait LogEvent extends NotNull final class Success(val msg: String) extends LogEvent @@ -12,7 +12,6 @@ final class SetTrace(val level: Int) extends LogEvent final class SetSuccess(val enabled: Boolean) extends LogEvent final class ControlEvent(val event: ControlEvent.Value, val msg: String) extends LogEvent -object ControlEvent extends Enumeration -{ - val Start, Header, Finish = Value +object ControlEvent extends Enumeration { + val Start, Header, Finish = Value } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala index c556f620c..c507484ce 100644 --- a/util/log/src/main/scala/sbt/Logger.scala +++ b/util/log/src/main/scala/sbt/Logger.scala @@ -1,138 +1,133 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ - package sbt +package sbt - import xsbti.{Logger => xLogger, F0} - import xsbti.{Maybe,Position,Problem,Severity} +import xsbti.{ Logger => xLogger, F0 } +import xsbti.{ Maybe, Position, Problem, Severity } - import java.io.File +import java.io.File -abstract class AbstractLogger extends Logger -{ - def getLevel: Level.Value - def setLevel(newLevel: Level.Value) - def setTrace(flag: Int) - def getTrace: Int - final def traceEnabled = getTrace >= 0 - def successEnabled: Boolean - def setSuccessEnabled(flag: Boolean): Unit +abstract class AbstractLogger extends Logger { + def getLevel: Level.Value + def setLevel(newLevel: Level.Value) + def setTrace(flag: Int) + def getTrace: Int + final def traceEnabled = getTrace >= 0 + def successEnabled: Boolean + def setSuccessEnabled(flag: Boolean): Unit - def atLevel(level: Level.Value) = level.id >= getLevel.id - def control(event: ControlEvent.Value, message: => String): Unit + def atLevel(level: Level.Value) = level.id >= getLevel.id + def control(event: ControlEvent.Value, message: => String): Unit - def logAll(events: Seq[LogEvent]): Unit - /** Defined in terms of other methods in Logger and should not be called from them. */ - final def log(event: LogEvent) - { - event match - { - case s: Success => success(s.msg) - case l: Log => log(l.level, l.msg) - case t: Trace => trace(t.exception) - case setL: SetLevel => setLevel(setL.newLevel) - case setT: SetTrace => setTrace(setT.level) - case setS: SetSuccess => setSuccessEnabled(setS.enabled) - case c: ControlEvent => control(c.event, c.msg) - } - } + def logAll(events: Seq[LogEvent]): Unit + /** Defined in terms of other methods in Logger and should not be called from them. */ + final def log(event: LogEvent) { + event match { + case s: Success => success(s.msg) + case l: Log => log(l.level, l.msg) + case t: Trace => trace(t.exception) + case setL: SetLevel => setLevel(setL.newLevel) + case setT: SetTrace => setTrace(setT.level) + case setS: SetSuccess => setSuccessEnabled(setS.enabled) + case c: ControlEvent => control(c.event, c.msg) + } + } } -object Logger -{ - def transferLevels(oldLog: AbstractLogger, newLog: AbstractLogger) { - newLog.setLevel(oldLog.getLevel) - newLog.setTrace(oldLog.getTrace) - } +object Logger { + def transferLevels(oldLog: AbstractLogger, newLog: AbstractLogger) { + newLog.setLevel(oldLog.getLevel) + newLog.setTrace(oldLog.getTrace) + } - // make public in 0.13 - private[sbt] val Null: AbstractLogger = new AbstractLogger { - def getLevel: Level.Value = Level.Error - def setLevel(newLevel: Level.Value) {} - def getTrace = 0 - def setTrace(flag: Int) {} - def successEnabled = false - def setSuccessEnabled(flag: Boolean) {} - def control(event: ControlEvent.Value, message: => String) {} - def logAll(events: Seq[LogEvent]) {} - def trace(t: => Throwable) {} - def success(message: => String) {} - def log(level: Level.Value, message: => String) {} - } + // make public in 0.13 + private[sbt] val Null: AbstractLogger = new AbstractLogger { + def getLevel: Level.Value = Level.Error + def setLevel(newLevel: Level.Value) {} + def getTrace = 0 + def setTrace(flag: Int) {} + def successEnabled = false + def setSuccessEnabled(flag: Boolean) {} + def control(event: ControlEvent.Value, message: => String) {} + def logAll(events: Seq[LogEvent]) {} + def trace(t: => Throwable) {} + def success(message: => String) {} + def log(level: Level.Value, message: => String) {} + } - implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger - implicit def log2PLog(log: Logger): ProcessLogger = absLog2PLog(new FullLogger(log)) - implicit def xlog2Log(lg: xLogger): Logger = lg match { - case l: Logger => l - case _ => wrapXLogger(lg) - } - private[this] def wrapXLogger(lg: xLogger): Logger = new Logger { - override def debug(msg: F0[String]): Unit = lg.debug(msg) - override def warn(msg: F0[String]): Unit = lg.warn(msg) - override def info(msg: F0[String]): Unit = lg.info(msg) - override def error(msg: F0[String]): Unit = lg.error(msg) - override def trace(msg: F0[Throwable]) = lg.trace(msg) - override def log(level: Level.Value, msg: F0[String]) = lg.log(level, msg) - def trace(t: => Throwable) = trace(f0(t)) - def success(s: => String) = info(f0(s)) - def log(level: Level.Value, msg: => String) = - { - val fmsg = f0(msg) - level match - { - case Level.Debug => lg.debug(fmsg) - case Level.Info => lg.info(fmsg) - case Level.Warn => lg.warn(fmsg) - case Level.Error => lg.error(fmsg) - } - } - } - def f0[T](t: =>T): F0[T] = new F0[T] { def apply = t } + implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger + implicit def log2PLog(log: Logger): ProcessLogger = absLog2PLog(new FullLogger(log)) + implicit def xlog2Log(lg: xLogger): Logger = lg match { + case l: Logger => l + case _ => wrapXLogger(lg) + } + private[this] def wrapXLogger(lg: xLogger): Logger = new Logger { + override def debug(msg: F0[String]): Unit = lg.debug(msg) + override def warn(msg: F0[String]): Unit = lg.warn(msg) + override def info(msg: F0[String]): Unit = lg.info(msg) + override def error(msg: F0[String]): Unit = lg.error(msg) + override def trace(msg: F0[Throwable]) = lg.trace(msg) + override def log(level: Level.Value, msg: F0[String]) = lg.log(level, msg) + def trace(t: => Throwable) = trace(f0(t)) + def success(s: => String) = info(f0(s)) + def log(level: Level.Value, msg: => String) = + { + val fmsg = f0(msg) + level match { + case Level.Debug => lg.debug(fmsg) + case Level.Info => lg.info(fmsg) + case Level.Warn => lg.warn(fmsg) + case Level.Error => lg.error(fmsg) + } + } + } + def f0[T](t: => T): F0[T] = new F0[T] { def apply = t } - def m2o[S](m: Maybe[S]): Option[S] = if(m.isDefined) Some(m.get) else None - def o2m[S](o: Option[S]): Maybe[S] = o match { case Some(v) => Maybe.just(v); case None => Maybe.nothing() } + def m2o[S](m: Maybe[S]): Option[S] = if (m.isDefined) Some(m.get) else None + def o2m[S](o: Option[S]): Maybe[S] = o match { case Some(v) => Maybe.just(v); case None => Maybe.nothing() } - def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = - new Position { - val line = o2m(line0) - val lineContent = content - val offset = o2m(offset0) - val pointer = o2m(pointer0) - val pointerSpace = o2m(pointerSpace0) - val sourcePath = o2m(sourcePath0) - val sourceFile = o2m(sourceFile0) - } + def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = + new Position { + val line = o2m(line0) + val lineContent = content + val offset = o2m(offset0) + val pointer = o2m(pointer0) + val pointerSpace = o2m(pointerSpace0) + val sourcePath = o2m(sourcePath0) + val sourceFile = o2m(sourceFile0) + } - def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = - new Problem - { - val category = cat - val position = pos - val message = msg - val severity = sev - } + def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = + new Problem { + val category = cat + val position = pos + val message = msg + val severity = sev + } } -/** This is intended to be the simplest logging interface for use by code that wants to log. -* It does not include configuring the logger. */ -trait Logger extends xLogger -{ - final def verbose(message: => String): Unit = debug(message) - final def debug(message: => String): Unit = log(Level.Debug, message) - final def info(message: => String): Unit = log(Level.Info, message) - final def warn(message: => String): Unit = log(Level.Warn, message) - final def error(message: => String): Unit = log(Level.Error, message) +/** + * This is intended to be the simplest logging interface for use by code that wants to log. + * It does not include configuring the logger. + */ +trait Logger extends xLogger { + final def verbose(message: => String): Unit = debug(message) + final def debug(message: => String): Unit = log(Level.Debug, message) + final def info(message: => String): Unit = log(Level.Info, message) + final def warn(message: => String): Unit = log(Level.Warn, message) + final def error(message: => String): Unit = log(Level.Error, message) - def ansiCodesSupported = false - - def trace(t: => Throwable): Unit - def success(message: => String): Unit - def log(level: Level.Value, message: => String): Unit - - def debug(msg: F0[String]): Unit = log(Level.Debug, msg) - def warn(msg: F0[String]): Unit = log(Level.Warn, msg) - def info(msg: F0[String]): Unit = log(Level.Info, msg) - def error(msg: F0[String]): Unit = log(Level.Error, msg) - def trace(msg: F0[Throwable]) = trace(msg.apply) - def log(level: Level.Value, msg: F0[String]): Unit = log(level, msg.apply) + def ansiCodesSupported = false + + def trace(t: => Throwable): Unit + def success(message: => String): Unit + def log(level: Level.Value, message: => String): Unit + + def debug(msg: F0[String]): Unit = log(Level.Debug, msg) + def warn(msg: F0[String]): Unit = log(Level.Warn, msg) + def info(msg: F0[String]): Unit = log(Level.Info, msg) + def error(msg: F0[String]): Unit = log(Level.Error, msg) + def trace(msg: F0[Throwable]) = trace(msg.apply) + def log(level: Level.Value, msg: F0[String]): Unit = log(level, msg.apply) } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/LoggerWriter.scala b/util/log/src/main/scala/sbt/LoggerWriter.scala index aeb67ce72..0165676f5 100644 --- a/util/log/src/main/scala/sbt/LoggerWriter.scala +++ b/util/log/src/main/scala/sbt/LoggerWriter.scala @@ -3,49 +3,47 @@ */ package sbt -/** Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. -* A line is delimited by `nl`, which is by default the platform line separator.*/ -class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: String = System.getProperty("line.separator")) extends java.io.Writer -{ - def this(delegate: Logger, level: Level.Value) = this(delegate, Some(level)) - def this(delegate: Logger) = this(delegate, None) - - private[this] val buffer = new StringBuilder - private[this] val lines = new collection.mutable.ListBuffer[String] +/** + * Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. + * A line is delimited by `nl`, which is by default the platform line separator. + */ +class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: String = System.getProperty("line.separator")) extends java.io.Writer { + def this(delegate: Logger, level: Level.Value) = this(delegate, Some(level)) + def this(delegate: Logger) = this(delegate, None) - override def close() = flush() - override def flush(): Unit = - synchronized { - if(buffer.length > 0) - { - log(buffer.toString) - buffer.clear() - } - } - def flushLines(level: Level.Value): Unit = - synchronized { - for(line <- lines) - delegate.log(level, line) - lines.clear() - } - override def write(content: Array[Char], offset: Int, length: Int): Unit = - synchronized { - buffer.appendAll(content, offset, length) - process() - } + private[this] val buffer = new StringBuilder + private[this] val lines = new collection.mutable.ListBuffer[String] - private[this] def process() - { - val i = buffer.indexOf(nl) - if(i >= 0) - { - log(buffer.substring(0, i)) - buffer.delete(0, i + nl.length) - process() - } - } - private[this] def log(s: String): Unit = unbufferedLevel match { - case None => lines += s - case Some(level) => delegate.log(level, s) - } + override def close() = flush() + override def flush(): Unit = + synchronized { + if (buffer.length > 0) { + log(buffer.toString) + buffer.clear() + } + } + def flushLines(level: Level.Value): Unit = + synchronized { + for (line <- lines) + delegate.log(level, line) + lines.clear() + } + override def write(content: Array[Char], offset: Int, length: Int): Unit = + synchronized { + buffer.appendAll(content, offset, length) + process() + } + + private[this] def process() { + val i = buffer.indexOf(nl) + if (i >= 0) { + log(buffer.substring(0, i)) + buffer.delete(0, i + nl.length) + process() + } + } + private[this] def log(s: String): Unit = unbufferedLevel match { + case None => lines += s + case Some(level) => delegate.log(level, s) + } } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/MainLogging.scala b/util/log/src/main/scala/sbt/MainLogging.scala index 5611dbd48..48015ad44 100644 --- a/util/log/src/main/scala/sbt/MainLogging.scala +++ b/util/log/src/main/scala/sbt/MainLogging.scala @@ -1,52 +1,51 @@ package sbt - import java.io.PrintWriter +import java.io.PrintWriter -object MainLogging -{ - def multiLogger(config: MultiLoggerConfig): Logger = - { - import config._ - val multi = new MultiLogger(console :: backed :: extra) - // sets multi to the most verbose for clients that inspect the current level - multi setLevel Level.unionAll(backingLevel :: screenLevel :: extra.map(_.getLevel)) - // set the specific levels - console setLevel screenLevel - backed setLevel backingLevel - console setTrace screenTrace - backed setTrace backingTrace - multi: Logger - } +object MainLogging { + def multiLogger(config: MultiLoggerConfig): Logger = + { + import config._ + val multi = new MultiLogger(console :: backed :: extra) + // sets multi to the most verbose for clients that inspect the current level + multi setLevel Level.unionAll(backingLevel :: screenLevel :: extra.map(_.getLevel)) + // set the specific levels + console setLevel screenLevel + backed setLevel backingLevel + console setTrace screenTrace + backed setTrace backingTrace + multi: Logger + } - def globalDefault(console: ConsoleOut): (PrintWriter, GlobalLogBacking) => GlobalLogging = - { - lazy val f: (PrintWriter, GlobalLogBacking) => GlobalLogging = (writer, backing) => { - val backed = defaultBacked()(writer) - val full = multiLogger(defaultMultiConfig(console, backed ) ) - GlobalLogging(full, console, backed, backing, f) - } - f - } + def globalDefault(console: ConsoleOut): (PrintWriter, GlobalLogBacking) => GlobalLogging = + { + lazy val f: (PrintWriter, GlobalLogBacking) => GlobalLogging = (writer, backing) => { + val backed = defaultBacked()(writer) + val full = multiLogger(defaultMultiConfig(console, backed)) + GlobalLogging(full, console, backed, backing, f) + } + f + } - @deprecated("Explicitly specify the console output.", "0.13.0") - def defaultMultiConfig(backing: AbstractLogger): MultiLoggerConfig = - defaultMultiConfig(ConsoleOut.systemOut, backing) - def defaultMultiConfig(console: ConsoleOut, backing: AbstractLogger): MultiLoggerConfig = - new MultiLoggerConfig(defaultScreen(console, ConsoleLogger.noSuppressedMessage), backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) + @deprecated("Explicitly specify the console output.", "0.13.0") + def defaultMultiConfig(backing: AbstractLogger): MultiLoggerConfig = + defaultMultiConfig(ConsoleOut.systemOut, backing) + def defaultMultiConfig(console: ConsoleOut, backing: AbstractLogger): MultiLoggerConfig = + new MultiLoggerConfig(defaultScreen(console, ConsoleLogger.noSuppressedMessage), backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) - @deprecated("Explicitly specify the console output.", "0.13.0") - def defaultScreen(): AbstractLogger = ConsoleLogger() + @deprecated("Explicitly specify the console output.", "0.13.0") + def defaultScreen(): AbstractLogger = ConsoleLogger() - @deprecated("Explicitly specify the console output.", "0.13.0") - def defaultScreen(suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = ConsoleLogger(suppressedMessage = suppressedMessage) + @deprecated("Explicitly specify the console output.", "0.13.0") + def defaultScreen(suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = ConsoleLogger(suppressedMessage = suppressedMessage) - def defaultScreen(console: ConsoleOut): AbstractLogger = ConsoleLogger(console) - def defaultScreen(console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = - ConsoleLogger(console, suppressedMessage = suppressedMessage) - - def defaultBacked(useColor: Boolean = ConsoleLogger.formatEnabled): PrintWriter => ConsoleLogger = - to => ConsoleLogger(ConsoleOut.printWriterOut(to), useColor = useColor) + def defaultScreen(console: ConsoleOut): AbstractLogger = ConsoleLogger(console) + def defaultScreen(console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = + ConsoleLogger(console, suppressedMessage = suppressedMessage) + + def defaultBacked(useColor: Boolean = ConsoleLogger.formatEnabled): PrintWriter => ConsoleLogger = + to => ConsoleLogger(ConsoleOut.printWriterOut(to), useColor = useColor) } final case class MultiLoggerConfig(console: AbstractLogger, backed: AbstractLogger, extra: List[AbstractLogger], - screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int) \ No newline at end of file + screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int) \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/MultiLogger.scala b/util/log/src/main/scala/sbt/MultiLogger.scala index cd73bf2c3..77c4c11d4 100644 --- a/util/log/src/main/scala/sbt/MultiLogger.scala +++ b/util/log/src/main/scala/sbt/MultiLogger.scala @@ -6,50 +6,45 @@ package sbt // note that setting the logging level on this logger has no effect on its behavior, only // on the behavior of the delegates. -class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger -{ - override lazy val ansiCodesSupported = delegates exists supported - private[this] lazy val allSupportCodes = delegates forall supported - private[this] def supported = (_: AbstractLogger).ansiCodesSupported +class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { + override lazy val ansiCodesSupported = delegates exists supported + private[this] lazy val allSupportCodes = delegates forall supported + private[this] def supported = (_: AbstractLogger).ansiCodesSupported - override def setLevel(newLevel: Level.Value) - { - super.setLevel(newLevel) - dispatch(new SetLevel(newLevel)) - } - override def setTrace(level: Int) - { - super.setTrace(level) - dispatch(new SetTrace(level)) - } - override def setSuccessEnabled(flag: Boolean) - { - super.setSuccessEnabled(flag) - dispatch(new SetSuccess(flag)) - } - def trace(t: => Throwable) { dispatch(new Trace(t)) } - def log(level: Level.Value, message: => String) { dispatch(new Log(level, message)) } - def success(message: => String) { dispatch(new Success(message)) } - def logAll(events: Seq[LogEvent]) { delegates.foreach(_.logAll(events)) } - def control(event: ControlEvent.Value, message: => String) { delegates.foreach(_.control(event, message)) } - private[this] def dispatch(event: LogEvent) - { - val plainEvent = if(allSupportCodes) event else removeEscapes(event) - for( d <- delegates) - if(d.ansiCodesSupported) - d.log(event) - else - d.log(plainEvent) - } + override def setLevel(newLevel: Level.Value) { + super.setLevel(newLevel) + dispatch(new SetLevel(newLevel)) + } + override def setTrace(level: Int) { + super.setTrace(level) + dispatch(new SetTrace(level)) + } + override def setSuccessEnabled(flag: Boolean) { + super.setSuccessEnabled(flag) + dispatch(new SetSuccess(flag)) + } + def trace(t: => Throwable) { dispatch(new Trace(t)) } + def log(level: Level.Value, message: => String) { dispatch(new Log(level, message)) } + def success(message: => String) { dispatch(new Success(message)) } + def logAll(events: Seq[LogEvent]) { delegates.foreach(_.logAll(events)) } + def control(event: ControlEvent.Value, message: => String) { delegates.foreach(_.control(event, message)) } + private[this] def dispatch(event: LogEvent) { + val plainEvent = if (allSupportCodes) event else removeEscapes(event) + for (d <- delegates) + if (d.ansiCodesSupported) + d.log(event) + else + d.log(plainEvent) + } - private[this] def removeEscapes(event: LogEvent): LogEvent = - { - import ConsoleLogger.{removeEscapeSequences => rm} - event match { - case s: Success => new Success(rm(s.msg)) - case l: Log => new Log(l.level, rm(l.msg)) - case ce: ControlEvent => new ControlEvent(ce.event, rm(ce.msg)) - case _: Trace | _: SetLevel | _: SetTrace | _: SetSuccess => event - } - } + private[this] def removeEscapes(event: LogEvent): LogEvent = + { + import ConsoleLogger.{ removeEscapeSequences => rm } + event match { + case s: Success => new Success(rm(s.msg)) + case l: Log => new Log(l.level, rm(l.msg)) + case ce: ControlEvent => new ControlEvent(ce.event, rm(ce.msg)) + case _: Trace | _: SetLevel | _: SetTrace | _: SetSuccess => event + } + } } \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/StackTrace.scala b/util/log/src/main/scala/sbt/StackTrace.scala index 1ecd6e8bf..70554c5ec 100644 --- a/util/log/src/main/scala/sbt/StackTrace.scala +++ b/util/log/src/main/scala/sbt/StackTrace.scala @@ -3,61 +3,60 @@ */ package sbt -object StackTrace -{ - def isSbtClass(name: String) = name.startsWith("sbt") || name.startsWith("xsbt") - /** - * Return a printable representation of the stack trace associated - * with t. Information about t and its Throwable causes is included. - * The number of lines to be included for each Throwable is configured - * via d which should be greater than or equal to zero. If d is zero, - * then all elements are included up to (but not including) the first - * element that comes from sbt. If d is greater than zero, then up to - * that many lines are included, where the line for the Throwable is - * counted plus one line for each stack element. Less lines will be - * included if there are not enough stack elements. - */ - def trimmed(t : Throwable, d : Int) : String = { - require(d >= 0) - val b = new StringBuilder () +object StackTrace { + def isSbtClass(name: String) = name.startsWith("sbt") || name.startsWith("xsbt") + /** + * Return a printable representation of the stack trace associated + * with t. Information about t and its Throwable causes is included. + * The number of lines to be included for each Throwable is configured + * via d which should be greater than or equal to zero. If d is zero, + * then all elements are included up to (but not including) the first + * element that comes from sbt. If d is greater than zero, then up to + * that many lines are included, where the line for the Throwable is + * counted plus one line for each stack element. Less lines will be + * included if there are not enough stack elements. + */ + def trimmed(t: Throwable, d: Int): String = { + require(d >= 0) + val b = new StringBuilder() - def appendStackTrace (t : Throwable, first : Boolean) { + def appendStackTrace(t: Throwable, first: Boolean) { - val include : StackTraceElement => Boolean = - if (d == 0) - element => !isSbtClass(element.getClassName) - else { - var count = d - 1 - (_ => { count -= 1; count >= 0 }) - } + val include: StackTraceElement => Boolean = + if (d == 0) + element => !isSbtClass(element.getClassName) + else { + var count = d - 1 + (_ => { count -= 1; count >= 0 }) + } - def appendElement (e : StackTraceElement) { - b.append ("\tat ") - b.append (e) - b.append ('\n') - } + def appendElement(e: StackTraceElement) { + b.append("\tat ") + b.append(e) + b.append('\n') + } - if (!first) - b.append ("Caused by: ") - b.append (t) - b.append ('\n') + if (!first) + b.append("Caused by: ") + b.append(t) + b.append('\n') - val els = t.getStackTrace () - var i = 0 - while ((i < els.size) && include (els (i))) { - appendElement (els (i)) - i += 1 - } + val els = t.getStackTrace() + var i = 0 + while ((i < els.size) && include(els(i))) { + appendElement(els(i)) + i += 1 + } - } + } - appendStackTrace (t, true) - var c = t - while (c.getCause () != null) { - c = c.getCause () - appendStackTrace (c, false) - } - b.toString () + appendStackTrace(t, true) + var c = t + while (c.getCause() != null) { + c = c.getCause() + appendStackTrace(c, false) + } + b.toString() - } + } } \ No newline at end of file diff --git a/util/logic/src/main/scala/sbt/logic/Logic.scala b/util/logic/src/main/scala/sbt/logic/Logic.scala index 4eb8e64b1..7ec73c15e 100644 --- a/util/logic/src/main/scala/sbt/logic/Logic.scala +++ b/util/logic/src/main/scala/sbt/logic/Logic.scala @@ -1,8 +1,8 @@ package sbt package logic - import scala.annotation.tailrec - import Formula.{And, True} +import scala.annotation.tailrec +import Formula.{ And, True } /* Defines a propositional logic with negation as failure and only allows stratified rule sets (negation must be acyclic) in order to have a unique minimal model. @@ -26,10 +26,9 @@ as is this: + http://www.w3.org/2005/rules/wg/wiki/negation */ - /** Disjunction (or) of the list of clauses. */ final case class Clauses(clauses: List[Clause]) { - assert(clauses.nonEmpty, "At least one clause is required.") + assert(clauses.nonEmpty, "At least one clause is required.") } /** When the `body` Formula succeeds, atoms in `head` are true. */ @@ -37,289 +36,301 @@ final case class Clause(body: Formula, head: Set[Atom]) /** A literal is an [[Atom]] or its [[negation|Negated]]. */ sealed abstract class Literal extends Formula { - /** The underlying (positive) atom. */ - def atom: Atom - /** Negates this literal.*/ - def unary_! : Literal + /** The underlying (positive) atom. */ + def atom: Atom + /** Negates this literal.*/ + def unary_! : Literal } /** A variable with name `label`. */ final case class Atom(label: String) extends Literal { - def atom = this - def unary_! : Negated = Negated(this) + def atom = this + def unary_! : Negated = Negated(this) } -/** A negated atom, in the sense of negation as failure, not logical negation. -* That is, it is true if `atom` is not known/defined. */ +/** + * A negated atom, in the sense of negation as failure, not logical negation. + * That is, it is true if `atom` is not known/defined. + */ final case class Negated(atom: Atom) extends Literal { - def unary_! : Atom = atom + def unary_! : Atom = atom } -/** A formula consists of variables, negation, and conjunction (and). -* (Disjunction is not currently included- it is modeled at the level of a sequence of clauses. -* This is less convenient when defining clauses, but is not less powerful.) */ +/** + * A formula consists of variables, negation, and conjunction (and). + * (Disjunction is not currently included- it is modeled at the level of a sequence of clauses. + * This is less convenient when defining clauses, but is not less powerful.) + */ sealed abstract class Formula { - /** Constructs a clause that proves `atoms` when this formula is true. */ - def proves(atom: Atom, atoms: Atom*): Clause = Clause(this, (atom +: atoms).toSet) + /** Constructs a clause that proves `atoms` when this formula is true. */ + def proves(atom: Atom, atoms: Atom*): Clause = Clause(this, (atom +: atoms).toSet) - /** Constructs a formula that is true iff this formula and `f` are both true.*/ - def && (f: Formula): Formula = (this, f) match { - case (True, x) => x - case (x, True) => x - case (And(as), And(bs)) => And(as ++ bs) - case (And(as), b: Literal) => And(as + b) - case (a: Literal, And(bs)) => And(bs + a) - case (a: Literal, b: Literal) => And( Set(a,b) ) - } + /** Constructs a formula that is true iff this formula and `f` are both true.*/ + def &&(f: Formula): Formula = (this, f) match { + case (True, x) => x + case (x, True) => x + case (And(as), And(bs)) => And(as ++ bs) + case (And(as), b: Literal) => And(as + b) + case (a: Literal, And(bs)) => And(bs + a) + case (a: Literal, b: Literal) => And(Set(a, b)) + } } - object Formula { - /** A conjunction of literals. */ - final case class And(literals: Set[Literal]) extends Formula { - assert(literals.nonEmpty, "'And' requires at least one literal.") - } - final case object True extends Formula + /** A conjunction of literals. */ + final case class And(literals: Set[Literal]) extends Formula { + assert(literals.nonEmpty, "'And' requires at least one literal.") + } + final case object True extends Formula } -object Logic -{ - def reduceAll(clauses: List[Clause], initialFacts: Set[Literal]): Either[LogicException, Matched] = - reduce(Clauses(clauses), initialFacts) +object Logic { + def reduceAll(clauses: List[Clause], initialFacts: Set[Literal]): Either[LogicException, Matched] = + reduce(Clauses(clauses), initialFacts) - /** Computes the variables in the unique stable model for the program represented by `clauses` and `initialFacts`. - * `clause` may not have any negative feedback (that is, negation is acyclic) - * and `initialFacts` cannot be in the head of any clauses in `clause`. - * These restrictions ensure that the logic program has a unique minimal model. */ - def reduce(clauses: Clauses, initialFacts: Set[Literal]): Either[LogicException, Matched] = - { - val (posSeq, negSeq) = separate(initialFacts.toSeq) - val (pos, neg) = (posSeq.toSet, negSeq.toSet) + /** + * Computes the variables in the unique stable model for the program represented by `clauses` and `initialFacts`. + * `clause` may not have any negative feedback (that is, negation is acyclic) + * and `initialFacts` cannot be in the head of any clauses in `clause`. + * These restrictions ensure that the logic program has a unique minimal model. + */ + def reduce(clauses: Clauses, initialFacts: Set[Literal]): Either[LogicException, Matched] = + { + val (posSeq, negSeq) = separate(initialFacts.toSeq) + val (pos, neg) = (posSeq.toSet, negSeq.toSet) - val problem = - checkContradictions(pos, neg) orElse - checkOverlap(clauses, pos) orElse - checkAcyclic(clauses) + val problem = + checkContradictions(pos, neg) orElse + checkOverlap(clauses, pos) orElse + checkAcyclic(clauses) - problem.toLeft( - reduce0(clauses, initialFacts, Matched.empty) - ) - } + problem.toLeft( + reduce0(clauses, initialFacts, Matched.empty) + ) + } + /** + * Verifies `initialFacts` are not in the head of any `clauses`. + * This avoids the situation where an atom is proved but no clauses prove it. + * This isn't necessarily a problem, but the main sbt use cases expects + * a proven atom to have at least one clause satisfied. + */ + private[this] def checkOverlap(clauses: Clauses, initialFacts: Set[Atom]): Option[InitialOverlap] = { + val as = atoms(clauses) + val initialOverlap = initialFacts.filter(as.inHead) + if (initialOverlap.nonEmpty) Some(new InitialOverlap(initialOverlap)) else None + } - /** Verifies `initialFacts` are not in the head of any `clauses`. - * This avoids the situation where an atom is proved but no clauses prove it. - * This isn't necessarily a problem, but the main sbt use cases expects - * a proven atom to have at least one clause satisfied. */ - private[this] def checkOverlap(clauses: Clauses, initialFacts: Set[Atom]): Option[InitialOverlap] = { - val as = atoms(clauses) - val initialOverlap = initialFacts.filter(as.inHead) - if(initialOverlap.nonEmpty) Some(new InitialOverlap(initialOverlap)) else None - } + private[this] def checkContradictions(pos: Set[Atom], neg: Set[Atom]): Option[InitialContradictions] = { + val contradictions = pos intersect neg + if (contradictions.nonEmpty) Some(new InitialContradictions(contradictions)) else None + } - private[this] def checkContradictions(pos: Set[Atom], neg: Set[Atom]): Option[InitialContradictions] = { - val contradictions = pos intersect neg - if(contradictions.nonEmpty) Some(new InitialContradictions(contradictions)) else None - } + private[this] def checkAcyclic(clauses: Clauses): Option[CyclicNegation] = { + val deps = dependencyMap(clauses) + val cycle = Dag.findNegativeCycle(graph(deps)) + if (cycle.nonEmpty) Some(new CyclicNegation(cycle)) else None + } + private[this] def graph(deps: Map[Atom, Set[Literal]]) = new Dag.DirectedSignedGraph[Atom] { + type Arrow = Literal + def nodes = deps.keys.toList + def dependencies(a: Atom) = deps.getOrElse(a, Set.empty).toList + def isNegative(b: Literal) = b match { + case Negated(_) => true + case Atom(_) => false + } + def head(b: Literal) = b.atom + } - private[this] def checkAcyclic(clauses: Clauses): Option[CyclicNegation] = { - val deps = dependencyMap(clauses) - val cycle = Dag.findNegativeCycle(graph(deps)) - if(cycle.nonEmpty) Some(new CyclicNegation(cycle)) else None - } - private[this] def graph(deps: Map[Atom, Set[Literal]]) = new Dag.DirectedSignedGraph[Atom] { - type Arrow = Literal - def nodes = deps.keys.toList - def dependencies(a: Atom) = deps.getOrElse(a, Set.empty).toList - def isNegative(b: Literal) = b match { - case Negated(_) => true - case Atom(_) => false - } - def head(b: Literal) = b.atom - } + private[this] def dependencyMap(clauses: Clauses): Map[Atom, Set[Literal]] = + (Map.empty[Atom, Set[Literal]] /: clauses.clauses) { + case (m, Clause(formula, heads)) => + val deps = literals(formula) + (m /: heads) { (n, head) => n.updated(head, n.getOrElse(head, Set.empty) ++ deps) } + } - private[this] def dependencyMap(clauses: Clauses): Map[Atom, Set[Literal]] = - (Map.empty[Atom, Set[Literal]] /: clauses.clauses) { - case (m, Clause(formula, heads)) => - val deps = literals(formula) - (m /: heads) { (n, head) => n.updated(head, n.getOrElse(head, Set.empty) ++ deps) } - } + sealed abstract class LogicException(override val toString: String) + final class InitialContradictions(val literals: Set[Atom]) extends LogicException("Initial facts cannot be both true and false:\n\t" + literals.mkString("\n\t")) + final class InitialOverlap(val literals: Set[Atom]) extends LogicException("Initial positive facts cannot be implied by any clauses:\n\t" + literals.mkString("\n\t")) + final class CyclicNegation(val cycle: List[Literal]) extends LogicException("Negation may not be involved in a cycle:\n\t" + cycle.mkString("\n\t")) - sealed abstract class LogicException(override val toString: String) - final class InitialContradictions(val literals: Set[Atom]) extends LogicException("Initial facts cannot be both true and false:\n\t" + literals.mkString("\n\t")) - final class InitialOverlap(val literals: Set[Atom]) extends LogicException("Initial positive facts cannot be implied by any clauses:\n\t" + literals.mkString("\n\t")) - final class CyclicNegation(val cycle: List[Literal]) extends LogicException("Negation may not be involved in a cycle:\n\t" + cycle.mkString("\n\t")) + /** Tracks proven atoms in the reverse order they were proved. */ + final class Matched private (val provenSet: Set[Atom], reverseOrdered: List[Atom]) { + def add(atoms: Set[Atom]): Matched = add(atoms.toList) + def add(atoms: List[Atom]): Matched = { + val newOnly = atoms.filterNot(provenSet) + new Matched(provenSet ++ newOnly, newOnly ::: reverseOrdered) + } + def ordered: List[Atom] = reverseOrdered.reverse + override def toString = ordered.map(_.label).mkString("Matched(", ",", ")") + } + object Matched { + val empty = new Matched(Set.empty, Nil) + } - /** Tracks proven atoms in the reverse order they were proved. */ - final class Matched private(val provenSet: Set[Atom], reverseOrdered: List[Atom]) { - def add(atoms: Set[Atom]): Matched = add(atoms.toList) - def add(atoms: List[Atom]): Matched = { - val newOnly = atoms.filterNot(provenSet) - new Matched(provenSet ++ newOnly, newOnly ::: reverseOrdered) - } - def ordered: List[Atom] = reverseOrdered.reverse - override def toString = ordered.map(_.label).mkString("Matched(", ",", ")") - } - object Matched { - val empty = new Matched(Set.empty, Nil) - } + /** Separates a sequence of literals into `(pos, neg)` atom sequences. */ + private[this] def separate(lits: Seq[Literal]): (Seq[Atom], Seq[Atom]) = Util.separate(lits) { + case a: Atom => Left(a) + case Negated(n) => Right(n) + } - /** Separates a sequence of literals into `(pos, neg)` atom sequences. */ - private[this] def separate(lits: Seq[Literal]): (Seq[Atom], Seq[Atom]) = Util.separate(lits) { - case a: Atom => Left(a) - case Negated(n) => Right(n) - } + /** + * Finds clauses that have no body and thus prove their head. + * Returns `(, )`. + */ + private[this] def findProven(c: Clauses): (Set[Atom], List[Clause]) = + { + val (proven, unproven) = c.clauses.partition(_.body == True) + (proven.flatMap(_.head).toSet, unproven) + } + private[this] def keepPositive(lits: Set[Literal]): Set[Atom] = + lits.collect { case a: Atom => a }.toSet - /** Finds clauses that have no body and thus prove their head. - * Returns `(, )`. */ - private[this] def findProven(c: Clauses): (Set[Atom], List[Clause]) = - { - val (proven, unproven) = c.clauses.partition(_.body == True) - (proven.flatMap(_.head).toSet, unproven) - } - private[this] def keepPositive(lits: Set[Literal]): Set[Atom] = - lits.collect{ case a: Atom => a}.toSet + // precondition: factsToProcess contains no contradictions + @tailrec + private[this] def reduce0(clauses: Clauses, factsToProcess: Set[Literal], state: Matched): Matched = + applyAll(clauses, factsToProcess) match { + case None => // all of the remaining clauses failed on the new facts + state + case Some(applied) => + val (proven, unprovenClauses) = findProven(applied) + val processedFacts = state add keepPositive(factsToProcess) + val newlyProven = proven -- processedFacts.provenSet + val newState = processedFacts add newlyProven + if (unprovenClauses.isEmpty) + newState // no remaining clauses, done. + else { + val unproven = Clauses(unprovenClauses) + val nextFacts: Set[Literal] = if (newlyProven.nonEmpty) newlyProven.toSet else inferFailure(unproven) + reduce0(unproven, nextFacts, newState) + } + } - // precondition: factsToProcess contains no contradictions - @tailrec - private[this] def reduce0(clauses: Clauses, factsToProcess: Set[Literal], state: Matched): Matched = - applyAll(clauses, factsToProcess) match { - case None => // all of the remaining clauses failed on the new facts - state - case Some(applied) => - val (proven, unprovenClauses) = findProven(applied) - val processedFacts = state add keepPositive(factsToProcess) - val newlyProven = proven -- processedFacts.provenSet - val newState = processedFacts add newlyProven - if(unprovenClauses.isEmpty) - newState // no remaining clauses, done. - else { - val unproven = Clauses(unprovenClauses) - val nextFacts: Set[Literal] = if(newlyProven.nonEmpty) newlyProven.toSet else inferFailure(unproven) - reduce0(unproven, nextFacts, newState) - } - } - - /** Finds negated atoms under the negation as failure rule and returns them. - * This should be called only after there are no more known atoms to be substituted. */ - private[this] def inferFailure(clauses: Clauses): Set[Literal] = - { - /* At this point, there is at least one clause and one of the following is the case as the result of the acyclic negation rule: + /** + * Finds negated atoms under the negation as failure rule and returns them. + * This should be called only after there are no more known atoms to be substituted. + */ + private[this] def inferFailure(clauses: Clauses): Set[Literal] = + { + /* At this point, there is at least one clause and one of the following is the case as the result of the acyclic negation rule: i. there is at least one variable that occurs in a clause body but not in the head of a clause ii. there is at least one variable that occurs in the head of a clause and does not transitively depend on a negated variable In either case, each such variable x cannot be proven true and therefore proves 'not x' (negation as failure, !x in the code). */ - val allAtoms = atoms(clauses) - val newFacts: Set[Literal] = negated(allAtoms.triviallyFalse) - if(newFacts.nonEmpty) - newFacts - else { - val possiblyTrue = hasNegatedDependency(clauses.clauses, Relation.empty, Relation.empty) - val newlyFalse: Set[Literal] = negated(allAtoms.inHead -- possiblyTrue) - if(newlyFalse.nonEmpty) - newlyFalse - else // should never happen due to the acyclic negation rule - error(s"No progress:\n\tclauses: $clauses\n\tpossibly true: $possiblyTrue") - } - } + val allAtoms = atoms(clauses) + val newFacts: Set[Literal] = negated(allAtoms.triviallyFalse) + if (newFacts.nonEmpty) + newFacts + else { + val possiblyTrue = hasNegatedDependency(clauses.clauses, Relation.empty, Relation.empty) + val newlyFalse: Set[Literal] = negated(allAtoms.inHead -- possiblyTrue) + if (newlyFalse.nonEmpty) + newlyFalse + else // should never happen due to the acyclic negation rule + error(s"No progress:\n\tclauses: $clauses\n\tpossibly true: $possiblyTrue") + } + } - private[this] def negated(atoms: Set[Atom]): Set[Literal] = atoms.map(a => Negated(a)) + private[this] def negated(atoms: Set[Atom]): Set[Literal] = atoms.map(a => Negated(a)) - /** Computes the set of atoms in `clauses` that directly or transitively take a negated atom as input. - * For example, for the following clauses, this method would return `List(a, d)` : - * a :- b, not c - * d :- a - */ - @tailrec - def hasNegatedDependency(clauses: Seq[Clause], posDeps: Relation[Atom, Atom], negDeps: Relation[Atom, Atom]): List[Atom] = - clauses match { - case Seq() => - // because cycles between positive literals are allowed, this isn't strictly a topological sort - Dag.topologicalSortUnchecked(negDeps._1s)(posDeps.reverse) - case Clause(formula, head) +: tail => - // collect direct positive and negative literals and track them in separate graphs - val (pos, neg) = directDeps(formula) - val (newPos, newNeg) = ( (posDeps, negDeps) /: head) { case ( (pdeps, ndeps), d) => - (pdeps + (d, pos), ndeps + (d, neg) ) - } - hasNegatedDependency(tail, newPos, newNeg) - } + /** + * Computes the set of atoms in `clauses` that directly or transitively take a negated atom as input. + * For example, for the following clauses, this method would return `List(a, d)` : + * a :- b, not c + * d :- a + */ + @tailrec + def hasNegatedDependency(clauses: Seq[Clause], posDeps: Relation[Atom, Atom], negDeps: Relation[Atom, Atom]): List[Atom] = + clauses match { + case Seq() => + // because cycles between positive literals are allowed, this isn't strictly a topological sort + Dag.topologicalSortUnchecked(negDeps._1s)(posDeps.reverse) + case Clause(formula, head) +: tail => + // collect direct positive and negative literals and track them in separate graphs + val (pos, neg) = directDeps(formula) + val (newPos, newNeg) = ((posDeps, negDeps) /: head) { + case ((pdeps, ndeps), d) => + (pdeps + (d, pos), ndeps + (d, neg)) + } + hasNegatedDependency(tail, newPos, newNeg) + } - /** Computes the `(positive, negative)` literals in `formula`. */ - private[this] def directDeps(formula: Formula): (Seq[Atom], Seq[Atom]) = - Util.separate(literals(formula).toSeq) { - case Negated(a) => Right(a) - case a: Atom => Left(a) - } - private[this] def literals(formula: Formula): Set[Literal] = formula match { - case And(lits) => lits - case l: Literal => Set(l) - case True => Set.empty - } + /** Computes the `(positive, negative)` literals in `formula`. */ + private[this] def directDeps(formula: Formula): (Seq[Atom], Seq[Atom]) = + Util.separate(literals(formula).toSeq) { + case Negated(a) => Right(a) + case a: Atom => Left(a) + } + private[this] def literals(formula: Formula): Set[Literal] = formula match { + case And(lits) => lits + case l: Literal => Set(l) + case True => Set.empty + } - /** Computes the atoms in the heads and bodies of the clauses in `clause`. */ - def atoms(cs: Clauses): Atoms = cs.clauses.map(c => Atoms(c.head, atoms(c.body))).reduce(_ ++ _) + /** Computes the atoms in the heads and bodies of the clauses in `clause`. */ + def atoms(cs: Clauses): Atoms = cs.clauses.map(c => Atoms(c.head, atoms(c.body))).reduce(_ ++ _) - /** Computes the set of all atoms in `formula`. */ - def atoms(formula: Formula): Set[Atom] = formula match { - case And(lits) => lits.map(_.atom) - case Negated(lit) => Set(lit) - case a: Atom => Set(a) - case True => Set() - } + /** Computes the set of all atoms in `formula`. */ + def atoms(formula: Formula): Set[Atom] = formula match { + case And(lits) => lits.map(_.atom) + case Negated(lit) => Set(lit) + case a: Atom => Set(a) + case True => Set() + } - /** Represents the set of atoms in the heads of clauses and in the bodies (formulas) of clauses. */ - final case class Atoms(val inHead: Set[Atom], val inFormula: Set[Atom]) { - /** Concatenates this with `as`. */ - def ++ (as: Atoms): Atoms = Atoms(inHead ++ as.inHead, inFormula ++ as.inFormula) - /** Atoms that cannot be true because they do not occur in a head. */ - def triviallyFalse: Set[Atom] = inFormula -- inHead - } + /** Represents the set of atoms in the heads of clauses and in the bodies (formulas) of clauses. */ + final case class Atoms(val inHead: Set[Atom], val inFormula: Set[Atom]) { + /** Concatenates this with `as`. */ + def ++(as: Atoms): Atoms = Atoms(inHead ++ as.inHead, inFormula ++ as.inFormula) + /** Atoms that cannot be true because they do not occur in a head. */ + def triviallyFalse: Set[Atom] = inFormula -- inHead + } - /** Applies known facts to `clause`s, deriving a new, possibly empty list of clauses. - * 1. If a fact is in the body of a clause, the derived clause has that fact removed from the body. - * 2. If the negation of a fact is in a body of a clause, that clause fails and is removed. - * 3. If a fact or its negation is in the head of a clause, the derived clause has that fact (or its negation) removed from the head. - * 4. If a head is empty, the clause proves nothing and is removed. - * - * NOTE: empty bodies do not cause a clause to succeed yet. - * All known facts must be applied before this can be done in order to avoid inconsistencies. - * Precondition: no contradictions in `facts` - * Postcondition: no atom in `facts` is present in the result - * Postcondition: No clauses have an empty head - * */ - def applyAll(cs: Clauses, facts: Set[Literal]): Option[Clauses] = - { - val newClauses = - if(facts.isEmpty) - cs.clauses.filter(_.head.nonEmpty) // still need to drop clauses with an empty head - else - cs.clauses.map(c => applyAll(c, facts)).flatMap(_.toList) - if(newClauses.isEmpty) None else Some(Clauses(newClauses)) - } + /** + * Applies known facts to `clause`s, deriving a new, possibly empty list of clauses. + * 1. If a fact is in the body of a clause, the derived clause has that fact removed from the body. + * 2. If the negation of a fact is in a body of a clause, that clause fails and is removed. + * 3. If a fact or its negation is in the head of a clause, the derived clause has that fact (or its negation) removed from the head. + * 4. If a head is empty, the clause proves nothing and is removed. + * + * NOTE: empty bodies do not cause a clause to succeed yet. + * All known facts must be applied before this can be done in order to avoid inconsistencies. + * Precondition: no contradictions in `facts` + * Postcondition: no atom in `facts` is present in the result + * Postcondition: No clauses have an empty head + */ + def applyAll(cs: Clauses, facts: Set[Literal]): Option[Clauses] = + { + val newClauses = + if (facts.isEmpty) + cs.clauses.filter(_.head.nonEmpty) // still need to drop clauses with an empty head + else + cs.clauses.map(c => applyAll(c, facts)).flatMap(_.toList) + if (newClauses.isEmpty) None else Some(Clauses(newClauses)) + } - def applyAll(c: Clause, facts: Set[Literal]): Option[Clause] = - { - val atoms = facts.map(_.atom) - val newHead = c.head -- atoms // 3. - if(newHead.isEmpty) // 4. empty head - None - else - substitute(c.body, facts).map( f => Clause(f, newHead) ) // 1, 2 - } + def applyAll(c: Clause, facts: Set[Literal]): Option[Clause] = + { + val atoms = facts.map(_.atom) + val newHead = c.head -- atoms // 3. + if (newHead.isEmpty) // 4. empty head + None + else + substitute(c.body, facts).map(f => Clause(f, newHead)) // 1, 2 + } - /** Derives the formula that results from substituting `facts` into `formula`. */ - @tailrec - def substitute(formula: Formula, facts: Set[Literal]): Option[Formula] = formula match { - case And(lits) => - def negated(lits: Set[Literal]): Set[Literal] = lits.map(a => !a) - if( lits.exists( negated(facts) ) ) // 2. - None - else { - val newLits = lits -- facts - val newF = if(newLits.isEmpty) True else And(newLits) - Some(newF) // 1. - } - case True => Some(True) - case lit: Literal => // define in terms of And - substitute(And(Set(lit)), facts) - } + /** Derives the formula that results from substituting `facts` into `formula`. */ + @tailrec + def substitute(formula: Formula, facts: Set[Literal]): Option[Formula] = formula match { + case And(lits) => + def negated(lits: Set[Literal]): Set[Literal] = lits.map(a => !a) + if (lits.exists(negated(facts))) // 2. + None + else { + val newLits = lits -- facts + val newF = if (newLits.isEmpty) True else And(newLits) + Some(newF) // 1. + } + case True => Some(True) + case lit: Literal => // define in terms of And + substitute(And(Set(lit)), facts) + } } diff --git a/util/process/src/main/scala/sbt/InheritInput.scala b/util/process/src/main/scala/sbt/InheritInput.scala index 1c9ef0ee8..9502cee49 100755 --- a/util/process/src/main/scala/sbt/InheritInput.scala +++ b/util/process/src/main/scala/sbt/InheritInput.scala @@ -3,18 +3,19 @@ */ package sbt -import java.lang.{ProcessBuilder => JProcessBuilder} +import java.lang.{ ProcessBuilder => JProcessBuilder } /** On java 7, inherit System.in for a ProcessBuilder. */ private[sbt] object InheritInput { - def apply(p: JProcessBuilder): Boolean = (redirectInput, inherit) match { - case (Some(m), Some(f)) => m.invoke(p, f); true - case _ => false - } + def apply(p: JProcessBuilder): Boolean = (redirectInput, inherit) match { + case (Some(m), Some(f)) => + m.invoke(p, f); true + case _ => false + } - private[this] val pbClass = Class.forName("java.lang.ProcessBuilder") - private[this] val redirectClass = pbClass.getClasses find (_.getSimpleName == "Redirect") + private[this] val pbClass = Class.forName("java.lang.ProcessBuilder") + private[this] val redirectClass = pbClass.getClasses find (_.getSimpleName == "Redirect") - private[this] val redirectInput = redirectClass map (pbClass.getMethod("redirectInput", _)) - private[this] val inherit = redirectClass map (_ getField "INHERIT" get null) + private[this] val redirectInput = redirectClass map (pbClass.getMethod("redirectInput", _)) + private[this] val inherit = redirectClass map (_ getField "INHERIT" get null) } diff --git a/util/process/src/main/scala/sbt/Process.scala b/util/process/src/main/scala/sbt/Process.scala index a370048e4..66b7e03c6 100644 --- a/util/process/src/main/scala/sbt/Process.scala +++ b/util/process/src/main/scala/sbt/Process.scala @@ -3,196 +3,219 @@ */ package sbt -import java.lang.{Process => JProcess, ProcessBuilder => JProcessBuilder} -import java.io.{Closeable, File, IOException} -import java.io.{BufferedReader, InputStream, InputStreamReader, OutputStream, PipedInputStream, PipedOutputStream} +import java.lang.{ Process => JProcess, ProcessBuilder => JProcessBuilder } +import java.io.{ Closeable, File, IOException } +import java.io.{ BufferedReader, InputStream, InputStreamReader, OutputStream, PipedInputStream, PipedOutputStream } import java.net.URL -trait ProcessExtra -{ - import Process._ - implicit def builderToProcess(builder: JProcessBuilder): ProcessBuilder = apply(builder) - implicit def fileToProcess(file: File): FilePartialBuilder = apply(file) - implicit def urlToProcess(url: URL): URLPartialBuilder = apply(url) - @deprecated("Use string interpolation", "0.13.0") - implicit def xmlToProcess(command: scala.xml.Elem): ProcessBuilder = apply(command) - implicit def buildersToProcess[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = applySeq(builders) +trait ProcessExtra { + import Process._ + implicit def builderToProcess(builder: JProcessBuilder): ProcessBuilder = apply(builder) + implicit def fileToProcess(file: File): FilePartialBuilder = apply(file) + implicit def urlToProcess(url: URL): URLPartialBuilder = apply(url) + @deprecated("Use string interpolation", "0.13.0") + implicit def xmlToProcess(command: scala.xml.Elem): ProcessBuilder = apply(command) + implicit def buildersToProcess[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = applySeq(builders) - implicit def stringToProcess(command: String): ProcessBuilder = apply(command) - implicit def stringSeqToProcess(command: Seq[String]): ProcessBuilder = apply(command) + implicit def stringToProcess(command: String): ProcessBuilder = apply(command) + implicit def stringSeqToProcess(command: Seq[String]): ProcessBuilder = apply(command) } /** Methods for constructing simple commands that can then be combined. */ -object Process extends ProcessExtra -{ - def apply(command: String): ProcessBuilder = apply(command, None) +object Process extends ProcessExtra { + def apply(command: String): ProcessBuilder = apply(command, None) - def apply(command: Seq[String]): ProcessBuilder = apply (command.toArray, None) + def apply(command: Seq[String]): ProcessBuilder = apply(command.toArray, None) - def apply(command: String, arguments: Seq[String]): ProcessBuilder = apply(command :: arguments.toList, None) - /** create ProcessBuilder with working dir set to File and extra environment variables */ - def apply(command: String, cwd: File, extraEnv: (String,String)*): ProcessBuilder = - apply(command, Some(cwd), extraEnv : _*) - /** create ProcessBuilder with working dir set to File and extra environment variables */ - def apply(command: Seq[String], cwd: File, extraEnv: (String,String)*): ProcessBuilder = - apply(command, Some(cwd), extraEnv : _*) - /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ - def apply(command: String, cwd: Option[File], extraEnv: (String,String)*): ProcessBuilder = { - apply(command.split("""\s+"""), cwd, extraEnv : _*) - // not smart to use this on windows, because CommandParser uses \ to escape ". - /*CommandParser.parse(command) match { + def apply(command: String, arguments: Seq[String]): ProcessBuilder = apply(command :: arguments.toList, None) + /** create ProcessBuilder with working dir set to File and extra environment variables */ + def apply(command: String, cwd: File, extraEnv: (String, String)*): ProcessBuilder = + apply(command, Some(cwd), extraEnv: _*) + /** create ProcessBuilder with working dir set to File and extra environment variables */ + def apply(command: Seq[String], cwd: File, extraEnv: (String, String)*): ProcessBuilder = + apply(command, Some(cwd), extraEnv: _*) + /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ + def apply(command: String, cwd: Option[File], extraEnv: (String, String)*): ProcessBuilder = { + apply(command.split("""\s+"""), cwd, extraEnv: _*) + // not smart to use this on windows, because CommandParser uses \ to escape ". + /*CommandParser.parse(command) match { case Left(errorMsg) => error(errorMsg) case Right((cmd, args)) => apply(cmd :: args, cwd, extraEnv : _*) }*/ - } - /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ - def apply(command: Seq[String], cwd: Option[File], extraEnv: (String,String)*): ProcessBuilder = { - val jpb = new JProcessBuilder(command.toArray : _*) - cwd.foreach(jpb directory _) - extraEnv.foreach { case (k, v) => jpb.environment.put(k, v) } - apply(jpb) - } - def apply(builder: JProcessBuilder): ProcessBuilder = new SimpleProcessBuilder(builder) - def apply(file: File): FilePartialBuilder = new FileBuilder(file) - def apply(url: URL): URLPartialBuilder = new URLBuilder(url) - @deprecated("Use string interpolation", "0.13.0") - def apply(command: scala.xml.Elem): ProcessBuilder = apply(command.text.trim) - def applySeq[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = builders.map(convert) + } + /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ + def apply(command: Seq[String], cwd: Option[File], extraEnv: (String, String)*): ProcessBuilder = { + val jpb = new JProcessBuilder(command.toArray: _*) + cwd.foreach(jpb directory _) + extraEnv.foreach { case (k, v) => jpb.environment.put(k, v) } + apply(jpb) + } + def apply(builder: JProcessBuilder): ProcessBuilder = new SimpleProcessBuilder(builder) + def apply(file: File): FilePartialBuilder = new FileBuilder(file) + def apply(url: URL): URLPartialBuilder = new URLBuilder(url) + @deprecated("Use string interpolation", "0.13.0") + def apply(command: scala.xml.Elem): ProcessBuilder = apply(command.text.trim) + def applySeq[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = builders.map(convert) - def apply(value: Boolean): ProcessBuilder = apply(value.toString, if(value) 0 else 1) - def apply(name: String, exitValue: => Int): ProcessBuilder = new DummyProcessBuilder(name, exitValue) + def apply(value: Boolean): ProcessBuilder = apply(value.toString, if (value) 0 else 1) + def apply(name: String, exitValue: => Int): ProcessBuilder = new DummyProcessBuilder(name, exitValue) - def cat(file: SourcePartialBuilder, files: SourcePartialBuilder*): ProcessBuilder = cat(file :: files.toList) - def cat(files: Seq[SourcePartialBuilder]): ProcessBuilder = - { - require(!files.isEmpty) - files.map(_.cat).reduceLeft(_ #&& _) - } + def cat(file: SourcePartialBuilder, files: SourcePartialBuilder*): ProcessBuilder = cat(file :: files.toList) + def cat(files: Seq[SourcePartialBuilder]): ProcessBuilder = + { + require(!files.isEmpty) + files.map(_.cat).reduceLeft(_ #&& _) + } } -trait SourcePartialBuilder extends NotNull -{ - /** Writes the output stream of this process to the given file. */ - def #> (f: File): ProcessBuilder = toFile(f, false) - /** Appends the output stream of this process to the given file. */ - def #>> (f: File): ProcessBuilder = toFile(f, true) - /** Writes the output stream of this process to the given OutputStream. The - * argument is call-by-name, so the stream is recreated, written, and closed each - * time this process is executed. */ - def #>(out: => OutputStream): ProcessBuilder = #> (new OutputStreamBuilder(out)) - def #>(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(toSource, b, false, ExitCodes.firstIfNonzero) - private def toFile(f: File, append: Boolean) = #> (new FileOutput(f, append)) - def cat = toSource - protected def toSource: ProcessBuilder +trait SourcePartialBuilder extends NotNull { + /** Writes the output stream of this process to the given file. */ + def #>(f: File): ProcessBuilder = toFile(f, false) + /** Appends the output stream of this process to the given file. */ + def #>>(f: File): ProcessBuilder = toFile(f, true) + /** + * Writes the output stream of this process to the given OutputStream. The + * argument is call-by-name, so the stream is recreated, written, and closed each + * time this process is executed. + */ + def #>(out: => OutputStream): ProcessBuilder = #>(new OutputStreamBuilder(out)) + def #>(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(toSource, b, false, ExitCodes.firstIfNonzero) + private def toFile(f: File, append: Boolean) = #>(new FileOutput(f, append)) + def cat = toSource + protected def toSource: ProcessBuilder } -trait SinkPartialBuilder extends NotNull -{ - /** Reads the given file into the input stream of this process. */ - def #< (f: File): ProcessBuilder = #< (new FileInput(f)) - /** Reads the given URL into the input stream of this process. */ - def #< (f: URL): ProcessBuilder = #< (new URLInput(f)) - /** Reads the given InputStream into the input stream of this process. The - * argument is call-by-name, so the stream is recreated, read, and closed each - * time this process is executed. */ - def #<(in: => InputStream): ProcessBuilder = #< (new InputStreamBuilder(in)) - def #<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, toSink, false, ExitCodes.firstIfNonzero) - protected def toSink: ProcessBuilder +trait SinkPartialBuilder extends NotNull { + /** Reads the given file into the input stream of this process. */ + def #<(f: File): ProcessBuilder = #<(new FileInput(f)) + /** Reads the given URL into the input stream of this process. */ + def #<(f: URL): ProcessBuilder = #<(new URLInput(f)) + /** + * Reads the given InputStream into the input stream of this process. The + * argument is call-by-name, so the stream is recreated, read, and closed each + * time this process is executed. + */ + def #<(in: => InputStream): ProcessBuilder = #<(new InputStreamBuilder(in)) + def #<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, toSink, false, ExitCodes.firstIfNonzero) + protected def toSink: ProcessBuilder } trait URLPartialBuilder extends SourcePartialBuilder -trait FilePartialBuilder extends SinkPartialBuilder with SourcePartialBuilder -{ - def #<<(f: File): ProcessBuilder - def #<<(u: URL): ProcessBuilder - def #<<(i: => InputStream): ProcessBuilder - def #<<(p: ProcessBuilder): ProcessBuilder +trait FilePartialBuilder extends SinkPartialBuilder with SourcePartialBuilder { + def #<<(f: File): ProcessBuilder + def #<<(u: URL): ProcessBuilder + def #<<(i: => InputStream): ProcessBuilder + def #<<(p: ProcessBuilder): ProcessBuilder } -/** Represents a process that is running or has finished running. -* It may be a compound process with several underlying native processes (such as 'a #&& b`).*/ -trait Process extends NotNull -{ - /** Blocks until this process exits and returns the exit code.*/ - def exitValue(): Int - /** Destroys this process. */ - def destroy(): Unit +/** + * Represents a process that is running or has finished running. + * It may be a compound process with several underlying native processes (such as 'a #&& b`). + */ +trait Process extends NotNull { + /** Blocks until this process exits and returns the exit code.*/ + def exitValue(): Int + /** Destroys this process. */ + def destroy(): Unit } /** Represents a runnable process. */ -trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder -{ - /** Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is - * sent to the console. If the exit code is non-zero, an exception is thrown.*/ - def !! : String - /** Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is - * sent to the provided ProcessLogger. If the exit code is non-zero, an exception is thrown.*/ - def !!(log: ProcessLogger) : String - /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, - * the Stream will provide all lines up to termination and then throw an exception. */ - def lines: Stream[String] - /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the provided ProcessLogger. If the process exits with a non-zero value, - * the Stream will provide all lines up to termination but will not throw an exception. */ - def lines(log: ProcessLogger): Stream[String] - /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, - * the Stream will provide all lines up to termination but will not throw an exception. */ - def lines_! : Stream[String] - /** Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the provided ProcessLogger. If the process exits with a non-zero value, - * the Stream will provide all lines up to termination but will not throw an exception. */ - def lines_!(log: ProcessLogger): Stream[String] - /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the console.*/ - def ! : Int - /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given ProcessLogger.*/ - def !(log: ProcessLogger): Int - /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the console. The newly started process reads from standard input of the current process.*/ - def !< : Int - /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given ProcessLogger. The newly started process reads from standard input of the current process.*/ - def !<(log: ProcessLogger) : Int - /** Starts the process represented by this builder. Standard output and error are sent to the console.*/ - def run(): Process - /** Starts the process represented by this builder. Standard output and error are sent to the given ProcessLogger.*/ - def run(log: ProcessLogger): Process - /** Starts the process represented by this builder. I/O is handled by the given ProcessIO instance.*/ - def run(io: ProcessIO): Process - /** Starts the process represented by this builder. Standard output and error are sent to the console. - * The newly started process reads from standard input of the current process if `connectInput` is true.*/ - def run(connectInput: Boolean): Process - /** Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given ProcessLogger. - * The newly started process reads from standard input of the current process if `connectInput` is true.*/ - def run(log: ProcessLogger, connectInput: Boolean): Process +trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder { + /** + * Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is + * sent to the console. If the exit code is non-zero, an exception is thrown. + */ + def !! : String + /** + * Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is + * sent to the provided ProcessLogger. If the exit code is non-zero, an exception is thrown. + */ + def !!(log: ProcessLogger): String + /** + * Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available + * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, + * the Stream will provide all lines up to termination and then throw an exception. + */ + def lines: Stream[String] + /** + * Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available + * but the process has not completed. Standard error is sent to the provided ProcessLogger. If the process exits with a non-zero value, + * the Stream will provide all lines up to termination but will not throw an exception. + */ + def lines(log: ProcessLogger): Stream[String] + /** + * Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available + * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, + * the Stream will provide all lines up to termination but will not throw an exception. + */ + def lines_! : Stream[String] + /** + * Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available + * but the process has not completed. Standard error is sent to the provided ProcessLogger. If the process exits with a non-zero value, + * the Stream will provide all lines up to termination but will not throw an exception. + */ + def lines_!(log: ProcessLogger): Stream[String] + /** + * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the console. + */ + def ! : Int + /** + * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the given ProcessLogger. + */ + def !(log: ProcessLogger): Int + /** + * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the console. The newly started process reads from standard input of the current process. + */ + def !< : Int + /** + * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the given ProcessLogger. The newly started process reads from standard input of the current process. + */ + def !<(log: ProcessLogger): Int + /** Starts the process represented by this builder. Standard output and error are sent to the console.*/ + def run(): Process + /** Starts the process represented by this builder. Standard output and error are sent to the given ProcessLogger.*/ + def run(log: ProcessLogger): Process + /** Starts the process represented by this builder. I/O is handled by the given ProcessIO instance.*/ + def run(io: ProcessIO): Process + /** + * Starts the process represented by this builder. Standard output and error are sent to the console. + * The newly started process reads from standard input of the current process if `connectInput` is true. + */ + def run(connectInput: Boolean): Process + /** + * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are + * sent to the given ProcessLogger. + * The newly started process reads from standard input of the current process if `connectInput` is true. + */ + def run(log: ProcessLogger, connectInput: Boolean): Process - def runBuffered(log: ProcessLogger, connectInput: Boolean): Process + def runBuffered(log: ProcessLogger, connectInput: Boolean): Process - /** Constructs a command that runs this command first and then `other` if this command succeeds.*/ - def #&& (other: ProcessBuilder): ProcessBuilder - /** Constructs a command that runs this command first and then `other` if this command does not succeed.*/ - def #|| (other: ProcessBuilder): ProcessBuilder - /** Constructs a command that will run this command and pipes the output to `other`. - * `other` must be a simple command. - * The exit code will be that of `other` regardless of whether this command succeeds. */ - def #| (other: ProcessBuilder): ProcessBuilder - /** Constructs a command that will run this command and then `other`. The exit code will be the exit code of `other`.*/ - def ### (other: ProcessBuilder): ProcessBuilder + /** Constructs a command that runs this command first and then `other` if this command succeeds.*/ + def #&&(other: ProcessBuilder): ProcessBuilder + /** Constructs a command that runs this command first and then `other` if this command does not succeed.*/ + def #||(other: ProcessBuilder): ProcessBuilder + /** + * Constructs a command that will run this command and pipes the output to `other`. + * `other` must be a simple command. + * The exit code will be that of `other` regardless of whether this command succeeds. + */ + def #|(other: ProcessBuilder): ProcessBuilder + /** Constructs a command that will run this command and then `other`. The exit code will be the exit code of `other`.*/ + def ###(other: ProcessBuilder): ProcessBuilder - def canPipeTo: Boolean + def canPipeTo: Boolean } /** Each method will be called in a separate thread.*/ -final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: InputStream => Unit, val processError: InputStream => Unit, val inheritInput: JProcessBuilder => Boolean) extends NotNull -{ - def withOutput(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, process, processError, inheritInput) - def withError(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, processOutput, process, inheritInput) - def withInput(write: OutputStream => Unit): ProcessIO = new ProcessIO(write, processOutput, processError, inheritInput) +final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: InputStream => Unit, val processError: InputStream => Unit, val inheritInput: JProcessBuilder => Boolean) extends NotNull { + def withOutput(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, process, processError, inheritInput) + def withError(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, processOutput, process, inheritInput) + def withInput(write: OutputStream => Unit): ProcessIO = new ProcessIO(write, processOutput, processError, inheritInput) } -trait ProcessLogger -{ - def info(s: => String): Unit - def error(s: => String): Unit - def buffer[T](f: => T): T +trait ProcessLogger { + def info(s: => String): Unit + def error(s: => String): Unit + def buffer[T](f: => T): T } diff --git a/util/process/src/main/scala/sbt/ProcessImpl.scala b/util/process/src/main/scala/sbt/ProcessImpl.scala index 9a3aae606..10c2460ad 100644 --- a/util/process/src/main/scala/sbt/ProcessImpl.scala +++ b/util/process/src/main/scala/sbt/ProcessImpl.scala @@ -3,423 +3,385 @@ */ package sbt -import java.lang.{Process => JProcess, ProcessBuilder => JProcessBuilder} -import java.io.{BufferedReader, Closeable, InputStream, InputStreamReader, IOException, OutputStream, PrintStream} -import java.io.{FilterInputStream, FilterOutputStream, PipedInputStream, PipedOutputStream} -import java.io.{File, FileInputStream, FileOutputStream} +import java.lang.{ Process => JProcess, ProcessBuilder => JProcessBuilder } +import java.io.{ BufferedReader, Closeable, InputStream, InputStreamReader, IOException, OutputStream, PrintStream } +import java.io.{ FilterInputStream, FilterOutputStream, PipedInputStream, PipedOutputStream } +import java.io.{ File, FileInputStream, FileOutputStream } import java.net.URL /** Runs provided code in a new Thread and returns the Thread instance. */ -private object Spawn -{ - def apply(f: => Unit): Thread = apply(f, false) - def apply(f: => Unit, daemon: Boolean): Thread = - { - val thread = new Thread() { override def run() = { f } } - thread.setDaemon(daemon) - thread.start() - thread - } +private object Spawn { + def apply(f: => Unit): Thread = apply(f, false) + def apply(f: => Unit, daemon: Boolean): Thread = + { + val thread = new Thread() { override def run() = { f } } + thread.setDaemon(daemon) + thread.start() + thread + } } -private object Future -{ - def apply[T](f: => T): () => T = - { - val result = new SyncVar[Either[Throwable, T]] - def run(): Unit = - try { result.set(Right(f)) } - catch { case e: Exception => result.set(Left(e)) } - Spawn(run) - () => - result.get match - { - case Right(value) => value - case Left(exception) => throw exception - } - } +private object Future { + def apply[T](f: => T): () => T = + { + val result = new SyncVar[Either[Throwable, T]] + def run(): Unit = + try { result.set(Right(f)) } + catch { case e: Exception => result.set(Left(e)) } + Spawn(run) + () => + result.get match { + case Right(value) => value + case Left(exception) => throw exception + } + } } -object BasicIO -{ - def apply(buffer: StringBuffer, log: Option[ProcessLogger], withIn: Boolean) = new ProcessIO(input(withIn), processFully(buffer), getErr(log), inheritInput(withIn)) - def apply(log: ProcessLogger, withIn: Boolean) = new ProcessIO(input(withIn), processInfoFully(log), processErrFully(log), inheritInput(withIn)) +object BasicIO { + def apply(buffer: StringBuffer, log: Option[ProcessLogger], withIn: Boolean) = new ProcessIO(input(withIn), processFully(buffer), getErr(log), inheritInput(withIn)) + def apply(log: ProcessLogger, withIn: Boolean) = new ProcessIO(input(withIn), processInfoFully(log), processErrFully(log), inheritInput(withIn)) - def getErr(log: Option[ProcessLogger]) = log match { case Some(lg) => processErrFully(lg); case None => toStdErr } + def getErr(log: Option[ProcessLogger]) = log match { case Some(lg) => processErrFully(lg); case None => toStdErr } - private def processErrFully(log: ProcessLogger) = processFully(s => log.error(s)) - private def processInfoFully(log: ProcessLogger) = processFully(s => log.info(s)) + private def processErrFully(log: ProcessLogger) = processFully(s => log.error(s)) + private def processInfoFully(log: ProcessLogger) = processFully(s => log.info(s)) - def closeOut = (_: OutputStream).close() - final val BufferSize = 8192 - final val Newline = System.getProperty("line.separator") + def closeOut = (_: OutputStream).close() + final val BufferSize = 8192 + final val Newline = System.getProperty("line.separator") - def close(c: java.io.Closeable) = try { c.close() } catch { case _: java.io.IOException => () } - def processFully(buffer: Appendable): InputStream => Unit = processFully(appendLine(buffer)) - def processFully(processLine: String => Unit): InputStream => Unit = - in => - { - val reader = new BufferedReader(new InputStreamReader(in)) - processLinesFully(processLine)(reader.readLine) - reader.close() - } - def processLinesFully(processLine: String => Unit)(readLine: () => String) - { - def readFully() - { - val line = readLine() - if(line != null) - { - processLine(line) - readFully() - } - } - readFully() - } - def connectToIn(o: OutputStream) { transferFully(Uncloseable protect System.in, o) } - def input(connect: Boolean): OutputStream => Unit = if(connect) connectToIn else closeOut - def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput), inheritInput(connectInput)) - def standard(in: OutputStream => Unit, inheritIn: JProcessBuilder => Boolean): ProcessIO = new ProcessIO(in, toStdOut, toStdErr, inheritIn) + def close(c: java.io.Closeable) = try { c.close() } catch { case _: java.io.IOException => () } + def processFully(buffer: Appendable): InputStream => Unit = processFully(appendLine(buffer)) + def processFully(processLine: String => Unit): InputStream => Unit = + in => + { + val reader = new BufferedReader(new InputStreamReader(in)) + processLinesFully(processLine)(reader.readLine) + reader.close() + } + def processLinesFully(processLine: String => Unit)(readLine: () => String) { + def readFully() { + val line = readLine() + if (line != null) { + processLine(line) + readFully() + } + } + readFully() + } + def connectToIn(o: OutputStream) { transferFully(Uncloseable protect System.in, o) } + def input(connect: Boolean): OutputStream => Unit = if (connect) connectToIn else closeOut + def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput), inheritInput(connectInput)) + def standard(in: OutputStream => Unit, inheritIn: JProcessBuilder => Boolean): ProcessIO = new ProcessIO(in, toStdOut, toStdErr, inheritIn) - def toStdErr = (in: InputStream) => transferFully(in, System.err) - def toStdOut = (in: InputStream) => transferFully(in, System.out) + def toStdErr = (in: InputStream) => transferFully(in, System.err) + def toStdOut = (in: InputStream) => transferFully(in, System.out) - def transferFully(in: InputStream, out: OutputStream): Unit = - try { transferFullyImpl(in, out) } - catch { case _: InterruptedException => () } + def transferFully(in: InputStream, out: OutputStream): Unit = + try { transferFullyImpl(in, out) } + catch { case _: InterruptedException => () } - private[this] def appendLine(buffer: Appendable): String => Unit = - line => - { - buffer.append(line) - buffer.append(Newline) - } + private[this] def appendLine(buffer: Appendable): String => Unit = + line => + { + buffer.append(line) + buffer.append(Newline) + } - private[this] def transferFullyImpl(in: InputStream, out: OutputStream) - { - val continueCount = 1//if(in.isInstanceOf[PipedInputStream]) 1 else 0 - val buffer = new Array[Byte](BufferSize) - def read() - { - val byteCount = in.read(buffer) - if(byteCount >= continueCount) - { - out.write(buffer, 0, byteCount) - out.flush() - read - } - } - read - in.close() - } + private[this] def transferFullyImpl(in: InputStream, out: OutputStream) { + val continueCount = 1 //if(in.isInstanceOf[PipedInputStream]) 1 else 0 + val buffer = new Array[Byte](BufferSize) + def read() { + val byteCount = in.read(buffer) + if (byteCount >= continueCount) { + out.write(buffer, 0, byteCount) + out.flush() + read + } + } + read + in.close() + } - def inheritInput(connect: Boolean) = { p: JProcessBuilder => if (connect) InheritInput(p) else false } + def inheritInput(connect: Boolean) = { p: JProcessBuilder => if (connect) InheritInput(p) else false } } private[sbt] object ExitCodes { - def ignoreFirst: (Int, Int) => Int = (a,b) => b - def firstIfNonzero: (Int, Int) => Int = (a,b) => if(a != 0) a else b + def ignoreFirst: (Int, Int) => Int = (a, b) => b + def firstIfNonzero: (Int, Int) => Int = (a, b) => if (a != 0) a else b } +private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPartialBuilder with SourcePartialBuilder { + def #&&(other: ProcessBuilder): ProcessBuilder = new AndProcessBuilder(this, other) + def #||(other: ProcessBuilder): ProcessBuilder = new OrProcessBuilder(this, other) + def #|(other: ProcessBuilder): ProcessBuilder = + { + require(other.canPipeTo, "Piping to multiple processes is not supported.") + new PipedProcessBuilder(this, other, false, exitCode = ExitCodes.ignoreFirst) + } + def ###(other: ProcessBuilder): ProcessBuilder = new SequenceProcessBuilder(this, other) -private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPartialBuilder with SourcePartialBuilder -{ - def #&&(other: ProcessBuilder): ProcessBuilder = new AndProcessBuilder(this, other) - def #||(other: ProcessBuilder): ProcessBuilder = new OrProcessBuilder(this, other) - def #|(other: ProcessBuilder): ProcessBuilder = - { - require(other.canPipeTo, "Piping to multiple processes is not supported.") - new PipedProcessBuilder(this, other, false, exitCode = ExitCodes.ignoreFirst) - } - def ###(other: ProcessBuilder): ProcessBuilder = new SequenceProcessBuilder(this, other) - - protected def toSource = this - protected def toSink = this - - def run(): Process = run(false) - def run(connectInput: Boolean): Process = run(BasicIO.standard(connectInput)) - def run(log: ProcessLogger): Process = run(log, false) - def run(log: ProcessLogger, connectInput: Boolean): Process = run(BasicIO(log, connectInput)) + protected def toSource = this + protected def toSink = this - private[this] def getString(log: Option[ProcessLogger], withIn: Boolean): String = - { - val buffer = new StringBuffer - val code = this ! BasicIO(buffer, log, withIn) - if(code == 0) buffer.toString else error("Nonzero exit value: " + code) - } - def !! = getString(None, false) - def !!(log: ProcessLogger) = getString(Some(log), false) - def !!< = getString(None, true) - def !!<(log: ProcessLogger) = getString(Some(log), true) + def run(): Process = run(false) + def run(connectInput: Boolean): Process = run(BasicIO.standard(connectInput)) + def run(log: ProcessLogger): Process = run(log, false) + def run(log: ProcessLogger, connectInput: Boolean): Process = run(BasicIO(log, connectInput)) - def lines: Stream[String] = lines(false, true, None) - def lines(log: ProcessLogger): Stream[String] = lines(false, true, Some(log)) - def lines_! : Stream[String] = lines(false, false, None) - def lines_!(log: ProcessLogger): Stream[String] = lines(false, false, Some(log)) + private[this] def getString(log: Option[ProcessLogger], withIn: Boolean): String = + { + val buffer = new StringBuffer + val code = this ! BasicIO(buffer, log, withIn) + if (code == 0) buffer.toString else error("Nonzero exit value: " + code) + } + def !! = getString(None, false) + def !!(log: ProcessLogger) = getString(Some(log), false) + def !!< = getString(None, true) + def !!<(log: ProcessLogger) = getString(Some(log), true) - private[this] def lines(withInput: Boolean, nonZeroException: Boolean, log: Option[ProcessLogger]): Stream[String] = - { - val streamed = Streamed[String](nonZeroException) - val process = run(new ProcessIO(BasicIO.input(withInput), BasicIO.processFully(streamed.process), BasicIO.getErr(log), BasicIO.inheritInput(withInput))) - Spawn { streamed.done(process.exitValue()) } - streamed.stream() - } + def lines: Stream[String] = lines(false, true, None) + def lines(log: ProcessLogger): Stream[String] = lines(false, true, Some(log)) + def lines_! : Stream[String] = lines(false, false, None) + def lines_!(log: ProcessLogger): Stream[String] = lines(false, false, Some(log)) - def ! = run(false).exitValue() - def !< = run(true).exitValue() - def !(log: ProcessLogger) = runBuffered(log, false).exitValue() - def !<(log: ProcessLogger) = runBuffered(log, true).exitValue() - def runBuffered(log: ProcessLogger, connectInput: Boolean) = - log.buffer { run(log, connectInput) } - def !(io: ProcessIO) = run(io).exitValue() + private[this] def lines(withInput: Boolean, nonZeroException: Boolean, log: Option[ProcessLogger]): Stream[String] = + { + val streamed = Streamed[String](nonZeroException) + val process = run(new ProcessIO(BasicIO.input(withInput), BasicIO.processFully(streamed.process), BasicIO.getErr(log), BasicIO.inheritInput(withInput))) + Spawn { streamed.done(process.exitValue()) } + streamed.stream() + } - def canPipeTo = false + def ! = run(false).exitValue() + def !< = run(true).exitValue() + def !(log: ProcessLogger) = runBuffered(log, false).exitValue() + def !<(log: ProcessLogger) = runBuffered(log, true).exitValue() + def runBuffered(log: ProcessLogger, connectInput: Boolean) = + log.buffer { run(log, connectInput) } + def !(io: ProcessIO) = run(io).exitValue() + + def canPipeTo = false } -private[sbt] class URLBuilder(url: URL) extends URLPartialBuilder with SourcePartialBuilder -{ - protected def toSource = new URLInput(url) +private[sbt] class URLBuilder(url: URL) extends URLPartialBuilder with SourcePartialBuilder { + protected def toSource = new URLInput(url) } -private[sbt] class FileBuilder(base: File) extends FilePartialBuilder with SinkPartialBuilder with SourcePartialBuilder -{ - protected def toSource = new FileInput(base) - protected def toSink = new FileOutput(base, false) - def #<<(f: File): ProcessBuilder = #<<(new FileInput(f)) - def #<<(u: URL): ProcessBuilder = #<<(new URLInput(u)) - def #<<(s: => InputStream): ProcessBuilder = #<<(new InputStreamBuilder(s)) - def #<<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, new FileOutput(base, true), false, ExitCodes.firstIfNonzero) +private[sbt] class FileBuilder(base: File) extends FilePartialBuilder with SinkPartialBuilder with SourcePartialBuilder { + protected def toSource = new FileInput(base) + protected def toSink = new FileOutput(base, false) + def #<<(f: File): ProcessBuilder = #<<(new FileInput(f)) + def #<<(u: URL): ProcessBuilder = #<<(new URLInput(u)) + def #<<(s: => InputStream): ProcessBuilder = #<<(new InputStreamBuilder(s)) + def #<<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, new FileOutput(base, true), false, ExitCodes.firstIfNonzero) } -private abstract class BasicBuilder extends AbstractProcessBuilder -{ - protected[this] def checkNotThis(a: ProcessBuilder) = require(a != this, "Compound process '" + a + "' cannot contain itself.") - final def run(io: ProcessIO): Process = - { - val p = createProcess(io) - p.start() - p - } - protected[this] def createProcess(io: ProcessIO): BasicProcess +private abstract class BasicBuilder extends AbstractProcessBuilder { + protected[this] def checkNotThis(a: ProcessBuilder) = require(a != this, "Compound process '" + a + "' cannot contain itself.") + final def run(io: ProcessIO): Process = + { + val p = createProcess(io) + p.start() + p + } + protected[this] def createProcess(io: ProcessIO): BasicProcess } -private abstract class BasicProcess extends Process -{ - def start(): Unit +private abstract class BasicProcess extends Process { + def start(): Unit } -private abstract class CompoundProcess extends BasicProcess -{ - def destroy() { destroyer() } - def exitValue() = getExitValue().getOrElse(error("No exit code: process destroyed.")) +private abstract class CompoundProcess extends BasicProcess { + def destroy() { destroyer() } + def exitValue() = getExitValue().getOrElse(error("No exit code: process destroyed.")) - def start() = getExitValue - - protected lazy val (getExitValue, destroyer) = - { - val code = new SyncVar[Option[Int]]() - code.set(None) - val thread = Spawn(code.set(runAndExitValue())) - - ( - Future { thread.join(); code.get }, - () => thread.interrupt() - ) - } - - /** Start and block until the exit value is available and then return it in Some. Return None if destroyed (use 'run')*/ - protected[this] def runAndExitValue(): Option[Int] + def start() = getExitValue - protected[this] def runInterruptible[T](action: => T)(destroyImpl: => Unit): Option[T] = - { - try { Some(action) } - catch { case _: InterruptedException => destroyImpl; None } - } + protected lazy val (getExitValue, destroyer) = + { + val code = new SyncVar[Option[Int]]() + code.set(None) + val thread = Spawn(code.set(runAndExitValue())) + + ( + Future { thread.join(); code.get }, + () => thread.interrupt() + ) + } + + /** Start and block until the exit value is available and then return it in Some. Return None if destroyed (use 'run')*/ + protected[this] def runAndExitValue(): Option[Int] + + protected[this] def runInterruptible[T](action: => T)(destroyImpl: => Unit): Option[T] = + { + try { Some(action) } + catch { case _: InterruptedException => destroyImpl; None } + } } -private abstract class SequentialProcessBuilder(a: ProcessBuilder, b: ProcessBuilder, operatorString: String) extends BasicBuilder -{ - checkNotThis(a) - checkNotThis(b) - override def toString = " ( " + a + " " + operatorString + " " + b + " ) " +private abstract class SequentialProcessBuilder(a: ProcessBuilder, b: ProcessBuilder, operatorString: String) extends BasicBuilder { + checkNotThis(a) + checkNotThis(b) + override def toString = " ( " + a + " " + operatorString + " " + b + " ) " } -private class PipedProcessBuilder(first: ProcessBuilder, second: ProcessBuilder, toError: Boolean, exitCode: (Int,Int) => Int) extends SequentialProcessBuilder(first, second, if(toError) "#|!" else "#|") -{ - override def createProcess(io: ProcessIO) = new PipedProcesses(first, second, io, toError, exitCode) +private class PipedProcessBuilder(first: ProcessBuilder, second: ProcessBuilder, toError: Boolean, exitCode: (Int, Int) => Int) extends SequentialProcessBuilder(first, second, if (toError) "#|!" else "#|") { + override def createProcess(io: ProcessIO) = new PipedProcesses(first, second, io, toError, exitCode) } -private class AndProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#&&") -{ - override def createProcess(io: ProcessIO) = new AndProcess(first, second, io) +private class AndProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#&&") { + override def createProcess(io: ProcessIO) = new AndProcess(first, second, io) } -private class OrProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#||") -{ - override def createProcess(io: ProcessIO) = new OrProcess(first, second, io) +private class OrProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#||") { + override def createProcess(io: ProcessIO) = new OrProcess(first, second, io) } -private class SequenceProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "###") -{ - override def createProcess(io: ProcessIO) = new ProcessSequence(first, second, io) +private class SequenceProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "###") { + override def createProcess(io: ProcessIO) = new ProcessSequence(first, second, io) } -private class SequentialProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO, evaluateSecondProcess: Int => Boolean) extends CompoundProcess -{ - protected[this] override def runAndExitValue() = - { - val first = a.run(io) - runInterruptible(first.exitValue)(first.destroy()) flatMap - { codeA => - if(evaluateSecondProcess(codeA)) - { - val second = b.run(io) - runInterruptible(second.exitValue)(second.destroy()) - } - else - Some(codeA) - } - } +private class SequentialProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO, evaluateSecondProcess: Int => Boolean) extends CompoundProcess { + protected[this] override def runAndExitValue() = + { + val first = a.run(io) + runInterruptible(first.exitValue)(first.destroy()) flatMap + { codeA => + if (evaluateSecondProcess(codeA)) { + val second = b.run(io) + runInterruptible(second.exitValue)(second.destroy()) + } else + Some(codeA) + } + } } private class AndProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, _ == 0) private class OrProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, _ != 0) private class ProcessSequence(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, ignore => true) +private class PipedProcesses(a: ProcessBuilder, b: ProcessBuilder, defaultIO: ProcessIO, toError: Boolean, exitCode: (Int, Int) => Int) extends CompoundProcess { + protected[this] override def runAndExitValue() = + { + val currentSource = new SyncVar[Option[InputStream]] + val pipeOut = new PipedOutputStream + val source = new PipeSource(currentSource, pipeOut, a.toString) + source.start() -private class PipedProcesses(a: ProcessBuilder, b: ProcessBuilder, defaultIO: ProcessIO, toError: Boolean, exitCode: (Int, Int) => Int) extends CompoundProcess -{ - protected[this] override def runAndExitValue() = - { - val currentSource = new SyncVar[Option[InputStream]] - val pipeOut = new PipedOutputStream - val source = new PipeSource(currentSource, pipeOut, a.toString) - source.start() - - val pipeIn = new PipedInputStream(pipeOut) - val currentSink = new SyncVar[Option[OutputStream]] - val sink = new PipeSink(pipeIn, currentSink, b.toString) - sink.start() + val pipeIn = new PipedInputStream(pipeOut) + val currentSink = new SyncVar[Option[OutputStream]] + val sink = new PipeSink(pipeIn, currentSink, b.toString) + sink.start() - def handleOutOrError(fromOutput: InputStream) = currentSource.put(Some(fromOutput)) + def handleOutOrError(fromOutput: InputStream) = currentSource.put(Some(fromOutput)) - val firstIO = - if(toError) - defaultIO.withError(handleOutOrError) - else - defaultIO.withOutput(handleOutOrError) - val secondIO = defaultIO.withInput(toInput => currentSink.put(Some(toInput)) ) - - val second = b.run(secondIO) - val first = a.run(firstIO) - try - { - runInterruptible { - val firstResult = first.exitValue - currentSource.put(None) - currentSink.put(None) - val secondResult = second.exitValue - exitCode(firstResult, secondResult) - } { - first.destroy() - second.destroy() - } - } - finally - { - BasicIO.close(pipeIn) - BasicIO.close(pipeOut) - } - } + val firstIO = + if (toError) + defaultIO.withError(handleOutOrError) + else + defaultIO.withOutput(handleOutOrError) + val secondIO = defaultIO.withInput(toInput => currentSink.put(Some(toInput))) + + val second = b.run(secondIO) + val first = a.run(firstIO) + try { + runInterruptible { + val firstResult = first.exitValue + currentSource.put(None) + currentSink.put(None) + val secondResult = second.exitValue + exitCode(firstResult, secondResult) + } { + first.destroy() + second.destroy() + } + } finally { + BasicIO.close(pipeIn) + BasicIO.close(pipeOut) + } + } } -private class PipeSource(currentSource: SyncVar[Option[InputStream]], pipe: PipedOutputStream, label: => String) extends Thread -{ - final override def run() - { - currentSource.get match - { - case Some(source) => - try { BasicIO.transferFully(source, pipe) } - catch { case e: IOException => println("I/O error " + e.getMessage + " for process: " + label); e.printStackTrace() } - finally - { - BasicIO.close(source) - currentSource.unset() - } - run() - case None => - currentSource.unset() - BasicIO.close(pipe) - } - } +private class PipeSource(currentSource: SyncVar[Option[InputStream]], pipe: PipedOutputStream, label: => String) extends Thread { + final override def run() { + currentSource.get match { + case Some(source) => + try { BasicIO.transferFully(source, pipe) } + catch { case e: IOException => println("I/O error " + e.getMessage + " for process: " + label); e.printStackTrace() } + finally { + BasicIO.close(source) + currentSource.unset() + } + run() + case None => + currentSource.unset() + BasicIO.close(pipe) + } + } } -private class PipeSink(pipe: PipedInputStream, currentSink: SyncVar[Option[OutputStream]], label: => String) extends Thread -{ - final override def run() - { - currentSink.get match - { - case Some(sink) => - try { BasicIO.transferFully(pipe, sink) } - catch { case e: IOException => println("I/O error " + e.getMessage + " for process: " + label); e.printStackTrace() } - finally - { - BasicIO.close(sink) - currentSink.unset() - } - run() - case None => - currentSink.unset() - } - } +private class PipeSink(pipe: PipedInputStream, currentSink: SyncVar[Option[OutputStream]], label: => String) extends Thread { + final override def run() { + currentSink.get match { + case Some(sink) => + try { BasicIO.transferFully(pipe, sink) } + catch { case e: IOException => println("I/O error " + e.getMessage + " for process: " + label); e.printStackTrace() } + finally { + BasicIO.close(sink) + currentSink.unset() + } + run() + case None => + currentSink.unset() + } + } } -private[sbt] class DummyProcessBuilder(override val toString: String, exitValue : => Int) extends AbstractProcessBuilder -{ - override def run(io: ProcessIO): Process = new DummyProcess(exitValue) - override def canPipeTo = true +private[sbt] class DummyProcessBuilder(override val toString: String, exitValue: => Int) extends AbstractProcessBuilder { + override def run(io: ProcessIO): Process = new DummyProcess(exitValue) + override def canPipeTo = true } -/** A thin wrapper around a java.lang.Process. `ioThreads` are the Threads created to do I/O. -* The implementation of `exitValue` waits until these threads die before returning. */ -private class DummyProcess(action: => Int) extends Process -{ - private[this] val exitCode = Future(action) - override def exitValue() = exitCode() - override def destroy() {} +/** + * A thin wrapper around a java.lang.Process. `ioThreads` are the Threads created to do I/O. + * The implementation of `exitValue` waits until these threads die before returning. + */ +private class DummyProcess(action: => Int) extends Process { + private[this] val exitCode = Future(action) + override def exitValue() = exitCode() + override def destroy() {} } /** Represents a simple command without any redirection or combination. */ -private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProcessBuilder -{ - override def run(io: ProcessIO): Process = - { - import io._ - val inherited = inheritInput(p) - val process = p.start() +private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProcessBuilder { + override def run(io: ProcessIO): Process = + { + import io._ + val inherited = inheritInput(p) + val process = p.start() - // spawn threads that process the output and error streams, and also write input if not inherited. - if (!inherited) - Spawn(writeInput(process.getOutputStream)) - val outThread = Spawn(processOutput(process.getInputStream)) - val errorThread = - if(!p.redirectErrorStream) - Spawn(processError(process.getErrorStream)) :: Nil - else - Nil - new SimpleProcess(process, outThread :: errorThread) - } - override def toString = p.command.toString - override def canPipeTo = true + // spawn threads that process the output and error streams, and also write input if not inherited. + if (!inherited) + Spawn(writeInput(process.getOutputStream)) + val outThread = Spawn(processOutput(process.getInputStream)) + val errorThread = + if (!p.redirectErrorStream) + Spawn(processError(process.getErrorStream)) :: Nil + else + Nil + new SimpleProcess(process, outThread :: errorThread) + } + override def toString = p.command.toString + override def canPipeTo = true } -/** A thin wrapper around a java.lang.Process. `outputThreads` are the Threads created to read from the -* output and error streams of the process. -* The implementation of `exitValue` wait for the process to finish and then waits until the threads reading output and error streams die before -* returning. Note that the thread that reads the input stream cannot be interrupted, see https://github.com/sbt/sbt/issues/327 and -* http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4514257 */ -private class SimpleProcess(p: JProcess, outputThreads: List[Thread]) extends Process -{ - override def exitValue() = - { - try { - p.waitFor() - } catch { - case _: InterruptedException => p.destroy() - } - outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) - p.exitValue() - } - override def destroy() = p.destroy() +/** + * A thin wrapper around a java.lang.Process. `outputThreads` are the Threads created to read from the + * output and error streams of the process. + * The implementation of `exitValue` wait for the process to finish and then waits until the threads reading output and error streams die before + * returning. Note that the thread that reads the input stream cannot be interrupted, see https://github.com/sbt/sbt/issues/327 and + * http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4514257 + */ +private class SimpleProcess(p: JProcess, outputThreads: List[Thread]) extends Process { + override def exitValue() = + { + try { + p.waitFor() + } catch { + case _: InterruptedException => p.destroy() + } + outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) + p.exitValue() + } + override def destroy() = p.destroy() } private class FileOutput(file: File, append: Boolean) extends OutputStreamBuilder(new FileOutputStream(file, append), file.getAbsolutePath) @@ -427,55 +389,48 @@ private class URLInput(url: URL) extends InputStreamBuilder(url.openStream, url. private class FileInput(file: File) extends InputStreamBuilder(new FileInputStream(file), file.getAbsolutePath) import Uncloseable.protect -private class OutputStreamBuilder(stream: => OutputStream, label: String) extends ThreadProcessBuilder(label, _.writeInput(protect(stream))) -{ - def this(stream: => OutputStream) = this(stream, "") +private class OutputStreamBuilder(stream: => OutputStream, label: String) extends ThreadProcessBuilder(label, _.writeInput(protect(stream))) { + def this(stream: => OutputStream) = this(stream, "") } -private class InputStreamBuilder(stream: => InputStream, label: String) extends ThreadProcessBuilder(label, _.processOutput(protect(stream))) -{ - def this(stream: => InputStream) = this(stream, "") +private class InputStreamBuilder(stream: => InputStream, label: String) extends ThreadProcessBuilder(label, _.processOutput(protect(stream))) { + def this(stream: => InputStream) = this(stream, "") } -private abstract class ThreadProcessBuilder(override val toString: String, runImpl: ProcessIO => Unit) extends AbstractProcessBuilder -{ - override def run(io: ProcessIO): Process = - { - val success = new SyncVar[Boolean] - success.put(false) - new ThreadProcess(Spawn {runImpl(io); success.set(true) }, success) - } +private abstract class ThreadProcessBuilder(override val toString: String, runImpl: ProcessIO => Unit) extends AbstractProcessBuilder { + override def run(io: ProcessIO): Process = + { + val success = new SyncVar[Boolean] + success.put(false) + new ThreadProcess(Spawn { runImpl(io); success.set(true) }, success) + } } -private final class ThreadProcess(thread: Thread, success: SyncVar[Boolean]) extends Process -{ - override def exitValue() = - { - thread.join() - if(success.get) 0 else 1 - } - override def destroy() { thread.interrupt() } +private final class ThreadProcess(thread: Thread, success: SyncVar[Boolean]) extends Process { + override def exitValue() = + { + thread.join() + if (success.get) 0 else 1 + } + override def destroy() { thread.interrupt() } } -object Uncloseable -{ - def apply(in: InputStream): InputStream = new FilterInputStream(in) { override def close() {} } - def apply(out: OutputStream): OutputStream = new FilterOutputStream(out) { override def close() {} } - def protect(in: InputStream): InputStream = if(in eq System.in) Uncloseable(in) else in - def protect(out: OutputStream): OutputStream = if( (out eq System.out) || (out eq System.err)) Uncloseable(out) else out +object Uncloseable { + def apply(in: InputStream): InputStream = new FilterInputStream(in) { override def close() {} } + def apply(out: OutputStream): OutputStream = new FilterOutputStream(out) { override def close() {} } + def protect(in: InputStream): InputStream = if (in eq System.in) Uncloseable(in) else in + def protect(out: OutputStream): OutputStream = if ((out eq System.out) || (out eq System.err)) Uncloseable(out) else out } -private object Streamed -{ - def apply[T](nonzeroException: Boolean): Streamed[T] = - { - val q = new java.util.concurrent.LinkedBlockingQueue[Either[Int, T]] - def next(): Stream[T] = - q.take match - { - case Left(0) => Stream.empty - case Left(code) => if(nonzeroException) error("Nonzero exit code: " + code) else Stream.empty - case Right(s) => Stream.cons(s, next) - } - new Streamed((s: T) => q.put(Right(s)), code => q.put(Left(code)), () => next()) - } +private object Streamed { + def apply[T](nonzeroException: Boolean): Streamed[T] = + { + val q = new java.util.concurrent.LinkedBlockingQueue[Either[Int, T]] + def next(): Stream[T] = + q.take match { + case Left(0) => Stream.empty + case Left(code) => if (nonzeroException) error("Nonzero exit code: " + code) else Stream.empty + case Right(s) => Stream.cons(s, next) + } + new Streamed((s: T) => q.put(Right(s)), code => q.put(Left(code)), () => next()) + } } private final class Streamed[T](val process: T => Unit, val done: Int => Unit, val stream: () => Stream[T]) extends NotNull diff --git a/util/process/src/main/scala/sbt/SyncVar.scala b/util/process/src/main/scala/sbt/SyncVar.scala index a04675851..c268aac3d 100644 --- a/util/process/src/main/scala/sbt/SyncVar.scala +++ b/util/process/src/main/scala/sbt/SyncVar.scala @@ -1,40 +1,39 @@ package sbt // minimal copy of scala.concurrent.SyncVar since that version deprecated put and unset -private[sbt] final class SyncVar[A] -{ - private[this] var isDefined: Boolean = false - private[this] var value: Option[A] = None +private[sbt] final class SyncVar[A] { + private[this] var isDefined: Boolean = false + private[this] var value: Option[A] = None - /** Waits until a value is set and then gets it. Does not clear the value */ - def get: A = synchronized { - while (!isDefined) wait() - value.get - } + /** Waits until a value is set and then gets it. Does not clear the value */ + def get: A = synchronized { + while (!isDefined) wait() + value.get + } - /** Waits until a value is set, gets it, and finally clears the value. */ - def take(): A = synchronized { - try get finally unset() - } + /** Waits until a value is set, gets it, and finally clears the value. */ + def take(): A = synchronized { + try get finally unset() + } - /** Sets the value, whether or not it is currently defined. */ - def set(x: A): Unit = synchronized { - isDefined = true - value = Some(x) - notifyAll() - } + /** Sets the value, whether or not it is currently defined. */ + def set(x: A): Unit = synchronized { + isDefined = true + value = Some(x) + notifyAll() + } - /** Sets the value, first waiting until it is undefined if it is currently defined. */ - def put(x: A): Unit = synchronized { - while (isDefined) wait() - set(x) - } + /** Sets the value, first waiting until it is undefined if it is currently defined. */ + def put(x: A): Unit = synchronized { + while (isDefined) wait() + set(x) + } - /** Clears the value, whether or not it is current defined. */ - def unset(): Unit = synchronized { - isDefined = false - value = None - notifyAll() - } + /** Clears the value, whether or not it is current defined. */ + def unset(): Unit = synchronized { + isDefined = false + value = None + notifyAll() + } } diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala index 77c0b70c2..987aafb14 100644 --- a/util/relation/src/main/scala/sbt/Relation.scala +++ b/util/relation/src/main/scala/sbt/Relation.scala @@ -3,165 +3,170 @@ */ package sbt - import Relation._ +import Relation._ -object Relation -{ - /** Constructs a new immutable, finite relation that is initially empty. */ - def empty[A,B]: Relation[A,B] = make(Map.empty, Map.empty) +object Relation { + /** Constructs a new immutable, finite relation that is initially empty. */ + def empty[A, B]: Relation[A, B] = make(Map.empty, Map.empty) - /** Constructs a [[Relation]] from underlying `forward` and `reverse` representations, without checking that they are consistent. - * This is a low-level constructor and the alternatives [[empty]] and [[reconstruct]] should be preferred. */ - def make[A,B](forward: Map[A,Set[B]], reverse: Map[B, Set[A]]): Relation[A,B] = new MRelation(forward, reverse) + /** + * Constructs a [[Relation]] from underlying `forward` and `reverse` representations, without checking that they are consistent. + * This is a low-level constructor and the alternatives [[empty]] and [[reconstruct]] should be preferred. + */ + def make[A, B](forward: Map[A, Set[B]], reverse: Map[B, Set[A]]): Relation[A, B] = new MRelation(forward, reverse) - /** Constructs a relation such that for every entry `_1 -> _2s` in `forward` and every `_2` in `_2s`, `(_1, _2)` is in the relation. */ - def reconstruct[A,B](forward: Map[A, Set[B]]): Relation[A,B] = - { - val reversePairs = for( (a,bs) <- forward.view; b <- bs.view) yield (b, a) - val reverse = (Map.empty[B,Set[A]] /: reversePairs) { case (m, (b, a)) => add(m, b, a :: Nil) } - make(forward filter { case (a, bs) => bs.nonEmpty }, reverse) - } + /** Constructs a relation such that for every entry `_1 -> _2s` in `forward` and every `_2` in `_2s`, `(_1, _2)` is in the relation. */ + def reconstruct[A, B](forward: Map[A, Set[B]]): Relation[A, B] = + { + val reversePairs = for ((a, bs) <- forward.view; b <- bs.view) yield (b, a) + val reverse = (Map.empty[B, Set[A]] /: reversePairs) { case (m, (b, a)) => add(m, b, a :: Nil) } + make(forward filter { case (a, bs) => bs.nonEmpty }, reverse) + } - def merge[A,B](rels: Traversable[Relation[A,B]]): Relation[A,B] = (Relation.empty[A, B] /: rels)(_ ++ _) + def merge[A, B](rels: Traversable[Relation[A, B]]): Relation[A, B] = (Relation.empty[A, B] /: rels)(_ ++ _) - private[sbt] def remove[X,Y](map: M[X,Y], from: X, to: Y): M[X,Y] = - map.get(from) match { - case Some(tos) => - val newSet = tos - to - if(newSet.isEmpty) map - from else map.updated(from, newSet) - case None => map - } + private[sbt] def remove[X, Y](map: M[X, Y], from: X, to: Y): M[X, Y] = + map.get(from) match { + case Some(tos) => + val newSet = tos - to + if (newSet.isEmpty) map - from else map.updated(from, newSet) + case None => map + } - private[sbt] def combine[X,Y](a: M[X,Y], b: M[X,Y]): M[X,Y] = - (a /: b) { (map, mapping) => add(map, mapping._1, mapping._2) } + private[sbt] def combine[X, Y](a: M[X, Y], b: M[X, Y]): M[X, Y] = + (a /: b) { (map, mapping) => add(map, mapping._1, mapping._2) } - private[sbt] def add[X,Y](map: M[X,Y], from: X, to: Traversable[Y]): M[X,Y] = - map.updated(from, get(map, from) ++ to) + private[sbt] def add[X, Y](map: M[X, Y], from: X, to: Traversable[Y]): M[X, Y] = + map.updated(from, get(map, from) ++ to) - private[sbt] def get[X,Y](map: M[X,Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) + private[sbt] def get[X, Y](map: M[X, Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) - private[sbt] type M[X,Y] = Map[X, Set[Y]] + private[sbt]type M[X, Y] = Map[X, Set[Y]] } /** Binary relation between A and B. It is a set of pairs (_1, _2) for _1 in A, _2 in B. */ -trait Relation[A,B] -{ - /** Returns the set of all `_2`s such that `(_1, _2)` is in this relation. */ - def forward(_1: A): Set[B] - /** Returns the set of all `_1`s such that `(_1, _2)` is in this relation. */ - def reverse(_2: B): Set[A] - /** Includes `pair` in the relation. */ - def +(pair: (A, B)): Relation[A,B] - /** Includes `(a, b)` in the relation. */ - def +(a: A, b: B): Relation[A,B] - /** Includes in the relation `(a, b)` for all `b` in `bs`. */ - def +(a: A, bs: Traversable[B]): Relation[A,B] - /** Returns the union of the relation `r` with this relation. */ - def ++(r: Relation[A,B]): Relation[A,B] - /** Includes the given pairs in this relation. */ - def ++(rs: Traversable[(A,B)]): Relation[A,B] - /** Removes all elements `(_1, _2)` for all `_1` in `_1s` from this relation. */ - def --(_1s: Traversable[A]): Relation[A,B] - /** Removes all `pairs` from this relation. */ - def --(pairs: TraversableOnce[(A,B)]): Relation[A,B] - /** Removes all `relations` from this relation. */ - def --(relations: Relation[A,B]): Relation[A,B] - /** Removes all pairs `(_1, _2)` from this relation. */ - def -(_1: A): Relation[A,B] - /** Removes `pair` from this relation. */ - def -(pair: (A,B)): Relation[A,B] - /** Returns the set of all `_1`s such that `(_1, _2)` is in this relation. */ - def _1s: collection.Set[A] - /** Returns the set of all `_2`s such that `(_1, _2)` is in this relation. */ - def _2s: collection.Set[B] - /** Returns the number of pairs in this relation */ - def size: Int +trait Relation[A, B] { + /** Returns the set of all `_2`s such that `(_1, _2)` is in this relation. */ + def forward(_1: A): Set[B] + /** Returns the set of all `_1`s such that `(_1, _2)` is in this relation. */ + def reverse(_2: B): Set[A] + /** Includes `pair` in the relation. */ + def +(pair: (A, B)): Relation[A, B] + /** Includes `(a, b)` in the relation. */ + def +(a: A, b: B): Relation[A, B] + /** Includes in the relation `(a, b)` for all `b` in `bs`. */ + def +(a: A, bs: Traversable[B]): Relation[A, B] + /** Returns the union of the relation `r` with this relation. */ + def ++(r: Relation[A, B]): Relation[A, B] + /** Includes the given pairs in this relation. */ + def ++(rs: Traversable[(A, B)]): Relation[A, B] + /** Removes all elements `(_1, _2)` for all `_1` in `_1s` from this relation. */ + def --(_1s: Traversable[A]): Relation[A, B] + /** Removes all `pairs` from this relation. */ + def --(pairs: TraversableOnce[(A, B)]): Relation[A, B] + /** Removes all `relations` from this relation. */ + def --(relations: Relation[A, B]): Relation[A, B] + /** Removes all pairs `(_1, _2)` from this relation. */ + def -(_1: A): Relation[A, B] + /** Removes `pair` from this relation. */ + def -(pair: (A, B)): Relation[A, B] + /** Returns the set of all `_1`s such that `(_1, _2)` is in this relation. */ + def _1s: collection.Set[A] + /** Returns the set of all `_2`s such that `(_1, _2)` is in this relation. */ + def _2s: collection.Set[B] + /** Returns the number of pairs in this relation */ + def size: Int - /** Returns true iff `(a,b)` is in this relation*/ - def contains(a: A, b: B): Boolean + /** Returns true iff `(a,b)` is in this relation*/ + def contains(a: A, b: B): Boolean - /** Returns a relation with only pairs `(a,b)` for which `f(a,b)` is true.*/ - def filter(f: (A,B) => Boolean): Relation[A,B] + /** Returns a relation with only pairs `(a,b)` for which `f(a,b)` is true.*/ + def filter(f: (A, B) => Boolean): Relation[A, B] - /** Returns a pair of relations: the first contains only pairs `(a,b)` for which `f(a,b)` is true and - * the other only pairs `(a,b)` for which `f(a,b)` is false. */ - def partition(f: (A,B) => Boolean): (Relation[A,B], Relation[A,B]) + /** + * Returns a pair of relations: the first contains only pairs `(a,b)` for which `f(a,b)` is true and + * the other only pairs `(a,b)` for which `f(a,b)` is false. + */ + def partition(f: (A, B) => Boolean): (Relation[A, B], Relation[A, B]) - /** Partitions this relation into a map of relations according to some discriminator function. */ - def groupBy[K](discriminator: ((A,B)) => K): Map[K, Relation[A,B]] + /** Partitions this relation into a map of relations according to some discriminator function. */ + def groupBy[K](discriminator: ((A, B)) => K): Map[K, Relation[A, B]] - /** Returns all pairs in this relation.*/ - def all: Traversable[(A,B)] + /** Returns all pairs in this relation.*/ + def all: Traversable[(A, B)] - /** Represents this relation as a `Map` from a `_1` to the set of `_2`s such that `(_1, _2)` is in this relation. - * - * Specifically, there is one entry for each `_1` such that `(_1, _2)` is in this relation for some `_2`. - * The value associated with a given `_1` is the set of all `_2`s such that `(_1, _2)` is in this relation.*/ - def forwardMap: Map[A, Set[B]] + /** + * Represents this relation as a `Map` from a `_1` to the set of `_2`s such that `(_1, _2)` is in this relation. + * + * Specifically, there is one entry for each `_1` such that `(_1, _2)` is in this relation for some `_2`. + * The value associated with a given `_1` is the set of all `_2`s such that `(_1, _2)` is in this relation. + */ + def forwardMap: Map[A, Set[B]] - /** Represents this relation as a `Map` from a `_2` to the set of `_1`s such that `(_1, _2)` is in this relation. - * - * Specifically, there is one entry for each `_2` such that `(_1, _2)` is in this relation for some `_1`. - * The value associated with a given `_2` is the set of all `_1`s such that `(_1, _2)` is in this relation.*/ - def reverseMap: Map[B, Set[A]] + /** + * Represents this relation as a `Map` from a `_2` to the set of `_1`s such that `(_1, _2)` is in this relation. + * + * Specifically, there is one entry for each `_2` such that `(_1, _2)` is in this relation for some `_1`. + * The value associated with a given `_2` is the set of all `_1`s such that `(_1, _2)` is in this relation. + */ + def reverseMap: Map[B, Set[A]] } // Note that we assume without checking that fwd and rev are consistent. -private final class MRelation[A,B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) extends Relation[A,B] -{ - def forwardMap = fwd - def reverseMap = rev +private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) extends Relation[A, B] { + def forwardMap = fwd + def reverseMap = rev - def forward(t: A) = get(fwd, t) - def reverse(t: B) = get(rev, t) + def forward(t: A) = get(fwd, t) + def reverse(t: B) = get(rev, t) - def _1s = fwd.keySet - def _2s = rev.keySet + def _1s = fwd.keySet + def _2s = rev.keySet - def size = (fwd.valuesIterator map { _.size }).foldLeft(0)(_ + _) + def size = (fwd.valuesIterator map { _.size }).foldLeft(0)(_ + _) - def all: Traversable[(A,B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map( b => (a,b) ) }.toTraversable + def all: Traversable[(A, B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map(b => (a, b)) }.toTraversable - def +(pair: (A,B)) = this + (pair._1, Set(pair._2)) - def +(from: A, to: B) = this + (from, to :: Nil) - def +(from: A, to: Traversable[B]) = if(to.isEmpty) this else - new MRelation( add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, from :: Nil) }) + def +(pair: (A, B)) = this + (pair._1, Set(pair._2)) + def +(from: A, to: B) = this + (from, to :: Nil) + def +(from: A, to: Traversable[B]) = if (to.isEmpty) this else + new MRelation(add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, from :: Nil) }) - def ++(rs: Traversable[(A,B)]) = ((this: Relation[A,B]) /: rs) { _ + _ } - def ++(other: Relation[A,B]) = new MRelation[A,B]( combine(fwd, other.forwardMap), combine(rev, other.reverseMap) ) + def ++(rs: Traversable[(A, B)]) = ((this: Relation[A, B]) /: rs) { _ + _ } + def ++(other: Relation[A, B]) = new MRelation[A, B](combine(fwd, other.forwardMap), combine(rev, other.reverseMap)) - def --(ts: Traversable[A]): Relation[A,B] = ((this: Relation[A,B]) /: ts) { _ - _ } - def --(pairs: TraversableOnce[(A,B)]): Relation[A,B] = ((this: Relation[A,B]) /: pairs) { _ - _ } - def --(relations: Relation[A,B]): Relation[A,B] = --(relations.all) - def -(pair: (A,B)): Relation[A,B] = - new MRelation( remove(fwd, pair._1, pair._2), remove(rev, pair._2, pair._1) ) - def -(t: A): Relation[A,B] = - fwd.get(t) match { - case Some(rs) => - val upRev = (rev /: rs) { (map, r) => remove(map, r, t) } - new MRelation(fwd - t, upRev) - case None => this - } + def --(ts: Traversable[A]): Relation[A, B] = ((this: Relation[A, B]) /: ts) { _ - _ } + def --(pairs: TraversableOnce[(A, B)]): Relation[A, B] = ((this: Relation[A, B]) /: pairs) { _ - _ } + def --(relations: Relation[A, B]): Relation[A, B] = --(relations.all) + def -(pair: (A, B)): Relation[A, B] = + new MRelation(remove(fwd, pair._1, pair._2), remove(rev, pair._2, pair._1)) + def -(t: A): Relation[A, B] = + fwd.get(t) match { + case Some(rs) => + val upRev = (rev /: rs) { (map, r) => remove(map, r, t) } + new MRelation(fwd - t, upRev) + case None => this + } - def filter(f: (A,B) => Boolean): Relation[A,B] = Relation.empty[A,B] ++ all.filter(f.tupled) + def filter(f: (A, B) => Boolean): Relation[A, B] = Relation.empty[A, B] ++ all.filter(f.tupled) - def partition(f: (A,B) => Boolean): (Relation[A,B], Relation[A,B]) = { - val (y, n) = all.partition(f.tupled) - (Relation.empty[A,B] ++ y, Relation.empty[A,B] ++ n) - } + def partition(f: (A, B) => Boolean): (Relation[A, B], Relation[A, B]) = { + val (y, n) = all.partition(f.tupled) + (Relation.empty[A, B] ++ y, Relation.empty[A, B] ++ n) + } - def groupBy[K](discriminator: ((A,B)) => K): Map[K, Relation[A,B]] = all.groupBy(discriminator) mapValues { Relation.empty[A,B] ++ _ } + def groupBy[K](discriminator: ((A, B)) => K): Map[K, Relation[A, B]] = all.groupBy(discriminator) mapValues { Relation.empty[A, B] ++ _ } - def contains(a: A, b: B): Boolean = forward(a)(b) + def contains(a: A, b: B): Boolean = forward(a)(b) - override def equals(other: Any) = other match { - // We assume that the forward and reverse maps are consistent, so we only use the forward map - // for equality. Note that key -> Empty is semantically the same as key not existing. - case o: MRelation[A,B] => forwardMap.filterNot(_._2.isEmpty) == o.forwardMap.filterNot(_._2.isEmpty) - case _ => false - } + override def equals(other: Any) = other match { + // We assume that the forward and reverse maps are consistent, so we only use the forward map + // for equality. Note that key -> Empty is semantically the same as key not existing. + case o: MRelation[A, B] => forwardMap.filterNot(_._2.isEmpty) == o.forwardMap.filterNot(_._2.isEmpty) + case _ => false + } - override def hashCode = fwd.filterNot(_._2.isEmpty).hashCode() + override def hashCode = fwd.filterNot(_._2.isEmpty).hashCode() - override def toString = all.map { case (a,b) => a + " -> " + b }.mkString("Relation [", ", ", "]") + override def toString = all.map { case (a, b) => a + " -> " + b }.mkString("Relation [", ", ", "]") } From 40fae635c6b029c2c7ec7123eb048c75bda17b79 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 2 May 2014 18:07:05 -0400 Subject: [PATCH 431/823] some more source getting formatted --- util/process/src/main/scala/sbt/InheritInput.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/process/src/main/scala/sbt/InheritInput.scala b/util/process/src/main/scala/sbt/InheritInput.scala index 9502cee49..a9828b04d 100755 --- a/util/process/src/main/scala/sbt/InheritInput.scala +++ b/util/process/src/main/scala/sbt/InheritInput.scala @@ -10,7 +10,7 @@ private[sbt] object InheritInput { def apply(p: JProcessBuilder): Boolean = (redirectInput, inherit) match { case (Some(m), Some(f)) => m.invoke(p, f); true - case _ => false + case _ => false } private[this] val pbClass = Class.forName("java.lang.ProcessBuilder") From 9f9de600ee8b762d2bf2e363b3aded1177eb7e62 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Wed, 7 May 2014 11:52:23 -0400 Subject: [PATCH 432/823] Scalariforming test code --- cache/src/test/scala/CacheTest.scala | 41 ++- .../src/test/scala/DagSpecification.scala | 78 +++-- util/collection/src/test/scala/KeyTest.scala | 55 ++-- .../src/test/scala/LiteralTest.scala | 8 +- util/collection/src/test/scala/PMapTest.scala | 19 +- .../src/test/scala/SettingsExample.scala | 116 +++---- .../src/test/scala/SettingsTest.scala | 303 +++++++++--------- util/complete/src/test/scala/ParserTest.scala | 250 +++++++-------- .../scala/sbt/complete/FileExamplesTest.scala | 132 ++++---- .../sbt/complete/FixedSetExamplesTest.scala | 32 +- .../sbt/complete/ParserWithExamplesTest.scala | 136 ++++---- util/log/src/test/scala/Escapes.scala | 133 ++++---- util/log/src/test/scala/LogWriterTest.scala | 252 +++++++-------- util/log/src/test/scala/TestLogger.scala | 15 +- .../logic/src/test/scala/sbt/logic/Test.scala | 194 ++++++----- .../src/test/scala/ProcessSpecification.scala | 226 +++++++------ .../src/test/scala/TestedProcess.scala | 91 +++--- .../src/test/scala/RelationTest.scala | 128 ++++---- 18 files changed, 1080 insertions(+), 1129 deletions(-) diff --git a/cache/src/test/scala/CacheTest.scala b/cache/src/test/scala/CacheTest.scala index 481bfb9b6..cbb7319b7 100644 --- a/cache/src/test/scala/CacheTest.scala +++ b/cache/src/test/scala/CacheTest.scala @@ -3,30 +3,29 @@ package sbt import java.io.File import Types.:+: -object CacheTest// extends Properties("Cache test") +object CacheTest // extends Properties("Cache test") { - val lengthCache = new File("/tmp/length-cache") - val cCache = new File("/tmp/c-cache") + val lengthCache = new File("/tmp/length-cache") + val cCache = new File("/tmp/c-cache") - import Cache._ - import FileInfo.hash._ - import Ordering._ - import sbinary.DefaultProtocol.FileFormat - def test - { - lazy val create = new File("test") + import Cache._ + import FileInfo.hash._ + import Ordering._ + import sbinary.DefaultProtocol.FileFormat + def test { + lazy val create = new File("test") - val length = cached(lengthCache) { - (f: File) => { println("File length: " + f.length); f.length } - } + val length = cached(lengthCache) { + (f: File) => { println("File length: " + f.length); f.length } + } - lazy val fileLength = length(create) + lazy val fileLength = length(create) - val c = cached(cCache) { (in: (File :+: Long :+: HNil)) => - val file :+: len :+: HNil = in - println("File: " + file + " (" + file.exists + "), length: " + len) - (len+1) :+: file :+: HNil - } - c(create :+: fileLength :+: HNil) - } + val c = cached(cCache) { (in: (File :+: Long :+: HNil)) => + val file :+: len :+: HNil = in + println("File: " + file + " (" + file.exists + "), length: " + len) + (len + 1) :+: file :+: HNil + } + c(create :+: fileLength :+: HNil) + } } \ No newline at end of file diff --git a/util/collection/src/test/scala/DagSpecification.scala b/util/collection/src/test/scala/DagSpecification.scala index 77ff80120..abf9ddf28 100644 --- a/util/collection/src/test/scala/DagSpecification.scala +++ b/util/collection/src/test/scala/DagSpecification.scala @@ -8,49 +8,43 @@ import Prop._ import scala.collection.mutable.HashSet -object DagSpecification extends Properties("Dag") -{ - property("No repeated nodes") = forAll{ (dag: TestDag) => isSet(dag.topologicalSort) } - property("Sort contains node") = forAll{ (dag: TestDag) => dag.topologicalSort.contains(dag) } - property("Dependencies precede node") = forAll{ (dag: TestDag) => dependenciesPrecedeNodes(dag.topologicalSort) } +object DagSpecification extends Properties("Dag") { + property("No repeated nodes") = forAll { (dag: TestDag) => isSet(dag.topologicalSort) } + property("Sort contains node") = forAll { (dag: TestDag) => dag.topologicalSort.contains(dag) } + property("Dependencies precede node") = forAll { (dag: TestDag) => dependenciesPrecedeNodes(dag.topologicalSort) } - implicit lazy val arbTestDag: Arbitrary[TestDag] = Arbitrary(Gen.sized(dagGen)) - private def dagGen(nodeCount: Int): Gen[TestDag] = - { - val nodes = new HashSet[TestDag] - def nonterminalGen(p: Gen.Parameters): Gen[TestDag] = - { - for(i <- 0 until nodeCount; nextDeps <- Gen.someOf(nodes).apply(p)) - nodes += new TestDag(i, nextDeps) - for(nextDeps <- Gen.someOf(nodes)) yield - new TestDag(nodeCount, nextDeps) - } - Gen.parameterized(nonterminalGen) - } + implicit lazy val arbTestDag: Arbitrary[TestDag] = Arbitrary(Gen.sized(dagGen)) + private def dagGen(nodeCount: Int): Gen[TestDag] = + { + val nodes = new HashSet[TestDag] + def nonterminalGen(p: Gen.Parameters): Gen[TestDag] = + { + for (i <- 0 until nodeCount; nextDeps <- Gen.someOf(nodes).apply(p)) + nodes += new TestDag(i, nextDeps) + for (nextDeps <- Gen.someOf(nodes)) yield new TestDag(nodeCount, nextDeps) + } + Gen.parameterized(nonterminalGen) + } - private def isSet[T](c: Seq[T]) = Set(c: _*).size == c.size - private def dependenciesPrecedeNodes(sort: List[TestDag]) = - { - val seen = new HashSet[TestDag] - def iterate(remaining: List[TestDag]): Boolean = - { - remaining match - { - case Nil => true - case node :: tail => - if(node.dependencies.forall(seen.contains) && !seen.contains(node)) - { - seen += node - iterate(tail) - } - else - false - } - } - iterate(sort) - } + private def isSet[T](c: Seq[T]) = Set(c: _*).size == c.size + private def dependenciesPrecedeNodes(sort: List[TestDag]) = + { + val seen = new HashSet[TestDag] + def iterate(remaining: List[TestDag]): Boolean = + { + remaining match { + case Nil => true + case node :: tail => + if (node.dependencies.forall(seen.contains) && !seen.contains(node)) { + seen += node + iterate(tail) + } else + false + } + } + iterate(sort) + } } -class TestDag(id: Int, val dependencies: Iterable[TestDag]) extends Dag[TestDag] -{ - override def toString = id + "->" + dependencies.mkString("[", ",", "]") +class TestDag(id: Int, val dependencies: Iterable[TestDag]) extends Dag[TestDag] { + override def toString = id + "->" + dependencies.mkString("[", ",", "]") } \ No newline at end of file diff --git a/util/collection/src/test/scala/KeyTest.scala b/util/collection/src/test/scala/KeyTest.scala index 9ac4f86bb..f48e3742a 100644 --- a/util/collection/src/test/scala/KeyTest.scala +++ b/util/collection/src/test/scala/KeyTest.scala @@ -1,35 +1,32 @@ package sbt - import org.scalacheck._ - import Prop._ +import org.scalacheck._ +import Prop._ -object KeyTest extends Properties("AttributeKey") -{ - property("equality") = { - compare(AttributeKey[Int]("test"), AttributeKey[Int]("test"), true) && - compare(AttributeKey[Int]("test"), AttributeKey[Int]("test", "description"), true) && - compare(AttributeKey[Int]("test", "a"), AttributeKey[Int]("test", "b"), true) && - compare(AttributeKey[Int]("test"), AttributeKey[Int]("tests"), false) && - compare(AttributeKey[Int]("test"), AttributeKey[Double]("test"), false) && - compare(AttributeKey[java.lang.Integer]("test"), AttributeKey[Int]("test"), false) && - compare(AttributeKey[Map[Int, String]]("test"), AttributeKey[Map[Int, String]]("test"), true) && - compare(AttributeKey[Map[Int, String]]("test"), AttributeKey[Map[Int, _]]("test"), false) - } +object KeyTest extends Properties("AttributeKey") { + property("equality") = { + compare(AttributeKey[Int]("test"), AttributeKey[Int]("test"), true) && + compare(AttributeKey[Int]("test"), AttributeKey[Int]("test", "description"), true) && + compare(AttributeKey[Int]("test", "a"), AttributeKey[Int]("test", "b"), true) && + compare(AttributeKey[Int]("test"), AttributeKey[Int]("tests"), false) && + compare(AttributeKey[Int]("test"), AttributeKey[Double]("test"), false) && + compare(AttributeKey[java.lang.Integer]("test"), AttributeKey[Int]("test"), false) && + compare(AttributeKey[Map[Int, String]]("test"), AttributeKey[Map[Int, String]]("test"), true) && + compare(AttributeKey[Map[Int, String]]("test"), AttributeKey[Map[Int, _]]("test"), false) + } - def compare(a: AttributeKey[_], b: AttributeKey[_], same: Boolean) = - ("a.label: " + a.label) |: - ("a.manifest: " + a.manifest) |: - ("b.label: " + b.label) |: - ("b.manifest: " + b.manifest) |: - ("expected equal? " + same) |: - compare0(a, b, same) + def compare(a: AttributeKey[_], b: AttributeKey[_], same: Boolean) = + ("a.label: " + a.label) |: + ("a.manifest: " + a.manifest) |: + ("b.label: " + b.label) |: + ("b.manifest: " + b.manifest) |: + ("expected equal? " + same) |: + compare0(a, b, same) - def compare0(a: AttributeKey[_], b: AttributeKey[_], same: Boolean) = - if(same) - { - ("equality" |: (a == b)) && - ("hash" |: (a.hashCode == b.hashCode)) - } - else - ("equality" |: (a != b)) + def compare0(a: AttributeKey[_], b: AttributeKey[_], same: Boolean) = + if (same) { + ("equality" |: (a == b)) && + ("hash" |: (a.hashCode == b.hashCode)) + } else + ("equality" |: (a != b)) } \ No newline at end of file diff --git a/util/collection/src/test/scala/LiteralTest.scala b/util/collection/src/test/scala/LiteralTest.scala index 76fffe80a..35ef373ca 100644 --- a/util/collection/src/test/scala/LiteralTest.scala +++ b/util/collection/src/test/scala/LiteralTest.scala @@ -7,11 +7,11 @@ import Types._ // compilation test object LiteralTest { - def x[A[_],B[_]](f: A ~> B) = f + def x[A[_], B[_]](f: A ~> B) = f import Param._ - val f = x { (p: Param[Option,List]) => p.ret( p.in.toList ) } + val f = x { (p: Param[Option, List]) => p.ret(p.in.toList) } - val a: List[Int] = f( Some(3) ) - val b: List[String] = f( Some("aa") ) + val a: List[Int] = f(Some(3)) + val b: List[String] = f(Some("aa")) } \ No newline at end of file diff --git a/util/collection/src/test/scala/PMapTest.scala b/util/collection/src/test/scala/PMapTest.scala index 7970e175e..6a6c558c1 100644 --- a/util/collection/src/test/scala/PMapTest.scala +++ b/util/collection/src/test/scala/PMapTest.scala @@ -6,14 +6,13 @@ package sbt import Types._ // compilation test -object PMapTest -{ - val mp = new DelegatingPMap[Some, Id](new collection.mutable.HashMap) - mp(Some("asdf")) = "a" - mp(Some(3)) = 9 - val x = Some(3) :^: Some("asdf") :^: KNil - val y = x.transform[Id](mp) - assert(y.head == 9) - assert(y.tail.head == "a") - assert(y.tail.tail == KNil) +object PMapTest { + val mp = new DelegatingPMap[Some, Id](new collection.mutable.HashMap) + mp(Some("asdf")) = "a" + mp(Some(3)) = 9 + val x = Some(3) :^: Some("asdf") :^: KNil + val y = x.transform[Id](mp) + assert(y.head == 9) + assert(y.tail.head == "a") + assert(y.tail.tail == KNil) } \ No newline at end of file diff --git a/util/collection/src/test/scala/SettingsExample.scala b/util/collection/src/test/scala/SettingsExample.scala index 9d863be31..b48bb27fc 100644 --- a/util/collection/src/test/scala/SettingsExample.scala +++ b/util/collection/src/test/scala/SettingsExample.scala @@ -10,78 +10,78 @@ final case class Scope(nestIndex: Int, idAtIndex: Int = 0) // Lots of type constructors would become binary, which as you may know requires lots of type lambdas // when you want a type function with only one parameter. // That would be a general pain.) -object SettingsExample extends Init[Scope] -{ - // Provides a way of showing a Scope+AttributeKey[_] - val showFullKey: Show[ScopedKey[_]] = new Show[ScopedKey[_]] { - def apply(key: ScopedKey[_]) = s"${key.scope.nestIndex}(${key.scope.idAtIndex})/${key.key.label}" - } +object SettingsExample extends Init[Scope] { + // Provides a way of showing a Scope+AttributeKey[_] + val showFullKey: Show[ScopedKey[_]] = new Show[ScopedKey[_]] { + def apply(key: ScopedKey[_]) = s"${key.scope.nestIndex}(${key.scope.idAtIndex})/${key.key.label}" + } - // A sample delegation function that delegates to a Scope with a lower index. - val delegates: Scope => Seq[Scope] = { case s @ Scope(index, proj) => - s +: (if(index <= 0) Nil else { (if (proj > 0) List(Scope(index)) else Nil) ++: delegates(Scope(index-1)) }) - } + // A sample delegation function that delegates to a Scope with a lower index. + val delegates: Scope => Seq[Scope] = { + case s @ Scope(index, proj) => + s +: (if (index <= 0) Nil else { (if (proj > 0) List(Scope(index)) else Nil) ++: delegates(Scope(index - 1)) }) + } - // Not using this feature in this example. - val scopeLocal: ScopeLocal = _ => Nil + // Not using this feature in this example. + val scopeLocal: ScopeLocal = _ => Nil - // These three functions + a scope (here, Scope) are sufficient for defining our settings system. + // These three functions + a scope (here, Scope) are sufficient for defining our settings system. } /** Usage Example **/ -object SettingsUsage -{ - import SettingsExample._ - import Types._ +object SettingsUsage { + import SettingsExample._ + import Types._ - // Define some keys - val a = AttributeKey[Int]("a") - val b = AttributeKey[Int]("b") + // Define some keys + val a = AttributeKey[Int]("a") + val b = AttributeKey[Int]("b") - // Scope these keys - val a3 = ScopedKey(Scope(3), a) - val a4 = ScopedKey(Scope(4), a) - val a5 = ScopedKey(Scope(5), a) + // Scope these keys + val a3 = ScopedKey(Scope(3), a) + val a4 = ScopedKey(Scope(4), a) + val a5 = ScopedKey(Scope(5), a) - val b4 = ScopedKey(Scope(4), b) + val b4 = ScopedKey(Scope(4), b) - // Define some settings - val mySettings: Seq[Setting[_]] = Seq( - setting( a3, value( 3 ) ), - setting( b4, map(a4)(_ * 3)), - update(a5)(_ + 1) - ) + // Define some settings + val mySettings: Seq[Setting[_]] = Seq( + setting(a3, value(3)), + setting(b4, map(a4)(_ * 3)), + update(a5)(_ + 1) + ) - // "compiles" and applies the settings. - // This can be split into multiple steps to access intermediate results if desired. - // The 'inspect' command operates on the output of 'compile', for example. - val applied: Settings[Scope] = make(mySettings)(delegates, scopeLocal, showFullKey) + // "compiles" and applies the settings. + // This can be split into multiple steps to access intermediate results if desired. + // The 'inspect' command operates on the output of 'compile', for example. + val applied: Settings[Scope] = make(mySettings)(delegates, scopeLocal, showFullKey) - // Show results. -/* for(i <- 0 to 5; k <- Seq(a, b)) { + // Show results. + /* for(i <- 0 to 5; k <- Seq(a, b)) { println( k.label + i + " = " + applied.get( Scope(i), k) ) }*/ -/** Output: -* For the None results, we never defined the value and there was no value to delegate to. -* For a3, we explicitly defined it to be 3. -* a4 wasn't defined, so it delegates to a3 according to our delegates function. -* b4 gets the value for a4 (which delegates to a3, so it is 3) and multiplies by 3 -* a5 is defined as the previous value of a5 + 1 and -* since no previous value of a5 was defined, it delegates to a4, resulting in 3+1=4. -* b5 isn't defined explicitly, so it delegates to b4 and is therefore equal to 9 as well -a0 = None -b0 = None -a1 = None -b1 = None -a2 = None -b2 = None -a3 = Some(3) -b3 = None -a4 = Some(3) -b4 = Some(9) -a5 = Some(4) -b5 = Some(9) -**/ + /** + * Output: + * For the None results, we never defined the value and there was no value to delegate to. + * For a3, we explicitly defined it to be 3. + * a4 wasn't defined, so it delegates to a3 according to our delegates function. + * b4 gets the value for a4 (which delegates to a3, so it is 3) and multiplies by 3 + * a5 is defined as the previous value of a5 + 1 and + * since no previous value of a5 was defined, it delegates to a4, resulting in 3+1=4. + * b5 isn't defined explicitly, so it delegates to b4 and is therefore equal to 9 as well + * a0 = None + * b0 = None + * a1 = None + * b1 = None + * a2 = None + * b2 = None + * a3 = Some(3) + * b3 = None + * a4 = Some(3) + * b4 = Some(9) + * a5 = Some(4) + * b5 = Some(9) + */ } diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index 1bdea8f38..f8c99a735 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -5,175 +5,174 @@ import Prop._ import SettingsUsage._ import SettingsExample._ -object SettingsTest extends Properties("settings") -{ +object SettingsTest extends Properties("settings") { - import scala.reflect.Manifest + import scala.reflect.Manifest - final val ChainMax = 5000 - lazy val chainLengthGen = Gen.choose(1, ChainMax) + final val ChainMax = 5000 + lazy val chainLengthGen = Gen.choose(1, ChainMax) - property("Basic settings test") = secure( all( tests: _*) ) + property("Basic settings test") = secure(all(tests: _*)) - property("Basic chain") = forAll(chainLengthGen) { (i: Int) => - val abs = math.abs(i) - singleIntTest( chain( abs, value(0)), abs ) - } - property("Basic bind chain") = forAll(chainLengthGen) { (i: Int) => - val abs = math.abs(i) - singleIntTest( chainBind(value(abs)), 0 ) - } + property("Basic chain") = forAll(chainLengthGen) { (i: Int) => + val abs = math.abs(i) + singleIntTest(chain(abs, value(0)), abs) + } + property("Basic bind chain") = forAll(chainLengthGen) { (i: Int) => + val abs = math.abs(i) + singleIntTest(chainBind(value(abs)), 0) + } - property("Allows references to completed settings") = forAllNoShrink(30) { allowedReference } - final def allowedReference(intermediate: Int): Prop = - { - val top = value(intermediate) - def iterate(init: Initialize[Int]): Initialize[Int] = - bind(init) { t => - if(t <= 0) - top - else - iterate(value(t-1) ) - } - evaluate( setting(chk, iterate(top)) :: Nil); true - } + property("Allows references to completed settings") = forAllNoShrink(30) { allowedReference } + final def allowedReference(intermediate: Int): Prop = + { + val top = value(intermediate) + def iterate(init: Initialize[Int]): Initialize[Int] = + bind(init) { t => + if (t <= 0) + top + else + iterate(value(t - 1)) + } + evaluate(setting(chk, iterate(top)) :: Nil); true + } - property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettings } - final def derivedSettings(nr: Int): Prop = - { - val genScopedKeys = { - val attrKeys = mkAttrKeys[Int](nr) - attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) - } - forAll(genScopedKeys) { scopedKeys => - val last = scopedKeys.last - val derivedSettings: Seq[Setting[Int]] = ( - for { - List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 - nextInit = if (scoped0 == chk) chk - else (scoped0 zipWith chk) { (p, _) => p + 1 } - } yield derive(setting(scoped1, nextInit)) - ).toSeq + property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettings } + final def derivedSettings(nr: Int): Prop = + { + val genScopedKeys = { + val attrKeys = mkAttrKeys[Int](nr) + attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) + } + forAll(genScopedKeys) { scopedKeys => + val last = scopedKeys.last + val derivedSettings: Seq[Setting[Int]] = ( + for { + List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 + nextInit = if (scoped0 == chk) chk + else (scoped0 zipWith chk) { (p, _) => p + 1 } + } yield derive(setting(scoped1, nextInit)) + ).toSeq - { checkKey(last, Some(nr-1), evaluate(setting(chk, value(0)) +: derivedSettings)) :| "Not derived?" } && - { checkKey( last, None, evaluate(derivedSettings)) :| "Should not be derived" } - } - } + { checkKey(last, Some(nr - 1), evaluate(setting(chk, value(0)) +: derivedSettings)) :| "Not derived?" } && + { checkKey(last, None, evaluate(derivedSettings)) :| "Should not be derived" } + } + } - private def mkAttrKeys[T](nr: Int)(implicit mf: Manifest[T]): Gen[List[AttributeKey[T]]] = - { - val alphaStr = Gen.alphaStr - for { - list <- Gen.listOfN(nr, alphaStr) suchThat (l => l.size == l.distinct.size) - item <- list - } yield AttributeKey[T](item) - } + private def mkAttrKeys[T](nr: Int)(implicit mf: Manifest[T]): Gen[List[AttributeKey[T]]] = + { + val alphaStr = Gen.alphaStr + for { + list <- Gen.listOfN(nr, alphaStr) suchThat (l => l.size == l.distinct.size) + item <- list + } yield AttributeKey[T](item) + } - property("Derived setting(s) replace DerivedSetting in the Seq[Setting[_]]") = derivedKeepsPosition - final def derivedKeepsPosition: Prop = - { - val a: ScopedKey[Int] = ScopedKey(Scope(0), AttributeKey[Int]("a")) - val b: ScopedKey[Int] = ScopedKey(Scope(0), AttributeKey[Int]("b")) - val prop1 = { - val settings: Seq[Setting[_]] = Seq( - setting(a, value(3)), - setting(b, value(6)), - derive(setting(b, a)), - setting(a, value(5)), - setting(b, value(8)) - ) - val ev = evaluate(settings) - checkKey(a, Some(5), ev) && checkKey(b, Some(8), ev) - } - val prop2 = { - val settings: Seq[Setting[Int]] = Seq( - setting(a, value(3)), - setting(b, value(6)), - derive(setting(b, a)), - setting(a, value(5)) - ) - val ev = evaluate(settings) - checkKey(a, Some(5), ev) && checkKey(b, Some(5), ev) - } - prop1 && prop2 - } + property("Derived setting(s) replace DerivedSetting in the Seq[Setting[_]]") = derivedKeepsPosition + final def derivedKeepsPosition: Prop = + { + val a: ScopedKey[Int] = ScopedKey(Scope(0), AttributeKey[Int]("a")) + val b: ScopedKey[Int] = ScopedKey(Scope(0), AttributeKey[Int]("b")) + val prop1 = { + val settings: Seq[Setting[_]] = Seq( + setting(a, value(3)), + setting(b, value(6)), + derive(setting(b, a)), + setting(a, value(5)), + setting(b, value(8)) + ) + val ev = evaluate(settings) + checkKey(a, Some(5), ev) && checkKey(b, Some(8), ev) + } + val prop2 = { + val settings: Seq[Setting[Int]] = Seq( + setting(a, value(3)), + setting(b, value(6)), + derive(setting(b, a)), + setting(a, value(5)) + ) + val ev = evaluate(settings) + checkKey(a, Some(5), ev) && checkKey(b, Some(5), ev) + } + prop1 && prop2 + } - property("DerivedSetting in ThisBuild scopes derived settings under projects thus allowing safe +=") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettingsScope } - final def derivedSettingsScope(nrProjects: Int): Prop = - { - forAll(mkAttrKeys[Int](2)) { case List(key, derivedKey) => - val projectKeys = for { proj <- 1 to nrProjects } yield ScopedKey(Scope(1, proj), key) - val projectDerivedKeys = for { proj <- 1 to nrProjects } yield ScopedKey(Scope(1, proj), derivedKey) - val globalKey = ScopedKey(Scope(0), key) - val globalDerivedKey = ScopedKey(Scope(0), derivedKey) - // Each project defines an initial value, but the update is defined in globalKey. - // However, the derived Settings that come from this should be scoped in each project. - val settings: Seq[Setting[_]] = - derive(setting(globalDerivedKey, SettingsExample.map(globalKey)(_ + 1))) +: projectKeys.map(pk => setting(pk, value(0))) - val ev = evaluate(settings) - // Also check that the key has no value at the "global" scope - val props = for { pk <- projectDerivedKeys } yield checkKey(pk, Some(1), ev) - checkKey(globalDerivedKey, None, ev) && Prop.all(props: _*) - } - } + property("DerivedSetting in ThisBuild scopes derived settings under projects thus allowing safe +=") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettingsScope } + final def derivedSettingsScope(nrProjects: Int): Prop = + { + forAll(mkAttrKeys[Int](2)) { + case List(key, derivedKey) => + val projectKeys = for { proj <- 1 to nrProjects } yield ScopedKey(Scope(1, proj), key) + val projectDerivedKeys = for { proj <- 1 to nrProjects } yield ScopedKey(Scope(1, proj), derivedKey) + val globalKey = ScopedKey(Scope(0), key) + val globalDerivedKey = ScopedKey(Scope(0), derivedKey) + // Each project defines an initial value, but the update is defined in globalKey. + // However, the derived Settings that come from this should be scoped in each project. + val settings: Seq[Setting[_]] = + derive(setting(globalDerivedKey, SettingsExample.map(globalKey)(_ + 1))) +: projectKeys.map(pk => setting(pk, value(0))) + val ev = evaluate(settings) + // Also check that the key has no value at the "global" scope + val props = for { pk <- projectDerivedKeys } yield checkKey(pk, Some(1), ev) + checkKey(globalDerivedKey, None, ev) && Prop.all(props: _*) + } + } -// Circular (dynamic) references currently loop infinitely. -// This is the expected behavior (detecting dynamic cycles is expensive), -// but it may be necessary to provide an option to detect them (with a performance hit) -// This would test that cycle detection. -// property("Catches circular references") = forAll(chainLengthGen) { checkCircularReferences _ } - final def checkCircularReferences(intermediate: Int): Prop = - { - val ccr = new CCR(intermediate) - try { evaluate( setting(chk, ccr.top) :: Nil); false } - catch { case e: java.lang.Exception => true } - } + // Circular (dynamic) references currently loop infinitely. + // This is the expected behavior (detecting dynamic cycles is expensive), + // but it may be necessary to provide an option to detect them (with a performance hit) + // This would test that cycle detection. + // property("Catches circular references") = forAll(chainLengthGen) { checkCircularReferences _ } + final def checkCircularReferences(intermediate: Int): Prop = + { + val ccr = new CCR(intermediate) + try { evaluate(setting(chk, ccr.top) :: Nil); false } + catch { case e: java.lang.Exception => true } + } - def tests = - for(i <- 0 to 5; k <- Seq(a, b)) yield { - val expected = expectedValues(2*i + (if(k == a) 0 else 1)) - checkKey[Int]( ScopedKey( Scope(i), k ), expected, applied) - } + def tests = + for (i <- 0 to 5; k <- Seq(a, b)) yield { + val expected = expectedValues(2 * i + (if (k == a) 0 else 1)) + checkKey[Int](ScopedKey(Scope(i), k), expected, applied) + } - lazy val expectedValues = None :: None :: None :: None :: None :: None :: Some(3) :: None :: Some(3) :: Some(9) :: Some(4) :: Some(9) :: Nil + lazy val expectedValues = None :: None :: None :: None :: None :: None :: Some(3) :: None :: Some(3) :: Some(9) :: Some(4) :: Some(9) :: Nil - lazy val ch = AttributeKey[Int]("ch") - lazy val chk = ScopedKey( Scope(0), ch) - def chain(i: Int, prev: Initialize[Int]): Initialize[Int] = - if(i <= 0) prev else chain(i - 1, prev(_ + 1)) + lazy val ch = AttributeKey[Int]("ch") + lazy val chk = ScopedKey(Scope(0), ch) + def chain(i: Int, prev: Initialize[Int]): Initialize[Int] = + if (i <= 0) prev else chain(i - 1, prev(_ + 1)) - def chainBind(prev: Initialize[Int]): Initialize[Int] = - bind(prev) { v => - if(v <= 0) prev else chainBind(value(v - 1) ) - } - def singleIntTest(i: Initialize[Int], expected: Int) = - { - val eval = evaluate( setting( chk, i ) :: Nil ) - checkKey( chk, Some(expected), eval ) - } + def chainBind(prev: Initialize[Int]): Initialize[Int] = + bind(prev) { v => + if (v <= 0) prev else chainBind(value(v - 1)) + } + def singleIntTest(i: Initialize[Int], expected: Int) = + { + val eval = evaluate(setting(chk, i) :: Nil) + checkKey(chk, Some(expected), eval) + } - def checkKey[T](key: ScopedKey[T], expected: Option[T], settings: Settings[Scope]) = - { - val value = settings.get( key.scope, key.key) - ("Key: " + key) |: - ("Value: " + value) |: - ("Expected: " + expected) |: - (value == expected) - } + def checkKey[T](key: ScopedKey[T], expected: Option[T], settings: Settings[Scope]) = + { + val value = settings.get(key.scope, key.key) + ("Key: " + key) |: + ("Value: " + value) |: + ("Expected: " + expected) |: + (value == expected) + } - def evaluate(settings: Seq[Setting[_]]): Settings[Scope] = - try { make(settings)(delegates, scopeLocal, showFullKey) } - catch { case e: Throwable => e.printStackTrace; throw e } + def evaluate(settings: Seq[Setting[_]]): Settings[Scope] = + try { make(settings)(delegates, scopeLocal, showFullKey) } + catch { case e: Throwable => e.printStackTrace; throw e } } // This setup is a workaround for module synchronization issues -final class CCR(intermediate: Int) -{ - lazy val top = iterate(value(intermediate), intermediate) - def iterate(init: Initialize[Int], i: Int): Initialize[Int] = - bind(init) { t => - if(t <= 0) - top - else - iterate(value(t - 1), t-1) - } +final class CCR(intermediate: Int) { + lazy val top = iterate(value(intermediate), intermediate) + def iterate(init: Initialize[Int], i: Int): Initialize[Int] = + bind(init) { t => + if (t <= 0) + top + else + iterate(value(t - 1), t - 1) + } } diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 78ee28dc0..53d6cb1db 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -1,154 +1,148 @@ package sbt.complete -object JLineTest -{ - import DefaultParsers._ +object JLineTest { + import DefaultParsers._ - val one = "blue" | "green" | "black" - val two = token("color" ~> Space) ~> token(one) - val three = token("color" ~> Space) ~> token(ID.examples("blue", "green", "black")) - val four = token("color" ~> Space) ~> token(ID, "") + val one = "blue" | "green" | "black" + val two = token("color" ~> Space) ~> token(one) + val three = token("color" ~> Space) ~> token(ID.examples("blue", "green", "black")) + val four = token("color" ~> Space) ~> token(ID, "") - val num = token(NatBasic) - val five = (num ~ token("+" | "-") ~ num) <~ token('=') flatMap { - case a ~ "+" ~ b => token((a+b).toString) - case a ~ "-" ~ b => token((a-b).toString) - } + val num = token(NatBasic) + val five = (num ~ token("+" | "-") ~ num) <~ token('=') flatMap { + case a ~ "+" ~ b => token((a + b).toString) + case a ~ "-" ~ b => token((a - b).toString) + } - val parsers = Map("1" -> one, "2" -> two, "3" -> three, "4" -> four, "5" -> five) - def main(args: Array[String]) - { - import jline.TerminalFactory - import jline.console.ConsoleReader - val reader = new ConsoleReader() - TerminalFactory.get.init + val parsers = Map("1" -> one, "2" -> two, "3" -> three, "4" -> four, "5" -> five) + def main(args: Array[String]) { + import jline.TerminalFactory + import jline.console.ConsoleReader + val reader = new ConsoleReader() + TerminalFactory.get.init - val parser = parsers(args(0)) - JLineCompletion.installCustomCompletor(reader, parser) - def loop() { - val line = reader.readLine("> ") - if(line ne null) { - println("Result: " + apply(parser)(line).resultEmpty) - loop() - } - } - loop() - } + val parser = parsers(args(0)) + JLineCompletion.installCustomCompletor(reader, parser) + def loop() { + val line = reader.readLine("> ") + if (line ne null) { + println("Result: " + apply(parser)(line).resultEmpty) + loop() + } + } + loop() + } } - import Parser._ - import org.scalacheck._ +import Parser._ +import org.scalacheck._ -object ParserTest extends Properties("Completing Parser") -{ - import Parsers._ - import DefaultParsers.matches +object ParserTest extends Properties("Completing Parser") { + import Parsers._ + import DefaultParsers.matches - val nested = (token("a1") ~ token("b2")) ~ "c3" - val nestedDisplay = (token("a1", "") ~ token("b2", "")) ~ "c3" + val nested = (token("a1") ~ token("b2")) ~ "c3" + val nestedDisplay = (token("a1", "") ~ token("b2", "")) ~ "c3" - val spacePort = (token(Space) ~> Port) + val spacePort = (token(Space) ~> Port) - def p[T](f: T): T = { println(f); f } + def p[T](f: T): T = { println(f); f } - def checkSingle(in: String, expect: Completion)(expectDisplay: Completion = expect) = - ( ("token '" + in + "'") |: checkOne(in, nested, expect)) && - ( ("display '" + in + "'") |: checkOne(in, nestedDisplay, expectDisplay) ) - - def checkOne(in: String, parser: Parser[_], expect: Completion): Prop = - completions(parser, in, 1) == Completions.single(expect) + def checkSingle(in: String, expect: Completion)(expectDisplay: Completion = expect) = + (("token '" + in + "'") |: checkOne(in, nested, expect)) && + (("display '" + in + "'") |: checkOne(in, nestedDisplay, expectDisplay)) - def checkAll(in: String, parser: Parser[_], expect: Completions): Prop = - { - val cs = completions(parser, in, 1) - ("completions: " + cs) |: ("Expected: " + expect) |: ( (cs == expect): Prop) - } - - def checkInvalid(in: String) = - ( ("token '" + in + "'") |: checkInv(in, nested) ) && - ( ("display '" + in + "'") |: checkInv(in, nestedDisplay) ) + def checkOne(in: String, parser: Parser[_], expect: Completion): Prop = + completions(parser, in, 1) == Completions.single(expect) - def checkInv(in: String, parser: Parser[_]): Prop = - { - val cs = completions(parser, in, 1) - ("completions: " + cs) |: (( cs == Completions.nil): Prop) - } - - property("nested tokens a") = checkSingle("", Completion.tokenStrict("","a1") )( Completion.displayStrict("")) - property("nested tokens a1") = checkSingle("a", Completion.tokenStrict("a","1") )( Completion.displayStrict("")) - property("nested tokens a inv") = checkInvalid("b") - property("nested tokens b") = checkSingle("a1", Completion.tokenStrict("","b2") )( Completion.displayStrict("")) - property("nested tokens b2") = checkSingle("a1b", Completion.tokenStrict("b","2") )( Completion.displayStrict("")) - property("nested tokens b inv") = checkInvalid("a1a") - property("nested tokens c") = checkSingle("a1b2", Completion.suggestStrict("c3") )() - property("nested tokens c3") = checkSingle("a1b2c", Completion.suggestStrict("3"))() - property("nested tokens c inv") = checkInvalid("a1b2a") + def checkAll(in: String, parser: Parser[_], expect: Completions): Prop = + { + val cs = completions(parser, in, 1) + ("completions: " + cs) |: ("Expected: " + expect) |: ((cs == expect): Prop) + } - property("suggest space") = checkOne("", spacePort, Completion.tokenStrict("", " ")) - property("suggest port") = checkOne(" ", spacePort, Completion.displayStrict("") ) - property("no suggest at end") = checkOne("asdf", "asdf", Completion.suggestStrict("")) - property("no suggest at token end") = checkOne("asdf", token("asdf"), Completion.suggestStrict("")) - property("empty suggest for examples") = checkOne("asdf", any.+.examples("asdf", "qwer"), Completion.suggestStrict("")) - property("empty suggest for examples token") = checkOne("asdf", token(any.+.examples("asdf", "qwer")), Completion.suggestStrict("")) + def checkInvalid(in: String) = + (("token '" + in + "'") |: checkInv(in, nested)) && + (("display '" + in + "'") |: checkInv(in, nestedDisplay)) - val colors = Set("blue", "green", "red") - val base = (seen: Seq[String]) => token( ID examples (colors -- seen) ) - val sep = token( Space ) - val repeat = repeatDep( base, sep) - def completionStrings(ss: Set[String]): Completions = Completions(ss.map { s => Completion.tokenStrict("", s) }) + def checkInv(in: String, parser: Parser[_]): Prop = + { + val cs = completions(parser, in, 1) + ("completions: " + cs) |: ((cs == Completions.nil): Prop) + } - property("repeatDep no suggestions for bad input") = checkInv(".", repeat) - property("repeatDep suggest all") = checkAll("", repeat, completionStrings(colors)) - property("repeatDep suggest remaining two") = { - val first = colors.toSeq.head - checkAll(first + " ", repeat, completionStrings(colors - first)) - } - property("repeatDep suggest remaining one") = { - val take = colors.toSeq.take(2) - checkAll(take.mkString("", " ", " "), repeat, completionStrings(colors -- take)) - } - property("repeatDep requires at least one token") = !matches(repeat, "") - property("repeatDep accepts one token") = matches(repeat, colors.toSeq.head) - property("repeatDep accepts two tokens") = matches(repeat, colors.toSeq.take(2).mkString(" ")) + property("nested tokens a") = checkSingle("", Completion.tokenStrict("", "a1"))(Completion.displayStrict("")) + property("nested tokens a1") = checkSingle("a", Completion.tokenStrict("a", "1"))(Completion.displayStrict("")) + property("nested tokens a inv") = checkInvalid("b") + property("nested tokens b") = checkSingle("a1", Completion.tokenStrict("", "b2"))(Completion.displayStrict("")) + property("nested tokens b2") = checkSingle("a1b", Completion.tokenStrict("b", "2"))(Completion.displayStrict("")) + property("nested tokens b inv") = checkInvalid("a1a") + property("nested tokens c") = checkSingle("a1b2", Completion.suggestStrict("c3"))() + property("nested tokens c3") = checkSingle("a1b2c", Completion.suggestStrict("3"))() + property("nested tokens c inv") = checkInvalid("a1b2a") + + property("suggest space") = checkOne("", spacePort, Completion.tokenStrict("", " ")) + property("suggest port") = checkOne(" ", spacePort, Completion.displayStrict("")) + property("no suggest at end") = checkOne("asdf", "asdf", Completion.suggestStrict("")) + property("no suggest at token end") = checkOne("asdf", token("asdf"), Completion.suggestStrict("")) + property("empty suggest for examples") = checkOne("asdf", any.+.examples("asdf", "qwer"), Completion.suggestStrict("")) + property("empty suggest for examples token") = checkOne("asdf", token(any.+.examples("asdf", "qwer")), Completion.suggestStrict("")) + + val colors = Set("blue", "green", "red") + val base = (seen: Seq[String]) => token(ID examples (colors -- seen)) + val sep = token(Space) + val repeat = repeatDep(base, sep) + def completionStrings(ss: Set[String]): Completions = Completions(ss.map { s => Completion.tokenStrict("", s) }) + + property("repeatDep no suggestions for bad input") = checkInv(".", repeat) + property("repeatDep suggest all") = checkAll("", repeat, completionStrings(colors)) + property("repeatDep suggest remaining two") = { + val first = colors.toSeq.head + checkAll(first + " ", repeat, completionStrings(colors - first)) + } + property("repeatDep suggest remaining one") = { + val take = colors.toSeq.take(2) + checkAll(take.mkString("", " ", " "), repeat, completionStrings(colors -- take)) + } + property("repeatDep requires at least one token") = !matches(repeat, "") + property("repeatDep accepts one token") = matches(repeat, colors.toSeq.head) + property("repeatDep accepts two tokens") = matches(repeat, colors.toSeq.take(2).mkString(" ")) } -object ParserExample -{ - val ws = charClass(_.isWhitespace)+ - val notws = charClass(!_.isWhitespace)+ +object ParserExample { + val ws = charClass(_.isWhitespace)+ + val notws = charClass(!_.isWhitespace)+ - val name = token("test") - val options = (ws ~> token("quick" | "failed" | "new") )* - val exampleSet = Set("am", "is", "are", "was", "were") - val include = (ws ~> token(examples(notws.string, new FixedSetExamples(exampleSet), exampleSet.size, false )) )* + val name = token("test") + val options = (ws ~> token("quick" | "failed" | "new"))* + val exampleSet = Set("am", "is", "are", "was", "were") + val include = (ws ~> token(examples(notws.string, new FixedSetExamples(exampleSet), exampleSet.size, false)))* - val t = name ~ options ~ include + val t = name ~ options ~ include - // Get completions for some different inputs - println(completions(t, "te", 1)) - println(completions(t, "test ",1)) - println(completions(t, "test w", 1)) + // Get completions for some different inputs + println(completions(t, "te", 1)) + println(completions(t, "test ", 1)) + println(completions(t, "test w", 1)) - // Get the parsed result for different inputs - println(apply(t)("te").resultEmpty) - println(apply(t)("test").resultEmpty) - println(apply(t)("test w").resultEmpty) - println(apply(t)("test was were").resultEmpty) + // Get the parsed result for different inputs + println(apply(t)("te").resultEmpty) + println(apply(t)("test").resultEmpty) + println(apply(t)("test w").resultEmpty) + println(apply(t)("test was were").resultEmpty) - def run(n: Int) - { - val a = 'a'.id - val aq = a.? - val aqn = repeat(aq, min = n, max = n) - val an = repeat(a, min = n, max = n) - val ann = aqn ~ an + def run(n: Int) { + val a = 'a'.id + val aq = a.? + val aqn = repeat(aq, min = n, max = n) + val an = repeat(a, min = n, max = n) + val ann = aqn ~ an - def r = apply(ann)("a"*(n*2)).resultEmpty - println(r.isValid) - } - def run2(n: Int) - { - val ab = "ab".?.* - val r = apply(ab)("a"*n).resultEmpty - println(r) - } + def r = apply(ann)("a" * (n * 2)).resultEmpty + println(r.isValid) + } + def run2(n: Int) { + val ab = "ab".?.* + val r = apply(ab)("a" * n).resultEmpty + println(r) + } } \ No newline at end of file diff --git a/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala index 08c9a5884..03b495bf0 100644 --- a/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala +++ b/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala @@ -6,87 +6,85 @@ import sbt.IO.withTemporaryDirectory import java.io.File import sbt.IO._ -class FileExamplesTest extends Specification -{ +class FileExamplesTest extends Specification { - "listing all files in an absolute base directory" should { - "produce the entire base directory's contents" in new directoryStructure { - fileExamples().toList should containTheSameElementsAs(allRelativizedPaths) - } - } + "listing all files in an absolute base directory" should { + "produce the entire base directory's contents" in new directoryStructure { + fileExamples().toList should containTheSameElementsAs(allRelativizedPaths) + } + } - "listing files with a prefix that matches none" should { - "produce an empty list" in new directoryStructure(withCompletionPrefix = "z") { - fileExamples().toList should beEmpty - } - } + "listing files with a prefix that matches none" should { + "produce an empty list" in new directoryStructure(withCompletionPrefix = "z") { + fileExamples().toList should beEmpty + } + } - "listing single-character prefixed files" should { - "produce matching paths only" in new directoryStructure(withCompletionPrefix = "f") { - fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) - } - } + "listing single-character prefixed files" should { + "produce matching paths only" in new directoryStructure(withCompletionPrefix = "f") { + fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + } + } - "listing directory-prefixed files" should { - "produce matching paths only" in new directoryStructure(withCompletionPrefix = "far") { - fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) - } + "listing directory-prefixed files" should { + "produce matching paths only" in new directoryStructure(withCompletionPrefix = "far") { + fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + } - "produce sub-dir contents only when appending a file separator to the directory" in new directoryStructure(withCompletionPrefix = "far" + File.separator) { - fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) - } - } + "produce sub-dir contents only when appending a file separator to the directory" in new directoryStructure(withCompletionPrefix = "far" + File.separator) { + fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + } + } - "listing files with a sub-path prefix" should { - "produce matching paths only" in new directoryStructure(withCompletionPrefix = "far" + File.separator + "ba") { - fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) - } - } + "listing files with a sub-path prefix" should { + "produce matching paths only" in new directoryStructure(withCompletionPrefix = "far" + File.separator + "ba") { + fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + } + } - "completing a full path" should { - "produce a list with an empty string" in new directoryStructure(withCompletionPrefix = "bazaar") { - fileExamples().toList shouldEqual List("") - } - } + "completing a full path" should { + "produce a list with an empty string" in new directoryStructure(withCompletionPrefix = "bazaar") { + fileExamples().toList shouldEqual List("") + } + } - class directoryStructure(withCompletionPrefix: String = "") extends Scope with DelayedInit - { - var fileExamples: FileExamples = _ - var baseDir: File = _ - var childFiles: List[File] = _ - var childDirectories: List[File] = _ - var nestedFiles: List[File] = _ - var nestedDirectories: List[File] = _ + class directoryStructure(withCompletionPrefix: String = "") extends Scope with DelayedInit { + var fileExamples: FileExamples = _ + var baseDir: File = _ + var childFiles: List[File] = _ + var childDirectories: List[File] = _ + var nestedFiles: List[File] = _ + var nestedDirectories: List[File] = _ - def allRelativizedPaths: List[String] = - (childFiles ++ childDirectories ++ nestedFiles ++ nestedDirectories).map(relativize(baseDir, _).get) + def allRelativizedPaths: List[String] = + (childFiles ++ childDirectories ++ nestedFiles ++ nestedDirectories).map(relativize(baseDir, _).get) - def prefixedPathsOnly: List[String] = - allRelativizedPaths.filter(_ startsWith withCompletionPrefix).map(_ substring withCompletionPrefix.length) + def prefixedPathsOnly: List[String] = + allRelativizedPaths.filter(_ startsWith withCompletionPrefix).map(_ substring withCompletionPrefix.length) - override def delayedInit(testBody: => Unit): Unit = { - withTemporaryDirectory { - tempDir => - createSampleDirStructure(tempDir) - fileExamples = new FileExamples(baseDir, withCompletionPrefix) - testBody - } - } + override def delayedInit(testBody: => Unit): Unit = { + withTemporaryDirectory { + tempDir => + createSampleDirStructure(tempDir) + fileExamples = new FileExamples(baseDir, withCompletionPrefix) + testBody + } + } - private def createSampleDirStructure(tempDir: File): Unit = { - childFiles = toChildFiles(tempDir, List("foo", "bar", "bazaar")) - childDirectories = toChildFiles(tempDir, List("moo", "far")) - nestedFiles = toChildFiles(childDirectories(1), List("farfile1", "barfile2")) - nestedDirectories = toChildFiles(childDirectories(1), List("fardir1", "bardir2")) + private def createSampleDirStructure(tempDir: File): Unit = { + childFiles = toChildFiles(tempDir, List("foo", "bar", "bazaar")) + childDirectories = toChildFiles(tempDir, List("moo", "far")) + nestedFiles = toChildFiles(childDirectories(1), List("farfile1", "barfile2")) + nestedDirectories = toChildFiles(childDirectories(1), List("fardir1", "bardir2")) - (childDirectories ++ nestedDirectories).map(_.mkdirs()) - (childFiles ++ nestedFiles).map(_.createNewFile()) + (childDirectories ++ nestedDirectories).map(_.mkdirs()) + (childFiles ++ nestedFiles).map(_.createNewFile()) - // NOTE: Creating a new file here because `tempDir.listFiles()` returned an empty list. - baseDir = new File(tempDir.getCanonicalPath) - } + // NOTE: Creating a new file here because `tempDir.listFiles()` returned an empty list. + baseDir = new File(tempDir.getCanonicalPath) + } - private def toChildFiles(baseDir: File, files: List[String]): List[File] = files.map(new File(baseDir, _)) - } + private def toChildFiles(baseDir: File, files: List[String]): List[File] = files.map(new File(baseDir, _)) + } } diff --git a/util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala b/util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala index b9a5b2de2..b5aa14250 100644 --- a/util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala +++ b/util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala @@ -5,22 +5,22 @@ import org.specs2.specification.Scope class FixedSetExamplesTest extends Specification { - "adding a prefix" should { - "produce a smaller set of examples with the prefix removed" in new examples { - fixedSetExamples.withAddedPrefix("f")() must containTheSameElementsAs(List("oo", "ool", "u")) - fixedSetExamples.withAddedPrefix("fo")() must containTheSameElementsAs(List("o", "ol")) - fixedSetExamples.withAddedPrefix("b")() must containTheSameElementsAs(List("ar")) - } - } + "adding a prefix" should { + "produce a smaller set of examples with the prefix removed" in new examples { + fixedSetExamples.withAddedPrefix("f")() must containTheSameElementsAs(List("oo", "ool", "u")) + fixedSetExamples.withAddedPrefix("fo")() must containTheSameElementsAs(List("o", "ol")) + fixedSetExamples.withAddedPrefix("b")() must containTheSameElementsAs(List("ar")) + } + } - "without a prefix" should { - "produce the original set" in new examples { - fixedSetExamples() mustEqual exampleSet - } - } + "without a prefix" should { + "produce the original set" in new examples { + fixedSetExamples() mustEqual exampleSet + } + } - trait examples extends Scope { - val exampleSet = List("foo", "bar", "fool", "fu") - val fixedSetExamples = FixedSetExamples(exampleSet) - } + trait examples extends Scope { + val exampleSet = List("foo", "bar", "fool", "fu") + val fixedSetExamples = FixedSetExamples(exampleSet) + } } diff --git a/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala index 1151e1b0d..dff68803c 100644 --- a/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala +++ b/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala @@ -6,88 +6,88 @@ import Completion._ class ParserWithExamplesTest extends Specification { - "listing a limited number of completions" should { - "grab only the needed number of elements from the iterable source of examples" in new parserWithLazyExamples { - parserWithExamples.completions(0) - examples.size shouldEqual maxNumberOfExamples - } - } + "listing a limited number of completions" should { + "grab only the needed number of elements from the iterable source of examples" in new parserWithLazyExamples { + parserWithExamples.completions(0) + examples.size shouldEqual maxNumberOfExamples + } + } - "listing only valid completions" should { - "use the delegate parser to remove invalid examples" in new parserWithValidExamples { - val validCompletions = Completions(Set( - suggestion("blue"), - suggestion("red") - )) - parserWithExamples.completions(0) shouldEqual validCompletions - } - } + "listing only valid completions" should { + "use the delegate parser to remove invalid examples" in new parserWithValidExamples { + val validCompletions = Completions(Set( + suggestion("blue"), + suggestion("red") + )) + parserWithExamples.completions(0) shouldEqual validCompletions + } + } - "listing valid completions in a derived parser" should { - "produce only valid examples that start with the character of the derivation" in new parserWithValidExamples { - val derivedCompletions = Completions(Set( - suggestion("lue") - )) - parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions - } - } + "listing valid completions in a derived parser" should { + "produce only valid examples that start with the character of the derivation" in new parserWithValidExamples { + val derivedCompletions = Completions(Set( + suggestion("lue") + )) + parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions + } + } - "listing valid and invalid completions" should { - "produce the entire source of examples" in new parserWithAllExamples { - val completions = Completions(examples.map(suggestion(_)).toSet) - parserWithExamples.completions(0) shouldEqual completions - } - } + "listing valid and invalid completions" should { + "produce the entire source of examples" in new parserWithAllExamples { + val completions = Completions(examples.map(suggestion(_)).toSet) + parserWithExamples.completions(0) shouldEqual completions + } + } - "listing valid and invalid completions in a derived parser" should { - "produce only examples that start with the character of the derivation" in new parserWithAllExamples { - val derivedCompletions = Completions(Set( - suggestion("lue"), - suggestion("lock") - )) - parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions - } - } + "listing valid and invalid completions in a derived parser" should { + "produce only examples that start with the character of the derivation" in new parserWithAllExamples { + val derivedCompletions = Completions(Set( + suggestion("lue"), + suggestion("lock") + )) + parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions + } + } - class parserWithLazyExamples extends parser(GrowableSourceOfExamples(), maxNumberOfExamples = 5, removeInvalidExamples = false) + class parserWithLazyExamples extends parser(GrowableSourceOfExamples(), maxNumberOfExamples = 5, removeInvalidExamples = false) - class parserWithValidExamples extends parser(removeInvalidExamples = true) + class parserWithValidExamples extends parser(removeInvalidExamples = true) - class parserWithAllExamples extends parser(removeInvalidExamples = false) + class parserWithAllExamples extends parser(removeInvalidExamples = false) - case class parser(examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), - maxNumberOfExamples: Int = 25, - removeInvalidExamples: Boolean) extends Scope { + case class parser(examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), + maxNumberOfExamples: Int = 25, + removeInvalidExamples: Boolean) extends Scope { - import DefaultParsers._ + import DefaultParsers._ - val colorParser = "blue" | "green" | "black" | "red" - val parserWithExamples: Parser[String] = new ParserWithExamples[String]( - colorParser, - FixedSetExamples(examples), - maxNumberOfExamples, - removeInvalidExamples - ) - } + val colorParser = "blue" | "green" | "black" | "red" + val parserWithExamples: Parser[String] = new ParserWithExamples[String]( + colorParser, + FixedSetExamples(examples), + maxNumberOfExamples, + removeInvalidExamples + ) + } - case class GrowableSourceOfExamples() extends Iterable[String] { - private var numberOfIteratedElements: Int = 0 + case class GrowableSourceOfExamples() extends Iterable[String] { + private var numberOfIteratedElements: Int = 0 - override def iterator: Iterator[String] = { - new Iterator[String] { - var currentElement = 0 + override def iterator: Iterator[String] = { + new Iterator[String] { + var currentElement = 0 - override def next(): String = { - currentElement += 1 - numberOfIteratedElements = Math.max(currentElement, numberOfIteratedElements) - numberOfIteratedElements.toString - } + override def next(): String = { + currentElement += 1 + numberOfIteratedElements = Math.max(currentElement, numberOfIteratedElements) + numberOfIteratedElements.toString + } - override def hasNext: Boolean = true - } - } + override def hasNext: Boolean = true + } + } - override def size: Int = numberOfIteratedElements - } + override def size: Int = numberOfIteratedElements + } } diff --git a/util/log/src/test/scala/Escapes.scala b/util/log/src/test/scala/Escapes.scala index f90499574..f780d25bf 100644 --- a/util/log/src/test/scala/Escapes.scala +++ b/util/log/src/test/scala/Escapes.scala @@ -2,90 +2,85 @@ package sbt import org.scalacheck._ import Prop._ -import Gen.{listOf, oneOf} +import Gen.{ listOf, oneOf } -import ConsoleLogger.{ESC, hasEscapeSequence, isEscapeTerminator, removeEscapeSequences} +import ConsoleLogger.{ ESC, hasEscapeSequence, isEscapeTerminator, removeEscapeSequences } -object Escapes extends Properties("Escapes") -{ - property("genTerminator only generates terminators") = - forAllNoShrink(genTerminator) { (c: Char) => isEscapeTerminator(c) } +object Escapes extends Properties("Escapes") { + property("genTerminator only generates terminators") = + forAllNoShrink(genTerminator) { (c: Char) => isEscapeTerminator(c) } - property("genWithoutTerminator only generates terminators") = - forAllNoShrink(genWithoutTerminator) { (s: String) => - s.forall { c => !isEscapeTerminator(c) } - } + property("genWithoutTerminator only generates terminators") = + forAllNoShrink(genWithoutTerminator) { (s: String) => + s.forall { c => !isEscapeTerminator(c) } + } - property("hasEscapeSequence is false when no escape character is present") = forAllNoShrink(genWithoutEscape) { (s: String) => - !hasEscapeSequence(s) - } + property("hasEscapeSequence is false when no escape character is present") = forAllNoShrink(genWithoutEscape) { (s: String) => + !hasEscapeSequence(s) + } - property("hasEscapeSequence is true when escape character is present") = forAllNoShrink(genWithRandomEscapes) { (s: String) => - hasEscapeSequence(s) - } + property("hasEscapeSequence is true when escape character is present") = forAllNoShrink(genWithRandomEscapes) { (s: String) => + hasEscapeSequence(s) + } - property("removeEscapeSequences is the identity when no escape character is present") = forAllNoShrink(genWithoutEscape) { (s: String) => - val removed: String = removeEscapeSequences(s) - ("Escape sequence removed: '" + removed + "'") |: - (removed == s) - } + property("removeEscapeSequences is the identity when no escape character is present") = forAllNoShrink(genWithoutEscape) { (s: String) => + val removed: String = removeEscapeSequences(s) + ("Escape sequence removed: '" + removed + "'") |: + (removed == s) + } - property("No escape characters remain after removeEscapeSequences") = forAll { (s: String) => - val removed: String = removeEscapeSequences(s) - ("Escape sequence removed: '" + removed + "'") |: - !hasEscapeSequence(removed) - } + property("No escape characters remain after removeEscapeSequences") = forAll { (s: String) => + val removed: String = removeEscapeSequences(s) + ("Escape sequence removed: '" + removed + "'") |: + !hasEscapeSequence(removed) + } - property("removeEscapeSequences returns string without escape sequences") = - forAllNoShrink( genWithoutEscape, genEscapePairs ) { (start: String, escapes: List[EscapeAndNot]) => - val withEscapes: String = start + escapes.map { ean => ean.escape.makeString + ean.notEscape } - val removed: String = removeEscapeSequences(withEscapes) - val original = start + escapes.map(_.notEscape) - ("Input string with escapes: '" + withEscapes + "'") |: - ("Escapes removed '" + removed + "'") |: - (original == removed) - } + property("removeEscapeSequences returns string without escape sequences") = + forAllNoShrink(genWithoutEscape, genEscapePairs) { (start: String, escapes: List[EscapeAndNot]) => + val withEscapes: String = start + escapes.map { ean => ean.escape.makeString + ean.notEscape } + val removed: String = removeEscapeSequences(withEscapes) + val original = start + escapes.map(_.notEscape) + ("Input string with escapes: '" + withEscapes + "'") |: + ("Escapes removed '" + removed + "'") |: + (original == removed) + } - final case class EscapeAndNot(escape: EscapeSequence, notEscape: String) - final case class EscapeSequence(content: String, terminator: Char) - { - assert( content.forall(c => !isEscapeTerminator(c) ), "Escape sequence content contains an escape terminator: '" + content + "'" ) - assert( isEscapeTerminator(terminator) ) - def makeString: String = ESC + content + terminator - } - private[this] def noEscape(s: String): String = s.replace(ESC, ' ') + final case class EscapeAndNot(escape: EscapeSequence, notEscape: String) + final case class EscapeSequence(content: String, terminator: Char) { + assert(content.forall(c => !isEscapeTerminator(c)), "Escape sequence content contains an escape terminator: '" + content + "'") + assert(isEscapeTerminator(terminator)) + def makeString: String = ESC + content + terminator + } + private[this] def noEscape(s: String): String = s.replace(ESC, ' ') - lazy val genEscapeSequence: Gen[EscapeSequence] = oneOf(genKnownSequence, genArbitraryEscapeSequence) - lazy val genEscapePair: Gen[EscapeAndNot] = for(esc <- genEscapeSequence; not <- genWithoutEscape) yield EscapeAndNot(esc, not) - lazy val genEscapePairs: Gen[List[EscapeAndNot]] = listOf(genEscapePair) + lazy val genEscapeSequence: Gen[EscapeSequence] = oneOf(genKnownSequence, genArbitraryEscapeSequence) + lazy val genEscapePair: Gen[EscapeAndNot] = for (esc <- genEscapeSequence; not <- genWithoutEscape) yield EscapeAndNot(esc, not) + lazy val genEscapePairs: Gen[List[EscapeAndNot]] = listOf(genEscapePair) - lazy val genArbitraryEscapeSequence: Gen[EscapeSequence] = - for(content <- genWithoutTerminator; term <- genTerminator) yield - new EscapeSequence(content, term) - - lazy val genKnownSequence: Gen[EscapeSequence] = - oneOf((misc ++ setGraphicsMode ++ setMode ++ resetMode).map(toEscapeSequence)) - - def toEscapeSequence(s: String): EscapeSequence = EscapeSequence(s.init, s.last) + lazy val genArbitraryEscapeSequence: Gen[EscapeSequence] = + for (content <- genWithoutTerminator; term <- genTerminator) yield new EscapeSequence(content, term) - lazy val misc = Seq("14;23H", "5;3f", "2A", "94B", "19C", "85D", "s", "u", "2J", "K") + lazy val genKnownSequence: Gen[EscapeSequence] = + oneOf((misc ++ setGraphicsMode ++ setMode ++ resetMode).map(toEscapeSequence)) - lazy val setGraphicsMode: Seq[String] = - for(txt <- 0 to 8; fg <- 30 to 37; bg <- 40 to 47) yield - txt.toString + ";" + fg.toString + ";" + bg.toString + "m" + def toEscapeSequence(s: String): EscapeSequence = EscapeSequence(s.init, s.last) - lazy val resetMode = setModeLike('I') - lazy val setMode = setModeLike('h') - def setModeLike(term: Char): Seq[String] = (0 to 19).map(i => "=" + i.toString + term) - - lazy val genWithoutTerminator = genRawString.map( _.filter { c => !isEscapeTerminator(c) } ) + lazy val misc = Seq("14;23H", "5;3f", "2A", "94B", "19C", "85D", "s", "u", "2J", "K") - lazy val genTerminator: Gen[Char] = Gen.choose('@', '~') - lazy val genWithoutEscape: Gen[String] = genRawString.map(noEscape) + lazy val setGraphicsMode: Seq[String] = + for (txt <- 0 to 8; fg <- 30 to 37; bg <- 40 to 47) yield txt.toString + ";" + fg.toString + ";" + bg.toString + "m" - def genWithRandomEscapes: Gen[String] = - for(ls <- listOf(genRawString); end <- genRawString) yield - ls.mkString("", ESC.toString, ESC.toString + end) + lazy val resetMode = setModeLike('I') + lazy val setMode = setModeLike('h') + def setModeLike(term: Char): Seq[String] = (0 to 19).map(i => "=" + i.toString + term) - private def genRawString = Arbitrary.arbString.arbitrary + lazy val genWithoutTerminator = genRawString.map(_.filter { c => !isEscapeTerminator(c) }) + + lazy val genTerminator: Gen[Char] = Gen.choose('@', '~') + lazy val genWithoutEscape: Gen[String] = genRawString.map(noEscape) + + def genWithRandomEscapes: Gen[String] = + for (ls <- listOf(genRawString); end <- genRawString) yield ls.mkString("", ESC.toString, ESC.toString + end) + + private def genRawString = Arbitrary.arbString.arbitrary } diff --git a/util/log/src/test/scala/LogWriterTest.scala b/util/log/src/test/scala/LogWriterTest.scala index 95736d524..d51919ad7 100644 --- a/util/log/src/test/scala/LogWriterTest.scala +++ b/util/log/src/test/scala/LogWriterTest.scala @@ -4,157 +4,147 @@ package sbt import org.scalacheck._ -import Arbitrary.{arbitrary => arb, _} -import Gen.{listOfN, oneOf} +import Arbitrary.{ arbitrary => arb, _ } +import Gen.{ listOfN, oneOf } import Prop._ import java.io.Writer -object LogWriterTest extends Properties("Log Writer") -{ - final val MaxLines = 100 - final val MaxSegments = 10 +object LogWriterTest extends Properties("Log Writer") { + final val MaxLines = 100 + final val MaxSegments = 10 - /* Tests that content written through a LoggerWriter is properly passed to the underlying Logger. + /* Tests that content written through a LoggerWriter is properly passed to the underlying Logger. * Each line, determined by the specified newline separator, must be logged at the correct logging level. */ - property("properly logged") = forAll { (output: Output, newLine: NewLine) => - import output.{lines, level} - val log = new RecordingLogger - val writer = new LoggerWriter(log, Some(level), newLine.str) - logLines(writer, lines, newLine.str) - val events = log.getEvents - ("Recorded:\n" + events.map(show).mkString("\n")) |: - check( toLines(lines), events, level) - } - - /** Displays a LogEvent in a useful format for debugging. In particular, we are only interested in `Log` types - * and non-printable characters should be escaped*/ - def show(event: LogEvent): String = - event match - { - case l: Log => "Log('" + Escape(l.msg) + "', " + l.level + ")" - case _ => "Not Log" - } - /** Writes the given lines to the Writer. `lines` is taken to be a list of lines, which are - * represented as separately written segments (ToLog instances). ToLog.`byCharacter` - * indicates whether to write the segment by character (true) or all at once (false)*/ - def logLines(writer: Writer, lines: List[List[ToLog]], newLine: String) - { - for(line <- lines; section <- line) - { - val content = section.content - val normalized = Escape.newline(content, newLine) - if(section.byCharacter) - normalized.foreach { c => writer.write(c.toInt) } - else - writer.write(normalized) - } - writer.flush() - } - - /** Converts the given lines in segments to lines as Strings for checking the results of the test.*/ - def toLines(lines: List[List[ToLog]]): List[String] = - lines.map(_.map(_.contentOnly).mkString) - /** Checks that the expected `lines` were recorded as `events` at level `Lvl`.*/ - def check(lines: List[String], events: List[LogEvent], Lvl: Level.Value): Boolean = - (lines zip events) forall { - case (line, log : Log) => log.level == Lvl && line == log.msg - case _ => false - } - - /* The following are implicit generators to build up a write sequence. + property("properly logged") = forAll { (output: Output, newLine: NewLine) => + import output.{ lines, level } + val log = new RecordingLogger + val writer = new LoggerWriter(log, Some(level), newLine.str) + logLines(writer, lines, newLine.str) + val events = log.getEvents + ("Recorded:\n" + events.map(show).mkString("\n")) |: + check(toLines(lines), events, level) + } + + /** + * Displays a LogEvent in a useful format for debugging. In particular, we are only interested in `Log` types + * and non-printable characters should be escaped + */ + def show(event: LogEvent): String = + event match { + case l: Log => "Log('" + Escape(l.msg) + "', " + l.level + ")" + case _ => "Not Log" + } + /** + * Writes the given lines to the Writer. `lines` is taken to be a list of lines, which are + * represented as separately written segments (ToLog instances). ToLog.`byCharacter` + * indicates whether to write the segment by character (true) or all at once (false) + */ + def logLines(writer: Writer, lines: List[List[ToLog]], newLine: String) { + for (line <- lines; section <- line) { + val content = section.content + val normalized = Escape.newline(content, newLine) + if (section.byCharacter) + normalized.foreach { c => writer.write(c.toInt) } + else + writer.write(normalized) + } + writer.flush() + } + + /** Converts the given lines in segments to lines as Strings for checking the results of the test.*/ + def toLines(lines: List[List[ToLog]]): List[String] = + lines.map(_.map(_.contentOnly).mkString) + /** Checks that the expected `lines` were recorded as `events` at level `Lvl`.*/ + def check(lines: List[String], events: List[LogEvent], Lvl: Level.Value): Boolean = + (lines zip events) forall { + case (line, log: Log) => log.level == Lvl && line == log.msg + case _ => false + } + + /* The following are implicit generators to build up a write sequence. * ToLog represents a written segment. NewLine represents one of the possible * newline separators. A List[ToLog] represents a full line and always includes a * final ToLog with a trailing '\n'. Newline characters are otherwise not present in * the `content` of a ToLog instance.*/ - - implicit lazy val arbOut: Arbitrary[Output] = Arbitrary(genOutput) - implicit lazy val arbLog: Arbitrary[ToLog] = Arbitrary(genLog) - implicit lazy val arbLine: Arbitrary[List[ToLog]] = Arbitrary(genLine) - implicit lazy val arbNewLine: Arbitrary[NewLine] = Arbitrary(genNewLine) - implicit lazy val arbLevel : Arbitrary[Level.Value] = Arbitrary(genLevel) - - implicit def genLine(implicit logG: Gen[ToLog]): Gen[List[ToLog]] = - for(l <- listOf[ToLog](MaxSegments); last <- logG) yield - (addNewline(last) :: l.filter(!_.content.isEmpty)).reverse - implicit def genLog(implicit content: Arbitrary[String], byChar: Arbitrary[Boolean]): Gen[ToLog] = - for(c <- content.arbitrary; by <- byChar.arbitrary) yield - { - assert(c != null) - new ToLog(removeNewlines(c), by) - } - - implicit lazy val genNewLine: Gen[NewLine] = - for(str <- oneOf("\n", "\r", "\r\n")) yield - new NewLine(str) - - implicit lazy val genLevel: Gen[Level.Value] = - oneOf(Level.values.toSeq) - - implicit lazy val genOutput: Gen[Output] = - for(ls <- listOf[List[ToLog]](MaxLines); lv <- genLevel) yield - new Output(ls, lv) - - def removeNewlines(s: String) = s.replaceAll("""[\n\r]+""", "") - def addNewline(l: ToLog): ToLog = - new ToLog(l.content + "\n", l.byCharacter) // \n will be replaced by a random line terminator for all lines + implicit lazy val arbOut: Arbitrary[Output] = Arbitrary(genOutput) + implicit lazy val arbLog: Arbitrary[ToLog] = Arbitrary(genLog) + implicit lazy val arbLine: Arbitrary[List[ToLog]] = Arbitrary(genLine) + implicit lazy val arbNewLine: Arbitrary[NewLine] = Arbitrary(genNewLine) + implicit lazy val arbLevel: Arbitrary[Level.Value] = Arbitrary(genLevel) - def listOf[T](max: Int)(implicit content: Arbitrary[T]): Gen[List[T]] = - Gen.choose(0, max) flatMap { sz => listOfN(sz, content.arbitrary) } + implicit def genLine(implicit logG: Gen[ToLog]): Gen[List[ToLog]] = + for (l <- listOf[ToLog](MaxSegments); last <- logG) yield (addNewline(last) :: l.filter(!_.content.isEmpty)).reverse + + implicit def genLog(implicit content: Arbitrary[String], byChar: Arbitrary[Boolean]): Gen[ToLog] = + for (c <- content.arbitrary; by <- byChar.arbitrary) yield { + assert(c != null) + new ToLog(removeNewlines(c), by) + } + + implicit lazy val genNewLine: Gen[NewLine] = + for (str <- oneOf("\n", "\r", "\r\n")) yield new NewLine(str) + + implicit lazy val genLevel: Gen[Level.Value] = + oneOf(Level.values.toSeq) + + implicit lazy val genOutput: Gen[Output] = + for (ls <- listOf[List[ToLog]](MaxLines); lv <- genLevel) yield new Output(ls, lv) + + def removeNewlines(s: String) = s.replaceAll("""[\n\r]+""", "") + def addNewline(l: ToLog): ToLog = + new ToLog(l.content + "\n", l.byCharacter) // \n will be replaced by a random line terminator for all lines + + def listOf[T](max: Int)(implicit content: Arbitrary[T]): Gen[List[T]] = + Gen.choose(0, max) flatMap { sz => listOfN(sz, content.arbitrary) } } /* Helper classes*/ -final class Output(val lines: List[List[ToLog]], val level: Level.Value) extends NotNull -{ - override def toString = - "Level: " + level + "\n" + lines.map(_.mkString).mkString("\n") +final class Output(val lines: List[List[ToLog]], val level: Level.Value) extends NotNull { + override def toString = + "Level: " + level + "\n" + lines.map(_.mkString).mkString("\n") } -final class NewLine(val str: String) extends NotNull -{ - override def toString = Escape(str) +final class NewLine(val str: String) extends NotNull { + override def toString = Escape(str) } -final class ToLog(val content: String, val byCharacter: Boolean) extends NotNull -{ - def contentOnly = Escape.newline(content, "") - override def toString = if(content.isEmpty) "" else "ToLog('" + Escape(contentOnly) + "', " + byCharacter + ")" +final class ToLog(val content: String, val byCharacter: Boolean) extends NotNull { + def contentOnly = Escape.newline(content, "") + override def toString = if (content.isEmpty) "" else "ToLog('" + Escape(contentOnly) + "', " + byCharacter + ")" } /** Defines some utility methods for escaping unprintable characters.*/ -object Escape -{ - /** Escapes characters with code less than 20 by printing them as unicode escapes.*/ - def apply(s: String): String = - { - val builder = new StringBuilder(s.length) - for(c <- s) - { - def escaped = pad(c.toInt.toHexString.toUpperCase, 4, '0') - if(c < 20) builder.append("\\u").append(escaped) else builder.append(c) - } - builder.toString - } - def pad(s: String, minLength: Int, extra: Char) = - { - val diff = minLength - s.length - if(diff <= 0) s else List.make(diff, extra).mkString("", "", s) - } - /** Replaces a \n character at the end of a string `s` with `nl`.*/ - def newline(s: String, nl: String): String = - if(s.endsWith("\n")) s.substring(0, s.length - 1) + nl else s +object Escape { + /** Escapes characters with code less than 20 by printing them as unicode escapes.*/ + def apply(s: String): String = + { + val builder = new StringBuilder(s.length) + for (c <- s) { + def escaped = pad(c.toInt.toHexString.toUpperCase, 4, '0') + if (c < 20) builder.append("\\u").append(escaped) else builder.append(c) + } + builder.toString + } + def pad(s: String, minLength: Int, extra: Char) = + { + val diff = minLength - s.length + if (diff <= 0) s else List.make(diff, extra).mkString("", "", s) + } + /** Replaces a \n character at the end of a string `s` with `nl`.*/ + def newline(s: String, nl: String): String = + if (s.endsWith("\n")) s.substring(0, s.length - 1) + nl else s } /** Records logging events for later retrieval.*/ -final class RecordingLogger extends BasicLogger -{ - private var events: List[LogEvent] = Nil - - def getEvents = events.reverse - - override def ansiCodesSupported = true - def trace(t: => Throwable) { events ::= new Trace(t) } - def log(level: Level.Value, message: => String) { events ::= new Log(level, message) } - def success(message: => String) { events ::= new Success(message) } - def logAll(es: Seq[LogEvent]) { events :::= es.toList } - def control(event: ControlEvent.Value, message: => String) { events ::= new ControlEvent(event, message) } - +final class RecordingLogger extends BasicLogger { + private var events: List[LogEvent] = Nil + + def getEvents = events.reverse + + override def ansiCodesSupported = true + def trace(t: => Throwable) { events ::= new Trace(t) } + def log(level: Level.Value, message: => String) { events ::= new Log(level, message) } + def success(message: => String) { events ::= new Success(message) } + def logAll(es: Seq[LogEvent]) { events :::= es.toList } + def control(event: ControlEvent.Value, message: => String) { events ::= new ControlEvent(event, message) } + } \ No newline at end of file diff --git a/util/log/src/test/scala/TestLogger.scala b/util/log/src/test/scala/TestLogger.scala index edf2b00dd..e7b6bee49 100644 --- a/util/log/src/test/scala/TestLogger.scala +++ b/util/log/src/test/scala/TestLogger.scala @@ -1,11 +1,10 @@ package sbt -object TestLogger -{ - def apply[T](f: Logger => T): T = - { - val log = new BufferedLogger(ConsoleLogger()) - log.setLevel(Level.Debug) - log.bufferQuietly(f(log)) - } +object TestLogger { + def apply[T](f: Logger => T): T = + { + val log = new BufferedLogger(ConsoleLogger()) + log.setLevel(Level.Debug) + log.bufferQuietly(f(log)) + } } \ No newline at end of file diff --git a/util/logic/src/test/scala/sbt/logic/Test.scala b/util/logic/src/test/scala/sbt/logic/Test.scala index cf50ef9fd..e66a3b9b2 100644 --- a/util/logic/src/test/scala/sbt/logic/Test.scala +++ b/util/logic/src/test/scala/sbt/logic/Test.scala @@ -1,117 +1,115 @@ package sbt package logic - import org.scalacheck._ - import Prop.secure - import Logic.{LogicException, Matched} +import org.scalacheck._ +import Prop.secure +import Logic.{ LogicException, Matched } -object LogicTest extends Properties("Logic") -{ - import TestClauses._ +object LogicTest extends Properties("Logic") { + import TestClauses._ - property("Handles trivial resolution.") = secure( expect(trivial, Set(A) ) ) - property("Handles less trivial resolution.") = secure( expect(lessTrivial, Set(B,A,D)) ) - property("Handles cycles without negation") = secure( expect(cycles, Set(F,A,B)) ) - property("Handles basic exclusion.") = secure( expect(excludedPos, Set()) ) - property("Handles exclusion of head proved by negation.") = secure( expect(excludedNeg, Set()) ) - // TODO: actually check ordering, probably as part of a check that dependencies are satisifed - property("Properly orders results.") = secure( expect(ordering, Set(B,A,C,E,F))) - property("Detects cyclic negation") = secure( - Logic.reduceAll(badClauses, Set()) match { - case Right(res) => false - case Left(err: Logic.CyclicNegation) => true - case Left(err) => error(s"Expected cyclic error, got: $err") - } - ) + property("Handles trivial resolution.") = secure(expect(trivial, Set(A))) + property("Handles less trivial resolution.") = secure(expect(lessTrivial, Set(B, A, D))) + property("Handles cycles without negation") = secure(expect(cycles, Set(F, A, B))) + property("Handles basic exclusion.") = secure(expect(excludedPos, Set())) + property("Handles exclusion of head proved by negation.") = secure(expect(excludedNeg, Set())) + // TODO: actually check ordering, probably as part of a check that dependencies are satisifed + property("Properly orders results.") = secure(expect(ordering, Set(B, A, C, E, F))) + property("Detects cyclic negation") = secure( + Logic.reduceAll(badClauses, Set()) match { + case Right(res) => false + case Left(err: Logic.CyclicNegation) => true + case Left(err) => error(s"Expected cyclic error, got: $err") + } + ) - def expect(result: Either[LogicException, Matched], expected: Set[Atom]) = result match { - case Left(err) => false - case Right(res) => - val actual = res.provenSet - (actual == expected) || error(s"Expected to prove $expected, but actually proved $actual") - } + def expect(result: Either[LogicException, Matched], expected: Set[Atom]) = result match { + case Left(err) => false + case Right(res) => + val actual = res.provenSet + (actual == expected) || error(s"Expected to prove $expected, but actually proved $actual") + } } -object TestClauses -{ +object TestClauses { - val A = Atom("A") - val B = Atom("B") - val C = Atom("C") - val D = Atom("D") - val E = Atom("E") - val F = Atom("F") - val G = Atom("G") + val A = Atom("A") + val B = Atom("B") + val C = Atom("C") + val D = Atom("D") + val E = Atom("E") + val F = Atom("F") + val G = Atom("G") - val clauses = - A.proves(B) :: - A.proves(F) :: - B.proves(F) :: - F.proves(A) :: - (!C).proves(F) :: - D.proves(C) :: - C.proves(D) :: - Nil + val clauses = + A.proves(B) :: + A.proves(F) :: + B.proves(F) :: + F.proves(A) :: + (!C).proves(F) :: + D.proves(C) :: + C.proves(D) :: + Nil - val cycles = Logic.reduceAll(clauses, Set()) + val cycles = Logic.reduceAll(clauses, Set()) - val badClauses = - A.proves(D) :: - clauses + val badClauses = + A.proves(D) :: + clauses - val excludedNeg = { - val cs = - (!A).proves(B) :: - Nil - val init = - (!A) :: - (!B) :: - Nil - Logic.reduceAll(cs, init.toSet) - } + val excludedNeg = { + val cs = + (!A).proves(B) :: + Nil + val init = + (!A) :: + (!B) :: + Nil + Logic.reduceAll(cs, init.toSet) + } - val excludedPos = { - val cs = - A.proves(B) :: - Nil - val init = - A :: - (!B) :: - Nil - Logic.reduceAll(cs, init.toSet) - } + val excludedPos = { + val cs = + A.proves(B) :: + Nil + val init = + A :: + (!B) :: + Nil + Logic.reduceAll(cs, init.toSet) + } - val trivial = { - val cs = - Formula.True.proves(A) :: - Nil - Logic.reduceAll(cs, Set.empty) - } + val trivial = { + val cs = + Formula.True.proves(A) :: + Nil + Logic.reduceAll(cs, Set.empty) + } - val lessTrivial = { - val cs = - Formula.True.proves(A) :: - Formula.True.proves(B) :: - (A && B && (!C)).proves(D) :: - Nil - Logic.reduceAll(cs, Set()) - } + val lessTrivial = { + val cs = + Formula.True.proves(A) :: + Formula.True.proves(B) :: + (A && B && (!C)).proves(D) :: + Nil + Logic.reduceAll(cs, Set()) + } - val ordering = { - val cs = - E.proves(F) :: - (C && !D).proves(E) :: - (A && B).proves(C) :: - Nil - Logic.reduceAll(cs, Set(A,B)) - } + val ordering = { + val cs = + E.proves(F) :: + (C && !D).proves(E) :: + (A && B).proves(C) :: + Nil + Logic.reduceAll(cs, Set(A, B)) + } - def all { - println(s"Cycles: $cycles") - println(s"xNeg: $excludedNeg") - println(s"xPos: $excludedPos") - println(s"trivial: $trivial") - println(s"lessTrivial: $lessTrivial") - println(s"ordering: $ordering") - } + def all { + println(s"Cycles: $cycles") + println(s"xNeg: $excludedNeg") + println(s"xPos: $excludedPos") + println(s"trivial: $trivial") + println(s"lessTrivial: $lessTrivial") + println(s"ordering: $ordering") + } } diff --git a/util/process/src/test/scala/ProcessSpecification.scala b/util/process/src/test/scala/ProcessSpecification.scala index 6298ce544..67bd5e625 100644 --- a/util/process/src/test/scala/ProcessSpecification.scala +++ b/util/process/src/test/scala/ProcessSpecification.scala @@ -1,133 +1,131 @@ package sbt import java.io.File -import org.scalacheck.{Arbitrary, Gen, Prop, Properties} +import org.scalacheck.{ Arbitrary, Gen, Prop, Properties } import Prop._ import Process._ -object ProcessSpecification extends Properties("Process I/O") -{ - implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary( - for(size <- Gen.choose(0, 10); - l <- Gen.listOfN[Byte](size, Arbitrary.arbByte.arbitrary)) - yield - l.toArray - ) +object ProcessSpecification extends Properties("Process I/O") { + implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary( + for ( + size <- Gen.choose(0, 10); + l <- Gen.listOfN[Byte](size, Arbitrary.arbByte.arbitrary) + ) yield l.toArray + ) - /*property("Correct exit code") = forAll( (exitCode: Byte) => checkExit(exitCode)) + /*property("Correct exit code") = forAll( (exitCode: Byte) => checkExit(exitCode)) property("#&& correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ #&& _)(_ && _)) property("#|| correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ #|| _)(_ || _)) property("### correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ ### _)( (x,latest) => latest))*/ - property("Pipe to output file") = forAll( (data: Array[Byte]) => checkFileOut(data)) - property("Pipe from input file") = forAll( (data: Array[Byte]) => checkFileIn(data)) - property("Pipe to process") = forAll( (data: Array[Byte]) => checkPipe(data)) - property("Pipe to process ignores input exit code") = forAll( (data: Array[Byte], code: Byte) => checkPipeExit(data, code)) - property("Pipe from input file to bad process preserves correct exit code.") = forAll( (data: Array[Byte], code: Byte) => checkFileInExit(data, code)) - property("Pipe to output file from bad process preserves correct exit code.") = forAll( (data: Array[Byte], code: Byte) => checkFileOutExit(data, code)) + property("Pipe to output file") = forAll((data: Array[Byte]) => checkFileOut(data)) + property("Pipe from input file") = forAll((data: Array[Byte]) => checkFileIn(data)) + property("Pipe to process") = forAll((data: Array[Byte]) => checkPipe(data)) + property("Pipe to process ignores input exit code") = forAll((data: Array[Byte], code: Byte) => checkPipeExit(data, code)) + property("Pipe from input file to bad process preserves correct exit code.") = forAll((data: Array[Byte], code: Byte) => checkFileInExit(data, code)) + property("Pipe to output file from bad process preserves correct exit code.") = forAll((data: Array[Byte], code: Byte) => checkFileOutExit(data, code)) - private def checkBinary(codes: Array[Byte])(reduceProcesses: (ProcessBuilder, ProcessBuilder) => ProcessBuilder)(reduceExit: (Boolean, Boolean) => Boolean) = - { - (codes.length > 1) ==> - { - val unsignedCodes = codes.map(unsigned) - val exitCode = unsignedCodes.map(code => Process(process("sbt.exit " + code))).reduceLeft(reduceProcesses) ! - val expectedExitCode = unsignedCodes.map(toBoolean).reduceLeft(reduceExit) - toBoolean(exitCode) == expectedExitCode - } - } - private def toBoolean(exitCode: Int) = exitCode == 0 - private def checkExit(code: Byte) = - { - val exitCode = unsigned(code) - (process("sbt.exit " + exitCode) !) == exitCode - } - private def checkFileOut(data: Array[Byte]) = - { - withData(data) { (temporaryFile, temporaryFile2) => - val catCommand = process("sbt.cat " + temporaryFile.getAbsolutePath) - catCommand #> temporaryFile2 - } - } - private def checkFileIn(data: Array[Byte]) = - { - withData(data) { (temporaryFile, temporaryFile2) => - val catCommand = process("sbt.cat") - temporaryFile #> catCommand #> temporaryFile2 - } - } - private def checkPipe(data: Array[Byte]) = - { - withData(data) { (temporaryFile, temporaryFile2) => - val catCommand = process("sbt.cat") - temporaryFile #> catCommand #| catCommand #> temporaryFile2 - } - } - private def checkPipeExit(data: Array[Byte], code: Byte) = - withTempFiles { (a,b) => - IO.write(a, data) - val catCommand = process("sbt.cat") - val exitCommand = process(s"sbt.exit $code") - val exit = (a #> exitCommand #| catCommand #> b).! - (s"Exit code: $exit") |: - (s"Output file length: ${b.length}") |: - (exit == 0) && - (b.length == 0) - } + private def checkBinary(codes: Array[Byte])(reduceProcesses: (ProcessBuilder, ProcessBuilder) => ProcessBuilder)(reduceExit: (Boolean, Boolean) => Boolean) = + { + (codes.length > 1) ==> + { + val unsignedCodes = codes.map(unsigned) + val exitCode = unsignedCodes.map(code => Process(process("sbt.exit " + code))).reduceLeft(reduceProcesses) ! + val expectedExitCode = unsignedCodes.map(toBoolean).reduceLeft(reduceExit) + toBoolean(exitCode) == expectedExitCode + } + } + private def toBoolean(exitCode: Int) = exitCode == 0 + private def checkExit(code: Byte) = + { + val exitCode = unsigned(code) + (process("sbt.exit " + exitCode) !) == exitCode + } + private def checkFileOut(data: Array[Byte]) = + { + withData(data) { (temporaryFile, temporaryFile2) => + val catCommand = process("sbt.cat " + temporaryFile.getAbsolutePath) + catCommand #> temporaryFile2 + } + } + private def checkFileIn(data: Array[Byte]) = + { + withData(data) { (temporaryFile, temporaryFile2) => + val catCommand = process("sbt.cat") + temporaryFile #> catCommand #> temporaryFile2 + } + } + private def checkPipe(data: Array[Byte]) = + { + withData(data) { (temporaryFile, temporaryFile2) => + val catCommand = process("sbt.cat") + temporaryFile #> catCommand #| catCommand #> temporaryFile2 + } + } + private def checkPipeExit(data: Array[Byte], code: Byte) = + withTempFiles { (a, b) => + IO.write(a, data) + val catCommand = process("sbt.cat") + val exitCommand = process(s"sbt.exit $code") + val exit = (a #> exitCommand #| catCommand #> b).! + (s"Exit code: $exit") |: + (s"Output file length: ${b.length}") |: + (exit == 0) && + (b.length == 0) + } - private def checkFileOutExit(data: Array[Byte], exitCode: Byte) = - withTempFiles { (a,b) => - IO.write(a, data) - val code = unsigned(exitCode) - val command = process(s"sbt.exit $code") - val exit = (a #> command #> b).! - (s"Exit code: $exit, expected: $code") |: - (s"Output file length: ${b.length}") |: - (exit == code) && - (b.length == 0) - } + private def checkFileOutExit(data: Array[Byte], exitCode: Byte) = + withTempFiles { (a, b) => + IO.write(a, data) + val code = unsigned(exitCode) + val command = process(s"sbt.exit $code") + val exit = (a #> command #> b).! + (s"Exit code: $exit, expected: $code") |: + (s"Output file length: ${b.length}") |: + (exit == code) && + (b.length == 0) + } - private def checkFileInExit(data: Array[Byte], exitCode: Byte) = - withTempFiles { (a,b) => - IO.write(a, data) - val code = unsigned(exitCode) - val command = process(s"sbt.exit $code") - val exit = (a #> command).! - (s"Exit code: $exit, expected: $code") |: - (exit == code) - } + private def checkFileInExit(data: Array[Byte], exitCode: Byte) = + withTempFiles { (a, b) => + IO.write(a, data) + val code = unsigned(exitCode) + val command = process(s"sbt.exit $code") + val exit = (a #> command).! + (s"Exit code: $exit, expected: $code") |: + (exit == code) + } - private def temp() = File.createTempFile("sbt", "") - private def withData(data: Array[Byte])(f: (File, File) => ProcessBuilder) = - withTempFiles { (a, b) => - IO.write(a, data) - val process = f(a, b) - ( process ! ) == 0 && sameFiles(a, b) - } - private def sameFiles(a: File, b: File) = - IO.readBytes(a) sameElements IO.readBytes(b) + private def temp() = File.createTempFile("sbt", "") + private def withData(data: Array[Byte])(f: (File, File) => ProcessBuilder) = + withTempFiles { (a, b) => + IO.write(a, data) + val process = f(a, b) + (process !) == 0 && sameFiles(a, b) + } + private def sameFiles(a: File, b: File) = + IO.readBytes(a) sameElements IO.readBytes(b) - private def withTempFiles[T](f: (File, File) => T): T = - { - val temporaryFile1 = temp() - val temporaryFile2 = temp() - try f(temporaryFile1, temporaryFile2) - finally - { - temporaryFile1.delete() - temporaryFile2.delete() - } - } - private def unsigned(b: Int): Int = ((b: Int) +256) % 256 - private def unsigned(b: Byte): Int = unsigned(b: Int) - private def process(command: String) = - { - val ignore = echo // just for the compile dependency so that this test is rerun when TestedProcess.scala changes, not used otherwise + private def withTempFiles[T](f: (File, File) => T): T = + { + val temporaryFile1 = temp() + val temporaryFile2 = temp() + try f(temporaryFile1, temporaryFile2) + finally { + temporaryFile1.delete() + temporaryFile2.delete() + } + } + private def unsigned(b: Int): Int = ((b: Int) + 256) % 256 + private def unsigned(b: Byte): Int = unsigned(b: Int) + private def process(command: String) = + { + val ignore = echo // just for the compile dependency so that this test is rerun when TestedProcess.scala changes, not used otherwise - val thisClasspath = List(getSource[Product], getSource[IO.type], getSource[SourceTag]).mkString(File.pathSeparator) - "java -cp " + thisClasspath + " " + command - } - private def getSource[T : Manifest]: String = - IO.classLocationFile[T].getAbsolutePath + val thisClasspath = List(getSource[Product], getSource[IO.type], getSource[SourceTag]).mkString(File.pathSeparator) + "java -cp " + thisClasspath + " " + command + } + private def getSource[T: Manifest]: String = + IO.classLocationFile[T].getAbsolutePath } private trait SourceTag diff --git a/util/process/src/test/scala/TestedProcess.scala b/util/process/src/test/scala/TestedProcess.scala index c013de531..5daea8bab 100644 --- a/util/process/src/test/scala/TestedProcess.scala +++ b/util/process/src/test/scala/TestedProcess.scala @@ -1,56 +1,47 @@ package sbt -import java.io.{File, FileNotFoundException, IOException} +import java.io.{ File, FileNotFoundException, IOException } -object exit -{ - def main(args: Array[String]) - { - System.exit(java.lang.Integer.parseInt(args(0))) - } +object exit { + def main(args: Array[String]) { + System.exit(java.lang.Integer.parseInt(args(0))) + } } -object cat -{ - def main(args: Array[String]) - { - try { - if(args.length == 0) - IO.transfer(System.in, System.out) - else - catFiles(args.toList) - System.exit(0) - } catch { - case e => - e.printStackTrace() - System.err.println("Error: " + e.toString) - System.exit(1) - } - } - private def catFiles(filenames: List[String]): Option[String] = - { - filenames match - { - case head :: tail => - val file = new File(head) - if(file.isDirectory) - throw new IOException("Is directory: " + file) - else if(file.exists) - { - Using.fileInputStream(file) { stream => - IO.transfer(stream, System.out) - } - catFiles(tail) - } - else - throw new FileNotFoundException("No such file or directory: " + file) - case Nil => None - } - } +object cat { + def main(args: Array[String]) { + try { + if (args.length == 0) + IO.transfer(System.in, System.out) + else + catFiles(args.toList) + System.exit(0) + } catch { + case e => + e.printStackTrace() + System.err.println("Error: " + e.toString) + System.exit(1) + } + } + private def catFiles(filenames: List[String]): Option[String] = + { + filenames match { + case head :: tail => + val file = new File(head) + if (file.isDirectory) + throw new IOException("Is directory: " + file) + else if (file.exists) { + Using.fileInputStream(file) { stream => + IO.transfer(stream, System.out) + } + catFiles(tail) + } else + throw new FileNotFoundException("No such file or directory: " + file) + case Nil => None + } + } } -object echo -{ - def main(args: Array[String]) - { - System.out.println(args.mkString(" ")) - } +object echo { + def main(args: Array[String]) { + System.out.println(args.mkString(" ")) + } } \ No newline at end of file diff --git a/util/relation/src/test/scala/RelationTest.scala b/util/relation/src/test/scala/RelationTest.scala index 3dcc03f38..558935bdb 100644 --- a/util/relation/src/test/scala/RelationTest.scala +++ b/util/relation/src/test/scala/RelationTest.scala @@ -6,79 +6,79 @@ package sbt import org.scalacheck._ import Prop._ -object RelationTest extends Properties("Relation") -{ - property("Added entry check") = forAll { (pairs: List[(Int, Double)]) => - val r = Relation.empty[Int, Double] ++ pairs - check(r, pairs) - } - def check(r: Relation[Int, Double], pairs: Seq[(Int, Double)]) = - { - val _1s = pairs.map(_._1).toSet - val _2s = pairs.map(_._2).toSet - - r._1s == _1s && r.forwardMap.keySet == _1s && - r._2s == _2s && r.reverseMap.keySet == _2s && - pairs.forall { case (a, b) => - (r.forward(a) contains b) && - (r.reverse(b) contains a) && - (r.forwardMap(a) contains b) && - (r.reverseMap(b) contains a) - } - } - - property("Does not contain removed entries") = forAll { (pairs: List[(Int, Double, Boolean)]) => - val add = pairs.map { case (a,b,c) => (a,b) } - val added = Relation.empty[Int, Double] ++ add - - val removeFine = pairs.collect { case (a,b,true) => (a,b) } - val removeCoarse = removeFine.map(_._1) - val r = added -- removeCoarse - - def notIn[X,Y](map: Map[X, Set[Y]], a: X, b: Y) = map.get(a).forall(set => ! (set contains b) ) - - all(removeCoarse) { rem => - ("_1s does not contain removed" |: (!r._1s.contains(rem)) ) && - ("Forward does not contain removed" |: r.forward(rem).isEmpty ) && - ("Forward map does not contain removed" |: !r.forwardMap.contains(rem) ) && - ("Removed is not a value in reverse map" |: !r.reverseMap.values.toSet.contains(rem) ) - } && - all(removeFine) { case (a, b) => - ("Forward does not contain removed" |: ( !r.forward(a).contains(b) ) ) && - ("Reverse does not contain removed" |: ( !r.reverse(b).contains(a) ) ) && - ("Forward map does not contain removed" |: ( notIn(r.forwardMap, a, b) ) ) && - ("Reverse map does not contain removed" |: ( notIn(r.reverseMap, b, a) ) ) - } - } +object RelationTest extends Properties("Relation") { + property("Added entry check") = forAll { (pairs: List[(Int, Double)]) => + val r = Relation.empty[Int, Double] ++ pairs + check(r, pairs) + } + def check(r: Relation[Int, Double], pairs: Seq[(Int, Double)]) = + { + val _1s = pairs.map(_._1).toSet + val _2s = pairs.map(_._2).toSet - property("Groups correctly") = forAll { (entries: List[(Int, Double)], randomInt: Int) => - val splitInto = math.abs(randomInt) % 10 + 1 // Split into 1-10 groups. - val rel = Relation.empty[Int, Double] ++ entries - val grouped = rel groupBy (_._1 % splitInto) - all(grouped.toSeq) { - case (k, rel_k) => rel_k._1s forall { _ % splitInto == k } - } - } + r._1s == _1s && r.forwardMap.keySet == _1s && + r._2s == _2s && r.reverseMap.keySet == _2s && + pairs.forall { + case (a, b) => + (r.forward(a) contains b) && + (r.reverse(b) contains a) && + (r.forwardMap(a) contains b) && + (r.reverseMap(b) contains a) + } + } + + property("Does not contain removed entries") = forAll { (pairs: List[(Int, Double, Boolean)]) => + val add = pairs.map { case (a, b, c) => (a, b) } + val added = Relation.empty[Int, Double] ++ add + + val removeFine = pairs.collect { case (a, b, true) => (a, b) } + val removeCoarse = removeFine.map(_._1) + val r = added -- removeCoarse + + def notIn[X, Y](map: Map[X, Set[Y]], a: X, b: Y) = map.get(a).forall(set => !(set contains b)) + + all(removeCoarse) { rem => + ("_1s does not contain removed" |: (!r._1s.contains(rem))) && + ("Forward does not contain removed" |: r.forward(rem).isEmpty) && + ("Forward map does not contain removed" |: !r.forwardMap.contains(rem)) && + ("Removed is not a value in reverse map" |: !r.reverseMap.values.toSet.contains(rem)) + } && + all(removeFine) { + case (a, b) => + ("Forward does not contain removed" |: (!r.forward(a).contains(b))) && + ("Reverse does not contain removed" |: (!r.reverse(b).contains(a))) && + ("Forward map does not contain removed" |: (notIn(r.forwardMap, a, b))) && + ("Reverse map does not contain removed" |: (notIn(r.reverseMap, b, a))) + } + } + + property("Groups correctly") = forAll { (entries: List[(Int, Double)], randomInt: Int) => + val splitInto = math.abs(randomInt) % 10 + 1 // Split into 1-10 groups. + val rel = Relation.empty[Int, Double] ++ entries + val grouped = rel groupBy (_._1 % splitInto) + all(grouped.toSeq) { + case (k, rel_k) => rel_k._1s forall { _ % splitInto == k } + } + } property("Computes size correctly") = forAll { (entries: List[(Int, Double)]) => val rel = Relation.empty[Int, Double] ++ entries - val expected = rel.all.size // Note: not entries.length, as entries may have duplicates. + val expected = rel.all.size // Note: not entries.length, as entries may have duplicates. val computed = rel.size "Expected size: %d. Computed size: %d.".format(expected, computed) |: expected == computed } - def all[T](s: Seq[T])(p: T => Prop): Prop = - if(s.isEmpty) true else s.map(p).reduceLeft(_ && _) + def all[T](s: Seq[T])(p: T => Prop): Prop = + if (s.isEmpty) true else s.map(p).reduceLeft(_ && _) } -object EmptyRelationTest extends Properties("Empty relation") -{ - lazy val e = Relation.empty[Int, Double] +object EmptyRelationTest extends Properties("Empty relation") { + lazy val e = Relation.empty[Int, Double] - property("Forward empty") = forAll { (i: Int) => e.forward(i).isEmpty } - property("Reverse empty") = forAll { (i: Double) => e.reverse(i).isEmpty } - property("Forward map empty") = e.forwardMap.isEmpty - property("Reverse map empty") = e.reverseMap.isEmpty - property("_1 empty") = e._1s.isEmpty - property("_2 empty") = e._2s.isEmpty + property("Forward empty") = forAll { (i: Int) => e.forward(i).isEmpty } + property("Reverse empty") = forAll { (i: Double) => e.reverse(i).isEmpty } + property("Forward map empty") = e.forwardMap.isEmpty + property("Reverse map empty") = e.reverseMap.isEmpty + property("_1 empty") = e._1s.isEmpty + property("_2 empty") = e._2s.isEmpty } \ No newline at end of file From 629a8ca6ebf30568af9977a5a434186c5c1da700 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Wed, 14 May 2014 12:30:56 -0400 Subject: [PATCH 433/823] Bump expected 2.11 module versions so we can compile with 2.11 Add scala 2.11 test/build verification. * Add 2.11 build configuratoin to travis ci * Create command which runs `safe` unit tests * Create command to test the scala 2.11 build * Update scalacheck to 1.11.4 * Update specs2 to 2.3.11 * Fix various 2.11/deprecation removals and other changes. Fix eval test failure in scala 2.11 with XML not existing. --- util/log/src/test/scala/LogWriterTest.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/log/src/test/scala/LogWriterTest.scala b/util/log/src/test/scala/LogWriterTest.scala index d51919ad7..db8250139 100644 --- a/util/log/src/test/scala/LogWriterTest.scala +++ b/util/log/src/test/scala/LogWriterTest.scala @@ -128,7 +128,7 @@ object Escape { def pad(s: String, minLength: Int, extra: Char) = { val diff = minLength - s.length - if (diff <= 0) s else List.make(diff, extra).mkString("", "", s) + if (diff <= 0) s else List.fill(diff)(extra).mkString("", "", s) } /** Replaces a \n character at the end of a string `s` with `nl`.*/ def newline(s: String, nl: String): String = From 4119bcf93634bffd27df39acfb000192453ce482 Mon Sep 17 00:00:00 2001 From: Dan Sanduleac Date: Tue, 20 May 2014 08:34:28 +0100 Subject: [PATCH 434/823] Don't allow generated strings (key names) to be empty --- util/collection/src/test/scala/SettingsTest.scala | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index f8c99a735..fb5511c99 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -61,9 +61,12 @@ object SettingsTest extends Properties("settings") { private def mkAttrKeys[T](nr: Int)(implicit mf: Manifest[T]): Gen[List[AttributeKey[T]]] = { - val alphaStr = Gen.alphaStr + import Gen._ + val nonEmptyAlphaStr = + nonEmptyListOf(alphaChar).map(_.mkString).suchThat(_.forall(_.isLetter)) + for { - list <- Gen.listOfN(nr, alphaStr) suchThat (l => l.size == l.distinct.size) + list <- Gen.listOfN(nr, nonEmptyAlphaStr) suchThat (l => l.size == l.distinct.size) item <- list } yield AttributeKey[T](item) } From a398e7d78f12f226cc7ac67ea38190bee171b78d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 27 Jul 2014 12:26:12 -0400 Subject: [PATCH 435/823] Implements eviction warning stories. #1200 This implements all stories from https://github.com/sbt/sbt/wiki/User-Stories%3A-Conflict-Warning. When scalaVersion is no longer effective an eviction warning will display. Scala version was updated by one of library dependencies: * org.scala-lang:scala-library:2.10.2 -> 2.10.3 When there're suspected incompatibility in directly depended Java libraries, eviction warnings will display. There may be incompatibilities among your library dependencies. Here are some of the libraries that were evicted: * commons-io:commons-io:1.4 -> 2.4 When there's suspected incompatiblity in directly depended Scala libraries, eviction warnings will display. There may be incompatibilities among your library dependencies. Here are some of the libraries that were evicted: * com.typesafe.akka:akka-actor_2.10:2.1.4 -> 2.3.4 This also adds 'evicted' task, which displays more detailed eviction warnings. --- .../collection/src/main/scala/sbt/ShowLines.scala | 15 +++++++++++++++ 1 file changed, 15 insertions(+) create mode 100644 util/collection/src/main/scala/sbt/ShowLines.scala diff --git a/util/collection/src/main/scala/sbt/ShowLines.scala b/util/collection/src/main/scala/sbt/ShowLines.scala new file mode 100644 index 000000000..126b6360e --- /dev/null +++ b/util/collection/src/main/scala/sbt/ShowLines.scala @@ -0,0 +1,15 @@ +package sbt + +trait ShowLines[A] { + def showLines(a: A): Seq[String] +} +object ShowLines { + def apply[A](f: A => Seq[String]): ShowLines[A] = + new ShowLines[A] { + def showLines(a: A): Seq[String] = f(a) + } + + implicit class ShowLinesOp[A: ShowLines](a: A) { + def lines: Seq[String] = implicitly[ShowLines[A]].showLines(a) + } +} From 515ec1f2e29b5472d0c417ac1043662b08a40005 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Mon, 11 Aug 2014 14:54:33 -0400 Subject: [PATCH 436/823] Fixes flaky no-such-element exception from bad generation of random tests. --- util/collection/src/test/scala/SettingsTest.scala | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index fb5511c99..2f3d9d4ae 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -41,10 +41,11 @@ object SettingsTest extends Properties("settings") { final def derivedSettings(nr: Int): Prop = { val genScopedKeys = { - val attrKeys = mkAttrKeys[Int](nr) + val attrKeys = mkAttrKeys[Int](nr).filter(!_.isEmpty) attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) - } + }.label("scopedKeys") forAll(genScopedKeys) { scopedKeys => + // Note; It's evil to grab last IF you haven't verified the set can't be empty. val last = scopedKeys.last val derivedSettings: Seq[Setting[Int]] = ( for { @@ -65,10 +66,10 @@ object SettingsTest extends Properties("settings") { val nonEmptyAlphaStr = nonEmptyListOf(alphaChar).map(_.mkString).suchThat(_.forall(_.isLetter)) - for { + (for { list <- Gen.listOfN(nr, nonEmptyAlphaStr) suchThat (l => l.size == l.distinct.size) item <- list - } yield AttributeKey[T](item) + } yield AttributeKey[T](item)).label(s"mkAttrKeys($nr)") } property("Derived setting(s) replace DerivedSetting in the Seq[Setting[_]]") = derivedKeepsPosition From dbc8b2643f8c52d03a2a7ada03f2064091bb4abc Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Mon, 11 Aug 2014 16:13:26 -0400 Subject: [PATCH 437/823] Ok, this is actually the flaky issue with the test. We use the ch key for testing, so it can't be part of the autogenerated set. --- .../src/test/scala/SettingsTest.scala | 41 ++++++++++++------- 1 file changed, 27 insertions(+), 14 deletions(-) diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index 2f3d9d4ae..dbad035c6 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -37,26 +37,39 @@ object SettingsTest extends Properties("settings") { evaluate(setting(chk, iterate(top)) :: Nil); true } - property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettings } + property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100).label("numSettings")) { derivedSettings } final def derivedSettings(nr: Int): Prop = { val genScopedKeys = { - val attrKeys = mkAttrKeys[Int](nr).filter(!_.isEmpty) + // We wan + // t to generate lists of keys that DO NOT inclue the "ch" key we use to check thigns. + val attrKeys = mkAttrKeys[Int](nr).filter(_.forall(_.label != "ch")) attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) - }.label("scopedKeys") + }.label("scopedKeys").filter(!_.isEmpty) forAll(genScopedKeys) { scopedKeys => - // Note; It's evil to grab last IF you haven't verified the set can't be empty. - val last = scopedKeys.last - val derivedSettings: Seq[Setting[Int]] = ( - for { - List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 - nextInit = if (scoped0 == chk) chk - else (scoped0 zipWith chk) { (p, _) => p + 1 } - } yield derive(setting(scoped1, nextInit)) - ).toSeq + try { + // Note; It's evil to grab last IF you haven't verified the set can't be empty. + val last = scopedKeys.last + val derivedSettings: Seq[Setting[Int]] = ( + for { + List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 + nextInit = if (scoped0 == chk) chk + else (scoped0 zipWith chk) { (p, _) => p + 1 } + } yield derive(setting(scoped1, nextInit)) + ).toSeq - { checkKey(last, Some(nr - 1), evaluate(setting(chk, value(0)) +: derivedSettings)) :| "Not derived?" } && - { checkKey(last, None, evaluate(derivedSettings)) :| "Should not be derived" } + { + // Note: This causes a cycle refernec error, quite frequently. + checkKey(last, Some(nr - 1), evaluate(setting(chk, value(0)) +: derivedSettings)) :| "Not derived?" + } && { + checkKey(last, None, evaluate(derivedSettings)) :| "Should not be derived" + } + } catch { + case t: Throwable => + // TODO - For debugging only. + t.printStackTrace(System.err) + throw t + } } } From f6c43b917d7f6f6d61b7ff0fc5e1b9bc36f079f9 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 15 Aug 2014 03:52:54 -0400 Subject: [PATCH 438/823] Fixes #1530. Fixes NPE by using IO.listFiles --- .../src/main/scala/sbt/complete/ExampleSource.scala | 8 ++++---- .../src/test/scala/sbt/complete/FileExamplesTest.scala | 3 +-- 2 files changed, 5 insertions(+), 6 deletions(-) diff --git a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala index 6d0469aa0..52d96246b 100644 --- a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala +++ b/util/complete/src/main/scala/sbt/complete/ExampleSource.scala @@ -1,7 +1,7 @@ package sbt.complete import java.io.File -import sbt.IO._ +import sbt.IO /** * These sources of examples are used in parsers for user input completion. An example of such a source is the @@ -48,9 +48,9 @@ class FileExamples(base: File, prefix: String = "") extends ExampleSource { override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) protected def files(directory: File): Stream[String] = { - val childPaths = directory.listFiles().toStream - val prefixedDirectChildPaths = childPaths.map(relativize(base, _).get).filter(_ startsWith prefix) - val dirsToRecurseInto = childPaths.filter(_.isDirectory).map(relativize(base, _).get).filter(dirStartsWithPrefix) + val childPaths = IO.listFiles(directory).toStream + val prefixedDirectChildPaths = childPaths map { IO.relativize(base, _).get } filter { _ startsWith prefix } + val dirsToRecurseInto = childPaths filter { _.isDirectory } map { IO.relativize(base, _).get } filter { dirStartsWithPrefix } prefixedDirectChildPaths append dirsToRecurseInto.flatMap(dir => files(new File(base, dir))) } diff --git a/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala index 03b495bf0..effd9be78 100644 --- a/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala +++ b/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala @@ -80,8 +80,7 @@ class FileExamplesTest extends Specification { (childDirectories ++ nestedDirectories).map(_.mkdirs()) (childFiles ++ nestedFiles).map(_.createNewFile()) - // NOTE: Creating a new file here because `tempDir.listFiles()` returned an empty list. - baseDir = new File(tempDir.getCanonicalPath) + baseDir = tempDir } private def toChildFiles(baseDir: File, files: List[String]): List[File] = files.map(new File(baseDir, _)) From e0dd2e6a4c590dff1b64fdc7d4c8a7d43eec393b Mon Sep 17 00:00:00 2001 From: Antonio Cunei Date: Thu, 11 Sep 2014 02:04:17 +0200 Subject: [PATCH 439/823] This commit reverts part of 322f6de6551665cade7d56b532348ea5dc3d54db The implementation of Relation should in theory make no difference whether an element is unmapped, or whether it is mapped to an empty set. One of the changes in 322f6de6551665cade7d56b532348ea5dc3d54db introduced an optimization to the '+' operation on Relations that, in theory, should have made no difference to the semantic. The result of that optimization is that some mappings of the form "elem -> Set()" are no longer inserted in the forwardMap of the Relation. Unfortunately, the change resulted in the breakage of #1430, causing "set every" to behave incorrectly. There must be, somewhere in the code, a test on the presence of a key rather than an access via .get(), or some other access that bypasses the supposed semantic equivalence described above. I spent several hours trying to track down exactly the offending test, without success. By undoing the relevant change in 322f6de6551665cade, "set every" works again. That however offers no guarantee that everything else will keep working correctly; the underlying quirk in the code that depends on this supposedly inessential detail is also still lurking in the code, which is less than ideal. --- util/relation/src/main/scala/sbt/Relation.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala index 987aafb14..dcd38fa90 100644 --- a/util/relation/src/main/scala/sbt/Relation.scala +++ b/util/relation/src/main/scala/sbt/Relation.scala @@ -129,7 +129,7 @@ private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ex def +(pair: (A, B)) = this + (pair._1, Set(pair._2)) def +(from: A, to: B) = this + (from, to :: Nil) - def +(from: A, to: Traversable[B]) = if (to.isEmpty) this else + def +(from: A, to: Traversable[B]) = new MRelation(add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, from :: Nil) }) def ++(rs: Traversable[(A, B)]) = ((this: Relation[A, B]) /: rs) { _ + _ } From f65f30d15296a9d09474d794168fe5c3226d867b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jean-R=C3=A9mi=20Desjardins?= Date: Wed, 10 Sep 2014 22:55:43 -0700 Subject: [PATCH 440/823] Remove some compiler warnings --- util/collection/src/main/scala/sbt/INode.scala | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/util/collection/src/main/scala/sbt/INode.scala b/util/collection/src/main/scala/sbt/INode.scala index d56a22485..1af592f77 100644 --- a/util/collection/src/main/scala/sbt/INode.scala +++ b/util/collection/src/main/scala/sbt/INode.scala @@ -24,14 +24,14 @@ abstract class EvaluateSettings[Scope] { private[this] val transform: Initialize ~> INode = new (Initialize ~> INode) { def apply[T](i: Initialize[T]): INode[T] = i match { - case k: Keyed[s, T] => single(getStatic(k.scopedKey), k.transform) - case a: Apply[k, T] => new MixedNode[k, T](a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) - case b: Bind[s, T] => new BindNode[s, T](transform(b.in), x => transform(b.f(x))) - case init.StaticScopes => strictConstant(allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] - case v: Value[T] => constant(v.value) - case v: ValidationCapture[T] => strictConstant(v.key) - case t: TransformCapture => strictConstant(t.f) - case o: Optional[s, T] => o.a match { + case k: Keyed[s, T] @unchecked => single(getStatic(k.scopedKey), k.transform) + case a: Apply[k, T] @unchecked => new MixedNode[k, T](a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) + case b: Bind[s, T] @unchecked => new BindNode[s, T](transform(b.in), x => transform(b.f(x))) + case init.StaticScopes => strictConstant(allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] + case v: Value[T] @unchecked => constant(v.value) + case v: ValidationCapture[T] @unchecked => strictConstant(v.key) + case t: TransformCapture => strictConstant(t.f) + case o: Optional[s, T] @unchecked => o.a match { case None => constant(() => o.f(None)) case Some(i) => single[s, T](transform(i), x => o.f(Some(x))) } From 77691eac40918ed5e9670f48a2e04626218b6ec5 Mon Sep 17 00:00:00 2001 From: Antonio Cunei Date: Fri, 12 Sep 2014 20:51:04 +0200 Subject: [PATCH 441/823] Undone the revert on the optimization, and fixed setAll() The optimization, and therefore the change in the behavior of Relation, is now needed by the class Logic, and cannot be reverted. This patch (written by Josh) therefore changes the implementation of setAll() so that _1s is no longer used. --- util/relation/src/main/scala/sbt/Relation.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala index dcd38fa90..987aafb14 100644 --- a/util/relation/src/main/scala/sbt/Relation.scala +++ b/util/relation/src/main/scala/sbt/Relation.scala @@ -129,7 +129,7 @@ private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ex def +(pair: (A, B)) = this + (pair._1, Set(pair._2)) def +(from: A, to: B) = this + (from, to :: Nil) - def +(from: A, to: Traversable[B]) = + def +(from: A, to: Traversable[B]) = if (to.isEmpty) this else new MRelation(add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, from :: Nil) }) def ++(rs: Traversable[(A, B)]) = ((this: Relation[A, B]) /: rs) { _ + _ } From a39e105b1f2cfe26fc0ba9bdf81caa8bbe704fbd Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 10 Oct 2014 15:42:26 -0400 Subject: [PATCH 442/823] enable -deprecation for Scala 2.10 Enable -deprecation flag to catch old code being use when we migrate things. In this commit I moved error to sys.error. --- util/collection/src/main/scala/sbt/Settings.scala | 4 ++-- util/complete/src/main/scala/sbt/complete/History.scala | 6 +++--- util/logic/src/main/scala/sbt/logic/Logic.scala | 2 +- util/logic/src/test/scala/sbt/logic/Test.scala | 4 ++-- util/process/src/main/scala/sbt/ProcessImpl.scala | 6 +++--- 5 files changed, 11 insertions(+), 11 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 96393f917..9edc46ca7 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -92,7 +92,7 @@ trait Init[Scope] { * Only the static dependencies are tracked, however. Dependencies on previous values do not introduce a derived setting either. */ final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true), default: Boolean = false): Setting[T] = { - deriveAllowed(s, allowDynamic) foreach error + deriveAllowed(s, allowDynamic) foreach sys.error val d = new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) if (default) d.default() else d } @@ -248,7 +248,7 @@ trait Init[Scope] { new Undefined(fakeUndefinedSetting(definingKey, derived), referencedKey) private[this] def fakeUndefinedSetting[T](definingKey: ScopedKey[T], d: Boolean): Setting[T] = { - val init: Initialize[T] = pure(() => error("Dummy setting for compatibility only.")) + val init: Initialize[T] = pure(() => sys.error("Dummy setting for compatibility only.")) new Setting(definingKey, init, NoPosition) { override def isDerived = d } } diff --git a/util/complete/src/main/scala/sbt/complete/History.scala b/util/complete/src/main/scala/sbt/complete/History.scala index ca394abf8..26d0a27c6 100644 --- a/util/complete/src/main/scala/sbt/complete/History.scala +++ b/util/complete/src/main/scala/sbt/complete/History.scala @@ -13,7 +13,7 @@ final class History private (val lines: IndexedSeq[String], val path: Option[Fil def all: Seq[String] = lines def size = lines.length def !! : Option[String] = !-(1) - def apply(i: Int): Option[String] = if (0 <= i && i < size) Some(lines(i)) else { error("Invalid history index: " + i); None } + def apply(i: Int): Option[String] = if (0 <= i && i < size) Some(lines(i)) else { sys.error("Invalid history index: " + i); None } def !(i: Int): Option[String] = apply(i) def !(s: String): Option[String] = @@ -27,7 +27,7 @@ final class History private (val lines: IndexedSeq[String], val path: Option[Fil private def nonEmpty[T](s: String)(act: => Option[T]): Option[T] = if (s.isEmpty) { - error("No action specified to history command") + sys.error("No action specified to history command") None } else act @@ -37,7 +37,7 @@ final class History private (val lines: IndexedSeq[String], val path: Option[Fil } object History { - def apply(lines: Seq[String], path: Option[File], error: String => Unit): History = new History(lines.toIndexedSeq, path, error) + def apply(lines: Seq[String], path: Option[File], error: String => Unit): History = new History(lines.toIndexedSeq, path, sys.error) def number(s: String): Option[Int] = try { Some(s.toInt) } diff --git a/util/logic/src/main/scala/sbt/logic/Logic.scala b/util/logic/src/main/scala/sbt/logic/Logic.scala index 7ec73c15e..72f2b2f64 100644 --- a/util/logic/src/main/scala/sbt/logic/Logic.scala +++ b/util/logic/src/main/scala/sbt/logic/Logic.scala @@ -225,7 +225,7 @@ object Logic { if (newlyFalse.nonEmpty) newlyFalse else // should never happen due to the acyclic negation rule - error(s"No progress:\n\tclauses: $clauses\n\tpossibly true: $possiblyTrue") + sys.error(s"No progress:\n\tclauses: $clauses\n\tpossibly true: $possiblyTrue") } } diff --git a/util/logic/src/test/scala/sbt/logic/Test.scala b/util/logic/src/test/scala/sbt/logic/Test.scala index e66a3b9b2..a5277582c 100644 --- a/util/logic/src/test/scala/sbt/logic/Test.scala +++ b/util/logic/src/test/scala/sbt/logic/Test.scala @@ -19,7 +19,7 @@ object LogicTest extends Properties("Logic") { Logic.reduceAll(badClauses, Set()) match { case Right(res) => false case Left(err: Logic.CyclicNegation) => true - case Left(err) => error(s"Expected cyclic error, got: $err") + case Left(err) => sys.error(s"Expected cyclic error, got: $err") } ) @@ -27,7 +27,7 @@ object LogicTest extends Properties("Logic") { case Left(err) => false case Right(res) => val actual = res.provenSet - (actual == expected) || error(s"Expected to prove $expected, but actually proved $actual") + (actual == expected) || sys.error(s"Expected to prove $expected, but actually proved $actual") } } diff --git a/util/process/src/main/scala/sbt/ProcessImpl.scala b/util/process/src/main/scala/sbt/ProcessImpl.scala index 10c2460ad..0800e8b49 100644 --- a/util/process/src/main/scala/sbt/ProcessImpl.scala +++ b/util/process/src/main/scala/sbt/ProcessImpl.scala @@ -131,7 +131,7 @@ private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPa { val buffer = new StringBuffer val code = this ! BasicIO(buffer, log, withIn) - if (code == 0) buffer.toString else error("Nonzero exit value: " + code) + if (code == 0) buffer.toString else sys.error("Nonzero exit value: " + code) } def !! = getString(None, false) def !!(log: ProcessLogger) = getString(Some(log), false) @@ -190,7 +190,7 @@ private abstract class BasicProcess extends Process { private abstract class CompoundProcess extends BasicProcess { def destroy() { destroyer() } - def exitValue() = getExitValue().getOrElse(error("No exit code: process destroyed.")) + def exitValue() = getExitValue().getOrElse(sys.error("No exit code: process destroyed.")) def start() = getExitValue @@ -426,7 +426,7 @@ private object Streamed { def next(): Stream[T] = q.take match { case Left(0) => Stream.empty - case Left(code) => if (nonzeroException) error("Nonzero exit code: " + code) else Stream.empty + case Left(code) => if (nonzeroException) sys.error("Nonzero exit code: " + code) else Stream.empty case Right(s) => Stream.cons(s, next) } new Streamed((s: T) => q.put(Right(s)), code => q.put(Left(code)), () => next()) From 9c36fa7508643c885f75f3dfce779d20cecb746f Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 28 Oct 2014 10:21:41 -0400 Subject: [PATCH 443/823] Fix BC issue discovered in #1696. Def.derive has a new parameter, so we add an override which delegates down to the new method. --- util/collection/src/main/scala/sbt/Settings.scala | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 9edc46ca7..bc4aca4ce 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -85,6 +85,10 @@ trait Init[Scope] { */ private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) + + @deprecated("0.13.7", "Use the version with default arguments and default paramter.") + final def derive[T](s: Setting[T], allowDynamic: Boolean, filter: Scope => Boolean, trigger: AttributeKey[_] => Boolean): Setting[T] = + derive(s, allowDynamic, filter, trigger, false) /** * Constructs a derived setting that will be automatically defined in every scope where one of its dependencies * is explicitly defined and the where the scope matches `filter`. From f0a8f5d44fd7767a4cd978d6fa00f624a073c8eb Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 28 Oct 2014 16:44:23 -0400 Subject: [PATCH 444/823] Create a new API for calling Java toolchains. * Create a new sbt.compiler.javac package * Create new interfaces to control running `javac` and `javadoc` whether forked or local. * Ensure new interfaces make use of `xsbti.Reporter`. * Create new method on `xsbti.compiler.JavaCompiler` which takes a `xsbti.Reporter` * Create a new mechanism to parse (more accurately) Warnings + Errors, to distinguish the two. * Ensure older xsbti.Compiler implementations still succeed via catcing NoSuchMethodError. * Feed new toolchain through sbt.actions.Compiler API via dirty hackery until we can break things in sbt 1.0 * Added a set of unit tests for parsing errors from Javac/Javadoc * Added a new integration test for hidden compilerReporter key, including testing threading of javac reports. Fixes #875, Fixes #1542, Related #1178 could be looked into/cleaned up. --- .../main/java/xsbti/compile/JavaCompiler.java | 16 ++++++++++++++-- util/log/src/main/scala/sbt/Logger.scala | 1 + 2 files changed, 15 insertions(+), 2 deletions(-) diff --git a/interface/src/main/java/xsbti/compile/JavaCompiler.java b/interface/src/main/java/xsbti/compile/JavaCompiler.java index ff6b83cc3..95f9fb992 100644 --- a/interface/src/main/java/xsbti/compile/JavaCompiler.java +++ b/interface/src/main/java/xsbti/compile/JavaCompiler.java @@ -2,6 +2,7 @@ package xsbti.compile; import java.io.File; import xsbti.Logger; +import xsbti.Reporter; /** * Interface to a Java compiler. @@ -9,6 +10,17 @@ import xsbti.Logger; public interface JavaCompiler { /** Compiles Java sources using the provided classpath, output directory, and additional options. - * Output should be sent to the provided logger.*/ + * Output should be sent to the provided logger. + * + * @deprecated 0.13.8 - Use compileWithReporter instead + */ void compile(File[] sources, File[] classpath, Output output, String[] options, Logger log); -} + + /** + * Compiles java sources using the provided classpath, output directory and additional options. + * + * Output should be sent to the provided logger. + * Failures should be passed to the provided Reporter. + */ + void compileWithReporter(File[] sources, File[] classpath, Output output, String[] options, Reporter reporter, Logger log); +} \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala index c507484ce..3c3dd92e1 100644 --- a/util/log/src/main/scala/sbt/Logger.scala +++ b/util/log/src/main/scala/sbt/Logger.scala @@ -104,6 +104,7 @@ object Logger { val position = pos val message = msg val severity = sev + override def toString = s"[$severity] $pos: $message" } } From 7f8bbe01930351a69a9985c01a3b9716572f5c3e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 4 Nov 2014 17:48:46 -0500 Subject: [PATCH 445/823] scalariform --- util/collection/src/main/scala/sbt/Settings.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index bc4aca4ce..de7d9a8fb 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -85,7 +85,6 @@ trait Init[Scope] { */ private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) - @deprecated("0.13.7", "Use the version with default arguments and default paramter.") final def derive[T](s: Setting[T], allowDynamic: Boolean, filter: Scope => Boolean, trigger: AttributeKey[_] => Boolean): Setting[T] = derive(s, allowDynamic, filter, trigger, false) From f3d500029737501659391ce0291325960d4f5d36 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Tue, 18 Nov 2014 23:42:00 +0100 Subject: [PATCH 446/823] Remove trait `DependencyContext` in favor of enum Since `DependencyContext` is needed in the compiler interface subproject, it has to be defined in this same subproject. `DependencyContext` is needed in this subproject because the `AnalysisCallback` interface uses it. --- .../main/java/xsbti/DependencyContext.java | 22 +++++++++++++++++++ 1 file changed, 22 insertions(+) create mode 100644 interface/src/main/java/xsbti/DependencyContext.java diff --git a/interface/src/main/java/xsbti/DependencyContext.java b/interface/src/main/java/xsbti/DependencyContext.java new file mode 100644 index 000000000..15cfa76d1 --- /dev/null +++ b/interface/src/main/java/xsbti/DependencyContext.java @@ -0,0 +1,22 @@ +package xsbti; + +/** + * Enumeration of existing dependency contexts. + * Dependency contexts represent the various kind of dependencies that + * can exist between symbols. + */ +public enum DependencyContext { + /** + * Represents a direct dependency between two symbols : + * object Foo + * object Bar { def foo = Foo } + */ + DependencyByMemberRef, + + /** + * Represents an inheritance dependency between two symbols : + * class A + * class B extends A + */ + DependencyByInheritance +} From 7bad42c6711621c6a8b55ff0ce6f20bddfe733ab Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Wed, 19 Nov 2014 08:52:52 +0100 Subject: [PATCH 447/823] Abstract over dependency context in Compile This commit completes the abstraction over dependency kinds in the incremental compiler, started with #1340. --- .../src/main/java/xsbti/AnalysisCallback.java | 20 +++++++++++++++++-- .../src/test/scala/xsbti/TestCallback.scala | 17 ++++++++++++---- 2 files changed, 31 insertions(+), 6 deletions(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 0e083d4eb..88b190e80 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -12,13 +12,29 @@ public interface AnalysisCallback * passed to this method. Dependencies on classes generated by sources not in the current compilation will * be passed as class dependencies to the classDependency method. * If publicInherited is true, this dependency is a result of inheritance by a - * template accessible outside of the source file. */ + * template accessible outside of the source file. + * @deprecated Use `sourceDependency(File dependsOn, File source, DependencyContext context)` instead. */ + @Deprecated public void sourceDependency(File dependsOn, File source, boolean publicInherited); + /** Called to indicate that the source file source depends on the source file + * dependsOn. Note that only source files included in the current compilation will + * passed to this method. Dependencies on classes generated by sources not in the current compilation will + * be passed as class dependencies to the classDependency method. + * context gives information about the context in which this dependency has been extracted. + * See xsbti.DependencyContext for the list of existing dependency contexts. */ + public void sourceDependency(File dependsOn, File source, DependencyContext context); /** Called to indicate that the source file source depends on the top-level * class named name from class or jar file binary. * If publicInherited is true, this dependency is a result of inheritance by a - * template accessible outside of the source file. */ + * template accessible outside of the source file. + * @deprecated Use `binaryDependency(File binary, String name, File source, DependencyContext context)` instead. */ + @Deprecated public void binaryDependency(File binary, String name, File source, boolean publicInherited); + /** Called to indicate that the source file source depends on the top-level + * class named name from class or jar file binary. + * context gives information about the context in which this dependency has been extracted. + * See xsbti.DependencyContext for the list of existing dependency contexts. */ + public void binaryDependency(File binary, String name, File source, DependencyContext context); /** Called to indicate that the source file source produces a class file at * module contain class name.*/ public void generatedClass(File source, File module, String name); diff --git a/interface/src/test/scala/xsbti/TestCallback.scala b/interface/src/test/scala/xsbti/TestCallback.scala index 3ea7e32e1..13b65df79 100644 --- a/interface/src/test/scala/xsbti/TestCallback.scala +++ b/interface/src/test/scala/xsbti/TestCallback.scala @@ -3,17 +3,26 @@ package xsbti import java.io.File import scala.collection.mutable.ArrayBuffer import xsbti.api.SourceAPI +import xsbti.DependencyContext._ class TestCallback(override val nameHashing: Boolean = false) extends AnalysisCallback { - val sourceDependencies = new ArrayBuffer[(File, File, Boolean)] - val binaryDependencies = new ArrayBuffer[(File, String, File, Boolean)] + val sourceDependencies = new ArrayBuffer[(File, File, DependencyContext)] + val binaryDependencies = new ArrayBuffer[(File, String, File, DependencyContext)] val products = new ArrayBuffer[(File, File, String)] val usedNames = scala.collection.mutable.Map.empty[File, Set[String]].withDefaultValue(Set.empty) val apis: scala.collection.mutable.Map[File, SourceAPI] = scala.collection.mutable.Map.empty - def sourceDependency(dependsOn: File, source: File, inherited: Boolean) { sourceDependencies += ((dependsOn, source, inherited)) } - def binaryDependency(binary: File, name: String, source: File, inherited: Boolean) { binaryDependencies += ((binary, name, source, inherited)) } + def sourceDependency(dependsOn: File, source: File, inherited: Boolean) { + val context = if(inherited) DependencyByInheritance else DependencyByMemberRef + sourceDependency(dependsOn, source, context) + } + def sourceDependency(dependsOn: File, source: File, context: DependencyContext) { sourceDependencies += ((dependsOn, source, context)) } + def binaryDependency(binary: File, name: String, source: File, inherited: Boolean) { + val context = if(inherited) DependencyByInheritance else DependencyByMemberRef + binaryDependency(binary, name, source, context) + } + def binaryDependency(binary: File, name: String, source: File, context: DependencyContext) { binaryDependencies += ((binary, name, source, context)) } def generatedClass(source: File, module: File, name: String) { products += ((source, module, name)) } def usedName(source: File, name: String) { usedNames(source) += name } From 4a42aa0027e21cb20211fc9a082da68af4476264 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 4 Nov 2014 11:08:50 -0500 Subject: [PATCH 448/823] Deprecating old APIs and attempting to document behavior correctly. * Removed as many binary incompatibilities as I could find. * Deprecating old APIs * Attempt to construct new nomenclature that fits the design of Incremental API. * Add as much documentation as I was comfortable writing (from my understanding of things). --- interface/src/main/java/xsbti/compile/GlobalsCache.java | 3 +++ 1 file changed, 3 insertions(+) diff --git a/interface/src/main/java/xsbti/compile/GlobalsCache.java b/interface/src/main/java/xsbti/compile/GlobalsCache.java index a3a412836..d9aa1c017 100644 --- a/interface/src/main/java/xsbti/compile/GlobalsCache.java +++ b/interface/src/main/java/xsbti/compile/GlobalsCache.java @@ -3,6 +3,9 @@ package xsbti.compile; import xsbti.Logger; import xsbti.Reporter; +/** + * An interface which lets us know how to retrieve cached compiler instances form the current JVM. + */ public interface GlobalsCache { public CachedCompiler apply(String[] args, Output output, boolean forceNew, CachedCompilerProvider provider, Logger log, Reporter reporter); From 8e81aabed075191aeac139283f43b8ed2b867e39 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Wed, 5 Nov 2014 14:31:26 -0500 Subject: [PATCH 449/823] First set of refactorings from review. * Split Java analyzing compile into its own class. * MixedAnalyzingCompiler now only does the mixing * Start moving methods around to more-final locations * Static analyzingCompile method now constructs a MixedAnalyzingCOmpiler and delegates to incremental compile. --- .../src/main/java/xsbti/compile/CompileProgress.java | 5 +++++ .../main/java/xsbti/compile/IncrementalCompiler.java | 10 ++++++++++ 2 files changed, 15 insertions(+) diff --git a/interface/src/main/java/xsbti/compile/CompileProgress.java b/interface/src/main/java/xsbti/compile/CompileProgress.java index 902a50018..17174ff6a 100755 --- a/interface/src/main/java/xsbti/compile/CompileProgress.java +++ b/interface/src/main/java/xsbti/compile/CompileProgress.java @@ -1,5 +1,10 @@ package xsbti.compile; +/** + * An API for reporting when files are being compiled. + * + * Note; This is tied VERY SPECIFICALLY to scala. + */ public interface CompileProgress { void startUnit(String phase, String unitPath); diff --git a/interface/src/main/java/xsbti/compile/IncrementalCompiler.java b/interface/src/main/java/xsbti/compile/IncrementalCompiler.java index f2323111d..c98263e7f 100644 --- a/interface/src/main/java/xsbti/compile/IncrementalCompiler.java +++ b/interface/src/main/java/xsbti/compile/IncrementalCompiler.java @@ -44,8 +44,18 @@ public interface IncrementalCompiler * @param instance The Scala version to use * @param interfaceJar The compiler interface jar compiled for the Scala version being used * @param options Configures how arguments to the underlying Scala compiler will be built. + * */ + @Deprecated ScalaCompiler newScalaCompiler(ScalaInstance instance, File interfaceJar, ClasspathOptions options, Logger log); + /** + * Creates a compiler instance that can be used by the `compile` method. + * + * @param instance The Scala version to use + * @param interfaceJar The compiler interface jar compiled for the Scala version being used + * @param options Configures how arguments to the underlying Scala compiler will be built. + */ + ScalaCompiler newScalaCompiler(ScalaInstance instance, File interfaceJar, ClasspathOptions options); /** * Compiles the source interface for a Scala version. The resulting jar can then be used by the `newScalaCompiler` method From 07c3d51a0c109cc37fb3eba8d563807779e0fc31 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jean-R=C3=A9mi=20Desjardins?= Date: Wed, 3 Dec 2014 09:56:34 -0800 Subject: [PATCH 450/823] Minor code cleanup --- interface/src/main/java/xsbti/api/Modifiers.java | 4 ++-- .../src/main/java/xsbti/compile/CompileOrder.java | 2 +- util/collection/src/main/scala/sbt/AList.scala | 2 +- util/collection/src/main/scala/sbt/Settings.scala | 10 +++++----- util/collection/src/test/scala/SettingsTest.scala | 4 ++-- util/complete/src/main/scala/sbt/complete/Parser.scala | 8 ++++---- util/logic/src/main/scala/sbt/logic/Logic.scala | 2 +- util/process/src/main/scala/sbt/Process.scala | 6 +++--- util/process/src/test/scala/TestedProcess.scala | 2 +- 9 files changed, 20 insertions(+), 20 deletions(-) diff --git a/interface/src/main/java/xsbti/api/Modifiers.java b/interface/src/main/java/xsbti/api/Modifiers.java index 78fa13901..5e103c7ec 100644 --- a/interface/src/main/java/xsbti/api/Modifiers.java +++ b/interface/src/main/java/xsbti/api/Modifiers.java @@ -10,7 +10,7 @@ public final class Modifiers implements java.io.Serializable private static final int LazyBit = 5; private static final int MacroBit = 6; - private static final int flag(boolean set, int bit) + private static int flag(boolean set, int bit) { return set ? (1 << bit) : 0; } @@ -30,7 +30,7 @@ public final class Modifiers implements java.io.Serializable private final byte flags; - private final boolean flag(int bit) + private boolean flag(int bit) { return (flags & (1 << bit)) != 0; } diff --git a/interface/src/main/java/xsbti/compile/CompileOrder.java b/interface/src/main/java/xsbti/compile/CompileOrder.java index 62b15bf1f..5683f75d9 100644 --- a/interface/src/main/java/xsbti/compile/CompileOrder.java +++ b/interface/src/main/java/xsbti/compile/CompileOrder.java @@ -30,5 +30,5 @@ public enum CompileOrder * Then, Java sources are passed to the Java compiler, which generates class files for the Java sources. * The Scala classes compiled in the first step are included on the classpath to the Java compiler. */ - ScalaThenJava; + ScalaThenJava } \ No newline at end of file diff --git a/util/collection/src/main/scala/sbt/AList.scala b/util/collection/src/main/scala/sbt/AList.scala index 10e1454e7..24368219b 100644 --- a/util/collection/src/main/scala/sbt/AList.scala +++ b/util/collection/src/main/scala/sbt/AList.scala @@ -46,7 +46,7 @@ object AList { def traverse[M[_], N[_], P[_]](s: List[M[T]], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[List[P[T]]] = ??? } - /** AList for the abitrary arity data structure KList. */ + /** AList for the arbitrary arity data structure KList. */ def klist[KL[M[_]] <: KList[M] { type Transform[N[_]] = KL[N] }]: AList[KL] = new AList[KL] { def transform[M[_], N[_]](k: KL[M], f: M ~> N) = k.transform(f) def foldr[M[_], T](k: KL[M], f: (M[_], T) => T, init: T): T = k.foldr(f, init) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index de7d9a8fb..a8e5b1d6c 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -24,14 +24,14 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val del def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = delegates(scope).toStream.flatMap(sc => getDirect(sc, key)).headOption def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] = - delegates(scope).toStream.filter(sc => getDirect(sc, key).isDefined).headOption + delegates(scope).toStream.find(sc => getDirect(sc, key).isDefined) def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] = (data get scope).flatMap(_ get key) def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] = { - val map = (data get scope) getOrElse AttributeMap.empty + val map = data getOrElse(scope, AttributeMap.empty) val newData = data.updated(scope, map.put(key, value)) new Settings0(newData, delegates) } @@ -85,7 +85,7 @@ trait Init[Scope] { */ private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) - @deprecated("0.13.7", "Use the version with default arguments and default paramter.") + @deprecated("0.13.7", "Use the version with default arguments and default parameter.") final def derive[T](s: Setting[T], allowDynamic: Boolean, filter: Scope => Boolean, trigger: AttributeKey[_] => Boolean): Setting[T] = derive(s, allowDynamic, filter, trigger, false) /** @@ -258,7 +258,7 @@ trait Init[Scope] { def Undefined(defining: Setting[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(defining, referencedKey) def Uninitialized(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], keys: Seq[Undefined], runtime: Boolean)(implicit display: Show[ScopedKey[_]]): Uninitialized = { - assert(!keys.isEmpty) + assert(keys.nonEmpty) val suffix = if (keys.length > 1) "s" else "" val prefix = if (runtime) "Runtime reference" else "Reference" val keysString = keys.map(u => showUndefined(u, validKeys, delegates)).mkString("\n\n ", "\n\n ", "") @@ -487,7 +487,7 @@ trait Init[Scope] { override def default(_id: => Long): DefaultSetting[T] = new DerivedSetting[T](sk, i, p, filter, trigger) with DefaultSetting[T] { val id = _id } override def toString = "derived " + super.toString } - // Only keep the first occurence of this setting and move it to the front so that it has lower precedence than non-defaults. + // Only keep the first occurrence of this setting and move it to the front so that it has lower precedence than non-defaults. // This is intended for internal sbt use only, where alternatives like Plugin.globalSettings are not available. private[Init] sealed trait DefaultSetting[T] extends Setting[T] { val id: Long diff --git a/util/collection/src/test/scala/SettingsTest.scala b/util/collection/src/test/scala/SettingsTest.scala index dbad035c6..d97b1056a 100644 --- a/util/collection/src/test/scala/SettingsTest.scala +++ b/util/collection/src/test/scala/SettingsTest.scala @@ -42,10 +42,10 @@ object SettingsTest extends Properties("settings") { { val genScopedKeys = { // We wan - // t to generate lists of keys that DO NOT inclue the "ch" key we use to check thigns. + // t to generate lists of keys that DO NOT inclue the "ch" key we use to check things. val attrKeys = mkAttrKeys[Int](nr).filter(_.forall(_.label != "ch")) attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) - }.label("scopedKeys").filter(!_.isEmpty) + }.label("scopedKeys").filter(_.nonEmpty) forAll(genScopedKeys) { scopedKeys => try { // Note; It's evil to grab last IF you haven't verified the set can't be empty. diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index 393501792..c52d16b91 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -187,7 +187,7 @@ object Parser extends ParserMain { @deprecated("This method is deprecated and will be removed in the next major version. Use the parser directly to check for invalid completions.", since = "0.13.2") def checkMatches(a: Parser[_], completions: Seq[String]) { val bad = completions.filter(apply(a)(_).resultEmpty.isFailure) - if (!bad.isEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) + if (bad.nonEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) } def tuple[A, B](a: Option[A], b: Option[B]): Option[(A, B)] = @@ -378,7 +378,7 @@ trait ParserMain { def unapply[A, B](t: (A, B)): Some[(A, B)] = Some(t) } - /** Parses input `str` using `parser`. If successful, the result is provided wrapped in `Right`. If unsuccesful, an error message is provided in `Left`.*/ + /** Parses input `str` using `parser`. If successful, the result is provided wrapped in `Right`. If unsuccessful, an error message is provided in `Left`.*/ def parse[T](str: String, parser: Parser[T]): Either[String, T] = Parser.result(parser, str).left.map { failures => val (msgs, pos) = failures() @@ -480,7 +480,7 @@ trait ParserMain { def matched(t: Parser[_], seen: Vector[Char] = Vector.empty, partial: Boolean = false): Parser[String] = t match { - case i: Invalid => if (partial && !seen.isEmpty) success(seen.mkString) else i + case i: Invalid => if (partial && seen.nonEmpty) success(seen.mkString) else i case _ => if (t.result.isEmpty) new MatchedString(t, seen, partial) @@ -634,7 +634,7 @@ private final class HetParser[A, B](a: Parser[A], b: Parser[B]) extends ValidPar override def toString = "(" + a + " || " + b + ")" } private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) extends ValidParser[Seq[T]] { - assert(!a.isEmpty) + assert(a.nonEmpty) lazy val resultEmpty: Result[Seq[T]] = { val res = a.map(_.resultEmpty) diff --git a/util/logic/src/main/scala/sbt/logic/Logic.scala b/util/logic/src/main/scala/sbt/logic/Logic.scala index 72f2b2f64..856394251 100644 --- a/util/logic/src/main/scala/sbt/logic/Logic.scala +++ b/util/logic/src/main/scala/sbt/logic/Logic.scala @@ -277,7 +277,7 @@ object Logic { } /** Represents the set of atoms in the heads of clauses and in the bodies (formulas) of clauses. */ - final case class Atoms(val inHead: Set[Atom], val inFormula: Set[Atom]) { + final case class Atoms(inHead: Set[Atom], inFormula: Set[Atom]) { /** Concatenates this with `as`. */ def ++(as: Atoms): Atoms = Atoms(inHead ++ as.inHead, inFormula ++ as.inFormula) /** Atoms that cannot be true because they do not occur in a head. */ diff --git a/util/process/src/main/scala/sbt/Process.scala b/util/process/src/main/scala/sbt/Process.scala index 66b7e03c6..79435367d 100644 --- a/util/process/src/main/scala/sbt/Process.scala +++ b/util/process/src/main/scala/sbt/Process.scala @@ -34,7 +34,7 @@ object Process extends ProcessExtra { /** create ProcessBuilder with working dir set to File and extra environment variables */ def apply(command: Seq[String], cwd: File, extraEnv: (String, String)*): ProcessBuilder = apply(command, Some(cwd), extraEnv: _*) - /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ + /** create ProcessBuilder with working dir optionally set to File and extra environment variables */ def apply(command: String, cwd: Option[File], extraEnv: (String, String)*): ProcessBuilder = { apply(command.split("""\s+"""), cwd, extraEnv: _*) // not smart to use this on windows, because CommandParser uses \ to escape ". @@ -43,7 +43,7 @@ object Process extends ProcessExtra { case Right((cmd, args)) => apply(cmd :: args, cwd, extraEnv : _*) }*/ } - /** create ProcessBuilder with working dir optionaly set to File and extra environment variables */ + /** create ProcessBuilder with working dir optionally set to File and extra environment variables */ def apply(command: Seq[String], cwd: Option[File], extraEnv: (String, String)*): ProcessBuilder = { val jpb = new JProcessBuilder(command.toArray: _*) cwd.foreach(jpb directory _) @@ -63,7 +63,7 @@ object Process extends ProcessExtra { def cat(file: SourcePartialBuilder, files: SourcePartialBuilder*): ProcessBuilder = cat(file :: files.toList) def cat(files: Seq[SourcePartialBuilder]): ProcessBuilder = { - require(!files.isEmpty) + require(files.nonEmpty) files.map(_.cat).reduceLeft(_ #&& _) } } diff --git a/util/process/src/test/scala/TestedProcess.scala b/util/process/src/test/scala/TestedProcess.scala index 5daea8bab..f83207a07 100644 --- a/util/process/src/test/scala/TestedProcess.scala +++ b/util/process/src/test/scala/TestedProcess.scala @@ -16,7 +16,7 @@ object cat { catFiles(args.toList) System.exit(0) } catch { - case e => + case e: Throwable => e.printStackTrace() System.err.println("Error: " + e.toString) System.exit(1) From 25a91c161cc71fba57b2ae716abf38d209bb8462 Mon Sep 17 00:00:00 2001 From: Derek Wickern Date: Thu, 11 Dec 2014 08:17:21 -0800 Subject: [PATCH 451/823] Fix logger not overwriting the previous line in alternate shells When running the 'update' task in bash, the output is all collapsed onto one line. On Windows, even using an ANSI capable shell, running 'update' spams the console. Tested with dash in Ubuntu; ANSICON, Console2 and ConsoleZ in Windows. --- util/log/src/main/scala/sbt/ConsoleOut.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/log/src/main/scala/sbt/ConsoleOut.scala b/util/log/src/main/scala/sbt/ConsoleOut.scala index 41367757b..3ce0d8cdf 100644 --- a/util/log/src/main/scala/sbt/ConsoleOut.scala +++ b/util/log/src/main/scala/sbt/ConsoleOut.scala @@ -16,7 +16,7 @@ object ConsoleOut { cur.contains(s) && prev.contains(s) /** Move to beginning of previous line and clear the line. */ - private[this] final val OverwriteLine = "\r\u001BM\u001B[2K" + private[this] final val OverwriteLine = "\u001B[A\r\u001B[2K" /** * ConsoleOut instance that is backed by System.out. It overwrites the previously printed line From d16297615f6c13f7089fcedbd2e0bdbc088e1f2e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 17 Dec 2014 23:38:10 -0500 Subject: [PATCH 452/823] Multi-project build.sbt --- build.sbt | 511 ++++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 511 insertions(+) create mode 100644 build.sbt diff --git a/build.sbt b/build.sbt new file mode 100644 index 000000000..bd8776e3d --- /dev/null +++ b/build.sbt @@ -0,0 +1,511 @@ +import Project.Initialize +import Util._ +import Common._ +import Licensed._ +import Scope.ThisScope +import LaunchProguard.{ proguard, Proguard } +import Scripted._ +import StringUtilities.normalize +import Sxr.sxr + +def commonSettings: Seq[Setting[_]] = Seq( + organization := "org.scala-sbt", + version := "0.13.8-SNAPSHOT", + publishArtifact in packageDoc := false, + scalaVersion := "2.10.4", + publishMavenStyle := false, + componentID := None, + crossPaths := false, + resolvers += Resolver.typesafeIvyRepo("releases"), + concurrentRestrictions in Global += Util.testExclusiveRestriction, + testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), + javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), + incOptions := incOptions.value.withNameHashing(true) +) + +//override lazy val settings = super.settings ++ buildSettings ++ Status.settings ++ nightlySettings +def minimalSettings: Seq[Setting[_]] = + commonSettings ++ customCommands ++ Status.settings ++ nightlySettings ++ + Seq( + crossVersion in update <<= (crossVersion, nightly211) { (cv, n) => if (n) CrossVersion.full else cv }, + resolvers += Resolver.typesafeIvyRepo("releases") + ) + +def baseSettings: Seq[Setting[_]] = + minimalSettings ++ Seq(projectComponent) ++ baseScalacOptions ++ Licensed.settings ++ Formatting.settings + +def testedBaseSettings: Seq[Setting[_]] = + baseSettings ++ testDependencies + +lazy val root: Project = (project in file(".")). + configs(Sxr.sxrConf, Proguard). + aggregate(nonRoots: _*). + settings(minimalSettings ++ rootSettings: _*) + +/* ** Projproject declarations ** */ + +// defines the Java interfaces through which the launcher and the launched application communicate +lazy val launchInterfaceProj = (project in launchPath / "interface"). + settings(minimalSettings ++ javaOnlySettings: _*). + settings( + name := "Launcher Interface" + ) + +// the launcher. Retrieves, loads, and runs applications based on a configuration file. +lazy val launchProj = (project in launchPath). + dependsOn(ioProj % "test->test", interfaceProj % Test, launchInterfaceProj). + settings(testedBaseSettings: _*). + settings( + name := "Launcher", + ivy, + compile in Test <<= compile in Test dependsOn (publishLocal in interfaceProj, publishLocal in testSamples, publishLocal in launchInterfaceProj) + ). + settings(inConfig(Compile)(Transform.configSettings): _*). + settings(inConfig(Compile)(Transform.transSourceSettings ++ Seq( + Transform.inputSourceDirectory <<= (sourceDirectory in crossProj) / "input_sources", + Transform.sourceProperties := Map("cross.package0" -> "xsbt", "cross.package1" -> "boot") + )): _*) + +// used to test the retrieving and loading of an application: sample app is packaged and published to the local repository +lazy val testSamples = (project in launchPath / "test-sample"). + dependsOn(interfaceProj, launchInterfaceProj). + settings(baseSettings ++ noPublishSettings: _*). + settings(scalaCompiler) + +// defines Java structures used across Scala versions, such as the API structures and relationships extracted by +// the analysis compiler phases and passed back to sbt. The API structures are defined in a simple +// format from which Java sources are generated by the datatype generator Projproject +lazy val interfaceProj = (project in file("interface")). + settings(minimalSettings ++ javaOnlySettings: _*). + settings( + name := "Interface", + projectComponent, + exportJars := true, + componentID := Some("xsbti"), + watchSources <++= apiDefinitions, + resourceGenerators in Compile <+= (version, resourceManaged, streams, compile in Compile) map generateVersionFile, + apiDefinitions <<= baseDirectory map { base => (base / "definition") :: (base / "other") :: (base / "type") :: Nil }, + sourceGenerators in Compile <+= (cacheDirectory, apiDefinitions, fullClasspath in Compile in datatypeProj, sourceManaged in Compile, mainClass in datatypeProj in Compile, runner, streams) map generateAPICached + ) + +// defines operations on the API of a source, including determining whether it has changed and converting it to a string +// and discovery of Projclasses and annotations +lazy val apiProj = (project in compilePath / "api"). + dependsOn(interfaceProj). + settings(testedBaseSettings: _*). + settings( + name := "API" + ) + +/* **** Utilities **** */ + +lazy val controlProj = (project in utilPath / "control"). + settings(baseSettings ++ Util.crossBuild: _*). + settings( + name := "Control" + ) + +lazy val collectionProj = (project in utilPath / "collection"). + settings(testedBaseSettings ++ Util.keywordsSettings ++ Util.crossBuild: _*). + settings( + name := "Collections" + ) + +lazy val applyMacroProj = (project in utilPath / "appmacro"). + dependsOn(collectionProj). + settings(testedBaseSettings: _*). + settings( + name := "Apply Macro", + scalaCompiler + ) + +// The API for forking, combining, and doing I/O with system processes +lazy val processProj = (project in utilPath / "process"). + dependsOn(ioProj % "test->test"). + settings(baseSettings: _*). + settings( + name := "Process", + scalaXml + ) + +// Path, IO (formerly FileUtilities), NameFilter and other I/O utility classes +lazy val ioProj = (project in utilPath / "io"). + dependsOn(controlProj). + settings(testedBaseSettings ++ Util.crossBuild: _*). + settings( + name := "IO", + libraryDependencies += { "org.scala-lang" % "scala-compiler" % scalaVersion.value % Test } + ) + +// Utilities related to reflection, managing Scala versions, and custom class loaders +lazy val classpathProj = (project in utilPath / "classpath"). + dependsOn(launchInterfaceProj, interfaceProj, ioProj). + settings(testedBaseSettings: _*). + settings( + name := "Classpath", + scalaCompiler + ) + +// Command line-related utilities. +lazy val completeProj = (project in utilPath / "complete"). + dependsOn(collectionProj, controlProj, ioProj). + settings(testedBaseSettings ++ Util.crossBuild: _*). + settings( + name := "Completion", + jline + ) + +// logging +lazy val logProj = (project in utilPath / "log"). + dependsOn(interfaceProj, processProj). + settings(testedBaseSettings: _*). + settings( + name := "Logging", + jline + ) + +// Relation +lazy val relationProj = (project in utilPath / "relation"). + dependsOn(interfaceProj, processProj). + settings(testedBaseSettings: _*). + settings( + name := "Relation" + ) + +// class file reader and analyzer +lazy val classfileProj = (project in utilPath / "classfile"). + dependsOn(ioProj, interfaceProj, logProj). + settings(testedBaseSettings: _*). + settings( + name := "Classfile" + ) + +// generates immutable or mutable Java data types according to a simple input format +lazy val datatypeProj = (project in utilPath / "datatype"). + dependsOn(ioProj). + settings(baseSettings: _*). + settings( + name := "Datatype Generator" + ) + +// cross versioning +lazy val crossProj = (project in utilPath / "cross"). + settings(baseSettings: _*). + settings(inConfig(Compile)(Transform.crossGenSettings): _*). + settings( + name := "Cross" + ) + +// A logic with restricted negation as failure for a unique, stable model +lazy val logicProj = (project in utilPath / "logic"). + dependsOn(collectionProj, relationProj). + settings(testedBaseSettings: _*). + settings( + name := "Logic" + ) + +/* **** Intermediate-level Modules **** */ + +// Apache Ivy integration +lazy val ivyProj = (project in file("ivy")). + dependsOn(interfaceProj, launchInterfaceProj, crossProj, logProj % "compile;test->test", ioProj % "compile;test->test", launchProj % "test->test", collectionProj). + settings(baseSettings: _*). + settings( + name := "Ivy", + ivy, jsch, testExclusive, json4sNative, jawnParser, jawnJson4s) + +// Runner for uniform test interface +lazy val testingProj = (project in file("testing")). + dependsOn(ioProj, classpathProj, logProj, launchInterfaceProj, testAgentProj). + settings(baseSettings: _*). + settings( + name := "Testing", + testInterface + ) + +// Testing agent for running tests in a separate process. +lazy val testAgentProj = (project in file("testing") / "agent"). + settings(minimalSettings: _*). + settings( + name := "Test Agent", + testInterface + ) + +// Basic task engine +lazy val taskProj = (project in tasksPath). + dependsOn(controlProj, collectionProj). + settings(testedBaseSettings: _*). + settings( + name := "Tasks" + ) + +// Standard task system. This provides map, flatMap, join, and more on top of the basic task model. +lazy val stdTaskProj = (project in tasksPath / "standard"). + dependsOn (taskProj % "compile;test->test", collectionProj, logProj, ioProj, processProj). + settings(testedBaseSettings: _*). + settings( + name := "Task System", + testExclusive + ) + +// Persisted caching based on SBinary +lazy val cacheProj = (project in cachePath). + dependsOn (ioProj, collectionProj). + settings(baseSettings: _*). + settings( + name := "Cache", + sbinary, scalaXml + ) + +// baseProject(cachePath, "Cache") dependsOn (ioProj, collectionProj) settings (sbinary, scalaXml) + +// Builds on cache to provide caching for filesystem-related operations +lazy val trackingProj = (project in cachePath / "tracking"). + dependsOn(cacheProj, ioProj). + settings(baseSettings: _*). + settings( + name := "Tracking" + ) + +// Embedded Scala code runner +lazy val runProj = (project in file("run")). + dependsOn (ioProj, logProj % "compile;test->test", classpathProj, processProj % "compile;test->test"). + settings(testedBaseSettings: _*). + settings( + name := "Run" + ) + +// Compiler-side interface to compiler that is compiled against the compiler being used either in advance or on the fly. +// Includes API and Analyzer phases that extract source API and relationships. +lazy val compileInterfaceProj = (project in compilePath / "interface"). + dependsOn(interfaceProj % "compile;test->test", ioProj % "test->test", logProj % "test->test", launchProj % "test->test", apiProj % "test->test"). + settings(baseSettings ++ precompiledSettings: _*). + settings( + name := "Compiler Interface", + exportJars := true, + // we need to fork because in unit tests we set usejavacp = true which means + // we are expecting all of our dependencies to be on classpath so Scala compiler + // can use them while constructing its own classpath for compilation + fork in Test := true, + // needed because we fork tests and tests are ran in parallel so we have multiple Scala + // compiler instances that are memory hungry + javaOptions in Test += "-Xmx1G", + artifact in (Compile, packageSrc) := Artifact(srcID).copy(configurations = Compile :: Nil).extra("e:component" -> srcID) + ) + +lazy val precompiled282 = precompiled("2.8.2") +lazy val precompiled292 = precompiled("2.9.2") +lazy val precompiled293 = precompiled("2.9.3") + +// Implements the core functionality of detecting and propagating changes incrementally. +// Defines the data structures for representing file fingerprints and relationships and the overall source analysis +lazy val compileIncrementalProj = (project in compilePath / "inc"). + dependsOn (apiProj, ioProj, logProj, classpathProj, relationProj). + settings(testedBaseSettings: _*). + settings( + name := "Incremental Compiler" + ) + +// Persists the incremental data structures using SBinary +lazy val compilePersistProj = (project in compilePath / "persist"). + dependsOn(compileIncrementalProj, apiProj, compileIncrementalProj % "test->test"). + settings(testedBaseSettings: _*). + settings( + name := "Persist", + sbinary + ) + +// sbt-side interface to compiler. Calls compiler-side interface reflectively +lazy val compilerProj = (project in compilePath). + dependsOn(launchInterfaceProj, interfaceProj % "compile;test->test", logProj, ioProj, classpathProj, apiProj, classfileProj, + logProj % "test->test", launchProj % "test->test"). + settings(testedBaseSettings: _*). + settings( + name := "Compile", + libraryDependencies <+= scalaVersion("org.scala-lang" % "scala-compiler" % _ % Test), + unmanagedJars in Test <<= (packageSrc in compileInterfaceProj in Compile).map(x => Seq(x).classpath) + ) + +lazy val compilerIntegrationProj = (project in (compilePath / "integration")). + dependsOn(compileIncrementalProj, compilerProj, compilePersistProj, apiProj, classfileProj). + settings(baseSettings: _*). + settings( + name := "Compiler Integration" + ) + +lazy val compilerIvyProj = (project in compilePath / "ivy"). + dependsOn (ivyProj, compilerProj). + settings(baseSettings: _*). + settings( + name := "Compiler Ivy Integration" + ) + +lazy val scriptedBaseProj = (project in scriptedPath / "base"). + dependsOn (ioProj, processProj). + settings(testedBaseSettings: _*). + settings( + name := "Scripted Framework", + scalaParsers + ) + +lazy val scriptedSbtProj = (project in scriptedPath / "sbt"). + dependsOn (ioProj, logProj, processProj, scriptedBaseProj, launchInterfaceProj % "provided"). + settings(baseSettings: _*). + settings( + name := "Scripted sbt" + ) + +lazy val scriptedPluginProj = (project in scriptedPath / "plugin"). + dependsOn (sbtProj, classpathProj). + settings(baseSettings: _*). + settings( + name := "Scripted Plugin" + ) + +// Implementation and support code for defining actions. +lazy val actionsProj = (project in mainPath / "actions"). + dependsOn (classpathProj, completeProj, apiProj, compilerIntegrationProj, compilerIvyProj, + interfaceProj, ioProj, ivyProj, logProj, processProj, runProj, relationProj, stdTaskProj, + taskProj, trackingProj, testingProj). + settings(testedBaseSettings: _*). + settings( + name := "Actions" + ) + +// General command support and core commands not specific to a build system +lazy val commandProj = (project in mainPath / "command"). + dependsOn(interfaceProj, ioProj, launchInterfaceProj, logProj, completeProj, classpathProj, crossProj). + settings(testedBaseSettings: _*). + settings( + name := "Command" + ) + +// Fixes scope=Scope for Setting (core defined in collectionProj) to define the settings system used in build definitions +lazy val mainSettingsProj = (project in mainPath / "settings"). + dependsOn (applyMacroProj, interfaceProj, ivyProj, relationProj, logProj, ioProj, commandProj, + completeProj, classpathProj, stdTaskProj, processProj). + settings(testedBaseSettings: _*). + settings( + name := "Main Settings", + sbinary + ) + +// The main integration project for sbt. It brings all of the Projsystems together, configures them, and provides for overriding conventions. +lazy val mainProj = (project in mainPath). + dependsOn (actionsProj, mainSettingsProj, interfaceProj, ioProj, ivyProj, launchInterfaceProj, logProj, logicProj, processProj, runProj, commandProj). + settings(testedBaseSettings: _*). + settings( + name := "Main", + scalaXml + ) + +// Strictly for bringing implicits and aliases from subsystems into the top-level sbt namespace through a single package object +// technically, we need a dependency on all of mainProj's dependencies, but we don't do that since this is strictly an integration project +// with the sole purpose of providing certain identifiers without qualification (with a package object) +lazy val sbtProj = (project in sbtPath). + dependsOn(mainProj, compileInterfaceProj, precompiled282, precompiled292, precompiled293, scriptedSbtProj % "test->test"). + settings(baseSettings: _*). + settings( + name := "sbt", + normalizedName := "sbt" + ) + +def scriptedTask: Initialize[InputTask[Unit]] = InputTask(scriptedSource(dir => (s: State) => scriptedParser(dir))) { result => + (proguard in Proguard, fullClasspath in scriptedSbtProj in Test, scalaInstance in scriptedSbtProj, publishAll, scriptedSource, result) map { + (launcher, scriptedSbtClasspath, scriptedSbtInstance, _, sourcePath, args) => + doScripted(launcher, scriptedSbtClasspath, scriptedSbtInstance, sourcePath, args) + } +} + +def scriptedUnpublishedTask: Initialize[InputTask[Unit]] = InputTask(scriptedSource(dir => (s: State) => scriptedParser(dir))) { result => + (proguard in Proguard, fullClasspath in scriptedSbtProj in Test, scalaInstance in scriptedSbtProj, scriptedSource, result) map doScripted +} + +lazy val publishAll = TaskKey[Unit]("publish-all") +lazy val publishLauncher = TaskKey[Unit]("publish-launcher") + +lazy val myProvided = config("provided") intransitive + +def allProjects = Seq(launchInterfaceProj, launchProj, testSamples, interfaceProj, apiProj, + controlProj, collectionProj, applyMacroProj, processProj, ioProj, classpathProj, completeProj, + logProj, relationProj, classfileProj, datatypeProj, crossProj, logicProj, ivyProj, + testingProj, testAgentProj, taskProj, stdTaskProj, cacheProj, trackingProj, runProj, + compileInterfaceProj, compileIncrementalProj, compilePersistProj, compilerProj, + compilerIntegrationProj, compilerIvyProj, + scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, + actionsProj, commandProj, mainSettingsProj, mainProj, sbtProj) +def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.configurations.filter(_ != Provided)) :+ myProvided)) +lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) + +def deepTasks[T](scoped: TaskKey[Seq[T]]): Initialize[Task[Seq[T]]] = deep(scoped.task) { _.join.map(_.flatten.distinct) } +def deep[T](scoped: SettingKey[T]): Initialize[Seq[T]] = + Util.inAllProjects(projectsWithMyProvided filterNot Set(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj) map { p => + LocalProject(p.id) }, scoped) + +def releaseSettings = Release.settings(nonRoots, proguard in Proguard) +def rootSettings = releaseSettings ++ fullDocSettings ++ LaunchProguard.settings ++ LaunchProguard.specific(launchProj) ++ + Util.publishPomSettings ++ otherRootSettings ++ proguardedLauncherSettings ++ Formatting.sbtFilesSettings ++ + Transform.conscriptSettings(launchProj) +def otherRootSettings = Seq( + Scripted.scripted <<= scriptedTask, + Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, + Scripted.scriptedSource <<= (sourceDirectory in sbtProj) / "sbt-test", + publishAll <<= inAll(nonRoots, publishLocal.task), + publishAll <<= (publishAll, publishLocal).map((x, y) => ()) // publish all normal deps as well as the sbt-launch jar +) +def fullDocSettings = Util.baseScalacOptions ++ Docs.settings ++ Sxr.settings ++ Seq( + scalacOptions += "-Ymacro-no-expand", // for both sxr and doc + sources in sxr <<= deepTasks(sources in Compile), //sxr + sources in (Compile, doc) <<= sources in sxr, // doc + Sxr.sourceDirectories <<= deep(sourceDirectories in Compile).map(_.flatten), // to properly relativize the source paths + fullClasspath in sxr <<= (externalDependencyClasspath in Compile in sbtProj), + dependencyClasspath in (Compile, doc) <<= fullClasspath in sxr +) + +// the launcher is published with metadata so that the scripted plugin can pull it in +// being proguarded, it shouldn't ever be on a classpath with other jars, however +def proguardedLauncherSettings = Seq( + publishArtifact in packageSrc := false, + moduleName := "sbt-launch", + autoScalaLibrary := false, + description := "sbt application launcher", + publishLauncher <<= Release.deployLauncher, + packageBin in Compile <<= proguard in Proguard +) + +/* Nested Projproject paths */ +def sbtPath = file("sbt") +def cachePath = file("cache") +def tasksPath = file("tasks") +def launchPath = file("launch") +def utilPath = file("util") +def compilePath = file("compile") +def mainPath = file("main") + +def precompiledSettings = Seq( + artifact in packageBin <<= (appConfiguration, scalaVersion) { (app, sv) => + val launcher = app.provider.scalaProvider.launcher + val bincID = binID + "_" + ScalaInstance(sv, launcher).actualVersion + Artifact(binID) extra ("e:component" -> bincID) + }, + target <<= (target, scalaVersion) { (base, sv) => base / ("precompiled_" + sv) }, + scalacOptions := Nil, + ivyScala ~= { _.map(_.copy(checkExplicit = false, overrideScalaVersion = false)) }, + exportedProducts in Compile := Nil, + libraryDependencies <+= scalaVersion("org.scala-lang" % "scala-compiler" % _ % "provided") +) + +def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " + scalav.replace('.', '_')), base = compilePath / "interface"). + dependsOn(interfaceProj). + settings(baseSettings ++ precompiledSettings: _*). + settings( + name := "Precompiled " + scalav.replace('.', '_'), + scalaHome := None, + scalaVersion <<= (scalaVersion in ThisBuild) { sbtScalaV => + assert(sbtScalaV != scalav, "Precompiled compiler interface cannot have the same Scala version (" + scalav + ") as sbt.") + scalav + }, + // we disable compiling and running tests in precompiled Projprojects of compiler interface + // so we do not need to worry about cross-versioning testing dependencies + sources in Test := Nil + ) From b674b462c27ee04d5d9ab470345f4afb18182e53 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 18 Dec 2014 07:57:05 -0500 Subject: [PATCH 453/823] Factor out dependencies --- build.sbt | 43 ++++++++++++++++++++++--------------------- 1 file changed, 22 insertions(+), 21 deletions(-) diff --git a/build.sbt b/build.sbt index bd8776e3d..0462159ad 100644 --- a/build.sbt +++ b/build.sbt @@ -1,6 +1,6 @@ import Project.Initialize import Util._ -import Common._ +import Dependencies._ import Licensed._ import Scope.ThisScope import LaunchProguard.{ proguard, Proguard } @@ -23,7 +23,6 @@ def commonSettings: Seq[Setting[_]] = Seq( incOptions := incOptions.value.withNameHashing(true) ) -//override lazy val settings = super.settings ++ buildSettings ++ Status.settings ++ nightlySettings def minimalSettings: Seq[Setting[_]] = commonSettings ++ customCommands ++ Status.settings ++ nightlySettings ++ Seq( @@ -57,7 +56,7 @@ lazy val launchProj = (project in launchPath). settings(testedBaseSettings: _*). settings( name := "Launcher", - ivy, + libraryDependencies += ivy, compile in Test <<= compile in Test dependsOn (publishLocal in interfaceProj, publishLocal in testSamples, publishLocal in launchInterfaceProj) ). settings(inConfig(Compile)(Transform.configSettings): _*). @@ -70,7 +69,10 @@ lazy val launchProj = (project in launchPath). lazy val testSamples = (project in launchPath / "test-sample"). dependsOn(interfaceProj, launchInterfaceProj). settings(baseSettings ++ noPublishSettings: _*). - settings(scalaCompiler) + settings( + name := "Test Sample", + libraryDependencies += scalaCompiler.value + ) // defines Java structures used across Scala versions, such as the API structures and relationships extracted by // the analysis compiler phases and passed back to sbt. The API structures are defined in a simple @@ -116,7 +118,7 @@ lazy val applyMacroProj = (project in utilPath / "appmacro"). settings(testedBaseSettings: _*). settings( name := "Apply Macro", - scalaCompiler + libraryDependencies += scalaCompiler.value ) // The API for forking, combining, and doing I/O with system processes @@ -125,7 +127,7 @@ lazy val processProj = (project in utilPath / "process"). settings(baseSettings: _*). settings( name := "Process", - scalaXml + libraryDependencies ++= scalaXml.value ) // Path, IO (formerly FileUtilities), NameFilter and other I/O utility classes @@ -143,7 +145,7 @@ lazy val classpathProj = (project in utilPath / "classpath"). settings(testedBaseSettings: _*). settings( name := "Classpath", - scalaCompiler + libraryDependencies += scalaCompiler.value ) // Command line-related utilities. @@ -152,7 +154,7 @@ lazy val completeProj = (project in utilPath / "complete"). settings(testedBaseSettings ++ Util.crossBuild: _*). settings( name := "Completion", - jline + libraryDependencies += jline ) // logging @@ -161,7 +163,7 @@ lazy val logProj = (project in utilPath / "log"). settings(testedBaseSettings: _*). settings( name := "Logging", - jline + libraryDependencies += jline ) // Relation @@ -212,7 +214,8 @@ lazy val ivyProj = (project in file("ivy")). settings(baseSettings: _*). settings( name := "Ivy", - ivy, jsch, testExclusive, json4sNative, jawnParser, jawnJson4s) + libraryDependencies ++= Seq(ivy, jsch, json4sNative, jawnParser, jawnJson4s), + testExclusive) // Runner for uniform test interface lazy val testingProj = (project in file("testing")). @@ -220,7 +223,7 @@ lazy val testingProj = (project in file("testing")). settings(baseSettings: _*). settings( name := "Testing", - testInterface + libraryDependencies += testInterface ) // Testing agent for running tests in a separate process. @@ -228,7 +231,7 @@ lazy val testAgentProj = (project in file("testing") / "agent"). settings(minimalSettings: _*). settings( name := "Test Agent", - testInterface + libraryDependencies += testInterface ) // Basic task engine @@ -254,11 +257,9 @@ lazy val cacheProj = (project in cachePath). settings(baseSettings: _*). settings( name := "Cache", - sbinary, scalaXml + libraryDependencies ++= Seq(sbinary) ++ scalaXml.value ) -// baseProject(cachePath, "Cache") dependsOn (ioProj, collectionProj) settings (sbinary, scalaXml) - // Builds on cache to provide caching for filesystem-related operations lazy val trackingProj = (project in cachePath / "tracking"). dependsOn(cacheProj, ioProj). @@ -312,7 +313,7 @@ lazy val compilePersistProj = (project in compilePath / "persist"). settings(testedBaseSettings: _*). settings( name := "Persist", - sbinary + libraryDependencies += sbinary ) // sbt-side interface to compiler. Calls compiler-side interface reflectively @@ -322,7 +323,7 @@ lazy val compilerProj = (project in compilePath). settings(testedBaseSettings: _*). settings( name := "Compile", - libraryDependencies <+= scalaVersion("org.scala-lang" % "scala-compiler" % _ % Test), + libraryDependencies += scalaCompiler.value % Test, unmanagedJars in Test <<= (packageSrc in compileInterfaceProj in Compile).map(x => Seq(x).classpath) ) @@ -345,7 +346,7 @@ lazy val scriptedBaseProj = (project in scriptedPath / "base"). settings(testedBaseSettings: _*). settings( name := "Scripted Framework", - scalaParsers + libraryDependencies ++= scalaParsers.value ) lazy val scriptedSbtProj = (project in scriptedPath / "sbt"). @@ -387,7 +388,7 @@ lazy val mainSettingsProj = (project in mainPath / "settings"). settings(testedBaseSettings: _*). settings( name := "Main Settings", - sbinary + libraryDependencies += sbinary ) // The main integration project for sbt. It brings all of the Projsystems together, configures them, and provides for overriding conventions. @@ -396,7 +397,7 @@ lazy val mainProj = (project in mainPath). settings(testedBaseSettings: _*). settings( name := "Main", - scalaXml + libraryDependencies ++= scalaXml.value ) // Strictly for bringing implicits and aliases from subsystems into the top-level sbt namespace through a single package object @@ -492,7 +493,7 @@ def precompiledSettings = Seq( scalacOptions := Nil, ivyScala ~= { _.map(_.copy(checkExplicit = false, overrideScalaVersion = false)) }, exportedProducts in Compile := Nil, - libraryDependencies <+= scalaVersion("org.scala-lang" % "scala-compiler" % _ % "provided") + libraryDependencies += scalaCompiler.value % "provided" ) def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " + scalav.replace('.', '_')), base = compilePath / "interface"). From 7e277a2b7abb5ae03976813c5dd94ce3e9bb5c32 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 18 Dec 2014 13:14:04 -0500 Subject: [PATCH 454/823] Fix java version check, and use scope filter --- build.sbt | 33 ++++++++++++++++++++------------- 1 file changed, 20 insertions(+), 13 deletions(-) diff --git a/build.sbt b/build.sbt index 0462159ad..f98bb1c97 100644 --- a/build.sbt +++ b/build.sbt @@ -25,6 +25,7 @@ def commonSettings: Seq[Setting[_]] = Seq( def minimalSettings: Seq[Setting[_]] = commonSettings ++ customCommands ++ Status.settings ++ nightlySettings ++ + publishPomSettings ++ Release.javaVersionCheckSettings ++ Seq( crossVersion in update <<= (crossVersion, nightly211) { (cv, n) => if (n) CrossVersion.full else cv }, resolvers += Resolver.typesafeIvyRepo("releases") @@ -41,7 +42,7 @@ lazy val root: Project = (project in file(".")). aggregate(nonRoots: _*). settings(minimalSettings ++ rootSettings: _*) -/* ** Projproject declarations ** */ +/* ** subproject declarations ** */ // defines the Java interfaces through which the launcher and the launched application communicate lazy val launchInterfaceProj = (project in launchPath / "interface"). @@ -438,11 +439,6 @@ def allProjects = Seq(launchInterfaceProj, launchProj, testSamples, interfacePro def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.configurations.filter(_ != Provided)) :+ myProvided)) lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) -def deepTasks[T](scoped: TaskKey[Seq[T]]): Initialize[Task[Seq[T]]] = deep(scoped.task) { _.join.map(_.flatten.distinct) } -def deep[T](scoped: SettingKey[T]): Initialize[Seq[T]] = - Util.inAllProjects(projectsWithMyProvided filterNot Set(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj) map { p => - LocalProject(p.id) }, scoped) - def releaseSettings = Release.settings(nonRoots, proguard in Proguard) def rootSettings = releaseSettings ++ fullDocSettings ++ LaunchProguard.settings ++ LaunchProguard.specific(launchProj) ++ Util.publishPomSettings ++ otherRootSettings ++ proguardedLauncherSettings ++ Formatting.sbtFilesSettings ++ @@ -451,16 +447,27 @@ def otherRootSettings = Seq( Scripted.scripted <<= scriptedTask, Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, Scripted.scriptedSource <<= (sourceDirectory in sbtProj) / "sbt-test", - publishAll <<= inAll(nonRoots, publishLocal.task), - publishAll <<= (publishAll, publishLocal).map((x, y) => ()) // publish all normal deps as well as the sbt-launch jar + publishAll := { + (publishLocal).all(ScopeFilter(inAnyProject)).value + } +) +lazy val docProjects: ScopeFilter = ScopeFilter( + inAnyProject -- inProjects(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj), + inConfigurations(Compile) ) def fullDocSettings = Util.baseScalacOptions ++ Docs.settings ++ Sxr.settings ++ Seq( scalacOptions += "-Ymacro-no-expand", // for both sxr and doc - sources in sxr <<= deepTasks(sources in Compile), //sxr - sources in (Compile, doc) <<= sources in sxr, // doc - Sxr.sourceDirectories <<= deep(sourceDirectories in Compile).map(_.flatten), // to properly relativize the source paths - fullClasspath in sxr <<= (externalDependencyClasspath in Compile in sbtProj), - dependencyClasspath in (Compile, doc) <<= fullClasspath in sxr + sources in sxr := { + val allSources = (sources ?? Nil).all(docProjects).value + allSources.flatten.distinct + }, //sxr + sources in (Compile, doc) := (sources in sxr).value, // doc + Sxr.sourceDirectories := { + val allSourceDirectories = (sourceDirectories ?? Nil).all(docProjects).value + allSourceDirectories.flatten + }, + fullClasspath in sxr := (externalDependencyClasspath in Compile in sbtProj).value, + dependencyClasspath in (Compile, doc) := (fullClasspath in sxr).value ) // the launcher is published with metadata so that the scripted plugin can pull it in From 56d9413f47cd60bca1f85838c1fb5a4a040bb68e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 18 Dec 2014 17:40:20 -0500 Subject: [PATCH 455/823] Fixing cross building --- build.sbt | 67 ++++++++++++++++++++++++++++++++++++++++++++++--------- 1 file changed, 56 insertions(+), 11 deletions(-) diff --git a/build.sbt b/build.sbt index f98bb1c97..2bb89248a 100644 --- a/build.sbt +++ b/build.sbt @@ -11,8 +11,8 @@ import Sxr.sxr def commonSettings: Seq[Setting[_]] = Seq( organization := "org.scala-sbt", version := "0.13.8-SNAPSHOT", + scalaVersion in ThisBuild := "2.10.4", publishArtifact in packageDoc := false, - scalaVersion := "2.10.4", publishMavenStyle := false, componentID := None, crossPaths := false, @@ -24,12 +24,8 @@ def commonSettings: Seq[Setting[_]] = Seq( ) def minimalSettings: Seq[Setting[_]] = - commonSettings ++ customCommands ++ Status.settings ++ nightlySettings ++ - publishPomSettings ++ Release.javaVersionCheckSettings ++ - Seq( - crossVersion in update <<= (crossVersion, nightly211) { (cv, n) => if (n) CrossVersion.full else cv }, - resolvers += Resolver.typesafeIvyRepo("releases") - ) + commonSettings ++ customCommands ++ Status.settings ++ + publishPomSettings ++ Release.javaVersionCheckSettings def baseSettings: Seq[Setting[_]] = minimalSettings ++ Seq(projectComponent) ++ baseScalacOptions ++ Licensed.settings ++ Formatting.settings @@ -105,13 +101,15 @@ lazy val apiProj = (project in compilePath / "api"). lazy val controlProj = (project in utilPath / "control"). settings(baseSettings ++ Util.crossBuild: _*). settings( - name := "Control" + name := "Control", + crossScalaVersions := Seq(scala210, scala211) ) lazy val collectionProj = (project in utilPath / "collection"). settings(testedBaseSettings ++ Util.keywordsSettings ++ Util.crossBuild: _*). settings( - name := "Collections" + name := "Collections", + crossScalaVersions := Seq(scala210, scala211) ) lazy val applyMacroProj = (project in utilPath / "appmacro"). @@ -137,7 +135,8 @@ lazy val ioProj = (project in utilPath / "io"). settings(testedBaseSettings ++ Util.crossBuild: _*). settings( name := "IO", - libraryDependencies += { "org.scala-lang" % "scala-compiler" % scalaVersion.value % Test } + libraryDependencies += scalaCompiler.value % Test, + crossScalaVersions := Seq(scala210, scala211) ) // Utilities related to reflection, managing Scala versions, and custom class loaders @@ -155,7 +154,8 @@ lazy val completeProj = (project in utilPath / "complete"). settings(testedBaseSettings ++ Util.crossBuild: _*). settings( name := "Completion", - libraryDependencies += jline + libraryDependencies += jline, + crossScalaVersions := Seq(scala210, scala211) ) // logging @@ -517,3 +517,48 @@ def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " // so we do not need to worry about cross-versioning testing dependencies sources in Test := Nil ) + +lazy val safeUnitTests = taskKey[Unit]("Known working tests (for both 2.10 and 2.11)") +lazy val safeProjects: ScopeFilter = ScopeFilter( + inProjects(launchProj, mainSettingsProj, mainProj, ivyProj, completeProj, + actionsProj, classpathProj, collectionProj, compileIncrementalProj, + logProj, runProj, stdTaskProj), + inConfigurations(Test) +) + +def customCommands: Seq[Setting[_]] = Seq( + commands += Command.command("setupBuildScala211") { state => + s"""set scalaVersion in ThisBuild := "$scala211" """ :: + state + }, + // This is invoked by Travis + commands += Command.command("checkBuildScala211") { state => + s"++ $scala211" :: + // First compile everything before attempting to test + "all compile test:compile" :: + // Now run known working tests. + safeUnitTests.key.label :: + state + }, + safeUnitTests := { + test.all(safeProjects).value + }, + commands += Command.command("release-sbt-local") { state => + "clean" :: + "so compile" :: + "so publishLocal" :: + "reload" :: + state + }, + commands += Command.command("release-sbt") { state => + // TODO - Any sort of validation + "clean" :: + "checkCredentials" :: + "conscript-configs" :: + "so compile" :: + "so publishSigned" :: + "publishLauncher" :: + "release-libs-211" :: + state + } +) From 71fb4648f57985bdc28304bf5c472be3d97a7d42 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 18 Dec 2014 20:09:06 -0500 Subject: [PATCH 456/823] Fix Launch Test project's name --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 2bb89248a..06cbf7a66 100644 --- a/build.sbt +++ b/build.sbt @@ -67,7 +67,7 @@ lazy val testSamples = (project in launchPath / "test-sample"). dependsOn(interfaceProj, launchInterfaceProj). settings(baseSettings ++ noPublishSettings: _*). settings( - name := "Test Sample", + name := "Launch Test", libraryDependencies += scalaCompiler.value ) From fbe390eefa584c38dc72dc86090fe056391e6499 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 16 Dec 2014 13:14:14 -0500 Subject: [PATCH 457/823] Create a new Ivy DependencyResolver which uses Aether. * Here we wire Aether into the Ivy dependency chain * Add hooks into Aether to use Ivy's http library (so credentials are configured the same) * Create the actual Resolver which extracts metadata information from Aether * Deprecate old Ivy-Maven integrations * Create hooks in existing Resolver facilities to expose a flag to enable the new behavior. * Create notes documenting the feature. * Create a new resolver type `MavenCache` which denotes how to read/write local maven cache metadata correctly. We use this type for publishM2 and mavenLocal. * Update failing -SNAPSHOT related tests to use new Aether resolver * Create specification for expected behavior from the new resolvers. Known to fix #1322, #321, #647, #1616 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 06cbf7a66..1faa99e75 100644 --- a/build.sbt +++ b/build.sbt @@ -215,7 +215,7 @@ lazy val ivyProj = (project in file("ivy")). settings(baseSettings: _*). settings( name := "Ivy", - libraryDependencies ++= Seq(ivy, jsch, json4sNative, jawnParser, jawnJson4s), + libraryDependencies ++= Seq(ivy, jsch, json4sNative, jawnParser, jawnJson4s) ++ aetherLibs, testExclusive) // Runner for uniform test interface From 48cb1444cf88a346f6c6ff43b6da071d450f20a7 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 8 Jan 2015 18:02:59 -0500 Subject: [PATCH 458/823] Turn Aether integration into sbt-maven-resolver --- build.sbt | 13 +++++++++++-- 1 file changed, 11 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 1faa99e75..e2e301bb5 100644 --- a/build.sbt +++ b/build.sbt @@ -215,7 +215,7 @@ lazy val ivyProj = (project in file("ivy")). settings(baseSettings: _*). settings( name := "Ivy", - libraryDependencies ++= Seq(ivy, jsch, json4sNative, jawnParser, jawnJson4s) ++ aetherLibs, + libraryDependencies ++= Seq(ivy, jsch, json4sNative, jawnParser, jawnJson4s), testExclusive) // Runner for uniform test interface @@ -412,6 +412,15 @@ lazy val sbtProj = (project in sbtPath). normalizedName := "sbt" ) +lazy val mavenResolverPluginProj = (project in file("sbt-maven-resolver")). + dependsOn(sbtProj). + settings(baseSettings: _*). + settings( + name := "sbt-maven-resolver", + libraryDependencies ++= aetherLibs, + sbtPlugin := true + ) + def scriptedTask: Initialize[InputTask[Unit]] = InputTask(scriptedSource(dir => (s: State) => scriptedParser(dir))) { result => (proguard in Proguard, fullClasspath in scriptedSbtProj in Test, scalaInstance in scriptedSbtProj, publishAll, scriptedSource, result) map { (launcher, scriptedSbtClasspath, scriptedSbtInstance, _, sourcePath, args) => @@ -435,7 +444,7 @@ def allProjects = Seq(launchInterfaceProj, launchProj, testSamples, interfacePro compileInterfaceProj, compileIncrementalProj, compilePersistProj, compilerProj, compilerIntegrationProj, compilerIvyProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, - actionsProj, commandProj, mainSettingsProj, mainProj, sbtProj) + actionsProj, commandProj, mainSettingsProj, mainProj, sbtProj, mavenResolverPluginProj) def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.configurations.filter(_ != Provided)) :+ myProvided)) lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) From f4cffa98b7e6a5b9cfed1246acef334ea5ae2a1f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 10 Jan 2015 22:55:50 -0500 Subject: [PATCH 459/823] Adjust tests. --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index e2e301bb5..7e16726bd 100644 --- a/build.sbt +++ b/build.sbt @@ -413,7 +413,7 @@ lazy val sbtProj = (project in sbtPath). ) lazy val mavenResolverPluginProj = (project in file("sbt-maven-resolver")). - dependsOn(sbtProj). + dependsOn(sbtProj, ivyProj % "test->test"). settings(baseSettings: _*). settings( name := "sbt-maven-resolver", From 18c4aba58deba1aca3d29ac6fb9f1f6689540630 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 12 Jan 2015 22:01:16 -0500 Subject: [PATCH 460/823] Run scripted with sbt-maven-resolver --- build.sbt | 30 ++++++++++++++++++++++-------- 1 file changed, 22 insertions(+), 8 deletions(-) diff --git a/build.sbt b/build.sbt index 7e16726bd..f72038ecd 100644 --- a/build.sbt +++ b/build.sbt @@ -421,15 +421,17 @@ lazy val mavenResolverPluginProj = (project in file("sbt-maven-resolver")). sbtPlugin := true ) -def scriptedTask: Initialize[InputTask[Unit]] = InputTask(scriptedSource(dir => (s: State) => scriptedParser(dir))) { result => - (proguard in Proguard, fullClasspath in scriptedSbtProj in Test, scalaInstance in scriptedSbtProj, publishAll, scriptedSource, result) map { - (launcher, scriptedSbtClasspath, scriptedSbtInstance, _, sourcePath, args) => - doScripted(launcher, scriptedSbtClasspath, scriptedSbtInstance, sourcePath, args) - } +def scriptedTask: Initialize[InputTask[Unit]] = Def.inputTask { + val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed + publishAll.value + doScripted((proguard in Proguard).value, (fullClasspath in scriptedSbtProj in Test).value, + (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) } -def scriptedUnpublishedTask: Initialize[InputTask[Unit]] = InputTask(scriptedSource(dir => (s: State) => scriptedParser(dir))) { result => - (proguard in Proguard, fullClasspath in scriptedSbtProj in Test, scalaInstance in scriptedSbtProj, scriptedSource, result) map doScripted +def scriptedUnpublishedTask: Initialize[InputTask[Unit]] = Def.inputTask { + val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed + doScripted((proguard in Proguard).value, (fullClasspath in scriptedSbtProj in Test).value, + (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) } lazy val publishAll = TaskKey[Unit]("publish-all") @@ -453,13 +455,25 @@ def rootSettings = releaseSettings ++ fullDocSettings ++ LaunchProguard.settings Util.publishPomSettings ++ otherRootSettings ++ proguardedLauncherSettings ++ Formatting.sbtFilesSettings ++ Transform.conscriptSettings(launchProj) def otherRootSettings = Seq( + Scripted.scriptedPrescripted := { _ => }, Scripted.scripted <<= scriptedTask, Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, Scripted.scriptedSource <<= (sourceDirectory in sbtProj) / "sbt-test", publishAll := { (publishLocal).all(ScopeFilter(inAnyProject)).value } -) +) ++ inConfig(Scripted.MavenResolverPluginTest)(Seq( + Scripted.scripted <<= scriptedTask, + Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, + Scripted.scriptedPrescripted := { f => + val inj = f / "project" / "maven.sbt" + if (!inj.exists) { + IO.write(inj, """libraryDependencies += Defaults.sbtPluginExtra("org.scala-sbt" % "sbt-maven-resolver" % sbtVersion.value, + |sbtBinaryVersion.value, scalaBinaryVersion.value)""".stripMargin) + // sLog.value.info(s"""Injected project/maven.sbt to $f""") + } + } +)) lazy val docProjects: ScopeFilter = ScopeFilter( inAnyProject -- inProjects(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj), inConfigurations(Compile) From 8b38780f21f4068b71c6e830fb941ec5b5a74c96 Mon Sep 17 00:00:00 2001 From: Indrajit Raychaudhuri Date: Sat, 17 Jan 2015 08:25:57 +0530 Subject: [PATCH 461/823] Fix params order in `@deprecated` --- util/collection/src/main/scala/sbt/Settings.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index a8e5b1d6c..326ebc9c5 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -85,7 +85,7 @@ trait Init[Scope] { */ private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) - @deprecated("0.13.7", "Use the version with default arguments and default parameter.") + @deprecated("Use the version with default arguments and default parameter.", "0.13.7") final def derive[T](s: Setting[T], allowDynamic: Boolean, filter: Scope => Boolean, trigger: AttributeKey[_] => Boolean): Setting[T] = derive(s, allowDynamic, filter, trigger, false) /** From 2906edb872c0c055acbbee18b8474b30df1fbe44 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 2 Feb 2015 14:56:13 -0500 Subject: [PATCH 462/823] Auto style fix --- util/collection/src/main/scala/sbt/Settings.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 326ebc9c5..38d2c2f5f 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -31,7 +31,7 @@ private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val del def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] = { - val map = data getOrElse(scope, AttributeMap.empty) + val map = data getOrElse (scope, AttributeMap.empty) val newData = data.updated(scope, map.put(key, value)) new Settings0(newData, delegates) } From b201bbab8c9dc643d9a2562ca40859cc7687fb32 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 2 Feb 2015 14:57:25 -0500 Subject: [PATCH 463/823] Adds addMavenResolverPlugin. #1808/#1793 --- build.sbt | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index f72038ecd..2cffb3cd8 100644 --- a/build.sbt +++ b/build.sbt @@ -468,8 +468,7 @@ def otherRootSettings = Seq( Scripted.scriptedPrescripted := { f => val inj = f / "project" / "maven.sbt" if (!inj.exists) { - IO.write(inj, """libraryDependencies += Defaults.sbtPluginExtra("org.scala-sbt" % "sbt-maven-resolver" % sbtVersion.value, - |sbtBinaryVersion.value, scalaBinaryVersion.value)""".stripMargin) + IO.write(inj, "addMavenResolverPlugin") // sLog.value.info(s"""Injected project/maven.sbt to $f""") } } From 4fe9fadea393e40dae63c141096fa1540a80635f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 2 Feb 2015 22:44:02 -0500 Subject: [PATCH 464/823] Fix build --- build.sbt | 59 ++++++++++++++++++++++++++++++++++++++++--------------- 1 file changed, 43 insertions(+), 16 deletions(-) diff --git a/build.sbt b/build.sbt index 2cffb3cd8..ecf602e70 100644 --- a/build.sbt +++ b/build.sbt @@ -11,7 +11,7 @@ import Sxr.sxr def commonSettings: Seq[Setting[_]] = Seq( organization := "org.scala-sbt", version := "0.13.8-SNAPSHOT", - scalaVersion in ThisBuild := "2.10.4", + scalaVersion := "2.10.4", publishArtifact in packageDoc := false, publishMavenStyle := false, componentID := None, @@ -20,7 +20,8 @@ def commonSettings: Seq[Setting[_]] = Seq( concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), - incOptions := incOptions.value.withNameHashing(true) + incOptions := incOptions.value.withNameHashing(true), + crossScalaVersions := Seq(scala210) ) def minimalSettings: Seq[Setting[_]] = @@ -84,7 +85,13 @@ lazy val interfaceProj = (project in file("interface")). watchSources <++= apiDefinitions, resourceGenerators in Compile <+= (version, resourceManaged, streams, compile in Compile) map generateVersionFile, apiDefinitions <<= baseDirectory map { base => (base / "definition") :: (base / "other") :: (base / "type") :: Nil }, - sourceGenerators in Compile <+= (cacheDirectory, apiDefinitions, fullClasspath in Compile in datatypeProj, sourceManaged in Compile, mainClass in datatypeProj in Compile, runner, streams) map generateAPICached + sourceGenerators in Compile <+= (cacheDirectory, + apiDefinitions, + fullClasspath in Compile in datatypeProj, + sourceManaged in Compile, + mainClass in datatypeProj in Compile, + runner, + streams) map generateAPICached ) // defines operations on the API of a source, including determining whether it has changed and converting it to a string @@ -101,15 +108,15 @@ lazy val apiProj = (project in compilePath / "api"). lazy val controlProj = (project in utilPath / "control"). settings(baseSettings ++ Util.crossBuild: _*). settings( - name := "Control", - crossScalaVersions := Seq(scala210, scala211) + name := "Control" + // crossScalaVersions := Seq(scala210, scala211) ) lazy val collectionProj = (project in utilPath / "collection"). settings(testedBaseSettings ++ Util.keywordsSettings ++ Util.crossBuild: _*). settings( - name := "Collections", - crossScalaVersions := Seq(scala210, scala211) + name := "Collections" + // crossScalaVersions := Seq(scala210, scala211) ) lazy val applyMacroProj = (project in utilPath / "appmacro"). @@ -135,8 +142,8 @@ lazy val ioProj = (project in utilPath / "io"). settings(testedBaseSettings ++ Util.crossBuild: _*). settings( name := "IO", - libraryDependencies += scalaCompiler.value % Test, - crossScalaVersions := Seq(scala210, scala211) + libraryDependencies += scalaCompiler.value % Test + // crossScalaVersions := Seq(scala210, scala211) ) // Utilities related to reflection, managing Scala versions, and custom class loaders @@ -154,8 +161,8 @@ lazy val completeProj = (project in utilPath / "complete"). settings(testedBaseSettings ++ Util.crossBuild: _*). settings( name := "Completion", - libraryDependencies += jline, - crossScalaVersions := Seq(scala210, scala211) + libraryDependencies += jline + // crossScalaVersions := Seq(scala210, scala211) ) // logging @@ -295,9 +302,9 @@ lazy val compileInterfaceProj = (project in compilePath / "interface"). artifact in (Compile, packageSrc) := Artifact(srcID).copy(configurations = Compile :: Nil).extra("e:component" -> srcID) ) -lazy val precompiled282 = precompiled("2.8.2") -lazy val precompiled292 = precompiled("2.9.2") -lazy val precompiled293 = precompiled("2.9.3") +lazy val precompiled282 = precompiled(scala282) +lazy val precompiled292 = precompiled(scala292) +lazy val precompiled293 = precompiled(scala293) // Implements the core functionality of detecting and propagating changes incrementally. // Defines the data structures for representing file fingerprints and relationships and the overall source analysis @@ -447,6 +454,7 @@ def allProjects = Seq(launchInterfaceProj, launchProj, testSamples, interfacePro compilerIntegrationProj, compilerIvyProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, actionsProj, commandProj, mainSettingsProj, mainProj, sbtProj, mavenResolverPluginProj) + def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.configurations.filter(_ != Provided)) :+ myProvided)) lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) @@ -460,7 +468,7 @@ def otherRootSettings = Seq( Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, Scripted.scriptedSource <<= (sourceDirectory in sbtProj) / "sbt-test", publishAll := { - (publishLocal).all(ScopeFilter(inAnyProject)).value + val _ = (publishLocal).all(ScopeFilter(inAnyProject)).value } ) ++ inConfig(Scripted.MavenResolverPluginTest)(Seq( Scripted.scripted <<= scriptedTask, @@ -535,6 +543,7 @@ def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " assert(sbtScalaV != scalav, "Precompiled compiler interface cannot have the same Scala version (" + scalav + ") as sbt.") scalav }, + crossScalaVersions := Seq(scalav), // we disable compiling and running tests in precompiled Projprojects of compiler interface // so we do not need to worry about cross-versioning testing dependencies sources in Test := Nil @@ -567,7 +576,14 @@ def customCommands: Seq[Setting[_]] = Seq( }, commands += Command.command("release-sbt-local") { state => "clean" :: + "interfaceProj/compile" :: // Java project needs to compile first + "precompiled-2_8_2/compile" :: + "precompiled-2_9_2/compile" :: + "precompiled-2_9_3/compile" :: "so compile" :: + "precompiled-2_8_2/publishLocal" :: + "precompiled-2_9_2/publishLocal" :: + "precompiled-2_9_3/publishLocal" :: "so publishLocal" :: "reload" :: state @@ -577,10 +593,21 @@ def customCommands: Seq[Setting[_]] = Seq( "clean" :: "checkCredentials" :: "conscript-configs" :: + "interfaceProj/compile" :: // Java project needs to compile first + "precompiled-2_8_2/compile" :: + "precompiled-2_9_2/compile" :: + "precompiled-2_9_3/compile" :: "so compile" :: "so publishSigned" :: + "precompiled-2_8_2/publishSigned" :: + "precompiled-2_9_2/publishSigned" :: + "precompiled-2_9_3/publishSigned" :: "publishLauncher" :: - "release-libs-211" :: + "++2.11.1" :: + "controlProj/publishSigned" :: + "collectionProj/publishSigned" :: + "ioProj/publishSigned" :: + "completeProj/publishSigned" :: state } ) From 9cf264cd84ec4b6edaa3a45663327fe7760d4a76 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 3 Feb 2015 00:40:19 -0500 Subject: [PATCH 465/823] sbt-doge 0.1.3 --- build.sbt | 21 ++++++++------------- 1 file changed, 8 insertions(+), 13 deletions(-) diff --git a/build.sbt b/build.sbt index ecf602e70..8eecd222c 100644 --- a/build.sbt +++ b/build.sbt @@ -108,15 +108,15 @@ lazy val apiProj = (project in compilePath / "api"). lazy val controlProj = (project in utilPath / "control"). settings(baseSettings ++ Util.crossBuild: _*). settings( - name := "Control" - // crossScalaVersions := Seq(scala210, scala211) + name := "Control", + crossScalaVersions := Seq(scala210, scala211) ) lazy val collectionProj = (project in utilPath / "collection"). settings(testedBaseSettings ++ Util.keywordsSettings ++ Util.crossBuild: _*). settings( - name := "Collections" - // crossScalaVersions := Seq(scala210, scala211) + name := "Collections", + crossScalaVersions := Seq(scala210, scala211) ) lazy val applyMacroProj = (project in utilPath / "appmacro"). @@ -142,8 +142,8 @@ lazy val ioProj = (project in utilPath / "io"). settings(testedBaseSettings ++ Util.crossBuild: _*). settings( name := "IO", - libraryDependencies += scalaCompiler.value % Test - // crossScalaVersions := Seq(scala210, scala211) + libraryDependencies += scalaCompiler.value % Test, + crossScalaVersions := Seq(scala210, scala211) ) // Utilities related to reflection, managing Scala versions, and custom class loaders @@ -161,8 +161,8 @@ lazy val completeProj = (project in utilPath / "complete"). settings(testedBaseSettings ++ Util.crossBuild: _*). settings( name := "Completion", - libraryDependencies += jline - // crossScalaVersions := Seq(scala210, scala211) + libraryDependencies += jline, + crossScalaVersions := Seq(scala210, scala211) ) // logging @@ -603,11 +603,6 @@ def customCommands: Seq[Setting[_]] = Seq( "precompiled-2_9_2/publishSigned" :: "precompiled-2_9_3/publishSigned" :: "publishLauncher" :: - "++2.11.1" :: - "controlProj/publishSigned" :: - "collectionProj/publishSigned" :: - "ioProj/publishSigned" :: - "completeProj/publishSigned" :: state } ) From 2a95a1f09b71ffb3c21367d7c9dcc838a253f5ac Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 3 Feb 2015 11:48:30 -0500 Subject: [PATCH 466/823] Removes interfaceProj/compile and adds comment. --- build.sbt | 18 ++++++++++++++++-- 1 file changed, 16 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 8eecd222c..9a8c1ca75 100644 --- a/build.sbt +++ b/build.sbt @@ -576,7 +576,6 @@ def customCommands: Seq[Setting[_]] = Seq( }, commands += Command.command("release-sbt-local") { state => "clean" :: - "interfaceProj/compile" :: // Java project needs to compile first "precompiled-2_8_2/compile" :: "precompiled-2_9_2/compile" :: "precompiled-2_9_3/compile" :: @@ -588,12 +587,27 @@ def customCommands: Seq[Setting[_]] = Seq( "reload" :: state }, + /** There are several complications with sbt's build. + * First is the fact that interface project is a Java-only project + * that uses source generator from datatype subproject in Scala 2.10.4, + * which is depended on by Scala 2.8.2, Scala 2.9.2, and Scala 2.9.3 precompiled project. + * + * Second is the fact that sbt project (currently using Scala 2.10.4) depends on + * the precompiled projects (that uses Scala 2.8.2 etc.) + * + * Finally, there's the fact that all subprojects are released with crossPaths + * turned off for the sbt's Scala version 2.10.4, but some of them are also + * cross published against 2.11.1 with crossPaths turned on. + * + * Because of the way ++ (and its improved version wow) is implemented + * precompiled compiler briges are handled outside of doge aggregation on root. + * `so compile` handles 2.10.x/2.11.x cross building. + */ commands += Command.command("release-sbt") { state => // TODO - Any sort of validation "clean" :: "checkCredentials" :: "conscript-configs" :: - "interfaceProj/compile" :: // Java project needs to compile first "precompiled-2_8_2/compile" :: "precompiled-2_9_2/compile" :: "precompiled-2_9_3/compile" :: From 1e3cbb5c0d4dbb3a86ee4364224de1ab5ee8d4b8 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 3 Feb 2015 19:36:45 -0500 Subject: [PATCH 467/823] remove precompiled compiler bridges - Scala 2.8.x or 2.9.x are no longer used that often. - Precompiled is a cross build liability as sbt (2.10.4) depends on 2.8.x/2.9.x code. - Scripted test was modified to check 2.8 and 2.9 compilation --- build.sbt | 74 ++++++++++--------------------------------------------- 1 file changed, 13 insertions(+), 61 deletions(-) diff --git a/build.sbt b/build.sbt index 9a8c1ca75..47417d5f6 100644 --- a/build.sbt +++ b/build.sbt @@ -302,9 +302,18 @@ lazy val compileInterfaceProj = (project in compilePath / "interface"). artifact in (Compile, packageSrc) := Artifact(srcID).copy(configurations = Compile :: Nil).extra("e:component" -> srcID) ) -lazy val precompiled282 = precompiled(scala282) -lazy val precompiled292 = precompiled(scala292) -lazy val precompiled293 = precompiled(scala293) +def precompiledSettings = Seq( + artifact in packageBin <<= (appConfiguration, scalaVersion) { (app, sv) => + val launcher = app.provider.scalaProvider.launcher + val bincID = binID + "_" + ScalaInstance(sv, launcher).actualVersion + Artifact(binID) extra ("e:component" -> bincID) + }, + target <<= (target, scalaVersion) { (base, sv) => base / ("precompiled_" + sv) }, + scalacOptions := Nil, + ivyScala ~= { _.map(_.copy(checkExplicit = false, overrideScalaVersion = false)) }, + exportedProducts in Compile := Nil, + libraryDependencies += scalaCompiler.value % "provided" +) // Implements the core functionality of detecting and propagating changes incrementally. // Defines the data structures for representing file fingerprints and relationships and the overall source analysis @@ -412,7 +421,7 @@ lazy val mainProj = (project in mainPath). // technically, we need a dependency on all of mainProj's dependencies, but we don't do that since this is strictly an integration project // with the sole purpose of providing certain identifiers without qualification (with a package object) lazy val sbtProj = (project in sbtPath). - dependsOn(mainProj, compileInterfaceProj, precompiled282, precompiled292, precompiled293, scriptedSbtProj % "test->test"). + dependsOn(mainProj, compileInterfaceProj, scriptedSbtProj % "test->test"). settings(baseSettings: _*). settings( name := "sbt", @@ -520,35 +529,6 @@ def utilPath = file("util") def compilePath = file("compile") def mainPath = file("main") -def precompiledSettings = Seq( - artifact in packageBin <<= (appConfiguration, scalaVersion) { (app, sv) => - val launcher = app.provider.scalaProvider.launcher - val bincID = binID + "_" + ScalaInstance(sv, launcher).actualVersion - Artifact(binID) extra ("e:component" -> bincID) - }, - target <<= (target, scalaVersion) { (base, sv) => base / ("precompiled_" + sv) }, - scalacOptions := Nil, - ivyScala ~= { _.map(_.copy(checkExplicit = false, overrideScalaVersion = false)) }, - exportedProducts in Compile := Nil, - libraryDependencies += scalaCompiler.value % "provided" -) - -def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " + scalav.replace('.', '_')), base = compilePath / "interface"). - dependsOn(interfaceProj). - settings(baseSettings ++ precompiledSettings: _*). - settings( - name := "Precompiled " + scalav.replace('.', '_'), - scalaHome := None, - scalaVersion <<= (scalaVersion in ThisBuild) { sbtScalaV => - assert(sbtScalaV != scalav, "Precompiled compiler interface cannot have the same Scala version (" + scalav + ") as sbt.") - scalav - }, - crossScalaVersions := Seq(scalav), - // we disable compiling and running tests in precompiled Projprojects of compiler interface - // so we do not need to worry about cross-versioning testing dependencies - sources in Test := Nil - ) - lazy val safeUnitTests = taskKey[Unit]("Known working tests (for both 2.10 and 2.11)") lazy val safeProjects: ScopeFilter = ScopeFilter( inProjects(launchProj, mainSettingsProj, mainProj, ivyProj, completeProj, @@ -576,46 +556,18 @@ def customCommands: Seq[Setting[_]] = Seq( }, commands += Command.command("release-sbt-local") { state => "clean" :: - "precompiled-2_8_2/compile" :: - "precompiled-2_9_2/compile" :: - "precompiled-2_9_3/compile" :: "so compile" :: - "precompiled-2_8_2/publishLocal" :: - "precompiled-2_9_2/publishLocal" :: - "precompiled-2_9_3/publishLocal" :: "so publishLocal" :: "reload" :: state }, - /** There are several complications with sbt's build. - * First is the fact that interface project is a Java-only project - * that uses source generator from datatype subproject in Scala 2.10.4, - * which is depended on by Scala 2.8.2, Scala 2.9.2, and Scala 2.9.3 precompiled project. - * - * Second is the fact that sbt project (currently using Scala 2.10.4) depends on - * the precompiled projects (that uses Scala 2.8.2 etc.) - * - * Finally, there's the fact that all subprojects are released with crossPaths - * turned off for the sbt's Scala version 2.10.4, but some of them are also - * cross published against 2.11.1 with crossPaths turned on. - * - * Because of the way ++ (and its improved version wow) is implemented - * precompiled compiler briges are handled outside of doge aggregation on root. - * `so compile` handles 2.10.x/2.11.x cross building. - */ commands += Command.command("release-sbt") { state => // TODO - Any sort of validation "clean" :: "checkCredentials" :: "conscript-configs" :: - "precompiled-2_8_2/compile" :: - "precompiled-2_9_2/compile" :: - "precompiled-2_9_3/compile" :: "so compile" :: "so publishSigned" :: - "precompiled-2_8_2/publishSigned" :: - "precompiled-2_9_2/publishSigned" :: - "precompiled-2_9_3/publishSigned" :: "publishLauncher" :: state } From 83803f631fa42af5318f46de244a0820c6666ba2 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 6 Feb 2015 13:48:12 -0500 Subject: [PATCH 468/823] Fix stamp-version for nightlies --- build.sbt | 10 ++++++++-- 1 file changed, 8 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 47417d5f6..afff6b2f9 100644 --- a/build.sbt +++ b/build.sbt @@ -8,9 +8,14 @@ import Scripted._ import StringUtilities.normalize import Sxr.sxr +// ThisBuild settings take lower precedence, +// but can be shared across the multi projects. +def buildLevelSettings: Seq[Setting[_]] = Seq( + organization in ThisBuild := "org.scala-sbt", + version in ThisBuild := "0.13.8-SNAPSHOT" +) + def commonSettings: Seq[Setting[_]] = Seq( - organization := "org.scala-sbt", - version := "0.13.8-SNAPSHOT", scalaVersion := "2.10.4", publishArtifact in packageDoc := false, publishMavenStyle := false, @@ -37,6 +42,7 @@ def testedBaseSettings: Seq[Setting[_]] = lazy val root: Project = (project in file(".")). configs(Sxr.sxrConf, Proguard). aggregate(nonRoots: _*). + settings(buildLevelSettings: _*). settings(minimalSettings ++ rootSettings: _*) /* ** subproject declarations ** */ From 6dc8f149bbf33e86b06c8549e6bcf1a8d8c95b32 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 6 Feb 2015 14:35:32 -0500 Subject: [PATCH 469/823] Split proguarded launcher out to a subproject --- build.sbt | 52 +++++++++++++++++++++++++++++++++------------------- 1 file changed, 33 insertions(+), 19 deletions(-) diff --git a/build.sbt b/build.sbt index afff6b2f9..086c0e361 100644 --- a/build.sbt +++ b/build.sbt @@ -40,10 +40,17 @@ def testedBaseSettings: Seq[Setting[_]] = baseSettings ++ testDependencies lazy val root: Project = (project in file(".")). - configs(Sxr.sxrConf, Proguard). + configs(Sxr.sxrConf). aggregate(nonRoots: _*). settings(buildLevelSettings: _*). - settings(minimalSettings ++ rootSettings: _*) + settings(minimalSettings ++ rootSettings: _*). + settings( + publish := {}, + publishLocal := { + val p = (proguard in (proguardedLauncherProj, Proguard)).value + IO.copyFile(p, target.value / p.getName) + } + ) /* ** subproject declarations ** */ @@ -69,6 +76,24 @@ lazy val launchProj = (project in launchPath). Transform.sourceProperties := Map("cross.package0" -> "xsbt", "cross.package1" -> "boot") )): _*) +// the proguarded launcher +// the launcher is published with metadata so that the scripted plugin can pull it in +// being proguarded, it shouldn't ever be on a classpath with other jars, however +lazy val proguardedLauncherProj = (project in file("sbt-launch")). + configs(Proguard). + settings(minimalSettings ++ LaunchProguard.settings ++ LaunchProguard.specific(launchProj) ++ + Release.launcherSettings(proguard in Proguard): _*). + settings( + name := "sbt-launch", + moduleName := "sbt-launch", + description := "sbt application launcher", + publishArtifact in packageSrc := false, + autoScalaLibrary := false, + publish <<= Seq(publish, Release.deployLauncher).dependOn, + publishLauncher <<= Release.deployLauncher, + packageBin in Compile <<= proguard in Proguard + ) + // used to test the retrieving and loading of an application: sample app is packaged and published to the local repository lazy val testSamples = (project in launchPath / "test-sample"). dependsOn(interfaceProj, launchInterfaceProj). @@ -446,13 +471,13 @@ lazy val mavenResolverPluginProj = (project in file("sbt-maven-resolver")). def scriptedTask: Initialize[InputTask[Unit]] = Def.inputTask { val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed publishAll.value - doScripted((proguard in Proguard).value, (fullClasspath in scriptedSbtProj in Test).value, + doScripted((proguard in Proguard in proguardedLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) } def scriptedUnpublishedTask: Initialize[InputTask[Unit]] = Def.inputTask { val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed - doScripted((proguard in Proguard).value, (fullClasspath in scriptedSbtProj in Test).value, + doScripted((proguard in Proguard in proguardedLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) } @@ -461,7 +486,8 @@ lazy val publishLauncher = TaskKey[Unit]("publish-launcher") lazy val myProvided = config("provided") intransitive -def allProjects = Seq(launchInterfaceProj, launchProj, testSamples, interfaceProj, apiProj, +def allProjects = Seq(launchInterfaceProj, launchProj, proguardedLauncherProj, + testSamples, interfaceProj, apiProj, controlProj, collectionProj, applyMacroProj, processProj, ioProj, classpathProj, completeProj, logProj, relationProj, classfileProj, datatypeProj, crossProj, logicProj, ivyProj, testingProj, testAgentProj, taskProj, stdTaskProj, cacheProj, trackingProj, runProj, @@ -473,9 +499,8 @@ def allProjects = Seq(launchInterfaceProj, launchProj, testSamples, interfacePro def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.configurations.filter(_ != Provided)) :+ myProvided)) lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) -def releaseSettings = Release.settings(nonRoots, proguard in Proguard) -def rootSettings = releaseSettings ++ fullDocSettings ++ LaunchProguard.settings ++ LaunchProguard.specific(launchProj) ++ - Util.publishPomSettings ++ otherRootSettings ++ proguardedLauncherSettings ++ Formatting.sbtFilesSettings ++ +def rootSettings = Release.releaseSettings ++ fullDocSettings ++ + Util.publishPomSettings ++ otherRootSettings ++ Formatting.sbtFilesSettings ++ Transform.conscriptSettings(launchProj) def otherRootSettings = Seq( Scripted.scriptedPrescripted := { _ => }, @@ -515,17 +540,6 @@ def fullDocSettings = Util.baseScalacOptions ++ Docs.settings ++ Sxr.settings ++ dependencyClasspath in (Compile, doc) := (fullClasspath in sxr).value ) -// the launcher is published with metadata so that the scripted plugin can pull it in -// being proguarded, it shouldn't ever be on a classpath with other jars, however -def proguardedLauncherSettings = Seq( - publishArtifact in packageSrc := false, - moduleName := "sbt-launch", - autoScalaLibrary := false, - description := "sbt application launcher", - publishLauncher <<= Release.deployLauncher, - packageBin in Compile <<= proguard in Proguard -) - /* Nested Projproject paths */ def sbtPath = file("sbt") def cachePath = file("cache") From 1f0a120a6f2f5a7386369615654162e23654f5a9 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 17 Feb 2015 13:33:54 -0500 Subject: [PATCH 470/823] Revert "remove precompiled compiler bridges" This reverts commit f2e5d48b6b6d9aba7dc24f14b0bc5c9a84c2997f. --- build.sbt | 74 +++++++++++++++++++++++++++++++++++++++++++++---------- 1 file changed, 61 insertions(+), 13 deletions(-) diff --git a/build.sbt b/build.sbt index 086c0e361..04652aec5 100644 --- a/build.sbt +++ b/build.sbt @@ -333,18 +333,9 @@ lazy val compileInterfaceProj = (project in compilePath / "interface"). artifact in (Compile, packageSrc) := Artifact(srcID).copy(configurations = Compile :: Nil).extra("e:component" -> srcID) ) -def precompiledSettings = Seq( - artifact in packageBin <<= (appConfiguration, scalaVersion) { (app, sv) => - val launcher = app.provider.scalaProvider.launcher - val bincID = binID + "_" + ScalaInstance(sv, launcher).actualVersion - Artifact(binID) extra ("e:component" -> bincID) - }, - target <<= (target, scalaVersion) { (base, sv) => base / ("precompiled_" + sv) }, - scalacOptions := Nil, - ivyScala ~= { _.map(_.copy(checkExplicit = false, overrideScalaVersion = false)) }, - exportedProducts in Compile := Nil, - libraryDependencies += scalaCompiler.value % "provided" -) +lazy val precompiled282 = precompiled(scala282) +lazy val precompiled292 = precompiled(scala292) +lazy val precompiled293 = precompiled(scala293) // Implements the core functionality of detecting and propagating changes incrementally. // Defines the data structures for representing file fingerprints and relationships and the overall source analysis @@ -452,7 +443,7 @@ lazy val mainProj = (project in mainPath). // technically, we need a dependency on all of mainProj's dependencies, but we don't do that since this is strictly an integration project // with the sole purpose of providing certain identifiers without qualification (with a package object) lazy val sbtProj = (project in sbtPath). - dependsOn(mainProj, compileInterfaceProj, scriptedSbtProj % "test->test"). + dependsOn(mainProj, compileInterfaceProj, precompiled282, precompiled292, precompiled293, scriptedSbtProj % "test->test"). settings(baseSettings: _*). settings( name := "sbt", @@ -549,6 +540,35 @@ def utilPath = file("util") def compilePath = file("compile") def mainPath = file("main") +def precompiledSettings = Seq( + artifact in packageBin <<= (appConfiguration, scalaVersion) { (app, sv) => + val launcher = app.provider.scalaProvider.launcher + val bincID = binID + "_" + ScalaInstance(sv, launcher).actualVersion + Artifact(binID) extra ("e:component" -> bincID) + }, + target <<= (target, scalaVersion) { (base, sv) => base / ("precompiled_" + sv) }, + scalacOptions := Nil, + ivyScala ~= { _.map(_.copy(checkExplicit = false, overrideScalaVersion = false)) }, + exportedProducts in Compile := Nil, + libraryDependencies += scalaCompiler.value % "provided" +) + +def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " + scalav.replace('.', '_')), base = compilePath / "interface"). + dependsOn(interfaceProj). + settings(baseSettings ++ precompiledSettings: _*). + settings( + name := "Precompiled " + scalav.replace('.', '_'), + scalaHome := None, + scalaVersion <<= (scalaVersion in ThisBuild) { sbtScalaV => + assert(sbtScalaV != scalav, "Precompiled compiler interface cannot have the same Scala version (" + scalav + ") as sbt.") + scalav + }, + crossScalaVersions := Seq(scalav), + // we disable compiling and running tests in precompiled Projprojects of compiler interface + // so we do not need to worry about cross-versioning testing dependencies + sources in Test := Nil + ) + lazy val safeUnitTests = taskKey[Unit]("Known working tests (for both 2.10 and 2.11)") lazy val safeProjects: ScopeFilter = ScopeFilter( inProjects(launchProj, mainSettingsProj, mainProj, ivyProj, completeProj, @@ -576,18 +596,46 @@ def customCommands: Seq[Setting[_]] = Seq( }, commands += Command.command("release-sbt-local") { state => "clean" :: + "precompiled-2_8_2/compile" :: + "precompiled-2_9_2/compile" :: + "precompiled-2_9_3/compile" :: "so compile" :: + "precompiled-2_8_2/publishLocal" :: + "precompiled-2_9_2/publishLocal" :: + "precompiled-2_9_3/publishLocal" :: "so publishLocal" :: "reload" :: state }, + /** There are several complications with sbt's build. + * First is the fact that interface project is a Java-only project + * that uses source generator from datatype subproject in Scala 2.10.4, + * which is depended on by Scala 2.8.2, Scala 2.9.2, and Scala 2.9.3 precompiled project. + * + * Second is the fact that sbt project (currently using Scala 2.10.4) depends on + * the precompiled projects (that uses Scala 2.8.2 etc.) + * + * Finally, there's the fact that all subprojects are released with crossPaths + * turned off for the sbt's Scala version 2.10.4, but some of them are also + * cross published against 2.11.1 with crossPaths turned on. + * + * Because of the way ++ (and its improved version wow) is implemented + * precompiled compiler briges are handled outside of doge aggregation on root. + * `so compile` handles 2.10.x/2.11.x cross building. + */ commands += Command.command("release-sbt") { state => // TODO - Any sort of validation "clean" :: "checkCredentials" :: "conscript-configs" :: + "precompiled-2_8_2/compile" :: + "precompiled-2_9_2/compile" :: + "precompiled-2_9_3/compile" :: "so compile" :: "so publishSigned" :: + "precompiled-2_8_2/publishSigned" :: + "precompiled-2_9_2/publishSigned" :: + "precompiled-2_9_3/publishSigned" :: "publishLauncher" :: state } From a236c5cff292e065f37594f558f9f7b4f33c3f1d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 17 Feb 2015 13:45:51 -0500 Subject: [PATCH 471/823] Exclude precompiled from doc --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 04652aec5..f4c67caf7 100644 --- a/build.sbt +++ b/build.sbt @@ -513,7 +513,7 @@ def otherRootSettings = Seq( } )) lazy val docProjects: ScopeFilter = ScopeFilter( - inAnyProject -- inProjects(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj), + inAnyProject -- inProjects(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, precompiled282, precompiled292, precompiled293), inConfigurations(Compile) ) def fullDocSettings = Util.baseScalacOptions ++ Docs.settings ++ Sxr.settings ++ Seq( From afa97cb833f608c33398c93feb01285c953fbd85 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 14 Feb 2015 22:29:24 -0500 Subject: [PATCH 472/823] Fix nightly publishing location by demoting publish-status to ThisBuild --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index f4c67caf7..447c3de82 100644 --- a/build.sbt +++ b/build.sbt @@ -30,7 +30,7 @@ def commonSettings: Seq[Setting[_]] = Seq( ) def minimalSettings: Seq[Setting[_]] = - commonSettings ++ customCommands ++ Status.settings ++ + commonSettings ++ customCommands ++ publishPomSettings ++ Release.javaVersionCheckSettings def baseSettings: Seq[Setting[_]] = From 80c3a44bb6be437663e63d59e613dc1ff83dd0d7 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 20 Feb 2015 21:09:43 -0500 Subject: [PATCH 473/823] Adds release-nightly command --- build.sbt | 43 +++++++++++++++++++++++++++++-------------- 1 file changed, 29 insertions(+), 14 deletions(-) diff --git a/build.sbt b/build.sbt index 447c3de82..71b5b4c8f 100644 --- a/build.sbt +++ b/build.sbt @@ -52,6 +52,15 @@ lazy val root: Project = (project in file(".")). } ) +// This is used only for command aggregation +lazy val allPrecompiled: Project = (project in file("all-precompiled")). + aggregate(precompiled282, precompiled292, precompiled293). + settings(buildLevelSettings ++ minimalSettings: _*). + settings( + publish := {}, + publishLocal := {} + ) + /* ** subproject declarations ** */ // defines the Java interfaces through which the launcher and the launched application communicate @@ -595,14 +604,11 @@ def customCommands: Seq[Setting[_]] = Seq( test.all(safeProjects).value }, commands += Command.command("release-sbt-local") { state => - "clean" :: - "precompiled-2_8_2/compile" :: - "precompiled-2_9_2/compile" :: - "precompiled-2_9_3/compile" :: + "so clean" :: + "allPrecompiled/clean" :: + "allPrecompiled/compile" :: "so compile" :: - "precompiled-2_8_2/publishLocal" :: - "precompiled-2_9_2/publishLocal" :: - "precompiled-2_9_3/publishLocal" :: + "allPrecompiled/publishLocal" :: "so publishLocal" :: "reload" :: state @@ -625,18 +631,27 @@ def customCommands: Seq[Setting[_]] = Seq( */ commands += Command.command("release-sbt") { state => // TODO - Any sort of validation - "clean" :: + "so clean" :: + "allPrecompiled/clean" :: "checkCredentials" :: "conscript-configs" :: - "precompiled-2_8_2/compile" :: - "precompiled-2_9_2/compile" :: - "precompiled-2_9_3/compile" :: + "allPrecompiled/compile" :: "so compile" :: "so publishSigned" :: - "precompiled-2_8_2/publishSigned" :: - "precompiled-2_9_2/publishSigned" :: - "precompiled-2_9_3/publishSigned" :: + "allPrecompiled/publishSigned" :: + "publishLauncher" :: + state + }, + commands += Command.command("release-nightly") { state => + "stamp-version" :: + "so clean" :: + "allPrecompiled/clean" :: + "allPrecompiled/compile" :: + "so compile" :: + "so publish" :: + "allPrecompiled/publish" :: "publishLauncher" :: state } ) + From 4bfeb7f88d7243ede273c5884d2e23821bc451b2 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 25 Feb 2015 12:31:21 -0500 Subject: [PATCH 474/823] Fixing precompiled and nightly build We noticed that -SNAPSHOT is being published as our nightly. This is because "wow" command (or ++) does not replay version injected by stamp-version. --- build.sbt | 25 +++++++++++++------------ 1 file changed, 13 insertions(+), 12 deletions(-) diff --git a/build.sbt b/build.sbt index 71b5b4c8f..76684c703 100644 --- a/build.sbt +++ b/build.sbt @@ -604,11 +604,12 @@ def customCommands: Seq[Setting[_]] = Seq( test.all(safeProjects).value }, commands += Command.command("release-sbt-local") { state => - "so clean" :: + "clean" :: "allPrecompiled/clean" :: "allPrecompiled/compile" :: - "so compile" :: "allPrecompiled/publishLocal" :: + "so clean" :: + "so compile" :: "so publishLocal" :: "reload" :: state @@ -631,27 +632,27 @@ def customCommands: Seq[Setting[_]] = Seq( */ commands += Command.command("release-sbt") { state => // TODO - Any sort of validation - "so clean" :: - "allPrecompiled/clean" :: - "checkCredentials" :: - "conscript-configs" :: + "checkCredentials" :: + "clean" :: + "allPrecompiled/clean" :: "allPrecompiled/compile" :: + "allPrecompiled/publishSigned" :: + "so clean" :: + "conscript-configs" :: "so compile" :: "so publishSigned" :: - "allPrecompiled/publishSigned" :: "publishLauncher" :: state }, + // stamp-version doesn't work with ++ or "so". commands += Command.command("release-nightly") { state => "stamp-version" :: - "so clean" :: + "clean" :: "allPrecompiled/clean" :: "allPrecompiled/compile" :: - "so compile" :: - "so publish" :: "allPrecompiled/publish" :: - "publishLauncher" :: + "compile" :: + "publish" :: state } ) - From 7cd6b18cf3b71f4600e775428a1059eee5fa06a9 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Fri, 27 Feb 2015 14:43:25 -0500 Subject: [PATCH 475/823] Fix ANSI escape sequences. Now we handle the CSI (ESC + [). Fixes #1143 --- .../src/main/scala/sbt/ConsoleLogger.scala | 50 ++++++++++++++++--- util/log/src/test/scala/Escapes.scala | 45 +++++++++++++---- 2 files changed, 80 insertions(+), 15 deletions(-) diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala index a614e4315..f86d7f2f5 100644 --- a/util/log/src/main/scala/sbt/ConsoleLogger.scala +++ b/util/log/src/main/scala/sbt/ConsoleLogger.scala @@ -31,11 +31,40 @@ object ConsoleLogger { /** * An escape terminator is a character in the range `@` (decimal value 64) to `~` (decimal value 126). * It is the final character in an escape sequence. + * + * cf. http://en.wikipedia.org/wiki/ANSI_escape_code#CSI_codes */ + @deprecated("No longer public.", "0.13.8") def isEscapeTerminator(c: Char): Boolean = c >= '@' && c <= '~' - /** Returns true if the string contains the ESC character. */ + /** + * Test if the character AFTER an ESC is the ANSI CSI. + * + * see: http://en.wikipedia.org/wiki/ANSI_escape_code + * + * The CSI (control sequence instruction) codes start with ESC + '['. This is for testing the second character. + * + * There is an additional CSI (one character) that we could test for, but is not frequnetly used, and we don't + * check for it. + * + * cf. http://en.wikipedia.org/wiki/ANSI_escape_code#CSI_codes + */ + private def isCSI(c: Char): Boolean = c == '[' + + /** + * Tests whether or not a character needs to immediately terminate the ANSI sequence. + * + * c.f. http://en.wikipedia.org/wiki/ANSI_escape_code#Sequence_elements + */ + private def isAnsiTwoCharacterTerminator(c: Char): Boolean = + (c >= '@') && (c <= '_') + + /** + * Returns true if the string contains the ESC character. + * + * TODO - this should handle raw CSI (not used much) + */ def hasEscapeSequence(s: String): Boolean = s.indexOf(ESC) >= 0 @@ -58,19 +87,28 @@ object ConsoleLogger { sb.append(s, start, s.length) else { sb.append(s, start, escIndex) - val next = skipESC(s, escIndex + 1) + val next: Int = + // If it's a CSI we skip past it and then look for a terminator. + if (isCSI(s.charAt(escIndex + 1))) skipESC(s, escIndex + 2) + else if (isAnsiTwoCharacterTerminator(s.charAt(escIndex + 1))) escIndex + 2 + else { + // There could be non-ANSI character sequences we should make sure we handle here. + skipESC(s, escIndex + 1) + } nextESC(s, next, sb) } } /** Skips the escape sequence starting at `i-1`. `i` should be positioned at the character after the ESC that starts the sequence. */ - private[this] def skipESC(s: String, i: Int): Int = - if (i >= s.length) + private[this] def skipESC(s: String, i: Int): Int = { + if (i >= s.length) { i - else if (isEscapeTerminator(s.charAt(i))) + } else if (isEscapeTerminator(s.charAt(i))) { i + 1 - else + } else { skipESC(s, i + 1) + } + } val formatEnabled = { diff --git a/util/log/src/test/scala/Escapes.scala b/util/log/src/test/scala/Escapes.scala index f780d25bf..408ec5e23 100644 --- a/util/log/src/test/scala/Escapes.scala +++ b/util/log/src/test/scala/Escapes.scala @@ -37,28 +37,50 @@ object Escapes extends Properties("Escapes") { property("removeEscapeSequences returns string without escape sequences") = forAllNoShrink(genWithoutEscape, genEscapePairs) { (start: String, escapes: List[EscapeAndNot]) => - val withEscapes: String = start + escapes.map { ean => ean.escape.makeString + ean.notEscape } + val withEscapes: String = start + (escapes.map { ean => ean.escape.makeString + ean.notEscape }).mkString("") val removed: String = removeEscapeSequences(withEscapes) - val original = start + escapes.map(_.notEscape) - ("Input string with escapes: '" + withEscapes + "'") |: - ("Escapes removed '" + removed + "'") |: + val original = start + escapes.map(_.notEscape).mkString("") + val diffCharString = diffIndex(original, removed) + ("Input string : '" + withEscapes + "'") |: + ("Expected : '" + original + "'") |: + ("Escapes removed : '" + removed + "'") |: + (diffCharString) |: (original == removed) } - final case class EscapeAndNot(escape: EscapeSequence, notEscape: String) + def diffIndex(expect: String, original: String): String = { + var i = 0; + while (i < expect.length && i < original.length) { + if (expect.charAt(i) != original.charAt(i)) return ("Differing character, idx: " + i + ", char: " + original.charAt(i) + ", expected: " + expect.charAt(i)) + i += 1 + } + if (expect.length != original.length) return s"Strings are different lengths!" + "No differences found" + } + + final case class EscapeAndNot(escape: EscapeSequence, notEscape: String) { + override def toString = s"EscapeAntNot(escape = [$escape], notEscape = [${notEscape.map(_.toInt)}])" + } final case class EscapeSequence(content: String, terminator: Char) { - assert(content.forall(c => !isEscapeTerminator(c)), "Escape sequence content contains an escape terminator: '" + content + "'") + if (!content.isEmpty) { + assert(content.tail.forall(c => !isEscapeTerminator(c)), "Escape sequence content contains an escape terminator: '" + content + "'") + assert((content.head == '[') || !isEscapeTerminator(content.head), "Escape sequence content contains an escape terminator: '" + content.headOption + "'") + } assert(isEscapeTerminator(terminator)) def makeString: String = ESC + content + terminator + + override def toString = + if (content.isEmpty) s"ESC (${terminator.toInt})" + else s"ESC ($content) (${terminator.toInt})" } private[this] def noEscape(s: String): String = s.replace(ESC, ' ') - lazy val genEscapeSequence: Gen[EscapeSequence] = oneOf(genKnownSequence, genArbitraryEscapeSequence) + lazy val genEscapeSequence: Gen[EscapeSequence] = oneOf(genKnownSequence, genTwoCharacterSequence, genArbitraryEscapeSequence) lazy val genEscapePair: Gen[EscapeAndNot] = for (esc <- genEscapeSequence; not <- genWithoutEscape) yield EscapeAndNot(esc, not) lazy val genEscapePairs: Gen[List[EscapeAndNot]] = listOf(genEscapePair) lazy val genArbitraryEscapeSequence: Gen[EscapeSequence] = - for (content <- genWithoutTerminator; term <- genTerminator) yield new EscapeSequence(content, term) + for (content <- genWithoutTerminator if !content.isEmpty; term <- genTerminator) yield new EscapeSequence("[" + content, term) lazy val genKnownSequence: Gen[EscapeSequence] = oneOf((misc ++ setGraphicsMode ++ setMode ++ resetMode).map(toEscapeSequence)) @@ -74,7 +96,12 @@ object Escapes extends Properties("Escapes") { lazy val setMode = setModeLike('h') def setModeLike(term: Char): Seq[String] = (0 to 19).map(i => "=" + i.toString + term) - lazy val genWithoutTerminator = genRawString.map(_.filter { c => !isEscapeTerminator(c) }) + lazy val genWithoutTerminator = + genRawString.map(_.filter { c => !isEscapeTerminator(c) && (c != '[') }) + + lazy val genTwoCharacterSequence = + // 91 == [ which is the CSI escape sequence. + oneOf((64 to 95)) filter (_ != 91) map (c => new EscapeSequence("", c.toChar)) lazy val genTerminator: Gen[Char] = Gen.choose('@', '~') lazy val genWithoutEscape: Gen[String] = genRawString.map(noEscape) From 6be84b62aa173500ae5b7451e2392347c5483164 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 9 Mar 2015 12:57:52 -0400 Subject: [PATCH 476/823] Remove "so clean" --- build.sbt | 2 -- 1 file changed, 2 deletions(-) diff --git a/build.sbt b/build.sbt index 76684c703..320c31fed 100644 --- a/build.sbt +++ b/build.sbt @@ -608,7 +608,6 @@ def customCommands: Seq[Setting[_]] = Seq( "allPrecompiled/clean" :: "allPrecompiled/compile" :: "allPrecompiled/publishLocal" :: - "so clean" :: "so compile" :: "so publishLocal" :: "reload" :: @@ -637,7 +636,6 @@ def customCommands: Seq[Setting[_]] = Seq( "allPrecompiled/clean" :: "allPrecompiled/compile" :: "allPrecompiled/publishSigned" :: - "so clean" :: "conscript-configs" :: "so compile" :: "so publishSigned" :: From 862e72b2dc9cc9cf85a807f92fd3a68d53bbdd12 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 4 Mar 2015 04:31:31 -0500 Subject: [PATCH 477/823] Implement pickler for UpdateReport. #1763 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 320c31fed..7b79a8b20 100644 --- a/build.sbt +++ b/build.sbt @@ -262,7 +262,7 @@ lazy val ivyProj = (project in file("ivy")). settings(baseSettings: _*). settings( name := "Ivy", - libraryDependencies ++= Seq(ivy, jsch, json4sNative, jawnParser, jawnJson4s), + libraryDependencies ++= Seq(ivy, jsch, sbtSerialization), testExclusive) // Runner for uniform test interface From e6cc43123aa3a9d3a3a5851007afa770e0ca1153 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 4 Mar 2015 05:41:57 -0500 Subject: [PATCH 478/823] Use pickler to cache UpdateReport for update task. #1763 --- build.sbt | 2 +- .../tracking/src/main/scala/sbt/Tracked.scala | 54 +++++++++++++++++++ 2 files changed, 55 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 7b79a8b20..bc2cf4cb0 100644 --- a/build.sbt +++ b/build.sbt @@ -305,7 +305,7 @@ lazy val cacheProj = (project in cachePath). settings(baseSettings: _*). settings( name := "Cache", - libraryDependencies ++= Seq(sbinary) ++ scalaXml.value + libraryDependencies ++= Seq(sbinary, sbtSerialization) ++ scalaXml.value ) // Builds on cache to provide caching for filesystem-related operations diff --git a/cache/tracking/src/main/scala/sbt/Tracked.scala b/cache/tracking/src/main/scala/sbt/Tracked.scala index c851ef9a5..028d385c2 100644 --- a/cache/tracking/src/main/scala/sbt/Tracked.scala +++ b/cache/tracking/src/main/scala/sbt/Tracked.scala @@ -9,6 +9,7 @@ import sbinary.Format import scala.reflect.Manifest import scala.collection.mutable import IO.{ delete, read, write } +import sbt.serialization._ object Tracked { /** @@ -36,6 +37,25 @@ object Tracked { toFile(next)(cacheFile) next } + private[sbt] def lastOuputWithJson[I, O: Pickler: Unpickler](cacheFile: File)(f: (I, Option[O]) => O): I => O = in => + { + val previous: Option[O] = fromJsonFile[O](cacheFile) + val next = f(in, previous) + toJsonFile(next)(cacheFile) + next + } + private[sbt] def fromJsonFile[A: Unpickler](file: File): Option[A] = + try { + val s = IO.read(file, IO.utf8) + fromJsonString[A](s).toOption + } catch { + case e: Throwable => None + } + private[sbt] def toJsonFile[A: Pickler](a: A)(file: File): Unit = + { + val str = toJsonString(a) + IO.write(file, str, IO.utf8) + } def inputChanged[I, O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): I => O = in => { @@ -44,6 +64,18 @@ object Tracked { val changed = help.changed(cacheFile, conv) val result = f(changed, in) + if (changed) + help.save(cacheFile, conv) + + result + } + private[sbt] def inputChangedWithJson[I: Pickler: Unpickler, O](cacheFile: File)(f: (Boolean, I) => O): I => O = in => + { + val help = new JsonCacheHelp[I] + val conv = help.convert(in) + val changed = help.changed(cacheFile, conv) + val result = f(changed, in) + if (changed) help.save(cacheFile, conv) @@ -56,6 +88,18 @@ object Tracked { val changed = help.changed(cacheFile, help.convert(initial)) val result = f(changed, initial) + if (changed) + help.save(cacheFile, help.convert(in())) + + result + } + private[sbt] def outputChangedWithJson[I: Pickler, O](cacheFile: File)(f: (Boolean, I) => O): (() => I) => O = in => + { + val initial = in() + val help = new JsonCacheHelp[I] + val changed = help.changed(cacheFile, help.convert(initial)) + val result = f(changed, initial) + if (changed) help.save(cacheFile, help.convert(in())) @@ -71,6 +115,16 @@ object Tracked { !ic.equiv.equiv(converted, prev) } catch { case e: Exception => true } } + private[sbt] final class JsonCacheHelp[I: Pickler] { + def convert(i: I): String = toJsonString(i) + def save(cacheFile: File, value: String): Unit = + IO.write(cacheFile, value, IO.utf8) + def changed(cacheFile: File, converted: String): Boolean = + try { + val prev = IO.read(cacheFile, IO.utf8) + converted != prev + } catch { case e: Exception => true } + } } trait Tracked { From 3952bd8e148eccab40da206f5da78a52132d9c0d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 6 Mar 2015 19:58:38 -0500 Subject: [PATCH 479/823] Write JSON to file without String. #1763 --- .../tracking/src/main/scala/sbt/Tracked.scala | 22 ++++++------------- 1 file changed, 7 insertions(+), 15 deletions(-) diff --git a/cache/tracking/src/main/scala/sbt/Tracked.scala b/cache/tracking/src/main/scala/sbt/Tracked.scala index 028d385c2..13119df3a 100644 --- a/cache/tracking/src/main/scala/sbt/Tracked.scala +++ b/cache/tracking/src/main/scala/sbt/Tracked.scala @@ -39,24 +39,16 @@ object Tracked { } private[sbt] def lastOuputWithJson[I, O: Pickler: Unpickler](cacheFile: File)(f: (I, Option[O]) => O): I => O = in => { - val previous: Option[O] = fromJsonFile[O](cacheFile) + val previous: Option[O] = try { + fromJsonFile[O](cacheFile).toOption + } catch { + case e: Throwable => None + } val next = f(in, previous) - toJsonFile(next)(cacheFile) + IO.createDirectory(cacheFile.getParentFile) + toJsonFile(next, cacheFile) next } - private[sbt] def fromJsonFile[A: Unpickler](file: File): Option[A] = - try { - val s = IO.read(file, IO.utf8) - fromJsonString[A](s).toOption - } catch { - case e: Throwable => None - } - private[sbt] def toJsonFile[A: Pickler](a: A)(file: File): Unit = - { - val str = toJsonString(a) - IO.write(file, str, IO.utf8) - } - def inputChanged[I, O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): I => O = in => { val help = new CacheHelp(ic) From 374ede55942492312c177b57ac5624aae1af1876 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 9 Mar 2015 21:20:52 -0400 Subject: [PATCH 480/823] Don't catch throwable --- cache/tracking/src/main/scala/sbt/Tracked.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/cache/tracking/src/main/scala/sbt/Tracked.scala b/cache/tracking/src/main/scala/sbt/Tracked.scala index 13119df3a..48ff86a41 100644 --- a/cache/tracking/src/main/scala/sbt/Tracked.scala +++ b/cache/tracking/src/main/scala/sbt/Tracked.scala @@ -3,7 +3,7 @@ */ package sbt -import java.io.File +import java.io.{ File, IOException } import CacheIO.{ fromFile, toFile } import sbinary.Format import scala.reflect.Manifest @@ -42,7 +42,7 @@ object Tracked { val previous: Option[O] = try { fromJsonFile[O](cacheFile).toOption } catch { - case e: Throwable => None + case e: IOException => None } val next = f(in, previous) IO.createDirectory(cacheFile.getParentFile) From 2959e870d77c5336ffc08c4fd35e986a06628669 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 9 Mar 2015 22:38:08 -0400 Subject: [PATCH 481/823] catch PicklingException --- cache/tracking/src/main/scala/sbt/Tracked.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/cache/tracking/src/main/scala/sbt/Tracked.scala b/cache/tracking/src/main/scala/sbt/Tracked.scala index 48ff86a41..5965fc135 100644 --- a/cache/tracking/src/main/scala/sbt/Tracked.scala +++ b/cache/tracking/src/main/scala/sbt/Tracked.scala @@ -6,6 +6,7 @@ package sbt import java.io.{ File, IOException } import CacheIO.{ fromFile, toFile } import sbinary.Format +import scala.pickling.PicklingException import scala.reflect.Manifest import scala.collection.mutable import IO.{ delete, read, write } @@ -42,7 +43,8 @@ object Tracked { val previous: Option[O] = try { fromJsonFile[O](cacheFile).toOption } catch { - case e: IOException => None + case e: PicklingException => None + case e: IOException => None } val next = f(in, previous) IO.createDirectory(cacheFile.getParentFile) From 45813fb692a61b9f9cb7735414d31a042c438b71 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 10 Mar 2015 05:12:17 -0400 Subject: [PATCH 482/823] Roll back the use of sbt/serialization for update caching --- cache/tracking/src/main/scala/sbt/Tracked.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/cache/tracking/src/main/scala/sbt/Tracked.scala b/cache/tracking/src/main/scala/sbt/Tracked.scala index 5965fc135..0de466686 100644 --- a/cache/tracking/src/main/scala/sbt/Tracked.scala +++ b/cache/tracking/src/main/scala/sbt/Tracked.scala @@ -38,7 +38,8 @@ object Tracked { toFile(next)(cacheFile) next } - private[sbt] def lastOuputWithJson[I, O: Pickler: Unpickler](cacheFile: File)(f: (I, Option[O]) => O): I => O = in => + // Todo: This function needs more testing. + private[sbt] def lastOutputWithJson[I, O: Pickler: Unpickler](cacheFile: File)(f: (I, Option[O]) => O): I => O = in => { val previous: Option[O] = try { fromJsonFile[O](cacheFile).toOption From 8764c64533bc7a96d10a9d507d6d9254b2ea36d4 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 20 Mar 2015 12:56:39 -0400 Subject: [PATCH 483/823] bumping up to 0.13.8 --- build.sbt | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index bc2cf4cb0..769160087 100644 --- a/build.sbt +++ b/build.sbt @@ -12,7 +12,7 @@ import Sxr.sxr // but can be shared across the multi projects. def buildLevelSettings: Seq[Setting[_]] = Seq( organization in ThisBuild := "org.scala-sbt", - version in ThisBuild := "0.13.8-SNAPSHOT" + version in ThisBuild := "0.13.8" ) def commonSettings: Seq[Setting[_]] = Seq( @@ -522,7 +522,7 @@ def otherRootSettings = Seq( } )) lazy val docProjects: ScopeFilter = ScopeFilter( - inAnyProject -- inProjects(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, precompiled282, precompiled292, precompiled293), + inAnyProject -- inProjects(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, precompiled282, precompiled292, precompiled293, mavenResolverPluginProj), inConfigurations(Compile) ) def fullDocSettings = Util.baseScalacOptions ++ Docs.settings ++ Sxr.settings ++ Seq( From 08893b27b6baafc6749198a57818e9f8b1b35d4d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 23 Mar 2015 13:27:27 -0400 Subject: [PATCH 484/823] 0.13.9-SNAPSHOT --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 769160087..7cf968a89 100644 --- a/build.sbt +++ b/build.sbt @@ -12,7 +12,7 @@ import Sxr.sxr // but can be shared across the multi projects. def buildLevelSettings: Seq[Setting[_]] = Seq( organization in ThisBuild := "org.scala-sbt", - version in ThisBuild := "0.13.8" + version in ThisBuild := "0.13.9-SNAPSHOT" ) def commonSettings: Seq[Setting[_]] = Seq( From 349628884587bf27af453101fb9e8103ceb2a80f Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 24 Mar 2015 11:14:13 -0400 Subject: [PATCH 485/823] Start using launcher interface from sbt/launcher module. --- build.sbt | 48 ++++++++++++++++++++++-------------------------- 1 file changed, 22 insertions(+), 26 deletions(-) diff --git a/build.sbt b/build.sbt index bc2cf4cb0..ff21fc8f9 100644 --- a/build.sbt +++ b/build.sbt @@ -63,21 +63,15 @@ lazy val allPrecompiled: Project = (project in file("all-precompiled")). /* ** subproject declarations ** */ -// defines the Java interfaces through which the launcher and the launched application communicate -lazy val launchInterfaceProj = (project in launchPath / "interface"). - settings(minimalSettings ++ javaOnlySettings: _*). - settings( - name := "Launcher Interface" - ) // the launcher. Retrieves, loads, and runs applications based on a configuration file. lazy val launchProj = (project in launchPath). - dependsOn(ioProj % "test->test", interfaceProj % Test, launchInterfaceProj). + dependsOn(ioProj % "test->test", interfaceProj % Test). settings(testedBaseSettings: _*). settings( name := "Launcher", - libraryDependencies += ivy, - compile in Test <<= compile in Test dependsOn (publishLocal in interfaceProj, publishLocal in testSamples, publishLocal in launchInterfaceProj) + libraryDependencies ++= Seq(ivy, Dependencies.launcherInterface), + compile in Test <<= compile in Test dependsOn (publishLocal in interfaceProj, publishLocal in testSamples) ). settings(inConfig(Compile)(Transform.configSettings): _*). settings(inConfig(Compile)(Transform.transSourceSettings ++ Seq( @@ -105,11 +99,11 @@ lazy val proguardedLauncherProj = (project in file("sbt-launch")). // used to test the retrieving and loading of an application: sample app is packaged and published to the local repository lazy val testSamples = (project in launchPath / "test-sample"). - dependsOn(interfaceProj, launchInterfaceProj). + dependsOn(interfaceProj). settings(baseSettings ++ noPublishSettings: _*). settings( name := "Launch Test", - libraryDependencies += scalaCompiler.value + libraryDependencies ++= Seq(scalaCompiler.value, Dependencies.launcherInterface) ) // defines Java structures used across Scala versions, such as the API structures and relationships extracted by @@ -188,11 +182,11 @@ lazy val ioProj = (project in utilPath / "io"). // Utilities related to reflection, managing Scala versions, and custom class loaders lazy val classpathProj = (project in utilPath / "classpath"). - dependsOn(launchInterfaceProj, interfaceProj, ioProj). + dependsOn(interfaceProj, ioProj). settings(testedBaseSettings: _*). settings( name := "Classpath", - libraryDependencies += scalaCompiler.value + libraryDependencies ++= Seq(scalaCompiler.value,Dependencies.launcherInterface) ) // Command line-related utilities. @@ -258,20 +252,20 @@ lazy val logicProj = (project in utilPath / "logic"). // Apache Ivy integration lazy val ivyProj = (project in file("ivy")). - dependsOn(interfaceProj, launchInterfaceProj, crossProj, logProj % "compile;test->test", ioProj % "compile;test->test", launchProj % "test->test", collectionProj). + dependsOn(interfaceProj, crossProj, logProj % "compile;test->test", ioProj % "compile;test->test", launchProj % "test->test", collectionProj). settings(baseSettings: _*). settings( name := "Ivy", - libraryDependencies ++= Seq(ivy, jsch, sbtSerialization), + libraryDependencies ++= Seq(ivy, jsch, sbtSerialization, launcherInterface), testExclusive) // Runner for uniform test interface lazy val testingProj = (project in file("testing")). - dependsOn(ioProj, classpathProj, logProj, launchInterfaceProj, testAgentProj). + dependsOn(ioProj, classpathProj, logProj, testAgentProj). settings(baseSettings: _*). settings( name := "Testing", - libraryDependencies += testInterface + libraryDependencies ++= Seq(testInterface,launcherInterface) ) // Testing agent for running tests in a separate process. @@ -366,12 +360,12 @@ lazy val compilePersistProj = (project in compilePath / "persist"). // sbt-side interface to compiler. Calls compiler-side interface reflectively lazy val compilerProj = (project in compilePath). - dependsOn(launchInterfaceProj, interfaceProj % "compile;test->test", logProj, ioProj, classpathProj, apiProj, classfileProj, + dependsOn(interfaceProj % "compile;test->test", logProj, ioProj, classpathProj, apiProj, classfileProj, logProj % "test->test", launchProj % "test->test"). settings(testedBaseSettings: _*). settings( name := "Compile", - libraryDependencies += scalaCompiler.value % Test, + libraryDependencies ++= Seq(scalaCompiler.value % Test, launcherInterface), unmanagedJars in Test <<= (packageSrc in compileInterfaceProj in Compile).map(x => Seq(x).classpath) ) @@ -398,10 +392,11 @@ lazy val scriptedBaseProj = (project in scriptedPath / "base"). ) lazy val scriptedSbtProj = (project in scriptedPath / "sbt"). - dependsOn (ioProj, logProj, processProj, scriptedBaseProj, launchInterfaceProj % "provided"). + dependsOn (ioProj, logProj, processProj, scriptedBaseProj). settings(baseSettings: _*). settings( - name := "Scripted sbt" + name := "Scripted sbt", + libraryDependencies += launcherInterface % "provided" ) lazy val scriptedPluginProj = (project in scriptedPath / "plugin"). @@ -423,10 +418,11 @@ lazy val actionsProj = (project in mainPath / "actions"). // General command support and core commands not specific to a build system lazy val commandProj = (project in mainPath / "command"). - dependsOn(interfaceProj, ioProj, launchInterfaceProj, logProj, completeProj, classpathProj, crossProj). + dependsOn(interfaceProj, ioProj, logProj, completeProj, classpathProj, crossProj). settings(testedBaseSettings: _*). settings( - name := "Command" + name := "Command", + libraryDependencies += launcherInterface ) // Fixes scope=Scope for Setting (core defined in collectionProj) to define the settings system used in build definitions @@ -441,11 +437,11 @@ lazy val mainSettingsProj = (project in mainPath / "settings"). // The main integration project for sbt. It brings all of the Projsystems together, configures them, and provides for overriding conventions. lazy val mainProj = (project in mainPath). - dependsOn (actionsProj, mainSettingsProj, interfaceProj, ioProj, ivyProj, launchInterfaceProj, logProj, logicProj, processProj, runProj, commandProj). + dependsOn (actionsProj, mainSettingsProj, interfaceProj, ioProj, ivyProj, logProj, logicProj, processProj, runProj, commandProj). settings(testedBaseSettings: _*). settings( name := "Main", - libraryDependencies ++= scalaXml.value + libraryDependencies ++= scalaXml.value ++ Seq(launcherInterface) ) // Strictly for bringing implicits and aliases from subsystems into the top-level sbt namespace through a single package object @@ -486,7 +482,7 @@ lazy val publishLauncher = TaskKey[Unit]("publish-launcher") lazy val myProvided = config("provided") intransitive -def allProjects = Seq(launchInterfaceProj, launchProj, proguardedLauncherProj, +def allProjects = Seq(launchProj, proguardedLauncherProj, testSamples, interfaceProj, apiProj, controlProj, collectionProj, applyMacroProj, processProj, ioProj, classpathProj, completeProj, logProj, relationProj, classfileProj, datatypeProj, crossProj, logicProj, ivyProj, From 4f1e1f61fce83c22bb1d5251a861cf0cd5644f10 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 24 Mar 2015 16:12:51 -0400 Subject: [PATCH 486/823] Migrate to using the sbt/launcher module, rather than having the code embedded. * Remove launch/* code/tests, as these are in the sbt/launcher project. * Create a new project which will resolve launcher module from sonatype-snapshots, and repackage it for the currently building version of sbt. * Remove ComponentManagerTest which was relying DIRECTLY on launcher classes. We'll need to reconfigure this shortly to enable the tests again. Remaining TODOs - * Update resolvers so people can find the launcher. * Add ComponentManagerTest back. * Re-publish the sbt-launch.jar in the location it used to be published. --- build.sbt | 79 ++++++++++++++++--------------------------------------- 1 file changed, 22 insertions(+), 57 deletions(-) diff --git a/build.sbt b/build.sbt index ff21fc8f9..5d963b424 100644 --- a/build.sbt +++ b/build.sbt @@ -46,12 +46,21 @@ lazy val root: Project = (project in file(".")). settings(minimalSettings ++ rootSettings: _*). settings( publish := {}, - publishLocal := { - val p = (proguard in (proguardedLauncherProj, Proguard)).value - IO.copyFile(p, target.value / p.getName) - } + publishLocal := {} ) +// This is used to configure an sbt-launcher for this version of sbt. +lazy val bundledLauncherProj = + (project in file("launch")). + settings(minimalSettings:_*). + settings(inConfig(Compile)(Transform.configSettings):_*). + enablePlugins(SbtLauncherPlugin). + settings( + publish := {}, + publishLocal := {} + ) + + // This is used only for command aggregation lazy val allPrecompiled: Project = (project in file("all-precompiled")). aggregate(precompiled282, precompiled292, precompiled293). @@ -63,49 +72,6 @@ lazy val allPrecompiled: Project = (project in file("all-precompiled")). /* ** subproject declarations ** */ - -// the launcher. Retrieves, loads, and runs applications based on a configuration file. -lazy val launchProj = (project in launchPath). - dependsOn(ioProj % "test->test", interfaceProj % Test). - settings(testedBaseSettings: _*). - settings( - name := "Launcher", - libraryDependencies ++= Seq(ivy, Dependencies.launcherInterface), - compile in Test <<= compile in Test dependsOn (publishLocal in interfaceProj, publishLocal in testSamples) - ). - settings(inConfig(Compile)(Transform.configSettings): _*). - settings(inConfig(Compile)(Transform.transSourceSettings ++ Seq( - Transform.inputSourceDirectory <<= (sourceDirectory in crossProj) / "input_sources", - Transform.sourceProperties := Map("cross.package0" -> "xsbt", "cross.package1" -> "boot") - )): _*) - -// the proguarded launcher -// the launcher is published with metadata so that the scripted plugin can pull it in -// being proguarded, it shouldn't ever be on a classpath with other jars, however -lazy val proguardedLauncherProj = (project in file("sbt-launch")). - configs(Proguard). - settings(minimalSettings ++ LaunchProguard.settings ++ LaunchProguard.specific(launchProj) ++ - Release.launcherSettings(proguard in Proguard): _*). - settings( - name := "sbt-launch", - moduleName := "sbt-launch", - description := "sbt application launcher", - publishArtifact in packageSrc := false, - autoScalaLibrary := false, - publish <<= Seq(publish, Release.deployLauncher).dependOn, - publishLauncher <<= Release.deployLauncher, - packageBin in Compile <<= proguard in Proguard - ) - -// used to test the retrieving and loading of an application: sample app is packaged and published to the local repository -lazy val testSamples = (project in launchPath / "test-sample"). - dependsOn(interfaceProj). - settings(baseSettings ++ noPublishSettings: _*). - settings( - name := "Launch Test", - libraryDependencies ++= Seq(scalaCompiler.value, Dependencies.launcherInterface) - ) - // defines Java structures used across Scala versions, such as the API structures and relationships extracted by // the analysis compiler phases and passed back to sbt. The API structures are defined in a simple // format from which Java sources are generated by the datatype generator Projproject @@ -252,7 +218,7 @@ lazy val logicProj = (project in utilPath / "logic"). // Apache Ivy integration lazy val ivyProj = (project in file("ivy")). - dependsOn(interfaceProj, crossProj, logProj % "compile;test->test", ioProj % "compile;test->test", launchProj % "test->test", collectionProj). + dependsOn(interfaceProj, crossProj, logProj % "compile;test->test", ioProj % "compile;test->test", /*launchProj % "test->test",*/ collectionProj). settings(baseSettings: _*). settings( name := "Ivy", @@ -321,7 +287,7 @@ lazy val runProj = (project in file("run")). // Compiler-side interface to compiler that is compiled against the compiler being used either in advance or on the fly. // Includes API and Analyzer phases that extract source API and relationships. lazy val compileInterfaceProj = (project in compilePath / "interface"). - dependsOn(interfaceProj % "compile;test->test", ioProj % "test->test", logProj % "test->test", launchProj % "test->test", apiProj % "test->test"). + dependsOn(interfaceProj % "compile;test->test", ioProj % "test->test", logProj % "test->test", /*launchProj % "test->test",*/ apiProj % "test->test"). settings(baseSettings ++ precompiledSettings: _*). settings( name := "Compiler Interface", @@ -361,7 +327,7 @@ lazy val compilePersistProj = (project in compilePath / "persist"). // sbt-side interface to compiler. Calls compiler-side interface reflectively lazy val compilerProj = (project in compilePath). dependsOn(interfaceProj % "compile;test->test", logProj, ioProj, classpathProj, apiProj, classfileProj, - logProj % "test->test", launchProj % "test->test"). + logProj % "test->test" /*,launchProj % "test->test" */). settings(testedBaseSettings: _*). settings( name := "Compile", @@ -467,13 +433,13 @@ lazy val mavenResolverPluginProj = (project in file("sbt-maven-resolver")). def scriptedTask: Initialize[InputTask[Unit]] = Def.inputTask { val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed publishAll.value - doScripted((proguard in Proguard in proguardedLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, + doScripted((sbtLaunchJar in bundledLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) } def scriptedUnpublishedTask: Initialize[InputTask[Unit]] = Def.inputTask { val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed - doScripted((proguard in Proguard in proguardedLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, + doScripted((sbtLaunchJar in bundledLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) } @@ -482,8 +448,7 @@ lazy val publishLauncher = TaskKey[Unit]("publish-launcher") lazy val myProvided = config("provided") intransitive -def allProjects = Seq(launchProj, proguardedLauncherProj, - testSamples, interfaceProj, apiProj, +def allProjects = Seq(interfaceProj, apiProj, controlProj, collectionProj, applyMacroProj, processProj, ioProj, classpathProj, completeProj, logProj, relationProj, classfileProj, datatypeProj, crossProj, logicProj, ivyProj, testingProj, testAgentProj, taskProj, stdTaskProj, cacheProj, trackingProj, runProj, @@ -496,8 +461,8 @@ def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.con lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) def rootSettings = Release.releaseSettings ++ fullDocSettings ++ - Util.publishPomSettings ++ otherRootSettings ++ Formatting.sbtFilesSettings ++ - Transform.conscriptSettings(launchProj) + Util.publishPomSettings ++ otherRootSettings ++ Formatting.sbtFilesSettings /*++ + Transform.conscriptSettings(launchProj)*/ def otherRootSettings = Seq( Scripted.scriptedPrescripted := { _ => }, Scripted.scripted <<= scriptedTask, @@ -576,7 +541,7 @@ def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " lazy val safeUnitTests = taskKey[Unit]("Known working tests (for both 2.10 and 2.11)") lazy val safeProjects: ScopeFilter = ScopeFilter( - inProjects(launchProj, mainSettingsProj, mainProj, ivyProj, completeProj, + inProjects(mainSettingsProj, mainProj, ivyProj, completeProj, actionsProj, classpathProj, collectionProj, compileIncrementalProj, logProj, runProj, stdTaskProj), inConfigurations(Test) From 3121ed96781bf95f1bef0acc2b59e0ee75896aa0 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 24 Mar 2015 16:16:55 -0400 Subject: [PATCH 487/823] Add sonatype-snapshots resolver to the list for the launcher module snapshots. --- build.sbt | 1 + 1 file changed, 1 insertion(+) diff --git a/build.sbt b/build.sbt index 5d963b424..28a4359fa 100644 --- a/build.sbt +++ b/build.sbt @@ -22,6 +22,7 @@ def commonSettings: Seq[Setting[_]] = Seq( componentID := None, crossPaths := false, resolvers += Resolver.typesafeIvyRepo("releases"), + resolvers += Resolver.sonatypeRepo("snapshots"), concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), From aa7de282177d3d5b1cd51b038c6c082a7f96e4ba Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 24 Mar 2015 18:32:56 -0400 Subject: [PATCH 488/823] fix launcher re-publication. * The rebundled sbt launcher is now pushed into the old location again. --- build.sbt | 13 ++++++++++--- 1 file changed, 10 insertions(+), 3 deletions(-) diff --git a/build.sbt b/build.sbt index 28a4359fa..c0b26f9fd 100644 --- a/build.sbt +++ b/build.sbt @@ -55,10 +55,17 @@ lazy val bundledLauncherProj = (project in file("launch")). settings(minimalSettings:_*). settings(inConfig(Compile)(Transform.configSettings):_*). + settings(Release.launcherSettings(sbtLaunchJar):_*). enablePlugins(SbtLauncherPlugin). settings( - publish := {}, - publishLocal := {} + name := "sbt-launch", + moduleName := "sbt-launch", + description := "sbt application launcher", + publishArtifact in packageSrc := false, + autoScalaLibrary := false, + publish := Release.deployLauncher.value, + publishLauncher := Release.deployLauncher.value, + packageBin in Compile := sbtLaunchJar.value ) @@ -601,7 +608,7 @@ def customCommands: Seq[Setting[_]] = Seq( "conscript-configs" :: "so compile" :: "so publishSigned" :: - "publishLauncher" :: + "bundledLauncherProj/publishLauncher" :: state }, // stamp-version doesn't work with ++ or "so". From 41900ae0281c5e2da35d261934de25c0c2177184 Mon Sep 17 00:00:00 2001 From: Josh Suereth Date: Tue, 24 Mar 2015 18:37:57 -0400 Subject: [PATCH 489/823] Remove unused proguard configuration. --- build.sbt | 1 - 1 file changed, 1 deletion(-) diff --git a/build.sbt b/build.sbt index c0b26f9fd..a1fb67972 100644 --- a/build.sbt +++ b/build.sbt @@ -3,7 +3,6 @@ import Util._ import Dependencies._ import Licensed._ import Scope.ThisScope -import LaunchProguard.{ proguard, Proguard } import Scripted._ import StringUtilities.normalize import Sxr.sxr From c2325f590b61f34d4c903bc7cfad605639096af6 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 26 Mar 2015 23:22:24 +0000 Subject: [PATCH 490/823] Make use of the nicer Project settings syntax in 0.13.8. --- build.sbt | 120 ++++++++++++++++++++++++++++++------------------------ 1 file changed, 66 insertions(+), 54 deletions(-) diff --git a/build.sbt b/build.sbt index bece10e2b..ef9c6c9dd 100644 --- a/build.sbt +++ b/build.sbt @@ -42,9 +42,10 @@ def testedBaseSettings: Seq[Setting[_]] = lazy val root: Project = (project in file(".")). configs(Sxr.sxrConf). aggregate(nonRoots: _*). - settings(buildLevelSettings: _*). - settings(minimalSettings ++ rootSettings: _*). settings( + buildLevelSettings, + minimalSettings, + rootSettings, publish := {}, publishLocal := {} ) @@ -52,27 +53,30 @@ lazy val root: Project = (project in file(".")). // This is used to configure an sbt-launcher for this version of sbt. lazy val bundledLauncherProj = (project in file("launch")). - settings(minimalSettings:_*). - settings(inConfig(Compile)(Transform.configSettings):_*). - settings(Release.launcherSettings(sbtLaunchJar):_*). + settings( + minimalSettings, + inConfig(Compile)(Transform.configSettings), + Release.launcherSettings(sbtLaunchJar) + ). enablePlugins(SbtLauncherPlugin). settings( - name := "sbt-launch", - moduleName := "sbt-launch", - description := "sbt application launcher", - publishArtifact in packageSrc := false, - autoScalaLibrary := false, - publish := Release.deployLauncher.value, - publishLauncher := Release.deployLauncher.value, - packageBin in Compile := sbtLaunchJar.value + name := "sbt-launch", + moduleName := "sbt-launch", + description := "sbt application launcher", + publishArtifact in packageSrc := false, + autoScalaLibrary := false, + publish := Release.deployLauncher.value, + publishLauncher := Release.deployLauncher.value, + packageBin in Compile := sbtLaunchJar.value ) // This is used only for command aggregation lazy val allPrecompiled: Project = (project in file("all-precompiled")). aggregate(precompiled282, precompiled292, precompiled293). - settings(buildLevelSettings ++ minimalSettings: _*). settings( + buildLevelSettings, + minimalSettings, publish := {}, publishLocal := {} ) @@ -83,8 +87,9 @@ lazy val allPrecompiled: Project = (project in file("all-precompiled")). // the analysis compiler phases and passed back to sbt. The API structures are defined in a simple // format from which Java sources are generated by the datatype generator Projproject lazy val interfaceProj = (project in file("interface")). - settings(minimalSettings ++ javaOnlySettings: _*). settings( + minimalSettings, + javaOnlySettings, name := "Interface", projectComponent, exportJars := true, @@ -105,31 +110,34 @@ lazy val interfaceProj = (project in file("interface")). // and discovery of Projclasses and annotations lazy val apiProj = (project in compilePath / "api"). dependsOn(interfaceProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "API" ) /* **** Utilities **** */ lazy val controlProj = (project in utilPath / "control"). - settings(baseSettings ++ Util.crossBuild: _*). settings( + baseSettings, + Util.crossBuild, name := "Control", crossScalaVersions := Seq(scala210, scala211) ) lazy val collectionProj = (project in utilPath / "collection"). - settings(testedBaseSettings ++ Util.keywordsSettings ++ Util.crossBuild: _*). settings( + testedBaseSettings, + Util.keywordsSettings, + Util.crossBuild, name := "Collections", crossScalaVersions := Seq(scala210, scala211) ) lazy val applyMacroProj = (project in utilPath / "appmacro"). dependsOn(collectionProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Apply Macro", libraryDependencies += scalaCompiler.value ) @@ -137,8 +145,8 @@ lazy val applyMacroProj = (project in utilPath / "appmacro"). // The API for forking, combining, and doing I/O with system processes lazy val processProj = (project in utilPath / "process"). dependsOn(ioProj % "test->test"). - settings(baseSettings: _*). settings( + baseSettings, name := "Process", libraryDependencies ++= scalaXml.value ) @@ -146,8 +154,9 @@ lazy val processProj = (project in utilPath / "process"). // Path, IO (formerly FileUtilities), NameFilter and other I/O utility classes lazy val ioProj = (project in utilPath / "io"). dependsOn(controlProj). - settings(testedBaseSettings ++ Util.crossBuild: _*). settings( + testedBaseSettings, + Util.crossBuild, name := "IO", libraryDependencies += scalaCompiler.value % Test, crossScalaVersions := Seq(scala210, scala211) @@ -156,8 +165,8 @@ lazy val ioProj = (project in utilPath / "io"). // Utilities related to reflection, managing Scala versions, and custom class loaders lazy val classpathProj = (project in utilPath / "classpath"). dependsOn(interfaceProj, ioProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Classpath", libraryDependencies ++= Seq(scalaCompiler.value,Dependencies.launcherInterface) ) @@ -165,8 +174,9 @@ lazy val classpathProj = (project in utilPath / "classpath"). // Command line-related utilities. lazy val completeProj = (project in utilPath / "complete"). dependsOn(collectionProj, controlProj, ioProj). - settings(testedBaseSettings ++ Util.crossBuild: _*). settings( + testedBaseSettings, + Util.crossBuild, name := "Completion", libraryDependencies += jline, crossScalaVersions := Seq(scala210, scala211) @@ -175,8 +185,8 @@ lazy val completeProj = (project in utilPath / "complete"). // logging lazy val logProj = (project in utilPath / "log"). dependsOn(interfaceProj, processProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Logging", libraryDependencies += jline ) @@ -184,40 +194,40 @@ lazy val logProj = (project in utilPath / "log"). // Relation lazy val relationProj = (project in utilPath / "relation"). dependsOn(interfaceProj, processProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Relation" ) // class file reader and analyzer lazy val classfileProj = (project in utilPath / "classfile"). dependsOn(ioProj, interfaceProj, logProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Classfile" ) // generates immutable or mutable Java data types according to a simple input format lazy val datatypeProj = (project in utilPath / "datatype"). dependsOn(ioProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Datatype Generator" ) // cross versioning lazy val crossProj = (project in utilPath / "cross"). - settings(baseSettings: _*). - settings(inConfig(Compile)(Transform.crossGenSettings): _*). settings( + baseSettings, + inConfig(Compile)(Transform.crossGenSettings), name := "Cross" ) // A logic with restricted negation as failure for a unique, stable model lazy val logicProj = (project in utilPath / "logic"). dependsOn(collectionProj, relationProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Logic" ) @@ -226,8 +236,8 @@ lazy val logicProj = (project in utilPath / "logic"). // Apache Ivy integration lazy val ivyProj = (project in file("ivy")). dependsOn(interfaceProj, crossProj, logProj % "compile;test->test", ioProj % "compile;test->test", /*launchProj % "test->test",*/ collectionProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Ivy", libraryDependencies ++= Seq(ivy, jsch, sbtSerialization, launcherInterface), testExclusive) @@ -235,16 +245,16 @@ lazy val ivyProj = (project in file("ivy")). // Runner for uniform test interface lazy val testingProj = (project in file("testing")). dependsOn(ioProj, classpathProj, logProj, testAgentProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Testing", libraryDependencies ++= Seq(testInterface,launcherInterface) ) // Testing agent for running tests in a separate process. lazy val testAgentProj = (project in file("testing") / "agent"). - settings(minimalSettings: _*). settings( + minimalSettings, name := "Test Agent", libraryDependencies += testInterface ) @@ -252,16 +262,16 @@ lazy val testAgentProj = (project in file("testing") / "agent"). // Basic task engine lazy val taskProj = (project in tasksPath). dependsOn(controlProj, collectionProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Tasks" ) // Standard task system. This provides map, flatMap, join, and more on top of the basic task model. lazy val stdTaskProj = (project in tasksPath / "standard"). dependsOn (taskProj % "compile;test->test", collectionProj, logProj, ioProj, processProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Task System", testExclusive ) @@ -269,8 +279,8 @@ lazy val stdTaskProj = (project in tasksPath / "standard"). // Persisted caching based on SBinary lazy val cacheProj = (project in cachePath). dependsOn (ioProj, collectionProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Cache", libraryDependencies ++= Seq(sbinary, sbtSerialization) ++ scalaXml.value ) @@ -278,16 +288,16 @@ lazy val cacheProj = (project in cachePath). // Builds on cache to provide caching for filesystem-related operations lazy val trackingProj = (project in cachePath / "tracking"). dependsOn(cacheProj, ioProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Tracking" ) // Embedded Scala code runner lazy val runProj = (project in file("run")). dependsOn (ioProj, logProj % "compile;test->test", classpathProj, processProj % "compile;test->test"). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Run" ) @@ -295,8 +305,9 @@ lazy val runProj = (project in file("run")). // Includes API and Analyzer phases that extract source API and relationships. lazy val compileInterfaceProj = (project in compilePath / "interface"). dependsOn(interfaceProj % "compile;test->test", ioProj % "test->test", logProj % "test->test", /*launchProj % "test->test",*/ apiProj % "test->test"). - settings(baseSettings ++ precompiledSettings: _*). settings( + baseSettings, + precompiledSettings, name := "Compiler Interface", exportJars := true, // we need to fork because in unit tests we set usejavacp = true which means @@ -317,16 +328,16 @@ lazy val precompiled293 = precompiled(scala293) // Defines the data structures for representing file fingerprints and relationships and the overall source analysis lazy val compileIncrementalProj = (project in compilePath / "inc"). dependsOn (apiProj, ioProj, logProj, classpathProj, relationProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Incremental Compiler" ) // Persists the incremental data structures using SBinary lazy val compilePersistProj = (project in compilePath / "persist"). dependsOn(compileIncrementalProj, apiProj, compileIncrementalProj % "test->test"). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Persist", libraryDependencies += sbinary ) @@ -335,8 +346,8 @@ lazy val compilePersistProj = (project in compilePath / "persist"). lazy val compilerProj = (project in compilePath). dependsOn(interfaceProj % "compile;test->test", logProj, ioProj, classpathProj, apiProj, classfileProj, logProj % "test->test" /*,launchProj % "test->test" */). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Compile", libraryDependencies ++= Seq(scalaCompiler.value % Test, launcherInterface), unmanagedJars in Test <<= (packageSrc in compileInterfaceProj in Compile).map(x => Seq(x).classpath) @@ -344,38 +355,38 @@ lazy val compilerProj = (project in compilePath). lazy val compilerIntegrationProj = (project in (compilePath / "integration")). dependsOn(compileIncrementalProj, compilerProj, compilePersistProj, apiProj, classfileProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Compiler Integration" ) lazy val compilerIvyProj = (project in compilePath / "ivy"). dependsOn (ivyProj, compilerProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Compiler Ivy Integration" ) lazy val scriptedBaseProj = (project in scriptedPath / "base"). dependsOn (ioProj, processProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Scripted Framework", libraryDependencies ++= scalaParsers.value ) lazy val scriptedSbtProj = (project in scriptedPath / "sbt"). dependsOn (ioProj, logProj, processProj, scriptedBaseProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Scripted sbt", libraryDependencies += launcherInterface % "provided" ) lazy val scriptedPluginProj = (project in scriptedPath / "plugin"). dependsOn (sbtProj, classpathProj). - settings(baseSettings: _*). settings( + baseSettings, name := "Scripted Plugin" ) @@ -384,16 +395,16 @@ lazy val actionsProj = (project in mainPath / "actions"). dependsOn (classpathProj, completeProj, apiProj, compilerIntegrationProj, compilerIvyProj, interfaceProj, ioProj, ivyProj, logProj, processProj, runProj, relationProj, stdTaskProj, taskProj, trackingProj, testingProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Actions" ) // General command support and core commands not specific to a build system lazy val commandProj = (project in mainPath / "command"). dependsOn(interfaceProj, ioProj, logProj, completeProj, classpathProj, crossProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Command", libraryDependencies += launcherInterface ) @@ -402,8 +413,8 @@ lazy val commandProj = (project in mainPath / "command"). lazy val mainSettingsProj = (project in mainPath / "settings"). dependsOn (applyMacroProj, interfaceProj, ivyProj, relationProj, logProj, ioProj, commandProj, completeProj, classpathProj, stdTaskProj, processProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Main Settings", libraryDependencies += sbinary ) @@ -411,8 +422,8 @@ lazy val mainSettingsProj = (project in mainPath / "settings"). // The main integration project for sbt. It brings all of the Projsystems together, configures them, and provides for overriding conventions. lazy val mainProj = (project in mainPath). dependsOn (actionsProj, mainSettingsProj, interfaceProj, ioProj, ivyProj, logProj, logicProj, processProj, runProj, commandProj). - settings(testedBaseSettings: _*). settings( + testedBaseSettings, name := "Main", libraryDependencies ++= scalaXml.value ++ Seq(launcherInterface) ) @@ -422,16 +433,16 @@ lazy val mainProj = (project in mainPath). // with the sole purpose of providing certain identifiers without qualification (with a package object) lazy val sbtProj = (project in sbtPath). dependsOn(mainProj, compileInterfaceProj, precompiled282, precompiled292, precompiled293, scriptedSbtProj % "test->test"). - settings(baseSettings: _*). settings( + baseSettings, name := "sbt", normalizedName := "sbt" ) lazy val mavenResolverPluginProj = (project in file("sbt-maven-resolver")). dependsOn(sbtProj, ivyProj % "test->test"). - settings(baseSettings: _*). settings( + baseSettings, name := "sbt-maven-resolver", libraryDependencies ++= aetherLibs, sbtPlugin := true @@ -532,8 +543,9 @@ def precompiledSettings = Seq( def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " + scalav.replace('.', '_')), base = compilePath / "interface"). dependsOn(interfaceProj). - settings(baseSettings ++ precompiledSettings: _*). settings( + baseSettings, + precompiledSettings, name := "Precompiled " + scalav.replace('.', '_'), scalaHome := None, scalaVersion <<= (scalaVersion in ThisBuild) { sbtScalaV => From 5f37efa4f530351725d6d5e5a4b825576b9b6143 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 27 Mar 2015 01:35:36 +0000 Subject: [PATCH 491/823] Rename root project to sbtRoot. This is mostly for IntelliJ IDEA. Currently IntelliJ IDEA's Scala (and SBT) plugin defines: * the project name (as seen in the window title and in the "open recent project" list) from `name` * the root module (as seen in the project view and in project structure) from `id` * doesn't use `moduleName` at all After this change the sbt project is no longer identified as "root". I was undecided between `sbtRoot` and `sbtRootProj`, and went with the shorter option. I'm happy to revise this decision. --- build.sbt | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index bece10e2b..9e7a1cf0e 100644 --- a/build.sbt +++ b/build.sbt @@ -39,7 +39,7 @@ def baseSettings: Seq[Setting[_]] = def testedBaseSettings: Seq[Setting[_]] = baseSettings ++ testDependencies -lazy val root: Project = (project in file(".")). +lazy val sbtRoot: Project = (project in file(".")). configs(Sxr.sxrConf). aggregate(nonRoots: _*). settings(buildLevelSettings: _*). @@ -490,7 +490,7 @@ def otherRootSettings = Seq( } )) lazy val docProjects: ScopeFilter = ScopeFilter( - inAnyProject -- inProjects(root, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, precompiled282, precompiled292, precompiled293, mavenResolverPluginProj), + inAnyProject -- inProjects(sbtRoot, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, precompiled282, precompiled292, precompiled293, mavenResolverPluginProj), inConfigurations(Compile) ) def fullDocSettings = Util.baseScalacOptions ++ Docs.settings ++ Sxr.settings ++ Seq( From 5a7bb765df11cd536e901eb076ca5bd4026f3e69 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 20 Apr 2015 01:20:23 -0400 Subject: [PATCH 492/823] publish to bintray --- build.sbt | 13 ++++++++++--- 1 file changed, 10 insertions(+), 3 deletions(-) diff --git a/build.sbt b/build.sbt index b6a48d173..cb9b46a49 100644 --- a/build.sbt +++ b/build.sbt @@ -11,7 +11,12 @@ import Sxr.sxr // but can be shared across the multi projects. def buildLevelSettings: Seq[Setting[_]] = Seq( organization in ThisBuild := "org.scala-sbt", - version in ThisBuild := "0.13.9-SNAPSHOT" + version in ThisBuild := "0.13.9-SNAPSHOT", + // bintrayOrganization in ThisBuild := None, + // bintrayRepository in ThisBuild := "test-test-test", + bintrayOrganization in ThisBuild := Some("sbt"), + bintrayRepository in ThisBuild := "ivy-releases", + bintrayPackage in ThisBuild := "sbt" ) def commonSettings: Seq[Setting[_]] = Seq( @@ -26,7 +31,9 @@ def commonSettings: Seq[Setting[_]] = Seq( testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), incOptions := incOptions.value.withNameHashing(true), - crossScalaVersions := Seq(scala210) + crossScalaVersions := Seq(scala210), + bintrayPackage := (bintrayPackage in ThisBuild).value, + bintrayRepository := (bintrayRepository in ThisBuild).value ) def minimalSettings: Seq[Setting[_]] = @@ -478,7 +485,7 @@ def allProjects = Seq(interfaceProj, apiProj, def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.configurations.filter(_ != Provided)) :+ myProvided)) lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) -def rootSettings = Release.releaseSettings ++ fullDocSettings ++ +def rootSettings = fullDocSettings ++ Util.publishPomSettings ++ otherRootSettings ++ Formatting.sbtFilesSettings /*++ Transform.conscriptSettings(launchProj)*/ def otherRootSettings = Seq( From feca1f6fad46b2b9fc7122e000f1523d27bf16db Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 20 Apr 2015 12:50:11 -0400 Subject: [PATCH 493/823] publish nightlies to bintray --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index cb9b46a49..eae15fa78 100644 --- a/build.sbt +++ b/build.sbt @@ -15,7 +15,7 @@ def buildLevelSettings: Seq[Setting[_]] = Seq( // bintrayOrganization in ThisBuild := None, // bintrayRepository in ThisBuild := "test-test-test", bintrayOrganization in ThisBuild := Some("sbt"), - bintrayRepository in ThisBuild := "ivy-releases", + bintrayRepository in ThisBuild := s"ivy-${(publishStatus in ThisBuild).value}", bintrayPackage in ThisBuild := "sbt" ) From ca3c29efbca2838e8f0424647a3d82cf6a64714c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 22 Apr 2015 00:28:47 -0400 Subject: [PATCH 494/823] call bintrayRelease on nightly --- build.sbt | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index eae15fa78..be13ac346 100644 --- a/build.sbt +++ b/build.sbt @@ -16,7 +16,8 @@ def buildLevelSettings: Seq[Setting[_]] = Seq( // bintrayRepository in ThisBuild := "test-test-test", bintrayOrganization in ThisBuild := Some("sbt"), bintrayRepository in ThisBuild := s"ivy-${(publishStatus in ThisBuild).value}", - bintrayPackage in ThisBuild := "sbt" + bintrayPackage in ThisBuild := "sbt", + bintrayReleaseOnPublish in ThisBuild := false ) def commonSettings: Seq[Setting[_]] = Seq( @@ -638,6 +639,7 @@ def customCommands: Seq[Setting[_]] = Seq( "allPrecompiled/publish" :: "compile" :: "publish" :: + "bintrayRelease" :: state } ) From b70acf84f3a8142221c2e3fb3118ba802059f0f6 Mon Sep 17 00:00:00 2001 From: David Perez Date: Fri, 8 May 2015 09:29:41 +0200 Subject: [PATCH 495/823] Issue 2008: provide more diagnostic info for undefined setting --- util/collection/src/main/scala/sbt/Settings.scala | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 38d2c2f5f..4266af1b6 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -244,7 +244,12 @@ trait Init[Scope] { @deprecated("Use the non-deprecated Undefined factory method.", "0.13.1") def this(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean) = this(fakeUndefinedSetting(definingKey, derived), referencedKey) } - final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") + final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") { + override def getMessage = + super.getMessage + undefined.map { u => + "\n" + u.defining + " referenced from " + u.referencedKey + }.mkString + } @deprecated("Use the other overload.", "0.13.1") def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean): Undefined = From 8f43c2f58c5dce6356881f3f045b61f9865b8c63 Mon Sep 17 00:00:00 2001 From: Vitalii Voloshyn Date: Mon, 18 May 2015 13:33:31 +0300 Subject: [PATCH 496/823] Prevent history command(s) from going into an infinite loop [1562] --- util/complete/src/main/scala/sbt/complete/HistoryCommands.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala b/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala index 762f48c6d..1e124c583 100644 --- a/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala +++ b/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala @@ -61,7 +61,7 @@ object HistoryCommands { def execute(f: History => Option[String]): History => Option[List[String]] = (h: History) => { - val command = f(h) + val command = f(h).filterNot(_.startsWith(Start)) val lines = h.lines.toArray command.foreach(lines(lines.length - 1) = _) h.path foreach { h => IO.writeLines(h, lines) } From 05358d6e3d7775e2d40ffc7573103901a8a11941 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 9 Jun 2015 17:20:03 +0200 Subject: [PATCH 497/823] Fixes the releasing --- build.sbt | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/build.sbt b/build.sbt index be13ac346..fb6db259f 100644 --- a/build.sbt +++ b/build.sbt @@ -14,7 +14,10 @@ def buildLevelSettings: Seq[Setting[_]] = Seq( version in ThisBuild := "0.13.9-SNAPSHOT", // bintrayOrganization in ThisBuild := None, // bintrayRepository in ThisBuild := "test-test-test", - bintrayOrganization in ThisBuild := Some("sbt"), + bintrayOrganization in ThisBuild := { + if ((publishStatus in ThisBuild).value == "releases") Some("typesafe") + else Some("sbt") + }, bintrayRepository in ThisBuild := s"ivy-${(publishStatus in ThisBuild).value}", bintrayPackage in ThisBuild := "sbt", bintrayReleaseOnPublish in ThisBuild := false @@ -487,8 +490,8 @@ def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.con lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) def rootSettings = fullDocSettings ++ - Util.publishPomSettings ++ otherRootSettings ++ Formatting.sbtFilesSettings /*++ - Transform.conscriptSettings(launchProj)*/ + Util.publishPomSettings ++ otherRootSettings ++ Formatting.sbtFilesSettings ++ + Transform.conscriptSettings(bundledLauncherProj) def otherRootSettings = Seq( Scripted.scriptedPrescripted := { _ => }, Scripted.scripted <<= scriptedTask, @@ -619,7 +622,6 @@ def customCommands: Seq[Setting[_]] = Seq( */ commands += Command.command("release-sbt") { state => // TODO - Any sort of validation - "checkCredentials" :: "clean" :: "allPrecompiled/clean" :: "allPrecompiled/compile" :: From d0473906c2381788b9aaa806582ef6643749b64e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 19 Jun 2015 13:40:10 -0400 Subject: [PATCH 498/823] Bumping up Scala version to 2.10.5/2.11.6. Fixes #1980 To pass File => Unit callback across the classloader boundary I am encoding it as a java.util.List[File] by overriding method. This was needed since Java didn't allow me to cast from one classloader to the other. --- build.sbt | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index fb6db259f..fe699f27c 100644 --- a/build.sbt +++ b/build.sbt @@ -24,7 +24,7 @@ def buildLevelSettings: Seq[Setting[_]] = Seq( ) def commonSettings: Seq[Setting[_]] = Seq( - scalaVersion := "2.10.4", + scalaVersion := scala210, publishArtifact in packageDoc := false, publishMavenStyle := false, componentID := None, @@ -387,7 +387,7 @@ lazy val scriptedBaseProj = (project in scriptedPath / "base"). ) lazy val scriptedSbtProj = (project in scriptedPath / "sbt"). - dependsOn (ioProj, logProj, processProj, scriptedBaseProj). + dependsOn (ioProj, logProj, processProj, scriptedBaseProj, interfaceProj). settings( baseSettings, name := "Scripted sbt", From f3ff3594edcdfdc0f2d55f42be1f7ca9286ea9c2 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 13 Jun 2015 20:31:28 -0400 Subject: [PATCH 499/823] Adds bundledLauncherProj to allProj This matters when someone tries to locally build sbt from source. --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index fe699f27c..69d868b24 100644 --- a/build.sbt +++ b/build.sbt @@ -484,7 +484,7 @@ def allProjects = Seq(interfaceProj, apiProj, compileInterfaceProj, compileIncrementalProj, compilePersistProj, compilerProj, compilerIntegrationProj, compilerIvyProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, - actionsProj, commandProj, mainSettingsProj, mainProj, sbtProj, mavenResolverPluginProj) + actionsProj, commandProj, mainSettingsProj, mainProj, sbtProj, bundledLauncherProj, mavenResolverPluginProj) def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.configurations.filter(_ != Provided)) :+ myProvided)) lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) From 81343707b990920a83e8c9bad7c3e7530879e208 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 20 Jun 2015 14:21:16 -0400 Subject: [PATCH 500/823] Skip process unit tests --- build.sbt | 20 ++++++++++++++++++++ 1 file changed, 20 insertions(+) diff --git a/build.sbt b/build.sbt index 69d868b24..d11c586b5 100644 --- a/build.sbt +++ b/build.sbt @@ -576,6 +576,23 @@ lazy val safeProjects: ScopeFilter = ScopeFilter( logProj, runProj, stdTaskProj), inConfigurations(Test) ) +lazy val otherUnitTests = taskKey[Unit]("Unit test other projects") +lazy val otherProjects: ScopeFilter = ScopeFilter( + inProjects(interfaceProj, apiProj, controlProj, + applyMacroProj, + // processProj, // this one is suspicious + ioProj, + relationProj, classfileProj, datatypeProj, + crossProj, logicProj, testingProj, testAgentProj, taskProj, + cacheProj, trackingProj, + compileIncrementalProj, + compilePersistProj, compilerProj, + compilerIntegrationProj, compilerIvyProj, + scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, + commandProj, mainSettingsProj, mainProj, + sbtProj, mavenResolverPluginProj), + inConfigurations(Test) +) def customCommands: Seq[Setting[_]] = Seq( commands += Command.command("setupBuildScala211") { state => @@ -594,6 +611,9 @@ def customCommands: Seq[Setting[_]] = Seq( safeUnitTests := { test.all(safeProjects).value }, + otherUnitTests := { + test.all(otherProjects) + } commands += Command.command("release-sbt-local") { state => "clean" :: "allPrecompiled/clean" :: From bcec8353e07fe34295c9d0ae6701f90a3ca99883 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 20 Jun 2015 14:42:26 -0400 Subject: [PATCH 501/823] Fix typo --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index d11c586b5..62f797f83 100644 --- a/build.sbt +++ b/build.sbt @@ -613,7 +613,7 @@ def customCommands: Seq[Setting[_]] = Seq( }, otherUnitTests := { test.all(otherProjects) - } + }, commands += Command.command("release-sbt-local") { state => "clean" :: "allPrecompiled/clean" :: From c9ef337b5c555584a0d64f9c6f8a4da163db95a4 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 24 Jun 2015 16:56:45 -0400 Subject: [PATCH 502/823] Fixes #2043. bintrayRelease is repeated 20x? --- build.sbt | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 62f797f83..132ca67fe 100644 --- a/build.sbt +++ b/build.sbt @@ -499,7 +499,8 @@ def otherRootSettings = Seq( Scripted.scriptedSource <<= (sourceDirectory in sbtProj) / "sbt-test", publishAll := { val _ = (publishLocal).all(ScopeFilter(inAnyProject)).value - } + }, + aggregate in bintrayRelease := false ) ++ inConfig(Scripted.MavenResolverPluginTest)(Seq( Scripted.scripted <<= scriptedTask, Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, From e296ca863daae916da655335b687345215334a6c Mon Sep 17 00:00:00 2001 From: Stu Hood Date: Mon, 6 Jul 2015 11:11:06 -0700 Subject: [PATCH 503/823] Add missing dependency --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index b6a48d173..dcfed8698 100644 --- a/build.sbt +++ b/build.sbt @@ -109,7 +109,7 @@ lazy val interfaceProj = (project in file("interface")). // defines operations on the API of a source, including determining whether it has changed and converting it to a string // and discovery of Projclasses and annotations lazy val apiProj = (project in compilePath / "api"). - dependsOn(interfaceProj). + dependsOn(interfaceProj, classfileProj). settings( testedBaseSettings, name := "API" From 91a441b8f82ca94888b860a4a42c1ff6234c3348 Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Thu, 9 Jul 2015 22:58:47 +0200 Subject: [PATCH 504/823] Make Logger.Null public --- util/log/src/main/scala/sbt/Logger.scala | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/util/log/src/main/scala/sbt/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala index 3c3dd92e1..a5fae46b8 100644 --- a/util/log/src/main/scala/sbt/Logger.scala +++ b/util/log/src/main/scala/sbt/Logger.scala @@ -41,8 +41,7 @@ object Logger { newLog.setTrace(oldLog.getTrace) } - // make public in 0.13 - private[sbt] val Null: AbstractLogger = new AbstractLogger { + val Null: AbstractLogger = new AbstractLogger { def getLevel: Level.Value = Level.Error def setLevel(newLevel: Level.Value) {} def getTrace = 0 From 49a8183a8dd9a40fde3e1481c595041b68ecf8cd Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Thu, 9 Jul 2015 22:59:05 +0200 Subject: [PATCH 505/823] Minor clean up --- interface/src/main/java/xsbti/Logger.java | 10 ++--- util/log/src/main/scala/sbt/Logger.scala | 48 +++++++++++------------ 2 files changed, 29 insertions(+), 29 deletions(-) diff --git a/interface/src/main/java/xsbti/Logger.java b/interface/src/main/java/xsbti/Logger.java index 3b676650d..60dfab7b5 100644 --- a/interface/src/main/java/xsbti/Logger.java +++ b/interface/src/main/java/xsbti/Logger.java @@ -5,9 +5,9 @@ package xsbti; public interface Logger { - public void error(F0 msg); - public void warn(F0 msg); - public void info(F0 msg); - public void debug(F0 msg); - public void trace(F0 exception); + void error(F0 msg); + void warn(F0 msg); + void info(F0 msg); + void debug(F0 msg); + void trace(F0 exception); } diff --git a/util/log/src/main/scala/sbt/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala index a5fae46b8..848d45b3f 100644 --- a/util/log/src/main/scala/sbt/Logger.scala +++ b/util/log/src/main/scala/sbt/Logger.scala @@ -10,19 +10,19 @@ import java.io.File abstract class AbstractLogger extends Logger { def getLevel: Level.Value - def setLevel(newLevel: Level.Value) - def setTrace(flag: Int) + def setLevel(newLevel: Level.Value): Unit + def setTrace(flag: Int): Unit def getTrace: Int - final def traceEnabled = getTrace >= 0 + final def traceEnabled: Boolean = getTrace >= 0 def successEnabled: Boolean def setSuccessEnabled(flag: Boolean): Unit - def atLevel(level: Level.Value) = level.id >= getLevel.id + def atLevel(level: Level.Value): Boolean = level.id >= getLevel.id def control(event: ControlEvent.Value, message: => String): Unit def logAll(events: Seq[LogEvent]): Unit /** Defined in terms of other methods in Logger and should not be called from them. */ - final def log(event: LogEvent) { + final def log(event: LogEvent): Unit = { event match { case s: Success => success(s.msg) case l: Log => log(l.level, l.msg) @@ -36,23 +36,23 @@ abstract class AbstractLogger extends Logger { } object Logger { - def transferLevels(oldLog: AbstractLogger, newLog: AbstractLogger) { + def transferLevels(oldLog: AbstractLogger, newLog: AbstractLogger): Unit = { newLog.setLevel(oldLog.getLevel) newLog.setTrace(oldLog.getTrace) } val Null: AbstractLogger = new AbstractLogger { def getLevel: Level.Value = Level.Error - def setLevel(newLevel: Level.Value) {} - def getTrace = 0 - def setTrace(flag: Int) {} - def successEnabled = false - def setSuccessEnabled(flag: Boolean) {} - def control(event: ControlEvent.Value, message: => String) {} - def logAll(events: Seq[LogEvent]) {} - def trace(t: => Throwable) {} - def success(message: => String) {} - def log(level: Level.Value, message: => String) {} + def setLevel(newLevel: Level.Value): Unit = () + def getTrace: Int = 0 + def setTrace(flag: Int): Unit = () + def successEnabled: Boolean = false + def setSuccessEnabled(flag: Boolean): Unit = () + def control(event: ControlEvent.Value, message: => String): Unit = () + def logAll(events: Seq[LogEvent]): Unit = () + def trace(t: => Throwable): Unit = () + def success(message: => String): Unit = () + def log(level: Level.Value, message: => String): Unit = () } implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger @@ -66,11 +66,11 @@ object Logger { override def warn(msg: F0[String]): Unit = lg.warn(msg) override def info(msg: F0[String]): Unit = lg.info(msg) override def error(msg: F0[String]): Unit = lg.error(msg) - override def trace(msg: F0[Throwable]) = lg.trace(msg) - override def log(level: Level.Value, msg: F0[String]) = lg.log(level, msg) - def trace(t: => Throwable) = trace(f0(t)) - def success(s: => String) = info(f0(s)) - def log(level: Level.Value, msg: => String) = + override def trace(msg: F0[Throwable]): Unit = lg.trace(msg) + override def log(level: Level.Value, msg: F0[String]): Unit = lg.log(level, msg) + def trace(t: => Throwable): Unit = trace(f0(t)) + def success(s: => String): Unit = info(f0(s)) + def log(level: Level.Value, msg: => String): Unit = { val fmsg = f0(msg) level match { @@ -118,7 +118,7 @@ trait Logger extends xLogger { final def warn(message: => String): Unit = log(Level.Warn, message) final def error(message: => String): Unit = log(Level.Error, message) - def ansiCodesSupported = false + def ansiCodesSupported: Boolean = false def trace(t: => Throwable): Unit def success(message: => String): Unit @@ -128,6 +128,6 @@ trait Logger extends xLogger { def warn(msg: F0[String]): Unit = log(Level.Warn, msg) def info(msg: F0[String]): Unit = log(Level.Info, msg) def error(msg: F0[String]): Unit = log(Level.Error, msg) - def trace(msg: F0[Throwable]) = trace(msg.apply) + def trace(msg: F0[Throwable]): Unit = trace(msg.apply) def log(level: Level.Value, msg: F0[String]): Unit = log(level, msg.apply) -} \ No newline at end of file +} From 352afdb5393772b9346f71ba7bd2c539de6b5dfe Mon Sep 17 00:00:00 2001 From: Adriaan Moors Date: Thu, 9 Jul 2015 14:49:08 -0700 Subject: [PATCH 506/823] Do not use `ListBuffer#readOnly` It's dangerous, deprecated, and was removed in 2.12.0-M1. See https://github.com/scala/scala/pull/4140. `ListBuffer#toList` has equivalent performance, except it actually returns an immutable copy(-on-write). --- util/log/src/main/scala/sbt/BufferedLogger.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util/log/src/main/scala/sbt/BufferedLogger.scala b/util/log/src/main/scala/sbt/BufferedLogger.scala index a40d3f1be..08a60577d 100644 --- a/util/log/src/main/scala/sbt/BufferedLogger.scala +++ b/util/log/src/main/scala/sbt/BufferedLogger.scala @@ -39,7 +39,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { * Flushes the buffer to the delegate logger. This method calls logAll on the delegate * so that the messages are written consecutively. The buffer is cleared in the process. */ - def play(): Unit = synchronized { delegate.logAll(buffer.readOnly); buffer.clear() } + def play(): Unit = synchronized { delegate.logAll(buffer.toList); buffer.clear() } /** Clears buffered events and disables buffering. */ def clear(): Unit = synchronized { buffer.clear(); recording = false } /** Plays buffered events and disables buffering. */ From 351a7dae1df7c950579dd7ba775e3814185ffdb4 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 9 Jul 2015 21:20:47 -0400 Subject: [PATCH 507/823] Ref #2068. Scala version bump needs to accompany scala-reflect bump. --- build.sbt | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 132ca67fe..0f802c394 100644 --- a/build.sbt +++ b/build.sbt @@ -250,7 +250,7 @@ lazy val ivyProj = (project in file("ivy")). settings( baseSettings, name := "Ivy", - libraryDependencies ++= Seq(ivy, jsch, sbtSerialization, launcherInterface), + libraryDependencies ++= Seq(ivy, jsch, sbtSerialization, scalaReflect.value, launcherInterface), testExclusive) // Runner for uniform test interface @@ -293,7 +293,7 @@ lazy val cacheProj = (project in cachePath). settings( baseSettings, name := "Cache", - libraryDependencies ++= Seq(sbinary, sbtSerialization) ++ scalaXml.value + libraryDependencies ++= Seq(sbinary, sbtSerialization, scalaReflect.value) ++ scalaXml.value ) // Builds on cache to provide caching for filesystem-related operations From 8f849ce2bd72927ffc016d8a9be35c3c3f34bea1 Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Fri, 10 Jul 2015 11:53:48 +0200 Subject: [PATCH 508/823] Fix most build warnings --- build.sbt | 9 ++++----- 1 file changed, 4 insertions(+), 5 deletions(-) diff --git a/build.sbt b/build.sbt index 132ca67fe..a8bb94530 100644 --- a/build.sbt +++ b/build.sbt @@ -108,8 +108,7 @@ lazy val interfaceProj = (project in file("interface")). watchSources <++= apiDefinitions, resourceGenerators in Compile <+= (version, resourceManaged, streams, compile in Compile) map generateVersionFile, apiDefinitions <<= baseDirectory map { base => (base / "definition") :: (base / "other") :: (base / "type") :: Nil }, - sourceGenerators in Compile <+= (cacheDirectory, - apiDefinitions, + sourceGenerators in Compile <+= (apiDefinitions, fullClasspath in Compile in datatypeProj, sourceManaged in Compile, mainClass in datatypeProj in Compile, @@ -459,14 +458,14 @@ lazy val mavenResolverPluginProj = (project in file("sbt-maven-resolver")). sbtPlugin := true ) -def scriptedTask: Initialize[InputTask[Unit]] = Def.inputTask { +def scriptedTask: Def.Initialize[InputTask[Unit]] = Def.inputTask { val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed publishAll.value doScripted((sbtLaunchJar in bundledLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) } -def scriptedUnpublishedTask: Initialize[InputTask[Unit]] = Def.inputTask { +def scriptedUnpublishedTask: Def.Initialize[InputTask[Unit]] = Def.inputTask { val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed doScripted((sbtLaunchJar in bundledLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) @@ -496,7 +495,7 @@ def otherRootSettings = Seq( Scripted.scriptedPrescripted := { _ => }, Scripted.scripted <<= scriptedTask, Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, - Scripted.scriptedSource <<= (sourceDirectory in sbtProj) / "sbt-test", + Scripted.scriptedSource := (sourceDirectory in sbtProj).value / "sbt-test", publishAll := { val _ = (publishLocal).all(ScopeFilter(inAnyProject)).value }, From 5d4e91d6443f4a74cdba3087634576eefb22e96b Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Sat, 11 Jul 2015 00:11:17 +0200 Subject: [PATCH 509/823] Remove redundant public modifier in Java interfaces --- .../src/main/java/xsbti/AnalysisCallback.java | 18 +++++++++--------- interface/src/main/java/xsbti/F0.java | 2 +- interface/src/main/java/xsbti/Reporter.java | 14 +++++++------- .../java/xsbti/compile/CachedCompiler.java | 4 ++-- .../main/java/xsbti/compile/GlobalsCache.java | 4 ++-- 5 files changed, 21 insertions(+), 21 deletions(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 88b190e80..a51628f15 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -15,35 +15,35 @@ public interface AnalysisCallback * template accessible outside of the source file. * @deprecated Use `sourceDependency(File dependsOn, File source, DependencyContext context)` instead. */ @Deprecated - public void sourceDependency(File dependsOn, File source, boolean publicInherited); + void sourceDependency(File dependsOn, File source, boolean publicInherited); /** Called to indicate that the source file source depends on the source file * dependsOn. Note that only source files included in the current compilation will * passed to this method. Dependencies on classes generated by sources not in the current compilation will * be passed as class dependencies to the classDependency method. * context gives information about the context in which this dependency has been extracted. * See xsbti.DependencyContext for the list of existing dependency contexts. */ - public void sourceDependency(File dependsOn, File source, DependencyContext context); + void sourceDependency(File dependsOn, File source, DependencyContext context); /** Called to indicate that the source file source depends on the top-level * class named name from class or jar file binary. * If publicInherited is true, this dependency is a result of inheritance by a * template accessible outside of the source file. * @deprecated Use `binaryDependency(File binary, String name, File source, DependencyContext context)` instead. */ @Deprecated - public void binaryDependency(File binary, String name, File source, boolean publicInherited); + void binaryDependency(File binary, String name, File source, boolean publicInherited); /** Called to indicate that the source file source depends on the top-level * class named name from class or jar file binary. * context gives information about the context in which this dependency has been extracted. * See xsbti.DependencyContext for the list of existing dependency contexts. */ - public void binaryDependency(File binary, String name, File source, DependencyContext context); + void binaryDependency(File binary, String name, File source, DependencyContext context); /** Called to indicate that the source file source produces a class file at * module contain class name.*/ - public void generatedClass(File source, File module, String name); + void generatedClass(File source, File module, String name); /** Called when the public API of a source file is extracted. */ - public void api(File sourceFile, xsbti.api.SourceAPI source); - public void usedName(File sourceFile, String names); + void api(File sourceFile, xsbti.api.SourceAPI source); + void usedName(File sourceFile, String names); /** Provides problems discovered during compilation. These may be reported (logged) or unreported. * Unreported problems are usually unreported because reporting was not enabled via a command line switch. */ - public void problem(String what, Position pos, String msg, Severity severity, boolean reported); + void problem(String what, Position pos, String msg, Severity severity, boolean reported); /** * Determines whether method calls through this interface should be interpreted as serving * name hashing algorithm needs in given compiler run. @@ -58,5 +58,5 @@ public interface AnalysisCallback * NOTE: This method is an implementation detail and can be removed at any point without deprecation. * Do not depend on it, please. */ - public boolean nameHashing(); + boolean nameHashing(); } \ No newline at end of file diff --git a/interface/src/main/java/xsbti/F0.java b/interface/src/main/java/xsbti/F0.java index 90e713b6b..b0091b186 100644 --- a/interface/src/main/java/xsbti/F0.java +++ b/interface/src/main/java/xsbti/F0.java @@ -5,5 +5,5 @@ package xsbti; public interface F0 { - public T apply(); + T apply(); } diff --git a/interface/src/main/java/xsbti/Reporter.java b/interface/src/main/java/xsbti/Reporter.java index 439e2738f..d76be8ea6 100644 --- a/interface/src/main/java/xsbti/Reporter.java +++ b/interface/src/main/java/xsbti/Reporter.java @@ -6,17 +6,17 @@ package xsbti; public interface Reporter { /** Resets logging, including any accumulated errors, warnings, messages, and counts.*/ - public void reset(); + void reset(); /** Returns true if this logger has seen any errors since the last call to reset.*/ - public boolean hasErrors(); + boolean hasErrors(); /** Returns true if this logger has seen any warnings since the last call to reset.*/ - public boolean hasWarnings(); + boolean hasWarnings(); /** Logs a summary of logging since the last reset.*/ - public void printSummary(); + void printSummary(); /** Returns a list of warnings and errors since the last reset.*/ - public Problem[] problems(); + Problem[] problems(); /** Logs a message.*/ - public void log(Position pos, String msg, Severity sev); + void log(Position pos, String msg, Severity sev); /** Reports a comment. */ - public void comment(Position pos, String msg); + void comment(Position pos, String msg); } diff --git a/interface/src/main/java/xsbti/compile/CachedCompiler.java b/interface/src/main/java/xsbti/compile/CachedCompiler.java index 0722a68b9..1d37f0883 100644 --- a/interface/src/main/java/xsbti/compile/CachedCompiler.java +++ b/interface/src/main/java/xsbti/compile/CachedCompiler.java @@ -8,6 +8,6 @@ import java.io.File; public interface CachedCompiler { /** Returns an array of arguments representing the nearest command line equivalent of a call to run but without the command name itself.*/ - public String[] commandArguments(File[] sources); - public void run(File[] sources, DependencyChanges cpChanges, AnalysisCallback callback, Logger log, Reporter delegate, CompileProgress progress); + String[] commandArguments(File[] sources); + void run(File[] sources, DependencyChanges cpChanges, AnalysisCallback callback, Logger log, Reporter delegate, CompileProgress progress); } diff --git a/interface/src/main/java/xsbti/compile/GlobalsCache.java b/interface/src/main/java/xsbti/compile/GlobalsCache.java index d9aa1c017..c8540e2d2 100644 --- a/interface/src/main/java/xsbti/compile/GlobalsCache.java +++ b/interface/src/main/java/xsbti/compile/GlobalsCache.java @@ -8,6 +8,6 @@ import xsbti.Reporter; */ public interface GlobalsCache { - public CachedCompiler apply(String[] args, Output output, boolean forceNew, CachedCompilerProvider provider, Logger log, Reporter reporter); - public void clear(); + CachedCompiler apply(String[] args, Output output, boolean forceNew, CachedCompilerProvider provider, Logger log, Reporter reporter); + void clear(); } From 3eef2d66b5b6ed91794aeb2810ab2df929254861 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Wed, 15 Jul 2015 10:01:11 +0200 Subject: [PATCH 510/823] Find most specific version of compiler interface sources This commit introduces a mechanism that allows sbt to find the most specific version of the compiler interface sources that exists using Ivy. For instance, when asked for a compiler interface for Scala 2.11.8-M2, sbt will look for sources for: - 2.11.8-M2 ; - 2.11.8 ; - 2.11 ; - the default sources. This commit also modifies the build definition by removing the precompiled projects and configuring the compiler-interface project so that it publishes its source artifacts in a Maven-friendly format. --- build.sbt | 81 +++++-------------------------------------------------- 1 file changed, 6 insertions(+), 75 deletions(-) diff --git a/build.sbt b/build.sbt index 9fd4f9173..19061a285 100644 --- a/build.sbt +++ b/build.sbt @@ -81,17 +81,6 @@ lazy val bundledLauncherProj = packageBin in Compile := sbtLaunchJar.value ) - -// This is used only for command aggregation -lazy val allPrecompiled: Project = (project in file("all-precompiled")). - aggregate(precompiled282, precompiled292, precompiled293). - settings( - buildLevelSettings, - minimalSettings, - publish := {}, - publishLocal := {} - ) - /* ** subproject declarations ** */ // defines Java structures used across Scala versions, such as the API structures and relationships extracted by @@ -199,7 +188,7 @@ lazy val logProj = (project in utilPath / "log"). testedBaseSettings, name := "Logging", libraryDependencies += jline - ) + ) // Relation lazy val relationProj = (project in utilPath / "relation"). @@ -317,7 +306,7 @@ lazy val compileInterfaceProj = (project in compilePath / "interface"). dependsOn(interfaceProj % "compile;test->test", ioProj % "test->test", logProj % "test->test", /*launchProj % "test->test",*/ apiProj % "test->test"). settings( baseSettings, - precompiledSettings, + libraryDependencies += scalaCompiler.value % "provided", name := "Compiler Interface", exportJars := true, // we need to fork because in unit tests we set usejavacp = true which means @@ -327,13 +316,10 @@ lazy val compileInterfaceProj = (project in compilePath / "interface"). // needed because we fork tests and tests are ran in parallel so we have multiple Scala // compiler instances that are memory hungry javaOptions in Test += "-Xmx1G", - artifact in (Compile, packageSrc) := Artifact(srcID).copy(configurations = Compile :: Nil).extra("e:component" -> srcID) + publishArtifact in (Compile, packageSrc) := true, + publishMavenStyle := true ) -lazy val precompiled282 = precompiled(scala282) -lazy val precompiled292 = precompiled(scala292) -lazy val precompiled293 = precompiled(scala293) - // Implements the core functionality of detecting and propagating changes incrementally. // Defines the data structures for representing file fingerprints and relationships and the overall source analysis lazy val compileIncrementalProj = (project in compilePath / "inc"). @@ -442,7 +428,7 @@ lazy val mainProj = (project in mainPath). // technically, we need a dependency on all of mainProj's dependencies, but we don't do that since this is strictly an integration project // with the sole purpose of providing certain identifiers without qualification (with a package object) lazy val sbtProj = (project in sbtPath). - dependsOn(mainProj, compileInterfaceProj, precompiled282, precompiled292, precompiled293, scriptedSbtProj % "test->test"). + dependsOn(mainProj, compileInterfaceProj, scriptedSbtProj % "test->test"). settings( baseSettings, name := "sbt", @@ -512,7 +498,7 @@ def otherRootSettings = Seq( } )) lazy val docProjects: ScopeFilter = ScopeFilter( - inAnyProject -- inProjects(sbtRoot, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, precompiled282, precompiled292, precompiled293, mavenResolverPluginProj), + inAnyProject -- inProjects(sbtRoot, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, mavenResolverPluginProj), inConfigurations(Compile) ) def fullDocSettings = Util.baseScalacOptions ++ Docs.settings ++ Sxr.settings ++ Seq( @@ -539,36 +525,6 @@ def utilPath = file("util") def compilePath = file("compile") def mainPath = file("main") -def precompiledSettings = Seq( - artifact in packageBin <<= (appConfiguration, scalaVersion) { (app, sv) => - val launcher = app.provider.scalaProvider.launcher - val bincID = binID + "_" + ScalaInstance(sv, launcher).actualVersion - Artifact(binID) extra ("e:component" -> bincID) - }, - target <<= (target, scalaVersion) { (base, sv) => base / ("precompiled_" + sv) }, - scalacOptions := Nil, - ivyScala ~= { _.map(_.copy(checkExplicit = false, overrideScalaVersion = false)) }, - exportedProducts in Compile := Nil, - libraryDependencies += scalaCompiler.value % "provided" -) - -def precompiled(scalav: String): Project = Project(id = normalize("Precompiled " + scalav.replace('.', '_')), base = compilePath / "interface"). - dependsOn(interfaceProj). - settings( - baseSettings, - precompiledSettings, - name := "Precompiled " + scalav.replace('.', '_'), - scalaHome := None, - scalaVersion <<= (scalaVersion in ThisBuild) { sbtScalaV => - assert(sbtScalaV != scalav, "Precompiled compiler interface cannot have the same Scala version (" + scalav + ") as sbt.") - scalav - }, - crossScalaVersions := Seq(scalav), - // we disable compiling and running tests in precompiled Projprojects of compiler interface - // so we do not need to worry about cross-versioning testing dependencies - sources in Test := Nil - ) - lazy val safeUnitTests = taskKey[Unit]("Known working tests (for both 2.10 and 2.11)") lazy val safeProjects: ScopeFilter = ScopeFilter( inProjects(mainSettingsProj, mainProj, ivyProj, completeProj, @@ -616,36 +572,14 @@ def customCommands: Seq[Setting[_]] = Seq( }, commands += Command.command("release-sbt-local") { state => "clean" :: - "allPrecompiled/clean" :: - "allPrecompiled/compile" :: - "allPrecompiled/publishLocal" :: "so compile" :: "so publishLocal" :: "reload" :: state }, - /** There are several complications with sbt's build. - * First is the fact that interface project is a Java-only project - * that uses source generator from datatype subproject in Scala 2.10.4, - * which is depended on by Scala 2.8.2, Scala 2.9.2, and Scala 2.9.3 precompiled project. - * - * Second is the fact that sbt project (currently using Scala 2.10.4) depends on - * the precompiled projects (that uses Scala 2.8.2 etc.) - * - * Finally, there's the fact that all subprojects are released with crossPaths - * turned off for the sbt's Scala version 2.10.4, but some of them are also - * cross published against 2.11.1 with crossPaths turned on. - * - * Because of the way ++ (and its improved version wow) is implemented - * precompiled compiler briges are handled outside of doge aggregation on root. - * `so compile` handles 2.10.x/2.11.x cross building. - */ commands += Command.command("release-sbt") { state => // TODO - Any sort of validation "clean" :: - "allPrecompiled/clean" :: - "allPrecompiled/compile" :: - "allPrecompiled/publishSigned" :: "conscript-configs" :: "so compile" :: "so publishSigned" :: @@ -656,9 +590,6 @@ def customCommands: Seq[Setting[_]] = Seq( commands += Command.command("release-nightly") { state => "stamp-version" :: "clean" :: - "allPrecompiled/clean" :: - "allPrecompiled/compile" :: - "allPrecompiled/publish" :: "compile" :: "publish" :: "bintrayRelease" :: From 3e60e3c4eb8779ab35baa37b05c9954cad8632a8 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Thu, 16 Jul 2015 13:53:10 +0200 Subject: [PATCH 511/823] Revive comment about sbt's build complications --- build.sbt | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/build.sbt b/build.sbt index 19061a285..4d054b823 100644 --- a/build.sbt +++ b/build.sbt @@ -577,6 +577,16 @@ def customCommands: Seq[Setting[_]] = Seq( "reload" :: state }, + /** There are several complications with sbt's build. + * First is the fact that interface project is a Java-only project + * that uses source generator from datatype subproject in Scala 2.10.5. + * + * Second is the fact that all subprojects are released with crossPaths + * turned off for the sbt's Scala version 2.10.5, but some of them are also + * cross published against 2.11.1 with crossPaths turned on. + * + * `so compile` handles 2.10.x/2.11.x cross building. + */ commands += Command.command("release-sbt") { state => // TODO - Any sort of validation "clean" :: From 517e4d6abe1d7e48626e0bde99421c11011fe6f3 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Thu, 16 Jul 2015 14:02:25 +0200 Subject: [PATCH 512/823] Don't set `publishMavenStyle := true` for compiler interface --- build.sbt | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 4d054b823..a3c7cb753 100644 --- a/build.sbt +++ b/build.sbt @@ -316,8 +316,7 @@ lazy val compileInterfaceProj = (project in compilePath / "interface"). // needed because we fork tests and tests are ran in parallel so we have multiple Scala // compiler instances that are memory hungry javaOptions in Test += "-Xmx1G", - publishArtifact in (Compile, packageSrc) := true, - publishMavenStyle := true + publishArtifact in (Compile, packageSrc) := true ) // Implements the core functionality of detecting and propagating changes incrementally. From 158856feaa880cf4774c850392c49990510b10a1 Mon Sep 17 00:00:00 2001 From: fkorotkov Date: Wed, 15 Jul 2015 14:49:12 -0400 Subject: [PATCH 513/823] Line content from diagnostic classes if available --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 1569ec94e..6d4dde762 100644 --- a/build.sbt +++ b/build.sbt @@ -573,7 +573,7 @@ lazy val safeUnitTests = taskKey[Unit]("Known working tests (for both 2.10 and 2 lazy val safeProjects: ScopeFilter = ScopeFilter( inProjects(mainSettingsProj, mainProj, ivyProj, completeProj, actionsProj, classpathProj, collectionProj, compileIncrementalProj, - logProj, runProj, stdTaskProj), + logProj, runProj, stdTaskProj, compilerProj), inConfigurations(Test) ) lazy val otherUnitTests = taskKey[Unit]("Unit test other projects") From f484dc41252f36894602db870d01fba3bc9722af Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Fri, 17 Jul 2015 08:30:13 +0200 Subject: [PATCH 514/823] Fix several warnings --- interface/src/main/java/xsbti/compile/JavaCompiler.java | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/interface/src/main/java/xsbti/compile/JavaCompiler.java b/interface/src/main/java/xsbti/compile/JavaCompiler.java index 95f9fb992..18b3f5bea 100644 --- a/interface/src/main/java/xsbti/compile/JavaCompiler.java +++ b/interface/src/main/java/xsbti/compile/JavaCompiler.java @@ -4,7 +4,7 @@ import java.io.File; import xsbti.Logger; import xsbti.Reporter; -/** +/** * Interface to a Java compiler. */ public interface JavaCompiler @@ -14,6 +14,7 @@ public interface JavaCompiler * * @deprecated 0.13.8 - Use compileWithReporter instead */ + @Deprecated void compile(File[] sources, File[] classpath, Output output, String[] options, Logger log); /** @@ -23,4 +24,4 @@ public interface JavaCompiler * Failures should be passed to the provided Reporter. */ void compileWithReporter(File[] sources, File[] classpath, Output output, String[] options, Reporter reporter, Logger log); -} \ No newline at end of file +} From 345cceafe63aac63e405cc7a60c69ef301de87c3 Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Sat, 1 Aug 2015 02:19:25 +0200 Subject: [PATCH 515/823] Simplify operations on collections --- util/collection/src/main/scala/sbt/Settings.scala | 2 +- util/complete/src/main/scala/sbt/LineReader.scala | 2 +- util/log/src/main/scala/sbt/LoggerWriter.scala | 2 +- util/relation/src/main/scala/sbt/Relation.scala | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 4266af1b6..0196b0a09 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -296,7 +296,7 @@ trait Init[Scope] { def definedAtString(settings: Seq[Setting[_]]): String = { val posDefined = settings.flatMap(_.positionString.toList) - if (posDefined.size > 0) { + if (posDefined.nonEmpty) { val header = if (posDefined.size == settings.size) "defined at:" else "some of the defining occurrences:" header + (posDefined.distinct mkString ("\n\t", "\n\t", "\n")) diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/util/complete/src/main/scala/sbt/LineReader.scala index 8f9fc219f..b85190f92 100644 --- a/util/complete/src/main/scala/sbt/LineReader.scala +++ b/util/complete/src/main/scala/sbt/LineReader.scala @@ -45,7 +45,7 @@ abstract class JLine extends LineReader { private[this] def handleMultilinePrompt(prompt: String): String = { val lines = """\r?\n""".r.split(prompt) - lines.size match { + lines.length match { case 0 | 1 => prompt case _ => reader.print(lines.init.mkString("\n") + "\n"); lines.last; } diff --git a/util/log/src/main/scala/sbt/LoggerWriter.scala b/util/log/src/main/scala/sbt/LoggerWriter.scala index 0165676f5..bc6062563 100644 --- a/util/log/src/main/scala/sbt/LoggerWriter.scala +++ b/util/log/src/main/scala/sbt/LoggerWriter.scala @@ -17,7 +17,7 @@ class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: S override def close() = flush() override def flush(): Unit = synchronized { - if (buffer.length > 0) { + if (buffer.nonEmpty) { log(buffer.toString) buffer.clear() } diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/util/relation/src/main/scala/sbt/Relation.scala index 987aafb14..9a648ad64 100644 --- a/util/relation/src/main/scala/sbt/Relation.scala +++ b/util/relation/src/main/scala/sbt/Relation.scala @@ -123,7 +123,7 @@ private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ex def _1s = fwd.keySet def _2s = rev.keySet - def size = (fwd.valuesIterator map { _.size }).foldLeft(0)(_ + _) + def size = (fwd.valuesIterator map (_.size)).sum def all: Traversable[(A, B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map(b => (a, b)) }.toTraversable From c1de41f5c07d007e161354fb59e786143db6b4be Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Sat, 1 Aug 2015 12:05:35 +0200 Subject: [PATCH 516/823] Remove redundant collection conversions --- util/collection/src/main/scala/sbt/PMap.scala | 2 +- util/collection/src/main/scala/sbt/Settings.scala | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/util/collection/src/main/scala/sbt/PMap.scala b/util/collection/src/main/scala/sbt/PMap.scala index 51c942112..cf0454fd9 100644 --- a/util/collection/src/main/scala/sbt/PMap.scala +++ b/util/collection/src/main/scala/sbt/PMap.scala @@ -52,7 +52,7 @@ object IMap { put(k, f(this get k getOrElse init)) def mapValues[V2[_]](f: V ~> V2) = - new IMap0[K, V2](backing.mapValues(x => f(x)).toMap) + new IMap0[K, V2](backing.mapValues(x => f(x))) def mapSeparate[VL[_], VR[_]](f: V ~> ({ type l[T] = Either[VL[T], VR[T]] })#l) = { diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 0196b0a09..03173d6ce 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -17,9 +17,9 @@ sealed trait Settings[Scope] { } private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val delegates: Scope => Seq[Scope]) extends Settings[Scope] { - def scopes: Set[Scope] = data.keySet.toSet + def scopes: Set[Scope] = data.keySet def keys(scope: Scope) = data(scope).keys.toSet - def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) } toSeq; + def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) } toSeq def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = delegates(scope).toStream.flatMap(sc => getDirect(sc, key)).headOption From 9951f8c72b74cc98c5065d9658bf4460eb52274e Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 3 Aug 2015 22:16:56 +0100 Subject: [PATCH 517/823] Set version to 0.13.10-SNAPSHOT. --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 2a720e174..cdf634fe3 100644 --- a/build.sbt +++ b/build.sbt @@ -11,7 +11,7 @@ import Sxr.sxr // but can be shared across the multi projects. def buildLevelSettings: Seq[Setting[_]] = Seq( organization in ThisBuild := "org.scala-sbt", - version in ThisBuild := "0.13.9-SNAPSHOT", + version in ThisBuild := "0.13.10-SNAPSHOT", // bintrayOrganization in ThisBuild := None, // bintrayRepository in ThisBuild := "test-test-test", bintrayOrganization in ThisBuild := { From 404c5e8fc628ae434afa289c9add12b2624affe7 Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Mon, 3 Aug 2015 23:13:59 +0200 Subject: [PATCH 518/823] Replace procedure syntax by explicit Unit annotation --- cache/src/main/scala/sbt/Cache.scala | 14 +++++------ .../src/test/scala/xsbti/TestCallback.scala | 14 +++++------ .../main/scala/sbt/appmacro/ContextUtil.scala | 2 +- util/collection/src/main/scala/sbt/Dag.scala | 14 +++++------ .../collection/src/main/scala/sbt/INode.scala | 6 ++--- .../collection/src/main/scala/sbt/Param.scala | 4 ++-- .../src/main/scala/sbt/Settings.scala | 2 +- .../scala/sbt/complete/JLineCompletion.scala | 10 ++++---- .../src/main/scala/sbt/complete/Parser.scala | 4 ++-- util/complete/src/test/scala/ParserTest.scala | 10 ++++---- .../src/main/scala/sbt/ConsoleLogger.scala | 10 ++++---- .../log/src/main/scala/sbt/FilterLogger.scala | 12 +++++----- util/log/src/main/scala/sbt/FullLogger.scala | 6 ++--- .../log/src/main/scala/sbt/LoggerWriter.scala | 4 ++-- util/log/src/main/scala/sbt/MultiLogger.scala | 20 ++++++++-------- util/log/src/test/scala/LogWriterTest.scala | 14 +++++------ .../src/main/scala/sbt/ProcessImpl.scala | 24 +++++++++---------- .../src/test/scala/TestedProcess.scala | 9 ++++--- 18 files changed, 89 insertions(+), 90 deletions(-) diff --git a/cache/src/main/scala/sbt/Cache.scala b/cache/src/main/scala/sbt/Cache.scala index c241394ba..bdfd8cb51 100644 --- a/cache/src/main/scala/sbt/Cache.scala +++ b/cache/src/main/scala/sbt/Cache.scala @@ -40,7 +40,7 @@ object Cache extends CacheImplicits { println(label + ".read: " + v) v } - def write(to: Out, v: Internal) { + def write(to: Out, v: Internal): Unit = { println(label + ".write: " + v) c.write(to, v) } @@ -119,7 +119,7 @@ trait BasicCacheImplicits { if (left <= 0) acc.reverse else next(left - 1, t.read(from) :: acc) next(size, Nil) } - def write(to: Out, vs: Internal) { + def write(to: Out, vs: Internal): Unit = { val size = vs.length IntFormat.writes(to, size) for (v <- vs) t.write(to, v) @@ -165,7 +165,7 @@ trait HListCacheImplicits { val t = tail.read(from) (h, t) } - def write(to: Out, j: Internal) { + def write(to: Out, j: Internal): Unit = { head.write(to, j._1) tail.write(to, j._2) } @@ -185,7 +185,7 @@ trait HListCacheImplicits { val t = tail.reads(from) HCons(h, t) } - def writes(to: Out, hc: H :+: T) { + def writes(to: Out, hc: H :+: T): Unit = { head.writes(to, hc.head) tail.writes(to, hc.tail) } @@ -205,8 +205,8 @@ trait UnionImplicits { val value = cache.read(in) new Found[cache.Internal](cache, clazz, value, index) } - def write(to: Out, i: Internal) { - def write0[I](f: Found[I]) { + def write(to: Out, i: Internal): Unit = { + def write0[I](f: Found[I]): Unit = { ByteFormat.writes(to, f.index.toByte) f.cache.write(to, f.value) } @@ -245,4 +245,4 @@ trait UnionImplicits { def at(i: Int): (InputCache[_ <: UB], Class[_]) def find(forValue: UB): Found[_] } -} \ No newline at end of file +} diff --git a/interface/src/test/scala/xsbti/TestCallback.scala b/interface/src/test/scala/xsbti/TestCallback.scala index 13b65df79..f0658597b 100644 --- a/interface/src/test/scala/xsbti/TestCallback.scala +++ b/interface/src/test/scala/xsbti/TestCallback.scala @@ -13,22 +13,22 @@ class TestCallback(override val nameHashing: Boolean = false) extends AnalysisCa val usedNames = scala.collection.mutable.Map.empty[File, Set[String]].withDefaultValue(Set.empty) val apis: scala.collection.mutable.Map[File, SourceAPI] = scala.collection.mutable.Map.empty - def sourceDependency(dependsOn: File, source: File, inherited: Boolean) { + def sourceDependency(dependsOn: File, source: File, inherited: Boolean): Unit = { val context = if(inherited) DependencyByInheritance else DependencyByMemberRef sourceDependency(dependsOn, source, context) } - def sourceDependency(dependsOn: File, source: File, context: DependencyContext) { sourceDependencies += ((dependsOn, source, context)) } - def binaryDependency(binary: File, name: String, source: File, inherited: Boolean) { + def sourceDependency(dependsOn: File, source: File, context: DependencyContext): Unit = { sourceDependencies += ((dependsOn, source, context)) } + def binaryDependency(binary: File, name: String, source: File, inherited: Boolean): Unit = { val context = if(inherited) DependencyByInheritance else DependencyByMemberRef binaryDependency(binary, name, source, context) } - def binaryDependency(binary: File, name: String, source: File, context: DependencyContext) { binaryDependencies += ((binary, name, source, context)) } - def generatedClass(source: File, module: File, name: String) { products += ((source, module, name)) } + def binaryDependency(binary: File, name: String, source: File, context: DependencyContext): Unit = { binaryDependencies += ((binary, name, source, context)) } + def generatedClass(source: File, module: File, name: String): Unit = { products += ((source, module, name)) } - def usedName(source: File, name: String) { usedNames(source) += name } + def usedName(source: File, name: String): Unit = { usedNames(source) += name } def api(source: File, sourceAPI: SourceAPI): Unit = { assert(!apis.contains(source), s"The `api` method should be called once per source file: $source") apis(source) = sourceAPI } - def problem(category: String, pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean) {} + def problem(category: String, pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean): Unit = () } diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index 29a962de7..a2f1e4e47 100644 --- a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -190,7 +190,7 @@ final class ContextUtil[C <: Context](val ctx: C) { // Workaround copied from scala/async:can be removed once https://github.com/scala/scala/pull/3179 is merged. private[this] class ChangeOwnerAndModuleClassTraverser(oldowner: global.Symbol, newowner: global.Symbol) extends global.ChangeOwnerTraverser(oldowner, newowner) { - override def traverse(tree: global.Tree) { + override def traverse(tree: global.Tree): Unit = { tree match { case _: global.DefTree => change(tree.symbol.moduleClass) case _ => diff --git a/util/collection/src/main/scala/sbt/Dag.scala b/util/collection/src/main/scala/sbt/Dag.scala index 7c0fd6f2c..118cd0dff 100644 --- a/util/collection/src/main/scala/sbt/Dag.scala +++ b/util/collection/src/main/scala/sbt/Dag.scala @@ -21,18 +21,18 @@ object Dag { val finished = (new java.util.LinkedHashSet[T]).asScala def visitAll(nodes: Iterable[T]) = nodes foreach visit - def visit(node: T) { + def visit(node: T): Unit = { if (!discovered(node)) { discovered(node) = true; try { visitAll(dependencies(node)); } catch { case c: Cyclic => throw node :: c } - finished += node; + finished += node } else if (!finished(node)) throw new Cyclic(node) } - visitAll(nodes); + visitAll(nodes) - finished.toList; + finished.toList } // doesn't check for cycles def topologicalSortUnchecked[T](node: T)(dependencies: T => Iterable[T]): List[T] = topologicalSortUnchecked(node :: Nil)(dependencies) @@ -43,11 +43,11 @@ object Dag { var finished: List[T] = Nil def visitAll(nodes: Iterable[T]) = nodes foreach visit - def visit(node: T) { + def visit(node: T): Unit = { if (!discovered(node)) { - discovered(node) = true; + discovered(node) = true visitAll(dependencies(node)) - finished ::= node; + finished ::= node } } diff --git a/util/collection/src/main/scala/sbt/INode.scala b/util/collection/src/main/scala/sbt/INode.scala index 1af592f77..ce39fadad 100644 --- a/util/collection/src/main/scala/sbt/INode.scala +++ b/util/collection/src/main/scala/sbt/INode.scala @@ -111,7 +111,7 @@ abstract class EvaluateSettings[Scope] { final def isNew: Boolean = synchronized { state == New } final def isCalling: Boolean = synchronized { state == Calling } final def registerIfNew(): Unit = synchronized { if (state == New) register() } - private[this] def register() { + private[this] def register(): Unit = { assert(state == New, "Already registered and: " + toString) val deps = dependsOn blockedOn = deps.size - deps.count(_.doneOrBlock(this)) @@ -133,12 +133,12 @@ abstract class EvaluateSettings[Scope] { if (blockedOn == 0) schedule() } final def evaluate(): Unit = synchronized { evaluate0() } - protected final def makeCall(source: BindNode[_, T], target: INode[T]) { + protected final def makeCall(source: BindNode[_, T], target: INode[T]): Unit = { assert(state == Ready, "Invalid state for call to makeCall: " + toString) state = Calling target.call(source) } - protected final def setValue(v: T) { + protected final def setValue(v: T): Unit = { assert(state != Evaluated, "Already evaluated (trying to set value to " + v + "): " + toString) if (v == null) sys.error("Setting value cannot be null: " + keyString) value = v diff --git a/util/collection/src/main/scala/sbt/Param.scala b/util/collection/src/main/scala/sbt/Param.scala index 6f674efdc..19d12798a 100644 --- a/util/collection/src/main/scala/sbt/Param.scala +++ b/util/collection/src/main/scala/sbt/Param.scala @@ -20,11 +20,11 @@ object Param { type T = s def in = a private var r: B[T] = _ - def ret(b: B[T]) { r = b } + def ret(b: B[T]): Unit = { r = b } def ret: B[T] = r } p(v) v.ret } } -} \ No newline at end of file +} diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/util/collection/src/main/scala/sbt/Settings.scala index 03173d6ce..eb4227d09 100644 --- a/util/collection/src/main/scala/sbt/Settings.scala +++ b/util/collection/src/main/scala/sbt/Settings.scala @@ -354,7 +354,7 @@ trait Init[Scope] { // set of defined scoped keys, used to ensure a derived setting is only added if all dependencies are present val defined = new mutable.HashSet[ScopedKey[_]] - def addDefs(ss: Seq[Setting[_]]) { for (s <- ss) defined += s.key } + def addDefs(ss: Seq[Setting[_]]): Unit = { for (s <- ss) defined += s.key } addDefs(defs) // true iff the scoped key is in `defined`, taking delegation into account diff --git a/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala b/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala index 1d876f0ba..fed89541f 100644 --- a/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala +++ b/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala @@ -104,7 +104,7 @@ object JLineCompletion { !(common.isEmpty && display.isEmpty) } - def appendCompletion(common: String, reader: ConsoleReader) { + def appendCompletion(common: String, reader: ConsoleReader): Unit = { reader.getCursorBuffer.write(common) reader.redrawLine() } @@ -113,16 +113,16 @@ object JLineCompletion { * `display` is assumed to be the exact strings requested to be displayed. * In particular, duplicates should have been removed already. */ - def showCompletions(display: Seq[String], reader: ConsoleReader) { + def showCompletions(display: Seq[String], reader: ConsoleReader): Unit = { printCompletions(display, reader) reader.drawLine() } - def printCompletions(cs: Seq[String], reader: ConsoleReader) { + def printCompletions(cs: Seq[String], reader: ConsoleReader): Unit = { val print = shouldPrint(cs, reader) reader.println() if (print) printLinesAndColumns(cs, reader) } - def printLinesAndColumns(cs: Seq[String], reader: ConsoleReader) { + def printLinesAndColumns(cs: Seq[String], reader: ConsoleReader): Unit = { val (lines, columns) = cs partition hasNewline for (line <- lines) { reader.print(line) @@ -153,4 +153,4 @@ object JLineCompletion { @tailrec def loop(i: Int): Int = if (i >= len) len else if (a(i) != b(i)) i else loop(i + 1) a.substring(0, loop(0)) } -} \ No newline at end of file +} diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/util/complete/src/main/scala/sbt/complete/Parser.scala index c52d16b91..892119f6c 100644 --- a/util/complete/src/main/scala/sbt/complete/Parser.scala +++ b/util/complete/src/main/scala/sbt/complete/Parser.scala @@ -185,7 +185,7 @@ object Parser extends ParserMain { def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) @deprecated("This method is deprecated and will be removed in the next major version. Use the parser directly to check for invalid completions.", since = "0.13.2") - def checkMatches(a: Parser[_], completions: Seq[String]) { + def checkMatches(a: Parser[_], completions: Seq[String]): Unit = { val bad = completions.filter(apply(a)(_).resultEmpty.isFailure) if (bad.nonEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) } @@ -845,4 +845,4 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m for (value <- repeated.resultEmpty) yield makeList(min, value) } override def toString = "repeat(" + min + "," + max + "," + partial + "," + repeated + ")" -} \ No newline at end of file +} diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index 53d6cb1db..efe2f3493 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -15,7 +15,7 @@ object JLineTest { } val parsers = Map("1" -> one, "2" -> two, "3" -> three, "4" -> four, "5" -> five) - def main(args: Array[String]) { + def main(args: Array[String]): Unit = { import jline.TerminalFactory import jline.console.ConsoleReader val reader = new ConsoleReader() @@ -23,7 +23,7 @@ object JLineTest { val parser = parsers(args(0)) JLineCompletion.installCustomCompletor(reader, parser) - def loop() { + def loop(): Unit = { val line = reader.readLine("> ") if (line ne null) { println("Result: " + apply(parser)(line).resultEmpty) @@ -130,7 +130,7 @@ object ParserExample { println(apply(t)("test w").resultEmpty) println(apply(t)("test was were").resultEmpty) - def run(n: Int) { + def run(n: Int): Unit = { val a = 'a'.id val aq = a.? val aqn = repeat(aq, min = n, max = n) @@ -140,9 +140,9 @@ object ParserExample { def r = apply(ann)("a" * (n * 2)).resultEmpty println(r.isValid) } - def run2(n: Int) { + def run2(n: Int): Unit = { val ab = "ab".?.* val r = apply(ab)("a" * n).resultEmpty println(r) } -} \ No newline at end of file +} diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/util/log/src/main/scala/sbt/ConsoleLogger.scala index f86d7f2f5..c1558cc66 100644 --- a/util/log/src/main/scala/sbt/ConsoleLogger.scala +++ b/util/log/src/main/scala/sbt/ConsoleLogger.scala @@ -81,7 +81,7 @@ object ConsoleLogger { nextESC(s, 0, sb) sb.toString } - private[this] def nextESC(s: String, start: Int, sb: java.lang.StringBuilder) { + private[this] def nextESC(s: String, start: Int, sb: java.lang.StringBuilder): Unit = { val escIndex = s.indexOf(ESC, start) if (escIndex < 0) sb.append(s, start, s.length) @@ -167,7 +167,7 @@ class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val an } def successLabelColor = GREEN def successMessageColor = RESET - override def success(message: => String) { + override def success(message: => String): Unit = { if (successEnabled) log(successLabelColor, Level.SuccessLabel, successMessageColor, message) } @@ -180,13 +180,13 @@ class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val an for (msg <- suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useColor))) printLabeledLine(labelColor(Level.Error), "trace", messageColor(Level.Error), msg) } - def log(level: Level.Value, message: => String) { + def log(level: Level.Value, message: => String): Unit = { if (atLevel(level)) log(labelColor(level), level.toString, messageColor(level), message) } private def reset(): Unit = setColor(RESET) - private def setColor(color: String) { + private def setColor(color: String): Unit = { if (ansiCodesSupported && useColor) out.lockObject.synchronized { out.print(color) } } @@ -210,6 +210,6 @@ class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val an } def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } - def control(event: ControlEvent.Value, message: => String) { log(labelColor(Level.Info), Level.Info.toString, BLUE, message) } + def control(event: ControlEvent.Value, message: => String): Unit = log(labelColor(Level.Info), Level.Info.toString, BLUE, message) } final class SuppressedTraceContext(val traceLevel: Int, val useColor: Boolean) diff --git a/util/log/src/main/scala/sbt/FilterLogger.scala b/util/log/src/main/scala/sbt/FilterLogger.scala index d3547f34f..5259a6e12 100644 --- a/util/log/src/main/scala/sbt/FilterLogger.scala +++ b/util/log/src/main/scala/sbt/FilterLogger.scala @@ -9,23 +9,23 @@ package sbt */ class FilterLogger(delegate: AbstractLogger) extends BasicLogger { override lazy val ansiCodesSupported = delegate.ansiCodesSupported - def trace(t: => Throwable) { + def trace(t: => Throwable): Unit = { if (traceEnabled) delegate.trace(t) } - override def setSuccessEnabled(flag: Boolean) { delegate.setSuccessEnabled(flag) } + override def setSuccessEnabled(flag: Boolean): Unit = delegate.setSuccessEnabled(flag) override def successEnabled = delegate.successEnabled - override def setTrace(level: Int) { delegate.setTrace(level) } + override def setTrace(level: Int): Unit = delegate.setTrace(level) override def getTrace = delegate.getTrace - def log(level: Level.Value, message: => String) { + def log(level: Level.Value, message: => String): Unit = { if (atLevel(level)) delegate.log(level, message) } - def success(message: => String) { + def success(message: => String): Unit = { if (successEnabled) delegate.success(message) } - def control(event: ControlEvent.Value, message: => String) { + def control(event: ControlEvent.Value, message: => String): Unit = { if (atLevel(Level.Info)) delegate.control(event, message) } diff --git a/util/log/src/main/scala/sbt/FullLogger.scala b/util/log/src/main/scala/sbt/FullLogger.scala index 968712317..32873eff7 100644 --- a/util/log/src/main/scala/sbt/FullLogger.scala +++ b/util/log/src/main/scala/sbt/FullLogger.scala @@ -6,11 +6,11 @@ package sbt /** Promotes the simple Logger interface to the full AbstractLogger interface. */ class FullLogger(delegate: Logger) extends BasicLogger { override val ansiCodesSupported: Boolean = delegate.ansiCodesSupported - def trace(t: => Throwable) { + def trace(t: => Throwable): Unit = { if (traceEnabled) delegate.trace(t) } - def log(level: Level.Value, message: => String) { + def log(level: Level.Value, message: => String): Unit = { if (atLevel(level)) delegate.log(level, message) } @@ -27,4 +27,4 @@ object FullLogger { case d: AbstractLogger => d case _ => new FullLogger(delegate) } -} \ No newline at end of file +} diff --git a/util/log/src/main/scala/sbt/LoggerWriter.scala b/util/log/src/main/scala/sbt/LoggerWriter.scala index bc6062563..9be8af409 100644 --- a/util/log/src/main/scala/sbt/LoggerWriter.scala +++ b/util/log/src/main/scala/sbt/LoggerWriter.scala @@ -34,7 +34,7 @@ class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: S process() } - private[this] def process() { + private[this] def process(): Unit = { val i = buffer.indexOf(nl) if (i >= 0) { log(buffer.substring(0, i)) @@ -46,4 +46,4 @@ class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: S case None => lines += s case Some(level) => delegate.log(level, s) } -} \ No newline at end of file +} diff --git a/util/log/src/main/scala/sbt/MultiLogger.scala b/util/log/src/main/scala/sbt/MultiLogger.scala index 77c4c11d4..a6de160cd 100644 --- a/util/log/src/main/scala/sbt/MultiLogger.scala +++ b/util/log/src/main/scala/sbt/MultiLogger.scala @@ -11,24 +11,24 @@ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { private[this] lazy val allSupportCodes = delegates forall supported private[this] def supported = (_: AbstractLogger).ansiCodesSupported - override def setLevel(newLevel: Level.Value) { + override def setLevel(newLevel: Level.Value): Unit = { super.setLevel(newLevel) dispatch(new SetLevel(newLevel)) } - override def setTrace(level: Int) { + override def setTrace(level: Int): Unit = { super.setTrace(level) dispatch(new SetTrace(level)) } - override def setSuccessEnabled(flag: Boolean) { + override def setSuccessEnabled(flag: Boolean): Unit = { super.setSuccessEnabled(flag) dispatch(new SetSuccess(flag)) } - def trace(t: => Throwable) { dispatch(new Trace(t)) } - def log(level: Level.Value, message: => String) { dispatch(new Log(level, message)) } - def success(message: => String) { dispatch(new Success(message)) } - def logAll(events: Seq[LogEvent]) { delegates.foreach(_.logAll(events)) } - def control(event: ControlEvent.Value, message: => String) { delegates.foreach(_.control(event, message)) } - private[this] def dispatch(event: LogEvent) { + def trace(t: => Throwable): Unit = dispatch(new Trace(t)) + def log(level: Level.Value, message: => String): Unit = dispatch(new Log(level, message)) + def success(message: => String): Unit = dispatch(new Success(message)) + def logAll(events: Seq[LogEvent]): Unit = delegates.foreach(_.logAll(events)) + def control(event: ControlEvent.Value, message: => String): Unit = delegates.foreach(_.control(event, message)) + private[this] def dispatch(event: LogEvent): Unit = { val plainEvent = if (allSupportCodes) event else removeEscapes(event) for (d <- delegates) if (d.ansiCodesSupported) @@ -47,4 +47,4 @@ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { case _: Trace | _: SetLevel | _: SetTrace | _: SetSuccess => event } } -} \ No newline at end of file +} diff --git a/util/log/src/test/scala/LogWriterTest.scala b/util/log/src/test/scala/LogWriterTest.scala index db8250139..ce96e9bc9 100644 --- a/util/log/src/test/scala/LogWriterTest.scala +++ b/util/log/src/test/scala/LogWriterTest.scala @@ -40,7 +40,7 @@ object LogWriterTest extends Properties("Log Writer") { * represented as separately written segments (ToLog instances). ToLog.`byCharacter` * indicates whether to write the segment by character (true) or all at once (false) */ - def logLines(writer: Writer, lines: List[List[ToLog]], newLine: String) { + def logLines(writer: Writer, lines: List[List[ToLog]], newLine: String): Unit = { for (line <- lines; section <- line) { val content = section.content val normalized = Escape.newline(content, newLine) @@ -141,10 +141,10 @@ final class RecordingLogger extends BasicLogger { def getEvents = events.reverse override def ansiCodesSupported = true - def trace(t: => Throwable) { events ::= new Trace(t) } - def log(level: Level.Value, message: => String) { events ::= new Log(level, message) } - def success(message: => String) { events ::= new Success(message) } - def logAll(es: Seq[LogEvent]) { events :::= es.toList } - def control(event: ControlEvent.Value, message: => String) { events ::= new ControlEvent(event, message) } + def trace(t: => Throwable): Unit = { events ::= new Trace(t) } + def log(level: Level.Value, message: => String): Unit = { events ::= new Log(level, message) } + def success(message: => String): Unit = { events ::= new Success(message) } + def logAll(es: Seq[LogEvent]): Unit = { events :::= es.toList } + def control(event: ControlEvent.Value, message: => String): Unit = { events ::= new ControlEvent(event, message) } -} \ No newline at end of file +} diff --git a/util/process/src/main/scala/sbt/ProcessImpl.scala b/util/process/src/main/scala/sbt/ProcessImpl.scala index 0800e8b49..abef81b33 100644 --- a/util/process/src/main/scala/sbt/ProcessImpl.scala +++ b/util/process/src/main/scala/sbt/ProcessImpl.scala @@ -58,8 +58,8 @@ object BasicIO { processLinesFully(processLine)(reader.readLine) reader.close() } - def processLinesFully(processLine: String => Unit)(readLine: () => String) { - def readFully() { + def processLinesFully(processLine: String => Unit)(readLine: () => String): Unit = { + def readFully(): Unit = { val line = readLine() if (line != null) { processLine(line) @@ -68,7 +68,7 @@ object BasicIO { } readFully() } - def connectToIn(o: OutputStream) { transferFully(Uncloseable protect System.in, o) } + def connectToIn(o: OutputStream): Unit = transferFully(Uncloseable protect System.in, o) def input(connect: Boolean): OutputStream => Unit = if (connect) connectToIn else closeOut def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput), inheritInput(connectInput)) def standard(in: OutputStream => Unit, inheritIn: JProcessBuilder => Boolean): ProcessIO = new ProcessIO(in, toStdOut, toStdErr, inheritIn) @@ -87,10 +87,10 @@ object BasicIO { buffer.append(Newline) } - private[this] def transferFullyImpl(in: InputStream, out: OutputStream) { + private[this] def transferFullyImpl(in: InputStream, out: OutputStream): Unit = { val continueCount = 1 //if(in.isInstanceOf[PipedInputStream]) 1 else 0 val buffer = new Array[Byte](BufferSize) - def read() { + def read(): Unit = { val byteCount = in.read(buffer) if (byteCount >= continueCount) { out.write(buffer, 0, byteCount) @@ -189,7 +189,7 @@ private abstract class BasicProcess extends Process { } private abstract class CompoundProcess extends BasicProcess { - def destroy() { destroyer() } + def destroy(): Unit = destroyer() def exitValue() = getExitValue().getOrElse(sys.error("No exit code: process destroyed.")) def start() = getExitValue @@ -294,7 +294,7 @@ private class PipedProcesses(a: ProcessBuilder, b: ProcessBuilder, defaultIO: Pr } } private class PipeSource(currentSource: SyncVar[Option[InputStream]], pipe: PipedOutputStream, label: => String) extends Thread { - final override def run() { + final override def run(): Unit = { currentSource.get match { case Some(source) => try { BasicIO.transferFully(source, pipe) } @@ -311,7 +311,7 @@ private class PipeSource(currentSource: SyncVar[Option[InputStream]], pipe: Pipe } } private class PipeSink(pipe: PipedInputStream, currentSink: SyncVar[Option[OutputStream]], label: => String) extends Thread { - final override def run() { + final override def run(): Unit = { currentSink.get match { case Some(sink) => try { BasicIO.transferFully(pipe, sink) } @@ -338,7 +338,7 @@ private[sbt] class DummyProcessBuilder(override val toString: String, exitValue: private class DummyProcess(action: => Int) extends Process { private[this] val exitCode = Future(action) override def exitValue() = exitCode() - override def destroy() {} + override def destroy(): Unit = () } /** Represents a simple command without any redirection or combination. */ private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProcessBuilder { @@ -410,12 +410,12 @@ private final class ThreadProcess(thread: Thread, success: SyncVar[Boolean]) ext thread.join() if (success.get) 0 else 1 } - override def destroy() { thread.interrupt() } + override def destroy(): Unit = thread.interrupt() } object Uncloseable { - def apply(in: InputStream): InputStream = new FilterInputStream(in) { override def close() {} } - def apply(out: OutputStream): OutputStream = new FilterOutputStream(out) { override def close() {} } + def apply(in: InputStream): InputStream = new FilterInputStream(in) { override def close(): Unit = () } + def apply(out: OutputStream): OutputStream = new FilterOutputStream(out) { override def close(): Unit = () } def protect(in: InputStream): InputStream = if (in eq System.in) Uncloseable(in) else in def protect(out: OutputStream): OutputStream = if ((out eq System.out) || (out eq System.err)) Uncloseable(out) else out } diff --git a/util/process/src/test/scala/TestedProcess.scala b/util/process/src/test/scala/TestedProcess.scala index f83207a07..8dabcf381 100644 --- a/util/process/src/test/scala/TestedProcess.scala +++ b/util/process/src/test/scala/TestedProcess.scala @@ -3,12 +3,12 @@ package sbt import java.io.{ File, FileNotFoundException, IOException } object exit { - def main(args: Array[String]) { + def main(args: Array[String]): Unit = { System.exit(java.lang.Integer.parseInt(args(0))) } } object cat { - def main(args: Array[String]) { + def main(args: Array[String]): Unit = { try { if (args.length == 0) IO.transfer(System.in, System.out) @@ -41,7 +41,6 @@ object cat { } } object echo { - def main(args: Array[String]) { + def main(args: Array[String]): Unit = System.out.println(args.mkString(" ")) - } -} \ No newline at end of file +} From 8b8de1101ca6d5ad722c8851c4473990c2789894 Mon Sep 17 00:00:00 2001 From: Pierre DAL-PRA Date: Fri, 7 Aug 2015 00:23:14 +0200 Subject: [PATCH 519/823] Fix additional warnings --- cache/src/test/scala/CacheTest.scala | 4 +-- util/complete/src/test/scala/ParserTest.scala | 32 +++++++++---------- .../logic/src/test/scala/sbt/logic/Test.scala | 2 +- 3 files changed, 19 insertions(+), 19 deletions(-) diff --git a/cache/src/test/scala/CacheTest.scala b/cache/src/test/scala/CacheTest.scala index cbb7319b7..ca66ba925 100644 --- a/cache/src/test/scala/CacheTest.scala +++ b/cache/src/test/scala/CacheTest.scala @@ -12,7 +12,7 @@ object CacheTest // extends Properties("Cache test") import FileInfo.hash._ import Ordering._ import sbinary.DefaultProtocol.FileFormat - def test { + def test(): Unit = { lazy val create = new File("test") val length = cached(lengthCache) { @@ -28,4 +28,4 @@ object CacheTest // extends Properties("Cache test") } c(create :+: fileLength :+: HNil) } -} \ No newline at end of file +} diff --git a/util/complete/src/test/scala/ParserTest.scala b/util/complete/src/test/scala/ParserTest.scala index efe2f3493..f02431e53 100644 --- a/util/complete/src/test/scala/ParserTest.scala +++ b/util/complete/src/test/scala/ParserTest.scala @@ -44,7 +44,7 @@ object ParserTest extends Properties("Completing Parser") { val nested = (token("a1") ~ token("b2")) ~ "c3" val nestedDisplay = (token("a1", "") ~ token("b2", "")) ~ "c3" - val spacePort = (token(Space) ~> Port) + val spacePort = token(Space) ~> Port def p[T](f: T): T = { println(f); f } @@ -58,7 +58,7 @@ object ParserTest extends Properties("Completing Parser") { def checkAll(in: String, parser: Parser[_], expect: Completions): Prop = { val cs = completions(parser, in, 1) - ("completions: " + cs) |: ("Expected: " + expect) |: ((cs == expect): Prop) + ("completions: " + cs) |: ("Expected: " + expect) |: (cs == expect: Prop) } def checkInvalid(in: String) = @@ -68,31 +68,31 @@ object ParserTest extends Properties("Completing Parser") { def checkInv(in: String, parser: Parser[_]): Prop = { val cs = completions(parser, in, 1) - ("completions: " + cs) |: ((cs == Completions.nil): Prop) + ("completions: " + cs) |: (cs == Completions.nil: Prop) } - property("nested tokens a") = checkSingle("", Completion.tokenStrict("", "a1"))(Completion.displayStrict("")) - property("nested tokens a1") = checkSingle("a", Completion.tokenStrict("a", "1"))(Completion.displayStrict("")) + property("nested tokens a") = checkSingle("", Completion.token("", "a1"))(Completion.displayOnly("")) + property("nested tokens a1") = checkSingle("a", Completion.token("a", "1"))(Completion.displayOnly("")) property("nested tokens a inv") = checkInvalid("b") - property("nested tokens b") = checkSingle("a1", Completion.tokenStrict("", "b2"))(Completion.displayStrict("")) - property("nested tokens b2") = checkSingle("a1b", Completion.tokenStrict("b", "2"))(Completion.displayStrict("")) + property("nested tokens b") = checkSingle("a1", Completion.token("", "b2"))(Completion.displayOnly("")) + property("nested tokens b2") = checkSingle("a1b", Completion.token("b", "2"))(Completion.displayOnly("")) property("nested tokens b inv") = checkInvalid("a1a") - property("nested tokens c") = checkSingle("a1b2", Completion.suggestStrict("c3"))() - property("nested tokens c3") = checkSingle("a1b2c", Completion.suggestStrict("3"))() + property("nested tokens c") = checkSingle("a1b2", Completion.suggestion("c3"))() + property("nested tokens c3") = checkSingle("a1b2c", Completion.suggestion("3"))() property("nested tokens c inv") = checkInvalid("a1b2a") - property("suggest space") = checkOne("", spacePort, Completion.tokenStrict("", " ")) - property("suggest port") = checkOne(" ", spacePort, Completion.displayStrict("")) - property("no suggest at end") = checkOne("asdf", "asdf", Completion.suggestStrict("")) - property("no suggest at token end") = checkOne("asdf", token("asdf"), Completion.suggestStrict("")) - property("empty suggest for examples") = checkOne("asdf", any.+.examples("asdf", "qwer"), Completion.suggestStrict("")) - property("empty suggest for examples token") = checkOne("asdf", token(any.+.examples("asdf", "qwer")), Completion.suggestStrict("")) + property("suggest space") = checkOne("", spacePort, Completion.token("", " ")) + property("suggest port") = checkOne(" ", spacePort, Completion.displayOnly("")) + property("no suggest at end") = checkOne("asdf", "asdf", Completion.suggestion("")) + property("no suggest at token end") = checkOne("asdf", token("asdf"), Completion.suggestion("")) + property("empty suggest for examples") = checkOne("asdf", any.+.examples("asdf", "qwer"), Completion.suggestion("")) + property("empty suggest for examples token") = checkOne("asdf", token(any.+.examples("asdf", "qwer")), Completion.suggestion("")) val colors = Set("blue", "green", "red") val base = (seen: Seq[String]) => token(ID examples (colors -- seen)) val sep = token(Space) val repeat = repeatDep(base, sep) - def completionStrings(ss: Set[String]): Completions = Completions(ss.map { s => Completion.tokenStrict("", s) }) + def completionStrings(ss: Set[String]): Completions = Completions(ss.map { s => Completion.token("", s) }) property("repeatDep no suggestions for bad input") = checkInv(".", repeat) property("repeatDep suggest all") = checkAll("", repeat, completionStrings(colors)) diff --git a/util/logic/src/test/scala/sbt/logic/Test.scala b/util/logic/src/test/scala/sbt/logic/Test.scala index a5277582c..f62a9e767 100644 --- a/util/logic/src/test/scala/sbt/logic/Test.scala +++ b/util/logic/src/test/scala/sbt/logic/Test.scala @@ -104,7 +104,7 @@ object TestClauses { Logic.reduceAll(cs, Set(A, B)) } - def all { + def all(): Unit = { println(s"Cycles: $cycles") println(s"xNeg: $excludedNeg") println(s"xPos: $excludedPos") From 0fdf0ce93980a05787e3f46a2574fef3330153b5 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 10 Aug 2015 20:32:24 -0400 Subject: [PATCH 520/823] 0.13.9 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 0f802c394..94392be9a 100644 --- a/build.sbt +++ b/build.sbt @@ -11,7 +11,7 @@ import Sxr.sxr // but can be shared across the multi projects. def buildLevelSettings: Seq[Setting[_]] = Seq( organization in ThisBuild := "org.scala-sbt", - version in ThisBuild := "0.13.9-SNAPSHOT", + version in ThisBuild := "0.13.9", // bintrayOrganization in ThisBuild := None, // bintrayRepository in ThisBuild := "test-test-test", bintrayOrganization in ThisBuild := { From 81a79826d6e802d6781f0e7b787f5a89a935bd5a Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 20 Aug 2015 00:18:02 -0400 Subject: [PATCH 521/823] Add build.sbt --- build.sbt | 614 +++++-------------------------------- project/Dependencies.scala | 34 ++ project/Util.scala | 38 +++ project/build.properties | 1 + 4 files changed, 152 insertions(+), 535 deletions(-) create mode 100644 project/Dependencies.scala create mode 100644 project/Util.scala create mode 100644 project/build.properties diff --git a/build.sbt b/build.sbt index cdf634fe3..66a045d8f 100644 --- a/build.sbt +++ b/build.sbt @@ -1,607 +1,151 @@ -import Project.Initialize -import Util._ import Dependencies._ -import Licensed._ -import Scope.ThisScope -import Scripted._ -import StringUtilities.normalize -import Sxr.sxr +import Util._ + +def internalPath = file("internal") +def utilPath = file("util") +def cachePath = file("cache") // ThisBuild settings take lower precedence, // but can be shared across the multi projects. def buildLevelSettings: Seq[Setting[_]] = Seq( - organization in ThisBuild := "org.scala-sbt", - version in ThisBuild := "0.13.10-SNAPSHOT", - // bintrayOrganization in ThisBuild := None, - // bintrayRepository in ThisBuild := "test-test-test", - bintrayOrganization in ThisBuild := { - if ((publishStatus in ThisBuild).value == "releases") Some("typesafe") - else Some("sbt") - }, - bintrayRepository in ThisBuild := s"ivy-${(publishStatus in ThisBuild).value}", - bintrayPackage in ThisBuild := "sbt", - bintrayReleaseOnPublish in ThisBuild := false + organization in ThisBuild := "org.scala-sbt.util", + version in ThisBuild := "1.0.0-SNAPSHOT" + // bintrayOrganization in ThisBuild := { + // if ((publishStatus in ThisBuild).value == "releases") Some("typesafe") + // else Some("sbt") + // }, + // bintrayRepository in ThisBuild := s"ivy-${(publishStatus in ThisBuild).value}", + // bintrayPackage in ThisBuild := "sbt", + // bintrayReleaseOnPublish in ThisBuild := false ) def commonSettings: Seq[Setting[_]] = Seq( - scalaVersion := scala210, - publishArtifact in packageDoc := false, - publishMavenStyle := false, - componentID := None, - crossPaths := false, + scalaVersion := "2.10.5", + // publishArtifact in packageDoc := false, resolvers += Resolver.typesafeIvyRepo("releases"), resolvers += Resolver.sonatypeRepo("snapshots"), - concurrentRestrictions in Global += Util.testExclusiveRestriction, + // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), - incOptions := incOptions.value.withNameHashing(true), - crossScalaVersions := Seq(scala210), - bintrayPackage := (bintrayPackage in ThisBuild).value, - bintrayRepository := (bintrayRepository in ThisBuild).value + incOptions := incOptions.value.withNameHashing(true) + // crossScalaVersions := Seq(scala210) + // bintrayPackage := (bintrayPackage in ThisBuild).value, + // bintrayRepository := (bintrayRepository in ThisBuild).value ) -def minimalSettings: Seq[Setting[_]] = - commonSettings ++ customCommands ++ - publishPomSettings ++ Release.javaVersionCheckSettings +// def testedBaseSettings: Seq[Setting[_]] = +// baseSettings ++ testDependencies +def testedBaseSettings: Seq[Setting[_]] = commonSettings -def baseSettings: Seq[Setting[_]] = - minimalSettings ++ Seq(projectComponent) ++ baseScalacOptions ++ Licensed.settings ++ Formatting.settings - -def testedBaseSettings: Seq[Setting[_]] = - baseSettings ++ testDependencies - -lazy val sbtRoot: Project = (project in file(".")). - configs(Sxr.sxrConf). - aggregate(nonRoots: _*). +lazy val utilRoot: Project = (project in file(".")). + // configs(Sxr.sxrConf). + aggregate( + utilInterface, utilControl, utilCollection, utilApplyMacro, utilComplete, + utilLogging, utilRelation, utilLogic, utilCache, utilTracking + ). settings( buildLevelSettings, - minimalSettings, - rootSettings, + commonSettings, + name := "Util Root", publish := {}, publishLocal := {} ) -// This is used to configure an sbt-launcher for this version of sbt. -lazy val bundledLauncherProj = - (project in file("launch")). - settings( - minimalSettings, - inConfig(Compile)(Transform.configSettings), - Release.launcherSettings(sbtLaunchJar) - ). - enablePlugins(SbtLauncherPlugin). - settings( - name := "sbt-launch", - moduleName := "sbt-launch", - description := "sbt application launcher", - publishArtifact in packageSrc := false, - autoScalaLibrary := false, - publish := Release.deployLauncher.value, - publishLauncher := Release.deployLauncher.value, - packageBin in Compile := sbtLaunchJar.value - ) - -/* ** subproject declarations ** */ - // defines Java structures used across Scala versions, such as the API structures and relationships extracted by // the analysis compiler phases and passed back to sbt. The API structures are defined in a simple // format from which Java sources are generated by the datatype generator Projproject -lazy val interfaceProj = (project in file("interface")). +lazy val utilInterface = (project in file("interface")). settings( - minimalSettings, + commonSettings, javaOnlySettings, - name := "Interface", - projectComponent, - exportJars := true, - componentID := Some("xsbti"), - watchSources <++= apiDefinitions, - resourceGenerators in Compile <+= (version, resourceManaged, streams, compile in Compile) map generateVersionFile, - apiDefinitions <<= baseDirectory map { base => (base / "definition") :: (base / "other") :: (base / "type") :: Nil }, - sourceGenerators in Compile <+= (apiDefinitions, - fullClasspath in Compile in datatypeProj, - sourceManaged in Compile, - mainClass in datatypeProj in Compile, - runner, - streams) map generateAPICached + name := "Util Interface", + // projectComponent, + exportJars := true + // resourceGenerators in Compile <+= (version, resourceManaged, streams, compile in Compile) map generateVersionFile, + // apiDefinitions <<= baseDirectory map { base => (base / "definition") :: (base / "other") :: (base / "type") :: Nil }, + // sourceGenerators in Compile <+= (apiDefinitions, + // fullClasspath in Compile in datatypeProj, + // sourceManaged in Compile, + // mainClass in datatypeProj in Compile, + // runner, + // streams) map generateAPICached ) -// defines operations on the API of a source, including determining whether it has changed and converting it to a string -// and discovery of Projclasses and annotations -lazy val apiProj = (project in compilePath / "api"). - dependsOn(interfaceProj, classfileProj). +lazy val utilControl = (project in utilPath / "control"). settings( - testedBaseSettings, - name := "API" - ) - -/* **** Utilities **** */ - -lazy val controlProj = (project in utilPath / "control"). - settings( - baseSettings, - Util.crossBuild, - name := "Control", + commonSettings, + // Util.crossBuild, + name := "Util Control", crossScalaVersions := Seq(scala210, scala211) ) -lazy val collectionProj = (project in utilPath / "collection"). +lazy val utilCollection = (project in utilPath / "collection"). settings( testedBaseSettings, Util.keywordsSettings, - Util.crossBuild, - name := "Collections", + // Util.crossBuild, + name := "Util Collection", crossScalaVersions := Seq(scala210, scala211) ) -lazy val applyMacroProj = (project in utilPath / "appmacro"). - dependsOn(collectionProj). +lazy val utilApplyMacro = (project in utilPath / "appmacro"). + dependsOn(utilCollection). settings( testedBaseSettings, - name := "Apply Macro", + name := "Util Apply Macro", libraryDependencies += scalaCompiler.value ) -// The API for forking, combining, and doing I/O with system processes -lazy val processProj = (project in utilPath / "process"). - dependsOn(ioProj % "test->test"). - settings( - baseSettings, - name := "Process", - libraryDependencies ++= scalaXml.value - ) - -// Path, IO (formerly FileUtilities), NameFilter and other I/O utility classes -lazy val ioProj = (project in utilPath / "io"). - dependsOn(controlProj). - settings( - testedBaseSettings, - Util.crossBuild, - name := "IO", - libraryDependencies += scalaCompiler.value % Test, - crossScalaVersions := Seq(scala210, scala211) - ) - -// Utilities related to reflection, managing Scala versions, and custom class loaders -lazy val classpathProj = (project in utilPath / "classpath"). - dependsOn(interfaceProj, ioProj). - settings( - testedBaseSettings, - name := "Classpath", - libraryDependencies ++= Seq(scalaCompiler.value,Dependencies.launcherInterface) - ) - // Command line-related utilities. -lazy val completeProj = (project in utilPath / "complete"). - dependsOn(collectionProj, controlProj, ioProj). +lazy val utilComplete = (project in utilPath / "complete"). + dependsOn(utilCollection, utilControl). settings( testedBaseSettings, - Util.crossBuild, - name := "Completion", - libraryDependencies += jline, + // Util.crossBuild, + name := "Util Completion", + libraryDependencies ++= Seq(jline, ioProj), crossScalaVersions := Seq(scala210, scala211) ) // logging -lazy val logProj = (project in utilPath / "log"). - dependsOn(interfaceProj, processProj). +lazy val utilLogging = (project in utilPath / "log"). + dependsOn(utilInterface/*, processProj*/). settings( testedBaseSettings, - name := "Logging", + name := "Util Logging", libraryDependencies += jline ) // Relation -lazy val relationProj = (project in utilPath / "relation"). - dependsOn(interfaceProj, processProj). +lazy val utilRelation = (project in utilPath / "relation"). + dependsOn(utilInterface/*, processProj*/). settings( testedBaseSettings, - name := "Relation" - ) - -// class file reader and analyzer -lazy val classfileProj = (project in utilPath / "classfile"). - dependsOn(ioProj, interfaceProj, logProj). - settings( - testedBaseSettings, - name := "Classfile" - ) - -// generates immutable or mutable Java data types according to a simple input format -lazy val datatypeProj = (project in utilPath / "datatype"). - dependsOn(ioProj). - settings( - baseSettings, - name := "Datatype Generator" - ) - -// cross versioning -lazy val crossProj = (project in utilPath / "cross"). - settings( - baseSettings, - inConfig(Compile)(Transform.crossGenSettings), - name := "Cross" + name := "Util Relation" ) // A logic with restricted negation as failure for a unique, stable model -lazy val logicProj = (project in utilPath / "logic"). - dependsOn(collectionProj, relationProj). +lazy val utilLogic = (project in utilPath / "logic"). + dependsOn(utilCollection, utilRelation). settings( testedBaseSettings, - name := "Logic" - ) - -/* **** Intermediate-level Modules **** */ - -// Apache Ivy integration -lazy val ivyProj = (project in file("ivy")). - dependsOn(interfaceProj, crossProj, logProj % "compile;test->test", ioProj % "compile;test->test", /*launchProj % "test->test",*/ collectionProj). - settings( - baseSettings, - name := "Ivy", - libraryDependencies ++= Seq(ivy, jsch, sbtSerialization, scalaReflect.value, launcherInterface), - testExclusive) - -// Runner for uniform test interface -lazy val testingProj = (project in file("testing")). - dependsOn(ioProj, classpathProj, logProj, testAgentProj). - settings( - baseSettings, - name := "Testing", - libraryDependencies ++= Seq(testInterface,launcherInterface) - ) - -// Testing agent for running tests in a separate process. -lazy val testAgentProj = (project in file("testing") / "agent"). - settings( - minimalSettings, - name := "Test Agent", - libraryDependencies += testInterface - ) - -// Basic task engine -lazy val taskProj = (project in tasksPath). - dependsOn(controlProj, collectionProj). - settings( - testedBaseSettings, - name := "Tasks" - ) - -// Standard task system. This provides map, flatMap, join, and more on top of the basic task model. -lazy val stdTaskProj = (project in tasksPath / "standard"). - dependsOn (taskProj % "compile;test->test", collectionProj, logProj, ioProj, processProj). - settings( - testedBaseSettings, - name := "Task System", - testExclusive + name := "Util Logic" ) // Persisted caching based on SBinary -lazy val cacheProj = (project in cachePath). - dependsOn (ioProj, collectionProj). +lazy val utilCache = (project in cachePath). + dependsOn(utilCollection). settings( - baseSettings, - name := "Cache", - libraryDependencies ++= Seq(sbinary, sbtSerialization, scalaReflect.value) ++ scalaXml.value + commonSettings, + name := "Util Cache", + libraryDependencies ++= Seq(sbinary, sbtSerialization, scalaReflect.value, ioProj) ++ scalaXml.value ) // Builds on cache to provide caching for filesystem-related operations -lazy val trackingProj = (project in cachePath / "tracking"). - dependsOn(cacheProj, ioProj). +lazy val utilTracking = (project in cachePath / "tracking"). + dependsOn(utilCache). settings( - baseSettings, - name := "Tracking" + commonSettings, + name := "Util Tracking", + libraryDependencies ++= Seq(ioProj) ) - -// Embedded Scala code runner -lazy val runProj = (project in file("run")). - dependsOn (ioProj, logProj % "compile;test->test", classpathProj, processProj % "compile;test->test"). - settings( - testedBaseSettings, - name := "Run" - ) - -// Compiler-side interface to compiler that is compiled against the compiler being used either in advance or on the fly. -// Includes API and Analyzer phases that extract source API and relationships. -lazy val compileInterfaceProj = (project in compilePath / "interface"). - dependsOn(interfaceProj % "compile;test->test", ioProj % "test->test", logProj % "test->test", /*launchProj % "test->test",*/ apiProj % "test->test"). - settings( - baseSettings, - libraryDependencies += scalaCompiler.value % "provided", - name := "Compiler Interface", - exportJars := true, - // we need to fork because in unit tests we set usejavacp = true which means - // we are expecting all of our dependencies to be on classpath so Scala compiler - // can use them while constructing its own classpath for compilation - fork in Test := true, - // needed because we fork tests and tests are ran in parallel so we have multiple Scala - // compiler instances that are memory hungry - javaOptions in Test += "-Xmx1G", - publishArtifact in (Compile, packageSrc) := true - ) - -// Implements the core functionality of detecting and propagating changes incrementally. -// Defines the data structures for representing file fingerprints and relationships and the overall source analysis -lazy val compileIncrementalProj = (project in compilePath / "inc"). - dependsOn (apiProj, ioProj, logProj, classpathProj, relationProj). - settings( - testedBaseSettings, - name := "Incremental Compiler" - ) - -// Persists the incremental data structures using SBinary -lazy val compilePersistProj = (project in compilePath / "persist"). - dependsOn(compileIncrementalProj, apiProj, compileIncrementalProj % "test->test"). - settings( - testedBaseSettings, - name := "Persist", - libraryDependencies += sbinary - ) - -// sbt-side interface to compiler. Calls compiler-side interface reflectively -lazy val compilerProj = (project in compilePath). - dependsOn(interfaceProj % "compile;test->test", logProj, ioProj, classpathProj, apiProj, classfileProj, - logProj % "test->test" /*,launchProj % "test->test" */). - settings( - testedBaseSettings, - name := "Compile", - libraryDependencies ++= Seq(scalaCompiler.value % Test, launcherInterface), - unmanagedJars in Test <<= (packageSrc in compileInterfaceProj in Compile).map(x => Seq(x).classpath) - ) - -lazy val compilerIntegrationProj = (project in (compilePath / "integration")). - dependsOn(compileIncrementalProj, compilerProj, compilePersistProj, apiProj, classfileProj). - settings( - baseSettings, - name := "Compiler Integration" - ) - -lazy val compilerIvyProj = (project in compilePath / "ivy"). - dependsOn (ivyProj, compilerProj). - settings( - baseSettings, - name := "Compiler Ivy Integration" - ) - -lazy val scriptedBaseProj = (project in scriptedPath / "base"). - dependsOn (ioProj, processProj). - settings( - testedBaseSettings, - name := "Scripted Framework", - libraryDependencies ++= scalaParsers.value - ) - -lazy val scriptedSbtProj = (project in scriptedPath / "sbt"). - dependsOn (ioProj, logProj, processProj, scriptedBaseProj, interfaceProj). - settings( - baseSettings, - name := "Scripted sbt", - libraryDependencies += launcherInterface % "provided" - ) - -lazy val scriptedPluginProj = (project in scriptedPath / "plugin"). - dependsOn (sbtProj, classpathProj). - settings( - baseSettings, - name := "Scripted Plugin" - ) - -// Implementation and support code for defining actions. -lazy val actionsProj = (project in mainPath / "actions"). - dependsOn (classpathProj, completeProj, apiProj, compilerIntegrationProj, compilerIvyProj, - interfaceProj, ioProj, ivyProj, logProj, processProj, runProj, relationProj, stdTaskProj, - taskProj, trackingProj, testingProj). - settings( - testedBaseSettings, - name := "Actions" - ) - -// General command support and core commands not specific to a build system -lazy val commandProj = (project in mainPath / "command"). - dependsOn(interfaceProj, ioProj, logProj, completeProj, classpathProj, crossProj). - settings( - testedBaseSettings, - name := "Command", - libraryDependencies += launcherInterface - ) - -// Fixes scope=Scope for Setting (core defined in collectionProj) to define the settings system used in build definitions -lazy val mainSettingsProj = (project in mainPath / "settings"). - dependsOn (applyMacroProj, interfaceProj, ivyProj, relationProj, logProj, ioProj, commandProj, - completeProj, classpathProj, stdTaskProj, processProj). - settings( - testedBaseSettings, - name := "Main Settings", - libraryDependencies += sbinary - ) - -// The main integration project for sbt. It brings all of the Projsystems together, configures them, and provides for overriding conventions. -lazy val mainProj = (project in mainPath). - dependsOn (actionsProj, mainSettingsProj, interfaceProj, ioProj, ivyProj, logProj, logicProj, processProj, runProj, commandProj). - settings( - testedBaseSettings, - name := "Main", - libraryDependencies ++= scalaXml.value ++ Seq(launcherInterface) - ) - -// Strictly for bringing implicits and aliases from subsystems into the top-level sbt namespace through a single package object -// technically, we need a dependency on all of mainProj's dependencies, but we don't do that since this is strictly an integration project -// with the sole purpose of providing certain identifiers without qualification (with a package object) -lazy val sbtProj = (project in sbtPath). - dependsOn(mainProj, compileInterfaceProj, scriptedSbtProj % "test->test"). - settings( - baseSettings, - name := "sbt", - normalizedName := "sbt" - ) - -lazy val mavenResolverPluginProj = (project in file("sbt-maven-resolver")). - dependsOn(sbtProj, ivyProj % "test->test"). - settings( - baseSettings, - name := "sbt-maven-resolver", - libraryDependencies ++= aetherLibs, - sbtPlugin := true - ) - -def scriptedTask: Def.Initialize[InputTask[Unit]] = Def.inputTask { - val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed - publishAll.value - doScripted((sbtLaunchJar in bundledLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, - (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) -} - -def scriptedUnpublishedTask: Def.Initialize[InputTask[Unit]] = Def.inputTask { - val result = scriptedSource(dir => (s: State) => scriptedParser(dir)).parsed - doScripted((sbtLaunchJar in bundledLauncherProj).value, (fullClasspath in scriptedSbtProj in Test).value, - (scalaInstance in scriptedSbtProj).value, scriptedSource.value, result, scriptedPrescripted.value) -} - -lazy val publishAll = TaskKey[Unit]("publish-all") -lazy val publishLauncher = TaskKey[Unit]("publish-launcher") - -lazy val myProvided = config("provided") intransitive - -def allProjects = Seq(interfaceProj, apiProj, - controlProj, collectionProj, applyMacroProj, processProj, ioProj, classpathProj, completeProj, - logProj, relationProj, classfileProj, datatypeProj, crossProj, logicProj, ivyProj, - testingProj, testAgentProj, taskProj, stdTaskProj, cacheProj, trackingProj, runProj, - compileInterfaceProj, compileIncrementalProj, compilePersistProj, compilerProj, - compilerIntegrationProj, compilerIvyProj, - scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, - actionsProj, commandProj, mainSettingsProj, mainProj, sbtProj, bundledLauncherProj, mavenResolverPluginProj) - -def projectsWithMyProvided = allProjects.map(p => p.copy(configurations = (p.configurations.filter(_ != Provided)) :+ myProvided)) -lazy val nonRoots = projectsWithMyProvided.map(p => LocalProject(p.id)) - -def rootSettings = fullDocSettings ++ - Util.publishPomSettings ++ otherRootSettings ++ Formatting.sbtFilesSettings ++ - Transform.conscriptSettings(bundledLauncherProj) -def otherRootSettings = Seq( - Scripted.scriptedPrescripted := { _ => }, - Scripted.scripted <<= scriptedTask, - Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, - Scripted.scriptedSource := (sourceDirectory in sbtProj).value / "sbt-test", - publishAll := { - val _ = (publishLocal).all(ScopeFilter(inAnyProject)).value - }, - aggregate in bintrayRelease := false -) ++ inConfig(Scripted.MavenResolverPluginTest)(Seq( - Scripted.scripted <<= scriptedTask, - Scripted.scriptedUnpublished <<= scriptedUnpublishedTask, - Scripted.scriptedPrescripted := { f => - val inj = f / "project" / "maven.sbt" - if (!inj.exists) { - IO.write(inj, "addMavenResolverPlugin") - // sLog.value.info(s"""Injected project/maven.sbt to $f""") - } - } -)) -lazy val docProjects: ScopeFilter = ScopeFilter( - inAnyProject -- inProjects(sbtRoot, sbtProj, scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, mavenResolverPluginProj), - inConfigurations(Compile) -) -def fullDocSettings = Util.baseScalacOptions ++ Docs.settings ++ Sxr.settings ++ Seq( - scalacOptions += "-Ymacro-no-expand", // for both sxr and doc - sources in sxr := { - val allSources = (sources ?? Nil).all(docProjects).value - allSources.flatten.distinct - }, //sxr - sources in (Compile, doc) := (sources in sxr).value, // doc - Sxr.sourceDirectories := { - val allSourceDirectories = (sourceDirectories ?? Nil).all(docProjects).value - allSourceDirectories.flatten - }, - fullClasspath in sxr := (externalDependencyClasspath in Compile in sbtProj).value, - dependencyClasspath in (Compile, doc) := (fullClasspath in sxr).value -) - -/* Nested Projproject paths */ -def sbtPath = file("sbt") -def cachePath = file("cache") -def tasksPath = file("tasks") -def launchPath = file("launch") -def utilPath = file("util") -def compilePath = file("compile") -def mainPath = file("main") - -lazy val safeUnitTests = taskKey[Unit]("Known working tests (for both 2.10 and 2.11)") -lazy val safeProjects: ScopeFilter = ScopeFilter( - inProjects(mainSettingsProj, mainProj, ivyProj, completeProj, - actionsProj, classpathProj, collectionProj, compileIncrementalProj, - logProj, runProj, stdTaskProj, compilerProj), - inConfigurations(Test) -) -lazy val otherUnitTests = taskKey[Unit]("Unit test other projects") -lazy val otherProjects: ScopeFilter = ScopeFilter( - inProjects(interfaceProj, apiProj, controlProj, - applyMacroProj, - // processProj, // this one is suspicious - ioProj, - relationProj, classfileProj, datatypeProj, - crossProj, logicProj, testingProj, testAgentProj, taskProj, - cacheProj, trackingProj, - compileIncrementalProj, - compilePersistProj, compilerProj, - compilerIntegrationProj, compilerIvyProj, - scriptedBaseProj, scriptedSbtProj, scriptedPluginProj, - commandProj, mainSettingsProj, mainProj, - sbtProj, mavenResolverPluginProj), - inConfigurations(Test) -) - -def customCommands: Seq[Setting[_]] = Seq( - commands += Command.command("setupBuildScala211") { state => - s"""set scalaVersion in ThisBuild := "$scala211" """ :: - state - }, - // This is invoked by Travis - commands += Command.command("checkBuildScala211") { state => - s"++ $scala211" :: - // First compile everything before attempting to test - "all compile test:compile" :: - // Now run known working tests. - safeUnitTests.key.label :: - state - }, - safeUnitTests := { - test.all(safeProjects).value - }, - otherUnitTests := { - test.all(otherProjects) - }, - commands += Command.command("release-sbt-local") { state => - "clean" :: - "so compile" :: - "so publishLocal" :: - "reload" :: - state - }, - /** There are several complications with sbt's build. - * First is the fact that interface project is a Java-only project - * that uses source generator from datatype subproject in Scala 2.10.5. - * - * Second is the fact that all subprojects are released with crossPaths - * turned off for the sbt's Scala version 2.10.5, but some of them are also - * cross published against 2.11.1 with crossPaths turned on. - * - * `so compile` handles 2.10.x/2.11.x cross building. - */ - commands += Command.command("release-sbt") { state => - // TODO - Any sort of validation - "clean" :: - "conscript-configs" :: - "so compile" :: - "so publishSigned" :: - "bundledLauncherProj/publishLauncher" :: - state - }, - // stamp-version doesn't work with ++ or "so". - commands += Command.command("release-nightly") { state => - "stamp-version" :: - "clean" :: - "compile" :: - "publish" :: - "bintrayRelease" :: - state - } -) diff --git a/project/Dependencies.scala b/project/Dependencies.scala new file mode 100644 index 000000000..f14cf23c9 --- /dev/null +++ b/project/Dependencies.scala @@ -0,0 +1,34 @@ +import sbt._ +import Keys._ + +object Dependencies { + lazy val scala210 = "2.10.5" + lazy val scala211 = "2.11.7" + + val bootstrapSbtVersion = "0.13.8" + // lazy val interfaceProj = "org.scala-sbt" % "interface" % bootstrapSbtVersion + lazy val ioProj = "org.scala-sbt" % "io" % bootstrapSbtVersion + // lazy val collectionProj = "org.scala-sbt" % "collections" % bootstrapSbtVersion + // lazy val logProj = "org.scala-sbt" % "logging" % bootstrapSbtVersion + // lazy val crossProj = "org.scala-sbt" % "cross" % bootstrapSbtVersion + + lazy val jline = "jline" % "jline" % "2.11" + // lazy val launcherInterface = "org.scala-sbt" % "launcher-interface" % "1.0.0-M1" + // lazy val ivy = "org.scala-sbt.ivy" % "ivy" % "2.3.0-sbt-927bc9ded7f8fba63297cddd0d5a3d01d6ad5d8d" + // lazy val jsch = "com.jcraft" % "jsch" % "0.1.46" intransitive () + lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" + lazy val sbinary = "org.scala-tools.sbinary" %% "sbinary" % "0.4.2" + // lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.11.4" + // lazy val specs2 = "org.specs2" %% "specs2" % "2.3.11" + // lazy val junit = "junit" % "junit" % "4.11" + lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } + lazy val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } + private def scala211Module(name: String, moduleVersion: String) = + Def.setting { + scalaVersion.value match { + case sv if (sv startsWith "2.9.") || (sv startsWith "2.10.") => Nil + case _ => ("org.scala-lang.modules" %% name % moduleVersion) :: Nil + } + } + lazy val scalaXml = scala211Module("scala-xml", "1.0.1") +} diff --git a/project/Util.scala b/project/Util.scala new file mode 100644 index 000000000..221a16000 --- /dev/null +++ b/project/Util.scala @@ -0,0 +1,38 @@ +import sbt._ +import Keys._ +import StringUtilities.normalize + +object Util { + lazy val scalaKeywords = TaskKey[Set[String]]("scala-keywords") + lazy val generateKeywords = TaskKey[File]("generateKeywords") + + lazy val javaOnlySettings = Seq[Setting[_]]( + compileOrder := CompileOrder.JavaThenScala, + unmanagedSourceDirectories in Compile <<= Seq(javaSource in Compile).join + ) + + def getScalaKeywords: Set[String] = + { + val g = new scala.tools.nsc.Global(new scala.tools.nsc.Settings) + g.nme.keywords.map(_.toString) + } + def writeScalaKeywords(base: File, keywords: Set[String]): File = + { + val init = keywords.map(tn => '"' + tn + '"').mkString("Set(", ", ", ")") + val ObjectName = "ScalaKeywords" + val PackageName = "sbt" + val keywordsSrc = + """package %s +object %s { + val values = %s +}""".format(PackageName, ObjectName, init) + val out = base / PackageName.replace('.', '/') / (ObjectName + ".scala") + IO.write(out, keywordsSrc) + out + } + def keywordsSettings: Seq[Setting[_]] = inConfig(Compile)(Seq( + scalaKeywords := getScalaKeywords, + generateKeywords <<= (sourceManaged, scalaKeywords) map writeScalaKeywords, + sourceGenerators <+= generateKeywords map (x => Seq(x)) + )) +} diff --git a/project/build.properties b/project/build.properties new file mode 100644 index 000000000..817bc38df --- /dev/null +++ b/project/build.properties @@ -0,0 +1 @@ +sbt.version=0.13.9 From 7cbcb67dfa9624d78184cb1185a3048613f6dcb1 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 20 Aug 2015 00:38:50 -0400 Subject: [PATCH 522/823] Remove all interface classes except for the ones used by the logger. --- build.sbt | 3 +- .../src/main/java/xsbti/AnalysisCallback.java | 62 -------------- .../src/main/java/xsbti/ArtifactInfo.java | 9 -- .../src/main/java/xsbti/CompileCancelled.java | 9 -- .../src/main/java/xsbti/CompileFailed.java | 7 -- .../main/java/xsbti/DependencyContext.java | 22 ----- interface/src/main/java/xsbti/Reporter.java | 22 ----- .../src/main/java/xsbti/api/AbstractLazy.java | 26 ------ interface/src/main/java/xsbti/api/Lazy.java | 9 -- .../src/main/java/xsbti/api/Modifiers.java | 83 ------------------- .../java/xsbti/compile/CachedCompiler.java | 13 --- .../xsbti/compile/CachedCompilerProvider.java | 10 --- .../java/xsbti/compile/ClasspathOptions.java | 29 ------- .../main/java/xsbti/compile/CompileOrder.java | 34 -------- .../java/xsbti/compile/CompileProgress.java | 12 --- .../main/java/xsbti/compile/Compilers.java | 8 -- .../main/java/xsbti/compile/DefinesClass.java | 12 --- .../java/xsbti/compile/DependencyChanges.java | 13 --- .../main/java/xsbti/compile/GlobalsCache.java | 13 --- .../xsbti/compile/IncrementalCompiler.java | 71 ---------------- .../src/main/java/xsbti/compile/Inputs.java | 14 ---- .../main/java/xsbti/compile/JavaCompiler.java | 27 ------ .../java/xsbti/compile/MultipleOutput.java | 20 ----- .../src/main/java/xsbti/compile/Options.java | 27 ------ .../src/main/java/xsbti/compile/Output.java | 7 -- .../java/xsbti/compile/ScalaInstance.java | 37 --------- .../src/main/java/xsbti/compile/Setup.java | 47 ----------- .../main/java/xsbti/compile/SingleOutput.java | 12 --- util/log/src/main/scala/sbt/Logger.scala | 5 ++ 29 files changed, 6 insertions(+), 657 deletions(-) delete mode 100644 interface/src/main/java/xsbti/AnalysisCallback.java delete mode 100644 interface/src/main/java/xsbti/ArtifactInfo.java delete mode 100644 interface/src/main/java/xsbti/CompileCancelled.java delete mode 100644 interface/src/main/java/xsbti/CompileFailed.java delete mode 100644 interface/src/main/java/xsbti/DependencyContext.java delete mode 100644 interface/src/main/java/xsbti/Reporter.java delete mode 100644 interface/src/main/java/xsbti/api/AbstractLazy.java delete mode 100644 interface/src/main/java/xsbti/api/Lazy.java delete mode 100644 interface/src/main/java/xsbti/api/Modifiers.java delete mode 100644 interface/src/main/java/xsbti/compile/CachedCompiler.java delete mode 100644 interface/src/main/java/xsbti/compile/CachedCompilerProvider.java delete mode 100644 interface/src/main/java/xsbti/compile/ClasspathOptions.java delete mode 100644 interface/src/main/java/xsbti/compile/CompileOrder.java delete mode 100755 interface/src/main/java/xsbti/compile/CompileProgress.java delete mode 100644 interface/src/main/java/xsbti/compile/Compilers.java delete mode 100644 interface/src/main/java/xsbti/compile/DefinesClass.java delete mode 100644 interface/src/main/java/xsbti/compile/DependencyChanges.java delete mode 100644 interface/src/main/java/xsbti/compile/GlobalsCache.java delete mode 100644 interface/src/main/java/xsbti/compile/IncrementalCompiler.java delete mode 100644 interface/src/main/java/xsbti/compile/Inputs.java delete mode 100644 interface/src/main/java/xsbti/compile/JavaCompiler.java delete mode 100755 interface/src/main/java/xsbti/compile/MultipleOutput.java delete mode 100644 interface/src/main/java/xsbti/compile/Options.java delete mode 100755 interface/src/main/java/xsbti/compile/Output.java delete mode 100644 interface/src/main/java/xsbti/compile/ScalaInstance.java delete mode 100644 interface/src/main/java/xsbti/compile/Setup.java delete mode 100755 interface/src/main/java/xsbti/compile/SingleOutput.java diff --git a/build.sbt b/build.sbt index 66a045d8f..fcc99e21b 100644 --- a/build.sbt +++ b/build.sbt @@ -109,7 +109,7 @@ lazy val utilComplete = (project in utilPath / "complete"). // logging lazy val utilLogging = (project in utilPath / "log"). - dependsOn(utilInterface/*, processProj*/). + dependsOn(utilInterface). settings( testedBaseSettings, name := "Util Logging", @@ -118,7 +118,6 @@ lazy val utilLogging = (project in utilPath / "log"). // Relation lazy val utilRelation = (project in utilPath / "relation"). - dependsOn(utilInterface/*, processProj*/). settings( testedBaseSettings, name := "Util Relation" diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java deleted file mode 100644 index a51628f15..000000000 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ /dev/null @@ -1,62 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2008, 2009 Mark Harrah - */ -package xsbti; - -import java.io.File; - -public interface AnalysisCallback -{ - /** Called to indicate that the source file source depends on the source file - * dependsOn. Note that only source files included in the current compilation will - * passed to this method. Dependencies on classes generated by sources not in the current compilation will - * be passed as class dependencies to the classDependency method. - * If publicInherited is true, this dependency is a result of inheritance by a - * template accessible outside of the source file. - * @deprecated Use `sourceDependency(File dependsOn, File source, DependencyContext context)` instead. */ - @Deprecated - void sourceDependency(File dependsOn, File source, boolean publicInherited); - /** Called to indicate that the source file source depends on the source file - * dependsOn. Note that only source files included in the current compilation will - * passed to this method. Dependencies on classes generated by sources not in the current compilation will - * be passed as class dependencies to the classDependency method. - * context gives information about the context in which this dependency has been extracted. - * See xsbti.DependencyContext for the list of existing dependency contexts. */ - void sourceDependency(File dependsOn, File source, DependencyContext context); - /** Called to indicate that the source file source depends on the top-level - * class named name from class or jar file binary. - * If publicInherited is true, this dependency is a result of inheritance by a - * template accessible outside of the source file. - * @deprecated Use `binaryDependency(File binary, String name, File source, DependencyContext context)` instead. */ - @Deprecated - void binaryDependency(File binary, String name, File source, boolean publicInherited); - /** Called to indicate that the source file source depends on the top-level - * class named name from class or jar file binary. - * context gives information about the context in which this dependency has been extracted. - * See xsbti.DependencyContext for the list of existing dependency contexts. */ - void binaryDependency(File binary, String name, File source, DependencyContext context); - /** Called to indicate that the source file source produces a class file at - * module contain class name.*/ - void generatedClass(File source, File module, String name); - /** Called when the public API of a source file is extracted. */ - void api(File sourceFile, xsbti.api.SourceAPI source); - void usedName(File sourceFile, String names); - /** Provides problems discovered during compilation. These may be reported (logged) or unreported. - * Unreported problems are usually unreported because reporting was not enabled via a command line switch. */ - void problem(String what, Position pos, String msg, Severity severity, boolean reported); - /** - * Determines whether method calls through this interface should be interpreted as serving - * name hashing algorithm needs in given compiler run. - * - * In particular, it indicates whether member reference and inheritance dependencies should be - * extracted. - * - * As the signature suggests, this method's implementation is meant to be side-effect free. It's added - * to AnalysisCallback because it indicates how other callback calls should be interpreted by both - * implementation of AnalysisCallback and it's clients. - * - * NOTE: This method is an implementation detail and can be removed at any point without deprecation. - * Do not depend on it, please. - */ - boolean nameHashing(); -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/ArtifactInfo.java b/interface/src/main/java/xsbti/ArtifactInfo.java deleted file mode 100644 index 6f2eedae5..000000000 --- a/interface/src/main/java/xsbti/ArtifactInfo.java +++ /dev/null @@ -1,9 +0,0 @@ -package xsbti; - -public final class ArtifactInfo -{ - public static final String ScalaOrganization = "org.scala-lang"; - public static final String ScalaLibraryID = "scala-library"; - public static final String ScalaCompilerID = "scala-compiler"; - public static final String SbtOrganization = "org.scala-sbt"; -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/CompileCancelled.java b/interface/src/main/java/xsbti/CompileCancelled.java deleted file mode 100644 index bcd3695dd..000000000 --- a/interface/src/main/java/xsbti/CompileCancelled.java +++ /dev/null @@ -1,9 +0,0 @@ -package xsbti; - -/** - * An exception thrown when compilation cancellation has been requested during - * Scala compiler run. - */ -public abstract class CompileCancelled extends RuntimeException { - public abstract String[] arguments(); -} diff --git a/interface/src/main/java/xsbti/CompileFailed.java b/interface/src/main/java/xsbti/CompileFailed.java deleted file mode 100644 index f1cbbc61b..000000000 --- a/interface/src/main/java/xsbti/CompileFailed.java +++ /dev/null @@ -1,7 +0,0 @@ -package xsbti; - -public abstract class CompileFailed extends RuntimeException -{ - public abstract String[] arguments(); - public abstract Problem[] problems(); -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/DependencyContext.java b/interface/src/main/java/xsbti/DependencyContext.java deleted file mode 100644 index 15cfa76d1..000000000 --- a/interface/src/main/java/xsbti/DependencyContext.java +++ /dev/null @@ -1,22 +0,0 @@ -package xsbti; - -/** - * Enumeration of existing dependency contexts. - * Dependency contexts represent the various kind of dependencies that - * can exist between symbols. - */ -public enum DependencyContext { - /** - * Represents a direct dependency between two symbols : - * object Foo - * object Bar { def foo = Foo } - */ - DependencyByMemberRef, - - /** - * Represents an inheritance dependency between two symbols : - * class A - * class B extends A - */ - DependencyByInheritance -} diff --git a/interface/src/main/java/xsbti/Reporter.java b/interface/src/main/java/xsbti/Reporter.java deleted file mode 100644 index d76be8ea6..000000000 --- a/interface/src/main/java/xsbti/Reporter.java +++ /dev/null @@ -1,22 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2008, 2009, 2010 Mark Harrah - */ -package xsbti; - -public interface Reporter -{ - /** Resets logging, including any accumulated errors, warnings, messages, and counts.*/ - void reset(); - /** Returns true if this logger has seen any errors since the last call to reset.*/ - boolean hasErrors(); - /** Returns true if this logger has seen any warnings since the last call to reset.*/ - boolean hasWarnings(); - /** Logs a summary of logging since the last reset.*/ - void printSummary(); - /** Returns a list of warnings and errors since the last reset.*/ - Problem[] problems(); - /** Logs a message.*/ - void log(Position pos, String msg, Severity sev); - /** Reports a comment. */ - void comment(Position pos, String msg); -} diff --git a/interface/src/main/java/xsbti/api/AbstractLazy.java b/interface/src/main/java/xsbti/api/AbstractLazy.java deleted file mode 100644 index bd21f166f..000000000 --- a/interface/src/main/java/xsbti/api/AbstractLazy.java +++ /dev/null @@ -1,26 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package xsbti.api; - - import java.io.ObjectStreamException; - -public abstract class AbstractLazy implements Lazy, java.io.Serializable -{ - private Object writeReplace() throws ObjectStreamException - { - return new StrictLazy(get()); - } - private static final class StrictLazy implements Lazy, java.io.Serializable - { - private final T value; - StrictLazy(T t) - { - value = t; - } - public T get() - { - return value; - } - } -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/api/Lazy.java b/interface/src/main/java/xsbti/api/Lazy.java deleted file mode 100644 index 1ee29b013..000000000 --- a/interface/src/main/java/xsbti/api/Lazy.java +++ /dev/null @@ -1,9 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package xsbti.api; - -public interface Lazy -{ - T get(); -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/api/Modifiers.java b/interface/src/main/java/xsbti/api/Modifiers.java deleted file mode 100644 index 5e103c7ec..000000000 --- a/interface/src/main/java/xsbti/api/Modifiers.java +++ /dev/null @@ -1,83 +0,0 @@ -package xsbti.api; - -public final class Modifiers implements java.io.Serializable -{ - private static final int AbstractBit = 0; - private static final int OverrideBit = 1; - private static final int FinalBit = 2; - private static final int SealedBit = 3; - private static final int ImplicitBit = 4; - private static final int LazyBit = 5; - private static final int MacroBit = 6; - - private static int flag(boolean set, int bit) - { - return set ? (1 << bit) : 0; - } - - public Modifiers(boolean isAbstract, boolean isOverride, boolean isFinal, boolean isSealed, boolean isImplicit, boolean isLazy, boolean isMacro) - { - this.flags = (byte)( - flag(isAbstract, AbstractBit) | - flag(isOverride, OverrideBit) | - flag(isFinal, FinalBit) | - flag(isSealed, SealedBit) | - flag(isImplicit, ImplicitBit) | - flag(isLazy, LazyBit) | - flag(isMacro, MacroBit) - ); - } - - private final byte flags; - - private boolean flag(int bit) - { - return (flags & (1 << bit)) != 0; - } - - public final byte raw() - { - return flags; - } - - public final boolean isAbstract() - { - return flag(AbstractBit); - } - public final boolean isOverride() - { - return flag(OverrideBit); - } - public final boolean isFinal() - { - return flag(FinalBit); - } - public final boolean isSealed() - { - return flag(SealedBit); - } - public final boolean isImplicit() - { - return flag(ImplicitBit); - } - public final boolean isLazy() - { - return flag(LazyBit); - } - public final boolean isMacro() - { - return flag(MacroBit); - } - public boolean equals(Object o) - { - return (o instanceof Modifiers) && flags == ((Modifiers)o).flags; - } - public int hashCode() - { - return flags; - } - public String toString() - { - return "Modifiers(" + "isAbstract: " + isAbstract() + ", " + "isOverride: " + isOverride() + ", " + "isFinal: " + isFinal() + ", " + "isSealed: " + isSealed() + ", " + "isImplicit: " + isImplicit() + ", " + "isLazy: " + isLazy() + ", " + "isMacro: " + isMacro()+ ")"; - } -} diff --git a/interface/src/main/java/xsbti/compile/CachedCompiler.java b/interface/src/main/java/xsbti/compile/CachedCompiler.java deleted file mode 100644 index 1d37f0883..000000000 --- a/interface/src/main/java/xsbti/compile/CachedCompiler.java +++ /dev/null @@ -1,13 +0,0 @@ -package xsbti.compile; - -import xsbti.AnalysisCallback; -import xsbti.Logger; -import xsbti.Reporter; -import java.io.File; - -public interface CachedCompiler -{ - /** Returns an array of arguments representing the nearest command line equivalent of a call to run but without the command name itself.*/ - String[] commandArguments(File[] sources); - void run(File[] sources, DependencyChanges cpChanges, AnalysisCallback callback, Logger log, Reporter delegate, CompileProgress progress); -} diff --git a/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java b/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java deleted file mode 100644 index 313f27505..000000000 --- a/interface/src/main/java/xsbti/compile/CachedCompilerProvider.java +++ /dev/null @@ -1,10 +0,0 @@ -package xsbti.compile; - -import xsbti.Logger; -import xsbti.Reporter; - -public interface CachedCompilerProvider -{ - ScalaInstance scalaInstance(); - CachedCompiler newCachedCompiler(String[] arguments, Output output, Logger log, Reporter reporter, boolean resident); -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/ClasspathOptions.java b/interface/src/main/java/xsbti/compile/ClasspathOptions.java deleted file mode 100644 index e4aba32b7..000000000 --- a/interface/src/main/java/xsbti/compile/ClasspathOptions.java +++ /dev/null @@ -1,29 +0,0 @@ -package xsbti.compile; - -/** -* Configures modifications to the classpath based on the Scala instance used for compilation. -* This is typically used for the Scala compiler only and all values set to false for the Java compiler. -*/ -public interface ClasspathOptions -{ - /** If true, includes the Scala library on the boot classpath. This should usually be true.*/ - boolean bootLibrary(); - - /** If true, includes the Scala compiler on the standard classpath. - * This is typically false and is instead managed by the build tool or environment. - */ - boolean compiler(); - - /** If true, includes extra jars from the Scala instance on the standard classpath. - * This is typically false and is instead managed by the build tool or environment. - */ - boolean extra(); - - /** If true, automatically configures the boot classpath. This should usually be true.*/ - boolean autoBoot(); - - /** If true, the Scala library jar is filtered from the standard classpath. - * This should usually be true because the library should be included on the boot classpath of the Scala compiler and not the standard classpath. - */ - boolean filterLibrary(); -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/CompileOrder.java b/interface/src/main/java/xsbti/compile/CompileOrder.java deleted file mode 100644 index 5683f75d9..000000000 --- a/interface/src/main/java/xsbti/compile/CompileOrder.java +++ /dev/null @@ -1,34 +0,0 @@ -package xsbti.compile; - -/** -* Defines the order in which Scala and Java sources are compiled when compiling a set of sources with both Java and Scala sources. -* This setting has no effect if only Java sources or only Scala sources are being compiled. -* It is generally more efficient to use JavaThenScala or ScalaThenJava when mixed compilation is not required. -*/ -public enum CompileOrder -{ - /** - * Allows Scala sources to depend on Java sources and allows Java sources to depend on Scala sources. - * - * In this mode, both Java and Scala sources are passed to the Scala compiler, which generates class files for the Scala sources. - * Then, Java sources are passed to the Java compiler, which generates class files for the Java sources. - * The Scala classes compiled in the first step are included on the classpath to the Java compiler. - */ - Mixed, - /** - * Allows Scala sources to depend on the Java sources in the compilation, but does not allow Java sources to depend on Scala sources. - * - * In this mode, both Java and Scala sources are passed to the Scala compiler, which generates class files for the Scala sources. - * Then, Java sources are passed to the Java compiler, which generates class files for the Java sources. - * The Scala classes compiled in the first step are included on the classpath to the Java compiler. - */ - JavaThenScala, - /** - * Allows Java sources to depend on the Scala sources in the compilation, but does not allow Scala sources to depend on Java sources. - * - * In this mode, both Java and Scala sources are passed to the Scala compiler, which generates class files for the Scala sources. - * Then, Java sources are passed to the Java compiler, which generates class files for the Java sources. - * The Scala classes compiled in the first step are included on the classpath to the Java compiler. - */ - ScalaThenJava -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/CompileProgress.java b/interface/src/main/java/xsbti/compile/CompileProgress.java deleted file mode 100755 index 17174ff6a..000000000 --- a/interface/src/main/java/xsbti/compile/CompileProgress.java +++ /dev/null @@ -1,12 +0,0 @@ -package xsbti.compile; - -/** - * An API for reporting when files are being compiled. - * - * Note; This is tied VERY SPECIFICALLY to scala. - */ -public interface CompileProgress { - void startUnit(String phase, String unitPath); - - boolean advance(int current, int total); -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/Compilers.java b/interface/src/main/java/xsbti/compile/Compilers.java deleted file mode 100644 index 0bb194534..000000000 --- a/interface/src/main/java/xsbti/compile/Compilers.java +++ /dev/null @@ -1,8 +0,0 @@ -package xsbti.compile; - -public interface Compilers -{ - JavaCompiler javac(); - // should be cached by client if desired - ScalaCompiler scalac(); -} diff --git a/interface/src/main/java/xsbti/compile/DefinesClass.java b/interface/src/main/java/xsbti/compile/DefinesClass.java deleted file mode 100644 index 261c6ca22..000000000 --- a/interface/src/main/java/xsbti/compile/DefinesClass.java +++ /dev/null @@ -1,12 +0,0 @@ -package xsbti.compile; - -/** -* Determines if an entry on a classpath contains a class. -*/ -public interface DefinesClass -{ - /** - * Returns true if the classpath entry contains the requested class. - */ - boolean apply(String className); -} diff --git a/interface/src/main/java/xsbti/compile/DependencyChanges.java b/interface/src/main/java/xsbti/compile/DependencyChanges.java deleted file mode 100644 index 4f6bda55a..000000000 --- a/interface/src/main/java/xsbti/compile/DependencyChanges.java +++ /dev/null @@ -1,13 +0,0 @@ -package xsbti.compile; - - import java.io.File; - -// only includes changes to dependencies outside of the project -public interface DependencyChanges -{ - boolean isEmpty(); - // class files or jar files - File[] modifiedBinaries(); - // class names - String[] modifiedClasses(); -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/GlobalsCache.java b/interface/src/main/java/xsbti/compile/GlobalsCache.java deleted file mode 100644 index c8540e2d2..000000000 --- a/interface/src/main/java/xsbti/compile/GlobalsCache.java +++ /dev/null @@ -1,13 +0,0 @@ -package xsbti.compile; - -import xsbti.Logger; -import xsbti.Reporter; - -/** - * An interface which lets us know how to retrieve cached compiler instances form the current JVM. - */ -public interface GlobalsCache -{ - CachedCompiler apply(String[] args, Output output, boolean forceNew, CachedCompilerProvider provider, Logger log, Reporter reporter); - void clear(); -} diff --git a/interface/src/main/java/xsbti/compile/IncrementalCompiler.java b/interface/src/main/java/xsbti/compile/IncrementalCompiler.java deleted file mode 100644 index c98263e7f..000000000 --- a/interface/src/main/java/xsbti/compile/IncrementalCompiler.java +++ /dev/null @@ -1,71 +0,0 @@ -package xsbti.compile; - -import xsbti.Logger; -import java.io.File; - -/* -* This API is subject to change. -* -* It is the client's responsibility to: -* 1. Manage class loaders. Usually the client will want to: -* i. Keep the class loader used by the ScalaInstance warm. -* ii. Keep the class loader of the incremental recompilation classes (xsbti.compile) warm. -* iii. Share the class loader for Scala classes between the incremental compiler implementation and the ScalaInstance where possible (must be binary compatible) -* 2. Manage the compiler interface jar. The interface must be compiled against the exact Scala version used for compilation and a compatible Java version. -* 3. Manage compilation order between different compilations. -* i. Execute a compilation for each dependency, obtaining an Analysis for each. -* ii. Provide the Analysis from previous compilations to dependent compilations in the analysis map. -* 4. Provide an implementation of JavaCompiler for compiling Java sources. -* 5. Define a function that determines if a classpath entry contains a class (Setup.definesClass). -* i. This is provided by the client so that the client can cache this information across compilations when compiling multiple sets of sources. -* ii. The cache should be cleared for each new compilation run or else recompilation will not properly account for changes to the classpath. -* 6. Provide a cache directory. -* i. This directory is used by IncrementalCompiler to persist data between compilations. -* ii. It should be a different directory for each set of sources being compiled. -* 7. Manage parallel execution. -* i. Each compilation may be performed in a different thread as long as the dependencies have been compiled already. -* ii. Implementations of all types should be immutable and arrays treated as immutable. -* 8. Ensure general invariants: -* i. The implementations of all types are immutable, except for the already discussed Setup.definesClass. -* ii. Arrays are treated as immutable. -* iii. No value is ever null. -*/ -public interface IncrementalCompiler -{ - /** - * Performs an incremental compilation as configured by `in`. - * The returned Analysis should be provided to compilations depending on the classes from this compilation. - */ - Analysis compile(Inputs in, Logger log); - - /** - * Creates a compiler instance that can be used by the `compile` method. - * - * @param instance The Scala version to use - * @param interfaceJar The compiler interface jar compiled for the Scala version being used - * @param options Configures how arguments to the underlying Scala compiler will be built. - * - */ - @Deprecated - ScalaCompiler newScalaCompiler(ScalaInstance instance, File interfaceJar, ClasspathOptions options, Logger log); - /** - * Creates a compiler instance that can be used by the `compile` method. - * - * @param instance The Scala version to use - * @param interfaceJar The compiler interface jar compiled for the Scala version being used - * @param options Configures how arguments to the underlying Scala compiler will be built. - */ - ScalaCompiler newScalaCompiler(ScalaInstance instance, File interfaceJar, ClasspathOptions options); - - /** - * Compiles the source interface for a Scala version. The resulting jar can then be used by the `newScalaCompiler` method - * to create a ScalaCompiler for incremental compilation. It is the client's responsibility to manage compiled jars for - * different Scala versions. - * - * @param label A brief name describing the source component for use in error messages - * @param sourceJar The jar file containing the compiler interface sources. These are published as sbt's compiler-interface-src module. - * @param targetJar Where to create the output jar file containing the compiled classes. - * @param instance The ScalaInstance to compile the compiler interface for. - * @param log The logger to use during compilation. */ - void compileInterfaceJar(String label, File sourceJar, File targetJar, File interfaceJar, ScalaInstance instance, Logger log); -} diff --git a/interface/src/main/java/xsbti/compile/Inputs.java b/interface/src/main/java/xsbti/compile/Inputs.java deleted file mode 100644 index 5c9ded425..000000000 --- a/interface/src/main/java/xsbti/compile/Inputs.java +++ /dev/null @@ -1,14 +0,0 @@ -package xsbti.compile; - -/** Configures a single compilation of a single set of sources.*/ -public interface Inputs -{ - /** The Scala and Java compilers to use for compilation.*/ - Compilers compilers(); - - /** Standard compilation options, such as the sources and classpath to use. */ - Options options(); - - /** Configures incremental compilation.*/ - Setup setup(); -} diff --git a/interface/src/main/java/xsbti/compile/JavaCompiler.java b/interface/src/main/java/xsbti/compile/JavaCompiler.java deleted file mode 100644 index 18b3f5bea..000000000 --- a/interface/src/main/java/xsbti/compile/JavaCompiler.java +++ /dev/null @@ -1,27 +0,0 @@ -package xsbti.compile; - -import java.io.File; -import xsbti.Logger; -import xsbti.Reporter; - -/** -* Interface to a Java compiler. -*/ -public interface JavaCompiler -{ - /** Compiles Java sources using the provided classpath, output directory, and additional options. - * Output should be sent to the provided logger. - * - * @deprecated 0.13.8 - Use compileWithReporter instead - */ - @Deprecated - void compile(File[] sources, File[] classpath, Output output, String[] options, Logger log); - - /** - * Compiles java sources using the provided classpath, output directory and additional options. - * - * Output should be sent to the provided logger. - * Failures should be passed to the provided Reporter. - */ - void compileWithReporter(File[] sources, File[] classpath, Output output, String[] options, Reporter reporter, Logger log); -} diff --git a/interface/src/main/java/xsbti/compile/MultipleOutput.java b/interface/src/main/java/xsbti/compile/MultipleOutput.java deleted file mode 100755 index 6ba3479e6..000000000 --- a/interface/src/main/java/xsbti/compile/MultipleOutput.java +++ /dev/null @@ -1,20 +0,0 @@ -package xsbti.compile; - -import java.io.File; - -public interface MultipleOutput extends Output { - - interface OutputGroup { - /** The directory where source files are stored for this group. - * Source directories should uniquely identify the group for a source file. */ - File sourceDirectory(); - - /** The directory where class files should be generated. - * Incremental compilation will manage the class files in this directory. - * In particular, outdated class files will be deleted before compilation. - * It is important that this directory is exclusively used for one set of sources. */ - File outputDirectory(); - } - - OutputGroup[] outputGroups(); -} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/compile/Options.java b/interface/src/main/java/xsbti/compile/Options.java deleted file mode 100644 index 78643202d..000000000 --- a/interface/src/main/java/xsbti/compile/Options.java +++ /dev/null @@ -1,27 +0,0 @@ -package xsbti.compile; - -import java.io.File; - -/** Standard compilation options.*/ -public interface Options -{ - /** The classpath to use for compilation. - * This will be modified according to the ClasspathOptions used to configure the ScalaCompiler.*/ - File[] classpath(); - - /** All sources that should be recompiled. - * This should include Scala and Java sources, which are identified by their extension. */ - File[] sources(); - - /** Output for the compilation. */ - Output output(); - - /** The options to pass to the Scala compiler other than the sources and classpath to use. */ - String[] options(); - - /** The options to pass to the Java compiler other than the sources and classpath to use. */ - String[] javacOptions(); - - /** Controls the order in which Java and Scala sources are compiled.*/ - CompileOrder order(); -} diff --git a/interface/src/main/java/xsbti/compile/Output.java b/interface/src/main/java/xsbti/compile/Output.java deleted file mode 100755 index 4f785884e..000000000 --- a/interface/src/main/java/xsbti/compile/Output.java +++ /dev/null @@ -1,7 +0,0 @@ -package xsbti.compile; -/** Abstract interface denoting the output of the compilation. Inheritors are SingleOutput with a global output directory and - * MultipleOutput that specifies the output directory per source file. - */ -public interface Output -{ -} diff --git a/interface/src/main/java/xsbti/compile/ScalaInstance.java b/interface/src/main/java/xsbti/compile/ScalaInstance.java deleted file mode 100644 index c7f3984e3..000000000 --- a/interface/src/main/java/xsbti/compile/ScalaInstance.java +++ /dev/null @@ -1,37 +0,0 @@ -package xsbti.compile; - -import java.io.File; - -/** -* Defines Scala instance, which is a reference version String, a unique version String, a set of jars, and a class loader for a Scala version. -* -* Note that in this API a 'jar' can actually be any valid classpath entry. -*/ -public interface ScalaInstance -{ - /** The version used to refer to this Scala version. - * It need not be unique and can be a dynamic version like 2.10.0-SNAPSHOT. - */ - String version(); - - /** A class loader providing access to the classes and resources in the library and compiler jars. */ - ClassLoader loader(); - - /**@deprecated Only `jars` can be reliably provided for modularized Scala. (Since 0.13.0) */ - @Deprecated - File libraryJar(); - - /**@deprecated Only `jars` can be reliably provided for modularized Scala. (Since 0.13.0) */ - @Deprecated - File compilerJar(); - - /**@deprecated Only `jars` can be reliably provided for modularized Scala. (Since 0.13.0) */ - @Deprecated - File[] otherJars(); - - /** All jar files provided by this Scala instance.*/ - File[] allJars(); - - /** The unique identifier for this Scala instance. An implementation should usually obtain this from the compiler.properties file in the compiler jar. */ - String actualVersion(); -} diff --git a/interface/src/main/java/xsbti/compile/Setup.java b/interface/src/main/java/xsbti/compile/Setup.java deleted file mode 100644 index edf250b8b..000000000 --- a/interface/src/main/java/xsbti/compile/Setup.java +++ /dev/null @@ -1,47 +0,0 @@ -package xsbti.compile; - -import java.io.File; -import java.util.Map; - -import xsbti.Maybe; -import xsbti.Reporter; - -/** Configures incremental recompilation. */ -public interface Setup -{ - /** Provides the Analysis for the given classpath entry.*/ - Maybe analysisMap(File file); - - /** Provides a function to determine if classpath entry `file` contains a given class. - * The returned function should generally cache information about `file`, such as the list of entries in a jar. - */ - DefinesClass definesClass(File file); - - /** If true, no sources are actually compiled and the Analysis from the previous compilation is returned.*/ - boolean skip(); - - /** The file used to cache information across compilations. - * This file can be removed to force a full recompilation. - * The file should be unique and not shared between compilations. */ - File cacheFile(); - - GlobalsCache cache(); - - /** If returned, the progress that should be used to report scala compilation to. */ - Maybe progress(); - - /** The reporter that should be used to report scala compilation to. */ - Reporter reporter(); - - /** - * Returns incremental compiler options. - * - * @see sbt.inc.IncOptions for details - * - * You can get default options by calling sbt.inc.IncOptions.toStringMap(sbt.inc.IncOptions.Default). - * - * In the future, we'll extend API in xsbti to provide factory methods that would allow to obtain - * defaults values so one can depend on xsbti package only. - **/ - Map incrementalCompilerOptions(); -} diff --git a/interface/src/main/java/xsbti/compile/SingleOutput.java b/interface/src/main/java/xsbti/compile/SingleOutput.java deleted file mode 100755 index cb200c9b7..000000000 --- a/interface/src/main/java/xsbti/compile/SingleOutput.java +++ /dev/null @@ -1,12 +0,0 @@ -package xsbti.compile; - -import java.io.File; - -public interface SingleOutput extends Output { - - /** The directory where class files should be generated. - * Incremental compilation will manage the class files in this directory. - * In particular, outdated class files will be deleted before compilation. - * It is important that this directory is exclusively used for one set of sources. */ - File outputDirectory(); -} \ No newline at end of file diff --git a/util/log/src/main/scala/sbt/Logger.scala b/util/log/src/main/scala/sbt/Logger.scala index 848d45b3f..ddba892d2 100644 --- a/util/log/src/main/scala/sbt/Logger.scala +++ b/util/log/src/main/scala/sbt/Logger.scala @@ -5,6 +5,7 @@ package sbt import xsbti.{ Logger => xLogger, F0 } import xsbti.{ Maybe, Position, Problem, Severity } +import sys.process.ProcessLogger import java.io.File @@ -117,6 +118,10 @@ trait Logger extends xLogger { final def info(message: => String): Unit = log(Level.Info, message) final def warn(message: => String): Unit = log(Level.Warn, message) final def error(message: => String): Unit = log(Level.Error, message) + // Added by sys.process.ProcessLogger + final def err(message: => String): Unit = log(Level.Error, message) + // sys.process.ProcessLogger + final def out(message: => String): Unit = log(Level.Info, message) def ansiCodesSupported: Boolean = false From 8b4e0486a80bf2fd042635a826ebf04ac0dea62b Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 20 Aug 2015 00:42:59 -0400 Subject: [PATCH 523/823] Delete more interface related. --- interface/definition | 25 ------ interface/other | 81 ------------------- .../src/test/scala/xsbti/TestCallback.scala | 34 -------- interface/type | 30 ------- 4 files changed, 170 deletions(-) delete mode 100644 interface/definition delete mode 100644 interface/other delete mode 100644 interface/src/test/scala/xsbti/TestCallback.scala delete mode 100644 interface/type diff --git a/interface/definition b/interface/definition deleted file mode 100644 index dadd41adb..000000000 --- a/interface/definition +++ /dev/null @@ -1,25 +0,0 @@ -Definition - name: String - access: Access - modifiers: Modifiers - annotations: Annotation* - FieldLike - tpe : Type - Val - Var - ParameterizedDefinition - typeParameters: TypeParameter* - Def - valueParameters: ParameterList* - returnType: Type - ClassLike - definitionType: DefinitionType - selfType: ~Type - structure: ~Structure - savedAnnotations: String* - TypeMember - TypeAlias - tpe: Type - TypeDeclaration - lowerBound: Type - upperBound: Type diff --git a/interface/other b/interface/other deleted file mode 100644 index 68e4c3a50..000000000 --- a/interface/other +++ /dev/null @@ -1,81 +0,0 @@ -Source - compilation: Compilation - hash: Byte* - api: SourceAPI - apiHash: Int - _internalOnly_nameHashes: _internalOnly_NameHashes - hasMacro: Boolean - -_internalOnly_NameHashes - regularMembers: _internalOnly_NameHash* - implicitMembers: _internalOnly_NameHash* - -_internalOnly_NameHash - name: String - hash: Int - -SourceAPI - packages : Package* - definitions: Definition* - -OutputSetting - sourceDirectory: String - outputDirectory: String - -Compilation - startTime: Long - outputs: OutputSetting* - -Package - name: String - -Access - Public - Qualified - qualifier: Qualifier - Protected - Private - -Qualifier - Unqualified - ThisQualifier - IdQualifier - value: String - -ParameterList - parameters: MethodParameter* - isImplicit: Boolean -MethodParameter - name: String - tpe: Type - hasDefault: Boolean - modifier: ParameterModifier - -TypeParameter - id: String - annotations: Annotation* - typeParameters : TypeParameter* - variance: Variance - lowerBound: Type - upperBound: Type - -Annotation - base: Type - arguments: AnnotationArgument* -AnnotationArgument - name: String - value: String - -enum Variance : Contravariant, Covariant, Invariant -enum ParameterModifier : Repeated, Plain, ByName -enum DefinitionType : Trait, ClassDef, Module, PackageModule - -Path - components: PathComponent* - -PathComponent - Super - qualifier: Path - This - Id - id: String \ No newline at end of file diff --git a/interface/src/test/scala/xsbti/TestCallback.scala b/interface/src/test/scala/xsbti/TestCallback.scala deleted file mode 100644 index f0658597b..000000000 --- a/interface/src/test/scala/xsbti/TestCallback.scala +++ /dev/null @@ -1,34 +0,0 @@ -package xsbti - -import java.io.File -import scala.collection.mutable.ArrayBuffer -import xsbti.api.SourceAPI -import xsbti.DependencyContext._ - -class TestCallback(override val nameHashing: Boolean = false) extends AnalysisCallback -{ - val sourceDependencies = new ArrayBuffer[(File, File, DependencyContext)] - val binaryDependencies = new ArrayBuffer[(File, String, File, DependencyContext)] - val products = new ArrayBuffer[(File, File, String)] - val usedNames = scala.collection.mutable.Map.empty[File, Set[String]].withDefaultValue(Set.empty) - val apis: scala.collection.mutable.Map[File, SourceAPI] = scala.collection.mutable.Map.empty - - def sourceDependency(dependsOn: File, source: File, inherited: Boolean): Unit = { - val context = if(inherited) DependencyByInheritance else DependencyByMemberRef - sourceDependency(dependsOn, source, context) - } - def sourceDependency(dependsOn: File, source: File, context: DependencyContext): Unit = { sourceDependencies += ((dependsOn, source, context)) } - def binaryDependency(binary: File, name: String, source: File, inherited: Boolean): Unit = { - val context = if(inherited) DependencyByInheritance else DependencyByMemberRef - binaryDependency(binary, name, source, context) - } - def binaryDependency(binary: File, name: String, source: File, context: DependencyContext): Unit = { binaryDependencies += ((binary, name, source, context)) } - def generatedClass(source: File, module: File, name: String): Unit = { products += ((source, module, name)) } - - def usedName(source: File, name: String): Unit = { usedNames(source) += name } - def api(source: File, sourceAPI: SourceAPI): Unit = { - assert(!apis.contains(source), s"The `api` method should be called once per source file: $source") - apis(source) = sourceAPI - } - def problem(category: String, pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean): Unit = () -} diff --git a/interface/type b/interface/type deleted file mode 100644 index dbb393dd6..000000000 --- a/interface/type +++ /dev/null @@ -1,30 +0,0 @@ - -Type - SimpleType - Projection - prefix : SimpleType - id: String - ParameterRef - id: String - Singleton - path: Path - EmptyType - Parameterized - baseType : SimpleType - typeArguments: Type* - Constant - baseType: Type - value: String - Annotated - baseType : Type - annotations : Annotation* - Structure - parents : ~Type* - declared: ~Definition* - inherited: ~Definition* - Existential - baseType : Type - clause: TypeParameter* - Polymorphic - baseType: Type - parameters: TypeParameter* From 871b4f4eefbf228a8a6f6e148a92819a4987210b Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 20 Aug 2015 00:43:22 -0400 Subject: [PATCH 524/823] move modules around. --- build.sbt | 22 +++++++++---------- {util => internal}/process/NOTICE | 0 .../src/main/scala/sbt/InheritInput.scala | 0 .../process/src/main/scala/sbt/Process.scala | 0 .../src/main/scala/sbt/ProcessImpl.scala | 0 .../process/src/main/scala/sbt/SyncVar.scala | 0 .../src/test/scala/ProcessSpecification.scala | 0 .../src/test/scala/TestedProcess.scala | 0 .../main/scala/sbt/appmacro/ContextUtil.scala | 0 .../src/main/scala/sbt/appmacro/Convert.scala | 0 .../main/scala/sbt/appmacro/Instance.scala | 0 .../scala/sbt/appmacro/KListBuilder.scala | 0 .../scala/sbt/appmacro/MixedBuilder.scala | 0 .../scala/sbt/appmacro/TupleBuilder.scala | 0 .../scala/sbt/appmacro/TupleNBuilder.scala | 0 {cache => internal/util-cache}/NOTICE | 0 .../src/main/scala/sbt/Cache.scala | 0 .../src/main/scala/sbt/CacheIO.scala | 0 .../src/main/scala/sbt/FileInfo.scala | 0 .../src/main/scala/sbt/SeparatedCache.scala | 0 .../src/test/scala/CacheTest.scala | 0 .../util-collection}/NOTICE | 0 .../src/main/scala/sbt/AList.scala | 0 .../src/main/scala/sbt/Attributes.scala | 0 .../src/main/scala/sbt/Classes.scala | 0 .../src/main/scala/sbt/Dag.scala | 0 .../src/main/scala/sbt/HList.scala | 0 .../src/main/scala/sbt/IDSet.scala | 0 .../src/main/scala/sbt/INode.scala | 0 .../src/main/scala/sbt/KList.scala | 0 .../src/main/scala/sbt/PMap.scala | 0 .../src/main/scala/sbt/Param.scala | 0 .../src/main/scala/sbt/Positions.scala | 0 .../src/main/scala/sbt/Settings.scala | 0 .../src/main/scala/sbt/Show.scala | 0 .../src/main/scala/sbt/ShowLines.scala | 0 .../src/main/scala/sbt/Signal.scala | 0 .../src/main/scala/sbt/TypeFunctions.scala | 0 .../src/main/scala/sbt/Types.scala | 0 .../src/main/scala/sbt/Util.scala | 0 .../src/test/scala/DagSpecification.scala | 0 .../src/test/scala/KeyTest.scala | 0 .../src/test/scala/LiteralTest.scala | 0 .../src/test/scala/PMapTest.scala | 0 .../src/test/scala/SettingsExample.scala | 0 .../src/test/scala/SettingsTest.scala | 0 .../util-complete}/NOTICE | 0 .../src/main/scala/sbt/LineReader.scala | 0 .../main/scala/sbt/complete/Completions.scala | 0 .../scala/sbt/complete/EditDistance.scala | 0 .../scala/sbt/complete/ExampleSource.scala | 0 .../src/main/scala/sbt/complete/History.scala | 0 .../scala/sbt/complete/HistoryCommands.scala | 0 .../scala/sbt/complete/JLineCompletion.scala | 0 .../src/main/scala/sbt/complete/Parser.scala | 0 .../src/main/scala/sbt/complete/Parsers.scala | 0 .../scala/sbt/complete/ProcessError.scala | 0 .../scala/sbt/complete/TokenCompletions.scala | 0 .../main/scala/sbt/complete/TypeString.scala | 0 .../main/scala/sbt/complete/UpperBound.scala | 0 .../src/test/scala/ParserTest.scala | 0 .../scala/sbt/complete/FileExamplesTest.scala | 0 .../sbt/complete/FixedSetExamplesTest.scala | 0 .../sbt/complete/ParserWithExamplesTest.scala | 0 .../control => internal/util-control}/NOTICE | 0 .../src/main/scala/sbt/ErrorHandling.scala | 0 .../src/main/scala/sbt/ExitHook.scala | 0 .../main/scala/sbt/MessageOnlyException.scala | 0 .../src/main/java/xsbti/F0.java | 0 .../src/main/java/xsbti/Logger.java | 0 .../src/main/java/xsbti/Maybe.java | 0 .../src/main/java/xsbti/Position.java | 0 .../src/main/java/xsbti/Problem.java | 0 .../src/main/java/xsbti/Severity.java | 0 {util/log => internal/util-logging}/NOTICE | 0 .../src/main/scala/sbt/BasicLogger.scala | 0 .../src/main/scala/sbt/BufferedLogger.scala | 0 .../src/main/scala/sbt/ConsoleLogger.scala | 0 .../src/main/scala/sbt/ConsoleOut.scala | 0 .../src/main/scala/sbt/FilterLogger.scala | 0 .../src/main/scala/sbt/FullLogger.scala | 0 .../src/main/scala/sbt/GlobalLogging.scala | 0 .../src/main/scala/sbt/Level.scala | 0 .../src/main/scala/sbt/LogEvent.scala | 0 .../src/main/scala/sbt/Logger.scala | 0 .../src/main/scala/sbt/LoggerWriter.scala | 0 .../src/main/scala/sbt/MainLogging.scala | 0 .../src/main/scala/sbt/MultiLogger.scala | 0 .../src/main/scala/sbt/StackTrace.scala | 0 .../src/test/scala/Escapes.scala | 0 .../src/test/scala/LogWriterTest.scala | 0 .../src/test/scala/TestLogger.scala | 0 .../src/main/scala/sbt/logic/Logic.scala | 0 .../src/test/scala/sbt/logic/Test.scala | 0 .../src/main/scala/sbt/Relation.scala | 0 .../src/test/scala/RelationTest.scala | 0 .../util-tracking}/NOTICE | 0 .../src/main/scala/sbt/ChangeReport.scala | 0 .../src/main/scala/sbt/Tracked.scala | 0 99 files changed, 10 insertions(+), 12 deletions(-) rename {util => internal}/process/NOTICE (100%) rename {util => internal}/process/src/main/scala/sbt/InheritInput.scala (100%) rename {util => internal}/process/src/main/scala/sbt/Process.scala (100%) rename {util => internal}/process/src/main/scala/sbt/ProcessImpl.scala (100%) rename {util => internal}/process/src/main/scala/sbt/SyncVar.scala (100%) rename {util => internal}/process/src/test/scala/ProcessSpecification.scala (100%) rename {util => internal}/process/src/test/scala/TestedProcess.scala (100%) rename {util/appmacro => internal/util-appmacro}/src/main/scala/sbt/appmacro/ContextUtil.scala (100%) rename {util/appmacro => internal/util-appmacro}/src/main/scala/sbt/appmacro/Convert.scala (100%) rename {util/appmacro => internal/util-appmacro}/src/main/scala/sbt/appmacro/Instance.scala (100%) rename {util/appmacro => internal/util-appmacro}/src/main/scala/sbt/appmacro/KListBuilder.scala (100%) rename {util/appmacro => internal/util-appmacro}/src/main/scala/sbt/appmacro/MixedBuilder.scala (100%) rename {util/appmacro => internal/util-appmacro}/src/main/scala/sbt/appmacro/TupleBuilder.scala (100%) rename {util/appmacro => internal/util-appmacro}/src/main/scala/sbt/appmacro/TupleNBuilder.scala (100%) rename {cache => internal/util-cache}/NOTICE (100%) rename {cache => internal/util-cache}/src/main/scala/sbt/Cache.scala (100%) rename {cache => internal/util-cache}/src/main/scala/sbt/CacheIO.scala (100%) rename {cache => internal/util-cache}/src/main/scala/sbt/FileInfo.scala (100%) rename {cache => internal/util-cache}/src/main/scala/sbt/SeparatedCache.scala (100%) rename {cache => internal/util-cache}/src/test/scala/CacheTest.scala (100%) rename {util/collection => internal/util-collection}/NOTICE (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/AList.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Attributes.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Classes.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Dag.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/HList.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/IDSet.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/INode.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/KList.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/PMap.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Param.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Positions.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Settings.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Show.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/ShowLines.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Signal.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/TypeFunctions.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Types.scala (100%) rename {util/collection => internal/util-collection}/src/main/scala/sbt/Util.scala (100%) rename {util/collection => internal/util-collection}/src/test/scala/DagSpecification.scala (100%) rename {util/collection => internal/util-collection}/src/test/scala/KeyTest.scala (100%) rename {util/collection => internal/util-collection}/src/test/scala/LiteralTest.scala (100%) rename {util/collection => internal/util-collection}/src/test/scala/PMapTest.scala (100%) rename {util/collection => internal/util-collection}/src/test/scala/SettingsExample.scala (100%) rename {util/collection => internal/util-collection}/src/test/scala/SettingsTest.scala (100%) rename {util/complete => internal/util-complete}/NOTICE (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/LineReader.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/Completions.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/EditDistance.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/ExampleSource.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/History.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/HistoryCommands.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/JLineCompletion.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/Parser.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/Parsers.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/ProcessError.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/TokenCompletions.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/TypeString.scala (100%) rename {util/complete => internal/util-complete}/src/main/scala/sbt/complete/UpperBound.scala (100%) rename {util/complete => internal/util-complete}/src/test/scala/ParserTest.scala (100%) rename {util/complete => internal/util-complete}/src/test/scala/sbt/complete/FileExamplesTest.scala (100%) rename {util/complete => internal/util-complete}/src/test/scala/sbt/complete/FixedSetExamplesTest.scala (100%) rename {util/complete => internal/util-complete}/src/test/scala/sbt/complete/ParserWithExamplesTest.scala (100%) rename {util/control => internal/util-control}/NOTICE (100%) rename {util/control => internal/util-control}/src/main/scala/sbt/ErrorHandling.scala (100%) rename {util/control => internal/util-control}/src/main/scala/sbt/ExitHook.scala (100%) rename {util/control => internal/util-control}/src/main/scala/sbt/MessageOnlyException.scala (100%) rename {interface => internal/util-interface}/src/main/java/xsbti/F0.java (100%) rename {interface => internal/util-interface}/src/main/java/xsbti/Logger.java (100%) rename {interface => internal/util-interface}/src/main/java/xsbti/Maybe.java (100%) rename {interface => internal/util-interface}/src/main/java/xsbti/Position.java (100%) rename {interface => internal/util-interface}/src/main/java/xsbti/Problem.java (100%) rename {interface => internal/util-interface}/src/main/java/xsbti/Severity.java (100%) rename {util/log => internal/util-logging}/NOTICE (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/BasicLogger.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/BufferedLogger.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/ConsoleLogger.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/ConsoleOut.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/FilterLogger.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/FullLogger.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/GlobalLogging.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/Level.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/LogEvent.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/Logger.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/LoggerWriter.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/MainLogging.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/MultiLogger.scala (100%) rename {util/log => internal/util-logging}/src/main/scala/sbt/StackTrace.scala (100%) rename {util/log => internal/util-logging}/src/test/scala/Escapes.scala (100%) rename {util/log => internal/util-logging}/src/test/scala/LogWriterTest.scala (100%) rename {util/log => internal/util-logging}/src/test/scala/TestLogger.scala (100%) rename {util/logic => internal/util-logic}/src/main/scala/sbt/logic/Logic.scala (100%) rename {util/logic => internal/util-logic}/src/test/scala/sbt/logic/Test.scala (100%) rename {util/relation => internal/util-relation}/src/main/scala/sbt/Relation.scala (100%) rename {util/relation => internal/util-relation}/src/test/scala/RelationTest.scala (100%) rename {cache/tracking => internal/util-tracking}/NOTICE (100%) rename {cache/tracking => internal/util-tracking}/src/main/scala/sbt/ChangeReport.scala (100%) rename {cache/tracking => internal/util-tracking}/src/main/scala/sbt/Tracked.scala (100%) diff --git a/build.sbt b/build.sbt index fcc99e21b..13061c241 100644 --- a/build.sbt +++ b/build.sbt @@ -2,8 +2,6 @@ import Dependencies._ import Util._ def internalPath = file("internal") -def utilPath = file("util") -def cachePath = file("cache") // ThisBuild settings take lower precedence, // but can be shared across the multi projects. @@ -54,7 +52,7 @@ lazy val utilRoot: Project = (project in file(".")). // defines Java structures used across Scala versions, such as the API structures and relationships extracted by // the analysis compiler phases and passed back to sbt. The API structures are defined in a simple // format from which Java sources are generated by the datatype generator Projproject -lazy val utilInterface = (project in file("interface")). +lazy val utilInterface = (project in internalPath / "util-interface"). settings( commonSettings, javaOnlySettings, @@ -71,7 +69,7 @@ lazy val utilInterface = (project in file("interface")). // streams) map generateAPICached ) -lazy val utilControl = (project in utilPath / "control"). +lazy val utilControl = (project in internalPath / "util-control"). settings( commonSettings, // Util.crossBuild, @@ -79,7 +77,7 @@ lazy val utilControl = (project in utilPath / "control"). crossScalaVersions := Seq(scala210, scala211) ) -lazy val utilCollection = (project in utilPath / "collection"). +lazy val utilCollection = (project in internalPath / "util-collection"). settings( testedBaseSettings, Util.keywordsSettings, @@ -88,7 +86,7 @@ lazy val utilCollection = (project in utilPath / "collection"). crossScalaVersions := Seq(scala210, scala211) ) -lazy val utilApplyMacro = (project in utilPath / "appmacro"). +lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). dependsOn(utilCollection). settings( testedBaseSettings, @@ -97,7 +95,7 @@ lazy val utilApplyMacro = (project in utilPath / "appmacro"). ) // Command line-related utilities. -lazy val utilComplete = (project in utilPath / "complete"). +lazy val utilComplete = (project in internalPath / "util-complete"). dependsOn(utilCollection, utilControl). settings( testedBaseSettings, @@ -108,7 +106,7 @@ lazy val utilComplete = (project in utilPath / "complete"). ) // logging -lazy val utilLogging = (project in utilPath / "log"). +lazy val utilLogging = (project in internalPath / "util-logging"). dependsOn(utilInterface). settings( testedBaseSettings, @@ -117,14 +115,14 @@ lazy val utilLogging = (project in utilPath / "log"). ) // Relation -lazy val utilRelation = (project in utilPath / "relation"). +lazy val utilRelation = (project in internalPath / "util-relation"). settings( testedBaseSettings, name := "Util Relation" ) // A logic with restricted negation as failure for a unique, stable model -lazy val utilLogic = (project in utilPath / "logic"). +lazy val utilLogic = (project in internalPath / "util-logic"). dependsOn(utilCollection, utilRelation). settings( testedBaseSettings, @@ -132,7 +130,7 @@ lazy val utilLogic = (project in utilPath / "logic"). ) // Persisted caching based on SBinary -lazy val utilCache = (project in cachePath). +lazy val utilCache = (project in internalPath / "util-cache"). dependsOn(utilCollection). settings( commonSettings, @@ -141,7 +139,7 @@ lazy val utilCache = (project in cachePath). ) // Builds on cache to provide caching for filesystem-related operations -lazy val utilTracking = (project in cachePath / "tracking"). +lazy val utilTracking = (project in internalPath / "util-tracking"). dependsOn(utilCache). settings( commonSettings, diff --git a/util/process/NOTICE b/internal/process/NOTICE similarity index 100% rename from util/process/NOTICE rename to internal/process/NOTICE diff --git a/util/process/src/main/scala/sbt/InheritInput.scala b/internal/process/src/main/scala/sbt/InheritInput.scala similarity index 100% rename from util/process/src/main/scala/sbt/InheritInput.scala rename to internal/process/src/main/scala/sbt/InheritInput.scala diff --git a/util/process/src/main/scala/sbt/Process.scala b/internal/process/src/main/scala/sbt/Process.scala similarity index 100% rename from util/process/src/main/scala/sbt/Process.scala rename to internal/process/src/main/scala/sbt/Process.scala diff --git a/util/process/src/main/scala/sbt/ProcessImpl.scala b/internal/process/src/main/scala/sbt/ProcessImpl.scala similarity index 100% rename from util/process/src/main/scala/sbt/ProcessImpl.scala rename to internal/process/src/main/scala/sbt/ProcessImpl.scala diff --git a/util/process/src/main/scala/sbt/SyncVar.scala b/internal/process/src/main/scala/sbt/SyncVar.scala similarity index 100% rename from util/process/src/main/scala/sbt/SyncVar.scala rename to internal/process/src/main/scala/sbt/SyncVar.scala diff --git a/util/process/src/test/scala/ProcessSpecification.scala b/internal/process/src/test/scala/ProcessSpecification.scala similarity index 100% rename from util/process/src/test/scala/ProcessSpecification.scala rename to internal/process/src/test/scala/ProcessSpecification.scala diff --git a/util/process/src/test/scala/TestedProcess.scala b/internal/process/src/test/scala/TestedProcess.scala similarity index 100% rename from util/process/src/test/scala/TestedProcess.scala rename to internal/process/src/test/scala/TestedProcess.scala diff --git a/util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala similarity index 100% rename from util/appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala rename to internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Convert.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/Convert.scala similarity index 100% rename from util/appmacro/src/main/scala/sbt/appmacro/Convert.scala rename to internal/util-appmacro/src/main/scala/sbt/appmacro/Convert.scala diff --git a/util/appmacro/src/main/scala/sbt/appmacro/Instance.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/Instance.scala similarity index 100% rename from util/appmacro/src/main/scala/sbt/appmacro/Instance.scala rename to internal/util-appmacro/src/main/scala/sbt/appmacro/Instance.scala diff --git a/util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala similarity index 100% rename from util/appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala diff --git a/util/appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala similarity index 100% rename from util/appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala diff --git a/util/appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala similarity index 100% rename from util/appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala diff --git a/util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala similarity index 100% rename from util/appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala diff --git a/cache/NOTICE b/internal/util-cache/NOTICE similarity index 100% rename from cache/NOTICE rename to internal/util-cache/NOTICE diff --git a/cache/src/main/scala/sbt/Cache.scala b/internal/util-cache/src/main/scala/sbt/Cache.scala similarity index 100% rename from cache/src/main/scala/sbt/Cache.scala rename to internal/util-cache/src/main/scala/sbt/Cache.scala diff --git a/cache/src/main/scala/sbt/CacheIO.scala b/internal/util-cache/src/main/scala/sbt/CacheIO.scala similarity index 100% rename from cache/src/main/scala/sbt/CacheIO.scala rename to internal/util-cache/src/main/scala/sbt/CacheIO.scala diff --git a/cache/src/main/scala/sbt/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/FileInfo.scala similarity index 100% rename from cache/src/main/scala/sbt/FileInfo.scala rename to internal/util-cache/src/main/scala/sbt/FileInfo.scala diff --git a/cache/src/main/scala/sbt/SeparatedCache.scala b/internal/util-cache/src/main/scala/sbt/SeparatedCache.scala similarity index 100% rename from cache/src/main/scala/sbt/SeparatedCache.scala rename to internal/util-cache/src/main/scala/sbt/SeparatedCache.scala diff --git a/cache/src/test/scala/CacheTest.scala b/internal/util-cache/src/test/scala/CacheTest.scala similarity index 100% rename from cache/src/test/scala/CacheTest.scala rename to internal/util-cache/src/test/scala/CacheTest.scala diff --git a/util/collection/NOTICE b/internal/util-collection/NOTICE similarity index 100% rename from util/collection/NOTICE rename to internal/util-collection/NOTICE diff --git a/util/collection/src/main/scala/sbt/AList.scala b/internal/util-collection/src/main/scala/sbt/AList.scala similarity index 100% rename from util/collection/src/main/scala/sbt/AList.scala rename to internal/util-collection/src/main/scala/sbt/AList.scala diff --git a/util/collection/src/main/scala/sbt/Attributes.scala b/internal/util-collection/src/main/scala/sbt/Attributes.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Attributes.scala rename to internal/util-collection/src/main/scala/sbt/Attributes.scala diff --git a/util/collection/src/main/scala/sbt/Classes.scala b/internal/util-collection/src/main/scala/sbt/Classes.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Classes.scala rename to internal/util-collection/src/main/scala/sbt/Classes.scala diff --git a/util/collection/src/main/scala/sbt/Dag.scala b/internal/util-collection/src/main/scala/sbt/Dag.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Dag.scala rename to internal/util-collection/src/main/scala/sbt/Dag.scala diff --git a/util/collection/src/main/scala/sbt/HList.scala b/internal/util-collection/src/main/scala/sbt/HList.scala similarity index 100% rename from util/collection/src/main/scala/sbt/HList.scala rename to internal/util-collection/src/main/scala/sbt/HList.scala diff --git a/util/collection/src/main/scala/sbt/IDSet.scala b/internal/util-collection/src/main/scala/sbt/IDSet.scala similarity index 100% rename from util/collection/src/main/scala/sbt/IDSet.scala rename to internal/util-collection/src/main/scala/sbt/IDSet.scala diff --git a/util/collection/src/main/scala/sbt/INode.scala b/internal/util-collection/src/main/scala/sbt/INode.scala similarity index 100% rename from util/collection/src/main/scala/sbt/INode.scala rename to internal/util-collection/src/main/scala/sbt/INode.scala diff --git a/util/collection/src/main/scala/sbt/KList.scala b/internal/util-collection/src/main/scala/sbt/KList.scala similarity index 100% rename from util/collection/src/main/scala/sbt/KList.scala rename to internal/util-collection/src/main/scala/sbt/KList.scala diff --git a/util/collection/src/main/scala/sbt/PMap.scala b/internal/util-collection/src/main/scala/sbt/PMap.scala similarity index 100% rename from util/collection/src/main/scala/sbt/PMap.scala rename to internal/util-collection/src/main/scala/sbt/PMap.scala diff --git a/util/collection/src/main/scala/sbt/Param.scala b/internal/util-collection/src/main/scala/sbt/Param.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Param.scala rename to internal/util-collection/src/main/scala/sbt/Param.scala diff --git a/util/collection/src/main/scala/sbt/Positions.scala b/internal/util-collection/src/main/scala/sbt/Positions.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Positions.scala rename to internal/util-collection/src/main/scala/sbt/Positions.scala diff --git a/util/collection/src/main/scala/sbt/Settings.scala b/internal/util-collection/src/main/scala/sbt/Settings.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Settings.scala rename to internal/util-collection/src/main/scala/sbt/Settings.scala diff --git a/util/collection/src/main/scala/sbt/Show.scala b/internal/util-collection/src/main/scala/sbt/Show.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Show.scala rename to internal/util-collection/src/main/scala/sbt/Show.scala diff --git a/util/collection/src/main/scala/sbt/ShowLines.scala b/internal/util-collection/src/main/scala/sbt/ShowLines.scala similarity index 100% rename from util/collection/src/main/scala/sbt/ShowLines.scala rename to internal/util-collection/src/main/scala/sbt/ShowLines.scala diff --git a/util/collection/src/main/scala/sbt/Signal.scala b/internal/util-collection/src/main/scala/sbt/Signal.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Signal.scala rename to internal/util-collection/src/main/scala/sbt/Signal.scala diff --git a/util/collection/src/main/scala/sbt/TypeFunctions.scala b/internal/util-collection/src/main/scala/sbt/TypeFunctions.scala similarity index 100% rename from util/collection/src/main/scala/sbt/TypeFunctions.scala rename to internal/util-collection/src/main/scala/sbt/TypeFunctions.scala diff --git a/util/collection/src/main/scala/sbt/Types.scala b/internal/util-collection/src/main/scala/sbt/Types.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Types.scala rename to internal/util-collection/src/main/scala/sbt/Types.scala diff --git a/util/collection/src/main/scala/sbt/Util.scala b/internal/util-collection/src/main/scala/sbt/Util.scala similarity index 100% rename from util/collection/src/main/scala/sbt/Util.scala rename to internal/util-collection/src/main/scala/sbt/Util.scala diff --git a/util/collection/src/test/scala/DagSpecification.scala b/internal/util-collection/src/test/scala/DagSpecification.scala similarity index 100% rename from util/collection/src/test/scala/DagSpecification.scala rename to internal/util-collection/src/test/scala/DagSpecification.scala diff --git a/util/collection/src/test/scala/KeyTest.scala b/internal/util-collection/src/test/scala/KeyTest.scala similarity index 100% rename from util/collection/src/test/scala/KeyTest.scala rename to internal/util-collection/src/test/scala/KeyTest.scala diff --git a/util/collection/src/test/scala/LiteralTest.scala b/internal/util-collection/src/test/scala/LiteralTest.scala similarity index 100% rename from util/collection/src/test/scala/LiteralTest.scala rename to internal/util-collection/src/test/scala/LiteralTest.scala diff --git a/util/collection/src/test/scala/PMapTest.scala b/internal/util-collection/src/test/scala/PMapTest.scala similarity index 100% rename from util/collection/src/test/scala/PMapTest.scala rename to internal/util-collection/src/test/scala/PMapTest.scala diff --git a/util/collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala similarity index 100% rename from util/collection/src/test/scala/SettingsExample.scala rename to internal/util-collection/src/test/scala/SettingsExample.scala diff --git a/util/collection/src/test/scala/SettingsTest.scala b/internal/util-collection/src/test/scala/SettingsTest.scala similarity index 100% rename from util/collection/src/test/scala/SettingsTest.scala rename to internal/util-collection/src/test/scala/SettingsTest.scala diff --git a/util/complete/NOTICE b/internal/util-complete/NOTICE similarity index 100% rename from util/complete/NOTICE rename to internal/util-complete/NOTICE diff --git a/util/complete/src/main/scala/sbt/LineReader.scala b/internal/util-complete/src/main/scala/sbt/LineReader.scala similarity index 100% rename from util/complete/src/main/scala/sbt/LineReader.scala rename to internal/util-complete/src/main/scala/sbt/LineReader.scala diff --git a/util/complete/src/main/scala/sbt/complete/Completions.scala b/internal/util-complete/src/main/scala/sbt/complete/Completions.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/Completions.scala rename to internal/util-complete/src/main/scala/sbt/complete/Completions.scala diff --git a/util/complete/src/main/scala/sbt/complete/EditDistance.scala b/internal/util-complete/src/main/scala/sbt/complete/EditDistance.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/EditDistance.scala rename to internal/util-complete/src/main/scala/sbt/complete/EditDistance.scala diff --git a/util/complete/src/main/scala/sbt/complete/ExampleSource.scala b/internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/ExampleSource.scala rename to internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala diff --git a/util/complete/src/main/scala/sbt/complete/History.scala b/internal/util-complete/src/main/scala/sbt/complete/History.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/History.scala rename to internal/util-complete/src/main/scala/sbt/complete/History.scala diff --git a/util/complete/src/main/scala/sbt/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/HistoryCommands.scala rename to internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala diff --git a/util/complete/src/main/scala/sbt/complete/JLineCompletion.scala b/internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/JLineCompletion.scala rename to internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala diff --git a/util/complete/src/main/scala/sbt/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/complete/Parser.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/Parser.scala rename to internal/util-complete/src/main/scala/sbt/complete/Parser.scala diff --git a/util/complete/src/main/scala/sbt/complete/Parsers.scala b/internal/util-complete/src/main/scala/sbt/complete/Parsers.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/Parsers.scala rename to internal/util-complete/src/main/scala/sbt/complete/Parsers.scala diff --git a/util/complete/src/main/scala/sbt/complete/ProcessError.scala b/internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/ProcessError.scala rename to internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala diff --git a/util/complete/src/main/scala/sbt/complete/TokenCompletions.scala b/internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/TokenCompletions.scala rename to internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala diff --git a/util/complete/src/main/scala/sbt/complete/TypeString.scala b/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/TypeString.scala rename to internal/util-complete/src/main/scala/sbt/complete/TypeString.scala diff --git a/util/complete/src/main/scala/sbt/complete/UpperBound.scala b/internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala similarity index 100% rename from util/complete/src/main/scala/sbt/complete/UpperBound.scala rename to internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala diff --git a/util/complete/src/test/scala/ParserTest.scala b/internal/util-complete/src/test/scala/ParserTest.scala similarity index 100% rename from util/complete/src/test/scala/ParserTest.scala rename to internal/util-complete/src/test/scala/ParserTest.scala diff --git a/util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala similarity index 100% rename from util/complete/src/test/scala/sbt/complete/FileExamplesTest.scala rename to internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala diff --git a/util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala similarity index 100% rename from util/complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala rename to internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala diff --git a/util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala similarity index 100% rename from util/complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala rename to internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala diff --git a/util/control/NOTICE b/internal/util-control/NOTICE similarity index 100% rename from util/control/NOTICE rename to internal/util-control/NOTICE diff --git a/util/control/src/main/scala/sbt/ErrorHandling.scala b/internal/util-control/src/main/scala/sbt/ErrorHandling.scala similarity index 100% rename from util/control/src/main/scala/sbt/ErrorHandling.scala rename to internal/util-control/src/main/scala/sbt/ErrorHandling.scala diff --git a/util/control/src/main/scala/sbt/ExitHook.scala b/internal/util-control/src/main/scala/sbt/ExitHook.scala similarity index 100% rename from util/control/src/main/scala/sbt/ExitHook.scala rename to internal/util-control/src/main/scala/sbt/ExitHook.scala diff --git a/util/control/src/main/scala/sbt/MessageOnlyException.scala b/internal/util-control/src/main/scala/sbt/MessageOnlyException.scala similarity index 100% rename from util/control/src/main/scala/sbt/MessageOnlyException.scala rename to internal/util-control/src/main/scala/sbt/MessageOnlyException.scala diff --git a/interface/src/main/java/xsbti/F0.java b/internal/util-interface/src/main/java/xsbti/F0.java similarity index 100% rename from interface/src/main/java/xsbti/F0.java rename to internal/util-interface/src/main/java/xsbti/F0.java diff --git a/interface/src/main/java/xsbti/Logger.java b/internal/util-interface/src/main/java/xsbti/Logger.java similarity index 100% rename from interface/src/main/java/xsbti/Logger.java rename to internal/util-interface/src/main/java/xsbti/Logger.java diff --git a/interface/src/main/java/xsbti/Maybe.java b/internal/util-interface/src/main/java/xsbti/Maybe.java similarity index 100% rename from interface/src/main/java/xsbti/Maybe.java rename to internal/util-interface/src/main/java/xsbti/Maybe.java diff --git a/interface/src/main/java/xsbti/Position.java b/internal/util-interface/src/main/java/xsbti/Position.java similarity index 100% rename from interface/src/main/java/xsbti/Position.java rename to internal/util-interface/src/main/java/xsbti/Position.java diff --git a/interface/src/main/java/xsbti/Problem.java b/internal/util-interface/src/main/java/xsbti/Problem.java similarity index 100% rename from interface/src/main/java/xsbti/Problem.java rename to internal/util-interface/src/main/java/xsbti/Problem.java diff --git a/interface/src/main/java/xsbti/Severity.java b/internal/util-interface/src/main/java/xsbti/Severity.java similarity index 100% rename from interface/src/main/java/xsbti/Severity.java rename to internal/util-interface/src/main/java/xsbti/Severity.java diff --git a/util/log/NOTICE b/internal/util-logging/NOTICE similarity index 100% rename from util/log/NOTICE rename to internal/util-logging/NOTICE diff --git a/util/log/src/main/scala/sbt/BasicLogger.scala b/internal/util-logging/src/main/scala/sbt/BasicLogger.scala similarity index 100% rename from util/log/src/main/scala/sbt/BasicLogger.scala rename to internal/util-logging/src/main/scala/sbt/BasicLogger.scala diff --git a/util/log/src/main/scala/sbt/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/BufferedLogger.scala similarity index 100% rename from util/log/src/main/scala/sbt/BufferedLogger.scala rename to internal/util-logging/src/main/scala/sbt/BufferedLogger.scala diff --git a/util/log/src/main/scala/sbt/ConsoleLogger.scala b/internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala similarity index 100% rename from util/log/src/main/scala/sbt/ConsoleLogger.scala rename to internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala diff --git a/util/log/src/main/scala/sbt/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/ConsoleOut.scala similarity index 100% rename from util/log/src/main/scala/sbt/ConsoleOut.scala rename to internal/util-logging/src/main/scala/sbt/ConsoleOut.scala diff --git a/util/log/src/main/scala/sbt/FilterLogger.scala b/internal/util-logging/src/main/scala/sbt/FilterLogger.scala similarity index 100% rename from util/log/src/main/scala/sbt/FilterLogger.scala rename to internal/util-logging/src/main/scala/sbt/FilterLogger.scala diff --git a/util/log/src/main/scala/sbt/FullLogger.scala b/internal/util-logging/src/main/scala/sbt/FullLogger.scala similarity index 100% rename from util/log/src/main/scala/sbt/FullLogger.scala rename to internal/util-logging/src/main/scala/sbt/FullLogger.scala diff --git a/util/log/src/main/scala/sbt/GlobalLogging.scala b/internal/util-logging/src/main/scala/sbt/GlobalLogging.scala similarity index 100% rename from util/log/src/main/scala/sbt/GlobalLogging.scala rename to internal/util-logging/src/main/scala/sbt/GlobalLogging.scala diff --git a/util/log/src/main/scala/sbt/Level.scala b/internal/util-logging/src/main/scala/sbt/Level.scala similarity index 100% rename from util/log/src/main/scala/sbt/Level.scala rename to internal/util-logging/src/main/scala/sbt/Level.scala diff --git a/util/log/src/main/scala/sbt/LogEvent.scala b/internal/util-logging/src/main/scala/sbt/LogEvent.scala similarity index 100% rename from util/log/src/main/scala/sbt/LogEvent.scala rename to internal/util-logging/src/main/scala/sbt/LogEvent.scala diff --git a/util/log/src/main/scala/sbt/Logger.scala b/internal/util-logging/src/main/scala/sbt/Logger.scala similarity index 100% rename from util/log/src/main/scala/sbt/Logger.scala rename to internal/util-logging/src/main/scala/sbt/Logger.scala diff --git a/util/log/src/main/scala/sbt/LoggerWriter.scala b/internal/util-logging/src/main/scala/sbt/LoggerWriter.scala similarity index 100% rename from util/log/src/main/scala/sbt/LoggerWriter.scala rename to internal/util-logging/src/main/scala/sbt/LoggerWriter.scala diff --git a/util/log/src/main/scala/sbt/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/MainLogging.scala similarity index 100% rename from util/log/src/main/scala/sbt/MainLogging.scala rename to internal/util-logging/src/main/scala/sbt/MainLogging.scala diff --git a/util/log/src/main/scala/sbt/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/MultiLogger.scala similarity index 100% rename from util/log/src/main/scala/sbt/MultiLogger.scala rename to internal/util-logging/src/main/scala/sbt/MultiLogger.scala diff --git a/util/log/src/main/scala/sbt/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/StackTrace.scala similarity index 100% rename from util/log/src/main/scala/sbt/StackTrace.scala rename to internal/util-logging/src/main/scala/sbt/StackTrace.scala diff --git a/util/log/src/test/scala/Escapes.scala b/internal/util-logging/src/test/scala/Escapes.scala similarity index 100% rename from util/log/src/test/scala/Escapes.scala rename to internal/util-logging/src/test/scala/Escapes.scala diff --git a/util/log/src/test/scala/LogWriterTest.scala b/internal/util-logging/src/test/scala/LogWriterTest.scala similarity index 100% rename from util/log/src/test/scala/LogWriterTest.scala rename to internal/util-logging/src/test/scala/LogWriterTest.scala diff --git a/util/log/src/test/scala/TestLogger.scala b/internal/util-logging/src/test/scala/TestLogger.scala similarity index 100% rename from util/log/src/test/scala/TestLogger.scala rename to internal/util-logging/src/test/scala/TestLogger.scala diff --git a/util/logic/src/main/scala/sbt/logic/Logic.scala b/internal/util-logic/src/main/scala/sbt/logic/Logic.scala similarity index 100% rename from util/logic/src/main/scala/sbt/logic/Logic.scala rename to internal/util-logic/src/main/scala/sbt/logic/Logic.scala diff --git a/util/logic/src/test/scala/sbt/logic/Test.scala b/internal/util-logic/src/test/scala/sbt/logic/Test.scala similarity index 100% rename from util/logic/src/test/scala/sbt/logic/Test.scala rename to internal/util-logic/src/test/scala/sbt/logic/Test.scala diff --git a/util/relation/src/main/scala/sbt/Relation.scala b/internal/util-relation/src/main/scala/sbt/Relation.scala similarity index 100% rename from util/relation/src/main/scala/sbt/Relation.scala rename to internal/util-relation/src/main/scala/sbt/Relation.scala diff --git a/util/relation/src/test/scala/RelationTest.scala b/internal/util-relation/src/test/scala/RelationTest.scala similarity index 100% rename from util/relation/src/test/scala/RelationTest.scala rename to internal/util-relation/src/test/scala/RelationTest.scala diff --git a/cache/tracking/NOTICE b/internal/util-tracking/NOTICE similarity index 100% rename from cache/tracking/NOTICE rename to internal/util-tracking/NOTICE diff --git a/cache/tracking/src/main/scala/sbt/ChangeReport.scala b/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala similarity index 100% rename from cache/tracking/src/main/scala/sbt/ChangeReport.scala rename to internal/util-tracking/src/main/scala/sbt/ChangeReport.scala diff --git a/cache/tracking/src/main/scala/sbt/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/Tracked.scala similarity index 100% rename from cache/tracking/src/main/scala/sbt/Tracked.scala rename to internal/util-tracking/src/main/scala/sbt/Tracked.scala From 4629053277afb703829ca2b586d4b4ed12e35e17 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 20 Aug 2015 01:00:26 -0400 Subject: [PATCH 525/823] remove process --- internal/process/NOTICE | 3 - .../src/main/scala/sbt/InheritInput.scala | 21 - .../process/src/main/scala/sbt/Process.scala | 221 --------- .../src/main/scala/sbt/ProcessImpl.scala | 436 ------------------ .../process/src/main/scala/sbt/SyncVar.scala | 39 -- .../src/test/scala/ProcessSpecification.scala | 131 ------ .../src/test/scala/TestedProcess.scala | 46 -- 7 files changed, 897 deletions(-) delete mode 100644 internal/process/NOTICE delete mode 100755 internal/process/src/main/scala/sbt/InheritInput.scala delete mode 100644 internal/process/src/main/scala/sbt/Process.scala delete mode 100644 internal/process/src/main/scala/sbt/ProcessImpl.scala delete mode 100644 internal/process/src/main/scala/sbt/SyncVar.scala delete mode 100644 internal/process/src/test/scala/ProcessSpecification.scala delete mode 100644 internal/process/src/test/scala/TestedProcess.scala diff --git a/internal/process/NOTICE b/internal/process/NOTICE deleted file mode 100644 index 789c9ff1f..000000000 --- a/internal/process/NOTICE +++ /dev/null @@ -1,3 +0,0 @@ -Simple Build Tool: Process Component -Copyright 2008, 2009, 2010 Mark Harrah, Vesa Vilhonen -Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/internal/process/src/main/scala/sbt/InheritInput.scala b/internal/process/src/main/scala/sbt/InheritInput.scala deleted file mode 100755 index a9828b04d..000000000 --- a/internal/process/src/main/scala/sbt/InheritInput.scala +++ /dev/null @@ -1,21 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2012 Eugene Vigdorchik - */ -package sbt - -import java.lang.{ ProcessBuilder => JProcessBuilder } - -/** On java 7, inherit System.in for a ProcessBuilder. */ -private[sbt] object InheritInput { - def apply(p: JProcessBuilder): Boolean = (redirectInput, inherit) match { - case (Some(m), Some(f)) => - m.invoke(p, f); true - case _ => false - } - - private[this] val pbClass = Class.forName("java.lang.ProcessBuilder") - private[this] val redirectClass = pbClass.getClasses find (_.getSimpleName == "Redirect") - - private[this] val redirectInput = redirectClass map (pbClass.getMethod("redirectInput", _)) - private[this] val inherit = redirectClass map (_ getField "INHERIT" get null) -} diff --git a/internal/process/src/main/scala/sbt/Process.scala b/internal/process/src/main/scala/sbt/Process.scala deleted file mode 100644 index 79435367d..000000000 --- a/internal/process/src/main/scala/sbt/Process.scala +++ /dev/null @@ -1,221 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah - */ -package sbt - -import java.lang.{ Process => JProcess, ProcessBuilder => JProcessBuilder } -import java.io.{ Closeable, File, IOException } -import java.io.{ BufferedReader, InputStream, InputStreamReader, OutputStream, PipedInputStream, PipedOutputStream } -import java.net.URL - -trait ProcessExtra { - import Process._ - implicit def builderToProcess(builder: JProcessBuilder): ProcessBuilder = apply(builder) - implicit def fileToProcess(file: File): FilePartialBuilder = apply(file) - implicit def urlToProcess(url: URL): URLPartialBuilder = apply(url) - @deprecated("Use string interpolation", "0.13.0") - implicit def xmlToProcess(command: scala.xml.Elem): ProcessBuilder = apply(command) - implicit def buildersToProcess[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = applySeq(builders) - - implicit def stringToProcess(command: String): ProcessBuilder = apply(command) - implicit def stringSeqToProcess(command: Seq[String]): ProcessBuilder = apply(command) -} - -/** Methods for constructing simple commands that can then be combined. */ -object Process extends ProcessExtra { - def apply(command: String): ProcessBuilder = apply(command, None) - - def apply(command: Seq[String]): ProcessBuilder = apply(command.toArray, None) - - def apply(command: String, arguments: Seq[String]): ProcessBuilder = apply(command :: arguments.toList, None) - /** create ProcessBuilder with working dir set to File and extra environment variables */ - def apply(command: String, cwd: File, extraEnv: (String, String)*): ProcessBuilder = - apply(command, Some(cwd), extraEnv: _*) - /** create ProcessBuilder with working dir set to File and extra environment variables */ - def apply(command: Seq[String], cwd: File, extraEnv: (String, String)*): ProcessBuilder = - apply(command, Some(cwd), extraEnv: _*) - /** create ProcessBuilder with working dir optionally set to File and extra environment variables */ - def apply(command: String, cwd: Option[File], extraEnv: (String, String)*): ProcessBuilder = { - apply(command.split("""\s+"""), cwd, extraEnv: _*) - // not smart to use this on windows, because CommandParser uses \ to escape ". - /*CommandParser.parse(command) match { - case Left(errorMsg) => error(errorMsg) - case Right((cmd, args)) => apply(cmd :: args, cwd, extraEnv : _*) - }*/ - } - /** create ProcessBuilder with working dir optionally set to File and extra environment variables */ - def apply(command: Seq[String], cwd: Option[File], extraEnv: (String, String)*): ProcessBuilder = { - val jpb = new JProcessBuilder(command.toArray: _*) - cwd.foreach(jpb directory _) - extraEnv.foreach { case (k, v) => jpb.environment.put(k, v) } - apply(jpb) - } - def apply(builder: JProcessBuilder): ProcessBuilder = new SimpleProcessBuilder(builder) - def apply(file: File): FilePartialBuilder = new FileBuilder(file) - def apply(url: URL): URLPartialBuilder = new URLBuilder(url) - @deprecated("Use string interpolation", "0.13.0") - def apply(command: scala.xml.Elem): ProcessBuilder = apply(command.text.trim) - def applySeq[T](builders: Seq[T])(implicit convert: T => SourcePartialBuilder): Seq[SourcePartialBuilder] = builders.map(convert) - - def apply(value: Boolean): ProcessBuilder = apply(value.toString, if (value) 0 else 1) - def apply(name: String, exitValue: => Int): ProcessBuilder = new DummyProcessBuilder(name, exitValue) - - def cat(file: SourcePartialBuilder, files: SourcePartialBuilder*): ProcessBuilder = cat(file :: files.toList) - def cat(files: Seq[SourcePartialBuilder]): ProcessBuilder = - { - require(files.nonEmpty) - files.map(_.cat).reduceLeft(_ #&& _) - } -} - -trait SourcePartialBuilder extends NotNull { - /** Writes the output stream of this process to the given file. */ - def #>(f: File): ProcessBuilder = toFile(f, false) - /** Appends the output stream of this process to the given file. */ - def #>>(f: File): ProcessBuilder = toFile(f, true) - /** - * Writes the output stream of this process to the given OutputStream. The - * argument is call-by-name, so the stream is recreated, written, and closed each - * time this process is executed. - */ - def #>(out: => OutputStream): ProcessBuilder = #>(new OutputStreamBuilder(out)) - def #>(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(toSource, b, false, ExitCodes.firstIfNonzero) - private def toFile(f: File, append: Boolean) = #>(new FileOutput(f, append)) - def cat = toSource - protected def toSource: ProcessBuilder -} -trait SinkPartialBuilder extends NotNull { - /** Reads the given file into the input stream of this process. */ - def #<(f: File): ProcessBuilder = #<(new FileInput(f)) - /** Reads the given URL into the input stream of this process. */ - def #<(f: URL): ProcessBuilder = #<(new URLInput(f)) - /** - * Reads the given InputStream into the input stream of this process. The - * argument is call-by-name, so the stream is recreated, read, and closed each - * time this process is executed. - */ - def #<(in: => InputStream): ProcessBuilder = #<(new InputStreamBuilder(in)) - def #<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, toSink, false, ExitCodes.firstIfNonzero) - protected def toSink: ProcessBuilder -} - -trait URLPartialBuilder extends SourcePartialBuilder -trait FilePartialBuilder extends SinkPartialBuilder with SourcePartialBuilder { - def #<<(f: File): ProcessBuilder - def #<<(u: URL): ProcessBuilder - def #<<(i: => InputStream): ProcessBuilder - def #<<(p: ProcessBuilder): ProcessBuilder -} - -/** - * Represents a process that is running or has finished running. - * It may be a compound process with several underlying native processes (such as 'a #&& b`). - */ -trait Process extends NotNull { - /** Blocks until this process exits and returns the exit code.*/ - def exitValue(): Int - /** Destroys this process. */ - def destroy(): Unit -} -/** Represents a runnable process. */ -trait ProcessBuilder extends SourcePartialBuilder with SinkPartialBuilder { - /** - * Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is - * sent to the console. If the exit code is non-zero, an exception is thrown. - */ - def !! : String - /** - * Starts the process represented by this builder, blocks until it exits, and returns the output as a String. Standard error is - * sent to the provided ProcessLogger. If the exit code is non-zero, an exception is thrown. - */ - def !!(log: ProcessLogger): String - /** - * Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, - * the Stream will provide all lines up to termination and then throw an exception. - */ - def lines: Stream[String] - /** - * Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the provided ProcessLogger. If the process exits with a non-zero value, - * the Stream will provide all lines up to termination but will not throw an exception. - */ - def lines(log: ProcessLogger): Stream[String] - /** - * Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the console. If the process exits with a non-zero value, - * the Stream will provide all lines up to termination but will not throw an exception. - */ - def lines_! : Stream[String] - /** - * Starts the process represented by this builder. The output is returned as a Stream that blocks when lines are not available - * but the process has not completed. Standard error is sent to the provided ProcessLogger. If the process exits with a non-zero value, - * the Stream will provide all lines up to termination but will not throw an exception. - */ - def lines_!(log: ProcessLogger): Stream[String] - /** - * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the console. - */ - def ! : Int - /** - * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given ProcessLogger. - */ - def !(log: ProcessLogger): Int - /** - * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the console. The newly started process reads from standard input of the current process. - */ - def !< : Int - /** - * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given ProcessLogger. The newly started process reads from standard input of the current process. - */ - def !<(log: ProcessLogger): Int - /** Starts the process represented by this builder. Standard output and error are sent to the console.*/ - def run(): Process - /** Starts the process represented by this builder. Standard output and error are sent to the given ProcessLogger.*/ - def run(log: ProcessLogger): Process - /** Starts the process represented by this builder. I/O is handled by the given ProcessIO instance.*/ - def run(io: ProcessIO): Process - /** - * Starts the process represented by this builder. Standard output and error are sent to the console. - * The newly started process reads from standard input of the current process if `connectInput` is true. - */ - def run(connectInput: Boolean): Process - /** - * Starts the process represented by this builder, blocks until it exits, and returns the exit code. Standard output and error are - * sent to the given ProcessLogger. - * The newly started process reads from standard input of the current process if `connectInput` is true. - */ - def run(log: ProcessLogger, connectInput: Boolean): Process - - def runBuffered(log: ProcessLogger, connectInput: Boolean): Process - - /** Constructs a command that runs this command first and then `other` if this command succeeds.*/ - def #&&(other: ProcessBuilder): ProcessBuilder - /** Constructs a command that runs this command first and then `other` if this command does not succeed.*/ - def #||(other: ProcessBuilder): ProcessBuilder - /** - * Constructs a command that will run this command and pipes the output to `other`. - * `other` must be a simple command. - * The exit code will be that of `other` regardless of whether this command succeeds. - */ - def #|(other: ProcessBuilder): ProcessBuilder - /** Constructs a command that will run this command and then `other`. The exit code will be the exit code of `other`.*/ - def ###(other: ProcessBuilder): ProcessBuilder - - def canPipeTo: Boolean -} -/** Each method will be called in a separate thread.*/ -final class ProcessIO(val writeInput: OutputStream => Unit, val processOutput: InputStream => Unit, val processError: InputStream => Unit, val inheritInput: JProcessBuilder => Boolean) extends NotNull { - def withOutput(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, process, processError, inheritInput) - def withError(process: InputStream => Unit): ProcessIO = new ProcessIO(writeInput, processOutput, process, inheritInput) - def withInput(write: OutputStream => Unit): ProcessIO = new ProcessIO(write, processOutput, processError, inheritInput) -} -trait ProcessLogger { - def info(s: => String): Unit - def error(s: => String): Unit - def buffer[T](f: => T): T -} diff --git a/internal/process/src/main/scala/sbt/ProcessImpl.scala b/internal/process/src/main/scala/sbt/ProcessImpl.scala deleted file mode 100644 index abef81b33..000000000 --- a/internal/process/src/main/scala/sbt/ProcessImpl.scala +++ /dev/null @@ -1,436 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah, Vesa Vilhonen - */ -package sbt - -import java.lang.{ Process => JProcess, ProcessBuilder => JProcessBuilder } -import java.io.{ BufferedReader, Closeable, InputStream, InputStreamReader, IOException, OutputStream, PrintStream } -import java.io.{ FilterInputStream, FilterOutputStream, PipedInputStream, PipedOutputStream } -import java.io.{ File, FileInputStream, FileOutputStream } -import java.net.URL - -/** Runs provided code in a new Thread and returns the Thread instance. */ -private object Spawn { - def apply(f: => Unit): Thread = apply(f, false) - def apply(f: => Unit, daemon: Boolean): Thread = - { - val thread = new Thread() { override def run() = { f } } - thread.setDaemon(daemon) - thread.start() - thread - } -} -private object Future { - def apply[T](f: => T): () => T = - { - val result = new SyncVar[Either[Throwable, T]] - def run(): Unit = - try { result.set(Right(f)) } - catch { case e: Exception => result.set(Left(e)) } - Spawn(run) - () => - result.get match { - case Right(value) => value - case Left(exception) => throw exception - } - } -} - -object BasicIO { - def apply(buffer: StringBuffer, log: Option[ProcessLogger], withIn: Boolean) = new ProcessIO(input(withIn), processFully(buffer), getErr(log), inheritInput(withIn)) - def apply(log: ProcessLogger, withIn: Boolean) = new ProcessIO(input(withIn), processInfoFully(log), processErrFully(log), inheritInput(withIn)) - - def getErr(log: Option[ProcessLogger]) = log match { case Some(lg) => processErrFully(lg); case None => toStdErr } - - private def processErrFully(log: ProcessLogger) = processFully(s => log.error(s)) - private def processInfoFully(log: ProcessLogger) = processFully(s => log.info(s)) - - def closeOut = (_: OutputStream).close() - final val BufferSize = 8192 - final val Newline = System.getProperty("line.separator") - - def close(c: java.io.Closeable) = try { c.close() } catch { case _: java.io.IOException => () } - def processFully(buffer: Appendable): InputStream => Unit = processFully(appendLine(buffer)) - def processFully(processLine: String => Unit): InputStream => Unit = - in => - { - val reader = new BufferedReader(new InputStreamReader(in)) - processLinesFully(processLine)(reader.readLine) - reader.close() - } - def processLinesFully(processLine: String => Unit)(readLine: () => String): Unit = { - def readFully(): Unit = { - val line = readLine() - if (line != null) { - processLine(line) - readFully() - } - } - readFully() - } - def connectToIn(o: OutputStream): Unit = transferFully(Uncloseable protect System.in, o) - def input(connect: Boolean): OutputStream => Unit = if (connect) connectToIn else closeOut - def standard(connectInput: Boolean): ProcessIO = standard(input(connectInput), inheritInput(connectInput)) - def standard(in: OutputStream => Unit, inheritIn: JProcessBuilder => Boolean): ProcessIO = new ProcessIO(in, toStdOut, toStdErr, inheritIn) - - def toStdErr = (in: InputStream) => transferFully(in, System.err) - def toStdOut = (in: InputStream) => transferFully(in, System.out) - - def transferFully(in: InputStream, out: OutputStream): Unit = - try { transferFullyImpl(in, out) } - catch { case _: InterruptedException => () } - - private[this] def appendLine(buffer: Appendable): String => Unit = - line => - { - buffer.append(line) - buffer.append(Newline) - } - - private[this] def transferFullyImpl(in: InputStream, out: OutputStream): Unit = { - val continueCount = 1 //if(in.isInstanceOf[PipedInputStream]) 1 else 0 - val buffer = new Array[Byte](BufferSize) - def read(): Unit = { - val byteCount = in.read(buffer) - if (byteCount >= continueCount) { - out.write(buffer, 0, byteCount) - out.flush() - read - } - } - read - in.close() - } - - def inheritInput(connect: Boolean) = { p: JProcessBuilder => if (connect) InheritInput(p) else false } -} -private[sbt] object ExitCodes { - def ignoreFirst: (Int, Int) => Int = (a, b) => b - def firstIfNonzero: (Int, Int) => Int = (a, b) => if (a != 0) a else b -} - -private abstract class AbstractProcessBuilder extends ProcessBuilder with SinkPartialBuilder with SourcePartialBuilder { - def #&&(other: ProcessBuilder): ProcessBuilder = new AndProcessBuilder(this, other) - def #||(other: ProcessBuilder): ProcessBuilder = new OrProcessBuilder(this, other) - def #|(other: ProcessBuilder): ProcessBuilder = - { - require(other.canPipeTo, "Piping to multiple processes is not supported.") - new PipedProcessBuilder(this, other, false, exitCode = ExitCodes.ignoreFirst) - } - def ###(other: ProcessBuilder): ProcessBuilder = new SequenceProcessBuilder(this, other) - - protected def toSource = this - protected def toSink = this - - def run(): Process = run(false) - def run(connectInput: Boolean): Process = run(BasicIO.standard(connectInput)) - def run(log: ProcessLogger): Process = run(log, false) - def run(log: ProcessLogger, connectInput: Boolean): Process = run(BasicIO(log, connectInput)) - - private[this] def getString(log: Option[ProcessLogger], withIn: Boolean): String = - { - val buffer = new StringBuffer - val code = this ! BasicIO(buffer, log, withIn) - if (code == 0) buffer.toString else sys.error("Nonzero exit value: " + code) - } - def !! = getString(None, false) - def !!(log: ProcessLogger) = getString(Some(log), false) - def !!< = getString(None, true) - def !!<(log: ProcessLogger) = getString(Some(log), true) - - def lines: Stream[String] = lines(false, true, None) - def lines(log: ProcessLogger): Stream[String] = lines(false, true, Some(log)) - def lines_! : Stream[String] = lines(false, false, None) - def lines_!(log: ProcessLogger): Stream[String] = lines(false, false, Some(log)) - - private[this] def lines(withInput: Boolean, nonZeroException: Boolean, log: Option[ProcessLogger]): Stream[String] = - { - val streamed = Streamed[String](nonZeroException) - val process = run(new ProcessIO(BasicIO.input(withInput), BasicIO.processFully(streamed.process), BasicIO.getErr(log), BasicIO.inheritInput(withInput))) - Spawn { streamed.done(process.exitValue()) } - streamed.stream() - } - - def ! = run(false).exitValue() - def !< = run(true).exitValue() - def !(log: ProcessLogger) = runBuffered(log, false).exitValue() - def !<(log: ProcessLogger) = runBuffered(log, true).exitValue() - def runBuffered(log: ProcessLogger, connectInput: Boolean) = - log.buffer { run(log, connectInput) } - def !(io: ProcessIO) = run(io).exitValue() - - def canPipeTo = false -} - -private[sbt] class URLBuilder(url: URL) extends URLPartialBuilder with SourcePartialBuilder { - protected def toSource = new URLInput(url) -} -private[sbt] class FileBuilder(base: File) extends FilePartialBuilder with SinkPartialBuilder with SourcePartialBuilder { - protected def toSource = new FileInput(base) - protected def toSink = new FileOutput(base, false) - def #<<(f: File): ProcessBuilder = #<<(new FileInput(f)) - def #<<(u: URL): ProcessBuilder = #<<(new URLInput(u)) - def #<<(s: => InputStream): ProcessBuilder = #<<(new InputStreamBuilder(s)) - def #<<(b: ProcessBuilder): ProcessBuilder = new PipedProcessBuilder(b, new FileOutput(base, true), false, ExitCodes.firstIfNonzero) -} - -private abstract class BasicBuilder extends AbstractProcessBuilder { - protected[this] def checkNotThis(a: ProcessBuilder) = require(a != this, "Compound process '" + a + "' cannot contain itself.") - final def run(io: ProcessIO): Process = - { - val p = createProcess(io) - p.start() - p - } - protected[this] def createProcess(io: ProcessIO): BasicProcess -} -private abstract class BasicProcess extends Process { - def start(): Unit -} - -private abstract class CompoundProcess extends BasicProcess { - def destroy(): Unit = destroyer() - def exitValue() = getExitValue().getOrElse(sys.error("No exit code: process destroyed.")) - - def start() = getExitValue - - protected lazy val (getExitValue, destroyer) = - { - val code = new SyncVar[Option[Int]]() - code.set(None) - val thread = Spawn(code.set(runAndExitValue())) - - ( - Future { thread.join(); code.get }, - () => thread.interrupt() - ) - } - - /** Start and block until the exit value is available and then return it in Some. Return None if destroyed (use 'run')*/ - protected[this] def runAndExitValue(): Option[Int] - - protected[this] def runInterruptible[T](action: => T)(destroyImpl: => Unit): Option[T] = - { - try { Some(action) } - catch { case _: InterruptedException => destroyImpl; None } - } -} - -private abstract class SequentialProcessBuilder(a: ProcessBuilder, b: ProcessBuilder, operatorString: String) extends BasicBuilder { - checkNotThis(a) - checkNotThis(b) - override def toString = " ( " + a + " " + operatorString + " " + b + " ) " -} -private class PipedProcessBuilder(first: ProcessBuilder, second: ProcessBuilder, toError: Boolean, exitCode: (Int, Int) => Int) extends SequentialProcessBuilder(first, second, if (toError) "#|!" else "#|") { - override def createProcess(io: ProcessIO) = new PipedProcesses(first, second, io, toError, exitCode) -} -private class AndProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#&&") { - override def createProcess(io: ProcessIO) = new AndProcess(first, second, io) -} -private class OrProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "#||") { - override def createProcess(io: ProcessIO) = new OrProcess(first, second, io) -} -private class SequenceProcessBuilder(first: ProcessBuilder, second: ProcessBuilder) extends SequentialProcessBuilder(first, second, "###") { - override def createProcess(io: ProcessIO) = new ProcessSequence(first, second, io) -} - -private class SequentialProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO, evaluateSecondProcess: Int => Boolean) extends CompoundProcess { - protected[this] override def runAndExitValue() = - { - val first = a.run(io) - runInterruptible(first.exitValue)(first.destroy()) flatMap - { codeA => - if (evaluateSecondProcess(codeA)) { - val second = b.run(io) - runInterruptible(second.exitValue)(second.destroy()) - } else - Some(codeA) - } - } -} -private class AndProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, _ == 0) -private class OrProcess(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, _ != 0) -private class ProcessSequence(a: ProcessBuilder, b: ProcessBuilder, io: ProcessIO) extends SequentialProcess(a, b, io, ignore => true) - -private class PipedProcesses(a: ProcessBuilder, b: ProcessBuilder, defaultIO: ProcessIO, toError: Boolean, exitCode: (Int, Int) => Int) extends CompoundProcess { - protected[this] override def runAndExitValue() = - { - val currentSource = new SyncVar[Option[InputStream]] - val pipeOut = new PipedOutputStream - val source = new PipeSource(currentSource, pipeOut, a.toString) - source.start() - - val pipeIn = new PipedInputStream(pipeOut) - val currentSink = new SyncVar[Option[OutputStream]] - val sink = new PipeSink(pipeIn, currentSink, b.toString) - sink.start() - - def handleOutOrError(fromOutput: InputStream) = currentSource.put(Some(fromOutput)) - - val firstIO = - if (toError) - defaultIO.withError(handleOutOrError) - else - defaultIO.withOutput(handleOutOrError) - val secondIO = defaultIO.withInput(toInput => currentSink.put(Some(toInput))) - - val second = b.run(secondIO) - val first = a.run(firstIO) - try { - runInterruptible { - val firstResult = first.exitValue - currentSource.put(None) - currentSink.put(None) - val secondResult = second.exitValue - exitCode(firstResult, secondResult) - } { - first.destroy() - second.destroy() - } - } finally { - BasicIO.close(pipeIn) - BasicIO.close(pipeOut) - } - } -} -private class PipeSource(currentSource: SyncVar[Option[InputStream]], pipe: PipedOutputStream, label: => String) extends Thread { - final override def run(): Unit = { - currentSource.get match { - case Some(source) => - try { BasicIO.transferFully(source, pipe) } - catch { case e: IOException => println("I/O error " + e.getMessage + " for process: " + label); e.printStackTrace() } - finally { - BasicIO.close(source) - currentSource.unset() - } - run() - case None => - currentSource.unset() - BasicIO.close(pipe) - } - } -} -private class PipeSink(pipe: PipedInputStream, currentSink: SyncVar[Option[OutputStream]], label: => String) extends Thread { - final override def run(): Unit = { - currentSink.get match { - case Some(sink) => - try { BasicIO.transferFully(pipe, sink) } - catch { case e: IOException => println("I/O error " + e.getMessage + " for process: " + label); e.printStackTrace() } - finally { - BasicIO.close(sink) - currentSink.unset() - } - run() - case None => - currentSink.unset() - } - } -} - -private[sbt] class DummyProcessBuilder(override val toString: String, exitValue: => Int) extends AbstractProcessBuilder { - override def run(io: ProcessIO): Process = new DummyProcess(exitValue) - override def canPipeTo = true -} -/** - * A thin wrapper around a java.lang.Process. `ioThreads` are the Threads created to do I/O. - * The implementation of `exitValue` waits until these threads die before returning. - */ -private class DummyProcess(action: => Int) extends Process { - private[this] val exitCode = Future(action) - override def exitValue() = exitCode() - override def destroy(): Unit = () -} -/** Represents a simple command without any redirection or combination. */ -private[sbt] class SimpleProcessBuilder(p: JProcessBuilder) extends AbstractProcessBuilder { - override def run(io: ProcessIO): Process = - { - import io._ - val inherited = inheritInput(p) - val process = p.start() - - // spawn threads that process the output and error streams, and also write input if not inherited. - if (!inherited) - Spawn(writeInput(process.getOutputStream)) - val outThread = Spawn(processOutput(process.getInputStream)) - val errorThread = - if (!p.redirectErrorStream) - Spawn(processError(process.getErrorStream)) :: Nil - else - Nil - new SimpleProcess(process, outThread :: errorThread) - } - override def toString = p.command.toString - override def canPipeTo = true -} - -/** - * A thin wrapper around a java.lang.Process. `outputThreads` are the Threads created to read from the - * output and error streams of the process. - * The implementation of `exitValue` wait for the process to finish and then waits until the threads reading output and error streams die before - * returning. Note that the thread that reads the input stream cannot be interrupted, see https://github.com/sbt/sbt/issues/327 and - * http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4514257 - */ -private class SimpleProcess(p: JProcess, outputThreads: List[Thread]) extends Process { - override def exitValue() = - { - try { - p.waitFor() - } catch { - case _: InterruptedException => p.destroy() - } - outputThreads.foreach(_.join()) // this ensures that all output is complete before returning (waitFor does not ensure this) - p.exitValue() - } - override def destroy() = p.destroy() -} - -private class FileOutput(file: File, append: Boolean) extends OutputStreamBuilder(new FileOutputStream(file, append), file.getAbsolutePath) -private class URLInput(url: URL) extends InputStreamBuilder(url.openStream, url.toString) -private class FileInput(file: File) extends InputStreamBuilder(new FileInputStream(file), file.getAbsolutePath) - -import Uncloseable.protect -private class OutputStreamBuilder(stream: => OutputStream, label: String) extends ThreadProcessBuilder(label, _.writeInput(protect(stream))) { - def this(stream: => OutputStream) = this(stream, "") -} -private class InputStreamBuilder(stream: => InputStream, label: String) extends ThreadProcessBuilder(label, _.processOutput(protect(stream))) { - def this(stream: => InputStream) = this(stream, "") -} - -private abstract class ThreadProcessBuilder(override val toString: String, runImpl: ProcessIO => Unit) extends AbstractProcessBuilder { - override def run(io: ProcessIO): Process = - { - val success = new SyncVar[Boolean] - success.put(false) - new ThreadProcess(Spawn { runImpl(io); success.set(true) }, success) - } -} -private final class ThreadProcess(thread: Thread, success: SyncVar[Boolean]) extends Process { - override def exitValue() = - { - thread.join() - if (success.get) 0 else 1 - } - override def destroy(): Unit = thread.interrupt() -} - -object Uncloseable { - def apply(in: InputStream): InputStream = new FilterInputStream(in) { override def close(): Unit = () } - def apply(out: OutputStream): OutputStream = new FilterOutputStream(out) { override def close(): Unit = () } - def protect(in: InputStream): InputStream = if (in eq System.in) Uncloseable(in) else in - def protect(out: OutputStream): OutputStream = if ((out eq System.out) || (out eq System.err)) Uncloseable(out) else out -} -private object Streamed { - def apply[T](nonzeroException: Boolean): Streamed[T] = - { - val q = new java.util.concurrent.LinkedBlockingQueue[Either[Int, T]] - def next(): Stream[T] = - q.take match { - case Left(0) => Stream.empty - case Left(code) => if (nonzeroException) sys.error("Nonzero exit code: " + code) else Stream.empty - case Right(s) => Stream.cons(s, next) - } - new Streamed((s: T) => q.put(Right(s)), code => q.put(Left(code)), () => next()) - } -} - -private final class Streamed[T](val process: T => Unit, val done: Int => Unit, val stream: () => Stream[T]) extends NotNull diff --git a/internal/process/src/main/scala/sbt/SyncVar.scala b/internal/process/src/main/scala/sbt/SyncVar.scala deleted file mode 100644 index c268aac3d..000000000 --- a/internal/process/src/main/scala/sbt/SyncVar.scala +++ /dev/null @@ -1,39 +0,0 @@ -package sbt - -// minimal copy of scala.concurrent.SyncVar since that version deprecated put and unset -private[sbt] final class SyncVar[A] { - private[this] var isDefined: Boolean = false - private[this] var value: Option[A] = None - - /** Waits until a value is set and then gets it. Does not clear the value */ - def get: A = synchronized { - while (!isDefined) wait() - value.get - } - - /** Waits until a value is set, gets it, and finally clears the value. */ - def take(): A = synchronized { - try get finally unset() - } - - /** Sets the value, whether or not it is currently defined. */ - def set(x: A): Unit = synchronized { - isDefined = true - value = Some(x) - notifyAll() - } - - /** Sets the value, first waiting until it is undefined if it is currently defined. */ - def put(x: A): Unit = synchronized { - while (isDefined) wait() - set(x) - } - - /** Clears the value, whether or not it is current defined. */ - def unset(): Unit = synchronized { - isDefined = false - value = None - notifyAll() - } -} - diff --git a/internal/process/src/test/scala/ProcessSpecification.scala b/internal/process/src/test/scala/ProcessSpecification.scala deleted file mode 100644 index 67bd5e625..000000000 --- a/internal/process/src/test/scala/ProcessSpecification.scala +++ /dev/null @@ -1,131 +0,0 @@ -package sbt - -import java.io.File -import org.scalacheck.{ Arbitrary, Gen, Prop, Properties } -import Prop._ - -import Process._ - -object ProcessSpecification extends Properties("Process I/O") { - implicit val exitCodeArb: Arbitrary[Array[Byte]] = Arbitrary( - for ( - size <- Gen.choose(0, 10); - l <- Gen.listOfN[Byte](size, Arbitrary.arbByte.arbitrary) - ) yield l.toArray - ) - - /*property("Correct exit code") = forAll( (exitCode: Byte) => checkExit(exitCode)) - property("#&& correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ #&& _)(_ && _)) - property("#|| correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ #|| _)(_ || _)) - property("### correct") = forAll( (exitCodes: Array[Byte]) => checkBinary(exitCodes)(_ ### _)( (x,latest) => latest))*/ - property("Pipe to output file") = forAll((data: Array[Byte]) => checkFileOut(data)) - property("Pipe from input file") = forAll((data: Array[Byte]) => checkFileIn(data)) - property("Pipe to process") = forAll((data: Array[Byte]) => checkPipe(data)) - property("Pipe to process ignores input exit code") = forAll((data: Array[Byte], code: Byte) => checkPipeExit(data, code)) - property("Pipe from input file to bad process preserves correct exit code.") = forAll((data: Array[Byte], code: Byte) => checkFileInExit(data, code)) - property("Pipe to output file from bad process preserves correct exit code.") = forAll((data: Array[Byte], code: Byte) => checkFileOutExit(data, code)) - - private def checkBinary(codes: Array[Byte])(reduceProcesses: (ProcessBuilder, ProcessBuilder) => ProcessBuilder)(reduceExit: (Boolean, Boolean) => Boolean) = - { - (codes.length > 1) ==> - { - val unsignedCodes = codes.map(unsigned) - val exitCode = unsignedCodes.map(code => Process(process("sbt.exit " + code))).reduceLeft(reduceProcesses) ! - val expectedExitCode = unsignedCodes.map(toBoolean).reduceLeft(reduceExit) - toBoolean(exitCode) == expectedExitCode - } - } - private def toBoolean(exitCode: Int) = exitCode == 0 - private def checkExit(code: Byte) = - { - val exitCode = unsigned(code) - (process("sbt.exit " + exitCode) !) == exitCode - } - private def checkFileOut(data: Array[Byte]) = - { - withData(data) { (temporaryFile, temporaryFile2) => - val catCommand = process("sbt.cat " + temporaryFile.getAbsolutePath) - catCommand #> temporaryFile2 - } - } - private def checkFileIn(data: Array[Byte]) = - { - withData(data) { (temporaryFile, temporaryFile2) => - val catCommand = process("sbt.cat") - temporaryFile #> catCommand #> temporaryFile2 - } - } - private def checkPipe(data: Array[Byte]) = - { - withData(data) { (temporaryFile, temporaryFile2) => - val catCommand = process("sbt.cat") - temporaryFile #> catCommand #| catCommand #> temporaryFile2 - } - } - private def checkPipeExit(data: Array[Byte], code: Byte) = - withTempFiles { (a, b) => - IO.write(a, data) - val catCommand = process("sbt.cat") - val exitCommand = process(s"sbt.exit $code") - val exit = (a #> exitCommand #| catCommand #> b).! - (s"Exit code: $exit") |: - (s"Output file length: ${b.length}") |: - (exit == 0) && - (b.length == 0) - } - - private def checkFileOutExit(data: Array[Byte], exitCode: Byte) = - withTempFiles { (a, b) => - IO.write(a, data) - val code = unsigned(exitCode) - val command = process(s"sbt.exit $code") - val exit = (a #> command #> b).! - (s"Exit code: $exit, expected: $code") |: - (s"Output file length: ${b.length}") |: - (exit == code) && - (b.length == 0) - } - - private def checkFileInExit(data: Array[Byte], exitCode: Byte) = - withTempFiles { (a, b) => - IO.write(a, data) - val code = unsigned(exitCode) - val command = process(s"sbt.exit $code") - val exit = (a #> command).! - (s"Exit code: $exit, expected: $code") |: - (exit == code) - } - - private def temp() = File.createTempFile("sbt", "") - private def withData(data: Array[Byte])(f: (File, File) => ProcessBuilder) = - withTempFiles { (a, b) => - IO.write(a, data) - val process = f(a, b) - (process !) == 0 && sameFiles(a, b) - } - private def sameFiles(a: File, b: File) = - IO.readBytes(a) sameElements IO.readBytes(b) - - private def withTempFiles[T](f: (File, File) => T): T = - { - val temporaryFile1 = temp() - val temporaryFile2 = temp() - try f(temporaryFile1, temporaryFile2) - finally { - temporaryFile1.delete() - temporaryFile2.delete() - } - } - private def unsigned(b: Int): Int = ((b: Int) + 256) % 256 - private def unsigned(b: Byte): Int = unsigned(b: Int) - private def process(command: String) = - { - val ignore = echo // just for the compile dependency so that this test is rerun when TestedProcess.scala changes, not used otherwise - - val thisClasspath = List(getSource[Product], getSource[IO.type], getSource[SourceTag]).mkString(File.pathSeparator) - "java -cp " + thisClasspath + " " + command - } - private def getSource[T: Manifest]: String = - IO.classLocationFile[T].getAbsolutePath -} -private trait SourceTag diff --git a/internal/process/src/test/scala/TestedProcess.scala b/internal/process/src/test/scala/TestedProcess.scala deleted file mode 100644 index 8dabcf381..000000000 --- a/internal/process/src/test/scala/TestedProcess.scala +++ /dev/null @@ -1,46 +0,0 @@ -package sbt - -import java.io.{ File, FileNotFoundException, IOException } - -object exit { - def main(args: Array[String]): Unit = { - System.exit(java.lang.Integer.parseInt(args(0))) - } -} -object cat { - def main(args: Array[String]): Unit = { - try { - if (args.length == 0) - IO.transfer(System.in, System.out) - else - catFiles(args.toList) - System.exit(0) - } catch { - case e: Throwable => - e.printStackTrace() - System.err.println("Error: " + e.toString) - System.exit(1) - } - } - private def catFiles(filenames: List[String]): Option[String] = - { - filenames match { - case head :: tail => - val file = new File(head) - if (file.isDirectory) - throw new IOException("Is directory: " + file) - else if (file.exists) { - Using.fileInputStream(file) { stream => - IO.transfer(stream, System.out) - } - catFiles(tail) - } else - throw new FileNotFoundException("No such file or directory: " + file) - case Nil => None - } - } -} -object echo { - def main(args: Array[String]): Unit = - System.out.println(args.mkString(" ")) -} From ef7ff653cb251acc96eb4210f87ac8d978cedb19 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 20 Aug 2015 01:02:40 -0400 Subject: [PATCH 526/823] readme --- README.md | 9 +++++++++ 1 file changed, 9 insertions(+) create mode 100644 README.md diff --git a/README.md b/README.md new file mode 100644 index 000000000..acdbb8c3a --- /dev/null +++ b/README.md @@ -0,0 +1,9 @@ +### utility modules for sbt + +``` +cd sbt-modules/util-take2 +git filter-branch --index-filter 'git rm --cached -qr -- . && git reset -q $GIT_COMMIT -- build.sbt LICENSE NOTICE interface util/appmacro util/collection util/complete util/control util/log util/logic util/process util/relation cache' --prune-empty +git reset --hard +git gc --aggressive +git prune +``` From 8eae9ba726a13c690e21df0714f1ba8c7b111ba8 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 31 Aug 2015 00:57:00 +0200 Subject: [PATCH 527/823] Fix test dependencies, cross compile all projects --- build.sbt | 9 ++++----- project/Dependencies.scala | 9 +++++++-- 2 files changed, 11 insertions(+), 7 deletions(-) diff --git a/build.sbt b/build.sbt index 13061c241..0cd1a67d3 100644 --- a/build.sbt +++ b/build.sbt @@ -25,15 +25,14 @@ def commonSettings: Seq[Setting[_]] = Seq( // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), - incOptions := incOptions.value.withNameHashing(true) - // crossScalaVersions := Seq(scala210) + incOptions := incOptions.value.withNameHashing(true), + crossScalaVersions := Seq(scala210, scala211) // bintrayPackage := (bintrayPackage in ThisBuild).value, // bintrayRepository := (bintrayRepository in ThisBuild).value ) -// def testedBaseSettings: Seq[Setting[_]] = -// baseSettings ++ testDependencies -def testedBaseSettings: Seq[Setting[_]] = commonSettings +def testedBaseSettings: Seq[Setting[_]] = + commonSettings ++ testDependencies lazy val utilRoot: Project = (project in file(".")). // configs(Sxr.sxrConf). diff --git a/project/Dependencies.scala b/project/Dependencies.scala index f14cf23c9..6782137dd 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -18,8 +18,6 @@ object Dependencies { // lazy val jsch = "com.jcraft" % "jsch" % "0.1.46" intransitive () lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" lazy val sbinary = "org.scala-tools.sbinary" %% "sbinary" % "0.4.2" - // lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.11.4" - // lazy val specs2 = "org.specs2" %% "specs2" % "2.3.11" // lazy val junit = "junit" % "junit" % "4.11" lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } lazy val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } @@ -31,4 +29,11 @@ object Dependencies { } } lazy val scalaXml = scala211Module("scala-xml", "1.0.1") + + lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.11.4" + lazy val specs2 = "org.specs2" %% "specs2" % "2.3.11" + lazy val testDependencies = libraryDependencies ++= Seq( + scalaCheck, + specs2 + ) } From bb20c40eccbb0ee2c7484321c50bc10da431ae8c Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 31 Aug 2015 00:48:28 +0200 Subject: [PATCH 528/823] Publish tests in utilLogging project They are used by the tests in sbt/librarymanagement --- build.sbt | 1 + 1 file changed, 1 insertion(+) diff --git a/build.sbt b/build.sbt index 0cd1a67d3..b5d71daae 100644 --- a/build.sbt +++ b/build.sbt @@ -109,6 +109,7 @@ lazy val utilLogging = (project in internalPath / "util-logging"). dependsOn(utilInterface). settings( testedBaseSettings, + publishArtifact in (Test, packageBin) := true, name := "Util Logging", libraryDependencies += jline ) From 650b37ff2467c73a32a75ea86f8579dbf6c1f7b2 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 31 Aug 2015 01:03:37 +0200 Subject: [PATCH 529/823] Travis setup --- .travis.yml | 4 ++++ 1 file changed, 4 insertions(+) create mode 100644 .travis.yml diff --git a/.travis.yml b/.travis.yml new file mode 100644 index 000000000..a5b73c0d8 --- /dev/null +++ b/.travis.yml @@ -0,0 +1,4 @@ +language: scala +scala: + - 2.10.5 + - 2.11.7 From b4e27ce471bbab5c7db6c4cbd34ef8b3035cf3bc Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 31 Aug 2015 01:51:03 +0200 Subject: [PATCH 530/823] Update IO library to sbt/IO v1.0.0-M1 --- build.sbt | 6 +++--- .../util-cache/src/main/scala/sbt/CacheIO.scala | 3 ++- .../util-cache/src/main/scala/sbt/FileInfo.scala | 3 ++- .../src/main/scala/sbt/SeparatedCache.scala | 3 ++- .../src/main/scala/sbt/complete/ExampleSource.scala | 4 ++-- .../main/scala/sbt/complete/HistoryCommands.scala | 3 ++- .../test/scala/sbt/complete/FileExamplesTest.scala | 3 +-- .../util-tracking/src/main/scala/sbt/Tracked.scala | 6 ++++-- project/Dependencies.scala | 13 ++++--------- 9 files changed, 22 insertions(+), 22 deletions(-) diff --git a/build.sbt b/build.sbt index b5d71daae..cf41630d1 100644 --- a/build.sbt +++ b/build.sbt @@ -100,7 +100,7 @@ lazy val utilComplete = (project in internalPath / "util-complete"). testedBaseSettings, // Util.crossBuild, name := "Util Completion", - libraryDependencies ++= Seq(jline, ioProj), + libraryDependencies ++= Seq(jline, sbtIO), crossScalaVersions := Seq(scala210, scala211) ) @@ -135,7 +135,7 @@ lazy val utilCache = (project in internalPath / "util-cache"). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sbinary, sbtSerialization, scalaReflect.value, ioProj) ++ scalaXml.value + libraryDependencies ++= Seq(sbinary, sbtSerialization, scalaReflect.value, sbtIO) ++ scalaXml.value ) // Builds on cache to provide caching for filesystem-related operations @@ -144,5 +144,5 @@ lazy val utilTracking = (project in internalPath / "util-tracking"). settings( commonSettings, name := "Util Tracking", - libraryDependencies ++= Seq(ioProj) + libraryDependencies ++= Seq(sbtIO) ) diff --git a/internal/util-cache/src/main/scala/sbt/CacheIO.scala b/internal/util-cache/src/main/scala/sbt/CacheIO.scala index a50da7ee7..3bb952330 100644 --- a/internal/util-cache/src/main/scala/sbt/CacheIO.scala +++ b/internal/util-cache/src/main/scala/sbt/CacheIO.scala @@ -6,6 +6,7 @@ package sbt import java.io.{ File, FileNotFoundException } import sbinary.{ DefaultProtocol, Format, Operations } import scala.reflect.Manifest +import sbt.io.IO object CacheIO { def toBytes[T](format: Format[T])(value: T)(implicit mf: Manifest[Format[T]]): Array[Byte] = @@ -41,4 +42,4 @@ object CacheIO { def typeHash[T](implicit mf: Manifest[T]) = mf.toString.hashCode def manifest[T](implicit mf: Manifest[T]): Manifest[T] = mf def objManifest[T](t: T)(implicit mf: Manifest[T]): Manifest[T] = mf -} \ No newline at end of file +} diff --git a/internal/util-cache/src/main/scala/sbt/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/FileInfo.scala index c735adcb0..12002c508 100644 --- a/internal/util-cache/src/main/scala/sbt/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/FileInfo.scala @@ -7,6 +7,7 @@ import java.io.{ File, IOException } import sbinary.{ DefaultProtocol, Format } import DefaultProtocol._ import scala.reflect.Manifest +import sbt.io.Hash sealed trait FileInfo extends NotNull { val file: File @@ -103,4 +104,4 @@ object FilesInfo { implicit def hashInputsCache: InputCache[FilesInfo[HashFileInfo]] = hash.infosInputCache implicit def modifiedInputsCache: InputCache[FilesInfo[ModifiedFileInfo]] = lastModified.infosInputCache implicit def fullInputsCache: InputCache[FilesInfo[HashModifiedFileInfo]] = full.infosInputCache -} \ No newline at end of file +} diff --git a/internal/util-cache/src/main/scala/sbt/SeparatedCache.scala b/internal/util-cache/src/main/scala/sbt/SeparatedCache.scala index 9d11f1f3c..1022832d6 100644 --- a/internal/util-cache/src/main/scala/sbt/SeparatedCache.scala +++ b/internal/util-cache/src/main/scala/sbt/SeparatedCache.scala @@ -7,6 +7,7 @@ import Types.:+: import sbinary.{ DefaultProtocol, Format, Input, Output => Out } import DefaultProtocol.ByteFormat import java.io.{ File, InputStream, OutputStream } +import sbt.internal.io.Using trait InputCache[I] { type Internal @@ -59,4 +60,4 @@ class BasicCache[I, O](implicit input: InputCache[I], outFormat: Format[O]) exte outFormat.writes(stream, out) } } -} \ No newline at end of file +} diff --git a/internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala b/internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala index 52d96246b..ab2d39b4c 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala @@ -1,7 +1,7 @@ package sbt.complete import java.io.File -import sbt.IO +import sbt.io.IO /** * These sources of examples are used in parsers for user input completion. An example of such a source is the @@ -56,4 +56,4 @@ class FileExamples(base: File, prefix: String = "") extends ExampleSource { private def dirStartsWithPrefix(relativizedPath: String): Boolean = (relativizedPath startsWith prefix) || (prefix startsWith relativizedPath) -} \ No newline at end of file +} diff --git a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala index 1e124c583..c82ecfa3d 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala @@ -5,6 +5,7 @@ package sbt package complete import java.io.File +import sbt.io.IO object HistoryCommands { val Start = "!" @@ -70,4 +71,4 @@ object HistoryCommands { val actionParser: Parser[complete.History => Option[List[String]]] = Start ~> (help | last | execInt | list | execStr) // execStr must come last -} \ No newline at end of file +} diff --git a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala index effd9be78..f9cc77038 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala @@ -2,9 +2,8 @@ package sbt.complete import org.specs2.mutable.Specification import org.specs2.specification.Scope -import sbt.IO.withTemporaryDirectory import java.io.File -import sbt.IO._ +import sbt.io.IO._ class FileExamplesTest extends Specification { diff --git a/internal/util-tracking/src/main/scala/sbt/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/Tracked.scala index 0de466686..5c6979718 100644 --- a/internal/util-tracking/src/main/scala/sbt/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/Tracked.scala @@ -9,7 +9,9 @@ import sbinary.Format import scala.pickling.PicklingException import scala.reflect.Manifest import scala.collection.mutable -import IO.{ delete, read, write } +import sbt.io.IO.{ delete, read, write } +import sbt.io.{ IO, Path } +import sbt.internal.io.Using import sbt.serialization._ object Tracked { @@ -251,4 +253,4 @@ object FileFunction { } } } -} \ No newline at end of file +} diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 6782137dd..0313a7267 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -6,21 +6,15 @@ object Dependencies { lazy val scala211 = "2.11.7" val bootstrapSbtVersion = "0.13.8" - // lazy val interfaceProj = "org.scala-sbt" % "interface" % bootstrapSbtVersion - lazy val ioProj = "org.scala-sbt" % "io" % bootstrapSbtVersion - // lazy val collectionProj = "org.scala-sbt" % "collections" % bootstrapSbtVersion - // lazy val logProj = "org.scala-sbt" % "logging" % bootstrapSbtVersion - // lazy val crossProj = "org.scala-sbt" % "cross" % bootstrapSbtVersion + lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M1" lazy val jline = "jline" % "jline" % "2.11" - // lazy val launcherInterface = "org.scala-sbt" % "launcher-interface" % "1.0.0-M1" - // lazy val ivy = "org.scala-sbt.ivy" % "ivy" % "2.3.0-sbt-927bc9ded7f8fba63297cddd0d5a3d01d6ad5d8d" - // lazy val jsch = "com.jcraft" % "jsch" % "0.1.46" intransitive () lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" lazy val sbinary = "org.scala-tools.sbinary" %% "sbinary" % "0.4.2" - // lazy val junit = "junit" % "junit" % "4.11" + lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } lazy val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } + private def scala211Module(name: String, moduleVersion: String) = Def.setting { scalaVersion.value match { @@ -28,6 +22,7 @@ object Dependencies { case _ => ("org.scala-lang.modules" %% name % moduleVersion) :: Nil } } + lazy val scalaXml = scala211Module("scala-xml", "1.0.1") lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.11.4" From fca7a42f3d75a8645733e0471aef250052ef9a3c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 31 Aug 2015 03:02:44 -0400 Subject: [PATCH 531/823] Add sbt-house-rules and bintray-sbt --- build.sbt | 39 ++++++++++--------- .../scala/sbt/complete/HistoryCommands.scala | 3 +- .../main/scala/sbt/complete/TypeString.scala | 3 +- .../src/main/scala/sbt/ExitHook.scala | 3 +- .../src/main/scala/sbt/logic/Logic.scala | 3 +- project/bintray.sbt | 1 + project/house.sbt | 1 + 7 files changed, 27 insertions(+), 26 deletions(-) create mode 100644 project/bintray.sbt create mode 100644 project/house.sbt diff --git a/build.sbt b/build.sbt index cf41630d1..1a09bb098 100644 --- a/build.sbt +++ b/build.sbt @@ -3,20 +3,6 @@ import Util._ def internalPath = file("internal") -// ThisBuild settings take lower precedence, -// but can be shared across the multi projects. -def buildLevelSettings: Seq[Setting[_]] = Seq( - organization in ThisBuild := "org.scala-sbt.util", - version in ThisBuild := "1.0.0-SNAPSHOT" - // bintrayOrganization in ThisBuild := { - // if ((publishStatus in ThisBuild).value == "releases") Some("typesafe") - // else Some("sbt") - // }, - // bintrayRepository in ThisBuild := s"ivy-${(publishStatus in ThisBuild).value}", - // bintrayPackage in ThisBuild := "sbt", - // bintrayReleaseOnPublish in ThisBuild := false -) - def commonSettings: Seq[Setting[_]] = Seq( scalaVersion := "2.10.5", // publishArtifact in packageDoc := false, @@ -26,9 +12,9 @@ def commonSettings: Seq[Setting[_]] = Seq( testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), incOptions := incOptions.value.withNameHashing(true), - crossScalaVersions := Seq(scala210, scala211) - // bintrayPackage := (bintrayPackage in ThisBuild).value, - // bintrayRepository := (bintrayRepository in ThisBuild).value + crossScalaVersions := Seq(scala210, scala211), + bintrayPackage := (bintrayPackage in ThisBuild).value, + bintrayRepository := (bintrayRepository in ThisBuild).value ) def testedBaseSettings: Seq[Setting[_]] = @@ -41,7 +27,24 @@ lazy val utilRoot: Project = (project in file(".")). utilLogging, utilRelation, utilLogic, utilCache, utilTracking ). settings( - buildLevelSettings, + inThisBuild(Seq( + organization := "org.scala-sbt.util", + version := "0.1.0-SNAPSHOT", + homepage := Some(url("https://github.com/sbt/util")), + description := "Util module for sbt", + licenses := List("BSD New" -> url("https://github.com/sbt/sbt/blob/0.13/LICENSE")), + scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")), + developers := List( + Developer("harrah", "Mark Harrah", "@harrah", url("https://github.com/harrah")), + Developer("eed3si9n", "Eugene Yokota", "@eed3si9n", url("https://github.com/eed3si9n")), + Developer("jsuereth", "Josh Suereth", "@jsuereth", url("https://github.com/jsuereth")), + Developer("dwijnand", "Dale Wijnand", "@dwijnand", url("https://github.com/dwijnand")) + ), + bintrayReleaseOnPublish := false, + bintrayOrganization := Some("sbt"), + bintrayRepository := "maven-releases", + bintrayPackage := "util" + )), commonSettings, name := "Util Root", publish := {}, diff --git a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala index c82ecfa3d..ff58fe9d2 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala @@ -34,8 +34,7 @@ object HistoryCommands { Nth -> ("Execute the command with index n, as shown by the " + ListFull + " command"), Previous -> "Execute the nth command before this one", StartsWithString -> "Execute the most recent command starting with 'string'", - ContainsString -> "Execute the most recent command containing 'string'" - ) + ContainsString -> "Execute the most recent command containing 'string'") def helpString = "History commands:\n " + (descriptions.map { case (c, d) => c + " " + d }).mkString("\n ") def printHelp(): Unit = println(helpString) diff --git a/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala b/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala index 6bf89ac05..bd5f84f43 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala @@ -61,8 +61,7 @@ private[sbt] object TypeString { val TypeMap = Map( "java.io.File" -> "File", "java.net.URL" -> "URL", - "java.net.URI" -> "URI" - ) + "java.net.URI" -> "URI") /** * A Parser that extracts basic structure from the string representation of a type from Manifest.toString. diff --git a/internal/util-control/src/main/scala/sbt/ExitHook.scala b/internal/util-control/src/main/scala/sbt/ExitHook.scala index 8ee5ddf86..16f295c7c 100644 --- a/internal/util-control/src/main/scala/sbt/ExitHook.scala +++ b/internal/util-control/src/main/scala/sbt/ExitHook.scala @@ -16,6 +16,5 @@ object ExitHooks { /** Calls each registered exit hook, trapping any exceptions so that each hook is given a chance to run. */ def runExitHooks(exitHooks: Seq[ExitHook]): Seq[Throwable] = exitHooks.flatMap(hook => - ErrorHandling.wideConvert(hook.runBeforeExiting()).left.toOption - ) + ErrorHandling.wideConvert(hook.runBeforeExiting()).left.toOption) } \ No newline at end of file diff --git a/internal/util-logic/src/main/scala/sbt/logic/Logic.scala b/internal/util-logic/src/main/scala/sbt/logic/Logic.scala index 856394251..0fbbe3f98 100644 --- a/internal/util-logic/src/main/scala/sbt/logic/Logic.scala +++ b/internal/util-logic/src/main/scala/sbt/logic/Logic.scala @@ -103,8 +103,7 @@ object Logic { checkAcyclic(clauses) problem.toLeft( - reduce0(clauses, initialFacts, Matched.empty) - ) + reduce0(clauses, initialFacts, Matched.empty)) } /** diff --git a/project/bintray.sbt b/project/bintray.sbt new file mode 100644 index 000000000..8dd913f98 --- /dev/null +++ b/project/bintray.sbt @@ -0,0 +1 @@ +addSbtPlugin("me.lessis" % "bintray-sbt" % "0.3.0") diff --git a/project/house.sbt b/project/house.sbt new file mode 100644 index 000000000..eefc29672 --- /dev/null +++ b/project/house.sbt @@ -0,0 +1 @@ +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.1.0") From 76d5aa49881c721227015fa9b14e7599d59f213f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 31 Aug 2015 03:19:51 -0400 Subject: [PATCH 532/823] publishArtifact := false --- build.sbt | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 1a09bb098..f4ebbe7a4 100644 --- a/build.sbt +++ b/build.sbt @@ -48,7 +48,8 @@ lazy val utilRoot: Project = (project in file(".")). commonSettings, name := "Util Root", publish := {}, - publishLocal := {} + publishLocal := {}, + publishArtifact := false ) // defines Java structures used across Scala versions, such as the API structures and relationships extracted by From 6603a948474be2fa562e47fa521b568100ce9c83 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 31 Aug 2015 10:55:02 +0200 Subject: [PATCH 533/823] Add sbt-houserules, formatting. --- .../src/test/scala/SettingsExample.scala | 3 +-- .../src/test/scala/SettingsTest.scala | 9 +++------ .../scala/sbt/complete/HistoryCommands.scala | 3 +-- .../src/main/scala/sbt/complete/TypeString.scala | 3 +-- .../sbt/complete/ParserWithExamplesTest.scala | 16 ++++++---------- .../src/main/scala/sbt/ExitHook.scala | 3 +-- .../src/main/scala/sbt/logic/Logic.scala | 3 +-- .../src/test/scala/sbt/logic/Test.scala | 3 +-- project/p.sbt | 1 + 9 files changed, 16 insertions(+), 28 deletions(-) create mode 100644 project/p.sbt diff --git a/internal/util-collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala index b48bb27fc..3a6bc3853 100644 --- a/internal/util-collection/src/test/scala/SettingsExample.scala +++ b/internal/util-collection/src/test/scala/SettingsExample.scala @@ -49,8 +49,7 @@ object SettingsUsage { val mySettings: Seq[Setting[_]] = Seq( setting(a3, value(3)), setting(b4, map(a4)(_ * 3)), - update(a5)(_ + 1) - ) + update(a5)(_ + 1)) // "compiles" and applies the settings. // This can be split into multiple steps to access intermediate results if desired. diff --git a/internal/util-collection/src/test/scala/SettingsTest.scala b/internal/util-collection/src/test/scala/SettingsTest.scala index d97b1056a..ab92332db 100644 --- a/internal/util-collection/src/test/scala/SettingsTest.scala +++ b/internal/util-collection/src/test/scala/SettingsTest.scala @@ -55,8 +55,7 @@ object SettingsTest extends Properties("settings") { List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 nextInit = if (scoped0 == chk) chk else (scoped0 zipWith chk) { (p, _) => p + 1 } - } yield derive(setting(scoped1, nextInit)) - ).toSeq + } yield derive(setting(scoped1, nextInit))).toSeq { // Note: This causes a cycle refernec error, quite frequently. @@ -96,8 +95,7 @@ object SettingsTest extends Properties("settings") { setting(b, value(6)), derive(setting(b, a)), setting(a, value(5)), - setting(b, value(8)) - ) + setting(b, value(8))) val ev = evaluate(settings) checkKey(a, Some(5), ev) && checkKey(b, Some(8), ev) } @@ -106,8 +104,7 @@ object SettingsTest extends Properties("settings") { setting(a, value(3)), setting(b, value(6)), derive(setting(b, a)), - setting(a, value(5)) - ) + setting(a, value(5))) val ev = evaluate(settings) checkKey(a, Some(5), ev) && checkKey(b, Some(5), ev) } diff --git a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala index c82ecfa3d..ff58fe9d2 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala @@ -34,8 +34,7 @@ object HistoryCommands { Nth -> ("Execute the command with index n, as shown by the " + ListFull + " command"), Previous -> "Execute the nth command before this one", StartsWithString -> "Execute the most recent command starting with 'string'", - ContainsString -> "Execute the most recent command containing 'string'" - ) + ContainsString -> "Execute the most recent command containing 'string'") def helpString = "History commands:\n " + (descriptions.map { case (c, d) => c + " " + d }).mkString("\n ") def printHelp(): Unit = println(helpString) diff --git a/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala b/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala index 6bf89ac05..bd5f84f43 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala @@ -61,8 +61,7 @@ private[sbt] object TypeString { val TypeMap = Map( "java.io.File" -> "File", "java.net.URL" -> "URL", - "java.net.URI" -> "URI" - ) + "java.net.URI" -> "URI") /** * A Parser that extracts basic structure from the string representation of a type from Manifest.toString. diff --git a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala index dff68803c..3bcc55dd2 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala @@ -17,8 +17,7 @@ class ParserWithExamplesTest extends Specification { "use the delegate parser to remove invalid examples" in new parserWithValidExamples { val validCompletions = Completions(Set( suggestion("blue"), - suggestion("red") - )) + suggestion("red"))) parserWithExamples.completions(0) shouldEqual validCompletions } } @@ -26,8 +25,7 @@ class ParserWithExamplesTest extends Specification { "listing valid completions in a derived parser" should { "produce only valid examples that start with the character of the derivation" in new parserWithValidExamples { val derivedCompletions = Completions(Set( - suggestion("lue") - )) + suggestion("lue"))) parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions } } @@ -43,8 +41,7 @@ class ParserWithExamplesTest extends Specification { "produce only examples that start with the character of the derivation" in new parserWithAllExamples { val derivedCompletions = Completions(Set( suggestion("lue"), - suggestion("lock") - )) + suggestion("lock"))) parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions } } @@ -56,8 +53,8 @@ class ParserWithExamplesTest extends Specification { class parserWithAllExamples extends parser(removeInvalidExamples = false) case class parser(examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), - maxNumberOfExamples: Int = 25, - removeInvalidExamples: Boolean) extends Scope { + maxNumberOfExamples: Int = 25, + removeInvalidExamples: Boolean) extends Scope { import DefaultParsers._ @@ -66,8 +63,7 @@ class ParserWithExamplesTest extends Specification { colorParser, FixedSetExamples(examples), maxNumberOfExamples, - removeInvalidExamples - ) + removeInvalidExamples) } case class GrowableSourceOfExamples() extends Iterable[String] { diff --git a/internal/util-control/src/main/scala/sbt/ExitHook.scala b/internal/util-control/src/main/scala/sbt/ExitHook.scala index 8ee5ddf86..16f295c7c 100644 --- a/internal/util-control/src/main/scala/sbt/ExitHook.scala +++ b/internal/util-control/src/main/scala/sbt/ExitHook.scala @@ -16,6 +16,5 @@ object ExitHooks { /** Calls each registered exit hook, trapping any exceptions so that each hook is given a chance to run. */ def runExitHooks(exitHooks: Seq[ExitHook]): Seq[Throwable] = exitHooks.flatMap(hook => - ErrorHandling.wideConvert(hook.runBeforeExiting()).left.toOption - ) + ErrorHandling.wideConvert(hook.runBeforeExiting()).left.toOption) } \ No newline at end of file diff --git a/internal/util-logic/src/main/scala/sbt/logic/Logic.scala b/internal/util-logic/src/main/scala/sbt/logic/Logic.scala index 856394251..0fbbe3f98 100644 --- a/internal/util-logic/src/main/scala/sbt/logic/Logic.scala +++ b/internal/util-logic/src/main/scala/sbt/logic/Logic.scala @@ -103,8 +103,7 @@ object Logic { checkAcyclic(clauses) problem.toLeft( - reduce0(clauses, initialFacts, Matched.empty) - ) + reduce0(clauses, initialFacts, Matched.empty)) } /** diff --git a/internal/util-logic/src/test/scala/sbt/logic/Test.scala b/internal/util-logic/src/test/scala/sbt/logic/Test.scala index f62a9e767..da1f0b706 100644 --- a/internal/util-logic/src/test/scala/sbt/logic/Test.scala +++ b/internal/util-logic/src/test/scala/sbt/logic/Test.scala @@ -20,8 +20,7 @@ object LogicTest extends Properties("Logic") { case Right(res) => false case Left(err: Logic.CyclicNegation) => true case Left(err) => sys.error(s"Expected cyclic error, got: $err") - } - ) + }) def expect(result: Either[LogicException, Matched], expected: Set[Atom]) = result match { case Left(err) => false diff --git a/project/p.sbt b/project/p.sbt new file mode 100644 index 000000000..eefc29672 --- /dev/null +++ b/project/p.sbt @@ -0,0 +1 @@ +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.1.0") From 6175d9233848b220ad8b68f63c90e9b844903f0d Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 31 Aug 2015 15:25:10 +0200 Subject: [PATCH 534/823] Add recommended compiler flags, fix most of the warnings --- build.sbt | 38 +++++++++---------- .../main/scala/sbt/appmacro/ContextUtil.scala | 10 +++-- .../scala/sbt/appmacro/KListBuilder.scala | 2 +- .../scala/sbt/appmacro/TupleNBuilder.scala | 2 +- .../util-cache/src/main/scala/sbt/Cache.scala | 3 +- .../util-cache/src/test/scala/CacheTest.scala | 1 + .../src/main/scala/sbt/Dag.scala | 1 + .../src/main/scala/sbt/IDSet.scala | 2 +- .../src/main/scala/sbt/INode.scala | 3 +- .../src/main/scala/sbt/PMap.scala | 2 +- .../src/main/scala/sbt/Settings.scala | 10 ++--- .../src/main/scala/sbt/Signal.scala | 5 ++- .../src/main/scala/sbt/LineReader.scala | 3 +- .../src/main/scala/sbt/complete/History.scala | 11 +++--- .../scala/sbt/complete/HistoryCommands.scala | 2 +- .../scala/sbt/complete/JLineCompletion.scala | 2 +- .../src/main/scala/sbt/complete/Parser.scala | 8 ++-- .../src/main/scala/sbt/complete/Parsers.scala | 4 +- .../scala/sbt/complete/ProcessError.scala | 4 +- .../scala/sbt/complete/TokenCompletions.scala | 6 +-- .../main/scala/sbt/complete/UpperBound.scala | 4 +- .../src/test/scala/ParserTest.scala | 8 ++-- .../src/main/scala/sbt/BufferedLogger.scala | 7 +++- .../src/main/scala/sbt/ConsoleLogger.scala | 8 ++-- .../src/main/scala/sbt/ConsoleOut.scala | 2 +- .../src/main/scala/sbt/LoggerWriter.scala | 3 +- .../src/main/scala/sbt/StackTrace.scala | 3 +- .../src/test/scala/LogWriterTest.scala | 3 +- .../src/test/scala/sbt/logic/Test.scala | 5 ++- .../src/main/scala/sbt/ChangeReport.scala | 2 +- project/Dependencies.scala | 2 - 31 files changed, 92 insertions(+), 74 deletions(-) diff --git a/build.sbt b/build.sbt index cf41630d1..7f9a30bf1 100644 --- a/build.sbt +++ b/build.sbt @@ -25,8 +25,22 @@ def commonSettings: Seq[Setting[_]] = Seq( // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), - incOptions := incOptions.value.withNameHashing(true), - crossScalaVersions := Seq(scala210, scala211) + crossScalaVersions := Seq(scala210, scala211), + scalacOptions ++= Seq( + "-encoding", "utf8", + "-deprecation", + "-feature", + "-unchecked", + "-Xlint", + "-language:higherKinds", + "-language:implicitConversions", + // "-Xfuture", + "-Yinline-warnings", + // "-Yfatal-warnings", + "-Yno-adapted-args", + "-Ywarn-dead-code", + "-Ywarn-numeric-widen", + "-Ywarn-value-discard") // bintrayPackage := (bintrayPackage in ThisBuild).value, // bintrayRepository := (bintrayRepository in ThisBuild).value ) @@ -56,33 +70,20 @@ lazy val utilInterface = (project in internalPath / "util-interface"). commonSettings, javaOnlySettings, name := "Util Interface", - // projectComponent, exportJars := true - // resourceGenerators in Compile <+= (version, resourceManaged, streams, compile in Compile) map generateVersionFile, - // apiDefinitions <<= baseDirectory map { base => (base / "definition") :: (base / "other") :: (base / "type") :: Nil }, - // sourceGenerators in Compile <+= (apiDefinitions, - // fullClasspath in Compile in datatypeProj, - // sourceManaged in Compile, - // mainClass in datatypeProj in Compile, - // runner, - // streams) map generateAPICached ) lazy val utilControl = (project in internalPath / "util-control"). settings( commonSettings, - // Util.crossBuild, - name := "Util Control", - crossScalaVersions := Seq(scala210, scala211) + name := "Util Control" ) lazy val utilCollection = (project in internalPath / "util-collection"). settings( testedBaseSettings, Util.keywordsSettings, - // Util.crossBuild, - name := "Util Collection", - crossScalaVersions := Seq(scala210, scala211) + name := "Util Collection" ) lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). @@ -98,7 +99,6 @@ lazy val utilComplete = (project in internalPath / "util-complete"). dependsOn(utilCollection, utilControl). settings( testedBaseSettings, - // Util.crossBuild, name := "Util Completion", libraryDependencies ++= Seq(jline, sbtIO), crossScalaVersions := Seq(scala210, scala211) @@ -144,5 +144,5 @@ lazy val utilTracking = (project in internalPath / "util-tracking"). settings( commonSettings, name := "Util Tracking", - libraryDependencies ++= Seq(sbtIO) + libraryDependencies += sbtIO ) diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala index a2f1e4e47..1db77f24d 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala @@ -36,7 +36,7 @@ object ContextUtil { } // TODO 2.11 Remove this after dropping 2.10.x support. -private object HasCompat { val compat = ??? }; import HasCompat._ +private object HasCompat { val compat = this }; import HasCompat._ /** * Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). @@ -134,10 +134,14 @@ final class ContextUtil[C <: Context](val ctx: C) { def mkTuple(args: List[Tree]): Tree = global.gen.mkTuple(args.asInstanceOf[List[global.Tree]]).asInstanceOf[ctx.universe.Tree] - def setSymbol[Tree](t: Tree, sym: Symbol): Unit = + def setSymbol[Tree](t: Tree, sym: Symbol): Unit = { t.asInstanceOf[global.Tree].setSymbol(sym.asInstanceOf[global.Symbol]) - def setInfo[Tree](sym: Symbol, tpe: Type): Unit = + () + } + def setInfo[Tree](sym: Symbol, tpe: Type): Unit = { sym.asInstanceOf[global.Symbol].setInfo(tpe.asInstanceOf[global.Type]) + () + } /** Creates a new, synthetic type variable with the specified `owner`. */ def newTypeVariable(owner: Symbol, prefix: String = "T0"): TypeSymbol = diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala index b5c2878f3..147f622bf 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala @@ -9,7 +9,7 @@ import macros._ /** A `TupleBuilder` that uses a KList as the tuple representation.*/ object KListBuilder extends TupleBuilder { // TODO 2.11 Remove this after dropping 2.10.x support. - private object HasCompat { val compat = ??? }; import HasCompat._ + private object HasCompat { val compat = this }; import HasCompat._ def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { val ctx: c.type = c diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala index 232174c81..ddef0bee3 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala @@ -16,7 +16,7 @@ object TupleNBuilder extends TupleBuilder { final val TupleMethodName = "tuple" // TODO 2.11 Remove this after dropping 2.10.x support. - private object HasCompat { val compat = ??? }; import HasCompat._ + private object HasCompat { val compat = this }; import HasCompat._ def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { val util = ContextUtil[c.type](c) diff --git a/internal/util-cache/src/main/scala/sbt/Cache.scala b/internal/util-cache/src/main/scala/sbt/Cache.scala index bdfd8cb51..2547ddb4a 100644 --- a/internal/util-cache/src/main/scala/sbt/Cache.scala +++ b/internal/util-cache/src/main/scala/sbt/Cache.scala @@ -9,6 +9,7 @@ import java.net.{ URI, URL } import Types.:+: import DefaultProtocol.{ asProduct2, asSingleton, BooleanFormat, ByteFormat, IntFormat, wrap } import scala.xml.NodeSeq +import scala.language.existentials trait Cache[I, O] { def apply(file: File)(i: I): Either[O, O => Unit] @@ -200,7 +201,7 @@ trait UnionImplicits { def convert(in: UB) = uc.find(in) def read(in: Input) = { - val index = ByteFormat.reads(in) + val index = ByteFormat.reads(in).toInt val (cache, clazz) = uc.at(index) val value = cache.read(in) new Found[cache.Internal](cache, clazz, value, index) diff --git a/internal/util-cache/src/test/scala/CacheTest.scala b/internal/util-cache/src/test/scala/CacheTest.scala index ca66ba925..9208666d5 100644 --- a/internal/util-cache/src/test/scala/CacheTest.scala +++ b/internal/util-cache/src/test/scala/CacheTest.scala @@ -27,5 +27,6 @@ object CacheTest // extends Properties("Cache test") (len + 1) :+: file :+: HNil } c(create :+: fileLength :+: HNil) + () } } diff --git a/internal/util-collection/src/main/scala/sbt/Dag.scala b/internal/util-collection/src/main/scala/sbt/Dag.scala index 118cd0dff..3eb3d8ccb 100644 --- a/internal/util-collection/src/main/scala/sbt/Dag.scala +++ b/internal/util-collection/src/main/scala/sbt/Dag.scala @@ -26,6 +26,7 @@ object Dag { discovered(node) = true; try { visitAll(dependencies(node)); } catch { case c: Cyclic => throw node :: c } finished += node + () } else if (!finished(node)) throw new Cyclic(node) } diff --git a/internal/util-collection/src/main/scala/sbt/IDSet.scala b/internal/util-collection/src/main/scala/sbt/IDSet.scala index 4f5245a26..7c21c48a7 100644 --- a/internal/util-collection/src/main/scala/sbt/IDSet.scala +++ b/internal/util-collection/src/main/scala/sbt/IDSet.scala @@ -33,7 +33,7 @@ object IDSet { def apply(t: T) = contains(t) def contains(t: T) = backing.containsKey(t) def foreach(f: T => Unit) = all foreach f - def +=(t: T) = backing.put(t, Dummy) + def +=(t: T) = { backing.put(t, Dummy); () } def ++=(t: Iterable[T]) = t foreach += def -=(t: T) = if (backing.remove(t) eq null) false else true def all = collection.JavaConversions.collectionAsScalaIterable(backing.keySet) diff --git a/internal/util-collection/src/main/scala/sbt/INode.scala b/internal/util-collection/src/main/scala/sbt/INode.scala index ce39fadad..e55ee9683 100644 --- a/internal/util-collection/src/main/scala/sbt/INode.scala +++ b/internal/util-collection/src/main/scala/sbt/INode.scala @@ -79,7 +79,7 @@ abstract class EvaluateSettings[Scope] { workComplete() } - private[this] def startWork(): Unit = running.incrementAndGet() + private[this] def startWork(): Unit = { running.incrementAndGet(); () } private[this] def workComplete(): Unit = if (running.decrementAndGet() == 0) complete.put(None) @@ -154,6 +154,7 @@ abstract class EvaluateSettings[Scope] { case Evaluated => submitCallComplete(by, value) case _ => calledBy += by } + () } protected def dependsOn: Seq[INode[_]] protected def evaluate0(): Unit diff --git a/internal/util-collection/src/main/scala/sbt/PMap.scala b/internal/util-collection/src/main/scala/sbt/PMap.scala index cf0454fd9..979e776fa 100644 --- a/internal/util-collection/src/main/scala/sbt/PMap.scala +++ b/internal/util-collection/src/main/scala/sbt/PMap.scala @@ -15,7 +15,7 @@ trait RMap[K[_], V[_]] { def values: Iterable[V[_]] def isEmpty: Boolean - final case class TPair[T](key: K[T], value: V[T]) + sealed case class TPair[T](key: K[T], value: V[T]) } trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K, V] { diff --git a/internal/util-collection/src/main/scala/sbt/Settings.scala b/internal/util-collection/src/main/scala/sbt/Settings.scala index eb4227d09..018d5dfc0 100644 --- a/internal/util-collection/src/main/scala/sbt/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/Settings.scala @@ -19,7 +19,7 @@ sealed trait Settings[Scope] { private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val delegates: Scope => Seq[Scope]) extends Settings[Scope] { def scopes: Set[Scope] = data.keySet def keys(scope: Scope) = data(scope).keys.toSet - def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) } toSeq + def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) }.toSeq def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = delegates(scope).toStream.flatMap(sc => getDirect(sc, key)).headOption @@ -42,7 +42,7 @@ trait Init[Scope] { /** The Show instance used when a detailed String needs to be generated. It is typically used when no context is available.*/ def showFullKey: Show[ScopedKey[_]] - final case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) extends KeyedInitialize[T] { + sealed case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) extends KeyedInitialize[T] { def scopedKey = this } @@ -154,9 +154,9 @@ trait Init[Scope] { def compile(sMap: ScopedMap): CompiledMap = sMap.toTypedSeq.map { case sMap.TPair(k, ss) => - val deps = ss flatMap { _.dependencies } toSet; + val deps = ss.flatMap(_.dependencies).toSet (k, new Compiled(k, deps, ss)) - } toMap; + }.toMap def grouped(init: Seq[Setting[_]]): ScopedMap = ((IMap.empty: ScopedMap) /: init)((m, s) => add(m, s)) @@ -445,7 +445,7 @@ trait Init[Scope] { def join: Initialize[Seq[T]] = uniform(s)(idFun) } def join[T](inits: Seq[Initialize[T]]): Initialize[Seq[T]] = uniform(inits)(idFun) - def joinAny[M[_]](inits: Seq[Initialize[M[T]] forSome { type T }]): Initialize[Seq[M[_]]] = + def joinAny[M[_], T](inits: Seq[Initialize[M[T]]]): Initialize[Seq[M[_]]] = join(inits.asInstanceOf[Seq[Initialize[M[Any]]]]).asInstanceOf[Initialize[Seq[M[T] forSome { type T }]]] } object SettingsDefinition { diff --git a/internal/util-collection/src/main/scala/sbt/Signal.scala b/internal/util-collection/src/main/scala/sbt/Signal.scala index e8c9e7e6c..5aa5fc86e 100644 --- a/internal/util-collection/src/main/scala/sbt/Signal.scala +++ b/internal/util-collection/src/main/scala/sbt/Signal.scala @@ -37,6 +37,7 @@ object Signals { object unregisterNewHandler extends Registration { override def remove(): Unit = { Signal.handle(intSignal, oldHandler) + () } } unregisterNewHandler @@ -80,6 +81,6 @@ private final class Signals0 { try Right(action()) catch { case e: LinkageError => Left(e) } - finally Signal.handle(intSignal, oldHandler) + finally { Signal.handle(intSignal, oldHandler); () } } -} \ No newline at end of file +} diff --git a/internal/util-complete/src/main/scala/sbt/LineReader.scala b/internal/util-complete/src/main/scala/sbt/LineReader.scala index b85190f92..41679ab6e 100644 --- a/internal/util-complete/src/main/scala/sbt/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/LineReader.scala @@ -65,7 +65,7 @@ private object JLine { // translate explicit class names to type in order to support // older Scala, since it shaded classes but not the system property - private[sbt] def fixTerminalProperty() { + private[sbt] def fixTerminalProperty(): Unit = { val newValue = System.getProperty(TerminalProperty) match { case "jline.UnixTerminal" => "unix" case null if System.getProperty("sbt.cygwin") != null => "unix" @@ -75,6 +75,7 @@ private object JLine { case x => x } if (newValue != null) System.setProperty(TerminalProperty, newValue) + () } // When calling this, ensure that enableEcho has been or will be called. diff --git a/internal/util-complete/src/main/scala/sbt/complete/History.scala b/internal/util-complete/src/main/scala/sbt/complete/History.scala index 26d0a27c6..614244b9e 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/History.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/History.scala @@ -13,7 +13,7 @@ final class History private (val lines: IndexedSeq[String], val path: Option[Fil def all: Seq[String] = lines def size = lines.length def !! : Option[String] = !-(1) - def apply(i: Int): Option[String] = if (0 <= i && i < size) Some(lines(i)) else { sys.error("Invalid history index: " + i); None } + def apply(i: Int): Option[String] = if (0 <= i && i < size) Some(lines(i)) else { sys.error("Invalid history index: " + i) } def !(i: Int): Option[String] = apply(i) def !(s: String): Option[String] = @@ -26,14 +26,13 @@ final class History private (val lines: IndexedSeq[String], val path: Option[Fil def !?(s: String): Option[String] = nonEmpty(s) { reversed.drop(1).find(_.contains(s)) } private def nonEmpty[T](s: String)(act: => Option[T]): Option[T] = - if (s.isEmpty) { + if (s.isEmpty) sys.error("No action specified to history command") - None - } else + else act def list(historySize: Int, show: Int): Seq[String] = - lines.toList.drop((lines.size - historySize) max 0).zipWithIndex.map { case (line, number) => " " + number + " " + line }.takeRight(show max 1) + lines.toList.drop(scala.math.max(0, lines.size - historySize)).zipWithIndex.map { case (line, number) => " " + number + " " + line }.takeRight(show max 1) } object History { @@ -42,4 +41,4 @@ object History { def number(s: String): Option[Int] = try { Some(s.toInt) } catch { case e: NumberFormatException => None } -} \ No newline at end of file +} diff --git a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala index ff58fe9d2..b66d3272e 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala @@ -45,7 +45,7 @@ object HistoryCommands { val MaxLines = 500 lazy val num = token(NatBasic, "") - lazy val last = Last ^^^ { execute(_ !!) } + lazy val last = Last ^^^ { execute(_.!!) } lazy val list = ListCommands ~> (num ?? Int.MaxValue) map { show => (h: History) => { printHistory(h, MaxLines, show); Some(Nil) } } diff --git a/internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala b/internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala index fed89541f..2445fd111 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala @@ -149,7 +149,7 @@ object JLineCompletion { def commonPrefix(s: Seq[String]): String = if (s.isEmpty) "" else s reduceLeft commonPrefix def commonPrefix(a: String, b: String): String = { - val len = a.length min b.length + val len = scala.math.min(a.length, b.length) @tailrec def loop(i: Int): Int = if (i >= len) len else if (a(i) != b(i)) i else loop(i + 1) a.substring(0, loop(0)) } diff --git a/internal/util-complete/src/main/scala/sbt/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/complete/Parser.scala index 892119f6c..7ae7ea0fd 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/Parser.scala @@ -300,9 +300,9 @@ trait ParserMain { def !!!(msg: String): Parser[A] = onFailure(a, msg) def failOnException: Parser[A] = trapAndFail(a) - def unary_- = not(a) + def unary_- = not(a, "Unexpected: " + a) def &(o: Parser[_]) = and(a, o) - def -(o: Parser[_]) = sub(a, o) + def -(o: Parser[_]) = and(a, not(o, "Unexpected: " + o)) def examples(s: String*): Parser[A] = examples(s.toSet) def examples(s: Set[String], check: Boolean = false): Parser[A] = examples(new FixedSetExamples(s), s.size, check) def examples(s: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = Parser.examples(a, s, maxNumberOfExamples, removeInvalidExamples) @@ -366,7 +366,7 @@ trait ParserMain { def result = None def resultEmpty = mkFailure("Expected '" + ch + "'") def derive(c: Char) = if (c == ch) success(ch) else new Invalid(resultEmpty) - def completions(level: Int) = Completions.single(Completion.suggestStrict(ch.toString)) + def completions(level: Int) = Completions.single(Completion.suggestion(ch.toString)) override def toString = "'" + ch + "'" } /** Presents a literal String `s` as a Parser that only parses that exact text and provides it as the result.*/ @@ -812,7 +812,7 @@ private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], m case None => repeatDerive(c, accumulatedReverse) } - def repeatDerive(c: Char, accRev: List[T]): Parser[Seq[T]] = repeat(Some(repeated derive c), repeated, (min - 1) max 0, max.decrement, accRev) + def repeatDerive(c: Char, accRev: List[T]): Parser[Seq[T]] = repeat(Some(repeated derive c), repeated, scala.math.max(0, min - 1), max.decrement, accRev) def completions(level: Int) = { diff --git a/internal/util-complete/src/main/scala/sbt/complete/Parsers.scala b/internal/util-complete/src/main/scala/sbt/complete/Parsers.scala index 3183929e8..af5849870 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/Parsers.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/Parsers.scala @@ -11,7 +11,7 @@ import java.lang.Character.{ getType, MATH_SYMBOL, OTHER_SYMBOL, DASH_PUNCTUATIO /** Provides standard implementations of commonly useful [[Parser]]s. */ trait Parsers { /** Matches the end of input, providing no useful result on success. */ - lazy val EOF = not(any) + lazy val EOF = not(any, "Expected EOF") /** Parses any single character and provides that character as the result. */ lazy val any: Parser[Char] = charClass(_ => true, "any character") @@ -265,4 +265,4 @@ object DefaultParsers extends Parsers with ParserMain { /** Returns `true` if `s` parses successfully according to [[ID]].*/ def validID(s: String): Boolean = matches(ID, s) -} \ No newline at end of file +} diff --git a/internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala b/internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala index 7e6c9794e..d85e523c9 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala @@ -25,5 +25,5 @@ object ProcessError { s.substring(i + 1) loop(s.length - 1) } - def pointerSpace(s: String, i: Int): String = (s take i) map { case '\t' => '\t'; case _ => ' ' } mkString; -} \ No newline at end of file + def pointerSpace(s: String, i: Int): String = (s take i) map { case '\t' => '\t'; case _ => ' ' } mkString "" +} diff --git a/internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala b/internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala index 96e70d2f1..1285507cc 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala @@ -1,6 +1,6 @@ package sbt.complete -import Completion.{ displayStrict, token => ctoken, tokenDisplay } +import Completion.{ token => ctoken, tokenDisplay } sealed trait TokenCompletions { def hideWhen(f: Int => Boolean): TokenCompletions @@ -24,7 +24,7 @@ object TokenCompletions { val default: TokenCompletions = mapDelegateCompletions((seen, level, c) => ctoken(seen, c.append)) def displayOnly(msg: String): TokenCompletions = new Fixed { - def completions(seen: String, level: Int) = Completions.single(displayStrict(msg)) + def completions(seen: String, level: Int) = Completions.single(Completion.displayOnly(msg)) } def overrideDisplay(msg: String): TokenCompletions = mapDelegateCompletions((seen, level, c) => tokenDisplay(display = msg, append = c.append)) @@ -34,4 +34,4 @@ object TokenCompletions { def mapDelegateCompletions(f: (String, Int, Completion) => Completion): TokenCompletions = new Delegating { def completions(seen: String, level: Int, delegate: Completions) = Completions(delegate.get.map(c => f(seen, level, c))) } -} \ No newline at end of file +} diff --git a/internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala b/internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala index 66a32e1a2..2d954d4c5 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala +++ b/internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala @@ -38,10 +38,10 @@ final case class Finite(value: Int) extends UpperBound { def >=(min: Int) = value >= min def isOne = value == 1 def isZero = value == 0 - def decrement = Finite((value - 1) max 0) + def decrement = Finite(scala.math.max(0, value - 1)) def isInfinite = false override def toString = value.toString } object UpperBound { implicit def intToFinite(i: Int): Finite = Finite(i) -} \ No newline at end of file +} diff --git a/internal/util-complete/src/test/scala/ParserTest.scala b/internal/util-complete/src/test/scala/ParserTest.scala index f02431e53..c70c6207b 100644 --- a/internal/util-complete/src/test/scala/ParserTest.scala +++ b/internal/util-complete/src/test/scala/ParserTest.scala @@ -109,13 +109,13 @@ object ParserTest extends Properties("Completing Parser") { property("repeatDep accepts two tokens") = matches(repeat, colors.toSeq.take(2).mkString(" ")) } object ParserExample { - val ws = charClass(_.isWhitespace)+ - val notws = charClass(!_.isWhitespace)+ + val ws = charClass(_.isWhitespace).+ + val notws = charClass(!_.isWhitespace).+ val name = token("test") - val options = (ws ~> token("quick" | "failed" | "new"))* + val options = (ws ~> token("quick" | "failed" | "new")).* val exampleSet = Set("am", "is", "are", "was", "were") - val include = (ws ~> token(examples(notws.string, new FixedSetExamples(exampleSet), exampleSet.size, false)))* + val include = (ws ~> token(examples(notws.string, new FixedSetExamples(exampleSet), exampleSet.size, false))).* val t = name ~ options ~ include diff --git a/internal/util-logging/src/main/scala/sbt/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/BufferedLogger.scala index 08a60577d..488de77bb 100644 --- a/internal/util-logging/src/main/scala/sbt/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/BufferedLogger.scala @@ -52,6 +52,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { buffer += new SetLevel(newLevel) else delegate.setLevel(newLevel) + () } override def setSuccessEnabled(flag: Boolean): Unit = synchronized { super.setSuccessEnabled(flag) @@ -59,6 +60,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { buffer += new SetSuccess(flag) else delegate.setSuccessEnabled(flag) + () } override def setTrace(level: Int): Unit = synchronized { super.setTrace(level) @@ -66,6 +68,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { buffer += new SetTrace(level) else delegate.setTrace(level) + () } def trace(t: => Throwable): Unit = @@ -79,6 +82,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { buffer ++= events else delegate.logAll(events) + () } def control(event: ControlEvent.Value, message: => String): Unit = doBufferable(Level.Info, new ControlEvent(event, message), _.control(event, message)) @@ -91,5 +95,6 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { else doUnbuffered(delegate) } + () } -} \ No newline at end of file +} diff --git a/internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala b/internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala index c1558cc66..7ecb7d186 100644 --- a/internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala @@ -34,8 +34,7 @@ object ConsoleLogger { * * cf. http://en.wikipedia.org/wiki/ANSI_escape_code#CSI_codes */ - @deprecated("No longer public.", "0.13.8") - def isEscapeTerminator(c: Char): Boolean = + private[sbt] def isEscapeTerminator(c: Char): Boolean = c >= '@' && c <= '~' /** @@ -83,9 +82,10 @@ object ConsoleLogger { } private[this] def nextESC(s: String, start: Int, sb: java.lang.StringBuilder): Unit = { val escIndex = s.indexOf(ESC, start) - if (escIndex < 0) + if (escIndex < 0) { sb.append(s, start, s.length) - else { + () + } else { sb.append(s, start, escIndex) val next: Int = // If it's a CSI we skip past it and then look for a terminator. diff --git a/internal/util-logging/src/main/scala/sbt/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/ConsoleOut.scala index 3ce0d8cdf..3d2c15abe 100644 --- a/internal/util-logging/src/main/scala/sbt/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/ConsoleOut.scala @@ -29,7 +29,7 @@ object ConsoleOut { val lockObject = System.out private[this] var last: Option[String] = None private[this] var current = new java.lang.StringBuffer - def print(s: String): Unit = synchronized { current.append(s) } + def print(s: String): Unit = synchronized { current.append(s); () } def println(s: String): Unit = synchronized { current.append(s); println() } def println(): Unit = synchronized { val s = current.toString diff --git a/internal/util-logging/src/main/scala/sbt/LoggerWriter.scala b/internal/util-logging/src/main/scala/sbt/LoggerWriter.scala index 9be8af409..a106ff605 100644 --- a/internal/util-logging/src/main/scala/sbt/LoggerWriter.scala +++ b/internal/util-logging/src/main/scala/sbt/LoggerWriter.scala @@ -43,7 +43,8 @@ class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: S } } private[this] def log(s: String): Unit = unbufferedLevel match { - case None => lines += s + case None => + lines += s; () case Some(level) => delegate.log(level, s) } } diff --git a/internal/util-logging/src/main/scala/sbt/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/StackTrace.scala index 70554c5ec..d6504cb10 100644 --- a/internal/util-logging/src/main/scala/sbt/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/StackTrace.scala @@ -34,6 +34,7 @@ object StackTrace { b.append("\tat ") b.append(e) b.append('\n') + () } if (!first) @@ -59,4 +60,4 @@ object StackTrace { b.toString() } -} \ No newline at end of file +} diff --git a/internal/util-logging/src/test/scala/LogWriterTest.scala b/internal/util-logging/src/test/scala/LogWriterTest.scala index ce96e9bc9..7c0125fba 100644 --- a/internal/util-logging/src/test/scala/LogWriterTest.scala +++ b/internal/util-logging/src/test/scala/LogWriterTest.scala @@ -120,7 +120,8 @@ object Escape { { val builder = new StringBuilder(s.length) for (c <- s) { - def escaped = pad(c.toInt.toHexString.toUpperCase, 4, '0') + val char = c.toInt + def escaped = pad(char.toHexString.toUpperCase, 4, '0') if (c < 20) builder.append("\\u").append(escaped) else builder.append(c) } builder.toString diff --git a/internal/util-logic/src/test/scala/sbt/logic/Test.scala b/internal/util-logic/src/test/scala/sbt/logic/Test.scala index da1f0b706..f9957414b 100644 --- a/internal/util-logic/src/test/scala/sbt/logic/Test.scala +++ b/internal/util-logic/src/test/scala/sbt/logic/Test.scala @@ -26,7 +26,10 @@ object LogicTest extends Properties("Logic") { case Left(err) => false case Right(res) => val actual = res.provenSet - (actual == expected) || sys.error(s"Expected to prove $expected, but actually proved $actual") + if (actual != expected) + sys.error(s"Expected to prove $expected, but actually proved $actual") + else + true } } diff --git a/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala b/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala index 8502f9d3f..25e3ca0ff 100644 --- a/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala +++ b/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala @@ -67,4 +67,4 @@ private class CompoundChangeReport[T](a: ChangeReport[T], b: ChangeReport[T]) ex lazy val modified = a.modified ++ b.modified lazy val added = a.added ++ b.added lazy val removed = a.removed ++ b.removed -} \ No newline at end of file +} diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 0313a7267..7da889422 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -5,8 +5,6 @@ object Dependencies { lazy val scala210 = "2.10.5" lazy val scala211 = "2.11.7" - val bootstrapSbtVersion = "0.13.8" - lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M1" lazy val jline = "jline" % "jline" % "2.11" lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" From cf890a84a86f27ef1619a9c67643324f2f2ebfb9 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 2 Sep 2015 02:44:51 -0400 Subject: [PATCH 535/823] organization := "org.scala-sbt" --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index f4ebbe7a4..646fb5f6c 100644 --- a/build.sbt +++ b/build.sbt @@ -28,7 +28,7 @@ lazy val utilRoot: Project = (project in file(".")). ). settings( inThisBuild(Seq( - organization := "org.scala-sbt.util", + organization := "org.scala-sbt", version := "0.1.0-SNAPSHOT", homepage := Some(url("https://github.com/sbt/util")), description := "Util module for sbt", From 624d2320f64d184a71f3d30a5ae0a2345a5d12fd Mon Sep 17 00:00:00 2001 From: xuwei-k <6b656e6a69@gmail.com> Date: Fri, 4 Sep 2015 00:28:16 +0900 Subject: [PATCH 536/823] specs2 and scalatest should be the "test" scope --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 0313a7267..22ac8a9c3 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -30,5 +30,5 @@ object Dependencies { lazy val testDependencies = libraryDependencies ++= Seq( scalaCheck, specs2 - ) + ).map(_ % "test") } From ef14f9dc0378e3b1b82ea1fee48d7178a4f5dc40 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 4 Sep 2015 01:48:44 -0400 Subject: [PATCH 537/823] publish tests --- build.sbt | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 646fb5f6c..4aae71c49 100644 --- a/build.sbt +++ b/build.sbt @@ -14,7 +14,9 @@ def commonSettings: Seq[Setting[_]] = Seq( incOptions := incOptions.value.withNameHashing(true), crossScalaVersions := Seq(scala210, scala211), bintrayPackage := (bintrayPackage in ThisBuild).value, - bintrayRepository := (bintrayRepository in ThisBuild).value + bintrayRepository := (bintrayRepository in ThisBuild).value, + publishArtifact in Compile := true, + publishArtifact in Test := true ) def testedBaseSettings: Seq[Setting[_]] = From c9d7a0964aa4267956c6dca6e50c9e6a54a96768 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 4 Sep 2015 06:59:57 -0400 Subject: [PATCH 538/823] Removes a warning. Ref #4. Inferred type was existential. This tightens it. /review @Duhemm, @dwijnand --- internal/util-tracking/src/main/scala/sbt/ChangeReport.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala b/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala index 25e3ca0ff..18db36a43 100644 --- a/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala +++ b/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala @@ -4,13 +4,13 @@ package sbt object ChangeReport { - def modified[T](files: Set[T]) = + def modified[T](files: Set[T]): ChangeReport[T] = new EmptyChangeReport[T] { override def checked = files override def modified = files override def markAllModified = this } - def unmodified[T](files: Set[T]) = + def unmodified[T](files: Set[T]): ChangeReport[T] = new EmptyChangeReport[T] { override def checked = files override def unmodified = files From d482668c080900e9cda1ae13f3361cfc77c45e2f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 4 Sep 2015 07:25:16 -0400 Subject: [PATCH 539/823] Removes a warning. Ref #4 Exhaustion check was not picking up StaticScopes, which is an object. --- internal/util-collection/src/main/scala/sbt/INode.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-collection/src/main/scala/sbt/INode.scala b/internal/util-collection/src/main/scala/sbt/INode.scala index e55ee9683..53f45e9f1 100644 --- a/internal/util-collection/src/main/scala/sbt/INode.scala +++ b/internal/util-collection/src/main/scala/sbt/INode.scala @@ -27,7 +27,6 @@ abstract class EvaluateSettings[Scope] { case k: Keyed[s, T] @unchecked => single(getStatic(k.scopedKey), k.transform) case a: Apply[k, T] @unchecked => new MixedNode[k, T](a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) case b: Bind[s, T] @unchecked => new BindNode[s, T](transform(b.in), x => transform(b.f(x))) - case init.StaticScopes => strictConstant(allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] case v: Value[T] @unchecked => constant(v.value) case v: ValidationCapture[T] @unchecked => strictConstant(v.key) case t: TransformCapture => strictConstant(t.f) @@ -35,6 +34,7 @@ abstract class EvaluateSettings[Scope] { case None => constant(() => o.f(None)) case Some(i) => single[s, T](transform(i), x => o.f(Some(x))) } + case x if x == StaticScopes => strictConstant(allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] } } private[this] lazy val roots: Seq[INode[_]] = compiledSettings flatMap { cs => From bc54e035ef9810cceab1e5c4a8aa1b6d0ba6d67b Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 4 Sep 2015 12:54:09 -0400 Subject: [PATCH 540/823] Move util into sbt.util.internal package --- .../sbt/{ => util/internal}/appmacro/ContextUtil.scala | 2 +- .../scala/sbt/{ => util/internal}/appmacro/Convert.scala | 2 +- .../scala/sbt/{ => util/internal}/appmacro/Instance.scala | 2 +- .../sbt/{ => util/internal}/appmacro/KListBuilder.scala | 2 +- .../sbt/{ => util/internal}/appmacro/MixedBuilder.scala | 2 +- .../sbt/{ => util/internal}/appmacro/TupleBuilder.scala | 2 +- .../sbt/{ => util/internal}/appmacro/TupleNBuilder.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Cache.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/CacheIO.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/FileInfo.scala | 2 +- .../scala/sbt/{ => util/internal}/SeparatedCache.scala | 2 +- internal/util-cache/src/test/scala/CacheTest.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/AList.scala | 2 +- .../main/scala/sbt/{ => util/internal}/Attributes.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Classes.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Dag.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/HList.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/IDSet.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/INode.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/KList.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/PMap.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Param.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Positions.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Settings.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Show.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/ShowLines.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Signal.scala | 2 +- .../main/scala/sbt/{ => util/internal}/TypeFunctions.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Types.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Util.scala | 2 +- .../util-collection/src/test/scala/DagSpecification.scala | 2 +- internal/util-collection/src/test/scala/KeyTest.scala | 2 +- internal/util-collection/src/test/scala/LiteralTest.scala | 2 +- internal/util-collection/src/test/scala/PMapTest.scala | 2 +- .../util-collection/src/test/scala/SettingsExample.scala | 2 +- internal/util-collection/src/test/scala/SettingsTest.scala | 2 +- .../main/scala/sbt/{ => util/internal}/LineReader.scala | 4 ++-- .../sbt/{ => util/internal}/complete/Completions.scala | 3 ++- .../sbt/{ => util/internal}/complete/EditDistance.scala | 3 ++- .../sbt/{ => util/internal}/complete/ExampleSource.scala | 3 ++- .../scala/sbt/{ => util/internal}/complete/History.scala | 2 +- .../sbt/{ => util/internal}/complete/HistoryCommands.scala | 2 +- .../sbt/{ => util/internal}/complete/JLineCompletion.scala | 3 ++- .../scala/sbt/{ => util/internal}/complete/Parser.scala | 7 ++++--- .../scala/sbt/{ => util/internal}/complete/Parsers.scala | 3 ++- .../sbt/{ => util/internal}/complete/ProcessError.scala | 3 ++- .../{ => util/internal}/complete/TokenCompletions.scala | 3 ++- .../sbt/{ => util/internal}/complete/TypeString.scala | 3 ++- .../sbt/{ => util/internal}/complete/UpperBound.scala | 3 ++- internal/util-complete/src/test/scala/ParserTest.scala | 3 ++- .../src/test/scala/sbt/complete/FileExamplesTest.scala | 3 ++- .../src/test/scala/sbt/complete/FixedSetExamplesTest.scala | 3 ++- .../test/scala/sbt/complete/ParserWithExamplesTest.scala | 3 ++- .../main/scala/sbt/{ => util/internal}/ErrorHandling.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/ExitHook.scala | 2 +- .../sbt/{ => util/internal}/MessageOnlyException.scala | 2 +- .../main/scala/sbt/{ => util/internal}/BasicLogger.scala | 2 +- .../scala/sbt/{ => util/internal}/BufferedLogger.scala | 2 +- .../main/scala/sbt/{ => util/internal}/ConsoleLogger.scala | 2 +- .../main/scala/sbt/{ => util/internal}/ConsoleOut.scala | 2 +- .../main/scala/sbt/{ => util/internal}/FilterLogger.scala | 2 +- .../main/scala/sbt/{ => util/internal}/FullLogger.scala | 2 +- .../main/scala/sbt/{ => util/internal}/GlobalLogging.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Level.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/LogEvent.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Logger.scala | 2 +- .../main/scala/sbt/{ => util/internal}/LoggerWriter.scala | 2 +- .../main/scala/sbt/{ => util/internal}/MainLogging.scala | 2 +- .../main/scala/sbt/{ => util/internal}/MultiLogger.scala | 2 +- .../main/scala/sbt/{ => util/internal}/StackTrace.scala | 2 +- internal/util-logging/src/test/scala/Escapes.scala | 2 +- internal/util-logging/src/test/scala/LogWriterTest.scala | 2 +- internal/util-logging/src/test/scala/TestLogger.scala | 2 +- .../main/scala/sbt/{ => util/internal}/logic/Logic.scala | 2 +- internal/util-logic/src/test/scala/sbt/logic/Test.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Relation.scala | 2 +- internal/util-relation/src/test/scala/RelationTest.scala | 2 +- .../main/scala/sbt/{ => util/internal}/ChangeReport.scala | 2 +- .../src/main/scala/sbt/{ => util/internal}/Tracked.scala | 2 +- project/Util.scala | 2 +- 80 files changed, 97 insertions(+), 83 deletions(-) rename internal/util-appmacro/src/main/scala/sbt/{ => util/internal}/appmacro/ContextUtil.scala (99%) rename internal/util-appmacro/src/main/scala/sbt/{ => util/internal}/appmacro/Convert.scala (98%) rename internal/util-appmacro/src/main/scala/sbt/{ => util/internal}/appmacro/Instance.scala (99%) rename internal/util-appmacro/src/main/scala/sbt/{ => util/internal}/appmacro/KListBuilder.scala (99%) rename internal/util-appmacro/src/main/scala/sbt/{ => util/internal}/appmacro/MixedBuilder.scala (95%) rename internal/util-appmacro/src/main/scala/sbt/{ => util/internal}/appmacro/TupleBuilder.scala (99%) rename internal/util-appmacro/src/main/scala/sbt/{ => util/internal}/appmacro/TupleNBuilder.scala (98%) rename internal/util-cache/src/main/scala/sbt/{ => util/internal}/Cache.scala (99%) rename internal/util-cache/src/main/scala/sbt/{ => util/internal}/CacheIO.scala (98%) rename internal/util-cache/src/main/scala/sbt/{ => util/internal}/FileInfo.scala (99%) rename internal/util-cache/src/main/scala/sbt/{ => util/internal}/SeparatedCache.scala (98%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/AList.scala (99%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Attributes.scala (99%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Classes.scala (97%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Dag.scala (99%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/HList.scala (96%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/IDSet.scala (98%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/INode.scala (99%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/KList.scala (98%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/PMap.scala (99%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Param.scala (95%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Positions.scala (94%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Settings.scala (99%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Show.scala (84%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/ShowLines.scala (92%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Signal.scala (99%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/TypeFunctions.scala (98%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Types.scala (88%) rename internal/util-collection/src/main/scala/sbt/{ => util/internal}/Util.scala (98%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/LineReader.scala (97%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/Completions.scala (99%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/EditDistance.scala (96%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/ExampleSource.scala (98%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/History.scala (98%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/HistoryCommands.scala (98%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/JLineCompletion.scala (99%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/Parser.scala (99%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/Parsers.scala (99%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/ProcessError.scala (95%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/TokenCompletions.scala (97%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/TypeString.scala (98%) rename internal/util-complete/src/main/scala/sbt/{ => util/internal}/complete/UpperBound.scala (97%) rename internal/util-control/src/main/scala/sbt/{ => util/internal}/ErrorHandling.scala (97%) rename internal/util-control/src/main/scala/sbt/{ => util/internal}/ExitHook.scala (96%) rename internal/util-control/src/main/scala/sbt/{ => util/internal}/MessageOnlyException.scala (97%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/BasicLogger.scala (96%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/BufferedLogger.scala (99%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/ConsoleLogger.scala (99%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/ConsoleOut.scala (98%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/FilterLogger.scala (97%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/FullLogger.scala (97%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/GlobalLogging.scala (98%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/Level.scala (97%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/LogEvent.scala (95%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/Logger.scala (99%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/LoggerWriter.scala (98%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/MainLogging.scala (97%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/MultiLogger.scala (98%) rename internal/util-logging/src/main/scala/sbt/{ => util/internal}/StackTrace.scala (98%) rename internal/util-logic/src/main/scala/sbt/{ => util/internal}/logic/Logic.scala (99%) rename internal/util-relation/src/main/scala/sbt/{ => util/internal}/Relation.scala (99%) rename internal/util-tracking/src/main/scala/sbt/{ => util/internal}/ChangeReport.scala (99%) rename internal/util-tracking/src/main/scala/sbt/{ => util/internal}/Tracked.scala (99%) diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/ContextUtil.scala similarity index 99% rename from internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala rename to internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/ContextUtil.scala index 1db77f24d..4bdf92dc7 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/ContextUtil.scala +++ b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/ContextUtil.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package appmacro import scala.reflect._ diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/Convert.scala b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Convert.scala similarity index 98% rename from internal/util-appmacro/src/main/scala/sbt/appmacro/Convert.scala rename to internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Convert.scala index 3a2e562a6..ecfa0ea4f 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/Convert.scala +++ b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Convert.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package appmacro import scala.reflect._ diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/Instance.scala b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Instance.scala similarity index 99% rename from internal/util-appmacro/src/main/scala/sbt/appmacro/Instance.scala rename to internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Instance.scala index 7a63feca5..111fbc3ca 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/Instance.scala +++ b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Instance.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package appmacro import Classes.Applicative diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/KListBuilder.scala similarity index 99% rename from internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/KListBuilder.scala index 147f622bf..3f0a12e28 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/KListBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/KListBuilder.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package appmacro import Types.Id diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/MixedBuilder.scala similarity index 95% rename from internal/util-appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/MixedBuilder.scala index 019dc8b20..c99610275 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/MixedBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/MixedBuilder.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package appmacro import scala.reflect._ diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleBuilder.scala similarity index 99% rename from internal/util-appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleBuilder.scala index a6ea2d84c..ab87ead81 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleBuilder.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package appmacro import Types.Id diff --git a/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleNBuilder.scala similarity index 98% rename from internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleNBuilder.scala index ddef0bee3..7c582328e 100644 --- a/internal/util-appmacro/src/main/scala/sbt/appmacro/TupleNBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleNBuilder.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package appmacro import Types.Id diff --git a/internal/util-cache/src/main/scala/sbt/Cache.scala b/internal/util-cache/src/main/scala/sbt/util/internal/Cache.scala similarity index 99% rename from internal/util-cache/src/main/scala/sbt/Cache.scala rename to internal/util-cache/src/main/scala/sbt/util/internal/Cache.scala index 2547ddb4a..a42236c57 100644 --- a/internal/util-cache/src/main/scala/sbt/Cache.scala +++ b/internal/util-cache/src/main/scala/sbt/util/internal/Cache.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt +package sbt.util.internal import sbinary.{ CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output => Out } import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } diff --git a/internal/util-cache/src/main/scala/sbt/CacheIO.scala b/internal/util-cache/src/main/scala/sbt/util/internal/CacheIO.scala similarity index 98% rename from internal/util-cache/src/main/scala/sbt/CacheIO.scala rename to internal/util-cache/src/main/scala/sbt/util/internal/CacheIO.scala index 3bb952330..7c0ab222d 100644 --- a/internal/util-cache/src/main/scala/sbt/CacheIO.scala +++ b/internal/util-cache/src/main/scala/sbt/util/internal/CacheIO.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt +package sbt.util.internal import java.io.{ File, FileNotFoundException } import sbinary.{ DefaultProtocol, Format, Operations } diff --git a/internal/util-cache/src/main/scala/sbt/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/util/internal/FileInfo.scala similarity index 99% rename from internal/util-cache/src/main/scala/sbt/FileInfo.scala rename to internal/util-cache/src/main/scala/sbt/util/internal/FileInfo.scala index 12002c508..92ed6c8b6 100644 --- a/internal/util-cache/src/main/scala/sbt/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/util/internal/FileInfo.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt +package sbt.util.internal import java.io.{ File, IOException } import sbinary.{ DefaultProtocol, Format } diff --git a/internal/util-cache/src/main/scala/sbt/SeparatedCache.scala b/internal/util-cache/src/main/scala/sbt/util/internal/SeparatedCache.scala similarity index 98% rename from internal/util-cache/src/main/scala/sbt/SeparatedCache.scala rename to internal/util-cache/src/main/scala/sbt/util/internal/SeparatedCache.scala index 1022832d6..03fd2c2e5 100644 --- a/internal/util-cache/src/main/scala/sbt/SeparatedCache.scala +++ b/internal/util-cache/src/main/scala/sbt/util/internal/SeparatedCache.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt +package sbt.util.internal import Types.:+: import sbinary.{ DefaultProtocol, Format, Input, Output => Out } diff --git a/internal/util-cache/src/test/scala/CacheTest.scala b/internal/util-cache/src/test/scala/CacheTest.scala index 9208666d5..d204d1f5e 100644 --- a/internal/util-cache/src/test/scala/CacheTest.scala +++ b/internal/util-cache/src/test/scala/CacheTest.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import java.io.File import Types.:+: diff --git a/internal/util-collection/src/main/scala/sbt/AList.scala b/internal/util-collection/src/main/scala/sbt/util/internal/AList.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/AList.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/AList.scala index 24368219b..e825b385c 100644 --- a/internal/util-collection/src/main/scala/sbt/AList.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/AList.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import Classes.Applicative import Types._ diff --git a/internal/util-collection/src/main/scala/sbt/Attributes.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Attributes.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/Attributes.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Attributes.scala index 64f379012..fb4bb8eaa 100644 --- a/internal/util-collection/src/main/scala/sbt/Attributes.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Attributes.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import Types._ import scala.reflect.Manifest diff --git a/internal/util-collection/src/main/scala/sbt/Classes.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Classes.scala similarity index 97% rename from internal/util-collection/src/main/scala/sbt/Classes.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Classes.scala index 1db644f96..678eb0651 100644 --- a/internal/util-collection/src/main/scala/sbt/Classes.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Classes.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal object Classes { trait Applicative[M[_]] { diff --git a/internal/util-collection/src/main/scala/sbt/Dag.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Dag.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/Dag.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Dag.scala index 3eb3d8ccb..d41730e88 100644 --- a/internal/util-collection/src/main/scala/sbt/Dag.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Dag.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 David MacIver, Mark Harrah */ -package sbt; +package sbt.util.internal trait Dag[Node <: Dag[Node]] { self: Node => diff --git a/internal/util-collection/src/main/scala/sbt/HList.scala b/internal/util-collection/src/main/scala/sbt/util/internal/HList.scala similarity index 96% rename from internal/util-collection/src/main/scala/sbt/HList.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/HList.scala index 23f5488c6..01cded498 100644 --- a/internal/util-collection/src/main/scala/sbt/HList.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/HList.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import Types._ diff --git a/internal/util-collection/src/main/scala/sbt/IDSet.scala b/internal/util-collection/src/main/scala/sbt/util/internal/IDSet.scala similarity index 98% rename from internal/util-collection/src/main/scala/sbt/IDSet.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/IDSet.scala index 7c21c48a7..cefe13186 100644 --- a/internal/util-collection/src/main/scala/sbt/IDSet.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/IDSet.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal /** A mutable set interface that uses object identity to test for set membership.*/ trait IDSet[T] { diff --git a/internal/util-collection/src/main/scala/sbt/INode.scala b/internal/util-collection/src/main/scala/sbt/util/internal/INode.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/INode.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/INode.scala index 53f45e9f1..3e2310f2f 100644 --- a/internal/util-collection/src/main/scala/sbt/INode.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/INode.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import java.lang.Runnable import java.util.concurrent.{ atomic, Executor, LinkedBlockingQueue } diff --git a/internal/util-collection/src/main/scala/sbt/KList.scala b/internal/util-collection/src/main/scala/sbt/util/internal/KList.scala similarity index 98% rename from internal/util-collection/src/main/scala/sbt/KList.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/KList.scala index 0b09ac9b1..1df8c4df5 100644 --- a/internal/util-collection/src/main/scala/sbt/KList.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/KList.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import Types._ import Classes.Applicative diff --git a/internal/util-collection/src/main/scala/sbt/PMap.scala b/internal/util-collection/src/main/scala/sbt/util/internal/PMap.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/PMap.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/PMap.scala index 979e776fa..f9ac4b0a0 100644 --- a/internal/util-collection/src/main/scala/sbt/PMap.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/PMap.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import collection.mutable diff --git a/internal/util-collection/src/main/scala/sbt/Param.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Param.scala similarity index 95% rename from internal/util-collection/src/main/scala/sbt/Param.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Param.scala index 19d12798a..16bd17e49 100644 --- a/internal/util-collection/src/main/scala/sbt/Param.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Param.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import Types._ diff --git a/internal/util-collection/src/main/scala/sbt/Positions.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Positions.scala similarity index 94% rename from internal/util-collection/src/main/scala/sbt/Positions.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Positions.scala index 5d7e1915d..fd64e4538 100755 --- a/internal/util-collection/src/main/scala/sbt/Positions.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Positions.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal sealed trait SourcePosition diff --git a/internal/util-collection/src/main/scala/sbt/Settings.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Settings.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/Settings.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Settings.scala index 018d5dfc0..9e6cdf95e 100644 --- a/internal/util-collection/src/main/scala/sbt/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Settings.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt +package sbt.util.internal import Types._ diff --git a/internal/util-collection/src/main/scala/sbt/Show.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Show.scala similarity index 84% rename from internal/util-collection/src/main/scala/sbt/Show.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Show.scala index 1f8e9703b..930431e13 100644 --- a/internal/util-collection/src/main/scala/sbt/Show.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Show.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal trait Show[T] { def apply(t: T): String diff --git a/internal/util-collection/src/main/scala/sbt/ShowLines.scala b/internal/util-collection/src/main/scala/sbt/util/internal/ShowLines.scala similarity index 92% rename from internal/util-collection/src/main/scala/sbt/ShowLines.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/ShowLines.scala index 126b6360e..794f3d5cb 100644 --- a/internal/util-collection/src/main/scala/sbt/ShowLines.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/ShowLines.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal trait ShowLines[A] { def showLines(a: A): Seq[String] diff --git a/internal/util-collection/src/main/scala/sbt/Signal.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Signal.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/Signal.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Signal.scala index 5aa5fc86e..f8e3ed17d 100644 --- a/internal/util-collection/src/main/scala/sbt/Signal.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Signal.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal object Signals { val CONT = "CONT" diff --git a/internal/util-collection/src/main/scala/sbt/TypeFunctions.scala b/internal/util-collection/src/main/scala/sbt/util/internal/TypeFunctions.scala similarity index 98% rename from internal/util-collection/src/main/scala/sbt/TypeFunctions.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/TypeFunctions.scala index 74f0a7d99..ed1669f35 100644 --- a/internal/util-collection/src/main/scala/sbt/TypeFunctions.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/TypeFunctions.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal trait TypeFunctions { type Id[X] = X diff --git a/internal/util-collection/src/main/scala/sbt/Types.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Types.scala similarity index 88% rename from internal/util-collection/src/main/scala/sbt/Types.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Types.scala index 29994f3d1..972f12769 100644 --- a/internal/util-collection/src/main/scala/sbt/Types.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Types.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal object Types extends Types diff --git a/internal/util-collection/src/main/scala/sbt/Util.scala b/internal/util-collection/src/main/scala/sbt/util/internal/Util.scala similarity index 98% rename from internal/util-collection/src/main/scala/sbt/Util.scala rename to internal/util-collection/src/main/scala/sbt/util/internal/Util.scala index befc7b5a9..ba158ae57 100644 --- a/internal/util-collection/src/main/scala/sbt/Util.scala +++ b/internal/util-collection/src/main/scala/sbt/util/internal/Util.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt +package sbt.util.internal import java.util.Locale diff --git a/internal/util-collection/src/test/scala/DagSpecification.scala b/internal/util-collection/src/test/scala/DagSpecification.scala index abf9ddf28..47c1b802b 100644 --- a/internal/util-collection/src/test/scala/DagSpecification.scala +++ b/internal/util-collection/src/test/scala/DagSpecification.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008 Mark Harrah */ -package sbt +package sbt.util.internal import org.scalacheck._ import Prop._ diff --git a/internal/util-collection/src/test/scala/KeyTest.scala b/internal/util-collection/src/test/scala/KeyTest.scala index f48e3742a..8bdcd76bf 100644 --- a/internal/util-collection/src/test/scala/KeyTest.scala +++ b/internal/util-collection/src/test/scala/KeyTest.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import org.scalacheck._ import Prop._ diff --git a/internal/util-collection/src/test/scala/LiteralTest.scala b/internal/util-collection/src/test/scala/LiteralTest.scala index 35ef373ca..32da58a7b 100644 --- a/internal/util-collection/src/test/scala/LiteralTest.scala +++ b/internal/util-collection/src/test/scala/LiteralTest.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import Types._ diff --git a/internal/util-collection/src/test/scala/PMapTest.scala b/internal/util-collection/src/test/scala/PMapTest.scala index 6a6c558c1..e8aa2c6fa 100644 --- a/internal/util-collection/src/test/scala/PMapTest.scala +++ b/internal/util-collection/src/test/scala/PMapTest.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import Types._ diff --git a/internal/util-collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala index 3a6bc3853..96a8e01f6 100644 --- a/internal/util-collection/src/test/scala/SettingsExample.scala +++ b/internal/util-collection/src/test/scala/SettingsExample.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal /** Define our settings system */ diff --git a/internal/util-collection/src/test/scala/SettingsTest.scala b/internal/util-collection/src/test/scala/SettingsTest.scala index ab92332db..3321c644b 100644 --- a/internal/util-collection/src/test/scala/SettingsTest.scala +++ b/internal/util-collection/src/test/scala/SettingsTest.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import org.scalacheck._ import Prop._ diff --git a/internal/util-complete/src/main/scala/sbt/LineReader.scala b/internal/util-complete/src/main/scala/sbt/util/internal/LineReader.scala similarity index 97% rename from internal/util-complete/src/main/scala/sbt/LineReader.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/LineReader.scala index 41679ab6e..c01986a0f 100644 --- a/internal/util-complete/src/main/scala/sbt/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/LineReader.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ -package sbt +package sbt.util.internal import jline.console.ConsoleReader import jline.console.history.{ FileHistory, MemoryHistory } @@ -128,7 +128,7 @@ final class FullReader(historyPath: Option[File], complete: Parser[_], val handl protected[this] val reader = { val cr = JLine.createReader(historyPath) - sbt.complete.JLineCompletion.installCustomCompletor(cr, complete) + sbt.util.internal.complete.JLineCompletion.installCustomCompletor(cr, complete) cr } } diff --git a/internal/util-complete/src/main/scala/sbt/complete/Completions.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/Completions.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/complete/Completions.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/Completions.scala index 5237ad26d..b3a562dc7 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/Completions.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/Completions.scala @@ -1,7 +1,8 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.complete +package sbt.util.internal +package complete /** * Represents a set of completions. diff --git a/internal/util-complete/src/main/scala/sbt/complete/EditDistance.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/EditDistance.scala similarity index 96% rename from internal/util-complete/src/main/scala/sbt/complete/EditDistance.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/EditDistance.scala index 95ed0c91f..873d9035c 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/EditDistance.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/EditDistance.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete import java.lang.Character.{ toLowerCase => lower } diff --git a/internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/ExampleSource.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/ExampleSource.scala index ab2d39b4c..932951e34 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/ExampleSource.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/ExampleSource.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete import java.io.File import sbt.io.IO diff --git a/internal/util-complete/src/main/scala/sbt/complete/History.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/History.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/complete/History.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/History.scala index 614244b9e..bd925cd17 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/History.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/History.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal package complete import History.number diff --git a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/HistoryCommands.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/HistoryCommands.scala index b66d3272e..662b914c0 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/HistoryCommands.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/HistoryCommands.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal package complete import java.io.File diff --git a/internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/JLineCompletion.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/JLineCompletion.scala index 2445fd111..c674e21db 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/JLineCompletion.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/JLineCompletion.scala @@ -1,7 +1,8 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt.complete +package sbt.util.internal +package complete import jline.console.ConsoleReader import jline.console.completer.{ CandidateListCompletionHandler, Completer, CompletionHandler } diff --git a/internal/util-complete/src/main/scala/sbt/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/Parser.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/complete/Parser.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/Parser.scala index 7ae7ea0fd..659f20e7b 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/Parser.scala @@ -1,11 +1,12 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2010, 2011 Mark Harrah */ -package sbt.complete +package sbt.util.internal +package complete import Parser._ -import sbt.Types.{ left, right, some } -import sbt.Util.{ makeList, separate } +import sbt.util.internal.Types.{ left, right, some } +import sbt.util.internal.Util.{ makeList, separate } /** * A String parser that provides semi-automatic tab completion. diff --git a/internal/util-complete/src/main/scala/sbt/complete/Parsers.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/Parsers.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/complete/Parsers.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/Parsers.scala index af5849870..e0b81884f 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/Parsers.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/Parsers.scala @@ -1,7 +1,8 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt.complete +package sbt.util.internal +package complete import Parser._ import java.io.File diff --git a/internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/ProcessError.scala similarity index 95% rename from internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/ProcessError.scala index d85e523c9..2faa17634 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/ProcessError.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/ProcessError.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete object ProcessError { def apply(command: String, msgs: Seq[String], index: Int): String = diff --git a/internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/TokenCompletions.scala similarity index 97% rename from internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/TokenCompletions.scala index 1285507cc..6d3e5100f 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/TokenCompletions.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/TokenCompletions.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete import Completion.{ token => ctoken, tokenDisplay } diff --git a/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/TypeString.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/complete/TypeString.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/TypeString.scala index bd5f84f43..1ac092023 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/TypeString.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/TypeString.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete import DefaultParsers._ import TypeString._ diff --git a/internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala b/internal/util-complete/src/main/scala/sbt/util/internal/complete/UpperBound.scala similarity index 97% rename from internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala rename to internal/util-complete/src/main/scala/sbt/util/internal/complete/UpperBound.scala index 2d954d4c5..cd42e688b 100644 --- a/internal/util-complete/src/main/scala/sbt/complete/UpperBound.scala +++ b/internal/util-complete/src/main/scala/sbt/util/internal/complete/UpperBound.scala @@ -1,7 +1,8 @@ /* sbt -- Simple Build Tool * Copyright 2008,2010 Mark Harrah */ -package sbt.complete +package sbt.util.internal +package complete sealed trait UpperBound { /** True if and only if the given value meets this bound.*/ diff --git a/internal/util-complete/src/test/scala/ParserTest.scala b/internal/util-complete/src/test/scala/ParserTest.scala index c70c6207b..c1bd8c45c 100644 --- a/internal/util-complete/src/test/scala/ParserTest.scala +++ b/internal/util-complete/src/test/scala/ParserTest.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete object JLineTest { import DefaultParsers._ diff --git a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala index f9cc77038..a0db3d777 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete import org.specs2.mutable.Specification import org.specs2.specification.Scope diff --git a/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala index b5aa14250..4323b8e24 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete import org.specs2.mutable.Specification import org.specs2.specification.Scope diff --git a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala index 3bcc55dd2..854569e88 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala @@ -1,4 +1,5 @@ -package sbt.complete +package sbt.util.internal +package complete import org.specs2.mutable.Specification import org.specs2.specification.Scope diff --git a/internal/util-control/src/main/scala/sbt/ErrorHandling.scala b/internal/util-control/src/main/scala/sbt/util/internal/ErrorHandling.scala similarity index 97% rename from internal/util-control/src/main/scala/sbt/ErrorHandling.scala rename to internal/util-control/src/main/scala/sbt/util/internal/ErrorHandling.scala index 70eba7d2f..6a93451cf 100644 --- a/internal/util-control/src/main/scala/sbt/ErrorHandling.scala +++ b/internal/util-control/src/main/scala/sbt/util/internal/ErrorHandling.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt +package sbt.util.internal import java.io.IOException diff --git a/internal/util-control/src/main/scala/sbt/ExitHook.scala b/internal/util-control/src/main/scala/sbt/util/internal/ExitHook.scala similarity index 96% rename from internal/util-control/src/main/scala/sbt/ExitHook.scala rename to internal/util-control/src/main/scala/sbt/util/internal/ExitHook.scala index 16f295c7c..80b2b9a75 100644 --- a/internal/util-control/src/main/scala/sbt/ExitHook.scala +++ b/internal/util-control/src/main/scala/sbt/util/internal/ExitHook.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal /** Defines a function to call as sbt exits.*/ trait ExitHook { diff --git a/internal/util-control/src/main/scala/sbt/MessageOnlyException.scala b/internal/util-control/src/main/scala/sbt/util/internal/MessageOnlyException.scala similarity index 97% rename from internal/util-control/src/main/scala/sbt/MessageOnlyException.scala rename to internal/util-control/src/main/scala/sbt/util/internal/MessageOnlyException.scala index ab4727b95..425928a11 100644 --- a/internal/util-control/src/main/scala/sbt/MessageOnlyException.scala +++ b/internal/util-control/src/main/scala/sbt/util/internal/MessageOnlyException.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt +package sbt.util.internal final class MessageOnlyException(override val toString: String) extends RuntimeException(toString) diff --git a/internal/util-logging/src/main/scala/sbt/BasicLogger.scala b/internal/util-logging/src/main/scala/sbt/util/internal/BasicLogger.scala similarity index 96% rename from internal/util-logging/src/main/scala/sbt/BasicLogger.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/BasicLogger.scala index 7fe59e8c0..b9a2913e2 100644 --- a/internal/util-logging/src/main/scala/sbt/BasicLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/BasicLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal /** Implements the level-setting methods of Logger.*/ abstract class BasicLogger extends AbstractLogger { diff --git a/internal/util-logging/src/main/scala/sbt/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/util/internal/BufferedLogger.scala similarity index 99% rename from internal/util-logging/src/main/scala/sbt/BufferedLogger.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/BufferedLogger.scala index 488de77bb..a1aaa002a 100644 --- a/internal/util-logging/src/main/scala/sbt/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/BufferedLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal import scala.collection.mutable.ListBuffer diff --git a/internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala b/internal/util-logging/src/main/scala/sbt/util/internal/ConsoleLogger.scala similarity index 99% rename from internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/ConsoleLogger.scala index 7ecb7d186..ae9e2c670 100644 --- a/internal/util-logging/src/main/scala/sbt/ConsoleLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/ConsoleLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010, 2011 Mark Harrah */ -package sbt +package sbt.util.internal import java.io.{ BufferedWriter, PrintStream, PrintWriter } import java.util.Locale diff --git a/internal/util-logging/src/main/scala/sbt/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/util/internal/ConsoleOut.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/ConsoleOut.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/ConsoleOut.scala index 3d2c15abe..4c20ee329 100644 --- a/internal/util-logging/src/main/scala/sbt/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/ConsoleOut.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import java.io.{ BufferedWriter, PrintStream, PrintWriter } diff --git a/internal/util-logging/src/main/scala/sbt/FilterLogger.scala b/internal/util-logging/src/main/scala/sbt/util/internal/FilterLogger.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/FilterLogger.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/FilterLogger.scala index 5259a6e12..d0f503495 100644 --- a/internal/util-logging/src/main/scala/sbt/FilterLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/FilterLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal /** * A filter logger is used to delegate messages but not the logging level to another logger. This means diff --git a/internal/util-logging/src/main/scala/sbt/FullLogger.scala b/internal/util-logging/src/main/scala/sbt/util/internal/FullLogger.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/FullLogger.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/FullLogger.scala index 32873eff7..64066abdb 100644 --- a/internal/util-logging/src/main/scala/sbt/FullLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/FullLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal /** Promotes the simple Logger interface to the full AbstractLogger interface. */ class FullLogger(delegate: Logger) extends BasicLogger { diff --git a/internal/util-logging/src/main/scala/sbt/GlobalLogging.scala b/internal/util-logging/src/main/scala/sbt/util/internal/GlobalLogging.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/GlobalLogging.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/GlobalLogging.scala index 1cd32653b..df9b5e0e3 100644 --- a/internal/util-logging/src/main/scala/sbt/GlobalLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/GlobalLogging.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import java.io.{ File, PrintWriter } diff --git a/internal/util-logging/src/main/scala/sbt/Level.scala b/internal/util-logging/src/main/scala/sbt/util/internal/Level.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/Level.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/Level.scala index 7744b9495..5a9d3e717 100644 --- a/internal/util-logging/src/main/scala/sbt/Level.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/Level.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ -package sbt +package sbt.util.internal /** * An enumeration defining the levels available for logging. A level includes all of the levels diff --git a/internal/util-logging/src/main/scala/sbt/LogEvent.scala b/internal/util-logging/src/main/scala/sbt/util/internal/LogEvent.scala similarity index 95% rename from internal/util-logging/src/main/scala/sbt/LogEvent.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/LogEvent.scala index d48957c75..a03c306b1 100644 --- a/internal/util-logging/src/main/scala/sbt/LogEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/LogEvent.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ -package sbt +package sbt.util.internal sealed trait LogEvent extends NotNull final class Success(val msg: String) extends LogEvent diff --git a/internal/util-logging/src/main/scala/sbt/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/internal/Logger.scala similarity index 99% rename from internal/util-logging/src/main/scala/sbt/Logger.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/Logger.scala index ddba892d2..a3daea79a 100644 --- a/internal/util-logging/src/main/scala/sbt/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/Logger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal import xsbti.{ Logger => xLogger, F0 } import xsbti.{ Maybe, Position, Problem, Severity } diff --git a/internal/util-logging/src/main/scala/sbt/LoggerWriter.scala b/internal/util-logging/src/main/scala/sbt/util/internal/LoggerWriter.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/LoggerWriter.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/LoggerWriter.scala index a106ff605..b616a045b 100644 --- a/internal/util-logging/src/main/scala/sbt/LoggerWriter.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/LoggerWriter.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal /** * Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. diff --git a/internal/util-logging/src/main/scala/sbt/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/util/internal/MainLogging.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/MainLogging.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/MainLogging.scala index 48015ad44..9a1240fc6 100644 --- a/internal/util-logging/src/main/scala/sbt/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/MainLogging.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import java.io.PrintWriter diff --git a/internal/util-logging/src/main/scala/sbt/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/util/internal/MultiLogger.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/MultiLogger.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/MultiLogger.scala index a6de160cd..add4a8761 100644 --- a/internal/util-logging/src/main/scala/sbt/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/MultiLogger.scala @@ -2,7 +2,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal // note that setting the logging level on this logger has no effect on its behavior, only // on the behavior of the delegates. diff --git a/internal/util-logging/src/main/scala/sbt/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/util/internal/StackTrace.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/StackTrace.scala rename to internal/util-logging/src/main/scala/sbt/util/internal/StackTrace.scala index d6504cb10..595529d49 100644 --- a/internal/util-logging/src/main/scala/sbt/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/util/internal/StackTrace.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Tony Sloane */ -package sbt +package sbt.util.internal object StackTrace { def isSbtClass(name: String) = name.startsWith("sbt") || name.startsWith("xsbt") diff --git a/internal/util-logging/src/test/scala/Escapes.scala b/internal/util-logging/src/test/scala/Escapes.scala index 408ec5e23..bf2e0bdd3 100644 --- a/internal/util-logging/src/test/scala/Escapes.scala +++ b/internal/util-logging/src/test/scala/Escapes.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal import org.scalacheck._ import Prop._ diff --git a/internal/util-logging/src/test/scala/LogWriterTest.scala b/internal/util-logging/src/test/scala/LogWriterTest.scala index 7c0125fba..5f73cdd2d 100644 --- a/internal/util-logging/src/test/scala/LogWriterTest.scala +++ b/internal/util-logging/src/test/scala/LogWriterTest.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import org.scalacheck._ import Arbitrary.{ arbitrary => arb, _ } diff --git a/internal/util-logging/src/test/scala/TestLogger.scala b/internal/util-logging/src/test/scala/TestLogger.scala index e7b6bee49..c66f358ea 100644 --- a/internal/util-logging/src/test/scala/TestLogger.scala +++ b/internal/util-logging/src/test/scala/TestLogger.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal object TestLogger { def apply[T](f: Logger => T): T = diff --git a/internal/util-logic/src/main/scala/sbt/logic/Logic.scala b/internal/util-logic/src/main/scala/sbt/util/internal/logic/Logic.scala similarity index 99% rename from internal/util-logic/src/main/scala/sbt/logic/Logic.scala rename to internal/util-logic/src/main/scala/sbt/util/internal/logic/Logic.scala index 0fbbe3f98..3e6f8ba8c 100644 --- a/internal/util-logic/src/main/scala/sbt/logic/Logic.scala +++ b/internal/util-logic/src/main/scala/sbt/util/internal/logic/Logic.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package logic import scala.annotation.tailrec diff --git a/internal/util-logic/src/test/scala/sbt/logic/Test.scala b/internal/util-logic/src/test/scala/sbt/logic/Test.scala index f9957414b..f6170f1e3 100644 --- a/internal/util-logic/src/test/scala/sbt/logic/Test.scala +++ b/internal/util-logic/src/test/scala/sbt/logic/Test.scala @@ -1,4 +1,4 @@ -package sbt +package sbt.util.internal package logic import org.scalacheck._ diff --git a/internal/util-relation/src/main/scala/sbt/Relation.scala b/internal/util-relation/src/main/scala/sbt/util/internal/Relation.scala similarity index 99% rename from internal/util-relation/src/main/scala/sbt/Relation.scala rename to internal/util-relation/src/main/scala/sbt/util/internal/Relation.scala index 9a648ad64..74b24629d 100644 --- a/internal/util-relation/src/main/scala/sbt/Relation.scala +++ b/internal/util-relation/src/main/scala/sbt/util/internal/Relation.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import Relation._ diff --git a/internal/util-relation/src/test/scala/RelationTest.scala b/internal/util-relation/src/test/scala/RelationTest.scala index 558935bdb..071c79d52 100644 --- a/internal/util-relation/src/test/scala/RelationTest.scala +++ b/internal/util-relation/src/test/scala/RelationTest.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt +package sbt.util.internal import org.scalacheck._ import Prop._ diff --git a/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala b/internal/util-tracking/src/main/scala/sbt/util/internal/ChangeReport.scala similarity index 99% rename from internal/util-tracking/src/main/scala/sbt/ChangeReport.scala rename to internal/util-tracking/src/main/scala/sbt/util/internal/ChangeReport.scala index 18db36a43..a2561d723 100644 --- a/internal/util-tracking/src/main/scala/sbt/ChangeReport.scala +++ b/internal/util-tracking/src/main/scala/sbt/util/internal/ChangeReport.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal object ChangeReport { def modified[T](files: Set[T]): ChangeReport[T] = diff --git a/internal/util-tracking/src/main/scala/sbt/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/util/internal/Tracked.scala similarity index 99% rename from internal/util-tracking/src/main/scala/sbt/Tracked.scala rename to internal/util-tracking/src/main/scala/sbt/util/internal/Tracked.scala index 5c6979718..58361e116 100644 --- a/internal/util-tracking/src/main/scala/sbt/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/util/internal/Tracked.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package sbt +package sbt.util.internal import java.io.{ File, IOException } import CacheIO.{ fromFile, toFile } diff --git a/project/Util.scala b/project/Util.scala index 221a16000..406c62723 100644 --- a/project/Util.scala +++ b/project/Util.scala @@ -20,7 +20,7 @@ object Util { { val init = keywords.map(tn => '"' + tn + '"').mkString("Set(", ", ", ")") val ObjectName = "ScalaKeywords" - val PackageName = "sbt" + val PackageName = "sbt.util.internal" val keywordsSrc = """package %s object %s { From c20887853e55f52b7a0ab440536c8e9739f7893c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 4 Sep 2015 17:40:48 -0400 Subject: [PATCH 541/823] migrate to scalatest 2.2.4 --- build.sbt | 31 ++++--- .../scala/sbt/complete/FileExamplesTest.scala | 80 ++++++++++--------- .../sbt/complete/FixedSetExamplesTest.scala | 23 +++--- .../sbt/complete/ParserWithExamplesTest.scala | 79 +++++++++--------- .../scala/sbt/util/internal/UnitSpec.scala | 5 ++ project/Dependencies.scala | 8 +- 6 files changed, 120 insertions(+), 106 deletions(-) create mode 100644 internal/util-testing/src/main/scala/sbt/util/internal/UnitSpec.scala diff --git a/build.sbt b/build.sbt index 9ac713786..cae1353d2 100644 --- a/build.sbt +++ b/build.sbt @@ -33,9 +33,6 @@ def commonSettings: Seq[Setting[_]] = Seq( publishArtifact in Test := true ) -def testedBaseSettings: Seq[Setting[_]] = - commonSettings ++ testDependencies - lazy val utilRoot: Project = (project in file(".")). // configs(Sxr.sxrConf). aggregate( @@ -86,8 +83,9 @@ lazy val utilControl = (project in internalPath / "util-control"). ) lazy val utilCollection = (project in internalPath / "util-collection"). + dependsOn(utilTesting % Test). settings( - testedBaseSettings, + commonSettings, Util.keywordsSettings, name := "Util Collection" ) @@ -95,16 +93,16 @@ lazy val utilCollection = (project in internalPath / "util-collection"). lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). dependsOn(utilCollection). settings( - testedBaseSettings, + commonSettings, name := "Util Apply Macro", libraryDependencies += scalaCompiler.value ) // Command line-related utilities. lazy val utilComplete = (project in internalPath / "util-complete"). - dependsOn(utilCollection, utilControl). + dependsOn(utilCollection, utilControl, utilTesting % Test). settings( - testedBaseSettings, + commonSettings, name := "Util Completion", libraryDependencies ++= Seq(jline, sbtIO), crossScalaVersions := Seq(scala210, scala211) @@ -112,9 +110,9 @@ lazy val utilComplete = (project in internalPath / "util-complete"). // logging lazy val utilLogging = (project in internalPath / "util-logging"). - dependsOn(utilInterface). + dependsOn(utilInterface, utilTesting % Test). settings( - testedBaseSettings, + commonSettings, publishArtifact in (Test, packageBin) := true, name := "Util Logging", libraryDependencies += jline @@ -122,16 +120,17 @@ lazy val utilLogging = (project in internalPath / "util-logging"). // Relation lazy val utilRelation = (project in internalPath / "util-relation"). + dependsOn(utilTesting % Test). settings( - testedBaseSettings, + commonSettings, name := "Util Relation" ) // A logic with restricted negation as failure for a unique, stable model lazy val utilLogic = (project in internalPath / "util-logic"). - dependsOn(utilCollection, utilRelation). + dependsOn(utilCollection, utilRelation, utilTesting % Test). settings( - testedBaseSettings, + commonSettings, name := "Util Logic" ) @@ -152,3 +151,11 @@ lazy val utilTracking = (project in internalPath / "util-tracking"). name := "Util Tracking", libraryDependencies += sbtIO ) + +// Internal utility for testing +lazy val utilTesting = (project in internalPath / "util-testing"). + settings( + commonSettings, + name := "Util Testing", + libraryDependencies ++= Seq(scalaCheck, scalatest) + ) diff --git a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala index a0db3d777..2f873cf6c 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala @@ -1,54 +1,60 @@ package sbt.util.internal package complete -import org.specs2.mutable.Specification -import org.specs2.specification.Scope import java.io.File import sbt.io.IO._ -class FileExamplesTest extends Specification { +class FileExamplesTest extends UnitSpec { - "listing all files in an absolute base directory" should { - "produce the entire base directory's contents" in new directoryStructure { - fileExamples().toList should containTheSameElementsAs(allRelativizedPaths) + "listing all files in an absolute base directory" should + "produce the entire base directory's contents" in { + val _ = new DirectoryStructure { + fileExamples().toList should contain theSameElementsAs (allRelativizedPaths) + } + } + + "listing files with a prefix that matches none" should + "produce an empty list" in { + val _ = new DirectoryStructure(withCompletionPrefix = "z") { + fileExamples().toList shouldBe empty + } + } + + "listing single-character prefixed files" should + "produce matching paths only" in { + val _ = new DirectoryStructure(withCompletionPrefix = "f") { + fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly) + } + } + + "listing directory-prefixed files" should + "produce matching paths only" in { + val _ = new DirectoryStructure(withCompletionPrefix = "far") { + fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly) + } + } + + it should "produce sub-dir contents only when appending a file separator to the directory" in { + val _ = new DirectoryStructure(withCompletionPrefix = "far" + File.separator) { + fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly) } } - "listing files with a prefix that matches none" should { - "produce an empty list" in new directoryStructure(withCompletionPrefix = "z") { - fileExamples().toList should beEmpty - } - } - - "listing single-character prefixed files" should { - "produce matching paths only" in new directoryStructure(withCompletionPrefix = "f") { - fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) - } - } - - "listing directory-prefixed files" should { - "produce matching paths only" in new directoryStructure(withCompletionPrefix = "far") { - fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + "listing files with a sub-path prefix" should + "produce matching paths only" in { + val _ = new DirectoryStructure(withCompletionPrefix = "far" + File.separator + "ba") { + fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly) + } } - "produce sub-dir contents only when appending a file separator to the directory" in new directoryStructure(withCompletionPrefix = "far" + File.separator) { - fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) + "completing a full path" should + "produce a list with an empty string" in { + val _ = new DirectoryStructure(withCompletionPrefix = "bazaar") { + fileExamples().toList shouldEqual List("") + } } - } - "listing files with a sub-path prefix" should { - "produce matching paths only" in new directoryStructure(withCompletionPrefix = "far" + File.separator + "ba") { - fileExamples().toList should containTheSameElementsAs(prefixedPathsOnly) - } - } - - "completing a full path" should { - "produce a list with an empty string" in new directoryStructure(withCompletionPrefix = "bazaar") { - fileExamples().toList shouldEqual List("") - } - } - - class directoryStructure(withCompletionPrefix: String = "") extends Scope with DelayedInit { + class DirectoryStructure(withCompletionPrefix: String = "") extends DelayedInit { var fileExamples: FileExamples = _ var baseDir: File = _ var childFiles: List[File] = _ diff --git a/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala index 4323b8e24..db1ebe67f 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala @@ -1,26 +1,23 @@ package sbt.util.internal package complete -import org.specs2.mutable.Specification -import org.specs2.specification.Scope +class FixedSetExamplesTest extends UnitSpec { -class FixedSetExamplesTest extends Specification { - - "adding a prefix" should { - "produce a smaller set of examples with the prefix removed" in new examples { - fixedSetExamples.withAddedPrefix("f")() must containTheSameElementsAs(List("oo", "ool", "u")) - fixedSetExamples.withAddedPrefix("fo")() must containTheSameElementsAs(List("o", "ol")) - fixedSetExamples.withAddedPrefix("b")() must containTheSameElementsAs(List("ar")) + "adding a prefix" should "produce a smaller set of examples with the prefix removed" in { + val _ = new Examples { + fixedSetExamples.withAddedPrefix("f")() should contain theSameElementsAs (List("oo", "ool", "u")) + fixedSetExamples.withAddedPrefix("fo")() should contain theSameElementsAs (List("o", "ol")) + fixedSetExamples.withAddedPrefix("b")() should contain theSameElementsAs (List("ar")) } } - "without a prefix" should { - "produce the original set" in new examples { - fixedSetExamples() mustEqual exampleSet + "without a prefix" should "produce the original set" in { + val _ = new Examples { + fixedSetExamples() shouldBe exampleSet } } - trait examples extends Scope { + trait Examples { val exampleSet = List("foo", "bar", "fool", "fu") val fixedSetExamples = FixedSetExamples(exampleSet) } diff --git a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala index 854569e88..fc7c0ae90 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala @@ -1,61 +1,64 @@ package sbt.util.internal package complete -import org.specs2.mutable.Specification -import org.specs2.specification.Scope import Completion._ -class ParserWithExamplesTest extends Specification { +class ParserWithExamplesTest extends UnitSpec { - "listing a limited number of completions" should { - "grab only the needed number of elements from the iterable source of examples" in new parserWithLazyExamples { - parserWithExamples.completions(0) - examples.size shouldEqual maxNumberOfExamples + "listing a limited number of completions" should + "grab only the needed number of elements from the iterable source of examples" in { + val _ = new ParserWithLazyExamples { + parserWithExamples.completions(0) + examples.size shouldEqual maxNumberOfExamples + } } - } - "listing only valid completions" should { - "use the delegate parser to remove invalid examples" in new parserWithValidExamples { - val validCompletions = Completions(Set( - suggestion("blue"), - suggestion("red"))) - parserWithExamples.completions(0) shouldEqual validCompletions + "listing only valid completions" should + "use the delegate parser to remove invalid examples" in { + val _ = new ParserWithValidExamples { + val validCompletions = Completions(Set( + suggestion("blue"), + suggestion("red"))) + parserWithExamples.completions(0) shouldEqual validCompletions + } } - } - "listing valid completions in a derived parser" should { - "produce only valid examples that start with the character of the derivation" in new parserWithValidExamples { - val derivedCompletions = Completions(Set( - suggestion("lue"))) - parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions + "listing valid completions in a derived parser" should + "produce only valid examples that start with the character of the derivation" in { + val _ = new ParserWithValidExamples { + val derivedCompletions = Completions(Set( + suggestion("lue"))) + parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions + } } - } - "listing valid and invalid completions" should { - "produce the entire source of examples" in new parserWithAllExamples { - val completions = Completions(examples.map(suggestion(_)).toSet) - parserWithExamples.completions(0) shouldEqual completions + "listing valid and invalid completions" should + "produce the entire source of examples" in { + val _ = new parserWithAllExamples { + val completions = Completions(examples.map(suggestion(_)).toSet) + parserWithExamples.completions(0) shouldEqual completions + } } - } - "listing valid and invalid completions in a derived parser" should { - "produce only examples that start with the character of the derivation" in new parserWithAllExamples { - val derivedCompletions = Completions(Set( - suggestion("lue"), - suggestion("lock"))) - parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions + "listing valid and invalid completions in a derived parser" should + "produce only examples that start with the character of the derivation" in { + val _ = new parserWithAllExamples { + val derivedCompletions = Completions(Set( + suggestion("lue"), + suggestion("lock"))) + parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions + } } - } - class parserWithLazyExamples extends parser(GrowableSourceOfExamples(), maxNumberOfExamples = 5, removeInvalidExamples = false) + class ParserWithLazyExamples extends ParserExample(GrowableSourceOfExamples(), maxNumberOfExamples = 5, removeInvalidExamples = false) - class parserWithValidExamples extends parser(removeInvalidExamples = true) + class ParserWithValidExamples extends ParserExample(removeInvalidExamples = true) - class parserWithAllExamples extends parser(removeInvalidExamples = false) + class parserWithAllExamples extends ParserExample(removeInvalidExamples = false) - case class parser(examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), + case class ParserExample(examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), maxNumberOfExamples: Int = 25, - removeInvalidExamples: Boolean) extends Scope { + removeInvalidExamples: Boolean) { import DefaultParsers._ diff --git a/internal/util-testing/src/main/scala/sbt/util/internal/UnitSpec.scala b/internal/util-testing/src/main/scala/sbt/util/internal/UnitSpec.scala new file mode 100644 index 000000000..83f17298e --- /dev/null +++ b/internal/util-testing/src/main/scala/sbt/util/internal/UnitSpec.scala @@ -0,0 +1,5 @@ +package sbt.util.internal + +import org.scalatest._ + +abstract class UnitSpec extends FlatSpec with Matchers diff --git a/project/Dependencies.scala b/project/Dependencies.scala index b0516b9cc..95b592d4d 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -23,10 +23,6 @@ object Dependencies { lazy val scalaXml = scala211Module("scala-xml", "1.0.1") - lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.11.4" - lazy val specs2 = "org.specs2" %% "specs2" % "2.3.11" - lazy val testDependencies = libraryDependencies ++= Seq( - scalaCheck, - specs2 - ).map(_ % "test") + lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.12.4" + lazy val scalatest = "org.scalatest" %% "scalatest" % "2.2.4" } From 0a2d39673c13c5bfc97adb6327bf0d5fa20929a4 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 5 Sep 2015 00:51:58 -0400 Subject: [PATCH 542/823] sbt.util.internal -> sbt.internal.util package --- .../internal => internal/util}/appmacro/ContextUtil.scala | 2 +- .../{util/internal => internal/util}/appmacro/Convert.scala | 2 +- .../internal => internal/util}/appmacro/Instance.scala | 2 +- .../internal => internal/util}/appmacro/KListBuilder.scala | 2 +- .../internal => internal/util}/appmacro/MixedBuilder.scala | 2 +- .../internal => internal/util}/appmacro/TupleBuilder.scala | 2 +- .../internal => internal/util}/appmacro/TupleNBuilder.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Cache.scala | 2 +- .../sbt/{util/internal => internal/util}/CacheIO.scala | 2 +- .../sbt/{util/internal => internal/util}/FileInfo.scala | 2 +- .../{util/internal => internal/util}/SeparatedCache.scala | 2 +- internal/util-cache/src/test/scala/CacheTest.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/AList.scala | 2 +- .../sbt/{util/internal => internal/util}/Attributes.scala | 2 +- .../sbt/{util/internal => internal/util}/Classes.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Dag.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/HList.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/IDSet.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/INode.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/KList.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/PMap.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Param.scala | 2 +- .../sbt/{util/internal => internal/util}/Positions.scala | 2 +- .../sbt/{util/internal => internal/util}/Settings.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Show.scala | 2 +- .../sbt/{util/internal => internal/util}/ShowLines.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Signal.scala | 2 +- .../{util/internal => internal/util}/TypeFunctions.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Types.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Util.scala | 2 +- .../util-collection/src/test/scala/DagSpecification.scala | 2 +- internal/util-collection/src/test/scala/KeyTest.scala | 2 +- internal/util-collection/src/test/scala/LiteralTest.scala | 2 +- internal/util-collection/src/test/scala/PMapTest.scala | 2 +- .../util-collection/src/test/scala/SettingsExample.scala | 2 +- internal/util-collection/src/test/scala/SettingsTest.scala | 2 +- .../sbt/{util/internal => internal/util}/LineReader.scala | 4 ++-- .../internal => internal/util}/complete/Completions.scala | 2 +- .../internal => internal/util}/complete/EditDistance.scala | 2 +- .../internal => internal/util}/complete/ExampleSource.scala | 2 +- .../{util/internal => internal/util}/complete/History.scala | 2 +- .../util}/complete/HistoryCommands.scala | 2 +- .../util}/complete/JLineCompletion.scala | 2 +- .../{util/internal => internal/util}/complete/Parser.scala | 6 +++--- .../{util/internal => internal/util}/complete/Parsers.scala | 2 +- .../internal => internal/util}/complete/ProcessError.scala | 2 +- .../util}/complete/TokenCompletions.scala | 2 +- .../internal => internal/util}/complete/TypeString.scala | 2 +- .../internal => internal/util}/complete/UpperBound.scala | 2 +- internal/util-complete/src/test/scala/ParserTest.scala | 2 +- .../src/test/scala/sbt/complete/FileExamplesTest.scala | 2 +- .../src/test/scala/sbt/complete/FixedSetExamplesTest.scala | 2 +- .../test/scala/sbt/complete/ParserWithExamplesTest.scala | 2 +- .../{util/internal => internal/util}/ErrorHandling.scala | 2 +- .../sbt/{util/internal => internal/util}/ExitHook.scala | 2 +- .../internal => internal/util}/MessageOnlyException.scala | 2 +- .../sbt/{util/internal => internal/util}/BasicLogger.scala | 2 +- .../{util/internal => internal/util}/BufferedLogger.scala | 2 +- .../{util/internal => internal/util}/ConsoleLogger.scala | 2 +- .../sbt/{util/internal => internal/util}/ConsoleOut.scala | 2 +- .../sbt/{util/internal => internal/util}/FilterLogger.scala | 2 +- .../sbt/{util/internal => internal/util}/FullLogger.scala | 2 +- .../{util/internal => internal/util}/GlobalLogging.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Level.scala | 2 +- .../sbt/{util/internal => internal/util}/LogEvent.scala | 2 +- .../scala/sbt/{util/internal => internal/util}/Logger.scala | 2 +- .../sbt/{util/internal => internal/util}/LoggerWriter.scala | 2 +- .../sbt/{util/internal => internal/util}/MainLogging.scala | 2 +- .../sbt/{util/internal => internal/util}/MultiLogger.scala | 2 +- .../sbt/{util/internal => internal/util}/StackTrace.scala | 2 +- internal/util-logging/src/test/scala/Escapes.scala | 2 +- internal/util-logging/src/test/scala/LogWriterTest.scala | 2 +- internal/util-logging/src/test/scala/TestLogger.scala | 2 +- .../sbt/{util/internal => internal/util}/logic/Logic.scala | 2 +- internal/util-logic/src/test/scala/sbt/logic/Test.scala | 2 +- .../sbt/{util/internal => internal/util}/Relation.scala | 2 +- internal/util-relation/src/test/scala/RelationTest.scala | 2 +- .../sbt/{util/internal => internal/util}/UnitSpec.scala | 2 +- .../sbt/{util/internal => internal/util}/ChangeReport.scala | 2 +- .../sbt/{util/internal => internal/util}/Tracked.scala | 2 +- project/Util.scala | 2 +- 81 files changed, 84 insertions(+), 84 deletions(-) rename internal/util-appmacro/src/main/scala/sbt/{util/internal => internal/util}/appmacro/ContextUtil.scala (99%) rename internal/util-appmacro/src/main/scala/sbt/{util/internal => internal/util}/appmacro/Convert.scala (98%) rename internal/util-appmacro/src/main/scala/sbt/{util/internal => internal/util}/appmacro/Instance.scala (99%) rename internal/util-appmacro/src/main/scala/sbt/{util/internal => internal/util}/appmacro/KListBuilder.scala (99%) rename internal/util-appmacro/src/main/scala/sbt/{util/internal => internal/util}/appmacro/MixedBuilder.scala (95%) rename internal/util-appmacro/src/main/scala/sbt/{util/internal => internal/util}/appmacro/TupleBuilder.scala (99%) rename internal/util-appmacro/src/main/scala/sbt/{util/internal => internal/util}/appmacro/TupleNBuilder.scala (98%) rename internal/util-cache/src/main/scala/sbt/{util/internal => internal/util}/Cache.scala (99%) rename internal/util-cache/src/main/scala/sbt/{util/internal => internal/util}/CacheIO.scala (98%) rename internal/util-cache/src/main/scala/sbt/{util/internal => internal/util}/FileInfo.scala (99%) rename internal/util-cache/src/main/scala/sbt/{util/internal => internal/util}/SeparatedCache.scala (98%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/AList.scala (99%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Attributes.scala (99%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Classes.scala (97%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Dag.scala (99%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/HList.scala (96%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/IDSet.scala (98%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/INode.scala (99%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/KList.scala (98%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/PMap.scala (99%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Param.scala (95%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Positions.scala (94%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Settings.scala (99%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Show.scala (84%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/ShowLines.scala (92%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Signal.scala (99%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/TypeFunctions.scala (98%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Types.scala (88%) rename internal/util-collection/src/main/scala/sbt/{util/internal => internal/util}/Util.scala (98%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/LineReader.scala (98%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/Completions.scala (99%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/EditDistance.scala (97%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/ExampleSource.scala (99%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/History.scala (98%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/HistoryCommands.scala (98%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/JLineCompletion.scala (99%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/Parser.scala (99%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/Parsers.scala (99%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/ProcessError.scala (97%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/TokenCompletions.scala (98%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/TypeString.scala (99%) rename internal/util-complete/src/main/scala/sbt/{util/internal => internal/util}/complete/UpperBound.scala (98%) rename internal/util-control/src/main/scala/sbt/{util/internal => internal/util}/ErrorHandling.scala (97%) rename internal/util-control/src/main/scala/sbt/{util/internal => internal/util}/ExitHook.scala (96%) rename internal/util-control/src/main/scala/sbt/{util/internal => internal/util}/MessageOnlyException.scala (97%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/BasicLogger.scala (96%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/BufferedLogger.scala (99%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/ConsoleLogger.scala (99%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/ConsoleOut.scala (98%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/FilterLogger.scala (97%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/FullLogger.scala (97%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/GlobalLogging.scala (98%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/Level.scala (97%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/LogEvent.scala (95%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/Logger.scala (99%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/LoggerWriter.scala (98%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/MainLogging.scala (97%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/MultiLogger.scala (98%) rename internal/util-logging/src/main/scala/sbt/{util/internal => internal/util}/StackTrace.scala (98%) rename internal/util-logic/src/main/scala/sbt/{util/internal => internal/util}/logic/Logic.scala (99%) rename internal/util-relation/src/main/scala/sbt/{util/internal => internal/util}/Relation.scala (99%) rename internal/util-testing/src/main/scala/sbt/{util/internal => internal/util}/UnitSpec.scala (75%) rename internal/util-tracking/src/main/scala/sbt/{util/internal => internal/util}/ChangeReport.scala (99%) rename internal/util-tracking/src/main/scala/sbt/{util/internal => internal/util}/Tracked.scala (99%) diff --git a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/ContextUtil.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala similarity index 99% rename from internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/ContextUtil.scala rename to internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala index 4bdf92dc7..e5cd74270 100644 --- a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/ContextUtil.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package appmacro import scala.reflect._ diff --git a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Convert.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala similarity index 98% rename from internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Convert.scala rename to internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala index ecfa0ea4f..1d0ebede1 100644 --- a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Convert.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package appmacro import scala.reflect._ diff --git a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Instance.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala similarity index 99% rename from internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Instance.scala rename to internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala index 111fbc3ca..2eb6f6877 100644 --- a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/Instance.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package appmacro import Classes.Applicative diff --git a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/KListBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala similarity index 99% rename from internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/KListBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala index 3f0a12e28..5d19f5b6c 100644 --- a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/KListBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package appmacro import Types.Id diff --git a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/MixedBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala similarity index 95% rename from internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/MixedBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala index c99610275..cc2897ae3 100644 --- a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/MixedBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package appmacro import scala.reflect._ diff --git a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala similarity index 99% rename from internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala index ab87ead81..7ed352457 100644 --- a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package appmacro import Types.Id diff --git a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleNBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala similarity index 98% rename from internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleNBuilder.scala rename to internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala index 7c582328e..c94a781f0 100644 --- a/internal/util-appmacro/src/main/scala/sbt/util/internal/appmacro/TupleNBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package appmacro import Types.Id diff --git a/internal/util-cache/src/main/scala/sbt/util/internal/Cache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala similarity index 99% rename from internal/util-cache/src/main/scala/sbt/util/internal/Cache.scala rename to internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala index a42236c57..f441fbc20 100644 --- a/internal/util-cache/src/main/scala/sbt/util/internal/Cache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import sbinary.{ CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output => Out } import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } diff --git a/internal/util-cache/src/main/scala/sbt/util/internal/CacheIO.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala similarity index 98% rename from internal/util-cache/src/main/scala/sbt/util/internal/CacheIO.scala rename to internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala index 7c0ab222d..95c00f47a 100644 --- a/internal/util-cache/src/main/scala/sbt/util/internal/CacheIO.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import java.io.{ File, FileNotFoundException } import sbinary.{ DefaultProtocol, Format, Operations } diff --git a/internal/util-cache/src/main/scala/sbt/util/internal/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala similarity index 99% rename from internal/util-cache/src/main/scala/sbt/util/internal/FileInfo.scala rename to internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index 92ed6c8b6..923f13189 100644 --- a/internal/util-cache/src/main/scala/sbt/util/internal/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import java.io.{ File, IOException } import sbinary.{ DefaultProtocol, Format } diff --git a/internal/util-cache/src/main/scala/sbt/util/internal/SeparatedCache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala similarity index 98% rename from internal/util-cache/src/main/scala/sbt/util/internal/SeparatedCache.scala rename to internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala index 03fd2c2e5..a68e46083 100644 --- a/internal/util-cache/src/main/scala/sbt/util/internal/SeparatedCache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import Types.:+: import sbinary.{ DefaultProtocol, Format, Input, Output => Out } diff --git a/internal/util-cache/src/test/scala/CacheTest.scala b/internal/util-cache/src/test/scala/CacheTest.scala index d204d1f5e..569b0bf24 100644 --- a/internal/util-cache/src/test/scala/CacheTest.scala +++ b/internal/util-cache/src/test/scala/CacheTest.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import java.io.File import Types.:+: diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/AList.scala b/internal/util-collection/src/main/scala/sbt/internal/util/AList.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/util/internal/AList.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/AList.scala index e825b385c..3247e9a8a 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/AList.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/AList.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import Classes.Applicative import Types._ diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Attributes.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/util/internal/Attributes.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala index fb4bb8eaa..817896567 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Attributes.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import Types._ import scala.reflect.Manifest diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Classes.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Classes.scala similarity index 97% rename from internal/util-collection/src/main/scala/sbt/util/internal/Classes.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Classes.scala index 678eb0651..b44cb8606 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Classes.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Classes.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util object Classes { trait Applicative[M[_]] { diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Dag.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/util/internal/Dag.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala index d41730e88..3a6e1d414 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Dag.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 David MacIver, Mark Harrah */ -package sbt.util.internal +package sbt.internal.util trait Dag[Node <: Dag[Node]] { self: Node => diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/HList.scala b/internal/util-collection/src/main/scala/sbt/internal/util/HList.scala similarity index 96% rename from internal/util-collection/src/main/scala/sbt/util/internal/HList.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/HList.scala index 01cded498..37c19dfdc 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/HList.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/HList.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import Types._ diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/IDSet.scala b/internal/util-collection/src/main/scala/sbt/internal/util/IDSet.scala similarity index 98% rename from internal/util-collection/src/main/scala/sbt/util/internal/IDSet.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/IDSet.scala index cefe13186..d7a9f7c1a 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/IDSet.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/IDSet.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util /** A mutable set interface that uses object identity to test for set membership.*/ trait IDSet[T] { diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/INode.scala b/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/util/internal/INode.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/INode.scala index 3e2310f2f..d7a15eee8 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/INode.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import java.lang.Runnable import java.util.concurrent.{ atomic, Executor, LinkedBlockingQueue } diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/KList.scala b/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala similarity index 98% rename from internal/util-collection/src/main/scala/sbt/util/internal/KList.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/KList.scala index 1df8c4df5..5530ba0bc 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/KList.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import Types._ import Classes.Applicative diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/PMap.scala b/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/util/internal/PMap.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala index f9ac4b0a0..a62755544 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/PMap.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import collection.mutable diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Param.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala similarity index 95% rename from internal/util-collection/src/main/scala/sbt/util/internal/Param.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Param.scala index 16bd17e49..08a58c837 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Param.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import Types._ diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Positions.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Positions.scala similarity index 94% rename from internal/util-collection/src/main/scala/sbt/util/internal/Positions.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Positions.scala index fd64e4538..a11ae9c24 100755 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Positions.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Positions.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util sealed trait SourcePosition diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Settings.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/util/internal/Settings.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala index 9e6cdf95e..f742a778c 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import Types._ diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Show.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Show.scala similarity index 84% rename from internal/util-collection/src/main/scala/sbt/util/internal/Show.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Show.scala index 930431e13..4a0343ed7 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Show.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Show.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util trait Show[T] { def apply(t: T): String diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/ShowLines.scala b/internal/util-collection/src/main/scala/sbt/internal/util/ShowLines.scala similarity index 92% rename from internal/util-collection/src/main/scala/sbt/util/internal/ShowLines.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/ShowLines.scala index 794f3d5cb..f99a1394c 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/ShowLines.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/ShowLines.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util trait ShowLines[A] { def showLines(a: A): Seq[String] diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Signal.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala similarity index 99% rename from internal/util-collection/src/main/scala/sbt/util/internal/Signal.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala index f8e3ed17d..8631fc75b 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Signal.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util object Signals { val CONT = "CONT" diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/TypeFunctions.scala b/internal/util-collection/src/main/scala/sbt/internal/util/TypeFunctions.scala similarity index 98% rename from internal/util-collection/src/main/scala/sbt/util/internal/TypeFunctions.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/TypeFunctions.scala index ed1669f35..b7aac6360 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/TypeFunctions.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/TypeFunctions.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util trait TypeFunctions { type Id[X] = X diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Types.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Types.scala similarity index 88% rename from internal/util-collection/src/main/scala/sbt/util/internal/Types.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Types.scala index 972f12769..9b6eb0733 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Types.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Types.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util object Types extends Types diff --git a/internal/util-collection/src/main/scala/sbt/util/internal/Util.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala similarity index 98% rename from internal/util-collection/src/main/scala/sbt/util/internal/Util.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/Util.scala index ba158ae57..4f82cae9c 100644 --- a/internal/util-collection/src/main/scala/sbt/util/internal/Util.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import java.util.Locale diff --git a/internal/util-collection/src/test/scala/DagSpecification.scala b/internal/util-collection/src/test/scala/DagSpecification.scala index 47c1b802b..9e5025488 100644 --- a/internal/util-collection/src/test/scala/DagSpecification.scala +++ b/internal/util-collection/src/test/scala/DagSpecification.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import org.scalacheck._ import Prop._ diff --git a/internal/util-collection/src/test/scala/KeyTest.scala b/internal/util-collection/src/test/scala/KeyTest.scala index 8bdcd76bf..461655b88 100644 --- a/internal/util-collection/src/test/scala/KeyTest.scala +++ b/internal/util-collection/src/test/scala/KeyTest.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import org.scalacheck._ import Prop._ diff --git a/internal/util-collection/src/test/scala/LiteralTest.scala b/internal/util-collection/src/test/scala/LiteralTest.scala index 32da58a7b..b50d02632 100644 --- a/internal/util-collection/src/test/scala/LiteralTest.scala +++ b/internal/util-collection/src/test/scala/LiteralTest.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import Types._ diff --git a/internal/util-collection/src/test/scala/PMapTest.scala b/internal/util-collection/src/test/scala/PMapTest.scala index e8aa2c6fa..9e1dde2c9 100644 --- a/internal/util-collection/src/test/scala/PMapTest.scala +++ b/internal/util-collection/src/test/scala/PMapTest.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import Types._ diff --git a/internal/util-collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala index 96a8e01f6..f7b2f2cf0 100644 --- a/internal/util-collection/src/test/scala/SettingsExample.scala +++ b/internal/util-collection/src/test/scala/SettingsExample.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util /** Define our settings system */ diff --git a/internal/util-collection/src/test/scala/SettingsTest.scala b/internal/util-collection/src/test/scala/SettingsTest.scala index 3321c644b..8b77dba16 100644 --- a/internal/util-collection/src/test/scala/SettingsTest.scala +++ b/internal/util-collection/src/test/scala/SettingsTest.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import org.scalacheck._ import Prop._ diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/util/internal/LineReader.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index c01986a0f..18cc431d7 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import jline.console.ConsoleReader import jline.console.history.{ FileHistory, MemoryHistory } @@ -128,7 +128,7 @@ final class FullReader(historyPath: Option[File], complete: Parser[_], val handl protected[this] val reader = { val cr = JLine.createReader(historyPath) - sbt.util.internal.complete.JLineCompletion.installCustomCompletor(cr, complete) + sbt.internal.util.complete.JLineCompletion.installCustomCompletor(cr, complete) cr } } diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/Completions.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/Completions.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala index b3a562dc7..c035f3620 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/Completions.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util package complete /** diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/EditDistance.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala similarity index 97% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/EditDistance.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala index 873d9035c..8cb617348 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/EditDistance.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete import java.lang.Character.{ toLowerCase => lower } diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/ExampleSource.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/ExampleSource.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/ExampleSource.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/ExampleSource.scala index 932951e34..6539554a6 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/ExampleSource.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/ExampleSource.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete import java.io.File diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/History.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/History.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala index bd925cd17..350c58dfa 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/History.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util package complete import History.number diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/HistoryCommands.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala index 662b914c0..f74d4e448 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/HistoryCommands.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util package complete import java.io.File diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/JLineCompletion.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/JLineCompletion.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala index c674e21db..e098f59f6 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/JLineCompletion.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util package complete import jline.console.ConsoleReader diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/Parser.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala index 659f20e7b..a41c0d7d2 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala @@ -1,12 +1,12 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2010, 2011 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util package complete import Parser._ -import sbt.util.internal.Types.{ left, right, some } -import sbt.util.internal.Util.{ makeList, separate } +import sbt.internal.util.Types.{ left, right, some } +import sbt.internal.util.Util.{ makeList, separate } /** * A String parser that provides semi-automatic tab completion. diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/Parsers.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parsers.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/Parsers.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/Parsers.scala index e0b81884f..9463d1acb 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/Parsers.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parsers.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util package complete import Parser._ diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/ProcessError.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/ProcessError.scala similarity index 97% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/ProcessError.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/ProcessError.scala index 2faa17634..6d74ed2d2 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/ProcessError.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/ProcessError.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete object ProcessError { diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/TokenCompletions.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/TokenCompletions.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/TokenCompletions.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/TokenCompletions.scala index 6d3e5100f..0d0b2980e 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/TokenCompletions.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/TokenCompletions.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete import Completion.{ token => ctoken, tokenDisplay } diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/TypeString.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala similarity index 99% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/TypeString.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala index 1ac092023..9a308a2bf 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/TypeString.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete import DefaultParsers._ diff --git a/internal/util-complete/src/main/scala/sbt/util/internal/complete/UpperBound.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/UpperBound.scala similarity index 98% rename from internal/util-complete/src/main/scala/sbt/util/internal/complete/UpperBound.scala rename to internal/util-complete/src/main/scala/sbt/internal/util/complete/UpperBound.scala index cd42e688b..6b600f9ed 100644 --- a/internal/util-complete/src/main/scala/sbt/util/internal/complete/UpperBound.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/UpperBound.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008,2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util package complete sealed trait UpperBound { diff --git a/internal/util-complete/src/test/scala/ParserTest.scala b/internal/util-complete/src/test/scala/ParserTest.scala index c1bd8c45c..1db99b513 100644 --- a/internal/util-complete/src/test/scala/ParserTest.scala +++ b/internal/util-complete/src/test/scala/ParserTest.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete object JLineTest { diff --git a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala index 2f873cf6c..2af9388a7 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete import java.io.File diff --git a/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala index db1ebe67f..b043497db 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete class FixedSetExamplesTest extends UnitSpec { diff --git a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala index fc7c0ae90..684cbe403 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package complete import Completion._ diff --git a/internal/util-control/src/main/scala/sbt/util/internal/ErrorHandling.scala b/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala similarity index 97% rename from internal/util-control/src/main/scala/sbt/util/internal/ErrorHandling.scala rename to internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala index 6a93451cf..ae0d5443e 100644 --- a/internal/util-control/src/main/scala/sbt/util/internal/ErrorHandling.scala +++ b/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import java.io.IOException diff --git a/internal/util-control/src/main/scala/sbt/util/internal/ExitHook.scala b/internal/util-control/src/main/scala/sbt/internal/util/ExitHook.scala similarity index 96% rename from internal/util-control/src/main/scala/sbt/util/internal/ExitHook.scala rename to internal/util-control/src/main/scala/sbt/internal/util/ExitHook.scala index 80b2b9a75..823c64b01 100644 --- a/internal/util-control/src/main/scala/sbt/util/internal/ExitHook.scala +++ b/internal/util-control/src/main/scala/sbt/internal/util/ExitHook.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util /** Defines a function to call as sbt exits.*/ trait ExitHook { diff --git a/internal/util-control/src/main/scala/sbt/util/internal/MessageOnlyException.scala b/internal/util-control/src/main/scala/sbt/internal/util/MessageOnlyException.scala similarity index 97% rename from internal/util-control/src/main/scala/sbt/util/internal/MessageOnlyException.scala rename to internal/util-control/src/main/scala/sbt/internal/util/MessageOnlyException.scala index 425928a11..32c16ee4d 100644 --- a/internal/util-control/src/main/scala/sbt/util/internal/MessageOnlyException.scala +++ b/internal/util-control/src/main/scala/sbt/internal/util/MessageOnlyException.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2011 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util final class MessageOnlyException(override val toString: String) extends RuntimeException(toString) diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/BasicLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BasicLogger.scala similarity index 96% rename from internal/util-logging/src/main/scala/sbt/util/internal/BasicLogger.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/BasicLogger.scala index b9a2913e2..9dd58e48e 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/BasicLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BasicLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util /** Implements the level-setting methods of Logger.*/ abstract class BasicLogger extends AbstractLogger { diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala similarity index 99% rename from internal/util-logging/src/main/scala/sbt/util/internal/BufferedLogger.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index a1aaa002a..4877de61f 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import scala.collection.mutable.ListBuffer diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/ConsoleLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala similarity index 99% rename from internal/util-logging/src/main/scala/sbt/util/internal/ConsoleLogger.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala index ae9e2c670..5be85eb24 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/ConsoleLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010, 2011 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import java.io.{ BufferedWriter, PrintStream, PrintWriter } import java.util.Locale diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/util/internal/ConsoleOut.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala index 4c20ee329..30da238da 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import java.io.{ BufferedWriter, PrintStream, PrintWriter } diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/FilterLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/util/internal/FilterLogger.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala index d0f503495..4d5111c44 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/FilterLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util /** * A filter logger is used to delegate messages but not the logging level to another logger. This means diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/FullLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/util/internal/FullLogger.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala index 64066abdb..460012e10 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/FullLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util /** Promotes the simple Logger interface to the full AbstractLogger interface. */ class FullLogger(delegate: Logger) extends BasicLogger { diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/GlobalLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/util/internal/GlobalLogging.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala index df9b5e0e3..d58c72eae 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/GlobalLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import java.io.{ File, PrintWriter } diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/Level.scala b/internal/util-logging/src/main/scala/sbt/internal/util/Level.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/util/internal/Level.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/Level.scala index 5a9d3e717..a12e53fee 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/Level.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/Level.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util /** * An enumeration defining the levels available for logging. A level includes all of the levels diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/LogEvent.scala b/internal/util-logging/src/main/scala/sbt/internal/util/LogEvent.scala similarity index 95% rename from internal/util-logging/src/main/scala/sbt/util/internal/LogEvent.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/LogEvent.scala index a03c306b1..70a4d5031 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/LogEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/LogEvent.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util sealed trait LogEvent extends NotNull final class Success(val msg: String) extends LogEvent diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/Logger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/Logger.scala similarity index 99% rename from internal/util-logging/src/main/scala/sbt/util/internal/Logger.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/Logger.scala index a3daea79a..bcd0963c2 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/Logger.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import xsbti.{ Logger => xLogger, F0 } import xsbti.{ Maybe, Position, Problem, Severity } diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/LoggerWriter.scala b/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/util/internal/LoggerWriter.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala index b616a045b..eea18b15b 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/LoggerWriter.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util /** * Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/util/internal/MainLogging.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala index 9a1240fc6..95402b4ff 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import java.io.PrintWriter diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/util/internal/MultiLogger.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala index add4a8761..cb5ff12d6 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala @@ -2,7 +2,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util // note that setting the logging level on this logger has no effect on its behavior, only // on the behavior of the delegates. diff --git a/internal/util-logging/src/main/scala/sbt/util/internal/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala similarity index 98% rename from internal/util-logging/src/main/scala/sbt/util/internal/StackTrace.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala index 595529d49..af16e35d4 100644 --- a/internal/util-logging/src/main/scala/sbt/util/internal/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Tony Sloane */ -package sbt.util.internal +package sbt.internal.util object StackTrace { def isSbtClass(name: String) = name.startsWith("sbt") || name.startsWith("xsbt") diff --git a/internal/util-logging/src/test/scala/Escapes.scala b/internal/util-logging/src/test/scala/Escapes.scala index bf2e0bdd3..153a5da56 100644 --- a/internal/util-logging/src/test/scala/Escapes.scala +++ b/internal/util-logging/src/test/scala/Escapes.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import org.scalacheck._ import Prop._ diff --git a/internal/util-logging/src/test/scala/LogWriterTest.scala b/internal/util-logging/src/test/scala/LogWriterTest.scala index 5f73cdd2d..156f45455 100644 --- a/internal/util-logging/src/test/scala/LogWriterTest.scala +++ b/internal/util-logging/src/test/scala/LogWriterTest.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import org.scalacheck._ import Arbitrary.{ arbitrary => arb, _ } diff --git a/internal/util-logging/src/test/scala/TestLogger.scala b/internal/util-logging/src/test/scala/TestLogger.scala index c66f358ea..74dcbb448 100644 --- a/internal/util-logging/src/test/scala/TestLogger.scala +++ b/internal/util-logging/src/test/scala/TestLogger.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util object TestLogger { def apply[T](f: Logger => T): T = diff --git a/internal/util-logic/src/main/scala/sbt/util/internal/logic/Logic.scala b/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala similarity index 99% rename from internal/util-logic/src/main/scala/sbt/util/internal/logic/Logic.scala rename to internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala index 3e6f8ba8c..795423c54 100644 --- a/internal/util-logic/src/main/scala/sbt/util/internal/logic/Logic.scala +++ b/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package logic import scala.annotation.tailrec diff --git a/internal/util-logic/src/test/scala/sbt/logic/Test.scala b/internal/util-logic/src/test/scala/sbt/logic/Test.scala index f6170f1e3..59b40c34b 100644 --- a/internal/util-logic/src/test/scala/sbt/logic/Test.scala +++ b/internal/util-logic/src/test/scala/sbt/logic/Test.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util package logic import org.scalacheck._ diff --git a/internal/util-relation/src/main/scala/sbt/util/internal/Relation.scala b/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala similarity index 99% rename from internal/util-relation/src/main/scala/sbt/util/internal/Relation.scala rename to internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala index 74b24629d..788f39362 100644 --- a/internal/util-relation/src/main/scala/sbt/util/internal/Relation.scala +++ b/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import Relation._ diff --git a/internal/util-relation/src/test/scala/RelationTest.scala b/internal/util-relation/src/test/scala/RelationTest.scala index 071c79d52..31f68e0c3 100644 --- a/internal/util-relation/src/test/scala/RelationTest.scala +++ b/internal/util-relation/src/test/scala/RelationTest.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import org.scalacheck._ import Prop._ diff --git a/internal/util-testing/src/main/scala/sbt/util/internal/UnitSpec.scala b/internal/util-testing/src/main/scala/sbt/internal/util/UnitSpec.scala similarity index 75% rename from internal/util-testing/src/main/scala/sbt/util/internal/UnitSpec.scala rename to internal/util-testing/src/main/scala/sbt/internal/util/UnitSpec.scala index 83f17298e..99ad43c2d 100644 --- a/internal/util-testing/src/main/scala/sbt/util/internal/UnitSpec.scala +++ b/internal/util-testing/src/main/scala/sbt/internal/util/UnitSpec.scala @@ -1,4 +1,4 @@ -package sbt.util.internal +package sbt.internal.util import org.scalatest._ diff --git a/internal/util-tracking/src/main/scala/sbt/util/internal/ChangeReport.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala similarity index 99% rename from internal/util-tracking/src/main/scala/sbt/util/internal/ChangeReport.scala rename to internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala index a2561d723..10afbea6f 100644 --- a/internal/util-tracking/src/main/scala/sbt/util/internal/ChangeReport.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util object ChangeReport { def modified[T](files: Set[T]): ChangeReport[T] = diff --git a/internal/util-tracking/src/main/scala/sbt/util/internal/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala similarity index 99% rename from internal/util-tracking/src/main/scala/sbt/util/internal/Tracked.scala rename to internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala index 58361e116..ae3e060a8 100644 --- a/internal/util-tracking/src/main/scala/sbt/util/internal/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package sbt.util.internal +package sbt.internal.util import java.io.{ File, IOException } import CacheIO.{ fromFile, toFile } diff --git a/project/Util.scala b/project/Util.scala index 406c62723..abaa4849e 100644 --- a/project/Util.scala +++ b/project/Util.scala @@ -20,7 +20,7 @@ object Util { { val init = keywords.map(tn => '"' + tn + '"').mkString("Set(", ", ", ")") val ObjectName = "ScalaKeywords" - val PackageName = "sbt.util.internal" + val PackageName = "sbt.internal.util" val keywordsSrc = """package %s object %s { From cf6e656a672435543182f336822fe48c8ab68fb5 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 5 Sep 2015 00:58:48 -0400 Subject: [PATCH 543/823] Add utilTesting to root --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index cae1353d2..bfef5d621 100644 --- a/build.sbt +++ b/build.sbt @@ -37,7 +37,7 @@ lazy val utilRoot: Project = (project in file(".")). // configs(Sxr.sxrConf). aggregate( utilInterface, utilControl, utilCollection, utilApplyMacro, utilComplete, - utilLogging, utilRelation, utilLogic, utilCache, utilTracking + utilLogging, utilRelation, utilLogic, utilCache, utilTracking, utilTesting ). settings( inThisBuild(Seq( From 10fa9a47099d3c1d3202b2bbe558cdc3dd494dc9 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 5 Sep 2015 01:42:34 -0400 Subject: [PATCH 544/823] Using doge to release correctly --- CONTRIBUTING.md | 3 +++ build.sbt | 19 +++++++++++++++---- project/Util.scala | 4 +++- project/doge.sbt | 1 + 4 files changed, 22 insertions(+), 5 deletions(-) create mode 100644 CONTRIBUTING.md create mode 100644 project/doge.sbt diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 000000000..994e17a23 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,3 @@ +``` +$ sbt release +``` diff --git a/build.sbt b/build.sbt index bfef5d621..20b5153f6 100644 --- a/build.sbt +++ b/build.sbt @@ -34,7 +34,6 @@ def commonSettings: Seq[Setting[_]] = Seq( ) lazy val utilRoot: Project = (project in file(".")). - // configs(Sxr.sxrConf). aggregate( utilInterface, utilControl, utilCollection, utilApplyMacro, utilComplete, utilLogging, utilRelation, utilLogic, utilCache, utilTracking, utilTesting @@ -62,7 +61,10 @@ lazy val utilRoot: Project = (project in file(".")). name := "Util Root", publish := {}, publishLocal := {}, - publishArtifact := false + publishArtifact in Compile := false, + publishArtifact in Test := false, + publishArtifact := false, + customCommands ) // defines Java structures used across Scala versions, such as the API structures and relationships extracted by @@ -104,8 +106,7 @@ lazy val utilComplete = (project in internalPath / "util-complete"). settings( commonSettings, name := "Util Completion", - libraryDependencies ++= Seq(jline, sbtIO), - crossScalaVersions := Seq(scala210, scala211) + libraryDependencies ++= Seq(jline, sbtIO) ) // logging @@ -159,3 +160,13 @@ lazy val utilTesting = (project in internalPath / "util-testing"). name := "Util Testing", libraryDependencies ++= Seq(scalaCheck, scalatest) ) + +def customCommands: Seq[Setting[_]] = Seq( + commands += Command.command("release") { state => + // "clean" :: + "so compile" :: + "so publishSigned" :: + "reload" :: + state + } +) diff --git a/project/Util.scala b/project/Util.scala index abaa4849e..cfe97dd1c 100644 --- a/project/Util.scala +++ b/project/Util.scala @@ -7,8 +7,10 @@ object Util { lazy val generateKeywords = TaskKey[File]("generateKeywords") lazy val javaOnlySettings = Seq[Setting[_]]( + crossPaths := false, compileOrder := CompileOrder.JavaThenScala, - unmanagedSourceDirectories in Compile <<= Seq(javaSource in Compile).join + unmanagedSourceDirectories in Compile <<= Seq(javaSource in Compile).join, + crossScalaVersions := Seq(Dependencies.scala211) ) def getScalaKeywords: Set[String] = diff --git a/project/doge.sbt b/project/doge.sbt new file mode 100644 index 000000000..fedea9490 --- /dev/null +++ b/project/doge.sbt @@ -0,0 +1 @@ +addSbtPlugin("com.eed3si9n" % "sbt-doge" % "0.1.3") From e6711a4e914686c07105e03c846b08288f420a16 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 7 Sep 2015 01:11:13 -0400 Subject: [PATCH 545/823] bump up 1.0.0-M3 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 95b592d4d..881899bb1 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -5,7 +5,7 @@ object Dependencies { lazy val scala210 = "2.10.5" lazy val scala211 = "2.11.7" - lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M1" + lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M3" lazy val jline = "jline" % "jline" % "2.11" lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" lazy val sbinary = "org.scala-tools.sbinary" %% "sbinary" % "0.4.2" From dc0fd2d48bb36bab030c437165e6618cf653a058 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 7 Sep 2015 01:31:17 -0400 Subject: [PATCH 546/823] move sbt.internal.util.Logger into sbt.util --- .../scala/sbt/internal/util/BasicLogger.scala | 12 +-- .../sbt/internal/util/BufferedLogger.scala | 1 + .../sbt/internal/util/ConsoleLogger.scala | 1 + .../scala/sbt/internal/util/ConsoleOut.scala | 1 + .../sbt/internal/util/FilterLogger.scala | 2 + .../scala/sbt/internal/util/FullLogger.scala | 2 + .../sbt/internal/util/GlobalLogging.scala | 1 + .../sbt/internal/util/LoggerWriter.scala | 2 + .../scala/sbt/internal/util/MainLogging.scala | 1 + .../scala/sbt/internal/util/MultiLogger.scala | 2 + .../main/scala/sbt/util/AbtractLogger.scala | 28 +++++++ .../scala/sbt/{internal => }/util/Level.scala | 4 +- .../sbt/{internal => }/util/LogEvent.scala | 2 +- .../sbt/{internal => }/util/Logger.scala | 80 +++++++------------ .../src/test/scala/LogWriterTest.scala | 1 + .../src/test/scala/TestLogger.scala | 2 + 16 files changed, 81 insertions(+), 61 deletions(-) create mode 100644 internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala rename internal/util-logging/src/main/scala/sbt/{internal => }/util/Level.scala (97%) rename internal/util-logging/src/main/scala/sbt/{internal => }/util/LogEvent.scala (95%) rename internal/util-logging/src/main/scala/sbt/{internal => }/util/Logger.scala (80%) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BasicLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BasicLogger.scala index 9dd58e48e..1838822c5 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BasicLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BasicLogger.scala @@ -3,15 +3,17 @@ */ package sbt.internal.util +import sbt.util._ + /** Implements the level-setting methods of Logger.*/ abstract class BasicLogger extends AbstractLogger { - private var traceEnabledVar = java.lang.Integer.MAX_VALUE + private var traceEnabledVar: Int = java.lang.Integer.MAX_VALUE private var level: Level.Value = Level.Info private var successEnabledVar = true - def successEnabled = synchronized { successEnabledVar } + def successEnabled: Boolean = synchronized { successEnabledVar } def setSuccessEnabled(flag: Boolean): Unit = synchronized { successEnabledVar = flag } - def getLevel = synchronized { level } + def getLevel: Level.Value = synchronized { level } def setLevel(newLevel: Level.Value): Unit = synchronized { level = newLevel } def setTrace(level: Int): Unit = synchronized { traceEnabledVar = level } - def getTrace = synchronized { traceEnabledVar } -} \ No newline at end of file + def getTrace: Int = synchronized { traceEnabledVar } +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index 4877de61f..d1f03cc72 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -3,6 +3,7 @@ */ package sbt.internal.util +import sbt.util._ import scala.collection.mutable.ListBuffer /** diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala index 5be85eb24..5ca3fe9ee 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala @@ -3,6 +3,7 @@ */ package sbt.internal.util +import sbt.util._ import java.io.{ BufferedWriter, PrintStream, PrintWriter } import java.util.Locale diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala index 30da238da..cffec8781 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala @@ -1,5 +1,6 @@ package sbt.internal.util +import sbt.util._ import java.io.{ BufferedWriter, PrintStream, PrintWriter } sealed trait ConsoleOut { diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala index 4d5111c44..d52901c7b 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala @@ -3,6 +3,8 @@ */ package sbt.internal.util +import sbt.util._ + /** * A filter logger is used to delegate messages but not the logging level to another logger. This means * that messages are logged at the higher of the two levels set by this logger and its delegate. diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala index 460012e10..1493e2d0f 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala @@ -3,6 +3,8 @@ */ package sbt.internal.util +import sbt.util._ + /** Promotes the simple Logger interface to the full AbstractLogger interface. */ class FullLogger(delegate: Logger) extends BasicLogger { override val ansiCodesSupported: Boolean = delegate.ansiCodesSupported diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala index d58c72eae..191408393 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala @@ -3,6 +3,7 @@ */ package sbt.internal.util +import sbt.util._ import java.io.{ File, PrintWriter } /** diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala b/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala index eea18b15b..7b440c200 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala @@ -3,6 +3,8 @@ */ package sbt.internal.util +import sbt.util._ + /** * Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. * A line is delimited by `nl`, which is by default the platform line separator. diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala index 95402b4ff..aeab7e5cd 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala @@ -1,5 +1,6 @@ package sbt.internal.util +import sbt.util._ import java.io.PrintWriter object MainLogging { diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala index cb5ff12d6..84168fd27 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala @@ -4,6 +4,8 @@ */ package sbt.internal.util +import sbt.util._ + // note that setting the logging level on this logger has no effect on its behavior, only // on the behavior of the delegates. class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { diff --git a/internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala b/internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala new file mode 100644 index 000000000..51b7f08b5 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala @@ -0,0 +1,28 @@ +package sbt.util + +abstract class AbstractLogger extends Logger { + def getLevel: Level.Value + def setLevel(newLevel: Level.Value): Unit + def setTrace(flag: Int): Unit + def getTrace: Int + final def traceEnabled: Boolean = getTrace >= 0 + def successEnabled: Boolean + def setSuccessEnabled(flag: Boolean): Unit + + def atLevel(level: Level.Value): Boolean = level.id >= getLevel.id + def control(event: ControlEvent.Value, message: => String): Unit + + def logAll(events: Seq[LogEvent]): Unit + /** Defined in terms of other methods in Logger and should not be called from them. */ + final def log(event: LogEvent): Unit = { + event match { + case s: Success => success(s.msg) + case l: Log => log(l.level, l.msg) + case t: Trace => trace(t.exception) + case setL: SetLevel => setLevel(setL.newLevel) + case setT: SetTrace => setTrace(setT.level) + case setS: SetSuccess => setSuccessEnabled(setS.enabled) + case c: ControlEvent => control(c.event, c.msg) + } + } +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/Level.scala b/internal/util-logging/src/main/scala/sbt/util/Level.scala similarity index 97% rename from internal/util-logging/src/main/scala/sbt/internal/util/Level.scala rename to internal/util-logging/src/main/scala/sbt/util/Level.scala index a12e53fee..2f319cffd 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/Level.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Level.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ -package sbt.internal.util +package sbt.util /** * An enumeration defining the levels available for logging. A level includes all of the levels @@ -25,4 +25,4 @@ object Level extends Enumeration { def apply(s: String) = values.find(s == _.toString) /** Same as apply, defined for use in pattern matching. */ private[sbt] def unapply(s: String) = apply(s) -} \ No newline at end of file +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/LogEvent.scala b/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala similarity index 95% rename from internal/util-logging/src/main/scala/sbt/internal/util/LogEvent.scala rename to internal/util-logging/src/main/scala/sbt/util/LogEvent.scala index 70a4d5031..b6225896f 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/LogEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009 Mark Harrah */ -package sbt.internal.util +package sbt.util sealed trait LogEvent extends NotNull final class Success(val msg: String) extends LogEvent diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/Logger.scala similarity index 80% rename from internal/util-logging/src/main/scala/sbt/internal/util/Logger.scala rename to internal/util-logging/src/main/scala/sbt/util/Logger.scala index bcd0963c2..17e19f902 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Logger.scala @@ -1,39 +1,42 @@ /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ -package sbt.internal.util +package sbt.util import xsbti.{ Logger => xLogger, F0 } import xsbti.{ Maybe, Position, Problem, Severity } import sys.process.ProcessLogger +import sbt.internal.util.{ BufferedLogger, FullLogger } import java.io.File -abstract class AbstractLogger extends Logger { - def getLevel: Level.Value - def setLevel(newLevel: Level.Value): Unit - def setTrace(flag: Int): Unit - def getTrace: Int - final def traceEnabled: Boolean = getTrace >= 0 - def successEnabled: Boolean - def setSuccessEnabled(flag: Boolean): Unit +/** + * This is intended to be the simplest logging interface for use by code that wants to log. + * It does not include configuring the logger. + */ +abstract class Logger extends xLogger { + final def verbose(message: => String): Unit = debug(message) + final def debug(message: => String): Unit = log(Level.Debug, message) + final def info(message: => String): Unit = log(Level.Info, message) + final def warn(message: => String): Unit = log(Level.Warn, message) + final def error(message: => String): Unit = log(Level.Error, message) + // Added by sys.process.ProcessLogger + final def err(message: => String): Unit = log(Level.Error, message) + // sys.process.ProcessLogger + final def out(message: => String): Unit = log(Level.Info, message) - def atLevel(level: Level.Value): Boolean = level.id >= getLevel.id - def control(event: ControlEvent.Value, message: => String): Unit + def ansiCodesSupported: Boolean = false - def logAll(events: Seq[LogEvent]): Unit - /** Defined in terms of other methods in Logger and should not be called from them. */ - final def log(event: LogEvent): Unit = { - event match { - case s: Success => success(s.msg) - case l: Log => log(l.level, l.msg) - case t: Trace => trace(t.exception) - case setL: SetLevel => setLevel(setL.newLevel) - case setT: SetTrace => setTrace(setT.level) - case setS: SetSuccess => setSuccessEnabled(setS.enabled) - case c: ControlEvent => control(c.event, c.msg) - } - } + def trace(t: => Throwable): Unit + def success(message: => String): Unit + def log(level: Level.Value, message: => String): Unit + + def debug(msg: F0[String]): Unit = log(Level.Debug, msg) + def warn(msg: F0[String]): Unit = log(Level.Warn, msg) + def info(msg: F0[String]): Unit = log(Level.Info, msg) + def error(msg: F0[String]): Unit = log(Level.Error, msg) + def trace(msg: F0[Throwable]): Unit = trace(msg.apply) + def log(level: Level.Value, msg: F0[String]): Unit = log(level, msg.apply) } object Logger { @@ -107,32 +110,3 @@ object Logger { override def toString = s"[$severity] $pos: $message" } } - -/** - * This is intended to be the simplest logging interface for use by code that wants to log. - * It does not include configuring the logger. - */ -trait Logger extends xLogger { - final def verbose(message: => String): Unit = debug(message) - final def debug(message: => String): Unit = log(Level.Debug, message) - final def info(message: => String): Unit = log(Level.Info, message) - final def warn(message: => String): Unit = log(Level.Warn, message) - final def error(message: => String): Unit = log(Level.Error, message) - // Added by sys.process.ProcessLogger - final def err(message: => String): Unit = log(Level.Error, message) - // sys.process.ProcessLogger - final def out(message: => String): Unit = log(Level.Info, message) - - def ansiCodesSupported: Boolean = false - - def trace(t: => Throwable): Unit - def success(message: => String): Unit - def log(level: Level.Value, message: => String): Unit - - def debug(msg: F0[String]): Unit = log(Level.Debug, msg) - def warn(msg: F0[String]): Unit = log(Level.Warn, msg) - def info(msg: F0[String]): Unit = log(Level.Info, msg) - def error(msg: F0[String]): Unit = log(Level.Error, msg) - def trace(msg: F0[Throwable]): Unit = trace(msg.apply) - def log(level: Level.Value, msg: F0[String]): Unit = log(level, msg.apply) -} diff --git a/internal/util-logging/src/test/scala/LogWriterTest.scala b/internal/util-logging/src/test/scala/LogWriterTest.scala index 156f45455..1f33a3761 100644 --- a/internal/util-logging/src/test/scala/LogWriterTest.scala +++ b/internal/util-logging/src/test/scala/LogWriterTest.scala @@ -3,6 +3,7 @@ package sbt.internal.util +import sbt.util._ import org.scalacheck._ import Arbitrary.{ arbitrary => arb, _ } import Gen.{ listOfN, oneOf } diff --git a/internal/util-logging/src/test/scala/TestLogger.scala b/internal/util-logging/src/test/scala/TestLogger.scala index 74dcbb448..b9ddda148 100644 --- a/internal/util-logging/src/test/scala/TestLogger.scala +++ b/internal/util-logging/src/test/scala/TestLogger.scala @@ -1,5 +1,7 @@ package sbt.internal.util +import sbt.util._ + object TestLogger { def apply[T](f: Logger => T): T = { From 70b49e9a4be3218db4c136f1386846555e1a0cce Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 7 Sep 2015 01:45:39 -0400 Subject: [PATCH 547/823] Comment on SI-8450 --- internal/util-logging/src/test/scala/Escapes.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/internal/util-logging/src/test/scala/Escapes.scala b/internal/util-logging/src/test/scala/Escapes.scala index 153a5da56..9078f4d59 100644 --- a/internal/util-logging/src/test/scala/Escapes.scala +++ b/internal/util-logging/src/test/scala/Escapes.scala @@ -61,6 +61,7 @@ object Escapes extends Properties("Escapes") { final case class EscapeAndNot(escape: EscapeSequence, notEscape: String) { override def toString = s"EscapeAntNot(escape = [$escape], notEscape = [${notEscape.map(_.toInt)}])" } + // 2.10.5 warns on "implicit numeric widening" but it looks like a bug: https://issues.scala-lang.org/browse/SI-8450 final case class EscapeSequence(content: String, terminator: Char) { if (!content.isEmpty) { assert(content.tail.forall(c => !isEscapeTerminator(c)), "Escape sequence content contains an escape terminator: '" + content + "'") From aec925b57fecc734249af8f68172ff025ac3b598 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Thu, 10 Sep 2015 14:39:29 +0200 Subject: [PATCH 548/823] Make `sbt.internal.util.JLine` private to sbt package It was private to `sbt.internal.util`, but it is used in sbt's codebase. --- .../src/main/scala/sbt/internal/util/LineReader.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index 18cc431d7..e72341a23 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -58,7 +58,7 @@ abstract class JLine extends LineReader { reader.flush() } } -private object JLine { +private[sbt] object JLine { private[this] val TerminalProperty = "jline.terminal" fixTerminalProperty() From c7c697bad109ff2f6f7e441b02c2967642a40c5f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 16 Sep 2015 21:07:41 -0400 Subject: [PATCH 549/823] Add picklers to FileInfo --- build.sbt | 2 +- .../scala/sbt/internal/util/FileInfo.scala | 38 +++++++++++++++++-- 2 files changed, 35 insertions(+), 5 deletions(-) diff --git a/build.sbt b/build.sbt index 20b5153f6..d69762670 100644 --- a/build.sbt +++ b/build.sbt @@ -4,7 +4,7 @@ import Util._ def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( - scalaVersion := "2.10.5", + scalaVersion := scala211, // publishArtifact in packageDoc := false, resolvers += Resolver.typesafeIvyRepo("releases"), resolvers += Resolver.sonatypeRepo("snapshots"), diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index 923f13189..5f7461eae 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -8,31 +8,61 @@ import sbinary.{ DefaultProtocol, Format } import DefaultProtocol._ import scala.reflect.Manifest import sbt.io.Hash +import sbt.serialization._ sealed trait FileInfo extends NotNull { val file: File } +@directSubclasses(Array(classOf[FileHash], classOf[HashModifiedFileInfo])) sealed trait HashFileInfo extends FileInfo { val hash: List[Byte] } +object HashFileInfo { + implicit val pickler: Pickler[HashFileInfo] with Unpickler[HashFileInfo] = PicklerUnpickler.generate[HashFileInfo] +} +@directSubclasses(Array(classOf[FileModified], classOf[HashModifiedFileInfo])) sealed trait ModifiedFileInfo extends FileInfo { val lastModified: Long } +object ModifiedFileInfo { + implicit val pickler: Pickler[ModifiedFileInfo] with Unpickler[ModifiedFileInfo] = PicklerUnpickler.generate[ModifiedFileInfo] +} +@directSubclasses(Array(classOf[PlainFile])) sealed trait PlainFileInfo extends FileInfo { def exists: Boolean } +object PlainFileInfo { + implicit val pickler: Pickler[PlainFileInfo] with Unpickler[PlainFileInfo] = PicklerUnpickler.generate[PlainFileInfo] +} +@directSubclasses(Array(classOf[FileHashModified])) sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo +object HashModifiedFileInfo { + implicit val pickler: Pickler[HashModifiedFileInfo] with Unpickler[HashModifiedFileInfo] = PicklerUnpickler.generate[HashModifiedFileInfo] +} -private final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo -private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo -private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo -private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo +private[sbt] final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo +private[sbt] object PlainFile { + implicit val pickler: Pickler[PlainFile] with Unpickler[PlainFile] = PicklerUnpickler.generate[PlainFile] +} +private[sbt] final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo +private[sbt] object FileHash { + implicit val pickler: Pickler[FileHash] with Unpickler[FileHash] = PicklerUnpickler.generate[FileHash] +} +private[sbt] final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo +private[sbt] object FileModified { + implicit val pickler: Pickler[FileModified] with Unpickler[FileModified] = PicklerUnpickler.generate[FileModified] +} +private[sbt] final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo +private[sbt] object FileHashModified { + implicit val pickler: Pickler[FileHashModified] with Unpickler[FileHashModified] = PicklerUnpickler.generate[FileHashModified] +} object FileInfo { implicit def existsInputCache: InputCache[PlainFileInfo] = exists.infoInputCache implicit def modifiedInputCache: InputCache[ModifiedFileInfo] = lastModified.infoInputCache implicit def hashInputCache: InputCache[HashFileInfo] = hash.infoInputCache implicit def fullInputCache: InputCache[HashModifiedFileInfo] = full.infoInputCache + implicit val pickler: Pickler[FileInfo] with Unpickler[FileInfo] = PicklerUnpickler.generate[FileInfo] sealed trait Style { type F <: FileInfo From 79b90917ec480cddf7d0b7a61b63767ac6af9420 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 16 Sep 2015 22:06:13 -0400 Subject: [PATCH 550/823] New house rules --- build.sbt | 20 ++++--------------- .../sbt/internal/util/appmacro/Instance.scala | 4 +++- .../main/scala/sbt/internal/util/Dag.scala | 5 ++--- .../src/test/scala/SettingsExample.scala | 3 ++- .../src/test/scala/SettingsTest.scala | 9 ++++++--- .../util/complete/HistoryCommands.scala | 6 +++--- .../sbt/internal/util/complete/Parser.scala | 9 ++++----- .../internal/util/complete/TypeString.scala | 3 ++- .../sbt/complete/ParserWithExamplesTest.scala | 18 +++++++++++------ .../scala/sbt/internal/util/logic/Logic.scala | 3 ++- .../src/test/scala/sbt/logic/Test.scala | 3 ++- project/house.sbt | 2 +- 12 files changed, 43 insertions(+), 42 deletions(-) diff --git a/build.sbt b/build.sbt index d69762670..364eed2ff 100644 --- a/build.sbt +++ b/build.sbt @@ -1,6 +1,7 @@ import Dependencies._ import Util._ +def baseVersion: String = "0.1.0-M4" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( @@ -27,8 +28,6 @@ def commonSettings: Seq[Setting[_]] = Seq( "-Ywarn-dead-code", "-Ywarn-numeric-widen", "-Ywarn-value-discard"), - bintrayPackage := (bintrayPackage in ThisBuild).value, - bintrayRepository := (bintrayRepository in ThisBuild).value, publishArtifact in Compile := true, publishArtifact in Test := true ) @@ -40,22 +39,11 @@ lazy val utilRoot: Project = (project in file(".")). ). settings( inThisBuild(Seq( - organization := "org.scala-sbt", - version := "0.1.0-SNAPSHOT", + git.baseVersion := baseVersion, + bintrayPackage := "util", homepage := Some(url("https://github.com/sbt/util")), description := "Util module for sbt", - licenses := List("BSD New" -> url("https://github.com/sbt/sbt/blob/0.13/LICENSE")), - scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")), - developers := List( - Developer("harrah", "Mark Harrah", "@harrah", url("https://github.com/harrah")), - Developer("eed3si9n", "Eugene Yokota", "@eed3si9n", url("https://github.com/eed3si9n")), - Developer("jsuereth", "Josh Suereth", "@jsuereth", url("https://github.com/jsuereth")), - Developer("dwijnand", "Dale Wijnand", "@dwijnand", url("https://github.com/dwijnand")) - ), - bintrayReleaseOnPublish := false, - bintrayOrganization := Some("sbt"), - bintrayRepository := "maven-releases", - bintrayPackage := "util" + scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")) )), commonSettings, name := "Util Root", diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala index 2eb6f6877..aa8eafe27 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala @@ -77,7 +77,9 @@ object Instance { * this should be the argument wrapped in Right. */ def contImpl[T, N[_]](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]], inner: Transform[c.type, N])( - implicit tt: c.WeakTypeTag[T], nt: c.WeakTypeTag[N[T]], it: c.TypeTag[i.type]): c.Expr[i.M[N[T]]] = + implicit + tt: c.WeakTypeTag[T], nt: c.WeakTypeTag[N[T]], it: c.TypeTag[i.type] + ): c.Expr[i.M[N[T]]] = { import c.universe.{ Apply => ApplyTree, _ } diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala index 3a6e1d414..5cad287da 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala @@ -56,9 +56,8 @@ object Dag { finished; } final class Cyclic(val value: Any, val all: List[Any], val complete: Boolean) - extends Exception("Cyclic reference involving " + - (if (complete) all.mkString("\n ", "\n ", "") else value) - ) { + extends Exception("Cyclic reference involving " + + (if (complete) all.mkString("\n ", "\n ", "") else value)) { def this(value: Any) = this(value, value :: Nil, false) override def toString = getMessage def ::(a: Any): Cyclic = diff --git a/internal/util-collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala index f7b2f2cf0..5dd408282 100644 --- a/internal/util-collection/src/test/scala/SettingsExample.scala +++ b/internal/util-collection/src/test/scala/SettingsExample.scala @@ -49,7 +49,8 @@ object SettingsUsage { val mySettings: Seq[Setting[_]] = Seq( setting(a3, value(3)), setting(b4, map(a4)(_ * 3)), - update(a5)(_ + 1)) + update(a5)(_ + 1) + ) // "compiles" and applies the settings. // This can be split into multiple steps to access intermediate results if desired. diff --git a/internal/util-collection/src/test/scala/SettingsTest.scala b/internal/util-collection/src/test/scala/SettingsTest.scala index 8b77dba16..85a3760ee 100644 --- a/internal/util-collection/src/test/scala/SettingsTest.scala +++ b/internal/util-collection/src/test/scala/SettingsTest.scala @@ -55,7 +55,8 @@ object SettingsTest extends Properties("settings") { List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 nextInit = if (scoped0 == chk) chk else (scoped0 zipWith chk) { (p, _) => p + 1 } - } yield derive(setting(scoped1, nextInit))).toSeq + } yield derive(setting(scoped1, nextInit)) + ).toSeq { // Note: This causes a cycle refernec error, quite frequently. @@ -95,7 +96,8 @@ object SettingsTest extends Properties("settings") { setting(b, value(6)), derive(setting(b, a)), setting(a, value(5)), - setting(b, value(8))) + setting(b, value(8)) + ) val ev = evaluate(settings) checkKey(a, Some(5), ev) && checkKey(b, Some(8), ev) } @@ -104,7 +106,8 @@ object SettingsTest extends Properties("settings") { setting(a, value(3)), setting(b, value(6)), derive(setting(b, a)), - setting(a, value(5))) + setting(a, value(5)) + ) val ev = evaluate(settings) checkKey(a, Some(5), ev) && checkKey(b, Some(5), ev) } diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala index f74d4e448..350a36610 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala @@ -34,7 +34,8 @@ object HistoryCommands { Nth -> ("Execute the command with index n, as shown by the " + ListFull + " command"), Previous -> "Execute the nth command before this one", StartsWithString -> "Execute the most recent command starting with 'string'", - ContainsString -> "Execute the most recent command containing 'string'") + ContainsString -> "Execute the most recent command containing 'string'" + ) def helpString = "History commands:\n " + (descriptions.map { case (c, d) => c + " " + d }).mkString("\n ") def printHelp(): Unit = println(helpString) @@ -46,8 +47,7 @@ object HistoryCommands { val MaxLines = 500 lazy val num = token(NatBasic, "") lazy val last = Last ^^^ { execute(_.!!) } - lazy val list = ListCommands ~> (num ?? Int.MaxValue) map { show => - (h: History) => { printHistory(h, MaxLines, show); Some(Nil) } + lazy val list = ListCommands ~> (num ?? Int.MaxValue) map { show => (h: History) => { printHistory(h, MaxLines, show); Some(Nil) } } lazy val execStr = flag('?') ~ token(any.+.string, "") map { case (contains, str) => diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala index a41c0d7d2..a6d5474aa 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala @@ -426,11 +426,10 @@ trait ParserMain { case _ => val ci = i + 1 if (ci >= s.length) - a.resultEmpty.toEither.left.map { msgs0 => - () => - val msgs = msgs0() - val nonEmpty = if (msgs.isEmpty) "Unexpected end of input" :: Nil else msgs - (nonEmpty, ci) + a.resultEmpty.toEither.left.map { msgs0 => () => + val msgs = msgs0() + val nonEmpty = if (msgs.isEmpty) "Unexpected end of input" :: Nil else msgs + (nonEmpty, ci) } else loop(ci, a derive s(ci)) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala index 9a308a2bf..e96dbad4f 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala @@ -62,7 +62,8 @@ private[sbt] object TypeString { val TypeMap = Map( "java.io.File" -> "File", "java.net.URL" -> "URL", - "java.net.URI" -> "URI") + "java.net.URI" -> "URI" + ) /** * A Parser that extracts basic structure from the string representation of a type from Manifest.toString. diff --git a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala index 684cbe403..17891be4f 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala @@ -18,7 +18,8 @@ class ParserWithExamplesTest extends UnitSpec { val _ = new ParserWithValidExamples { val validCompletions = Completions(Set( suggestion("blue"), - suggestion("red"))) + suggestion("red") + )) parserWithExamples.completions(0) shouldEqual validCompletions } } @@ -27,7 +28,8 @@ class ParserWithExamplesTest extends UnitSpec { "produce only valid examples that start with the character of the derivation" in { val _ = new ParserWithValidExamples { val derivedCompletions = Completions(Set( - suggestion("lue"))) + suggestion("lue") + )) parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions } } @@ -45,7 +47,8 @@ class ParserWithExamplesTest extends UnitSpec { val _ = new parserWithAllExamples { val derivedCompletions = Completions(Set( suggestion("lue"), - suggestion("lock"))) + suggestion("lock") + )) parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions } } @@ -56,9 +59,11 @@ class ParserWithExamplesTest extends UnitSpec { class parserWithAllExamples extends ParserExample(removeInvalidExamples = false) - case class ParserExample(examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), + case class ParserExample( + examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), maxNumberOfExamples: Int = 25, - removeInvalidExamples: Boolean) { + removeInvalidExamples: Boolean + ) { import DefaultParsers._ @@ -67,7 +72,8 @@ class ParserWithExamplesTest extends UnitSpec { colorParser, FixedSetExamples(examples), maxNumberOfExamples, - removeInvalidExamples) + removeInvalidExamples + ) } case class GrowableSourceOfExamples() extends Iterable[String] { diff --git a/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala b/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala index 795423c54..0e15fadf2 100644 --- a/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala +++ b/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala @@ -103,7 +103,8 @@ object Logic { checkAcyclic(clauses) problem.toLeft( - reduce0(clauses, initialFacts, Matched.empty)) + reduce0(clauses, initialFacts, Matched.empty) + ) } /** diff --git a/internal/util-logic/src/test/scala/sbt/logic/Test.scala b/internal/util-logic/src/test/scala/sbt/logic/Test.scala index 59b40c34b..91ded0e69 100644 --- a/internal/util-logic/src/test/scala/sbt/logic/Test.scala +++ b/internal/util-logic/src/test/scala/sbt/logic/Test.scala @@ -20,7 +20,8 @@ object LogicTest extends Properties("Logic") { case Right(res) => false case Left(err: Logic.CyclicNegation) => true case Left(err) => sys.error(s"Expected cyclic error, got: $err") - }) + } + ) def expect(result: Either[LogicException, Matched], expected: Set[Atom]) = result match { case Left(err) => false diff --git a/project/house.sbt b/project/house.sbt index eefc29672..fede298d1 100644 --- a/project/house.sbt +++ b/project/house.sbt @@ -1 +1 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.1.0") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.2.1") From f1edeec3515d85741ae77f576bde7dfb5876b14d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 30 Sep 2015 21:59:20 -0400 Subject: [PATCH 551/823] Fixes sbt/util#14 by rolling back 6175d9233848b220ad8b68f63c90e9b844903f0d --- .../src/main/scala/sbt/internal/util/Settings.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala index f742a778c..6519287da 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala @@ -3,6 +3,8 @@ */ package sbt.internal.util +import scala.language.existentials + import Types._ sealed trait Settings[Scope] { @@ -445,7 +447,7 @@ trait Init[Scope] { def join: Initialize[Seq[T]] = uniform(s)(idFun) } def join[T](inits: Seq[Initialize[T]]): Initialize[Seq[T]] = uniform(inits)(idFun) - def joinAny[M[_], T](inits: Seq[Initialize[M[T]]]): Initialize[Seq[M[_]]] = + def joinAny[M[_]](inits: Seq[Initialize[M[T]] forSome { type T }]): Initialize[Seq[M[_]]] = join(inits.asInstanceOf[Seq[Initialize[M[Any]]]]).asInstanceOf[Initialize[Seq[M[T] forSome { type T }]]] } object SettingsDefinition { From a0fb5e11fc0b2cd58898611b3437987e255c1b7e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 30 Sep 2015 22:00:58 -0400 Subject: [PATCH 552/823] 0.1.0-M5 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 364eed2ff..0ba5c9ab4 100644 --- a/build.sbt +++ b/build.sbt @@ -1,7 +1,7 @@ import Dependencies._ import Util._ -def baseVersion: String = "0.1.0-M4" +def baseVersion: String = "0.1.0-M5" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From 95f183f9bdc3a4e70c08bde981689c4c15451016 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 13 Nov 2015 01:58:39 -0500 Subject: [PATCH 553/823] Remove scala-library from Java only subproject. Fixes #20 --- project/Util.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/project/Util.scala b/project/Util.scala index cfe97dd1c..860d6c84b 100644 --- a/project/Util.scala +++ b/project/Util.scala @@ -10,7 +10,8 @@ object Util { crossPaths := false, compileOrder := CompileOrder.JavaThenScala, unmanagedSourceDirectories in Compile <<= Seq(javaSource in Compile).join, - crossScalaVersions := Seq(Dependencies.scala211) + crossScalaVersions := Seq(Dependencies.scala211), + autoScalaLibrary := false ) def getScalaKeywords: Set[String] = From f84251c877c80b26faea14892fde4699992bab1a Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 13 Nov 2015 01:59:18 -0500 Subject: [PATCH 554/823] 0.1.0-M6 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 0ba5c9ab4..02b37203c 100644 --- a/build.sbt +++ b/build.sbt @@ -1,7 +1,7 @@ import Dependencies._ import Util._ -def baseVersion: String = "0.1.0-M5" +def baseVersion: String = "0.1.0-M6" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From 1e5fa46cbd123c0899606b69b85ae095129f1616 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 9 Dec 2015 14:12:53 +0000 Subject: [PATCH 555/823] Upgrade to jline 2.13 Forward port of sbt/sbt#2173. Fixes #22 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 881899bb1..37ad0b3fe 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -6,7 +6,7 @@ object Dependencies { lazy val scala211 = "2.11.7" lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M3" - lazy val jline = "jline" % "jline" % "2.11" + lazy val jline = "jline" % "jline" % "2.13" lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" lazy val sbinary = "org.scala-tools.sbinary" %% "sbinary" % "0.4.2" From 1c2922a44b0f1b0bbb8da8efba251feb306aa40a Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 16 Dec 2015 15:27:00 -0500 Subject: [PATCH 556/823] Bump up Scala version. Fixes #24 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 37ad0b3fe..ac034d38a 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -2,7 +2,7 @@ import sbt._ import Keys._ object Dependencies { - lazy val scala210 = "2.10.5" + lazy val scala210 = "2.10.6" lazy val scala211 = "2.11.7" lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M3" From 994634fc2c0c2c34f9dbae3c9771e6f95ce97fb5 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 16 Dec 2015 15:28:00 -0500 Subject: [PATCH 557/823] Adds xsbti.F1. --- internal/util-interface/src/main/java/xsbti/F1.java | 6 ++++++ 1 file changed, 6 insertions(+) create mode 100644 internal/util-interface/src/main/java/xsbti/F1.java diff --git a/internal/util-interface/src/main/java/xsbti/F1.java b/internal/util-interface/src/main/java/xsbti/F1.java new file mode 100644 index 000000000..8797e9196 --- /dev/null +++ b/internal/util-interface/src/main/java/xsbti/F1.java @@ -0,0 +1,6 @@ +package xsbti; + +public interface F1 +{ + R apply(A1 a1); +} From 200b3515529fc066748357cf5bbae2ac02e8b779 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 16 Dec 2015 15:46:40 -0500 Subject: [PATCH 558/823] Adds xsbti.T2. --- internal/util-interface/src/main/java/xsbti/T2.java | 12 ++++++++++++ 1 file changed, 12 insertions(+) create mode 100644 internal/util-interface/src/main/java/xsbti/T2.java diff --git a/internal/util-interface/src/main/java/xsbti/T2.java b/internal/util-interface/src/main/java/xsbti/T2.java new file mode 100644 index 000000000..ee844d783 --- /dev/null +++ b/internal/util-interface/src/main/java/xsbti/T2.java @@ -0,0 +1,12 @@ +package xsbti; + +/** Used to pass a pair of values. */ +public class T2 { + public final A1 _1; + public final A2 _2; + + public T2(A1 a1, A2 a2) { + this._1 = a1; + this._2 = a2; + } + } From 121972577dd3179b81b9513d5bb1ecf0a4a72662 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 16 Dec 2015 15:54:23 -0500 Subject: [PATCH 559/823] Define T2 as an interface. --- .../util-interface/src/main/java/xsbti/T2.java | 14 +++++--------- 1 file changed, 5 insertions(+), 9 deletions(-) diff --git a/internal/util-interface/src/main/java/xsbti/T2.java b/internal/util-interface/src/main/java/xsbti/T2.java index ee844d783..0dff08c92 100644 --- a/internal/util-interface/src/main/java/xsbti/T2.java +++ b/internal/util-interface/src/main/java/xsbti/T2.java @@ -1,12 +1,8 @@ package xsbti; /** Used to pass a pair of values. */ -public class T2 { - public final A1 _1; - public final A2 _2; - - public T2(A1 a1, A2 a2) { - this._1 = a1; - this._2 = a2; - } - } +public interface T2 +{ + public A1 get1(); + public A2 get2(); +} From e39247039432ed8999ad955fc727976b18a38730 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 16 Dec 2015 17:22:02 -0500 Subject: [PATCH 560/823] Bump Scala version on Travis --- .travis.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.travis.yml b/.travis.yml index a5b73c0d8..8dc242caf 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,4 +1,4 @@ language: scala scala: - - 2.10.5 + - 2.10.6 - 2.11.7 From 277cbd12ef5a4e9ee42038c75f3265dbd2b27030 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 17 Dec 2015 00:57:24 -0500 Subject: [PATCH 561/823] Adds concrete classes --- .../main/scala/sbt/util/InterfaceUtil.scala | 85 +++++++++++++++++++ .../src/main/scala/sbt/util/Logger.scala | 30 ++----- 2 files changed, 92 insertions(+), 23 deletions(-) create mode 100644 internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala new file mode 100644 index 000000000..d88a93674 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -0,0 +1,85 @@ +package sbt.util + +import xsbti.{ Maybe, F0, F1, T2, Position, Problem, Severity } +import java.io.File + +object InterfaceUtil { + def f0[A](a: => A): F0[A] = new ConcreteF0[A](a) + def f1[A1, R](f: A1 => R): F1[A1, R] = new ConcreteF1(f) + def t2[A1, A2](x: (A1, A2)): T2[A1, A2] = new ConcreteT2(x._1, x._2) + + def m2o[A](m: Maybe[A]): Option[A] = + if (m.isDefined) Some(m.get) + else None + + def o2m[A](o: Option[A]): Maybe[A] = + o match { + case Some(v) => Maybe.just(v) + case None => Maybe.nothing() + } + + def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], + pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = + new ConcretePosition(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0) + + def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = + new ConcreteProblem(cat, pos, msg, sev) + + private final class ConcreteF0[A](a: => A) extends F0[A] { + def apply: A = a + } + + private final class ConcreteF1[A1, R](f: A1 => R) extends F1[A1, R] { + def apply(a1: A1): R = f(a1) + } + + private final class ConcreteT2[A1, A2](a1: A1, a2: A2) extends T2[A1, A2] { + val get1: A1 = a1 + val get2: A2 = a2 + override def toString: String = s"ConcreteT2($a1, $a2)" + override def equals(o: Any): Boolean = o match { + case o: ConcreteT2[A1, A2] => + this.get1 == o.get1 && + this.get2 == o.get2 + case _ => false + } + override def hashCode: Int = + { + var hash = 1 + hash = hash * 31 + this.get1.## + hash = hash * 31 + this.get2.## + hash + } + } + + private final class ConcretePosition( + line0: Option[Integer], + content: String, + offset0: Option[Integer], + pointer0: Option[Integer], + pointerSpace0: Option[String], + sourcePath0: Option[String], + sourceFile0: Option[File] + ) extends Position { + val line = o2m(line0) + val lineContent = content + val offset = o2m(offset0) + val pointer = o2m(pointer0) + val pointerSpace = o2m(pointerSpace0) + val sourcePath = o2m(sourcePath0) + val sourceFile = o2m(sourceFile0) + } + + private final class ConcreteProblem( + cat: String, + pos: Position, + msg: String, + sev: Severity + ) extends Problem { + val category = cat + val position = pos + val message = msg + val severity = sev + override def toString = s"[$severity] $pos: $message" + } +} diff --git a/internal/util-logging/src/main/scala/sbt/util/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/Logger.scala index 17e19f902..08945b379 100644 --- a/internal/util-logging/src/main/scala/sbt/util/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Logger.scala @@ -85,28 +85,12 @@ object Logger { } } } - def f0[T](t: => T): F0[T] = new F0[T] { def apply = t } - - def m2o[S](m: Maybe[S]): Option[S] = if (m.isDefined) Some(m.get) else None - def o2m[S](o: Option[S]): Maybe[S] = o match { case Some(v) => Maybe.just(v); case None => Maybe.nothing() } - - def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = - new Position { - val line = o2m(line0) - val lineContent = content - val offset = o2m(offset0) - val pointer = o2m(pointer0) - val pointerSpace = o2m(pointerSpace0) - val sourcePath = o2m(sourcePath0) - val sourceFile = o2m(sourceFile0) - } - + def f0[A](a: => A): F0[A] = InterfaceUtil.f0[A](a) + def m2o[A](m: Maybe[A]): Option[A] = InterfaceUtil.m2o(m) + def o2m[A](o: Option[A]): Maybe[A] = InterfaceUtil.o2m(o) + def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], + pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = + InterfaceUtil.position(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0) def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = - new Problem { - val category = cat - val position = pos - val message = msg - val severity = sev - override def toString = s"[$severity] $pos: $message" - } + InterfaceUtil.problem(cat, pos, msg, sev) } From 23698d6664f1b8220b67677fce46f9a6bd0ff571 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Wed, 23 Dec 2015 09:36:38 +0100 Subject: [PATCH 562/823] Add scripted-core --- build.sbt | 15 +- .../sbt/internal/scripted/ScriptConfig.java | 31 ++++ .../internal/scripted/CommentHandler.scala | 10 + .../sbt/internal/scripted/FileCommands.scala | 134 ++++++++++++++ .../internal/scripted/FilteredLoader.scala | 19 ++ .../internal/scripted/HandlersProvider.scala | 5 + .../sbt/internal/scripted/ScriptRunner.scala | 48 +++++ .../sbt/internal/scripted/ScriptedTests.scala | 173 ++++++++++++++++++ .../internal/scripted/StatementHandler.scala | 26 +++ .../internal/scripted/TestScriptParser.scala | 83 +++++++++ project/Dependencies.scala | 2 + 11 files changed, 545 insertions(+), 1 deletion(-) create mode 100644 internal/util-scripted/src/main/java/sbt/internal/scripted/ScriptConfig.java create mode 100644 internal/util-scripted/src/main/scala/sbt/internal/scripted/CommentHandler.scala create mode 100644 internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala create mode 100644 internal/util-scripted/src/main/scala/sbt/internal/scripted/FilteredLoader.scala create mode 100644 internal/util-scripted/src/main/scala/sbt/internal/scripted/HandlersProvider.scala create mode 100644 internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala create mode 100644 internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala create mode 100644 internal/util-scripted/src/main/scala/sbt/internal/scripted/StatementHandler.scala create mode 100644 internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala diff --git a/build.sbt b/build.sbt index 02b37203c..175279969 100644 --- a/build.sbt +++ b/build.sbt @@ -35,7 +35,8 @@ def commonSettings: Seq[Setting[_]] = Seq( lazy val utilRoot: Project = (project in file(".")). aggregate( utilInterface, utilControl, utilCollection, utilApplyMacro, utilComplete, - utilLogging, utilRelation, utilLogic, utilCache, utilTracking, utilTesting + utilLogging, utilRelation, utilLogic, utilCache, utilTracking, utilTesting, + utilScripted ). settings( inThisBuild(Seq( @@ -149,6 +150,18 @@ lazy val utilTesting = (project in internalPath / "util-testing"). libraryDependencies ++= Seq(scalaCheck, scalatest) ) +lazy val utilScripted = (project in internalPath / "util-scripted"). + dependsOn(utilLogging). + settings( + commonSettings, + name := "Util Scripted", + libraryDependencies += sbtIO, + libraryDependencies ++= { + if (scalaVersion.value startsWith "2.11") Seq(parserCombinator211) + else Seq() + } + ) + def customCommands: Seq[Setting[_]] = Seq( commands += Command.command("release") { state => // "clean" :: diff --git a/internal/util-scripted/src/main/java/sbt/internal/scripted/ScriptConfig.java b/internal/util-scripted/src/main/java/sbt/internal/scripted/ScriptConfig.java new file mode 100644 index 000000000..52cb52c4e --- /dev/null +++ b/internal/util-scripted/src/main/java/sbt/internal/scripted/ScriptConfig.java @@ -0,0 +1,31 @@ +package sbt.internal.scripted; + +import java.io.File; + +import xsbti.Logger; + +public class ScriptConfig { + + private String label; + private File testDirectory; + private Logger logger; + + public ScriptConfig(String label, File testDirectory, Logger logger) { + this.label = label; + this.testDirectory = testDirectory; + this.logger = logger; + } + + public String label() { + return this.label; + } + + public File testDirectory() { + return this.testDirectory; + } + + public Logger logger() { + return this.logger; + } + +} \ No newline at end of file diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/CommentHandler.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/CommentHandler.scala new file mode 100644 index 000000000..370ae0005 --- /dev/null +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/CommentHandler.scala @@ -0,0 +1,10 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt +package internal +package scripted + +object CommentHandler extends BasicStatementHandler { + def apply(command: String, args: List[String]) = () +} \ No newline at end of file diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala new file mode 100644 index 000000000..ea5fc7559 --- /dev/null +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -0,0 +1,134 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt +package internal +package scripted + +import java.io.File +import sbt.io.{ IO, Path } +import Path._ + +class FileCommands(baseDirectory: File) extends BasicStatementHandler { + lazy val commands = commandMap + def commandMap = + Map( + "touch" nonEmpty touch _, + "delete" nonEmpty delete _, + "exists" nonEmpty exists _, + "mkdir" nonEmpty makeDirectories _, + "absent" nonEmpty absent _, + // "sync" twoArg("Two directory paths", sync _), + "newer" twoArg ("Two paths", newer _), + "pause" noArg { + println("Pausing in " + baseDirectory) + /*readLine("Press enter to continue. ") */ + print("Press enter to continue. ") + System.console.readLine + println() + }, + "sleep" oneArg ("Time in milliseconds", time => Thread.sleep(time.toLong)), + "exec" nonEmpty (execute _), + "copy" copy (to => rebase(baseDirectory, to)), + "copy-file" twoArg ("Two paths", copyFile _), + "must-mirror" twoArg ("Two paths", diffFiles _), + "copy-flat" copy flat + ) + + def apply(command: String, arguments: List[String]): Unit = + commands.get(command).map(_(arguments)) match { + case Some(_) => () + case _ => scriptError("Unknown command " + command); () + } + + def scriptError(message: String): Unit = sys.error("Test script error: " + message) + def spaced[T](l: Seq[T]) = l.mkString(" ") + def fromStrings(paths: List[String]) = paths.map(fromString) + def fromString(path: String) = new File(baseDirectory, path) + def touch(paths: List[String]): Unit = IO.touch(fromStrings(paths)) + def delete(paths: List[String]): Unit = IO.delete(fromStrings(paths)) + /*def sync(from: String, to: String) = + IO.sync(fromString(from), fromString(to), log)*/ + def copyFile(from: String, to: String): Unit = + IO.copyFile(fromString(from), fromString(to)) + def makeDirectories(paths: List[String]) = + IO.createDirectories(fromStrings(paths)) + def diffFiles(file1: String, file2: String): Unit = { + val lines1 = IO.readLines(fromString(file1)) + val lines2 = IO.readLines(fromString(file2)) + if (lines1 != lines2) + scriptError("File contents are different:\n" + lines1.mkString("\n") + "\nAnd:\n" + lines2.mkString("\n")) + } + + def newer(a: String, b: String): Unit = + { + val pathA = fromString(a) + val pathB = fromString(b) + val isNewer = pathA.exists && (!pathB.exists || pathA.lastModified > pathB.lastModified) + if (!isNewer) { + scriptError(s"$pathA is not newer than $pathB") + } + } + def exists(paths: List[String]): Unit = { + val notPresent = fromStrings(paths).filter(!_.exists) + if (notPresent.nonEmpty) + scriptError("File(s) did not exist: " + notPresent.mkString("[ ", " , ", " ]")) + } + def absent(paths: List[String]): Unit = { + val present = fromStrings(paths).filter(_.exists) + if (present.nonEmpty) + scriptError("File(s) existed: " + present.mkString("[ ", " , ", " ]")) + } + def execute(command: List[String]): Unit = execute0(command.head, command.tail) + def execute0(command: String, args: List[String]): Unit = { + if (command.trim.isEmpty) + scriptError("Command was empty.") + else { + val exitValue = sys.process.Process(command :: args, baseDirectory).! + if (exitValue != 0) + sys.error("Nonzero exit value (" + exitValue + ")") + } + } + + // these are for readability of the command list + implicit def commandBuilder(s: String): CommandBuilder = new CommandBuilder(s) + final class CommandBuilder(commandName: String) { + type NamedCommand = (String, List[String] => Unit) + def nonEmpty(action: List[String] => Unit): NamedCommand = + commandName -> { paths => + if (paths.isEmpty) + scriptError("No arguments specified for " + commandName + " command.") + else + action(paths) + } + def twoArg(requiredArgs: String, action: (String, String) => Unit): NamedCommand = + commandName -> { + case List(from, to) => action(from, to) + case other => wrongArguments(requiredArgs, other) + } + def noArg(action: => Unit): NamedCommand = + commandName -> { + case Nil => action + case other => wrongArguments(other) + } + def oneArg(requiredArgs: String, action: String => Unit): NamedCommand = + commandName -> { + case List(single) => action(single) + case other => wrongArguments(requiredArgs, other) + } + def copy(mapper: File => FileMap): NamedCommand = + commandName -> { + case Nil => scriptError("No paths specified for " + commandName + " command.") + case path :: Nil => scriptError("No destination specified for " + commandName + " command.") + case paths => + val mapped = fromStrings(paths) + val map = mapper(mapped.last) + IO.copy(mapped.init pair map) + () + } + def wrongArguments(args: List[String]): Unit = + scriptError("Command '" + commandName + "' does not accept arguments (found '" + spaced(args) + "').") + def wrongArguments(requiredArgs: String, args: List[String]): Unit = + scriptError("Wrong number of arguments to " + commandName + " command. " + requiredArgs + " required, found: '" + spaced(args) + "'.") + } +} \ No newline at end of file diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FilteredLoader.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FilteredLoader.scala new file mode 100644 index 000000000..cb2c3100d --- /dev/null +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FilteredLoader.scala @@ -0,0 +1,19 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt +package internal +package scripted + +final class FilteredLoader(parent: ClassLoader) extends ClassLoader(parent) { + @throws(classOf[ClassNotFoundException]) + override final def loadClass(className: String, resolve: Boolean): Class[_] = + { + if (className.startsWith("java.") || className.startsWith("javax.")) + super.loadClass(className, resolve) + else + throw new ClassNotFoundException(className) + } + override def getResources(name: String) = null + override def getResource(name: String) = null +} \ No newline at end of file diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/HandlersProvider.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/HandlersProvider.scala new file mode 100644 index 000000000..3dcb4ef6d --- /dev/null +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/HandlersProvider.scala @@ -0,0 +1,5 @@ +package sbt.internal.scripted + +trait HandlersProvider { + def getHandlers(config: ScriptConfig): Map[Char, StatementHandler] +} \ No newline at end of file diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala new file mode 100644 index 000000000..f43b54f39 --- /dev/null +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala @@ -0,0 +1,48 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt +package internal +package scripted + +final class TestException(statement: Statement, msg: String, exception: Throwable) + extends RuntimeException(statement.linePrefix + " " + msg, exception) + +class ScriptRunner { + import scala.collection.mutable.HashMap + def apply(statements: List[(StatementHandler, Statement)]): Unit = { + val states = new HashMap[StatementHandler, Any] + def processStatement(handler: StatementHandler, statement: Statement): Unit = { + val state = states(handler).asInstanceOf[handler.State] + val nextState = + try { Right(handler(statement.command, statement.arguments, state)) } + catch { case e: Exception => Left(e) } + nextState match { + case Left(err) => + if (statement.successExpected) { + err match { + case t: TestFailed => throw new TestException(statement, "Command failed: " + t.getMessage, null) + case _ => throw new TestException(statement, "Command failed", err) + } + } else + () + case Right(s) => + if (statement.successExpected) + states(handler) = s + else + throw new TestException(statement, "Command succeeded but failure was expected", null) + } + } + val handlers = Set() ++ statements.map(_._1) + + try { + handlers.foreach { handler => states(handler) = handler.initialState } + statements foreach (Function.tupled(processStatement)) + } finally { + for (handler <- handlers; state <- states.get(handler)) { + try { handler.finish(state.asInstanceOf[handler.State]) } + catch { case e: Exception => () } + } + } + } +} \ No newline at end of file diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala new file mode 100644 index 000000000..dcfb50845 --- /dev/null +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -0,0 +1,173 @@ +package sbt +package internal +package scripted + +import java.io.File +import sbt.util.Logger +import sbt.internal.util.{ ConsoleLogger, BufferedLogger, FullLogger } +import sbt.io.IO.wrapNull +import sbt.io.{ DirectoryFilter, HiddenFileFilter, Path, GlobFilter } +import sbt.internal.io.Resources + +object ScriptedRunnerImpl { + def run(resourceBaseDirectory: File, bufferLog: Boolean, tests: Array[String], handlersProvider: HandlersProvider): Unit = { + val runner = new ScriptedTests(resourceBaseDirectory, bufferLog, handlersProvider) + val logger = ConsoleLogger() + val allTests = get(tests, resourceBaseDirectory, logger) flatMap { + case ScriptedTest(group, name) => + runner.scriptedTest(group, name, logger) + } + runAll(allTests) + } + def runAll(tests: Seq[() => Option[String]]): Unit = { + val errors = for (test <- tests; err <- test()) yield err + if (errors.nonEmpty) + sys.error(errors.mkString("Failed tests:\n\t", "\n\t", "\n")) + } + def get(tests: Seq[String], baseDirectory: File, log: Logger): Seq[ScriptedTest] = + if (tests.isEmpty) listTests(baseDirectory, log) else parseTests(tests) + def listTests(baseDirectory: File, log: Logger): Seq[ScriptedTest] = + (new ListTests(baseDirectory, _ => true, log)).listTests + def parseTests(in: Seq[String]): Seq[ScriptedTest] = + for (testString <- in) yield { + val Array(group, name) = testString.split("/").map(_.trim) + ScriptedTest(group, name) + } +} + +final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handlersProvider: HandlersProvider) { + // import ScriptedTests._ + private val testResources = new Resources(resourceBaseDirectory) + + val ScriptFilename = "test" + val PendingScriptFilename = "pending" + + def scriptedTest(group: String, name: String, log: xsbti.Logger): Seq[() => Option[String]] = + scriptedTest(group, name, Logger.xlog2Log(log)) + def scriptedTest(group: String, name: String, log: Logger): Seq[() => Option[String]] = + scriptedTest(group, name, { _ => () }, log) + def scriptedTest(group: String, name: String, prescripted: File => Unit, log: Logger): Seq[() => Option[String]] = { + import Path._ + import GlobFilter._ + var failed = false + for (groupDir <- (resourceBaseDirectory * group).get; nme <- (groupDir * name).get) yield { + val g = groupDir.getName + val n = nme.getName + val str = s"$g / $n" + () => { + println("Running " + str) + testResources.readWriteResourceDirectory(g, n) { testDirectory => + val disabled = new File(testDirectory, "disabled").isFile + if (disabled) { + log.info("D " + str + " [DISABLED]") + None + } else { + try { scriptedTest(str, testDirectory, prescripted, log); None } + catch { case _: TestException | _: PendingTestSuccessException => Some(str) } + } + } + } + } + } + + private def scriptedTest(label: String, testDirectory: File, prescripted: File => Unit, log: Logger): Unit = + { + val buffered = new BufferedLogger(new FullLogger(log)) + if (bufferLog) + buffered.record() + + def createParser() = + { + // val fileHandler = new FileCommands(testDirectory) + // // val sbtHandler = new SbtHandler(testDirectory, launcher, buffered, launchOpts) + // new TestScriptParser(Map('$' -> fileHandler, /* '>' -> sbtHandler, */ '#' -> CommentHandler)) + val scriptConfig = new ScriptConfig(label, testDirectory, buffered) + new TestScriptParser(handlersProvider getHandlers scriptConfig) + } + val (file, pending) = { + val normal = new File(testDirectory, ScriptFilename) + val pending = new File(testDirectory, PendingScriptFilename) + if (pending.isFile) (pending, true) else (normal, false) + } + val pendingString = if (pending) " [PENDING]" else "" + + def runTest() = + { + val run = new ScriptRunner + val parser = createParser() + run(parser.parse(file)) + } + def testFailed(): Unit = { + if (pending) buffered.clear() else buffered.stop() + buffered.error("x " + label + pendingString) + } + + try { + prescripted(testDirectory) + runTest() + buffered.info("+ " + label + pendingString) + if (pending) throw new PendingTestSuccessException(label) + } catch { + case e: TestException => + testFailed() + e.getCause match { + case null | _: java.net.SocketException => buffered.error(" " + e.getMessage) + case _ => e.printStackTrace + } + if (!pending) throw e + case e: PendingTestSuccessException => + testFailed() + buffered.error(" Mark as passing to remove this failure.") + throw e + case e: Exception => + testFailed() + if (!pending) throw e + } finally { buffered.clear() } + } +} + +// object ScriptedTests extends ScriptedRunner { +// val emptyCallback: File => Unit = { _ => () } +// } + +final case class ScriptedTest(group: String, name: String) { + override def toString = group + "/" + name +} + +object ListTests { + def list(directory: File, filter: java.io.FileFilter) = wrapNull(directory.listFiles(filter)) +} +import ListTests._ +final class ListTests(baseDirectory: File, accept: ScriptedTest => Boolean, log: Logger) { + def filter = DirectoryFilter -- HiddenFileFilter + def listTests: Seq[ScriptedTest] = + { + list(baseDirectory, filter) flatMap { group => + val groupName = group.getName + listTests(group).map(ScriptedTest(groupName, _)) + } + } + private[this] def listTests(group: File): Set[String] = + { + val groupName = group.getName + val allTests = list(group, filter) + if (allTests.isEmpty) { + log.warn("No tests in test group " + groupName) + Set.empty + } else { + val (included, skipped) = allTests.toList.partition(test => accept(ScriptedTest(groupName, test.getName))) + if (included.isEmpty) + log.warn("Test group " + groupName + " skipped.") + else if (skipped.nonEmpty) { + log.warn("Tests skipped in group " + group.getName + ":") + skipped.foreach(testName => log.warn(" " + testName.getName)) + } + Set(included.map(_.getName): _*) + } + } +} + +class PendingTestSuccessException(label: String) extends Exception { + override def getMessage: String = + s"The pending test $label succeeded. Mark this test as passing to remove this failure." +} \ No newline at end of file diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/StatementHandler.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/StatementHandler.scala new file mode 100644 index 000000000..15b7189ce --- /dev/null +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/StatementHandler.scala @@ -0,0 +1,26 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt +package internal +package scripted + +trait StatementHandler { + type State + def initialState: State + def apply(command: String, arguments: List[String], state: State): State + def finish(state: State): Unit +} + +trait BasicStatementHandler extends StatementHandler { + final type State = Unit + final def initialState = () + final def apply(command: String, arguments: List[String], state: Unit): Unit = apply(command, arguments) + def apply(command: String, arguments: List[String]): Unit + def finish(state: Unit) = () +} + +/** Use when a stack trace is not useful */ +final class TestFailed(msg: String) extends RuntimeException(msg) { + override def fillInStackTrace = this +} \ No newline at end of file diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala new file mode 100644 index 000000000..2e8b7f7f6 --- /dev/null +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala @@ -0,0 +1,83 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package sbt +package internal +package scripted + +import java.io.{ BufferedReader, File, InputStreamReader } +import scala.util.parsing.combinator._ +import scala.util.parsing.input.Positional +import Character.isWhitespace +import sbt.io.IO + +/* +statement* +statement ::= startChar successChar word+ nl +startChar ::= +successChar ::= '+' | '-' +word ::= [^ \[\]]+ +comment ::= '#' \S* nl +nl ::= '\r' \'n' | '\n' | '\r' | eof +*/ +final case class Statement(command: String, arguments: List[String], successExpected: Boolean, line: Int) { + def linePrefix = "{line " + line + "} " +} + +private object TestScriptParser { + val SuccessLiteral = "success" + val FailureLiteral = "failure" + val WordRegex = """[^ \[\]\s'\"][^ \[\]\s]*""".r +} + +import TestScriptParser._ +class TestScriptParser(handlers: Map[Char, StatementHandler]) extends RegexParsers { + require(handlers.nonEmpty) + override def skipWhitespace = false + + import IO.read + if (handlers.keys.exists(isWhitespace)) + sys.error("Start characters cannot be whitespace") + if (handlers.keys.exists(key => key == '+' || key == '-')) + sys.error("Start characters cannot be '+' or '-'") + + def parse(scriptFile: File): List[(StatementHandler, Statement)] = parse(read(scriptFile), Some(scriptFile.getAbsolutePath)) + def parse(script: String): List[(StatementHandler, Statement)] = parse(script, None) + private def parse(script: String, label: Option[String]): List[(StatementHandler, Statement)] = + { + parseAll(statements, script) match { + case Success(result, next) => result + case err: NoSuccess => + { + val labelString = label.map("'" + _ + "' ").getOrElse("") + sys.error("Could not parse test script, " + labelString + err.toString) + } + } + } + + lazy val statements = rep1(space ~> statement <~ newline) + def statement: Parser[(StatementHandler, Statement)] = + { + trait PositionalStatement extends Positional { + def tuple: (StatementHandler, Statement) + } + positioned { + val command = (word | err("expected command")) + val arguments = rep(space ~> (word | failure("expected argument"))) + (successParser ~ (space ~> startCharacterParser <~ space) ~! command ~! arguments) ^^ + { + case successExpected ~ start ~ command ~ arguments => + new PositionalStatement { + def tuple = (handlers(start), new Statement(command, arguments, successExpected, pos.line)) + } + } + } ^^ (_.tuple) + } + def successParser: Parser[Boolean] = ('+' ^^^ true) | ('-' ^^^ false) | success(true) + def space: Parser[String] = """[ \t]*""".r + lazy val word: Parser[String] = ("\'" ~> "[^'\n\r]*".r <~ "\'") | ("\"" ~> "[^\"\n\r]*".r <~ "\"") | WordRegex + def startCharacterParser: Parser[Char] = elem("start character", handlers.contains _) | + ((newline | err("expected start character " + handlers.keys.mkString("(", "", ")"))) ~> failure("end of input")) + + def newline = """\s*([\n\r]|$)""".r +} \ No newline at end of file diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 37ad0b3fe..20c69b87b 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -25,4 +25,6 @@ object Dependencies { lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.12.4" lazy val scalatest = "org.scalatest" %% "scalatest" % "2.2.4" + + lazy val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" } From ee7e2889dceb2a7db7da14ccba8956888d986e70 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 23 Dec 2015 19:23:11 -0500 Subject: [PATCH 563/823] bumping up to 0.1.0-M8 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 175279969..641f7e54b 100644 --- a/build.sbt +++ b/build.sbt @@ -1,7 +1,7 @@ import Dependencies._ import Util._ -def baseVersion: String = "0.1.0-M6" +def baseVersion: String = "0.1.0-M8" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From 149122ab4daf2a57fb9e64ecadb022039d785410 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 25 Jan 2016 14:48:06 +0100 Subject: [PATCH 564/823] Hide stacktrace upon failure on pending scripted test --- .../src/main/scala/sbt/internal/scripted/ScriptedTests.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index dcfb50845..665d97f0f 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -112,7 +112,7 @@ final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handl testFailed() e.getCause match { case null | _: java.net.SocketException => buffered.error(" " + e.getMessage) - case _ => e.printStackTrace + case _ => if (!pending) e.printStackTrace } if (!pending) throw e case e: PendingTestSuccessException => From a48d7cb904d260e5b28b55bca2576f86c8580155 Mon Sep 17 00:00:00 2001 From: Tim Harper Date: Tue, 9 Feb 2016 02:10:03 -0700 Subject: [PATCH 565/823] add documention for FileFunction.cached --- .../main/scala/sbt/internal/util/Tracked.scala | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala index ae3e060a8..77d4b3a29 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala @@ -233,6 +233,24 @@ class Difference(val cache: File, val style: FilesInfo.Style, val defineClean: B object FileFunction { type UpdateFunction = (ChangeReport[File], ChangeReport[File]) => Set[File] + /** + Generic change-detection helper used to help build / artifact generation / + etc. steps detect whether or not they need to run. Returns a function whose + input is a Set of input files, and subsequently executes the action function + (which does the actual work: compiles, generates resources, etc.), returning + a Set of output files that it generated. + + The input file and resulting output file state is cached in + cacheBaseDirectory. On each invocation, the state of the input and output + files from the previous run is compared against the cache, as is the set of + input files. If a change in file state / input files set is detected, the + action function is re-executed. + + @param cacheBaseDirectory The folder in which to store + @param inStyle The strategy by which to detect state change in the input files from the previous run + @param outStyle The strategy by which to detect state change in the output files from the previous run + @param action The work function, which receives a list of input files and returns a list of output files + */ def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style = FilesInfo.lastModified, outStyle: FilesInfo.Style = FilesInfo.exists)(action: Set[File] => Set[File]): Set[File] => Set[File] = cached(cacheBaseDirectory)(inStyle, outStyle)((in, out) => action(in.checked)) From 81757bcb1562a44e83bdde41c21a16355514cafe Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 14 Feb 2016 16:42:00 +0000 Subject: [PATCH 566/823] Bump sbt-houserules to add MiMa See sbt/sbt#2383 --- build.sbt | 2 ++ project/house.sbt | 2 +- 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 641f7e54b..7b48e331e 100644 --- a/build.sbt +++ b/build.sbt @@ -1,5 +1,6 @@ import Dependencies._ import Util._ +import com.typesafe.tools.mima.core._, ProblemFilters._ def baseVersion: String = "0.1.0-M8" def internalPath = file("internal") @@ -28,6 +29,7 @@ def commonSettings: Seq[Setting[_]] = Seq( "-Ywarn-dead-code", "-Ywarn-numeric-widen", "-Ywarn-value-discard"), + previousArtifact := None, // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, publishArtifact in Test := true ) diff --git a/project/house.sbt b/project/house.sbt index fede298d1..555559b37 100644 --- a/project/house.sbt +++ b/project/house.sbt @@ -1 +1 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.2.1") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.1") From dc410f9842884e80666ecb25af55a3c6c693d81d Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 14 Feb 2016 23:00:25 +0000 Subject: [PATCH 567/823] Unexecute Positions --- .../src/main/scala/sbt/internal/util/Positions.scala | 0 1 file changed, 0 insertions(+), 0 deletions(-) mode change 100755 => 100644 internal/util-collection/src/main/scala/sbt/internal/util/Positions.scala diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Positions.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Positions.scala old mode 100755 new mode 100644 From e1c7f39e5a819f3f0ae560f9728cf09508a5afd2 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 14 Feb 2016 23:15:32 +0000 Subject: [PATCH 568/823] Exempt out of -Xfuture in util-collection --- build.sbt | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 7b48e331e..bbfa3f30f 100644 --- a/build.sbt +++ b/build.sbt @@ -80,7 +80,9 @@ lazy val utilCollection = (project in internalPath / "util-collection"). settings( commonSettings, Util.keywordsSettings, - name := "Util Collection" + name := "Util Collection", + scalacOptions --= // scalac 2.10 rejects some HK types under -Xfuture it seems.. + (CrossVersion partialVersion scalaVersion.value collect { case (2, 10) => "-Xfuture" }).toList ) lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). From 5004b8a515b2a2b8dcd85dcae60d33388065d0cb Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 18 Mar 2016 02:38:06 -0400 Subject: [PATCH 569/823] Refactor nulls to Option --- .../scala/sbt/internal/util/LineReader.scala | 17 ++++++++--------- 1 file changed, 8 insertions(+), 9 deletions(-) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index e72341a23..ed68630bd 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -15,13 +15,12 @@ abstract class JLine extends LineReader { def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } - private[this] def unsynchronizedReadLine(prompt: String, mask: Option[Char]) = - readLineWithHistory(prompt, mask) match { - case null => None - case x => Some(x.trim) + private[this] def unsynchronizedReadLine(prompt: String, mask: Option[Char]): Option[String] = + readLineWithHistory(prompt, mask) map { x => + x.trim } - private[this] def readLineWithHistory(prompt: String, mask: Option[Char]): String = + private[this] def readLineWithHistory(prompt: String, mask: Option[Char]): Option[String] = reader.getHistory match { case fh: FileHistory => try { readLineDirect(prompt, mask) } @@ -29,17 +28,17 @@ abstract class JLine extends LineReader { case _ => readLineDirect(prompt, mask) } - private[this] def readLineDirect(prompt: String, mask: Option[Char]): String = + private[this] def readLineDirect(prompt: String, mask: Option[Char]): Option[String] = if (handleCONT) Signals.withHandler(() => resume(), signal = Signals.CONT)(() => readLineDirectRaw(prompt, mask)) else readLineDirectRaw(prompt, mask) - private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): String = + private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): Option[String] = { val newprompt = handleMultilinePrompt(prompt) mask match { - case Some(m) => reader.readLine(newprompt, m) - case None => reader.readLine(newprompt) + case Some(m) => Option(reader.readLine(newprompt, m)) + case None => Option(reader.readLine(newprompt)) } } From 073f2be487e2e81918e8588c7d52dfa9172eec0b Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 18 Mar 2016 21:58:40 -0400 Subject: [PATCH 570/823] Inject Thread.sleep periodically during read() to allow thread interruption --- .../scala/sbt/internal/util/LineReader.scala | 45 ++++++++++++++----- 1 file changed, 35 insertions(+), 10 deletions(-) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index ed68630bd..93552b518 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -5,9 +5,11 @@ package sbt.internal.util import jline.console.ConsoleReader import jline.console.history.{ FileHistory, MemoryHistory } -import java.io.{ File, InputStream, PrintWriter } +import java.io.{ File, InputStream, PrintWriter, FileInputStream, FileDescriptor, FilterInputStream } import complete.Parser import java.util.concurrent.atomic.AtomicBoolean +import scala.concurrent.duration.Duration +import scala.annotation.tailrec abstract class JLine extends LineReader { protected[this] val handleCONT: Boolean @@ -94,10 +96,14 @@ private[sbt] object JLine { t.restore f(t) } - def createReader(): ConsoleReader = createReader(None) - def createReader(historyPath: Option[File]): ConsoleReader = + def createReader(): ConsoleReader = createReader(None, true) + def createReader(historyPath: Option[File], injectThreadSleep: Boolean): ConsoleReader = usingTerminal { t => - val cr = new ConsoleReader + val cr = if (injectThreadSleep) { + val originalIn = new FileInputStream(FileDescriptor.in) + val in = new InputStreamWrapper(originalIn, Duration("50 ms")) + new ConsoleReader(in, System.out) + } else new ConsoleReader cr.setExpandEvents(false) // https://issues.scala-lang.org/browse/SI-7650 cr.setBellEnabled(false) val h = historyPath match { @@ -115,25 +121,44 @@ private[sbt] object JLine { finally { t.restore } } - def simple(historyPath: Option[File], handleCONT: Boolean = HandleCONT): SimpleReader = new SimpleReader(historyPath, handleCONT) + def simple( + historyPath: Option[File], + handleCONT: Boolean = HandleCONT, + injectThreadSleep: Boolean = true + ): SimpleReader = new SimpleReader(historyPath, handleCONT, injectThreadSleep) val MaxHistorySize = 500 val HandleCONT = !java.lang.Boolean.getBoolean("sbt.disable.cont") && Signals.supported(Signals.CONT) } +private[sbt] class InputStreamWrapper(is: InputStream, val poll: Duration) extends FilterInputStream(is) { + @tailrec + final override def read(): Int = + if (is.available() != 0) is.read() + else { + Thread.sleep(poll.toMillis) + read() + } +} + trait LineReader { def readLine(prompt: String, mask: Option[Char] = None): Option[String] } -final class FullReader(historyPath: Option[File], complete: Parser[_], val handleCONT: Boolean = JLine.HandleCONT) extends JLine { +final class FullReader( + historyPath: Option[File], + complete: Parser[_], + val handleCONT: Boolean = JLine.HandleCONT, + val injectThreadSleep: Boolean = true +) extends JLine { protected[this] val reader = { - val cr = JLine.createReader(historyPath) + val cr = JLine.createReader(historyPath, injectThreadSleep) sbt.internal.util.complete.JLineCompletion.installCustomCompletor(cr, complete) cr } } -class SimpleReader private[sbt] (historyPath: Option[File], val handleCONT: Boolean) extends JLine { - protected[this] val reader = JLine.createReader(historyPath) +class SimpleReader private[sbt] (historyPath: Option[File], val handleCONT: Boolean, val injectThreadSleep: Boolean) extends JLine { + protected[this] val reader = JLine.createReader(historyPath, injectThreadSleep) } -object SimpleReader extends SimpleReader(None, JLine.HandleCONT) +object SimpleReader extends SimpleReader(None, JLine.HandleCONT, true) From c1aa172467e03bc0ff6f71bdf4eb389f21c8db4f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 18 Mar 2016 22:00:02 -0400 Subject: [PATCH 571/823] 0.1.0-M9 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 641f7e54b..565b09da0 100644 --- a/build.sbt +++ b/build.sbt @@ -1,7 +1,7 @@ import Dependencies._ import Util._ -def baseVersion: String = "0.1.0-M8" +def baseVersion: String = "0.1.0-M9" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From 56e840018c9b4125f9588ff9fe1ffca1b19c1f60 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 31 Mar 2016 01:23:34 -0400 Subject: [PATCH 572/823] Bump to 0.1.0-M10 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 7888c9611..1b16a5050 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "0.1.0-M9" +def baseVersion: String = "0.1.0-M10" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From f4055e6c5f1de918e485dcadf213bd250bf7f919 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 31 Mar 2016 01:28:55 -0400 Subject: [PATCH 573/823] Fixes #32. Don't inject thread sleep by default. Thread sleeping interferes with scripted test when the build cannot be loaded. The scripted test gets stuck, and jstack shows java.lang.Thread.State: TIMED_WAITING (sleeping) at java.lang.Thread.sleep(Native Method) at sbt.internal.util.InputStreamWrapper.read(LineReader.scala:138) at jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:2 45) .... at sbt.internal.util.JLine$.withJLine(LineReader.scala:118) at sbt.internal.util.JLine.readLine(LineReader.scala:18) at sbt.BuiltinCommands$.sbt$BuiltinCommands$$doLoadFailed(Main.scala:460) --- .../src/main/scala/sbt/internal/util/LineReader.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index 93552b518..712071684 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -124,7 +124,7 @@ private[sbt] object JLine { def simple( historyPath: Option[File], handleCONT: Boolean = HandleCONT, - injectThreadSleep: Boolean = true + injectThreadSleep: Boolean = false ): SimpleReader = new SimpleReader(historyPath, handleCONT, injectThreadSleep) val MaxHistorySize = 500 val HandleCONT = !java.lang.Boolean.getBoolean("sbt.disable.cont") && Signals.supported(Signals.CONT) @@ -147,7 +147,7 @@ final class FullReader( historyPath: Option[File], complete: Parser[_], val handleCONT: Boolean = JLine.HandleCONT, - val injectThreadSleep: Boolean = true + val injectThreadSleep: Boolean = false ) extends JLine { protected[this] val reader = { @@ -160,5 +160,5 @@ final class FullReader( class SimpleReader private[sbt] (historyPath: Option[File], val handleCONT: Boolean, val injectThreadSleep: Boolean) extends JLine { protected[this] val reader = JLine.createReader(historyPath, injectThreadSleep) } -object SimpleReader extends SimpleReader(None, JLine.HandleCONT, true) +object SimpleReader extends SimpleReader(None, JLine.HandleCONT, false) From 6eb808def934ab329c7eefbbe905bf6d198c71bb Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 29 Mar 2016 23:13:45 -0400 Subject: [PATCH 574/823] Add Eval from Cats - https://github.com/typelevel/cats/blob/a8ba943fff5928d962101a92d5c173d51df1d626/core/src/main/scala/cats/Eval.scala --- .../main/scala/sbt/internal/util/Eval.scala | 264 ++++++++++++++++++ 1 file changed, 264 insertions(+) create mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala new file mode 100644 index 000000000..5117f1a6d --- /dev/null +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala @@ -0,0 +1,264 @@ +package sbt.internal.util + +import scala.annotation.tailrec + +// Copied from Cats (MIT license) + +/** + * Eval is a datatype, which controls evaluation. + */ +sealed abstract class Eval[A] extends Serializable { self => + /** + * Evaluate the computation and return an A value. + * + * For lazy instances (Later, Always), any necessary computation + * will be performed at this point. For eager instances (Now), a + * value will be immediately returned. + */ + def value: A + + /** + * Transform an Eval[A] into an Eval[B] given the transformation + * function `f`. + * + * This call is stack-safe -- many .map calls may be chained without + * consumed additional stack during evaluation. + * + * Computation performed in f is always lazy, even when called on an + * eager (Now) instance. + */ + def map[B](f: A => B): Eval[B] = + flatMap(a => Now(f(a))) + + /** + * Lazily perform a computation based on an Eval[A], using the + * function `f` to produce an Eval[B] given an A. + * + * This call is stack-safe -- many .flatMap calls may be chained + * without consumed additional stack during evaluation. It is also + * written to avoid left-association problems, so that repeated + * calls to .flatMap will be efficiently applied. + * + * Computation performed in f is always lazy, even when called on an + * eager (Now) instance. + */ + def flatMap[B](f: A => Eval[B]): Eval[B] = + this match { + case c: Eval.Compute[A] => + new Eval.Compute[B] { + type Start = c.Start + val start = c.start + val run = (s: c.Start) => + new Eval.Compute[B] { + type Start = A + val start = () => c.run(s) + val run = f + } + } + case c: Eval.Call[A] => + new Eval.Compute[B] { + type Start = A + val start = c.thunk + val run = f + } + case _ => + new Eval.Compute[B] { + type Start = A + val start = () => self + val run = f + } + } + + /** + * Ensure that the result of the computation (if any) will be + * memoized. + * + * Practically, this means that when called on an Always[A] a + * Later[A] with an equivalent computation will be returned. + */ + def memoize: Eval[A] +} + +/** + * Construct an eager Eval[A] instance. + * + * In some sense it is equivalent to using a val. + * + * This type should be used when an A value is already in hand, or + * when the computation to produce an A value is pure and very fast. + */ +final case class Now[A](value: A) extends Eval[A] { + def memoize: Eval[A] = this +} + +/** + * Construct a lazy Eval[A] instance. + * + * This type should be used for most "lazy" values. In some sense it + * is equivalent to using a lazy val. + * + * When caching is not required or desired (e.g. if the value produced + * may be large) prefer Always. When there is no computation + * necessary, prefer Now. + * + * Once Later has been evaluated, the closure (and any values captured + * by the closure) will not be retained, and will be available for + * garbage collection. + */ +final class Later[A](f: () => A) extends Eval[A] { + private[this] var thunk: () => A = f + + // The idea here is that `f` may have captured very large + // structures, but produce a very small result. In this case, once + // we've calculated a value, we would prefer to be able to free + // everything else. + // + // (For situations where `f` is small, but the output will be very + // expensive to store, consider using `Always`.) + lazy val value: A = { + val result = thunk() + thunk = null // scalastyle:off + result + } + + def memoize: Eval[A] = this +} + +object Later { + def apply[A](a: => A): Later[A] = new Later(a _) +} + +/** + * Construct a lazy Eval[A] instance. + * + * This type can be used for "lazy" values. In some sense it is + * equivalent to using a Function0 value. + * + * This type will evaluate the computation every time the value is + * required. It should be avoided except when laziness is required and + * caching must be avoided. Generally, prefer Later. + */ +final class Always[A](f: () => A) extends Eval[A] { + def value: A = f() + def memoize: Eval[A] = new Later(f) +} + +object Always { + def apply[A](a: => A): Always[A] = new Always(a _) +} + +object Eval { + + /** + * Construct an eager Eval[A] value (i.e. Now[A]). + */ + def now[A](a: A): Eval[A] = Now(a) + + /** + * Construct a lazy Eval[A] value with caching (i.e. Later[A]). + */ + def later[A](a: => A): Eval[A] = new Later(a _) + + /** + * Construct a lazy Eval[A] value without caching (i.e. Always[A]). + */ + def always[A](a: => A): Eval[A] = new Always(a _) + + /** + * Defer a computation which produces an Eval[A] value. + * + * This is useful when you want to delay execution of an expression + * which produces an Eval[A] value. Like .flatMap, it is stack-safe. + */ + def defer[A](a: => Eval[A]): Eval[A] = + new Eval.Call[A](a _) {} + + /** + * Static Eval instances for some common values. + * + * These can be useful in cases where the same values may be needed + * many times. + */ + val Unit: Eval[Unit] = Now(()) + val True: Eval[Boolean] = Now(true) + val False: Eval[Boolean] = Now(false) + val Zero: Eval[Int] = Now(0) + val One: Eval[Int] = Now(1) + + /** + * Call is a type of Eval[A] that is used to defer computations + * which produce Eval[A]. + * + * Users should not instantiate Call instances themselves. Instead, + * they will be automatically created when needed. + */ + sealed abstract class Call[A](val thunk: () => Eval[A]) extends Eval[A] { + def memoize: Eval[A] = new Later(() => value) + def value: A = Call.loop(this).value + } + + object Call { + /** Collapse the call stack for eager evaluations */ + @tailrec private def loop[A](fa: Eval[A]): Eval[A] = fa match { + case call: Eval.Call[A] => + loop(call.thunk()) + case compute: Eval.Compute[A] => + new Eval.Compute[A] { + type Start = compute.Start + val start: () => Eval[Start] = () => compute.start() + val run: Start => Eval[A] = s => loop1(compute.run(s)) + } + case other => other + } + + /** + * Alias for loop that can be called in a non-tail position + * from an otherwise tailrec-optimized loop. + */ + private def loop1[A](fa: Eval[A]): Eval[A] = loop(fa) + } + + /** + * Compute is a type of Eval[A] that is used to chain computations + * involving .map and .flatMap. Along with Eval#flatMap it + * implements the trampoline that guarantees stack-safety. + * + * Users should not instantiate Compute instances + * themselves. Instead, they will be automatically created when + * needed. + * + * Unlike a traditional trampoline, the internal workings of the + * trampoline are not exposed. This allows a slightly more efficient + * implementation of the .value method. + */ + sealed abstract class Compute[A] extends Eval[A] { + type Start + val start: () => Eval[Start] + val run: Start => Eval[A] + + def memoize: Eval[A] = Later(value) + + def value: A = { + type L = Eval[Any] + type C = Any => Eval[Any] + @tailrec def loop(curr: L, fs: List[C]): Any = + curr match { + case c: Compute[_] => + c.start() match { + case cc: Compute[_] => + loop( + cc.start().asInstanceOf[L], + cc.run.asInstanceOf[C] :: c.run.asInstanceOf[C] :: fs) + case xx => + loop(c.run(xx.value).asInstanceOf[L], fs) + } + case x => + fs match { + case f :: fs => loop(f(x.value), fs) + case Nil => x.value + } + } + loop(this.asInstanceOf[L], Nil).asInstanceOf[A] + } + } +} From 183f17c19235900ebb625197bc75cef0f733d7ab Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 30 Mar 2016 00:11:52 -0400 Subject: [PATCH 575/823] Rename Eval#value to get --- .../main/scala/sbt/internal/util/Eval.scala | 27 ++++++++++--------- 1 file changed, 14 insertions(+), 13 deletions(-) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala index 5117f1a6d..1596e457b 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala @@ -15,7 +15,7 @@ sealed abstract class Eval[A] extends Serializable { self => * will be performed at this point. For eager instances (Now), a * value will be immediately returned. */ - def value: A + def get: A /** * Transform an Eval[A] into an Eval[B] given the transformation @@ -87,7 +87,7 @@ sealed abstract class Eval[A] extends Serializable { self => * This type should be used when an A value is already in hand, or * when the computation to produce an A value is pure and very fast. */ -final case class Now[A](value: A) extends Eval[A] { +final case class Now[A](get: A) extends Eval[A] { def memoize: Eval[A] = this } @@ -115,7 +115,7 @@ final class Later[A](f: () => A) extends Eval[A] { // // (For situations where `f` is small, but the output will be very // expensive to store, consider using `Always`.) - lazy val value: A = { + lazy val get: A = { val result = thunk() thunk = null // scalastyle:off result @@ -139,7 +139,7 @@ object Later { * caching must be avoided. Generally, prefer Later. */ final class Always[A](f: () => A) extends Eval[A] { - def value: A = f() + def get: A = f() def memoize: Eval[A] = new Later(f) } @@ -193,8 +193,8 @@ object Eval { * they will be automatically created when needed. */ sealed abstract class Call[A](val thunk: () => Eval[A]) extends Eval[A] { - def memoize: Eval[A] = new Later(() => value) - def value: A = Call.loop(this).value + def memoize: Eval[A] = new Later(() => get) + def get: A = Call.loop(this).get } object Call { @@ -229,16 +229,16 @@ object Eval { * * Unlike a traditional trampoline, the internal workings of the * trampoline are not exposed. This allows a slightly more efficient - * implementation of the .value method. + * implementation of the .get method. */ sealed abstract class Compute[A] extends Eval[A] { type Start val start: () => Eval[Start] val run: Start => Eval[A] - def memoize: Eval[A] = Later(value) + def memoize: Eval[A] = Later(get) - def value: A = { + def get: A = { type L = Eval[Any] type C = Any => Eval[Any] @tailrec def loop(curr: L, fs: List[C]): Any = @@ -248,14 +248,15 @@ object Eval { case cc: Compute[_] => loop( cc.start().asInstanceOf[L], - cc.run.asInstanceOf[C] :: c.run.asInstanceOf[C] :: fs) + cc.run.asInstanceOf[C] :: c.run.asInstanceOf[C] :: fs + ) case xx => - loop(c.run(xx.value).asInstanceOf[L], fs) + loop(c.run(xx.get).asInstanceOf[L], fs) } case x => fs match { - case f :: fs => loop(f(x.value), fs) - case Nil => x.value + case f :: fs => loop(f(x.get), fs) + case Nil => x.get } } loop(this.asInstanceOf[L], Nil).asInstanceOf[A] From 299484cee6bbd3db46c2f8b05e1de69195639f4c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 1 Apr 2016 01:07:00 -0400 Subject: [PATCH 576/823] Remove some warnings --- .../main/scala/sbt/internal/util/Cache.scala | 4 +-- .../scala/sbt/internal/util/FileInfo.scala | 2 +- .../scala/sbt/internal/util/Attributes.scala | 2 +- .../main/scala/sbt/internal/util/KList.scala | 4 +-- .../main/scala/sbt/internal/util/PMap.scala | 2 +- .../main/scala/sbt/internal/util/Param.scala | 2 +- .../scala/sbt/internal/util/Settings.scala | 26 ++++++-------- .../main/scala/sbt/internal/util/Signal.scala | 4 +-- .../scala/sbt/internal/util/LineReader.scala | 2 +- .../sbt/internal/util/complete/History.scala | 2 +- .../scala/sbt/internal/util/StackTrace.scala | 4 +-- .../src/main/scala/sbt/util/LogEvent.scala | 2 +- .../src/test/scala/LogWriterTest.scala | 6 ++-- .../sbt/internal/util/ChangeReport.scala | 2 +- .../scala/sbt/internal/util/Tracked.scala | 34 +++++++++---------- 15 files changed, 47 insertions(+), 51 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala index f441fbc20..13710611c 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala @@ -214,14 +214,14 @@ trait UnionImplicits { write0(i) } def equiv: Equiv[Internal] = new Equiv[Internal] { - def equiv(a: Internal, b: Internal) = + def equiv(a: Internal, b: Internal): Boolean = { if (a.clazz == b.clazz) force(a.cache.equiv, a.value, b.value) else false } - def force[T <: UB, UB](e: Equiv[T], a: UB, b: UB) = e.equiv(a.asInstanceOf[T], b.asInstanceOf[T]) + def force[T <: UB, UB](e: Equiv[T], a: UB, b: UB): Boolean = e.equiv(a.asInstanceOf[T], b.asInstanceOf[T]) } } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index 5f7461eae..1b6ac418c 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -10,7 +10,7 @@ import scala.reflect.Manifest import sbt.io.Hash import sbt.serialization._ -sealed trait FileInfo extends NotNull { +sealed trait FileInfo { val file: File } @directSubclasses(Array(classOf[FileHash], classOf[HashModifiedFileInfo])) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala index 817896567..25cf298d7 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala @@ -89,7 +89,7 @@ object AttributeKey { def isLocal: Boolean = true def rank = Int.MaxValue } - private[sbt] final val LocalLabel = "$local" + private[sbt] final val LocalLabel = "$" + "local" } /** diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala b/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala index 5530ba0bc..3406f1b4b 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala @@ -11,7 +11,7 @@ sealed trait KList[+M[_]] { def transform[N[_]](f: M ~> N): Transform[N] /** Folds this list using a function that operates on the homogeneous type of the elements of this list. */ - def foldr[T](f: (M[_], T) => T, init: T): T = init // had trouble defining it in KNil + def foldr[B](f: (M[_], B) => B, init: B): B = init // had trouble defining it in KNil /** Applies `f` to the elements of this list in the applicative functor defined by `ap`. */ def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] @@ -39,7 +39,7 @@ final case class KCons[H, +T <: KList[M], +M[_]](head: M[H], tail: T) extends KL np.apply(np.map(g, tt), f(head)) } def :^:[A, N[x] >: M[x]](h: N[A]) = KCons(h, this) - override def foldr[T](f: (M[_], T) => T, init: T): T = f(head, tail.foldr(f, init)) + override def foldr[B](f: (M[_], B) => B, init: B): B = f(head, tail.foldr(f, init)) } sealed abstract class KNil extends KList[Nothing] { final type Transform[N[_]] = KNil diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala b/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala index a62755544..989d657e2 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala @@ -87,7 +87,7 @@ abstract class AbstractRMap[K[_], V[_]] extends RMap[K, V] { */ class DelegatingPMap[K[_], V[_]](backing: mutable.Map[K[_], V[_]]) extends AbstractRMap[K, V] with PMap[K, V] { def get[T](k: K[T]): Option[V[T]] = cast[T](backing.get(k)) - def update[T](k: K[T], v: V[T]) { backing(k) = v } + def update[T](k: K[T], v: V[T]): Unit = { backing(k) = v } def remove[T](k: K[T]) = cast(backing.remove(k)) def getOrUpdate[T](k: K[T], make: => V[T]) = cast[T](backing.getOrElseUpdate(k, make)) def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] = diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala index 08a58c837..68671c8ca 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala @@ -9,7 +9,7 @@ import Types._ trait Param[A[_], B[_]] { type T def in: A[T] - def ret(out: B[T]) + def ret(out: B[T]): Unit def ret: B[T] } diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala index 6519287da..a85a3faa6 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala @@ -421,14 +421,10 @@ trait Init[Scope] { def dependencies: Seq[ScopedKey[_]] def apply[S](g: T => S): Initialize[S] - @deprecated("Will be made private.", "0.13.2") - def mapReferenced(g: MapScoped): Initialize[T] - @deprecated("Will be made private.", "0.13.2") - def mapConstant(g: MapConstant): Initialize[T] - - @deprecated("Will be made private.", "0.13.2") - def validateReferenced(g: ValidateRef): ValidatedInit[T] = - validateKeyReferenced(new ValidateKeyRef { def apply[T](key: ScopedKey[T], selfRefOk: Boolean) = g(key) }) + private[sbt] def mapReferenced(g: MapScoped): Initialize[T] + private[sbt] def mapConstant(g: MapConstant): Initialize[T] + private[sbt] def validateReferenced(g: ValidateRef): ValidatedInit[T] = + validateKeyReferenced(new ValidateKeyRef { def apply[B](key: ScopedKey[B], selfRefOk: Boolean) = g(key) }) private[sbt] def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] @@ -482,14 +478,14 @@ trait Init[Scope] { private[sbt] def mapInitialize(f: Initialize[T] => Initialize[T]): Setting[T] = make(key, f(init), pos) override def toString = "setting(" + key + ") at " + pos - protected[this] def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new Setting[T](key, init, pos) + protected[this] def make[B](key: ScopedKey[B], init: Initialize[B], pos: SourcePosition): Setting[B] = new Setting[B](key, init, pos) protected[sbt] def isDerived: Boolean = false private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) /** Turn this setting into a `DefaultSetting` if it's not already, otherwise returns `this` */ private[sbt] def default(id: => Long = nextDefaultID()): DefaultSetting[T] = DefaultSetting(key, init, pos, id) } private[Init] sealed class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean) extends Setting[T](sk, i, p) { - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = new DerivedSetting[T](key, init, pos, filter, trigger) + override def make[B](key: ScopedKey[B], init: Initialize[B], pos: SourcePosition): Setting[B] = new DerivedSetting[B](key, init, pos, filter, trigger) protected[sbt] override def isDerived: Boolean = true override def default(_id: => Long): DefaultSetting[T] = new DerivedSetting[T](sk, i, p, filter, trigger) with DefaultSetting[T] { val id = _id } override def toString = "derived " + super.toString @@ -498,7 +494,7 @@ trait Init[Scope] { // This is intended for internal sbt use only, where alternatives like Plugin.globalSettings are not available. private[Init] sealed trait DefaultSetting[T] extends Setting[T] { val id: Long - override def make[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition): Setting[T] = super.make(key, init, pos) default id + override def make[B](key: ScopedKey[B], init: Initialize[B], pos: SourcePosition): Setting[B] = super.make(key, init, pos) default id override final def hashCode = id.hashCode override final def equals(o: Any): Boolean = o match { case d: DefaultSetting[_] => d.id == id; case _ => false } override def toString = s"default($id) " + super.toString @@ -547,7 +543,7 @@ trait Init[Scope] { case None => this case Some(const) => new Value(() => transform(const)) } - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init + private[sbt] def processAttributes[B](init: B)(f: (B, AttributeMap) => B): B = init } private[this] final class GetValue[S, T](val scopedKey: ScopedKey[S], val transform: S => T) extends Keyed[S, T] trait KeyedInitialize[T] extends Keyed[T, T] { @@ -585,7 +581,7 @@ trait Init[Scope] { new Bind[S, T](s => handleUndefined(f(s) validateKeyReferenced g), validIn) } def mapConstant(g: MapConstant) = new Bind[S, T](s => f(s) mapConstant g, in mapConstant g) - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = in.processAttributes(init)(f) + private[sbt] def processAttributes[B](init: B)(f: (B, AttributeMap) => B): B = in.processAttributes(init)(f) } private[sbt] final class Optional[S, T](val a: Option[Initialize[S]], val f: Option[S] => T) extends Initialize[T] { def dependencies = deps(a.toList) @@ -599,7 +595,7 @@ trait Init[Scope] { def evaluate(ss: Settings[Scope]): T = f(a.flatMap(i => trapBadRef(evaluateT(ss)(i)))) // proper solution is for evaluate to be deprecated or for external use only and a new internal method returning Either be used private[this] def trapBadRef[A](run: => A): Option[A] = try Some(run) catch { case e: InvalidReference => None } - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = a match { + private[sbt] def processAttributes[B](init: B)(f: (B, AttributeMap) => B): B = a match { case None => init case Some(i) => i.processAttributes(init)(f) } @@ -633,7 +629,7 @@ trait Init[Scope] { { val tx = alist.transform(inputs, validateKeyReferencedT(g)) val undefs = alist.toList(tx).flatMap(_.left.toSeq.flatten) - val get = new (ValidatedInit ~> Initialize) { def apply[T](vr: ValidatedInit[T]) = vr.right.get } + val get = new (ValidatedInit ~> Initialize) { def apply[B](vr: ValidatedInit[B]) = vr.right.get } if (undefs.isEmpty) Right(new Apply(f, alist.transform(tx, get), alist)) else Left(undefs) } diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala index 8631fc75b..0c9fac038 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala @@ -31,7 +31,7 @@ object Signals { import sun.misc.{ Signal, SignalHandler } val intSignal = new Signal(signal) val newHandler = new SignalHandler { - def handle(sig: Signal) { handler() } + def handle(sig: Signal): Unit = { handler() } } val oldHandler = Signal.handle(intSignal, newHandler) object unregisterNewHandler extends Registration { @@ -74,7 +74,7 @@ private final class Signals0 { import sun.misc.{ Signal, SignalHandler } val intSignal = new Signal(signal) val newHandler = new SignalHandler { - def handle(sig: Signal) { handler() } + def handle(sig: Signal): Unit = { handler() } } val oldHandler = Signal.handle(intSignal, newHandler) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index 712071684..23938b03d 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -52,7 +52,7 @@ abstract class JLine extends LineReader { } } - private[this] def resume() { + private[this] def resume(): Unit = { jline.TerminalFactory.reset JLine.terminal.init reader.drawLine() diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala index 350c58dfa..d5a96836e 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala @@ -7,7 +7,7 @@ package complete import History.number import java.io.File -final class History private (val lines: IndexedSeq[String], val path: Option[File], error: String => Unit) extends NotNull { +final class History private (val lines: IndexedSeq[String], val path: Option[File], error: String => Unit) { private def reversed = lines.reverse def all: Seq[String] = lines diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala index af16e35d4..e636d914c 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala @@ -20,7 +20,7 @@ object StackTrace { require(d >= 0) val b = new StringBuilder() - def appendStackTrace(t: Throwable, first: Boolean) { + def appendStackTrace(t: Throwable, first: Boolean): Unit = { val include: StackTraceElement => Boolean = if (d == 0) @@ -30,7 +30,7 @@ object StackTrace { (_ => { count -= 1; count >= 0 }) } - def appendElement(e: StackTraceElement) { + def appendElement(e: StackTraceElement): Unit = { b.append("\tat ") b.append(e) b.append('\n') diff --git a/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala b/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala index b6225896f..bfc962891 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala @@ -3,7 +3,7 @@ */ package sbt.util -sealed trait LogEvent extends NotNull +sealed trait LogEvent final class Success(val msg: String) extends LogEvent final class Log(val level: Level.Value, val msg: String) extends LogEvent final class Trace(val exception: Throwable) extends LogEvent diff --git a/internal/util-logging/src/test/scala/LogWriterTest.scala b/internal/util-logging/src/test/scala/LogWriterTest.scala index 1f33a3761..4b1aace56 100644 --- a/internal/util-logging/src/test/scala/LogWriterTest.scala +++ b/internal/util-logging/src/test/scala/LogWriterTest.scala @@ -103,14 +103,14 @@ object LogWriterTest extends Properties("Log Writer") { /* Helper classes*/ -final class Output(val lines: List[List[ToLog]], val level: Level.Value) extends NotNull { +final class Output(val lines: List[List[ToLog]], val level: Level.Value) { override def toString = "Level: " + level + "\n" + lines.map(_.mkString).mkString("\n") } -final class NewLine(val str: String) extends NotNull { +final class NewLine(val str: String) { override def toString = Escape(str) } -final class ToLog(val content: String, val byCharacter: Boolean) extends NotNull { +final class ToLog(val content: String, val byCharacter: Boolean) { def contentOnly = Escape.newline(content, "") override def toString = if (content.isEmpty) "" else "ToLog('" + Escape(contentOnly) + "', " + byCharacter + ")" } diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala index 10afbea6f..801fc22cf 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala @@ -17,7 +17,7 @@ object ChangeReport { } } /** The result of comparing some current set of objects against a previous set of objects.*/ -trait ChangeReport[T] extends NotNull { +trait ChangeReport[T] { /** The set of all of the objects in the current set.*/ def checked: Set[T] /** All of the objects that are in the same state in the current and reference sets.*/ diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala index 77d4b3a29..e8d3be591 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala @@ -234,23 +234,23 @@ object FileFunction { type UpdateFunction = (ChangeReport[File], ChangeReport[File]) => Set[File] /** - Generic change-detection helper used to help build / artifact generation / - etc. steps detect whether or not they need to run. Returns a function whose - input is a Set of input files, and subsequently executes the action function - (which does the actual work: compiles, generates resources, etc.), returning - a Set of output files that it generated. - - The input file and resulting output file state is cached in - cacheBaseDirectory. On each invocation, the state of the input and output - files from the previous run is compared against the cache, as is the set of - input files. If a change in file state / input files set is detected, the - action function is re-executed. - - @param cacheBaseDirectory The folder in which to store - @param inStyle The strategy by which to detect state change in the input files from the previous run - @param outStyle The strategy by which to detect state change in the output files from the previous run - @param action The work function, which receives a list of input files and returns a list of output files - */ + * Generic change-detection helper used to help build / artifact generation / + * etc. steps detect whether or not they need to run. Returns a function whose + * input is a Set of input files, and subsequently executes the action function + * (which does the actual work: compiles, generates resources, etc.), returning + * a Set of output files that it generated. + * + * The input file and resulting output file state is cached in + * cacheBaseDirectory. On each invocation, the state of the input and output + * files from the previous run is compared against the cache, as is the set of + * input files. If a change in file state / input files set is detected, the + * action function is re-executed. + * + * @param cacheBaseDirectory The folder in which to store + * @param inStyle The strategy by which to detect state change in the input files from the previous run + * @param outStyle The strategy by which to detect state change in the output files from the previous run + * @param action The work function, which receives a list of input files and returns a list of output files + */ def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style = FilesInfo.lastModified, outStyle: FilesInfo.Style = FilesInfo.exists)(action: Set[File] => Set[File]): Set[File] => Set[File] = cached(cacheBaseDirectory)(inStyle, outStyle)((in, out) => action(in.checked)) From 5ad5591c8ee07e48f527c350742dbc29c8d1a8f4 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Thu, 7 Apr 2016 12:47:36 +0200 Subject: [PATCH 577/823] Add target/ to .gitignore --- .gitignore | 1 + 1 file changed, 1 insertion(+) create mode 100644 .gitignore diff --git a/.gitignore b/.gitignore new file mode 100644 index 000000000..9f970225a --- /dev/null +++ b/.gitignore @@ -0,0 +1 @@ +target/ \ No newline at end of file From 89e88ff584dfc57d1382106b8607049ee86e1a86 Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Thu, 7 Apr 2016 12:48:45 +0200 Subject: [PATCH 578/823] Run scripted tests in the alphabetical order Makes the order deterministic and makes it easier to see the progress on running tests. --- .../main/scala/sbt/internal/scripted/ScriptedTests.scala | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index 665d97f0f..18c2f2e9e 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -147,13 +147,13 @@ final class ListTests(baseDirectory: File, accept: ScriptedTest => Boolean, log: listTests(group).map(ScriptedTest(groupName, _)) } } - private[this] def listTests(group: File): Set[String] = + private[this] def listTests(group: File): Seq[String] = { val groupName = group.getName - val allTests = list(group, filter) + val allTests = list(group, filter).sortBy(_.getName) if (allTests.isEmpty) { log.warn("No tests in test group " + groupName) - Set.empty + Seq.empty } else { val (included, skipped) = allTests.toList.partition(test => accept(ScriptedTest(groupName, test.getName))) if (included.isEmpty) @@ -162,7 +162,7 @@ final class ListTests(baseDirectory: File, accept: ScriptedTest => Boolean, log: log.warn("Tests skipped in group " + group.getName + ":") skipped.foreach(testName => log.warn(" " + testName.getName)) } - Set(included.map(_.getName): _*) + Seq(included.map(_.getName): _*) } } } From 9f9ac3a9ccbf0a9b8c51b6c237ccd4697a5d08fd Mon Sep 17 00:00:00 2001 From: Grzegorz Kossakowski Date: Tue, 12 Apr 2016 20:14:10 +0200 Subject: [PATCH 579/823] Scripted logger logs everything Do not filter any logging in scripted logger by setting the log level to Debug. The caller of ScriptedRunner passes a logger and decides the level of logging it wants to receive. Scripted shouldn't filter anything. --- .../src/main/scala/sbt/internal/scripted/ScriptedTests.scala | 2 ++ 1 file changed, 2 insertions(+) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index 18c2f2e9e..c008f92ba 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -13,6 +13,7 @@ object ScriptedRunnerImpl { def run(resourceBaseDirectory: File, bufferLog: Boolean, tests: Array[String], handlersProvider: HandlersProvider): Unit = { val runner = new ScriptedTests(resourceBaseDirectory, bufferLog, handlersProvider) val logger = ConsoleLogger() + logger.setLevel(sbt.util.Level.Debug) val allTests = get(tests, resourceBaseDirectory, logger) flatMap { case ScriptedTest(group, name) => runner.scriptedTest(group, name, logger) @@ -73,6 +74,7 @@ final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handl private def scriptedTest(label: String, testDirectory: File, prescripted: File => Unit, log: Logger): Unit = { val buffered = new BufferedLogger(new FullLogger(log)) + buffered.setLevel(sbt.util.Level.Debug) if (bufferLog) buffered.record() From d49b6fd4207153ea152ae6e09c6ec501fe9121e3 Mon Sep 17 00:00:00 2001 From: eugene yokota Date: Sat, 23 Apr 2016 23:56:25 -0400 Subject: [PATCH 580/823] Fixes #2480. Workaround for Jline regression (#2570) Workaround jline/jline2#205 --- .../src/main/scala/sbt/internal/util/LineReader.scala | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index 23938b03d..a345099fd 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -48,7 +48,10 @@ abstract class JLine extends LineReader { val lines = """\r?\n""".r.split(prompt) lines.length match { case 0 | 1 => prompt - case _ => reader.print(lines.init.mkString("\n") + "\n"); lines.last; + case _ => + // Workaround for regression jline/jline2#205 + reader.getOutput.write(lines.init.mkString("\n") + "\n") + lines.last } } From 30ee653e829975dc30bc8f7760a672bd412294c8 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 2 May 2016 04:00:57 -0400 Subject: [PATCH 581/823] Bump to sbinary 0.4.3 sbinary 0.4.3 is available from Maven Central --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 2aa4efc99..71ae6af5f 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -8,7 +8,7 @@ object Dependencies { lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M3" lazy val jline = "jline" % "jline" % "2.13" lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" - lazy val sbinary = "org.scala-tools.sbinary" %% "sbinary" % "0.4.2" + lazy val sbinary = "org.scala-sbt" %% "sbinary" % "0.4.3" lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } lazy val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } From 1f45027b3aeec6e7f577f0da57e4100733a4e04f Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 22 Feb 2016 12:12:32 +0100 Subject: [PATCH 582/823] Completion for build-level keys sbt's shell provided completion only for keys that were relative to a defined project, but didn't provide completion for keys that belong to the build definition only. This commit fixes this issue by defining a new kind of `Parser` (from which completions are generated) which runs its input simultaneously on distinct parsers. We now define a parser for project-level keys and another parser for build-level keys. These two parsers are eventually combined, and we get the completions of both parsers. Fixes sbt/sbt#2460 --- .../sbt/internal/util/complete/Parser.scala | 28 +++++++++++++++++++ .../src/test/scala/ParserTest.scala | 9 ++++++ 2 files changed, 37 insertions(+) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala index a6d5474aa..2642ad486 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala @@ -127,6 +127,9 @@ sealed trait RichParser[A] { /** Applies the original parser, applies `f` to the result to get the next parser, and applies that parser and uses its result for the overall result. */ def flatMap[B](f: A => Parser[B]): Parser[B] + + /** Applied both the original parser and `b` on the same input and returns the results produced by each parser */ + def combinedWith(b: Parser[A]): Parser[Seq[A]] } /** Contains Parser implementation helper methods not typically needed for using parsers. */ @@ -230,6 +233,12 @@ object Parser extends ParserMain { } } + def combinedParser[A](a: Parser[A], b: Parser[A]): Parser[Seq[A]] = + if (a.valid) + if (b.valid) new CombiningParser(a, b) else a.map(Seq(_)) + else + b.map(Seq(_)) + def choiceParser[A, B](a: Parser[A], b: Parser[B]): Parser[Either[A, B]] = if (a.valid) if (b.valid) new HetParser(a, b) else a.map(left.fn) @@ -310,6 +319,7 @@ trait ParserMain { def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) + def combinedWith(b: Parser[A]): Parser[Seq[A]] = combinedParser(a, b) } implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) @@ -633,6 +643,24 @@ private final class HetParser[A, B](a: Parser[A], b: Parser[B]) extends ValidPar def completions(level: Int) = a.completions(level) ++ b.completions(level) override def toString = "(" + a + " || " + b + ")" } +private final class CombiningParser[T](a: Parser[T], b: Parser[T]) extends ValidParser[Seq[T]] { + lazy val result: Option[Seq[T]] = (a.result.toSeq ++ b.result.toSeq) match { case Seq() => None; case seq => Some(seq) } + def completions(level: Int) = a.completions(level) ++ b.completions(level) + def derive(i: Char) = + (a.valid, b.valid) match { + case (true, true) => new CombiningParser(a derive i, b derive i) + case (true, false) => a derive i map (Seq(_)) + case (false, true) => b derive i map (Seq(_)) + case (false, false) => new Invalid(mkFailure("No valid parser available.")) + } + def resultEmpty = + (a.resultEmpty, b.resultEmpty) match { + case (Value(ra), Value(rb)) => Value(Seq(ra, rb)) + case (Value(ra), _) => Value(Seq(ra)) + case (_, Value(rb)) => Value(Seq(rb)) + case _ => Value(Nil) + } +} private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) extends ValidParser[Seq[T]] { assert(a.nonEmpty) lazy val resultEmpty: Result[Seq[T]] = diff --git a/internal/util-complete/src/test/scala/ParserTest.scala b/internal/util-complete/src/test/scala/ParserTest.scala index 1db99b513..34e35efbe 100644 --- a/internal/util-complete/src/test/scala/ParserTest.scala +++ b/internal/util-complete/src/test/scala/ParserTest.scala @@ -108,6 +108,15 @@ object ParserTest extends Properties("Completing Parser") { property("repeatDep requires at least one token") = !matches(repeat, "") property("repeatDep accepts one token") = matches(repeat, colors.toSeq.head) property("repeatDep accepts two tokens") = matches(repeat, colors.toSeq.take(2).mkString(" ")) + property("combined parser gives completion of both parsers") = { + val prefix = "fix" + val p1Suffixes = Set("", "ated", "ation") + val p2Suffixes = Set("es", "ed") + val p1: Parser[String] = p1Suffixes map (suffix => (prefix + suffix): Parser[String]) reduce (_ | _) + val p2: Parser[String] = p2Suffixes map (suffix => (prefix + suffix): Parser[String]) reduce (_ | _) + val suggestions: Set[Completion] = p1Suffixes ++ p2Suffixes map (new Suggestion(_)) + checkAll(prefix, p1 combinedWith p2, Completions(suggestions)) + } } object ParserExample { val ws = charClass(_.isWhitespace).+ From 605beef7d3b286585e63116894951f581f1cf75a Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Tue, 23 Feb 2016 15:19:04 +0100 Subject: [PATCH 583/823] Address problems reported by Codacy --- .../scala/sbt/internal/util/complete/Parser.scala | 12 +++++------- 1 file changed, 5 insertions(+), 7 deletions(-) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala index 2642ad486..bc1229ad1 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala @@ -233,12 +233,6 @@ object Parser extends ParserMain { } } - def combinedParser[A](a: Parser[A], b: Parser[A]): Parser[Seq[A]] = - if (a.valid) - if (b.valid) new CombiningParser(a, b) else a.map(Seq(_)) - else - b.map(Seq(_)) - def choiceParser[A, B](a: Parser[A], b: Parser[B]): Parser[Either[A, B]] = if (a.valid) if (b.valid) new HetParser(a, b) else a.map(left.fn) @@ -319,7 +313,11 @@ trait ParserMain { def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) - def combinedWith(b: Parser[A]): Parser[Seq[A]] = combinedParser(a, b) + def combinedWith(b: Parser[A]): Parser[Seq[A]] = + if (a.valid) + if (b.valid) new CombiningParser(a, b) else a.map(Seq(_)) + else + b.map(Seq(_)) } implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) From 5a60c0eea788b7154eeb1d7fe3634eeaeac00f3f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 3 May 2016 17:09:22 -0400 Subject: [PATCH 584/823] 0.1.0-M11 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 1b16a5050..9e211758b 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "0.1.0-M10" +def baseVersion: String = "0.1.0-M11" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From 9c49a0ed9fa854496345265e5c8489d2762c37cc Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 4 May 2016 16:27:29 -0400 Subject: [PATCH 585/823] Update dependencies --- build.sbt | 3 ++- .../main/scala/sbt/internal/util/SeparatedCache.scala | 2 +- .../src/test/scala/DagSpecification.scala | 9 ++++++--- .../scala/sbt/internal/scripted/FileCommands.scala | 1 + .../scala/sbt/internal/scripted/ScriptedTests.scala | 1 + .../src/main/scala/sbt/internal/util/Tracked.scala | 3 ++- project/Dependencies.scala | 11 ++++++----- 7 files changed, 19 insertions(+), 11 deletions(-) diff --git a/build.sbt b/build.sbt index 9e211758b..f2d68877a 100644 --- a/build.sbt +++ b/build.sbt @@ -14,6 +14,7 @@ def commonSettings: Seq[Setting[_]] = Seq( testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), crossScalaVersions := Seq(scala210, scala211), + scalacOptions -= "-Yinline-warnings", scalacOptions ++= Seq( "-encoding", "utf8", "-deprecation", @@ -23,7 +24,7 @@ def commonSettings: Seq[Setting[_]] = Seq( "-language:higherKinds", "-language:implicitConversions", // "-Xfuture", - "-Yinline-warnings", + // "-Yinline-warnings", // "-Yfatal-warnings", "-Yno-adapted-args", "-Ywarn-dead-code", diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala index a68e46083..379cdbff7 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala @@ -7,7 +7,7 @@ import Types.:+: import sbinary.{ DefaultProtocol, Format, Input, Output => Out } import DefaultProtocol.ByteFormat import java.io.{ File, InputStream, OutputStream } -import sbt.internal.io.Using +import sbt.io.Using trait InputCache[I] { type Internal diff --git a/internal/util-collection/src/test/scala/DagSpecification.scala b/internal/util-collection/src/test/scala/DagSpecification.scala index 9e5025488..3b3614e39 100644 --- a/internal/util-collection/src/test/scala/DagSpecification.scala +++ b/internal/util-collection/src/test/scala/DagSpecification.scala @@ -19,8 +19,11 @@ object DagSpecification extends Properties("Dag") { val nodes = new HashSet[TestDag] def nonterminalGen(p: Gen.Parameters): Gen[TestDag] = { - for (i <- 0 until nodeCount; nextDeps <- Gen.someOf(nodes).apply(p)) - nodes += new TestDag(i, nextDeps) + val seed = rng.Seed.random() + for { + i <- 0 until nodeCount + nextDeps <- Gen.someOf(nodes).apply(p, seed) + } nodes += new TestDag(i, nextDeps) for (nextDeps <- Gen.someOf(nodes)) yield new TestDag(nodeCount, nextDeps) } Gen.parameterized(nonterminalGen) @@ -47,4 +50,4 @@ object DagSpecification extends Properties("Dag") { } class TestDag(id: Int, val dependencies: Iterable[TestDag]) extends Dag[TestDag] { override def toString = id + "->" + dependencies.mkString("[", ",", "]") -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index ea5fc7559..65b5af423 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -7,6 +7,7 @@ package scripted import java.io.File import sbt.io.{ IO, Path } +import sbt.io.syntax._ import Path._ class FileCommands(baseDirectory: File) extends BasicStatementHandler { diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index c008f92ba..9575f2f6c 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -7,6 +7,7 @@ import sbt.util.Logger import sbt.internal.util.{ ConsoleLogger, BufferedLogger, FullLogger } import sbt.io.IO.wrapNull import sbt.io.{ DirectoryFilter, HiddenFileFilter, Path, GlobFilter } +import sbt.io.syntax._ import sbt.internal.io.Resources object ScriptedRunnerImpl { diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala index e8d3be591..28ee9c21e 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala @@ -11,7 +11,8 @@ import scala.reflect.Manifest import scala.collection.mutable import sbt.io.IO.{ delete, read, write } import sbt.io.{ IO, Path } -import sbt.internal.io.Using +import sbt.io.Using +import sbt.io.syntax._ import sbt.serialization._ object Tracked { diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 71ae6af5f..05e9584db 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -3,9 +3,10 @@ import Keys._ object Dependencies { lazy val scala210 = "2.10.6" - lazy val scala211 = "2.11.7" + lazy val scala211 = "2.11.8" + lazy val scala212 = "2.12.0-M4" - lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M3" + lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M5" lazy val jline = "jline" % "jline" % "2.13" lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" lazy val sbinary = "org.scala-sbt" %% "sbinary" % "0.4.3" @@ -21,10 +22,10 @@ object Dependencies { } } - lazy val scalaXml = scala211Module("scala-xml", "1.0.1") + lazy val scalaXml = scala211Module("scala-xml", "1.0.5") - lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.12.4" - lazy val scalatest = "org.scalatest" %% "scalatest" % "2.2.4" + val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.1" + val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" lazy val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" } From daaa45f494571a6c483165d296d3750338593036 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 4 May 2016 16:28:17 -0400 Subject: [PATCH 586/823] 0.1.0-M12 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index f2d68877a..60147bbf4 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "0.1.0-M11" +def baseVersion: String = "0.1.0-M12" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From 75ca2537750e55c1c09911ba836f9d285346d9f8 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 11 May 2016 11:30:40 -0400 Subject: [PATCH 587/823] Don't publish tests --- build.sbt | 5 ++--- project/Dependencies.scala | 2 +- 2 files changed, 3 insertions(+), 4 deletions(-) diff --git a/build.sbt b/build.sbt index 60147bbf4..926ecb3da 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "0.1.0-M12" +def baseVersion: String = "0.1.0-M13" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( @@ -32,7 +32,7 @@ def commonSettings: Seq[Setting[_]] = Seq( "-Ywarn-value-discard"), previousArtifact := None, // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, - publishArtifact in Test := true + publishArtifact in Test := false ) lazy val utilRoot: Project = (project in file(".")). @@ -108,7 +108,6 @@ lazy val utilLogging = (project in internalPath / "util-logging"). dependsOn(utilInterface, utilTesting % Test). settings( commonSettings, - publishArtifact in (Test, packageBin) := true, name := "Util Logging", libraryDependencies += jline ) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 05e9584db..e0ea6ad2d 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -6,7 +6,7 @@ object Dependencies { lazy val scala211 = "2.11.8" lazy val scala212 = "2.12.0-M4" - lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M5" + lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M6" lazy val jline = "jline" % "jline" % "2.13" lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" lazy val sbinary = "org.scala-sbt" %% "sbinary" % "0.4.3" From 8989549cb63da218d6bf65132857edcec5d94634 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sat, 18 Jun 2016 12:08:22 +0200 Subject: [PATCH 588/823] Remove scalacOptions already set by house rules --- build.sbt | 16 ---------------- 1 file changed, 16 deletions(-) diff --git a/build.sbt b/build.sbt index 926ecb3da..802fa7b17 100644 --- a/build.sbt +++ b/build.sbt @@ -14,22 +14,6 @@ def commonSettings: Seq[Setting[_]] = Seq( testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), crossScalaVersions := Seq(scala210, scala211), - scalacOptions -= "-Yinline-warnings", - scalacOptions ++= Seq( - "-encoding", "utf8", - "-deprecation", - "-feature", - "-unchecked", - "-Xlint", - "-language:higherKinds", - "-language:implicitConversions", - // "-Xfuture", - // "-Yinline-warnings", - // "-Yfatal-warnings", - "-Yno-adapted-args", - "-Ywarn-dead-code", - "-Ywarn-numeric-widen", - "-Ywarn-value-discard"), previousArtifact := None, // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, publishArtifact in Test := false From d0826ff13ca8b399739c76922bd90f91c2d2e009 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 19 Jun 2016 11:14:12 +0100 Subject: [PATCH 589/823] Fix compilation warnings, migrate to blackbox.Context --- .../internal/util/appmacro/ContextUtil.scala | 44 +++++++++---------- .../sbt/internal/util/appmacro/Convert.scala | 18 ++++---- .../sbt/internal/util/appmacro/Instance.scala | 10 ++--- .../internal/util/appmacro/KListBuilder.scala | 12 ++--- .../internal/util/appmacro/MixedBuilder.scala | 4 +- .../internal/util/appmacro/TupleBuilder.scala | 4 +- .../util/appmacro/TupleNBuilder.scala | 10 ++--- .../main/scala/sbt/internal/util/Cache.scala | 2 +- .../scala/sbt/complete/FileExamplesTest.scala | 1 + 9 files changed, 48 insertions(+), 57 deletions(-) diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala index e5cd74270..ee3b56361 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala @@ -14,7 +14,7 @@ object ContextUtil { * Constructs an object with utility methods for operating in the provided macro context `c`. * Callers should explicitly specify the type parameter as `c.type` in order to preserve the path dependent types. */ - def apply[C <: Context with Singleton](c: C): ContextUtil[C] = new ContextUtil(c) + def apply[C <: blackbox.Context with Singleton](c: C): ContextUtil[C] = new ContextUtil(c) /** * Helper for implementing a no-argument macro that is introduced via an implicit. @@ -23,7 +23,7 @@ object ContextUtil { * Given `myImplicitConversion(someValue).extensionMethod`, where `extensionMethod` is a macro that uses this * method, the result of this method is `f()`. */ - def selectMacroImpl[T: c.WeakTypeTag](c: Context)(f: (c.Expr[Any], c.Position) => c.Expr[T]): c.Expr[T] = + def selectMacroImpl[T: c.WeakTypeTag](c: blackbox.Context)(f: (c.Expr[Any], c.Position) => c.Expr[T]): c.Expr[T] = { import c.universe._ c.macroApplication match { @@ -32,20 +32,18 @@ object ContextUtil { } } - def unexpectedTree[C <: Context](tree: C#Tree): Nothing = sys.error("Unexpected macro application tree (" + tree.getClass + "): " + tree) + def unexpectedTree[C <: blackbox.Context](tree: C#Tree): Nothing = sys.error("Unexpected macro application tree (" + tree.getClass + "): " + tree) } -// TODO 2.11 Remove this after dropping 2.10.x support. -private object HasCompat { val compat = this }; import HasCompat._ - /** - * Utility methods for macros. Several methods assume that the context's universe is a full compiler (`scala.tools.nsc.Global`). + * Utility methods for macros. Several methods assume that the context's universe is a full compiler + * (`scala.tools.nsc.Global`). * This is not thread safe due to the underlying Context and related data structures not being thread safe. * Use `ContextUtil[c.type](c)` to construct. */ -final class ContextUtil[C <: Context](val ctx: C) { +final class ContextUtil[C <: blackbox.Context](val ctx: C) { import ctx.universe.{ Apply => ApplyTree, _ } - import compat._ + import internal.decorators._ val powerContext = ctx.asInstanceOf[reflect.macros.runtime.Context] val global: powerContext.universe.type = powerContext.universe @@ -53,7 +51,7 @@ final class ContextUtil[C <: Context](val ctx: C) { val initialOwner: Symbol = callsiteTyper.context.owner.asInstanceOf[ctx.universe.Symbol] lazy val alistType = ctx.typeOf[AList[KList]] - lazy val alist: Symbol = alistType.typeSymbol.companionSymbol + lazy val alist: Symbol = alistType.typeSymbol.companion lazy val alistTC: Type = alistType.typeConstructor /** Modifiers for a local val.*/ @@ -63,9 +61,9 @@ final class ContextUtil[C <: Context](val ctx: C) { /** * Constructs a unique term name with the given prefix within this Context. - * (The current implementation uses Context.fresh, which increments + * (The current implementation uses Context.freshName, which increments */ - def freshTermName(prefix: String) = newTermName(ctx.fresh("$" + prefix)) + def freshTermName(prefix: String) = TermName(ctx.freshName("$" + prefix)) /** * Constructs a new, synthetic, local ValDef Type `tpe`, a unique name, @@ -76,7 +74,7 @@ final class ContextUtil[C <: Context](val ctx: C) { val SYNTHETIC = (1 << 21).toLong.asInstanceOf[FlagSet] val sym = owner.newTermSymbol(freshTermName("q"), pos, SYNTHETIC) setInfo(sym, tpe) - val vd = ValDef(sym, EmptyTree) + val vd = internal.valDef(sym, EmptyTree) vd.setPos(pos) vd } @@ -94,7 +92,7 @@ final class ContextUtil[C <: Context](val ctx: C) { val process = new Traverser { override def traverse(t: Tree) = t match { case _: Ident => () - case ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) if isWrapper(nme.decoded, tpe.tpe, qual) => () + case ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) if isWrapper(nme.decodedName.toString, tpe.tpe, qual) => () case tree => if (tree.symbol ne null) defs += tree.symbol; super.traverse(tree) @@ -117,7 +115,7 @@ final class ContextUtil[C <: Context](val ctx: C) { */ def checkReferences(defs: collection.Set[Symbol], isWrapper: (String, Type, Tree) => Boolean): Tree => Unit = { case s @ ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) => - if (isWrapper(nme.decoded, tpe.tpe, qual)) ctx.error(s.pos, DynamicDependencyError) + if (isWrapper(nme.decodedName.toString, tpe.tpe, qual)) ctx.error(s.pos, DynamicDependencyError) case id @ Ident(name) if illegalReference(defs, id.symbol) => ctx.error(id.pos, DynamicReferenceError + ": " + name) case _ => () } @@ -134,11 +132,11 @@ final class ContextUtil[C <: Context](val ctx: C) { def mkTuple(args: List[Tree]): Tree = global.gen.mkTuple(args.asInstanceOf[List[global.Tree]]).asInstanceOf[ctx.universe.Tree] - def setSymbol[Tree](t: Tree, sym: Symbol): Unit = { + def setSymbol[_Tree](t: _Tree, sym: Symbol): Unit = { t.asInstanceOf[global.Tree].setSymbol(sym.asInstanceOf[global.Symbol]) () } - def setInfo[Tree](sym: Symbol, tpe: Type): Unit = { + def setInfo(sym: Symbol, tpe: Type): Unit = { sym.asInstanceOf[global.Symbol].setInfo(tpe.asInstanceOf[global.Type]) () } @@ -151,7 +149,7 @@ final class ContextUtil[C <: Context](val ctx: C) { lazy val idTC: Type = { val tvar = newTypeVariable(NoSymbol) - polyType(tvar :: Nil, refVar(tvar)) + internal.polyType(tvar :: Nil, refVar(tvar)) } /** A Type that references the given type variable. */ def refVar(variable: TypeSymbol): Type = variable.toTypeConstructor @@ -159,12 +157,12 @@ final class ContextUtil[C <: Context](val ctx: C) { def newTCVariable(owner: Symbol): TypeSymbol = { val tc = newTypeVariable(owner) - val arg = newTypeVariable(tc, "x") - tc.setTypeSignature(PolyType(arg :: Nil, emptyTypeBounds)) + val arg = newTypeVariable(tc, "x"); + tc.setInfo(internal.polyType(arg :: Nil, emptyTypeBounds)) tc } /** >: Nothing <: Any */ - def emptyTypeBounds: TypeBounds = TypeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) + def emptyTypeBounds: TypeBounds = internal.typeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) /** Creates a new anonymous function symbol with Position `pos`. */ def functionSymbol(pos: Position): Symbol = @@ -210,7 +208,7 @@ final class ContextUtil[C <: Context](val ctx: C) { case x => sys.error("Instance must be static (was " + x + ").") } - def select(t: Tree, name: String): Tree = Select(t, newTermName(name)) + def select(t: Tree, name: String): Tree = Select(t, TermName(name)) /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ def method(obj: Symbol, name: String): Symbol = { @@ -247,7 +245,7 @@ final class ContextUtil[C <: Context](val ctx: C) { override def transform(tree: Tree): Tree = tree match { case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => - subWrapper(nme.decoded, targ.tpe, qual, tree) match { + subWrapper(nme.decodedName.toString, targ.tpe, qual, tree) match { case Converted.Success(t, finalTx) => changeOwner(qual, currentOwner, initialOwner) // Fixes https://github.com/sbt/sbt/issues/1150 finalTx(t) diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala index 1d0ebede1..8accb85c6 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala @@ -6,32 +6,32 @@ import macros._ import Types.idFun abstract class Convert { - def apply[T: c.WeakTypeTag](c: Context)(nme: String, in: c.Tree): Converted[c.type] - def asPredicate(c: Context): (String, c.Type, c.Tree) => Boolean = + def apply[T: c.WeakTypeTag](c: blackbox.Context)(nme: String, in: c.Tree): Converted[c.type] + def asPredicate(c: blackbox.Context): (String, c.Type, c.Tree) => Boolean = (n, tpe, tree) => { val tag = c.WeakTypeTag(tpe) apply(c)(n, tree)(tag).isSuccess } } -sealed trait Converted[C <: Context with Singleton] { +sealed trait Converted[C <: blackbox.Context with Singleton] { def isSuccess: Boolean def transform(f: C#Tree => C#Tree): Converted[C] } object Converted { - def NotApplicable[C <: Context with Singleton] = new NotApplicable[C] - final case class Failure[C <: Context with Singleton](position: C#Position, message: String) extends Converted[C] { + def NotApplicable[C <: blackbox.Context with Singleton] = new NotApplicable[C] + final case class Failure[C <: blackbox.Context with Singleton](position: C#Position, message: String) extends Converted[C] { def isSuccess = false def transform(f: C#Tree => C#Tree): Converted[C] = new Failure(position, message) } - final class NotApplicable[C <: Context with Singleton] extends Converted[C] { + final class NotApplicable[C <: blackbox.Context with Singleton] extends Converted[C] { def isSuccess = false def transform(f: C#Tree => C#Tree): Converted[C] = this } - final case class Success[C <: Context with Singleton](tree: C#Tree, finalTransform: C#Tree => C#Tree) extends Converted[C] { + final case class Success[C <: blackbox.Context with Singleton](tree: C#Tree, finalTransform: C#Tree => C#Tree) extends Converted[C] { def isSuccess = true def transform(f: C#Tree => C#Tree): Converted[C] = Success(f(tree), finalTransform) } object Success { - def apply[C <: Context with Singleton](tree: C#Tree): Success[C] = Success(tree, idFun) + def apply[C <: blackbox.Context with Singleton](tree: C#Tree): Success[C] = Success(tree, idFun) } -} \ No newline at end of file +} diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala index aa8eafe27..3177d59c4 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala @@ -33,10 +33,10 @@ object Instance { final val InstanceTCName = "M" final class Input[U <: Universe with Singleton](val tpe: U#Type, val expr: U#Tree, val local: U#ValDef) - trait Transform[C <: Context with Singleton, N[_]] { + trait Transform[C <: blackbox.Context with Singleton, N[_]] { def apply(in: C#Tree): C#Tree } - def idTransform[C <: Context with Singleton]: Transform[C, Id] = new Transform[C, Id] { + def idTransform[C <: blackbox.Context with Singleton]: Transform[C, Id] = new Transform[C, Id] { def apply(in: C#Tree): C#Tree = in } @@ -76,7 +76,7 @@ object Instance { * If this is for multi-input flatMap (app followed by flatMap), * this should be the argument wrapped in Right. */ - def contImpl[T, N[_]](c: Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]], inner: Transform[c.type, N])( + def contImpl[T, N[_]](c: blackbox.Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]], inner: Transform[c.type, N])( implicit tt: c.WeakTypeTag[T], nt: c.WeakTypeTag[N[T]], it: c.TypeTag[i.type] ): c.Expr[i.M[N[T]]] = @@ -85,11 +85,11 @@ object Instance { val util = ContextUtil[c.type](c) val mTC: Type = util.extractTC(i, InstanceTCName) - val mttpe: Type = appliedType(mTC, nt.tpe :: Nil).normalize + val mttpe: Type = appliedType(mTC, nt.tpe :: Nil).dealias // the tree for the macro argument val (tree, treeType) = t match { - case Left(l) => (l.tree, nt.tpe.normalize) + case Left(l) => (l.tree, nt.tpe.dealias) case Right(r) => (r.tree, mttpe) } // the Symbol for the anonymous function passed to the appropriate Instance.map/flatMap/pure method diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala index 5d19f5b6c..cab5058cb 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala @@ -8,20 +8,16 @@ import macros._ /** A `TupleBuilder` that uses a KList as the tuple representation.*/ object KListBuilder extends TupleBuilder { - // TODO 2.11 Remove this after dropping 2.10.x support. - private object HasCompat { val compat = this }; import HasCompat._ - - def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { + def make(c: blackbox.Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { val ctx: c.type = c val util = ContextUtil[c.type](c) import c.universe.{ Apply => ApplyTree, _ } - import compat._ import util._ val knilType = c.typeOf[KNil] - val knil = Ident(knilType.typeSymbol.companionSymbol) + val knil = Ident(knilType.typeSymbol.companion) val kconsTpe = c.typeOf[KCons[Int, KNil, List]] - val kcons = kconsTpe.typeSymbol.companionSymbol + val kcons = kconsTpe.typeSymbol.companion val mTC: Type = mt.asInstanceOf[c.universe.Type] val kconsTC: Type = kconsTpe.typeConstructor @@ -62,7 +58,7 @@ object KListBuilder extends TupleBuilder { */ val klistType: Type = (inputs :\ knilType)((in, klist) => kconsType(in.tpe, klist)) - val representationC = PolyType(tcVariable :: Nil, klistType) + val representationC = internal.polyType(tcVariable :: Nil, klistType) val resultType = appliedType(representationC, idTC :: Nil) val input = klist val alistInstance: ctx.universe.Tree = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala index cc2897ae3..cd77f50ae 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala @@ -9,9 +9,9 @@ import macros._ * and `KList` for larger numbers of inputs. This builder cannot handle fewer than 2 inputs. */ object MixedBuilder extends TupleBuilder { - def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = + def make(c: blackbox.Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = { val delegate = if (inputs.size > TupleNBuilder.MaxInputs) KListBuilder else TupleNBuilder delegate.make(c)(mt, inputs) } -} \ No newline at end of file +} diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala index 7ed352457..c36baa78a 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala @@ -29,10 +29,10 @@ trait TupleBuilder { type Inputs[U <: Universe with Singleton] = List[Instance.Input[U]] /** Constructs a one-time use Builder for Context `c` and type constructor `tcType`. */ - def make(c: Context)(tcType: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] + def make(c: blackbox.Context)(tcType: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] } -trait BuilderResult[C <: Context with Singleton] { +trait BuilderResult[C <: blackbox.Context with Singleton] { val ctx: C import ctx.universe._ diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala index c94a781f0..f902db25e 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala @@ -15,13 +15,9 @@ object TupleNBuilder extends TupleBuilder { final val MaxInputs = 11 final val TupleMethodName = "tuple" - // TODO 2.11 Remove this after dropping 2.10.x support. - private object HasCompat { val compat = this }; import HasCompat._ - - def make(c: Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { + def make(c: blackbox.Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { val util = ContextUtil[c.type](c) import c.universe.{ Apply => ApplyTree, _ } - import compat._ import util._ val global: Global = c.universe.asInstanceOf[Global] @@ -30,9 +26,9 @@ object TupleNBuilder extends TupleBuilder { val ctx: c.type = c val representationC: PolyType = { val tcVariable: Symbol = newTCVariable(util.initialOwner) - val tupleTypeArgs = inputs.map(in => typeRef(NoPrefix, tcVariable, in.tpe :: Nil).asInstanceOf[global.Type]) + val tupleTypeArgs = inputs.map(in => internal.typeRef(NoPrefix, tcVariable, in.tpe :: Nil).asInstanceOf[global.Type]) val tuple = global.definitions.tupleType(tupleTypeArgs) - PolyType(tcVariable :: Nil, tuple.asInstanceOf[Type]) + internal.polyType(tcVariable :: Nil, tuple.asInstanceOf[Type]) } val resultType = appliedType(representationC, idTC :: Nil) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala index 13710611c..12ae1f7e4 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala @@ -221,7 +221,7 @@ trait UnionImplicits { else false } - def force[T <: UB, UB](e: Equiv[T], a: UB, b: UB): Boolean = e.equiv(a.asInstanceOf[T], b.asInstanceOf[T]) + def force[T <: UB2, UB2](e: Equiv[T], a: UB2, b: UB2): Boolean = e.equiv(a.asInstanceOf[T], b.asInstanceOf[T]) } } diff --git a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala index 2af9388a7..9cb416840 100644 --- a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala +++ b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala @@ -54,6 +54,7 @@ class FileExamplesTest extends UnitSpec { } } + // TODO: Remove DelayedInit - https://github.com/scala/scala/releases/tag/v2.11.0-RC1 class DirectoryStructure(withCompletionPrefix: String = "") extends DelayedInit { var fileExamples: FileExamples = _ var baseDir: File = _ From 28a40163e7e9146da5746173d604278e02072164 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 19 Jun 2016 11:24:50 +0100 Subject: [PATCH 590/823] Sync Scala 2.11 version in .travis.yml --- .travis.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.travis.yml b/.travis.yml index 8dc242caf..0f9dbb590 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,4 +1,4 @@ language: scala scala: - 2.10.6 - - 2.11.7 + - 2.11.8 From 121e7f5d9e647baffa35207dbbf00d8da7f5eb1f Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 19 Jun 2016 11:42:31 +0100 Subject: [PATCH 591/823] Add -Ywarn-unused & -Ywarn-unused-import, & fix warnings --- build.sbt | 1 + .../scala/sbt/internal/util/appmacro/ContextUtil.scala | 1 - .../main/scala/sbt/internal/util/appmacro/Instance.scala | 1 - .../scala/sbt/internal/util/appmacro/KListBuilder.scala | 3 --- .../scala/sbt/internal/util/appmacro/TupleBuilder.scala | 2 -- .../scala/sbt/internal/util/appmacro/TupleNBuilder.scala | 5 +---- .../src/main/scala/sbt/internal/util/Cache.scala | 2 +- .../src/main/scala/sbt/internal/util/CacheIO.scala | 2 +- .../src/main/scala/sbt/internal/util/FileInfo.scala | 2 +- .../src/main/scala/sbt/internal/util/SeparatedCache.scala | 6 ++---- .../src/main/scala/sbt/internal/util/Dag.scala | 1 - .../src/main/scala/sbt/internal/util/INode.scala | 2 +- .../src/main/scala/sbt/internal/util/Param.scala | 2 -- .../src/main/scala/sbt/internal/util/Settings.scala | 6 ------ internal/util-collection/src/test/scala/LiteralTest.scala | 4 +--- .../util-collection/src/test/scala/SettingsExample.scala | 1 - .../src/main/scala/sbt/internal/util/LineReader.scala | 3 +-- .../scala/sbt/internal/util/complete/EditDistance.scala | 3 +-- .../scala/sbt/internal/util/complete/HistoryCommands.scala | 1 - .../scala/sbt/internal/util/complete/JLineCompletion.scala | 2 +- .../src/main/scala/sbt/internal/util/ConsoleOut.scala | 1 - internal/util-logging/src/test/scala/LogWriterTest.scala | 2 +- .../main/scala/sbt/internal/scripted/ScriptedTests.scala | 7 ++----- .../scala/sbt/internal/scripted/TestScriptParser.scala | 4 ++-- .../src/main/scala/sbt/internal/util/Tracked.scala | 4 +--- 25 files changed, 18 insertions(+), 50 deletions(-) diff --git a/build.sbt b/build.sbt index 802fa7b17..06f510900 100644 --- a/build.sbt +++ b/build.sbt @@ -14,6 +14,7 @@ def commonSettings: Seq[Setting[_]] = Seq( testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), crossScalaVersions := Seq(scala210, scala211), + scalacOptions ++= Seq("-Ywarn-unused", "-Ywarn-unused-import"), previousArtifact := None, // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, publishArtifact in Test := false diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala index ee3b56361..b9b968a23 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala @@ -3,7 +3,6 @@ package appmacro import scala.reflect._ import macros._ -import scala.tools.nsc.Global import ContextUtil.{ DynamicDependencyError, DynamicReferenceError } object ContextUtil { diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala index 3177d59c4..a10fdfb18 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala @@ -23,7 +23,6 @@ trait MonadInstance extends Instance { import scala.reflect._ import macros._ -import reflect.internal.annotations.compileTimeOnly object Instance { final val ApplyName = "app" diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala index cab5058cb..65b061e66 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala @@ -1,8 +1,6 @@ package sbt.internal.util package appmacro -import Types.Id -import scala.tools.nsc.Global import scala.reflect._ import macros._ @@ -59,7 +57,6 @@ object KListBuilder extends TupleBuilder { val klistType: Type = (inputs :\ knilType)((in, klist) => kconsType(in.tpe, klist)) val representationC = internal.polyType(tcVariable :: Nil, klistType) - val resultType = appliedType(representationC, idTC :: Nil) val input = klist val alistInstance: ctx.universe.Tree = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala index c36baa78a..1186f3549 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala @@ -1,8 +1,6 @@ package sbt.internal.util package appmacro -import Types.Id -import scala.tools.nsc.Global import scala.reflect._ import macros._ diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala index f902db25e..1c5430e4c 100644 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala +++ b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala @@ -1,7 +1,6 @@ package sbt.internal.util package appmacro -import Types.Id import scala.tools.nsc.Global import scala.reflect._ import macros._ @@ -17,11 +16,10 @@ object TupleNBuilder extends TupleBuilder { def make(c: blackbox.Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { val util = ContextUtil[c.type](c) - import c.universe.{ Apply => ApplyTree, _ } + import c.universe._ import util._ val global: Global = c.universe.asInstanceOf[Global] - val mTC: Type = mt.asInstanceOf[c.universe.Type] val ctx: c.type = c val representationC: PolyType = { @@ -30,7 +28,6 @@ object TupleNBuilder extends TupleBuilder { val tuple = global.definitions.tupleType(tupleTypeArgs) internal.polyType(tcVariable :: Nil, tuple.asInstanceOf[Type]) } - val resultType = appliedType(representationC, idTC :: Nil) val input: Tree = mkTuple(inputs.map(_.expr)) val alistInstance: Tree = { diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala index 12ae1f7e4..411771300 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala @@ -7,7 +7,7 @@ import sbinary.{ CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, O import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } import java.net.{ URI, URL } import Types.:+: -import DefaultProtocol.{ asProduct2, asSingleton, BooleanFormat, ByteFormat, IntFormat, wrap } +import DefaultProtocol.{ asSingleton, BooleanFormat, ByteFormat, IntFormat, wrap } import scala.xml.NodeSeq import scala.language.existentials diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala index 95c00f47a..afa5d12a6 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala @@ -3,7 +3,7 @@ */ package sbt.internal.util -import java.io.{ File, FileNotFoundException } +import java.io.File import sbinary.{ DefaultProtocol, Format, Operations } import scala.reflect.Manifest import sbt.io.IO diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index 1b6ac418c..8bd025397 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -3,7 +3,7 @@ */ package sbt.internal.util -import java.io.{ File, IOException } +import java.io.File import sbinary.{ DefaultProtocol, Format } import DefaultProtocol._ import scala.reflect.Manifest diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala index 379cdbff7..ff735a528 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala @@ -3,10 +3,8 @@ */ package sbt.internal.util -import Types.:+: -import sbinary.{ DefaultProtocol, Format, Input, Output => Out } -import DefaultProtocol.ByteFormat -import java.io.{ File, InputStream, OutputStream } +import sbinary.{ Format, Input, Output => Out } +import java.io.File import sbt.io.Using trait InputCache[I] { diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala index 5cad287da..1c9d93ea0 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala @@ -94,7 +94,6 @@ object Dag { */ private[sbt] def findNegativeCycle[Node](graph: DirectedSignedGraph[Node]): List[graph.Arrow] = { - import scala.annotation.tailrec import graph._ val finished = new mutable.HashSet[Node] val visited = new mutable.HashSet[Node] diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala b/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala index d7a15eee8..d85cadf3f 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala @@ -3,7 +3,7 @@ package sbt.internal.util import java.lang.Runnable import java.util.concurrent.{ atomic, Executor, LinkedBlockingQueue } import atomic.{ AtomicBoolean, AtomicInteger } -import Types.{ :+:, ConstK, Id } +import Types.{ ConstK, Id } object EvaluationState extends Enumeration { val New, Blocked, Ready, Calling, Evaluated = Value diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala index 68671c8ca..dbded9292 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala @@ -3,8 +3,6 @@ */ package sbt.internal.util -import Types._ - // Used to emulate ~> literals trait Param[A[_], B[_]] { type T diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala index a85a3faa6..3bee6fb9c 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala @@ -278,7 +278,6 @@ trait Init[Scope] { def flattenLocals(compiled: CompiledMap): Map[ScopedKey[_], Flattened] = { - import collection.breakOut val locals = compiled flatMap { case (key, comp) => if (key.key.isLocal) Seq[Compiled[_]](comp) else Nil } val ordered = Dag.topologicalSort(locals)(_.dependencies.flatMap(dep => if (dep.key.isLocal) Seq[Compiled[_]](compiled(dep)) else Nil)) def flatten(cmap: Map[ScopedKey[_], Flattened], key: ScopedKey[_], deps: Iterable[ScopedKey[_]]): Flattened = @@ -340,11 +339,6 @@ trait Init[Scope] { derivsByDef.getOrElseUpdate(key, new Deriveds(key, new mutable.ListBuffer)).settings += s } - // sort derived settings so that dependencies come first - // this is necessary when verifying that a derived setting's dependencies exist - val ddeps = (d: Deriveds) => d.dependencies.flatMap(derivsByDef.get) - val sortedDerivs = Dag.topologicalSort(derivsByDef.values)(ddeps) - // index derived settings by triggering key. This maps a key to the list of settings potentially derived from it. val derivedBy = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived]] for (s <- derived; d <- s.triggeredBy) diff --git a/internal/util-collection/src/test/scala/LiteralTest.scala b/internal/util-collection/src/test/scala/LiteralTest.scala index b50d02632..9353a07bf 100644 --- a/internal/util-collection/src/test/scala/LiteralTest.scala +++ b/internal/util-collection/src/test/scala/LiteralTest.scala @@ -3,8 +3,6 @@ */ package sbt.internal.util -import Types._ - // compilation test object LiteralTest { def x[A[_], B[_]](f: A ~> B) = f @@ -14,4 +12,4 @@ object LiteralTest { val a: List[Int] = f(Some(3)) val b: List[String] = f(Some("aa")) -} \ No newline at end of file +} diff --git a/internal/util-collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala index 5dd408282..0dd910773 100644 --- a/internal/util-collection/src/test/scala/SettingsExample.scala +++ b/internal/util-collection/src/test/scala/SettingsExample.scala @@ -32,7 +32,6 @@ object SettingsExample extends Init[Scope] { object SettingsUsage { import SettingsExample._ - import Types._ // Define some keys val a = AttributeKey[Int]("a") diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index a345099fd..c05a7427d 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -5,9 +5,8 @@ package sbt.internal.util import jline.console.ConsoleReader import jline.console.history.{ FileHistory, MemoryHistory } -import java.io.{ File, InputStream, PrintWriter, FileInputStream, FileDescriptor, FilterInputStream } +import java.io.{ File, InputStream, FileInputStream, FileDescriptor, FilterInputStream } import complete.Parser -import java.util.concurrent.atomic.AtomicBoolean import scala.concurrent.duration.Duration import scala.annotation.tailrec diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala index 8cb617348..79f488554 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala @@ -23,7 +23,6 @@ object EditDistance { for (i <- 1 to n; s_i = s(i - 1); j <- 1 to m) { val t_j = t(j - 1) val cost = if (s_i == t_j) matchCost else if (lower(s_i) == lower(t_j)) caseCost else subCost - val tcost = if (s_i == t_j) matchCost else transposeCost val c1 = d(i - 1)(j) + deleteCost val c2 = d(i)(j - 1) + insertCost @@ -39,4 +38,4 @@ object EditDistance { d(n)(m) } -} \ No newline at end of file +} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala index 350a36610..f18f1619f 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala @@ -4,7 +4,6 @@ package sbt.internal.util package complete -import java.io.File import sbt.io.IO object HistoryCommands { diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala index e098f59f6..0b8b50502 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala @@ -5,7 +5,7 @@ package sbt.internal.util package complete import jline.console.ConsoleReader -import jline.console.completer.{ CandidateListCompletionHandler, Completer, CompletionHandler } +import jline.console.completer.{ Completer, CompletionHandler } import scala.annotation.tailrec import collection.JavaConversions diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala index cffec8781..30da238da 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala @@ -1,6 +1,5 @@ package sbt.internal.util -import sbt.util._ import java.io.{ BufferedWriter, PrintStream, PrintWriter } sealed trait ConsoleOut { diff --git a/internal/util-logging/src/test/scala/LogWriterTest.scala b/internal/util-logging/src/test/scala/LogWriterTest.scala index 4b1aace56..f00663b4b 100644 --- a/internal/util-logging/src/test/scala/LogWriterTest.scala +++ b/internal/util-logging/src/test/scala/LogWriterTest.scala @@ -5,7 +5,7 @@ package sbt.internal.util import sbt.util._ import org.scalacheck._ -import Arbitrary.{ arbitrary => arb, _ } +import Arbitrary._ import Gen.{ listOfN, oneOf } import Prop._ diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index 9575f2f6c..197a58403 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -6,7 +6,7 @@ import java.io.File import sbt.util.Logger import sbt.internal.util.{ ConsoleLogger, BufferedLogger, FullLogger } import sbt.io.IO.wrapNull -import sbt.io.{ DirectoryFilter, HiddenFileFilter, Path, GlobFilter } +import sbt.io.{ DirectoryFilter, HiddenFileFilter } import sbt.io.syntax._ import sbt.internal.io.Resources @@ -49,9 +49,6 @@ final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handl def scriptedTest(group: String, name: String, log: Logger): Seq[() => Option[String]] = scriptedTest(group, name, { _ => () }, log) def scriptedTest(group: String, name: String, prescripted: File => Unit, log: Logger): Seq[() => Option[String]] = { - import Path._ - import GlobFilter._ - var failed = false for (groupDir <- (resourceBaseDirectory * group).get; nme <- (groupDir * name).get) yield { val g = groupDir.getName val n = nme.getName @@ -173,4 +170,4 @@ final class ListTests(baseDirectory: File, accept: ScriptedTest => Boolean, log: class PendingTestSuccessException(label: String) extends Exception { override def getMessage: String = s"The pending test $label succeeded. Mark this test as passing to remove this failure." -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala index 2e8b7f7f6..74f96eb81 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala @@ -5,7 +5,7 @@ package sbt package internal package scripted -import java.io.{ BufferedReader, File, InputStreamReader } +import java.io.File import scala.util.parsing.combinator._ import scala.util.parsing.input.Positional import Character.isWhitespace @@ -80,4 +80,4 @@ class TestScriptParser(handlers: Map[Char, StatementHandler]) extends RegexParse ((newline | err("expected start character " + handlers.keys.mkString("(", "", ")"))) ~> failure("end of input")) def newline = """\s*([\n\r]|$)""".r -} \ No newline at end of file +} diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala index 28ee9c21e..51e70a8a5 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala @@ -8,9 +8,8 @@ import CacheIO.{ fromFile, toFile } import sbinary.Format import scala.pickling.PicklingException import scala.reflect.Manifest -import scala.collection.mutable import sbt.io.IO.{ delete, read, write } -import sbt.io.{ IO, Path } +import sbt.io.IO import sbt.io.Using import sbt.io.syntax._ import sbt.serialization._ @@ -257,7 +256,6 @@ object FileFunction { def cached(cacheBaseDirectory: File)(inStyle: FilesInfo.Style, outStyle: FilesInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = { - import Path._ lazy val inCache = Difference.inputs(cacheBaseDirectory / "in-cache", inStyle) lazy val outCache = Difference.outputs(cacheBaseDirectory / "out-cache", outStyle) inputs => From 4e4aa08a1a13bd85c70d60d7129438acd1819406 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 19 Jun 2016 13:06:43 +0100 Subject: [PATCH 592/823] Drop Scala 2.10 --- .travis.yml | 4 +--- build.sbt | 6 ++---- project/Dependencies.scala | 1 - 3 files changed, 3 insertions(+), 8 deletions(-) diff --git a/.travis.yml b/.travis.yml index 0f9dbb590..639444bb0 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,4 +1,2 @@ language: scala -scala: - - 2.10.6 - - 2.11.8 +scala: 2.11.8 diff --git a/build.sbt b/build.sbt index 06f510900..bc5816052 100644 --- a/build.sbt +++ b/build.sbt @@ -13,7 +13,7 @@ def commonSettings: Seq[Setting[_]] = Seq( // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), - crossScalaVersions := Seq(scala210, scala211), + crossScalaVersions := Seq(scala211), scalacOptions ++= Seq("-Ywarn-unused", "-Ywarn-unused-import"), previousArtifact := None, // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, @@ -66,9 +66,7 @@ lazy val utilCollection = (project in internalPath / "util-collection"). settings( commonSettings, Util.keywordsSettings, - name := "Util Collection", - scalacOptions --= // scalac 2.10 rejects some HK types under -Xfuture it seems.. - (CrossVersion partialVersion scalaVersion.value collect { case (2, 10) => "-Xfuture" }).toList + name := "Util Collection" ) lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). diff --git a/project/Dependencies.scala b/project/Dependencies.scala index e0ea6ad2d..23d6b1f77 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -2,7 +2,6 @@ import sbt._ import Keys._ object Dependencies { - lazy val scala210 = "2.10.6" lazy val scala211 = "2.11.8" lazy val scala212 = "2.12.0-M4" From 745bf4dc6d99e936283224990c5c4fc3abc5f27c Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 19 Jun 2016 21:21:08 +0100 Subject: [PATCH 593/823] Remove Attribute#rawLabel --- .../src/main/scala/sbt/internal/util/Attributes.scala | 7 +------ 1 file changed, 1 insertion(+), 6 deletions(-) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala index 25cf298d7..33591506c 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala @@ -19,9 +19,6 @@ sealed trait AttributeKey[T] { /** The runtime evidence for `T` */ def manifest: Manifest[T] - @deprecated("Should only be used for compatibility during the transition from hyphenated labels to camelCase labels.", "0.13.0") - def rawLabel: String - /** The label is the identifier for the key and is camelCase by convention. */ def label: String @@ -73,7 +70,6 @@ object AttributeKey { private[this] def make[T](name: String, description0: Option[String], extend0: Seq[AttributeKey[_]], rank0: Int)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { def manifest = mf - def rawLabel = name val label = Util.hyphenToCamel(name) def description = description0 def extend = extend0 @@ -81,7 +77,6 @@ object AttributeKey { } private[sbt] def local[T](implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { def manifest = mf - def rawLabel = LocalLabel def label = LocalLabel def description = None def extend = Nil @@ -207,4 +202,4 @@ object Attributed { /** Associates an empty metadata map with `data`. */ def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) -} \ No newline at end of file +} From 5ecfc4d59f5ba05603d1877d6e37675cd741445c Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Sun, 19 Jun 2016 23:16:59 +0100 Subject: [PATCH 594/823] Un-deprecate now-private methods --- .../src/main/scala/sbt/internal/util/Settings.scala | 3 --- 1 file changed, 3 deletions(-) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala index 3bee6fb9c..ed23e377a 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala @@ -452,9 +452,7 @@ trait Init[Scope] { def settings = this :: Nil def definitive: Boolean = !init.dependencies.contains(key) def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) - @deprecated("Will be made private.", "0.13.2") def mapReferenced(g: MapScoped): Setting[T] = make(key, init mapReferenced g, pos) - @deprecated("Will be made private.", "0.13.2") def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => make(key, newI, pos)) private[sbt] def validateKeyReferenced(g: ValidateKeyRef): Either[Seq[Undefined], Setting[T]] = @@ -462,7 +460,6 @@ trait Init[Scope] { def mapKey(g: MapScoped): Setting[T] = make(g(key), init, pos) def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = make(key, init(t => f(key, t)), pos) - @deprecated("Will be made private.", "0.13.2") def mapConstant(g: MapConstant): Setting[T] = make(key, init mapConstant g, pos) def withPos(pos: SourcePosition) = make(key, init, pos) def positionString: Option[String] = pos match { From 4cffccc8c8c9e3dd5f7e61d391d1e42a8f1e42f2 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 27 Jun 2016 15:27:42 +0200 Subject: [PATCH 595/823] Caching based on sjsonnew --- build.sbt | 12 +- .../sbt/internal/util/AdditionalFormats.scala | 52 ++++ .../internal/util/BasicCacheImplicits.scala | 59 ++++ .../main/scala/sbt/internal/util/Cache.scala | 291 ++++-------------- .../scala/sbt/internal/util/CacheIO.scala | 45 --- .../sbt/internal/util/CacheImplicits.scala | 17 + .../scala/sbt/internal/util/CacheStore.scala | 93 ++++++ .../scala/sbt/internal/util/FileInfo.scala | 239 ++++++++------ .../main/scala/sbt/internal/util/Input.scala | 45 +++ .../main/scala/sbt/internal/util/Output.scala | 32 ++ .../sbt/internal/util/SeparatedCache.scala | 100 +++--- .../sbt/internal/util/StampedFormat.scala | 44 +++ .../util-cache/src/test/scala/CacheSpec.scala | 76 +++++ .../util-cache/src/test/scala/CacheTest.scala | 32 -- .../src/test/scala/SingletonCacheSpec.scala | 91 ++++++ .../scalajson/unsafe/FixedParser.scala | 29 ++ .../scala/sbt/internal/util/Tracked.scala | 206 +++++-------- .../scala/sbt/internal/util/TrackedSpec.scala | 140 +++++++++ .../scalajson/unsafe/FixedParser.scala | 29 ++ project/Dependencies.scala | 4 +- 20 files changed, 1037 insertions(+), 599 deletions(-) create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala delete mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/Input.scala create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/Output.scala create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala create mode 100644 internal/util-cache/src/test/scala/CacheSpec.scala delete mode 100644 internal/util-cache/src/test/scala/CacheTest.scala create mode 100644 internal/util-cache/src/test/scala/SingletonCacheSpec.scala create mode 100644 internal/util-cache/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala create mode 100644 internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala create mode 100644 internal/util-tracking/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala diff --git a/build.sbt b/build.sbt index bc5816052..2a335cd79 100644 --- a/build.sbt +++ b/build.sbt @@ -111,22 +111,24 @@ lazy val utilLogic = (project in internalPath / "util-logic"). name := "Util Logic" ) -// Persisted caching based on SBinary +// Persisted caching based on sjson-new lazy val utilCache = (project in internalPath / "util-cache"). - dependsOn(utilCollection). + dependsOn(utilCollection, utilTesting % Test). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sbinary, sbtSerialization, scalaReflect.value, sbtIO) ++ scalaXml.value + libraryDependencies ++= Seq(datatypeCodecs, sbtSerialization, scalaReflect.value, sbtIO) ++ scalaXml.value, + libraryDependencies += sjsonnewScalaJson % Test ) // Builds on cache to provide caching for filesystem-related operations lazy val utilTracking = (project in internalPath / "util-tracking"). - dependsOn(utilCache). + dependsOn(utilCache, utilTesting % Test). settings( commonSettings, name := "Util Tracking", - libraryDependencies += sbtIO + libraryDependencies += sbtIO, + libraryDependencies += sjsonnewScalaJson % Test ) // Internal utility for testing diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala new file mode 100644 index 000000000..6e74c26c7 --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala @@ -0,0 +1,52 @@ +package sbt.internal.util + +import sbt.datatype.StringFormat +import sbt.internal.util.Types.:+: + +import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } +import sjsonnew.BasicJsonProtocol.{ wrap, asSingleton } + +import java.io.File + +import java.net.{ URI, URL } + +trait URIFormat { self: StringFormat => + implicit def URIFormat: JsonFormat[URI] = wrap(_.toString, new URI(_: String)) +} + +trait URLFormat { self: StringFormat => + implicit def URLFormat: JsonFormat[URL] = wrap(_.toString, new URL(_: String)) +} + +trait FileFormat { self: StringFormat => + implicit def FileFormat: JsonFormat[File] = wrap(_.toString, new File(_: String)) +} + +trait HListFormat { + implicit def HConsFormat[H: JsonFormat, T <: HList: JsonFormat]: JsonFormat[H :+: T] = + new JsonFormat[H :+: T] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): H :+: T = + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val h = unbuilder.readField[H]("h") + val t = unbuilder.readField[T]("t") + unbuilder.endObject() + + HCons(h, t) + + case None => + deserializationError("Expect JValue but found None") + } + + override def write[J](obj: H :+: T, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("h", obj.head) + builder.addField("t", obj.tail) + builder.endObject() + } + } + + implicit val HNilFormat: JsonFormat[HNil] = asSingleton(HNil) + +} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala new file mode 100644 index 000000000..7829e8e22 --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala @@ -0,0 +1,59 @@ +package sbt.internal.util + +import sbt.datatype.{ ArrayFormat, BooleanFormat, ByteFormat, IntFormat } + +import java.net.{ URI, URL } + +import sjsonnew.JsonFormat +import sjsonnew.BasicJsonProtocol.asSingleton + +trait BasicCacheImplicits { self: ArrayFormat with BooleanFormat with ByteFormat with IntFormat => + + implicit def basicCache[I: JsonFormat: Equiv, O: JsonFormat]: Cache[I, O] = + new BasicCache[I, O]() + + def defaultEquiv[T]: Equiv[T] = + new Equiv[T] { def equiv(a: T, b: T) = a == b } + + def wrapEquiv[S, T](f: S => T)(implicit eqT: Equiv[T]): Equiv[S] = + new Equiv[S] { + def equiv(a: S, b: S) = + eqT.equiv(f(a), f(b)) + } + + implicit def optEquiv[T](implicit t: Equiv[T]): Equiv[Option[T]] = + new Equiv[Option[T]] { + def equiv(a: Option[T], b: Option[T]) = + (a, b) match { + case (None, None) => true + case (Some(va), Some(vb)) => t.equiv(va, vb) + case _ => false + } + } + implicit def urlEquiv(implicit uriEq: Equiv[URI]): Equiv[URL] = wrapEquiv[URL, URI](_.toURI)(uriEq) + implicit def uriEquiv: Equiv[URI] = defaultEquiv + implicit def stringSetEquiv: Equiv[Set[String]] = defaultEquiv + implicit def stringMapEquiv: Equiv[Map[String, String]] = defaultEquiv + + implicit def arrEquiv[T](implicit t: Equiv[T]): Equiv[Array[T]] = + wrapEquiv((x: Array[T]) => x: Seq[T])(seqEquiv[T](t)) + + implicit def seqEquiv[T](implicit t: Equiv[T]): Equiv[Seq[T]] = + new Equiv[Seq[T]] { + def equiv(a: Seq[T], b: Seq[T]) = + a.length == b.length && + ((a, b).zipped forall t.equiv) + } + + def wrapIn[I, J](implicit f: I => J, g: J => I, jCache: SingletonCache[J]): SingletonCache[I] = + new SingletonCache[I] { + override def read(from: Input): I = g(jCache.read(from)) + override def write(to: Output, value: I) = jCache.write(to, f(value)) + override def equiv: Equiv[I] = wrapEquiv(f)(jCache.equiv) + } + + def singleton[T](t: T): SingletonCache[T] = + SingletonCache.basicSingletonCache(asSingleton(t), trueEquiv) + + def trueEquiv[T] = new Equiv[T] { def equiv(a: T, b: T) = true } +} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala index 411771300..0a04dbcdd 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala @@ -3,247 +3,72 @@ */ package sbt.internal.util -import sbinary.{ CollectionTypes, DefaultProtocol, Format, Input, JavaFormats, Output => Out } -import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } -import java.net.{ URI, URL } -import Types.:+: -import DefaultProtocol.{ asSingleton, BooleanFormat, ByteFormat, IntFormat, wrap } -import scala.xml.NodeSeq -import scala.language.existentials +/** The result of a cache query */ +sealed trait CacheResult[K] +/** A successful hit on the cache */ +case class Hit[O](value: O) extends CacheResult[O] + +/** + * A cache miss. + * `update` associates the missing key with `O` in the cache. + */ +case class Miss[O](update: O => Unit) extends CacheResult[O] + +/** + * A simple cache with keys of type `I` and values of type `O` + */ trait Cache[I, O] { - def apply(file: File)(i: I): Either[O, O => Unit] + /** + * Queries the cache backed with store `store` for key `key`. + */ + def apply(store: CacheStore)(key: I): CacheResult[O] } -trait SBinaryFormats extends CollectionTypes with JavaFormats { - implicit def urlFormat: Format[URL] = DefaultProtocol.UrlFormat - implicit def uriFormat: Format[URI] = DefaultProtocol.UriFormat -} -object Cache extends CacheImplicits { + +object Cache { + + /** + * Materializes a cache. + */ def cache[I, O](implicit c: Cache[I, O]): Cache[I, O] = c - def cached[I, O](file: File)(f: I => O)(implicit cache: Cache[I, O]): I => O = - in => - cache(file)(in) match { - case Left(value) => value - case Right(store) => - val out = f(in) - store(out) - out + /** + * Returns a function that represents a cache that inserts on miss. + * + * @param store The store that backs this cache. + * @param default A function that computes a default value to insert on + */ + def cached[I, O](store: CacheStore)(default: I => O)(implicit cache: Cache[I, O]): I => O = + key => + cache(store)(key) match { + case Hit(value) => + value + + case Miss(update) => + val result = default(key) + update(result) + result } - def debug[I](label: String, c: InputCache[I]): InputCache[I] = - new InputCache[I] { - type Internal = c.Internal - def convert(i: I) = c.convert(i) - def read(from: Input) = - { - val v = c.read(from) - println(label + ".read: " + v) - v - } - def write(to: Out, v: Internal): Unit = { - println(label + ".write: " + v) - c.write(to, v) + def debug[I](label: String, cache: SingletonCache[I]): SingletonCache[I] = + new SingletonCache[I] { + override def read(from: Input): I = { + val value = cache.read(from) + println(label + ".read: " + value) + value } - def equiv: Equiv[Internal] = new Equiv[Internal] { - def equiv(a: Internal, b: Internal) = - { - val equ = c.equiv.equiv(a, b) - println(label + ".equiv(" + a + ", " + b + "): " + equ) - equ - } + + override def write(to: Output, value: I): Unit = { + println(label + ".write: " + value) + cache.write(to, value) + } + + override def equiv: Equiv[I] = new Equiv[I] { + def equiv(a: I, b: I) = { + val equ = cache.equiv.equiv(a, b) + println(label + ".equiv(" + a + ", " + b + "): " + equ) + equ + } } } } -trait CacheImplicits extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits with UnionImplicits -trait BasicCacheImplicits { - implicit def basicCache[I, O](implicit in: InputCache[I], outFormat: Format[O]): Cache[I, O] = - new BasicCache()(in, outFormat) - def basicInput[I](implicit eq: Equiv[I], fmt: Format[I]): InputCache[I] = InputCache.basicInputCache(fmt, eq) - - def defaultEquiv[T]: Equiv[T] = new Equiv[T] { def equiv(a: T, b: T) = a == b } - - implicit def optInputCache[T](implicit t: InputCache[T]): InputCache[Option[T]] = - new InputCache[Option[T]] { - type Internal = Option[t.Internal] - def convert(v: Option[T]): Internal = v.map(x => t.convert(x)) - def read(from: Input) = - { - val isDefined = BooleanFormat.reads(from) - if (isDefined) Some(t.read(from)) else None - } - def write(to: Out, j: Internal): Unit = - { - BooleanFormat.writes(to, j.isDefined) - j foreach { x => t.write(to, x) } - } - def equiv = optEquiv(t.equiv) - } - - def wrapEquiv[S, T](f: S => T)(implicit eqT: Equiv[T]): Equiv[S] = - new Equiv[S] { - def equiv(a: S, b: S) = - eqT.equiv(f(a), f(b)) - } - - implicit def optEquiv[T](implicit t: Equiv[T]): Equiv[Option[T]] = - new Equiv[Option[T]] { - def equiv(a: Option[T], b: Option[T]) = - (a, b) match { - case (None, None) => true - case (Some(va), Some(vb)) => t.equiv(va, vb) - case _ => false - } - } - implicit def urlEquiv(implicit uriEq: Equiv[URI]): Equiv[URL] = wrapEquiv[URL, URI](_.toURI)(uriEq) - implicit def uriEquiv: Equiv[URI] = defaultEquiv - implicit def stringSetEquiv: Equiv[Set[String]] = defaultEquiv - implicit def stringMapEquiv: Equiv[Map[String, String]] = defaultEquiv - - def streamFormat[T](write: (T, OutputStream) => Unit, f: InputStream => T): Format[T] = - { - val toBytes = (t: T) => { val bos = new ByteArrayOutputStream; write(t, bos); bos.toByteArray } - val fromBytes = (bs: Array[Byte]) => f(new ByteArrayInputStream(bs)) - wrap(toBytes, fromBytes)(DefaultProtocol.ByteArrayFormat) - } - - implicit def xmlInputCache(implicit strEq: InputCache[String]): InputCache[NodeSeq] = wrapIn[NodeSeq, String](_.toString, strEq) - - implicit def seqCache[T](implicit t: InputCache[T]): InputCache[Seq[T]] = - new InputCache[Seq[T]] { - type Internal = Seq[t.Internal] - def convert(v: Seq[T]) = v.map(x => t.convert(x)) - def read(from: Input) = - { - val size = IntFormat.reads(from) - def next(left: Int, acc: List[t.Internal]): Internal = - if (left <= 0) acc.reverse else next(left - 1, t.read(from) :: acc) - next(size, Nil) - } - def write(to: Out, vs: Internal): Unit = { - val size = vs.length - IntFormat.writes(to, size) - for (v <- vs) t.write(to, v) - } - def equiv: Equiv[Internal] = seqEquiv(t.equiv) - } - - implicit def arrEquiv[T](implicit t: Equiv[T]): Equiv[Array[T]] = - wrapEquiv((x: Array[T]) => x: Seq[T])(seqEquiv[T](t)) - - implicit def seqEquiv[T](implicit t: Equiv[T]): Equiv[Seq[T]] = - new Equiv[Seq[T]] { - def equiv(a: Seq[T], b: Seq[T]) = - a.length == b.length && - ((a, b).zipped forall t.equiv) - } - implicit def seqFormat[T](implicit t: Format[T]): Format[Seq[T]] = - wrap[Seq[T], List[T]](_.toList, _.toSeq)(DefaultProtocol.listFormat) - - def wrapIn[I, J](implicit f: I => J, jCache: InputCache[J]): InputCache[I] = - new InputCache[I] { - type Internal = jCache.Internal - def convert(i: I) = jCache.convert(f(i)) - def read(from: Input) = jCache.read(from) - def write(to: Out, j: Internal) = jCache.write(to, j) - def equiv = jCache.equiv - } - - def singleton[T](t: T): InputCache[T] = - basicInput(trueEquiv, asSingleton(t)) - - def trueEquiv[T] = new Equiv[T] { def equiv(a: T, b: T) = true } -} - -trait HListCacheImplicits { - implicit def hConsCache[H, T <: HList](implicit head: InputCache[H], tail: InputCache[T]): InputCache[H :+: T] = - new InputCache[H :+: T] { - type Internal = (head.Internal, tail.Internal) - def convert(in: H :+: T) = (head.convert(in.head), tail.convert(in.tail)) - def read(from: Input) = - { - val h = head.read(from) - val t = tail.read(from) - (h, t) - } - def write(to: Out, j: Internal): Unit = { - head.write(to, j._1) - tail.write(to, j._2) - } - def equiv = new Equiv[Internal] { - def equiv(a: Internal, b: Internal) = - head.equiv.equiv(a._1, b._1) && - tail.equiv.equiv(a._2, b._2) - } - } - - implicit def hNilCache: InputCache[HNil] = Cache.singleton(HNil: HNil) - - implicit def hConsFormat[H, T <: HList](implicit head: Format[H], tail: Format[T]): Format[H :+: T] = new Format[H :+: T] { - def reads(from: Input) = - { - val h = head.reads(from) - val t = tail.reads(from) - HCons(h, t) - } - def writes(to: Out, hc: H :+: T): Unit = { - head.writes(to, hc.head) - tail.writes(to, hc.tail) - } - } - - implicit def hNilFormat: Format[HNil] = asSingleton(HNil) -} -trait UnionImplicits { - def unionInputCache[UB, HL <: HList](implicit uc: UnionCache[HL, UB]): InputCache[UB] = - new InputCache[UB] { - type Internal = Found[_] - def convert(in: UB) = uc.find(in) - def read(in: Input) = - { - val index = ByteFormat.reads(in).toInt - val (cache, clazz) = uc.at(index) - val value = cache.read(in) - new Found[cache.Internal](cache, clazz, value, index) - } - def write(to: Out, i: Internal): Unit = { - def write0[I](f: Found[I]): Unit = { - ByteFormat.writes(to, f.index.toByte) - f.cache.write(to, f.value) - } - write0(i) - } - def equiv: Equiv[Internal] = new Equiv[Internal] { - def equiv(a: Internal, b: Internal): Boolean = - { - if (a.clazz == b.clazz) - force(a.cache.equiv, a.value, b.value) - else - false - } - def force[T <: UB2, UB2](e: Equiv[T], a: UB2, b: UB2): Boolean = e.equiv(a.asInstanceOf[T], b.asInstanceOf[T]) - } - } - - implicit def unionCons[H <: UB, UB, T <: HList](implicit head: InputCache[H], mf: Manifest[H], t: UnionCache[T, UB]): UnionCache[H :+: T, UB] = - new UnionCache[H :+: T, UB] { - val size = 1 + t.size - def c = mf.runtimeClass - def find(value: UB): Found[_] = - if (c.isInstance(value)) new Found[head.Internal](head, c, head.convert(value.asInstanceOf[H]), size - 1) else t.find(value) - def at(i: Int): (InputCache[_ <: UB], Class[_]) = if (size == i + 1) (head, c) else t.at(i) - } - - implicit def unionNil[UB]: UnionCache[HNil, UB] = new UnionCache[HNil, UB] { - def size = 0 - def find(value: UB) = sys.error("No valid sum type for " + value) - def at(i: Int) = sys.error("Invalid union index " + i) - } - - final class Found[I](val cache: InputCache[_] { type Internal = I }, val clazz: Class[_], val value: I, val index: Int) - sealed trait UnionCache[HL <: HList, UB] { - def size: Int - def at(i: Int): (InputCache[_ <: UB], Class[_]) - def find(forValue: UB): Found[_] - } -} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala deleted file mode 100644 index afa5d12a6..000000000 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheIO.scala +++ /dev/null @@ -1,45 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah - */ -package sbt.internal.util - -import java.io.File -import sbinary.{ DefaultProtocol, Format, Operations } -import scala.reflect.Manifest -import sbt.io.IO - -object CacheIO { - def toBytes[T](format: Format[T])(value: T)(implicit mf: Manifest[Format[T]]): Array[Byte] = - toBytes[T](value)(format, mf) - def toBytes[T](value: T)(implicit format: Format[T], mf: Manifest[Format[T]]): Array[Byte] = - Operations.toByteArray(value)(stampedFormat(format)) - def fromBytes[T](format: Format[T], default: => T)(bytes: Array[Byte])(implicit mf: Manifest[Format[T]]): T = - fromBytes(default)(bytes)(format, mf) - def fromBytes[T](default: => T)(bytes: Array[Byte])(implicit format: Format[T], mf: Manifest[Format[T]]): T = - if (bytes.isEmpty) default else Operations.fromByteArray(bytes)(stampedFormat(format)) - - def fromFile[T](format: Format[T], default: => T)(file: File)(implicit mf: Manifest[Format[T]]): T = - fromFile(file, default)(format, mf) - def fromFile[T](file: File, default: => T)(implicit format: Format[T], mf: Manifest[Format[T]]): T = - fromFile[T](file) getOrElse default - def fromFile[T](file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Option[T] = - try { Some(Operations.fromFile(file)(stampedFormat(format))) } - catch { case e: Exception => None } - - def toFile[T](format: Format[T])(value: T)(file: File)(implicit mf: Manifest[Format[T]]): Unit = - toFile(value)(file)(format, mf) - def toFile[T](value: T)(file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Unit = - { - IO.createDirectory(file.getParentFile) - Operations.toFile(value)(file)(stampedFormat(format)) - } - def stampedFormat[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Format[T] = - { - import DefaultProtocol._ - withStamp(stamp(format))(format) - } - def stamp[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Int = typeHash(mf) - def typeHash[T](implicit mf: Manifest[T]) = mf.toString.hashCode - def manifest[T](implicit mf: Manifest[T]): Manifest[T] = mf - def objManifest[T](t: T)(implicit mf: Manifest[T]): Manifest[T] = mf -} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala new file mode 100644 index 000000000..190282a6e --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala @@ -0,0 +1,17 @@ +package sbt.internal.util + +import sbt.datatype.{ ArrayFormat, BooleanFormat, ByteFormat, IntFormat, LongFormat, StringFormat } +import sjsonnew.{ CollectionFormats, TupleFormats } + +object CacheImplicits extends BasicCacheImplicits + with ArrayFormat + with BooleanFormat + with ByteFormat + with FileFormat + with IntFormat + with LongFormat + with StringFormat + with URIFormat + with URLFormat + with TupleFormats + with CollectionFormats \ No newline at end of file diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala new file mode 100644 index 000000000..16ef54e95 --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala @@ -0,0 +1,93 @@ +package sbt.internal.util + +import sjsonnew.{ IsoString, JsonReader, JsonWriter, SupportConverter } + +import java.io.{ File, InputStream, OutputStream } + +import sbt.io.{ IO, Using } +import sbt.io.syntax.fileToRichFile + +/** + * A `CacheStore` is used by the caching infrastructure to persist cached information. + */ +trait CacheStore extends Input with Output { + /** Delete the persisted information. */ + def delete(): Unit +} + +/** + * Factory that can derive new stores. + */ +trait CacheStoreFactory { + /** Create a new store. */ + def derive(identifier: String): CacheStore +} + +/** + * A factory that creates new stores persisted in `base`. + */ +class DirectoryStoreFactory[J: IsoString](base: File, converter: SupportConverter[J]) extends CacheStoreFactory { + + IO.createDirectory(base) + + override def derive(identifier: String): CacheStore = + new FileBasedStore(base / identifier, converter) +} + +/** + * A `CacheStore` that persists information in `file`. + */ +class FileBasedStore[J: IsoString](file: File, converter: SupportConverter[J]) extends CacheStore { + + IO.touch(file, setModified = false) + + override def delete(): Unit = + IO.delete(file) + + override def read[T: JsonReader](): T = + Using.fileInputStream(file) { stream => + val input = new PlainInput(stream, converter) + input.read() + } + + override def read[T: JsonReader](default: => T): T = + try read[T]() + catch { case _: Exception => default } + + override def write[T: JsonWriter](value: T): Unit = + Using.fileOutputStream(append = false)(file) { stream => + val output = new PlainOutput(stream, converter) + output.write(value) + } + + override def close(): Unit = () + +} + +/** + * A store that reads from `inputStream` and writes to `outputStream + */ +class StreamBasedStore[J: IsoString](inputStream: InputStream, outputStream: OutputStream, converter: SupportConverter[J]) extends CacheStore { + + override def delete(): Unit = () + + override def read[T: JsonReader](): T = { + val input = new PlainInput(inputStream, converter) + input.read() + } + + override def read[T: JsonReader](default: => T): T = + try read[T]() + catch { case _: Exception => default } + + override def write[T: JsonWriter](value: T): Unit = { + val output = new PlainOutput(outputStream, converter) + output.write(value) + } + + override def close(): Unit = { + inputStream.close() + outputStream.close() + } + +} \ No newline at end of file diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index 8bd025397..cf83d01dd 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -3,135 +3,172 @@ */ package sbt.internal.util -import java.io.File -import sbinary.{ DefaultProtocol, Format } -import DefaultProtocol._ -import scala.reflect.Manifest import sbt.io.Hash -import sbt.serialization._ + +import java.io.File +import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } +import CacheImplicits._ sealed trait FileInfo { - val file: File + def file: File } -@directSubclasses(Array(classOf[FileHash], classOf[HashModifiedFileInfo])) + sealed trait HashFileInfo extends FileInfo { - val hash: List[Byte] + def hash: List[Byte] } -object HashFileInfo { - implicit val pickler: Pickler[HashFileInfo] with Unpickler[HashFileInfo] = PicklerUnpickler.generate[HashFileInfo] -} -@directSubclasses(Array(classOf[FileModified], classOf[HashModifiedFileInfo])) + sealed trait ModifiedFileInfo extends FileInfo { - val lastModified: Long + def lastModified: Long } -object ModifiedFileInfo { - implicit val pickler: Pickler[ModifiedFileInfo] with Unpickler[ModifiedFileInfo] = PicklerUnpickler.generate[ModifiedFileInfo] -} -@directSubclasses(Array(classOf[PlainFile])) + sealed trait PlainFileInfo extends FileInfo { def exists: Boolean } -object PlainFileInfo { - implicit val pickler: Pickler[PlainFileInfo] with Unpickler[PlainFileInfo] = PicklerUnpickler.generate[PlainFileInfo] -} -@directSubclasses(Array(classOf[FileHashModified])) -sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo -object HashModifiedFileInfo { - implicit val pickler: Pickler[HashModifiedFileInfo] with Unpickler[HashModifiedFileInfo] = PicklerUnpickler.generate[HashModifiedFileInfo] -} -private[sbt] final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo -private[sbt] object PlainFile { - implicit val pickler: Pickler[PlainFile] with Unpickler[PlainFile] = PicklerUnpickler.generate[PlainFile] -} -private[sbt] final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo -private[sbt] object FileHash { - implicit val pickler: Pickler[FileHash] with Unpickler[FileHash] = PicklerUnpickler.generate[FileHash] -} -private[sbt] final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo -private[sbt] object FileModified { - implicit val pickler: Pickler[FileModified] with Unpickler[FileModified] = PicklerUnpickler.generate[FileModified] -} -private[sbt] final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo -private[sbt] object FileHashModified { - implicit val pickler: Pickler[FileHashModified] with Unpickler[FileHashModified] = PicklerUnpickler.generate[FileHashModified] -} +sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo +private final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo +private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo +private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo +private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo object FileInfo { - implicit def existsInputCache: InputCache[PlainFileInfo] = exists.infoInputCache - implicit def modifiedInputCache: InputCache[ModifiedFileInfo] = lastModified.infoInputCache - implicit def hashInputCache: InputCache[HashFileInfo] = hash.infoInputCache - implicit def fullInputCache: InputCache[HashModifiedFileInfo] = full.infoInputCache - implicit val pickler: Pickler[FileInfo] with Unpickler[FileInfo] = PicklerUnpickler.generate[FileInfo] sealed trait Style { type F <: FileInfo - implicit def apply(file: File): F - implicit def unapply(info: F): File = info.file - implicit val format: Format[F] - import Cache._ - implicit def fileInfoEquiv: Equiv[F] = defaultEquiv - def infoInputCache: InputCache[F] = basicInput - implicit def fileInputCache: InputCache[File] = wrapIn[File, F] + implicit val format: JsonFormat[F] + + def apply(file: File): F + + def apply(files: Set[File]): FilesInfo[F] = FilesInfo(files map apply) + + def unapply(info: F): File = info.file + + def unapply(infos: FilesInfo[F]): Set[File] = infos.files map (_.file) } + object full extends Style { - type F = HashModifiedFileInfo - implicit def apply(file: File): HashModifiedFileInfo = make(file, Hash(file).toList, file.lastModified) - def make(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, hash, lastModified) - implicit val format: Format[HashModifiedFileInfo] = wrap(f => (f.file, f.hash, f.lastModified), (make _).tupled) + override type F = HashModifiedFileInfo + + override implicit val format: JsonFormat[HashModifiedFileInfo] = new JsonFormat[HashModifiedFileInfo] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): HashModifiedFileInfo = + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val file = unbuilder.readField[File]("file") + val hash = unbuilder.readField[List[Byte]]("hash") + val lastModified = unbuilder.readField[Long]("lastModified") + unbuilder.endObject() + FileHashModified(file, hash, lastModified) + case None => + deserializationError("Expected JsObject but found None") + } + + override def write[J](obj: HashModifiedFileInfo, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("file", obj.file) + builder.addField("hash", obj.hash) + builder.addField("lastModified", obj.lastModified) + builder.endObject() + } + } + + override implicit def apply(file: File): HashModifiedFileInfo = + FileHashModified(file.getAbsoluteFile, Hash(file).toList, file.lastModified) } + object hash extends Style { - type F = HashFileInfo - implicit def apply(file: File): HashFileInfo = make(file, computeHash(file)) - def make(file: File, hash: List[Byte]): HashFileInfo = FileHash(file.getAbsoluteFile, hash) - implicit val format: Format[HashFileInfo] = wrap(f => (f.file, f.hash), (make _).tupled) - private def computeHash(file: File): List[Byte] = try { Hash(file).toList } catch { case e: Exception => Nil } + override type F = HashFileInfo + + override implicit val format: JsonFormat[HashFileInfo] = new JsonFormat[HashFileInfo] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): HashFileInfo = + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val file = unbuilder.readField[File]("file") + val hash = unbuilder.readField[List[Byte]]("hash") + unbuilder.endObject() + FileHash(file, hash) + case None => + deserializationError("Expected JsObject but found None") + } + + override def write[J](obj: HashFileInfo, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("file", obj.file) + builder.addField("hash", obj.hash) + builder.endObject() + } + } + + override implicit def apply(file: File): HashFileInfo = + FileHash(file.getAbsoluteFile, computeHash(file)) + + private def computeHash(file: File): List[Byte] = + try Hash(file).toList + catch { case _: Exception => Nil } } + object lastModified extends Style { - type F = ModifiedFileInfo - implicit def apply(file: File): ModifiedFileInfo = make(file, file.lastModified) - def make(file: File, lastModified: Long): ModifiedFileInfo = FileModified(file.getAbsoluteFile, lastModified) - implicit val format: Format[ModifiedFileInfo] = wrap(f => (f.file, f.lastModified), (make _).tupled) + override type F = ModifiedFileInfo + + override implicit val format: JsonFormat[ModifiedFileInfo] = new JsonFormat[ModifiedFileInfo] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): ModifiedFileInfo = + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val file = unbuilder.readField[File]("file") + val lastModified = unbuilder.readField[Long]("lastModified") + unbuilder.endObject() + FileModified(file, lastModified) + case None => + deserializationError("Expected JsObject but found None") + } + + override def write[J](obj: ModifiedFileInfo, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("file", obj.file) + builder.addField("lastModified", obj.lastModified) + builder.endObject() + } + } + + override implicit def apply(file: File): ModifiedFileInfo = + FileModified(file.getAbsoluteFile, file.lastModified) } + object exists extends Style { - type F = PlainFileInfo - implicit def apply(file: File): PlainFileInfo = make(file) - def make(file: File): PlainFileInfo = { val abs = file.getAbsoluteFile; PlainFile(abs, abs.exists) } - implicit val format: Format[PlainFileInfo] = asProduct2[PlainFileInfo, File, Boolean](PlainFile.apply)(x => (x.file, x.exists)) + override type F = PlainFileInfo + + override implicit val format: JsonFormat[PlainFileInfo] = new JsonFormat[PlainFileInfo] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): PlainFileInfo = + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val file = unbuilder.readField[File]("file") + val exists = unbuilder.readField[Boolean]("exists") + unbuilder.endObject() + PlainFile(file, exists) + case None => + deserializationError("Expected JsObject but found None") + } + + override def write[J](obj: PlainFileInfo, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("file", obj.file) + builder.addField("exists", obj.exists) + builder.endObject() + } + } + + override implicit def apply(file: File): PlainFileInfo = { + val abs = file.getAbsoluteFile + PlainFile(abs, abs.exists) + } } } final case class FilesInfo[F <: FileInfo] private (files: Set[F]) object FilesInfo { - sealed abstract class Style { - type F <: FileInfo - val fileStyle: FileInfo.Style { type F = Style.this.F } - - //def manifest: Manifest[F] = fileStyle.manifest - implicit def apply(files: Set[File]): FilesInfo[F] - implicit def unapply(info: FilesInfo[F]): Set[File] = info.files.map(_.file) - implicit val formats: Format[FilesInfo[F]] - val manifest: Manifest[Format[FilesInfo[F]]] - def empty: FilesInfo[F] = new FilesInfo[F](Set.empty) - import Cache._ - def infosInputCache: InputCache[FilesInfo[F]] = basicInput - implicit def filesInputCache: InputCache[Set[File]] = wrapIn[Set[File], FilesInfo[F]] - implicit def filesInfoEquiv: Equiv[FilesInfo[F]] = defaultEquiv - } - private final class BasicStyle[FI <: FileInfo](style: FileInfo.Style { type F = FI })(implicit val manifest: Manifest[Format[FilesInfo[FI]]]) extends Style { - type F = FI - val fileStyle: FileInfo.Style { type F = FI } = style - private implicit val infoFormat: Format[FI] = fileStyle.format - implicit def apply(files: Set[File]): FilesInfo[F] = FilesInfo(files.map(_.getAbsoluteFile).map(fileStyle.apply)) - implicit val formats: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) - } - lazy val full: Style { type F = HashModifiedFileInfo } = new BasicStyle(FileInfo.full) - lazy val hash: Style { type F = HashFileInfo } = new BasicStyle(FileInfo.hash) - lazy val lastModified: Style { type F = ModifiedFileInfo } = new BasicStyle(FileInfo.lastModified) - lazy val exists: Style { type F = PlainFileInfo } = new BasicStyle(FileInfo.exists) - - implicit def existsInputsCache: InputCache[FilesInfo[PlainFileInfo]] = exists.infosInputCache - implicit def hashInputsCache: InputCache[FilesInfo[HashFileInfo]] = hash.infosInputCache - implicit def modifiedInputsCache: InputCache[FilesInfo[ModifiedFileInfo]] = lastModified.infosInputCache - implicit def fullInputsCache: InputCache[FilesInfo[HashModifiedFileInfo]] = full.infosInputCache + implicit def format[F <: FileInfo]: JsonFormat[FilesInfo[F]] = implicitly + def empty[F <: FileInfo]: FilesInfo[F] = FilesInfo(Set.empty[F]) } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala new file mode 100644 index 000000000..75eb92463 --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala @@ -0,0 +1,45 @@ +package sbt.internal.util + +import sbt.io.{ IO, Using } + +import java.io.{ Closeable, InputStream } + +import scala.util.{ Failure, Success } + +import sjsonnew.{ IsoString, JsonReader, SupportConverter } + +trait Input extends Closeable { + def read[T: JsonReader](): T + def read[T: JsonReader](default: => T): T +} + +class PlainInput[J: IsoString](input: InputStream, converter: SupportConverter[J]) extends Input { + val isoFormat: IsoString[J] = implicitly + private def readFully(): String = { + Using.streamReader(input, IO.utf8) { reader => + val builder = new StringBuilder() + val bufferSize = 1024 + val buffer = new Array[Char](bufferSize) + var read = 0 + while ({ read = reader.read(buffer, 0, bufferSize); read != -1 }) { + builder.append(String.valueOf(buffer.take(read))) + } + builder.toString() + } + } + + override def read[T: JsonReader](): T = { + val string = readFully() + val json = isoFormat.from(string) + converter.fromJson(json) match { + case Success(value) => value + case Failure(ex) => throw ex + } + } + + override def read[T: JsonReader](default: => T): T = + try read[T]() + catch { case _: Exception => default } + + override def close(): Unit = input.close() +} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala new file mode 100644 index 000000000..6e99db9ac --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala @@ -0,0 +1,32 @@ +package sbt.internal.util + +import sbt.io.Using + +import java.io.{ Closeable, OutputStream } + +import scala.util.{ Failure, Success } + +import sjsonnew.{ IsoString, JsonWriter, SupportConverter } + +trait Output extends Closeable { + def write[T: JsonWriter](value: T): Unit +} + +class PlainOutput[J: IsoString](output: OutputStream, converter: SupportConverter[J]) extends Output { + val isoFormat: IsoString[J] = implicitly + override def write[T: JsonWriter](value: T): Unit = { + converter.toJson(value) match { + case Success(js) => + val asString = isoFormat.to(js) + Using.bufferedOutputStream(output) { writer => + val out = new java.io.PrintWriter(writer) + out.print(asString) + out.flush() + } + case Failure(ex) => + throw ex + } + } + + override def close(): Unit = output.close() +} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala b/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala index ff735a528..be8f11a38 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala @@ -3,59 +3,61 @@ */ package sbt.internal.util -import sbinary.{ Format, Input, Output => Out } -import java.io.File -import sbt.io.Using +import scala.util.Try -trait InputCache[I] { - type Internal - def convert(i: I): Internal - def read(from: Input): Internal - def write(to: Out, j: Internal): Unit - def equiv: Equiv[Internal] +import sjsonnew.JsonFormat + +import CacheImplicits._ + +/** + * A cache that stores a single value. + */ +trait SingletonCache[T] { + /** Reads the cache from the backing `from`. */ + def read(from: Input): T + + /** Writes `value` to the backing `to`. */ + def write(to: Output, value: T): Unit + + /** Equivalence for elements of type `T`. */ + def equiv: Equiv[T] } -object InputCache { - implicit def basicInputCache[I](implicit fmt: Format[I], eqv: Equiv[I]): InputCache[I] = - new InputCache[I] { - type Internal = I - def convert(i: I) = i - def read(from: Input): I = fmt.reads(from) - def write(to: Out, i: I) = fmt.writes(to, i) - def equiv = eqv + +object SingletonCache { + + implicit def basicSingletonCache[T: JsonFormat: Equiv]: SingletonCache[T] = + new SingletonCache[T] { + override def read(from: Input): T = from.read[T] + override def write(to: Output, value: T) = to.write(value) + override def equiv: Equiv[T] = implicitly } - def lzy[I](mkIn: => InputCache[I]): InputCache[I] = - new InputCache[I] { - lazy val ic = mkIn - type Internal = ic.Internal - def convert(i: I) = ic convert i - def read(from: Input): ic.Internal = ic.read(from) - def write(to: Out, i: ic.Internal) = ic.write(to, i) - def equiv = ic.equiv + + /** A lazy `SingletonCache` */ + def lzy[T: JsonFormat: Equiv](mkCache: => SingletonCache[T]): SingletonCache[T] = + new SingletonCache[T] { + lazy val cache = mkCache + override def read(from: Input): T = cache.read(from) + override def write(to: Output, value: T) = cache.write(to, value) + override def equiv = cache.equiv } } -class BasicCache[I, O](implicit input: InputCache[I], outFormat: Format[O]) extends Cache[I, O] { - def apply(file: File)(in: I) = - { - val j = input.convert(in) - try { applyImpl(file, j) } - catch { case e: Exception => Right(update(file)(j)) } - } - protected def applyImpl(file: File, in: input.Internal) = - { - Using.fileInputStream(file) { stream => - val previousIn = input.read(stream) - if (input.equiv.equiv(in, previousIn)) - Left(outFormat.reads(stream)) - else - Right(update(file)(in)) - } - } - protected def update(file: File)(in: input.Internal) = (out: O) => - { - Using.fileOutputStream(false)(file) { stream => - input.write(stream, in) - outFormat.writes(stream, out) - } - } +/** + * Simple key-value cache. + */ +class BasicCache[I: JsonFormat: Equiv, O: JsonFormat] extends Cache[I, O] { + private val singletonCache: SingletonCache[(I, O)] = implicitly + val equiv: Equiv[I] = implicitly + override def apply(store: CacheStore)(key: I): CacheResult[O] = + Try { + val (previousKey, previousValue) = singletonCache.read(store) + if (equiv.equiv(key, previousKey)) + Hit(previousValue) + else + Miss(update(store)(key)) + } getOrElse Miss(update(store)(key)) + + private def update(store: CacheStore)(key: I) = (value: O) => { + singletonCache.write(store, (key, value)) + } } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala b/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala new file mode 100644 index 000000000..1d3a6d9fc --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala @@ -0,0 +1,44 @@ +package sbt.internal.util + +import scala.reflect.Manifest + +import sbt.datatype.IntFormat + +import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } + +object StampedFormat extends IntFormat { + + def apply[T](format: JsonFormat[T])(implicit mf: Manifest[JsonFormat[T]]): JsonFormat[T] = { + withStamp(stamp(format))(format) + } + + def withStamp[T, S](stamp: S)(format: JsonFormat[T])(implicit formatStamp: JsonFormat[S], equivStamp: Equiv[S]): JsonFormat[T] = + new JsonFormat[T] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): T = + jsOpt match { + case Some(js) => + unbuilder.extractArray(js) match { + case Vector(readStamp, readValue) => + val actualStamp = formatStamp.read(Some(readStamp), unbuilder) + if (equivStamp.equiv(actualStamp, stamp)) format.read(Some(readValue), unbuilder) + else sys.error(s"Incorrect stamp. Expected: $stamp, Found: $readStamp") + + case other => + deserializationError(s"Expected JsArray of size 2, but found JsArray of size ${other.size}") + } + + case None => + deserializationError("Expected JsArray but found None.") + } + + override def write[J](obj: T, builder: Builder[J]): Unit = { + builder.beginArray() + formatStamp.write(stamp, builder) + format.write(obj, builder) + builder.endArray() + } + } + private def stamp[T](format: JsonFormat[T])(implicit mf: Manifest[JsonFormat[T]]): Int = typeHash(mf) + private def typeHash[T](implicit mf: Manifest[T]) = mf.toString.hashCode + +} \ No newline at end of file diff --git a/internal/util-cache/src/test/scala/CacheSpec.scala b/internal/util-cache/src/test/scala/CacheSpec.scala new file mode 100644 index 000000000..8e2ebd12a --- /dev/null +++ b/internal/util-cache/src/test/scala/CacheSpec.scala @@ -0,0 +1,76 @@ +package sbt.internal.util + +import sbt.io.IO +import sbt.io.syntax._ + +import CacheImplicits._ + +import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilder } +import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, FixedParser } + +import scala.json.ast.unsafe.JValue + +class CacheSpec extends UnitSpec { + + implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, FixedParser.parseUnsafe) + + "A cache" should "NOT throw an exception if read without being written previously" in { + testCache[String, Int] { + case (cache, store) => + cache(store)("missing") match { + case Hit(_) => fail + case Miss(_) => () + } + } + } + + it should "write a very simple value" in { + testCache[String, Int] { + case (cache, store) => + cache(store)("missing") match { + case Hit(_) => fail + case Miss(update) => update(5) + } + } + } + + it should "be updatable" in { + testCache[String, Int] { + case (cache, store) => + val value = 5 + cache(store)("someKey") match { + case Hit(_) => fail + case Miss(update) => update(value) + } + + cache(store)("someKey") match { + case Hit(read) => assert(read === value) + case Miss(_) => fail + } + } + } + + it should "return the value that has been previously written" in { + testCache[String, Int] { + case (cache, store) => + val key = "someKey" + val value = 5 + cache(store)(key) match { + case Hit(_) => fail + case Miss(update) => update(value) + } + + cache(store)(key) match { + case Hit(read) => assert(read === value) + case Miss(_) => fail + } + } + } + + private def testCache[K, V](f: (Cache[K, V], CacheStore) => Unit)(implicit cache: Cache[K, V]): Unit = + IO.withTemporaryDirectory { tmp => + val store = new FileBasedStore(tmp / "cache-store", Converter) + f(cache, store) + } + +} \ No newline at end of file diff --git a/internal/util-cache/src/test/scala/CacheTest.scala b/internal/util-cache/src/test/scala/CacheTest.scala deleted file mode 100644 index 569b0bf24..000000000 --- a/internal/util-cache/src/test/scala/CacheTest.scala +++ /dev/null @@ -1,32 +0,0 @@ -package sbt.internal.util - -import java.io.File -import Types.:+: - -object CacheTest // extends Properties("Cache test") -{ - val lengthCache = new File("/tmp/length-cache") - val cCache = new File("/tmp/c-cache") - - import Cache._ - import FileInfo.hash._ - import Ordering._ - import sbinary.DefaultProtocol.FileFormat - def test(): Unit = { - lazy val create = new File("test") - - val length = cached(lengthCache) { - (f: File) => { println("File length: " + f.length); f.length } - } - - lazy val fileLength = length(create) - - val c = cached(cCache) { (in: (File :+: Long :+: HNil)) => - val file :+: len :+: HNil = in - println("File: " + file + " (" + file.exists + "), length: " + len) - (len + 1) :+: file :+: HNil - } - c(create :+: fileLength :+: HNil) - () - } -} diff --git a/internal/util-cache/src/test/scala/SingletonCacheSpec.scala b/internal/util-cache/src/test/scala/SingletonCacheSpec.scala new file mode 100644 index 000000000..42c774a2f --- /dev/null +++ b/internal/util-cache/src/test/scala/SingletonCacheSpec.scala @@ -0,0 +1,91 @@ +package sbt.internal.util + +import sbt.io.IO +import sbt.io.syntax._ + +import CacheImplicits._ + +import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilder } +import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, FixedParser } + +import scala.json.ast.unsafe.JValue + +class SingletonCacheSpec extends UnitSpec { + + case class ComplexType(val x: Int, y: String, z: List[Int]) + object ComplexType { + implicit val format: JsonFormat[ComplexType] = + new JsonFormat[ComplexType] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): ComplexType = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val x = unbuilder.readField[Int]("x") + val y = unbuilder.readField[String]("y") + val z = unbuilder.readField[List[Int]]("z") + unbuilder.endObject() + ComplexType(x, y, z) + + case None => + deserializationError("Exception JObject but found None") + } + } + + override def write[J](obj: ComplexType, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("x", obj.x) + builder.addField("y", obj.y) + builder.addField("z", obj.z) + builder.endObject() + } + } + } + + implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, FixedParser.parseUnsafe) + + "A singleton cache" should "throw an exception if read without being written previously" in { + testCache[Int] { + case (cache, store) => + intercept[Exception] { + cache.read(store) + } + () + } + } + + it should "write a very simple value" in { + testCache[Int] { + case (cache, store) => + cache.write(store, 5) + } + } + + it should "return the simple value that has been previously written" in { + testCache[Int] { + case (cache, store) => + val value = 5 + cache.write(store, value) + val read = cache.read(store) + + assert(read === value) + } + } + + it should "write a complex value" in { + testCache[ComplexType] { + case (cache, store) => + val value = ComplexType(1, "hello, world!", (1 to 10 by 3).toList) + cache.write(store, value) + val read = cache.read(store) + + assert(read === value) + } + } + + private def testCache[T](f: (SingletonCache[T], CacheStore) => Unit)(implicit cache: SingletonCache[T]): Unit = + IO.withTemporaryDirectory { tmp => + val store = new FileBasedStore(tmp / "cache-store", Converter) + f(cache, store) + } + +} \ No newline at end of file diff --git a/internal/util-cache/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala b/internal/util-cache/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala new file mode 100644 index 000000000..7f9f759dc --- /dev/null +++ b/internal/util-cache/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala @@ -0,0 +1,29 @@ +package sjsonnew +package support.scalajson.unsafe + +import scala.json.ast.unsafe._ +import scala.collection.mutable +import jawn.{ SupportParser, MutableFacade } + +object FixedParser extends SupportParser[JValue] { + implicit val facade: MutableFacade[JValue] = + new MutableFacade[JValue] { + def jnull() = JNull + def jfalse() = JTrue + def jtrue() = JFalse + def jnum(s: String) = JNumber(s) + def jint(s: String) = JNumber(s) + def jstring(s: String) = JString(s) + def jarray(vs: mutable.ArrayBuffer[JValue]) = JArray(vs.toArray) + def jobject(vs: mutable.Map[String, JValue]) = { + val array = new Array[JField](vs.size) + var i = 0 + vs.foreach { + case (key, value) => + array(i) = JField(key, value) + i += 1 + } + JObject(array) + } + } +} \ No newline at end of file diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala index 51e70a8a5..5aaf42e10 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala @@ -3,133 +3,76 @@ */ package sbt.internal.util -import java.io.{ File, IOException } -import CacheIO.{ fromFile, toFile } -import sbinary.Format -import scala.pickling.PicklingException -import scala.reflect.Manifest -import sbt.io.IO.{ delete, read, write } +import scala.util.{ Failure, Try, Success } + +import java.io.File import sbt.io.IO -import sbt.io.Using import sbt.io.syntax._ -import sbt.serialization._ + +import sjsonnew.JsonFormat object Tracked { + + import CacheImplicits.LongFormat + /** * Creates a tracker that provides the last time it was evaluated. * If 'useStartTime' is true, the recorded time is the start of the evaluated function. * If 'useStartTime' is false, the recorded time is when the evaluated function completes. * In both cases, the timestamp is not updated if the function throws an exception. */ - def tstamp(cacheFile: File, useStartTime: Boolean = true): Timestamp = new Timestamp(cacheFile, useStartTime) - /** Creates a tracker that only evaluates a function when the input has changed.*/ - //def changed[O](cacheFile: File)(implicit format: Format[O], equiv: Equiv[O]): Changed[O] = - // new Changed[O](cacheFile) + def tstamp(store: CacheStore, useStartTime: Boolean = true): Timestamp = new Timestamp(store, useStartTime) /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ - def diffInputs(cache: File, style: FilesInfo.Style): Difference = - Difference.inputs(cache, style) + def diffInputs(store: CacheStore, style: FileInfo.Style): Difference = + Difference.inputs(store, style) + /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ - def diffOutputs(cache: File, style: FilesInfo.Style): Difference = - Difference.outputs(cache, style) + def diffOutputs(store: CacheStore, style: FileInfo.Style): Difference = + Difference.outputs(store, style) - def lastOutput[I, O](cacheFile: File)(f: (I, Option[O]) => O)(implicit o: Format[O], mf: Manifest[Format[O]]): I => O = in => - { - val previous: Option[O] = fromFile[O](cacheFile) - val next = f(in, previous) - toFile(next)(cacheFile) - next + /** Creates a tracker that provides the output of the most recent invocation of the function */ + def lastOutput[I, O: JsonFormat](store: CacheStore)(f: (I, Option[O]) => O): I => O = { in => + val previous = Try { store.read[O] }.toOption + val next = f(in, previous) + store.write(next) + next + } + + /** + * Creates a tracker that indicates whether the arguments given to f have changed since the most + * recent invocation. + */ + def inputChanged[I: JsonFormat: SingletonCache, O](store: CacheStore)(f: (Boolean, I) => O): I => O = { in => + val cache: SingletonCache[I] = implicitly + val help = new CacheHelp(cache) + val changed = help.changed(store, in) + val result = f(changed, in) + if (changed) + help.save(store, in) + result + } + + private final class CacheHelp[I: JsonFormat](val sc: SingletonCache[I]) { + def save(store: CacheStore, value: I): Unit = { + store.write(value) } - // Todo: This function needs more testing. - private[sbt] def lastOutputWithJson[I, O: Pickler: Unpickler](cacheFile: File)(f: (I, Option[O]) => O): I => O = in => - { - val previous: Option[O] = try { - fromJsonFile[O](cacheFile).toOption - } catch { - case e: PicklingException => None - case e: IOException => None + + def changed(store: CacheStore, value: I): Boolean = + Try { store.read[I] } match { + case Success(prev) => !sc.equiv.equiv(value, prev) + case Failure(_) => true } - val next = f(in, previous) - IO.createDirectory(cacheFile.getParentFile) - toJsonFile(next, cacheFile) - next - } - def inputChanged[I, O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): I => O = in => - { - val help = new CacheHelp(ic) - val conv = help.convert(in) - val changed = help.changed(cacheFile, conv) - val result = f(changed, in) - - if (changed) - help.save(cacheFile, conv) - - result - } - private[sbt] def inputChangedWithJson[I: Pickler: Unpickler, O](cacheFile: File)(f: (Boolean, I) => O): I => O = in => - { - val help = new JsonCacheHelp[I] - val conv = help.convert(in) - val changed = help.changed(cacheFile, conv) - val result = f(changed, in) - - if (changed) - help.save(cacheFile, conv) - - result - } - def outputChanged[I, O](cacheFile: File)(f: (Boolean, I) => O)(implicit ic: InputCache[I]): (() => I) => O = in => - { - val initial = in() - val help = new CacheHelp(ic) - val changed = help.changed(cacheFile, help.convert(initial)) - val result = f(changed, initial) - - if (changed) - help.save(cacheFile, help.convert(in())) - - result - } - private[sbt] def outputChangedWithJson[I: Pickler, O](cacheFile: File)(f: (Boolean, I) => O): (() => I) => O = in => - { - val initial = in() - val help = new JsonCacheHelp[I] - val changed = help.changed(cacheFile, help.convert(initial)) - val result = f(changed, initial) - - if (changed) - help.save(cacheFile, help.convert(in())) - - result - } - final class CacheHelp[I](val ic: InputCache[I]) { - def convert(i: I): ic.Internal = ic.convert(i) - def save(cacheFile: File, value: ic.Internal): Unit = - Using.fileOutputStream()(cacheFile)(out => ic.write(out, value)) - def changed(cacheFile: File, converted: ic.Internal): Boolean = - try { - val prev = Using.fileInputStream(cacheFile)(x => ic.read(x)) - !ic.equiv.equiv(converted, prev) - } catch { case e: Exception => true } - } - private[sbt] final class JsonCacheHelp[I: Pickler] { - def convert(i: I): String = toJsonString(i) - def save(cacheFile: File, value: String): Unit = - IO.write(cacheFile, value, IO.utf8) - def changed(cacheFile: File, converted: String): Boolean = - try { - val prev = IO.read(cacheFile, IO.utf8) - converted != prev - } catch { case e: Exception => true } } + } trait Tracked { /** Cleans outputs and clears the cache.*/ def clean(): Unit } -class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked { - def clean() = delete(cacheFile) +class Timestamp(val store: CacheStore, useStartTime: Boolean)(implicit format: JsonFormat[Long]) extends Tracked { + def clean() = store.delete() /** * Reads the previous timestamp, evaluates the provided function, * and then updates the timestamp if the function completes normally. @@ -138,17 +81,16 @@ class Timestamp(val cacheFile: File, useStartTime: Boolean) extends Tracked { { val start = now() val result = f(readTimestamp) - write(cacheFile, (if (useStartTime) start else now()).toString) + store.write(if (useStartTime) start else now()) result } private def now() = System.currentTimeMillis def readTimestamp: Long = - try { read(cacheFile).toLong } - catch { case _: NumberFormatException | _: java.io.FileNotFoundException => 0 } + Try { store.read[Long] } getOrElse 0 } -class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O]) extends Tracked { - def clean() = delete(cacheFile) +class Changed[O: Equiv: JsonFormat](val store: CacheStore) extends Tracked { + def clean() = store.delete() def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => { if (uptodate(value)) @@ -159,19 +101,15 @@ class Changed[O](val cacheFile: File)(implicit equiv: Equiv[O], format: Format[O } } - def update(value: O): Unit = Using.fileOutputStream(false)(cacheFile)(stream => format.writes(stream, value)) - def uptodate(value: O): Boolean = - try { - Using.fileInputStream(cacheFile) { - stream => equiv.equiv(value, format.reads(stream)) - } - } catch { - case _: Exception => false - } + def update(value: O): Unit = store.write(value) //Using.fileOutputStream(false)(cacheFile)(stream => format.writes(stream, value)) + def uptodate(value: O): Boolean = { + val equiv: Equiv[O] = implicitly + equiv.equiv(value, store.read[O]) + } } object Difference { - def constructor(defineClean: Boolean, filesAreOutputs: Boolean): (File, FilesInfo.Style) => Difference = - (cache, style) => new Difference(cache, style, defineClean, filesAreOutputs) + def constructor(defineClean: Boolean, filesAreOutputs: Boolean): (CacheStore, FileInfo.Style) => Difference = + (store, style) => new Difference(store, style, defineClean, filesAreOutputs) /** * Provides a constructor for a Difference that removes the files from the previous run on a call to 'clean' and saves the @@ -185,15 +123,15 @@ object Difference { */ val inputs = constructor(false, false) } -class Difference(val cache: File, val style: FilesInfo.Style, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked { +class Difference(val store: CacheStore, val style: FileInfo.Style, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked { def clean() = { - if (defineClean) delete(raw(cachedFilesInfo)) else () + if (defineClean) IO.delete(raw(cachedFilesInfo)) else () clearCache() } - private def clearCache() = delete(cache) + private def clearCache() = store.delete() - private def cachedFilesInfo = fromFile(style.formats, style.empty)(cache)(style.manifest).files + private def cachedFilesInfo = store.read(default = FilesInfo.empty[style.F]).files //(style.formats).files private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) def apply[T](files: Set[File])(f: ChangeReport[File] => T): T = @@ -225,7 +163,9 @@ class Difference(val cache: File, val style: FilesInfo.Style, val defineClean: B val result = f(report) val info = if (filesAreOutputs) style(abs(extractFiles(result))) else currentFilesInfo - toFile(style.formats)(info)(cache)(style.manifest) + + store.write(info) + result } } @@ -240,24 +180,24 @@ object FileFunction { * (which does the actual work: compiles, generates resources, etc.), returning * a Set of output files that it generated. * - * The input file and resulting output file state is cached in - * cacheBaseDirectory. On each invocation, the state of the input and output + * The input file and resulting output file state is cached in stores issued by + * `storeFactory`. On each invocation, the state of the input and output * files from the previous run is compared against the cache, as is the set of * input files. If a change in file state / input files set is detected, the * action function is re-executed. * - * @param cacheBaseDirectory The folder in which to store + * @param storeFactory The factory to use to get stores for the input and output files. * @param inStyle The strategy by which to detect state change in the input files from the previous run * @param outStyle The strategy by which to detect state change in the output files from the previous run * @param action The work function, which receives a list of input files and returns a list of output files */ - def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style = FilesInfo.lastModified, outStyle: FilesInfo.Style = FilesInfo.exists)(action: Set[File] => Set[File]): Set[File] => Set[File] = - cached(cacheBaseDirectory)(inStyle, outStyle)((in, out) => action(in.checked)) + def cached(storeFactory: CacheStoreFactory, inStyle: FileInfo.Style = FileInfo.lastModified, outStyle: FileInfo.Style = FileInfo.exists)(action: Set[File] => Set[File]): Set[File] => Set[File] = + cached(storeFactory)(inStyle, outStyle)((in, out) => action(in.checked)) - def cached(cacheBaseDirectory: File)(inStyle: FilesInfo.Style, outStyle: FilesInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = + def cached(storeFactory: CacheStoreFactory)(inStyle: FileInfo.Style, outStyle: FileInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = { - lazy val inCache = Difference.inputs(cacheBaseDirectory / "in-cache", inStyle) - lazy val outCache = Difference.outputs(cacheBaseDirectory / "out-cache", outStyle) + lazy val inCache = Difference.inputs(storeFactory.derive("in-cache"), inStyle) + lazy val outCache = Difference.outputs(storeFactory.derive("out-cache"), outStyle) inputs => { inCache(inputs) { inReport => diff --git a/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala b/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala new file mode 100644 index 000000000..b23c191dc --- /dev/null +++ b/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala @@ -0,0 +1,140 @@ +package sbt.internal.util + +import sbt.io.IO +import sbt.io.syntax._ + +import CacheImplicits._ + +import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilder } +import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, FixedParser } + +import scala.json.ast.unsafe.JValue + +class TrackedSpec extends UnitSpec { + + implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, FixedParser.parseUnsafe) + + "lastOutput" should "store the last output" in { + withStore { store => + + val value = 5 + val otherValue = 10 + + val res0 = + Tracked.lastOutput[Int, Int](store) { + case (in, None) => + assert(in === value) + in + case (in, Some(_)) => + fail() + }(implicitly)(value) + assert(res0 === value) + + val res1 = + Tracked.lastOutput[Int, Int](store) { + case (in, None) => + fail() + case (in, Some(read)) => + assert(in === otherValue) + assert(read === value) + read + }(implicitly)(otherValue) + assert(res1 === value) + + val res2 = + Tracked.lastOutput[Int, Int](store) { + case (in, None) => + fail() + case (in, Some(read)) => + assert(in === otherValue) + assert(read === value) + read + }(implicitly)(otherValue) + assert(res2 === value) + } + } + + "inputChanged" should "detect that the input has not changed" in { + withStore { store => + val input0 = 0 + + val res0 = + Tracked.inputChanged[Int, Int](store) { + case (true, in) => + assert(in === input0) + in + case (false, in) => + fail() + }(implicitly, implicitly)(input0) + assert(res0 === input0) + + val res1 = + Tracked.inputChanged[Int, Int](store) { + case (true, in) => + fail() + case (false, in) => + assert(in === input0) + in + }(implicitly, implicitly)(input0) + assert(res1 === input0) + + } + } + + it should "detect that the input has changed" in { + withStore { store => + val input0 = 0 + val input1 = 1 + + val res0 = + Tracked.inputChanged[Int, Int](store) { + case (true, in) => + assert(in === input0) + in + case (false, in) => + fail() + }(implicitly, implicitly)(input0) + assert(res0 === input0) + + val res1 = + Tracked.inputChanged[Int, Int](store) { + case (true, in) => + assert(in === input1) + in + case (false, in) => + fail() + }(implicitly, implicitly)(input1) + assert(res1 === input1) + + } + } + + "tstamp tracker" should "have a timestamp of 0 on first invocation" in { + withStore { store => + Tracked.tstamp(store) { last => + assert(last === 0) + } + } + } + + it should "provide the last time a function has been evaluated" in { + withStore { store => + + Tracked.tstamp(store) { last => + assert(last === 0) + } + + Tracked.tstamp(store) { last => + val difference = System.currentTimeMillis - last + assert(difference < 1000) + } + } + } + + private def withStore(f: CacheStore => Unit): Unit = + IO.withTemporaryDirectory { tmp => + val store = new FileBasedStore(tmp / "cache-store", Converter) + f(store) + } + +} \ No newline at end of file diff --git a/internal/util-tracking/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala b/internal/util-tracking/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala new file mode 100644 index 000000000..7f9f759dc --- /dev/null +++ b/internal/util-tracking/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala @@ -0,0 +1,29 @@ +package sjsonnew +package support.scalajson.unsafe + +import scala.json.ast.unsafe._ +import scala.collection.mutable +import jawn.{ SupportParser, MutableFacade } + +object FixedParser extends SupportParser[JValue] { + implicit val facade: MutableFacade[JValue] = + new MutableFacade[JValue] { + def jnull() = JNull + def jfalse() = JTrue + def jtrue() = JFalse + def jnum(s: String) = JNumber(s) + def jint(s: String) = JNumber(s) + def jstring(s: String) = JString(s) + def jarray(vs: mutable.ArrayBuffer[JValue]) = JArray(vs.toArray) + def jobject(vs: mutable.Map[String, JValue]) = { + val array = new Array[JField](vs.size) + var i = 0 + vs.foreach { + case (key, value) => + array(i) = JField(key, value) + i += 1 + } + JObject(array) + } + } +} \ No newline at end of file diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 23d6b1f77..3db532055 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -8,7 +8,6 @@ object Dependencies { lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M6" lazy val jline = "jline" % "jline" % "2.13" lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" - lazy val sbinary = "org.scala-sbt" %% "sbinary" % "0.4.3" lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } lazy val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } @@ -27,4 +26,7 @@ object Dependencies { val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" lazy val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" + + lazy val datatypeCodecs = "org.scala-sbt" %% "datatype-codecs" % "1.0.0-SNAPSHOT" + lazy val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % "0.4.0" } From eda708dfeb7548dc169090ea536435575228fe8d Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Tue, 28 Jun 2016 19:44:32 +0200 Subject: [PATCH 596/823] Add `StreamFormat` and `HListFormat` --- .../sbt/internal/util/AdditionalFormats.scala | 20 +++++++++++++++++-- .../sbt/internal/util/CacheImplicits.scala | 2 ++ 2 files changed, 20 insertions(+), 2 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala index 6e74c26c7..79c90e9e4 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala @@ -1,12 +1,12 @@ package sbt.internal.util -import sbt.datatype.StringFormat +import sbt.datatype.{ ArrayFormat, ByteFormat, StringFormat } import sbt.internal.util.Types.:+: import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } import sjsonnew.BasicJsonProtocol.{ wrap, asSingleton } -import java.io.File +import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } import java.net.{ URI, URL } @@ -50,3 +50,19 @@ trait HListFormat { implicit val HNilFormat: JsonFormat[HNil] = asSingleton(HNil) } + +trait StreamFormat { self: ArrayFormat with ByteFormat => + def streamFormat[T](write: (T, OutputStream) => Unit, read: InputStream => T): JsonFormat[T] = { + lazy val byteArrayFormat = implicitly[JsonFormat[Array[Byte]]] + val toBytes = (t: T) => { val bos = new ByteArrayOutputStream(); write(t, bos); bos.toByteArray } + val fromBytes = (bs: Array[Byte]) => read(new ByteArrayInputStream(bs)) + + new JsonFormat[T] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): T = + fromBytes(byteArrayFormat.read(jsOpt, unbuilder)) + + override def write[J](obj: T, builder: Builder[J]): Unit = + byteArrayFormat.write(toBytes(obj), builder) + } + } +} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala index 190282a6e..9e45b90c6 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala @@ -9,9 +9,11 @@ object CacheImplicits extends BasicCacheImplicits with ByteFormat with FileFormat with IntFormat + with HListFormat with LongFormat with StringFormat with URIFormat with URLFormat + with StreamFormat with TupleFormats with CollectionFormats \ No newline at end of file From c395bd14a8194b67a79fc53d3bbd6358a3880fde Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Tue, 28 Jun 2016 19:44:51 +0200 Subject: [PATCH 597/823] Add `sub` to `CacheStoreFactory` --- .../src/main/scala/sbt/internal/util/CacheStore.scala | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala index 16ef54e95..175525658 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala @@ -21,6 +21,9 @@ trait CacheStore extends Input with Output { trait CacheStoreFactory { /** Create a new store. */ def derive(identifier: String): CacheStore + + /** Create a new `CacheStoreFactory` from this factory. */ + def sub(identifier: String): CacheStoreFactory } /** @@ -32,6 +35,9 @@ class DirectoryStoreFactory[J: IsoString](base: File, converter: SupportConverte override def derive(identifier: String): CacheStore = new FileBasedStore(base / identifier, converter) + + override def sub(identifier: String): CacheStoreFactory = + new DirectoryStoreFactory(base / identifier, converter) } /** From 465774b13cbf4d9241718c95d28ef7d27e601e66 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Sun, 24 Jul 2016 21:10:36 +0100 Subject: [PATCH 598/823] Adapt to recent changes to sbt-datatype --- build.sbt | 2 +- .../sbt/internal/util/AdditionalFormats.scala | 15 +++++++++------ .../sbt/internal/util/BasicCacheImplicits.scala | 7 ++----- .../scala/sbt/internal/util/CacheImplicits.scala | 12 ++---------- .../scala/sbt/internal/util/StampedFormat.scala | 6 ++---- .../util-cache/src/test/scala/CacheSpec.scala | 2 +- .../main/scala/sbt/internal/util/Tracked.scala | 2 +- .../scala/sbt/internal/util/TrackedSpec.scala | 2 +- project/Dependencies.scala | 6 +++--- 9 files changed, 22 insertions(+), 32 deletions(-) diff --git a/build.sbt b/build.sbt index 2a335cd79..6b1655678 100644 --- a/build.sbt +++ b/build.sbt @@ -117,7 +117,7 @@ lazy val utilCache = (project in internalPath / "util-cache"). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(datatypeCodecs, sbtSerialization, scalaReflect.value, sbtIO) ++ scalaXml.value, + libraryDependencies ++= Seq(sjsonnew, scalaReflect.value, sbtIO) ++ scalaXml.value, libraryDependencies += sjsonnewScalaJson % Test ) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala index 79c90e9e4..00fc195ef 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala @@ -1,27 +1,30 @@ package sbt.internal.util -import sbt.datatype.{ ArrayFormat, ByteFormat, StringFormat } import sbt.internal.util.Types.:+: import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } -import sjsonnew.BasicJsonProtocol.{ wrap, asSingleton } +import sjsonnew.BasicJsonProtocol, BasicJsonProtocol.asSingleton import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } import java.net.{ URI, URL } -trait URIFormat { self: StringFormat => +trait URIFormat { self: BasicJsonProtocol => implicit def URIFormat: JsonFormat[URI] = wrap(_.toString, new URI(_: String)) } -trait URLFormat { self: StringFormat => +trait URLFormat { self: BasicJsonProtocol => implicit def URLFormat: JsonFormat[URL] = wrap(_.toString, new URL(_: String)) } -trait FileFormat { self: StringFormat => +trait FileFormat { self: BasicJsonProtocol => implicit def FileFormat: JsonFormat[File] = wrap(_.toString, new File(_: String)) } +trait SetFormat { self: BasicJsonProtocol => + implicit def SetFormat[T: JsonFormat]: JsonFormat[Set[T]] = wrap(_.toSeq, (_: Seq[T]).toSet) +} + trait HListFormat { implicit def HConsFormat[H: JsonFormat, T <: HList: JsonFormat]: JsonFormat[H :+: T] = new JsonFormat[H :+: T] { @@ -51,7 +54,7 @@ trait HListFormat { } -trait StreamFormat { self: ArrayFormat with ByteFormat => +trait StreamFormat { self: BasicJsonProtocol => def streamFormat[T](write: (T, OutputStream) => Unit, read: InputStream => T): JsonFormat[T] = { lazy val byteArrayFormat = implicitly[JsonFormat[Array[Byte]]] val toBytes = (t: T) => { val bos = new ByteArrayOutputStream(); write(t, bos); bos.toByteArray } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala index 7829e8e22..1d1ebe16d 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala @@ -1,13 +1,10 @@ package sbt.internal.util -import sbt.datatype.{ ArrayFormat, BooleanFormat, ByteFormat, IntFormat } - import java.net.{ URI, URL } -import sjsonnew.JsonFormat -import sjsonnew.BasicJsonProtocol.asSingleton +import sjsonnew.{ BasicJsonProtocol, JsonFormat } -trait BasicCacheImplicits { self: ArrayFormat with BooleanFormat with ByteFormat with IntFormat => +trait BasicCacheImplicits { self: BasicJsonProtocol => implicit def basicCache[I: JsonFormat: Equiv, O: JsonFormat]: Cache[I, O] = new BasicCache[I, O]() diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala index 9e45b90c6..0ebdf134b 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala @@ -1,19 +1,11 @@ package sbt.internal.util -import sbt.datatype.{ ArrayFormat, BooleanFormat, ByteFormat, IntFormat, LongFormat, StringFormat } -import sjsonnew.{ CollectionFormats, TupleFormats } +import sjsonnew.BasicJsonProtocol object CacheImplicits extends BasicCacheImplicits - with ArrayFormat - with BooleanFormat - with ByteFormat + with BasicJsonProtocol with FileFormat - with IntFormat with HListFormat - with LongFormat - with StringFormat with URIFormat with URLFormat with StreamFormat - with TupleFormats - with CollectionFormats \ No newline at end of file diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala b/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala index 1d3a6d9fc..1e11214bd 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala @@ -2,11 +2,9 @@ package sbt.internal.util import scala.reflect.Manifest -import sbt.datatype.IntFormat +import sjsonnew.{ BasicJsonProtocol, Builder, deserializationError, JsonFormat, Unbuilder } -import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } - -object StampedFormat extends IntFormat { +object StampedFormat extends BasicJsonProtocol { def apply[T](format: JsonFormat[T])(implicit mf: Manifest[JsonFormat[T]]): JsonFormat[T] = { withStamp(stamp(format))(format) diff --git a/internal/util-cache/src/test/scala/CacheSpec.scala b/internal/util-cache/src/test/scala/CacheSpec.scala index 8e2ebd12a..7b9924b6e 100644 --- a/internal/util-cache/src/test/scala/CacheSpec.scala +++ b/internal/util-cache/src/test/scala/CacheSpec.scala @@ -5,7 +5,7 @@ import sbt.io.syntax._ import CacheImplicits._ -import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilder } +import sjsonnew.IsoString import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, FixedParser } import scala.json.ast.unsafe.JValue diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala index 5aaf42e10..47f0b0f00 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala @@ -13,7 +13,7 @@ import sjsonnew.JsonFormat object Tracked { - import CacheImplicits.LongFormat + import CacheImplicits.LongJsonFormat /** * Creates a tracker that provides the last time it was evaluated. diff --git a/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala b/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala index b23c191dc..3c38a2c4a 100644 --- a/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala +++ b/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala @@ -5,7 +5,7 @@ import sbt.io.syntax._ import CacheImplicits._ -import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilder } +import sjsonnew.IsoString import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, FixedParser } import scala.json.ast.unsafe.JValue diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 3db532055..4777365c4 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -7,7 +7,6 @@ object Dependencies { lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M6" lazy val jline = "jline" % "jline" % "2.13" - lazy val sbtSerialization = "org.scala-sbt" %% "serialization" % "0.1.2" lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } lazy val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } @@ -27,6 +26,7 @@ object Dependencies { lazy val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - lazy val datatypeCodecs = "org.scala-sbt" %% "datatype-codecs" % "1.0.0-SNAPSHOT" - lazy val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % "0.4.0" + lazy val sjsonnewVersion = "0.4.0" + lazy val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion + lazy val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From feda07b896a5be3e7bfd548049e198e90636b01b Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Sun, 24 Jul 2016 21:42:03 +0100 Subject: [PATCH 599/823] Adapt to recent changes to sjson-new --- .../sbt/internal/util/AdditionalFormats.scala | 8 ++++---- .../scala/sbt/internal/util/StampedFormat.scala | 16 +++++++--------- project/Dependencies.scala | 2 +- 3 files changed, 12 insertions(+), 14 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala index 00fc195ef..8008cf90a 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala @@ -10,19 +10,19 @@ import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, import java.net.{ URI, URL } trait URIFormat { self: BasicJsonProtocol => - implicit def URIFormat: JsonFormat[URI] = wrap(_.toString, new URI(_: String)) + implicit def URIFormat: JsonFormat[URI] = project(_.toString, new URI(_: String)) } trait URLFormat { self: BasicJsonProtocol => - implicit def URLFormat: JsonFormat[URL] = wrap(_.toString, new URL(_: String)) + implicit def URLFormat: JsonFormat[URL] = project(_.toString, new URL(_: String)) } trait FileFormat { self: BasicJsonProtocol => - implicit def FileFormat: JsonFormat[File] = wrap(_.toString, new File(_: String)) + implicit def FileFormat: JsonFormat[File] = project(_.toString, new File(_: String)) } trait SetFormat { self: BasicJsonProtocol => - implicit def SetFormat[T: JsonFormat]: JsonFormat[Set[T]] = wrap(_.toSeq, (_: Seq[T]).toSet) + implicit def SetFormat[T: JsonFormat]: JsonFormat[Set[T]] = project(_.toSeq, (_: Seq[T]).toSet) } trait HListFormat { diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala b/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala index 1e11214bd..213f50ec5 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala @@ -15,15 +15,13 @@ object StampedFormat extends BasicJsonProtocol { override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): T = jsOpt match { case Some(js) => - unbuilder.extractArray(js) match { - case Vector(readStamp, readValue) => - val actualStamp = formatStamp.read(Some(readStamp), unbuilder) - if (equivStamp.equiv(actualStamp, stamp)) format.read(Some(readValue), unbuilder) - else sys.error(s"Incorrect stamp. Expected: $stamp, Found: $readStamp") - - case other => - deserializationError(s"Expected JsArray of size 2, but found JsArray of size ${other.size}") - } + val stampedLength = unbuilder.beginArray(js) + if (stampedLength != 2) sys.error(s"Expected JsArray of size 2, found JsArray of size $stampedLength.") + val readStamp = unbuilder.nextElement + val readValue = unbuilder.nextElement + val actualStamp = formatStamp.read(Some(readStamp), unbuilder) + if (equivStamp.equiv(actualStamp, stamp)) format.read(Some(readValue), unbuilder) + else sys.error(s"Incorrect stamp. Expected: $stamp, Found: $readStamp") case None => deserializationError("Expected JsArray but found None.") diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 4777365c4..8be45ba91 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -26,7 +26,7 @@ object Dependencies { lazy val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - lazy val sjsonnewVersion = "0.4.0" + lazy val sjsonnewVersion = "0.4.1" lazy val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion lazy val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From 432c93b0bbe85435dbe2c1e61c8748ee98008772 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Fri, 19 Aug 2016 15:41:56 +0200 Subject: [PATCH 600/823] Implement hashCode, equals and toString in Maybe This brings Maybe's behavior closer to scala's Option. --- .../src/main/java/xsbti/Maybe.java | 21 ++++++++++++++++++- 1 file changed, 20 insertions(+), 1 deletion(-) diff --git a/internal/util-interface/src/main/java/xsbti/Maybe.java b/internal/util-interface/src/main/java/xsbti/Maybe.java index f730ef918..c52671f77 100644 --- a/internal/util-interface/src/main/java/xsbti/Maybe.java +++ b/internal/util-interface/src/main/java/xsbti/Maybe.java @@ -14,6 +14,16 @@ public abstract class Maybe return new Maybe() { public boolean isDefined() { return true; } public s get() { return v; } + public int hashCode() { return 17 + (v == null ? 0 : v.hashCode()); } + public String toString() { return "Maybe(" + v + ")"; } + public boolean equals(Object o) { + if (o == null) return false; + if (!(o instanceof Maybe)) return false; + Maybe other = (Maybe) o; + if (!other.isDefined()) return false; + if (v == null) return other.get() == null; + return v.equals(other.get()); + } }; } public static Maybe nothing() @@ -21,10 +31,19 @@ public abstract class Maybe return new Maybe() { public boolean isDefined() { return false; } public s get() { throw new UnsupportedOperationException("nothing.get"); } + public int hashCode() { return 1; } + public String toString() { return "Nothing"; } + public boolean equals(Object o) { + if (o == null) return false; + if (!(o instanceof Maybe)) return false; + Maybe other = (Maybe) o; + return !other.isDefined(); + } }; + } public final boolean isEmpty() { return !isDefined(); } public abstract boolean isDefined(); public abstract t get(); -} \ No newline at end of file +} From 4e233d81f960d0b1d79a6786214040756d9156c3 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Sun, 21 Aug 2016 14:24:34 +0200 Subject: [PATCH 601/823] Make Maybe's toString closer to the actual code --- internal/util-interface/src/main/java/xsbti/Maybe.java | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/internal/util-interface/src/main/java/xsbti/Maybe.java b/internal/util-interface/src/main/java/xsbti/Maybe.java index c52671f77..3280e5990 100644 --- a/internal/util-interface/src/main/java/xsbti/Maybe.java +++ b/internal/util-interface/src/main/java/xsbti/Maybe.java @@ -15,7 +15,7 @@ public abstract class Maybe public boolean isDefined() { return true; } public s get() { return v; } public int hashCode() { return 17 + (v == null ? 0 : v.hashCode()); } - public String toString() { return "Maybe(" + v + ")"; } + public String toString() { return "Maybe.just(" + v + ")"; } public boolean equals(Object o) { if (o == null) return false; if (!(o instanceof Maybe)) return false; @@ -32,7 +32,7 @@ public abstract class Maybe public boolean isDefined() { return false; } public s get() { throw new UnsupportedOperationException("nothing.get"); } public int hashCode() { return 1; } - public String toString() { return "Nothing"; } + public String toString() { return "Maybe.nothing()"; } public boolean equals(Object o) { if (o == null) return false; if (!(o instanceof Maybe)) return false; From cee43575ce3988b1565489db3166281cd7899cde Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Sun, 21 Aug 2016 17:52:08 +0200 Subject: [PATCH 602/823] Remove unused additional formats --- .../sbt/internal/util/AdditionalFormats.scala | 71 ------------------- .../sbt/internal/util/CacheImplicits.scala | 5 -- 2 files changed, 76 deletions(-) delete mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala deleted file mode 100644 index 8008cf90a..000000000 --- a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala +++ /dev/null @@ -1,71 +0,0 @@ -package sbt.internal.util - -import sbt.internal.util.Types.:+: - -import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } -import sjsonnew.BasicJsonProtocol, BasicJsonProtocol.asSingleton - -import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } - -import java.net.{ URI, URL } - -trait URIFormat { self: BasicJsonProtocol => - implicit def URIFormat: JsonFormat[URI] = project(_.toString, new URI(_: String)) -} - -trait URLFormat { self: BasicJsonProtocol => - implicit def URLFormat: JsonFormat[URL] = project(_.toString, new URL(_: String)) -} - -trait FileFormat { self: BasicJsonProtocol => - implicit def FileFormat: JsonFormat[File] = project(_.toString, new File(_: String)) -} - -trait SetFormat { self: BasicJsonProtocol => - implicit def SetFormat[T: JsonFormat]: JsonFormat[Set[T]] = project(_.toSeq, (_: Seq[T]).toSet) -} - -trait HListFormat { - implicit def HConsFormat[H: JsonFormat, T <: HList: JsonFormat]: JsonFormat[H :+: T] = - new JsonFormat[H :+: T] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): H :+: T = - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) - val h = unbuilder.readField[H]("h") - val t = unbuilder.readField[T]("t") - unbuilder.endObject() - - HCons(h, t) - - case None => - deserializationError("Expect JValue but found None") - } - - override def write[J](obj: H :+: T, builder: Builder[J]): Unit = { - builder.beginObject() - builder.addField("h", obj.head) - builder.addField("t", obj.tail) - builder.endObject() - } - } - - implicit val HNilFormat: JsonFormat[HNil] = asSingleton(HNil) - -} - -trait StreamFormat { self: BasicJsonProtocol => - def streamFormat[T](write: (T, OutputStream) => Unit, read: InputStream => T): JsonFormat[T] = { - lazy val byteArrayFormat = implicitly[JsonFormat[Array[Byte]]] - val toBytes = (t: T) => { val bos = new ByteArrayOutputStream(); write(t, bos); bos.toByteArray } - val fromBytes = (bs: Array[Byte]) => read(new ByteArrayInputStream(bs)) - - new JsonFormat[T] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): T = - fromBytes(byteArrayFormat.read(jsOpt, unbuilder)) - - override def write[J](obj: T, builder: Builder[J]): Unit = - byteArrayFormat.write(toBytes(obj), builder) - } - } -} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala index 0ebdf134b..78e6e301c 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala @@ -4,8 +4,3 @@ import sjsonnew.BasicJsonProtocol object CacheImplicits extends BasicCacheImplicits with BasicJsonProtocol - with FileFormat - with HListFormat - with URIFormat - with URLFormat - with StreamFormat From 8956da53a8aa5c8689717b7dc9433e026d7098a4 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Sun, 21 Aug 2016 19:56:31 +0200 Subject: [PATCH 603/823] Update to latest revision of sjsonnew --- .../util-cache/src/test/scala/CacheSpec.scala | 4 +-- .../src/test/scala/SingletonCacheSpec.scala | 4 +-- .../scalajson/unsafe/FixedParser.scala | 29 ------------------- .../scala/sbt/internal/util/TrackedSpec.scala | 4 +-- .../scalajson/unsafe/FixedParser.scala | 29 ------------------- project/Dependencies.scala | 2 +- 6 files changed, 7 insertions(+), 65 deletions(-) delete mode 100644 internal/util-cache/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala delete mode 100644 internal/util-tracking/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala diff --git a/internal/util-cache/src/test/scala/CacheSpec.scala b/internal/util-cache/src/test/scala/CacheSpec.scala index 7b9924b6e..a3b0dd5e1 100644 --- a/internal/util-cache/src/test/scala/CacheSpec.scala +++ b/internal/util-cache/src/test/scala/CacheSpec.scala @@ -6,13 +6,13 @@ import sbt.io.syntax._ import CacheImplicits._ import sjsonnew.IsoString -import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, FixedParser } +import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } import scala.json.ast.unsafe.JValue class CacheSpec extends UnitSpec { - implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, FixedParser.parseUnsafe) + implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) "A cache" should "NOT throw an exception if read without being written previously" in { testCache[String, Int] { diff --git a/internal/util-cache/src/test/scala/SingletonCacheSpec.scala b/internal/util-cache/src/test/scala/SingletonCacheSpec.scala index 42c774a2f..da883d446 100644 --- a/internal/util-cache/src/test/scala/SingletonCacheSpec.scala +++ b/internal/util-cache/src/test/scala/SingletonCacheSpec.scala @@ -6,7 +6,7 @@ import sbt.io.syntax._ import CacheImplicits._ import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilder } -import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, FixedParser } +import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } import scala.json.ast.unsafe.JValue @@ -41,7 +41,7 @@ class SingletonCacheSpec extends UnitSpec { } } - implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, FixedParser.parseUnsafe) + implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) "A singleton cache" should "throw an exception if read without being written previously" in { testCache[Int] { diff --git a/internal/util-cache/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala b/internal/util-cache/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala deleted file mode 100644 index 7f9f759dc..000000000 --- a/internal/util-cache/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala +++ /dev/null @@ -1,29 +0,0 @@ -package sjsonnew -package support.scalajson.unsafe - -import scala.json.ast.unsafe._ -import scala.collection.mutable -import jawn.{ SupportParser, MutableFacade } - -object FixedParser extends SupportParser[JValue] { - implicit val facade: MutableFacade[JValue] = - new MutableFacade[JValue] { - def jnull() = JNull - def jfalse() = JTrue - def jtrue() = JFalse - def jnum(s: String) = JNumber(s) - def jint(s: String) = JNumber(s) - def jstring(s: String) = JString(s) - def jarray(vs: mutable.ArrayBuffer[JValue]) = JArray(vs.toArray) - def jobject(vs: mutable.Map[String, JValue]) = { - val array = new Array[JField](vs.size) - var i = 0 - vs.foreach { - case (key, value) => - array(i) = JField(key, value) - i += 1 - } - JObject(array) - } - } -} \ No newline at end of file diff --git a/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala b/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala index 3c38a2c4a..df23ae8e7 100644 --- a/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala +++ b/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala @@ -6,13 +6,13 @@ import sbt.io.syntax._ import CacheImplicits._ import sjsonnew.IsoString -import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, FixedParser } +import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } import scala.json.ast.unsafe.JValue class TrackedSpec extends UnitSpec { - implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, FixedParser.parseUnsafe) + implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) "lastOutput" should "store the last output" in { withStore { store => diff --git a/internal/util-tracking/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala b/internal/util-tracking/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala deleted file mode 100644 index 7f9f759dc..000000000 --- a/internal/util-tracking/src/test/scala/sjsonnew/support/scalajson/unsafe/FixedParser.scala +++ /dev/null @@ -1,29 +0,0 @@ -package sjsonnew -package support.scalajson.unsafe - -import scala.json.ast.unsafe._ -import scala.collection.mutable -import jawn.{ SupportParser, MutableFacade } - -object FixedParser extends SupportParser[JValue] { - implicit val facade: MutableFacade[JValue] = - new MutableFacade[JValue] { - def jnull() = JNull - def jfalse() = JTrue - def jtrue() = JFalse - def jnum(s: String) = JNumber(s) - def jint(s: String) = JNumber(s) - def jstring(s: String) = JString(s) - def jarray(vs: mutable.ArrayBuffer[JValue]) = JArray(vs.toArray) - def jobject(vs: mutable.Map[String, JValue]) = { - val array = new Array[JField](vs.size) - var i = 0 - vs.foreach { - case (key, value) => - array(i) = JField(key, value) - i += 1 - } - JObject(array) - } - } -} \ No newline at end of file diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 8be45ba91..18e4189c4 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -26,7 +26,7 @@ object Dependencies { lazy val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - lazy val sjsonnewVersion = "0.4.1" + lazy val sjsonnewVersion = "0.4.2" lazy val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion lazy val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From bc32cb4c6fcbcea2a5c4b7813daa29d61643457c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 8 Aug 2016 01:11:33 -0400 Subject: [PATCH 604/823] Trying to make readline timeout --- .../scala/sbt/internal/util/LineReader.scala | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index c05a7427d..eb838824f 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -140,6 +140,22 @@ private[sbt] class InputStreamWrapper(is: InputStream, val poll: Duration) exten Thread.sleep(poll.toMillis) read() } + + @tailrec + final override def read(b: Array[Byte]): Int = + if (is.available() != 0) is.read(b) + else { + Thread.sleep(poll.toMillis) + read(b) + } + + @tailrec + final override def read(b: Array[Byte], off: Int, len: Int): Int = + if (is.available() != 0) is.read(b, off, len) + else { + Thread.sleep(poll.toMillis) + read(b, off, len) + } } trait LineReader { From a38e100678a60918c3e9c1d6f223b01fdd4aeeff Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 12 Sep 2016 23:00:16 -0400 Subject: [PATCH 605/823] Handle sleep interruption --- .java-version | 1 + .../scala/sbt/internal/util/LineReader.scala | 37 +++++++++++-------- 2 files changed, 23 insertions(+), 15 deletions(-) create mode 100644 .java-version diff --git a/.java-version b/.java-version new file mode 100644 index 000000000..d3bdbdf1f --- /dev/null +++ b/.java-version @@ -0,0 +1 @@ +1.7 diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala index eb838824f..b4d5d5f83 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala @@ -11,9 +11,10 @@ import scala.concurrent.duration.Duration import scala.annotation.tailrec abstract class JLine extends LineReader { - protected[this] val handleCONT: Boolean - protected[this] val reader: ConsoleReader - + protected[this] def handleCONT: Boolean + protected[this] def reader: ConsoleReader + protected[this] def injectThreadSleep: Boolean + protected[this] val in: InputStream = JLine.makeInputStream(injectThreadSleep) def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } private[this] def unsynchronizedReadLine(prompt: String, mask: Option[Char]): Option[String] = @@ -37,9 +38,13 @@ abstract class JLine extends LineReader { private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): Option[String] = { val newprompt = handleMultilinePrompt(prompt) - mask match { - case Some(m) => Option(reader.readLine(newprompt, m)) - case None => Option(reader.readLine(newprompt)) + try { + mask match { + case Some(m) => Option(reader.readLine(newprompt, m)) + case None => Option(reader.readLine(newprompt)) + } + } catch { + case e: InterruptedException => Option("") } } @@ -81,6 +86,11 @@ private[sbt] object JLine { () } + protected[this] val originalIn = new FileInputStream(FileDescriptor.in) + private[sbt] def makeInputStream(injectThreadSleep: Boolean): InputStream = + if (injectThreadSleep) new InputStreamWrapper(originalIn, Duration("50 ms")) + else originalIn + // When calling this, ensure that enableEcho has been or will be called. // TerminalFactory.get will initialize the terminal to disable echo. private def terminal = jline.TerminalFactory.get @@ -98,14 +108,10 @@ private[sbt] object JLine { t.restore f(t) } - def createReader(): ConsoleReader = createReader(None, true) - def createReader(historyPath: Option[File], injectThreadSleep: Boolean): ConsoleReader = + def createReader(): ConsoleReader = createReader(None, JLine.makeInputStream(true)) + def createReader(historyPath: Option[File], in: InputStream): ConsoleReader = usingTerminal { t => - val cr = if (injectThreadSleep) { - val originalIn = new FileInputStream(FileDescriptor.in) - val in = new InputStreamWrapper(originalIn, Duration("50 ms")) - new ConsoleReader(in, System.out) - } else new ConsoleReader + val cr = new ConsoleReader(in, System.out) cr.setExpandEvents(false) // https://issues.scala-lang.org/browse/SI-7650 cr.setBellEnabled(false) val h = historyPath match { @@ -169,14 +175,15 @@ final class FullReader( ) extends JLine { protected[this] val reader = { - val cr = JLine.createReader(historyPath, injectThreadSleep) + val cr = JLine.createReader(historyPath, in) sbt.internal.util.complete.JLineCompletion.installCustomCompletor(cr, complete) cr } } class SimpleReader private[sbt] (historyPath: Option[File], val handleCONT: Boolean, val injectThreadSleep: Boolean) extends JLine { - protected[this] val reader = JLine.createReader(historyPath, injectThreadSleep) + protected[this] val reader = JLine.createReader(historyPath, in) + } object SimpleReader extends SimpleReader(None, JLine.HandleCONT, false) From 881ab0f298e838dd3b12e1908a3a5b4203a2c33e Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 27 Oct 2016 11:13:45 +0100 Subject: [PATCH 606/823] Expose Eval. Fixes sbt/sbt#2616 --- .../src/main/scala/sbt/{internal => }/util/Eval.scala | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename internal/util-collection/src/main/scala/sbt/{internal => }/util/Eval.scala (100%) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala b/internal/util-collection/src/main/scala/sbt/util/Eval.scala similarity index 100% rename from internal/util-collection/src/main/scala/sbt/internal/util/Eval.scala rename to internal/util-collection/src/main/scala/sbt/util/Eval.scala From c2b88760ad18b1aff865fe49774d7a487e0aa7e1 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 27 Oct 2016 11:43:34 +0100 Subject: [PATCH 607/823] Remove scala-xml, unused --- build.sbt | 2 +- project/Dependencies.scala | 10 ---------- 2 files changed, 1 insertion(+), 11 deletions(-) diff --git a/build.sbt b/build.sbt index 6b1655678..e7e40064a 100644 --- a/build.sbt +++ b/build.sbt @@ -117,7 +117,7 @@ lazy val utilCache = (project in internalPath / "util-cache"). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sjsonnew, scalaReflect.value, sbtIO) ++ scalaXml.value, + libraryDependencies ++= Seq(sjsonnew, scalaReflect.value, sbtIO), libraryDependencies += sjsonnewScalaJson % Test ) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 18e4189c4..e19f9246f 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -11,16 +11,6 @@ object Dependencies { lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } lazy val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } - private def scala211Module(name: String, moduleVersion: String) = - Def.setting { - scalaVersion.value match { - case sv if (sv startsWith "2.9.") || (sv startsWith "2.10.") => Nil - case _ => ("org.scala-lang.modules" %% name % moduleVersion) :: Nil - } - } - - lazy val scalaXml = scala211Module("scala-xml", "1.0.5") - val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.1" val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" From 8d2f106f7adb5864b71d3141d98c74bdc029b5cd Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 28 Oct 2016 12:19:41 +0100 Subject: [PATCH 608/823] Really expose Eval. Fixes error in #50 --- internal/util-collection/src/main/scala/sbt/util/Eval.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-collection/src/main/scala/sbt/util/Eval.scala b/internal/util-collection/src/main/scala/sbt/util/Eval.scala index 1596e457b..8c142e336 100644 --- a/internal/util-collection/src/main/scala/sbt/util/Eval.scala +++ b/internal/util-collection/src/main/scala/sbt/util/Eval.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import scala.annotation.tailrec From ea56f331a159efdc1346d38b485cdbeb5550e193 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 31 Oct 2016 09:23:38 +0000 Subject: [PATCH 609/823] Lazily concatenate failed errors for completion [forwardport] (#53) * Fixes [sbt/sbt#2781] * When using `` completion the failed errors were always computed for mathcing projects even if there was no failure, leading to excessive computation of Levenshtein distances and a large lag (seconds) on builds with many matching projects. --- .../src/main/scala/sbt/internal/util/complete/Parser.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala index bc1229ad1..0e12fd28a 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala @@ -559,8 +559,8 @@ trait ParserMain { def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = seq0(p, Nil) def seq0[T](p: Seq[Parser[T]], errors: => Seq[String]): Parser[Seq[T]] = { - val (newErrors, valid) = separate(p) { case Invalid(f) => Left(f.errors); case ok => Right(ok) } - def combinedErrors = errors ++ newErrors.flatten + val (newErrors, valid) = separate(p) { case Invalid(f) => Left(f.errors _); case ok => Right(ok) } + def combinedErrors = errors ++ newErrors.flatMap(_()) if (valid.isEmpty) invalid(combinedErrors) else new ParserSeq(valid, combinedErrors) } From 41c7e9b85d12e66b69d24a6fdcf3d18bde442771 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 31 Oct 2016 15:11:00 +0000 Subject: [PATCH 610/823] Improve xsbti.Maybe * Make Nothing a singleton * Make Maybe's constructor private * Optimise equals to use reference equality first * Optimise Just.equals by having proper, non-anonymous subclasses * Having non-anonymous subclasses makes them have nicer classnames * Give Just a value() method --- .../src/main/java/xsbti/Maybe.java | 77 +++++++++---------- 1 file changed, 38 insertions(+), 39 deletions(-) diff --git a/internal/util-interface/src/main/java/xsbti/Maybe.java b/internal/util-interface/src/main/java/xsbti/Maybe.java index 3280e5990..0c5ea23b8 100644 --- a/internal/util-interface/src/main/java/xsbti/Maybe.java +++ b/internal/util-interface/src/main/java/xsbti/Maybe.java @@ -4,46 +4,45 @@ package xsbti; /** Intended as a lightweight carrier for scala.Option. */ -public abstract class Maybe -{ - // private pending Scala bug #3642 - protected Maybe() {} +public abstract class Maybe { + private Maybe() {} - public static Maybe just(final s v) - { - return new Maybe() { - public boolean isDefined() { return true; } - public s get() { return v; } - public int hashCode() { return 17 + (v == null ? 0 : v.hashCode()); } - public String toString() { return "Maybe.just(" + v + ")"; } - public boolean equals(Object o) { - if (o == null) return false; - if (!(o instanceof Maybe)) return false; - Maybe other = (Maybe) o; - if (!other.isDefined()) return false; - if (v == null) return other.get() == null; - return v.equals(other.get()); - } - }; - } - public static Maybe nothing() - { - return new Maybe() { - public boolean isDefined() { return false; } - public s get() { throw new UnsupportedOperationException("nothing.get"); } - public int hashCode() { return 1; } - public String toString() { return "Maybe.nothing()"; } - public boolean equals(Object o) { - if (o == null) return false; - if (!(o instanceof Maybe)) return false; - Maybe other = (Maybe) o; - return !other.isDefined(); - } - }; + @SuppressWarnings("unchecked") + public static Maybe nothing() { return (Maybe) Nothing.INSTANCE; } + public static Maybe just(final s v) { return new Just(v); } - } + public static final class Just extends Maybe { + private final s v; - public final boolean isEmpty() { return !isDefined(); } - public abstract boolean isDefined(); - public abstract t get(); + public Just(final s v) { this.v = v; } + + public s value() { return v; } + + public boolean isDefined() { return true; } + public s get() { return v; } + public int hashCode() { return 17 + (v == null ? 0 : v.hashCode()); } + public String toString() { return "Maybe.just(" + v + ")"; } + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || !(o instanceof Just)) return false; + final Just that = (Just) o; + return v == null ? that.v == null : v.equals(that.v); + } + } + + public static final class Nothing extends Maybe { + public static final Nothing INSTANCE = new Nothing(); + private Nothing() { } + + public boolean isDefined() { return false; } + public Object get() { throw new UnsupportedOperationException("nothing.get"); } + + public int hashCode() { return 1; } + public String toString() { return "Maybe.nothing()"; } + public boolean equals(Object o) { return this == o || o != null && o instanceof Nothing; } + } + + public final boolean isEmpty() { return !isDefined(); } + public abstract boolean isDefined(); + public abstract t get(); } From 0d86bbdd0e9e1ca8c18a87de5d43af986d367757 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 27 Oct 2016 12:05:29 +0100 Subject: [PATCH 611/823] Add props to define source dependency on io Use either -Dsbtio.path on the command line or sbtio.path= in project/local.properties --- build.sbt | 18 ++++++++++-------- project/Dependencies.scala | 21 ++++++++++++++++++++- 2 files changed, 30 insertions(+), 9 deletions(-) diff --git a/build.sbt b/build.sbt index e7e40064a..cd552be01 100644 --- a/build.sbt +++ b/build.sbt @@ -83,8 +83,9 @@ lazy val utilComplete = (project in internalPath / "util-complete"). settings( commonSettings, name := "Util Completion", - libraryDependencies ++= Seq(jline, sbtIO) - ) + libraryDependencies += jline + ). + configure(addSbtIO) // logging lazy val utilLogging = (project in internalPath / "util-logging"). @@ -117,9 +118,10 @@ lazy val utilCache = (project in internalPath / "util-cache"). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sjsonnew, scalaReflect.value, sbtIO), + libraryDependencies ++= Seq(sjsonnew, scalaReflect.value), libraryDependencies += sjsonnewScalaJson % Test - ) + ). + configure(addSbtIO) // Builds on cache to provide caching for filesystem-related operations lazy val utilTracking = (project in internalPath / "util-tracking"). @@ -127,9 +129,9 @@ lazy val utilTracking = (project in internalPath / "util-tracking"). settings( commonSettings, name := "Util Tracking", - libraryDependencies += sbtIO, libraryDependencies += sjsonnewScalaJson % Test - ) + ). + configure(addSbtIO) // Internal utility for testing lazy val utilTesting = (project in internalPath / "util-testing"). @@ -144,12 +146,12 @@ lazy val utilScripted = (project in internalPath / "util-scripted"). settings( commonSettings, name := "Util Scripted", - libraryDependencies += sbtIO, libraryDependencies ++= { if (scalaVersion.value startsWith "2.11") Seq(parserCombinator211) else Seq() } - ) + ). + configure(addSbtIO) def customCommands: Seq[Setting[_]] = Seq( commands += Command.command("release") { state => diff --git a/project/Dependencies.scala b/project/Dependencies.scala index e19f9246f..ff5159413 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -5,7 +5,26 @@ object Dependencies { lazy val scala211 = "2.11.8" lazy val scala212 = "2.12.0-M4" - lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M6" + private lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M6" + + def getSbtModulePath(key: String, name: String) = { + val localProps = new java.util.Properties() + IO.load(localProps, file("project/local.properties")) + val path = Option(localProps getProperty key) orElse (sys.props get key) + path foreach (f => println(s"Using $name from $f")) + path + } + + lazy val sbtIoPath = getSbtModulePath("sbtio.path", "sbt/io") + + def addSbtModule(p: Project, path: Option[String], projectName: String, m: ModuleID, c: Option[Configuration] = None) = + path match { + case Some(f) => p dependsOn c.fold[ClasspathDependency](ProjectRef(file(f), projectName))(ProjectRef(file(f), projectName) % _) + case None => p settings (libraryDependencies += c.fold(m)(m % _)) + } + + def addSbtIO(p: Project): Project = addSbtModule(p, sbtIoPath, "io", sbtIO) + lazy val jline = "jline" % "jline" % "2.13" lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } From d6ebb4bc1e40b010055d2b9ca96c78dc5c520b1e Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 9 Nov 2016 14:06:48 +0000 Subject: [PATCH 612/823] Upgrade to sbt 0.13.13 (#56) --- project/Util.scala | 7 +++---- project/build.properties | 2 +- 2 files changed, 4 insertions(+), 5 deletions(-) diff --git a/project/Util.scala b/project/Util.scala index 860d6c84b..adb7cef5b 100644 --- a/project/Util.scala +++ b/project/Util.scala @@ -1,6 +1,5 @@ import sbt._ import Keys._ -import StringUtilities.normalize object Util { lazy val scalaKeywords = TaskKey[Set[String]]("scala-keywords") @@ -9,7 +8,7 @@ object Util { lazy val javaOnlySettings = Seq[Setting[_]]( crossPaths := false, compileOrder := CompileOrder.JavaThenScala, - unmanagedSourceDirectories in Compile <<= Seq(javaSource in Compile).join, + unmanagedSourceDirectories in Compile := Seq((javaSource in Compile).value), crossScalaVersions := Seq(Dependencies.scala211), autoScalaLibrary := false ) @@ -35,7 +34,7 @@ object %s { } def keywordsSettings: Seq[Setting[_]] = inConfig(Compile)(Seq( scalaKeywords := getScalaKeywords, - generateKeywords <<= (sourceManaged, scalaKeywords) map writeScalaKeywords, - sourceGenerators <+= generateKeywords map (x => Seq(x)) + generateKeywords := writeScalaKeywords(sourceManaged.value, scalaKeywords.value), + sourceGenerators += (generateKeywords map (x => Seq(x))).taskValue )) } diff --git a/project/build.properties b/project/build.properties index 817bc38df..27e88aa11 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=0.13.9 +sbt.version=0.13.13 From 033adfe4eaccb379976f55f7a652203c5205bdc0 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 15 Nov 2016 17:14:10 +0000 Subject: [PATCH 613/823] Remove deprecated methods --- .../scala/sbt/internal/util/Settings.scala | 23 +++-------------- .../main/scala/sbt/internal/util/Util.scala | 14 +++++------ .../internal/util/complete/Completions.scala | 10 +------- .../sbt/internal/util/complete/Parser.scala | 25 ------------------- .../sbt/internal/util/ConsoleLogger.scala | 20 +-------------- .../scala/sbt/internal/util/MainLogging.scala | 11 +------- 6 files changed, 12 insertions(+), 91 deletions(-) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala index ed23e377a..97117e2dc 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala @@ -87,9 +87,6 @@ trait Init[Scope] { */ private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) - @deprecated("Use the version with default arguments and default parameter.", "0.13.7") - final def derive[T](s: Setting[T], allowDynamic: Boolean, filter: Scope => Boolean, trigger: AttributeKey[_] => Boolean): Setting[T] = - derive(s, allowDynamic, filter, trigger, false) /** * Constructs a derived setting that will be automatically defined in every scope where one of its dependencies * is explicitly defined and the where the scope matches `filter`. @@ -101,10 +98,12 @@ trait Init[Scope] { val d = new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) if (default) d.default() else d } + def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { case _: Bind[_, _] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") case _ => None } + // id is used for equality private[sbt] final def defaultSetting[T](s: Setting[T]): Setting[T] = s.default() private[sbt] def defaultSettings(ss: Seq[Setting[_]]): Seq[Setting[_]] = ss.map(s => defaultSetting(s)) @@ -238,14 +237,7 @@ trait Init[Scope] { } final class Uninitialized(val undefined: Seq[Undefined], override val toString: String) extends Exception(toString) - final class Undefined private[sbt] (val defining: Setting[_], val referencedKey: ScopedKey[_]) { - @deprecated("For compatibility only, use `defining` directly.", "0.13.1") - val definingKey = defining.key - @deprecated("For compatibility only, use `defining` directly.", "0.13.1") - val derived: Boolean = defining.isDerived - @deprecated("Use the non-deprecated Undefined factory method.", "0.13.1") - def this(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean) = this(fakeUndefinedSetting(definingKey, derived), referencedKey) - } + final class Undefined private[sbt] (val defining: Setting[_], val referencedKey: ScopedKey[_]) final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") { override def getMessage = super.getMessage + undefined.map { u => @@ -253,15 +245,6 @@ trait Init[Scope] { }.mkString } - @deprecated("Use the other overload.", "0.13.1") - def Undefined(definingKey: ScopedKey[_], referencedKey: ScopedKey[_], derived: Boolean): Undefined = - new Undefined(fakeUndefinedSetting(definingKey, derived), referencedKey) - private[this] def fakeUndefinedSetting[T](definingKey: ScopedKey[T], d: Boolean): Setting[T] = - { - val init: Initialize[T] = pure(() => sys.error("Dummy setting for compatibility only.")) - new Setting(definingKey, init, NoPosition) { override def isDerived = d } - } - def Undefined(defining: Setting[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(defining, referencedKey) def Uninitialized(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], keys: Seq[Undefined], runtime: Boolean)(implicit display: Show[ScopedKey[_]]): Uninitialized = { diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala index 4f82cae9c..75b8224c1 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala @@ -25,18 +25,16 @@ object Util { def pairID[A, B] = (a: A, b: B) => (a, b) - private[this] lazy val Hypen = """-(\p{javaLowerCase})""".r + private[this] lazy val Hyphen = """-(\p{javaLowerCase})""".r + def hasHyphen(s: String): Boolean = s.indexOf('-') >= 0 - @deprecated("Use the properly spelled version: hyphenToCamel", "0.13.0") - def hypenToCamel(s: String): String = hyphenToCamel(s) + def hyphenToCamel(s: String): String = - if (hasHyphen(s)) - Hypen.replaceAllIn(s, _.group(1).toUpperCase(Locale.ENGLISH)) - else - s + if (hasHyphen(s)) Hyphen.replaceAllIn(s, _.group(1).toUpperCase(Locale.ENGLISH)) else s private[this] lazy val Camel = """(\p{javaLowerCase})(\p{javaUpperCase})""".r - def camelToHypen(s: String): String = + + def camelToHyphen(s: String): String = Camel.replaceAllIn(s, m => m.group(1) + "-" + m.group(2).toLowerCase(Locale.ENGLISH)) def quoteIfKeyword(s: String): String = if (ScalaKeywords.values(s)) '`' + s + '`' else s diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala index c035f3620..47dbb3b4f 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala @@ -82,8 +82,6 @@ final class DisplayOnly(val display: String) extends Completion { override def toString = "{" + display + "}" } final class Token(val display: String, val append: String) extends Completion { - @deprecated("Retained only for compatibility. All information is now in `display` and `append`.", "0.12.1") - lazy val prepend = display.stripSuffix(append) def isEmpty = display.isEmpty && append.isEmpty override final def toString = "[" + display + "]++" + append } @@ -127,19 +125,13 @@ object Completion { // TODO: make strict in 0.13.0 to match DisplayOnly def displayOnly(value: => String): Completion = new DisplayOnly(value) - @deprecated("Use displayOnly.", "0.12.1") - def displayStrict(value: String): Completion = displayOnly(value) // TODO: make strict in 0.13.0 to match Token def token(prepend: => String, append: => String): Completion = new Token(prepend + append, append) - @deprecated("Use token.", "0.12.1") - def tokenStrict(prepend: String, append: String): Completion = token(prepend, append) /** @since 0.12.1 */ def tokenDisplay(append: String, display: String): Completion = new Token(display, append) // TODO: make strict in 0.13.0 to match Suggestion def suggestion(value: => String): Completion = new Suggestion(value) - @deprecated("Use suggestion.", "0.12.1") - def suggestStrict(value: String): Completion = suggestion(value) -} \ No newline at end of file +} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala index 0e12fd28a..67651bfbd 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala @@ -79,18 +79,12 @@ sealed trait RichParser[A] { */ def failOnException: Parser[A] - @deprecated("Use `not` and explicitly provide the failure message", "0.12.2") - def unary_- : Parser[Unit] - /** * Apply the original parser, but only succeed if `o` also succeeds. * Note that `o` does not need to consume the same amount of input to satisfy this condition. */ def &(o: Parser[_]): Parser[A] - @deprecated("Use `and` and `not` and explicitly provide the failure message", "0.12.2") - def -(o: Parser[_]): Parser[A] - /** Explicitly defines the completions for the original Parser.*/ def examples(s: String*): Parser[A] @@ -188,12 +182,6 @@ object Parser extends ParserMain { def mkFailures(errors: => Seq[String], definitive: Boolean = false): Failure = new Failure(errors.distinct, definitive) def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) - @deprecated("This method is deprecated and will be removed in the next major version. Use the parser directly to check for invalid completions.", since = "0.13.2") - def checkMatches(a: Parser[_], completions: Seq[String]): Unit = { - val bad = completions.filter(apply(a)(_).resultEmpty.isFailure) - if (bad.nonEmpty) sys.error("Invalid example completions: " + bad.mkString("'", "', '", "'")) - } - def tuple[A, B](a: Option[A], b: Option[B]): Option[(A, B)] = (a, b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } @@ -280,9 +268,6 @@ object Parser extends ParserMain { } } - @deprecated("Explicitly call `and` and `not` to provide the failure message.", "0.12.2") - def sub[T](a: Parser[T], b: Parser[_]): Parser[T] = and(a, not(b)) - def and[T](a: Parser[T], b: Parser[_]): Parser[T] = a.ifValid(b.ifValid(new And(a, b))) } trait ParserMain { @@ -527,13 +512,6 @@ trait ParserMain { def token[T](t: Parser[T], complete: TokenCompletions): Parser[T] = mkToken(t, "", complete) - @deprecated("Use a different `token` overload.", "0.12.1") - def token[T](t: Parser[T], seen: String, track: Boolean, hide: Int => Boolean): Parser[T] = - { - val base = if (track) TokenCompletions.default else TokenCompletions.displayOnly(seen) - token(t, base.hideWhen(hide)) - } - private[sbt] def mkToken[T](t: Parser[T], seen: String, complete: TokenCompletions): Parser[T] = if (t.valid && !t.isTokenStart) if (t.result.isEmpty) new TokenStart(t, seen, complete) else t @@ -547,9 +525,6 @@ trait ParserMain { case (av, bv) => new HomParser(a, b) } - @deprecated("Explicitly specify the failure message.", "0.12.2") - def not(p: Parser[_]): Parser[Unit] = not(p, "Excluded.") - def not(p: Parser[_], failMessage: String): Parser[Unit] = p.result match { case None => new Not(p, failMessage) case Some(_) => failure(failMessage) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala index 5ca3fe9ee..9e562a770 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala @@ -4,28 +4,10 @@ package sbt.internal.util import sbt.util._ -import java.io.{ BufferedWriter, PrintStream, PrintWriter } +import java.io.{ PrintStream, PrintWriter } import java.util.Locale object ConsoleLogger { - @deprecated("Moved to ConsoleOut", "0.13.0") - def systemOut: ConsoleOut = ConsoleOut.systemOut - - @deprecated("Moved to ConsoleOut", "0.13.0") - def overwriteContaining(s: String): (String, String) => Boolean = ConsoleOut.overwriteContaining(s) - - @deprecated("Moved to ConsoleOut", "0.13.0") - def systemOutOverwrite(f: (String, String) => Boolean): ConsoleOut = ConsoleOut.systemOutOverwrite(f) - - @deprecated("Moved to ConsoleOut", "0.13.0") - def printStreamOut(out: PrintStream): ConsoleOut = ConsoleOut.printStreamOut(out) - - @deprecated("Moved to ConsoleOut", "0.13.0") - def printWriterOut(out: PrintWriter): ConsoleOut = ConsoleOut.printWriterOut(out) - - @deprecated("Moved to ConsoleOut", "0.13.0") - def bufferedWriterOut(out: BufferedWriter): ConsoleOut = bufferedWriterOut(out) - /** Escape character, used to introduce an escape sequence. */ final val ESC = '\u001B' diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala index aeab7e5cd..6cea97edb 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala @@ -28,18 +28,9 @@ object MainLogging { f } - @deprecated("Explicitly specify the console output.", "0.13.0") - def defaultMultiConfig(backing: AbstractLogger): MultiLoggerConfig = - defaultMultiConfig(ConsoleOut.systemOut, backing) def defaultMultiConfig(console: ConsoleOut, backing: AbstractLogger): MultiLoggerConfig = new MultiLoggerConfig(defaultScreen(console, ConsoleLogger.noSuppressedMessage), backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) - @deprecated("Explicitly specify the console output.", "0.13.0") - def defaultScreen(): AbstractLogger = ConsoleLogger() - - @deprecated("Explicitly specify the console output.", "0.13.0") - def defaultScreen(suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = ConsoleLogger(suppressedMessage = suppressedMessage) - def defaultScreen(console: ConsoleOut): AbstractLogger = ConsoleLogger(console) def defaultScreen(console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = ConsoleLogger(console, suppressedMessage = suppressedMessage) @@ -49,4 +40,4 @@ object MainLogging { } final case class MultiLoggerConfig(console: AbstractLogger, backed: AbstractLogger, extra: List[AbstractLogger], - screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int) \ No newline at end of file + screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int) From f3adb2953c833b6c8aa08a03578674e1398ef3c6 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 15 Nov 2016 23:05:48 +0000 Subject: [PATCH 614/823] Add sbt-pgp, required by publishSigned in release --- project/pgp.sbt | 1 + 1 file changed, 1 insertion(+) create mode 100644 project/pgp.sbt diff --git a/project/pgp.sbt b/project/pgp.sbt new file mode 100644 index 000000000..4ce4d9ed4 --- /dev/null +++ b/project/pgp.sbt @@ -0,0 +1 @@ +addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.0.0") From 20deba4b24d536df2b09dbf0687613e6d188eea7 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 15 Nov 2016 23:03:18 +0000 Subject: [PATCH 615/823] 0.1.0-M15 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index cd552be01..7cfafd386 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "0.1.0-M13" +def baseVersion: String = "0.1.0-M15" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From a2c73e77f90e1bd018e770850b100733d06f0182 Mon Sep 17 00:00:00 2001 From: eugene yokota Date: Tue, 22 Nov 2016 18:41:29 -0500 Subject: [PATCH 616/823] Cross publish util-logging and util-testing (#59) --- .travis.yml | 4 ++++ build.sbt | 4 ++++ project/Dependencies.scala | 1 + 3 files changed, 9 insertions(+) diff --git a/.travis.yml b/.travis.yml index 639444bb0..895984e79 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,2 +1,6 @@ language: scala + scala: 2.11.8 + +script: + - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "so test" diff --git a/build.sbt b/build.sbt index 7cfafd386..c08b7a214 100644 --- a/build.sbt +++ b/build.sbt @@ -15,6 +15,8 @@ def commonSettings: Seq[Setting[_]] = Seq( javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), crossScalaVersions := Seq(scala211), scalacOptions ++= Seq("-Ywarn-unused", "-Ywarn-unused-import"), + scalacOptions --= // scalac 2.10 rejects some HK types under -Xfuture it seems.. + (CrossVersion partialVersion scalaVersion.value collect { case (2, 10) => List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") }).toList.flatten, previousArtifact := None, // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, publishArtifact in Test := false @@ -92,6 +94,7 @@ lazy val utilLogging = (project in internalPath / "util-logging"). dependsOn(utilInterface, utilTesting % Test). settings( commonSettings, + crossScalaVersions := Seq(scala210, scala211), name := "Util Logging", libraryDependencies += jline ) @@ -137,6 +140,7 @@ lazy val utilTracking = (project in internalPath / "util-tracking"). lazy val utilTesting = (project in internalPath / "util-testing"). settings( commonSettings, + crossScalaVersions := Seq(scala210, scala211), name := "Util Testing", libraryDependencies ++= Seq(scalaCheck, scalatest) ) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index ff5159413..8e9f07b08 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -2,6 +2,7 @@ import sbt._ import Keys._ object Dependencies { + lazy val scala210 = "2.10.6" lazy val scala211 = "2.11.8" lazy val scala212 = "2.12.0-M4" From ab08e1a9d551fb1d647f0486cfdbc2427f5244a9 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 24 Nov 2016 15:48:47 +0000 Subject: [PATCH 617/823] Add back additional formats. At least a subset of these are required for sbt/sbt to migrate away from sbinary. This reverts commit cee43575ce3988b1565489db3166281cd7899cde. --- .../sbt/internal/util/AdditionalFormats.scala | 71 +++++++++++++++++++ .../sbt/internal/util/CacheImplicits.scala | 5 ++ 2 files changed, 76 insertions(+) create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala new file mode 100644 index 000000000..8008cf90a --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala @@ -0,0 +1,71 @@ +package sbt.internal.util + +import sbt.internal.util.Types.:+: + +import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } +import sjsonnew.BasicJsonProtocol, BasicJsonProtocol.asSingleton + +import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } + +import java.net.{ URI, URL } + +trait URIFormat { self: BasicJsonProtocol => + implicit def URIFormat: JsonFormat[URI] = project(_.toString, new URI(_: String)) +} + +trait URLFormat { self: BasicJsonProtocol => + implicit def URLFormat: JsonFormat[URL] = project(_.toString, new URL(_: String)) +} + +trait FileFormat { self: BasicJsonProtocol => + implicit def FileFormat: JsonFormat[File] = project(_.toString, new File(_: String)) +} + +trait SetFormat { self: BasicJsonProtocol => + implicit def SetFormat[T: JsonFormat]: JsonFormat[Set[T]] = project(_.toSeq, (_: Seq[T]).toSet) +} + +trait HListFormat { + implicit def HConsFormat[H: JsonFormat, T <: HList: JsonFormat]: JsonFormat[H :+: T] = + new JsonFormat[H :+: T] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): H :+: T = + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val h = unbuilder.readField[H]("h") + val t = unbuilder.readField[T]("t") + unbuilder.endObject() + + HCons(h, t) + + case None => + deserializationError("Expect JValue but found None") + } + + override def write[J](obj: H :+: T, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("h", obj.head) + builder.addField("t", obj.tail) + builder.endObject() + } + } + + implicit val HNilFormat: JsonFormat[HNil] = asSingleton(HNil) + +} + +trait StreamFormat { self: BasicJsonProtocol => + def streamFormat[T](write: (T, OutputStream) => Unit, read: InputStream => T): JsonFormat[T] = { + lazy val byteArrayFormat = implicitly[JsonFormat[Array[Byte]]] + val toBytes = (t: T) => { val bos = new ByteArrayOutputStream(); write(t, bos); bos.toByteArray } + val fromBytes = (bs: Array[Byte]) => read(new ByteArrayInputStream(bs)) + + new JsonFormat[T] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): T = + fromBytes(byteArrayFormat.read(jsOpt, unbuilder)) + + override def write[J](obj: T, builder: Builder[J]): Unit = + byteArrayFormat.write(toBytes(obj), builder) + } + } +} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala index 78e6e301c..0ebdf134b 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala @@ -4,3 +4,8 @@ import sjsonnew.BasicJsonProtocol object CacheImplicits extends BasicCacheImplicits with BasicJsonProtocol + with FileFormat + with HListFormat + with URIFormat + with URLFormat + with StreamFormat From ab9165ab0425fab4406aabd6133da32bcff63c6f Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 25 Nov 2016 21:24:39 +0000 Subject: [PATCH 618/823] Remove formats already present upstream in sjson-new --- .../sbt/internal/util/AdditionalFormats.scala | 20 +------------------ .../sbt/internal/util/CacheImplicits.scala | 3 --- 2 files changed, 1 insertion(+), 22 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala index 8008cf90a..7ec9ecf17 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala @@ -5,25 +5,7 @@ import sbt.internal.util.Types.:+: import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } import sjsonnew.BasicJsonProtocol, BasicJsonProtocol.asSingleton -import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, File, InputStream, OutputStream } - -import java.net.{ URI, URL } - -trait URIFormat { self: BasicJsonProtocol => - implicit def URIFormat: JsonFormat[URI] = project(_.toString, new URI(_: String)) -} - -trait URLFormat { self: BasicJsonProtocol => - implicit def URLFormat: JsonFormat[URL] = project(_.toString, new URL(_: String)) -} - -trait FileFormat { self: BasicJsonProtocol => - implicit def FileFormat: JsonFormat[File] = project(_.toString, new File(_: String)) -} - -trait SetFormat { self: BasicJsonProtocol => - implicit def SetFormat[T: JsonFormat]: JsonFormat[Set[T]] = project(_.toSeq, (_: Seq[T]).toSet) -} +import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, InputStream, OutputStream } trait HListFormat { implicit def HConsFormat[H: JsonFormat, T <: HList: JsonFormat]: JsonFormat[H :+: T] = diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala index 0ebdf134b..b3d660074 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala @@ -4,8 +4,5 @@ import sjsonnew.BasicJsonProtocol object CacheImplicits extends BasicCacheImplicits with BasicJsonProtocol - with FileFormat with HListFormat - with URIFormat - with URLFormat with StreamFormat From bcd5e800c4526e705b659bd743e9601d25c68172 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 28 Nov 2016 11:11:10 +0000 Subject: [PATCH 619/823] Remove InputStream/OutputStream support --- .../sbt/internal/util/AdditionalFormats.scala | 19 ------------------- .../sbt/internal/util/CacheImplicits.scala | 1 - 2 files changed, 20 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala index 7ec9ecf17..34fbbcab7 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala @@ -5,8 +5,6 @@ import sbt.internal.util.Types.:+: import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } import sjsonnew.BasicJsonProtocol, BasicJsonProtocol.asSingleton -import java.io.{ ByteArrayInputStream, ByteArrayOutputStream, InputStream, OutputStream } - trait HListFormat { implicit def HConsFormat[H: JsonFormat, T <: HList: JsonFormat]: JsonFormat[H :+: T] = new JsonFormat[H :+: T] { @@ -33,21 +31,4 @@ trait HListFormat { } implicit val HNilFormat: JsonFormat[HNil] = asSingleton(HNil) - -} - -trait StreamFormat { self: BasicJsonProtocol => - def streamFormat[T](write: (T, OutputStream) => Unit, read: InputStream => T): JsonFormat[T] = { - lazy val byteArrayFormat = implicitly[JsonFormat[Array[Byte]]] - val toBytes = (t: T) => { val bos = new ByteArrayOutputStream(); write(t, bos); bos.toByteArray } - val fromBytes = (bs: Array[Byte]) => read(new ByteArrayInputStream(bs)) - - new JsonFormat[T] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): T = - fromBytes(byteArrayFormat.read(jsOpt, unbuilder)) - - override def write[J](obj: T, builder: Builder[J]): Unit = - byteArrayFormat.write(toBytes(obj), builder) - } - } } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala index b3d660074..00bb6beaa 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala @@ -5,4 +5,3 @@ import sjsonnew.BasicJsonProtocol object CacheImplicits extends BasicCacheImplicits with BasicJsonProtocol with HListFormat - with StreamFormat From 998cffd9ab304e6b36e019a861fd31d9ca1a50ed Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 28 Nov 2016 17:16:20 +0000 Subject: [PATCH 620/823] Change hlist format to serialise to flat array --- build.sbt | 2 + .../sbt/internal/util/AdditionalFormats.scala | 34 ----------------- .../scala/sbt/internal/util/HListFormat.scala | 37 +++++++++++++++++++ .../src/test/scala/HListFormatSpec.scala | 26 +++++++++++++ project/Dependencies.scala | 2 +- 5 files changed, 66 insertions(+), 35 deletions(-) delete mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala create mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala create mode 100644 internal/util-cache/src/test/scala/HListFormatSpec.scala diff --git a/build.sbt b/build.sbt index c08b7a214..5c41d937c 100644 --- a/build.sbt +++ b/build.sbt @@ -17,6 +17,8 @@ def commonSettings: Seq[Setting[_]] = Seq( scalacOptions ++= Seq("-Ywarn-unused", "-Ywarn-unused-import"), scalacOptions --= // scalac 2.10 rejects some HK types under -Xfuture it seems.. (CrossVersion partialVersion scalaVersion.value collect { case (2, 10) => List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") }).toList.flatten, + scalacOptions in console in Compile -= "-Ywarn-unused-import", + scalacOptions in console in Test -= "-Ywarn-unused-import", previousArtifact := None, // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, publishArtifact in Test := false diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala b/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala deleted file mode 100644 index 34fbbcab7..000000000 --- a/internal/util-cache/src/main/scala/sbt/internal/util/AdditionalFormats.scala +++ /dev/null @@ -1,34 +0,0 @@ -package sbt.internal.util - -import sbt.internal.util.Types.:+: - -import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } -import sjsonnew.BasicJsonProtocol, BasicJsonProtocol.asSingleton - -trait HListFormat { - implicit def HConsFormat[H: JsonFormat, T <: HList: JsonFormat]: JsonFormat[H :+: T] = - new JsonFormat[H :+: T] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): H :+: T = - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) - val h = unbuilder.readField[H]("h") - val t = unbuilder.readField[T]("t") - unbuilder.endObject() - - HCons(h, t) - - case None => - deserializationError("Expect JValue but found None") - } - - override def write[J](obj: H :+: T, builder: Builder[J]): Unit = { - builder.beginObject() - builder.addField("h", obj.head) - builder.addField("t", obj.tail) - builder.endObject() - } - } - - implicit val HNilFormat: JsonFormat[HNil] = asSingleton(HNil) -} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala b/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala new file mode 100644 index 000000000..82ecf9ba3 --- /dev/null +++ b/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala @@ -0,0 +1,37 @@ +package sbt.internal.util + +import sjsonnew._ +import Types.:+: + +trait HListFormat { + implicit val lnilFormat1: JsonFormat[HNil] = forHNil(HNil) + implicit val lnilFormat2: JsonFormat[HNil.type] = forHNil(HNil) + + private def forHNil[A <: HNil](hnil: A): JsonFormat[A] = new JsonFormat[A] { + def write[J](x: A, builder: Builder[J]): Unit = { + if (builder.state != BuilderState.InArray) builder.beginArray() + builder.endArray() + } + + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): A = { + if (unbuilder.state == UnbuilderState.InArray) unbuilder.endArray() + hnil + } + } + + implicit def hconsFormat[H, T <: HList](implicit hf: JsonFormat[H], tf: JsonFormat[T]): JsonFormat[H :+: T] = + new JsonFormat[H :+: T] { + def write[J](hcons: H :+: T, builder: Builder[J]) = { + if (builder.state != BuilderState.InArray) builder.beginArray() + hf.write(hcons.head, builder) + tf.write(hcons.tail, builder) + } + + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = jsOpt match { + case None => HCons(hf.read(None, unbuilder), tf.read(None, unbuilder)) + case Some(js) => + if (unbuilder.state != UnbuilderState.InArray) unbuilder.beginArray(js) + HCons(hf.read(Some(unbuilder.nextElement), unbuilder), tf.read(Some(js), unbuilder)) + } + } +} diff --git a/internal/util-cache/src/test/scala/HListFormatSpec.scala b/internal/util-cache/src/test/scala/HListFormatSpec.scala new file mode 100644 index 000000000..23e4cde0f --- /dev/null +++ b/internal/util-cache/src/test/scala/HListFormatSpec.scala @@ -0,0 +1,26 @@ +package sbt.internal.util + +import scala.json.ast.unsafe._ +import sjsonnew._, support.scalajson.unsafe._ +import CacheImplicits._ + +class HListFormatSpec extends UnitSpec { + val quux = 23 :+: "quux" :+: true :+: HNil + + it should "round trip quux" in assertRoundTrip(quux) + it should "round trip hnil" in assertRoundTrip(HNil) + + it should "have a flat structure for quux" in assertJsonString(quux, """[23,"quux",true]""") + it should "have a flat structure for hnil" in assertJsonString(HNil, "[]") + + def assertRoundTrip[A: JsonWriter: JsonReader](x: A) = { + val jsonString: String = toJsonString(x) + val jValue: JValue = Parser.parseUnsafe(jsonString) + val y: A = Converter.fromJson[A](jValue).get + assert(x === y) + } + + def assertJsonString[A: JsonWriter](x: A, s: String) = assert(toJsonString(x) === s) + + def toJsonString[A: JsonWriter](x: A): String = CompactPrinter(Converter.toJson(x).get) +} diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 8e9f07b08..7ebe4c8ed 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -36,7 +36,7 @@ object Dependencies { lazy val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - lazy val sjsonnewVersion = "0.4.2" + lazy val sjsonnewVersion = "0.5.1" lazy val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion lazy val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From 89a03b7dae92be54879c5a1a73708916e7c2deb3 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 29 Nov 2016 10:53:22 +0000 Subject: [PATCH 621/823] Scala 2.12.0 & sbt/io 1.0.0-M7 --- project/Dependencies.scala | 22 +++++++++++----------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 7ebe4c8ed..f9fd1706f 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -2,11 +2,11 @@ import sbt._ import Keys._ object Dependencies { - lazy val scala210 = "2.10.6" - lazy val scala211 = "2.11.8" - lazy val scala212 = "2.12.0-M4" + val scala210 = "2.10.6" + val scala211 = "2.11.8" + val scala212 = "2.12.0" - private lazy val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M6" + private val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M7" def getSbtModulePath(key: String, name: String) = { val localProps = new java.util.Properties() @@ -26,17 +26,17 @@ object Dependencies { def addSbtIO(p: Project): Project = addSbtModule(p, sbtIoPath, "io", sbtIO) - lazy val jline = "jline" % "jline" % "2.13" + val jline = "jline" % "jline" % "2.13" - lazy val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } - lazy val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } + val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } + val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.1" val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" - lazy val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" + val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - lazy val sjsonnewVersion = "0.5.1" - lazy val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion - lazy val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion + val sjsonnewVersion = "0.5.1" + val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion + val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From d86eab7a7a1e90082de121245cbd32397a1246ec Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 29 Nov 2016 11:07:36 +0000 Subject: [PATCH 622/823] sbt-doge 0.1.5 for plz --- project/doge.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/doge.sbt b/project/doge.sbt index fedea9490..e1274c941 100644 --- a/project/doge.sbt +++ b/project/doge.sbt @@ -1 +1 @@ -addSbtPlugin("com.eed3si9n" % "sbt-doge" % "0.1.3") +addSbtPlugin("com.eed3si9n" % "sbt-doge" % "0.1.5") From 875a30cc70acae2b3aff1219ae052879f23e3753 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 30 Nov 2016 15:03:38 +0000 Subject: [PATCH 623/823] Cleanup FileInfo --- .../scala/sbt/internal/util/FileInfo.scala | 145 ++++++++---------- 1 file changed, 61 insertions(+), 84 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index cf83d01dd..28cf7de86 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -3,115 +3,101 @@ */ package sbt.internal.util -import sbt.io.Hash - import java.io.File -import sjsonnew.{ Builder, deserializationError, JsonFormat, Unbuilder } +import sbt.io.Hash +import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError } import CacheImplicits._ -sealed trait FileInfo { - def file: File -} - -sealed trait HashFileInfo extends FileInfo { - def hash: List[Byte] -} - -sealed trait ModifiedFileInfo extends FileInfo { - def lastModified: Long -} - -sealed trait PlainFileInfo extends FileInfo { - def exists: Boolean -} +sealed trait FileInfo { def file: File } +sealed trait HashFileInfo extends FileInfo { def hash: List[Byte] } +sealed trait ModifiedFileInfo extends FileInfo { def lastModified: Long } +sealed trait PlainFileInfo extends FileInfo { def exists: Boolean } sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo + private final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo -object FileInfo { +final case class FilesInfo[F <: FileInfo] private (files: Set[F]) +object FilesInfo { + implicit def format[F <: FileInfo]: JsonFormat[FilesInfo[F]] = implicitly + def empty[F <: FileInfo]: FilesInfo[F] = FilesInfo(Set.empty[F]) +} +object FileInfo { sealed trait Style { type F <: FileInfo + implicit val format: JsonFormat[F] def apply(file: File): F - def apply(files: Set[File]): FilesInfo[F] = FilesInfo(files map apply) def unapply(info: F): File = info.file - def unapply(infos: FilesInfo[F]): Set[File] = infos.files map (_.file) } object full extends Style { - override type F = HashModifiedFileInfo + type F = HashModifiedFileInfo - override implicit val format: JsonFormat[HashModifiedFileInfo] = new JsonFormat[HashModifiedFileInfo] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): HashModifiedFileInfo = - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) - val file = unbuilder.readField[File]("file") - val hash = unbuilder.readField[List[Byte]]("hash") - val lastModified = unbuilder.readField[Long]("lastModified") - unbuilder.endObject() - FileHashModified(file, hash, lastModified) - case None => - deserializationError("Expected JsObject but found None") - } - - override def write[J](obj: HashModifiedFileInfo, builder: Builder[J]): Unit = { + implicit val format: JsonFormat[HashModifiedFileInfo] = new JsonFormat[HashModifiedFileInfo] { + def write[J](obj: HashModifiedFileInfo, builder: Builder[J]) = { builder.beginObject() builder.addField("file", obj.file) builder.addField("hash", obj.hash) builder.addField("lastModified", obj.lastModified) builder.endObject() } + + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val file = unbuilder.readField[File]("file") + val hash = unbuilder.readField[List[Byte]]("hash") + val lastModified = unbuilder.readField[Long]("lastModified") + unbuilder.endObject() + FileHashModified(file, hash, lastModified) + case None => deserializationError("Expected JsObject but found None") + } } - override implicit def apply(file: File): HashModifiedFileInfo = + implicit def apply(file: File): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, Hash(file).toList, file.lastModified) } object hash extends Style { - override type F = HashFileInfo + type F = HashFileInfo - override implicit val format: JsonFormat[HashFileInfo] = new JsonFormat[HashFileInfo] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): HashFileInfo = - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) - val file = unbuilder.readField[File]("file") - val hash = unbuilder.readField[List[Byte]]("hash") - unbuilder.endObject() - FileHash(file, hash) - case None => - deserializationError("Expected JsObject but found None") - } - - override def write[J](obj: HashFileInfo, builder: Builder[J]): Unit = { + implicit val format: JsonFormat[HashFileInfo] = new JsonFormat[HashFileInfo] { + def write[J](obj: HashFileInfo, builder: Builder[J]) = { builder.beginObject() builder.addField("file", obj.file) builder.addField("hash", obj.hash) builder.endObject() } + + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val file = unbuilder.readField[File]("file") + val hash = unbuilder.readField[List[Byte]]("hash") + unbuilder.endObject() + FileHash(file, hash) + case None => deserializationError("Expected JsObject but found None") + } } - override implicit def apply(file: File): HashFileInfo = - FileHash(file.getAbsoluteFile, computeHash(file)) + implicit def apply(file: File): HashFileInfo = FileHash(file.getAbsoluteFile, computeHash(file)) - private def computeHash(file: File): List[Byte] = - try Hash(file).toList - catch { case _: Exception => Nil } + private def computeHash(file: File): List[Byte] = try Hash(file).toList catch { case _: Exception => Nil } } object lastModified extends Style { - override type F = ModifiedFileInfo + type F = ModifiedFileInfo - override implicit val format: JsonFormat[ModifiedFileInfo] = new JsonFormat[ModifiedFileInfo] { + implicit val format: JsonFormat[ModifiedFileInfo] = new JsonFormat[ModifiedFileInfo] { override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): ModifiedFileInfo = jsOpt match { case Some(js) => @@ -132,43 +118,34 @@ object FileInfo { } } - override implicit def apply(file: File): ModifiedFileInfo = - FileModified(file.getAbsoluteFile, file.lastModified) + implicit def apply(file: File): ModifiedFileInfo = FileModified(file.getAbsoluteFile, file.lastModified) } object exists extends Style { - override type F = PlainFileInfo + type F = PlainFileInfo - override implicit val format: JsonFormat[PlainFileInfo] = new JsonFormat[PlainFileInfo] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): PlainFileInfo = - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) - val file = unbuilder.readField[File]("file") - val exists = unbuilder.readField[Boolean]("exists") - unbuilder.endObject() - PlainFile(file, exists) - case None => - deserializationError("Expected JsObject but found None") - } - - override def write[J](obj: PlainFileInfo, builder: Builder[J]): Unit = { + implicit val format: JsonFormat[PlainFileInfo] = new JsonFormat[PlainFileInfo] { + def write[J](obj: PlainFileInfo, builder: Builder[J]): Unit = { builder.beginObject() builder.addField("file", obj.file) builder.addField("exists", obj.exists) builder.endObject() } + + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val file = unbuilder.readField[File]("file") + val exists = unbuilder.readField[Boolean]("exists") + unbuilder.endObject() + PlainFile(file, exists) + case None => deserializationError("Expected JsObject but found None") + } } - override implicit def apply(file: File): PlainFileInfo = { + implicit def apply(file: File): PlainFileInfo = { val abs = file.getAbsoluteFile PlainFile(abs, abs.exists) } } } - -final case class FilesInfo[F <: FileInfo] private (files: Set[F]) -object FilesInfo { - implicit def format[F <: FileInfo]: JsonFormat[FilesInfo[F]] = implicitly - def empty[F <: FileInfo]: FilesInfo[F] = FilesInfo(Set.empty[F]) -} From 71d104da3df26ae223d2b1f489a0b015e5f111eb Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 30 Nov 2016 15:03:53 +0000 Subject: [PATCH 624/823] Tweaks in FileInfo --- .../src/main/scala/sbt/internal/util/FileInfo.scala | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index 28cf7de86..b9d89b594 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -4,6 +4,7 @@ package sbt.internal.util import java.io.File +import scala.util.control.NonFatal import sbt.io.Hash import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError } import CacheImplicits._ @@ -30,7 +31,7 @@ object FileInfo { sealed trait Style { type F <: FileInfo - implicit val format: JsonFormat[F] + implicit def format: JsonFormat[F] def apply(file: File): F def apply(files: Set[File]): FilesInfo[F] = FilesInfo(files map apply) @@ -91,7 +92,7 @@ object FileInfo { implicit def apply(file: File): HashFileInfo = FileHash(file.getAbsoluteFile, computeHash(file)) - private def computeHash(file: File): List[Byte] = try Hash(file).toList catch { case _: Exception => Nil } + private def computeHash(file: File): List[Byte] = try Hash(file).toList catch { case NonFatal(_) => Nil } } object lastModified extends Style { From c6e793b03cbc0763d6af0ed45478b712df79950d Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 30 Nov 2016 15:04:27 +0000 Subject: [PATCH 625/823] Test FilesInfo brokenness --- .../src/test/scala/FileInfoSpec.scala | 23 +++++++++++++++++++ 1 file changed, 23 insertions(+) create mode 100644 internal/util-cache/src/test/scala/FileInfoSpec.scala diff --git a/internal/util-cache/src/test/scala/FileInfoSpec.scala b/internal/util-cache/src/test/scala/FileInfoSpec.scala new file mode 100644 index 000000000..974956bc4 --- /dev/null +++ b/internal/util-cache/src/test/scala/FileInfoSpec.scala @@ -0,0 +1,23 @@ +package sbt.internal.util + +import scala.json.ast.unsafe._ +import sjsonnew._, support.scalajson.unsafe._ + +class FileInfoSpec extends UnitSpec { + val file = new java.io.File(".") + val fileInfo: ModifiedFileInfo = FileModified(file, file.lastModified()) + val filesInfo = FilesInfo(Set(fileInfo)) + + it should "round trip" in assertRoundTrip(filesInfo) + + def assertRoundTrip[A: JsonWriter: JsonReader](x: A) = { + val jsonString: String = toJsonString(x) + val jValue: JValue = Parser.parseUnsafe(jsonString) + val y: A = Converter.fromJson[A](jValue).get + assert(x === y) + } + + def assertJsonString[A: JsonWriter](x: A, s: String) = assert(toJsonString(x) === s) + + def toJsonString[A: JsonWriter](x: A): String = CompactPrinter(Converter.toJson(x).get) +} From 1368f5f9dbcd03fe35b02bbbbc8d035635ffc520 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 30 Nov 2016 16:21:37 +0000 Subject: [PATCH 626/823] Cleanup Input/Output/CacheStore --- .../scala/sbt/internal/util/CacheStore.scala | 88 +++++-------------- .../main/scala/sbt/internal/util/Input.scala | 25 ++---- .../main/scala/sbt/internal/util/Output.scala | 27 +++--- 3 files changed, 36 insertions(+), 104 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala index 175525658..c7e2674b7 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala @@ -1,23 +1,17 @@ package sbt.internal.util +import java.io.{ File, InputStream, OutputStream } +import sbt.io.syntax.fileToRichFile +import sbt.io.{ IO, Using } import sjsonnew.{ IsoString, JsonReader, JsonWriter, SupportConverter } -import java.io.{ File, InputStream, OutputStream } - -import sbt.io.{ IO, Using } -import sbt.io.syntax.fileToRichFile - -/** - * A `CacheStore` is used by the caching infrastructure to persist cached information. - */ +/** A `CacheStore` is used by the caching infrastructure to persist cached information. */ trait CacheStore extends Input with Output { /** Delete the persisted information. */ def delete(): Unit } -/** - * Factory that can derive new stores. - */ +/** Factory that can derive new stores. */ trait CacheStoreFactory { /** Create a new store. */ def derive(identifier: String): CacheStore @@ -26,74 +20,32 @@ trait CacheStoreFactory { def sub(identifier: String): CacheStoreFactory } -/** - * A factory that creates new stores persisted in `base`. - */ +/** A factory that creates new stores persisted in `base`. */ class DirectoryStoreFactory[J: IsoString](base: File, converter: SupportConverter[J]) extends CacheStoreFactory { - IO.createDirectory(base) - override def derive(identifier: String): CacheStore = - new FileBasedStore(base / identifier, converter) + def derive(identifier: String): CacheStore = new FileBasedStore(base / identifier, converter) - override def sub(identifier: String): CacheStoreFactory = - new DirectoryStoreFactory(base / identifier, converter) + def sub(identifier: String): CacheStoreFactory = new DirectoryStoreFactory(base / identifier, converter) } -/** - * A `CacheStore` that persists information in `file`. - */ +/** A `CacheStore` that persists information in `file`. */ class FileBasedStore[J: IsoString](file: File, converter: SupportConverter[J]) extends CacheStore { - IO.touch(file, setModified = false) - override def delete(): Unit = - IO.delete(file) + def read[T: JsonReader]() = Using.fileInputStream(file)(stream => new PlainInput(stream, converter).read()) - override def read[T: JsonReader](): T = - Using.fileInputStream(file) { stream => - val input = new PlainInput(stream, converter) - input.read() - } - - override def read[T: JsonReader](default: => T): T = - try read[T]() - catch { case _: Exception => default } - - override def write[T: JsonWriter](value: T): Unit = - Using.fileOutputStream(append = false)(file) { stream => - val output = new PlainOutput(stream, converter) - output.write(value) - } - - override def close(): Unit = () + def write[T: JsonWriter](value: T) = + Using.fileOutputStream(append = false)(file)(stream => new PlainOutput(stream, converter).write(value)) + def delete() = IO.delete(file) + def close() = () } -/** - * A store that reads from `inputStream` and writes to `outputStream - */ +/** A store that reads from `inputStream` and writes to `outputStream`. */ class StreamBasedStore[J: IsoString](inputStream: InputStream, outputStream: OutputStream, converter: SupportConverter[J]) extends CacheStore { - - override def delete(): Unit = () - - override def read[T: JsonReader](): T = { - val input = new PlainInput(inputStream, converter) - input.read() - } - - override def read[T: JsonReader](default: => T): T = - try read[T]() - catch { case _: Exception => default } - - override def write[T: JsonWriter](value: T): Unit = { - val output = new PlainOutput(outputStream, converter) - output.write(value) - } - - override def close(): Unit = { - inputStream.close() - outputStream.close() - } - -} \ No newline at end of file + def read[T: JsonReader]() = new PlainInput(inputStream, converter).read() + def write[T: JsonWriter](value: T) = new PlainOutput(outputStream, converter).write(value) + def delete() = () + def close() = { inputStream.close(); outputStream.close() } +} diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala index 75eb92463..3426c117d 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala @@ -1,20 +1,18 @@ package sbt.internal.util -import sbt.io.{ IO, Using } - import java.io.{ Closeable, InputStream } - -import scala.util.{ Failure, Success } - +import scala.util.control.NonFatal import sjsonnew.{ IsoString, JsonReader, SupportConverter } +import sbt.io.{ IO, Using } trait Input extends Closeable { def read[T: JsonReader](): T - def read[T: JsonReader](default: => T): T + def read[T: JsonReader](default: => T): T = try read[T]() catch { case NonFatal(_) => default } } class PlainInput[J: IsoString](input: InputStream, converter: SupportConverter[J]) extends Input { val isoFormat: IsoString[J] = implicitly + private def readFully(): String = { Using.streamReader(input, IO.utf8) { reader => val builder = new StringBuilder() @@ -28,18 +26,7 @@ class PlainInput[J: IsoString](input: InputStream, converter: SupportConverter[J } } - override def read[T: JsonReader](): T = { - val string = readFully() - val json = isoFormat.from(string) - converter.fromJson(json) match { - case Success(value) => value - case Failure(ex) => throw ex - } - } + def read[T: JsonReader]() = converter.fromJson(isoFormat.from(readFully())).get - override def read[T: JsonReader](default: => T): T = - try read[T]() - catch { case _: Exception => default } - - override def close(): Unit = input.close() + def close() = input.close() } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala b/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala index 6e99db9ac..0472adee4 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala @@ -1,12 +1,8 @@ package sbt.internal.util -import sbt.io.Using - import java.io.{ Closeable, OutputStream } - -import scala.util.{ Failure, Success } - import sjsonnew.{ IsoString, JsonWriter, SupportConverter } +import sbt.io.Using trait Output extends Closeable { def write[T: JsonWriter](value: T): Unit @@ -14,19 +10,16 @@ trait Output extends Closeable { class PlainOutput[J: IsoString](output: OutputStream, converter: SupportConverter[J]) extends Output { val isoFormat: IsoString[J] = implicitly - override def write[T: JsonWriter](value: T): Unit = { - converter.toJson(value) match { - case Success(js) => - val asString = isoFormat.to(js) - Using.bufferedOutputStream(output) { writer => - val out = new java.io.PrintWriter(writer) - out.print(asString) - out.flush() - } - case Failure(ex) => - throw ex + + def write[T: JsonWriter](value: T) = { + val js = converter.toJson(value).get + val asString = isoFormat.to(js) + Using.bufferedOutputStream(output) { writer => + val out = new java.io.PrintWriter(writer) + out.print(asString) + out.flush() } } - override def close(): Unit = output.close() + def close() = output.close() } From 92e90c559ba01a5acdaa2dadb37e7def13cf5782 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 30 Nov 2016 16:49:59 +0000 Subject: [PATCH 627/823] Fix stackoverflow in implicit FilesInfo JsonFormat Fixes #61 * move it back into style * use that in Tracked (uncomment some code) * specify the style in the spec test * must define absolute file.. (doesn't pass with relative, which is wrong imo) --- .../src/main/scala/sbt/internal/util/FileInfo.scala | 2 +- internal/util-cache/src/test/scala/FileInfoSpec.scala | 4 ++-- .../src/main/scala/sbt/internal/util/Tracked.scala | 4 ++-- 3 files changed, 5 insertions(+), 5 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index b9d89b594..b86068cfd 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -23,7 +23,6 @@ private final case class FileHashModified(file: File, hash: List[Byte], lastModi final case class FilesInfo[F <: FileInfo] private (files: Set[F]) object FilesInfo { - implicit def format[F <: FileInfo]: JsonFormat[FilesInfo[F]] = implicitly def empty[F <: FileInfo]: FilesInfo[F] = FilesInfo(Set.empty[F]) } @@ -32,6 +31,7 @@ object FileInfo { type F <: FileInfo implicit def format: JsonFormat[F] + implicit def formats: JsonFormat[FilesInfo[F]] = project(_.files, (fs: Set[F]) => FilesInfo(fs)) def apply(file: File): F def apply(files: Set[File]): FilesInfo[F] = FilesInfo(files map apply) diff --git a/internal/util-cache/src/test/scala/FileInfoSpec.scala b/internal/util-cache/src/test/scala/FileInfoSpec.scala index 974956bc4..55ad18666 100644 --- a/internal/util-cache/src/test/scala/FileInfoSpec.scala +++ b/internal/util-cache/src/test/scala/FileInfoSpec.scala @@ -4,11 +4,11 @@ import scala.json.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ class FileInfoSpec extends UnitSpec { - val file = new java.io.File(".") + val file = new java.io.File(".").getAbsoluteFile val fileInfo: ModifiedFileInfo = FileModified(file, file.lastModified()) val filesInfo = FilesInfo(Set(fileInfo)) - it should "round trip" in assertRoundTrip(filesInfo) + it should "round trip" in assertRoundTrip(filesInfo)(FileInfo.lastModified.formats, FileInfo.lastModified.formats) def assertRoundTrip[A: JsonWriter: JsonReader](x: A) = { val jsonString: String = toJsonString(x) diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala index 47f0b0f00..4db2acc0c 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala +++ b/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala @@ -131,7 +131,7 @@ class Difference(val store: CacheStore, val style: FileInfo.Style, val defineCle } private def clearCache() = store.delete() - private def cachedFilesInfo = store.read(default = FilesInfo.empty[style.F]).files //(style.formats).files + private def cachedFilesInfo = store.read(default = FilesInfo.empty[style.F])(style.formats).files private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) def apply[T](files: Set[File])(f: ChangeReport[File] => T): T = @@ -164,7 +164,7 @@ class Difference(val store: CacheStore, val style: FileInfo.Style, val defineCle val result = f(report) val info = if (filesAreOutputs) style(abs(extractFiles(result))) else currentFilesInfo - store.write(info) + store.write(info)(style.formats) result } From ac14fc8de88d81475a4f2c4726c0a9f0924b063a Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 1 Dec 2016 14:38:40 +0000 Subject: [PATCH 628/823] Find a way to give FilesInfo an implicit JsonFormat --- .../main/scala/sbt/internal/util/FileInfo.scala | 16 ++++++++++++++++ .../util-cache/src/test/scala/FileInfoSpec.scala | 2 +- 2 files changed, 17 insertions(+), 1 deletion(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala index b86068cfd..5f4a3fd1e 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala @@ -16,6 +16,19 @@ sealed trait PlainFileInfo extends FileInfo { def exists: Boolean } sealed trait HashModifiedFileInfo extends HashFileInfo with ModifiedFileInfo +object HashFileInfo { + implicit val format: JsonFormat[HashFileInfo] = FileInfo.hash.format +} +object ModifiedFileInfo { + implicit val format: JsonFormat[ModifiedFileInfo] = FileInfo.lastModified.format +} +object PlainFileInfo { + implicit val format: JsonFormat[PlainFileInfo] = FileInfo.exists.format +} +object HashModifiedFileInfo { + implicit val format: JsonFormat[HashModifiedFileInfo] = FileInfo.full.format +} + private final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo @@ -24,6 +37,9 @@ private final case class FileHashModified(file: File, hash: List[Byte], lastModi final case class FilesInfo[F <: FileInfo] private (files: Set[F]) object FilesInfo { def empty[F <: FileInfo]: FilesInfo[F] = FilesInfo(Set.empty[F]) + + implicit def format[F <: FileInfo: JsonFormat]: JsonFormat[FilesInfo[F]] = + project(_.files, (fs: Set[F]) => FilesInfo(fs)) } object FileInfo { diff --git a/internal/util-cache/src/test/scala/FileInfoSpec.scala b/internal/util-cache/src/test/scala/FileInfoSpec.scala index 55ad18666..ed7b4ec28 100644 --- a/internal/util-cache/src/test/scala/FileInfoSpec.scala +++ b/internal/util-cache/src/test/scala/FileInfoSpec.scala @@ -8,7 +8,7 @@ class FileInfoSpec extends UnitSpec { val fileInfo: ModifiedFileInfo = FileModified(file, file.lastModified()) val filesInfo = FilesInfo(Set(fileInfo)) - it should "round trip" in assertRoundTrip(filesInfo)(FileInfo.lastModified.formats, FileInfo.lastModified.formats) + it should "round trip" in assertRoundTrip(filesInfo) def assertRoundTrip[A: JsonWriter: JsonReader](x: A) = { val jsonString: String = toJsonString(x) From 9a7abcb9c1b845e3a52e626f266f18bf9416661b Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 30 Nov 2016 12:13:51 +0000 Subject: [PATCH 629/823] sbt/io upgrade --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index f9fd1706f..c35a8020a 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -6,7 +6,7 @@ object Dependencies { val scala211 = "2.11.8" val scala212 = "2.12.0" - private val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M7" + private val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M7-6c64b6b5b29e4e12e95b09ceda6d2e8dd6092f00" def getSbtModulePath(key: String, name: String) = { val localProps = new java.util.Properties() From 10183dcf3d4fc984b02d275ec80071cdf2ae99dd Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 6 Dec 2016 10:05:29 +0000 Subject: [PATCH 630/823] Upgrade sjson-new --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index c35a8020a..3c1f7df76 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -36,7 +36,7 @@ object Dependencies { val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - val sjsonnewVersion = "0.5.1" + val sjsonnewVersion = "6.0.0" val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From 11efe6846f9f85e02bc09efa0786f09025b85701 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 6 Dec 2016 11:27:17 +0000 Subject: [PATCH 631/823] new sjson --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 3c1f7df76..0e7040ab1 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -36,7 +36,7 @@ object Dependencies { val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - val sjsonnewVersion = "6.0.0" + val sjsonnewVersion = "0.6.0-dnw" val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From 65ffbfd1f24f843a591a575607da779aa8a526e1 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 6 Dec 2016 15:34:03 +0000 Subject: [PATCH 632/823] Add mavenLocal --- build.sbt | 1 + 1 file changed, 1 insertion(+) diff --git a/build.sbt b/build.sbt index 5c41d937c..5fa4b8919 100644 --- a/build.sbt +++ b/build.sbt @@ -10,6 +10,7 @@ def commonSettings: Seq[Setting[_]] = Seq( // publishArtifact in packageDoc := false, resolvers += Resolver.typesafeIvyRepo("releases"), resolvers += Resolver.sonatypeRepo("snapshots"), + resolvers += Resolver.mavenLocal, // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), From b0536f1a4d31f3cc5f4723a01dcf6ffaa06eee15 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 15 Dec 2016 18:02:39 +0000 Subject: [PATCH 633/823] new sjson-new --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 0e7040ab1..2c9535d9c 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -36,7 +36,7 @@ object Dependencies { val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - val sjsonnewVersion = "0.6.0-dnw" + val sjsonnewVersion = "0.6.1-dnw" val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From 7688de459b613e7d16db232c42d44fba0662203d Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 19 Dec 2016 15:23:51 +0000 Subject: [PATCH 634/823] Fix JsonFormat[HList] Introduce HListJF, used as an inner TC so the outer HCons/HNil JsonFormat can manage when the array starts and ends. --- .../scala/sbt/internal/util/HListFormat.scala | 45 +++++++++++++++---- 1 file changed, 37 insertions(+), 8 deletions(-) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala b/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala index 82ecf9ba3..6594896c8 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala +++ b/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala @@ -9,29 +9,58 @@ trait HListFormat { private def forHNil[A <: HNil](hnil: A): JsonFormat[A] = new JsonFormat[A] { def write[J](x: A, builder: Builder[J]): Unit = { - if (builder.state != BuilderState.InArray) builder.beginArray() + builder.beginArray() builder.endArray() } - def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): A = { - if (unbuilder.state == UnbuilderState.InArray) unbuilder.endArray() - hnil + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): A = jsOpt match { + case None => hnil + case Some(js) => unbuilder.beginArray(js); unbuilder.endArray(); hnil } } - implicit def hconsFormat[H, T <: HList](implicit hf: JsonFormat[H], tf: JsonFormat[T]): JsonFormat[H :+: T] = + implicit def hconsFormat[H, T <: HList](implicit hf: JsonFormat[H], tf: HListJF[T]): JsonFormat[H :+: T] = new JsonFormat[H :+: T] { def write[J](hcons: H :+: T, builder: Builder[J]) = { - if (builder.state != BuilderState.InArray) builder.beginArray() + builder.beginArray() hf.write(hcons.head, builder) tf.write(hcons.tail, builder) + builder.endArray() } def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = jsOpt match { case None => HCons(hf.read(None, unbuilder), tf.read(None, unbuilder)) case Some(js) => - if (unbuilder.state != UnbuilderState.InArray) unbuilder.beginArray(js) - HCons(hf.read(Some(unbuilder.nextElement), unbuilder), tf.read(Some(js), unbuilder)) + unbuilder.beginArray(js) + val hcons = HCons(hf.read(Some(unbuilder.nextElement), unbuilder), tf.read(Some(js), unbuilder)) + unbuilder.endArray() + hcons } } + + trait HListJF[A <: HList] { + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): A + def write[J](obj: A, builder: Builder[J]): Unit + } + + implicit def hconsHListJF[H, T <: HList](implicit hf: JsonFormat[H], tf: HListJF[T]): HListJF[H :+: T] = + new HListJF[H :+: T] { + def write[J](hcons: H :+: T, builder: Builder[J]) = { + hf.write(hcons.head, builder) + tf.write(hcons.tail, builder) + } + + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = jsOpt match { + case None => HCons(hf.read(None, unbuilder), tf.read(None, unbuilder)) + case Some(js) => HCons(hf.read(Some(unbuilder.nextElement), unbuilder), tf.read(Some(js), unbuilder)) + } + } + + implicit val lnilHListJF1: HListJF[HNil] = hnilHListJF(HNil) + implicit val lnilHListJF2: HListJF[HNil.type] = hnilHListJF(HNil) + + implicit def hnilHListJF[A <: HNil](hnil: A): HListJF[A] = new HListJF[A] { + def write[J](hcons: A, builder: Builder[J]) = () + def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = hnil + } } From 5014e7f6b7f8cee0aada45b941782a1cef98dd94 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 22 Dec 2016 11:19:48 -0500 Subject: [PATCH 635/823] sjson-new 0.7.0 --- build.sbt | 2 +- project/Dependencies.scala | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/build.sbt b/build.sbt index 5fa4b8919..4ac7d0e02 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "0.1.0-M15" +def baseVersion: String = "0.1.0-M16" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 2c9535d9c..fe0fb53a5 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -6,7 +6,7 @@ object Dependencies { val scala211 = "2.11.8" val scala212 = "2.12.0" - private val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M7-6c64b6b5b29e4e12e95b09ceda6d2e8dd6092f00" + private val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M7" def getSbtModulePath(key: String, name: String) = { val localProps = new java.util.Properties() @@ -36,7 +36,7 @@ object Dependencies { val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - val sjsonnewVersion = "0.6.1-dnw" + val sjsonnewVersion = "0.7.0" val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion } From 61bdfd4367b4da9acd74b6a3f238002af8de3a49 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 22 Dec 2016 11:50:34 -0500 Subject: [PATCH 636/823] Update Eval https://github.com/typelevel/cats/blob/e2335730a958ce605f0d75f1b0d838454 2336aaf/core/src/main/scala/cats/Eval.scala --- .../src/main/scala/sbt/util/Eval.scala | 63 ++++++++++++++----- 1 file changed, 47 insertions(+), 16 deletions(-) diff --git a/internal/util-collection/src/main/scala/sbt/util/Eval.scala b/internal/util-collection/src/main/scala/sbt/util/Eval.scala index 8c142e336..abfb38070 100644 --- a/internal/util-collection/src/main/scala/sbt/util/Eval.scala +++ b/internal/util-collection/src/main/scala/sbt/util/Eval.scala @@ -5,9 +5,37 @@ import scala.annotation.tailrec // Copied from Cats (MIT license) /** - * Eval is a datatype, which controls evaluation. + * Eval is a monad which controls evaluation. + * + * This type wraps a value (or a computation that produces a value) + * and can produce it on command via the `.value` method. + * + * There are three basic evaluation strategies: + * + * - Now: evaluated immediately + * - Later: evaluated once when value is needed + * - Always: evaluated every time value is needed + * + * The Later and Always are both lazy strategies while Now is eager. + * Later and Always are distinguished from each other only by + * memoization: once evaluated Later will save the value to be returned + * immediately if it is needed again. Always will run its computation + * every time. + * + * Eval supports stack-safe lazy computation via the .map and .flatMap + * methods, which use an internal trampoline to avoid stack overflows. + * Computation done within .map and .flatMap is always done lazily, + * even when applied to a Now instance. + * + * It is not generally good style to pattern-match on Eval instances. + * Rather, use .map and .flatMap to chain computation, and use .value + * to get the result when needed. It is also not good style to create + * Eval instances whose computation involves calling .value on another + * Eval instance -- this can defeat the trampolining and lead to stack + * overflows. */ -sealed abstract class Eval[A] extends Serializable { self => +sealed abstract class Eval[+A] extends Serializable { self => + /** * Evaluate the computation and return an A value. * @@ -15,7 +43,7 @@ sealed abstract class Eval[A] extends Serializable { self => * will be performed at this point. For eager instances (Now), a * value will be immediately returned. */ - def get: A + def value: A /** * Transform an Eval[A] into an Eval[B] given the transformation @@ -47,8 +75,11 @@ sealed abstract class Eval[A] extends Serializable { self => case c: Eval.Compute[A] => new Eval.Compute[B] { type Start = c.Start - val start = c.start - val run = (s: c.Start) => + // See https://issues.scala-lang.org/browse/SI-9931 for an explanation + // of why the type annotations are necessary in these two lines on + // Scala 2.12.0. + val start: () => Eval[Start] = c.start + val run: Start => Eval[B] = (s: c.Start) => new Eval.Compute[B] { type Start = A val start = () => c.run(s) @@ -87,7 +118,7 @@ sealed abstract class Eval[A] extends Serializable { self => * This type should be used when an A value is already in hand, or * when the computation to produce an A value is pure and very fast. */ -final case class Now[A](get: A) extends Eval[A] { +final case class Now[A](value: A) extends Eval[A] { def memoize: Eval[A] = this } @@ -115,7 +146,7 @@ final class Later[A](f: () => A) extends Eval[A] { // // (For situations where `f` is small, but the output will be very // expensive to store, consider using `Always`.) - lazy val get: A = { + lazy val value: A = { val result = thunk() thunk = null // scalastyle:off result @@ -139,7 +170,7 @@ object Later { * caching must be avoided. Generally, prefer Later. */ final class Always[A](f: () => A) extends Eval[A] { - def get: A = f() + def value: A = f() def memoize: Eval[A] = new Later(f) } @@ -193,8 +224,8 @@ object Eval { * they will be automatically created when needed. */ sealed abstract class Call[A](val thunk: () => Eval[A]) extends Eval[A] { - def memoize: Eval[A] = new Later(() => get) - def get: A = Call.loop(this).get + def memoize: Eval[A] = new Later(() => value) + def value: A = Call.loop(this).value } object Call { @@ -229,16 +260,16 @@ object Eval { * * Unlike a traditional trampoline, the internal workings of the * trampoline are not exposed. This allows a slightly more efficient - * implementation of the .get method. + * implementation of the .value method. */ sealed abstract class Compute[A] extends Eval[A] { type Start val start: () => Eval[Start] val run: Start => Eval[A] - def memoize: Eval[A] = Later(get) + def memoize: Eval[A] = Later(value) - def get: A = { + def value: A = { type L = Eval[Any] type C = Any => Eval[Any] @tailrec def loop(curr: L, fs: List[C]): Any = @@ -251,12 +282,12 @@ object Eval { cc.run.asInstanceOf[C] :: c.run.asInstanceOf[C] :: fs ) case xx => - loop(c.run(xx.get).asInstanceOf[L], fs) + loop(c.run(xx.value), fs) } case x => fs match { - case f :: fs => loop(f(x.get), fs) - case Nil => x.get + case f :: fs => loop(f(x.value), fs) + case Nil => x.value } } loop(this.asInstanceOf[L], Nil).asInstanceOf[A] From 2573c0f0921433f741801bf635bbb88da5f3b26a Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 22 Dec 2016 11:50:53 -0500 Subject: [PATCH 637/823] Scala 2.12 --- .java-version | 2 +- .travis.yml | 9 +++++++-- build.sbt | 11 +++++++---- project/Dependencies.scala | 6 +++--- project/house.sbt | 2 +- 5 files changed, 19 insertions(+), 11 deletions(-) diff --git a/.java-version b/.java-version index d3bdbdf1f..625934097 100644 --- a/.java-version +++ b/.java-version @@ -1 +1 @@ -1.7 +1.8 diff --git a/.travis.yml b/.travis.yml index 895984e79..e19b3364a 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,6 +1,11 @@ language: scala -scala: 2.11.8 +scala: + - 2.11.8 + - 2.12.1 script: - - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "so test" + - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "plz $TRAVIS_SCALA_VERSION test" + +jdk: + - oraclejdk8 diff --git a/build.sbt b/build.sbt index 4ac7d0e02..472e444b2 100644 --- a/build.sbt +++ b/build.sbt @@ -14,13 +14,13 @@ def commonSettings: Seq[Setting[_]] = Seq( // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), - crossScalaVersions := Seq(scala211), + crossScalaVersions := Seq(scala211, scala212), scalacOptions ++= Seq("-Ywarn-unused", "-Ywarn-unused-import"), scalacOptions --= // scalac 2.10 rejects some HK types under -Xfuture it seems.. (CrossVersion partialVersion scalaVersion.value collect { case (2, 10) => List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") }).toList.flatten, scalacOptions in console in Compile -= "-Ywarn-unused-import", scalacOptions in console in Test -= "-Ywarn-unused-import", - previousArtifact := None, // Some(organization.value %% moduleName.value % "1.0.0"), + mimaPreviousArtifacts := Set(), // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, publishArtifact in Test := false ) @@ -154,8 +154,11 @@ lazy val utilScripted = (project in internalPath / "util-scripted"). commonSettings, name := "Util Scripted", libraryDependencies ++= { - if (scalaVersion.value startsWith "2.11") Seq(parserCombinator211) - else Seq() + scalaVersion.value match { + case sv if sv startsWith "2.11" => Seq(parserCombinator211) + case sv if sv startsWith "2.12" => Seq(parserCombinator211) + case _ => Seq() + } } ). configure(addSbtIO) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index fe0fb53a5..31dcb5381 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -4,7 +4,7 @@ import Keys._ object Dependencies { val scala210 = "2.10.6" val scala211 = "2.11.8" - val scala212 = "2.12.0" + val scala212 = "2.12.1" private val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M7" @@ -31,8 +31,8 @@ object Dependencies { val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } - val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.1" - val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" + val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.4" + val scalatest = "org.scalatest" %% "scalatest" % "3.0.1" val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" diff --git a/project/house.sbt b/project/house.sbt index 555559b37..bad061ebe 100644 --- a/project/house.sbt +++ b/project/house.sbt @@ -1 +1 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.1") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.2") From 008f9bee2ee5651fe4bf1b88ca38bf019c124723 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 22 Dec 2016 14:34:37 -0500 Subject: [PATCH 638/823] Work around Scala 2.12 init deadlock (SI-9824) --- build.sbt | 2 +- .../util-collection/src/test/scala/SettingsExample.scala | 6 +++--- .../util-collection/src/test/scala/SettingsTest.scala | 9 ++++++--- 3 files changed, 10 insertions(+), 7 deletions(-) diff --git a/build.sbt b/build.sbt index 472e444b2..58897af1e 100644 --- a/build.sbt +++ b/build.sbt @@ -13,7 +13,7 @@ def commonSettings: Seq[Setting[_]] = Seq( resolvers += Resolver.mavenLocal, // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), - javacOptions in compile ++= Seq("-target", "6", "-source", "6", "-Xlint", "-Xlint:-serial"), + javacOptions in compile ++= Seq("-Xlint", "-Xlint:-serial"), crossScalaVersions := Seq(scala211, scala212), scalacOptions ++= Seq("-Ywarn-unused", "-Ywarn-unused-import"), scalacOptions --= // scalac 2.10 rejects some HK types under -Xfuture it seems.. diff --git a/internal/util-collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala index 0dd910773..cf65d6c68 100644 --- a/internal/util-collection/src/test/scala/SettingsExample.scala +++ b/internal/util-collection/src/test/scala/SettingsExample.scala @@ -10,7 +10,7 @@ final case class Scope(nestIndex: Int, idAtIndex: Int = 0) // Lots of type constructors would become binary, which as you may know requires lots of type lambdas // when you want a type function with only one parameter. // That would be a general pain.) -object SettingsExample extends Init[Scope] { +case class SettingsExample() extends Init[Scope] { // Provides a way of showing a Scope+AttributeKey[_] val showFullKey: Show[ScopedKey[_]] = new Show[ScopedKey[_]] { def apply(key: ScopedKey[_]) = s"${key.scope.nestIndex}(${key.scope.idAtIndex})/${key.key.label}" @@ -30,8 +30,8 @@ object SettingsExample extends Init[Scope] { /** Usage Example **/ -object SettingsUsage { - import SettingsExample._ +case class SettingsUsage(val settingsExample: SettingsExample) { + import settingsExample._ // Define some keys val a = AttributeKey[Int]("a") diff --git a/internal/util-collection/src/test/scala/SettingsTest.scala b/internal/util-collection/src/test/scala/SettingsTest.scala index 85a3760ee..3e0bf0c0f 100644 --- a/internal/util-collection/src/test/scala/SettingsTest.scala +++ b/internal/util-collection/src/test/scala/SettingsTest.scala @@ -2,10 +2,12 @@ package sbt.internal.util import org.scalacheck._ import Prop._ -import SettingsUsage._ -import SettingsExample._ object SettingsTest extends Properties("settings") { + val settingsExample: SettingsExample = SettingsExample() + import settingsExample._ + val settingsUsage = SettingsUsage(settingsExample) + import settingsUsage._ import scala.reflect.Manifest @@ -126,7 +128,7 @@ object SettingsTest extends Properties("settings") { // Each project defines an initial value, but the update is defined in globalKey. // However, the derived Settings that come from this should be scoped in each project. val settings: Seq[Setting[_]] = - derive(setting(globalDerivedKey, SettingsExample.map(globalKey)(_ + 1))) +: projectKeys.map(pk => setting(pk, value(0))) + derive(setting(globalDerivedKey, settingsExample.map(globalKey)(_ + 1))) +: projectKeys.map(pk => setting(pk, value(0))) val ev = evaluate(settings) // Also check that the key has no value at the "global" scope val props = for { pk <- projectDerivedKeys } yield checkKey(pk, Some(1), ev) @@ -184,6 +186,7 @@ object SettingsTest extends Properties("settings") { } // This setup is a workaround for module synchronization issues final class CCR(intermediate: Int) { + import SettingsTest.settingsExample._ lazy val top = iterate(value(intermediate), intermediate) def iterate(init: Initialize[Int], i: Int): Initialize[Int] = bind(init) { t => From 496e8d3e4fb03e2fd828e6bec75cf01e2c5f348f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 22 Dec 2016 22:30:45 -0500 Subject: [PATCH 639/823] Cross building --- build.sbt | 14 +++++++++----- 1 file changed, 9 insertions(+), 5 deletions(-) diff --git a/build.sbt b/build.sbt index 58897af1e..bc11d37f1 100644 --- a/build.sbt +++ b/build.sbt @@ -15,9 +15,13 @@ def commonSettings: Seq[Setting[_]] = Seq( testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-Xlint", "-Xlint:-serial"), crossScalaVersions := Seq(scala211, scala212), - scalacOptions ++= Seq("-Ywarn-unused", "-Ywarn-unused-import"), - scalacOptions --= // scalac 2.10 rejects some HK types under -Xfuture it seems.. - (CrossVersion partialVersion scalaVersion.value collect { case (2, 10) => List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") }).toList.flatten, + scalacOptions := { + val old = scalacOptions.value + scalaVersion.value match { + case sv if sv.startsWith("2.10") => old diff List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") + case _ => old ++ List("-Ywarn-unused", "-Ywarn-unused-import") + } + }, scalacOptions in console in Compile -= "-Ywarn-unused-import", scalacOptions in console in Test -= "-Ywarn-unused-import", mimaPreviousArtifacts := Set(), // Some(organization.value %% moduleName.value % "1.0.0"), @@ -97,7 +101,7 @@ lazy val utilLogging = (project in internalPath / "util-logging"). dependsOn(utilInterface, utilTesting % Test). settings( commonSettings, - crossScalaVersions := Seq(scala210, scala211), + crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", libraryDependencies += jline ) @@ -143,7 +147,7 @@ lazy val utilTracking = (project in internalPath / "util-tracking"). lazy val utilTesting = (project in internalPath / "util-testing"). settings( commonSettings, - crossScalaVersions := Seq(scala210, scala211), + crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Testing", libraryDependencies ++= Seq(scalaCheck, scalatest) ) From b7fefb367ff523e3d898d946a3d8f561b5d84e05 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 3 Jan 2017 10:28:08 +0000 Subject: [PATCH 640/823] Bump sbt/io to 1.0.0-M8, w/ fix to IO.relativize --- project/Dependencies.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 31dcb5381..f020e4384 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -6,7 +6,9 @@ object Dependencies { val scala211 = "2.11.8" val scala212 = "2.12.1" - private val sbtIO = "org.scala-sbt" %% "io" % "1.0.0-M7" + private val ioVersion = "1.0.0-M9" + + private val sbtIO = "org.scala-sbt" %% "io" % ioVersion def getSbtModulePath(key: String, name: String) = { val localProps = new java.util.Properties() From 36eeb4578d7233d6a0d8873efa0ca42ae7e76307 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 13 Jan 2017 02:49:12 -0500 Subject: [PATCH 641/823] Reimplement multi-logger using log4j2 This introduces ManagedLogger, which is a wrapper around Log4j2's async logging. Log4j2 separates the notion of logger (the code that collects events) and appender (the code that acts on events). The old code is kept around intentionally to minimize breakage during transition. --- build.sbt | 9 +- .../sbt/internal/util/AbstractEntry.scala | 27 +++ .../sbt/internal/util/ChannelLogEntry.scala | 51 ++++++ .../util/codec/AbstractEntryFormats.scala | 10 ++ .../util/codec/ChannelLogEntryFormats.scala | 33 ++++ .../internal/util/codec/JsonProtocol.scala | 10 ++ .../src/main/contraband/logging.contra | 16 ++ ...soleLogger.scala => ConsoleAppender.scala} | 164 +++++++++++++++--- .../scala/sbt/internal/util/ConsoleOut.scala | 2 +- .../sbt/internal/util/GlobalLogging.scala | 30 +++- .../scala/sbt/internal/util/MainLogging.scala | 77 +++++--- .../sbt/internal/util/ManagedLogger.scala | 25 +++ .../scala/sbt/internal/util/MultiLogger.scala | 2 +- .../src/main/scala/sbt/util/LogExchange.scala | 75 ++++++++ .../resources/log4j2.component.properties | 1 + .../util-logging/src/test/scala/Escapes.scala | 2 +- .../src/test/scala/ManagedLoggerSpec.scala | 52 ++++++ project/Dependencies.scala | 7 +- project/contraband.sbt | 1 + 19 files changed, 533 insertions(+), 61 deletions(-) create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala create mode 100644 internal/util-logging/src/main/contraband/logging.contra rename internal/util-logging/src/main/scala/sbt/internal/util/{ConsoleLogger.scala => ConsoleAppender.scala} (55%) create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala create mode 100644 internal/util-logging/src/main/scala/sbt/util/LogExchange.scala create mode 100644 internal/util-logging/src/test/resources/log4j2.component.properties create mode 100644 internal/util-logging/src/test/scala/ManagedLoggerSpec.scala create mode 100644 project/contraband.sbt diff --git a/build.sbt b/build.sbt index bc11d37f1..74535786f 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "0.1.0-M16" +def baseVersion: String = "1.0.0-M18" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( @@ -98,12 +98,14 @@ lazy val utilComplete = (project in internalPath / "util-complete"). // logging lazy val utilLogging = (project in internalPath / "util-logging"). + enablePlugins(ContrabandPlugin, JsonCodecPlugin). dependsOn(utilInterface, utilTesting % Test). settings( commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", - libraryDependencies += jline + libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson), + sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala" ) // Relation @@ -150,7 +152,8 @@ lazy val utilTesting = (project in internalPath / "util-testing"). crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Testing", libraryDependencies ++= Seq(scalaCheck, scalatest) - ) + ). + configure(addSbtIO) lazy val utilScripted = (project in internalPath / "util-scripted"). dependsOn(utilLogging). diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala new file mode 100644 index 000000000..1c08e3cad --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala @@ -0,0 +1,27 @@ +/** + * This code is generated using sbt-datatype. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util +abstract class AbstractEntry( + val channelName: Option[String], + val execId: Option[String]) extends Serializable { + + + + + override def equals(o: Any): Boolean = o match { + case x: AbstractEntry => (this.channelName == x.channelName) && (this.execId == x.execId) + case _ => false + } + override def hashCode: Int = { + 37 * (37 * (17 + channelName.##) + execId.##) + } + override def toString: String = { + "AbstractEntry(" + channelName + ", " + execId + ")" + } +} +object AbstractEntry { + +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala new file mode 100644 index 000000000..b7350efee --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala @@ -0,0 +1,51 @@ +/** + * This code is generated using sbt-datatype. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util +final class ChannelLogEntry private ( + val level: String, + val message: String, + channelName: Option[String], + execId: Option[String]) extends sbt.internal.util.AbstractEntry(channelName, execId) with Serializable { + + + + override def equals(o: Any): Boolean = o match { + case x: ChannelLogEntry => (this.level == x.level) && (this.message == x.message) && (this.channelName == x.channelName) && (this.execId == x.execId) + case _ => false + } + override def hashCode: Int = { + 37 * (37 * (37 * (37 * (17 + level.##) + message.##) + channelName.##) + execId.##) + } + override def toString: String = { + "ChannelLogEntry(" + level + ", " + message + ", " + channelName + ", " + execId + ")" + } + protected[this] def copy(level: String = level, message: String = message, channelName: Option[String] = channelName, execId: Option[String] = execId): ChannelLogEntry = { + new ChannelLogEntry(level, message, channelName, execId) + } + def withLevel(level: String): ChannelLogEntry = { + copy(level = level) + } + def withMessage(message: String): ChannelLogEntry = { + copy(message = message) + } + def withChannelName(channelName: Option[String]): ChannelLogEntry = { + copy(channelName = channelName) + } + def withChannelName(channelName: String): ChannelLogEntry = { + copy(channelName = Option(channelName)) + } + def withExecId(execId: Option[String]): ChannelLogEntry = { + copy(execId = execId) + } + def withExecId(execId: String): ChannelLogEntry = { + copy(execId = Option(execId)) + } +} +object ChannelLogEntry { + + def apply(level: String, message: String, channelName: Option[String], execId: Option[String]): ChannelLogEntry = new ChannelLogEntry(level, message, channelName, execId) + def apply(level: String, message: String, channelName: String, execId: String): ChannelLogEntry = new ChannelLogEntry(level, message, Option(channelName), Option(execId)) +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala new file mode 100644 index 000000000..b797af060 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala @@ -0,0 +1,10 @@ +/** + * This code is generated using sbt-datatype. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +import _root_.sjsonnew.{ deserializationError, serializationError, Builder, JsonFormat, Unbuilder } +trait AbstractEntryFormats { self: sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.ChannelLogEntryFormats => +implicit lazy val AbstractEntryFormat: JsonFormat[sbt.internal.util.AbstractEntry] = flatUnionFormat1[sbt.internal.util.AbstractEntry, sbt.internal.util.ChannelLogEntry]("type") +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala new file mode 100644 index 000000000..4c8d666d9 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala @@ -0,0 +1,33 @@ +/** + * This code is generated using sbt-datatype. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +import _root_.sjsonnew.{ deserializationError, serializationError, Builder, JsonFormat, Unbuilder } +trait ChannelLogEntryFormats { self: sjsonnew.BasicJsonProtocol => +implicit lazy val ChannelLogEntryFormat: JsonFormat[sbt.internal.util.ChannelLogEntry] = new JsonFormat[sbt.internal.util.ChannelLogEntry] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.ChannelLogEntry = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val level = unbuilder.readField[String]("level") + val message = unbuilder.readField[String]("message") + val channelName = unbuilder.readField[Option[String]]("channelName") + val execId = unbuilder.readField[Option[String]]("execId") + unbuilder.endObject() + sbt.internal.util.ChannelLogEntry(level, message, channelName, execId) + case None => + deserializationError("Expected JsObject but found None") + } + } + override def write[J](obj: sbt.internal.util.ChannelLogEntry, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("level", obj.level) + builder.addField("message", obj.message) + builder.addField("channelName", obj.channelName) + builder.addField("execId", obj.execId) + builder.endObject() + } +} +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala new file mode 100644 index 000000000..a2bfe0f25 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala @@ -0,0 +1,10 @@ +/** + * This code is generated using sbt-datatype. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +trait JsonProtocol extends sjsonnew.BasicJsonProtocol + with sbt.internal.util.codec.ChannelLogEntryFormats + with sbt.internal.util.codec.AbstractEntryFormats +object JsonProtocol extends JsonProtocol \ No newline at end of file diff --git a/internal/util-logging/src/main/contraband/logging.contra b/internal/util-logging/src/main/contraband/logging.contra new file mode 100644 index 000000000..085044ed8 --- /dev/null +++ b/internal/util-logging/src/main/contraband/logging.contra @@ -0,0 +1,16 @@ +package sbt.internal.util +@target(Scala) +@codecPackage("sbt.internal.util.codec") +@fullCodec("JsonProtocol") + +interface AbstractEntry { + channelName: String + execId: String +} + +type ChannelLogEntry implements sbt.internal.util.AbstractEntry { + level: String! + message: String! + channelName: String + execId: String +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala similarity index 55% rename from internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 9e562a770..e39acb1dc 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -1,13 +1,69 @@ -/* sbt -- Simple Build Tool - * Copyright 2008, 2009, 2010, 2011 Mark Harrah - */ package sbt.internal.util import sbt.util._ import java.io.{ PrintStream, PrintWriter } import java.util.Locale +import java.util.concurrent.atomic.AtomicInteger +import org.apache.logging.log4j.{ Level => XLevel } +import org.apache.logging.log4j.message.{ Message, ParameterizedMessage, ObjectMessage } +import org.apache.logging.log4j.core.{ LogEvent => XLogEvent } +import org.apache.logging.log4j.core.appender.AbstractAppender +import org.apache.logging.log4j.core.layout.PatternLayout +import org.apache.logging.log4j.core.async.RingBufferLogEvent + +import ConsoleAppender._ object ConsoleLogger { + // These are provided so other modules do not break immediately. + @deprecated("Use ConsoleAppender.", "0.13.x") + final val ESC = ConsoleAppender.ESC + @deprecated("Use ConsoleAppender.", "0.13.x") + private[sbt] def isEscapeTerminator(c: Char): Boolean = ConsoleAppender.isEscapeTerminator(c) + @deprecated("Use ConsoleAppender.", "0.13.x") + def hasEscapeSequence(s: String): Boolean = ConsoleAppender.hasEscapeSequence(s) + @deprecated("Use ConsoleAppender.", "0.13.x") + def removeEscapeSequences(s: String): String = ConsoleAppender.removeEscapeSequences(s) + @deprecated("Use ConsoleAppender.", "0.13.x") + val formatEnabled = ConsoleAppender.formatEnabled + @deprecated("Use ConsoleAppender.", "0.13.x") + val noSuppressedMessage = ConsoleAppender.noSuppressedMessage + + def apply(out: PrintStream): ConsoleLogger = apply(ConsoleOut.printStreamOut(out)) + def apply(out: PrintWriter): ConsoleLogger = apply(ConsoleOut.printWriterOut(out)) + def apply(out: ConsoleOut = ConsoleOut.systemOut, ansiCodesSupported: Boolean = ConsoleAppender.formatEnabled, + useColor: Boolean = ConsoleAppender.formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = ConsoleAppender.noSuppressedMessage): ConsoleLogger = + new ConsoleLogger(out, ansiCodesSupported, useColor, suppressedMessage) +} + +/** + * A logger that logs to the console. On supported systems, the level labels are + * colored. + */ +class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean, val suppressedMessage: SuppressedTraceContext => Option[String]) extends BasicLogger { + private[sbt] val appender = ConsoleAppender(generateName, out, ansiCodesSupported, useColor, suppressedMessage) + + override def control(event: ControlEvent.Value, message: => String): Unit = + appender.control(event, message) + override def log(level: Level.Value, message: => String): Unit = + { + if (atLevel(level)) { + appender.appendLog(level, message) + } + } + + override def success(message: => String): Unit = + { + if (successEnabled) { + appender.success(message) + } + } + override def trace(t: => Throwable): Unit = + appender.trace(t, getTrace) + + override def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } +} + +object ConsoleAppender { /** Escape character, used to introduce an escape sequence. */ final val ESC = '\u001B' @@ -122,25 +178,83 @@ object ConsoleLogger { private[this] def os = System.getProperty("os.name") private[this] def isWindows = os.toLowerCase(Locale.ENGLISH).indexOf("windows") >= 0 - def apply(out: PrintStream): ConsoleLogger = apply(ConsoleOut.printStreamOut(out)) - def apply(out: PrintWriter): ConsoleLogger = apply(ConsoleOut.printWriterOut(out)) - def apply(out: ConsoleOut = ConsoleOut.systemOut, ansiCodesSupported: Boolean = formatEnabled, - useColor: Boolean = formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = noSuppressedMessage): ConsoleLogger = - new ConsoleLogger(out, ansiCodesSupported, useColor, suppressedMessage) + def apply(out: PrintStream): ConsoleAppender = apply(generateName, ConsoleOut.printStreamOut(out)) + def apply(out: PrintWriter): ConsoleAppender = apply(generateName, ConsoleOut.printWriterOut(out)) + def apply(name: String = generateName, out: ConsoleOut = ConsoleOut.systemOut, ansiCodesSupported: Boolean = formatEnabled, + useColor: Boolean = formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = noSuppressedMessage): ConsoleAppender = + { + val appender = new ConsoleAppender(name, out, ansiCodesSupported, useColor, suppressedMessage) + appender.start + appender + } + def generateName: String = + "out-" + generateId.incrementAndGet + private val generateId: AtomicInteger = new AtomicInteger private[this] val EscapeSequence = (27.toChar + "[^@-~]*[@-~]").r def stripEscapeSequences(s: String): String = EscapeSequence.pattern.matcher(s).replaceAll("") + + def toLevel(level: XLevel): Level.Value = + level match { + case XLevel.OFF => Level.Debug + case XLevel.FATAL => Level.Error + case XLevel.ERROR => Level.Error + case XLevel.WARN => Level.Warn + case XLevel.INFO => Level.Info + case XLevel.DEBUG => Level.Debug + case _ => Level.Debug + } + def toXLevel(level: Level.Value): XLevel = + level match { + case Level.Error => XLevel.ERROR + case Level.Warn => XLevel.WARN + case Level.Info => XLevel.INFO + case Level.Debug => XLevel.DEBUG + } } +// See http://stackoverflow.com/questions/24205093/how-to-create-a-custom-appender-in-log4j2 +// for custom appender using Java. +// http://logging.apache.org/log4j/2.x/manual/customconfig.html +// https://logging.apache.org/log4j/2.x/log4j-core/apidocs/index.html + /** * A logger that logs to the console. On supported systems, the level labels are * colored. * * This logger is not thread-safe. */ -class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean, val suppressedMessage: SuppressedTraceContext => Option[String]) extends BasicLogger { +class ConsoleAppender private[ConsoleAppender] ( + val name: String, + val out: ConsoleOut, + val ansiCodesSupported: Boolean, + val useColor: Boolean, + val suppressedMessage: SuppressedTraceContext => Option[String] +) extends AbstractAppender(name, null, PatternLayout.createDefaultLayout(), true) { import scala.Console.{ BLUE, GREEN, RED, RESET, YELLOW } + + def append(event: XLogEvent): Unit = + { + val level = ConsoleAppender.toLevel(event.getLevel) + val message = event.getMessage + val str = messageToString(message) + appendLog(level, str) + } + + def messageToString(msg: Message): String = + msg match { + case p: ParameterizedMessage => p.getFormattedMessage + case r: RingBufferLogEvent => r.getFormattedMessage + case o: ObjectMessage => objectToString(o.getParameter) + case _ => msg.toString + } + def objectToString(o: AnyRef): String = + o match { + case x: ChannelLogEntry => x.message + case _ => o.toString + } + def messageColor(level: Level.Value) = RESET def labelColor(level: Level.Value) = level match { @@ -148,24 +262,29 @@ class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val an case Level.Warn => YELLOW case _ => RESET } - def successLabelColor = GREEN - def successMessageColor = RESET - override def success(message: => String): Unit = { - if (successEnabled) - log(successLabelColor, Level.SuccessLabel, successMessageColor, message) + + // success is called by ConsoleLogger. + // This should turn into an event. + private[sbt] def success(message: => String): Unit = { + appendLog(successLabelColor, Level.SuccessLabel, successMessageColor, message) } - def trace(t: => Throwable): Unit = + private[sbt] def successLabelColor = GREEN + private[sbt] def successMessageColor = RESET + + def trace(t: => Throwable, traceLevel: Int): Unit = out.lockObject.synchronized { - val traceLevel = getTrace if (traceLevel >= 0) out.print(StackTrace.trimmed(t, traceLevel)) if (traceLevel <= 2) for (msg <- suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useColor))) printLabeledLine(labelColor(Level.Error), "trace", messageColor(Level.Error), msg) } - def log(level: Level.Value, message: => String): Unit = { - if (atLevel(level)) - log(labelColor(level), level.toString, messageColor(level), message) + + def control(event: ControlEvent.Value, message: => String): Unit = + appendLog(labelColor(Level.Info), Level.Info.toString, BLUE, message) + + def appendLog(level: Level.Value, message: => String): Unit = { + appendLog(labelColor(level), level.toString, messageColor(level), message) } private def reset(): Unit = setColor(RESET) @@ -173,7 +292,7 @@ class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val an if (ansiCodesSupported && useColor) out.lockObject.synchronized { out.print(color) } } - private def log(labelColor: String, label: String, messageColor: String, message: String): Unit = + private def appendLog(labelColor: String, label: String, messageColor: String, message: String): Unit = out.lockObject.synchronized { for (line <- message.split("""\n""")) printLabeledLine(labelColor, label, messageColor, line) @@ -189,10 +308,9 @@ class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val an setColor(messageColor) out.print(line) reset() + out.print(s" ($name)") out.println() } - - def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } - def control(event: ControlEvent.Value, message: => String): Unit = log(labelColor(Level.Info), Level.Info.toString, BLUE, message) } + final class SuppressedTraceContext(val traceLevel: Int, val useColor: Boolean) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala index 30da238da..72fa01594 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala @@ -33,7 +33,7 @@ object ConsoleOut { def println(s: String): Unit = synchronized { current.append(s); println() } def println(): Unit = synchronized { val s = current.toString - if (ConsoleLogger.formatEnabled && last.exists(lmsg => f(s, lmsg))) + if (ConsoleAppender.formatEnabled && last.exists(lmsg => f(s, lmsg))) lockObject.print(OverwriteLine) lockObject.println(s) last = Some(s) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala index 191408393..1dcf9d9f9 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala @@ -5,6 +5,7 @@ package sbt.internal.util import sbt.util._ import java.io.{ File, PrintWriter } +import org.apache.logging.log4j.core.Appender /** * Provides the current global logging configuration. @@ -15,7 +16,10 @@ import java.io.{ File, PrintWriter } * `backing` tracks the files that persist the global logging. * `newLogger` creates a new global logging configuration from a sink and backing configuration. */ -final case class GlobalLogging(full: Logger, console: ConsoleOut, backed: AbstractLogger, backing: GlobalLogBacking, newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging) +final case class GlobalLogging(full: ManagedLogger, console: ConsoleOut, backed: Appender, + backing: GlobalLogBacking, newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging) + +final case class GlobalLogging1(full: Logger, console: ConsoleOut, backed: AbstractLogger, backing: GlobalLogBacking, newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging1) /** * Tracks the files that persist the global logging. @@ -38,10 +42,28 @@ final case class GlobalLogBacking(file: File, last: Option[File], newBackingFile object GlobalLogBacking { def apply(newBackingFile: => File): GlobalLogBacking = GlobalLogBacking(newBackingFile, None, newBackingFile _) } + object GlobalLogging { - def initial(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File, console: ConsoleOut): GlobalLogging = + import java.util.concurrent.atomic.AtomicInteger + private def generateName: String = + "GlobalLogging" + generateId.incrementAndGet + private val generateId: AtomicInteger = new AtomicInteger + + def initial1(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging1, newBackingFile: => File, console: ConsoleOut): GlobalLogging1 = { val log = ConsoleLogger(console) - GlobalLogging(log, console, log, GlobalLogBacking(newBackingFile), newLogger) + GlobalLogging1(log, console, log, GlobalLogBacking(newBackingFile), newLogger) } -} \ No newline at end of file + + def initial(newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File, console: ConsoleOut): GlobalLogging = + { + val loggerName = generateName + val log = LogExchange.logger(loggerName) + val appender = ConsoleAppender(ConsoleAppender.generateName, console) + LogExchange.bindLoggerAppenders( + loggerName, List(appender -> Level.Info) + ) + GlobalLogging(log, console, appender, GlobalLogBacking(newBackingFile), newAppender) + } +} + diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala index 6cea97edb..75eb08298 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala @@ -2,42 +2,65 @@ package sbt.internal.util import sbt.util._ import java.io.PrintWriter +import org.apache.logging.log4j.core.Appender -object MainLogging { - def multiLogger(config: MultiLoggerConfig): Logger = +object MainAppender { + import java.util.concurrent.atomic.AtomicInteger + private def generateGlobalBackingName: String = + "GlobalBacking" + generateId.incrementAndGet + private val generateId: AtomicInteger = new AtomicInteger + + def multiLogger(log: ManagedLogger, config: MainAppenderConfig): ManagedLogger = { import config._ - val multi = new MultiLogger(console :: backed :: extra) - // sets multi to the most verbose for clients that inspect the current level - multi setLevel Level.unionAll(backingLevel :: screenLevel :: extra.map(_.getLevel)) - // set the specific levels - console setLevel screenLevel - backed setLevel backingLevel - console setTrace screenTrace - backed setTrace backingTrace - multi: Logger + // TODO + // console setTrace screenTrace + // backed setTrace backingTrace + // multi: Logger + + // val log = LogExchange.logger(loggerName) + LogExchange.unbindLoggerAppenders(log.name) + LogExchange.bindLoggerAppenders( + log.name, + (consoleOpt.toList map { _ -> screenLevel }) ::: + List(backed -> backingLevel) ::: + (extra map { x => (x -> Level.Info) }) + ) + log } - def globalDefault(console: ConsoleOut): (PrintWriter, GlobalLogBacking) => GlobalLogging = + def globalDefault(console: ConsoleOut): (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging = { - lazy val f: (PrintWriter, GlobalLogBacking) => GlobalLogging = (writer, backing) => { - val backed = defaultBacked()(writer) - val full = multiLogger(defaultMultiConfig(console, backed)) - GlobalLogging(full, console, backed, backing, f) + lazy val newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging = (log, writer, backing) => { + val backed: Appender = defaultBacked(generateGlobalBackingName)(writer) + val full = multiLogger(log, defaultMultiConfig(Option(console), backed, Nil)) + GlobalLogging(full, console, backed, backing, newAppender) } - f + newAppender } - def defaultMultiConfig(console: ConsoleOut, backing: AbstractLogger): MultiLoggerConfig = - new MultiLoggerConfig(defaultScreen(console, ConsoleLogger.noSuppressedMessage), backing, Nil, Level.Info, Level.Debug, -1, Int.MaxValue) + def defaultMultiConfig(consoleOpt: Option[ConsoleOut], backing: Appender, extra: List[Appender]): MainAppenderConfig = + MainAppenderConfig(consoleOpt map { defaultScreen(_, ConsoleAppender.noSuppressedMessage) }, backing, extra, + Level.Info, Level.Debug, -1, Int.MaxValue) + def defaultScreen(console: ConsoleOut): Appender = ConsoleAppender(ConsoleAppender.generateName, console) + def defaultScreen(console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): Appender = + ConsoleAppender(ConsoleAppender.generateName, console, suppressedMessage = suppressedMessage) + def defaultScreen(name: String, console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): Appender = + ConsoleAppender(name, console, suppressedMessage = suppressedMessage) - def defaultScreen(console: ConsoleOut): AbstractLogger = ConsoleLogger(console) - def defaultScreen(console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): AbstractLogger = - ConsoleLogger(console, suppressedMessage = suppressedMessage) + def defaultBacked( + loggerName: String = generateGlobalBackingName, + useColor: Boolean = ConsoleAppender.formatEnabled + ): PrintWriter => Appender = + to => { + ConsoleAppender( + ConsoleAppender.generateName, + ConsoleOut.printWriterOut(to), useColor = useColor + ) + } - def defaultBacked(useColor: Boolean = ConsoleLogger.formatEnabled): PrintWriter => ConsoleLogger = - to => ConsoleLogger(ConsoleOut.printWriterOut(to), useColor = useColor) + final case class MainAppenderConfig( + consoleOpt: Option[Appender], backed: Appender, extra: List[Appender], + screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int + ) } - -final case class MultiLoggerConfig(console: AbstractLogger, backed: AbstractLogger, extra: List[AbstractLogger], - screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala new file mode 100644 index 000000000..90c84b6aa --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -0,0 +1,25 @@ +package sbt.internal.util + +import sbt.util._ +import org.apache.logging.log4j.{ Logger => XLogger } +import org.apache.logging.log4j.message.ObjectMessage + +/** + * Delegates log events to the associated LogExchange. + */ +class ManagedLogger( + val name: String, + val channelName: Option[String], + val execId: Option[String], + xlogger: XLogger +) extends Logger { + override def trace(t: => Throwable): Unit = () // exchange.appendLog(new Trace(t)) + override def log(level: Level.Value, message: => String): Unit = + { + xlogger.log( + ConsoleAppender.toXLevel(level), + new ObjectMessage(ChannelLogEntry(level.toString, message, channelName, execId)) + ) + } + override def success(message: => String): Unit = xlogger.info(message) +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala index 84168fd27..c5f7d1103 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala @@ -41,7 +41,7 @@ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { private[this] def removeEscapes(event: LogEvent): LogEvent = { - import ConsoleLogger.{ removeEscapeSequences => rm } + import ConsoleAppender.{ removeEscapeSequences => rm } event match { case s: Success => new Success(rm(s.msg)) case l: Log => new Log(l.level, rm(l.msg)) diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala new file mode 100644 index 000000000..0f967da61 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -0,0 +1,75 @@ +package sbt.util + +import sbt.internal.util._ +import org.apache.logging.log4j.{ LogManager => XLogManager, Level => XLevel } +import org.apache.logging.log4j.core._ +import org.apache.logging.log4j.core.appender.AsyncAppender +import org.apache.logging.log4j.core.config.{ AppenderRef, LoggerConfig } +import scala.collection.JavaConverters._ + +// http://logging.apache.org/log4j/2.x/manual/customconfig.html +// https://logging.apache.org/log4j/2.x/log4j-core/apidocs/index.html + +sealed abstract class LogExchange { + private[sbt] lazy val context: LoggerContext = init() + private[sbt] lazy val asyncStdout: AsyncAppender = buildAsyncStdout + + def logger(name: String): ManagedLogger = logger(name, None, None) + def logger(name: String, channelName: Option[String], execId: Option[String]): ManagedLogger = { + val _ = context + val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } + val config = ctx.getConfiguration + val loggerConfig = LoggerConfig.createLogger(false, XLevel.DEBUG, name, + "true", Array[AppenderRef](), null, config, null) + config.addLogger(name, loggerConfig) + ctx.updateLoggers + val logger = ctx.getLogger(name) + new ManagedLogger(name, channelName, execId, logger) + } + def unbindLoggerAppenders(loggerName: String): Unit = { + val lc = loggerConfig(loggerName) + lc.getAppenders.asScala foreach { + case (k, v) => lc.removeAppender(k) + } + } + def bindLoggerAppenders(loggerName: String, appenders: List[(Appender, Level.Value)]): Unit = { + val lc = loggerConfig(loggerName) + appenders foreach { + case (x, lv) => lc.addAppender(x, ConsoleAppender.toXLevel(lv), null) + } + } + def loggerConfig(loggerName: String): LoggerConfig = { + val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } + val config = ctx.getConfiguration + config.getLoggerConfig(loggerName) + } + private[sbt] def buildAsyncStdout: AsyncAppender = { + val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } + val config = ctx.getConfiguration + // val layout = PatternLayout.newBuilder + // .withPattern(PatternLayout.SIMPLE_CONVERSION_PATTERN) + // .build + val appender = ConsoleAppender("Stdout") + // CustomConsoleAppenderImpl.createAppender("Stdout", layout, null, null) + appender.start + config.addAppender(appender) + val asyncAppender: AsyncAppender = (AsyncAppender.newBuilder(): AsyncAppender.Builder) + .setName("AsyncStdout") + .setAppenderRefs(Array(AppenderRef.createAppenderRef("Stdout", XLevel.DEBUG, null))) + .setBlocking(false) + .setConfiguration(config) + .build + asyncAppender.start + config.addAppender(asyncAppender) + asyncAppender + } + private[sbt] def init(): LoggerContext = { + import org.apache.logging.log4j.core.config.builder.api.ConfigurationBuilderFactory + import org.apache.logging.log4j.core.config.Configurator + val builder = ConfigurationBuilderFactory.newConfigurationBuilder + builder.setConfigurationName("sbt.util.logging") + val ctx = Configurator.initialize(builder.build()) + ctx match { case x: LoggerContext => x } + } +} +object LogExchange extends LogExchange diff --git a/internal/util-logging/src/test/resources/log4j2.component.properties b/internal/util-logging/src/test/resources/log4j2.component.properties new file mode 100644 index 000000000..ee7c90784 --- /dev/null +++ b/internal/util-logging/src/test/resources/log4j2.component.properties @@ -0,0 +1 @@ +Log4jContextSelector=org.apache.logging.log4j.core.async.AsyncLoggerContextSelector diff --git a/internal/util-logging/src/test/scala/Escapes.scala b/internal/util-logging/src/test/scala/Escapes.scala index 9078f4d59..a226e4d3b 100644 --- a/internal/util-logging/src/test/scala/Escapes.scala +++ b/internal/util-logging/src/test/scala/Escapes.scala @@ -4,7 +4,7 @@ import org.scalacheck._ import Prop._ import Gen.{ listOf, oneOf } -import ConsoleLogger.{ ESC, hasEscapeSequence, isEscapeTerminator, removeEscapeSequences } +import ConsoleAppender.{ ESC, hasEscapeSequence, isEscapeTerminator, removeEscapeSequences } object Escapes extends Properties("Escapes") { property("genTerminator only generates terminators") = diff --git a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala new file mode 100644 index 000000000..d063fbf2c --- /dev/null +++ b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala @@ -0,0 +1,52 @@ +package sbt.internal.util + +import org.scalatest._ +import sbt.util._ +import java.io.{ File, PrintWriter } +import sbt.io.Using + +class ManagedLoggerSpec extends FlatSpec with Matchers { + "ManagedLogger" should "log to console" in { + val log = LogExchange.logger("foo") + LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) + log.info("test") + log.debug("test") + } + + "global logging" should "log immediately after initialization" in { + // this is passed into State normally + val global0 = initialGlobalLogging + val full = global0.full + (1 to 3).toList foreach { x => full.info(s"test$x") } + } + + // This is done in Mainloop.scala + it should "create a new backing with newAppender" in { + val global0 = initialGlobalLogging + val logBacking0 = global0.backing + val global1 = Using.fileWriter(append = true)(logBacking0.file) { writer => + val out = new PrintWriter(writer) + val g = global0.newAppender(global0.full, out, logBacking0) + val full = g.full + (1 to 3).toList foreach { x => full.info(s"newAppender $x") } + assert(logBacking0.file.exists) + g + } + val logBacking1 = global1.backing + Using.fileWriter(append = true)(logBacking1.file) { writer => + val out = new PrintWriter(writer) + val g = global1.newAppender(global1.full, out, logBacking1) + val full = g.full + (1 to 3).toList foreach { x => full.info(s"newAppender $x") } + // println(logBacking.file) + // print("Press enter to continue. ") + // System.console.readLine + assert(logBacking1.file.exists) + } + } + + val console = ConsoleOut.systemOut + def initialGlobalLogging: GlobalLogging = GlobalLogging.initial( + MainAppender.globalDefault(console), File.createTempFile("sbt", ".log"), console + ) +} diff --git a/project/Dependencies.scala b/project/Dependencies.scala index f020e4384..20e317e57 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -35,10 +35,15 @@ object Dependencies { val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.4" val scalatest = "org.scalatest" %% "scalatest" % "3.0.1" - val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" val sjsonnewVersion = "0.7.0" val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion + + def log4jVersion = "2.7" + val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion + val log4jCore = "org.apache.logging.log4j" % "log4j-core" % log4jVersion + val log4jSlf4jImpl = "org.apache.logging.log4j" % "log4j-slf4j-impl" % log4jVersion + val disruptor = "com.lmax" % "disruptor" % "3.3.6" } diff --git a/project/contraband.sbt b/project/contraband.sbt new file mode 100644 index 000000000..88961b8f9 --- /dev/null +++ b/project/contraband.sbt @@ -0,0 +1 @@ +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M3") From ef2d0794947c9078fa6d4cc88b9513828e45d078 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 16 Jan 2017 13:26:31 -0500 Subject: [PATCH 642/823] Revert "Merge pull request #41 from eed3si9n/wip/2469" This reverts commit 0da2f30ee8b895933c3fd88d2401fc82d1a3e01a, reversing changes made to 93418589b7a0839baec551a45f6525000af03d4d. --- .../sbt/internal/util/complete/Parser.scala | 26 ------------------- .../src/test/scala/ParserTest.scala | 9 ------- 2 files changed, 35 deletions(-) diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala index 67651bfbd..003862c5e 100644 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala +++ b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala @@ -121,9 +121,6 @@ sealed trait RichParser[A] { /** Applies the original parser, applies `f` to the result to get the next parser, and applies that parser and uses its result for the overall result. */ def flatMap[B](f: A => Parser[B]): Parser[B] - - /** Applied both the original parser and `b` on the same input and returns the results produced by each parser */ - def combinedWith(b: Parser[A]): Parser[Seq[A]] } /** Contains Parser implementation helper methods not typically needed for using parsers. */ @@ -298,11 +295,6 @@ trait ParserMain { def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) def flatMap[B](f: A => Parser[B]) = bindParser(a, f) - def combinedWith(b: Parser[A]): Parser[Seq[A]] = - if (a.valid) - if (b.valid) new CombiningParser(a, b) else a.map(Seq(_)) - else - b.map(Seq(_)) } implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) @@ -616,24 +608,6 @@ private final class HetParser[A, B](a: Parser[A], b: Parser[B]) extends ValidPar def completions(level: Int) = a.completions(level) ++ b.completions(level) override def toString = "(" + a + " || " + b + ")" } -private final class CombiningParser[T](a: Parser[T], b: Parser[T]) extends ValidParser[Seq[T]] { - lazy val result: Option[Seq[T]] = (a.result.toSeq ++ b.result.toSeq) match { case Seq() => None; case seq => Some(seq) } - def completions(level: Int) = a.completions(level) ++ b.completions(level) - def derive(i: Char) = - (a.valid, b.valid) match { - case (true, true) => new CombiningParser(a derive i, b derive i) - case (true, false) => a derive i map (Seq(_)) - case (false, true) => b derive i map (Seq(_)) - case (false, false) => new Invalid(mkFailure("No valid parser available.")) - } - def resultEmpty = - (a.resultEmpty, b.resultEmpty) match { - case (Value(ra), Value(rb)) => Value(Seq(ra, rb)) - case (Value(ra), _) => Value(Seq(ra)) - case (_, Value(rb)) => Value(Seq(rb)) - case _ => Value(Nil) - } -} private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) extends ValidParser[Seq[T]] { assert(a.nonEmpty) lazy val resultEmpty: Result[Seq[T]] = diff --git a/internal/util-complete/src/test/scala/ParserTest.scala b/internal/util-complete/src/test/scala/ParserTest.scala index 34e35efbe..1db99b513 100644 --- a/internal/util-complete/src/test/scala/ParserTest.scala +++ b/internal/util-complete/src/test/scala/ParserTest.scala @@ -108,15 +108,6 @@ object ParserTest extends Properties("Completing Parser") { property("repeatDep requires at least one token") = !matches(repeat, "") property("repeatDep accepts one token") = matches(repeat, colors.toSeq.head) property("repeatDep accepts two tokens") = matches(repeat, colors.toSeq.take(2).mkString(" ")) - property("combined parser gives completion of both parsers") = { - val prefix = "fix" - val p1Suffixes = Set("", "ated", "ation") - val p2Suffixes = Set("es", "ed") - val p1: Parser[String] = p1Suffixes map (suffix => (prefix + suffix): Parser[String]) reduce (_ | _) - val p2: Parser[String] = p2Suffixes map (suffix => (prefix + suffix): Parser[String]) reduce (_ | _) - val suggestions: Set[Completion] = p1Suffixes ++ p2Suffixes map (new Suggestion(_)) - checkAll(prefix, p1 combinedWith p2, Completions(suggestions)) - } } object ParserExample { val ws = charClass(_.isWhitespace).+ From ca6a0be6028b5fc542cb0c128103d03ed8bf8bfd Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 18 Jan 2017 06:54:15 -0500 Subject: [PATCH 643/823] Handle ReusableObjectMessage When log4j2 is not using async logging, it sends the ObjectMessage using ReusableObjectMessage. --- build.sbt | 4 ++-- .../scala/sbt/internal/util/ConsoleAppender.scala | 11 ++++++----- 2 files changed, 8 insertions(+), 7 deletions(-) diff --git a/build.sbt b/build.sbt index 74535786f..becdb12cf 100644 --- a/build.sbt +++ b/build.sbt @@ -2,11 +2,11 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "1.0.0-M18" +def baseVersion: String = "1.0.0-M19" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( - scalaVersion := scala211, + scalaVersion := scala212, // publishArtifact in packageDoc := false, resolvers += Resolver.typesafeIvyRepo("releases"), resolvers += Resolver.sonatypeRepo("snapshots"), diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index e39acb1dc..3c9bf0e9e 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -5,7 +5,7 @@ import java.io.{ PrintStream, PrintWriter } import java.util.Locale import java.util.concurrent.atomic.AtomicInteger import org.apache.logging.log4j.{ Level => XLevel } -import org.apache.logging.log4j.message.{ Message, ParameterizedMessage, ObjectMessage } +import org.apache.logging.log4j.message.{ Message, ParameterizedMessage, ObjectMessage, ReusableObjectMessage } import org.apache.logging.log4j.core.{ LogEvent => XLogEvent } import org.apache.logging.log4j.core.appender.AbstractAppender import org.apache.logging.log4j.core.layout.PatternLayout @@ -244,10 +244,11 @@ class ConsoleAppender private[ConsoleAppender] ( def messageToString(msg: Message): String = msg match { - case p: ParameterizedMessage => p.getFormattedMessage - case r: RingBufferLogEvent => r.getFormattedMessage - case o: ObjectMessage => objectToString(o.getParameter) - case _ => msg.toString + case p: ParameterizedMessage => p.getFormattedMessage + case r: RingBufferLogEvent => r.getFormattedMessage + case o: ObjectMessage => objectToString(o.getParameter) + case o: ReusableObjectMessage => objectToString(o.getParameter) + case _ => msg.getFormattedMessage } def objectToString(o: AnyRef): String = o match { From 08e9ce95260ed438edde51579e67f305fbbd817c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 23 Jan 2017 16:38:14 -0500 Subject: [PATCH 644/823] Implement basic event logging --- .../sbt/internal/util/ConsoleAppender.scala | 5 +++-- .../scala/sbt/internal/util/ManagedLogger.scala | 17 +++++++++++++++++ .../sbt/internal/util/ObjectLogEntry.scala | 17 +++++++++++++++++ .../src/main/scala/sbt/util/LogExchange.scala | 10 ++++++++++ .../src/test/scala/ManagedLoggerSpec.scala | 7 +++++++ 5 files changed, 54 insertions(+), 2 deletions(-) create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/ObjectLogEntry.scala diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 3c9bf0e9e..e46fea4b8 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -252,8 +252,9 @@ class ConsoleAppender private[ConsoleAppender] ( } def objectToString(o: AnyRef): String = o match { - case x: ChannelLogEntry => x.message - case _ => o.toString + case x: ChannelLogEntry => x.message + case x: ObjectLogEntry[_] => x.message.toString + case _ => o.toString } def messageColor(level: Level.Value) = RESET diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index 90c84b6aa..e3c110e3b 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -3,6 +3,7 @@ package sbt.internal.util import sbt.util._ import org.apache.logging.log4j.{ Logger => XLogger } import org.apache.logging.log4j.message.ObjectMessage +import sjsonnew.JsonFormat /** * Delegates log events to the associated LogExchange. @@ -22,4 +23,20 @@ class ManagedLogger( ) } override def success(message: => String): Unit = xlogger.info(message) + + final def debugEvent[A: JsonFormat](event: => A): Unit = logEvent(Level.Debug, event) + final def infoEvent[A: JsonFormat](event: => A): Unit = logEvent(Level.Info, event) + final def warnEvent[A: JsonFormat](event: => A): Unit = logEvent(Level.Warn, event) + final def errorEvent[A: JsonFormat](event: => A): Unit = logEvent(Level.Error, event) + def logEvent[A: JsonFormat](level: Level.Value, event: => A): Unit = + { + val v: A = event + val clazz: Class[A] = v.getClass.asInstanceOf[Class[A]] + val ev = LogExchange.getOrElseUpdateJsonCodec(clazz, implicitly[JsonFormat[A]]) + val entry: ObjectLogEntry[A] = new ObjectLogEntry(level, v, channelName, execId, ev, clazz) + xlogger.log( + ConsoleAppender.toXLevel(level), + new ObjectMessage(entry) + ) + } } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectLogEntry.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectLogEntry.scala new file mode 100644 index 000000000..64ee0db71 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectLogEntry.scala @@ -0,0 +1,17 @@ +package sbt +package internal +package util + +import sbt.util.Level +import sjsonnew.JsonFormat + +final class ObjectLogEntry[A]( + val level: Level.Value, + val message: A, + val channelName: Option[String], + val execId: Option[String], + val ev: JsonFormat[A], + val clazz: Class[A] +) extends Serializable { + +} diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index 0f967da61..8914b4998 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -6,6 +6,8 @@ import org.apache.logging.log4j.core._ import org.apache.logging.log4j.core.appender.AsyncAppender import org.apache.logging.log4j.core.config.{ AppenderRef, LoggerConfig } import scala.collection.JavaConverters._ +import scala.collection.concurrent +import sjsonnew.JsonFormat // http://logging.apache.org/log4j/2.x/manual/customconfig.html // https://logging.apache.org/log4j/2.x/log4j-core/apidocs/index.html @@ -13,6 +15,7 @@ import scala.collection.JavaConverters._ sealed abstract class LogExchange { private[sbt] lazy val context: LoggerContext = init() private[sbt] lazy val asyncStdout: AsyncAppender = buildAsyncStdout + private[sbt] val jsonCodecs: concurrent.Map[Class[_], JsonFormat[_]] = concurrent.TrieMap() def logger(name: String): ManagedLogger = logger(name, None, None) def logger(name: String, channelName: Option[String], execId: Option[String]): ManagedLogger = { @@ -43,6 +46,13 @@ sealed abstract class LogExchange { val config = ctx.getConfiguration config.getLoggerConfig(loggerName) } + def jsonCodec[A](clazz: Class[A]): Option[JsonFormat[A]] = + jsonCodecs.get(clazz) map { _.asInstanceOf[JsonFormat[A]] } + def hasJsonCodec[A](clazz: Class[A]): Boolean = + jsonCodecs.contains(clazz) + def getOrElseUpdateJsonCodec[A](clazz: Class[A], v: JsonFormat[A]): JsonFormat[A] = + jsonCodecs.getOrElseUpdate(clazz, v).asInstanceOf[JsonFormat[A]] + private[sbt] def buildAsyncStdout: AsyncAppender = { val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } val config = ctx.getConfiguration diff --git a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala index d063fbf2c..f7871053f 100644 --- a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala +++ b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala @@ -13,6 +13,13 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { log.debug("test") } + it should "support event logging" in { + import sjsonnew.BasicJsonProtocol._ + val log = LogExchange.logger("foo") + LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) + log.infoEvent(1) + } + "global logging" should "log immediately after initialization" in { // this is passed into State normally val global0 = initialGlobalLogging From 56b51df66ba075d3b4015978beaf950ab1a4219c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 24 Jan 2017 20:23:56 -0500 Subject: [PATCH 645/823] Avoid default params --- .../scala/sbt/internal/util/ConsoleAppender.scala | 14 +++++++++++--- .../main/scala/sbt/internal/util/MainLogging.scala | 11 ++++++----- 2 files changed, 17 insertions(+), 8 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index e46fea4b8..c12f587ae 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -149,7 +149,7 @@ object ConsoleAppender { } } - val formatEnabled = + val formatEnabled: Boolean = { import java.lang.Boolean.{ getBoolean, parseBoolean } val value = System.getProperty("sbt.log.format") @@ -180,8 +180,16 @@ object ConsoleAppender { def apply(out: PrintStream): ConsoleAppender = apply(generateName, ConsoleOut.printStreamOut(out)) def apply(out: PrintWriter): ConsoleAppender = apply(generateName, ConsoleOut.printWriterOut(out)) - def apply(name: String = generateName, out: ConsoleOut = ConsoleOut.systemOut, ansiCodesSupported: Boolean = formatEnabled, - useColor: Boolean = formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = noSuppressedMessage): ConsoleAppender = + def apply(): ConsoleAppender = apply(generateName, ConsoleOut.systemOut) + def apply(name: String): ConsoleAppender = apply(name, ConsoleOut.systemOut, formatEnabled, formatEnabled, noSuppressedMessage) + def apply(out: ConsoleOut): ConsoleAppender = apply(generateName, out, formatEnabled, formatEnabled, noSuppressedMessage) + def apply(name: String, out: ConsoleOut): ConsoleAppender = apply(name, out, formatEnabled, formatEnabled, noSuppressedMessage) + def apply(name: String, out: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = + apply(name, out, formatEnabled, formatEnabled, suppressedMessage) + def apply(name: String, out: ConsoleOut, useColor: Boolean): ConsoleAppender = + apply(name, out, formatEnabled, useColor, noSuppressedMessage) + def apply(name: String, out: ConsoleOut, ansiCodesSupported: Boolean, + useColor: Boolean, suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = { val appender = new ConsoleAppender(name, out, ansiCodesSupported, useColor, suppressedMessage) appender.start diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala index 75eb08298..37dac9b70 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala @@ -48,14 +48,15 @@ object MainAppender { def defaultScreen(name: String, console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): Appender = ConsoleAppender(name, console, suppressedMessage = suppressedMessage) - def defaultBacked( - loggerName: String = generateGlobalBackingName, - useColor: Boolean = ConsoleAppender.formatEnabled - ): PrintWriter => Appender = + def defaultBacked: PrintWriter => Appender = defaultBacked(generateGlobalBackingName, ConsoleAppender.formatEnabled) + def defaultBacked(loggerName: String): PrintWriter => Appender = defaultBacked(loggerName, ConsoleAppender.formatEnabled) + def defaultBacked(useColor: Boolean): PrintWriter => Appender = defaultBacked(generateGlobalBackingName, useColor) + def defaultBacked(loggerName: String, useColor: Boolean): PrintWriter => Appender = to => { ConsoleAppender( ConsoleAppender.generateName, - ConsoleOut.printWriterOut(to), useColor = useColor + ConsoleOut.printWriterOut(to), + useColor = useColor ) } From 51f9f910381c925e0ad8272936fa4e52b8d889fb Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 24 Jan 2017 21:13:38 -0500 Subject: [PATCH 646/823] Adds BufferedAppender --- .../sbt/internal/util/BufferedLogger.scala | 70 +++++++++++++++++++ 1 file changed, 70 insertions(+) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index d1f03cc72..d3b6972bc 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -5,6 +5,76 @@ package sbt.internal.util import sbt.util._ import scala.collection.mutable.ListBuffer +import org.apache.logging.log4j.core.{ LogEvent => XLogEvent, Appender } +import org.apache.logging.log4j.core.appender.AbstractAppender +import org.apache.logging.log4j.core.layout.PatternLayout +import java.util.concurrent.atomic.AtomicInteger + +object BufferedAppender { + def generateName: String = + "buffered-" + generateId.incrementAndGet + private val generateId: AtomicInteger = new AtomicInteger + def apply(delegate: Appender): BufferedAppender = + apply(generateName, delegate) + def apply(name: String, delegate: Appender): BufferedAppender = + { + val appender = new BufferedAppender(name, delegate) + appender.start + appender + } +} + +/** + * Am appender that can buffer the logging done on it and then can flush the buffer + * to the delegate appender provided in the constructor. Use 'record()' to + * start buffering and then 'play' to flush the buffer to the backing appender. + * The logging level set at the time a message is originally logged is used, not + * the level at the time 'play' is called. + */ +class BufferedAppender private[BufferedAppender] (name: String, delegate: Appender) extends AbstractAppender(name, null, PatternLayout.createDefaultLayout(), true) { + private[this] val buffer = new ListBuffer[XLogEvent] + private[this] var recording = false + + def append(event: XLogEvent): Unit = + { + if (recording) { + buffer += event + } else delegate.append(event) + } + + /** Enables buffering. */ + def record() = synchronized { recording = true } + def buffer[T](f: => T): T = { + record() + try { f } + finally { stopQuietly() } + } + def bufferQuietly[T](f: => T): T = { + record() + try { + val result = f + clearBuffer() + result + } catch { case e: Throwable => stopQuietly(); throw e } + } + def stopQuietly() = synchronized { try { stopBuffer() } catch { case e: Exception => () } } + + /** + * Flushes the buffer to the delegate logger. This method calls logAll on the delegate + * so that the messages are written consecutively. The buffer is cleared in the process. + */ + def play(): Unit = + synchronized { + buffer.toList foreach { + delegate.append + } + buffer.clear() + } + /** Clears buffered events and disables buffering. */ + def clearBuffer(): Unit = synchronized { buffer.clear(); recording = false } + /** Plays buffered events and disables buffering. */ + def stopBuffer(): Unit = synchronized { play(); clearBuffer() } +} /** * A logger that can buffer the logging done on it and then can flush the buffer From c985d9cdc075c31b9314ee80c96fb6c076cab270 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 24 Jan 2017 21:13:58 -0500 Subject: [PATCH 647/823] Switch Scripted tests to used ManagedLogger --- .../sbt/internal/scripted/ScriptedTests.scala | 50 +++++++++++-------- 1 file changed, 29 insertions(+), 21 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index 197a58403..81a04721a 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -3,18 +3,18 @@ package internal package scripted import java.io.File -import sbt.util.Logger -import sbt.internal.util.{ ConsoleLogger, BufferedLogger, FullLogger } +import sbt.util.{ Logger, LogExchange, Level } +import sbt.internal.util.{ ManagedLogger, ConsoleOut, MainAppender, ConsoleAppender, BufferedAppender } import sbt.io.IO.wrapNull import sbt.io.{ DirectoryFilter, HiddenFileFilter } import sbt.io.syntax._ import sbt.internal.io.Resources +import java.util.concurrent.atomic.AtomicInteger object ScriptedRunnerImpl { def run(resourceBaseDirectory: File, bufferLog: Boolean, tests: Array[String], handlersProvider: HandlersProvider): Unit = { val runner = new ScriptedTests(resourceBaseDirectory, bufferLog, handlersProvider) - val logger = ConsoleLogger() - logger.setLevel(sbt.util.Level.Debug) + val logger = newLogger val allTests = get(tests, resourceBaseDirectory, logger) flatMap { case ScriptedTest(group, name) => runner.scriptedTest(group, name, logger) @@ -26,29 +26,36 @@ object ScriptedRunnerImpl { if (errors.nonEmpty) sys.error(errors.mkString("Failed tests:\n\t", "\n\t", "\n")) } - def get(tests: Seq[String], baseDirectory: File, log: Logger): Seq[ScriptedTest] = + def get(tests: Seq[String], baseDirectory: File, log: ManagedLogger): Seq[ScriptedTest] = if (tests.isEmpty) listTests(baseDirectory, log) else parseTests(tests) - def listTests(baseDirectory: File, log: Logger): Seq[ScriptedTest] = + def listTests(baseDirectory: File, log: ManagedLogger): Seq[ScriptedTest] = (new ListTests(baseDirectory, _ => true, log)).listTests def parseTests(in: Seq[String]): Seq[ScriptedTest] = for (testString <- in) yield { val Array(group, name) = testString.split("/").map(_.trim) ScriptedTest(group, name) } + private[sbt] val generateId: AtomicInteger = new AtomicInteger + private[sbt] def newLogger: ManagedLogger = + { + val loggerName = "scripted-" + generateId.incrementAndGet + val x = LogExchange.logger(loggerName) + x + } } final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handlersProvider: HandlersProvider) { - // import ScriptedTests._ private val testResources = new Resources(resourceBaseDirectory) + private val consoleAppender: ConsoleAppender = ConsoleAppender() val ScriptFilename = "test" val PendingScriptFilename = "pending" def scriptedTest(group: String, name: String, log: xsbti.Logger): Seq[() => Option[String]] = scriptedTest(group, name, Logger.xlog2Log(log)) - def scriptedTest(group: String, name: String, log: Logger): Seq[() => Option[String]] = + def scriptedTest(group: String, name: String, log: ManagedLogger): Seq[() => Option[String]] = scriptedTest(group, name, { _ => () }, log) - def scriptedTest(group: String, name: String, prescripted: File => Unit, log: Logger): Seq[() => Option[String]] = { + def scriptedTest(group: String, name: String, prescripted: File => Unit, log: ManagedLogger): Seq[() => Option[String]] = { for (groupDir <- (resourceBaseDirectory * group).get; nme <- (groupDir * name).get) yield { val g = groupDir.getName val n = nme.getName @@ -69,19 +76,20 @@ final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handl } } - private def scriptedTest(label: String, testDirectory: File, prescripted: File => Unit, log: Logger): Unit = + private def scriptedTest(label: String, testDirectory: File, prescripted: File => Unit, log: ManagedLogger): Unit = { - val buffered = new BufferedLogger(new FullLogger(log)) - buffered.setLevel(sbt.util.Level.Debug) - if (bufferLog) + val buffered = BufferedAppender(consoleAppender) + LogExchange.unbindLoggerAppenders(log.name) + LogExchange.bindLoggerAppenders(log.name, (buffered -> Level.Debug) :: Nil) + if (bufferLog) { buffered.record() - + } def createParser() = { // val fileHandler = new FileCommands(testDirectory) // // val sbtHandler = new SbtHandler(testDirectory, launcher, buffered, launchOpts) // new TestScriptParser(Map('$' -> fileHandler, /* '>' -> sbtHandler, */ '#' -> CommentHandler)) - val scriptConfig = new ScriptConfig(label, testDirectory, buffered) + val scriptConfig = new ScriptConfig(label, testDirectory, log) new TestScriptParser(handlersProvider getHandlers scriptConfig) } val (file, pending) = { @@ -98,31 +106,31 @@ final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handl run(parser.parse(file)) } def testFailed(): Unit = { - if (pending) buffered.clear() else buffered.stop() - buffered.error("x " + label + pendingString) + if (pending) buffered.clearBuffer() else buffered.stopBuffer() + log.error("x " + label + pendingString) } try { prescripted(testDirectory) runTest() - buffered.info("+ " + label + pendingString) + log.info("+ " + label + pendingString) if (pending) throw new PendingTestSuccessException(label) } catch { case e: TestException => testFailed() e.getCause match { - case null | _: java.net.SocketException => buffered.error(" " + e.getMessage) + case null | _: java.net.SocketException => log.error(" " + e.getMessage) case _ => if (!pending) e.printStackTrace } if (!pending) throw e case e: PendingTestSuccessException => testFailed() - buffered.error(" Mark as passing to remove this failure.") + log.error(" Mark as passing to remove this failure.") throw e case e: Exception => testFailed() if (!pending) throw e - } finally { buffered.clear() } + } finally { buffered.clearBuffer() } } } From f76e3aa2bb27dd82c53d15acb4907b01da31455a Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 25 Jan 2017 20:42:15 -0500 Subject: [PATCH 648/823] use java.util.Optional in Position --- .../src/main/java/xsbti/Position.java | 17 ++++--- .../sbt/internal/util/AbstractEntry.scala | 2 +- .../sbt/internal/util/ChannelLogEntry.scala | 2 +- .../util/codec/AbstractEntryFormats.scala | 2 +- .../util/codec/ChannelLogEntryFormats.scala | 2 +- .../internal/util/codec/JsonProtocol.scala | 2 +- .../src/main/contraband/interface.contra.txt | 19 +++++++ .../util/codecs/PositionFormats.scala | 49 +++++++++++++++++++ .../util/codecs/SeverityFormats.scala | 33 +++++++++++++ .../main/scala/sbt/util/InterfaceUtil.scala | 23 ++++++--- .../src/main/scala/sbt/util/Logger.scala | 3 ++ project/contraband.sbt | 2 +- 12 files changed, 137 insertions(+), 19 deletions(-) create mode 100644 internal/util-logging/src/main/contraband/interface.contra.txt create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/codecs/PositionFormats.scala create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/codecs/SeverityFormats.scala diff --git a/internal/util-interface/src/main/java/xsbti/Position.java b/internal/util-interface/src/main/java/xsbti/Position.java index 96c60ebb2..be0239046 100644 --- a/internal/util-interface/src/main/java/xsbti/Position.java +++ b/internal/util-interface/src/main/java/xsbti/Position.java @@ -3,16 +3,19 @@ */ package xsbti; +import java.io.File; +import java.util.Optional; + public interface Position { - Maybe line(); + Optional line(); String lineContent(); - Maybe offset(); + Optional offset(); // pointer to the column position of the error/warning - Maybe pointer(); - Maybe pointerSpace(); + Optional pointer(); + Optional pointerSpace(); - Maybe sourcePath(); - Maybe sourceFile(); -} \ No newline at end of file + Optional sourcePath(); + Optional sourceFile(); +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala index 1c08e3cad..c20b78e1b 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala @@ -1,5 +1,5 @@ /** - * This code is generated using sbt-datatype. + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. */ // DO NOT EDIT MANUALLY diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala index b7350efee..ec3a282dc 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala @@ -1,5 +1,5 @@ /** - * This code is generated using sbt-datatype. + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. */ // DO NOT EDIT MANUALLY diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala index b797af060..2711ef949 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala @@ -1,5 +1,5 @@ /** - * This code is generated using sbt-datatype. + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. */ // DO NOT EDIT MANUALLY diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala index 4c8d666d9..c8db52bec 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala @@ -1,5 +1,5 @@ /** - * This code is generated using sbt-datatype. + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. */ // DO NOT EDIT MANUALLY diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala index a2bfe0f25..bd721cdfe 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala @@ -1,5 +1,5 @@ /** - * This code is generated using sbt-datatype. + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. */ // DO NOT EDIT MANUALLY diff --git a/internal/util-logging/src/main/contraband/interface.contra.txt b/internal/util-logging/src/main/contraband/interface.contra.txt new file mode 100644 index 000000000..dd3091750 --- /dev/null +++ b/internal/util-logging/src/main/contraband/interface.contra.txt @@ -0,0 +1,19 @@ +package sbt.internal.util +@target(Java) +@codecPackage("sbt.internal.util.codec") +@fullCodec("JsonProtocol") + +enum Severity +{ + Info, Warn, Error +} + +type Position { + line: Int + lineContent: String! + offset: Int + pointer: Int + pointerSpace: String + sourcePath: String + sourceFile: java.io.File +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codecs/PositionFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codecs/PositionFormats.scala new file mode 100644 index 000000000..d6ddf8049 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codecs/PositionFormats.scala @@ -0,0 +1,49 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +package sbt.internal.util.codec +import _root_.sjsonnew.{ deserializationError, Builder, JsonFormat, Unbuilder } +import xsbti.Position +import java.util.Optional + +trait PositionFormats { self: sjsonnew.BasicJsonProtocol => + implicit lazy val PositionFormat: JsonFormat[Position] = new JsonFormat[Position] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): Position = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val line0 = unbuilder.readField[Optional[java.lang.Integer]]("line") + val lineContent0 = unbuilder.readField[String]("lineContent") + val offset0 = unbuilder.readField[Optional[java.lang.Integer]]("offset") + val pointer0 = unbuilder.readField[Optional[java.lang.Integer]]("pointer") + val pointerSpace0 = unbuilder.readField[Optional[String]]("pointerSpace") + val sourcePath0 = unbuilder.readField[Optional[String]]("sourcePath") + val sourceFile0 = unbuilder.readField[Optional[java.io.File]]("sourceFile") + unbuilder.endObject() + new Position() { + override val line = line0 + override val lineContent = lineContent0 + override val offset = offset0 + override val pointer = pointer0 + override val pointerSpace = pointerSpace0 + override val sourcePath = sourcePath0 + override val sourceFile = sourceFile0 + } + case None => + deserializationError("Expected JsObject but found None") + } + } + override def write[J](obj: Position, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("line", obj.line) + builder.addField("lineContent", obj.lineContent) + builder.addField("offset", obj.offset) + builder.addField("pointer", obj.pointer) + builder.addField("pointerSpace", obj.pointerSpace) + builder.addField("sourcePath", obj.sourcePath) + builder.addField("sourceFile", obj.sourceFile) + builder.endObject() + } + } +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codecs/SeverityFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codecs/SeverityFormats.scala new file mode 100644 index 000000000..7548a2ff1 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codecs/SeverityFormats.scala @@ -0,0 +1,33 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +package sbt.internal.util.codec + +import _root_.sjsonnew.{ deserializationError, Builder, JsonFormat, Unbuilder } +import xsbti.Severity; + +trait SeverityFormats { self: sjsonnew.BasicJsonProtocol => + implicit lazy val SeverityFormat: JsonFormat[Severity] = new JsonFormat[Severity] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): Severity = { + jsOpt match { + case Some(js) => + unbuilder.readString(js) match { + case "Info" => Severity.Info + case "Warn" => Severity.Warn + case "Error" => Severity.Error + } + case None => + deserializationError("Expected JsString but found None") + } + } + override def write[J](obj: Severity, builder: Builder[J]): Unit = { + val str = obj match { + case Severity.Info => "Info" + case Severity.Warn => "Warn" + case Severity.Error => "Error" + } + builder.writeString(str) + } + } +} diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index d88a93674..328ba8bc7 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -2,6 +2,7 @@ package sbt.util import xsbti.{ Maybe, F0, F1, T2, Position, Problem, Severity } import java.io.File +import java.util.Optional object InterfaceUtil { def f0[A](a: => A): F0[A] = new ConcreteF0[A](a) @@ -18,6 +19,16 @@ object InterfaceUtil { case None => Maybe.nothing() } + def jo2o[A](o: Optional[A]): Option[A] = + if (o.isPresent) Some(o.get) + else None + + def o2jo[A](o: Option[A]): Optional[A] = + o match { + case Some(v) => Optional.ofNullable(v) + case None => Optional.empty[A]() + } + def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = new ConcretePosition(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0) @@ -61,13 +72,13 @@ object InterfaceUtil { sourcePath0: Option[String], sourceFile0: Option[File] ) extends Position { - val line = o2m(line0) + val line = o2jo(line0) val lineContent = content - val offset = o2m(offset0) - val pointer = o2m(pointer0) - val pointerSpace = o2m(pointerSpace0) - val sourcePath = o2m(sourcePath0) - val sourceFile = o2m(sourceFile0) + val offset = o2jo(offset0) + val pointer = o2jo(pointer0) + val pointerSpace = o2jo(pointerSpace0) + val sourcePath = o2jo(sourcePath0) + val sourceFile = o2jo(sourceFile0) } private final class ConcreteProblem( diff --git a/internal/util-logging/src/main/scala/sbt/util/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/Logger.scala index 08945b379..fd5b34a60 100644 --- a/internal/util-logging/src/main/scala/sbt/util/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Logger.scala @@ -9,6 +9,7 @@ import sys.process.ProcessLogger import sbt.internal.util.{ BufferedLogger, FullLogger } import java.io.File +import java.util.Optional /** * This is intended to be the simplest logging interface for use by code that wants to log. @@ -88,6 +89,8 @@ object Logger { def f0[A](a: => A): F0[A] = InterfaceUtil.f0[A](a) def m2o[A](m: Maybe[A]): Option[A] = InterfaceUtil.m2o(m) def o2m[A](o: Option[A]): Maybe[A] = InterfaceUtil.o2m(o) + def jo2o[A](o: Optional[A]): Option[A] = InterfaceUtil.jo2o(o) + def o2jo[A](o: Option[A]): Optional[A] = InterfaceUtil.o2jo(o) def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = InterfaceUtil.position(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0) diff --git a/project/contraband.sbt b/project/contraband.sbt index 88961b8f9..8a80f6ea1 100644 --- a/project/contraband.sbt +++ b/project/contraband.sbt @@ -1 +1 @@ -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M3") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M4") From 1320c9695331fea3ea7d1c751ba7a99a456f7bff Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 25 Jan 2017 23:15:31 -0500 Subject: [PATCH 649/823] Rename log events --- ...hannelLogEntry.scala => StringEvent.scala} | 28 +++++++++---------- .../util/codec/AbstractEntryFormats.scala | 4 +-- .../internal/util/codec/JsonProtocol.scala | 2 +- ...Formats.scala => StringEventFormats.scala} | 10 +++---- .../src/main/contraband/logging.contra | 2 +- .../sbt/internal/util/ConsoleAppender.scala | 6 ++-- .../sbt/internal/util/ManagedLogger.scala | 4 +-- ...ObjectLogEntry.scala => ObjectEvent.scala} | 2 +- 8 files changed, 29 insertions(+), 29 deletions(-) rename internal/util-logging/src/main/contraband-scala/sbt/internal/util/{ChannelLogEntry.scala => StringEvent.scala} (51%) rename internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/{ChannelLogEntryFormats.scala => StringEventFormats.scala} (70%) rename internal/util-logging/src/main/scala/sbt/internal/util/{ObjectLogEntry.scala => ObjectEvent.scala} (89%) diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala similarity index 51% rename from internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala rename to internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala index ec3a282dc..4ac959836 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ChannelLogEntry.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala @@ -4,7 +4,7 @@ // DO NOT EDIT MANUALLY package sbt.internal.util -final class ChannelLogEntry private ( +final class StringEvent private ( val level: String, val message: String, channelName: Option[String], @@ -13,39 +13,39 @@ final class ChannelLogEntry private ( override def equals(o: Any): Boolean = o match { - case x: ChannelLogEntry => (this.level == x.level) && (this.message == x.message) && (this.channelName == x.channelName) && (this.execId == x.execId) + case x: StringEvent => (this.level == x.level) && (this.message == x.message) && (this.channelName == x.channelName) && (this.execId == x.execId) case _ => false } override def hashCode: Int = { 37 * (37 * (37 * (37 * (17 + level.##) + message.##) + channelName.##) + execId.##) } override def toString: String = { - "ChannelLogEntry(" + level + ", " + message + ", " + channelName + ", " + execId + ")" + "StringEvent(" + level + ", " + message + ", " + channelName + ", " + execId + ")" } - protected[this] def copy(level: String = level, message: String = message, channelName: Option[String] = channelName, execId: Option[String] = execId): ChannelLogEntry = { - new ChannelLogEntry(level, message, channelName, execId) + protected[this] def copy(level: String = level, message: String = message, channelName: Option[String] = channelName, execId: Option[String] = execId): StringEvent = { + new StringEvent(level, message, channelName, execId) } - def withLevel(level: String): ChannelLogEntry = { + def withLevel(level: String): StringEvent = { copy(level = level) } - def withMessage(message: String): ChannelLogEntry = { + def withMessage(message: String): StringEvent = { copy(message = message) } - def withChannelName(channelName: Option[String]): ChannelLogEntry = { + def withChannelName(channelName: Option[String]): StringEvent = { copy(channelName = channelName) } - def withChannelName(channelName: String): ChannelLogEntry = { + def withChannelName(channelName: String): StringEvent = { copy(channelName = Option(channelName)) } - def withExecId(execId: Option[String]): ChannelLogEntry = { + def withExecId(execId: Option[String]): StringEvent = { copy(execId = execId) } - def withExecId(execId: String): ChannelLogEntry = { + def withExecId(execId: String): StringEvent = { copy(execId = Option(execId)) } } -object ChannelLogEntry { +object StringEvent { - def apply(level: String, message: String, channelName: Option[String], execId: Option[String]): ChannelLogEntry = new ChannelLogEntry(level, message, channelName, execId) - def apply(level: String, message: String, channelName: String, execId: String): ChannelLogEntry = new ChannelLogEntry(level, message, Option(channelName), Option(execId)) + def apply(level: String, message: String, channelName: Option[String], execId: Option[String]): StringEvent = new StringEvent(level, message, channelName, execId) + def apply(level: String, message: String, channelName: String, execId: String): StringEvent = new StringEvent(level, message, Option(channelName), Option(execId)) } diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala index 2711ef949..4eed06c7b 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala @@ -5,6 +5,6 @@ // DO NOT EDIT MANUALLY package sbt.internal.util.codec import _root_.sjsonnew.{ deserializationError, serializationError, Builder, JsonFormat, Unbuilder } -trait AbstractEntryFormats { self: sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.ChannelLogEntryFormats => -implicit lazy val AbstractEntryFormat: JsonFormat[sbt.internal.util.AbstractEntry] = flatUnionFormat1[sbt.internal.util.AbstractEntry, sbt.internal.util.ChannelLogEntry]("type") +trait AbstractEntryFormats { self: sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.StringEventFormats => +implicit lazy val AbstractEntryFormat: JsonFormat[sbt.internal.util.AbstractEntry] = flatUnionFormat1[sbt.internal.util.AbstractEntry, sbt.internal.util.StringEvent]("type") } diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala index bd721cdfe..39484f2e0 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala @@ -5,6 +5,6 @@ // DO NOT EDIT MANUALLY package sbt.internal.util.codec trait JsonProtocol extends sjsonnew.BasicJsonProtocol - with sbt.internal.util.codec.ChannelLogEntryFormats + with sbt.internal.util.codec.StringEventFormats with sbt.internal.util.codec.AbstractEntryFormats object JsonProtocol extends JsonProtocol \ No newline at end of file diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala similarity index 70% rename from internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala rename to internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala index c8db52bec..c005071e7 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ChannelLogEntryFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala @@ -5,9 +5,9 @@ // DO NOT EDIT MANUALLY package sbt.internal.util.codec import _root_.sjsonnew.{ deserializationError, serializationError, Builder, JsonFormat, Unbuilder } -trait ChannelLogEntryFormats { self: sjsonnew.BasicJsonProtocol => -implicit lazy val ChannelLogEntryFormat: JsonFormat[sbt.internal.util.ChannelLogEntry] = new JsonFormat[sbt.internal.util.ChannelLogEntry] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.ChannelLogEntry = { +trait StringEventFormats { self: sjsonnew.BasicJsonProtocol => +implicit lazy val StringEventFormat: JsonFormat[sbt.internal.util.StringEvent] = new JsonFormat[sbt.internal.util.StringEvent] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.StringEvent = { jsOpt match { case Some(js) => unbuilder.beginObject(js) @@ -16,12 +16,12 @@ implicit lazy val ChannelLogEntryFormat: JsonFormat[sbt.internal.util.ChannelLog val channelName = unbuilder.readField[Option[String]]("channelName") val execId = unbuilder.readField[Option[String]]("execId") unbuilder.endObject() - sbt.internal.util.ChannelLogEntry(level, message, channelName, execId) + sbt.internal.util.StringEvent(level, message, channelName, execId) case None => deserializationError("Expected JsObject but found None") } } - override def write[J](obj: sbt.internal.util.ChannelLogEntry, builder: Builder[J]): Unit = { + override def write[J](obj: sbt.internal.util.StringEvent, builder: Builder[J]): Unit = { builder.beginObject() builder.addField("level", obj.level) builder.addField("message", obj.message) diff --git a/internal/util-logging/src/main/contraband/logging.contra b/internal/util-logging/src/main/contraband/logging.contra index 085044ed8..67a4b3a04 100644 --- a/internal/util-logging/src/main/contraband/logging.contra +++ b/internal/util-logging/src/main/contraband/logging.contra @@ -8,7 +8,7 @@ interface AbstractEntry { execId: String } -type ChannelLogEntry implements sbt.internal.util.AbstractEntry { +type StringEvent implements sbt.internal.util.AbstractEntry { level: String! message: String! channelName: String diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index c12f587ae..540f84436 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -260,9 +260,9 @@ class ConsoleAppender private[ConsoleAppender] ( } def objectToString(o: AnyRef): String = o match { - case x: ChannelLogEntry => x.message - case x: ObjectLogEntry[_] => x.message.toString - case _ => o.toString + case x: StringEvent => x.message + case x: ObjectEvent[_] => x.message.toString + case _ => o.toString } def messageColor(level: Level.Value) = RESET diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index e3c110e3b..1564b224e 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -19,7 +19,7 @@ class ManagedLogger( { xlogger.log( ConsoleAppender.toXLevel(level), - new ObjectMessage(ChannelLogEntry(level.toString, message, channelName, execId)) + new ObjectMessage(StringEvent(level.toString, message, channelName, execId)) ) } override def success(message: => String): Unit = xlogger.info(message) @@ -33,7 +33,7 @@ class ManagedLogger( val v: A = event val clazz: Class[A] = v.getClass.asInstanceOf[Class[A]] val ev = LogExchange.getOrElseUpdateJsonCodec(clazz, implicitly[JsonFormat[A]]) - val entry: ObjectLogEntry[A] = new ObjectLogEntry(level, v, channelName, execId, ev, clazz) + val entry: ObjectEvent[A] = new ObjectEvent(level, v, channelName, execId, ev, clazz) xlogger.log( ConsoleAppender.toXLevel(level), new ObjectMessage(entry) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectLogEntry.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala similarity index 89% rename from internal/util-logging/src/main/scala/sbt/internal/util/ObjectLogEntry.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala index 64ee0db71..611af321b 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectLogEntry.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala @@ -5,7 +5,7 @@ package util import sbt.util.Level import sjsonnew.JsonFormat -final class ObjectLogEntry[A]( +final class ObjectEvent[A]( val level: Level.Value, val message: A, val channelName: Option[String], From 6e2f77f852c943e3fe38d6fd06357a91b817a01f Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 25 Jan 2017 23:25:24 -0500 Subject: [PATCH 650/823] ProblemFormats --- .../src/main/contraband/interface.contra.txt | 7 ++++ .../internal/util/codecs/ProblemFormats.scala | 36 +++++++++++++++++++ 2 files changed, 43 insertions(+) create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/codecs/ProblemFormats.scala diff --git a/internal/util-logging/src/main/contraband/interface.contra.txt b/internal/util-logging/src/main/contraband/interface.contra.txt index dd3091750..795e6a4c3 100644 --- a/internal/util-logging/src/main/contraband/interface.contra.txt +++ b/internal/util-logging/src/main/contraband/interface.contra.txt @@ -17,3 +17,10 @@ type Position { sourcePath: String sourceFile: java.io.File } + +type Problem { + category: String! + severity: Severity! + message: String! + position: Position! +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codecs/ProblemFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codecs/ProblemFormats.scala new file mode 100644 index 000000000..cbb5f0010 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codecs/ProblemFormats.scala @@ -0,0 +1,36 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +package sbt.internal.util.codec + +import xsbti.{ Problem, Severity, Position } +import sbt.util.InterfaceUtil.problem +import _root_.sjsonnew.{ deserializationError, Builder, JsonFormat, Unbuilder } + +trait ProblemFormats { self: SeverityFormats with PositionFormats with sjsonnew.BasicJsonProtocol => + implicit lazy val ProblemFormat: JsonFormat[Problem] = new JsonFormat[Problem] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): Problem = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val category = unbuilder.readField[String]("category") + val severity = unbuilder.readField[Severity]("severity") + val message = unbuilder.readField[String]("message") + val position = unbuilder.readField[Position]("position") + unbuilder.endObject() + problem(category, position, message, severity) + case None => + deserializationError("Expected JsObject but found None") + } + } + override def write[J](obj: Problem, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("category", obj.category) + builder.addField("severity", obj.severity) + builder.addField("message", obj.message) + builder.addField("position", obj.position) + builder.endObject() + } + } +} From a9377ce4a6b9d87b771b8668044478c26d73da48 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 30 Jan 2017 19:20:06 -0500 Subject: [PATCH 651/823] Implements registerStringCodec Uses TypeTag to recover the full name of type parameter, which is calculated by StringTypeTag. This is sent along in ObjectEvent. On the other end, we can lookup typeclass instances using the tag key. --- build.sbt | 4 +-- .../main/scala/sbt/internal/util/INode.scala | 2 +- .../scala/sbt/internal/util/Settings.scala | 11 +++--- .../main/scala/sbt/internal/util/Show.scala | 8 ----- .../src/main/scala/sbt/util/Show.scala | 12 +++++++ .../sbt/{internal => }/util/ShowLines.scala | 2 +- .../src/test/scala/SettingsExample.scala | 9 +++-- .../sbt/internal/util/ConsoleAppender.scala | 34 +++++++++++-------- .../sbt/internal/util/ManagedLogger.scala | 25 +++++++++----- .../scala/sbt/internal/util/ObjectEvent.scala | 4 +-- .../sbt/internal/util/StringTypeTag.scala | 29 ++++++++++++++++ .../src/main/scala/sbt/util/LogExchange.scala | 21 ++++++++---- .../src/test/scala/ManagedLoggerSpec.scala | 27 +++++++++++++++ 13 files changed, 135 insertions(+), 53 deletions(-) delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Show.scala create mode 100644 internal/util-collection/src/main/scala/sbt/util/Show.scala rename internal/util-collection/src/main/scala/sbt/{internal => }/util/ShowLines.scala (92%) create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala diff --git a/build.sbt b/build.sbt index becdb12cf..86898c559 100644 --- a/build.sbt +++ b/build.sbt @@ -99,12 +99,12 @@ lazy val utilComplete = (project in internalPath / "util-complete"). // logging lazy val utilLogging = (project in internalPath / "util-logging"). enablePlugins(ContrabandPlugin, JsonCodecPlugin). - dependsOn(utilInterface, utilTesting % Test). + dependsOn(utilInterface, utilCollection, utilTesting % Test). settings( commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", - libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson), + libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson, scalaReflect.value), sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala" ) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala b/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala index d85cadf3f..3c159ec11 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala @@ -95,7 +95,7 @@ abstract class EvaluateSettings[Scope] { keyString private[this] def keyString = - (static.toSeq.flatMap { case (key, value) => if (value eq this) init.showFullKey(key) :: Nil else Nil }).headOption getOrElse "non-static" + (static.toSeq.flatMap { case (key, value) => if (value eq this) init.showFullKey.show(key) :: Nil else Nil }).headOption getOrElse "non-static" final def get: T = synchronized { assert(value != null, toString + " not evaluated") diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala index 97117e2dc..afa02c150 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala @@ -6,6 +6,7 @@ package sbt.internal.util import scala.language.existentials import Types._ +import sbt.util.Show sealed trait Settings[Scope] { def data: Map[Scope, AttributeMap] @@ -119,7 +120,7 @@ trait Init[Scope] { def mapScope(f: Scope => Scope): MapScoped = new MapScoped { def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) } - private final class InvalidReference(val key: ScopedKey[_]) extends RuntimeException("Internal settings error: invalid reference to " + showFullKey(key)) + private final class InvalidReference(val key: ScopedKey[_]) extends RuntimeException("Internal settings error: invalid reference to " + showFullKey.show(key)) private[this] def applyDefaults(ss: Seq[Setting[_]]): Seq[Setting[_]] = { @@ -215,11 +216,11 @@ trait Init[Scope] { { val guessed = guessIntendedScope(validKeys, delegates, u.referencedKey) val derived = u.defining.isDerived - val refString = display(u.defining.key) + val refString = display.show(u.defining.key) val sourceString = if (derived) "" else parenPosString(u.defining) - val guessedString = if (derived) "" else guessed.map(g => "\n Did you mean " + display(g) + " ?").toList.mkString + val guessedString = if (derived) "" else guessed.map(g => "\n Did you mean " + display.show(g) + " ?").toList.mkString val derivedString = if (derived) ", which is a derived setting that needs this key to be defined in this scope." else "" - display(u.referencedKey) + " from " + refString + sourceString + derivedString + guessedString + display.show(u.referencedKey) + " from " + refString + sourceString + derivedString + guessedString } private[this] def parenPosString(s: Setting[_]): String = s.positionString match { case None => ""; case Some(s) => " (" + s + ")" } @@ -255,7 +256,7 @@ trait Init[Scope] { new Uninitialized(keys, prefix + suffix + " to undefined setting" + suffix + ": " + keysString + "\n ") } final class Compiled[T](val key: ScopedKey[T], val dependencies: Iterable[ScopedKey[_]], val settings: Seq[Setting[T]]) { - override def toString = showFullKey(key) + override def toString = showFullKey.show(key) } final class Flattened(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]]) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Show.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Show.scala deleted file mode 100644 index 4a0343ed7..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Show.scala +++ /dev/null @@ -1,8 +0,0 @@ -package sbt.internal.util - -trait Show[T] { - def apply(t: T): String -} -object Show { - def apply[T](f: T => String): Show[T] = new Show[T] { def apply(t: T): String = f(t) } -} \ No newline at end of file diff --git a/internal/util-collection/src/main/scala/sbt/util/Show.scala b/internal/util-collection/src/main/scala/sbt/util/Show.scala new file mode 100644 index 000000000..20ac0565d --- /dev/null +++ b/internal/util-collection/src/main/scala/sbt/util/Show.scala @@ -0,0 +1,12 @@ +package sbt.util + +trait Show[A] { + def show(a: A): String +} +object Show { + def apply[A](f: A => String): Show[A] = new Show[A] { def show(a: A): String = f(a) } + + def fromToString[A]: Show[A] = new Show[A] { + def show(a: A): String = a.toString + } +} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/ShowLines.scala b/internal/util-collection/src/main/scala/sbt/util/ShowLines.scala similarity index 92% rename from internal/util-collection/src/main/scala/sbt/internal/util/ShowLines.scala rename to internal/util-collection/src/main/scala/sbt/util/ShowLines.scala index f99a1394c..65729d747 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/ShowLines.scala +++ b/internal/util-collection/src/main/scala/sbt/util/ShowLines.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util trait ShowLines[A] { def showLines(a: A): Seq[String] diff --git a/internal/util-collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala index cf65d6c68..20536a5ef 100644 --- a/internal/util-collection/src/test/scala/SettingsExample.scala +++ b/internal/util-collection/src/test/scala/SettingsExample.scala @@ -1,5 +1,7 @@ package sbt.internal.util +import sbt.util.Show + /** Define our settings system */ // A basic scope indexed by an integer. @@ -12,9 +14,10 @@ final case class Scope(nestIndex: Int, idAtIndex: Int = 0) // That would be a general pain.) case class SettingsExample() extends Init[Scope] { // Provides a way of showing a Scope+AttributeKey[_] - val showFullKey: Show[ScopedKey[_]] = new Show[ScopedKey[_]] { - def apply(key: ScopedKey[_]) = s"${key.scope.nestIndex}(${key.scope.idAtIndex})/${key.key.label}" - } + val showFullKey: Show[ScopedKey[_]] = Show[ScopedKey[_]]((key: ScopedKey[_]) => + { + s"${key.scope.nestIndex}(${key.scope.idAtIndex})/${key.key.label}" + }) // A sample delegation function that delegates to a Scope with a lower index. val delegates: Scope => Seq[Scope] = { diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 540f84436..98f404d34 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -5,11 +5,10 @@ import java.io.{ PrintStream, PrintWriter } import java.util.Locale import java.util.concurrent.atomic.AtomicInteger import org.apache.logging.log4j.{ Level => XLevel } -import org.apache.logging.log4j.message.{ Message, ParameterizedMessage, ObjectMessage, ReusableObjectMessage } +import org.apache.logging.log4j.message.{ Message, ObjectMessage, ReusableObjectMessage } import org.apache.logging.log4j.core.{ LogEvent => XLogEvent } import org.apache.logging.log4j.core.appender.AbstractAppender import org.apache.logging.log4j.core.layout.PatternLayout -import org.apache.logging.log4j.core.async.RingBufferLogEvent import ConsoleAppender._ @@ -246,25 +245,30 @@ class ConsoleAppender private[ConsoleAppender] ( { val level = ConsoleAppender.toLevel(event.getLevel) val message = event.getMessage - val str = messageToString(message) - appendLog(level, str) + // val str = messageToString(message) + appendMessage(level, message) } - def messageToString(msg: Message): String = + def appendMessage(level: Level.Value, msg: Message): Unit = msg match { - case p: ParameterizedMessage => p.getFormattedMessage - case r: RingBufferLogEvent => r.getFormattedMessage - case o: ObjectMessage => objectToString(o.getParameter) - case o: ReusableObjectMessage => objectToString(o.getParameter) - case _ => msg.getFormattedMessage + case o: ObjectMessage => objectToLines(o.getParameter) foreach { appendLog(level, _) } + case o: ReusableObjectMessage => objectToLines(o.getParameter) foreach { appendLog(level, _) } + case _ => appendLog(level, msg.getFormattedMessage) } - def objectToString(o: AnyRef): String = + def objectToLines(o: AnyRef): Vector[String] = o match { - case x: StringEvent => x.message - case x: ObjectEvent[_] => x.message.toString - case _ => o.toString + case x: StringEvent => Vector(x.message) + case x: ObjectEvent[_] => objectEventToLines(x) + case _ => Vector(o.toString) + } + def objectEventToLines(oe: ObjectEvent[_]): Vector[String] = + { + val tag = oe.tag + LogExchange.stringCodec[AnyRef](tag) match { + case Some(codec) => codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector + case _ => Vector(oe.message.toString) + } } - def messageColor(level: Level.Value) = RESET def labelColor(level: Level.Value) = level match { diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index 1564b224e..a7b84990a 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -4,6 +4,7 @@ import sbt.util._ import org.apache.logging.log4j.{ Logger => XLogger } import org.apache.logging.log4j.message.ObjectMessage import sjsonnew.JsonFormat +import scala.reflect.runtime.universe.TypeTag /** * Delegates log events to the associated LogExchange. @@ -24,16 +25,24 @@ class ManagedLogger( } override def success(message: => String): Unit = xlogger.info(message) - final def debugEvent[A: JsonFormat](event: => A): Unit = logEvent(Level.Debug, event) - final def infoEvent[A: JsonFormat](event: => A): Unit = logEvent(Level.Info, event) - final def warnEvent[A: JsonFormat](event: => A): Unit = logEvent(Level.Warn, event) - final def errorEvent[A: JsonFormat](event: => A): Unit = logEvent(Level.Error, event) - def logEvent[A: JsonFormat](level: Level.Value, event: => A): Unit = + def registerStringCodec[A: ShowLines: TypeTag]: Unit = + { + val tag = StringTypeTag[A] + val ev = implicitly[ShowLines[A]] + // println(s"registerStringCodec ${tag.key}") + val _ = LogExchange.getOrElseUpdateStringCodec(tag.key, ev) + } + final def debugEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Debug, event) + final def infoEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Info, event) + final def warnEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Warn, event) + final def errorEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Error, event) + def logEvent[A: JsonFormat: TypeTag](level: Level.Value, event: => A): Unit = { val v: A = event - val clazz: Class[A] = v.getClass.asInstanceOf[Class[A]] - val ev = LogExchange.getOrElseUpdateJsonCodec(clazz, implicitly[JsonFormat[A]]) - val entry: ObjectEvent[A] = new ObjectEvent(level, v, channelName, execId, ev, clazz) + val tag = StringTypeTag[A] + LogExchange.getOrElseUpdateJsonCodec(tag.key, implicitly[JsonFormat[A]]) + // println("logEvent " + tag.key) + val entry: ObjectEvent[A] = new ObjectEvent(level, v, channelName, execId, tag.key) xlogger.log( ConsoleAppender.toXLevel(level), new ObjectMessage(entry) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala index 611af321b..13cddfb83 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala @@ -10,8 +10,6 @@ final class ObjectEvent[A]( val message: A, val channelName: Option[String], val execId: Option[String], - val ev: JsonFormat[A], - val clazz: Class[A] + val tag: String ) extends Serializable { - } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala new file mode 100644 index 000000000..20d226884 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala @@ -0,0 +1,29 @@ +package sbt.internal.util + +import scala.reflect.runtime.universe._ + +/** This is used to carry type information in JSON. */ +final case class StringTypeTag[A](key: String) { + override def toString: String = key +} + +object StringTypeTag { + def apply[A: TypeTag]: StringTypeTag[A] = + { + val tag = implicitly[TypeTag[A]] + val tpe = tag.tpe + val k = typeToString(tpe) + // println(tpe.getClass.toString + " " + k) + StringTypeTag[A](k) + } + def typeToString(tpe: Type): String = + tpe match { + case ref: TypeRef => + if (ref.args.nonEmpty) { + val typeCon = ref.typeConstructor.typeSymbol.asType.fullName + val typeArgs = ref.typeArgs map typeToString + s"""$typeCon[${typeArgs.mkString(",")}]""" + } else tpe.toString + case _ => tpe.toString + } +} diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index 8914b4998..1dc17804d 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -15,7 +15,8 @@ import sjsonnew.JsonFormat sealed abstract class LogExchange { private[sbt] lazy val context: LoggerContext = init() private[sbt] lazy val asyncStdout: AsyncAppender = buildAsyncStdout - private[sbt] val jsonCodecs: concurrent.Map[Class[_], JsonFormat[_]] = concurrent.TrieMap() + private[sbt] val jsonCodecs: concurrent.Map[String, JsonFormat[_]] = concurrent.TrieMap() + private[sbt] val stringCodecs: concurrent.Map[String, ShowLines[_]] = concurrent.TrieMap() def logger(name: String): ManagedLogger = logger(name, None, None) def logger(name: String, channelName: Option[String], execId: Option[String]): ManagedLogger = { @@ -46,12 +47,18 @@ sealed abstract class LogExchange { val config = ctx.getConfiguration config.getLoggerConfig(loggerName) } - def jsonCodec[A](clazz: Class[A]): Option[JsonFormat[A]] = - jsonCodecs.get(clazz) map { _.asInstanceOf[JsonFormat[A]] } - def hasJsonCodec[A](clazz: Class[A]): Boolean = - jsonCodecs.contains(clazz) - def getOrElseUpdateJsonCodec[A](clazz: Class[A], v: JsonFormat[A]): JsonFormat[A] = - jsonCodecs.getOrElseUpdate(clazz, v).asInstanceOf[JsonFormat[A]] + def jsonCodec[A](tag: String): Option[JsonFormat[A]] = + jsonCodecs.get(tag) map { _.asInstanceOf[JsonFormat[A]] } + def hasJsonCodec(tag: String): Boolean = + jsonCodecs.contains(tag) + def getOrElseUpdateJsonCodec[A](tag: String, v: JsonFormat[A]): JsonFormat[A] = + jsonCodecs.getOrElseUpdate(tag, v).asInstanceOf[JsonFormat[A]] + def stringCodec[A](tag: String): Option[ShowLines[A]] = + stringCodecs.get(tag) map { _.asInstanceOf[ShowLines[A]] } + def hasStringCodec(tag: String): Boolean = + stringCodecs.contains(tag) + def getOrElseUpdateStringCodec[A](tag: String, v: ShowLines[A]): ShowLines[A] = + stringCodecs.getOrElseUpdate(tag, v).asInstanceOf[ShowLines[A]] private[sbt] def buildAsyncStdout: AsyncAppender = { val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } diff --git a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala index f7871053f..80e42a64d 100644 --- a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala +++ b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala @@ -20,6 +20,33 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { log.infoEvent(1) } + it should "allow registering Show[Int]" in { + import sjsonnew.BasicJsonProtocol._ + val log = LogExchange.logger("foo") + LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) + implicit val intShow: ShowLines[Int] = ShowLines({ (x: Int) => Vector(s"String representation of $x") }) + log.registerStringCodec[Int] + log.infoEvent(1) + } + + it should "allow registering Show[Array[Int]]" in { + import sjsonnew.BasicJsonProtocol._ + val log = LogExchange.logger("foo") + LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) + implicit val intArrayShow: ShowLines[Array[Int]] = ShowLines({ (x: Array[Int]) => Vector(s"String representation of ${x.mkString}") }) + log.registerStringCodec[Array[Int]] + log.infoEvent(Array(1, 2, 3)) + } + + it should "allow registering Show[Vector[Vector[Int]]]" in { + import sjsonnew.BasicJsonProtocol._ + val log = LogExchange.logger("foo") + LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) + implicit val intVectorShow: ShowLines[Vector[Vector[Int]]] = ShowLines({ (xss: Vector[Vector[Int]]) => Vector(s"String representation of $xss") }) + log.registerStringCodec[Vector[Vector[Int]]] + log.infoEvent(Vector(Vector(1, 2, 3))) + } + "global logging" should "log immediately after initialization" in { // this is passed into State normally val global0 = initialGlobalLogging From a12045ed4282b26179584593bf553a06b42ad1cc Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 15 Feb 2017 18:52:04 -0500 Subject: [PATCH 652/823] some change for Scala 2.10 --- .../src/main/scala/sbt/internal/util/StringTypeTag.scala | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala index 20d226884..5b90d9a12 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala @@ -18,10 +18,10 @@ object StringTypeTag { } def typeToString(tpe: Type): String = tpe match { - case ref: TypeRef => - if (ref.args.nonEmpty) { - val typeCon = ref.typeConstructor.typeSymbol.asType.fullName - val typeArgs = ref.typeArgs map typeToString + case TypeRef(_, sym, args) => + if (args.nonEmpty) { + val typeCon = tpe.typeSymbol.fullName + val typeArgs = args map typeToString s"""$typeCon[${typeArgs.mkString(",")}]""" } else tpe.toString case _ => tpe.toString From bbeecae0b1d701629321c69e79feb9b86c433441 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 21 Mar 2017 00:09:07 +0000 Subject: [PATCH 653/823] Define OptJsonWriter & put it on AttributeKey --- build.sbt | 3 ++- .../scala/sbt/internal/util/Attributes.scala | 20 +++++++++++-------- .../sbt/internal/util/OptJsonWriter.scala | 14 +++++++++++++ 3 files changed, 28 insertions(+), 9 deletions(-) create mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/OptJsonWriter.scala diff --git a/build.sbt b/build.sbt index 86898c559..a7ce52220 100644 --- a/build.sbt +++ b/build.sbt @@ -75,7 +75,8 @@ lazy val utilCollection = (project in internalPath / "util-collection"). settings( commonSettings, Util.keywordsSettings, - name := "Util Collection" + name := "Util Collection", + libraryDependencies ++= Seq(sjsonnew) ) lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala index 33591506c..b8ed058a1 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala @@ -39,6 +39,8 @@ sealed trait AttributeKey[T] { /** Identifies the relative importance of a key among other keys.*/ def rank: Int + + def optJsonWriter: OptJsonWriter[T] } private[sbt] abstract class SharedAttributeKey[T] extends AttributeKey[T] { override final def toString = label @@ -50,32 +52,33 @@ private[sbt] abstract class SharedAttributeKey[T] extends AttributeKey[T] { final def isLocal: Boolean = false } object AttributeKey { - def apply[T](name: String)(implicit mf: Manifest[T]): AttributeKey[T] = + def apply[T](name: String)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = make(name, None, Nil, Int.MaxValue) - def apply[T](name: String, rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + def apply[T](name: String, rank: Int)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = make(name, None, Nil, rank) - def apply[T](name: String, description: String)(implicit mf: Manifest[T]): AttributeKey[T] = + def apply[T](name: String, description: String)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = apply(name, description, Nil) - def apply[T](name: String, description: String, rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + def apply[T](name: String, description: String, rank: Int)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = apply(name, description, Nil, rank) - def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]])(implicit mf: Manifest[T]): AttributeKey[T] = + def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]])(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = apply(name, description, extend, Int.MaxValue) - def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]], rank: Int)(implicit mf: Manifest[T]): AttributeKey[T] = + def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]], rank: Int)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = make(name, Some(description), extend, rank) - private[this] def make[T](name: String, description0: Option[String], extend0: Seq[AttributeKey[_]], rank0: Int)(implicit mf: Manifest[T]): AttributeKey[T] = new SharedAttributeKey[T] { + private[this] def make[T](name: String, description0: Option[String], extend0: Seq[AttributeKey[_]], rank0: Int)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = new SharedAttributeKey[T] { def manifest = mf val label = Util.hyphenToCamel(name) def description = description0 def extend = extend0 def rank = rank0 + def optJsonWriter = ojw } - private[sbt] def local[T](implicit mf: Manifest[T]): AttributeKey[T] = new AttributeKey[T] { + private[sbt] def local[T](implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = new AttributeKey[T] { def manifest = mf def label = LocalLabel def description = None @@ -83,6 +86,7 @@ object AttributeKey { override def toString = label def isLocal: Boolean = true def rank = Int.MaxValue + val optJsonWriter = ojw } private[sbt] final val LocalLabel = "$" + "local" } diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/OptJsonWriter.scala b/internal/util-collection/src/main/scala/sbt/internal/util/OptJsonWriter.scala new file mode 100644 index 000000000..f44ef44e8 --- /dev/null +++ b/internal/util-collection/src/main/scala/sbt/internal/util/OptJsonWriter.scala @@ -0,0 +1,14 @@ +package sbt.internal.util + +import sjsonnew.JsonWriter + +sealed trait OptJsonWriter[A] +final case class NoJsonWriter[A]() extends OptJsonWriter[A] +final case class SomeJsonWriter[A](value: JsonWriter[A]) extends OptJsonWriter[A] + +trait OptJsonWriter0 { + implicit def fallback[A]: NoJsonWriter[A] = NoJsonWriter() +} +object OptJsonWriter extends OptJsonWriter0 { + implicit def lift[A](implicit z: JsonWriter[A]): SomeJsonWriter[A] = SomeJsonWriter(z) +} From e984875b776be292ef2ffcb237482ebb2c286357 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 21 Mar 2017 10:46:58 +0000 Subject: [PATCH 654/823] Move OptJsonWriter to public API --- .../src/main/scala/sbt/internal/util/Attributes.scala | 1 + .../src/main/scala/sbt/{internal => }/util/OptJsonWriter.scala | 2 +- 2 files changed, 2 insertions(+), 1 deletion(-) rename internal/util-collection/src/main/scala/sbt/{internal => }/util/OptJsonWriter.scala (94%) diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala index b8ed058a1..5cf6fb65b 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala @@ -5,6 +5,7 @@ package sbt.internal.util import Types._ import scala.reflect.Manifest +import sbt.util.OptJsonWriter // T must be invariant to work properly. // Because it is sealed and the only instances go through AttributeKey.apply, diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/OptJsonWriter.scala b/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala similarity index 94% rename from internal/util-collection/src/main/scala/sbt/internal/util/OptJsonWriter.scala rename to internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala index f44ef44e8..3a2a23f25 100644 --- a/internal/util-collection/src/main/scala/sbt/internal/util/OptJsonWriter.scala +++ b/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import sjsonnew.JsonWriter From 6c7f99005ee26a06791b44d88bc1e8dbbcbacf59 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 21 Mar 2017 17:12:10 +0000 Subject: [PATCH 655/823] Allow opting out of the fallback OptJsonWriter Simply import OptJsonWriter.OptOut._ And you'll get the implicit lift, but not the implicit fallback. You get an ambiguous compile error like this: [error] /d/sbt-util/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala:28: ambiguous implicit values: [error] both method conflictingFallback1 in trait OptOut0 of type [A]=> sbt.util.NoJsonWriter[A] [error] and method conflictingFallback2 in trait OptOut0 of type [A]=> sbt.util.NoJsonWriter[A] [error] match expected type sbt.util.OptJsonWriter[Foo] [error] val x = implicitly[OptJsonWriter[Foo]] [error] ^ --- .../src/main/scala/sbt/util/OptJsonWriter.scala | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala b/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala index 3a2a23f25..b7a793b46 100644 --- a/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala +++ b/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala @@ -11,4 +11,12 @@ trait OptJsonWriter0 { } object OptJsonWriter extends OptJsonWriter0 { implicit def lift[A](implicit z: JsonWriter[A]): SomeJsonWriter[A] = SomeJsonWriter(z) + + trait OptOut0 { + implicit def conflictingFallback1[A]: NoJsonWriter[A] = NoJsonWriter() + implicit def conflictingFallback2[A]: NoJsonWriter[A] = NoJsonWriter() + } + object OptOut extends OptOut0 { + implicit def lift[A](implicit z: JsonWriter[A]): SomeJsonWriter[A] = SomeJsonWriter(z) + } } From d91c3de736ea393c188017bea0c7030bcfa1f6dd Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 21 Mar 2017 21:50:45 +0000 Subject: [PATCH 656/823] Rename to StrictMode --- .../src/main/scala/sbt/util/OptJsonWriter.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala b/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala index b7a793b46..e840bc689 100644 --- a/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala +++ b/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala @@ -12,11 +12,11 @@ trait OptJsonWriter0 { object OptJsonWriter extends OptJsonWriter0 { implicit def lift[A](implicit z: JsonWriter[A]): SomeJsonWriter[A] = SomeJsonWriter(z) - trait OptOut0 { + trait StrictMode0 { implicit def conflictingFallback1[A]: NoJsonWriter[A] = NoJsonWriter() implicit def conflictingFallback2[A]: NoJsonWriter[A] = NoJsonWriter() } - object OptOut extends OptOut0 { + object StrictMode extends StrictMode0 { implicit def lift[A](implicit z: JsonWriter[A]): SomeJsonWriter[A] = SomeJsonWriter(z) } } From 66f345a3030bedcc0e4077f10dd3a013984d4211 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 3 Apr 2017 03:20:36 -0400 Subject: [PATCH 657/823] Cross publish utilCollection Fixes sbt/util#72 --- build.sbt | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index a7ce52220..a2f98a67c 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "1.0.0-M19" +def baseVersion: String = "1.0.0-M21" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( @@ -73,6 +73,7 @@ lazy val utilControl = (project in internalPath / "util-control"). lazy val utilCollection = (project in internalPath / "util-collection"). dependsOn(utilTesting % Test). settings( + crossScalaVersions := Seq(scala210, scala211, scala212), commonSettings, Util.keywordsSettings, name := "Util Collection", From 94a2e6cb1274537d4df4fc085f9f8e86d0357329 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 3 Apr 2017 03:21:14 -0400 Subject: [PATCH 658/823] Fix directory name --- .../sbt/internal/util/{codecs => codec}/PositionFormats.scala | 0 .../sbt/internal/util/{codecs => codec}/ProblemFormats.scala | 0 .../sbt/internal/util/{codecs => codec}/SeverityFormats.scala | 0 3 files changed, 0 insertions(+), 0 deletions(-) rename internal/util-logging/src/main/scala/sbt/internal/util/{codecs => codec}/PositionFormats.scala (100%) rename internal/util-logging/src/main/scala/sbt/internal/util/{codecs => codec}/ProblemFormats.scala (100%) rename internal/util-logging/src/main/scala/sbt/internal/util/{codecs => codec}/SeverityFormats.scala (100%) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codecs/PositionFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala similarity index 100% rename from internal/util-logging/src/main/scala/sbt/internal/util/codecs/PositionFormats.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codecs/ProblemFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala similarity index 100% rename from internal/util-logging/src/main/scala/sbt/internal/util/codecs/ProblemFormats.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codecs/SeverityFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/SeverityFormats.scala similarity index 100% rename from internal/util-logging/src/main/scala/sbt/internal/util/codecs/SeverityFormats.scala rename to internal/util-logging/src/main/scala/sbt/internal/util/codec/SeverityFormats.scala From e21c78ebb0d6733711e62e885856d5e60c171c51 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 3 Apr 2017 03:22:04 -0400 Subject: [PATCH 659/823] Move JValue format here --- .../internal/util/codec/JValueFormats.scala | 47 +++++++++++++++++++ 1 file changed, 47 insertions(+) create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala new file mode 100644 index 000000000..2ff681825 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala @@ -0,0 +1,47 @@ +/* + * Copyright (C) 2017 Lightbend Inc. + */ + +package sbt +package internal +package util.codec + +import sjsonnew.{ JsonWriter => JW, JsonReader => JR, JsonFormat => JF, _ } +import scala.json.ast.unsafe._ + +trait JValueFormats { self: sjsonnew.BasicJsonProtocol => + implicit val JNullFormat: JF[JNull.type] = new JF[JNull.type] { + def write[J](x: JNull.type, b: Builder[J]) = b.writeNull() + def read[J](j: Option[J], u: Unbuilder[J]) = JNull + } + + implicit val JBooleanFormat: JF[JBoolean] = project(_.get, (x: Boolean) => JBoolean(x)) + implicit val JStringFormat: JF[JString] = project(_.value, (x: String) => JString(x)) + implicit val JNumberFormat: JF[JNumber] = project(x => BigDecimal(x.value), (x: BigDecimal) => JNumber(x.toString)) + implicit val JArrayFormat: JF[JArray] = project[JArray, Array[JValue]](_.value, JArray(_)) + + implicit lazy val JObjectJsonWriter: JW[JObject] = new JW[JObject] { + def write[J](x: JObject, b: Builder[J]) = { + b.beginObject() + x.value foreach (jsonField => JValueFormat.addField(jsonField.field, jsonField.value, b)) + b.endObject() + } + } + + implicit lazy val JValueJsonWriter: JW[JValue] = new JW[JValue] { + def write[J](x: JValue, b: Builder[J]) = x match { + case x: JNull.type => JNullFormat.write(x, b) + case x: JBoolean => JBooleanFormat.write(x, b) + case x: JString => JStringFormat.write(x, b) + case x: JNumber => JNumberFormat.write(x, b) + case x: JArray => JArrayFormat.write(x, b) + case x: JObject => JObjectJsonWriter.write(x, b) + } + } + + implicit lazy val JValueJsonReader: JR[JValue] = new JR[JValue] { + def read[J](j: Option[J], u: Unbuilder[J]) = ??? // Is this even possible? with no Manifest[J]? + } + + implicit lazy val JValueFormat: JF[JValue] = jsonFormat[JValue](JValueJsonReader, JValueJsonWriter) +} From 1dab826ffd845f3f8390ab009e8c329f7e130b46 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 3 Apr 2017 03:22:33 -0400 Subject: [PATCH 660/823] Store JValue into ObjectEvent --- .../sbt/internal/util/ConsoleAppender.scala | 4 ++-- .../scala/sbt/internal/util/ManagedLogger.scala | 2 +- .../scala/sbt/internal/util/ObjectEvent.scala | 17 ++++++++++++++++- 3 files changed, 19 insertions(+), 4 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 98f404d34..39d75cfce 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -263,8 +263,8 @@ class ConsoleAppender private[ConsoleAppender] ( } def objectEventToLines(oe: ObjectEvent[_]): Vector[String] = { - val tag = oe.tag - LogExchange.stringCodec[AnyRef](tag) match { + val contentType = oe.contentType + LogExchange.stringCodec[AnyRef](contentType) match { case Some(codec) => codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector case _ => Vector(oe.message.toString) } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index a7b84990a..24b9a8e02 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -42,7 +42,7 @@ class ManagedLogger( val tag = StringTypeTag[A] LogExchange.getOrElseUpdateJsonCodec(tag.key, implicitly[JsonFormat[A]]) // println("logEvent " + tag.key) - val entry: ObjectEvent[A] = new ObjectEvent(level, v, channelName, execId, tag.key) + val entry: ObjectEvent[A] = ObjectEvent(level, v, channelName, execId, tag.key) xlogger.log( ConsoleAppender.toXLevel(level), new ObjectMessage(entry) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala index 13cddfb83..fdf4d8789 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala @@ -4,12 +4,27 @@ package util import sbt.util.Level import sjsonnew.JsonFormat +import sjsonnew.support.scalajson.unsafe.Converter +import scala.json.ast.unsafe.JValue final class ObjectEvent[A]( val level: Level.Value, val message: A, val channelName: Option[String], val execId: Option[String], - val tag: String + val contentType: String, + val json: JValue ) extends Serializable { } + +object ObjectEvent { + def apply[A: JsonFormat]( + level: Level.Value, + message: A, + channelName: Option[String], + execId: Option[String], + contentType: String + ): ObjectEvent[A] = + new ObjectEvent(level, message, channelName, execId, contentType, + Converter.toJsonUnsafe(message)) +} From 069104b989bd403648380944f023a27925f8f75a Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 3 Apr 2017 03:22:47 -0400 Subject: [PATCH 661/823] Bump up log4j2 to 2.8.1 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 20e317e57..f132800f9 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -41,7 +41,7 @@ object Dependencies { val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion - def log4jVersion = "2.7" + def log4jVersion = "2.8.1" val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion val log4jCore = "org.apache.logging.log4j" % "log4j-core" % log4jVersion val log4jSlf4jImpl = "org.apache.logging.log4j" % "log4j-slf4j-impl" % log4jVersion From d088d16d78006b48374e6bbabe0f10342b2202e4 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 4 Apr 2017 14:14:31 +0100 Subject: [PATCH 662/823] Drop (out-X) from the log output Fixes sbt/sbt#3056 --- .../sbt/internal/util/ConsoleAppender.scala | 21 +++++++++++-------- 1 file changed, 12 insertions(+), 9 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 39d75cfce..d730df112 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -177,16 +177,19 @@ object ConsoleAppender { private[this] def os = System.getProperty("os.name") private[this] def isWindows = os.toLowerCase(Locale.ENGLISH).indexOf("windows") >= 0 - def apply(out: PrintStream): ConsoleAppender = apply(generateName, ConsoleOut.printStreamOut(out)) - def apply(out: PrintWriter): ConsoleAppender = apply(generateName, ConsoleOut.printWriterOut(out)) - def apply(): ConsoleAppender = apply(generateName, ConsoleOut.systemOut) - def apply(name: String): ConsoleAppender = apply(name, ConsoleOut.systemOut, formatEnabled, formatEnabled, noSuppressedMessage) - def apply(out: ConsoleOut): ConsoleAppender = apply(generateName, out, formatEnabled, formatEnabled, noSuppressedMessage) - def apply(name: String, out: ConsoleOut): ConsoleAppender = apply(name, out, formatEnabled, formatEnabled, noSuppressedMessage) + def apply(out: PrintStream): ConsoleAppender = apply(ConsoleOut.printStreamOut(out)) + def apply(out: PrintWriter): ConsoleAppender = apply(ConsoleOut.printWriterOut(out)) + def apply(): ConsoleAppender = apply(ConsoleOut.systemOut) + def apply(name: String): ConsoleAppender = apply(name, ConsoleOut.systemOut) + def apply(out: ConsoleOut): ConsoleAppender = apply(generateName, out) + def apply(name: String, out: ConsoleOut): ConsoleAppender = apply(name, out, formatEnabled) + def apply(name: String, out: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = apply(name, out, formatEnabled, formatEnabled, suppressedMessage) + def apply(name: String, out: ConsoleOut, useColor: Boolean): ConsoleAppender = apply(name, out, formatEnabled, useColor, noSuppressedMessage) + def apply(name: String, out: ConsoleOut, ansiCodesSupported: Boolean, useColor: Boolean, suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = { @@ -194,8 +197,9 @@ object ConsoleAppender { appender.start appender } - def generateName: String = - "out-" + generateId.incrementAndGet + + def generateName: String = "out-" + generateId.incrementAndGet + private val generateId: AtomicInteger = new AtomicInteger private[this] val EscapeSequence = (27.toChar + "[^@-~]*[@-~]").r @@ -322,7 +326,6 @@ class ConsoleAppender private[ConsoleAppender] ( setColor(messageColor) out.print(line) reset() - out.print(s" ($name)") out.println() } } From 70a2b4c1691d2af6d3e3b243684a6125248b554c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 4 Apr 2017 16:49:18 -0400 Subject: [PATCH 663/823] JLine 2.14.3 Fixes sbt/sbt#1855 See also jline/jline2#127 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index f132800f9..c6a6c4638 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -28,7 +28,7 @@ object Dependencies { def addSbtIO(p: Project): Project = addSbtModule(p, sbtIoPath, "io", sbtIO) - val jline = "jline" % "jline" % "2.13" + val jline = "jline" % "jline" % "2.14.3" val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } From ed8034f12db670b4d8b39198f7dba8bf6e4f6d6d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 4 Apr 2017 23:55:39 -0400 Subject: [PATCH 664/823] Bump IO --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index c6a6c4638..c39981812 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -6,7 +6,7 @@ object Dependencies { val scala211 = "2.11.8" val scala212 = "2.12.1" - private val ioVersion = "1.0.0-M9" + private val ioVersion = "1.0.0-M10" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From 061b259d1d7f26792b70ea477bc9c48bc3467498 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 4 Apr 2017 23:56:54 -0400 Subject: [PATCH 665/823] Fix build --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index a2f98a67c..78c9048f4 100644 --- a/build.sbt +++ b/build.sbt @@ -73,8 +73,8 @@ lazy val utilControl = (project in internalPath / "util-control"). lazy val utilCollection = (project in internalPath / "util-collection"). dependsOn(utilTesting % Test). settings( - crossScalaVersions := Seq(scala210, scala211, scala212), commonSettings, + crossScalaVersions := Seq(scala210, scala211, scala212), Util.keywordsSettings, name := "Util Collection", libraryDependencies ++= Seq(sjsonnew) From f48848e5d4474aa7a658325930c2c6b8b93261c7 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 17 Apr 2017 03:20:41 -0400 Subject: [PATCH 666/823] Adds overrides for File-based caching sbt/util#45 implemented caching using sjson-new. Now many of the functions take `CacheStore` that abstracts the caching ability. sbt/sbt#3109 demonstrates that setting up CacheStore requires boilerplate involving concepts introduced in sbt 1. This change adds back overrides using File by making assumption that majority of the time we would want standard JSON converter. --- CONTRIBUTING.md | 10 ++ README.md | 8 - build.sbt | 10 +- .../sbt/internal/util/CacheImplicits.scala | 7 - project/build.properties | 2 +- {internal/util-cache => util-cache}/NOTICE | 0 .../sbt/internal/util/HListFormats.scala | 6 +- .../scala/sbt}/util/BasicCacheImplicits.scala | 2 +- .../src/main/scala/sbt}/util/Cache.scala | 13 +- .../main/scala/sbt/util/CacheImplicits.scala | 9 ++ .../src/main/scala/sbt}/util/CacheStore.scala | 37 ++++- .../src/main/scala/sbt}/util/FileInfo.scala | 7 +- .../src/main/scala/sbt}/util/Input.scala | 2 +- .../src/main/scala/sbt}/util/Output.scala | 2 +- .../main/scala/sbt}/util/SeparatedCache.scala | 2 +- .../main/scala/sbt}/util/StampedFormat.scala | 2 +- .../src/test/scala/CacheSpec.scala | 3 +- .../src/test/scala/FileInfoSpec.scala | 3 +- .../src/test/scala/HListFormatSpec.scala | 3 +- .../src/test/scala/SingletonCacheSpec.scala | 3 +- .../util-tracking => util-tracking}/NOTICE | 0 .../main/scala/sbt}/util/ChangeReport.scala | 2 +- .../main/scala/sbt/util/FileFunction.scala | 143 ++++++++++++++++++ .../src/main/scala/sbt}/util/Tracked.scala | 83 +++++----- .../test/scala/sbt}/util/TrackedSpec.scala | 11 +- 25 files changed, 276 insertions(+), 94 deletions(-) delete mode 100644 internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala rename {internal/util-cache => util-cache}/NOTICE (100%) rename internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala => util-cache/src/main/scala/sbt/internal/util/HListFormats.scala (97%) rename {internal/util-cache/src/main/scala/sbt/internal => util-cache/src/main/scala/sbt}/util/BasicCacheImplicits.scala (98%) rename {internal/util-cache/src/main/scala/sbt/internal => util-cache/src/main/scala/sbt}/util/Cache.scala (82%) create mode 100644 util-cache/src/main/scala/sbt/util/CacheImplicits.scala rename {internal/util-cache/src/main/scala/sbt/internal => util-cache/src/main/scala/sbt}/util/CacheStore.scala (55%) rename {internal/util-cache/src/main/scala/sbt/internal => util-cache/src/main/scala/sbt}/util/FileInfo.scala (96%) rename {internal/util-cache/src/main/scala/sbt/internal => util-cache/src/main/scala/sbt}/util/Input.scala (97%) rename {internal/util-cache/src/main/scala/sbt/internal => util-cache/src/main/scala/sbt}/util/Output.scala (96%) rename {internal/util-cache/src/main/scala/sbt/internal => util-cache/src/main/scala/sbt}/util/SeparatedCache.scala (98%) rename {internal/util-cache/src/main/scala/sbt/internal => util-cache/src/main/scala/sbt}/util/StampedFormat.scala (98%) rename {internal/util-cache => util-cache}/src/test/scala/CacheSpec.scala (97%) rename {internal/util-cache => util-cache}/src/test/scala/FileInfoSpec.scala (93%) rename {internal/util-cache => util-cache}/src/test/scala/HListFormatSpec.scala (93%) rename {internal/util-cache => util-cache}/src/test/scala/SingletonCacheSpec.scala (98%) rename {internal/util-tracking => util-tracking}/NOTICE (100%) rename {internal/util-tracking/src/main/scala/sbt/internal => util-tracking/src/main/scala/sbt}/util/ChangeReport.scala (99%) create mode 100644 util-tracking/src/main/scala/sbt/util/FileFunction.scala rename {internal/util-tracking/src/main/scala/sbt/internal => util-tracking/src/main/scala/sbt}/util/Tracked.scala (72%) rename {internal/util-tracking/src/test/scala/sbt/internal => util-tracking/src/test/scala/sbt}/util/TrackedSpec.scala (90%) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 994e17a23..c947404ad 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,3 +1,13 @@ ``` $ sbt release ``` + +### Historical note + +``` +cd sbt-modules/util-take2 +git filter-branch --index-filter 'git rm --cached -qr -- . && git reset -q $GIT_COMMIT -- build.sbt LICENSE NOTICE interface util/appmacro util/collection util/complete util/control util/log util/logic util/process util/relation cache' --prune-empty +git reset --hard +git gc --aggressive +git prune +``` diff --git a/README.md b/README.md index acdbb8c3a..87ddd558f 100644 --- a/README.md +++ b/README.md @@ -1,9 +1 @@ ### utility modules for sbt - -``` -cd sbt-modules/util-take2 -git filter-branch --index-filter 'git rm --cached -qr -- . && git reset -q $GIT_COMMIT -- build.sbt LICENSE NOTICE interface util/appmacro util/collection util/complete util/control util/log util/logic util/process util/relation cache' --prune-empty -git reset --hard -git gc --aggressive -git prune -``` diff --git a/build.sbt b/build.sbt index 78c9048f4..eb9ede57b 100644 --- a/build.sbt +++ b/build.sbt @@ -127,23 +127,21 @@ lazy val utilLogic = (project in internalPath / "util-logic"). ) // Persisted caching based on sjson-new -lazy val utilCache = (project in internalPath / "util-cache"). +lazy val utilCache = (project in file("util-cache")). dependsOn(utilCollection, utilTesting % Test). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sjsonnew, scalaReflect.value), - libraryDependencies += sjsonnewScalaJson % Test + libraryDependencies ++= Seq(sjsonnewScalaJson, scalaReflect.value) ). configure(addSbtIO) // Builds on cache to provide caching for filesystem-related operations -lazy val utilTracking = (project in internalPath / "util-tracking"). +lazy val utilTracking = (project in file("util-tracking")). dependsOn(utilCache, utilTesting % Test). settings( commonSettings, - name := "Util Tracking", - libraryDependencies += sjsonnewScalaJson % Test + name := "Util Tracking" ). configure(addSbtIO) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala b/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala deleted file mode 100644 index 00bb6beaa..000000000 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheImplicits.scala +++ /dev/null @@ -1,7 +0,0 @@ -package sbt.internal.util - -import sjsonnew.BasicJsonProtocol - -object CacheImplicits extends BasicCacheImplicits - with BasicJsonProtocol - with HListFormat diff --git a/project/build.properties b/project/build.properties index 27e88aa11..64317fdae 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=0.13.13 +sbt.version=0.13.15 diff --git a/internal/util-cache/NOTICE b/util-cache/NOTICE similarity index 100% rename from internal/util-cache/NOTICE rename to util-cache/NOTICE diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala b/util-cache/src/main/scala/sbt/internal/util/HListFormats.scala similarity index 97% rename from internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala rename to util-cache/src/main/scala/sbt/internal/util/HListFormats.scala index 6594896c8..bf69b4db8 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/HListFormat.scala +++ b/util-cache/src/main/scala/sbt/internal/util/HListFormats.scala @@ -1,9 +1,11 @@ -package sbt.internal.util +package sbt +package internal +package util import sjsonnew._ import Types.:+: -trait HListFormat { +trait HListFormats { implicit val lnilFormat1: JsonFormat[HNil] = forHNil(HNil) implicit val lnilFormat2: JsonFormat[HNil.type] = forHNil(HNil) diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala b/util-cache/src/main/scala/sbt/util/BasicCacheImplicits.scala similarity index 98% rename from internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala rename to util-cache/src/main/scala/sbt/util/BasicCacheImplicits.scala index 1d1ebe16d..92e69bddb 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/BasicCacheImplicits.scala +++ b/util-cache/src/main/scala/sbt/util/BasicCacheImplicits.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import java.net.{ URI, URL } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala b/util-cache/src/main/scala/sbt/util/Cache.scala similarity index 82% rename from internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala rename to util-cache/src/main/scala/sbt/util/Cache.scala index 0a04dbcdd..3b85a8dc2 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Cache.scala +++ b/util-cache/src/main/scala/sbt/util/Cache.scala @@ -1,7 +1,9 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt.internal.util +package sbt.util + +import java.io.File /** The result of a cache query */ sealed trait CacheResult[K] @@ -32,6 +34,15 @@ object Cache { */ def cache[I, O](implicit c: Cache[I, O]): Cache[I, O] = c + /** + * Returns a function that represents a cache that inserts on miss. + * + * @param cacheFile The store that backs this cache. + * @param default A function that computes a default value to insert on + */ + def cached[I, O](cacheFile: File)(default: I => O)(implicit cache: Cache[I, O]): I => O = + cached(CacheStore(cacheFile))(default) + /** * Returns a function that represents a cache that inserts on miss. * diff --git a/util-cache/src/main/scala/sbt/util/CacheImplicits.scala b/util-cache/src/main/scala/sbt/util/CacheImplicits.scala new file mode 100644 index 000000000..2eb1639cd --- /dev/null +++ b/util-cache/src/main/scala/sbt/util/CacheImplicits.scala @@ -0,0 +1,9 @@ +package sbt.util + +import sjsonnew.BasicJsonProtocol +import sbt.internal.util.HListFormats + +object CacheImplicits extends CacheImplicits +trait CacheImplicits extends BasicCacheImplicits + with BasicJsonProtocol + with HListFormats diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala b/util-cache/src/main/scala/sbt/util/CacheStore.scala similarity index 55% rename from internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala rename to util-cache/src/main/scala/sbt/util/CacheStore.scala index c7e2674b7..073a52281 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/CacheStore.scala +++ b/util-cache/src/main/scala/sbt/util/CacheStore.scala @@ -1,30 +1,55 @@ -package sbt.internal.util +package sbt.util import java.io.{ File, InputStream, OutputStream } import sbt.io.syntax.fileToRichFile import sbt.io.{ IO, Using } import sjsonnew.{ IsoString, JsonReader, JsonWriter, SupportConverter } +import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } +import scala.json.ast.unsafe.JValue /** A `CacheStore` is used by the caching infrastructure to persist cached information. */ -trait CacheStore extends Input with Output { +abstract class CacheStore extends Input with Output { /** Delete the persisted information. */ def delete(): Unit } -/** Factory that can derive new stores. */ -trait CacheStoreFactory { +object CacheStore { + implicit lazy val jvalueIsoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) + + /** Returns file-based CacheStore using standard JSON converter. */ + def apply(cacheFile: File): CacheStore = file(cacheFile) + + /** Returns file-based CacheStore using standard JSON converter. */ + def file(cacheFile: File): CacheStore = new FileBasedStore[JValue](cacheFile, Converter) +} + +/** Factory that can make new stores. */ +abstract class CacheStoreFactory { /** Create a new store. */ - def derive(identifier: String): CacheStore + def make(identifier: String): CacheStore /** Create a new `CacheStoreFactory` from this factory. */ def sub(identifier: String): CacheStoreFactory + + /** A symbolic alias for `sub`. */ + final def /(identifier: String): CacheStoreFactory = sub(identifier) +} + +object CacheStoreFactory { + implicit lazy val jvalueIsoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) + + /** Returns directory-based CacheStoreFactory using standard JSON converter. */ + def apply(base: File): CacheStoreFactory = directory(base) + + /** Returns directory-based CacheStoreFactory using standard JSON converter. */ + def directory(base: File): CacheStoreFactory = new DirectoryStoreFactory[JValue](base, Converter) } /** A factory that creates new stores persisted in `base`. */ class DirectoryStoreFactory[J: IsoString](base: File, converter: SupportConverter[J]) extends CacheStoreFactory { IO.createDirectory(base) - def derive(identifier: String): CacheStore = new FileBasedStore(base / identifier, converter) + def make(identifier: String): CacheStore = new FileBasedStore(base / identifier, converter) def sub(identifier: String): CacheStoreFactory = new DirectoryStoreFactory(base / identifier, converter) } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala similarity index 96% rename from internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala rename to util-cache/src/main/scala/sbt/util/FileInfo.scala index 5f4a3fd1e..6c42422d0 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt.internal.util +package sbt.util import java.io.File import scala.util.control.NonFatal @@ -40,6 +40,11 @@ object FilesInfo { implicit def format[F <: FileInfo: JsonFormat]: JsonFormat[FilesInfo[F]] = project(_.files, (fs: Set[F]) => FilesInfo(fs)) + + def full: FileInfo.Style = FileInfo.full + def hash: FileInfo.Style = FileInfo.hash + def lastModified: FileInfo.Style = FileInfo.lastModified + def exists: FileInfo.Style = FileInfo.exists } object FileInfo { diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala b/util-cache/src/main/scala/sbt/util/Input.scala similarity index 97% rename from internal/util-cache/src/main/scala/sbt/internal/util/Input.scala rename to util-cache/src/main/scala/sbt/util/Input.scala index 3426c117d..646660e59 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Input.scala +++ b/util-cache/src/main/scala/sbt/util/Input.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import java.io.{ Closeable, InputStream } import scala.util.control.NonFatal diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala b/util-cache/src/main/scala/sbt/util/Output.scala similarity index 96% rename from internal/util-cache/src/main/scala/sbt/internal/util/Output.scala rename to util-cache/src/main/scala/sbt/util/Output.scala index 0472adee4..cf4b27f12 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/Output.scala +++ b/util-cache/src/main/scala/sbt/util/Output.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import java.io.{ Closeable, OutputStream } import sjsonnew.{ IsoString, JsonWriter, SupportConverter } diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala b/util-cache/src/main/scala/sbt/util/SeparatedCache.scala similarity index 98% rename from internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala rename to util-cache/src/main/scala/sbt/util/SeparatedCache.scala index be8f11a38..c8772e034 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/SeparatedCache.scala +++ b/util-cache/src/main/scala/sbt/util/SeparatedCache.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009 Mark Harrah */ -package sbt.internal.util +package sbt.util import scala.util.Try diff --git a/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala b/util-cache/src/main/scala/sbt/util/StampedFormat.scala similarity index 98% rename from internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala rename to util-cache/src/main/scala/sbt/util/StampedFormat.scala index 213f50ec5..c78186d1c 100644 --- a/internal/util-cache/src/main/scala/sbt/internal/util/StampedFormat.scala +++ b/util-cache/src/main/scala/sbt/util/StampedFormat.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import scala.reflect.Manifest diff --git a/internal/util-cache/src/test/scala/CacheSpec.scala b/util-cache/src/test/scala/CacheSpec.scala similarity index 97% rename from internal/util-cache/src/test/scala/CacheSpec.scala rename to util-cache/src/test/scala/CacheSpec.scala index a3b0dd5e1..109fd1247 100644 --- a/internal/util-cache/src/test/scala/CacheSpec.scala +++ b/util-cache/src/test/scala/CacheSpec.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import sbt.io.IO import sbt.io.syntax._ @@ -9,6 +9,7 @@ import sjsonnew.IsoString import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } import scala.json.ast.unsafe.JValue +import sbt.internal.util.UnitSpec class CacheSpec extends UnitSpec { diff --git a/internal/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala similarity index 93% rename from internal/util-cache/src/test/scala/FileInfoSpec.scala rename to util-cache/src/test/scala/FileInfoSpec.scala index ed7b4ec28..cae8e15b1 100644 --- a/internal/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -1,7 +1,8 @@ -package sbt.internal.util +package sbt.util import scala.json.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ +import sbt.internal.util.UnitSpec class FileInfoSpec extends UnitSpec { val file = new java.io.File(".").getAbsoluteFile diff --git a/internal/util-cache/src/test/scala/HListFormatSpec.scala b/util-cache/src/test/scala/HListFormatSpec.scala similarity index 93% rename from internal/util-cache/src/test/scala/HListFormatSpec.scala rename to util-cache/src/test/scala/HListFormatSpec.scala index 23e4cde0f..e2d5d38fa 100644 --- a/internal/util-cache/src/test/scala/HListFormatSpec.scala +++ b/util-cache/src/test/scala/HListFormatSpec.scala @@ -1,8 +1,9 @@ -package sbt.internal.util +package sbt.util import scala.json.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ import CacheImplicits._ +import sbt.internal.util.{ UnitSpec, HNil } class HListFormatSpec extends UnitSpec { val quux = 23 :+: "quux" :+: true :+: HNil diff --git a/internal/util-cache/src/test/scala/SingletonCacheSpec.scala b/util-cache/src/test/scala/SingletonCacheSpec.scala similarity index 98% rename from internal/util-cache/src/test/scala/SingletonCacheSpec.scala rename to util-cache/src/test/scala/SingletonCacheSpec.scala index da883d446..5956746de 100644 --- a/internal/util-cache/src/test/scala/SingletonCacheSpec.scala +++ b/util-cache/src/test/scala/SingletonCacheSpec.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import sbt.io.IO import sbt.io.syntax._ @@ -9,6 +9,7 @@ import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilde import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } import scala.json.ast.unsafe.JValue +import sbt.internal.util.UnitSpec class SingletonCacheSpec extends UnitSpec { diff --git a/internal/util-tracking/NOTICE b/util-tracking/NOTICE similarity index 100% rename from internal/util-tracking/NOTICE rename to util-tracking/NOTICE diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala b/util-tracking/src/main/scala/sbt/util/ChangeReport.scala similarity index 99% rename from internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala rename to util-tracking/src/main/scala/sbt/util/ChangeReport.scala index 801fc22cf..af8154729 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/ChangeReport.scala +++ b/util-tracking/src/main/scala/sbt/util/ChangeReport.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package sbt.internal.util +package sbt.util object ChangeReport { def modified[T](files: Set[T]): ChangeReport[T] = diff --git a/util-tracking/src/main/scala/sbt/util/FileFunction.scala b/util-tracking/src/main/scala/sbt/util/FileFunction.scala new file mode 100644 index 000000000..cdc1cc4e8 --- /dev/null +++ b/util-tracking/src/main/scala/sbt/util/FileFunction.scala @@ -0,0 +1,143 @@ +package sbt.util + +import java.io.File + +object FileFunction { + type UpdateFunction = (ChangeReport[File], ChangeReport[File]) => Set[File] + private val defaultInStyle = FileInfo.lastModified + private val defaultOutStyle = FileInfo.exists + + /** + * Generic change-detection helper used to help build / artifact generation / + * etc. steps detect whether or not they need to run. Returns a function whose + * input is a Set of input files, and subsequently executes the action function + * (which does the actual work: compiles, generates resources, etc.), returning + * a Set of output files that it generated. + * + * The input file and resulting output file state is cached in stores issued by + * `storeFactory`. On each invocation, the state of the input and output + * files from the previous run is compared against the cache, as is the set of + * input files. If a change in file state / input files set is detected, the + * action function is re-executed. + * + * @param cacheBaseDirectory The folder in which to store + * @param action The work function, which receives a list of input files and returns a list of output files + */ + def cached(cacheBaseDirectory: File)(action: Set[File] => Set[File]): Set[File] => Set[File] = + cached(cacheBaseDirectory, inStyle = defaultInStyle, outStyle = defaultOutStyle)(action) + + /** + * Generic change-detection helper used to help build / artifact generation / + * etc. steps detect whether or not they need to run. Returns a function whose + * input is a Set of input files, and subsequently executes the action function + * (which does the actual work: compiles, generates resources, etc.), returning + * a Set of output files that it generated. + * + * The input file and resulting output file state is cached in stores issued by + * `storeFactory`. On each invocation, the state of the input and output + * files from the previous run is compared against the cache, as is the set of + * input files. If a change in file state / input files set is detected, the + * action function is re-executed. + * + * @param cacheBaseDirectory The folder in which to store + * @param inStyle The strategy by which to detect state change in the input files from the previous run + * @param action The work function, which receives a list of input files and returns a list of output files + */ + def cached(cacheBaseDirectory: File, inStyle: FileInfo.Style)(action: Set[File] => Set[File]): Set[File] => Set[File] = + cached(cacheBaseDirectory, inStyle = inStyle, outStyle = defaultOutStyle)(action) + + /** + * Generic change-detection helper used to help build / artifact generation / + * etc. steps detect whether or not they need to run. Returns a function whose + * input is a Set of input files, and subsequently executes the action function + * (which does the actual work: compiles, generates resources, etc.), returning + * a Set of output files that it generated. + * + * The input file and resulting output file state is cached in stores issued by + * `storeFactory`. On each invocation, the state of the input and output + * files from the previous run is compared against the cache, as is the set of + * input files. If a change in file state / input files set is detected, the + * action function is re-executed. + * + * @param cacheBaseDirectory The folder in which to store + * @param inStyle The strategy by which to detect state change in the input files from the previous run + * @param outStyle The strategy by which to detect state change in the output files from the previous run + * @param action The work function, which receives a list of input files and returns a list of output files + */ + def cached(cacheBaseDirectory: File, inStyle: FileInfo.Style, outStyle: FileInfo.Style)(action: Set[File] => Set[File]): Set[File] => Set[File] = + cached(CacheStoreFactory(cacheBaseDirectory), inStyle, outStyle)((in, out) => action(in.checked)) + + /** + * Generic change-detection helper used to help build / artifact generation / + * etc. steps detect whether or not they need to run. Returns a function whose + * input is a Set of input files, and subsequently executes the action function + * (which does the actual work: compiles, generates resources, etc.), returning + * a Set of output files that it generated. + * + * The input file and resulting output file state is cached in stores issued by + * `storeFactory`. On each invocation, the state of the input and output + * files from the previous run is compared against the cache, as is the set of + * input files. If a change in file state / input files set is detected, the + * action function is re-executed. + * + * @param storeFactory The factory to use to get stores for the input and output files. + * @param action The work function, which receives a list of input files and returns a list of output files + */ + def cached(storeFactory: CacheStoreFactory)(action: UpdateFunction): Set[File] => Set[File] = + cached(storeFactory, inStyle = defaultInStyle, outStyle = defaultOutStyle)(action) + + /** + * Generic change-detection helper used to help build / artifact generation / + * etc. steps detect whether or not they need to run. Returns a function whose + * input is a Set of input files, and subsequently executes the action function + * (which does the actual work: compiles, generates resources, etc.), returning + * a Set of output files that it generated. + * + * The input file and resulting output file state is cached in stores issued by + * `storeFactory`. On each invocation, the state of the input and output + * files from the previous run is compared against the cache, as is the set of + * input files. If a change in file state / input files set is detected, the + * action function is re-executed. + * + * @param storeFactory The factory to use to get stores for the input and output files. + * @param inStyle The strategy by which to detect state change in the input files from the previous run + * @param action The work function, which receives a list of input files and returns a list of output files + */ + def cached(storeFactory: CacheStoreFactory, inStyle: FileInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = + cached(storeFactory, inStyle = inStyle, outStyle = defaultOutStyle)(action) + + /** + * Generic change-detection helper used to help build / artifact generation / + * etc. steps detect whether or not they need to run. Returns a function whose + * input is a Set of input files, and subsequently executes the action function + * (which does the actual work: compiles, generates resources, etc.), returning + * a Set of output files that it generated. + * + * The input file and resulting output file state is cached in stores issued by + * `storeFactory`. On each invocation, the state of the input and output + * files from the previous run is compared against the cache, as is the set of + * input files. If a change in file state / input files set is detected, the + * action function is re-executed. + * + * @param storeFactory The factory to use to get stores for the input and output files. + * @param inStyle The strategy by which to detect state change in the input files from the previous run + * @param outStyle The strategy by which to detect state change in the output files from the previous run + * @param action The work function, which receives a list of input files and returns a list of output files + */ + def cached(storeFactory: CacheStoreFactory, inStyle: FileInfo.Style, outStyle: FileInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = + { + lazy val inCache = Difference.inputs(storeFactory.make("in-cache"), inStyle) + lazy val outCache = Difference.outputs(storeFactory.make("out-cache"), outStyle) + inputs => + { + inCache(inputs) { inReport => + outCache { outReport => + if (inReport.modified.isEmpty && outReport.modified.isEmpty) + outReport.checked + else + action(inReport, outReport) + } + } + } + } +} diff --git a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala similarity index 72% rename from internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala rename to util-tracking/src/main/scala/sbt/util/Tracked.scala index 4db2acc0c..dcd5658e9 100644 --- a/internal/util-tracking/src/main/scala/sbt/internal/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -1,7 +1,7 @@ /* sbt -- Simple Build Tool * Copyright 2009, 2010 Mark Harrah */ -package sbt.internal.util +package sbt.util import scala.util.{ Failure, Try, Success } @@ -15,22 +15,48 @@ object Tracked { import CacheImplicits.LongJsonFormat + /** + * Creates a tracker that provides the last time it was evaluated. + * If the function throws an exception. + */ + def tstamp(store: CacheStore): Timestamp = tstamp(store, true) + + /** + * Creates a tracker that provides the last time it was evaluated. + * If the function throws an exception. + */ + def tstamp(cacheFile: File): Timestamp = tstamp(CacheStore(cacheFile)) + /** * Creates a tracker that provides the last time it was evaluated. * If 'useStartTime' is true, the recorded time is the start of the evaluated function. * If 'useStartTime' is false, the recorded time is when the evaluated function completes. * In both cases, the timestamp is not updated if the function throws an exception. */ - def tstamp(store: CacheStore, useStartTime: Boolean = true): Timestamp = new Timestamp(store, useStartTime) + def tstamp(store: CacheStore, useStartTime: Boolean): Timestamp = new Timestamp(store, useStartTime) + + /** + * Creates a tracker that provides the last time it was evaluated. + * If 'useStartTime' is true, the recorded time is the start of the evaluated function. + * If 'useStartTime' is false, the recorded time is when the evaluated function completes. + * In both cases, the timestamp is not updated if the function throws an exception. + */ + def tstamp(cacheFile: File, useStartTime: Boolean): Timestamp = tstamp(CacheStore(cacheFile), useStartTime) /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ def diffInputs(store: CacheStore, style: FileInfo.Style): Difference = Difference.inputs(store, style) + /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ + def diffInputs(cacheFile: File, style: FileInfo.Style): Difference = diffInputs(CacheStore(cacheFile), style) + /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ def diffOutputs(store: CacheStore, style: FileInfo.Style): Difference = Difference.outputs(store, style) + /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ + def diffOutputs(cacheFile: File, style: FileInfo.Style): Difference = diffOutputs(CacheStore(cacheFile), style) + /** Creates a tracker that provides the output of the most recent invocation of the function */ def lastOutput[I, O: JsonFormat](store: CacheStore)(f: (I, Option[O]) => O): I => O = { in => val previous = Try { store.read[O] }.toOption @@ -39,6 +65,10 @@ object Tracked { next } + /** Creates a tracker that provides the output of the most recent invocation of the function */ + def lastOutput[I, O: JsonFormat](cacheFile: File)(f: (I, Option[O]) => O): I => O = + lastOutput(CacheStore(cacheFile))(f) + /** * Creates a tracker that indicates whether the arguments given to f have changed since the most * recent invocation. @@ -53,6 +83,13 @@ object Tracked { result } + /** + * Creates a tracker that indicates whether the arguments given to f have changed since the most + * recent invocation. + */ + def inputChanged[I: JsonFormat: SingletonCache, O](cacheFile: File)(f: (Boolean, I) => O): I => O = + inputChanged(CacheStore(cacheFile))(f) + private final class CacheHelp[I: JsonFormat](val sc: SingletonCache[I]) { def save(store: CacheStore, value: I): Unit = { store.write(value) @@ -169,45 +206,3 @@ class Difference(val store: CacheStore, val style: FileInfo.Style, val defineCle result } } - -object FileFunction { - type UpdateFunction = (ChangeReport[File], ChangeReport[File]) => Set[File] - - /** - * Generic change-detection helper used to help build / artifact generation / - * etc. steps detect whether or not they need to run. Returns a function whose - * input is a Set of input files, and subsequently executes the action function - * (which does the actual work: compiles, generates resources, etc.), returning - * a Set of output files that it generated. - * - * The input file and resulting output file state is cached in stores issued by - * `storeFactory`. On each invocation, the state of the input and output - * files from the previous run is compared against the cache, as is the set of - * input files. If a change in file state / input files set is detected, the - * action function is re-executed. - * - * @param storeFactory The factory to use to get stores for the input and output files. - * @param inStyle The strategy by which to detect state change in the input files from the previous run - * @param outStyle The strategy by which to detect state change in the output files from the previous run - * @param action The work function, which receives a list of input files and returns a list of output files - */ - def cached(storeFactory: CacheStoreFactory, inStyle: FileInfo.Style = FileInfo.lastModified, outStyle: FileInfo.Style = FileInfo.exists)(action: Set[File] => Set[File]): Set[File] => Set[File] = - cached(storeFactory)(inStyle, outStyle)((in, out) => action(in.checked)) - - def cached(storeFactory: CacheStoreFactory)(inStyle: FileInfo.Style, outStyle: FileInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = - { - lazy val inCache = Difference.inputs(storeFactory.derive("in-cache"), inStyle) - lazy val outCache = Difference.outputs(storeFactory.derive("out-cache"), outStyle) - inputs => - { - inCache(inputs) { inReport => - outCache { outReport => - if (inReport.modified.isEmpty && outReport.modified.isEmpty) - outReport.checked - else - action(inReport, outReport) - } - } - } - } -} diff --git a/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala similarity index 90% rename from internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala rename to util-tracking/src/test/scala/sbt/util/TrackedSpec.scala index df23ae8e7..90513cad6 100644 --- a/internal/util-tracking/src/test/scala/sbt/internal/util/TrackedSpec.scala +++ b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala @@ -1,4 +1,4 @@ -package sbt.internal.util +package sbt.util import sbt.io.IO import sbt.io.syntax._ @@ -6,14 +6,9 @@ import sbt.io.syntax._ import CacheImplicits._ import sjsonnew.IsoString -import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } - -import scala.json.ast.unsafe.JValue +import sbt.internal.util.UnitSpec class TrackedSpec extends UnitSpec { - - implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) - "lastOutput" should "store the last output" in { withStore { store => @@ -133,7 +128,7 @@ class TrackedSpec extends UnitSpec { private def withStore(f: CacheStore => Unit): Unit = IO.withTemporaryDirectory { tmp => - val store = new FileBasedStore(tmp / "cache-store", Converter) + val store = CacheStore(tmp / "cache-store") f(store) } From 122c738913ebf259496306937d58bfc1158ad955 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 17 May 2017 00:18:24 -0400 Subject: [PATCH 667/823] bump version --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index eb9ede57b..60aa33c28 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "1.0.0-M21" +def baseVersion: String = "1.0.0-M24" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From feeb6291cd7f58807917bb5efb75a59ce523a456 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 18 May 2017 00:34:41 -0400 Subject: [PATCH 668/823] Add toString for ObjectEvent --- .../src/main/scala/sbt/internal/util/ObjectEvent.scala | 2 ++ 1 file changed, 2 insertions(+) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala index fdf4d8789..c75e09a1e 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala @@ -15,6 +15,8 @@ final class ObjectEvent[A]( val contentType: String, val json: JValue ) extends Serializable { + override def toString: String = + s"ObjectEvent($level, $message, $channelName, $execId, $contentType, $json)" } object ObjectEvent { From f2cc5ee775d74e9c8ca2db6a091a735fc0d72933 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 18 May 2017 00:36:24 -0400 Subject: [PATCH 669/823] convert log4j async LogEvent to an immutable one This fixes the buffered log not showing up for tests. Ref sbt/sbt#3198 --- .../src/main/scala/sbt/internal/util/BufferedLogger.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index d3b6972bc..a8399e2b6 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -38,7 +38,7 @@ class BufferedAppender private[BufferedAppender] (name: String, delegate: Append def append(event: XLogEvent): Unit = { if (recording) { - buffer += event + buffer += event.toImmutable } else delegate.append(event) } From 44e64437b0bcc06b81f96448ffba5adcf9dfbb47 Mon Sep 17 00:00:00 2001 From: Rogach Date: Tue, 13 Jun 2017 15:19:45 +0300 Subject: [PATCH 670/823] Upgrade to jline 2.14.4 A recent ncurses upgrade breaks older jlines. https://github.com/sbt/sbt/issues/3240 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index c39981812..278b298dc 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -28,7 +28,7 @@ object Dependencies { def addSbtIO(p: Project): Project = addSbtModule(p, sbtIoPath, "io", sbtIO) - val jline = "jline" % "jline" % "2.14.3" + val jline = "jline" % "jline" % "2.14.4" val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } From 244bf0f6e14a32e0fdac8cf7c1bb433adccb1f55 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 14 Jun 2017 01:30:26 -0400 Subject: [PATCH 671/823] Bump to using sbt 1.0.0-M6 --- .travis.yml | 6 +++--- build.sbt | 9 +++++---- project/Dependencies.scala | 16 +++++++++++----- project/bintray.sbt | 1 - project/build.properties | 2 +- project/contraband.sbt | 1 - project/doge.sbt | 1 - project/house.sbt | 1 - project/pgp.sbt | 1 - project/plugins.sbt | 4 ++++ 10 files changed, 24 insertions(+), 18 deletions(-) delete mode 100644 project/bintray.sbt delete mode 100644 project/contraband.sbt delete mode 100644 project/doge.sbt delete mode 100644 project/house.sbt delete mode 100644 project/pgp.sbt create mode 100644 project/plugins.sbt diff --git a/.travis.yml b/.travis.yml index e19b3364a..d428e1dbc 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,11 +1,11 @@ language: scala scala: - - 2.11.8 - - 2.12.1 + - 2.11.11 + - 2.12.2 script: - - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "plz $TRAVIS_SCALA_VERSION test" + - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" test jdk: - oraclejdk8 diff --git a/build.sbt b/build.sbt index 60aa33c28..36db22d09 100644 --- a/build.sbt +++ b/build.sbt @@ -1,6 +1,6 @@ import Dependencies._ import Util._ -import com.typesafe.tools.mima.core._, ProblemFilters._ +// import com.typesafe.tools.mima.core._, ProblemFilters._ def baseVersion: String = "1.0.0-M24" def internalPath = file("internal") @@ -19,12 +19,13 @@ def commonSettings: Seq[Setting[_]] = Seq( val old = scalacOptions.value scalaVersion.value match { case sv if sv.startsWith("2.10") => old diff List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") - case _ => old ++ List("-Ywarn-unused", "-Ywarn-unused-import") + case sv if sv.startsWith("2.11") => old ++ List("-Ywarn-unused", "-Ywarn-unused-import") + case _ => old ++ List("-Ywarn-unused", "-Ywarn-unused-import", "-YdisableFlatCpCaching") } }, scalacOptions in console in Compile -= "-Ywarn-unused-import", scalacOptions in console in Test -= "-Ywarn-unused-import", - mimaPreviousArtifacts := Set(), // Some(organization.value %% moduleName.value % "1.0.0"), + // mimaPreviousArtifacts := Set(), // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, publishArtifact in Test := false ) @@ -156,7 +157,7 @@ lazy val utilTesting = (project in internalPath / "util-testing"). configure(addSbtIO) lazy val utilScripted = (project in internalPath / "util-scripted"). - dependsOn(utilLogging). + dependsOn(utilLogging, utilInterface). settings( commonSettings, name := "Util Scripted", diff --git a/project/Dependencies.scala b/project/Dependencies.scala index c39981812..225987869 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -3,8 +3,8 @@ import Keys._ object Dependencies { val scala210 = "2.10.6" - val scala211 = "2.11.8" - val scala212 = "2.12.1" + val scala211 = "2.11.11" + val scala212 = "2.12.2" private val ioVersion = "1.0.0-M10" @@ -20,10 +20,16 @@ object Dependencies { lazy val sbtIoPath = getSbtModulePath("sbtio.path", "sbt/io") - def addSbtModule(p: Project, path: Option[String], projectName: String, m: ModuleID, c: Option[Configuration] = None) = + def addSbtModule(p: Project, + path: Option[String], + projectName: String, + m: ModuleID, + c: Option[Configuration] = None) = path match { - case Some(f) => p dependsOn c.fold[ClasspathDependency](ProjectRef(file(f), projectName))(ProjectRef(file(f), projectName) % _) - case None => p settings (libraryDependencies += c.fold(m)(m % _)) + case Some(f) => + p dependsOn c.fold[ClasspathDep[ProjectReference]](ProjectRef(file(f), projectName))( + ProjectRef(file(f), projectName) % _) + case None => p settings (libraryDependencies += c.fold(m)(m % _)) } def addSbtIO(p: Project): Project = addSbtModule(p, sbtIoPath, "io", sbtIO) diff --git a/project/bintray.sbt b/project/bintray.sbt deleted file mode 100644 index 8dd913f98..000000000 --- a/project/bintray.sbt +++ /dev/null @@ -1 +0,0 @@ -addSbtPlugin("me.lessis" % "bintray-sbt" % "0.3.0") diff --git a/project/build.properties b/project/build.properties index 64317fdae..cd66fd542 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=0.13.15 +sbt.version=1.0.0-M6 diff --git a/project/contraband.sbt b/project/contraband.sbt deleted file mode 100644 index 8a80f6ea1..000000000 --- a/project/contraband.sbt +++ /dev/null @@ -1 +0,0 @@ -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M4") diff --git a/project/doge.sbt b/project/doge.sbt deleted file mode 100644 index e1274c941..000000000 --- a/project/doge.sbt +++ /dev/null @@ -1 +0,0 @@ -addSbtPlugin("com.eed3si9n" % "sbt-doge" % "0.1.5") diff --git a/project/house.sbt b/project/house.sbt deleted file mode 100644 index bad061ebe..000000000 --- a/project/house.sbt +++ /dev/null @@ -1 +0,0 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.2") diff --git a/project/pgp.sbt b/project/pgp.sbt deleted file mode 100644 index 4ce4d9ed4..000000000 --- a/project/pgp.sbt +++ /dev/null @@ -1 +0,0 @@ -addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.0.0") diff --git a/project/plugins.sbt b/project/plugins.sbt new file mode 100644 index 000000000..c22194daf --- /dev/null +++ b/project/plugins.sbt @@ -0,0 +1,4 @@ +addSbtPlugin("org.foundweekends" % "sbt-bintray" % "0.4.0") +addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.1.0-M1") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M5") From 03213f84c8e6b12d7fcfc367f7214cf93e69bf43 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 21 Jun 2017 11:36:02 +0100 Subject: [PATCH 672/823] Get rid of Eval --- .../src/main/scala/sbt/util/Eval.scala | 296 ------------------ 1 file changed, 296 deletions(-) delete mode 100644 internal/util-collection/src/main/scala/sbt/util/Eval.scala diff --git a/internal/util-collection/src/main/scala/sbt/util/Eval.scala b/internal/util-collection/src/main/scala/sbt/util/Eval.scala deleted file mode 100644 index abfb38070..000000000 --- a/internal/util-collection/src/main/scala/sbt/util/Eval.scala +++ /dev/null @@ -1,296 +0,0 @@ -package sbt.util - -import scala.annotation.tailrec - -// Copied from Cats (MIT license) - -/** - * Eval is a monad which controls evaluation. - * - * This type wraps a value (or a computation that produces a value) - * and can produce it on command via the `.value` method. - * - * There are three basic evaluation strategies: - * - * - Now: evaluated immediately - * - Later: evaluated once when value is needed - * - Always: evaluated every time value is needed - * - * The Later and Always are both lazy strategies while Now is eager. - * Later and Always are distinguished from each other only by - * memoization: once evaluated Later will save the value to be returned - * immediately if it is needed again. Always will run its computation - * every time. - * - * Eval supports stack-safe lazy computation via the .map and .flatMap - * methods, which use an internal trampoline to avoid stack overflows. - * Computation done within .map and .flatMap is always done lazily, - * even when applied to a Now instance. - * - * It is not generally good style to pattern-match on Eval instances. - * Rather, use .map and .flatMap to chain computation, and use .value - * to get the result when needed. It is also not good style to create - * Eval instances whose computation involves calling .value on another - * Eval instance -- this can defeat the trampolining and lead to stack - * overflows. - */ -sealed abstract class Eval[+A] extends Serializable { self => - - /** - * Evaluate the computation and return an A value. - * - * For lazy instances (Later, Always), any necessary computation - * will be performed at this point. For eager instances (Now), a - * value will be immediately returned. - */ - def value: A - - /** - * Transform an Eval[A] into an Eval[B] given the transformation - * function `f`. - * - * This call is stack-safe -- many .map calls may be chained without - * consumed additional stack during evaluation. - * - * Computation performed in f is always lazy, even when called on an - * eager (Now) instance. - */ - def map[B](f: A => B): Eval[B] = - flatMap(a => Now(f(a))) - - /** - * Lazily perform a computation based on an Eval[A], using the - * function `f` to produce an Eval[B] given an A. - * - * This call is stack-safe -- many .flatMap calls may be chained - * without consumed additional stack during evaluation. It is also - * written to avoid left-association problems, so that repeated - * calls to .flatMap will be efficiently applied. - * - * Computation performed in f is always lazy, even when called on an - * eager (Now) instance. - */ - def flatMap[B](f: A => Eval[B]): Eval[B] = - this match { - case c: Eval.Compute[A] => - new Eval.Compute[B] { - type Start = c.Start - // See https://issues.scala-lang.org/browse/SI-9931 for an explanation - // of why the type annotations are necessary in these two lines on - // Scala 2.12.0. - val start: () => Eval[Start] = c.start - val run: Start => Eval[B] = (s: c.Start) => - new Eval.Compute[B] { - type Start = A - val start = () => c.run(s) - val run = f - } - } - case c: Eval.Call[A] => - new Eval.Compute[B] { - type Start = A - val start = c.thunk - val run = f - } - case _ => - new Eval.Compute[B] { - type Start = A - val start = () => self - val run = f - } - } - - /** - * Ensure that the result of the computation (if any) will be - * memoized. - * - * Practically, this means that when called on an Always[A] a - * Later[A] with an equivalent computation will be returned. - */ - def memoize: Eval[A] -} - -/** - * Construct an eager Eval[A] instance. - * - * In some sense it is equivalent to using a val. - * - * This type should be used when an A value is already in hand, or - * when the computation to produce an A value is pure and very fast. - */ -final case class Now[A](value: A) extends Eval[A] { - def memoize: Eval[A] = this -} - -/** - * Construct a lazy Eval[A] instance. - * - * This type should be used for most "lazy" values. In some sense it - * is equivalent to using a lazy val. - * - * When caching is not required or desired (e.g. if the value produced - * may be large) prefer Always. When there is no computation - * necessary, prefer Now. - * - * Once Later has been evaluated, the closure (and any values captured - * by the closure) will not be retained, and will be available for - * garbage collection. - */ -final class Later[A](f: () => A) extends Eval[A] { - private[this] var thunk: () => A = f - - // The idea here is that `f` may have captured very large - // structures, but produce a very small result. In this case, once - // we've calculated a value, we would prefer to be able to free - // everything else. - // - // (For situations where `f` is small, but the output will be very - // expensive to store, consider using `Always`.) - lazy val value: A = { - val result = thunk() - thunk = null // scalastyle:off - result - } - - def memoize: Eval[A] = this -} - -object Later { - def apply[A](a: => A): Later[A] = new Later(a _) -} - -/** - * Construct a lazy Eval[A] instance. - * - * This type can be used for "lazy" values. In some sense it is - * equivalent to using a Function0 value. - * - * This type will evaluate the computation every time the value is - * required. It should be avoided except when laziness is required and - * caching must be avoided. Generally, prefer Later. - */ -final class Always[A](f: () => A) extends Eval[A] { - def value: A = f() - def memoize: Eval[A] = new Later(f) -} - -object Always { - def apply[A](a: => A): Always[A] = new Always(a _) -} - -object Eval { - - /** - * Construct an eager Eval[A] value (i.e. Now[A]). - */ - def now[A](a: A): Eval[A] = Now(a) - - /** - * Construct a lazy Eval[A] value with caching (i.e. Later[A]). - */ - def later[A](a: => A): Eval[A] = new Later(a _) - - /** - * Construct a lazy Eval[A] value without caching (i.e. Always[A]). - */ - def always[A](a: => A): Eval[A] = new Always(a _) - - /** - * Defer a computation which produces an Eval[A] value. - * - * This is useful when you want to delay execution of an expression - * which produces an Eval[A] value. Like .flatMap, it is stack-safe. - */ - def defer[A](a: => Eval[A]): Eval[A] = - new Eval.Call[A](a _) {} - - /** - * Static Eval instances for some common values. - * - * These can be useful in cases where the same values may be needed - * many times. - */ - val Unit: Eval[Unit] = Now(()) - val True: Eval[Boolean] = Now(true) - val False: Eval[Boolean] = Now(false) - val Zero: Eval[Int] = Now(0) - val One: Eval[Int] = Now(1) - - /** - * Call is a type of Eval[A] that is used to defer computations - * which produce Eval[A]. - * - * Users should not instantiate Call instances themselves. Instead, - * they will be automatically created when needed. - */ - sealed abstract class Call[A](val thunk: () => Eval[A]) extends Eval[A] { - def memoize: Eval[A] = new Later(() => value) - def value: A = Call.loop(this).value - } - - object Call { - /** Collapse the call stack for eager evaluations */ - @tailrec private def loop[A](fa: Eval[A]): Eval[A] = fa match { - case call: Eval.Call[A] => - loop(call.thunk()) - case compute: Eval.Compute[A] => - new Eval.Compute[A] { - type Start = compute.Start - val start: () => Eval[Start] = () => compute.start() - val run: Start => Eval[A] = s => loop1(compute.run(s)) - } - case other => other - } - - /** - * Alias for loop that can be called in a non-tail position - * from an otherwise tailrec-optimized loop. - */ - private def loop1[A](fa: Eval[A]): Eval[A] = loop(fa) - } - - /** - * Compute is a type of Eval[A] that is used to chain computations - * involving .map and .flatMap. Along with Eval#flatMap it - * implements the trampoline that guarantees stack-safety. - * - * Users should not instantiate Compute instances - * themselves. Instead, they will be automatically created when - * needed. - * - * Unlike a traditional trampoline, the internal workings of the - * trampoline are not exposed. This allows a slightly more efficient - * implementation of the .value method. - */ - sealed abstract class Compute[A] extends Eval[A] { - type Start - val start: () => Eval[Start] - val run: Start => Eval[A] - - def memoize: Eval[A] = Later(value) - - def value: A = { - type L = Eval[Any] - type C = Any => Eval[Any] - @tailrec def loop(curr: L, fs: List[C]): Any = - curr match { - case c: Compute[_] => - c.start() match { - case cc: Compute[_] => - loop( - cc.start().asInstanceOf[L], - cc.run.asInstanceOf[C] :: c.run.asInstanceOf[C] :: fs - ) - case xx => - loop(c.run(xx.value), fs) - } - case x => - fs match { - case f :: fs => loop(f(x.value), fs) - case Nil => x.value - } - } - loop(this.asInstanceOf[L], Nil).asInstanceOf[A] - } - } -} From 32d6bf38cf78e08316442159900d6135b8f417db Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 26 Jun 2017 23:43:21 -0400 Subject: [PATCH 673/823] send TraceEvent on crash Fixes sbt/sbt#3234 --- build.sbt | 15 +++++- .../sbt/internal/util/AbstractEntry.scala | 2 +- .../sbt/internal/util/StringEvent.scala | 2 +- .../sbt/internal/util/TraceEvent.scala | 51 +++++++++++++++++++ .../util/codec/AbstractEntryFormats.scala | 7 +-- .../internal/util/codec/JsonProtocol.scala | 1 + .../util/codec/StringEventFormats.scala | 2 +- .../util/codec/TraceEventFormats.scala | 33 ++++++++++++ .../src/main/contraband/logging.contra | 7 +++ .../sbt/internal/util/ManagedLogger.scala | 10 +++- .../internal/util/codec/JValueFormats.scala | 8 +-- .../util/codec/ThrowableShowLines.scala | 25 +++++++++ project/Dependencies.scala | 2 +- project/plugins.sbt | 2 +- .../src/main/scala/sbt/util/FileInfo.scala | 4 +- 15 files changed, 153 insertions(+), 18 deletions(-) create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TraceEventFormats.scala create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/codec/ThrowableShowLines.scala diff --git a/build.sbt b/build.sbt index 36db22d09..1b77cc349 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ // import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion: String = "1.0.0-M24" +def baseVersion = "1.0.0-SNAPSHOT" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( @@ -39,6 +39,11 @@ lazy val utilRoot: Project = (project in file(".")). settings( inThisBuild(Seq( git.baseVersion := baseVersion, + version := { + val v = version.value + if (v contains "SNAPSHOT") git.baseVersion.value + else v + }, bintrayPackage := "util", homepage := Some(url("https://github.com/sbt/util")), description := "Util module for sbt", @@ -108,7 +113,13 @@ lazy val utilLogging = (project in internalPath / "util-logging"). crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson, scalaReflect.value), - sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala" + sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala", + contrabandFormatsForType in generateContrabands in Compile := { tpe => + val old = (contrabandFormatsForType in generateContrabands in Compile).value + val name = tpe.removeTypeParameters.name + if (name == "Throwable") Nil + else old(tpe) + }, ) // Relation diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala index c20b78e1b..5f8b37c07 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala @@ -16,7 +16,7 @@ abstract class AbstractEntry( case _ => false } override def hashCode: Int = { - 37 * (37 * (17 + channelName.##) + execId.##) + 37 * (37 * (37 * (17 + "AbstractEntry".##) + channelName.##) + execId.##) } override def toString: String = { "AbstractEntry(" + channelName + ", " + execId + ")" diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala index 4ac959836..71763458c 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala @@ -17,7 +17,7 @@ final class StringEvent private ( case _ => false } override def hashCode: Int = { - 37 * (37 * (37 * (37 * (17 + level.##) + message.##) + channelName.##) + execId.##) + 37 * (37 * (37 * (37 * (37 * (17 + "StringEvent".##) + level.##) + message.##) + channelName.##) + execId.##) } override def toString: String = { "StringEvent(" + level + ", " + message + ", " + channelName + ", " + execId + ")" diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala new file mode 100644 index 000000000..85312aff4 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala @@ -0,0 +1,51 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util +final class TraceEvent private ( + val level: String, + val message: Throwable, + channelName: Option[String], + execId: Option[String]) extends sbt.internal.util.AbstractEntry(channelName, execId) with Serializable { + + + + override def equals(o: Any): Boolean = o match { + case x: TraceEvent => (this.level == x.level) && (this.message == x.message) && (this.channelName == x.channelName) && (this.execId == x.execId) + case _ => false + } + override def hashCode: Int = { + 37 * (37 * (37 * (37 * (37 * (17 + "TraceEvent".##) + level.##) + message.##) + channelName.##) + execId.##) + } + override def toString: String = { + "TraceEvent(" + level + ", " + message + ", " + channelName + ", " + execId + ")" + } + protected[this] def copy(level: String = level, message: Throwable = message, channelName: Option[String] = channelName, execId: Option[String] = execId): TraceEvent = { + new TraceEvent(level, message, channelName, execId) + } + def withLevel(level: String): TraceEvent = { + copy(level = level) + } + def withMessage(message: Throwable): TraceEvent = { + copy(message = message) + } + def withChannelName(channelName: Option[String]): TraceEvent = { + copy(channelName = channelName) + } + def withChannelName(channelName: String): TraceEvent = { + copy(channelName = Option(channelName)) + } + def withExecId(execId: Option[String]): TraceEvent = { + copy(execId = execId) + } + def withExecId(execId: String): TraceEvent = { + copy(execId = Option(execId)) + } +} +object TraceEvent { + + def apply(level: String, message: Throwable, channelName: Option[String], execId: Option[String]): TraceEvent = new TraceEvent(level, message, channelName, execId) + def apply(level: String, message: Throwable, channelName: String, execId: String): TraceEvent = new TraceEvent(level, message, Option(channelName), Option(execId)) +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala index 4eed06c7b..55784f9ac 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala @@ -4,7 +4,8 @@ // DO NOT EDIT MANUALLY package sbt.internal.util.codec -import _root_.sjsonnew.{ deserializationError, serializationError, Builder, JsonFormat, Unbuilder } -trait AbstractEntryFormats { self: sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.StringEventFormats => -implicit lazy val AbstractEntryFormat: JsonFormat[sbt.internal.util.AbstractEntry] = flatUnionFormat1[sbt.internal.util.AbstractEntry, sbt.internal.util.StringEvent]("type") + +import _root_.sjsonnew.JsonFormat +trait AbstractEntryFormats { self: sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.StringEventFormats with sbt.internal.util.codec.TraceEventFormats => +implicit lazy val AbstractEntryFormat: JsonFormat[sbt.internal.util.AbstractEntry] = flatUnionFormat2[sbt.internal.util.AbstractEntry, sbt.internal.util.StringEvent, sbt.internal.util.TraceEvent]("type") } diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala index 39484f2e0..4696c9612 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala @@ -6,5 +6,6 @@ package sbt.internal.util.codec trait JsonProtocol extends sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.StringEventFormats + with sbt.internal.util.codec.TraceEventFormats with sbt.internal.util.codec.AbstractEntryFormats object JsonProtocol extends JsonProtocol \ No newline at end of file diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala index c005071e7..2d142f6ec 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala @@ -4,7 +4,7 @@ // DO NOT EDIT MANUALLY package sbt.internal.util.codec -import _root_.sjsonnew.{ deserializationError, serializationError, Builder, JsonFormat, Unbuilder } +import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } trait StringEventFormats { self: sjsonnew.BasicJsonProtocol => implicit lazy val StringEventFormat: JsonFormat[sbt.internal.util.StringEvent] = new JsonFormat[sbt.internal.util.StringEvent] { override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.StringEvent = { diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TraceEventFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TraceEventFormats.scala new file mode 100644 index 000000000..379196a9b --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TraceEventFormats.scala @@ -0,0 +1,33 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } +trait TraceEventFormats { self: sjsonnew.BasicJsonProtocol => +implicit lazy val TraceEventFormat: JsonFormat[sbt.internal.util.TraceEvent] = new JsonFormat[sbt.internal.util.TraceEvent] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.TraceEvent = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val level = unbuilder.readField[String]("level") + val message = unbuilder.readField[Throwable]("message") + val channelName = unbuilder.readField[Option[String]]("channelName") + val execId = unbuilder.readField[Option[String]]("execId") + unbuilder.endObject() + sbt.internal.util.TraceEvent(level, message, channelName, execId) + case None => + deserializationError("Expected JsObject but found None") + } + } + override def write[J](obj: sbt.internal.util.TraceEvent, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("level", obj.level) + builder.addField("message", obj.message) + builder.addField("channelName", obj.channelName) + builder.addField("execId", obj.execId) + builder.endObject() + } +} +} diff --git a/internal/util-logging/src/main/contraband/logging.contra b/internal/util-logging/src/main/contraband/logging.contra index 67a4b3a04..5cd31c230 100644 --- a/internal/util-logging/src/main/contraband/logging.contra +++ b/internal/util-logging/src/main/contraband/logging.contra @@ -14,3 +14,10 @@ type StringEvent implements sbt.internal.util.AbstractEntry { channelName: String execId: String } + +type TraceEvent implements sbt.internal.util.AbstractEntry { + level: String! + message: Throwable! + channelName: String + execId: String +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index 24b9a8e02..c66ceb4d2 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -1,10 +1,13 @@ package sbt.internal.util import sbt.util._ -import org.apache.logging.log4j.{ Logger => XLogger } +import org.apache.logging.log4j.{ Logger => XLogger, Level => XLevel } import org.apache.logging.log4j.message.ObjectMessage import sjsonnew.JsonFormat import scala.reflect.runtime.universe.TypeTag +import sbt.internal.util.codec.ThrowableShowLines._ +import sbt.internal.util.codec.TraceEventShowLines._ +import sbt.internal.util.codec.JsonProtocol._ /** * Delegates log events to the associated LogExchange. @@ -15,7 +18,8 @@ class ManagedLogger( val execId: Option[String], xlogger: XLogger ) extends Logger { - override def trace(t: => Throwable): Unit = () // exchange.appendLog(new Trace(t)) + override def trace(t: => Throwable): Unit = + logEvent(Level.Error, TraceEvent("Error", t, channelName, execId)) override def log(level: Level.Value, message: => String): Unit = { xlogger.log( @@ -32,6 +36,8 @@ class ManagedLogger( // println(s"registerStringCodec ${tag.key}") val _ = LogExchange.getOrElseUpdateStringCodec(tag.key, ev) } + registerStringCodec[Throwable] + registerStringCodec[TraceEvent] final def debugEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Debug, event) final def infoEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Info, event) final def warnEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Warn, event) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala index 2ff681825..c3ce6299b 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala @@ -15,10 +15,10 @@ trait JValueFormats { self: sjsonnew.BasicJsonProtocol => def read[J](j: Option[J], u: Unbuilder[J]) = JNull } - implicit val JBooleanFormat: JF[JBoolean] = project(_.get, (x: Boolean) => JBoolean(x)) - implicit val JStringFormat: JF[JString] = project(_.value, (x: String) => JString(x)) - implicit val JNumberFormat: JF[JNumber] = project(x => BigDecimal(x.value), (x: BigDecimal) => JNumber(x.toString)) - implicit val JArrayFormat: JF[JArray] = project[JArray, Array[JValue]](_.value, JArray(_)) + implicit val JBooleanFormat: JF[JBoolean] = projectFormat(_.get, (x: Boolean) => JBoolean(x)) + implicit val JStringFormat: JF[JString] = projectFormat(_.value, (x: String) => JString(x)) + implicit val JNumberFormat: JF[JNumber] = projectFormat(x => BigDecimal(x.value), (x: BigDecimal) => JNumber(x.toString)) + implicit val JArrayFormat: JF[JArray] = projectFormat[JArray, Array[JValue]](_.value, JArray(_)) implicit lazy val JObjectJsonWriter: JW[JObject] = new JW[JObject] { def write[J](x: JObject, b: Builder[J]) = { diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/ThrowableShowLines.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ThrowableShowLines.scala new file mode 100644 index 000000000..13abbdf8a --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ThrowableShowLines.scala @@ -0,0 +1,25 @@ +package sbt +package internal.util.codec + +import sbt.util.ShowLines +import sbt.internal.util.{ StackTrace, TraceEvent } + +trait ThrowableShowLines { + implicit val sbtThrowableShowLines: ShowLines[Throwable] = + ShowLines[Throwable]( (t: Throwable) => { + // 0 means enabled with default behavior. See StackTrace.scala. + val traceLevel = 0 + List(StackTrace.trimmed(t, traceLevel)) + }) +} + +object ThrowableShowLines extends ThrowableShowLines + +trait TraceEventShowLines { + implicit val sbtTraceEventShowLines: ShowLines[TraceEvent] = + ShowLines[TraceEvent]( (t: TraceEvent) => { + ThrowableShowLines.sbtThrowableShowLines.showLines(t.message) + }) +} + +object TraceEventShowLines extends TraceEventShowLines diff --git a/project/Dependencies.scala b/project/Dependencies.scala index aba5a8317..8f380f99c 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -43,7 +43,7 @@ object Dependencies { val scalatest = "org.scalatest" %% "scalatest" % "3.0.1" val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - val sjsonnewVersion = "0.7.0" + val sjsonnewVersion = "0.8.0-M1" val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion diff --git a/project/plugins.sbt b/project/plugins.sbt index c22194daf..1af048fe4 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,4 +1,4 @@ addSbtPlugin("org.foundweekends" % "sbt-bintray" % "0.4.0") addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.1.0-M1") addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M5") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M6") diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 6c42422d0..9c675e652 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -39,7 +39,7 @@ object FilesInfo { def empty[F <: FileInfo]: FilesInfo[F] = FilesInfo(Set.empty[F]) implicit def format[F <: FileInfo: JsonFormat]: JsonFormat[FilesInfo[F]] = - project(_.files, (fs: Set[F]) => FilesInfo(fs)) + projectFormat(_.files, (fs: Set[F]) => FilesInfo(fs)) def full: FileInfo.Style = FileInfo.full def hash: FileInfo.Style = FileInfo.hash @@ -52,7 +52,7 @@ object FileInfo { type F <: FileInfo implicit def format: JsonFormat[F] - implicit def formats: JsonFormat[FilesInfo[F]] = project(_.files, (fs: Set[F]) => FilesInfo(fs)) + implicit def formats: JsonFormat[FilesInfo[F]] = projectFormat(_.files, (fs: Set[F]) => FilesInfo(fs)) def apply(file: File): F def apply(files: Set[File]): FilesInfo[F] = FilesInfo(files map apply) From 4ff793645db578d82f345873755e923ca71509b1 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 1 Jul 2017 03:10:38 -0400 Subject: [PATCH 674/823] io 1.0.0-M12 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 8f380f99c..2c52bb55b 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -6,7 +6,7 @@ object Dependencies { val scala211 = "2.11.11" val scala212 = "2.12.2" - private val ioVersion = "1.0.0-M10" + private val ioVersion = "1.0.0-M12" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From 4e01a359177f6fb844e9b392909d3f69ea597905 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 1 Jul 2017 07:47:19 -0400 Subject: [PATCH 675/823] Contraband update --- build.sbt | 6 +++--- .../src/main/scala/sbt/internal/util/ObjectEvent.scala | 2 +- .../main/scala/sbt/internal/util/codec/JValueFormats.scala | 2 +- project/Dependencies.scala | 6 +++--- project/plugins.sbt | 2 +- util-cache/src/main/scala/sbt/util/CacheStore.scala | 2 +- 6 files changed, 10 insertions(+), 10 deletions(-) diff --git a/build.sbt b/build.sbt index 1b77cc349..5f53fdb2c 100644 --- a/build.sbt +++ b/build.sbt @@ -83,7 +83,7 @@ lazy val utilCollection = (project in internalPath / "util-collection"). crossScalaVersions := Seq(scala210, scala211, scala212), Util.keywordsSettings, name := "Util Collection", - libraryDependencies ++= Seq(sjsonnew) + libraryDependencies ++= Seq(sjsonnew.value) ) lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). @@ -112,7 +112,7 @@ lazy val utilLogging = (project in internalPath / "util-logging"). commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", - libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson, scalaReflect.value), + libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala", contrabandFormatsForType in generateContrabands in Compile := { tpe => val old = (contrabandFormatsForType in generateContrabands in Compile).value @@ -144,7 +144,7 @@ lazy val utilCache = (project in file("util-cache")). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sjsonnewScalaJson, scalaReflect.value) + libraryDependencies ++= Seq(sjsonnewScalaJson.value, scalaReflect.value) ). configure(addSbtIO) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala index c75e09a1e..674e74673 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala @@ -5,7 +5,7 @@ package util import sbt.util.Level import sjsonnew.JsonFormat import sjsonnew.support.scalajson.unsafe.Converter -import scala.json.ast.unsafe.JValue +import scalajson.ast.unsafe.JValue final class ObjectEvent[A]( val level: Level.Value, diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala index c3ce6299b..e800bcff0 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala @@ -7,7 +7,7 @@ package internal package util.codec import sjsonnew.{ JsonWriter => JW, JsonReader => JR, JsonFormat => JF, _ } -import scala.json.ast.unsafe._ +import scalajson.ast.unsafe._ trait JValueFormats { self: sjsonnew.BasicJsonProtocol => implicit val JNullFormat: JF[JNull.type] = new JF[JNull.type] { diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 2c52bb55b..79c777367 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -1,5 +1,6 @@ import sbt._ import Keys._ +import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { val scala210 = "2.10.6" @@ -43,9 +44,8 @@ object Dependencies { val scalatest = "org.scalatest" %% "scalatest" % "3.0.1" val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - val sjsonnewVersion = "0.8.0-M1" - val sjsonnew = "com.eed3si9n" %% "sjson-new-core" % sjsonnewVersion - val sjsonnewScalaJson = "com.eed3si9n" %% "sjson-new-scalajson" % sjsonnewVersion + val sjsonnew = Def.setting { "com.eed3si9n" %% "sjson-new-core" % contrabandSjsonNewVersion.value } + val sjsonnewScalaJson = Def.setting { "com.eed3si9n" %% "sjson-new-scalajson" % contrabandSjsonNewVersion.value } def log4jVersion = "2.8.1" val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion diff --git a/project/plugins.sbt b/project/plugins.sbt index 1af048fe4..396603956 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,4 +1,4 @@ addSbtPlugin("org.foundweekends" % "sbt-bintray" % "0.4.0") addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.1.0-M1") addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M6") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M7") diff --git a/util-cache/src/main/scala/sbt/util/CacheStore.scala b/util-cache/src/main/scala/sbt/util/CacheStore.scala index 073a52281..9ccac0d76 100644 --- a/util-cache/src/main/scala/sbt/util/CacheStore.scala +++ b/util-cache/src/main/scala/sbt/util/CacheStore.scala @@ -5,7 +5,7 @@ import sbt.io.syntax.fileToRichFile import sbt.io.{ IO, Using } import sjsonnew.{ IsoString, JsonReader, JsonWriter, SupportConverter } import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } -import scala.json.ast.unsafe.JValue +import scalajson.ast.unsafe.JValue /** A `CacheStore` is used by the caching infrastructure to persist cached information. */ abstract class CacheStore extends Input with Output { From 2d777a85ee3e0faeb0d1eb100391922a226cd564 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 1 Jul 2017 07:47:39 -0400 Subject: [PATCH 676/823] clean up warnings --- .../src/main/scala/sbt/internal/util/BufferedLogger.scala | 4 ++-- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 2 +- .../src/main/scala/sbt/internal/util/ManagedLogger.scala | 2 +- .../src/main/scala/sbt/internal/scripted/ScriptedTests.scala | 2 +- 4 files changed, 5 insertions(+), 5 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index a8399e2b6..be24152c1 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -57,7 +57,7 @@ class BufferedAppender private[BufferedAppender] (name: String, delegate: Append result } catch { case e: Throwable => stopQuietly(); throw e } } - def stopQuietly() = synchronized { try { stopBuffer() } catch { case e: Exception => () } } + def stopQuietly() = synchronized { try { stopBuffer() } catch { case _: Exception => () } } /** * Flushes the buffer to the delegate logger. This method calls logAll on the delegate @@ -104,7 +104,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { result } catch { case e: Throwable => stopQuietly(); throw e } } - def stopQuietly() = synchronized { try { stop() } catch { case e: Exception => () } } + def stopQuietly() = synchronized { try { stop() } catch { case _: Exception => () } } /** * Flushes the buffer to the delegate logger. This method calls logAll on the delegate diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index d730df112..e85f70fa8 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -162,7 +162,7 @@ object ConsoleAppender { terminal.restore // #460 terminal.isAnsiSupported } catch { - case e: Exception => !isWindows + case _: Exception => !isWindows // sbt 0.13 drops JLine 1.0 from the launcher and uses 2.x as a normal dependency // when 0.13 is used with a 0.12 launcher or earlier, the JLine classes from the launcher get loaded diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index c66ceb4d2..794107111 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -1,7 +1,7 @@ package sbt.internal.util import sbt.util._ -import org.apache.logging.log4j.{ Logger => XLogger, Level => XLevel } +import org.apache.logging.log4j.{ Logger => XLogger } import org.apache.logging.log4j.message.ObjectMessage import sjsonnew.JsonFormat import scala.reflect.runtime.universe.TypeTag diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index 81a04721a..fb1ba9eca 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -4,7 +4,7 @@ package scripted import java.io.File import sbt.util.{ Logger, LogExchange, Level } -import sbt.internal.util.{ ManagedLogger, ConsoleOut, MainAppender, ConsoleAppender, BufferedAppender } +import sbt.internal.util.{ ManagedLogger, ConsoleAppender, BufferedAppender } import sbt.io.IO.wrapNull import sbt.io.{ DirectoryFilter, HiddenFileFilter } import sbt.io.syntax._ From b912a5812535fd321a918a9d3a609eb576b65b07 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 1 Jul 2017 08:00:19 -0400 Subject: [PATCH 677/823] Fix tests --- util-cache/src/test/scala/CacheSpec.scala | 2 +- util-cache/src/test/scala/FileInfoSpec.scala | 2 +- util-cache/src/test/scala/HListFormatSpec.scala | 2 +- util-cache/src/test/scala/SingletonCacheSpec.scala | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/util-cache/src/test/scala/CacheSpec.scala b/util-cache/src/test/scala/CacheSpec.scala index 109fd1247..bce7b9af1 100644 --- a/util-cache/src/test/scala/CacheSpec.scala +++ b/util-cache/src/test/scala/CacheSpec.scala @@ -8,7 +8,7 @@ import CacheImplicits._ import sjsonnew.IsoString import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } -import scala.json.ast.unsafe.JValue +import scalajson.ast.unsafe.JValue import sbt.internal.util.UnitSpec class CacheSpec extends UnitSpec { diff --git a/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala index cae8e15b1..813e85371 100644 --- a/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -1,6 +1,6 @@ package sbt.util -import scala.json.ast.unsafe._ +import scalajson.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ import sbt.internal.util.UnitSpec diff --git a/util-cache/src/test/scala/HListFormatSpec.scala b/util-cache/src/test/scala/HListFormatSpec.scala index e2d5d38fa..a0922d02f 100644 --- a/util-cache/src/test/scala/HListFormatSpec.scala +++ b/util-cache/src/test/scala/HListFormatSpec.scala @@ -1,6 +1,6 @@ package sbt.util -import scala.json.ast.unsafe._ +import scalajson.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ import CacheImplicits._ import sbt.internal.util.{ UnitSpec, HNil } diff --git a/util-cache/src/test/scala/SingletonCacheSpec.scala b/util-cache/src/test/scala/SingletonCacheSpec.scala index 5956746de..15265f312 100644 --- a/util-cache/src/test/scala/SingletonCacheSpec.scala +++ b/util-cache/src/test/scala/SingletonCacheSpec.scala @@ -8,7 +8,7 @@ import CacheImplicits._ import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilder } import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } -import scala.json.ast.unsafe.JValue +import scalajson.ast.unsafe.JValue import sbt.internal.util.UnitSpec class SingletonCacheSpec extends UnitSpec { From f8d67d68374922a8660357e095651e05db501b6c Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 21 Jun 2017 14:41:23 +0100 Subject: [PATCH 678/823] Move HListFormats to collection to drop cache->collection dep Looks like the reason that util-cache depended on util-collection was to define the sjson-new formats (HListFormats) for util-collection's HList. Given that util-collection already depends on sjsonnew, HListFormats can also be defined in util-collection. All that was left then was (a) HListFormatSpec requires sjsonnewScalaJson, so that was added in test scope, and (b) HListFormats had to be dropped from sbt.util.CacheImplicits - HListFormats will have to be imported and/or mixed-in where required downstream. For importing convenience I defined a companion object. --- build.sbt | 4 ++-- .../src/main/scala/sbt/internal/util/HListFormats.scala | 2 ++ .../src/test/scala/HListFormatSpec.scala | 9 +++++---- util-cache/src/main/scala/sbt/util/CacheImplicits.scala | 2 -- 4 files changed, 9 insertions(+), 8 deletions(-) rename {util-cache => internal/util-collection}/src/main/scala/sbt/internal/util/HListFormats.scala (98%) rename {util-cache => internal/util-collection}/src/test/scala/HListFormatSpec.scala (86%) diff --git a/build.sbt b/build.sbt index 5f53fdb2c..6ba8ecec3 100644 --- a/build.sbt +++ b/build.sbt @@ -83,7 +83,7 @@ lazy val utilCollection = (project in internalPath / "util-collection"). crossScalaVersions := Seq(scala210, scala211, scala212), Util.keywordsSettings, name := "Util Collection", - libraryDependencies ++= Seq(sjsonnew.value) + libraryDependencies ++= Seq(sjsonnew.value, sjsonnewScalaJson.value % Test) ) lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). @@ -140,7 +140,7 @@ lazy val utilLogic = (project in internalPath / "util-logic"). // Persisted caching based on sjson-new lazy val utilCache = (project in file("util-cache")). - dependsOn(utilCollection, utilTesting % Test). + dependsOn(utilTesting % Test). settings( commonSettings, name := "Util Cache", diff --git a/util-cache/src/main/scala/sbt/internal/util/HListFormats.scala b/internal/util-collection/src/main/scala/sbt/internal/util/HListFormats.scala similarity index 98% rename from util-cache/src/main/scala/sbt/internal/util/HListFormats.scala rename to internal/util-collection/src/main/scala/sbt/internal/util/HListFormats.scala index bf69b4db8..6abae921c 100644 --- a/util-cache/src/main/scala/sbt/internal/util/HListFormats.scala +++ b/internal/util-collection/src/main/scala/sbt/internal/util/HListFormats.scala @@ -66,3 +66,5 @@ trait HListFormats { def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = hnil } } + +object HListFormats extends HListFormats diff --git a/util-cache/src/test/scala/HListFormatSpec.scala b/internal/util-collection/src/test/scala/HListFormatSpec.scala similarity index 86% rename from util-cache/src/test/scala/HListFormatSpec.scala rename to internal/util-collection/src/test/scala/HListFormatSpec.scala index a0922d02f..8f6e9a73b 100644 --- a/util-cache/src/test/scala/HListFormatSpec.scala +++ b/internal/util-collection/src/test/scala/HListFormatSpec.scala @@ -1,9 +1,10 @@ -package sbt.util +package sbt +package internal +package util import scalajson.ast.unsafe._ -import sjsonnew._, support.scalajson.unsafe._ -import CacheImplicits._ -import sbt.internal.util.{ UnitSpec, HNil } +import sjsonnew._, BasicJsonProtocol._, support.scalajson.unsafe._ +import HListFormats._ class HListFormatSpec extends UnitSpec { val quux = 23 :+: "quux" :+: true :+: HNil diff --git a/util-cache/src/main/scala/sbt/util/CacheImplicits.scala b/util-cache/src/main/scala/sbt/util/CacheImplicits.scala index 2eb1639cd..74cd51f68 100644 --- a/util-cache/src/main/scala/sbt/util/CacheImplicits.scala +++ b/util-cache/src/main/scala/sbt/util/CacheImplicits.scala @@ -1,9 +1,7 @@ package sbt.util import sjsonnew.BasicJsonProtocol -import sbt.internal.util.HListFormats object CacheImplicits extends CacheImplicits trait CacheImplicits extends BasicCacheImplicits with BasicJsonProtocol - with HListFormats From 84180ec4027773fc6b45895ee9a3a11f1a599517 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 23 Jun 2017 14:06:53 +0100 Subject: [PATCH 679/823] Move Showlines to logging to drop logging->collection dep --- build.sbt | 2 +- .../src/main/scala/sbt/util/ShowLines.scala | 0 2 files changed, 1 insertion(+), 1 deletion(-) rename internal/{util-collection => util-logging}/src/main/scala/sbt/util/ShowLines.scala (100%) diff --git a/build.sbt b/build.sbt index 6ba8ecec3..173fa8066 100644 --- a/build.sbt +++ b/build.sbt @@ -107,7 +107,7 @@ lazy val utilComplete = (project in internalPath / "util-complete"). // logging lazy val utilLogging = (project in internalPath / "util-logging"). enablePlugins(ContrabandPlugin, JsonCodecPlugin). - dependsOn(utilInterface, utilCollection, utilTesting % Test). + dependsOn(utilInterface, utilTesting % Test). settings( commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), diff --git a/internal/util-collection/src/main/scala/sbt/util/ShowLines.scala b/internal/util-logging/src/main/scala/sbt/util/ShowLines.scala similarity index 100% rename from internal/util-collection/src/main/scala/sbt/util/ShowLines.scala rename to internal/util-logging/src/main/scala/sbt/util/ShowLines.scala From 48d82f95c2d167a989d9f15d2da5cfdc652e9244 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 23 Jun 2017 14:12:02 +0100 Subject: [PATCH 680/823] Breakout Position to drop lm->collection dep --- build.sbt | 9 +++++++-- .../src/main/scala/sbt/internal/util/Positions.scala | 0 2 files changed, 7 insertions(+), 2 deletions(-) rename internal/{util-collection => util-position}/src/main/scala/sbt/internal/util/Positions.scala (100%) diff --git a/build.sbt b/build.sbt index 173fa8066..8fea6680a 100644 --- a/build.sbt +++ b/build.sbt @@ -32,7 +32,7 @@ def commonSettings: Seq[Setting[_]] = Seq( lazy val utilRoot: Project = (project in file(".")). aggregate( - utilInterface, utilControl, utilCollection, utilApplyMacro, utilComplete, + utilInterface, utilControl, utilPosition, utilCollection, utilApplyMacro, utilComplete, utilLogging, utilRelation, utilLogic, utilCache, utilTracking, utilTesting, utilScripted ). @@ -76,8 +76,13 @@ lazy val utilControl = (project in internalPath / "util-control"). name := "Util Control" ) +val utilPosition = (project in file("internal") / "util-position").settings( + commonSettings, + name := "Util Position" +) + lazy val utilCollection = (project in internalPath / "util-collection"). - dependsOn(utilTesting % Test). + dependsOn(utilPosition, utilTesting % Test). settings( commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Positions.scala b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala similarity index 100% rename from internal/util-collection/src/main/scala/sbt/internal/util/Positions.scala rename to internal/util-position/src/main/scala/sbt/internal/util/Positions.scala From 8d0463e6e8a224d145c835e1748832fefbe12448 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 6 Jul 2017 13:25:05 +0100 Subject: [PATCH 681/823] No sbt-doge in sbt 1, switch back to + --- build.sbt | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 8fea6680a..488e6af39 100644 --- a/build.sbt +++ b/build.sbt @@ -190,8 +190,8 @@ lazy val utilScripted = (project in internalPath / "util-scripted"). def customCommands: Seq[Setting[_]] = Seq( commands += Command.command("release") { state => // "clean" :: - "so compile" :: - "so publishSigned" :: + "+compile" :: + "+publishSigned" :: "reload" :: state } From c76d2624f931a2b75453f39dcaaf750bf4282a44 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 6 Jul 2017 13:30:15 +0100 Subject: [PATCH 682/823] No need for util-collection on Scala 2.10 anymore We still publish parts of sbt for Scala 2.10 for compiler-bridge reasons in Zinc. However util-collection is now becoming an sbt-only module, so it won't need to continue to cross-build to Scala 2.10! Its new dependency util-position which is used by lm also isn't a dependency of the compiler-bridge, so also doesn't need to be cross-built to Scala 2.10. --- build.sbt | 1 - 1 file changed, 1 deletion(-) diff --git a/build.sbt b/build.sbt index 488e6af39..86b90c4a0 100644 --- a/build.sbt +++ b/build.sbt @@ -85,7 +85,6 @@ lazy val utilCollection = (project in internalPath / "util-collection"). dependsOn(utilPosition, utilTesting % Test). settings( commonSettings, - crossScalaVersions := Seq(scala210, scala211, scala212), Util.keywordsSettings, name := "Util Collection", libraryDependencies ++= Seq(sjsonnew.value, sjsonnewScalaJson.value % Test) From 927692223cacd11c1cd870d7841664e4b3b31fab Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 7 Jul 2017 12:06:08 +0100 Subject: [PATCH 683/823] Add caching to .travis.yml --- .travis.yml | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/.travis.yml b/.travis.yml index d428e1dbc..b0bab8379 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,4 +1,5 @@ language: scala +jdk: oraclejdk8 scala: - 2.11.11 @@ -7,5 +8,11 @@ scala: script: - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" test -jdk: - - oraclejdk8 +cache: + directories: + - $HOME/.ivy2/cache + - $HOME/.sbt + +before_cache: + - find $HOME/.ivy2/cache -name "ivydata-*.properties" -print -delete + - find $HOME/.sbt -name "*.lock" -print -delete From 9d7f7bf0ec2468dd82787d236fcc058146dc2072 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 7 Jul 2017 14:41:32 +0100 Subject: [PATCH 684/823] Remove modules that have moved back to sbt/sbt --- build.sbt | 39 +- .../internal/util/appmacro/ContextUtil.scala | 261 ------ .../sbt/internal/util/appmacro/Convert.scala | 37 - .../sbt/internal/util/appmacro/Instance.scala | 211 ----- .../internal/util/appmacro/KListBuilder.scala | 64 -- .../internal/util/appmacro/MixedBuilder.scala | 17 - .../internal/util/appmacro/TupleBuilder.scala | 55 -- .../util/appmacro/TupleNBuilder.scala | 49 -- internal/util-collection/NOTICE | 3 - .../main/scala/sbt/internal/util/AList.scala | 212 ----- .../scala/sbt/internal/util/Attributes.scala | 210 ----- .../scala/sbt/internal/util/Classes.scala | 24 - .../main/scala/sbt/internal/util/Dag.scala | 127 --- .../main/scala/sbt/internal/util/HList.scala | 32 - .../sbt/internal/util/HListFormats.scala | 70 -- .../main/scala/sbt/internal/util/IDSet.scala | 45 - .../main/scala/sbt/internal/util/INode.scala | 178 ---- .../main/scala/sbt/internal/util/KList.scala | 53 -- .../main/scala/sbt/internal/util/PMap.scala | 108 --- .../main/scala/sbt/internal/util/Param.scala | 28 - .../scala/sbt/internal/util/Settings.scala | 615 ------------- .../main/scala/sbt/internal/util/Signal.scala | 86 -- .../sbt/internal/util/TypeFunctions.scala | 50 -- .../main/scala/sbt/internal/util/Types.scala | 12 - .../main/scala/sbt/internal/util/Util.scala | 41 - .../main/scala/sbt/util/OptJsonWriter.scala | 22 - .../src/main/scala/sbt/util/Show.scala | 12 - .../src/test/scala/DagSpecification.scala | 53 -- .../src/test/scala/HListFormatSpec.scala | 28 - .../src/test/scala/KeyTest.scala | 32 - .../src/test/scala/LiteralTest.scala | 15 - .../src/test/scala/PMapTest.scala | 18 - .../src/test/scala/SettingsExample.scala | 89 -- .../src/test/scala/SettingsTest.scala | 198 ----- internal/util-complete/NOTICE | 3 - .../scala/sbt/internal/util/LineReader.scala | 189 ---- .../internal/util/complete/Completions.scala | 137 --- .../internal/util/complete/EditDistance.scala | 41 - .../util/complete/ExampleSource.scala | 60 -- .../sbt/internal/util/complete/History.scala | 44 - .../util/complete/HistoryCommands.scala | 72 -- .../util/complete/JLineCompletion.scala | 157 ---- .../sbt/internal/util/complete/Parser.scala | 823 ------------------ .../sbt/internal/util/complete/Parsers.scala | 269 ------ .../internal/util/complete/ProcessError.scala | 30 - .../util/complete/TokenCompletions.scala | 38 - .../internal/util/complete/TypeString.scala | 80 -- .../internal/util/complete/UpperBound.scala | 48 - .../src/test/scala/ParserTest.scala | 149 ---- .../scala/sbt/complete/FileExamplesTest.scala | 96 -- .../sbt/complete/FixedSetExamplesTest.scala | 24 - .../sbt/complete/ParserWithExamplesTest.scala | 99 --- .../scala/sbt/internal/util/logic/Logic.scala | 336 ------- .../src/test/scala/sbt/logic/Test.scala | 118 --- project/Util.scala | 33 +- 55 files changed, 4 insertions(+), 5936 deletions(-) delete mode 100644 internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala delete mode 100644 internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala delete mode 100644 internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala delete mode 100644 internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala delete mode 100644 internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala delete mode 100644 internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala delete mode 100644 internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala delete mode 100644 internal/util-collection/NOTICE delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/AList.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Classes.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/HList.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/HListFormats.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/IDSet.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/INode.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/KList.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Param.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/TypeFunctions.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Types.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/internal/util/Util.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala delete mode 100644 internal/util-collection/src/main/scala/sbt/util/Show.scala delete mode 100644 internal/util-collection/src/test/scala/DagSpecification.scala delete mode 100644 internal/util-collection/src/test/scala/HListFormatSpec.scala delete mode 100644 internal/util-collection/src/test/scala/KeyTest.scala delete mode 100644 internal/util-collection/src/test/scala/LiteralTest.scala delete mode 100644 internal/util-collection/src/test/scala/PMapTest.scala delete mode 100644 internal/util-collection/src/test/scala/SettingsExample.scala delete mode 100644 internal/util-collection/src/test/scala/SettingsTest.scala delete mode 100644 internal/util-complete/NOTICE delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/ExampleSource.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/Parsers.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/ProcessError.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/TokenCompletions.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala delete mode 100644 internal/util-complete/src/main/scala/sbt/internal/util/complete/UpperBound.scala delete mode 100644 internal/util-complete/src/test/scala/ParserTest.scala delete mode 100644 internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala delete mode 100644 internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala delete mode 100644 internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala delete mode 100644 internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala delete mode 100644 internal/util-logic/src/test/scala/sbt/logic/Test.scala diff --git a/build.sbt b/build.sbt index 86b90c4a0..a0eb78fa3 100644 --- a/build.sbt +++ b/build.sbt @@ -32,8 +32,8 @@ def commonSettings: Seq[Setting[_]] = Seq( lazy val utilRoot: Project = (project in file(".")). aggregate( - utilInterface, utilControl, utilPosition, utilCollection, utilApplyMacro, utilComplete, - utilLogging, utilRelation, utilLogic, utilCache, utilTracking, utilTesting, + utilInterface, utilControl, utilPosition, + utilLogging, utilRelation, utilCache, utilTracking, utilTesting, utilScripted ). settings( @@ -81,33 +81,6 @@ val utilPosition = (project in file("internal") / "util-position").settings( name := "Util Position" ) -lazy val utilCollection = (project in internalPath / "util-collection"). - dependsOn(utilPosition, utilTesting % Test). - settings( - commonSettings, - Util.keywordsSettings, - name := "Util Collection", - libraryDependencies ++= Seq(sjsonnew.value, sjsonnewScalaJson.value % Test) - ) - -lazy val utilApplyMacro = (project in internalPath / "util-appmacro"). - dependsOn(utilCollection). - settings( - commonSettings, - name := "Util Apply Macro", - libraryDependencies += scalaCompiler.value - ) - -// Command line-related utilities. -lazy val utilComplete = (project in internalPath / "util-complete"). - dependsOn(utilCollection, utilControl, utilTesting % Test). - settings( - commonSettings, - name := "Util Completion", - libraryDependencies += jline - ). - configure(addSbtIO) - // logging lazy val utilLogging = (project in internalPath / "util-logging"). enablePlugins(ContrabandPlugin, JsonCodecPlugin). @@ -134,14 +107,6 @@ lazy val utilRelation = (project in internalPath / "util-relation"). name := "Util Relation" ) -// A logic with restricted negation as failure for a unique, stable model -lazy val utilLogic = (project in internalPath / "util-logic"). - dependsOn(utilCollection, utilRelation, utilTesting % Test). - settings( - commonSettings, - name := "Util Logic" - ) - // Persisted caching based on sjson-new lazy val utilCache = (project in file("util-cache")). dependsOn(utilTesting % Test). diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala deleted file mode 100644 index b9b968a23..000000000 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/ContextUtil.scala +++ /dev/null @@ -1,261 +0,0 @@ -package sbt.internal.util -package appmacro - -import scala.reflect._ -import macros._ -import ContextUtil.{ DynamicDependencyError, DynamicReferenceError } - -object ContextUtil { - final val DynamicDependencyError = "Illegal dynamic dependency" - final val DynamicReferenceError = "Illegal dynamic reference" - - /** - * Constructs an object with utility methods for operating in the provided macro context `c`. - * Callers should explicitly specify the type parameter as `c.type` in order to preserve the path dependent types. - */ - def apply[C <: blackbox.Context with Singleton](c: C): ContextUtil[C] = new ContextUtil(c) - - /** - * Helper for implementing a no-argument macro that is introduced via an implicit. - * This method removes the implicit conversion and evaluates the function `f` on the target of the conversion. - * - * Given `myImplicitConversion(someValue).extensionMethod`, where `extensionMethod` is a macro that uses this - * method, the result of this method is `f()`. - */ - def selectMacroImpl[T: c.WeakTypeTag](c: blackbox.Context)(f: (c.Expr[Any], c.Position) => c.Expr[T]): c.Expr[T] = - { - import c.universe._ - c.macroApplication match { - case s @ Select(Apply(_, t :: Nil), tp) => f(c.Expr[Any](t), s.pos) - case x => unexpectedTree(x) - } - } - - def unexpectedTree[C <: blackbox.Context](tree: C#Tree): Nothing = sys.error("Unexpected macro application tree (" + tree.getClass + "): " + tree) -} - -/** - * Utility methods for macros. Several methods assume that the context's universe is a full compiler - * (`scala.tools.nsc.Global`). - * This is not thread safe due to the underlying Context and related data structures not being thread safe. - * Use `ContextUtil[c.type](c)` to construct. - */ -final class ContextUtil[C <: blackbox.Context](val ctx: C) { - import ctx.universe.{ Apply => ApplyTree, _ } - import internal.decorators._ - - val powerContext = ctx.asInstanceOf[reflect.macros.runtime.Context] - val global: powerContext.universe.type = powerContext.universe - def callsiteTyper: global.analyzer.Typer = powerContext.callsiteTyper - val initialOwner: Symbol = callsiteTyper.context.owner.asInstanceOf[ctx.universe.Symbol] - - lazy val alistType = ctx.typeOf[AList[KList]] - lazy val alist: Symbol = alistType.typeSymbol.companion - lazy val alistTC: Type = alistType.typeConstructor - - /** Modifiers for a local val.*/ - lazy val localModifiers = Modifiers(NoFlags) - - def getPos(sym: Symbol) = if (sym eq null) NoPosition else sym.pos - - /** - * Constructs a unique term name with the given prefix within this Context. - * (The current implementation uses Context.freshName, which increments - */ - def freshTermName(prefix: String) = TermName(ctx.freshName("$" + prefix)) - - /** - * Constructs a new, synthetic, local ValDef Type `tpe`, a unique name, - * Position `pos`, an empty implementation (no rhs), and owned by `owner`. - */ - def freshValDef(tpe: Type, pos: Position, owner: Symbol): ValDef = - { - val SYNTHETIC = (1 << 21).toLong.asInstanceOf[FlagSet] - val sym = owner.newTermSymbol(freshTermName("q"), pos, SYNTHETIC) - setInfo(sym, tpe) - val vd = internal.valDef(sym, EmptyTree) - vd.setPos(pos) - vd - } - - lazy val parameterModifiers = Modifiers(Flag.PARAM) - - /** - * Collects all definitions in the tree for use in checkReferences. - * This excludes definitions in wrapped expressions because checkReferences won't allow nested dereferencing anyway. - */ - def collectDefs(tree: Tree, isWrapper: (String, Type, Tree) => Boolean): collection.Set[Symbol] = - { - val defs = new collection.mutable.HashSet[Symbol] - // adds the symbols for all non-Ident subtrees to `defs`. - val process = new Traverser { - override def traverse(t: Tree) = t match { - case _: Ident => () - case ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) if isWrapper(nme.decodedName.toString, tpe.tpe, qual) => () - case tree => - if (tree.symbol ne null) defs += tree.symbol; - super.traverse(tree) - } - } - process.traverse(tree) - defs - } - - /** - * A reference is illegal if it is to an M instance defined within the scope of the macro call. - * As an approximation, disallow referenced to any local definitions `defs`. - */ - def illegalReference(defs: collection.Set[Symbol], sym: Symbol): Boolean = - sym != null && sym != NoSymbol && defs.contains(sym) - - /** - * A function that checks the provided tree for illegal references to M instances defined in the - * expression passed to the macro and for illegal dereferencing of M instances. - */ - def checkReferences(defs: collection.Set[Symbol], isWrapper: (String, Type, Tree) => Boolean): Tree => Unit = { - case s @ ApplyTree(TypeApply(Select(_, nme), tpe :: Nil), qual :: Nil) => - if (isWrapper(nme.decodedName.toString, tpe.tpe, qual)) ctx.error(s.pos, DynamicDependencyError) - case id @ Ident(name) if illegalReference(defs, id.symbol) => ctx.error(id.pos, DynamicReferenceError + ": " + name) - case _ => () - } - - /** Constructs a ValDef with a parameter modifier, a unique name, with the provided Type and with an empty rhs. */ - def freshMethodParameter(tpe: Type): ValDef = - ValDef(parameterModifiers, freshTermName("p"), TypeTree(tpe), EmptyTree) - - /** Constructs a ValDef with local modifiers and a unique name. */ - def localValDef(tpt: Tree, rhs: Tree): ValDef = - ValDef(localModifiers, freshTermName("q"), tpt, rhs) - - /** Constructs a tuple value of the right TupleN type from the provided inputs.*/ - def mkTuple(args: List[Tree]): Tree = - global.gen.mkTuple(args.asInstanceOf[List[global.Tree]]).asInstanceOf[ctx.universe.Tree] - - def setSymbol[_Tree](t: _Tree, sym: Symbol): Unit = { - t.asInstanceOf[global.Tree].setSymbol(sym.asInstanceOf[global.Symbol]) - () - } - def setInfo(sym: Symbol, tpe: Type): Unit = { - sym.asInstanceOf[global.Symbol].setInfo(tpe.asInstanceOf[global.Type]) - () - } - - /** Creates a new, synthetic type variable with the specified `owner`. */ - def newTypeVariable(owner: Symbol, prefix: String = "T0"): TypeSymbol = - owner.asInstanceOf[global.Symbol].newSyntheticTypeParam(prefix, 0L).asInstanceOf[ctx.universe.TypeSymbol] - - /** The type representing the type constructor `[X] X` */ - lazy val idTC: Type = - { - val tvar = newTypeVariable(NoSymbol) - internal.polyType(tvar :: Nil, refVar(tvar)) - } - /** A Type that references the given type variable. */ - def refVar(variable: TypeSymbol): Type = variable.toTypeConstructor - /** Constructs a new, synthetic type variable that is a type constructor. For example, in type Y[L[x]], L is such a type variable. */ - def newTCVariable(owner: Symbol): TypeSymbol = - { - val tc = newTypeVariable(owner) - val arg = newTypeVariable(tc, "x"); - tc.setInfo(internal.polyType(arg :: Nil, emptyTypeBounds)) - tc - } - /** >: Nothing <: Any */ - def emptyTypeBounds: TypeBounds = internal.typeBounds(definitions.NothingClass.toType, definitions.AnyClass.toType) - - /** Creates a new anonymous function symbol with Position `pos`. */ - def functionSymbol(pos: Position): Symbol = - callsiteTyper.context.owner.newAnonymousFunctionValue(pos.asInstanceOf[global.Position]).asInstanceOf[ctx.universe.Symbol] - - def functionType(args: List[Type], result: Type): Type = - { - val tpe = global.definitions.functionType(args.asInstanceOf[List[global.Type]], result.asInstanceOf[global.Type]) - tpe.asInstanceOf[Type] - } - - /** Create a Tree that references the `val` represented by `vd`, copying attributes from `replaced`. */ - def refVal(replaced: Tree, vd: ValDef): Tree = - treeCopy.Ident(replaced, vd.name).setSymbol(vd.symbol) - - /** Creates a Function tree using `functionSym` as the Symbol and changing `initialOwner` to `functionSym` in `body`.*/ - def createFunction(params: List[ValDef], body: Tree, functionSym: Symbol): Tree = - { - changeOwner(body, initialOwner, functionSym) - val f = Function(params, body) - setSymbol(f, functionSym) - f - } - - def changeOwner(tree: Tree, prev: Symbol, next: Symbol): Unit = - new ChangeOwnerAndModuleClassTraverser(prev.asInstanceOf[global.Symbol], next.asInstanceOf[global.Symbol]).traverse(tree.asInstanceOf[global.Tree]) - - // Workaround copied from scala/async:can be removed once https://github.com/scala/scala/pull/3179 is merged. - private[this] class ChangeOwnerAndModuleClassTraverser(oldowner: global.Symbol, newowner: global.Symbol) extends global.ChangeOwnerTraverser(oldowner, newowner) { - override def traverse(tree: global.Tree): Unit = { - tree match { - case _: global.DefTree => change(tree.symbol.moduleClass) - case _ => - } - super.traverse(tree) - } - } - - /** Returns the Symbol that references the statically accessible singleton `i`. */ - def singleton[T <: AnyRef with Singleton](i: T)(implicit it: ctx.TypeTag[i.type]): Symbol = - it.tpe match { - case SingleType(_, sym) if !sym.isFreeTerm && sym.isStatic => sym - case x => sys.error("Instance must be static (was " + x + ").") - } - - def select(t: Tree, name: String): Tree = Select(t, TermName(name)) - - /** Returns the symbol for the non-private method named `name` for the class/module `obj`. */ - def method(obj: Symbol, name: String): Symbol = { - val ts: Type = obj.typeSignature - val m: global.Symbol = ts.asInstanceOf[global.Type].nonPrivateMember(global.newTermName(name)) - m.asInstanceOf[Symbol] - } - - /** - * Returns a Type representing the type constructor tcp.. For example, given - * `object Demo { type M[x] = List[x] }`, the call `extractTC(Demo, "M")` will return a type representing - * the type constructor `[x] List[x]`. - */ - def extractTC(tcp: AnyRef with Singleton, name: String)(implicit it: ctx.TypeTag[tcp.type]): ctx.Type = - { - val itTpe = it.tpe.asInstanceOf[global.Type] - val m = itTpe.nonPrivateMember(global.newTypeName(name)) - val tc = itTpe.memberInfo(m).asInstanceOf[ctx.universe.Type] - assert(tc != NoType && tc.takesTypeArgs, "Invalid type constructor: " + tc) - tc - } - - /** - * Substitutes wrappers in tree `t` with the result of `subWrapper`. - * A wrapper is a Tree of the form `f[T](v)` for which isWrapper(, , .target) returns true. - * Typically, `f` is a `Select` or `Ident`. - * The wrapper is replaced with the result of `subWrapper(, , )` - */ - def transformWrappers(t: Tree, subWrapper: (String, Type, Tree, Tree) => Converted[ctx.type]): Tree = - { - // the main tree transformer that replaces calls to InputWrapper.wrap(x) with - // plain Idents that reference the actual input value - object appTransformer extends Transformer { - override def transform(tree: Tree): Tree = - tree match { - case ApplyTree(TypeApply(Select(_, nme), targ :: Nil), qual :: Nil) => - subWrapper(nme.decodedName.toString, targ.tpe, qual, tree) match { - case Converted.Success(t, finalTx) => - changeOwner(qual, currentOwner, initialOwner) // Fixes https://github.com/sbt/sbt/issues/1150 - finalTx(t) - case Converted.Failure(p, m) => ctx.abort(p, m) - case _: Converted.NotApplicable[_] => super.transform(tree) - } - case _ => super.transform(tree) - } - } - appTransformer.atOwner(initialOwner) { - appTransformer.transform(t) - } - } -} diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala deleted file mode 100644 index 8accb85c6..000000000 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Convert.scala +++ /dev/null @@ -1,37 +0,0 @@ -package sbt.internal.util -package appmacro - -import scala.reflect._ -import macros._ -import Types.idFun - -abstract class Convert { - def apply[T: c.WeakTypeTag](c: blackbox.Context)(nme: String, in: c.Tree): Converted[c.type] - def asPredicate(c: blackbox.Context): (String, c.Type, c.Tree) => Boolean = - (n, tpe, tree) => { - val tag = c.WeakTypeTag(tpe) - apply(c)(n, tree)(tag).isSuccess - } -} -sealed trait Converted[C <: blackbox.Context with Singleton] { - def isSuccess: Boolean - def transform(f: C#Tree => C#Tree): Converted[C] -} -object Converted { - def NotApplicable[C <: blackbox.Context with Singleton] = new NotApplicable[C] - final case class Failure[C <: blackbox.Context with Singleton](position: C#Position, message: String) extends Converted[C] { - def isSuccess = false - def transform(f: C#Tree => C#Tree): Converted[C] = new Failure(position, message) - } - final class NotApplicable[C <: blackbox.Context with Singleton] extends Converted[C] { - def isSuccess = false - def transform(f: C#Tree => C#Tree): Converted[C] = this - } - final case class Success[C <: blackbox.Context with Singleton](tree: C#Tree, finalTransform: C#Tree => C#Tree) extends Converted[C] { - def isSuccess = true - def transform(f: C#Tree => C#Tree): Converted[C] = Success(f(tree), finalTransform) - } - object Success { - def apply[C <: blackbox.Context with Singleton](tree: C#Tree): Success[C] = Success(tree, idFun) - } -} diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala deleted file mode 100644 index a10fdfb18..000000000 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/Instance.scala +++ /dev/null @@ -1,211 +0,0 @@ -package sbt.internal.util -package appmacro - -import Classes.Applicative -import Types.Id - -/** - * The separate hierarchy from Applicative/Monad is for two reasons. - * - * 1. The type constructor is represented as an abstract type because a TypeTag cannot represent a type constructor directly. - * 2. The applicative interface is uncurried. - */ -trait Instance { - type M[x] - def app[K[L[x]], Z](in: K[M], f: K[Id] => Z)(implicit a: AList[K]): M[Z] - def map[S, T](in: M[S], f: S => T): M[T] - def pure[T](t: () => T): M[T] -} - -trait MonadInstance extends Instance { - def flatten[T](in: M[M[T]]): M[T] -} - -import scala.reflect._ -import macros._ - -object Instance { - final val ApplyName = "app" - final val FlattenName = "flatten" - final val PureName = "pure" - final val MapName = "map" - final val InstanceTCName = "M" - - final class Input[U <: Universe with Singleton](val tpe: U#Type, val expr: U#Tree, val local: U#ValDef) - trait Transform[C <: blackbox.Context with Singleton, N[_]] { - def apply(in: C#Tree): C#Tree - } - def idTransform[C <: blackbox.Context with Singleton]: Transform[C, Id] = new Transform[C, Id] { - def apply(in: C#Tree): C#Tree = in - } - - /** - * Implementation of a macro that provides a direct syntax for applicative functors and monads. - * It is intended to be used in conjunction with another macro that conditions the inputs. - * - * This method processes the Tree `t` to find inputs of the form `wrap[T]( input )` - * This form is typically constructed by another macro that pretends to be able to get a value of type `T` - * from a value convertible to `M[T]`. This `wrap(input)` form has two main purposes. - * First, it identifies the inputs that should be transformed. - * Second, it allows the input trees to be wrapped for later conversion into the appropriate `M[T]` type by `convert`. - * This wrapping is necessary because applying the first macro must preserve the original type, - * but it is useful to delay conversion until the outer, second macro is called. The `wrap` method accomplishes this by - * allowing the original `Tree` and `Type` to be hidden behind the raw `T` type. This method will remove the call to `wrap` - * so that it is not actually called at runtime. - * - * Each `input` in each expression of the form `wrap[T]( input )` is transformed by `convert`. - * This transformation converts the input Tree to a Tree of type `M[T]`. - * The original wrapped expression `wrap(input)` is replaced by a reference to a new local `val $x: T`, where `$x` is a fresh name. - * These converted inputs are passed to `builder` as well as the list of these synthetic `ValDef`s. - * The `TupleBuilder` instance constructs a tuple (Tree) from the inputs and defines the right hand side of the vals - * that unpacks the tuple containing the results of the inputs. - * - * The constructed tuple of inputs and the code that unpacks the results of the inputs are then passed to the `i`, - * which is an implementation of `Instance` that is statically accessible. - * An Instance defines a applicative functor associated with a specific type constructor and, if it implements MonadInstance as well, a monad. - * Typically, it will be either a top-level module or a stable member of a top-level module (such as a val or a nested module). - * The `with Singleton` part of the type verifies some cases at macro compilation time, - * while the full check for static accessibility is done at macro expansion time. - * Note: Ideally, the types would verify that `i: MonadInstance` when `t.isRight`. - * With the various dependent types involved, this is not worth it. - * - * The `t` argument is the argument of the macro that will be transformed as described above. - * If the macro that calls this method is for a multi-input map (app followed by map), - * `t` should be the argument wrapped in Left. - * If this is for multi-input flatMap (app followed by flatMap), - * this should be the argument wrapped in Right. - */ - def contImpl[T, N[_]](c: blackbox.Context, i: Instance with Singleton, convert: Convert, builder: TupleBuilder)(t: Either[c.Expr[T], c.Expr[i.M[T]]], inner: Transform[c.type, N])( - implicit - tt: c.WeakTypeTag[T], nt: c.WeakTypeTag[N[T]], it: c.TypeTag[i.type] - ): c.Expr[i.M[N[T]]] = - { - import c.universe.{ Apply => ApplyTree, _ } - - val util = ContextUtil[c.type](c) - val mTC: Type = util.extractTC(i, InstanceTCName) - val mttpe: Type = appliedType(mTC, nt.tpe :: Nil).dealias - - // the tree for the macro argument - val (tree, treeType) = t match { - case Left(l) => (l.tree, nt.tpe.dealias) - case Right(r) => (r.tree, mttpe) - } - // the Symbol for the anonymous function passed to the appropriate Instance.map/flatMap/pure method - // this Symbol needs to be known up front so that it can be used as the owner of synthetic vals - val functionSym = util.functionSymbol(tree.pos) - - val instanceSym = util.singleton(i) - // A Tree that references the statically accessible Instance that provides the actual implementations of map, flatMap, ... - val instance = Ident(instanceSym) - - val isWrapper: (String, Type, Tree) => Boolean = convert.asPredicate(c) - - // Local definitions `defs` in the macro. This is used to ensure references are to M instances defined outside of the macro call. - // Also `refCount` is the number of references, which is used to create the private, synthetic method containing the body - val defs = util.collectDefs(tree, isWrapper) - val checkQual: Tree => Unit = util.checkReferences(defs, isWrapper) - - type In = Input[c.universe.type] - var inputs = List[In]() - - // transforms the original tree into calls to the Instance functions pure, map, ..., - // resulting in a value of type M[T] - def makeApp(body: Tree): Tree = - inputs match { - case Nil => pure(body) - case x :: Nil => single(body, x) - case xs => arbArity(body, xs) - } - - // no inputs, so construct M[T] via Instance.pure or pure+flatten - def pure(body: Tree): Tree = - { - val typeApplied = TypeApply(util.select(instance, PureName), TypeTree(treeType) :: Nil) - val f = util.createFunction(Nil, body, functionSym) - val p = ApplyTree(typeApplied, f :: Nil) - if (t.isLeft) p else flatten(p) - } - // m should have type M[M[T]] - // the returned Tree will have type M[T] - def flatten(m: Tree): Tree = - { - val typedFlatten = TypeApply(util.select(instance, FlattenName), TypeTree(tt.tpe) :: Nil) - ApplyTree(typedFlatten, m :: Nil) - } - - // calls Instance.map or flatmap directly, skipping the intermediate Instance.app that is unnecessary for a single input - def single(body: Tree, input: In): Tree = - { - val variable = input.local - val param = treeCopy.ValDef(variable, util.parameterModifiers, variable.name, variable.tpt, EmptyTree) - val typeApplied = TypeApply(util.select(instance, MapName), variable.tpt :: TypeTree(treeType) :: Nil) - val f = util.createFunction(param :: Nil, body, functionSym) - val mapped = ApplyTree(typeApplied, input.expr :: f :: Nil) - if (t.isLeft) mapped else flatten(mapped) - } - - // calls Instance.app to get the values for all inputs and then calls Instance.map or flatMap to evaluate the body - def arbArity(body: Tree, inputs: List[In]): Tree = - { - val result = builder.make(c)(mTC, inputs) - val param = util.freshMethodParameter(appliedType(result.representationC, util.idTC :: Nil)) - val bindings = result.extract(param) - val f = util.createFunction(param :: Nil, Block(bindings, body), functionSym) - val ttt = TypeTree(treeType) - val typedApp = TypeApply(util.select(instance, ApplyName), TypeTree(result.representationC) :: ttt :: Nil) - val app = ApplyTree(ApplyTree(typedApp, result.input :: f :: Nil), result.alistInstance :: Nil) - if (t.isLeft) app else flatten(app) - } - - // Called when transforming the tree to add an input. - // For `qual` of type M[A], and a `selection` qual.value, - // the call is addType(Type A, Tree qual) - // The result is a Tree representing a reference to - // the bound value of the input. - def addType(tpe: Type, qual: Tree, selection: Tree): Tree = - { - qual.foreach(checkQual) - val vd = util.freshValDef(tpe, qual.pos, functionSym) - inputs ::= new Input(tpe, qual, vd) - util.refVal(selection, vd) - } - def sub(name: String, tpe: Type, qual: Tree, replace: Tree): Converted[c.type] = - { - val tag = c.WeakTypeTag[T](tpe) - convert[T](c)(name, qual)(tag) transform { tree => - addType(tpe, tree, replace) - } - } - - // applies the transformation - val tx = util.transformWrappers(tree, (n, tpe, t, replace) => sub(n, tpe, t, replace)) - // resetting attributes must be: a) local b) done here and not wider or else there are obscure errors - val tr = makeApp(inner(tx)) - c.Expr[i.M[N[T]]](tr) - } - - import Types._ - - implicit def applicativeInstance[A[_]](implicit ap: Applicative[A]): Instance { type M[x] = A[x] } = new Instance { - type M[x] = A[x] - def app[K[L[x]], Z](in: K[A], f: K[Id] => Z)(implicit a: AList[K]) = a.apply[A, Z](in, f) - def map[S, T](in: A[S], f: S => T) = ap.map(f, in) - def pure[S](s: () => S): M[S] = ap.pure(s()) - } - - type AI[A[_]] = Instance { type M[x] = A[x] } - def compose[A[_], B[_]](implicit a: AI[A], b: AI[B]): Instance { type M[x] = A[B[x]] } = new Composed[A, B](a, b) - // made a public, named, unsealed class because of trouble with macros and inference when the Instance is not an object - class Composed[A[_], B[_]](a: AI[A], b: AI[B]) extends Instance { - type M[x] = A[B[x]] - def pure[S](s: () => S): A[B[S]] = a.pure(() => b.pure(s)) - def map[S, T](in: M[S], f: S => T): M[T] = a.map(in, (bv: B[S]) => b.map(bv, f)) - def app[K[L[x]], Z](in: K[M], f: K[Id] => Z)(implicit alist: AList[K]): A[B[Z]] = - { - val g: K[B] => B[Z] = in => b.app[K, Z](in, f) - type Split[L[x]] = K[(L ∙ B)#l] - a.app[Split, B[Z]](in, g)(AList.asplit(alist)) - } - } -} diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala deleted file mode 100644 index 65b061e66..000000000 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/KListBuilder.scala +++ /dev/null @@ -1,64 +0,0 @@ -package sbt.internal.util -package appmacro - -import scala.reflect._ -import macros._ - -/** A `TupleBuilder` that uses a KList as the tuple representation.*/ -object KListBuilder extends TupleBuilder { - def make(c: blackbox.Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { - val ctx: c.type = c - val util = ContextUtil[c.type](c) - import c.universe.{ Apply => ApplyTree, _ } - import util._ - - val knilType = c.typeOf[KNil] - val knil = Ident(knilType.typeSymbol.companion) - val kconsTpe = c.typeOf[KCons[Int, KNil, List]] - val kcons = kconsTpe.typeSymbol.companion - val mTC: Type = mt.asInstanceOf[c.universe.Type] - val kconsTC: Type = kconsTpe.typeConstructor - - /** This is the L in the type function [L[x]] ... */ - val tcVariable: TypeSymbol = newTCVariable(util.initialOwner) - - /** Instantiates KCons[h, t <: KList[L], L], where L is the type constructor variable */ - def kconsType(h: Type, t: Type): Type = - appliedType(kconsTC, h :: t :: refVar(tcVariable) :: Nil) - - def bindKList(prev: ValDef, revBindings: List[ValDef], params: List[ValDef]): List[ValDef] = - params match { - case (x @ ValDef(mods, name, tpt, _)) :: xs => - val rhs = select(Ident(prev.name), "head") - val head = treeCopy.ValDef(x, mods, name, tpt, rhs) - util.setSymbol(head, x.symbol) - val tail = localValDef(TypeTree(), select(Ident(prev.name), "tail")) - val base = head :: revBindings - bindKList(tail, if (xs.isEmpty) base else tail :: base, xs) - case Nil => revBindings.reverse - } - - private[this] def makeKList(revInputs: Inputs[c.universe.type], klist: Tree, klistType: Type): Tree = - revInputs match { - case in :: tail => - val next = ApplyTree(TypeApply(Ident(kcons), TypeTree(in.tpe) :: TypeTree(klistType) :: TypeTree(mTC) :: Nil), in.expr :: klist :: Nil) - makeKList(tail, next, appliedType(kconsTC, in.tpe :: klistType :: mTC :: Nil)) - case Nil => klist - } - - /** The input trees combined in a KList */ - val klist = makeKList(inputs.reverse, knil, knilType) - - /** - * The input types combined in a KList type. The main concern is tracking the heterogeneous types. - * The type constructor is tcVariable, so that it can be applied to [X] X or M later. - * When applied to `M`, this type gives the type of the `input` KList. - */ - val klistType: Type = (inputs :\ knilType)((in, klist) => kconsType(in.tpe, klist)) - - val representationC = internal.polyType(tcVariable :: Nil, klistType) - val input = klist - val alistInstance: ctx.universe.Tree = TypeApply(select(Ident(alist), "klist"), TypeTree(representationC) :: Nil) - def extract(param: ValDef) = bindKList(param, Nil, inputs.map(_.local)) - } -} diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala deleted file mode 100644 index cd77f50ae..000000000 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/MixedBuilder.scala +++ /dev/null @@ -1,17 +0,0 @@ -package sbt.internal.util -package appmacro - -import scala.reflect._ -import macros._ - -/** - * A builder that uses `TupleN` as the representation for small numbers of inputs (up to `TupleNBuilder.MaxInputs`) - * and `KList` for larger numbers of inputs. This builder cannot handle fewer than 2 inputs. - */ -object MixedBuilder extends TupleBuilder { - def make(c: blackbox.Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = - { - val delegate = if (inputs.size > TupleNBuilder.MaxInputs) KListBuilder else TupleNBuilder - delegate.make(c)(mt, inputs) - } -} diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala deleted file mode 100644 index 1186f3549..000000000 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleBuilder.scala +++ /dev/null @@ -1,55 +0,0 @@ -package sbt.internal.util -package appmacro - -import scala.reflect._ -import macros._ - -/** - * A `TupleBuilder` abstracts the work of constructing a tuple data structure such as a `TupleN` or `KList` - * and extracting values from it. The `Instance` macro implementation will (roughly) traverse the tree of its argument - * and ultimately obtain a list of expressions with type `M[T]` for different types `T`. - * The macro constructs an `Input` value for each of these expressions that contains the `Type` for `T`, - * the `Tree` for the expression, and a `ValDef` that will hold the value for the input. - * - * `TupleBuilder.apply` is provided with the list of `Input`s and is expected to provide three values in the returned BuilderResult. - * First, it returns the constructed tuple data structure Tree in `input`. - * Next, it provides the type constructor `representationC` that, when applied to M, gives the type of tuple data structure. - * For example, a builder that constructs a `Tuple3` for inputs `M[Int]`, `M[Boolean]`, and `M[String]` - * would provide a Type representing `[L[x]] (L[Int], L[Boolean], L[String])`. The `input` method - * would return a value whose type is that type constructor applied to M, or `(M[Int], M[Boolean], M[String])`. - * - * Finally, the `extract` method provides a list of vals that extract information from the applied input. - * The type of the applied input is the type constructor applied to `Id` (`[X] X`). - * The returned list of ValDefs should be the ValDefs from `inputs`, but with non-empty right-hand sides. - */ -trait TupleBuilder { - /** A convenience alias for a list of inputs (associated with a Universe of type U). */ - type Inputs[U <: Universe with Singleton] = List[Instance.Input[U]] - - /** Constructs a one-time use Builder for Context `c` and type constructor `tcType`. */ - def make(c: blackbox.Context)(tcType: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] -} - -trait BuilderResult[C <: blackbox.Context with Singleton] { - val ctx: C - import ctx.universe._ - - /** - * Represents the higher-order type constructor `[L[x]] ...` where `...` is the - * type of the data structure containing the added expressions, - * except that it is abstracted over the type constructor applied to each heterogeneous part of the type . - */ - def representationC: PolyType - - /** The instance of AList for the input. For a `representationC` of `[L[x]]`, this `Tree` should have a `Type` of `AList[L]`*/ - def alistInstance: Tree - - /** Returns the completed value containing all expressions added to the builder. */ - def input: Tree - - /* The list of definitions that extract values from a value of type `$representationC[Id]`. - * The returned value should be identical to the `ValDef`s provided to the `TupleBuilder.make` method but with - * non-empty right hand sides. Each `ValDef` may refer to `param` and previous `ValDef`s in the list.*/ - def extract(param: ValDef): List[ValDef] -} - diff --git a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala b/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala deleted file mode 100644 index 1c5430e4c..000000000 --- a/internal/util-appmacro/src/main/scala/sbt/internal/util/appmacro/TupleNBuilder.scala +++ /dev/null @@ -1,49 +0,0 @@ -package sbt.internal.util -package appmacro - -import scala.tools.nsc.Global -import scala.reflect._ -import macros._ - -/** - * A builder that uses a TupleN as the tuple representation. - * It is limited to tuples of size 2 to `MaxInputs`. - */ -object TupleNBuilder extends TupleBuilder { - /** The largest number of inputs that this builder can handle. */ - final val MaxInputs = 11 - final val TupleMethodName = "tuple" - - def make(c: blackbox.Context)(mt: c.Type, inputs: Inputs[c.universe.type]): BuilderResult[c.type] = new BuilderResult[c.type] { - val util = ContextUtil[c.type](c) - import c.universe._ - import util._ - - val global: Global = c.universe.asInstanceOf[Global] - - val ctx: c.type = c - val representationC: PolyType = { - val tcVariable: Symbol = newTCVariable(util.initialOwner) - val tupleTypeArgs = inputs.map(in => internal.typeRef(NoPrefix, tcVariable, in.tpe :: Nil).asInstanceOf[global.Type]) - val tuple = global.definitions.tupleType(tupleTypeArgs) - internal.polyType(tcVariable :: Nil, tuple.asInstanceOf[Type]) - } - - val input: Tree = mkTuple(inputs.map(_.expr)) - val alistInstance: Tree = { - val selectTree = select(Ident(alist), TupleMethodName + inputs.size.toString) - TypeApply(selectTree, inputs.map(in => TypeTree(in.tpe))) - } - def extract(param: ValDef): List[ValDef] = bindTuple(param, Nil, inputs.map(_.local), 1) - - def bindTuple(param: ValDef, revBindings: List[ValDef], params: List[ValDef], i: Int): List[ValDef] = - params match { - case (x @ ValDef(mods, name, tpt, _)) :: xs => - val rhs = select(Ident(param.name), "_" + i.toString) - val newVal = treeCopy.ValDef(x, mods, name, tpt, rhs) - util.setSymbol(newVal, x.symbol) - bindTuple(param, newVal :: revBindings, xs, i + 1) - case Nil => revBindings.reverse - } - } -} diff --git a/internal/util-collection/NOTICE b/internal/util-collection/NOTICE deleted file mode 100644 index 428020987..000000000 --- a/internal/util-collection/NOTICE +++ /dev/null @@ -1,3 +0,0 @@ -Simple Build Tool: Collection Component -Copyright 2010 Mark Harrah -Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/AList.scala b/internal/util-collection/src/main/scala/sbt/internal/util/AList.scala deleted file mode 100644 index 3247e9a8a..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/AList.scala +++ /dev/null @@ -1,212 +0,0 @@ -package sbt.internal.util - -import Classes.Applicative -import Types._ - -/** - * An abstraction over a higher-order type constructor `K[x[y]]` with the purpose of abstracting - * over heterogeneous sequences like `KList` and `TupleN` with elements with a common type - * constructor as well as homogeneous sequences `Seq[M[T]]`. - */ -trait AList[K[L[x]]] { - def transform[M[_], N[_]](value: K[M], f: M ~> N): K[N] - def traverse[M[_], N[_], P[_]](value: K[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[K[P]] - def foldr[M[_], A](value: K[M], f: (M[_], A) => A, init: A): A - - def toList[M[_]](value: K[M]): List[M[_]] = foldr[M, List[M[_]]](value, _ :: _, Nil) - def apply[M[_], C](value: K[M], f: K[Id] => C)(implicit a: Applicative[M]): M[C] = - a.map(f, traverse[M, M, Id](value, idK[M])(a)) -} -object AList { - type Empty = AList[({ type l[L[x]] = Unit })#l] - /** AList for Unit, which represents a sequence that is always empty.*/ - val empty: Empty = new Empty { - def transform[M[_], N[_]](in: Unit, f: M ~> N) = () - def foldr[M[_], T](in: Unit, f: (M[_], T) => T, init: T) = init - override def apply[M[_], C](in: Unit, f: Unit => C)(implicit app: Applicative[M]): M[C] = app.pure(f(())) - def traverse[M[_], N[_], P[_]](in: Unit, f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Unit] = np.pure(()) - } - - type SeqList[T] = AList[({ type l[L[x]] = List[L[T]] })#l] - /** AList for a homogeneous sequence. */ - def seq[T]: SeqList[T] = new SeqList[T] { - def transform[M[_], N[_]](s: List[M[T]], f: M ~> N) = s.map(f.fn[T]) - def foldr[M[_], A](s: List[M[T]], f: (M[_], A) => A, init: A): A = (init /: s.reverse)((t, m) => f(m, t)) - override def apply[M[_], C](s: List[M[T]], f: List[T] => C)(implicit ap: Applicative[M]): M[C] = - { - def loop[V](in: List[M[T]], g: List[T] => V): M[V] = - in match { - case Nil => ap.pure(g(Nil)) - case x :: xs => - val h = (ts: List[T]) => (t: T) => g(t :: ts) - ap.apply(loop(xs, h), x) - } - loop(s, f) - } - def traverse[M[_], N[_], P[_]](s: List[M[T]], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[List[P[T]]] = ??? - } - - /** AList for the arbitrary arity data structure KList. */ - def klist[KL[M[_]] <: KList[M] { type Transform[N[_]] = KL[N] }]: AList[KL] = new AList[KL] { - def transform[M[_], N[_]](k: KL[M], f: M ~> N) = k.transform(f) - def foldr[M[_], T](k: KL[M], f: (M[_], T) => T, init: T): T = k.foldr(f, init) - override def apply[M[_], C](k: KL[M], f: KL[Id] => C)(implicit app: Applicative[M]): M[C] = k.apply(f)(app) - def traverse[M[_], N[_], P[_]](k: KL[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KL[P]] = k.traverse[N, P](f)(np) - override def toList[M[_]](k: KL[M]) = k.toList - } - - /** AList for a single value. */ - type Single[A] = AList[({ type l[L[x]] = L[A] })#l] - def single[A]: Single[A] = new Single[A] { - def transform[M[_], N[_]](a: M[A], f: M ~> N) = f(a) - def foldr[M[_], T](a: M[A], f: (M[_], T) => T, init: T): T = f(a, init) - def traverse[M[_], N[_], P[_]](a: M[A], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[P[A]] = f(a) - } - - type ASplit[K[L[x]], B[x]] = AList[({ type l[L[x]] = K[(L ∙ B)#l] })#l] - /** AList that operates on the outer type constructor `A` of a composition `[x] A[B[x]]` for type constructors `A` and `B`*/ - def asplit[K[L[x]], B[x]](base: AList[K]): ASplit[K, B] = new ASplit[K, B] { - type Split[L[x]] = K[(L ∙ B)#l] - def transform[M[_], N[_]](value: Split[M], f: M ~> N): Split[N] = - base.transform[(M ∙ B)#l, (N ∙ B)#l](value, nestCon[M, N, B](f)) - - def traverse[M[_], N[_], P[_]](value: Split[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Split[P]] = - { - val g = nestCon[M, (N ∙ P)#l, B](f) - base.traverse[(M ∙ B)#l, N, (P ∙ B)#l](value, g)(np) - } - - def foldr[M[_], A](value: Split[M], f: (M[_], A) => A, init: A): A = - base.foldr[(M ∙ B)#l, A](value, f, init) - } - - // TODO: auto-generate - sealed trait T2K[A, B] { type l[L[x]] = (L[A], L[B]) } - type T2List[A, B] = AList[T2K[A, B]#l] - def tuple2[A, B]: T2List[A, B] = new T2List[A, B] { - type T2[M[_]] = (M[A], M[B]) - def transform[M[_], N[_]](t: T2[M], f: M ~> N): T2[N] = (f(t._1), f(t._2)) - def foldr[M[_], T](t: T2[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, init)) - def traverse[M[_], N[_], P[_]](t: T2[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T2[P]] = - { - val g = (Tuple2.apply[P[A], P[B]] _).curried - np.apply(np.map(g, f(t._1)), f(t._2)) - } - } - - sealed trait T3K[A, B, C] { type l[L[x]] = (L[A], L[B], L[C]) } - type T3List[A, B, C] = AList[T3K[A, B, C]#l] - def tuple3[A, B, C]: T3List[A, B, C] = new T3List[A, B, C] { - type T3[M[_]] = (M[A], M[B], M[C]) - def transform[M[_], N[_]](t: T3[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3)) - def foldr[M[_], T](t: T3[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, init))) - def traverse[M[_], N[_], P[_]](t: T3[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T3[P]] = - { - val g = (Tuple3.apply[P[A], P[B], P[C]] _).curried - np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)) - } - } - - sealed trait T4K[A, B, C, D] { type l[L[x]] = (L[A], L[B], L[C], L[D]) } - type T4List[A, B, C, D] = AList[T4K[A, B, C, D]#l] - def tuple4[A, B, C, D]: T4List[A, B, C, D] = new T4List[A, B, C, D] { - type T4[M[_]] = (M[A], M[B], M[C], M[D]) - def transform[M[_], N[_]](t: T4[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4)) - def foldr[M[_], T](t: T4[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, init)))) - def traverse[M[_], N[_], P[_]](t: T4[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T4[P]] = - { - val g = (Tuple4.apply[P[A], P[B], P[C], P[D]] _).curried - np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)) - } - } - - sealed trait T5K[A, B, C, D, E] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E]) } - type T5List[A, B, C, D, E] = AList[T5K[A, B, C, D, E]#l] - def tuple5[A, B, C, D, E]: T5List[A, B, C, D, E] = new T5List[A, B, C, D, E] { - type T5[M[_]] = (M[A], M[B], M[C], M[D], M[E]) - def transform[M[_], N[_]](t: T5[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5)) - def foldr[M[_], T](t: T5[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, init))))) - def traverse[M[_], N[_], P[_]](t: T5[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T5[P]] = - { - val g = (Tuple5.apply[P[A], P[B], P[C], P[D], P[E]] _).curried - np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)) - } - } - - sealed trait T6K[A, B, C, D, E, F] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F]) } - type T6List[A, B, C, D, E, F] = AList[T6K[A, B, C, D, E, F]#l] - def tuple6[A, B, C, D, E, F]: T6List[A, B, C, D, E, F] = new T6List[A, B, C, D, E, F] { - type T6[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F]) - def transform[M[_], N[_]](t: T6[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6)) - def foldr[M[_], T](t: T6[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, init)))))) - def traverse[M[_], N[_], P[_]](t: T6[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T6[P]] = - { - val g = (Tuple6.apply[P[A], P[B], P[C], P[D], P[E], P[F]] _).curried - np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)) - } - } - - sealed trait T7K[A, B, C, D, E, F, G] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G]) } - type T7List[A, B, C, D, E, F, G] = AList[T7K[A, B, C, D, E, F, G]#l] - def tuple7[A, B, C, D, E, F, G]: T7List[A, B, C, D, E, F, G] = new T7List[A, B, C, D, E, F, G] { - type T7[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G]) - def transform[M[_], N[_]](t: T7[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7)) - def foldr[M[_], T](t: T7[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, init))))))) - def traverse[M[_], N[_], P[_]](t: T7[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T7[P]] = - { - val g = (Tuple7.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G]] _).curried - np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)) - } - } - sealed trait T8K[A, B, C, D, E, F, G, H] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H]) } - type T8List[A, B, C, D, E, F, G, H] = AList[T8K[A, B, C, D, E, F, G, H]#l] - def tuple8[A, B, C, D, E, F, G, H]: T8List[A, B, C, D, E, F, G, H] = new T8List[A, B, C, D, E, F, G, H] { - type T8[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H]) - def transform[M[_], N[_]](t: T8[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8)) - def foldr[M[_], T](t: T8[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, init)))))))) - def traverse[M[_], N[_], P[_]](t: T8[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T8[P]] = - { - val g = (Tuple8.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G], P[H]] _).curried - np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)) - } - } - - sealed trait T9K[A, B, C, D, E, F, G, H, I] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I]) } - type T9List[A, B, C, D, E, F, G, H, I] = AList[T9K[A, B, C, D, E, F, G, H, I]#l] - def tuple9[A, B, C, D, E, F, G, H, I]: T9List[A, B, C, D, E, F, G, H, I] = new T9List[A, B, C, D, E, F, G, H, I] { - type T9[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I]) - def transform[M[_], N[_]](t: T9[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9)) - def foldr[M[_], T](t: T9[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, init))))))))) - def traverse[M[_], N[_], P[_]](t: T9[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T9[P]] = - { - val g = (Tuple9.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G], P[H], P[I]] _).curried - np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)) - } - } - - sealed trait T10K[A, B, C, D, E, F, G, H, I, J] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I], L[J]) } - type T10List[A, B, C, D, E, F, G, H, I, J] = AList[T10K[A, B, C, D, E, F, G, H, I, J]#l] - def tuple10[A, B, C, D, E, F, G, H, I, J]: T10List[A, B, C, D, E, F, G, H, I, J] = new T10List[A, B, C, D, E, F, G, H, I, J] { - type T10[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I], M[J]) - def transform[M[_], N[_]](t: T10[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9), f(t._10)) - def foldr[M[_], T](t: T10[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, f(t._10, init)))))))))) - def traverse[M[_], N[_], P[_]](t: T10[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T10[P]] = - { - val g = (Tuple10.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G], P[H], P[I], P[J]] _).curried - np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)), f(t._10)) - } - } - - sealed trait T11K[A, B, C, D, E, F, G, H, I, J, K] { type l[L[x]] = (L[A], L[B], L[C], L[D], L[E], L[F], L[G], L[H], L[I], L[J], L[K]) } - type T11List[A, B, C, D, E, F, G, H, I, J, K] = AList[T11K[A, B, C, D, E, F, G, H, I, J, K]#l] - def tuple11[A, B, C, D, E, F, G, H, I, J, K]: T11List[A, B, C, D, E, F, G, H, I, J, K] = new T11List[A, B, C, D, E, F, G, H, I, J, K] { - type T11[M[_]] = (M[A], M[B], M[C], M[D], M[E], M[F], M[G], M[H], M[I], M[J], M[K]) - def transform[M[_], N[_]](t: T11[M], f: M ~> N) = (f(t._1), f(t._2), f(t._3), f(t._4), f(t._5), f(t._6), f(t._7), f(t._8), f(t._9), f(t._10), f(t._11)) - def foldr[M[_], T](t: T11[M], f: (M[_], T) => T, init: T): T = f(t._1, f(t._2, f(t._3, f(t._4, f(t._5, f(t._6, f(t._7, f(t._8, f(t._9, f(t._10, f(t._11, init))))))))))) - def traverse[M[_], N[_], P[_]](t: T11[M], f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[T11[P]] = - { - val g = (Tuple11.apply[P[A], P[B], P[C], P[D], P[E], P[F], P[G], P[H], P[I], P[J], P[K]] _).curried - np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.apply(np.map(g, f(t._1)), f(t._2)), f(t._3)), f(t._4)), f(t._5)), f(t._6)), f(t._7)), f(t._8)), f(t._9)), f(t._10)), f(t._11)) - } - } -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala deleted file mode 100644 index 5cf6fb65b..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Attributes.scala +++ /dev/null @@ -1,210 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -import Types._ -import scala.reflect.Manifest -import sbt.util.OptJsonWriter - -// T must be invariant to work properly. -// Because it is sealed and the only instances go through AttributeKey.apply, -// a single AttributeKey instance cannot conform to AttributeKey[T] for different Ts - -/** - * A key in an [[AttributeMap]] that constrains its associated value to be of type `T`. - * The key is uniquely defined by its [[label]] and type `T`, represented at runtime by [[manifest]]. - */ -sealed trait AttributeKey[T] { - - /** The runtime evidence for `T` */ - def manifest: Manifest[T] - - /** The label is the identifier for the key and is camelCase by convention. */ - def label: String - - /** An optional, brief description of the key. */ - def description: Option[String] - - /** - * In environments that support delegation, looking up this key when it has no associated value will delegate to the values associated with these keys. - * The delegation proceeds in order the keys are returned here. - */ - def extend: Seq[AttributeKey[_]] - - /** - * Specifies whether this key is a local, anonymous key (`true`) or not (`false`). - * This is typically only used for programmatic, intermediate keys that should not be referenced outside of a specific scope. - */ - def isLocal: Boolean - - /** Identifies the relative importance of a key among other keys.*/ - def rank: Int - - def optJsonWriter: OptJsonWriter[T] -} -private[sbt] abstract class SharedAttributeKey[T] extends AttributeKey[T] { - override final def toString = label - override final def hashCode = label.hashCode - override final def equals(o: Any) = (this eq o.asInstanceOf[AnyRef]) || (o match { - case a: SharedAttributeKey[t] => a.label == this.label && a.manifest == this.manifest - case _ => false - }) - final def isLocal: Boolean = false -} -object AttributeKey { - def apply[T](name: String)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = - make(name, None, Nil, Int.MaxValue) - - def apply[T](name: String, rank: Int)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = - make(name, None, Nil, rank) - - def apply[T](name: String, description: String)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = - apply(name, description, Nil) - - def apply[T](name: String, description: String, rank: Int)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = - apply(name, description, Nil, rank) - - def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]])(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = - apply(name, description, extend, Int.MaxValue) - - def apply[T](name: String, description: String, extend: Seq[AttributeKey[_]], rank: Int)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = - make(name, Some(description), extend, rank) - - private[this] def make[T](name: String, description0: Option[String], extend0: Seq[AttributeKey[_]], rank0: Int)(implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = new SharedAttributeKey[T] { - def manifest = mf - val label = Util.hyphenToCamel(name) - def description = description0 - def extend = extend0 - def rank = rank0 - def optJsonWriter = ojw - } - private[sbt] def local[T](implicit mf: Manifest[T], ojw: OptJsonWriter[T]): AttributeKey[T] = new AttributeKey[T] { - def manifest = mf - def label = LocalLabel - def description = None - def extend = Nil - override def toString = label - def isLocal: Boolean = true - def rank = Int.MaxValue - val optJsonWriter = ojw - } - private[sbt] final val LocalLabel = "$" + "local" -} - -/** - * An immutable map where a key is the tuple `(String,T)` for a fixed type `T` and can only be associated with values of type `T`. - * It is therefore possible for this map to contain mappings for keys with the same label but different types. - * Excluding this possibility is the responsibility of the client if desired. - */ -trait AttributeMap { - /** - * Gets the value of type `T` associated with the key `k`. - * If a key with the same label but different type is defined, this method will fail. - */ - def apply[T](k: AttributeKey[T]): T - - /** - * Gets the value of type `T` associated with the key `k` or `None` if no value is associated. - * If a key with the same label but a different type is defined, this method will return `None`. - */ - def get[T](k: AttributeKey[T]): Option[T] - - /** - * Returns this map without the mapping for `k`. - * This method will not remove a mapping for a key with the same label but a different type. - */ - def remove[T](k: AttributeKey[T]): AttributeMap - - /** - * Returns true if this map contains a mapping for `k`. - * If a key with the same label but a different type is defined in this map, this method will return `false`. - */ - def contains[T](k: AttributeKey[T]): Boolean - - /** - * Adds the mapping `k -> value` to this map, replacing any existing mapping for `k`. - * Any mappings for keys with the same label but different types are unaffected. - */ - def put[T](k: AttributeKey[T], value: T): AttributeMap - - /** All keys with defined mappings. There may be multiple keys with the same `label`, but different types. */ - def keys: Iterable[AttributeKey[_]] - - /** Adds the mappings in `o` to this map, with mappings in `o` taking precedence over existing mappings.*/ - def ++(o: Iterable[AttributeEntry[_]]): AttributeMap - - /** Combines the mappings in `o` with the mappings in this map, with mappings in `o` taking precedence over existing mappings.*/ - def ++(o: AttributeMap): AttributeMap - - /** All mappings in this map. The [[AttributeEntry]] type preserves the typesafety of mappings, although the specific types are unknown.*/ - def entries: Iterable[AttributeEntry[_]] - - /** `true` if there are no mappings in this map, `false` if there are. */ - def isEmpty: Boolean -} -object AttributeMap { - /** An [[AttributeMap]] without any mappings. */ - val empty: AttributeMap = new BasicAttributeMap(Map.empty) - - /** Constructs an [[AttributeMap]] containing the given `entries`. */ - def apply(entries: Iterable[AttributeEntry[_]]): AttributeMap = empty ++ entries - - /** Constructs an [[AttributeMap]] containing the given `entries`.*/ - def apply(entries: AttributeEntry[_]*): AttributeMap = empty ++ entries - - /** Presents an `AttributeMap` as a natural transformation. */ - implicit def toNatTrans(map: AttributeMap): AttributeKey ~> Id = new (AttributeKey ~> Id) { - def apply[T](key: AttributeKey[T]): T = map(key) - } -} -private class BasicAttributeMap(private val backing: Map[AttributeKey[_], Any]) extends AttributeMap { - def isEmpty: Boolean = backing.isEmpty - def apply[T](k: AttributeKey[T]) = backing(k).asInstanceOf[T] - def get[T](k: AttributeKey[T]) = backing.get(k).asInstanceOf[Option[T]] - def remove[T](k: AttributeKey[T]): AttributeMap = new BasicAttributeMap(backing - k) - def contains[T](k: AttributeKey[T]) = backing.contains(k) - def put[T](k: AttributeKey[T], value: T): AttributeMap = new BasicAttributeMap(backing.updated(k, value)) - def keys: Iterable[AttributeKey[_]] = backing.keys - def ++(o: Iterable[AttributeEntry[_]]): AttributeMap = - { - val newBacking = (backing /: o) { case (b, AttributeEntry(key, value)) => b.updated(key, value) } - new BasicAttributeMap(newBacking) - } - def ++(o: AttributeMap): AttributeMap = - o match { - case bam: BasicAttributeMap => new BasicAttributeMap(backing ++ bam.backing) - case _ => o ++ this - } - def entries: Iterable[AttributeEntry[_]] = - for ((k: AttributeKey[kt], v) <- backing) yield AttributeEntry(k, v.asInstanceOf[kt]) - override def toString = entries.mkString("(", ", ", ")") -} - -// type inference required less generality -/** A map entry where `key` is constrained to only be associated with a fixed value of type `T`. */ -final case class AttributeEntry[T](key: AttributeKey[T], value: T) { - override def toString = key.label + ": " + value -} - -/** Associates a `metadata` map with `data`. */ -final case class Attributed[D](data: D)(val metadata: AttributeMap) { - /** Retrieves the associated value of `key` from the metadata. */ - def get[T](key: AttributeKey[T]): Option[T] = metadata.get(key) - - /** Defines a mapping `key -> value` in the metadata. */ - def put[T](key: AttributeKey[T], value: T): Attributed[D] = Attributed(data)(metadata.put(key, value)) - - /** Transforms the data by applying `f`. */ - def map[T](f: D => T): Attributed[T] = Attributed(f(data))(metadata) -} -object Attributed { - /** Extracts the underlying data from the sequence `in`. */ - def data[T](in: Seq[Attributed[T]]): Seq[T] = in.map(_.data) - - /** Associates empty metadata maps with each entry of `in`.*/ - def blankSeq[T](in: Seq[T]): Seq[Attributed[T]] = in map blank - - /** Associates an empty metadata map with `data`. */ - def blank[T](data: T): Attributed[T] = Attributed(data)(AttributeMap.empty) -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Classes.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Classes.scala deleted file mode 100644 index b44cb8606..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Classes.scala +++ /dev/null @@ -1,24 +0,0 @@ -package sbt.internal.util - -object Classes { - trait Applicative[M[_]] { - def apply[S, T](f: M[S => T], v: M[S]): M[T] - def pure[S](s: => S): M[S] - def map[S, T](f: S => T, v: M[S]): M[T] - } - trait Monad[M[_]] extends Applicative[M] { - def flatten[T](m: M[M[T]]): M[T] - } - implicit val optionMonad: Monad[Option] = new Monad[Option] { - def apply[S, T](f: Option[S => T], v: Option[S]) = (f, v) match { case (Some(fv), Some(vv)) => Some(fv(vv)); case _ => None } - def pure[S](s: => S) = Some(s) - def map[S, T](f: S => T, v: Option[S]) = v map f - def flatten[T](m: Option[Option[T]]): Option[T] = m.flatten - } - implicit val listMonad: Monad[List] = new Monad[List] { - def apply[S, T](f: List[S => T], v: List[S]) = for (fv <- f; vv <- v) yield fv(vv) - def pure[S](s: => S) = s :: Nil - def map[S, T](f: S => T, v: List[S]) = v map f - def flatten[T](m: List[List[T]]): List[T] = m.flatten - } -} \ No newline at end of file diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala deleted file mode 100644 index 1c9d93ea0..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Dag.scala +++ /dev/null @@ -1,127 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2008, 2009, 2010 David MacIver, Mark Harrah - */ -package sbt.internal.util - -trait Dag[Node <: Dag[Node]] { - self: Node => - - def dependencies: Iterable[Node] - def topologicalSort = Dag.topologicalSort(self)(_.dependencies) -} -object Dag { - import scala.collection.{ mutable, JavaConverters } - import JavaConverters.asScalaSetConverter - - def topologicalSort[T](root: T)(dependencies: T => Iterable[T]): List[T] = topologicalSort(root :: Nil)(dependencies) - - def topologicalSort[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = - { - val discovered = new mutable.HashSet[T] - val finished = (new java.util.LinkedHashSet[T]).asScala - - def visitAll(nodes: Iterable[T]) = nodes foreach visit - def visit(node: T): Unit = { - if (!discovered(node)) { - discovered(node) = true; - try { visitAll(dependencies(node)); } catch { case c: Cyclic => throw node :: c } - finished += node - () - } else if (!finished(node)) - throw new Cyclic(node) - } - - visitAll(nodes) - - finished.toList - } - // doesn't check for cycles - def topologicalSortUnchecked[T](node: T)(dependencies: T => Iterable[T]): List[T] = topologicalSortUnchecked(node :: Nil)(dependencies) - - def topologicalSortUnchecked[T](nodes: Iterable[T])(dependencies: T => Iterable[T]): List[T] = - { - val discovered = new mutable.HashSet[T] - var finished: List[T] = Nil - - def visitAll(nodes: Iterable[T]) = nodes foreach visit - def visit(node: T): Unit = { - if (!discovered(node)) { - discovered(node) = true - visitAll(dependencies(node)) - finished ::= node - } - } - - visitAll(nodes); - finished; - } - final class Cyclic(val value: Any, val all: List[Any], val complete: Boolean) - extends Exception("Cyclic reference involving " + - (if (complete) all.mkString("\n ", "\n ", "") else value)) { - def this(value: Any) = this(value, value :: Nil, false) - override def toString = getMessage - def ::(a: Any): Cyclic = - if (complete) - this - else if (a == value) - new Cyclic(value, all, true) - else - new Cyclic(value, a :: all, false) - } - - /** A directed graph with edges labeled positive or negative. */ - private[sbt] trait DirectedSignedGraph[Node] { - /** - * Directed edge type that tracks the sign and target (head) vertex. - * The sign can be obtained via [[isNegative]] and the target vertex via [[head]]. - */ - type Arrow - /** List of initial nodes. */ - def nodes: List[Arrow] - /** Outgoing edges for `n`. */ - def dependencies(n: Node): List[Arrow] - /** `true` if the edge `a` is "negative", false if it is "positive". */ - def isNegative(a: Arrow): Boolean - /** The target of the directed edge `a`. */ - def head(a: Arrow): Node - } - - /** - * Traverses a directed graph defined by `graph` looking for a cycle that includes a "negative" edge. - * The directed edges are weighted by the caller as "positive" or "negative". - * If a cycle containing a "negative" edge is detected, its member edges are returned in order. - * Otherwise, the empty list is returned. - */ - private[sbt] def findNegativeCycle[Node](graph: DirectedSignedGraph[Node]): List[graph.Arrow] = - { - import graph._ - val finished = new mutable.HashSet[Node] - val visited = new mutable.HashSet[Node] - - def visit(edges: List[Arrow], stack: List[Arrow]): List[Arrow] = edges match { - case Nil => Nil - case edge :: tail => - val node = head(edge) - if (!visited(node)) { - visited += node - visit(dependencies(node), edge :: stack) match { - case Nil => - finished += node - visit(tail, stack) - case cycle => cycle - } - } else if (!finished(node)) { - // cycle. If a negative edge is involved, it is an error. - val between = edge :: stack.takeWhile(f => head(f) != node) - if (between exists isNegative) - between - else - visit(tail, stack) - } else - visit(tail, stack) - } - - visit(graph.nodes, Nil) - } - -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/HList.scala b/internal/util-collection/src/main/scala/sbt/internal/util/HList.scala deleted file mode 100644 index 37c19dfdc..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/HList.scala +++ /dev/null @@ -1,32 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -import Types._ - -/** - * A minimal heterogeneous list type. For background, see - * http://apocalisp.wordpress.com/2010/07/06/type-level-programming-in-scala-part-6a-heterogeneous-list basics/ - */ -sealed trait HList { - type Wrap[M[_]] <: HList -} -sealed trait HNil extends HList { - type Wrap[M[_]] = HNil - def :+:[G](g: G): G :+: HNil = HCons(g, this) - - override def toString = "HNil" -} -object HNil extends HNil -final case class HCons[H, T <: HList](head: H, tail: T) extends HList { - type Wrap[M[_]] = M[H] :+: T#Wrap[M] - def :+:[G](g: G): G :+: H :+: T = HCons(g, this) - - override def toString = head + " :+: " + tail.toString -} - -object HList { - // contains no type information: not even A - implicit def fromList[A](list: Traversable[A]): HList = ((HNil: HList) /: list)((hl, v) => HCons(v, hl)) -} \ No newline at end of file diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/HListFormats.scala b/internal/util-collection/src/main/scala/sbt/internal/util/HListFormats.scala deleted file mode 100644 index 6abae921c..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/HListFormats.scala +++ /dev/null @@ -1,70 +0,0 @@ -package sbt -package internal -package util - -import sjsonnew._ -import Types.:+: - -trait HListFormats { - implicit val lnilFormat1: JsonFormat[HNil] = forHNil(HNil) - implicit val lnilFormat2: JsonFormat[HNil.type] = forHNil(HNil) - - private def forHNil[A <: HNil](hnil: A): JsonFormat[A] = new JsonFormat[A] { - def write[J](x: A, builder: Builder[J]): Unit = { - builder.beginArray() - builder.endArray() - } - - def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): A = jsOpt match { - case None => hnil - case Some(js) => unbuilder.beginArray(js); unbuilder.endArray(); hnil - } - } - - implicit def hconsFormat[H, T <: HList](implicit hf: JsonFormat[H], tf: HListJF[T]): JsonFormat[H :+: T] = - new JsonFormat[H :+: T] { - def write[J](hcons: H :+: T, builder: Builder[J]) = { - builder.beginArray() - hf.write(hcons.head, builder) - tf.write(hcons.tail, builder) - builder.endArray() - } - - def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = jsOpt match { - case None => HCons(hf.read(None, unbuilder), tf.read(None, unbuilder)) - case Some(js) => - unbuilder.beginArray(js) - val hcons = HCons(hf.read(Some(unbuilder.nextElement), unbuilder), tf.read(Some(js), unbuilder)) - unbuilder.endArray() - hcons - } - } - - trait HListJF[A <: HList] { - def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): A - def write[J](obj: A, builder: Builder[J]): Unit - } - - implicit def hconsHListJF[H, T <: HList](implicit hf: JsonFormat[H], tf: HListJF[T]): HListJF[H :+: T] = - new HListJF[H :+: T] { - def write[J](hcons: H :+: T, builder: Builder[J]) = { - hf.write(hcons.head, builder) - tf.write(hcons.tail, builder) - } - - def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = jsOpt match { - case None => HCons(hf.read(None, unbuilder), tf.read(None, unbuilder)) - case Some(js) => HCons(hf.read(Some(unbuilder.nextElement), unbuilder), tf.read(Some(js), unbuilder)) - } - } - - implicit val lnilHListJF1: HListJF[HNil] = hnilHListJF(HNil) - implicit val lnilHListJF2: HListJF[HNil.type] = hnilHListJF(HNil) - - implicit def hnilHListJF[A <: HNil](hnil: A): HListJF[A] = new HListJF[A] { - def write[J](hcons: A, builder: Builder[J]) = () - def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]) = hnil - } -} - -object HListFormats extends HListFormats diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/IDSet.scala b/internal/util-collection/src/main/scala/sbt/internal/util/IDSet.scala deleted file mode 100644 index d7a9f7c1a..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/IDSet.scala +++ /dev/null @@ -1,45 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -/** A mutable set interface that uses object identity to test for set membership.*/ -trait IDSet[T] { - def apply(t: T): Boolean - def contains(t: T): Boolean - def +=(t: T): Unit - def ++=(t: Iterable[T]): Unit - def -=(t: T): Boolean - def all: collection.Iterable[T] - def toList: List[T] - def isEmpty: Boolean - def foreach(f: T => Unit): Unit - def process[S](t: T)(ifSeen: S)(ifNew: => S): S -} - -object IDSet { - implicit def toTraversable[T]: IDSet[T] => Traversable[T] = _.all - def apply[T](values: T*): IDSet[T] = apply(values) - def apply[T](values: Iterable[T]): IDSet[T] = - { - val s = create[T] - s ++= values - s - } - def create[T]: IDSet[T] = new IDSet[T] { - private[this] val backing = new java.util.IdentityHashMap[T, AnyRef] - private[this] val Dummy: AnyRef = "" - - def apply(t: T) = contains(t) - def contains(t: T) = backing.containsKey(t) - def foreach(f: T => Unit) = all foreach f - def +=(t: T) = { backing.put(t, Dummy); () } - def ++=(t: Iterable[T]) = t foreach += - def -=(t: T) = if (backing.remove(t) eq null) false else true - def all = collection.JavaConversions.collectionAsScalaIterable(backing.keySet) - def toList = all.toList - def isEmpty = backing.isEmpty - def process[S](t: T)(ifSeen: S)(ifNew: => S) = if (contains(t)) ifSeen else { this += t; ifNew } - override def toString = backing.toString - } -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala b/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala deleted file mode 100644 index 3c159ec11..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/INode.scala +++ /dev/null @@ -1,178 +0,0 @@ -package sbt.internal.util - -import java.lang.Runnable -import java.util.concurrent.{ atomic, Executor, LinkedBlockingQueue } -import atomic.{ AtomicBoolean, AtomicInteger } -import Types.{ ConstK, Id } - -object EvaluationState extends Enumeration { - val New, Blocked, Ready, Calling, Evaluated = Value -} - -abstract class EvaluateSettings[Scope] { - protected val init: Init[Scope] - import init._ - protected def executor: Executor - protected def compiledSettings: Seq[Compiled[_]] - - import EvaluationState.{ Value => EvaluationState, _ } - - private[this] val complete = new LinkedBlockingQueue[Option[Throwable]] - private[this] val static = PMap.empty[ScopedKey, INode] - private[this] val allScopes: Set[Scope] = compiledSettings.map(_.key.scope).toSet - private[this] def getStatic[T](key: ScopedKey[T]): INode[T] = static get key getOrElse sys.error("Illegal reference to key " + key) - - private[this] val transform: Initialize ~> INode = new (Initialize ~> INode) { - def apply[T](i: Initialize[T]): INode[T] = i match { - case k: Keyed[s, T] @unchecked => single(getStatic(k.scopedKey), k.transform) - case a: Apply[k, T] @unchecked => new MixedNode[k, T](a.alist.transform[Initialize, INode](a.inputs, transform), a.f, a.alist) - case b: Bind[s, T] @unchecked => new BindNode[s, T](transform(b.in), x => transform(b.f(x))) - case v: Value[T] @unchecked => constant(v.value) - case v: ValidationCapture[T] @unchecked => strictConstant(v.key) - case t: TransformCapture => strictConstant(t.f) - case o: Optional[s, T] @unchecked => o.a match { - case None => constant(() => o.f(None)) - case Some(i) => single[s, T](transform(i), x => o.f(Some(x))) - } - case x if x == StaticScopes => strictConstant(allScopes.asInstanceOf[T]) // can't convince scalac that StaticScopes => T == Set[Scope] - } - } - private[this] lazy val roots: Seq[INode[_]] = compiledSettings flatMap { cs => - (cs.settings map { s => - val t = transform(s.init) - static(s.key) = t - t - }): Seq[INode[_]] - } - private[this] var running = new AtomicInteger - private[this] var cancel = new AtomicBoolean(false) - - def run(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = - { - assert(running.get() == 0, "Already running") - startWork() - roots.foreach(_.registerIfNew()) - workComplete() - complete.take() foreach { ex => - cancel.set(true) - throw ex - } - getResults(delegates) - } - private[this] def getResults(implicit delegates: Scope => Seq[Scope]) = - (empty /: static.toTypedSeq) { - case (ss, static.TPair(key, node)) => - if (key.key.isLocal) ss else ss.set(key.scope, key.key, node.get) - } - private[this] val getValue = new (INode ~> Id) { def apply[T](node: INode[T]) = node.get } - - private[this] def submitEvaluate(node: INode[_]) = submit(node.evaluate()) - private[this] def submitCallComplete[T](node: BindNode[_, T], value: T) = submit(node.callComplete(value)) - private[this] def submit(work: => Unit): Unit = - { - startWork() - executor.execute(new Runnable { def run = if (!cancel.get()) run0(work) }) - } - private[this] def run0(work: => Unit): Unit = - { - try { work } catch { case e: Throwable => complete.put(Some(e)) } - workComplete() - } - - private[this] def startWork(): Unit = { running.incrementAndGet(); () } - private[this] def workComplete(): Unit = - if (running.decrementAndGet() == 0) - complete.put(None) - - private[this] sealed abstract class INode[T] { - private[this] var state: EvaluationState = New - private[this] var value: T = _ - private[this] val blocking = new collection.mutable.ListBuffer[INode[_]] - private[this] var blockedOn: Int = 0 - private[this] val calledBy = new collection.mutable.ListBuffer[BindNode[_, T]] - - override def toString = getClass.getName + " (state=" + state + ",blockedOn=" + blockedOn + ",calledBy=" + calledBy.size + ",blocking=" + blocking.size + "): " + - keyString - - private[this] def keyString = - (static.toSeq.flatMap { case (key, value) => if (value eq this) init.showFullKey.show(key) :: Nil else Nil }).headOption getOrElse "non-static" - - final def get: T = synchronized { - assert(value != null, toString + " not evaluated") - value - } - final def doneOrBlock(from: INode[_]): Boolean = synchronized { - val ready = state == Evaluated - if (!ready) blocking += from - registerIfNew() - ready - } - final def isDone: Boolean = synchronized { state == Evaluated } - final def isNew: Boolean = synchronized { state == New } - final def isCalling: Boolean = synchronized { state == Calling } - final def registerIfNew(): Unit = synchronized { if (state == New) register() } - private[this] def register(): Unit = { - assert(state == New, "Already registered and: " + toString) - val deps = dependsOn - blockedOn = deps.size - deps.count(_.doneOrBlock(this)) - if (blockedOn == 0) - schedule() - else - state = Blocked - } - - final def schedule(): Unit = synchronized { - assert(state == New || state == Blocked, "Invalid state for schedule() call: " + toString) - state = Ready - submitEvaluate(this) - } - final def unblocked(): Unit = synchronized { - assert(state == Blocked, "Invalid state for unblocked() call: " + toString) - blockedOn -= 1 - assert(blockedOn >= 0, "Negative blockedOn: " + blockedOn + " for " + toString) - if (blockedOn == 0) schedule() - } - final def evaluate(): Unit = synchronized { evaluate0() } - protected final def makeCall(source: BindNode[_, T], target: INode[T]): Unit = { - assert(state == Ready, "Invalid state for call to makeCall: " + toString) - state = Calling - target.call(source) - } - protected final def setValue(v: T): Unit = { - assert(state != Evaluated, "Already evaluated (trying to set value to " + v + "): " + toString) - if (v == null) sys.error("Setting value cannot be null: " + keyString) - value = v - state = Evaluated - blocking foreach { _.unblocked() } - blocking.clear() - calledBy foreach { node => submitCallComplete(node, value) } - calledBy.clear() - } - final def call(by: BindNode[_, T]): Unit = synchronized { - registerIfNew() - state match { - case Evaluated => submitCallComplete(by, value) - case _ => calledBy += by - } - () - } - protected def dependsOn: Seq[INode[_]] - protected def evaluate0(): Unit - } - - private[this] def strictConstant[T](v: T): INode[T] = constant(() => v) - private[this] def constant[T](f: () => T): INode[T] = new MixedNode[ConstK[Unit]#l, T]((), _ => f(), AList.empty) - private[this] def single[S, T](in: INode[S], f: S => T): INode[T] = new MixedNode[({ type l[L[x]] = L[S] })#l, T](in, f, AList.single[S]) - private[this] final class BindNode[S, T](in: INode[S], f: S => INode[T]) extends INode[T] { - protected def dependsOn = in :: Nil - protected def evaluate0(): Unit = makeCall(this, f(in.get)) - def callComplete(value: T): Unit = synchronized { - assert(isCalling, "Invalid state for callComplete(" + value + "): " + toString) - setValue(value) - } - } - private[this] final class MixedNode[K[L[x]], T](in: K[INode], f: K[Id] => T, alist: AList[K]) extends INode[T] { - protected def dependsOn = alist.toList(in) - protected def evaluate0(): Unit = setValue(f(alist.transform(in, getValue))) - } -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala b/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala deleted file mode 100644 index 3406f1b4b..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/KList.scala +++ /dev/null @@ -1,53 +0,0 @@ -package sbt.internal.util - -import Types._ -import Classes.Applicative - -/** Heterogeneous list with each element having type M[T] for some type T.*/ -sealed trait KList[+M[_]] { - type Transform[N[_]] <: KList[N] - - /** Apply the natural transformation `f` to each element. */ - def transform[N[_]](f: M ~> N): Transform[N] - - /** Folds this list using a function that operates on the homogeneous type of the elements of this list. */ - def foldr[B](f: (M[_], B) => B, init: B): B = init // had trouble defining it in KNil - - /** Applies `f` to the elements of this list in the applicative functor defined by `ap`. */ - def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] - - /** Equivalent to `transform(f) . apply(x => x)`, this is the essence of the iterator at the level of natural transformations.*/ - def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] - - /** Discards the heterogeneous type information and constructs a plain List from this KList's elements. */ - def toList: List[M[_]] -} -final case class KCons[H, +T <: KList[M], +M[_]](head: M[H], tail: T) extends KList[M] { - final type Transform[N[_]] = KCons[H, tail.Transform[N], N] - - def transform[N[_]](f: M ~> N) = KCons(f(head), tail.transform(f)) - def toList: List[M[_]] = head :: tail.toList - def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z] = - { - val g = (t: tail.Transform[Id]) => (h: H) => f(KCons[H, tail.Transform[Id], Id](h, t)) - ap.apply(tail.apply[N, H => Z](g), head) - } - def traverse[N[_], P[_]](f: M ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[Transform[P]] = - { - val tt: N[tail.Transform[P]] = tail.traverse[N, P](f) - val g = (t: tail.Transform[P]) => (h: P[H]) => KCons(h, t) - np.apply(np.map(g, tt), f(head)) - } - def :^:[A, N[x] >: M[x]](h: N[A]) = KCons(h, this) - override def foldr[B](f: (M[_], B) => B, init: B): B = f(head, tail.foldr(f, init)) -} -sealed abstract class KNil extends KList[Nothing] { - final type Transform[N[_]] = KNil - final def transform[N[_]](f: Nothing ~> N): Transform[N] = KNil - final def toList = Nil - final def apply[N[x], Z](f: KNil => Z)(implicit ap: Applicative[N]): N[Z] = ap.pure(f(KNil)) - final def traverse[N[_], P[_]](f: Nothing ~> (N ∙ P)#l)(implicit np: Applicative[N]): N[KNil] = np.pure(KNil) -} -case object KNil extends KNil { - def :^:[M[_], H](h: M[H]): KCons[H, KNil, M] = KCons(h, this) -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala b/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala deleted file mode 100644 index 989d657e2..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/PMap.scala +++ /dev/null @@ -1,108 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -import collection.mutable - -trait RMap[K[_], V[_]] { - def apply[T](k: K[T]): V[T] - def get[T](k: K[T]): Option[V[T]] - def contains[T](k: K[T]): Boolean - def toSeq: Seq[(K[_], V[_])] - def toTypedSeq: Seq[TPair[_]] = toSeq.map { case (k: K[t], v) => TPair[t](k, v.asInstanceOf[V[t]]) } - def keys: Iterable[K[_]] - def values: Iterable[V[_]] - def isEmpty: Boolean - - sealed case class TPair[T](key: K[T], value: V[T]) -} - -trait IMap[K[_], V[_]] extends (K ~> V) with RMap[K, V] { - def put[T](k: K[T], v: V[T]): IMap[K, V] - def remove[T](k: K[T]): IMap[K, V] - def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): IMap[K, V] - def mapValues[V2[_]](f: V ~> V2): IMap[K, V2] - def mapSeparate[VL[_], VR[_]](f: V ~> ({ type l[T] = Either[VL[T], VR[T]] })#l): (IMap[K, VL], IMap[K, VR]) -} -trait PMap[K[_], V[_]] extends (K ~> V) with RMap[K, V] { - def update[T](k: K[T], v: V[T]): Unit - def remove[T](k: K[T]): Option[V[T]] - def getOrUpdate[T](k: K[T], make: => V[T]): V[T] - def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] -} -object PMap { - implicit def toFunction[K[_], V[_]](map: PMap[K, V]): K[_] => V[_] = k => map(k) - def empty[K[_], V[_]]: PMap[K, V] = new DelegatingPMap[K, V](new mutable.HashMap) -} -object IMap { - /** - * Only suitable for K that is invariant in its type parameter. - * Option and List keys are not suitable, for example, - * because None <:< Option[String] and None <: Option[Int]. - */ - def empty[K[_], V[_]]: IMap[K, V] = new IMap0[K, V](Map.empty) - - private[this] class IMap0[K[_], V[_]](backing: Map[K[_], V[_]]) extends AbstractRMap[K, V] with IMap[K, V] { - def get[T](k: K[T]): Option[V[T]] = (backing get k).asInstanceOf[Option[V[T]]] - def put[T](k: K[T], v: V[T]) = new IMap0[K, V](backing.updated(k, v)) - def remove[T](k: K[T]) = new IMap0[K, V](backing - k) - - def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]) = - put(k, f(this get k getOrElse init)) - - def mapValues[V2[_]](f: V ~> V2) = - new IMap0[K, V2](backing.mapValues(x => f(x))) - - def mapSeparate[VL[_], VR[_]](f: V ~> ({ type l[T] = Either[VL[T], VR[T]] })#l) = - { - val mapped = backing.iterator.map { - case (k, v) => f(v) match { - case Left(l) => Left((k, l)) - case Right(r) => Right((k, r)) - } - } - val (l, r) = Util.separateE[(K[_], VL[_]), (K[_], VR[_])](mapped.toList) - (new IMap0[K, VL](l.toMap), new IMap0[K, VR](r.toMap)) - } - - def toSeq = backing.toSeq - def keys = backing.keys - def values = backing.values - def isEmpty = backing.isEmpty - - override def toString = backing.toString - } -} - -abstract class AbstractRMap[K[_], V[_]] extends RMap[K, V] { - def apply[T](k: K[T]): V[T] = get(k).get - def contains[T](k: K[T]): Boolean = get(k).isDefined -} - -/** - * Only suitable for K that is invariant in its type parameter. - * Option and List keys are not suitable, for example, - * because None <:< Option[String] and None <: Option[Int]. - */ -class DelegatingPMap[K[_], V[_]](backing: mutable.Map[K[_], V[_]]) extends AbstractRMap[K, V] with PMap[K, V] { - def get[T](k: K[T]): Option[V[T]] = cast[T](backing.get(k)) - def update[T](k: K[T], v: V[T]): Unit = { backing(k) = v } - def remove[T](k: K[T]) = cast(backing.remove(k)) - def getOrUpdate[T](k: K[T], make: => V[T]) = cast[T](backing.getOrElseUpdate(k, make)) - def mapValue[T](k: K[T], init: V[T], f: V[T] => V[T]): V[T] = - { - val v = f(this get k getOrElse init) - update(k, v) - v - } - def toSeq = backing.toSeq - def keys = backing.keys - def values = backing.values - def isEmpty = backing.isEmpty - - private[this] def cast[T](v: V[_]): V[T] = v.asInstanceOf[V[T]] - private[this] def cast[T](o: Option[V[_]]): Option[V[T]] = o map cast[T] - - override def toString = backing.toString -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala deleted file mode 100644 index dbded9292..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Param.scala +++ /dev/null @@ -1,28 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -// Used to emulate ~> literals -trait Param[A[_], B[_]] { - type T - def in: A[T] - def ret(out: B[T]): Unit - def ret: B[T] -} - -object Param { - implicit def pToT[A[_], B[_]](p: Param[A, B] => Unit): A ~> B = new (A ~> B) { - def apply[s](a: A[s]): B[s] = { - val v: Param[A, B] { type T = s } = new Param[A, B] { - type T = s - def in = a - private var r: B[T] = _ - def ret(b: B[T]): Unit = { r = b } - def ret: B[T] = r - } - p(v) - v.ret - } - } -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala deleted file mode 100644 index afa02c150..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Settings.scala +++ /dev/null @@ -1,615 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2011 Mark Harrah - */ -package sbt.internal.util - -import scala.language.existentials - -import Types._ -import sbt.util.Show - -sealed trait Settings[Scope] { - def data: Map[Scope, AttributeMap] - def keys(scope: Scope): Set[AttributeKey[_]] - def scopes: Set[Scope] - def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] - def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] - def get[T](scope: Scope, key: AttributeKey[T]): Option[T] - def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] - def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] -} - -private final class Settings0[Scope](val data: Map[Scope, AttributeMap], val delegates: Scope => Seq[Scope]) extends Settings[Scope] { - def scopes: Set[Scope] = data.keySet - def keys(scope: Scope) = data(scope).keys.toSet - def allKeys[T](f: (Scope, AttributeKey[_]) => T): Seq[T] = data.flatMap { case (scope, map) => map.keys.map(k => f(scope, k)) }.toSeq - - def get[T](scope: Scope, key: AttributeKey[T]): Option[T] = - delegates(scope).toStream.flatMap(sc => getDirect(sc, key)).headOption - def definingScope(scope: Scope, key: AttributeKey[_]): Option[Scope] = - delegates(scope).toStream.find(sc => getDirect(sc, key).isDefined) - - def getDirect[T](scope: Scope, key: AttributeKey[T]): Option[T] = - (data get scope).flatMap(_ get key) - - def set[T](scope: Scope, key: AttributeKey[T], value: T): Settings[Scope] = - { - val map = data getOrElse (scope, AttributeMap.empty) - val newData = data.updated(scope, map.put(key, value)) - new Settings0(newData, delegates) - } -} -// delegates should contain the input Scope as the first entry -// this trait is intended to be mixed into an object -trait Init[Scope] { - /** The Show instance used when a detailed String needs to be generated. It is typically used when no context is available.*/ - def showFullKey: Show[ScopedKey[_]] - - sealed case class ScopedKey[T](scope: Scope, key: AttributeKey[T]) extends KeyedInitialize[T] { - def scopedKey = this - } - - type SettingSeq[T] = Seq[Setting[T]] - type ScopedMap = IMap[ScopedKey, SettingSeq] - type CompiledMap = Map[ScopedKey[_], Compiled[_]] - type MapScoped = ScopedKey ~> ScopedKey - type ValidatedRef[T] = Either[Undefined, ScopedKey[T]] - type ValidatedInit[T] = Either[Seq[Undefined], Initialize[T]] - type ValidateRef = ScopedKey ~> ValidatedRef - type ScopeLocal = ScopedKey[_] => Seq[Setting[_]] - type MapConstant = ScopedKey ~> Option - - private[sbt] abstract class ValidateKeyRef { - def apply[T](key: ScopedKey[T], selfRefOk: Boolean): ValidatedRef[T] - } - - /** - * The result of this initialization is the composition of applied transformations. - * This can be useful when dealing with dynamic Initialize values. - */ - lazy val capturedTransformations: Initialize[Initialize ~> Initialize] = new TransformCapture(idK[Initialize]) - - def setting[T](key: ScopedKey[T], init: Initialize[T], pos: SourcePosition = NoPosition): Setting[T] = new Setting[T](key, init, pos) - def valueStrict[T](value: T): Initialize[T] = pure(() => value) - def value[T](value: => T): Initialize[T] = pure(value _) - def pure[T](value: () => T): Initialize[T] = new Value(value) - def optional[T, U](i: Initialize[T])(f: Option[T] => U): Initialize[U] = new Optional(Some(i), f) - def update[T](key: ScopedKey[T])(f: T => T): Setting[T] = setting[T](key, map(key)(f), NoPosition) - def bind[S, T](in: Initialize[S])(f: S => Initialize[T]): Initialize[T] = new Bind(f, in) - def map[S, T](in: Initialize[S])(f: S => T): Initialize[T] = new Apply[({ type l[L[x]] = L[S] })#l, T](f, in, AList.single[S]) - def app[K[L[x]], T](inputs: K[Initialize])(f: K[Id] => T)(implicit alist: AList[K]): Initialize[T] = new Apply[K, T](f, inputs, alist) - def uniform[S, T](inputs: Seq[Initialize[S]])(f: Seq[S] => T): Initialize[T] = - new Apply[({ type l[L[x]] = List[L[S]] })#l, T](f, inputs.toList, AList.seq[S]) - - /** - * The result of this initialization is the validated `key`. - * No dependency is introduced on `key`. If `selfRefOk` is true, validation will not fail if the key is referenced by a definition of `key`. - * That is, key := f(validated(key).value) is allowed only if `selfRefOk == true`. - */ - private[sbt] final def validated[T](key: ScopedKey[T], selfRefOk: Boolean): ValidationCapture[T] = new ValidationCapture(key, selfRefOk) - - /** - * Constructs a derived setting that will be automatically defined in every scope where one of its dependencies - * is explicitly defined and the where the scope matches `filter`. - * A setting initialized with dynamic dependencies is only allowed if `allowDynamic` is true. - * Only the static dependencies are tracked, however. Dependencies on previous values do not introduce a derived setting either. - */ - final def derive[T](s: Setting[T], allowDynamic: Boolean = false, filter: Scope => Boolean = const(true), trigger: AttributeKey[_] => Boolean = const(true), default: Boolean = false): Setting[T] = { - deriveAllowed(s, allowDynamic) foreach sys.error - val d = new DerivedSetting[T](s.key, s.init, s.pos, filter, trigger) - if (default) d.default() else d - } - - def deriveAllowed[T](s: Setting[T], allowDynamic: Boolean): Option[String] = s.init match { - case _: Bind[_, _] if !allowDynamic => Some("Cannot derive from dynamic dependencies.") - case _ => None - } - - // id is used for equality - private[sbt] final def defaultSetting[T](s: Setting[T]): Setting[T] = s.default() - private[sbt] def defaultSettings(ss: Seq[Setting[_]]): Seq[Setting[_]] = ss.map(s => defaultSetting(s)) - private[this] final val nextID = new java.util.concurrent.atomic.AtomicLong - private[this] final def nextDefaultID(): Long = nextID.incrementAndGet() - - def empty(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = new Settings0(Map.empty, delegates) - def asTransform(s: Settings[Scope]): ScopedKey ~> Id = new (ScopedKey ~> Id) { - def apply[T](k: ScopedKey[T]): T = getValue(s, k) - } - def getValue[T](s: Settings[Scope], k: ScopedKey[T]) = s.get(k.scope, k.key) getOrElse (throw new InvalidReference(k)) - def asFunction[T](s: Settings[Scope]): ScopedKey[T] => T = k => getValue(s, k) - def mapScope(f: Scope => Scope): MapScoped = new MapScoped { - def apply[T](k: ScopedKey[T]): ScopedKey[T] = k.copy(scope = f(k.scope)) - } - private final class InvalidReference(val key: ScopedKey[_]) extends RuntimeException("Internal settings error: invalid reference to " + showFullKey.show(key)) - - private[this] def applyDefaults(ss: Seq[Setting[_]]): Seq[Setting[_]] = - { - val (defaults, others) = Util.separate[Setting[_], DefaultSetting[_], Setting[_]](ss) { case u: DefaultSetting[_] => Left(u); case s => Right(s) } - defaults.distinct ++ others - } - - def compiled(init: Seq[Setting[_]], actual: Boolean = true)(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): CompiledMap = - { - val initDefaults = applyDefaults(init) - // inject derived settings into scopes where their dependencies are directly defined - // and prepend per-scope settings - val derived = deriveAndLocal(initDefaults) - // group by Scope/Key, dropping dead initializations - val sMap: ScopedMap = grouped(derived) - // delegate references to undefined values according to 'delegates' - val dMap: ScopedMap = if (actual) delegate(sMap)(delegates, display) else sMap - // merge Seq[Setting[_]] into Compiled - compile(dMap) - } - def make(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal, display: Show[ScopedKey[_]]): Settings[Scope] = - { - val cMap = compiled(init)(delegates, scopeLocal, display) - // order the initializations. cyclic references are detected here. - val ordered: Seq[Compiled[_]] = sort(cMap) - // evaluation: apply the initializations. - try { applyInits(ordered) } - catch { case rru: RuntimeUndefined => throw Uninitialized(cMap.keys.toSeq, delegates, rru.undefined, true) } - } - def sort(cMap: CompiledMap): Seq[Compiled[_]] = - Dag.topologicalSort(cMap.values)(_.dependencies.map(cMap)) - - def compile(sMap: ScopedMap): CompiledMap = - sMap.toTypedSeq.map { - case sMap.TPair(k, ss) => - val deps = ss.flatMap(_.dependencies).toSet - (k, new Compiled(k, deps, ss)) - }.toMap - - def grouped(init: Seq[Setting[_]]): ScopedMap = - ((IMap.empty: ScopedMap) /: init)((m, s) => add(m, s)) - - def add[T](m: ScopedMap, s: Setting[T]): ScopedMap = - m.mapValue[T](s.key, Nil, ss => append(ss, s)) - - def append[T](ss: Seq[Setting[T]], s: Setting[T]): Seq[Setting[T]] = - if (s.definitive) s :: Nil else ss :+ s - - def addLocal(init: Seq[Setting[_]])(implicit scopeLocal: ScopeLocal): Seq[Setting[_]] = - init.flatMap(_.dependencies flatMap scopeLocal) ++ init - - def delegate(sMap: ScopedMap)(implicit delegates: Scope => Seq[Scope], display: Show[ScopedKey[_]]): ScopedMap = - { - def refMap(ref: Setting[_], isFirst: Boolean) = new ValidateKeyRef { - def apply[T](k: ScopedKey[T], selfRefOk: Boolean) = - delegateForKey(sMap, k, delegates(k.scope), ref, selfRefOk || !isFirst) - } - type ValidatedSettings[T] = Either[Seq[Undefined], SettingSeq[T]] - val f = new (SettingSeq ~> ValidatedSettings) { - def apply[T](ks: Seq[Setting[T]]) = { - val (undefs, valid) = Util.separate(ks.zipWithIndex) { case (s, i) => s validateKeyReferenced refMap(s, i == 0) } - if (undefs.isEmpty) Right(valid) else Left(undefs.flatten) - } - } - type Undefs[_] = Seq[Undefined] - val (undefineds, result) = sMap.mapSeparate[Undefs, SettingSeq](f) - if (undefineds.isEmpty) - result - else - throw Uninitialized(sMap.keys.toSeq, delegates, undefineds.values.flatten.toList, false) - } - private[this] def delegateForKey[T](sMap: ScopedMap, k: ScopedKey[T], scopes: Seq[Scope], ref: Setting[_], selfRefOk: Boolean): Either[Undefined, ScopedKey[T]] = - { - val skeys = scopes.iterator.map(x => ScopedKey(x, k.key)) - val definedAt = skeys.find(sk => (selfRefOk || ref.key != sk) && (sMap contains sk)) - definedAt.toRight(Undefined(ref, k)) - } - - private[this] def applyInits(ordered: Seq[Compiled[_]])(implicit delegates: Scope => Seq[Scope]): Settings[Scope] = - { - val x = java.util.concurrent.Executors.newFixedThreadPool(Runtime.getRuntime.availableProcessors) - try { - val eval: EvaluateSettings[Scope] = new EvaluateSettings[Scope] { - override val init: Init.this.type = Init.this - def compiledSettings = ordered - def executor = x - } - eval.run - } finally { x.shutdown() } - } - - def showUndefined(u: Undefined, validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope])(implicit display: Show[ScopedKey[_]]): String = - { - val guessed = guessIntendedScope(validKeys, delegates, u.referencedKey) - val derived = u.defining.isDerived - val refString = display.show(u.defining.key) - val sourceString = if (derived) "" else parenPosString(u.defining) - val guessedString = if (derived) "" else guessed.map(g => "\n Did you mean " + display.show(g) + " ?").toList.mkString - val derivedString = if (derived) ", which is a derived setting that needs this key to be defined in this scope." else "" - display.show(u.referencedKey) + " from " + refString + sourceString + derivedString + guessedString - } - private[this] def parenPosString(s: Setting[_]): String = - s.positionString match { case None => ""; case Some(s) => " (" + s + ")" } - - def guessIntendedScope(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], key: ScopedKey[_]): Option[ScopedKey[_]] = - { - val distances = validKeys.flatMap { validKey => refinedDistance(delegates, validKey, key).map(dist => (dist, validKey)) } - distances.sortBy(_._1).map(_._2).headOption - } - def refinedDistance(delegates: Scope => Seq[Scope], a: ScopedKey[_], b: ScopedKey[_]): Option[Int] = - if (a.key != b.key || a == b) None - else { - val dist = delegates(a.scope).indexOf(b.scope) - if (dist < 0) None else Some(dist) - } - - final class Uninitialized(val undefined: Seq[Undefined], override val toString: String) extends Exception(toString) - final class Undefined private[sbt] (val defining: Setting[_], val referencedKey: ScopedKey[_]) - final class RuntimeUndefined(val undefined: Seq[Undefined]) extends RuntimeException("References to undefined settings at runtime.") { - override def getMessage = - super.getMessage + undefined.map { u => - "\n" + u.defining + " referenced from " + u.referencedKey - }.mkString - } - - def Undefined(defining: Setting[_], referencedKey: ScopedKey[_]): Undefined = new Undefined(defining, referencedKey) - def Uninitialized(validKeys: Seq[ScopedKey[_]], delegates: Scope => Seq[Scope], keys: Seq[Undefined], runtime: Boolean)(implicit display: Show[ScopedKey[_]]): Uninitialized = - { - assert(keys.nonEmpty) - val suffix = if (keys.length > 1) "s" else "" - val prefix = if (runtime) "Runtime reference" else "Reference" - val keysString = keys.map(u => showUndefined(u, validKeys, delegates)).mkString("\n\n ", "\n\n ", "") - new Uninitialized(keys, prefix + suffix + " to undefined setting" + suffix + ": " + keysString + "\n ") - } - final class Compiled[T](val key: ScopedKey[T], val dependencies: Iterable[ScopedKey[_]], val settings: Seq[Setting[T]]) { - override def toString = showFullKey.show(key) - } - final class Flattened(val key: ScopedKey[_], val dependencies: Iterable[ScopedKey[_]]) - - def flattenLocals(compiled: CompiledMap): Map[ScopedKey[_], Flattened] = - { - val locals = compiled flatMap { case (key, comp) => if (key.key.isLocal) Seq[Compiled[_]](comp) else Nil } - val ordered = Dag.topologicalSort(locals)(_.dependencies.flatMap(dep => if (dep.key.isLocal) Seq[Compiled[_]](compiled(dep)) else Nil)) - def flatten(cmap: Map[ScopedKey[_], Flattened], key: ScopedKey[_], deps: Iterable[ScopedKey[_]]): Flattened = - new Flattened(key, deps.flatMap(dep => if (dep.key.isLocal) cmap(dep).dependencies else dep :: Nil)) - - val empty = Map.empty[ScopedKey[_], Flattened] - val flattenedLocals = (empty /: ordered) { (cmap, c) => cmap.updated(c.key, flatten(cmap, c.key, c.dependencies)) } - compiled flatMap { - case (key, comp) => - if (key.key.isLocal) - Nil - else - Seq[(ScopedKey[_], Flattened)]((key, flatten(flattenedLocals, key, comp.dependencies))) - } - } - - def definedAtString(settings: Seq[Setting[_]]): String = - { - val posDefined = settings.flatMap(_.positionString.toList) - if (posDefined.nonEmpty) { - val header = if (posDefined.size == settings.size) "defined at:" else - "some of the defining occurrences:" - header + (posDefined.distinct mkString ("\n\t", "\n\t", "\n")) - } else "" - } - - /** - * Intersects two scopes, returning the more specific one if they intersect, or None otherwise. - */ - private[sbt] def intersect(s1: Scope, s2: Scope)(implicit delegates: Scope => Seq[Scope]): Option[Scope] = - if (delegates(s1).contains(s2)) Some(s1) // s1 is more specific - else if (delegates(s2).contains(s1)) Some(s2) // s2 is more specific - else None - - private[this] def deriveAndLocal(init: Seq[Setting[_]])(implicit delegates: Scope => Seq[Scope], scopeLocal: ScopeLocal): Seq[Setting[_]] = - { - import collection.mutable - - final class Derived(val setting: DerivedSetting[_]) { - val dependencies = setting.dependencies.map(_.key) - def triggeredBy = dependencies.filter(setting.trigger) - val inScopes = new mutable.HashSet[Scope] - val outputs = new mutable.ListBuffer[Setting[_]] - } - final class Deriveds(val key: AttributeKey[_], val settings: mutable.ListBuffer[Derived]) { - def dependencies = settings.flatMap(_.dependencies) - // This is mainly for use in the cyclic reference error message - override def toString = s"Derived settings for ${key.label}, ${definedAtString(settings.map(_.setting))}" - } - - // separate `derived` settings from normal settings (`defs`) - val (derived, rawDefs) = Util.separate[Setting[_], Derived, Setting[_]](init) { case d: DerivedSetting[_] => Left(new Derived(d)); case s => Right(s) } - val defs = addLocal(rawDefs)(scopeLocal) - - // group derived settings by the key they define - val derivsByDef = new mutable.HashMap[AttributeKey[_], Deriveds] - for (s <- derived) { - val key = s.setting.key.key - derivsByDef.getOrElseUpdate(key, new Deriveds(key, new mutable.ListBuffer)).settings += s - } - - // index derived settings by triggering key. This maps a key to the list of settings potentially derived from it. - val derivedBy = new mutable.HashMap[AttributeKey[_], mutable.ListBuffer[Derived]] - for (s <- derived; d <- s.triggeredBy) - derivedBy.getOrElseUpdate(d, new mutable.ListBuffer) += s - - // Map a DerivedSetting[_] to the `Derived` struct wrapping it. Used to ultimately replace a DerivedSetting with - // the `Setting`s that were actually derived from it: `Derived.outputs` - val derivedToStruct: Map[DerivedSetting[_], Derived] = (derived map { s => s.setting -> s }).toMap - - // set of defined scoped keys, used to ensure a derived setting is only added if all dependencies are present - val defined = new mutable.HashSet[ScopedKey[_]] - def addDefs(ss: Seq[Setting[_]]): Unit = { for (s <- ss) defined += s.key } - addDefs(defs) - - // true iff the scoped key is in `defined`, taking delegation into account - def isDefined(key: AttributeKey[_], scope: Scope) = - delegates(scope).exists(s => defined.contains(ScopedKey(s, key))) - - // true iff all dependencies of derived setting `d` have a value (potentially via delegation) in `scope` - def allDepsDefined(d: Derived, scope: Scope, local: Set[AttributeKey[_]]): Boolean = - d.dependencies.forall(dep => local(dep) || isDefined(dep, scope)) - - // Returns the list of injectable derived settings and their local settings for `sk`. - // The settings are to be injected under `outputScope` = whichever scope is more specific of: - // * the dependency's (`sk`) scope - // * the DerivedSetting's scope in which it has been declared, `definingScope` - // provided that these two scopes intersect. - // A derived setting is injectable if: - // 1. it has not been previously injected into outputScope - // 2. it applies to outputScope (as determined by its `filter`) - // 3. all of its dependencies are defined for outputScope (allowing for delegation) - // This needs to handle local settings because a derived setting wouldn't be injected if it's local setting didn't exist yet. - val deriveFor = (sk: ScopedKey[_]) => { - val derivedForKey: List[Derived] = derivedBy.get(sk.key).toList.flatten - val scope = sk.scope - def localAndDerived(d: Derived): Seq[Setting[_]] = { - def definingScope = d.setting.key.scope - val outputScope = intersect(scope, definingScope) - outputScope collect { - case s if !d.inScopes.contains(s) && d.setting.filter(s) => - val local = d.dependencies.flatMap(dep => scopeLocal(ScopedKey(s, dep))) - if (allDepsDefined(d, s, local.map(_.key.key).toSet)) { - d.inScopes.add(s) - val out = local :+ d.setting.setScope(s) - d.outputs ++= out - out - } else - Nil - } getOrElse Nil - } - derivedForKey.flatMap(localAndDerived) - } - - val processed = new mutable.HashSet[ScopedKey[_]] - - // derives settings, transitively so that a derived setting can trigger another - def process(rem: List[Setting[_]]): Unit = rem match { - case s :: ss => - val sk = s.key - val ds = if (processed.add(sk)) deriveFor(sk) else Nil - addDefs(ds) - process(ds ::: ss) - case Nil => - } - process(defs.toList) - - // Take all the original defs and DerivedSettings along with locals, replace each DerivedSetting with the actual - // settings that were derived. - val allDefs = addLocal(init)(scopeLocal) - allDefs flatMap { case d: DerivedSetting[_] => (derivedToStruct get d map (_.outputs)).toStream.flatten; case s => Stream(s) } - } - - sealed trait Initialize[T] { - def dependencies: Seq[ScopedKey[_]] - def apply[S](g: T => S): Initialize[S] - - private[sbt] def mapReferenced(g: MapScoped): Initialize[T] - private[sbt] def mapConstant(g: MapConstant): Initialize[T] - private[sbt] def validateReferenced(g: ValidateRef): ValidatedInit[T] = - validateKeyReferenced(new ValidateKeyRef { def apply[B](key: ScopedKey[B], selfRefOk: Boolean) = g(key) }) - - private[sbt] def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] - - def evaluate(map: Settings[Scope]): T - def zip[S](o: Initialize[S]): Initialize[(T, S)] = zipTupled(o)(idFun) - def zipWith[S, U](o: Initialize[S])(f: (T, S) => U): Initialize[U] = zipTupled(o)(f.tupled) - private[this] def zipTupled[S, U](o: Initialize[S])(f: ((T, S)) => U): Initialize[U] = - new Apply[({ type l[L[x]] = (L[T], L[S]) })#l, U](f, (this, o), AList.tuple2[T, S]) - /** A fold on the static attributes of this and nested Initializes. */ - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S - } - object Initialize { - implicit def joinInitialize[T](s: Seq[Initialize[T]]): JoinInitSeq[T] = new JoinInitSeq(s) - final class JoinInitSeq[T](s: Seq[Initialize[T]]) { - def joinWith[S](f: Seq[T] => S): Initialize[S] = uniform(s)(f) - def join: Initialize[Seq[T]] = uniform(s)(idFun) - } - def join[T](inits: Seq[Initialize[T]]): Initialize[Seq[T]] = uniform(inits)(idFun) - def joinAny[M[_]](inits: Seq[Initialize[M[T]] forSome { type T }]): Initialize[Seq[M[_]]] = - join(inits.asInstanceOf[Seq[Initialize[M[Any]]]]).asInstanceOf[Initialize[Seq[M[T] forSome { type T }]]] - } - object SettingsDefinition { - implicit def unwrapSettingsDefinition(d: SettingsDefinition): Seq[Setting[_]] = d.settings - implicit def wrapSettingsDefinition(ss: Seq[Setting[_]]): SettingsDefinition = new SettingList(ss) - } - sealed trait SettingsDefinition { - def settings: Seq[Setting[_]] - } - final class SettingList(val settings: Seq[Setting[_]]) extends SettingsDefinition - sealed class Setting[T] private[Init] (val key: ScopedKey[T], val init: Initialize[T], val pos: SourcePosition) extends SettingsDefinition { - def settings = this :: Nil - def definitive: Boolean = !init.dependencies.contains(key) - def dependencies: Seq[ScopedKey[_]] = remove(init.dependencies, key) - def mapReferenced(g: MapScoped): Setting[T] = make(key, init mapReferenced g, pos) - def validateReferenced(g: ValidateRef): Either[Seq[Undefined], Setting[T]] = (init validateReferenced g).right.map(newI => make(key, newI, pos)) - - private[sbt] def validateKeyReferenced(g: ValidateKeyRef): Either[Seq[Undefined], Setting[T]] = - (init validateKeyReferenced g).right.map(newI => make(key, newI, pos)) - - def mapKey(g: MapScoped): Setting[T] = make(g(key), init, pos) - def mapInit(f: (ScopedKey[T], T) => T): Setting[T] = make(key, init(t => f(key, t)), pos) - def mapConstant(g: MapConstant): Setting[T] = make(key, init mapConstant g, pos) - def withPos(pos: SourcePosition) = make(key, init, pos) - def positionString: Option[String] = pos match { - case pos: FilePosition => Some(pos.path + ":" + pos.startLine) - case NoPosition => None - } - private[sbt] def mapInitialize(f: Initialize[T] => Initialize[T]): Setting[T] = make(key, f(init), pos) - override def toString = "setting(" + key + ") at " + pos - - protected[this] def make[B](key: ScopedKey[B], init: Initialize[B], pos: SourcePosition): Setting[B] = new Setting[B](key, init, pos) - protected[sbt] def isDerived: Boolean = false - private[sbt] def setScope(s: Scope): Setting[T] = make(key.copy(scope = s), init.mapReferenced(mapScope(const(s))), pos) - /** Turn this setting into a `DefaultSetting` if it's not already, otherwise returns `this` */ - private[sbt] def default(id: => Long = nextDefaultID()): DefaultSetting[T] = DefaultSetting(key, init, pos, id) - } - private[Init] sealed class DerivedSetting[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, val filter: Scope => Boolean, val trigger: AttributeKey[_] => Boolean) extends Setting[T](sk, i, p) { - override def make[B](key: ScopedKey[B], init: Initialize[B], pos: SourcePosition): Setting[B] = new DerivedSetting[B](key, init, pos, filter, trigger) - protected[sbt] override def isDerived: Boolean = true - override def default(_id: => Long): DefaultSetting[T] = new DerivedSetting[T](sk, i, p, filter, trigger) with DefaultSetting[T] { val id = _id } - override def toString = "derived " + super.toString - } - // Only keep the first occurrence of this setting and move it to the front so that it has lower precedence than non-defaults. - // This is intended for internal sbt use only, where alternatives like Plugin.globalSettings are not available. - private[Init] sealed trait DefaultSetting[T] extends Setting[T] { - val id: Long - override def make[B](key: ScopedKey[B], init: Initialize[B], pos: SourcePosition): Setting[B] = super.make(key, init, pos) default id - override final def hashCode = id.hashCode - override final def equals(o: Any): Boolean = o match { case d: DefaultSetting[_] => d.id == id; case _ => false } - override def toString = s"default($id) " + super.toString - override def default(id: => Long) = this - } - - object DefaultSetting { - def apply[T](sk: ScopedKey[T], i: Initialize[T], p: SourcePosition, _id: Long) = new Setting[T](sk, i, p) with DefaultSetting[T] { val id = _id } - } - - private[this] def handleUndefined[T](vr: ValidatedInit[T]): Initialize[T] = vr match { - case Left(undefs) => throw new RuntimeUndefined(undefs) - case Right(x) => x - } - - private[this] lazy val getValidated = - new (ValidatedInit ~> Initialize) { def apply[T](v: ValidatedInit[T]) = handleUndefined[T](v) } - - // mainly for reducing generated class count - private[this] def validateKeyReferencedT(g: ValidateKeyRef) = - new (Initialize ~> ValidatedInit) { def apply[T](i: Initialize[T]) = i validateKeyReferenced g } - - private[this] def mapReferencedT(g: MapScoped) = - new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapReferenced g } - - private[this] def mapConstantT(g: MapConstant) = - new (Initialize ~> Initialize) { def apply[T](i: Initialize[T]) = i mapConstant g } - - private[this] def evaluateT(g: Settings[Scope]) = - new (Initialize ~> Id) { def apply[T](i: Initialize[T]) = i evaluate g } - - private[this] def deps(ls: Seq[Initialize[_]]): Seq[ScopedKey[_]] = ls.flatMap(_.dependencies) - - sealed trait Keyed[S, T] extends Initialize[T] { - def scopedKey: ScopedKey[S] - def transform: S => T - final def dependencies = scopedKey :: Nil - final def apply[Z](g: T => Z): Initialize[Z] = new GetValue(scopedKey, g compose transform) - final def evaluate(ss: Settings[Scope]): T = transform(getValue(ss, scopedKey)) - final def mapReferenced(g: MapScoped): Initialize[T] = new GetValue(g(scopedKey), transform) - private[sbt] final def validateKeyReferenced(g: ValidateKeyRef): ValidatedInit[T] = g(scopedKey, false) match { - case Left(un) => Left(un :: Nil) - case Right(nk) => Right(new GetValue(nk, transform)) - } - final def mapConstant(g: MapConstant): Initialize[T] = g(scopedKey) match { - case None => this - case Some(const) => new Value(() => transform(const)) - } - private[sbt] def processAttributes[B](init: B)(f: (B, AttributeMap) => B): B = init - } - private[this] final class GetValue[S, T](val scopedKey: ScopedKey[S], val transform: S => T) extends Keyed[S, T] - trait KeyedInitialize[T] extends Keyed[T, T] { - final val transform = idFun[T] - } - - private[sbt] final class TransformCapture(val f: Initialize ~> Initialize) extends Initialize[Initialize ~> Initialize] { - def dependencies = Nil - def apply[Z](g2: (Initialize ~> Initialize) => Z): Initialize[Z] = map(this)(g2) - def evaluate(ss: Settings[Scope]): Initialize ~> Initialize = f - def mapReferenced(g: MapScoped) = new TransformCapture(mapReferencedT(g) ∙ f) - def mapConstant(g: MapConstant) = new TransformCapture(mapConstantT(g) ∙ f) - def validateKeyReferenced(g: ValidateKeyRef) = Right(new TransformCapture(getValidated ∙ validateKeyReferencedT(g) ∙ f)) - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[sbt] final class ValidationCapture[T](val key: ScopedKey[T], val selfRefOk: Boolean) extends Initialize[ScopedKey[T]] { - def dependencies = Nil - def apply[Z](g2: ScopedKey[T] => Z): Initialize[Z] = map(this)(g2) - def evaluate(ss: Settings[Scope]) = key - def mapReferenced(g: MapScoped) = new ValidationCapture(g(key), selfRefOk) - def mapConstant(g: MapConstant) = this - def validateKeyReferenced(g: ValidateKeyRef) = g(key, selfRefOk) match { - case Left(un) => Left(un :: Nil) - case Right(k) => Right(new ValidationCapture(k, selfRefOk)) - } - - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[sbt] final class Bind[S, T](val f: S => Initialize[T], val in: Initialize[S]) extends Initialize[T] { - def dependencies = in.dependencies - def apply[Z](g: T => Z): Initialize[Z] = new Bind[S, Z](s => f(s)(g), in) - def evaluate(ss: Settings[Scope]): T = f(in evaluate ss) evaluate ss - def mapReferenced(g: MapScoped) = new Bind[S, T](s => f(s) mapReferenced g, in mapReferenced g) - def validateKeyReferenced(g: ValidateKeyRef) = (in validateKeyReferenced g).right.map { validIn => - new Bind[S, T](s => handleUndefined(f(s) validateKeyReferenced g), validIn) - } - def mapConstant(g: MapConstant) = new Bind[S, T](s => f(s) mapConstant g, in mapConstant g) - private[sbt] def processAttributes[B](init: B)(f: (B, AttributeMap) => B): B = in.processAttributes(init)(f) - } - private[sbt] final class Optional[S, T](val a: Option[Initialize[S]], val f: Option[S] => T) extends Initialize[T] { - def dependencies = deps(a.toList) - def apply[Z](g: T => Z): Initialize[Z] = new Optional[S, Z](a, g compose f) - def mapReferenced(g: MapScoped) = new Optional(a map mapReferencedT(g).fn, f) - def validateKeyReferenced(g: ValidateKeyRef) = a match { - case None => Right(this) - case Some(i) => Right(new Optional(i.validateKeyReferenced(g).right.toOption, f)) - } - def mapConstant(g: MapConstant): Initialize[T] = new Optional(a map mapConstantT(g).fn, f) - def evaluate(ss: Settings[Scope]): T = f(a.flatMap(i => trapBadRef(evaluateT(ss)(i)))) - // proper solution is for evaluate to be deprecated or for external use only and a new internal method returning Either be used - private[this] def trapBadRef[A](run: => A): Option[A] = try Some(run) catch { case e: InvalidReference => None } - private[sbt] def processAttributes[B](init: B)(f: (B, AttributeMap) => B): B = a match { - case None => init - case Some(i) => i.processAttributes(init)(f) - } - } - private[sbt] final class Value[T](val value: () => T) extends Initialize[T] { - def dependencies = Nil - def mapReferenced(g: MapScoped) = this - def validateKeyReferenced(g: ValidateKeyRef) = Right(this) - def apply[S](g: T => S) = new Value[S](() => g(value())) - def mapConstant(g: MapConstant) = this - def evaluate(map: Settings[Scope]): T = value() - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[sbt] final object StaticScopes extends Initialize[Set[Scope]] { - def dependencies = Nil - def mapReferenced(g: MapScoped) = this - def validateKeyReferenced(g: ValidateKeyRef) = Right(this) - def apply[S](g: Set[Scope] => S) = map(this)(g) - def mapConstant(g: MapConstant) = this - def evaluate(map: Settings[Scope]) = map.scopes - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = init - } - private[sbt] final class Apply[K[L[x]], T](val f: K[Id] => T, val inputs: K[Initialize], val alist: AList[K]) extends Initialize[T] { - def dependencies = deps(alist.toList(inputs)) - def mapReferenced(g: MapScoped) = mapInputs(mapReferencedT(g)) - def apply[S](g: T => S) = new Apply(g compose f, inputs, alist) - def mapConstant(g: MapConstant) = mapInputs(mapConstantT(g)) - def mapInputs(g: Initialize ~> Initialize): Initialize[T] = new Apply(f, alist.transform(inputs, g), alist) - def evaluate(ss: Settings[Scope]) = f(alist.transform(inputs, evaluateT(ss))) - def validateKeyReferenced(g: ValidateKeyRef) = - { - val tx = alist.transform(inputs, validateKeyReferencedT(g)) - val undefs = alist.toList(tx).flatMap(_.left.toSeq.flatten) - val get = new (ValidatedInit ~> Initialize) { def apply[B](vr: ValidatedInit[B]) = vr.right.get } - if (undefs.isEmpty) Right(new Apply(f, alist.transform(tx, get), alist)) else Left(undefs) - } - - private[sbt] def processAttributes[S](init: S)(f: (S, AttributeMap) => S): S = - (init /: alist.toList(inputs)) { (v, i) => i.processAttributes(v)(f) } - } - private def remove[T](s: Seq[T], v: T) = s filterNot (_ == v) -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala deleted file mode 100644 index 0c9fac038..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Signal.scala +++ /dev/null @@ -1,86 +0,0 @@ -package sbt.internal.util - -object Signals { - val CONT = "CONT" - val INT = "INT" - def withHandler[T](handler: () => Unit, signal: String = INT)(action: () => T): T = - { - val result = - try { - val signals = new Signals0 - signals.withHandler(signal, handler, action) - } catch { case e: LinkageError => Right(action()) } - - result match { - case Left(e) => throw e - case Right(v) => v - } - } - - /** Helper interface so we can expose internals of signal-isms to others. */ - sealed trait Registration { - def remove(): Unit - } - /** - * Register a signal handler that can be removed later. - * NOTE: Does not stack with other signal handlers!!!! - */ - def register(handler: () => Unit, signal: String = INT): Registration = - // TODO - Maybe we can just ignore things if not is-supported. - if (supported(signal)) { - import sun.misc.{ Signal, SignalHandler } - val intSignal = new Signal(signal) - val newHandler = new SignalHandler { - def handle(sig: Signal): Unit = { handler() } - } - val oldHandler = Signal.handle(intSignal, newHandler) - object unregisterNewHandler extends Registration { - override def remove(): Unit = { - Signal.handle(intSignal, oldHandler) - () - } - } - unregisterNewHandler - } else { - // TODO - Maybe we should just throw an exception if we don't support signals... - object NullUnregisterNewHandler extends Registration { - override def remove(): Unit = () - } - NullUnregisterNewHandler - } - - def supported(signal: String): Boolean = - try { - val signals = new Signals0 - signals.supported(signal) - } catch { case e: LinkageError => false } -} - -// Must only be referenced using a -// try { } catch { case e: LinkageError => ... } -// block to -private final class Signals0 { - def supported(signal: String): Boolean = - { - import sun.misc.Signal - try { new Signal(signal); true } - catch { case e: IllegalArgumentException => false } - } - - // returns a LinkageError in `action` as Left(t) in order to avoid it being - // incorrectly swallowed as missing Signal/SignalHandler - def withHandler[T](signal: String, handler: () => Unit, action: () => T): Either[Throwable, T] = - { - import sun.misc.{ Signal, SignalHandler } - val intSignal = new Signal(signal) - val newHandler = new SignalHandler { - def handle(sig: Signal): Unit = { handler() } - } - - val oldHandler = Signal.handle(intSignal, newHandler) - - try Right(action()) - catch { case e: LinkageError => Left(e) } - finally { Signal.handle(intSignal, oldHandler); () } - } -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/TypeFunctions.scala b/internal/util-collection/src/main/scala/sbt/internal/util/TypeFunctions.scala deleted file mode 100644 index b7aac6360..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/TypeFunctions.scala +++ /dev/null @@ -1,50 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -trait TypeFunctions { - type Id[X] = X - sealed trait Const[A] { type Apply[B] = A } - sealed trait ConstK[A] { type l[L[x]] = A } - sealed trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] } - sealed trait ∙[A[_], B[_]] { type l[T] = A[B[T]] } - sealed trait P1of2[M[_, _], A] { type Apply[B] = M[A, B]; type Flip[B] = M[B, A] } - - final val left = new (Id ~> P1of2[Left, Nothing]#Flip) { def apply[T](t: T) = Left(t) } - final val right = new (Id ~> P1of2[Right, Nothing]#Apply) { def apply[T](t: T) = Right(t) } - final val some = new (Id ~> Some) { def apply[T](t: T) = Some(t) } - final def idFun[T] = (t: T) => t - final def const[A, B](b: B): A => B = _ => b - final def idK[M[_]]: M ~> M = new (M ~> M) { def apply[T](m: M[T]): M[T] = m } - - def nestCon[M[_], N[_], G[_]](f: M ~> N): (M ∙ G)#l ~> (N ∙ G)#l = - f.asInstanceOf[(M ∙ G)#l ~> (N ∙ G)#l] // implemented with a cast to avoid extra object+method call. castless version: - /* new ( (M ∙ G)#l ~> (N ∙ G)#l ) { - def apply[T](mg: M[G[T]]): N[G[T]] = f(mg) - }*/ - - implicit def toFn1[A, B](f: A => B): Fn1[A, B] = new Fn1[A, B] { - def ∙[C](g: C => A) = f compose g - } - - type Endo[T] = T => T - type ~>|[A[_], B[_]] = A ~> Compose[Option, B]#Apply -} -object TypeFunctions extends TypeFunctions - -trait ~>[-A[_], +B[_]] { outer => - def apply[T](a: A[T]): B[T] - // directly on ~> because of type inference limitations - final def ∙[C[_]](g: C ~> A): C ~> B = new (C ~> B) { def apply[T](c: C[T]) = outer.apply(g(c)) } - final def ∙[C, D](g: C => D)(implicit ev: D <:< A[D]): C => B[D] = i => apply(ev(g(i))) - final def fn[T] = (t: A[T]) => apply[T](t) -} -object ~> { - import TypeFunctions._ - val Id: Id ~> Id = new (Id ~> Id) { def apply[T](a: T): T = a } - implicit def tcIdEquals: (Id ~> Id) = Id -} -trait Fn1[A, B] { - def ∙[C](g: C => A): C => B -} \ No newline at end of file diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Types.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Types.scala deleted file mode 100644 index 9b6eb0733..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Types.scala +++ /dev/null @@ -1,12 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -object Types extends Types - -trait Types extends TypeFunctions { - val :^: = KCons - type :+:[H, T <: HList] = HCons[H, T] - val :+: = HCons -} diff --git a/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala b/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala deleted file mode 100644 index 75b8224c1..000000000 --- a/internal/util-collection/src/main/scala/sbt/internal/util/Util.scala +++ /dev/null @@ -1,41 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2011 Mark Harrah - */ -package sbt.internal.util - -import java.util.Locale - -object Util { - def makeList[T](size: Int, value: T): List[T] = List.fill(size)(value) - - def separateE[A, B](ps: Seq[Either[A, B]]): (Seq[A], Seq[B]) = - separate(ps)(Types.idFun) - - def separate[T, A, B](ps: Seq[T])(f: T => Either[A, B]): (Seq[A], Seq[B]) = - { - val (a, b) = ((Nil: Seq[A], Nil: Seq[B]) /: ps)((xs, y) => prependEither(xs, f(y))) - (a.reverse, b.reverse) - } - - def prependEither[A, B](acc: (Seq[A], Seq[B]), next: Either[A, B]): (Seq[A], Seq[B]) = - next match { - case Left(l) => (l +: acc._1, acc._2) - case Right(r) => (acc._1, r +: acc._2) - } - - def pairID[A, B] = (a: A, b: B) => (a, b) - - private[this] lazy val Hyphen = """-(\p{javaLowerCase})""".r - - def hasHyphen(s: String): Boolean = s.indexOf('-') >= 0 - - def hyphenToCamel(s: String): String = - if (hasHyphen(s)) Hyphen.replaceAllIn(s, _.group(1).toUpperCase(Locale.ENGLISH)) else s - - private[this] lazy val Camel = """(\p{javaLowerCase})(\p{javaUpperCase})""".r - - def camelToHyphen(s: String): String = - Camel.replaceAllIn(s, m => m.group(1) + "-" + m.group(2).toLowerCase(Locale.ENGLISH)) - - def quoteIfKeyword(s: String): String = if (ScalaKeywords.values(s)) '`' + s + '`' else s -} diff --git a/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala b/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala deleted file mode 100644 index e840bc689..000000000 --- a/internal/util-collection/src/main/scala/sbt/util/OptJsonWriter.scala +++ /dev/null @@ -1,22 +0,0 @@ -package sbt.util - -import sjsonnew.JsonWriter - -sealed trait OptJsonWriter[A] -final case class NoJsonWriter[A]() extends OptJsonWriter[A] -final case class SomeJsonWriter[A](value: JsonWriter[A]) extends OptJsonWriter[A] - -trait OptJsonWriter0 { - implicit def fallback[A]: NoJsonWriter[A] = NoJsonWriter() -} -object OptJsonWriter extends OptJsonWriter0 { - implicit def lift[A](implicit z: JsonWriter[A]): SomeJsonWriter[A] = SomeJsonWriter(z) - - trait StrictMode0 { - implicit def conflictingFallback1[A]: NoJsonWriter[A] = NoJsonWriter() - implicit def conflictingFallback2[A]: NoJsonWriter[A] = NoJsonWriter() - } - object StrictMode extends StrictMode0 { - implicit def lift[A](implicit z: JsonWriter[A]): SomeJsonWriter[A] = SomeJsonWriter(z) - } -} diff --git a/internal/util-collection/src/main/scala/sbt/util/Show.scala b/internal/util-collection/src/main/scala/sbt/util/Show.scala deleted file mode 100644 index 20ac0565d..000000000 --- a/internal/util-collection/src/main/scala/sbt/util/Show.scala +++ /dev/null @@ -1,12 +0,0 @@ -package sbt.util - -trait Show[A] { - def show(a: A): String -} -object Show { - def apply[A](f: A => String): Show[A] = new Show[A] { def show(a: A): String = f(a) } - - def fromToString[A]: Show[A] = new Show[A] { - def show(a: A): String = a.toString - } -} diff --git a/internal/util-collection/src/test/scala/DagSpecification.scala b/internal/util-collection/src/test/scala/DagSpecification.scala deleted file mode 100644 index 3b3614e39..000000000 --- a/internal/util-collection/src/test/scala/DagSpecification.scala +++ /dev/null @@ -1,53 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2008 Mark Harrah */ - -package sbt.internal.util - -import org.scalacheck._ -import Prop._ - -import scala.collection.mutable.HashSet - -object DagSpecification extends Properties("Dag") { - property("No repeated nodes") = forAll { (dag: TestDag) => isSet(dag.topologicalSort) } - property("Sort contains node") = forAll { (dag: TestDag) => dag.topologicalSort.contains(dag) } - property("Dependencies precede node") = forAll { (dag: TestDag) => dependenciesPrecedeNodes(dag.topologicalSort) } - - implicit lazy val arbTestDag: Arbitrary[TestDag] = Arbitrary(Gen.sized(dagGen)) - private def dagGen(nodeCount: Int): Gen[TestDag] = - { - val nodes = new HashSet[TestDag] - def nonterminalGen(p: Gen.Parameters): Gen[TestDag] = - { - val seed = rng.Seed.random() - for { - i <- 0 until nodeCount - nextDeps <- Gen.someOf(nodes).apply(p, seed) - } nodes += new TestDag(i, nextDeps) - for (nextDeps <- Gen.someOf(nodes)) yield new TestDag(nodeCount, nextDeps) - } - Gen.parameterized(nonterminalGen) - } - - private def isSet[T](c: Seq[T]) = Set(c: _*).size == c.size - private def dependenciesPrecedeNodes(sort: List[TestDag]) = - { - val seen = new HashSet[TestDag] - def iterate(remaining: List[TestDag]): Boolean = - { - remaining match { - case Nil => true - case node :: tail => - if (node.dependencies.forall(seen.contains) && !seen.contains(node)) { - seen += node - iterate(tail) - } else - false - } - } - iterate(sort) - } -} -class TestDag(id: Int, val dependencies: Iterable[TestDag]) extends Dag[TestDag] { - override def toString = id + "->" + dependencies.mkString("[", ",", "]") -} diff --git a/internal/util-collection/src/test/scala/HListFormatSpec.scala b/internal/util-collection/src/test/scala/HListFormatSpec.scala deleted file mode 100644 index 8f6e9a73b..000000000 --- a/internal/util-collection/src/test/scala/HListFormatSpec.scala +++ /dev/null @@ -1,28 +0,0 @@ -package sbt -package internal -package util - -import scalajson.ast.unsafe._ -import sjsonnew._, BasicJsonProtocol._, support.scalajson.unsafe._ -import HListFormats._ - -class HListFormatSpec extends UnitSpec { - val quux = 23 :+: "quux" :+: true :+: HNil - - it should "round trip quux" in assertRoundTrip(quux) - it should "round trip hnil" in assertRoundTrip(HNil) - - it should "have a flat structure for quux" in assertJsonString(quux, """[23,"quux",true]""") - it should "have a flat structure for hnil" in assertJsonString(HNil, "[]") - - def assertRoundTrip[A: JsonWriter: JsonReader](x: A) = { - val jsonString: String = toJsonString(x) - val jValue: JValue = Parser.parseUnsafe(jsonString) - val y: A = Converter.fromJson[A](jValue).get - assert(x === y) - } - - def assertJsonString[A: JsonWriter](x: A, s: String) = assert(toJsonString(x) === s) - - def toJsonString[A: JsonWriter](x: A): String = CompactPrinter(Converter.toJson(x).get) -} diff --git a/internal/util-collection/src/test/scala/KeyTest.scala b/internal/util-collection/src/test/scala/KeyTest.scala deleted file mode 100644 index 461655b88..000000000 --- a/internal/util-collection/src/test/scala/KeyTest.scala +++ /dev/null @@ -1,32 +0,0 @@ -package sbt.internal.util - -import org.scalacheck._ -import Prop._ - -object KeyTest extends Properties("AttributeKey") { - property("equality") = { - compare(AttributeKey[Int]("test"), AttributeKey[Int]("test"), true) && - compare(AttributeKey[Int]("test"), AttributeKey[Int]("test", "description"), true) && - compare(AttributeKey[Int]("test", "a"), AttributeKey[Int]("test", "b"), true) && - compare(AttributeKey[Int]("test"), AttributeKey[Int]("tests"), false) && - compare(AttributeKey[Int]("test"), AttributeKey[Double]("test"), false) && - compare(AttributeKey[java.lang.Integer]("test"), AttributeKey[Int]("test"), false) && - compare(AttributeKey[Map[Int, String]]("test"), AttributeKey[Map[Int, String]]("test"), true) && - compare(AttributeKey[Map[Int, String]]("test"), AttributeKey[Map[Int, _]]("test"), false) - } - - def compare(a: AttributeKey[_], b: AttributeKey[_], same: Boolean) = - ("a.label: " + a.label) |: - ("a.manifest: " + a.manifest) |: - ("b.label: " + b.label) |: - ("b.manifest: " + b.manifest) |: - ("expected equal? " + same) |: - compare0(a, b, same) - - def compare0(a: AttributeKey[_], b: AttributeKey[_], same: Boolean) = - if (same) { - ("equality" |: (a == b)) && - ("hash" |: (a.hashCode == b.hashCode)) - } else - ("equality" |: (a != b)) -} \ No newline at end of file diff --git a/internal/util-collection/src/test/scala/LiteralTest.scala b/internal/util-collection/src/test/scala/LiteralTest.scala deleted file mode 100644 index 9353a07bf..000000000 --- a/internal/util-collection/src/test/scala/LiteralTest.scala +++ /dev/null @@ -1,15 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -// compilation test -object LiteralTest { - def x[A[_], B[_]](f: A ~> B) = f - - import Param._ - val f = x { (p: Param[Option, List]) => p.ret(p.in.toList) } - - val a: List[Int] = f(Some(3)) - val b: List[String] = f(Some("aa")) -} diff --git a/internal/util-collection/src/test/scala/PMapTest.scala b/internal/util-collection/src/test/scala/PMapTest.scala deleted file mode 100644 index 9e1dde2c9..000000000 --- a/internal/util-collection/src/test/scala/PMapTest.scala +++ /dev/null @@ -1,18 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util - -import Types._ - -// compilation test -object PMapTest { - val mp = new DelegatingPMap[Some, Id](new collection.mutable.HashMap) - mp(Some("asdf")) = "a" - mp(Some(3)) = 9 - val x = Some(3) :^: Some("asdf") :^: KNil - val y = x.transform[Id](mp) - assert(y.head == 9) - assert(y.tail.head == "a") - assert(y.tail.tail == KNil) -} \ No newline at end of file diff --git a/internal/util-collection/src/test/scala/SettingsExample.scala b/internal/util-collection/src/test/scala/SettingsExample.scala deleted file mode 100644 index 20536a5ef..000000000 --- a/internal/util-collection/src/test/scala/SettingsExample.scala +++ /dev/null @@ -1,89 +0,0 @@ -package sbt.internal.util - -import sbt.util.Show - -/** Define our settings system */ - -// A basic scope indexed by an integer. -final case class Scope(nestIndex: Int, idAtIndex: Int = 0) - -// Extend the Init trait. -// (It is done this way because the Scope type parameter is used everywhere in Init. -// Lots of type constructors would become binary, which as you may know requires lots of type lambdas -// when you want a type function with only one parameter. -// That would be a general pain.) -case class SettingsExample() extends Init[Scope] { - // Provides a way of showing a Scope+AttributeKey[_] - val showFullKey: Show[ScopedKey[_]] = Show[ScopedKey[_]]((key: ScopedKey[_]) => - { - s"${key.scope.nestIndex}(${key.scope.idAtIndex})/${key.key.label}" - }) - - // A sample delegation function that delegates to a Scope with a lower index. - val delegates: Scope => Seq[Scope] = { - case s @ Scope(index, proj) => - s +: (if (index <= 0) Nil else { (if (proj > 0) List(Scope(index)) else Nil) ++: delegates(Scope(index - 1)) }) - } - - // Not using this feature in this example. - val scopeLocal: ScopeLocal = _ => Nil - - // These three functions + a scope (here, Scope) are sufficient for defining our settings system. -} - -/** Usage Example **/ - -case class SettingsUsage(val settingsExample: SettingsExample) { - import settingsExample._ - - // Define some keys - val a = AttributeKey[Int]("a") - val b = AttributeKey[Int]("b") - - // Scope these keys - val a3 = ScopedKey(Scope(3), a) - val a4 = ScopedKey(Scope(4), a) - val a5 = ScopedKey(Scope(5), a) - - val b4 = ScopedKey(Scope(4), b) - - // Define some settings - val mySettings: Seq[Setting[_]] = Seq( - setting(a3, value(3)), - setting(b4, map(a4)(_ * 3)), - update(a5)(_ + 1) - ) - - // "compiles" and applies the settings. - // This can be split into multiple steps to access intermediate results if desired. - // The 'inspect' command operates on the output of 'compile', for example. - val applied: Settings[Scope] = make(mySettings)(delegates, scopeLocal, showFullKey) - - // Show results. - /* for(i <- 0 to 5; k <- Seq(a, b)) { - println( k.label + i + " = " + applied.get( Scope(i), k) ) - }*/ - - /** - * Output: - * For the None results, we never defined the value and there was no value to delegate to. - * For a3, we explicitly defined it to be 3. - * a4 wasn't defined, so it delegates to a3 according to our delegates function. - * b4 gets the value for a4 (which delegates to a3, so it is 3) and multiplies by 3 - * a5 is defined as the previous value of a5 + 1 and - * since no previous value of a5 was defined, it delegates to a4, resulting in 3+1=4. - * b5 isn't defined explicitly, so it delegates to b4 and is therefore equal to 9 as well - * a0 = None - * b0 = None - * a1 = None - * b1 = None - * a2 = None - * b2 = None - * a3 = Some(3) - * b3 = None - * a4 = Some(3) - * b4 = Some(9) - * a5 = Some(4) - * b5 = Some(9) - */ -} diff --git a/internal/util-collection/src/test/scala/SettingsTest.scala b/internal/util-collection/src/test/scala/SettingsTest.scala deleted file mode 100644 index 3e0bf0c0f..000000000 --- a/internal/util-collection/src/test/scala/SettingsTest.scala +++ /dev/null @@ -1,198 +0,0 @@ -package sbt.internal.util - -import org.scalacheck._ -import Prop._ - -object SettingsTest extends Properties("settings") { - val settingsExample: SettingsExample = SettingsExample() - import settingsExample._ - val settingsUsage = SettingsUsage(settingsExample) - import settingsUsage._ - - import scala.reflect.Manifest - - final val ChainMax = 5000 - lazy val chainLengthGen = Gen.choose(1, ChainMax) - - property("Basic settings test") = secure(all(tests: _*)) - - property("Basic chain") = forAll(chainLengthGen) { (i: Int) => - val abs = math.abs(i) - singleIntTest(chain(abs, value(0)), abs) - } - property("Basic bind chain") = forAll(chainLengthGen) { (i: Int) => - val abs = math.abs(i) - singleIntTest(chainBind(value(abs)), 0) - } - - property("Allows references to completed settings") = forAllNoShrink(30) { allowedReference } - final def allowedReference(intermediate: Int): Prop = - { - val top = value(intermediate) - def iterate(init: Initialize[Int]): Initialize[Int] = - bind(init) { t => - if (t <= 0) - top - else - iterate(value(t - 1)) - } - evaluate(setting(chk, iterate(top)) :: Nil); true - } - - property("Derived setting chain depending on (prev derived, normal setting)") = forAllNoShrink(Gen.choose(1, 100).label("numSettings")) { derivedSettings } - final def derivedSettings(nr: Int): Prop = - { - val genScopedKeys = { - // We wan - // t to generate lists of keys that DO NOT inclue the "ch" key we use to check things. - val attrKeys = mkAttrKeys[Int](nr).filter(_.forall(_.label != "ch")) - attrKeys map (_ map (ak => ScopedKey(Scope(0), ak))) - }.label("scopedKeys").filter(_.nonEmpty) - forAll(genScopedKeys) { scopedKeys => - try { - // Note; It's evil to grab last IF you haven't verified the set can't be empty. - val last = scopedKeys.last - val derivedSettings: Seq[Setting[Int]] = ( - for { - List(scoped0, scoped1) <- chk :: scopedKeys sliding 2 - nextInit = if (scoped0 == chk) chk - else (scoped0 zipWith chk) { (p, _) => p + 1 } - } yield derive(setting(scoped1, nextInit)) - ).toSeq - - { - // Note: This causes a cycle refernec error, quite frequently. - checkKey(last, Some(nr - 1), evaluate(setting(chk, value(0)) +: derivedSettings)) :| "Not derived?" - } && { - checkKey(last, None, evaluate(derivedSettings)) :| "Should not be derived" - } - } catch { - case t: Throwable => - // TODO - For debugging only. - t.printStackTrace(System.err) - throw t - } - } - } - - private def mkAttrKeys[T](nr: Int)(implicit mf: Manifest[T]): Gen[List[AttributeKey[T]]] = - { - import Gen._ - val nonEmptyAlphaStr = - nonEmptyListOf(alphaChar).map(_.mkString).suchThat(_.forall(_.isLetter)) - - (for { - list <- Gen.listOfN(nr, nonEmptyAlphaStr) suchThat (l => l.size == l.distinct.size) - item <- list - } yield AttributeKey[T](item)).label(s"mkAttrKeys($nr)") - } - - property("Derived setting(s) replace DerivedSetting in the Seq[Setting[_]]") = derivedKeepsPosition - final def derivedKeepsPosition: Prop = - { - val a: ScopedKey[Int] = ScopedKey(Scope(0), AttributeKey[Int]("a")) - val b: ScopedKey[Int] = ScopedKey(Scope(0), AttributeKey[Int]("b")) - val prop1 = { - val settings: Seq[Setting[_]] = Seq( - setting(a, value(3)), - setting(b, value(6)), - derive(setting(b, a)), - setting(a, value(5)), - setting(b, value(8)) - ) - val ev = evaluate(settings) - checkKey(a, Some(5), ev) && checkKey(b, Some(8), ev) - } - val prop2 = { - val settings: Seq[Setting[Int]] = Seq( - setting(a, value(3)), - setting(b, value(6)), - derive(setting(b, a)), - setting(a, value(5)) - ) - val ev = evaluate(settings) - checkKey(a, Some(5), ev) && checkKey(b, Some(5), ev) - } - prop1 && prop2 - } - - property("DerivedSetting in ThisBuild scopes derived settings under projects thus allowing safe +=") = forAllNoShrink(Gen.choose(1, 100)) { derivedSettingsScope } - final def derivedSettingsScope(nrProjects: Int): Prop = - { - forAll(mkAttrKeys[Int](2)) { - case List(key, derivedKey) => - val projectKeys = for { proj <- 1 to nrProjects } yield ScopedKey(Scope(1, proj), key) - val projectDerivedKeys = for { proj <- 1 to nrProjects } yield ScopedKey(Scope(1, proj), derivedKey) - val globalKey = ScopedKey(Scope(0), key) - val globalDerivedKey = ScopedKey(Scope(0), derivedKey) - // Each project defines an initial value, but the update is defined in globalKey. - // However, the derived Settings that come from this should be scoped in each project. - val settings: Seq[Setting[_]] = - derive(setting(globalDerivedKey, settingsExample.map(globalKey)(_ + 1))) +: projectKeys.map(pk => setting(pk, value(0))) - val ev = evaluate(settings) - // Also check that the key has no value at the "global" scope - val props = for { pk <- projectDerivedKeys } yield checkKey(pk, Some(1), ev) - checkKey(globalDerivedKey, None, ev) && Prop.all(props: _*) - } - } - - // Circular (dynamic) references currently loop infinitely. - // This is the expected behavior (detecting dynamic cycles is expensive), - // but it may be necessary to provide an option to detect them (with a performance hit) - // This would test that cycle detection. - // property("Catches circular references") = forAll(chainLengthGen) { checkCircularReferences _ } - final def checkCircularReferences(intermediate: Int): Prop = - { - val ccr = new CCR(intermediate) - try { evaluate(setting(chk, ccr.top) :: Nil); false } - catch { case e: java.lang.Exception => true } - } - - def tests = - for (i <- 0 to 5; k <- Seq(a, b)) yield { - val expected = expectedValues(2 * i + (if (k == a) 0 else 1)) - checkKey[Int](ScopedKey(Scope(i), k), expected, applied) - } - - lazy val expectedValues = None :: None :: None :: None :: None :: None :: Some(3) :: None :: Some(3) :: Some(9) :: Some(4) :: Some(9) :: Nil - - lazy val ch = AttributeKey[Int]("ch") - lazy val chk = ScopedKey(Scope(0), ch) - def chain(i: Int, prev: Initialize[Int]): Initialize[Int] = - if (i <= 0) prev else chain(i - 1, prev(_ + 1)) - - def chainBind(prev: Initialize[Int]): Initialize[Int] = - bind(prev) { v => - if (v <= 0) prev else chainBind(value(v - 1)) - } - def singleIntTest(i: Initialize[Int], expected: Int) = - { - val eval = evaluate(setting(chk, i) :: Nil) - checkKey(chk, Some(expected), eval) - } - - def checkKey[T](key: ScopedKey[T], expected: Option[T], settings: Settings[Scope]) = - { - val value = settings.get(key.scope, key.key) - ("Key: " + key) |: - ("Value: " + value) |: - ("Expected: " + expected) |: - (value == expected) - } - - def evaluate(settings: Seq[Setting[_]]): Settings[Scope] = - try { make(settings)(delegates, scopeLocal, showFullKey) } - catch { case e: Throwable => e.printStackTrace; throw e } -} -// This setup is a workaround for module synchronization issues -final class CCR(intermediate: Int) { - import SettingsTest.settingsExample._ - lazy val top = iterate(value(intermediate), intermediate) - def iterate(init: Initialize[Int], i: Int): Initialize[Int] = - bind(init) { t => - if (t <= 0) - top - else - iterate(value(t - 1), t - 1) - } -} diff --git a/internal/util-complete/NOTICE b/internal/util-complete/NOTICE deleted file mode 100644 index a6f2c1de4..000000000 --- a/internal/util-complete/NOTICE +++ /dev/null @@ -1,3 +0,0 @@ -Simple Build Tool: Completion Component -Copyright 2010 Mark Harrah -Licensed under BSD-style license (see LICENSE) \ No newline at end of file diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala b/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala deleted file mode 100644 index b4d5d5f83..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/LineReader.scala +++ /dev/null @@ -1,189 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2008, 2009 Mark Harrah - */ -package sbt.internal.util - -import jline.console.ConsoleReader -import jline.console.history.{ FileHistory, MemoryHistory } -import java.io.{ File, InputStream, FileInputStream, FileDescriptor, FilterInputStream } -import complete.Parser -import scala.concurrent.duration.Duration -import scala.annotation.tailrec - -abstract class JLine extends LineReader { - protected[this] def handleCONT: Boolean - protected[this] def reader: ConsoleReader - protected[this] def injectThreadSleep: Boolean - protected[this] val in: InputStream = JLine.makeInputStream(injectThreadSleep) - def readLine(prompt: String, mask: Option[Char] = None) = JLine.withJLine { unsynchronizedReadLine(prompt, mask) } - - private[this] def unsynchronizedReadLine(prompt: String, mask: Option[Char]): Option[String] = - readLineWithHistory(prompt, mask) map { x => - x.trim - } - - private[this] def readLineWithHistory(prompt: String, mask: Option[Char]): Option[String] = - reader.getHistory match { - case fh: FileHistory => - try { readLineDirect(prompt, mask) } - finally { fh.flush() } - case _ => readLineDirect(prompt, mask) - } - - private[this] def readLineDirect(prompt: String, mask: Option[Char]): Option[String] = - if (handleCONT) - Signals.withHandler(() => resume(), signal = Signals.CONT)(() => readLineDirectRaw(prompt, mask)) - else - readLineDirectRaw(prompt, mask) - private[this] def readLineDirectRaw(prompt: String, mask: Option[Char]): Option[String] = - { - val newprompt = handleMultilinePrompt(prompt) - try { - mask match { - case Some(m) => Option(reader.readLine(newprompt, m)) - case None => Option(reader.readLine(newprompt)) - } - } catch { - case e: InterruptedException => Option("") - } - } - - private[this] def handleMultilinePrompt(prompt: String): String = { - val lines = """\r?\n""".r.split(prompt) - lines.length match { - case 0 | 1 => prompt - case _ => - // Workaround for regression jline/jline2#205 - reader.getOutput.write(lines.init.mkString("\n") + "\n") - lines.last - } - } - - private[this] def resume(): Unit = { - jline.TerminalFactory.reset - JLine.terminal.init - reader.drawLine() - reader.flush() - } -} -private[sbt] object JLine { - private[this] val TerminalProperty = "jline.terminal" - - fixTerminalProperty() - - // translate explicit class names to type in order to support - // older Scala, since it shaded classes but not the system property - private[sbt] def fixTerminalProperty(): Unit = { - val newValue = System.getProperty(TerminalProperty) match { - case "jline.UnixTerminal" => "unix" - case null if System.getProperty("sbt.cygwin") != null => "unix" - case "jline.WindowsTerminal" => "windows" - case "jline.AnsiWindowsTerminal" => "windows" - case "jline.UnsupportedTerminal" => "none" - case x => x - } - if (newValue != null) System.setProperty(TerminalProperty, newValue) - () - } - - protected[this] val originalIn = new FileInputStream(FileDescriptor.in) - private[sbt] def makeInputStream(injectThreadSleep: Boolean): InputStream = - if (injectThreadSleep) new InputStreamWrapper(originalIn, Duration("50 ms")) - else originalIn - - // When calling this, ensure that enableEcho has been or will be called. - // TerminalFactory.get will initialize the terminal to disable echo. - private def terminal = jline.TerminalFactory.get - private def withTerminal[T](f: jline.Terminal => T): T = - synchronized { - val t = terminal - t.synchronized { f(t) } - } - /** - * For accessing the JLine Terminal object. - * This ensures synchronized access as well as re-enabling echo after getting the Terminal. - */ - def usingTerminal[T](f: jline.Terminal => T): T = - withTerminal { t => - t.restore - f(t) - } - def createReader(): ConsoleReader = createReader(None, JLine.makeInputStream(true)) - def createReader(historyPath: Option[File], in: InputStream): ConsoleReader = - usingTerminal { t => - val cr = new ConsoleReader(in, System.out) - cr.setExpandEvents(false) // https://issues.scala-lang.org/browse/SI-7650 - cr.setBellEnabled(false) - val h = historyPath match { - case None => new MemoryHistory - case Some(file) => new FileHistory(file) - } - h.setMaxSize(MaxHistorySize) - cr.setHistory(h) - cr - } - def withJLine[T](action: => T): T = - withTerminal { t => - t.init - try { action } - finally { t.restore } - } - - def simple( - historyPath: Option[File], - handleCONT: Boolean = HandleCONT, - injectThreadSleep: Boolean = false - ): SimpleReader = new SimpleReader(historyPath, handleCONT, injectThreadSleep) - val MaxHistorySize = 500 - val HandleCONT = !java.lang.Boolean.getBoolean("sbt.disable.cont") && Signals.supported(Signals.CONT) -} - -private[sbt] class InputStreamWrapper(is: InputStream, val poll: Duration) extends FilterInputStream(is) { - @tailrec - final override def read(): Int = - if (is.available() != 0) is.read() - else { - Thread.sleep(poll.toMillis) - read() - } - - @tailrec - final override def read(b: Array[Byte]): Int = - if (is.available() != 0) is.read(b) - else { - Thread.sleep(poll.toMillis) - read(b) - } - - @tailrec - final override def read(b: Array[Byte], off: Int, len: Int): Int = - if (is.available() != 0) is.read(b, off, len) - else { - Thread.sleep(poll.toMillis) - read(b, off, len) - } -} - -trait LineReader { - def readLine(prompt: String, mask: Option[Char] = None): Option[String] -} -final class FullReader( - historyPath: Option[File], - complete: Parser[_], - val handleCONT: Boolean = JLine.HandleCONT, - val injectThreadSleep: Boolean = false -) extends JLine { - protected[this] val reader = - { - val cr = JLine.createReader(historyPath, in) - sbt.internal.util.complete.JLineCompletion.installCustomCompletor(cr, complete) - cr - } -} - -class SimpleReader private[sbt] (historyPath: Option[File], val handleCONT: Boolean, val injectThreadSleep: Boolean) extends JLine { - protected[this] val reader = JLine.createReader(historyPath, in) - -} -object SimpleReader extends SimpleReader(None, JLine.HandleCONT, false) - diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala deleted file mode 100644 index 47dbb3b4f..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Completions.scala +++ /dev/null @@ -1,137 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util -package complete - -/** - * Represents a set of completions. - * It exists instead of implicitly defined operations on top of Set[Completion] - * for laziness. - */ -sealed trait Completions { - def get: Set[Completion] - final def x(o: Completions): Completions = flatMap(_ x o) - final def ++(o: Completions): Completions = Completions(get ++ o.get) - final def +:(o: Completion): Completions = Completions(get + o) - final def filter(f: Completion => Boolean): Completions = Completions(get filter f) - final def filterS(f: String => Boolean): Completions = filter(c => f(c.append)) - override def toString = get.mkString("Completions(", ",", ")") - final def flatMap(f: Completion => Completions): Completions = Completions(get.flatMap(c => f(c).get)) - final def map(f: Completion => Completion): Completions = Completions(get map f) - override final def hashCode = get.hashCode - override final def equals(o: Any) = o match { case c: Completions => get == c.get; case _ => false } -} -object Completions { - /** Returns a lazy Completions instance using the provided Completion Set. */ - def apply(cs: => Set[Completion]): Completions = new Completions { - lazy val get = cs - } - - /** Returns a strict Completions instance using the provided Completion Set. */ - def strict(cs: Set[Completion]): Completions = apply(cs) - - /** - * No suggested completions, not even the empty Completion. - * This typically represents invalid input. - */ - val nil: Completions = strict(Set.empty) - - /** - * Only includes an empty Suggestion. - * This typically represents valid input that either has no completions or accepts no further input. - */ - val empty: Completions = strict(Set.empty + Completion.empty) - - /** Returns a strict Completions instance containing only the provided Completion.*/ - def single(c: Completion): Completions = strict(Set.empty + c) -} - -/** - * Represents a completion. - * The abstract members `display` and `append` are best explained with an example. - * - * Assuming space-delimited tokens, processing this: - * am is are w - * could produce these Completions: - * Completion { display = "was"; append = "as" } - * Completion { display = "were"; append = "ere" } - * to suggest the tokens "was" and "were". - * - * In this way, two pieces of information are preserved: - * 1) what needs to be appended to the current input if a completion is selected - * 2) the full token being completed, which is useful for presenting a user with choices to select - */ -sealed trait Completion { - /** The proposed suffix to append to the existing input to complete the last token in the input.*/ - def append: String - /** The string to present to the user to represent the full token being suggested.*/ - def display: String - /** True if this Completion is suggesting the empty string.*/ - def isEmpty: Boolean - - /** Appends the completions in `o` with the completions in this Completion.*/ - def ++(o: Completion): Completion = Completion.concat(this, o) - final def x(o: Completions): Completions = if (Completion evaluatesRight this) o.map(this ++ _) else Completions.strict(Set.empty + this) - override final lazy val hashCode = Completion.hashCode(this) - override final def equals(o: Any) = o match { case c: Completion => Completion.equal(this, c); case _ => false } -} -final class DisplayOnly(val display: String) extends Completion { - def isEmpty = display.isEmpty - def append = "" - override def toString = "{" + display + "}" -} -final class Token(val display: String, val append: String) extends Completion { - def isEmpty = display.isEmpty && append.isEmpty - override final def toString = "[" + display + "]++" + append -} -final class Suggestion(val append: String) extends Completion { - def isEmpty = append.isEmpty - def display = append - override def toString = append -} -object Completion { - def concat(a: Completion, b: Completion): Completion = - (a, b) match { - case (as: Suggestion, bs: Suggestion) => suggestion(as.append + bs.append) - case (at: Token, _) if at.append.isEmpty => b - case _ if a.isEmpty => b - case _ => a - } - def evaluatesRight(a: Completion): Boolean = - a match { - case _: Suggestion => true - case at: Token if at.append.isEmpty => true - case _ => a.isEmpty - } - - def equal(a: Completion, b: Completion): Boolean = - (a, b) match { - case (as: Suggestion, bs: Suggestion) => as.append == bs.append - case (ad: DisplayOnly, bd: DisplayOnly) => ad.display == bd.display - case (at: Token, bt: Token) => at.display == bt.display && at.append == bt.append - case _ => false - } - - def hashCode(a: Completion): Int = - a match { - case as: Suggestion => (0, as.append).hashCode - case ad: DisplayOnly => (1, ad.display).hashCode - case at: Token => (2, at.display, at.append).hashCode - } - - val empty: Completion = suggestion("") - def single(c: Char): Completion = suggestion(c.toString) - - // TODO: make strict in 0.13.0 to match DisplayOnly - def displayOnly(value: => String): Completion = new DisplayOnly(value) - - // TODO: make strict in 0.13.0 to match Token - def token(prepend: => String, append: => String): Completion = new Token(prepend + append, append) - - /** @since 0.12.1 */ - def tokenDisplay(append: String, display: String): Completion = new Token(display, append) - - // TODO: make strict in 0.13.0 to match Suggestion - def suggestion(value: => String): Completion = new Suggestion(value) -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala deleted file mode 100644 index 79f488554..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/EditDistance.scala +++ /dev/null @@ -1,41 +0,0 @@ -package sbt.internal.util -package complete - -import java.lang.Character.{ toLowerCase => lower } - -/** @author Paul Phillips*/ -object EditDistance { - /** - * Translated from the java version at - * http://www.merriampark.com/ld.htm - * which is declared to be public domain. - */ - def levenshtein(s: String, t: String, insertCost: Int = 1, deleteCost: Int = 1, subCost: Int = 1, transposeCost: Int = 1, matchCost: Int = 0, caseCost: Int = 1, transpositions: Boolean = false): Int = { - val n = s.length - val m = t.length - if (n == 0) return m - if (m == 0) return n - - val d = Array.ofDim[Int](n + 1, m + 1) - 0 to n foreach (x => d(x)(0) = x) - 0 to m foreach (x => d(0)(x) = x) - - for (i <- 1 to n; s_i = s(i - 1); j <- 1 to m) { - val t_j = t(j - 1) - val cost = if (s_i == t_j) matchCost else if (lower(s_i) == lower(t_j)) caseCost else subCost - - val c1 = d(i - 1)(j) + deleteCost - val c2 = d(i)(j - 1) + insertCost - val c3 = d(i - 1)(j - 1) + cost - - d(i)(j) = c1 min c2 min c3 - - if (transpositions) { - if (i > 1 && j > 1 && s(i - 1) == t(j - 2) && s(i - 2) == t(j - 1)) - d(i)(j) = d(i)(j) min (d(i - 2)(j - 2) + cost) - } - } - - d(n)(m) - } -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/ExampleSource.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/ExampleSource.scala deleted file mode 100644 index 6539554a6..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/ExampleSource.scala +++ /dev/null @@ -1,60 +0,0 @@ -package sbt.internal.util -package complete - -import java.io.File -import sbt.io.IO - -/** - * These sources of examples are used in parsers for user input completion. An example of such a source is the - * [[sbt.complete.FileExamples]] class, which provides a list of suggested files to the user as they press the - * TAB key in the console. - */ -trait ExampleSource { - /** - * @return a (possibly lazy) list of completion example strings. These strings are continuations of user's input. The - * user's input is incremented with calls to [[withAddedPrefix]]. - */ - def apply(): Iterable[String] - - /** - * @param addedPrefix a string that just typed in by the user. - * @return a new source of only those examples that start with the string typed by the user so far (with addition of - * the just added prefix). - */ - def withAddedPrefix(addedPrefix: String): ExampleSource -} - -/** - * A convenience example source that wraps any collection of strings into a source of examples. - * @param examples the examples that will be displayed to the user when they press the TAB key. - */ -sealed case class FixedSetExamples(examples: Iterable[String]) extends ExampleSource { - override def withAddedPrefix(addedPrefix: String): ExampleSource = FixedSetExamples(examplesWithRemovedPrefix(addedPrefix)) - - override def apply(): Iterable[String] = examples - - private def examplesWithRemovedPrefix(prefix: String) = examples.collect { - case example if example startsWith prefix => example substring prefix.length - } -} - -/** - * Provides path completion examples based on files in the base directory. - * @param base the directory within which this class will search for completion examples. - * @param prefix the part of the path already written by the user. - */ -class FileExamples(base: File, prefix: String = "") extends ExampleSource { - override def apply(): Stream[String] = files(base).map(_ substring prefix.length) - - override def withAddedPrefix(addedPrefix: String): FileExamples = new FileExamples(base, prefix + addedPrefix) - - protected def files(directory: File): Stream[String] = { - val childPaths = IO.listFiles(directory).toStream - val prefixedDirectChildPaths = childPaths map { IO.relativize(base, _).get } filter { _ startsWith prefix } - val dirsToRecurseInto = childPaths filter { _.isDirectory } map { IO.relativize(base, _).get } filter { dirStartsWithPrefix } - prefixedDirectChildPaths append dirsToRecurseInto.flatMap(dir => files(new File(base, dir))) - } - - private def dirStartsWithPrefix(relativizedPath: String): Boolean = - (relativizedPath startsWith prefix) || (prefix startsWith relativizedPath) -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala deleted file mode 100644 index d5a96836e..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/History.scala +++ /dev/null @@ -1,44 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util -package complete - -import History.number -import java.io.File - -final class History private (val lines: IndexedSeq[String], val path: Option[File], error: String => Unit) { - private def reversed = lines.reverse - - def all: Seq[String] = lines - def size = lines.length - def !! : Option[String] = !-(1) - def apply(i: Int): Option[String] = if (0 <= i && i < size) Some(lines(i)) else { sys.error("Invalid history index: " + i) } - def !(i: Int): Option[String] = apply(i) - - def !(s: String): Option[String] = - number(s) match { - case Some(n) => if (n < 0) !-(-n) else apply(n) - case None => nonEmpty(s) { reversed.find(_.startsWith(s)) } - } - def !-(n: Int): Option[String] = apply(size - n - 1) - - def !?(s: String): Option[String] = nonEmpty(s) { reversed.drop(1).find(_.contains(s)) } - - private def nonEmpty[T](s: String)(act: => Option[T]): Option[T] = - if (s.isEmpty) - sys.error("No action specified to history command") - else - act - - def list(historySize: Int, show: Int): Seq[String] = - lines.toList.drop(scala.math.max(0, lines.size - historySize)).zipWithIndex.map { case (line, number) => " " + number + " " + line }.takeRight(show max 1) -} - -object History { - def apply(lines: Seq[String], path: Option[File], error: String => Unit): History = new History(lines.toIndexedSeq, path, sys.error) - - def number(s: String): Option[Int] = - try { Some(s.toInt) } - catch { case e: NumberFormatException => None } -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala deleted file mode 100644 index f18f1619f..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/HistoryCommands.scala +++ /dev/null @@ -1,72 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2010 Mark Harrah - */ -package sbt.internal.util -package complete - -import sbt.io.IO - -object HistoryCommands { - val Start = "!" - // second characters - val Contains = "?" - val Last = "!" - val ListCommands = ":" - - def ContainsFull = h(Contains) - def LastFull = h(Last) - def ListFull = h(ListCommands) - - def ListN = ListFull + "n" - def ContainsString = ContainsFull + "string" - def StartsWithString = Start + "string" - def Previous = Start + "-n" - def Nth = Start + "n" - - private def h(s: String) = Start + s - def plainCommands = Seq(ListFull, Start, LastFull, ContainsFull) - - def descriptions = Seq( - LastFull -> "Execute the last command again", - ListFull -> "Show all previous commands", - ListN -> "Show the last n commands", - Nth -> ("Execute the command with index n, as shown by the " + ListFull + " command"), - Previous -> "Execute the nth command before this one", - StartsWithString -> "Execute the most recent command starting with 'string'", - ContainsString -> "Execute the most recent command containing 'string'" - ) - def helpString = "History commands:\n " + (descriptions.map { case (c, d) => c + " " + d }).mkString("\n ") - def printHelp(): Unit = - println(helpString) - def printHistory(history: complete.History, historySize: Int, show: Int): Unit = - history.list(historySize, show).foreach(println) - - import DefaultParsers._ - - val MaxLines = 500 - lazy val num = token(NatBasic, "") - lazy val last = Last ^^^ { execute(_.!!) } - lazy val list = ListCommands ~> (num ?? Int.MaxValue) map { show => (h: History) => { printHistory(h, MaxLines, show); Some(Nil) } - } - lazy val execStr = flag('?') ~ token(any.+.string, "") map { - case (contains, str) => - execute(h => if (contains) h !? str else h ! str) - } - lazy val execInt = flag('-') ~ num map { - case (neg, value) => - execute(h => if (neg) h !- value else h ! value) - } - lazy val help = success((h: History) => { printHelp(); Some(Nil) }) - - def execute(f: History => Option[String]): History => Option[List[String]] = (h: History) => - { - val command = f(h).filterNot(_.startsWith(Start)) - val lines = h.lines.toArray - command.foreach(lines(lines.length - 1) = _) - h.path foreach { h => IO.writeLines(h, lines) } - Some(command.toList) - } - - val actionParser: Parser[complete.History => Option[List[String]]] = - Start ~> (help | last | execInt | list | execStr) // execStr must come last -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala deleted file mode 100644 index 0b8b50502..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/JLineCompletion.scala +++ /dev/null @@ -1,157 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2011 Mark Harrah - */ -package sbt.internal.util -package complete - -import jline.console.ConsoleReader -import jline.console.completer.{ Completer, CompletionHandler } -import scala.annotation.tailrec -import collection.JavaConversions - -object JLineCompletion { - def installCustomCompletor(reader: ConsoleReader, parser: Parser[_]): Unit = - installCustomCompletor(reader)(parserAsCompletor(parser)) - def installCustomCompletor(reader: ConsoleReader)(complete: (String, Int) => (Seq[String], Seq[String])): Unit = - installCustomCompletor(customCompletor(complete), reader) - def installCustomCompletor(complete: (ConsoleReader, Int) => Boolean, reader: ConsoleReader): Unit = - { - reader.removeCompleter(DummyCompletor) - reader.addCompleter(DummyCompletor) - reader.setCompletionHandler(new CustomHandler(complete)) - } - - private[this] final class CustomHandler(completeImpl: (ConsoleReader, Int) => Boolean) extends CompletionHandler { - private[this] var previous: Option[(String, Int)] = None - private[this] var level: Int = 1 - override def complete(reader: ConsoleReader, candidates: java.util.List[CharSequence], position: Int) = { - val current = Some(bufferSnapshot(reader)) - level = if (current == previous) level + 1 else 1 - previous = current - try completeImpl(reader, level) - catch { - case e: Exception => - reader.print("\nException occurred while determining completions.") - e.printStackTrace() - false - } - } - } - - // always provides dummy completions so that the custom completion handler gets called - // (ConsoleReader doesn't call the handler if there aren't any completions) - // the custom handler will then throw away the candidates and call the custom function - private[this] final object DummyCompletor extends Completer { - override def complete(buffer: String, cursor: Int, candidates: java.util.List[CharSequence]): Int = - { - candidates.asInstanceOf[java.util.List[String]] add "dummy" - 0 - } - } - - def parserAsCompletor(p: Parser[_]): (String, Int) => (Seq[String], Seq[String]) = - (str, level) => convertCompletions(Parser.completions(p, str, level)) - - def convertCompletions(c: Completions): (Seq[String], Seq[String]) = - { - val cs = c.get - if (cs.isEmpty) - (Nil, "{invalid input}" :: Nil) - else - convertCompletions(cs) - } - def convertCompletions(cs: Set[Completion]): (Seq[String], Seq[String]) = - { - val (insert, display) = - ((Set.empty[String], Set.empty[String]) /: cs) { - case (t @ (insert, display), comp) => - if (comp.isEmpty) t else (insert + comp.append, appendNonEmpty(display, comp.display)) - } - (insert.toSeq, display.toSeq.sorted) - } - def appendNonEmpty(set: Set[String], add: String) = if (add.trim.isEmpty) set else set + add - - def customCompletor(f: (String, Int) => (Seq[String], Seq[String])): (ConsoleReader, Int) => Boolean = - (reader, level) => { - val success = complete(beforeCursor(reader), reader => f(reader, level), reader) - reader.flush() - success - } - - def bufferSnapshot(reader: ConsoleReader): (String, Int) = - { - val b = reader.getCursorBuffer - (b.buffer.toString, b.cursor) - } - def beforeCursor(reader: ConsoleReader): String = - { - val b = reader.getCursorBuffer - b.buffer.substring(0, b.cursor) - } - - // returns false if there was nothing to insert and nothing to display - def complete(beforeCursor: String, completions: String => (Seq[String], Seq[String]), reader: ConsoleReader): Boolean = - { - val (insert, display) = completions(beforeCursor) - val common = commonPrefix(insert) - if (common.isEmpty) - if (display.isEmpty) - () - else - showCompletions(display, reader) - else - appendCompletion(common, reader) - - !(common.isEmpty && display.isEmpty) - } - - def appendCompletion(common: String, reader: ConsoleReader): Unit = { - reader.getCursorBuffer.write(common) - reader.redrawLine() - } - - /** - * `display` is assumed to be the exact strings requested to be displayed. - * In particular, duplicates should have been removed already. - */ - def showCompletions(display: Seq[String], reader: ConsoleReader): Unit = { - printCompletions(display, reader) - reader.drawLine() - } - def printCompletions(cs: Seq[String], reader: ConsoleReader): Unit = { - val print = shouldPrint(cs, reader) - reader.println() - if (print) printLinesAndColumns(cs, reader) - } - def printLinesAndColumns(cs: Seq[String], reader: ConsoleReader): Unit = { - val (lines, columns) = cs partition hasNewline - for (line <- lines) { - reader.print(line) - if (line.charAt(line.length - 1) != '\n') - reader.println() - } - reader.printColumns(JavaConversions.seqAsJavaList(columns.map(_.trim))) - } - def hasNewline(s: String): Boolean = s.indexOf('\n') >= 0 - def shouldPrint(cs: Seq[String], reader: ConsoleReader): Boolean = - { - val size = cs.size - (size <= reader.getAutoprintThreshold) || - confirm("Display all %d possibilities? (y or n) ".format(size), 'y', 'n', reader) - } - def confirm(prompt: String, trueC: Char, falseC: Char, reader: ConsoleReader): Boolean = - { - reader.println() - reader.print(prompt) - reader.flush() - reader.readCharacter(trueC, falseC) == trueC - } - - def commonPrefix(s: Seq[String]): String = if (s.isEmpty) "" else s reduceLeft commonPrefix - def commonPrefix(a: String, b: String): String = - { - val len = scala.math.min(a.length, b.length) - @tailrec def loop(i: Int): Int = if (i >= len) len else if (a(i) != b(i)) i else loop(i + 1) - a.substring(0, loop(0)) - } -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala deleted file mode 100644 index 003862c5e..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parser.scala +++ /dev/null @@ -1,823 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2008, 2010, 2011 Mark Harrah - */ -package sbt.internal.util -package complete - -import Parser._ -import sbt.internal.util.Types.{ left, right, some } -import sbt.internal.util.Util.{ makeList, separate } - -/** - * A String parser that provides semi-automatic tab completion. - * A successful parse results in a value of type `T`. - * The methods in this trait are what must be implemented to define a new Parser implementation, but are not typically useful for common usage. - * Instead, most useful methods for combining smaller parsers into larger parsers are implicitly added by the [[RichParser]] type. - */ -sealed trait Parser[+T] { - def derive(i: Char): Parser[T] - def resultEmpty: Result[T] - def result: Option[T] - def completions(level: Int): Completions - def failure: Option[Failure] - def isTokenStart = false - def ifValid[S](p: => Parser[S]): Parser[S] - def valid: Boolean -} -sealed trait RichParser[A] { - /** Apply the original Parser and then apply `next` (in order). The result of both is provides as a pair. */ - def ~[B](next: Parser[B]): Parser[(A, B)] - - /** Apply the original Parser one or more times and provide the non-empty sequence of results.*/ - def + : Parser[Seq[A]] - - /** Apply the original Parser zero or more times and provide the (potentially empty) sequence of results.*/ - def * : Parser[Seq[A]] - - /** Apply the original Parser zero or one times, returning None if it was applied zero times or the result wrapped in Some if it was applied once.*/ - def ? : Parser[Option[A]] - - /** Apply either the original Parser or `b`.*/ - def |[B >: A](b: Parser[B]): Parser[B] - - /** Apply either the original Parser or `b`.*/ - def ||[B](b: Parser[B]): Parser[Either[A, B]] - - /** Apply the original Parser to the input and then apply `f` to the result.*/ - def map[B](f: A => B): Parser[B] - - /** - * Returns the original parser. This is useful for converting literals to Parsers. - * For example, `'c'.id` or `"asdf".id` - */ - def id: Parser[A] - - /** Apply the original Parser, but provide `value` as the result if it succeeds. */ - def ^^^[B](value: B): Parser[B] - - /** Apply the original Parser, but provide `alt` as the result if it fails.*/ - def ??[B >: A](alt: B): Parser[B] - - /** - * Produces a Parser that applies the original Parser and then applies `next` (in order), discarding the result of `next`. - * (The arrow point in the direction of the retained result.) - */ - def <~[B](b: Parser[B]): Parser[A] - - /** - * Produces a Parser that applies the original Parser and then applies `next` (in order), discarding the result of the original parser. - * (The arrow point in the direction of the retained result.) - */ - def ~>[B](b: Parser[B]): Parser[B] - - /** Uses the specified message if the original Parser fails.*/ - def !!!(msg: String): Parser[A] - - /** - * If an exception is thrown by the original Parser, - * capture it and fail locally instead of allowing the exception to propagate up and terminate parsing. - */ - def failOnException: Parser[A] - - /** - * Apply the original parser, but only succeed if `o` also succeeds. - * Note that `o` does not need to consume the same amount of input to satisfy this condition. - */ - def &(o: Parser[_]): Parser[A] - - /** Explicitly defines the completions for the original Parser.*/ - def examples(s: String*): Parser[A] - - /** Explicitly defines the completions for the original Parser.*/ - def examples(s: Set[String], check: Boolean = false): Parser[A] - - /** - * @param exampleSource the source of examples when displaying completions to the user. - * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can - * prevent lengthy pauses and avoids bad interactive user experience. - * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the - * given parser). Invalid examples will be filtered out and only valid suggestions will - * be displayed. - * @return a new parser with a new source of completions. - */ - def examples(exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] - - /** - * @param exampleSource the source of examples when displaying completions to the user. - * @return a new parser with a new source of completions. It displays at most 25 completion examples and does not - * remove invalid examples. - */ - def examples(exampleSource: ExampleSource): Parser[A] = examples(exampleSource, maxNumberOfExamples = 25, removeInvalidExamples = false) - - /** Converts a Parser returning a Char sequence to a Parser returning a String.*/ - def string(implicit ev: A <:< Seq[Char]): Parser[String] - - /** - * Produces a Parser that filters the original parser. - * If 'f' is not true when applied to the output of the original parser, the Parser returned by this method fails. - * The failure message is constructed by applying `msg` to the String that was successfully parsed by the original parser. - */ - def filter(f: A => Boolean, msg: String => String): Parser[A] - - /** Applies the original parser, applies `f` to the result to get the next parser, and applies that parser and uses its result for the overall result. */ - def flatMap[B](f: A => Parser[B]): Parser[B] -} - -/** Contains Parser implementation helper methods not typically needed for using parsers. */ -object Parser extends ParserMain { - sealed abstract class Result[+T] { - def isFailure: Boolean - def isValid: Boolean - def errors: Seq[String] - def or[B >: T](b: => Result[B]): Result[B] - def either[B](b: => Result[B]): Result[Either[T, B]] - def map[B](f: T => B): Result[B] - def flatMap[B](f: T => Result[B]): Result[B] - def &&(b: => Result[_]): Result[T] - def filter(f: T => Boolean, msg: => String): Result[T] - def seq[B](b: => Result[B]): Result[(T, B)] = app(b)((m, n) => (m, n)) - def app[B, C](b: => Result[B])(f: (T, B) => C): Result[C] - def toEither: Either[() => Seq[String], T] - } - final case class Value[+T](value: T) extends Result[T] { - def isFailure = false - def isValid: Boolean = true - def errors = Nil - def app[B, C](b: => Result[B])(f: (T, B) => C): Result[C] = b match { - case fail: Failure => fail - case Value(bv) => Value(f(value, bv)) - } - def &&(b: => Result[_]): Result[T] = b match { case f: Failure => f; case _ => this } - def or[B >: T](b: => Result[B]): Result[B] = this - def either[B](b: => Result[B]): Result[Either[T, B]] = Value(Left(value)) - def map[B](f: T => B): Result[B] = Value(f(value)) - def flatMap[B](f: T => Result[B]): Result[B] = f(value) - def filter(f: T => Boolean, msg: => String): Result[T] = if (f(value)) this else mkFailure(msg) - def toEither = Right(value) - } - final class Failure private[sbt] (mkErrors: => Seq[String], val definitive: Boolean) extends Result[Nothing] { - lazy val errors: Seq[String] = mkErrors - def isFailure = true - def isValid = false - def map[B](f: Nothing => B) = this - def flatMap[B](f: Nothing => Result[B]) = this - def or[B](b: => Result[B]): Result[B] = b match { - case v: Value[B] => v - case f: Failure => if (definitive) this else this ++ f - } - def either[B](b: => Result[B]): Result[Either[Nothing, B]] = b match { - case Value(v) => Value(Right(v)) - case f: Failure => if (definitive) this else this ++ f - } - def filter(f: Nothing => Boolean, msg: => String) = this - def app[B, C](b: => Result[B])(f: (Nothing, B) => C): Result[C] = this - def &&(b: => Result[_]) = this - def toEither = Left(() => errors) - - private[sbt] def ++(f: Failure) = mkFailures(errors ++ f.errors) - } - def mkFailures(errors: => Seq[String], definitive: Boolean = false): Failure = new Failure(errors.distinct, definitive) - def mkFailure(error: => String, definitive: Boolean = false): Failure = new Failure(error :: Nil, definitive) - - def tuple[A, B](a: Option[A], b: Option[B]): Option[(A, B)] = - (a, b) match { case (Some(av), Some(bv)) => Some((av, bv)); case _ => None } - - def mapParser[A, B](a: Parser[A], f: A => B): Parser[B] = - a.ifValid { - a.result match { - case Some(av) => success(f(av)) - case None => new MapParser(a, f) - } - } - - def bindParser[A, B](a: Parser[A], f: A => Parser[B]): Parser[B] = - a.ifValid { - a.result match { - case Some(av) => f(av) - case None => new BindParser(a, f) - } - } - - def filterParser[T](a: Parser[T], f: T => Boolean, seen: String, msg: String => String): Parser[T] = - a.ifValid { - a.result match { - case Some(av) if f(av) => success(av) - case _ => new Filter(a, f, seen, msg) - } - } - - def seqParser[A, B](a: Parser[A], b: Parser[B]): Parser[(A, B)] = - a.ifValid { - b.ifValid { - (a.result, b.result) match { - case (Some(av), Some(bv)) => success((av, bv)) - case (Some(av), None) => b map { bv => (av, bv) } - case (None, Some(bv)) => a map { av => (av, bv) } - case (None, None) => new SeqParser(a, b) - } - } - } - - def choiceParser[A, B](a: Parser[A], b: Parser[B]): Parser[Either[A, B]] = - if (a.valid) - if (b.valid) new HetParser(a, b) else a.map(left.fn) - else - b.map(right.fn) - - def opt[T](a: Parser[T]): Parser[Option[T]] = - if (a.valid) new Optional(a) else success(None) - - def onFailure[T](delegate: Parser[T], msg: String): Parser[T] = - if (delegate.valid) new OnFailure(delegate, msg) else failure(msg) - def trapAndFail[T](delegate: Parser[T]): Parser[T] = - delegate.ifValid(new TrapAndFail(delegate)) - - def zeroOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 0, Infinite) - def oneOrMore[T](p: Parser[T]): Parser[Seq[T]] = repeat(p, 1, Infinite) - - def repeat[T](p: Parser[T], min: Int = 0, max: UpperBound = Infinite): Parser[Seq[T]] = - repeat(None, p, min, max, Nil) - private[complete] def repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, revAcc: List[T]): Parser[Seq[T]] = - { - assume(min >= 0, "Minimum must be greater than or equal to zero (was " + min + ")") - assume(max >= min, "Minimum must be less than or equal to maximum (min: " + min + ", max: " + max + ")") - - def checkRepeated(invalidButOptional: => Parser[Seq[T]]): Parser[Seq[T]] = - repeated match { - case i: Invalid if min == 0 => invalidButOptional - case i: Invalid => i - case _ => - repeated.result match { - case Some(value) => success(revAcc reverse_::: value :: Nil) // revAcc should be Nil here - case None => if (max.isZero) success(revAcc.reverse) else new Repeat(partial, repeated, min, max, revAcc) - } - } - - partial match { - case Some(part) => - part.ifValid { - part.result match { - case Some(value) => repeat(None, repeated, min, max, value :: revAcc) - case None => checkRepeated(part.map(lv => (lv :: revAcc).reverse)) - } - } - case None => checkRepeated(success(Nil)) - } - } - - def and[T](a: Parser[T], b: Parser[_]): Parser[T] = a.ifValid(b.ifValid(new And(a, b))) -} -trait ParserMain { - /** Provides combinators for Parsers.*/ - implicit def richParser[A](a: Parser[A]): RichParser[A] = new RichParser[A] { - def ~[B](b: Parser[B]) = seqParser(a, b) - def ||[B](b: Parser[B]) = choiceParser(a, b) - def |[B >: A](b: Parser[B]) = homParser(a, b) - def ? = opt(a) - def * = zeroOrMore(a) - def + = oneOrMore(a) - def map[B](f: A => B) = mapParser(a, f) - def id = a - - def ^^^[B](value: B): Parser[B] = a map { _ => value } - def ??[B >: A](alt: B): Parser[B] = a.? map { _ getOrElse alt } - def <~[B](b: Parser[B]): Parser[A] = (a ~ b) map { case av ~ _ => av } - def ~>[B](b: Parser[B]): Parser[B] = (a ~ b) map { case _ ~ bv => bv } - def !!!(msg: String): Parser[A] = onFailure(a, msg) - def failOnException: Parser[A] = trapAndFail(a) - - def unary_- = not(a, "Unexpected: " + a) - def &(o: Parser[_]) = and(a, o) - def -(o: Parser[_]) = and(a, not(o, "Unexpected: " + o)) - def examples(s: String*): Parser[A] = examples(s.toSet) - def examples(s: Set[String], check: Boolean = false): Parser[A] = examples(new FixedSetExamples(s), s.size, check) - def examples(s: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = Parser.examples(a, s, maxNumberOfExamples, removeInvalidExamples) - def filter(f: A => Boolean, msg: String => String): Parser[A] = filterParser(a, f, "", msg) - def string(implicit ev: A <:< Seq[Char]): Parser[String] = map(_.mkString) - def flatMap[B](f: A => Parser[B]) = bindParser(a, f) - } - - implicit def literalRichCharParser(c: Char): RichParser[Char] = richParser(c) - implicit def literalRichStringParser(s: String): RichParser[String] = richParser(s) - - /** - * Construct a parser that is valid, but has no valid result. This is used as a way - * to provide a definitive Failure when a parser doesn't match empty input. For example, - * in `softFailure(...) | p`, if `p` doesn't match the empty sequence, the failure will come - * from the Parser constructed by the `softFailure` method. - */ - private[sbt] def softFailure(msg: => String, definitive: Boolean = false): Parser[Nothing] = - SoftInvalid(mkFailures(msg :: Nil, definitive)) - - /** - * Defines a parser that always fails on any input with messages `msgs`. - * If `definitive` is `true`, any failures by later alternatives are discarded. - */ - def invalid(msgs: => Seq[String], definitive: Boolean = false): Parser[Nothing] = Invalid(mkFailures(msgs, definitive)) - - /** - * Defines a parser that always fails on any input with message `msg`. - * If `definitive` is `true`, any failures by later alternatives are discarded. - */ - def failure(msg: => String, definitive: Boolean = false): Parser[Nothing] = invalid(msg :: Nil, definitive) - - /** Defines a parser that always succeeds on empty input with the result `value`.*/ - def success[T](value: T): Parser[T] = new ValidParser[T] { - override def result = Some(value) - def resultEmpty = Value(value) - def derive(c: Char) = Parser.failure("Expected end of input.") - def completions(level: Int) = Completions.empty - override def toString = "success(" + value + ")" - } - - /** Presents a Char range as a Parser. A single Char is parsed only if it is in the given range.*/ - implicit def range(r: collection.immutable.NumericRange[Char]): Parser[Char] = - charClass(r contains _).examples(r.map(_.toString): _*) - - /** Defines a Parser that parses a single character only if it is contained in `legal`.*/ - def chars(legal: String): Parser[Char] = - { - val set = legal.toSet - charClass(set, "character in '" + legal + "'") examples (set.map(_.toString)) - } - - /** - * Defines a Parser that parses a single character only if the predicate `f` returns true for that character. - * If this parser fails, `label` is used as the failure message. - */ - def charClass(f: Char => Boolean, label: String = ""): Parser[Char] = new CharacterClass(f, label) - - /** Presents a single Char `ch` as a Parser that only parses that exact character. */ - implicit def literal(ch: Char): Parser[Char] = new ValidParser[Char] { - def result = None - def resultEmpty = mkFailure("Expected '" + ch + "'") - def derive(c: Char) = if (c == ch) success(ch) else new Invalid(resultEmpty) - def completions(level: Int) = Completions.single(Completion.suggestion(ch.toString)) - override def toString = "'" + ch + "'" - } - /** Presents a literal String `s` as a Parser that only parses that exact text and provides it as the result.*/ - implicit def literal(s: String): Parser[String] = stringLiteral(s, 0) - - /** See [[unapply]]. */ - object ~ { - /** Convenience for destructuring a tuple that mirrors the `~` combinator.*/ - def unapply[A, B](t: (A, B)): Some[(A, B)] = Some(t) - } - - /** Parses input `str` using `parser`. If successful, the result is provided wrapped in `Right`. If unsuccessful, an error message is provided in `Left`.*/ - def parse[T](str: String, parser: Parser[T]): Either[String, T] = - Parser.result(parser, str).left.map { failures => - val (msgs, pos) = failures() - ProcessError(str, msgs, pos) - } - - /** - * Convenience method to use when developing a parser. - * `parser` is applied to the input `str`. - * If `completions` is true, the available completions for the input are displayed. - * Otherwise, the result of parsing is printed using the result's `toString` method. - * If parsing fails, the error message is displayed. - * - * See also [[sampleParse]] and [[sampleCompletions]]. - */ - def sample(str: String, parser: Parser[_], completions: Boolean = false): Unit = - if (completions) sampleCompletions(str, parser) else sampleParse(str, parser) - - /** - * Convenience method to use when developing a parser. - * `parser` is applied to the input `str` and the result of parsing is printed using the result's `toString` method. - * If parsing fails, the error message is displayed. - */ - def sampleParse(str: String, parser: Parser[_]): Unit = - parse(str, parser) match { - case Left(msg) => println(msg) - case Right(v) => println(v) - } - - /** - * Convenience method to use when developing a parser. - * `parser` is applied to the input `str` and the available completions are displayed on separate lines. - * If parsing fails, the error message is displayed. - */ - def sampleCompletions(str: String, parser: Parser[_], level: Int = 1): Unit = - Parser.completions(parser, str, level).get foreach println - - // intended to be temporary pending proper error feedback - def result[T](p: Parser[T], s: String): Either[() => (Seq[String], Int), T] = - { - def loop(i: Int, a: Parser[T]): Either[() => (Seq[String], Int), T] = - a match { - case Invalid(f) => Left(() => (f.errors, i)) - case _ => - val ci = i + 1 - if (ci >= s.length) - a.resultEmpty.toEither.left.map { msgs0 => () => - val msgs = msgs0() - val nonEmpty = if (msgs.isEmpty) "Unexpected end of input" :: Nil else msgs - (nonEmpty, ci) - } - else - loop(ci, a derive s(ci)) - } - loop(-1, p) - } - - /** Applies parser `p` to input `s`. */ - def apply[T](p: Parser[T])(s: String): Parser[T] = - (p /: s)(derive1) - - /** Applies parser `p` to a single character of input. */ - def derive1[T](p: Parser[T], c: Char): Parser[T] = - if (p.valid) p.derive(c) else p - - /** - * Applies parser `p` to input `s` and returns the completions at verbosity `level`. - * The interpretation of `level` is up to parser definitions, but 0 is the default by convention, - * with increasing positive numbers corresponding to increasing verbosity. Typically no more than - * a few levels are defined. - */ - def completions(p: Parser[_], s: String, level: Int): Completions = - // The x Completions.empty removes any trailing token completions where append.isEmpty - apply(p)(s).completions(level) x Completions.empty - - def examples[A](a: Parser[A], completions: Set[String], check: Boolean = false): Parser[A] = - examples(a, new FixedSetExamples(completions), completions.size, check) - - /** - * @param a the parser to decorate with a source of examples. All validation and parsing is delegated to this parser, - * only [[Parser.completions]] is modified. - * @param completions the source of examples when displaying completions to the user. - * @param maxNumberOfExamples limits the number of examples that the source of examples should return. This can - * prevent lengthy pauses and avoids bad interactive user experience. - * @param removeInvalidExamples indicates whether completion examples should be checked for validity (against the given parser). An - * exception is thrown if the example source contains no valid completion suggestions. - * @tparam A the type of values that are returned by the parser. - * @return - */ - def examples[A](a: Parser[A], completions: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean): Parser[A] = - if (a.valid) { - a.result match { - case Some(av) => success(av) - case None => - new ParserWithExamples(a, completions, maxNumberOfExamples, removeInvalidExamples) - } - } else a - - def matched(t: Parser[_], seen: Vector[Char] = Vector.empty, partial: Boolean = false): Parser[String] = - t match { - case i: Invalid => if (partial && seen.nonEmpty) success(seen.mkString) else i - case _ => - if (t.result.isEmpty) - new MatchedString(t, seen, partial) - else - success(seen.mkString) - } - - /** - * Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, the completions provided by the delegate `t` or a later derivative are appended to - * the prefix String already seen by this parser. - */ - def token[T](t: Parser[T]): Parser[T] = token(t, TokenCompletions.default) - - /** - * Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, no completions are returned if `hide` returns true for the current tab completion level. - * Otherwise, the completions provided by the delegate `t` or a later derivative are appended to the prefix String already seen by this parser. - */ - def token[T](t: Parser[T], hide: Int => Boolean): Parser[T] = token(t, TokenCompletions.default.hideWhen(hide)) - - /** - * Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, `description` is displayed for suggestions and no completions are ever performed. - */ - def token[T](t: Parser[T], description: String): Parser[T] = token(t, TokenCompletions.displayOnly(description)) - - /** - * Establishes delegate parser `t` as a single token of tab completion. - * When tab completion of part of this token is requested, `display` is used as the printed suggestion, but the completions from the delegate - * parser `t` are used to complete if unambiguous. - */ - def tokenDisplay[T](t: Parser[T], display: String): Parser[T] = - token(t, TokenCompletions.overrideDisplay(display)) - - def token[T](t: Parser[T], complete: TokenCompletions): Parser[T] = - mkToken(t, "", complete) - - private[sbt] def mkToken[T](t: Parser[T], seen: String, complete: TokenCompletions): Parser[T] = - if (t.valid && !t.isTokenStart) - if (t.result.isEmpty) new TokenStart(t, seen, complete) else t - else - t - - def homParser[A](a: Parser[A], b: Parser[A]): Parser[A] = (a, b) match { - case (Invalid(af), Invalid(bf)) => Invalid(af ++ bf) - case (Invalid(_), bv) => bv - case (av, Invalid(_)) => av - case (av, bv) => new HomParser(a, b) - } - - def not(p: Parser[_], failMessage: String): Parser[Unit] = p.result match { - case None => new Not(p, failMessage) - case Some(_) => failure(failMessage) - } - - def oneOf[T](p: Seq[Parser[T]]): Parser[T] = p.reduceLeft(_ | _) - def seq[T](p: Seq[Parser[T]]): Parser[Seq[T]] = seq0(p, Nil) - def seq0[T](p: Seq[Parser[T]], errors: => Seq[String]): Parser[Seq[T]] = - { - val (newErrors, valid) = separate(p) { case Invalid(f) => Left(f.errors _); case ok => Right(ok) } - def combinedErrors = errors ++ newErrors.flatMap(_()) - if (valid.isEmpty) invalid(combinedErrors) else new ParserSeq(valid, combinedErrors) - } - - def stringLiteral(s: String, start: Int): Parser[String] = - { - val len = s.length - if (len == 0) sys.error("String literal cannot be empty") else if (start >= len) success(s) else new StringLiteral(s, start) - } -} -sealed trait ValidParser[T] extends Parser[T] { - final def valid = true - final def failure = None - final def ifValid[S](p: => Parser[S]): Parser[S] = p -} -private final case class Invalid(fail: Failure) extends Parser[Nothing] { - def failure = Some(fail) - def result = None - def resultEmpty = fail - def derive(c: Char) = sys.error("Invalid.") - def completions(level: Int) = Completions.nil - override def toString = fail.errors.mkString("; ") - def valid = false - def ifValid[S](p: => Parser[S]): Parser[S] = this -} - -private final case class SoftInvalid(fail: Failure) extends ValidParser[Nothing] { - def result = None - def resultEmpty = fail - def derive(c: Char) = Invalid(fail) - def completions(level: Int) = Completions.nil - override def toString = fail.errors.mkString("; ") -} - -private final class TrapAndFail[A](a: Parser[A]) extends ValidParser[A] { - def result = try { a.result } catch { case e: Exception => None } - def resultEmpty = try { a.resultEmpty } catch { case e: Exception => fail(e) } - def derive(c: Char) = try { trapAndFail(a derive c) } catch { case e: Exception => Invalid(fail(e)) } - def completions(level: Int) = try { a.completions(level) } catch { case e: Exception => Completions.nil } - override def toString = "trap(" + a + ")" - override def isTokenStart = a.isTokenStart - private[this] def fail(e: Exception): Failure = mkFailure(e.toString) -} - -private final class OnFailure[A](a: Parser[A], message: String) extends ValidParser[A] { - def result = a.result - def resultEmpty = a.resultEmpty match { case f: Failure => mkFailure(message); case v: Value[A] => v } - def derive(c: Char) = onFailure(a derive c, message) - def completions(level: Int) = a.completions(level) - override def toString = "(" + a + " !!! \"" + message + "\" )" - override def isTokenStart = a.isTokenStart -} -private final class SeqParser[A, B](a: Parser[A], b: Parser[B]) extends ValidParser[(A, B)] { - lazy val result = tuple(a.result, b.result) - lazy val resultEmpty = a.resultEmpty seq b.resultEmpty - def derive(c: Char) = - { - val common = a.derive(c) ~ b - a.resultEmpty match { - case Value(av) => common | b.derive(c).map(br => (av, br)) - case _: Failure => common - } - } - def completions(level: Int) = a.completions(level) x b.completions(level) - override def toString = "(" + a + " ~ " + b + ")" -} - -private final class HomParser[A](a: Parser[A], b: Parser[A]) extends ValidParser[A] { - lazy val result = tuple(a.result, b.result) map (_._1) - def derive(c: Char) = (a derive c) | (b derive c) - lazy val resultEmpty = a.resultEmpty or b.resultEmpty - def completions(level: Int) = a.completions(level) ++ b.completions(level) - override def toString = "(" + a + " | " + b + ")" -} -private final class HetParser[A, B](a: Parser[A], b: Parser[B]) extends ValidParser[Either[A, B]] { - lazy val result = tuple(a.result, b.result) map { case (a, b) => Left(a) } - def derive(c: Char) = (a derive c) || (b derive c) - lazy val resultEmpty = a.resultEmpty either b.resultEmpty - def completions(level: Int) = a.completions(level) ++ b.completions(level) - override def toString = "(" + a + " || " + b + ")" -} -private final class ParserSeq[T](a: Seq[Parser[T]], errors: => Seq[String]) extends ValidParser[Seq[T]] { - assert(a.nonEmpty) - lazy val resultEmpty: Result[Seq[T]] = - { - val res = a.map(_.resultEmpty) - val (failures, values) = separate(res)(_.toEither) - // if(failures.isEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) - if (values.nonEmpty) Value(values) else mkFailures(failures.flatMap(_()) ++ errors) - } - def result = { - val success = a.flatMap(_.result) - if (success.length == a.length) Some(success) else None - } - def completions(level: Int) = a.map(_.completions(level)).reduceLeft(_ ++ _) - def derive(c: Char) = seq0(a.map(_ derive c), errors) - override def toString = "seq(" + a + ")" -} - -private final class BindParser[A, B](a: Parser[A], f: A => Parser[B]) extends ValidParser[B] { - lazy val result = a.result flatMap { av => f(av).result } - lazy val resultEmpty = a.resultEmpty flatMap { av => f(av).resultEmpty } - def completions(level: Int) = - a.completions(level) flatMap { c => - apply(a)(c.append).resultEmpty match { - case _: Failure => Completions.strict(Set.empty + c) - case Value(av) => c x f(av).completions(level) - } - } - - def derive(c: Char) = - { - val common = a derive c flatMap f - a.resultEmpty match { - case Value(av) => common | derive1(f(av), c) - case _: Failure => common - } - } - override def isTokenStart = a.isTokenStart - override def toString = "bind(" + a + ")" -} -private final class MapParser[A, B](a: Parser[A], f: A => B) extends ValidParser[B] { - lazy val result = a.result map f - lazy val resultEmpty = a.resultEmpty map f - def derive(c: Char) = (a derive c) map f - def completions(level: Int) = a.completions(level) - override def isTokenStart = a.isTokenStart - override def toString = "map(" + a + ")" -} -private final class Filter[T](p: Parser[T], f: T => Boolean, seen: String, msg: String => String) extends ValidParser[T] { - def filterResult(r: Result[T]) = r.filter(f, msg(seen)) - lazy val result = p.result filter f - lazy val resultEmpty = filterResult(p.resultEmpty) - def derive(c: Char) = filterParser(p derive c, f, seen + c, msg) - def completions(level: Int) = p.completions(level) filterS { s => filterResult(apply(p)(s).resultEmpty).isValid } - override def toString = "filter(" + p + ")" - override def isTokenStart = p.isTokenStart -} -private final class MatchedString(delegate: Parser[_], seenV: Vector[Char], partial: Boolean) extends ValidParser[String] { - lazy val seen = seenV.mkString - def derive(c: Char) = matched(delegate derive c, seenV :+ c, partial) - def completions(level: Int) = delegate.completions(level) - def result = if (delegate.result.isDefined) Some(seen) else None - def resultEmpty = delegate.resultEmpty match { case f: Failure if !partial => f; case _ => Value(seen) } - override def isTokenStart = delegate.isTokenStart - override def toString = "matched(" + partial + ", " + seen + ", " + delegate + ")" -} -private final class TokenStart[T](delegate: Parser[T], seen: String, complete: TokenCompletions) extends ValidParser[T] { - def derive(c: Char) = mkToken(delegate derive c, seen + c, complete) - def completions(level: Int) = complete match { - case dc: TokenCompletions.Delegating => dc.completions(seen, level, delegate.completions(level)) - case fc: TokenCompletions.Fixed => fc.completions(seen, level) - } - def result = delegate.result - def resultEmpty = delegate.resultEmpty - override def isTokenStart = true - override def toString = "token('" + complete + ", " + delegate + ")" -} -private final class And[T](a: Parser[T], b: Parser[_]) extends ValidParser[T] { - lazy val result = tuple(a.result, b.result) map { _._1 } - def derive(c: Char) = (a derive c) & (b derive c) - def completions(level: Int) = a.completions(level).filterS(s => apply(b)(s).resultEmpty.isValid) - lazy val resultEmpty = a.resultEmpty && b.resultEmpty - override def toString = "(%s) && (%s)".format(a, b) -} - -private final class Not(delegate: Parser[_], failMessage: String) extends ValidParser[Unit] { - def derive(c: Char) = if (delegate.valid) not(delegate derive c, failMessage) else this - def completions(level: Int) = Completions.empty - def result = None - lazy val resultEmpty = delegate.resultEmpty match { - case f: Failure => Value(()) - case v: Value[_] => mkFailure(failMessage) - } - override def toString = " -(%s)".format(delegate) -} - -/** - * This class wraps an existing parser (the delegate), and replaces the delegate's completions with examples from - * the given example source. - * - * This class asks the example source for a limited amount of examples (to prevent lengthy and expensive - * computations and large amounts of allocated data). It then passes these examples on to the UI. - * - * @param delegate the parser to decorate with completion examples (i.e., completion of user input). - * @param exampleSource the source from which this class will take examples (potentially filter them with the delegate - * parser), and pass them to the UI. - * @param maxNumberOfExamples the maximum number of completions to read from the example source and pass to the UI. This - * limit prevents lengthy example generation and allocation of large amounts of memory. - * @param removeInvalidExamples indicates whether to remove examples that are deemed invalid by the delegate parser. - * @tparam T the type of value produced by the parser. - */ -private final class ParserWithExamples[T](delegate: Parser[T], exampleSource: ExampleSource, maxNumberOfExamples: Int, removeInvalidExamples: Boolean) extends ValidParser[T] { - def derive(c: Char) = - examples(delegate derive c, exampleSource.withAddedPrefix(c.toString), maxNumberOfExamples, removeInvalidExamples) - - def result = delegate.result - - lazy val resultEmpty = delegate.resultEmpty - - def completions(level: Int) = { - if (exampleSource().isEmpty) - if (resultEmpty.isValid) Completions.nil else Completions.empty - else { - val examplesBasedOnTheResult = filteredExamples.take(maxNumberOfExamples).toSet - Completions(examplesBasedOnTheResult.map(ex => Completion.suggestion(ex))) - } - } - - override def toString = "examples(" + delegate + ", " + exampleSource().take(2).toList + ")" - - private def filteredExamples: Iterable[String] = { - if (removeInvalidExamples) - exampleSource().filter(isExampleValid) - else - exampleSource() - } - - private def isExampleValid(example: String): Boolean = { - apply(delegate)(example).resultEmpty.isValid - } -} -private final class StringLiteral(str: String, start: Int) extends ValidParser[String] { - assert(0 <= start && start < str.length) - def failMsg = "Expected '" + str + "'" - def resultEmpty = mkFailure(failMsg) - def result = None - def derive(c: Char) = if (str.charAt(start) == c) stringLiteral(str, start + 1) else new Invalid(resultEmpty) - def completions(level: Int) = Completions.single(Completion.suggestion(str.substring(start))) - override def toString = '"' + str + '"' -} -private final class CharacterClass(f: Char => Boolean, label: String) extends ValidParser[Char] { - def result = None - def resultEmpty = mkFailure("Expected " + label) - def derive(c: Char) = if (f(c)) success(c) else Invalid(resultEmpty) - def completions(level: Int) = Completions.empty - override def toString = "class(" + label + ")" -} -private final class Optional[T](delegate: Parser[T]) extends ValidParser[Option[T]] { - def result = delegate.result map some.fn - def resultEmpty = Value(None) - def derive(c: Char) = (delegate derive c).map(some.fn) - def completions(level: Int) = Completion.empty +: delegate.completions(level) - override def toString = delegate.toString + "?" -} -private final class Repeat[T](partial: Option[Parser[T]], repeated: Parser[T], min: Int, max: UpperBound, accumulatedReverse: List[T]) extends ValidParser[Seq[T]] { - assume(0 <= min, "Minimum occurences must be non-negative") - assume(max >= min, "Minimum occurences must be less than the maximum occurences") - - def derive(c: Char) = - partial match { - case Some(part) => - val partD = repeat(Some(part derive c), repeated, min, max, accumulatedReverse) - part.resultEmpty match { - case Value(pv) => partD | repeatDerive(c, pv :: accumulatedReverse) - case _: Failure => partD - } - case None => repeatDerive(c, accumulatedReverse) - } - - def repeatDerive(c: Char, accRev: List[T]): Parser[Seq[T]] = repeat(Some(repeated derive c), repeated, scala.math.max(0, min - 1), max.decrement, accRev) - - def completions(level: Int) = - { - def pow(comp: Completions, exp: Completions, n: Int): Completions = - if (n == 1) comp else pow(comp x exp, exp, n - 1) - - val repC = repeated.completions(level) - val fin = if (min == 0) Completion.empty +: repC else pow(repC, repC, min) - partial match { - case Some(p) => p.completions(level) x fin - case None => fin - } - } - def result = None - lazy val resultEmpty: Result[Seq[T]] = - { - val partialAccumulatedOption = - partial match { - case None => Value(accumulatedReverse) - case Some(partialPattern) => partialPattern.resultEmpty.map(_ :: accumulatedReverse) - } - (partialAccumulatedOption app repeatedParseEmpty)(_ reverse_::: _) - } - private def repeatedParseEmpty: Result[List[T]] = - { - if (min == 0) - Value(Nil) - else - // forced determinism - for (value <- repeated.resultEmpty) yield makeList(min, value) - } - override def toString = "repeat(" + min + "," + max + "," + partial + "," + repeated + ")" -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parsers.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parsers.scala deleted file mode 100644 index 9463d1acb..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/Parsers.scala +++ /dev/null @@ -1,269 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2011 Mark Harrah - */ -package sbt.internal.util -package complete - -import Parser._ -import java.io.File -import java.net.URI -import java.lang.Character.{ getType, MATH_SYMBOL, OTHER_SYMBOL, DASH_PUNCTUATION, OTHER_PUNCTUATION, MODIFIER_SYMBOL, CURRENCY_SYMBOL } - -/** Provides standard implementations of commonly useful [[Parser]]s. */ -trait Parsers { - /** Matches the end of input, providing no useful result on success. */ - lazy val EOF = not(any, "Expected EOF") - - /** Parses any single character and provides that character as the result. */ - lazy val any: Parser[Char] = charClass(_ => true, "any character") - - /** Set that contains each digit in a String representation.*/ - lazy val DigitSet = Set("0", "1", "2", "3", "4", "5", "6", "7", "8", "9") - - /** Parses any single digit and provides that digit as a Char as the result.*/ - lazy val Digit = charClass(_.isDigit, "digit") examples DigitSet - - /** Set containing Chars for hexadecimal digits 0-9 and A-F (but not a-f). */ - lazy val HexDigitSet = Set('0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'A', 'B', 'C', 'D', 'E', 'F') - - /** Parses a single hexadecimal digit (0-9, a-f, A-F). */ - lazy val HexDigit = charClass(c => HexDigitSet(c.toUpper), "hex digit") examples HexDigitSet.map(_.toString) - - /** Parses a single letter, according to Char.isLetter, into a Char. */ - lazy val Letter = charClass(_.isLetter, "letter") - - /** Parses the first Char in an sbt identifier, which must be a [[Letter]].*/ - def IDStart = Letter - - /** Parses an identifier Char other than the first character. This includes letters, digits, dash `-`, and underscore `_`.*/ - lazy val IDChar = charClass(isIDChar, "ID character") - - /** Parses an identifier String, which must start with [[IDStart]] and contain zero or more [[IDChar]]s after that. */ - lazy val ID = identifier(IDStart, IDChar) - - /** Parses a single operator Char, as allowed by [[isOpChar]]. */ - lazy val OpChar = charClass(isOpChar, "symbol") - - /** Parses a non-empty operator String, which consists only of characters allowed by [[OpChar]]. */ - lazy val Op = OpChar.+.string - - /** Parses either an operator String defined by [[Op]] or a non-symbolic identifier defined by [[ID]]. */ - lazy val OpOrID = ID | Op - - /** Parses a single, non-symbolic Scala identifier Char. Valid characters are letters, digits, and the underscore character `_`. */ - lazy val ScalaIDChar = charClass(isScalaIDChar, "Scala identifier character") - - /** Parses a non-symbolic Scala-like identifier. The identifier must start with [[IDStart]] and contain zero or more [[ScalaIDChar]]s after that.*/ - lazy val ScalaID = identifier(IDStart, ScalaIDChar) - - /** Parses a String that starts with `start` and is followed by zero or more characters parsed by `rep`.*/ - def identifier(start: Parser[Char], rep: Parser[Char]): Parser[String] = - start ~ rep.* map { case x ~ xs => (x +: xs).mkString } - - def opOrIDSpaced(s: String): Parser[Char] = - if (DefaultParsers.matches(ID, s)) - OpChar | SpaceClass - else if (DefaultParsers.matches(Op, s)) - IDChar | SpaceClass - else - any - - /** Returns true if `c` an operator character. */ - def isOpChar(c: Char) = !isDelimiter(c) && isOpType(getType(c)) - def isOpType(cat: Int) = cat match { case MATH_SYMBOL | OTHER_SYMBOL | DASH_PUNCTUATION | OTHER_PUNCTUATION | MODIFIER_SYMBOL | CURRENCY_SYMBOL => true; case _ => false } - /** Returns true if `c` is a dash `-`, a letter, digit, or an underscore `_`. */ - def isIDChar(c: Char) = isScalaIDChar(c) || c == '-' - - /** Returns true if `c` is a letter, digit, or an underscore `_`. */ - def isScalaIDChar(c: Char) = c.isLetterOrDigit || c == '_' - - def isDelimiter(c: Char) = c match { case '`' | '\'' | '\"' | /*';' | */ ',' | '.' => true; case _ => false } - - /** Matches a single character that is not a whitespace character. */ - lazy val NotSpaceClass = charClass(!_.isWhitespace, "non-whitespace character") - - /** Matches a single whitespace character, as determined by Char.isWhitespace.*/ - lazy val SpaceClass = charClass(_.isWhitespace, "whitespace character") - - /** Matches a non-empty String consisting of non-whitespace characters. */ - lazy val NotSpace = NotSpaceClass.+.string - - /** Matches a possibly empty String consisting of non-whitespace characters. */ - lazy val OptNotSpace = NotSpaceClass.*.string - - /** - * Matches a non-empty String consisting of whitespace characters. - * The suggested tab completion is a single, constant space character. - */ - lazy val Space = SpaceClass.+.examples(" ") - - /** - * Matches a possibly empty String consisting of whitespace characters. - * The suggested tab completion is a single, constant space character. - */ - lazy val OptSpace = SpaceClass.*.examples(" ") - - /** Parses a non-empty String that contains only valid URI characters, as defined by [[URIChar]].*/ - lazy val URIClass = URIChar.+.string !!! "Invalid URI" - - /** Triple-quotes, as used for verbatim quoting.*/ - lazy val VerbatimDQuotes = "\"\"\"" - - /** Double quote character. */ - lazy val DQuoteChar = '\"' - - /** Backslash character. */ - lazy val BackslashChar = '\\' - - /** Matches a single double quote. */ - lazy val DQuoteClass = charClass(_ == DQuoteChar, "double-quote character") - - /** Matches any character except a double quote or whitespace. */ - lazy val NotDQuoteSpaceClass = - charClass({ c: Char => (c != DQuoteChar) && !c.isWhitespace }, "non-double-quote-space character") - - /** Matches any character except a double quote or backslash. */ - lazy val NotDQuoteBackslashClass = - charClass({ c: Char => (c != DQuoteChar) && (c != BackslashChar) }, "non-double-quote-backslash character") - - /** Matches a single character that is valid somewhere in a URI. */ - lazy val URIChar = charClass(alphanum) | chars("_-!.~'()*,;:$&+=?/[]@%#") - - /** Returns true if `c` is an ASCII letter or digit. */ - def alphanum(c: Char) = ('a' <= c && c <= 'z') || ('A' <= c && c <= 'Z') || ('0' <= c && c <= '9') - - /** - * @param base the directory used for completion proposals (when the user presses the TAB key). Only paths under this - * directory will be proposed. - * @return the file that was parsed from the input string. The returned path may or may not exist. - */ - def fileParser(base: File): Parser[File] = - OptSpace ~> StringBasic - .examples(new FileExamples(base)) - .map(new File(_)) - - /** Parses a port number. Currently, this accepts any integer and presents a tab completion suggestion of ``. */ - lazy val Port = token(IntBasic, "") - - /** Parses a signed integer. */ - lazy val IntBasic = mapOrFail('-'.? ~ Digit.+)(Function.tupled(toInt)) - - /** Parses an unsigned integer. */ - lazy val NatBasic = mapOrFail(Digit.+)(_.mkString.toInt) - - private[this] def toInt(neg: Option[Char], digits: Seq[Char]): Int = - (neg.toSeq ++ digits).mkString.toInt - - /** Parses the lower-case values `true` and `false` into their respesct Boolean values. */ - lazy val Bool = ("true" ^^^ true) | ("false" ^^^ false) - - /** - * Parses a potentially quoted String value. The value may be verbatim quoted ([[StringVerbatim]]), - * quoted with interpreted escapes ([[StringEscapable]]), or unquoted ([[NotQuoted]]). - */ - lazy val StringBasic = StringVerbatim | StringEscapable | NotQuoted - - /** - * Parses a verbatim quoted String value, discarding the quotes in the result. This kind of quoted text starts with triple quotes `"""` - * and ends at the next triple quotes and may contain any character in between. - */ - lazy val StringVerbatim: Parser[String] = VerbatimDQuotes ~> - any.+.string.filter(!_.contains(VerbatimDQuotes), _ => "Invalid verbatim string") <~ - VerbatimDQuotes - - /** - * Parses a string value, interpreting escapes and discarding the surrounding quotes in the result. - * See [[EscapeSequence]] for supported escapes. - */ - lazy val StringEscapable: Parser[String] = - (DQuoteChar ~> (NotDQuoteBackslashClass | EscapeSequence).+.string <~ DQuoteChar | - (DQuoteChar ~ DQuoteChar) ^^^ "") - - /** - * Parses a single escape sequence into the represented Char. - * Escapes start with a backslash and are followed by `u` for a [[UnicodeEscape]] or by `b`, `t`, `n`, `f`, `r`, `"`, `'`, `\` for standard escapes. - */ - lazy val EscapeSequence: Parser[Char] = - BackslashChar ~> ('b' ^^^ '\b' | 't' ^^^ '\t' | 'n' ^^^ '\n' | 'f' ^^^ '\f' | 'r' ^^^ '\r' | - '\"' ^^^ '\"' | '\'' ^^^ '\'' | '\\' ^^^ '\\' | UnicodeEscape) - - /** - * Parses a single unicode escape sequence into the represented Char. - * A unicode escape begins with a backslash, followed by a `u` and 4 hexadecimal digits representing the unicode value. - */ - lazy val UnicodeEscape: Parser[Char] = - ("u" ~> repeat(HexDigit, 4, 4)) map { seq => Integer.parseInt(seq.mkString, 16).toChar } - - /** Parses an unquoted, non-empty String value that cannot start with a double quote and cannot contain whitespace.*/ - lazy val NotQuoted = (NotDQuoteSpaceClass ~ OptNotSpace) map { case (c, s) => c.toString + s } - - /** - * Applies `rep` zero or more times, separated by `sep`. - * The result is the (possibly empty) sequence of results from the multiple `rep` applications. The `sep` results are discarded. - */ - def repsep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = - rep1sep(rep, sep) ?? Nil - - /** - * Applies `rep` one or more times, separated by `sep`. - * The result is the non-empty sequence of results from the multiple `rep` applications. The `sep` results are discarded. - */ - def rep1sep[T](rep: Parser[T], sep: Parser[_]): Parser[Seq[T]] = - (rep ~ (sep ~> rep).*).map { case (x ~ xs) => x +: xs } - - /** Wraps the result of `p` in `Some`.*/ - def some[T](p: Parser[T]): Parser[Option[T]] = p map { v => Some(v) } - - /** - * Applies `f` to the result of `p`, transforming any exception when evaluating - * `f` into a parse failure with the exception `toString` as the message. - */ - def mapOrFail[S, T](p: Parser[S])(f: S => T): Parser[T] = - p flatMap { s => try { success(f(s)) } catch { case e: Exception => failure(e.toString) } } - - /** - * Parses a space-delimited, possibly empty sequence of arguments. - * The arguments may use quotes and escapes according to [[StringBasic]]. - */ - def spaceDelimited(display: String): Parser[Seq[String]] = (token(Space) ~> token(StringBasic, display)).* <~ SpaceClass.* - - /** Applies `p` and uses `true` as the result if it succeeds and turns failure into a result of `false`. */ - def flag[T](p: Parser[T]): Parser[Boolean] = (p ^^^ true) ?? false - - /** - * Defines a sequence parser where the parser used for each part depends on the previously parsed values. - * `p` is applied to the (possibly empty) sequence of already parsed values to obtain the next parser to use. - * The parsers obtained in this way are separated by `sep`, whose result is discarded and only the sequence - * of values from the parsers returned by `p` is used for the result. - */ - def repeatDep[A](p: Seq[A] => Parser[A], sep: Parser[Any]): Parser[Seq[A]] = - { - def loop(acc: Seq[A]): Parser[Seq[A]] = { - val next = (sep ~> p(acc)) flatMap { result => loop(acc :+ result) } - next ?? acc - } - p(Vector()) flatMap { first => loop(Seq(first)) } - } - - /** Applies String.trim to the result of `p`. */ - def trimmed(p: Parser[String]) = p map { _.trim } - - /** Parses a URI that is valid according to the single argument java.net.URI constructor. */ - lazy val basicUri = mapOrFail(URIClass)(uri => new URI(uri)) - - /** Parses a URI that is valid according to the single argument java.net.URI constructor, using `ex` as tab completion examples. */ - def Uri(ex: Set[URI]) = basicUri examples (ex.map(_.toString)) -} - -/** Provides standard [[Parser]] implementations. */ -object Parsers extends Parsers - -/** Provides common [[Parser]] implementations and helper methods.*/ -object DefaultParsers extends Parsers with ParserMain { - /** Applies parser `p` to input `s` and returns `true` if the parse was successful. */ - def matches(p: Parser[_], s: String): Boolean = - apply(p)(s).resultEmpty.isValid - - /** Returns `true` if `s` parses successfully according to [[ID]].*/ - def validID(s: String): Boolean = matches(ID, s) -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/ProcessError.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/ProcessError.scala deleted file mode 100644 index 6d74ed2d2..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/ProcessError.scala +++ /dev/null @@ -1,30 +0,0 @@ -package sbt.internal.util -package complete - -object ProcessError { - def apply(command: String, msgs: Seq[String], index: Int): String = - { - val (line, modIndex) = extractLine(command, index) - val point = pointerSpace(command, modIndex) - msgs.mkString("\n") + "\n" + line + "\n" + point + "^" - } - def extractLine(s: String, i: Int): (String, Int) = - { - val notNewline = (c: Char) => c != '\n' && c != '\r' - val left = takeRightWhile(s.substring(0, i))(notNewline) - val right = s substring i takeWhile notNewline - (left + right, left.length) - } - def takeRightWhile(s: String)(pred: Char => Boolean): String = - { - def loop(i: Int): String = - if (i < 0) - s - else if (pred(s(i))) - loop(i - 1) - else - s.substring(i + 1) - loop(s.length - 1) - } - def pointerSpace(s: String, i: Int): String = (s take i) map { case '\t' => '\t'; case _ => ' ' } mkString "" -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/TokenCompletions.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/TokenCompletions.scala deleted file mode 100644 index 0d0b2980e..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/TokenCompletions.scala +++ /dev/null @@ -1,38 +0,0 @@ -package sbt.internal.util -package complete - -import Completion.{ token => ctoken, tokenDisplay } - -sealed trait TokenCompletions { - def hideWhen(f: Int => Boolean): TokenCompletions -} -object TokenCompletions { - private[sbt] abstract class Delegating extends TokenCompletions { outer => - def completions(seen: String, level: Int, delegate: Completions): Completions - final def hideWhen(hide: Int => Boolean): TokenCompletions = new Delegating { - def completions(seen: String, level: Int, delegate: Completions): Completions = - if (hide(level)) Completions.nil else outer.completions(seen, level, delegate) - } - } - private[sbt] abstract class Fixed extends TokenCompletions { outer => - def completions(seen: String, level: Int): Completions - final def hideWhen(hide: Int => Boolean): TokenCompletions = new Fixed { - def completions(seen: String, level: Int) = - if (hide(level)) Completions.nil else outer.completions(seen, level) - } - } - - val default: TokenCompletions = mapDelegateCompletions((seen, level, c) => ctoken(seen, c.append)) - - def displayOnly(msg: String): TokenCompletions = new Fixed { - def completions(seen: String, level: Int) = Completions.single(Completion.displayOnly(msg)) - } - def overrideDisplay(msg: String): TokenCompletions = mapDelegateCompletions((seen, level, c) => tokenDisplay(display = msg, append = c.append)) - - def fixed(f: (String, Int) => Completions): TokenCompletions = new Fixed { - def completions(seen: String, level: Int) = f(seen, level) - } - def mapDelegateCompletions(f: (String, Int, Completion) => Completion): TokenCompletions = new Delegating { - def completions(seen: String, level: Int, delegate: Completions) = Completions(delegate.get.map(c => f(seen, level, c))) - } -} diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala deleted file mode 100644 index e96dbad4f..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/TypeString.scala +++ /dev/null @@ -1,80 +0,0 @@ -package sbt.internal.util -package complete - -import DefaultParsers._ -import TypeString._ - -/** - * Basic representation of types parsed from Manifest.toString. - * This can only represent the structure of parameterized types. - * All other types are represented by a TypeString with an empty `args`. - */ -private[sbt] final class TypeString(val base: String, val args: List[TypeString]) { - override def toString = - if (base.startsWith(FunctionName)) - args.dropRight(1).mkString("(", ",", ")") + " => " + args.last - else if (base.startsWith(TupleName)) - args.mkString("(", ",", ")") - else - cleanupTypeName(base) + (if (args.isEmpty) "" else args.mkString("[", ",", "]")) -} - -private[sbt] object TypeString { - /** Makes the string representation of a type as returned by Manifest.toString more readable.*/ - def cleanup(typeString: String): String = - parse(typeString, typeStringParser) match { - case Right(ts) => ts.toString - case Left(err) => typeString - } - - /** - * Makes a fully qualified type name provided by Manifest.toString more readable. - * The argument should be just a name (like scala.Tuple2) and not a full type (like scala.Tuple2[Int,Boolean]) - */ - def cleanupTypeName(base: String): String = - dropPrefix(base).replace('$', '.') - - /** - * Removes prefixes from a fully qualified type name that are unnecessary in the presence of standard imports for an sbt setting. - * This does not use the compiler and is therefore a conservative approximation. - */ - def dropPrefix(base: String): String = - if (base.startsWith(SbtPrefix)) - base.substring(SbtPrefix.length) - else if (base.startsWith(CollectionPrefix)) { - val simple = base.substring(CollectionPrefix.length) - if (ShortenCollection(simple)) simple else base - } else if (base.startsWith(ScalaPrefix)) - base.substring(ScalaPrefix.length) - else if (base.startsWith(JavaPrefix)) - base.substring(JavaPrefix.length) - else - TypeMap.getOrElse(base, base) - - final val CollectionPrefix = "scala.collection." - final val FunctionName = "scala.Function" - final val TupleName = "scala.Tuple" - final val SbtPrefix = "sbt." - final val ScalaPrefix = "scala." - final val JavaPrefix = "java.lang." - /* scala.collection.X -> X */ - val ShortenCollection = Set("Seq", "List", "Set", "Map", "Iterable") - val TypeMap = Map( - "java.io.File" -> "File", - "java.net.URL" -> "URL", - "java.net.URI" -> "URI" - ) - - /** - * A Parser that extracts basic structure from the string representation of a type from Manifest.toString. - * This is rudimentary and essentially only decomposes the string into names and arguments for parameterized types. - */ - lazy val typeStringParser: Parser[TypeString] = - { - def isFullScalaIDChar(c: Char) = isScalaIDChar(c) || c == '.' || c == '$' - lazy val fullScalaID = identifier(IDStart, charClass(isFullScalaIDChar, "Scala identifier character")) - lazy val tpe: Parser[TypeString] = - for (id <- fullScalaID; args <- ('[' ~> rep1sep(tpe, ',') <~ ']').?) yield new TypeString(id, args.toList.flatten) - tpe - } -} \ No newline at end of file diff --git a/internal/util-complete/src/main/scala/sbt/internal/util/complete/UpperBound.scala b/internal/util-complete/src/main/scala/sbt/internal/util/complete/UpperBound.scala deleted file mode 100644 index 6b600f9ed..000000000 --- a/internal/util-complete/src/main/scala/sbt/internal/util/complete/UpperBound.scala +++ /dev/null @@ -1,48 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2008,2010 Mark Harrah - */ -package sbt.internal.util -package complete - -sealed trait UpperBound { - /** True if and only if the given value meets this bound.*/ - def >=(min: Int): Boolean - /** True if and only if this bound is one.*/ - def isOne: Boolean - /** True if and only if this bound is zero.*/ - def isZero: Boolean - /** - * If this bound is zero or Infinite, `decrement` returns this bound. - * Otherwise, this bound is finite and greater than zero and `decrement` returns the bound that is one less than this bound. - */ - def decrement: UpperBound - /** True if and only if this is unbounded.*/ - def isInfinite: Boolean -} -/** Represents unbounded. */ -case object Infinite extends UpperBound { - /** All finite numbers meet this bound. */ - def >=(min: Int) = true - def isOne = false - def isZero = false - def decrement = this - def isInfinite = true - override def toString = "Infinity" -} -/** - * Represents a finite upper bound. The maximum allowed value is 'value', inclusive. - * It must positive. - */ -final case class Finite(value: Int) extends UpperBound { - assume(value >= 0, "Maximum occurences must be nonnegative.") - - def >=(min: Int) = value >= min - def isOne = value == 1 - def isZero = value == 0 - def decrement = Finite(scala.math.max(0, value - 1)) - def isInfinite = false - override def toString = value.toString -} -object UpperBound { - implicit def intToFinite(i: Int): Finite = Finite(i) -} diff --git a/internal/util-complete/src/test/scala/ParserTest.scala b/internal/util-complete/src/test/scala/ParserTest.scala deleted file mode 100644 index 1db99b513..000000000 --- a/internal/util-complete/src/test/scala/ParserTest.scala +++ /dev/null @@ -1,149 +0,0 @@ -package sbt.internal.util -package complete - -object JLineTest { - import DefaultParsers._ - - val one = "blue" | "green" | "black" - val two = token("color" ~> Space) ~> token(one) - val three = token("color" ~> Space) ~> token(ID.examples("blue", "green", "black")) - val four = token("color" ~> Space) ~> token(ID, "") - - val num = token(NatBasic) - val five = (num ~ token("+" | "-") ~ num) <~ token('=') flatMap { - case a ~ "+" ~ b => token((a + b).toString) - case a ~ "-" ~ b => token((a - b).toString) - } - - val parsers = Map("1" -> one, "2" -> two, "3" -> three, "4" -> four, "5" -> five) - def main(args: Array[String]): Unit = { - import jline.TerminalFactory - import jline.console.ConsoleReader - val reader = new ConsoleReader() - TerminalFactory.get.init - - val parser = parsers(args(0)) - JLineCompletion.installCustomCompletor(reader, parser) - def loop(): Unit = { - val line = reader.readLine("> ") - if (line ne null) { - println("Result: " + apply(parser)(line).resultEmpty) - loop() - } - } - loop() - } -} - -import Parser._ -import org.scalacheck._ - -object ParserTest extends Properties("Completing Parser") { - import Parsers._ - import DefaultParsers.matches - - val nested = (token("a1") ~ token("b2")) ~ "c3" - val nestedDisplay = (token("a1", "") ~ token("b2", "")) ~ "c3" - - val spacePort = token(Space) ~> Port - - def p[T](f: T): T = { println(f); f } - - def checkSingle(in: String, expect: Completion)(expectDisplay: Completion = expect) = - (("token '" + in + "'") |: checkOne(in, nested, expect)) && - (("display '" + in + "'") |: checkOne(in, nestedDisplay, expectDisplay)) - - def checkOne(in: String, parser: Parser[_], expect: Completion): Prop = - completions(parser, in, 1) == Completions.single(expect) - - def checkAll(in: String, parser: Parser[_], expect: Completions): Prop = - { - val cs = completions(parser, in, 1) - ("completions: " + cs) |: ("Expected: " + expect) |: (cs == expect: Prop) - } - - def checkInvalid(in: String) = - (("token '" + in + "'") |: checkInv(in, nested)) && - (("display '" + in + "'") |: checkInv(in, nestedDisplay)) - - def checkInv(in: String, parser: Parser[_]): Prop = - { - val cs = completions(parser, in, 1) - ("completions: " + cs) |: (cs == Completions.nil: Prop) - } - - property("nested tokens a") = checkSingle("", Completion.token("", "a1"))(Completion.displayOnly("")) - property("nested tokens a1") = checkSingle("a", Completion.token("a", "1"))(Completion.displayOnly("")) - property("nested tokens a inv") = checkInvalid("b") - property("nested tokens b") = checkSingle("a1", Completion.token("", "b2"))(Completion.displayOnly("")) - property("nested tokens b2") = checkSingle("a1b", Completion.token("b", "2"))(Completion.displayOnly("")) - property("nested tokens b inv") = checkInvalid("a1a") - property("nested tokens c") = checkSingle("a1b2", Completion.suggestion("c3"))() - property("nested tokens c3") = checkSingle("a1b2c", Completion.suggestion("3"))() - property("nested tokens c inv") = checkInvalid("a1b2a") - - property("suggest space") = checkOne("", spacePort, Completion.token("", " ")) - property("suggest port") = checkOne(" ", spacePort, Completion.displayOnly("")) - property("no suggest at end") = checkOne("asdf", "asdf", Completion.suggestion("")) - property("no suggest at token end") = checkOne("asdf", token("asdf"), Completion.suggestion("")) - property("empty suggest for examples") = checkOne("asdf", any.+.examples("asdf", "qwer"), Completion.suggestion("")) - property("empty suggest for examples token") = checkOne("asdf", token(any.+.examples("asdf", "qwer")), Completion.suggestion("")) - - val colors = Set("blue", "green", "red") - val base = (seen: Seq[String]) => token(ID examples (colors -- seen)) - val sep = token(Space) - val repeat = repeatDep(base, sep) - def completionStrings(ss: Set[String]): Completions = Completions(ss.map { s => Completion.token("", s) }) - - property("repeatDep no suggestions for bad input") = checkInv(".", repeat) - property("repeatDep suggest all") = checkAll("", repeat, completionStrings(colors)) - property("repeatDep suggest remaining two") = { - val first = colors.toSeq.head - checkAll(first + " ", repeat, completionStrings(colors - first)) - } - property("repeatDep suggest remaining one") = { - val take = colors.toSeq.take(2) - checkAll(take.mkString("", " ", " "), repeat, completionStrings(colors -- take)) - } - property("repeatDep requires at least one token") = !matches(repeat, "") - property("repeatDep accepts one token") = matches(repeat, colors.toSeq.head) - property("repeatDep accepts two tokens") = matches(repeat, colors.toSeq.take(2).mkString(" ")) -} -object ParserExample { - val ws = charClass(_.isWhitespace).+ - val notws = charClass(!_.isWhitespace).+ - - val name = token("test") - val options = (ws ~> token("quick" | "failed" | "new")).* - val exampleSet = Set("am", "is", "are", "was", "were") - val include = (ws ~> token(examples(notws.string, new FixedSetExamples(exampleSet), exampleSet.size, false))).* - - val t = name ~ options ~ include - - // Get completions for some different inputs - println(completions(t, "te", 1)) - println(completions(t, "test ", 1)) - println(completions(t, "test w", 1)) - - // Get the parsed result for different inputs - println(apply(t)("te").resultEmpty) - println(apply(t)("test").resultEmpty) - println(apply(t)("test w").resultEmpty) - println(apply(t)("test was were").resultEmpty) - - def run(n: Int): Unit = { - val a = 'a'.id - val aq = a.? - val aqn = repeat(aq, min = n, max = n) - val an = repeat(a, min = n, max = n) - val ann = aqn ~ an - - def r = apply(ann)("a" * (n * 2)).resultEmpty - println(r.isValid) - } - def run2(n: Int): Unit = { - val ab = "ab".?.* - val r = apply(ab)("a" * n).resultEmpty - println(r) - } -} diff --git a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala deleted file mode 100644 index 9cb416840..000000000 --- a/internal/util-complete/src/test/scala/sbt/complete/FileExamplesTest.scala +++ /dev/null @@ -1,96 +0,0 @@ -package sbt.internal.util -package complete - -import java.io.File -import sbt.io.IO._ - -class FileExamplesTest extends UnitSpec { - - "listing all files in an absolute base directory" should - "produce the entire base directory's contents" in { - val _ = new DirectoryStructure { - fileExamples().toList should contain theSameElementsAs (allRelativizedPaths) - } - } - - "listing files with a prefix that matches none" should - "produce an empty list" in { - val _ = new DirectoryStructure(withCompletionPrefix = "z") { - fileExamples().toList shouldBe empty - } - } - - "listing single-character prefixed files" should - "produce matching paths only" in { - val _ = new DirectoryStructure(withCompletionPrefix = "f") { - fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly) - } - } - - "listing directory-prefixed files" should - "produce matching paths only" in { - val _ = new DirectoryStructure(withCompletionPrefix = "far") { - fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly) - } - } - - it should "produce sub-dir contents only when appending a file separator to the directory" in { - val _ = new DirectoryStructure(withCompletionPrefix = "far" + File.separator) { - fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly) - } - } - - "listing files with a sub-path prefix" should - "produce matching paths only" in { - val _ = new DirectoryStructure(withCompletionPrefix = "far" + File.separator + "ba") { - fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly) - } - } - - "completing a full path" should - "produce a list with an empty string" in { - val _ = new DirectoryStructure(withCompletionPrefix = "bazaar") { - fileExamples().toList shouldEqual List("") - } - } - - // TODO: Remove DelayedInit - https://github.com/scala/scala/releases/tag/v2.11.0-RC1 - class DirectoryStructure(withCompletionPrefix: String = "") extends DelayedInit { - var fileExamples: FileExamples = _ - var baseDir: File = _ - var childFiles: List[File] = _ - var childDirectories: List[File] = _ - var nestedFiles: List[File] = _ - var nestedDirectories: List[File] = _ - - def allRelativizedPaths: List[String] = - (childFiles ++ childDirectories ++ nestedFiles ++ nestedDirectories).map(relativize(baseDir, _).get) - - def prefixedPathsOnly: List[String] = - allRelativizedPaths.filter(_ startsWith withCompletionPrefix).map(_ substring withCompletionPrefix.length) - - override def delayedInit(testBody: => Unit): Unit = { - withTemporaryDirectory { - tempDir => - createSampleDirStructure(tempDir) - fileExamples = new FileExamples(baseDir, withCompletionPrefix) - testBody - } - } - - private def createSampleDirStructure(tempDir: File): Unit = { - childFiles = toChildFiles(tempDir, List("foo", "bar", "bazaar")) - childDirectories = toChildFiles(tempDir, List("moo", "far")) - nestedFiles = toChildFiles(childDirectories(1), List("farfile1", "barfile2")) - nestedDirectories = toChildFiles(childDirectories(1), List("fardir1", "bardir2")) - - (childDirectories ++ nestedDirectories).map(_.mkdirs()) - (childFiles ++ nestedFiles).map(_.createNewFile()) - - baseDir = tempDir - } - - private def toChildFiles(baseDir: File, files: List[String]): List[File] = files.map(new File(baseDir, _)) - } - -} diff --git a/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala deleted file mode 100644 index b043497db..000000000 --- a/internal/util-complete/src/test/scala/sbt/complete/FixedSetExamplesTest.scala +++ /dev/null @@ -1,24 +0,0 @@ -package sbt.internal.util -package complete - -class FixedSetExamplesTest extends UnitSpec { - - "adding a prefix" should "produce a smaller set of examples with the prefix removed" in { - val _ = new Examples { - fixedSetExamples.withAddedPrefix("f")() should contain theSameElementsAs (List("oo", "ool", "u")) - fixedSetExamples.withAddedPrefix("fo")() should contain theSameElementsAs (List("o", "ol")) - fixedSetExamples.withAddedPrefix("b")() should contain theSameElementsAs (List("ar")) - } - } - - "without a prefix" should "produce the original set" in { - val _ = new Examples { - fixedSetExamples() shouldBe exampleSet - } - } - - trait Examples { - val exampleSet = List("foo", "bar", "fool", "fu") - val fixedSetExamples = FixedSetExamples(exampleSet) - } -} diff --git a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala b/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala deleted file mode 100644 index 17891be4f..000000000 --- a/internal/util-complete/src/test/scala/sbt/complete/ParserWithExamplesTest.scala +++ /dev/null @@ -1,99 +0,0 @@ -package sbt.internal.util -package complete - -import Completion._ - -class ParserWithExamplesTest extends UnitSpec { - - "listing a limited number of completions" should - "grab only the needed number of elements from the iterable source of examples" in { - val _ = new ParserWithLazyExamples { - parserWithExamples.completions(0) - examples.size shouldEqual maxNumberOfExamples - } - } - - "listing only valid completions" should - "use the delegate parser to remove invalid examples" in { - val _ = new ParserWithValidExamples { - val validCompletions = Completions(Set( - suggestion("blue"), - suggestion("red") - )) - parserWithExamples.completions(0) shouldEqual validCompletions - } - } - - "listing valid completions in a derived parser" should - "produce only valid examples that start with the character of the derivation" in { - val _ = new ParserWithValidExamples { - val derivedCompletions = Completions(Set( - suggestion("lue") - )) - parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions - } - } - - "listing valid and invalid completions" should - "produce the entire source of examples" in { - val _ = new parserWithAllExamples { - val completions = Completions(examples.map(suggestion(_)).toSet) - parserWithExamples.completions(0) shouldEqual completions - } - } - - "listing valid and invalid completions in a derived parser" should - "produce only examples that start with the character of the derivation" in { - val _ = new parserWithAllExamples { - val derivedCompletions = Completions(Set( - suggestion("lue"), - suggestion("lock") - )) - parserWithExamples.derive('b').completions(0) shouldEqual derivedCompletions - } - } - - class ParserWithLazyExamples extends ParserExample(GrowableSourceOfExamples(), maxNumberOfExamples = 5, removeInvalidExamples = false) - - class ParserWithValidExamples extends ParserExample(removeInvalidExamples = true) - - class parserWithAllExamples extends ParserExample(removeInvalidExamples = false) - - case class ParserExample( - examples: Iterable[String] = Set("blue", "yellow", "greeen", "block", "red"), - maxNumberOfExamples: Int = 25, - removeInvalidExamples: Boolean - ) { - - import DefaultParsers._ - - val colorParser = "blue" | "green" | "black" | "red" - val parserWithExamples: Parser[String] = new ParserWithExamples[String]( - colorParser, - FixedSetExamples(examples), - maxNumberOfExamples, - removeInvalidExamples - ) - } - - case class GrowableSourceOfExamples() extends Iterable[String] { - private var numberOfIteratedElements: Int = 0 - - override def iterator: Iterator[String] = { - new Iterator[String] { - var currentElement = 0 - - override def next(): String = { - currentElement += 1 - numberOfIteratedElements = Math.max(currentElement, numberOfIteratedElements) - numberOfIteratedElements.toString - } - - override def hasNext: Boolean = true - } - } - - override def size: Int = numberOfIteratedElements - } - -} diff --git a/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala b/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala deleted file mode 100644 index 0e15fadf2..000000000 --- a/internal/util-logic/src/main/scala/sbt/internal/util/logic/Logic.scala +++ /dev/null @@ -1,336 +0,0 @@ -package sbt.internal.util -package logic - -import scala.annotation.tailrec -import Formula.{ And, True } - -/* -Defines a propositional logic with negation as failure and only allows stratified rule sets (negation must be acyclic) in order to have a unique minimal model. - -For example, this is not allowed: - + p :- not q - + q :- not p -but this is: - + p :- q - + q :- p -as is this: - + p :- q - + q := not r - - - Some useful links: - + https://en.wikipedia.org/wiki/Nonmonotonic_logic - + https://en.wikipedia.org/wiki/Negation_as_failure - + https://en.wikipedia.org/wiki/Propositional_logic - + https://en.wikipedia.org/wiki/Stable_model_semantics - + http://www.w3.org/2005/rules/wg/wiki/negation -*/ - -/** Disjunction (or) of the list of clauses. */ -final case class Clauses(clauses: List[Clause]) { - assert(clauses.nonEmpty, "At least one clause is required.") -} - -/** When the `body` Formula succeeds, atoms in `head` are true. */ -final case class Clause(body: Formula, head: Set[Atom]) - -/** A literal is an [[Atom]] or its [[negation|Negated]]. */ -sealed abstract class Literal extends Formula { - /** The underlying (positive) atom. */ - def atom: Atom - /** Negates this literal.*/ - def unary_! : Literal -} -/** A variable with name `label`. */ -final case class Atom(label: String) extends Literal { - def atom = this - def unary_! : Negated = Negated(this) -} -/** - * A negated atom, in the sense of negation as failure, not logical negation. - * That is, it is true if `atom` is not known/defined. - */ -final case class Negated(atom: Atom) extends Literal { - def unary_! : Atom = atom -} - -/** - * A formula consists of variables, negation, and conjunction (and). - * (Disjunction is not currently included- it is modeled at the level of a sequence of clauses. - * This is less convenient when defining clauses, but is not less powerful.) - */ -sealed abstract class Formula { - /** Constructs a clause that proves `atoms` when this formula is true. */ - def proves(atom: Atom, atoms: Atom*): Clause = Clause(this, (atom +: atoms).toSet) - - /** Constructs a formula that is true iff this formula and `f` are both true.*/ - def &&(f: Formula): Formula = (this, f) match { - case (True, x) => x - case (x, True) => x - case (And(as), And(bs)) => And(as ++ bs) - case (And(as), b: Literal) => And(as + b) - case (a: Literal, And(bs)) => And(bs + a) - case (a: Literal, b: Literal) => And(Set(a, b)) - } -} - -object Formula { - /** A conjunction of literals. */ - final case class And(literals: Set[Literal]) extends Formula { - assert(literals.nonEmpty, "'And' requires at least one literal.") - } - final case object True extends Formula -} - -object Logic { - def reduceAll(clauses: List[Clause], initialFacts: Set[Literal]): Either[LogicException, Matched] = - reduce(Clauses(clauses), initialFacts) - - /** - * Computes the variables in the unique stable model for the program represented by `clauses` and `initialFacts`. - * `clause` may not have any negative feedback (that is, negation is acyclic) - * and `initialFacts` cannot be in the head of any clauses in `clause`. - * These restrictions ensure that the logic program has a unique minimal model. - */ - def reduce(clauses: Clauses, initialFacts: Set[Literal]): Either[LogicException, Matched] = - { - val (posSeq, negSeq) = separate(initialFacts.toSeq) - val (pos, neg) = (posSeq.toSet, negSeq.toSet) - - val problem = - checkContradictions(pos, neg) orElse - checkOverlap(clauses, pos) orElse - checkAcyclic(clauses) - - problem.toLeft( - reduce0(clauses, initialFacts, Matched.empty) - ) - } - - /** - * Verifies `initialFacts` are not in the head of any `clauses`. - * This avoids the situation where an atom is proved but no clauses prove it. - * This isn't necessarily a problem, but the main sbt use cases expects - * a proven atom to have at least one clause satisfied. - */ - private[this] def checkOverlap(clauses: Clauses, initialFacts: Set[Atom]): Option[InitialOverlap] = { - val as = atoms(clauses) - val initialOverlap = initialFacts.filter(as.inHead) - if (initialOverlap.nonEmpty) Some(new InitialOverlap(initialOverlap)) else None - } - - private[this] def checkContradictions(pos: Set[Atom], neg: Set[Atom]): Option[InitialContradictions] = { - val contradictions = pos intersect neg - if (contradictions.nonEmpty) Some(new InitialContradictions(contradictions)) else None - } - - private[this] def checkAcyclic(clauses: Clauses): Option[CyclicNegation] = { - val deps = dependencyMap(clauses) - val cycle = Dag.findNegativeCycle(graph(deps)) - if (cycle.nonEmpty) Some(new CyclicNegation(cycle)) else None - } - private[this] def graph(deps: Map[Atom, Set[Literal]]) = new Dag.DirectedSignedGraph[Atom] { - type Arrow = Literal - def nodes = deps.keys.toList - def dependencies(a: Atom) = deps.getOrElse(a, Set.empty).toList - def isNegative(b: Literal) = b match { - case Negated(_) => true - case Atom(_) => false - } - def head(b: Literal) = b.atom - } - - private[this] def dependencyMap(clauses: Clauses): Map[Atom, Set[Literal]] = - (Map.empty[Atom, Set[Literal]] /: clauses.clauses) { - case (m, Clause(formula, heads)) => - val deps = literals(formula) - (m /: heads) { (n, head) => n.updated(head, n.getOrElse(head, Set.empty) ++ deps) } - } - - sealed abstract class LogicException(override val toString: String) - final class InitialContradictions(val literals: Set[Atom]) extends LogicException("Initial facts cannot be both true and false:\n\t" + literals.mkString("\n\t")) - final class InitialOverlap(val literals: Set[Atom]) extends LogicException("Initial positive facts cannot be implied by any clauses:\n\t" + literals.mkString("\n\t")) - final class CyclicNegation(val cycle: List[Literal]) extends LogicException("Negation may not be involved in a cycle:\n\t" + cycle.mkString("\n\t")) - - /** Tracks proven atoms in the reverse order they were proved. */ - final class Matched private (val provenSet: Set[Atom], reverseOrdered: List[Atom]) { - def add(atoms: Set[Atom]): Matched = add(atoms.toList) - def add(atoms: List[Atom]): Matched = { - val newOnly = atoms.filterNot(provenSet) - new Matched(provenSet ++ newOnly, newOnly ::: reverseOrdered) - } - def ordered: List[Atom] = reverseOrdered.reverse - override def toString = ordered.map(_.label).mkString("Matched(", ",", ")") - } - object Matched { - val empty = new Matched(Set.empty, Nil) - } - - /** Separates a sequence of literals into `(pos, neg)` atom sequences. */ - private[this] def separate(lits: Seq[Literal]): (Seq[Atom], Seq[Atom]) = Util.separate(lits) { - case a: Atom => Left(a) - case Negated(n) => Right(n) - } - - /** - * Finds clauses that have no body and thus prove their head. - * Returns `(, )`. - */ - private[this] def findProven(c: Clauses): (Set[Atom], List[Clause]) = - { - val (proven, unproven) = c.clauses.partition(_.body == True) - (proven.flatMap(_.head).toSet, unproven) - } - private[this] def keepPositive(lits: Set[Literal]): Set[Atom] = - lits.collect { case a: Atom => a }.toSet - - // precondition: factsToProcess contains no contradictions - @tailrec - private[this] def reduce0(clauses: Clauses, factsToProcess: Set[Literal], state: Matched): Matched = - applyAll(clauses, factsToProcess) match { - case None => // all of the remaining clauses failed on the new facts - state - case Some(applied) => - val (proven, unprovenClauses) = findProven(applied) - val processedFacts = state add keepPositive(factsToProcess) - val newlyProven = proven -- processedFacts.provenSet - val newState = processedFacts add newlyProven - if (unprovenClauses.isEmpty) - newState // no remaining clauses, done. - else { - val unproven = Clauses(unprovenClauses) - val nextFacts: Set[Literal] = if (newlyProven.nonEmpty) newlyProven.toSet else inferFailure(unproven) - reduce0(unproven, nextFacts, newState) - } - } - - /** - * Finds negated atoms under the negation as failure rule and returns them. - * This should be called only after there are no more known atoms to be substituted. - */ - private[this] def inferFailure(clauses: Clauses): Set[Literal] = - { - /* At this point, there is at least one clause and one of the following is the case as the result of the acyclic negation rule: - i. there is at least one variable that occurs in a clause body but not in the head of a clause - ii. there is at least one variable that occurs in the head of a clause and does not transitively depend on a negated variable - In either case, each such variable x cannot be proven true and therefore proves 'not x' (negation as failure, !x in the code). - */ - val allAtoms = atoms(clauses) - val newFacts: Set[Literal] = negated(allAtoms.triviallyFalse) - if (newFacts.nonEmpty) - newFacts - else { - val possiblyTrue = hasNegatedDependency(clauses.clauses, Relation.empty, Relation.empty) - val newlyFalse: Set[Literal] = negated(allAtoms.inHead -- possiblyTrue) - if (newlyFalse.nonEmpty) - newlyFalse - else // should never happen due to the acyclic negation rule - sys.error(s"No progress:\n\tclauses: $clauses\n\tpossibly true: $possiblyTrue") - } - } - - private[this] def negated(atoms: Set[Atom]): Set[Literal] = atoms.map(a => Negated(a)) - - /** - * Computes the set of atoms in `clauses` that directly or transitively take a negated atom as input. - * For example, for the following clauses, this method would return `List(a, d)` : - * a :- b, not c - * d :- a - */ - @tailrec - def hasNegatedDependency(clauses: Seq[Clause], posDeps: Relation[Atom, Atom], negDeps: Relation[Atom, Atom]): List[Atom] = - clauses match { - case Seq() => - // because cycles between positive literals are allowed, this isn't strictly a topological sort - Dag.topologicalSortUnchecked(negDeps._1s)(posDeps.reverse) - case Clause(formula, head) +: tail => - // collect direct positive and negative literals and track them in separate graphs - val (pos, neg) = directDeps(formula) - val (newPos, newNeg) = ((posDeps, negDeps) /: head) { - case ((pdeps, ndeps), d) => - (pdeps + (d, pos), ndeps + (d, neg)) - } - hasNegatedDependency(tail, newPos, newNeg) - } - - /** Computes the `(positive, negative)` literals in `formula`. */ - private[this] def directDeps(formula: Formula): (Seq[Atom], Seq[Atom]) = - Util.separate(literals(formula).toSeq) { - case Negated(a) => Right(a) - case a: Atom => Left(a) - } - private[this] def literals(formula: Formula): Set[Literal] = formula match { - case And(lits) => lits - case l: Literal => Set(l) - case True => Set.empty - } - - /** Computes the atoms in the heads and bodies of the clauses in `clause`. */ - def atoms(cs: Clauses): Atoms = cs.clauses.map(c => Atoms(c.head, atoms(c.body))).reduce(_ ++ _) - - /** Computes the set of all atoms in `formula`. */ - def atoms(formula: Formula): Set[Atom] = formula match { - case And(lits) => lits.map(_.atom) - case Negated(lit) => Set(lit) - case a: Atom => Set(a) - case True => Set() - } - - /** Represents the set of atoms in the heads of clauses and in the bodies (formulas) of clauses. */ - final case class Atoms(inHead: Set[Atom], inFormula: Set[Atom]) { - /** Concatenates this with `as`. */ - def ++(as: Atoms): Atoms = Atoms(inHead ++ as.inHead, inFormula ++ as.inFormula) - /** Atoms that cannot be true because they do not occur in a head. */ - def triviallyFalse: Set[Atom] = inFormula -- inHead - } - - /** - * Applies known facts to `clause`s, deriving a new, possibly empty list of clauses. - * 1. If a fact is in the body of a clause, the derived clause has that fact removed from the body. - * 2. If the negation of a fact is in a body of a clause, that clause fails and is removed. - * 3. If a fact or its negation is in the head of a clause, the derived clause has that fact (or its negation) removed from the head. - * 4. If a head is empty, the clause proves nothing and is removed. - * - * NOTE: empty bodies do not cause a clause to succeed yet. - * All known facts must be applied before this can be done in order to avoid inconsistencies. - * Precondition: no contradictions in `facts` - * Postcondition: no atom in `facts` is present in the result - * Postcondition: No clauses have an empty head - */ - def applyAll(cs: Clauses, facts: Set[Literal]): Option[Clauses] = - { - val newClauses = - if (facts.isEmpty) - cs.clauses.filter(_.head.nonEmpty) // still need to drop clauses with an empty head - else - cs.clauses.map(c => applyAll(c, facts)).flatMap(_.toList) - if (newClauses.isEmpty) None else Some(Clauses(newClauses)) - } - - def applyAll(c: Clause, facts: Set[Literal]): Option[Clause] = - { - val atoms = facts.map(_.atom) - val newHead = c.head -- atoms // 3. - if (newHead.isEmpty) // 4. empty head - None - else - substitute(c.body, facts).map(f => Clause(f, newHead)) // 1, 2 - } - - /** Derives the formula that results from substituting `facts` into `formula`. */ - @tailrec - def substitute(formula: Formula, facts: Set[Literal]): Option[Formula] = formula match { - case And(lits) => - def negated(lits: Set[Literal]): Set[Literal] = lits.map(a => !a) - if (lits.exists(negated(facts))) // 2. - None - else { - val newLits = lits -- facts - val newF = if (newLits.isEmpty) True else And(newLits) - Some(newF) // 1. - } - case True => Some(True) - case lit: Literal => // define in terms of And - substitute(And(Set(lit)), facts) - } -} diff --git a/internal/util-logic/src/test/scala/sbt/logic/Test.scala b/internal/util-logic/src/test/scala/sbt/logic/Test.scala deleted file mode 100644 index 91ded0e69..000000000 --- a/internal/util-logic/src/test/scala/sbt/logic/Test.scala +++ /dev/null @@ -1,118 +0,0 @@ -package sbt.internal.util -package logic - -import org.scalacheck._ -import Prop.secure -import Logic.{ LogicException, Matched } - -object LogicTest extends Properties("Logic") { - import TestClauses._ - - property("Handles trivial resolution.") = secure(expect(trivial, Set(A))) - property("Handles less trivial resolution.") = secure(expect(lessTrivial, Set(B, A, D))) - property("Handles cycles without negation") = secure(expect(cycles, Set(F, A, B))) - property("Handles basic exclusion.") = secure(expect(excludedPos, Set())) - property("Handles exclusion of head proved by negation.") = secure(expect(excludedNeg, Set())) - // TODO: actually check ordering, probably as part of a check that dependencies are satisifed - property("Properly orders results.") = secure(expect(ordering, Set(B, A, C, E, F))) - property("Detects cyclic negation") = secure( - Logic.reduceAll(badClauses, Set()) match { - case Right(res) => false - case Left(err: Logic.CyclicNegation) => true - case Left(err) => sys.error(s"Expected cyclic error, got: $err") - } - ) - - def expect(result: Either[LogicException, Matched], expected: Set[Atom]) = result match { - case Left(err) => false - case Right(res) => - val actual = res.provenSet - if (actual != expected) - sys.error(s"Expected to prove $expected, but actually proved $actual") - else - true - } -} - -object TestClauses { - - val A = Atom("A") - val B = Atom("B") - val C = Atom("C") - val D = Atom("D") - val E = Atom("E") - val F = Atom("F") - val G = Atom("G") - - val clauses = - A.proves(B) :: - A.proves(F) :: - B.proves(F) :: - F.proves(A) :: - (!C).proves(F) :: - D.proves(C) :: - C.proves(D) :: - Nil - - val cycles = Logic.reduceAll(clauses, Set()) - - val badClauses = - A.proves(D) :: - clauses - - val excludedNeg = { - val cs = - (!A).proves(B) :: - Nil - val init = - (!A) :: - (!B) :: - Nil - Logic.reduceAll(cs, init.toSet) - } - - val excludedPos = { - val cs = - A.proves(B) :: - Nil - val init = - A :: - (!B) :: - Nil - Logic.reduceAll(cs, init.toSet) - } - - val trivial = { - val cs = - Formula.True.proves(A) :: - Nil - Logic.reduceAll(cs, Set.empty) - } - - val lessTrivial = { - val cs = - Formula.True.proves(A) :: - Formula.True.proves(B) :: - (A && B && (!C)).proves(D) :: - Nil - Logic.reduceAll(cs, Set()) - } - - val ordering = { - val cs = - E.proves(F) :: - (C && !D).proves(E) :: - (A && B).proves(C) :: - Nil - Logic.reduceAll(cs, Set(A, B)) - } - - def all(): Unit = { - println(s"Cycles: $cycles") - println(s"xNeg: $excludedNeg") - println(s"xPos: $excludedPos") - println(s"trivial: $trivial") - println(s"lessTrivial: $lessTrivial") - println(s"ordering: $ordering") - } -} diff --git a/project/Util.scala b/project/Util.scala index adb7cef5b..f163f6dd0 100644 --- a/project/Util.scala +++ b/project/Util.scala @@ -1,40 +1,11 @@ -import sbt._ -import Keys._ +import sbt._, Keys._ object Util { - lazy val scalaKeywords = TaskKey[Set[String]]("scala-keywords") - lazy val generateKeywords = TaskKey[File]("generateKeywords") - - lazy val javaOnlySettings = Seq[Setting[_]]( + val javaOnlySettings = Seq[Setting[_]]( crossPaths := false, compileOrder := CompileOrder.JavaThenScala, unmanagedSourceDirectories in Compile := Seq((javaSource in Compile).value), crossScalaVersions := Seq(Dependencies.scala211), autoScalaLibrary := false ) - - def getScalaKeywords: Set[String] = - { - val g = new scala.tools.nsc.Global(new scala.tools.nsc.Settings) - g.nme.keywords.map(_.toString) - } - def writeScalaKeywords(base: File, keywords: Set[String]): File = - { - val init = keywords.map(tn => '"' + tn + '"').mkString("Set(", ", ", ")") - val ObjectName = "ScalaKeywords" - val PackageName = "sbt.internal.util" - val keywordsSrc = - """package %s -object %s { - val values = %s -}""".format(PackageName, ObjectName, init) - val out = base / PackageName.replace('.', '/') / (ObjectName + ".scala") - IO.write(out, keywordsSrc) - out - } - def keywordsSettings: Seq[Setting[_]] = inConfig(Compile)(Seq( - scalaKeywords := getScalaKeywords, - generateKeywords := writeScalaKeywords(sourceManaged.value, scalaKeywords.value), - sourceGenerators += (generateKeywords map (x => Seq(x))).taskValue - )) } From f55a509fdd05c7a1cce7f0e3461f92f3ac080e37 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Fri, 7 Jul 2017 11:25:45 +0200 Subject: [PATCH 685/823] Cleanup `ConsoleAppender` --- .../sbt/internal/util/ConsoleAppender.scala | 523 ++++++++++-------- .../scala/sbt/internal/util/ConsoleOut.scala | 2 +- .../scala/sbt/internal/util/EscHelpers.scala | 91 +++ .../scala/sbt/internal/util/MainLogging.scala | 10 +- .../scala/sbt/internal/util/MultiLogger.scala | 2 +- .../util-logging/src/test/scala/Escapes.scala | 2 +- 6 files changed, 397 insertions(+), 233 deletions(-) create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/EscHelpers.scala diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index e85f70fa8..4af11dc84 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -1,5 +1,6 @@ package sbt.internal.util +import scala.compat.Platform.EOL import sbt.util._ import java.io.{ PrintStream, PrintWriter } import java.util.Locale @@ -14,146 +15,220 @@ import ConsoleAppender._ object ConsoleLogger { // These are provided so other modules do not break immediately. - @deprecated("Use ConsoleAppender.", "0.13.x") - final val ESC = ConsoleAppender.ESC - @deprecated("Use ConsoleAppender.", "0.13.x") - private[sbt] def isEscapeTerminator(c: Char): Boolean = ConsoleAppender.isEscapeTerminator(c) - @deprecated("Use ConsoleAppender.", "0.13.x") - def hasEscapeSequence(s: String): Boolean = ConsoleAppender.hasEscapeSequence(s) - @deprecated("Use ConsoleAppender.", "0.13.x") - def removeEscapeSequences(s: String): String = ConsoleAppender.removeEscapeSequences(s) - @deprecated("Use ConsoleAppender.", "0.13.x") - val formatEnabled = ConsoleAppender.formatEnabled + @deprecated("Use EscHelpers.", "0.13.x") + final val ESC = EscHelpers.ESC + @deprecated("Use EscHelpers.", "0.13.x") + private[sbt] def isEscapeTerminator(c: Char): Boolean = EscHelpers.isEscapeTerminator(c) + @deprecated("Use EscHelpers.", "0.13.x") + def hasEscapeSequence(s: String): Boolean = EscHelpers.hasEscapeSequence(s) + @deprecated("Use EscHelpers.", "0.13.x") + def removeEscapeSequences(s: String): String = EscHelpers.removeEscapeSequences(s) + @deprecated("Use ConsoleAppenders.formatEnabledInEnv", "0.13.x") + val formatEnabled = ConsoleAppender.formatEnabledInEnv @deprecated("Use ConsoleAppender.", "0.13.x") val noSuppressedMessage = ConsoleAppender.noSuppressedMessage + /** + * A new `ConsoleLogger` that logs to `out`. + * + * @param out Where to log the messages. + * @return A new `ConsoleLogger` that logs to `out`. + */ def apply(out: PrintStream): ConsoleLogger = apply(ConsoleOut.printStreamOut(out)) + + /** + * A new `ConsoleLogger` that logs to `out`. + * + * @param out Where to log the messages. + * @return A new `ConsoleLogger` that logs to `out`. + */ def apply(out: PrintWriter): ConsoleLogger = apply(ConsoleOut.printWriterOut(out)) - def apply(out: ConsoleOut = ConsoleOut.systemOut, ansiCodesSupported: Boolean = ConsoleAppender.formatEnabled, - useColor: Boolean = ConsoleAppender.formatEnabled, suppressedMessage: SuppressedTraceContext => Option[String] = ConsoleAppender.noSuppressedMessage): ConsoleLogger = - new ConsoleLogger(out, ansiCodesSupported, useColor, suppressedMessage) + + /** + * A new `ConsoleLogger` that logs to `out`. + * + * @param out Where to log the messages. + * @param ansiCodesSupported `true` if `out` supported ansi codes, `false` otherwise. + * @param useFormat `true` to show formatting, `false` to remove it from messages. + * @param suppressedMessage How to show suppressed stack traces. + * @return A new `ConsoleLogger` that logs to `out`. + */ + def apply(out: ConsoleOut = ConsoleOut.systemOut, + ansiCodesSupported: Boolean = ConsoleAppender.formatEnabledInEnv, + useFormat: Boolean = ConsoleAppender.formatEnabledInEnv, + suppressedMessage: SuppressedTraceContext => Option[String] = ConsoleAppender.noSuppressedMessage): ConsoleLogger = + new ConsoleLogger(out, ansiCodesSupported, useFormat, suppressedMessage) } /** * A logger that logs to the console. On supported systems, the level labels are * colored. */ -class ConsoleLogger private[ConsoleLogger] (val out: ConsoleOut, override val ansiCodesSupported: Boolean, val useColor: Boolean, val suppressedMessage: SuppressedTraceContext => Option[String]) extends BasicLogger { - private[sbt] val appender = ConsoleAppender(generateName, out, ansiCodesSupported, useColor, suppressedMessage) +class ConsoleLogger private[ConsoleLogger] (out: ConsoleOut, + override val ansiCodesSupported: Boolean, + useFormat: Boolean, + suppressedMessage: SuppressedTraceContext => Option[String]) extends BasicLogger { + + private[sbt] val appender: ConsoleAppender = + ConsoleAppender(generateName(), out, ansiCodesSupported, useFormat, suppressedMessage) override def control(event: ControlEvent.Value, message: => String): Unit = appender.control(event, message) + override def log(level: Level.Value, message: => String): Unit = - { - if (atLevel(level)) { - appender.appendLog(level, message) - } + if (atLevel(level)) { + appender.appendLog(level, message) } override def success(message: => String): Unit = - { - if (successEnabled) { - appender.success(message) - } + if (successEnabled) { + appender.success(message) } + override def trace(t: => Throwable): Unit = appender.trace(t, getTrace) - override def logAll(events: Seq[LogEvent]) = out.lockObject.synchronized { events.foreach(log) } + override def logAll(events: Seq[LogEvent]) = + out.lockObject.synchronized { events.foreach(log) } } object ConsoleAppender { - /** Escape character, used to introduce an escape sequence. */ - final val ESC = '\u001B' - /** - * An escape terminator is a character in the range `@` (decimal value 64) to `~` (decimal value 126). - * It is the final character in an escape sequence. - * - * cf. http://en.wikipedia.org/wiki/ANSI_escape_code#CSI_codes - */ - private[sbt] def isEscapeTerminator(c: Char): Boolean = - c >= '@' && c <= '~' + /** Hide stack trace altogether. */ + val noSuppressedMessage = (_: SuppressedTraceContext) => None - /** - * Test if the character AFTER an ESC is the ANSI CSI. - * - * see: http://en.wikipedia.org/wiki/ANSI_escape_code - * - * The CSI (control sequence instruction) codes start with ESC + '['. This is for testing the second character. - * - * There is an additional CSI (one character) that we could test for, but is not frequnetly used, and we don't - * check for it. - * - * cf. http://en.wikipedia.org/wiki/ANSI_escape_code#CSI_codes - */ - private def isCSI(c: Char): Boolean = c == '[' - - /** - * Tests whether or not a character needs to immediately terminate the ANSI sequence. - * - * c.f. http://en.wikipedia.org/wiki/ANSI_escape_code#Sequence_elements - */ - private def isAnsiTwoCharacterTerminator(c: Char): Boolean = - (c >= '@') && (c <= '_') - - /** - * Returns true if the string contains the ESC character. - * - * TODO - this should handle raw CSI (not used much) - */ - def hasEscapeSequence(s: String): Boolean = - s.indexOf(ESC) >= 0 - - /** - * Returns the string `s` with escape sequences removed. - * An escape sequence starts with the ESC character (decimal value 27) and ends with an escape terminator. - * @see isEscapeTerminator - */ - def removeEscapeSequences(s: String): String = - if (s.isEmpty || !hasEscapeSequence(s)) - s - else { - val sb = new java.lang.StringBuilder - nextESC(s, 0, sb) - sb.toString - } - private[this] def nextESC(s: String, start: Int, sb: java.lang.StringBuilder): Unit = { - val escIndex = s.indexOf(ESC, start) - if (escIndex < 0) { - sb.append(s, start, s.length) - () - } else { - sb.append(s, start, escIndex) - val next: Int = - // If it's a CSI we skip past it and then look for a terminator. - if (isCSI(s.charAt(escIndex + 1))) skipESC(s, escIndex + 2) - else if (isAnsiTwoCharacterTerminator(s.charAt(escIndex + 1))) escIndex + 2 - else { - // There could be non-ANSI character sequences we should make sure we handle here. - skipESC(s, escIndex + 1) - } - nextESC(s, next, sb) - } + /** Indicates whether formatting has been disabled in environment variables. */ + val formatEnabledInEnv: Boolean = { + import java.lang.Boolean.{ getBoolean, parseBoolean } + val value = System.getProperty("sbt.log.format") + if (value eq null) (ansiSupported && !getBoolean("sbt.log.noformat")) else parseBoolean(value) } - /** Skips the escape sequence starting at `i-1`. `i` should be positioned at the character after the ESC that starts the sequence. */ - private[this] def skipESC(s: String, i: Int): Int = { - if (i >= s.length) { - i - } else if (isEscapeTerminator(s.charAt(i))) { - i + 1 - } else { - skipESC(s, i + 1) - } + private[this] val generateId: AtomicInteger = new AtomicInteger + + /** + * A new `ConsoleAppender` that writes to standard output. + * + * @return A new `ConsoleAppender` that writes to standard output. + */ + def apply(): ConsoleAppender = apply(ConsoleOut.systemOut) + + /** + * A new `ConsoleAppender` that appends log message to `out`. + * + * @param out Where to write messages. + * @return A new `ConsoleAppender`. + */ + def apply(out: PrintStream): ConsoleAppender = apply(ConsoleOut.printStreamOut(out)) + + /** + * A new `ConsoleAppender` that appends log messages to `out`. + * + * @param out Where to write messages. + * @return A new `ConsoleAppender`. + */ + def apply(out: PrintWriter): ConsoleAppender = apply(ConsoleOut.printWriterOut(out)) + + /** + * A new `ConsoleAppender` that writes to `out`. + * + * @param out Where to write messages. + * @return A new `ConsoleAppender that writes to `out`. + */ + def apply(out: ConsoleOut): ConsoleAppender = apply(generateName(), out) + + /** + * A new `ConsoleAppender` identified by `name`, and that writes to standard output. + * + * @param name An identifier for the `ConsoleAppender`. + * @return A new `ConsoleAppender` that writes to standard output. + */ + def apply(name: String): ConsoleAppender = apply(name, ConsoleOut.systemOut) + + /** + * A new `ConsoleAppender` identified by `name`, and that writes to `out`. + * + * @param name An identifier for the `ConsoleAppender`. + * @param out Where to write messages. + * @return A new `ConsoleAppender` that writes to `out`. + */ + def apply(name: String, out: ConsoleOut): ConsoleAppender = apply(name, out, formatEnabledInEnv) + + /** + * A new `ConsoleAppender` identified by `name`, and that writes to `out`. + * + * @param name An identifier for the `ConsoleAppender`. + * @param out Where to write messages. + * @param suppressedMessage How to handle stack traces. + * @return A new `ConsoleAppender` that writes to `out`. + */ + def apply(name: String, out: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = + apply(name, out, formatEnabledInEnv, formatEnabledInEnv, suppressedMessage) + + /** + * A new `ConsoleAppender` identified by `name`, and that writes to `out`. + * + * @param name An identifier for the `ConsoleAppender`. + * @param out Where to write messages. + * @param useFormat `true` to enable format (color, bold, etc.), `false` to remove formatting. + * @return A new `ConsoleAppender` that writes to `out`. + */ + def apply(name: String, out: ConsoleOut, useFormat: Boolean): ConsoleAppender = + apply(name, out, formatEnabledInEnv, useFormat, noSuppressedMessage) + + /** + * A new `ConsoleAppender` identified by `name`, and that writes to `out`. + * + * @param name An identifier for the `ConsoleAppender`. + * @param out Where to write messages. + * @param ansiCodesSupported `true` if the output stream supports ansi codes, `false` otherwise. + * @param useFormat `true` to enable format (color, bold, etc.), `false` to remove + * formatting. + * @return A new `ConsoleAppender` that writes to `out`. + */ + def apply(name: String, + out: ConsoleOut, + ansiCodesSupported: Boolean, + useFormat: Boolean, + suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = { + val appender = new ConsoleAppender(name, out, ansiCodesSupported, useFormat, suppressedMessage) + appender.start + appender } - val formatEnabled: Boolean = - { - import java.lang.Boolean.{ getBoolean, parseBoolean } - val value = System.getProperty("sbt.log.format") - if (value eq null) (ansiSupported && !getBoolean("sbt.log.noformat")) else parseBoolean(value) + /** + * Converts the Log4J `level` to the corresponding sbt level. + * + * @param level A level, as represented by Log4J. + * @return The corresponding level in sbt's world. + */ + def toLevel(level: XLevel): Level.Value = + level match { + case XLevel.OFF => Level.Debug + case XLevel.FATAL => Level.Error + case XLevel.ERROR => Level.Error + case XLevel.WARN => Level.Warn + case XLevel.INFO => Level.Info + case XLevel.DEBUG => Level.Debug + case _ => Level.Debug } + + /** + * Converts the sbt `level` to the corresponding Log4J level. + * + * @param level A level, as represented by sbt. + * @return The corresponding level in Log4J's world. + */ + def toXLevel(level: Level.Value): XLevel = + level match { + case Level.Error => XLevel.ERROR + case Level.Warn => XLevel.WARN + case Level.Info => XLevel.INFO + case Level.Debug => XLevel.DEBUG + } + + private[sbt] def generateName(): String = "out-" + generateId.incrementAndGet + private[this] def jline1to2CompatMsg = "Found class jline.Terminal, but interface was expected" private[this] def ansiSupported = @@ -172,57 +247,9 @@ object ConsoleAppender { throw new IncompatibleClassChangeError("JLine incompatibility detected. Check that the sbt launcher is version 0.13.x or later.") } - val noSuppressedMessage = (_: SuppressedTraceContext) => None - private[this] def os = System.getProperty("os.name") private[this] def isWindows = os.toLowerCase(Locale.ENGLISH).indexOf("windows") >= 0 - def apply(out: PrintStream): ConsoleAppender = apply(ConsoleOut.printStreamOut(out)) - def apply(out: PrintWriter): ConsoleAppender = apply(ConsoleOut.printWriterOut(out)) - def apply(): ConsoleAppender = apply(ConsoleOut.systemOut) - def apply(name: String): ConsoleAppender = apply(name, ConsoleOut.systemOut) - def apply(out: ConsoleOut): ConsoleAppender = apply(generateName, out) - def apply(name: String, out: ConsoleOut): ConsoleAppender = apply(name, out, formatEnabled) - - def apply(name: String, out: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = - apply(name, out, formatEnabled, formatEnabled, suppressedMessage) - - def apply(name: String, out: ConsoleOut, useColor: Boolean): ConsoleAppender = - apply(name, out, formatEnabled, useColor, noSuppressedMessage) - - def apply(name: String, out: ConsoleOut, ansiCodesSupported: Boolean, - useColor: Boolean, suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = - { - val appender = new ConsoleAppender(name, out, ansiCodesSupported, useColor, suppressedMessage) - appender.start - appender - } - - def generateName: String = "out-" + generateId.incrementAndGet - - private val generateId: AtomicInteger = new AtomicInteger - - private[this] val EscapeSequence = (27.toChar + "[^@-~]*[@-~]").r - def stripEscapeSequences(s: String): String = - EscapeSequence.pattern.matcher(s).replaceAll("") - - def toLevel(level: XLevel): Level.Value = - level match { - case XLevel.OFF => Level.Debug - case XLevel.FATAL => Level.Error - case XLevel.ERROR => Level.Error - case XLevel.WARN => Level.Warn - case XLevel.INFO => Level.Info - case XLevel.DEBUG => Level.Debug - case _ => Level.Debug - } - def toXLevel(level: Level.Value): XLevel = - level match { - case Level.Error => XLevel.ERROR - case Level.Warn => XLevel.WARN - case Level.Info => XLevel.INFO - case Level.Debug => XLevel.DEBUG - } } // See http://stackoverflow.com/questions/24205093/how-to-create-a-custom-appender-in-log4j2 @@ -237,35 +264,135 @@ object ConsoleAppender { * This logger is not thread-safe. */ class ConsoleAppender private[ConsoleAppender] ( - val name: String, - val out: ConsoleOut, - val ansiCodesSupported: Boolean, - val useColor: Boolean, - val suppressedMessage: SuppressedTraceContext => Option[String] + name: String, + out: ConsoleOut, + ansiCodesSupported: Boolean, + useFormat: Boolean, + suppressedMessage: SuppressedTraceContext => Option[String] ) extends AbstractAppender(name, null, PatternLayout.createDefaultLayout(), true) { import scala.Console.{ BLUE, GREEN, RED, RESET, YELLOW } - def append(event: XLogEvent): Unit = - { - val level = ConsoleAppender.toLevel(event.getLevel) - val message = event.getMessage - // val str = messageToString(message) - appendMessage(level, message) + private final val SUCCESS_LABEL_COLOR = GREEN + private final val SUCCESS_MESSAGE_COLOR = RESET + private final val NO_COLOR = RESET + + override def append(event: XLogEvent): Unit = { + val level = ConsoleAppender.toLevel(event.getLevel) + val message = event.getMessage + appendMessage(level, message) + } + + // TODO: + // success is called by ConsoleLogger. + // This should turn into an event. + private[sbt] def success(message: => String): Unit = { + appendLog(SUCCESS_LABEL_COLOR, Level.SuccessLabel, SUCCESS_MESSAGE_COLOR, message) + } + + /** + * Logs the stack trace of `t`, possibly shortening it. + * + * The `traceLevel` parameter configures how the stack trace will be shortened. + * See `StackTrace.trimmed`. + * + * @param t The `Throwable` whose stack trace to log. + * @param traceLevel How to shorten the stack trace. + */ + def trace(t: => Throwable, traceLevel: Int): Unit = + out.lockObject.synchronized { + if (traceLevel >= 0) + write(StackTrace.trimmed(t, traceLevel)) + if (traceLevel <= 2) + for (msg <- suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useFormat))) + appendLog(NO_COLOR, "trace", NO_COLOR, msg) } - def appendMessage(level: Level.Value, msg: Message): Unit = + /** + * Logs a `ControlEvent` to the log. + * + * @param event The kind of `ControlEvent`. + * @param message The message to log. + */ + def control(event: ControlEvent.Value, message: => String): Unit = + appendLog(labelColor(Level.Info), Level.Info.toString, BLUE, message) + + /** + * Appends the message `message` to the to the log at level `level`. + * + * @param level The importance level of the message. + * @param message The message to log. + */ + def appendLog(level: Level.Value, message: => String): Unit = { + appendLog(labelColor(level), level.toString, NO_COLOR, message) + } + + /** + * Formats `msg` with `format, wrapped between `RESET`s + * + * @param format The format to use + * @param msg The message to format + * @return The formatted message. + */ + private def formatted(format: String, msg: String): String = + s"${RESET}${format}${msg}${RESET}" + + /** + * Select the right color for the label given `level`. + * + * @param level The label to consider to select the color. + * @return The color to use to color the label. + */ + private def labelColor(level: Level.Value): String = + level match { + case Level.Error => RED + case Level.Warn => YELLOW + case _ => NO_COLOR + } + + /** + * Appends a full message to the log. Each line is prefixed with `[$label]`, written in + * `labelColor` if formatting is enabled. The lines of the messages are colored with + * `messageColor` if formatting is enabled. + * + * @param labelColor The color to use to format the label. + * @param label The label to prefix each line with. The label is shown between square + * brackets. + * @param messageColor The color to use to format the message. + * @param message The message to write. + */ + private def appendLog(labelColor: String, label: String, messageColor: String, message: String): Unit = + out.lockObject.synchronized { + message.lines.foreach { line => + val labeledLine = s"[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" + writeLine(labeledLine) + } + } + + private def write(msg: String): Unit = { + val cleanedMsg = + if (!useFormat) EscHelpers.removeEscapeSequences(msg) + else msg + out.println(cleanedMsg) + } + + private def writeLine(line: String): Unit = + write(line + EOL) + + private def appendMessage(level: Level.Value, msg: Message): Unit = msg match { case o: ObjectMessage => objectToLines(o.getParameter) foreach { appendLog(level, _) } case o: ReusableObjectMessage => objectToLines(o.getParameter) foreach { appendLog(level, _) } case _ => appendLog(level, msg.getFormattedMessage) } - def objectToLines(o: AnyRef): Vector[String] = + + private def objectToLines(o: AnyRef): Vector[String] = o match { case x: StringEvent => Vector(x.message) case x: ObjectEvent[_] => objectEventToLines(x) case _ => Vector(o.toString) } - def objectEventToLines(oe: ObjectEvent[_]): Vector[String] = + + private def objectEventToLines(oe: ObjectEvent[_]): Vector[String] = { val contentType = oe.contentType LogExchange.stringCodec[AnyRef](contentType) match { @@ -273,61 +400,7 @@ class ConsoleAppender private[ConsoleAppender] ( case _ => Vector(oe.message.toString) } } - def messageColor(level: Level.Value) = RESET - def labelColor(level: Level.Value) = - level match { - case Level.Error => RED - case Level.Warn => YELLOW - case _ => RESET - } - // success is called by ConsoleLogger. - // This should turn into an event. - private[sbt] def success(message: => String): Unit = { - appendLog(successLabelColor, Level.SuccessLabel, successMessageColor, message) - } - private[sbt] def successLabelColor = GREEN - private[sbt] def successMessageColor = RESET - - def trace(t: => Throwable, traceLevel: Int): Unit = - out.lockObject.synchronized { - if (traceLevel >= 0) - out.print(StackTrace.trimmed(t, traceLevel)) - if (traceLevel <= 2) - for (msg <- suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useColor))) - printLabeledLine(labelColor(Level.Error), "trace", messageColor(Level.Error), msg) - } - - def control(event: ControlEvent.Value, message: => String): Unit = - appendLog(labelColor(Level.Info), Level.Info.toString, BLUE, message) - - def appendLog(level: Level.Value, message: => String): Unit = { - appendLog(labelColor(level), level.toString, messageColor(level), message) - } - private def reset(): Unit = setColor(RESET) - - private def setColor(color: String): Unit = { - if (ansiCodesSupported && useColor) - out.lockObject.synchronized { out.print(color) } - } - private def appendLog(labelColor: String, label: String, messageColor: String, message: String): Unit = - out.lockObject.synchronized { - for (line <- message.split("""\n""")) - printLabeledLine(labelColor, label, messageColor, line) - } - private def printLabeledLine(labelColor: String, label: String, messageColor: String, line: String): Unit = - { - reset() - out.print("[") - setColor(labelColor) - out.print(label) - reset() - out.print("] ") - setColor(messageColor) - out.print(line) - reset() - out.println() - } } -final class SuppressedTraceContext(val traceLevel: Int, val useColor: Boolean) +final class SuppressedTraceContext(val traceLevel: Int, val useFormat: Boolean) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala index 72fa01594..b9834d7e8 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala @@ -33,7 +33,7 @@ object ConsoleOut { def println(s: String): Unit = synchronized { current.append(s); println() } def println(): Unit = synchronized { val s = current.toString - if (ConsoleAppender.formatEnabled && last.exists(lmsg => f(s, lmsg))) + if (ConsoleAppender.formatEnabledInEnv && last.exists(lmsg => f(s, lmsg))) lockObject.print(OverwriteLine) lockObject.println(s) last = Some(s) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/EscHelpers.scala b/internal/util-logging/src/main/scala/sbt/internal/util/EscHelpers.scala new file mode 100644 index 000000000..ad394994c --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/EscHelpers.scala @@ -0,0 +1,91 @@ +package sbt.internal.util + +object EscHelpers { + + /** Escape character, used to introduce an escape sequence. */ + final val ESC = '\u001B' + + /** + * An escape terminator is a character in the range `@` (decimal value 64) to `~` (decimal value 126). + * It is the final character in an escape sequence. + * + * cf. http://en.wikipedia.org/wiki/ANSI_escape_code#CSI_codes + */ + private[sbt] def isEscapeTerminator(c: Char): Boolean = + c >= '@' && c <= '~' + + /** + * Test if the character AFTER an ESC is the ANSI CSI. + * + * see: http://en.wikipedia.org/wiki/ANSI_escape_code + * + * The CSI (control sequence instruction) codes start with ESC + '['. This is for testing the second character. + * + * There is an additional CSI (one character) that we could test for, but is not frequnetly used, and we don't + * check for it. + * + * cf. http://en.wikipedia.org/wiki/ANSI_escape_code#CSI_codes + */ + private def isCSI(c: Char): Boolean = c == '[' + + /** + * Tests whether or not a character needs to immediately terminate the ANSI sequence. + * + * c.f. http://en.wikipedia.org/wiki/ANSI_escape_code#Sequence_elements + */ + private def isAnsiTwoCharacterTerminator(c: Char): Boolean = + (c >= '@') && (c <= '_') + + /** + * Returns true if the string contains the ESC character. + * + * TODO - this should handle raw CSI (not used much) + */ + def hasEscapeSequence(s: String): Boolean = + s.indexOf(ESC) >= 0 + + /** + * Returns the string `s` with escape sequences removed. + * An escape sequence starts with the ESC character (decimal value 27) and ends with an escape terminator. + * @see isEscapeTerminator + */ + def removeEscapeSequences(s: String): String = + if (s.isEmpty || !hasEscapeSequence(s)) + s + else { + val sb = new java.lang.StringBuilder + nextESC(s, 0, sb) + sb.toString + } + + private[this] def nextESC(s: String, start: Int, sb: java.lang.StringBuilder): Unit = { + val escIndex = s.indexOf(ESC, start) + if (escIndex < 0) { + sb.append(s, start, s.length) + () + } else { + sb.append(s, start, escIndex) + val next: Int = + // If it's a CSI we skip past it and then look for a terminator. + if (isCSI(s.charAt(escIndex + 1))) skipESC(s, escIndex + 2) + else if (isAnsiTwoCharacterTerminator(s.charAt(escIndex + 1))) escIndex + 2 + else { + // There could be non-ANSI character sequences we should make sure we handle here. + skipESC(s, escIndex + 1) + } + nextESC(s, next, sb) + } + } + + /** Skips the escape sequence starting at `i-1`. `i` should be positioned at the character after the ESC that starts the sequence. */ + private[this] def skipESC(s: String, i: Int): Int = { + if (i >= s.length) { + i + } else if (isEscapeTerminator(s.charAt(i))) { + i + 1 + } else { + skipESC(s, i + 1) + } + } + +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala index 37dac9b70..dd08ba0bb 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala @@ -48,15 +48,15 @@ object MainAppender { def defaultScreen(name: String, console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): Appender = ConsoleAppender(name, console, suppressedMessage = suppressedMessage) - def defaultBacked: PrintWriter => Appender = defaultBacked(generateGlobalBackingName, ConsoleAppender.formatEnabled) - def defaultBacked(loggerName: String): PrintWriter => Appender = defaultBacked(loggerName, ConsoleAppender.formatEnabled) - def defaultBacked(useColor: Boolean): PrintWriter => Appender = defaultBacked(generateGlobalBackingName, useColor) - def defaultBacked(loggerName: String, useColor: Boolean): PrintWriter => Appender = + def defaultBacked: PrintWriter => Appender = defaultBacked(generateGlobalBackingName, ConsoleAppender.formatEnabledInEnv) + def defaultBacked(loggerName: String): PrintWriter => Appender = defaultBacked(loggerName, ConsoleAppender.formatEnabledInEnv) + def defaultBacked(useFormat: Boolean): PrintWriter => Appender = defaultBacked(generateGlobalBackingName, useFormat) + def defaultBacked(loggerName: String, useFormat: Boolean): PrintWriter => Appender = to => { ConsoleAppender( ConsoleAppender.generateName, ConsoleOut.printWriterOut(to), - useColor = useColor + useFormat = useFormat ) } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala index c5f7d1103..c72d094af 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala @@ -41,7 +41,7 @@ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { private[this] def removeEscapes(event: LogEvent): LogEvent = { - import ConsoleAppender.{ removeEscapeSequences => rm } + import EscHelpers.{ removeEscapeSequences => rm } event match { case s: Success => new Success(rm(s.msg)) case l: Log => new Log(l.level, rm(l.msg)) diff --git a/internal/util-logging/src/test/scala/Escapes.scala b/internal/util-logging/src/test/scala/Escapes.scala index a226e4d3b..0ae24a6e4 100644 --- a/internal/util-logging/src/test/scala/Escapes.scala +++ b/internal/util-logging/src/test/scala/Escapes.scala @@ -4,7 +4,7 @@ import org.scalacheck._ import Prop._ import Gen.{ listOf, oneOf } -import ConsoleAppender.{ ESC, hasEscapeSequence, isEscapeTerminator, removeEscapeSequences } +import EscHelpers.{ ESC, hasEscapeSequence, isEscapeTerminator, removeEscapeSequences } object Escapes extends Properties("Escapes") { property("genTerminator only generates terminators") = From 19b3e47972ddc6558dc9010edf9dce761d96f98e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 13 Jul 2017 20:20:49 -0400 Subject: [PATCH 686/823] Fix casting error during initialization While running scripted, you see ``` ERROR StatusLogger Unable to create custom ContextSelector. Falling back to default. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.async.AsyncLoggerContextSelector to org.apache.logging.log4j.core.selector.ContextSelector at java.lang.Class.cast(Class.java:3369) at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:201) at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOfProperty(LoaderUtil.java:226) at org.apache.logging.log4j.core.impl.Log4jContextFactory.createContextSelector(Log4jContextFactory.java:97) at org.apache.logging.log4j.core.impl.Log4jContextFactory.(Log4jContextFactory.java:58) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at java.lang.Class.newInstance(Class.java:442) at org.apache.logging.log4j.LogManager.(LogManager.java:94) at org.apache.logging.log4j.spi.ThreadContextMapFactory.createThreadContextMap(ThreadContextMapFactory.java:73) at org.apache.logging.log4j.ThreadContext.init(ThreadContext.java:223) at org.apache.logging.log4j.ThreadContext.(ThreadContext.java:202) at org.apache.logging.log4j.core.impl.ContextDataInjectorFactory.createDefaultInjector(ContextDataInjectorFactory.java:83) at org.apache.logging.log4j.core.impl.ContextDataInjectorFactory.createInjector(ContextDataInjectorFactory.java:67) at org.apache.logging.log4j.core.lookup.ContextMapLookup.(ContextMapLookup.java:34) at org.apache.logging.log4j.core.lookup.Interpolator.(Interpolator.java:117) at org.apache.logging.log4j.core.config.AbstractConfiguration.(AbstractConfiguration.java:125) at org.apache.logging.log4j.core.config.DefaultConfiguration.(DefaultConfiguration.java:46) at org.apache.logging.log4j.core.layout.PatternLayout$Builder.build(PatternLayout.java:650) at org.apache.logging.log4j.core.layout.PatternLayout.createDefaultLayout(PatternLayout.java:487) at sbt.internal.util.ConsoleAppender.(ConsoleAppender.scala:245) ``` This aims to workaround the casting error during PatternLayout.createDefaultLayout() that was originally used for ConsoleAppender. The stacktrace shows it's having issue initializing default DefaultConfiguration. Since we currently do not use Layout inside ConsoleAppender, the actual pattern is not relevant. --- .../sbt/internal/util/ConsoleAppender.scala | 3 +-- .../src/main/scala/sbt/util/LogExchange.scala | 20 ++++++++++++++++--- 2 files changed, 18 insertions(+), 5 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index e85f70fa8..75ca7146e 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -8,7 +8,6 @@ import org.apache.logging.log4j.{ Level => XLevel } import org.apache.logging.log4j.message.{ Message, ObjectMessage, ReusableObjectMessage } import org.apache.logging.log4j.core.{ LogEvent => XLogEvent } import org.apache.logging.log4j.core.appender.AbstractAppender -import org.apache.logging.log4j.core.layout.PatternLayout import ConsoleAppender._ @@ -242,7 +241,7 @@ class ConsoleAppender private[ConsoleAppender] ( val ansiCodesSupported: Boolean, val useColor: Boolean, val suppressedMessage: SuppressedTraceContext => Option[String] -) extends AbstractAppender(name, null, PatternLayout.createDefaultLayout(), true) { +) extends AbstractAppender(name, null, LogExchange.dummyLayout, true) { import scala.Console.{ BLUE, GREEN, RED, RESET, YELLOW } def append(event: XLogEvent): Unit = diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index 1dc17804d..150814bc5 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -5,6 +5,7 @@ import org.apache.logging.log4j.{ LogManager => XLogManager, Level => XLevel } import org.apache.logging.log4j.core._ import org.apache.logging.log4j.core.appender.AsyncAppender import org.apache.logging.log4j.core.config.{ AppenderRef, LoggerConfig } +import org.apache.logging.log4j.core.layout.PatternLayout import scala.collection.JavaConverters._ import scala.collection.concurrent import sjsonnew.JsonFormat @@ -47,6 +48,22 @@ sealed abstract class LogExchange { val config = ctx.getConfiguration config.getLoggerConfig(loggerName) } + + // This is a dummy layout to avoid casting error during PatternLayout.createDefaultLayout() + // that was originally used for ConsoleAppender. + // The stacktrace shows it's having issue initializing default DefaultConfiguration. + // Since we currently do not use Layout inside ConsoleAppender, the actual pattern is not relevant. + private[sbt] lazy val dummyLayout: PatternLayout = { + val _ = context + val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } + val config = ctx.getConfiguration + val lo = PatternLayout.newBuilder + .withConfiguration(config) + .withPattern(PatternLayout.SIMPLE_CONVERSION_PATTERN) + .build + lo + } + def jsonCodec[A](tag: String): Option[JsonFormat[A]] = jsonCodecs.get(tag) map { _.asInstanceOf[JsonFormat[A]] } def hasJsonCodec(tag: String): Boolean = @@ -63,9 +80,6 @@ sealed abstract class LogExchange { private[sbt] def buildAsyncStdout: AsyncAppender = { val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } val config = ctx.getConfiguration - // val layout = PatternLayout.newBuilder - // .withPattern(PatternLayout.SIMPLE_CONVERSION_PATTERN) - // .build val appender = ConsoleAppender("Stdout") // CustomConsoleAppenderImpl.createAppender("Stdout", layout, null, null) appender.start From 18a73db57d8699d487918ed0ad441db3d3848854 Mon Sep 17 00:00:00 2001 From: jvican Date: Thu, 22 Jun 2017 20:46:23 +0200 Subject: [PATCH 687/823] Remove unnecessary F0, F1 and Maybe `F0`, `F1` and `Maybe` have become useless since Java 8 introduced `Supplier`, `Function` and `Optional` in the default Java 8 standard library. Therefore, they are not necessary anymore. This change is required to change some Zinc's and sbt APIs. They are not widely used, so the changes will be small. --- .../src/main/java/xsbti/F0.java | 9 ---- .../src/main/java/xsbti/F1.java | 6 --- .../src/main/java/xsbti/Logger.java | 15 +++--- .../src/main/java/xsbti/Maybe.java | 48 ------------------- .../main/scala/sbt/util/InterfaceUtil.scala | 33 +++++++------ .../src/main/scala/sbt/util/Logger.scala | 42 ++++++++-------- 6 files changed, 45 insertions(+), 108 deletions(-) delete mode 100644 internal/util-interface/src/main/java/xsbti/F0.java delete mode 100644 internal/util-interface/src/main/java/xsbti/F1.java delete mode 100644 internal/util-interface/src/main/java/xsbti/Maybe.java diff --git a/internal/util-interface/src/main/java/xsbti/F0.java b/internal/util-interface/src/main/java/xsbti/F0.java deleted file mode 100644 index b0091b186..000000000 --- a/internal/util-interface/src/main/java/xsbti/F0.java +++ /dev/null @@ -1,9 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah - */ -package xsbti; - -public interface F0 -{ - T apply(); -} diff --git a/internal/util-interface/src/main/java/xsbti/F1.java b/internal/util-interface/src/main/java/xsbti/F1.java deleted file mode 100644 index 8797e9196..000000000 --- a/internal/util-interface/src/main/java/xsbti/F1.java +++ /dev/null @@ -1,6 +0,0 @@ -package xsbti; - -public interface F1 -{ - R apply(A1 a1); -} diff --git a/internal/util-interface/src/main/java/xsbti/Logger.java b/internal/util-interface/src/main/java/xsbti/Logger.java index 60dfab7b5..1e9539fe7 100644 --- a/internal/util-interface/src/main/java/xsbti/Logger.java +++ b/internal/util-interface/src/main/java/xsbti/Logger.java @@ -3,11 +3,12 @@ */ package xsbti; -public interface Logger -{ - void error(F0 msg); - void warn(F0 msg); - void info(F0 msg); - void debug(F0 msg); - void trace(F0 exception); +import java.util.function.Supplier; + +public interface Logger { + void error(Supplier msg); + void warn(Supplier msg); + void info(Supplier msg); + void debug(Supplier msg); + void trace(Supplier exception); } diff --git a/internal/util-interface/src/main/java/xsbti/Maybe.java b/internal/util-interface/src/main/java/xsbti/Maybe.java deleted file mode 100644 index 0c5ea23b8..000000000 --- a/internal/util-interface/src/main/java/xsbti/Maybe.java +++ /dev/null @@ -1,48 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2008, 2009, 2010 Mark Harrah - */ -package xsbti; - -/** Intended as a lightweight carrier for scala.Option. */ -public abstract class Maybe { - private Maybe() {} - - @SuppressWarnings("unchecked") - public static Maybe nothing() { return (Maybe) Nothing.INSTANCE; } - public static Maybe just(final s v) { return new Just(v); } - - public static final class Just extends Maybe { - private final s v; - - public Just(final s v) { this.v = v; } - - public s value() { return v; } - - public boolean isDefined() { return true; } - public s get() { return v; } - public int hashCode() { return 17 + (v == null ? 0 : v.hashCode()); } - public String toString() { return "Maybe.just(" + v + ")"; } - public boolean equals(Object o) { - if (this == o) return true; - if (o == null || !(o instanceof Just)) return false; - final Just that = (Just) o; - return v == null ? that.v == null : v.equals(that.v); - } - } - - public static final class Nothing extends Maybe { - public static final Nothing INSTANCE = new Nothing(); - private Nothing() { } - - public boolean isDefined() { return false; } - public Object get() { throw new UnsupportedOperationException("nothing.get"); } - - public int hashCode() { return 1; } - public String toString() { return "Maybe.nothing()"; } - public boolean equals(Object o) { return this == o || o != null && o instanceof Nothing; } - } - - public final boolean isEmpty() { return !isDefined(); } - public abstract boolean isDefined(); - public abstract t get(); -} diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index 328ba8bc7..63e4213cb 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -1,22 +1,29 @@ package sbt.util -import xsbti.{ Maybe, F0, F1, T2, Position, Problem, Severity } +import xsbti.{ Position, Problem, Severity, T2 } import java.io.File import java.util.Optional +import java.util.function.Supplier object InterfaceUtil { - def f0[A](a: => A): F0[A] = new ConcreteF0[A](a) - def f1[A1, R](f: A1 => R): F1[A1, R] = new ConcreteF1(f) + def toSupplier[A](a: => A): Supplier[A] = new Supplier[A] { + override def get: A = a + } + + import java.util.function.{ Function => JavaFunction } + def toJavaFunction[A1, R](f: A1 => R): JavaFunction[A1, R] = new JavaFunction[A1, R] { + override def apply(t: A1): R = f(t) + } + def t2[A1, A2](x: (A1, A2)): T2[A1, A2] = new ConcreteT2(x._1, x._2) - def m2o[A](m: Maybe[A]): Option[A] = - if (m.isDefined) Some(m.get) - else None + def toOption[A](m: Optional[A]): Option[A] = + if (m.isPresent) Some(m.get) else None - def o2m[A](o: Option[A]): Maybe[A] = + def toOptional[A](o: Option[A]): Optional[A] = o match { - case Some(v) => Maybe.just(v) - case None => Maybe.nothing() + case Some(v) => Optional.of(v) + case None => Optional.empty() } def jo2o[A](o: Optional[A]): Option[A] = @@ -36,14 +43,6 @@ object InterfaceUtil { def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = new ConcreteProblem(cat, pos, msg, sev) - private final class ConcreteF0[A](a: => A) extends F0[A] { - def apply: A = a - } - - private final class ConcreteF1[A1, R](f: A1 => R) extends F1[A1, R] { - def apply(a1: A1): R = f(a1) - } - private final class ConcreteT2[A1, A2](a1: A1, a2: A2) extends T2[A1, A2] { val get1: A1 = a1 val get2: A2 = a2 diff --git a/internal/util-logging/src/main/scala/sbt/util/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/Logger.scala index fd5b34a60..abcad3428 100644 --- a/internal/util-logging/src/main/scala/sbt/util/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Logger.scala @@ -3,13 +3,14 @@ */ package sbt.util -import xsbti.{ Logger => xLogger, F0 } -import xsbti.{ Maybe, Position, Problem, Severity } +import xsbti.{ Logger => xLogger } +import xsbti.{ Position, Problem, Severity } + import sys.process.ProcessLogger import sbt.internal.util.{ BufferedLogger, FullLogger } - import java.io.File import java.util.Optional +import java.util.function.Supplier /** * This is intended to be the simplest logging interface for use by code that wants to log. @@ -32,12 +33,12 @@ abstract class Logger extends xLogger { def success(message: => String): Unit def log(level: Level.Value, message: => String): Unit - def debug(msg: F0[String]): Unit = log(Level.Debug, msg) - def warn(msg: F0[String]): Unit = log(Level.Warn, msg) - def info(msg: F0[String]): Unit = log(Level.Info, msg) - def error(msg: F0[String]): Unit = log(Level.Error, msg) - def trace(msg: F0[Throwable]): Unit = trace(msg.apply) - def log(level: Level.Value, msg: F0[String]): Unit = log(level, msg.apply) + def debug(msg: Supplier[String]): Unit = log(Level.Debug, msg) + def warn(msg: Supplier[String]): Unit = log(Level.Warn, msg) + def info(msg: Supplier[String]): Unit = log(Level.Info, msg) + def error(msg: Supplier[String]): Unit = log(Level.Error, msg) + def trace(msg: Supplier[Throwable]): Unit = trace(msg.get()) + def log(level: Level.Value, msg: Supplier[String]): Unit = log(level, msg.get) } object Logger { @@ -67,17 +68,18 @@ object Logger { case _ => wrapXLogger(lg) } private[this] def wrapXLogger(lg: xLogger): Logger = new Logger { - override def debug(msg: F0[String]): Unit = lg.debug(msg) - override def warn(msg: F0[String]): Unit = lg.warn(msg) - override def info(msg: F0[String]): Unit = lg.info(msg) - override def error(msg: F0[String]): Unit = lg.error(msg) - override def trace(msg: F0[Throwable]): Unit = lg.trace(msg) - override def log(level: Level.Value, msg: F0[String]): Unit = lg.log(level, msg) - def trace(t: => Throwable): Unit = trace(f0(t)) - def success(s: => String): Unit = info(f0(s)) + import InterfaceUtil.toSupplier + override def debug(msg: Supplier[String]): Unit = lg.debug(msg) + override def warn(msg: Supplier[String]): Unit = lg.warn(msg) + override def info(msg: Supplier[String]): Unit = lg.info(msg) + override def error(msg: Supplier[String]): Unit = lg.error(msg) + override def trace(msg: Supplier[Throwable]): Unit = lg.trace(msg) + override def log(level: Level.Value, msg: Supplier[String]): Unit = lg.log(level, msg) + def trace(t: => Throwable): Unit = trace(toSupplier(t)) + def success(s: => String): Unit = info(toSupplier(s)) def log(level: Level.Value, msg: => String): Unit = { - val fmsg = f0(msg) + val fmsg = toSupplier(msg) level match { case Level.Debug => lg.debug(fmsg) case Level.Info => lg.info(fmsg) @@ -86,9 +88,7 @@ object Logger { } } } - def f0[A](a: => A): F0[A] = InterfaceUtil.f0[A](a) - def m2o[A](m: Maybe[A]): Option[A] = InterfaceUtil.m2o(m) - def o2m[A](o: Option[A]): Maybe[A] = InterfaceUtil.o2m(o) + def jo2o[A](o: Optional[A]): Option[A] = InterfaceUtil.jo2o(o) def o2jo[A](o: Option[A]): Optional[A] = InterfaceUtil.o2jo(o) def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], From b0b9dc5e0f6a13bbba11fbf43b98575e4f73f009 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 14 Jul 2017 08:53:47 -0400 Subject: [PATCH 688/823] switch to using sjson-new-murmurhash The input validation for caching currently relies on having a stack of `scala.math.Equiv`, which is questionable since it can fallback to universal equality. This is likely related to the intermittent caching behavior we are seeing in https://github.com/sbt/sbt/issues/3226 --- build.sbt | 2 +- project/Dependencies.scala | 1 + .../scala/sbt/util/BasicCacheImplicits.scala | 42 +-------------- .../src/main/scala/sbt/util/Cache.scala | 8 --- .../main/scala/sbt/util/SeparatedCache.scala | 52 +++++++++---------- .../src/main/scala/sbt/util/Tracked.scala | 25 ++++++--- 6 files changed, 45 insertions(+), 85 deletions(-) diff --git a/build.sbt b/build.sbt index a0eb78fa3..3f8885f73 100644 --- a/build.sbt +++ b/build.sbt @@ -113,7 +113,7 @@ lazy val utilCache = (project in file("util-cache")). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sjsonnewScalaJson.value, scalaReflect.value) + libraryDependencies ++= Seq(sjsonnewScalaJson.value, sjsonnewMurmurhash.value, scalaReflect.value) ). configure(addSbtIO) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 79c777367..9afeed255 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -46,6 +46,7 @@ object Dependencies { val sjsonnew = Def.setting { "com.eed3si9n" %% "sjson-new-core" % contrabandSjsonNewVersion.value } val sjsonnewScalaJson = Def.setting { "com.eed3si9n" %% "sjson-new-scalajson" % contrabandSjsonNewVersion.value } + val sjsonnewMurmurhash = Def.setting { "com.eed3si9n" %% "sjson-new-murmurhash" % contrabandSjsonNewVersion.value } def log4jVersion = "2.8.1" val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion diff --git a/util-cache/src/main/scala/sbt/util/BasicCacheImplicits.scala b/util-cache/src/main/scala/sbt/util/BasicCacheImplicits.scala index 92e69bddb..1e2c74da8 100644 --- a/util-cache/src/main/scala/sbt/util/BasicCacheImplicits.scala +++ b/util-cache/src/main/scala/sbt/util/BasicCacheImplicits.scala @@ -1,56 +1,18 @@ package sbt.util -import java.net.{ URI, URL } - import sjsonnew.{ BasicJsonProtocol, JsonFormat } trait BasicCacheImplicits { self: BasicJsonProtocol => - implicit def basicCache[I: JsonFormat: Equiv, O: JsonFormat]: Cache[I, O] = + implicit def basicCache[I: JsonFormat, O: JsonFormat]: Cache[I, O] = new BasicCache[I, O]() - def defaultEquiv[T]: Equiv[T] = - new Equiv[T] { def equiv(a: T, b: T) = a == b } - - def wrapEquiv[S, T](f: S => T)(implicit eqT: Equiv[T]): Equiv[S] = - new Equiv[S] { - def equiv(a: S, b: S) = - eqT.equiv(f(a), f(b)) - } - - implicit def optEquiv[T](implicit t: Equiv[T]): Equiv[Option[T]] = - new Equiv[Option[T]] { - def equiv(a: Option[T], b: Option[T]) = - (a, b) match { - case (None, None) => true - case (Some(va), Some(vb)) => t.equiv(va, vb) - case _ => false - } - } - implicit def urlEquiv(implicit uriEq: Equiv[URI]): Equiv[URL] = wrapEquiv[URL, URI](_.toURI)(uriEq) - implicit def uriEquiv: Equiv[URI] = defaultEquiv - implicit def stringSetEquiv: Equiv[Set[String]] = defaultEquiv - implicit def stringMapEquiv: Equiv[Map[String, String]] = defaultEquiv - - implicit def arrEquiv[T](implicit t: Equiv[T]): Equiv[Array[T]] = - wrapEquiv((x: Array[T]) => x: Seq[T])(seqEquiv[T](t)) - - implicit def seqEquiv[T](implicit t: Equiv[T]): Equiv[Seq[T]] = - new Equiv[Seq[T]] { - def equiv(a: Seq[T], b: Seq[T]) = - a.length == b.length && - ((a, b).zipped forall t.equiv) - } - def wrapIn[I, J](implicit f: I => J, g: J => I, jCache: SingletonCache[J]): SingletonCache[I] = new SingletonCache[I] { override def read(from: Input): I = g(jCache.read(from)) override def write(to: Output, value: I) = jCache.write(to, f(value)) - override def equiv: Equiv[I] = wrapEquiv(f)(jCache.equiv) } def singleton[T](t: T): SingletonCache[T] = - SingletonCache.basicSingletonCache(asSingleton(t), trueEquiv) - - def trueEquiv[T] = new Equiv[T] { def equiv(a: T, b: T) = true } + SingletonCache.basicSingletonCache(asSingleton(t)) } diff --git a/util-cache/src/main/scala/sbt/util/Cache.scala b/util-cache/src/main/scala/sbt/util/Cache.scala index 3b85a8dc2..78d694b27 100644 --- a/util-cache/src/main/scala/sbt/util/Cache.scala +++ b/util-cache/src/main/scala/sbt/util/Cache.scala @@ -73,13 +73,5 @@ object Cache { println(label + ".write: " + value) cache.write(to, value) } - - override def equiv: Equiv[I] = new Equiv[I] { - def equiv(a: I, b: I) = { - val equ = cache.equiv.equiv(a, b) - println(label + ".equiv(" + a + ", " + b + "): " + equ) - equ - } - } } } diff --git a/util-cache/src/main/scala/sbt/util/SeparatedCache.scala b/util-cache/src/main/scala/sbt/util/SeparatedCache.scala index c8772e034..0dc8dfceb 100644 --- a/util-cache/src/main/scala/sbt/util/SeparatedCache.scala +++ b/util-cache/src/main/scala/sbt/util/SeparatedCache.scala @@ -6,58 +6,54 @@ package sbt.util import scala.util.Try import sjsonnew.JsonFormat +import sjsonnew.support.murmurhash.Hasher import CacheImplicits._ /** * A cache that stores a single value. */ -trait SingletonCache[T] { +trait SingletonCache[A] { /** Reads the cache from the backing `from`. */ - def read(from: Input): T + def read(from: Input): A /** Writes `value` to the backing `to`. */ - def write(to: Output, value: T): Unit - - /** Equivalence for elements of type `T`. */ - def equiv: Equiv[T] + def write(to: Output, value: A): Unit } object SingletonCache { - implicit def basicSingletonCache[T: JsonFormat: Equiv]: SingletonCache[T] = - new SingletonCache[T] { - override def read(from: Input): T = from.read[T] - override def write(to: Output, value: T) = to.write(value) - override def equiv: Equiv[T] = implicitly + implicit def basicSingletonCache[A: JsonFormat]: SingletonCache[A] = + new SingletonCache[A] { + override def read(from: Input): A = from.read[A] + override def write(to: Output, value: A) = to.write(value) } /** A lazy `SingletonCache` */ - def lzy[T: JsonFormat: Equiv](mkCache: => SingletonCache[T]): SingletonCache[T] = - new SingletonCache[T] { + def lzy[A: JsonFormat](mkCache: => SingletonCache[A]): SingletonCache[A] = + new SingletonCache[A] { lazy val cache = mkCache - override def read(from: Input): T = cache.read(from) - override def write(to: Output, value: T) = cache.write(to, value) - override def equiv = cache.equiv + override def read(from: Input): A = cache.read(from) + override def write(to: Output, value: A) = cache.write(to, value) } } /** * Simple key-value cache. */ -class BasicCache[I: JsonFormat: Equiv, O: JsonFormat] extends Cache[I, O] { - private val singletonCache: SingletonCache[(I, O)] = implicitly - val equiv: Equiv[I] = implicitly - override def apply(store: CacheStore)(key: I): CacheResult[O] = +class BasicCache[I: JsonFormat, O: JsonFormat] extends Cache[I, O] { + private val singletonCache: SingletonCache[(Long, O)] = implicitly + val jsonFormat: JsonFormat[I] = implicitly + override def apply(store: CacheStore)(key: I): CacheResult[O] = { + val keyHash: Long = Hasher.hashUnsafe[I](key).toLong Try { - val (previousKey, previousValue) = singletonCache.read(store) - if (equiv.equiv(key, previousKey)) - Hit(previousValue) - else - Miss(update(store)(key)) - } getOrElse Miss(update(store)(key)) + val (previousKeyHash, previousValue) = singletonCache.read(store) + if (keyHash == previousKeyHash) Hit(previousValue) + else Miss(update(store)(keyHash)) + } getOrElse Miss(update(store)(keyHash)) + } - private def update(store: CacheStore)(key: I) = (value: O) => { - singletonCache.write(store, (key, value)) + private def update(store: CacheStore)(keyHash: Long) = (value: O) => { + singletonCache.write(store, (keyHash, value)) } } diff --git a/util-tracking/src/main/scala/sbt/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala index dcd5658e9..66e84e8fb 100644 --- a/util-tracking/src/main/scala/sbt/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -10,11 +10,9 @@ import sbt.io.IO import sbt.io.syntax._ import sjsonnew.JsonFormat +import sjsonnew.support.murmurhash.Hasher object Tracked { - - import CacheImplicits.LongJsonFormat - /** * Creates a tracker that provides the last time it was evaluated. * If the function throws an exception. @@ -33,7 +31,10 @@ object Tracked { * If 'useStartTime' is false, the recorded time is when the evaluated function completes. * In both cases, the timestamp is not updated if the function throws an exception. */ - def tstamp(store: CacheStore, useStartTime: Boolean): Timestamp = new Timestamp(store, useStartTime) + def tstamp(store: CacheStore, useStartTime: Boolean): Timestamp = { + import CacheImplicits.LongJsonFormat + new Timestamp(store, useStartTime) + } /** * Creates a tracker that provides the last time it was evaluated. @@ -74,7 +75,10 @@ object Tracked { * recent invocation. */ def inputChanged[I: JsonFormat: SingletonCache, O](store: CacheStore)(f: (Boolean, I) => O): I => O = { in => - val cache: SingletonCache[I] = implicitly + val cache: SingletonCache[Long] = { + import CacheImplicits.LongJsonFormat + implicitly + } val help = new CacheHelp(cache) val changed = help.changed(store, in) val result = f(changed, in) @@ -90,15 +94,20 @@ object Tracked { def inputChanged[I: JsonFormat: SingletonCache, O](cacheFile: File)(f: (Boolean, I) => O): I => O = inputChanged(CacheStore(cacheFile))(f) - private final class CacheHelp[I: JsonFormat](val sc: SingletonCache[I]) { + private final class CacheHelp[I: JsonFormat](val sc: SingletonCache[Long]) { + import CacheImplicits.implicitHashWriter def save(store: CacheStore, value: I): Unit = { store.write(value) } def changed(store: CacheStore, value: I): Boolean = Try { store.read[I] } match { - case Success(prev) => !sc.equiv.equiv(value, prev) - case Failure(_) => true + case Success(prev) => + Hasher.hash(value) match { + case Success(keyHash) => keyHash.toLong != prev + case Failure(_) => true + } + case Failure(_) => true } } From 8b5210f84d68bf2921102c6a85af8ab02beca829 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 14 Jul 2017 15:56:06 -0400 Subject: [PATCH 689/823] Bump to latest Contraband --- .../contraband-scala/sbt/internal/util/AbstractEntry.scala | 2 +- .../main/contraband-scala/sbt/internal/util/StringEvent.scala | 2 +- .../main/contraband-scala/sbt/internal/util/TraceEvent.scala | 2 +- .../src/main/scala/sbt/internal/util/ObjectEvent.scala | 2 +- .../main/scala/sbt/internal/util/codec/JValueFormats.scala | 2 +- project/plugins.sbt | 4 +++- util-cache/src/main/scala/sbt/util/CacheStore.scala | 2 +- util-cache/src/test/scala/CacheSpec.scala | 2 +- util-cache/src/test/scala/FileInfoSpec.scala | 2 +- util-cache/src/test/scala/SingletonCacheSpec.scala | 2 +- 10 files changed, 12 insertions(+), 10 deletions(-) diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala index 5f8b37c07..09948f70f 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/AbstractEntry.scala @@ -16,7 +16,7 @@ abstract class AbstractEntry( case _ => false } override def hashCode: Int = { - 37 * (37 * (37 * (17 + "AbstractEntry".##) + channelName.##) + execId.##) + 37 * (37 * (37 * (17 + "sbt.internal.util.AbstractEntry".##) + channelName.##) + execId.##) } override def toString: String = { "AbstractEntry(" + channelName + ", " + execId + ")" diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala index 71763458c..83e697ec6 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala @@ -17,7 +17,7 @@ final class StringEvent private ( case _ => false } override def hashCode: Int = { - 37 * (37 * (37 * (37 * (37 * (17 + "StringEvent".##) + level.##) + message.##) + channelName.##) + execId.##) + 37 * (37 * (37 * (37 * (37 * (17 + "sbt.internal.util.StringEvent".##) + level.##) + message.##) + channelName.##) + execId.##) } override def toString: String = { "StringEvent(" + level + ", " + message + ", " + channelName + ", " + execId + ")" diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala index 85312aff4..7775220fc 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala @@ -17,7 +17,7 @@ final class TraceEvent private ( case _ => false } override def hashCode: Int = { - 37 * (37 * (37 * (37 * (37 * (17 + "TraceEvent".##) + level.##) + message.##) + channelName.##) + execId.##) + 37 * (37 * (37 * (37 * (37 * (17 + "sbt.internal.util.TraceEvent".##) + level.##) + message.##) + channelName.##) + execId.##) } override def toString: String = { "TraceEvent(" + level + ", " + message + ", " + channelName + ", " + execId + ")" diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala index 674e74673..f8f288c21 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala @@ -5,7 +5,7 @@ package util import sbt.util.Level import sjsonnew.JsonFormat import sjsonnew.support.scalajson.unsafe.Converter -import scalajson.ast.unsafe.JValue +import sjsonnew.shaded.scalajson.ast.unsafe.JValue final class ObjectEvent[A]( val level: Level.Value, diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala index e800bcff0..4e11f26e8 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala @@ -7,7 +7,7 @@ package internal package util.codec import sjsonnew.{ JsonWriter => JW, JsonReader => JR, JsonFormat => JF, _ } -import scalajson.ast.unsafe._ +import sjsonnew.shaded.scalajson.ast.unsafe._ trait JValueFormats { self: sjsonnew.BasicJsonProtocol => implicit val JNullFormat: JF[JNull.type] = new JF[JNull.type] { diff --git a/project/plugins.sbt b/project/plugins.sbt index 396603956..9e3900c5b 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,4 +1,6 @@ addSbtPlugin("org.foundweekends" % "sbt-bintray" % "0.4.0") addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.1.0-M1") addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M7") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M9") + +resolvers += Resolver.sonatypeRepo("public") diff --git a/util-cache/src/main/scala/sbt/util/CacheStore.scala b/util-cache/src/main/scala/sbt/util/CacheStore.scala index 9ccac0d76..e29999471 100644 --- a/util-cache/src/main/scala/sbt/util/CacheStore.scala +++ b/util-cache/src/main/scala/sbt/util/CacheStore.scala @@ -5,7 +5,7 @@ import sbt.io.syntax.fileToRichFile import sbt.io.{ IO, Using } import sjsonnew.{ IsoString, JsonReader, JsonWriter, SupportConverter } import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } -import scalajson.ast.unsafe.JValue +import sjsonnew.shaded.scalajson.ast.unsafe.JValue /** A `CacheStore` is used by the caching infrastructure to persist cached information. */ abstract class CacheStore extends Input with Output { diff --git a/util-cache/src/test/scala/CacheSpec.scala b/util-cache/src/test/scala/CacheSpec.scala index bce7b9af1..00481d227 100644 --- a/util-cache/src/test/scala/CacheSpec.scala +++ b/util-cache/src/test/scala/CacheSpec.scala @@ -8,7 +8,7 @@ import CacheImplicits._ import sjsonnew.IsoString import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } -import scalajson.ast.unsafe.JValue +import sjsonnew.shaded.scalajson.ast.unsafe.JValue import sbt.internal.util.UnitSpec class CacheSpec extends UnitSpec { diff --git a/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala index 813e85371..debd427c7 100644 --- a/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -1,6 +1,6 @@ package sbt.util -import scalajson.ast.unsafe._ +import sjsonnew.shaded.scalajson.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ import sbt.internal.util.UnitSpec diff --git a/util-cache/src/test/scala/SingletonCacheSpec.scala b/util-cache/src/test/scala/SingletonCacheSpec.scala index 15265f312..9bfab82f8 100644 --- a/util-cache/src/test/scala/SingletonCacheSpec.scala +++ b/util-cache/src/test/scala/SingletonCacheSpec.scala @@ -8,7 +8,7 @@ import CacheImplicits._ import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilder } import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } -import scalajson.ast.unsafe.JValue +import sjsonnew.shaded.scalajson.ast.unsafe.JValue import sbt.internal.util.UnitSpec class SingletonCacheSpec extends UnitSpec { From c1a12d5ee7746db96bf54374f4e1f496f62010ac Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 15 Jul 2017 13:24:26 -0400 Subject: [PATCH 690/823] bump IO --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 9afeed255..076545c47 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -7,7 +7,7 @@ object Dependencies { val scala211 = "2.11.11" val scala212 = "2.12.2" - private val ioVersion = "1.0.0-M12" + private val ioVersion = "1.0.0-M13" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From 24d97aa10410e7d571a8a5aef8a74ba059f7f7dd Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 16 Jul 2017 18:48:28 -0400 Subject: [PATCH 691/823] Fixes Tracked.inputChanged Tracked.inputChanged stores and reads hash correctly. Fixes #96 --- .../src/main/scala/sbt/util/Tracked.scala | 14 +++++++++----- .../src/test/scala/sbt/util/TrackedSpec.scala | 6 +++--- 2 files changed, 12 insertions(+), 8 deletions(-) diff --git a/util-tracking/src/main/scala/sbt/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala index 66e84e8fb..823e9bde9 100644 --- a/util-tracking/src/main/scala/sbt/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -96,16 +96,20 @@ object Tracked { private final class CacheHelp[I: JsonFormat](val sc: SingletonCache[Long]) { import CacheImplicits.implicitHashWriter + import CacheImplicits.LongJsonFormat def save(store: CacheStore, value: I): Unit = { - store.write(value) + Hasher.hash(value) match { + case Success(keyHash) => store.write[Long](keyHash.toLong) + case Failure(e) => () + } } def changed(store: CacheStore, value: I): Boolean = - Try { store.read[I] } match { - case Success(prev) => + Try { store.read[Long] } match { + case Success(prev: Long) => Hasher.hash(value) match { - case Success(keyHash) => keyHash.toLong != prev - case Failure(_) => true + case Success(keyHash: Int) => keyHash.toLong != prev + case Failure(_) => true } case Failure(_) => true } diff --git a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala index 90513cad6..4fa480ca7 100644 --- a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala +++ b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala @@ -51,10 +51,10 @@ class TrackedSpec extends UnitSpec { "inputChanged" should "detect that the input has not changed" in { withStore { store => - val input0 = 0 + val input0 = "foo" val res0 = - Tracked.inputChanged[Int, Int](store) { + Tracked.inputChanged[String, String](store) { case (true, in) => assert(in === input0) in @@ -64,7 +64,7 @@ class TrackedSpec extends UnitSpec { assert(res0 === input0) val res1 = - Tracked.inputChanged[Int, Int](store) { + Tracked.inputChanged[String, String](store) { case (true, in) => fail() case (false, in) => From f653800cb30a7c5b0c675ec759da20673f6b1546 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 16 Jul 2017 19:28:02 -0400 Subject: [PATCH 692/823] Add back outputChanged Fixes #79 --- .../src/main/scala/sbt/util/Tracked.scala | 68 +++++++++++++++++++ .../src/test/scala/sbt/util/TrackedSpec.scala | 29 ++++++++ 2 files changed, 97 insertions(+) diff --git a/util-tracking/src/main/scala/sbt/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala index 823e9bde9..83ac2df0e 100644 --- a/util-tracking/src/main/scala/sbt/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -70,9 +70,66 @@ object Tracked { def lastOutput[I, O: JsonFormat](cacheFile: File)(f: (I, Option[O]) => O): I => O = lastOutput(CacheStore(cacheFile))(f) + /** + * Creates a tracker that indicates whether the output returned from `p` has changed or not. + * + * {{{ + * val cachedTask = inputChanged(cache / "inputs") { (inChanged, in: Inputs) => + * Tracked.outputChanged(cache / "output") { (outChanged, outputs: FilesInfo[PlainFileInfo]) => + * if (inChanged || outChanged) { + * doSomething(label, sources, classpath, outputDirectory, options, log) + * } + * } + * } + * cachedDoc(inputs)(() => exists(outputDirectory.allPaths.get.toSet)) + * }}} + */ + def outputChanged[A1: JsonFormat, A2](store: CacheStore)(f: (Boolean, A1) => A2): (() => A1) => A2 = p => { + val cache: SingletonCache[Long] = { + import CacheImplicits.LongJsonFormat + implicitly + } + val initial = p() + val help = new CacheHelp(cache) + val changed = help.changed(store, initial) + val result = f(changed, initial) + if (changed) { + help.save(store, initial) + } + result + } + + /** + * Creates a tracker that indicates whether the output returned from `p` has changed or not. + * + * {{{ + * val cachedTask = inputChanged(cache / "inputs") { (inChanged, in: Inputs) => + * Tracked.outputChanged(cache / "output") { (outChanged, outputs: FilesInfo[PlainFileInfo]) => + * if (inChanged || outChanged) { + * doSomething(label, sources, classpath, outputDirectory, options, log) + * } + * } + * } + * cachedDoc(inputs)(() => exists(outputDirectory.allPaths.get.toSet)) + * }}} + */ + def outputChanged[A1: JsonFormat, A2](cacheFile: File)(f: (Boolean, A1) => A2): (() => A1) => A2 = + outputChanged[A1, A2](CacheStore(cacheFile))(f) + /** * Creates a tracker that indicates whether the arguments given to f have changed since the most * recent invocation. + * + * {{{ + * val cachedTask = inputChanged(cache / "inputs") { (inChanged, in: Inputs) => + * Tracked.outputChanged(cache / "output") { (outChanged, outputs: FilesInfo[PlainFileInfo]) => + * if (inChanged || outChanged) { + * doSomething(label, sources, classpath, outputDirectory, options, log) + * } + * } + * } + * cachedDoc(inputs)(() => exists(outputDirectory.allPaths.get.toSet)) + * }}} */ def inputChanged[I: JsonFormat: SingletonCache, O](store: CacheStore)(f: (Boolean, I) => O): I => O = { in => val cache: SingletonCache[Long] = { @@ -90,6 +147,17 @@ object Tracked { /** * Creates a tracker that indicates whether the arguments given to f have changed since the most * recent invocation. + * + * {{{ + * val cachedTask = inputChanged(cache / "inputs") { (inChanged, in: Inputs) => + * Tracked.outputChanged(cache / "output") { (outChanged, outputs: FilesInfo[PlainFileInfo]) => + * if (inChanged || outChanged) { + * doSomething(label, sources, classpath, outputDirectory, options, log) + * } + * } + * } + * cachedDoc(inputs)(() => exists(outputDirectory.allPaths.get.toSet)) + * }}} */ def inputChanged[I: JsonFormat: SingletonCache, O](cacheFile: File)(f: (Boolean, I) => O): I => O = inputChanged(CacheStore(cacheFile))(f) diff --git a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala index 4fa480ca7..e0cf74558 100644 --- a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala +++ b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala @@ -104,6 +104,35 @@ class TrackedSpec extends UnitSpec { } } + "outputChanged" should "detect that the output has not changed" in { + withStore { store => + val input0: String = "foo" + val p0: () => String = () => input0 + + val res0 = + Tracked.outputChanged[String, String](store) { + case (true, in) => + assert(in === input0) + in + case (false, in) => + fail() + }(implicitly)(p0) + assert(res0 === input0) + + val res1 = + Tracked.outputChanged[String, String](store) { + case (true, in) => + fail() + case (false, in) => + assert(in === input0) + in + }(implicitly)(p0) + assert(res1 === input0) + + } + } + + "tstamp tracker" should "have a timestamp of 0 on first invocation" in { withStore { store => Tracked.tstamp(store) { last => From 28ab7ac79de4883f72cd5f00224971a7cd4e02c9 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 17 Jul 2017 10:05:58 +0200 Subject: [PATCH 693/823] Fix logger printing too many newlines --- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index c6261a9e6..6e329c259 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -371,7 +371,7 @@ class ConsoleAppender private[ConsoleAppender] ( val cleanedMsg = if (!useFormat) EscHelpers.removeEscapeSequences(msg) else msg - out.println(cleanedMsg) + out.print(cleanedMsg) } private def writeLine(line: String): Unit = From e744985b6b167cf9bdf3ed8c951e2f1f8e8dbe53 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 17 Jul 2017 10:42:39 +0100 Subject: [PATCH 694/823] Add a RESET at the front of the appendLog line --- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index c6261a9e6..28bb8b152 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -362,7 +362,7 @@ class ConsoleAppender private[ConsoleAppender] ( private def appendLog(labelColor: String, label: String, messageColor: String, message: String): Unit = out.lockObject.synchronized { message.lines.foreach { line => - val labeledLine = s"[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" + val labeledLine = s"$RESET[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" writeLine(labeledLine) } } From ddcc9091956656a665393d52c80eb51cfee18500 Mon Sep 17 00:00:00 2001 From: Martin Duhem Date: Mon, 17 Jul 2017 14:36:03 +0200 Subject: [PATCH 695/823] Re-fix ConsoleAppender It turns out that calling `print` on `ConsoleOut` doesn't flush the output. Remove `writeLine` and let `write` directly use `println`. --- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 7 ++----- 1 file changed, 2 insertions(+), 5 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index cc98f2269..422e420fb 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -363,7 +363,7 @@ class ConsoleAppender private[ConsoleAppender] ( out.lockObject.synchronized { message.lines.foreach { line => val labeledLine = s"$RESET[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" - writeLine(labeledLine) + write(labeledLine) } } @@ -371,12 +371,9 @@ class ConsoleAppender private[ConsoleAppender] ( val cleanedMsg = if (!useFormat) EscHelpers.removeEscapeSequences(msg) else msg - out.print(cleanedMsg) + out.println(cleanedMsg) } - private def writeLine(line: String): Unit = - write(line + EOL) - private def appendMessage(level: Level.Value, msg: Message): Unit = msg match { case o: ObjectMessage => objectToLines(o.getParameter) foreach { appendLog(level, _) } From a38d3d6c694da2ab9bf070cd3c2fd5122b79fcc6 Mon Sep 17 00:00:00 2001 From: Olli Helenius Date: Tue, 18 Jul 2017 19:57:57 +0300 Subject: [PATCH 696/823] Delegate ansiCodesSupported to ConsoleAppender Fixes sbt/sbt#3336. --- .../src/main/scala/sbt/internal/util/ManagedLogger.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index 794107111..a0b69929c 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -54,4 +54,5 @@ class ManagedLogger( new ObjectMessage(entry) ) } + override def ansiCodesSupported = ConsoleAppender.formatEnabledInEnv } From 855638243ad8508ff23656c2d676fa0d2a15fdb6 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 19 Jul 2017 21:36:11 -0400 Subject: [PATCH 697/823] sbt 1.0.0-RC2 --- project/build.properties | 2 +- project/plugins.sbt | 4 ---- 2 files changed, 1 insertion(+), 5 deletions(-) diff --git a/project/build.properties b/project/build.properties index cd66fd542..c9e698a14 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.0.0-M6 +sbt.version=1.0.0-RC2 diff --git a/project/plugins.sbt b/project/plugins.sbt index 9e3900c5b..1547f7984 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,6 +1,2 @@ -addSbtPlugin("org.foundweekends" % "sbt-bintray" % "0.4.0") -addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.1.0-M1") addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M9") - -resolvers += Resolver.sonatypeRepo("public") From da9084b2d87ca9972f9ba4975ad2e7a63a7c3eb7 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 20 Jul 2017 17:09:06 +0100 Subject: [PATCH 698/823] Add, configure & enable MiMa --- .travis.yml | 2 +- build.sbt | 32 +++++++++++++++++++++++--------- project/plugins.sbt | 1 + 3 files changed, 25 insertions(+), 10 deletions(-) diff --git a/.travis.yml b/.travis.yml index b0bab8379..f0d1274c1 100644 --- a/.travis.yml +++ b/.travis.yml @@ -6,7 +6,7 @@ scala: - 2.12.2 script: - - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" test + - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues test cache: directories: diff --git a/build.sbt b/build.sbt index 3f8885f73..116b9ca31 100644 --- a/build.sbt +++ b/build.sbt @@ -25,11 +25,16 @@ def commonSettings: Seq[Setting[_]] = Seq( }, scalacOptions in console in Compile -= "-Ywarn-unused-import", scalacOptions in console in Test -= "-Ywarn-unused-import", - // mimaPreviousArtifacts := Set(), // Some(organization.value %% moduleName.value % "1.0.0"), publishArtifact in Compile := true, publishArtifact in Test := false ) +val mimaSettings = Def settings ( + mimaPreviousArtifacts := Set(organization.value % moduleName.value % "1.0.0-M28" + cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) + ) +) + lazy val utilRoot: Project = (project in file(".")). aggregate( utilInterface, utilControl, utilPosition, @@ -67,18 +72,21 @@ lazy val utilInterface = (project in internalPath / "util-interface"). commonSettings, javaOnlySettings, name := "Util Interface", - exportJars := true + exportJars := true, + mimaSettings, ) lazy val utilControl = (project in internalPath / "util-control"). settings( commonSettings, - name := "Util Control" + name := "Util Control", + mimaSettings, ) val utilPosition = (project in file("internal") / "util-position").settings( commonSettings, - name := "Util Position" + name := "Util Position", + mimaSettings, ) // logging @@ -97,6 +105,7 @@ lazy val utilLogging = (project in internalPath / "util-logging"). if (name == "Throwable") Nil else old(tpe) }, + mimaSettings, ) // Relation @@ -104,7 +113,8 @@ lazy val utilRelation = (project in internalPath / "util-relation"). dependsOn(utilTesting % Test). settings( commonSettings, - name := "Util Relation" + name := "Util Relation", + mimaSettings, ) // Persisted caching based on sjson-new @@ -113,7 +123,8 @@ lazy val utilCache = (project in file("util-cache")). settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sjsonnewScalaJson.value, sjsonnewMurmurhash.value, scalaReflect.value) + libraryDependencies ++= Seq(sjsonnewScalaJson.value, sjsonnewMurmurhash.value, scalaReflect.value), + mimaSettings, ). configure(addSbtIO) @@ -122,7 +133,8 @@ lazy val utilTracking = (project in file("util-tracking")). dependsOn(utilCache, utilTesting % Test). settings( commonSettings, - name := "Util Tracking" + name := "Util Tracking", + mimaSettings, ). configure(addSbtIO) @@ -132,7 +144,8 @@ lazy val utilTesting = (project in internalPath / "util-testing"). commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Testing", - libraryDependencies ++= Seq(scalaCheck, scalatest) + libraryDependencies ++= Seq(scalaCheck, scalatest), + mimaSettings, ). configure(addSbtIO) @@ -147,7 +160,8 @@ lazy val utilScripted = (project in internalPath / "util-scripted"). case sv if sv startsWith "2.12" => Seq(parserCombinator211) case _ => Seq() } - } + }, + mimaSettings, ). configure(addSbtIO) diff --git a/project/plugins.sbt b/project/plugins.sbt index 1547f7984..1e3e62f5d 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,2 +1,3 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M9") +addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.14") From f6370063f4343471632063a1fba791ce0b2c07cf Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 24 Jul 2017 23:20:05 -0400 Subject: [PATCH 699/823] Deprecate ansiCodesSupported from the logger --- .../src/main/scala/sbt/internal/util/BufferedLogger.scala | 2 ++ .../src/main/scala/sbt/internal/util/FullLogger.scala | 2 ++ .../src/main/scala/sbt/internal/util/ManagedLogger.scala | 2 ++ .../src/main/scala/sbt/internal/util/MultiLogger.scala | 2 ++ internal/util-logging/src/main/scala/sbt/util/Logger.scala | 1 + 5 files changed, 9 insertions(+) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index be24152c1..93686c334 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -116,7 +116,9 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { /** Plays buffered events and disables buffering. */ def stop(): Unit = synchronized { play(); clear() } + @deprecated("No longer used.", "1.0.0") override def ansiCodesSupported = delegate.ansiCodesSupported + override def setLevel(newLevel: Level.Value): Unit = synchronized { super.setLevel(newLevel) if (recording) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala index 1493e2d0f..c3ad40442 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala @@ -7,7 +7,9 @@ import sbt.util._ /** Promotes the simple Logger interface to the full AbstractLogger interface. */ class FullLogger(delegate: Logger) extends BasicLogger { + @deprecated("No longer used.", "1.0.0") override val ansiCodesSupported: Boolean = delegate.ansiCodesSupported + def trace(t: => Throwable): Unit = { if (traceEnabled) delegate.trace(t) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index a0b69929c..5f284a77a 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -54,5 +54,7 @@ class ManagedLogger( new ObjectMessage(entry) ) } + + @deprecated("No longer used.", "1.0.0") override def ansiCodesSupported = ConsoleAppender.formatEnabledInEnv } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala index c72d094af..c43d346f4 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala @@ -9,7 +9,9 @@ import sbt.util._ // note that setting the logging level on this logger has no effect on its behavior, only // on the behavior of the delegates. class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { + @deprecated("No longer used.", "1.0.0") override lazy val ansiCodesSupported = delegates exists supported + private[this] lazy val allSupportCodes = delegates forall supported private[this] def supported = (_: AbstractLogger).ansiCodesSupported diff --git a/internal/util-logging/src/main/scala/sbt/util/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/Logger.scala index abcad3428..0bcea3b78 100644 --- a/internal/util-logging/src/main/scala/sbt/util/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Logger.scala @@ -27,6 +27,7 @@ abstract class Logger extends xLogger { // sys.process.ProcessLogger final def out(message: => String): Unit = log(Level.Info, message) + @deprecated("No longer used.", "1.0.0") def ansiCodesSupported: Boolean = false def trace(t: => Throwable): Unit From d796084ff437b46f630af4abd394b7389392808e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 24 Jul 2017 23:54:25 -0400 Subject: [PATCH 700/823] Filter out color in CosoleAppender only Fixes sbt/sbt#3348 Ref #101 The new logger, based on log4j separates the concern of the log producer (Logger) and the handlers that takes actions (Appender, e.g for displaying on Console). As such filtering of color should be performed only in the ConsoleAppender. --- .../sbt/internal/util/ConsoleAppender.scala | 20 ++++++++++------- .../scala/sbt/internal/util/MultiLogger.scala | 22 +++---------------- 2 files changed, 15 insertions(+), 27 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 422e420fb..d6764463a 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -1,6 +1,5 @@ package sbt.internal.util -import scala.compat.Platform.EOL import sbt.util._ import java.io.{ PrintStream, PrintWriter } import java.util.Locale @@ -269,11 +268,16 @@ class ConsoleAppender private[ConsoleAppender] ( useFormat: Boolean, suppressedMessage: SuppressedTraceContext => Option[String] ) extends AbstractAppender(name, null, LogExchange.dummyLayout, true) { - import scala.Console.{ BLUE, GREEN, RED, RESET, YELLOW } + import scala.Console.{ BLUE, GREEN, RED, YELLOW } - private final val SUCCESS_LABEL_COLOR = GREEN - private final val SUCCESS_MESSAGE_COLOR = RESET - private final val NO_COLOR = RESET + private val reset: String = { + if (ansiCodesSupported && useFormat) scala.Console.RESET + else "" + } + + private val SUCCESS_LABEL_COLOR = GREEN + private val SUCCESS_MESSAGE_COLOR = reset + private val NO_COLOR = reset override def append(event: XLogEvent): Unit = { val level = ConsoleAppender.toLevel(event.getLevel) @@ -333,7 +337,7 @@ class ConsoleAppender private[ConsoleAppender] ( * @return The formatted message. */ private def formatted(format: String, msg: String): String = - s"${RESET}${format}${msg}${RESET}" + s"$reset${format}${msg}$reset" /** * Select the right color for the label given `level`. @@ -362,14 +366,14 @@ class ConsoleAppender private[ConsoleAppender] ( private def appendLog(labelColor: String, label: String, messageColor: String, message: String): Unit = out.lockObject.synchronized { message.lines.foreach { line => - val labeledLine = s"$RESET[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" + val labeledLine = s"$reset[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" write(labeledLine) } } private def write(msg: String): Unit = { val cleanedMsg = - if (!useFormat) EscHelpers.removeEscapeSequences(msg) + if (!useFormat || !ansiCodesSupported) EscHelpers.removeEscapeSequences(msg) else msg out.println(cleanedMsg) } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala index c43d346f4..ef82fa10f 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala @@ -11,8 +11,6 @@ import sbt.util._ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { @deprecated("No longer used.", "1.0.0") override lazy val ansiCodesSupported = delegates exists supported - - private[this] lazy val allSupportCodes = delegates forall supported private[this] def supported = (_: AbstractLogger).ansiCodesSupported override def setLevel(newLevel: Level.Value): Unit = { @@ -33,22 +31,8 @@ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { def logAll(events: Seq[LogEvent]): Unit = delegates.foreach(_.logAll(events)) def control(event: ControlEvent.Value, message: => String): Unit = delegates.foreach(_.control(event, message)) private[this] def dispatch(event: LogEvent): Unit = { - val plainEvent = if (allSupportCodes) event else removeEscapes(event) - for (d <- delegates) - if (d.ansiCodesSupported) - d.log(event) - else - d.log(plainEvent) - } - - private[this] def removeEscapes(event: LogEvent): LogEvent = - { - import EscHelpers.{ removeEscapeSequences => rm } - event match { - case s: Success => new Success(rm(s.msg)) - case l: Log => new Log(l.level, rm(l.msg)) - case ce: ControlEvent => new ControlEvent(ce.event, rm(ce.msg)) - case _: Trace | _: SetLevel | _: SetTrace | _: SetSuccess => event - } + for (d <- delegates) { + d.log(event) } + } } From 5183f7ef81677093c2d50848a44555d639eff781 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 25 Jul 2017 16:16:21 -0400 Subject: [PATCH 701/823] Use event logging to send success Fixes sbt/sbt#3213 --- .../sbt/internal/util/SuccessEvent.scala | 32 +++++++++++++ .../internal/util/codec/JsonProtocol.scala | 1 + .../util/codec/SuccessEventFormats.scala | 27 +++++++++++ .../src/main/contraband/logging.contra | 4 ++ .../sbt/internal/util/ConsoleAppender.scala | 47 ++++++++++--------- .../sbt/internal/util/ManagedLogger.scala | 8 +++- .../util/codec/SuccessEventShowLines.scala | 14 ++++++ 7 files changed, 109 insertions(+), 24 deletions(-) create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/SuccessEvent.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/SuccessEventFormats.scala create mode 100644 internal/util-logging/src/main/scala/sbt/internal/util/codec/SuccessEventShowLines.scala diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/SuccessEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/SuccessEvent.scala new file mode 100644 index 000000000..9fdcc8e09 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/SuccessEvent.scala @@ -0,0 +1,32 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util +final class SuccessEvent private ( + val message: String) extends Serializable { + + + + override def equals(o: Any): Boolean = o match { + case x: SuccessEvent => (this.message == x.message) + case _ => false + } + override def hashCode: Int = { + 37 * (37 * (17 + "sbt.internal.util.SuccessEvent".##) + message.##) + } + override def toString: String = { + "SuccessEvent(" + message + ")" + } + protected[this] def copy(message: String = message): SuccessEvent = { + new SuccessEvent(message) + } + def withMessage(message: String): SuccessEvent = { + copy(message = message) + } +} +object SuccessEvent { + + def apply(message: String): SuccessEvent = new SuccessEvent(message) +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala index 4696c9612..a94906dda 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala @@ -8,4 +8,5 @@ trait JsonProtocol extends sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.StringEventFormats with sbt.internal.util.codec.TraceEventFormats with sbt.internal.util.codec.AbstractEntryFormats + with sbt.internal.util.codec.SuccessEventFormats object JsonProtocol extends JsonProtocol \ No newline at end of file diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/SuccessEventFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/SuccessEventFormats.scala new file mode 100644 index 000000000..19621d7c1 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/SuccessEventFormats.scala @@ -0,0 +1,27 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } +trait SuccessEventFormats { self: sjsonnew.BasicJsonProtocol => +implicit lazy val SuccessEventFormat: JsonFormat[sbt.internal.util.SuccessEvent] = new JsonFormat[sbt.internal.util.SuccessEvent] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.SuccessEvent = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val message = unbuilder.readField[String]("message") + unbuilder.endObject() + sbt.internal.util.SuccessEvent(message) + case None => + deserializationError("Expected JsObject but found None") + } + } + override def write[J](obj: sbt.internal.util.SuccessEvent, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("message", obj.message) + builder.endObject() + } +} +} diff --git a/internal/util-logging/src/main/contraband/logging.contra b/internal/util-logging/src/main/contraband/logging.contra index 5cd31c230..19b019c66 100644 --- a/internal/util-logging/src/main/contraband/logging.contra +++ b/internal/util-logging/src/main/contraband/logging.contra @@ -21,3 +21,7 @@ type TraceEvent implements sbt.internal.util.AbstractEntry { channelName: String execId: String } + +type SuccessEvent { + message: String! +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index d6764463a..14cfe6ab2 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -285,13 +285,6 @@ class ConsoleAppender private[ConsoleAppender] ( appendMessage(level, message) } - // TODO: - // success is called by ConsoleLogger. - // This should turn into an event. - private[sbt] def success(message: => String): Unit = { - appendLog(SUCCESS_LABEL_COLOR, Level.SuccessLabel, SUCCESS_MESSAGE_COLOR, message) - } - /** * Logs the stack trace of `t`, possibly shortening it. * @@ -371,6 +364,11 @@ class ConsoleAppender private[ConsoleAppender] ( } } + // success is called by ConsoleLogger. + private[sbt] def success(message: => String): Unit = { + appendLog(SUCCESS_LABEL_COLOR, Level.SuccessLabel, SUCCESS_MESSAGE_COLOR, message) + } + private def write(msg: String): Unit = { val cleanedMsg = if (!useFormat || !ansiCodesSupported) EscHelpers.removeEscapeSequences(msg) @@ -380,27 +378,30 @@ class ConsoleAppender private[ConsoleAppender] ( private def appendMessage(level: Level.Value, msg: Message): Unit = msg match { - case o: ObjectMessage => objectToLines(o.getParameter) foreach { appendLog(level, _) } - case o: ReusableObjectMessage => objectToLines(o.getParameter) foreach { appendLog(level, _) } + case o: ObjectMessage => appendMessageContent(level, o.getParameter) + case o: ReusableObjectMessage => appendMessageContent(level, o.getParameter) case _ => appendLog(level, msg.getFormattedMessage) } - private def objectToLines(o: AnyRef): Vector[String] = + private def appendMessageContent(level: Level.Value, o: AnyRef): Unit = { + def appendEvent(oe: ObjectEvent[_]): Unit = + { + val contentType = oe.contentType + LogExchange.stringCodec[AnyRef](contentType) match { + case Some(codec) if contentType == "sbt.internal.util.SuccessEvent" => + codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach { success(_) } + case Some(codec) => + codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach { appendLog(level, _) } + case _ => appendLog(level, oe.message.toString) + } + } + o match { - case x: StringEvent => Vector(x.message) - case x: ObjectEvent[_] => objectEventToLines(x) - case _ => Vector(o.toString) + case x: StringEvent => Vector(x.message) foreach { appendLog(level, _) } + case x: ObjectEvent[_] => appendEvent(x) + case _ => Vector(o.toString) foreach { appendLog(level, _) } } - - private def objectEventToLines(oe: ObjectEvent[_]): Vector[String] = - { - val contentType = oe.contentType - LogExchange.stringCodec[AnyRef](contentType) match { - case Some(codec) => codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector - case _ => Vector(oe.message.toString) - } - } - + } } final class SuppressedTraceContext(val traceLevel: Int, val useFormat: Boolean) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index 5f284a77a..5a9215ba8 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -7,6 +7,7 @@ import sjsonnew.JsonFormat import scala.reflect.runtime.universe.TypeTag import sbt.internal.util.codec.ThrowableShowLines._ import sbt.internal.util.codec.TraceEventShowLines._ +import sbt.internal.util.codec.SuccessEventShowLines._ import sbt.internal.util.codec.JsonProtocol._ /** @@ -27,7 +28,11 @@ class ManagedLogger( new ObjectMessage(StringEvent(level.toString, message, channelName, execId)) ) } - override def success(message: => String): Unit = xlogger.info(message) + + // send special event for success since it's not a real log level + override def success(message: => String): Unit = { + infoEvent[SuccessEvent](SuccessEvent(message)) + } def registerStringCodec[A: ShowLines: TypeTag]: Unit = { @@ -38,6 +43,7 @@ class ManagedLogger( } registerStringCodec[Throwable] registerStringCodec[TraceEvent] + registerStringCodec[SuccessEvent] final def debugEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Debug, event) final def infoEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Info, event) final def warnEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Warn, event) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/SuccessEventShowLines.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/SuccessEventShowLines.scala new file mode 100644 index 000000000..e3b338719 --- /dev/null +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/SuccessEventShowLines.scala @@ -0,0 +1,14 @@ +package sbt +package internal.util.codec + +import sbt.util.ShowLines +import sbt.internal.util.SuccessEvent + +trait SuccessEventShowLines { + implicit val sbtSuccessEventShowLines: ShowLines[SuccessEvent] = + ShowLines[SuccessEvent]( (e: SuccessEvent) => { + Vector(e.message) + }) +} + +object SuccessEventShowLines extends SuccessEventShowLines From 930489eba3d9b8b660ef8e1da50c92591f504825 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 26 Jul 2017 00:03:07 -0400 Subject: [PATCH 702/823] Fix ConsoleAppender to show full stack trace This is modification of crash log event logging that was added in sbt/util#85. Instead of using the hardcoded 0 as the default value, this introduces `setTrace(..)` to `ConsoleAppender` like `BasicLogger`. Also the default value is set to `Int.MaxValue` that will display the full stack trace. Fixes sbt/sbt#3343 --- .../sbt/internal/util/ConsoleAppender.scala | 27 ++++++++++++++++++- .../scala/sbt/internal/util/StackTrace.scala | 13 ++++----- 2 files changed, 33 insertions(+), 7 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 14cfe6ab2..ef392e785 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -279,6 +279,15 @@ class ConsoleAppender private[ConsoleAppender] ( private val SUCCESS_MESSAGE_COLOR = reset private val NO_COLOR = reset + private var traceEnabledVar: Int = Int.MaxValue + + def setTrace(level: Int): Unit = synchronized { traceEnabledVar = level } + + /** + * Returns the number of lines for stacktrace. + */ + def getTrace: Int = synchronized { traceEnabledVar } + override def append(event: XLogEvent): Unit = { val level = ConsoleAppender.toLevel(event.getLevel) val message = event.getMessage @@ -383,11 +392,27 @@ class ConsoleAppender private[ConsoleAppender] ( case _ => appendLog(level, msg.getFormattedMessage) } + private def appendTraceEvent(te: TraceEvent): Unit = { + val traceLevel = getTrace + val throwableShowLines: ShowLines[Throwable] = + ShowLines[Throwable]( (t: Throwable) => { + List(StackTrace.trimmed(t, traceLevel)) + }) + val codec: ShowLines[TraceEvent] = + ShowLines[TraceEvent]( (t: TraceEvent) => { + throwableShowLines.showLines(t.message) + }) + codec.showLines(te).toVector foreach { appendLog(Level.Error, _) } + } + private def appendMessageContent(level: Level.Value, o: AnyRef): Unit = { def appendEvent(oe: ObjectEvent[_]): Unit = { val contentType = oe.contentType - LogExchange.stringCodec[AnyRef](contentType) match { + if (contentType == "sbt.internal.util.TraceEvent") { + appendTraceEvent(oe.message.asInstanceOf[TraceEvent]) + } + else LogExchange.stringCodec[AnyRef](contentType) match { case Some(codec) if contentType == "sbt.internal.util.SuccessEvent" => codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach { success(_) } case Some(codec) => diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala index e636d914c..20821eefb 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala @@ -9,12 +9,13 @@ object StackTrace { * Return a printable representation of the stack trace associated * with t. Information about t and its Throwable causes is included. * The number of lines to be included for each Throwable is configured - * via d which should be greater than or equal to zero. If d is zero, - * then all elements are included up to (but not including) the first - * element that comes from sbt. If d is greater than zero, then up to - * that many lines are included, where the line for the Throwable is - * counted plus one line for each stack element. Less lines will be - * included if there are not enough stack elements. + * via d which should be greater than or equal to 0. + * + * - If d is 0, then all elements are included up to (but not including) + * the first element that comes from sbt. + * - If d is greater than 0, then up to that many lines are included, + * where the line for the Throwable is counted plus one line for each stack element. + * Less lines will be included if there are not enough stack elements. */ def trimmed(t: Throwable, d: Int): String = { require(d >= 0) From 96b9d27c73f79099d6570191a894821a283efa82 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 26 Jul 2017 10:52:35 +0100 Subject: [PATCH 703/823] Upgrade to mima 0.1.15 & exclude a false positive --- build.sbt | 7 ++++++- project/plugins.sbt | 2 +- 2 files changed, 7 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 116b9ca31..c50a1d7b1 100644 --- a/build.sbt +++ b/build.sbt @@ -1,6 +1,6 @@ import Dependencies._ import Util._ -// import com.typesafe.tools.mima.core._, ProblemFilters._ +import com.typesafe.tools.mima.core._, ProblemFilters._ def baseVersion = "1.0.0-SNAPSHOT" def internalPath = file("internal") @@ -106,6 +106,11 @@ lazy val utilLogging = (project in internalPath / "util-logging"). else old(tpe) }, mimaSettings, + mimaBinaryIssueFilters ++= Seq( + // abstract method SuccessEventFormat()sjsonnew.JsonFormat in trait sbt.internal.util.codec.SuccessEventFormats is inherited by class JsonProtocol in current version. + // I think this is a false positive: https://github.com/typesafehub/migration-manager/issues/187 + ProblemFilters.exclude[InheritedNewAbstractMethodProblem]("sbt.internal.util.codec.SuccessEventFormats.SuccessEventFormat") + ) ) // Relation diff --git a/project/plugins.sbt b/project/plugins.sbt index 1e3e62f5d..3411b8c51 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,3 +1,3 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M9") -addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.14") +addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.15") From 5f5a70cca7e0a1004788e0eb95313b98e770b449 Mon Sep 17 00:00:00 2001 From: Dale Wijnand <344610+dwijnand@users.noreply.github.com> Date: Fri, 28 Jul 2017 07:34:48 +0100 Subject: [PATCH 704/823] Remove undeclared log4j-slf4j-impl dependency --- project/Dependencies.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 076545c47..35e0850b9 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -51,6 +51,5 @@ object Dependencies { def log4jVersion = "2.8.1" val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion val log4jCore = "org.apache.logging.log4j" % "log4j-core" % log4jVersion - val log4jSlf4jImpl = "org.apache.logging.log4j" % "log4j-slf4j-impl" % log4jVersion val disruptor = "com.lmax" % "disruptor" % "3.3.6" } From 867cd3fa8f4d92e6671cddc4ce26bda609ab61b7 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 28 Jul 2017 11:21:22 -0400 Subject: [PATCH 705/823] Contraband 0.3.0 --- project/plugins.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/plugins.sbt b/project/plugins.sbt index 3411b8c51..1a287c7b4 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,3 +1,3 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0-M9") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0") addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.15") From 5fe8128906dba0908c9c05f83a6ba083a8d218eb Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 28 Jul 2017 11:22:52 -0400 Subject: [PATCH 706/823] Bump Scala to 2.12.3 and IO to 1.0.0-RC3 --- .travis.yml | 2 +- project/Dependencies.scala | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/.travis.yml b/.travis.yml index f0d1274c1..b859331fe 100644 --- a/.travis.yml +++ b/.travis.yml @@ -3,7 +3,7 @@ jdk: oraclejdk8 scala: - 2.11.11 - - 2.12.2 + - 2.12.3 script: - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues test diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 35e0850b9..7006bee30 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -5,9 +5,9 @@ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { val scala210 = "2.10.6" val scala211 = "2.11.11" - val scala212 = "2.12.2" + val scala212 = "2.12.3" - private val ioVersion = "1.0.0-M13" + private val ioVersion = "1.0.0-RC3" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From 2a7226dffbe1354ffbca9d6b0e7d4b6cf84e7717 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 31 Jul 2017 11:35:52 +0100 Subject: [PATCH 707/823] Bump mimaPreviousArtifacts to 1.0.0-RC3 --- build.sbt | 9 ++------- 1 file changed, 2 insertions(+), 7 deletions(-) diff --git a/build.sbt b/build.sbt index c50a1d7b1..4015f18d7 100644 --- a/build.sbt +++ b/build.sbt @@ -1,6 +1,6 @@ import Dependencies._ import Util._ -import com.typesafe.tools.mima.core._, ProblemFilters._ +//import com.typesafe.tools.mima.core._, ProblemFilters._ def baseVersion = "1.0.0-SNAPSHOT" def internalPath = file("internal") @@ -30,7 +30,7 @@ def commonSettings: Seq[Setting[_]] = Seq( ) val mimaSettings = Def settings ( - mimaPreviousArtifacts := Set(organization.value % moduleName.value % "1.0.0-M28" + mimaPreviousArtifacts := Set(organization.value % moduleName.value % "1.0.0-RC3" cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) ) ) @@ -106,11 +106,6 @@ lazy val utilLogging = (project in internalPath / "util-logging"). else old(tpe) }, mimaSettings, - mimaBinaryIssueFilters ++= Seq( - // abstract method SuccessEventFormat()sjsonnew.JsonFormat in trait sbt.internal.util.codec.SuccessEventFormats is inherited by class JsonProtocol in current version. - // I think this is a false positive: https://github.com/typesafehub/migration-manager/issues/187 - ProblemFilters.exclude[InheritedNewAbstractMethodProblem]("sbt.internal.util.codec.SuccessEventFormats.SuccessEventFormat") - ) ) // Relation From bb2b4fc9611686f04d0907248fbcdb5c710fd6d6 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 31 Jul 2017 14:54:28 +0100 Subject: [PATCH 708/823] Upgrade to sbt 1.0.0-RC3 --- project/build.properties | 2 +- project/plugins.sbt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/project/build.properties b/project/build.properties index c9e698a14..12c38d389 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.0.0-RC2 +sbt.version=1.0.0-RC3 diff --git a/project/plugins.sbt b/project/plugins.sbt index 1a287c7b4..493521644 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,3 +1,3 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0") -addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.15") +addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.17") From ad4afa0a187bc6ba9715681dfc869b1113dcfa42 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 8 Aug 2017 15:31:58 +0100 Subject: [PATCH 709/823] Deprecated Changed Fixes #114 --- util-tracking/src/main/scala/sbt/util/Tracked.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/util-tracking/src/main/scala/sbt/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala index 83ac2df0e..2cca19325 100644 --- a/util-tracking/src/main/scala/sbt/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -207,6 +207,7 @@ class Timestamp(val store: CacheStore, useStartTime: Boolean)(implicit format: J Try { store.read[Long] } getOrElse 0 } +@deprecated("Use Tracked.inputChanged and Tracked.outputChanged instead", "1.0.1") class Changed[O: Equiv: JsonFormat](val store: CacheStore) extends Tracked { def clean() = store.delete() def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => From b8ac05aa7fe9de89fcfec3388aeade3be3a1687c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 9 Aug 2017 19:14:10 -0400 Subject: [PATCH 710/823] IO 1.0.0 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 7006bee30..73e3a8304 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -7,7 +7,7 @@ object Dependencies { val scala211 = "2.11.11" val scala212 = "2.12.3" - private val ioVersion = "1.0.0-RC3" + private val ioVersion = "1.0.0" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From d31b9c509384ecaf37f2ffaec8731250e3c60532 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 10 Aug 2017 11:24:29 +0100 Subject: [PATCH 711/823] Add, configure & enable Scalafmt --- .scalafmt.conf | 10 + .travis.yml | 2 +- build.sbt | 149 ++++++------ .../sbt/internal/util/ErrorHandling.scala | 17 +- .../scala/sbt/internal/util/ExitHook.scala | 10 +- .../sbt/internal/util/BufferedLogger.scala | 65 ++++-- .../sbt/internal/util/ConsoleAppender.scala | 106 +++++---- .../scala/sbt/internal/util/ConsoleOut.scala | 4 +- .../sbt/internal/util/GlobalLogging.scala | 62 +++-- .../sbt/internal/util/LoggerWriter.scala | 10 +- .../scala/sbt/internal/util/MainLogging.scala | 102 ++++++--- .../sbt/internal/util/ManagedLogger.scala | 59 +++-- .../scala/sbt/internal/util/MultiLogger.scala | 9 +- .../scala/sbt/internal/util/ObjectEvent.scala | 32 +-- .../scala/sbt/internal/util/StackTrace.scala | 3 +- .../sbt/internal/util/StringTypeTag.scala | 15 +- .../internal/util/codec/JValueFormats.scala | 8 +- .../internal/util/codec/PositionFormats.scala | 1 - .../internal/util/codec/ProblemFormats.scala | 1 - .../internal/util/codec/SeverityFormats.scala | 1 - .../util/codec/SuccessEventShowLines.scala | 2 +- .../util/codec/ThrowableShowLines.scala | 4 +- .../main/scala/sbt/util/AbtractLogger.scala | 1 + .../main/scala/sbt/util/InterfaceUtil.scala | 46 ++-- .../src/main/scala/sbt/util/Level.scala | 2 + .../src/main/scala/sbt/util/LogEvent.scala | 2 +- .../src/main/scala/sbt/util/LogExchange.scala | 12 +- .../src/main/scala/sbt/util/Logger.scala | 47 ++-- .../util-logging/src/test/scala/Escapes.scala | 85 ++++--- .../src/test/scala/LogWriterTest.scala | 58 +++-- .../src/test/scala/ManagedLoggerSpec.scala | 21 +- .../src/test/scala/TestLogger.scala | 13 +- .../scala/sbt/internal/util/Relation.scala | 69 ++++-- .../src/test/scala/RelationTest.scala | 51 ++--- .../internal/scripted/CommentHandler.scala | 2 +- .../sbt/internal/scripted/FileCommands.scala | 33 ++- .../internal/scripted/FilteredLoader.scala | 15 +- .../internal/scripted/HandlersProvider.scala | 2 +- .../sbt/internal/scripted/ScriptRunner.scala | 19 +- .../sbt/internal/scripted/ScriptedTests.scala | 216 ++++++++++-------- .../internal/scripted/StatementHandler.scala | 7 +- .../internal/scripted/TestScriptParser.scala | 74 +++--- project/Dependencies.scala | 12 +- project/plugins.sbt | 3 +- .../src/main/scala/sbt/util/Cache.scala | 3 +- .../main/scala/sbt/util/CacheImplicits.scala | 3 +- .../src/main/scala/sbt/util/CacheStore.scala | 28 ++- .../src/main/scala/sbt/util/FileInfo.scala | 13 +- .../src/main/scala/sbt/util/Input.scala | 4 +- .../src/main/scala/sbt/util/Output.scala | 3 +- .../main/scala/sbt/util/SeparatedCache.scala | 2 + .../main/scala/sbt/util/StampedFormat.scala | 15 +- util-cache/src/test/scala/CacheSpec.scala | 9 +- .../src/test/scala/SingletonCacheSpec.scala | 9 +- .../main/scala/sbt/util/ChangeReport.scala | 28 ++- .../main/scala/sbt/util/FileFunction.scala | 45 ++-- .../src/main/scala/sbt/util/Tracked.scala | 148 +++++++----- .../src/test/scala/sbt/util/TrackedSpec.scala | 5 +- 58 files changed, 1075 insertions(+), 702 deletions(-) create mode 100644 .scalafmt.conf diff --git a/.scalafmt.conf b/.scalafmt.conf new file mode 100644 index 000000000..e4ab36511 --- /dev/null +++ b/.scalafmt.conf @@ -0,0 +1,10 @@ +maxColumn = 100 +project.git = true +project.excludeFilters = [ /sbt-test/, /input_sources/, /contraband-scala/ ] + +# http://docs.scala-lang.org/style/scaladoc.html recommends the JavaDoc style. +# scala/scala is written that way too https://github.com/scala/scala/blob/v2.12.2/src/library/scala/Predef.scala +docstrings = JavaDoc + +# This also seems more idiomatic to include whitespace in import x.{ yyy } +spaces.inImportCurlyBraces = true diff --git a/.travis.yml b/.travis.yml index b859331fe..1af85e971 100644 --- a/.travis.yml +++ b/.travis.yml @@ -6,7 +6,7 @@ scala: - 2.12.3 script: - - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues test + - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues scalafmt::test test:scalafmt::test sbt:scalafmt::test test cache: directories: diff --git a/build.sbt b/build.sbt index 4015f18d7..1c300d5ac 100644 --- a/build.sbt +++ b/build.sbt @@ -3,7 +3,7 @@ import Util._ //import com.typesafe.tools.mima.core._, ProblemFilters._ def baseVersion = "1.0.0-SNAPSHOT" -def internalPath = file("internal") +def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( scalaVersion := scala212, @@ -18,42 +18,53 @@ def commonSettings: Seq[Setting[_]] = Seq( scalacOptions := { val old = scalacOptions.value scalaVersion.value match { - case sv if sv.startsWith("2.10") => old diff List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") + case sv if sv.startsWith("2.10") => + old diff List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") case sv if sv.startsWith("2.11") => old ++ List("-Ywarn-unused", "-Ywarn-unused-import") case _ => old ++ List("-Ywarn-unused", "-Ywarn-unused-import", "-YdisableFlatCpCaching") } }, scalacOptions in console in Compile -= "-Ywarn-unused-import", - scalacOptions in console in Test -= "-Ywarn-unused-import", + scalacOptions in console in Test -= "-Ywarn-unused-import", publishArtifact in Compile := true, publishArtifact in Test := false ) val mimaSettings = Def settings ( - mimaPreviousArtifacts := Set(organization.value % moduleName.value % "1.0.0-RC3" - cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) + mimaPreviousArtifacts := Set( + organization.value % moduleName.value % "1.0.0-RC3" + cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) ) ) -lazy val utilRoot: Project = (project in file(".")). - aggregate( - utilInterface, utilControl, utilPosition, - utilLogging, utilRelation, utilCache, utilTracking, utilTesting, +lazy val utilRoot: Project = (project in file(".")) + .aggregate( + utilInterface, + utilControl, + utilPosition, + utilLogging, + utilRelation, + utilCache, + utilTracking, + utilTesting, utilScripted - ). - settings( - inThisBuild(Seq( - git.baseVersion := baseVersion, - version := { - val v = version.value - if (v contains "SNAPSHOT") git.baseVersion.value - else v - }, - bintrayPackage := "util", - homepage := Some(url("https://github.com/sbt/util")), - description := "Util module for sbt", - scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")) - )), + ) + .settings( + inThisBuild( + Seq( + git.baseVersion := baseVersion, + version := { + val v = version.value + if (v contains "SNAPSHOT") git.baseVersion.value + else v + }, + bintrayPackage := "util", + homepage := Some(url("https://github.com/sbt/util")), + description := "Util module for sbt", + scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")), + scalafmtOnCompile := true, + scalafmtVersion := "1.1.0", + )), commonSettings, name := "Util Root", publish := {}, @@ -67,21 +78,19 @@ lazy val utilRoot: Project = (project in file(".")). // defines Java structures used across Scala versions, such as the API structures and relationships extracted by // the analysis compiler phases and passed back to sbt. The API structures are defined in a simple // format from which Java sources are generated by the datatype generator Projproject -lazy val utilInterface = (project in internalPath / "util-interface"). - settings( - commonSettings, - javaOnlySettings, - name := "Util Interface", - exportJars := true, - mimaSettings, - ) +lazy val utilInterface = (project in internalPath / "util-interface").settings( + commonSettings, + javaOnlySettings, + name := "Util Interface", + exportJars := true, + mimaSettings, +) -lazy val utilControl = (project in internalPath / "util-control"). - settings( - commonSettings, - name := "Util Control", - mimaSettings, - ) +lazy val utilControl = (project in internalPath / "util-control").settings( + commonSettings, + name := "Util Control", + mimaSettings, +) val utilPosition = (project in file("internal") / "util-position").settings( commonSettings, @@ -90,14 +99,15 @@ val utilPosition = (project in file("internal") / "util-position").settings( ) // logging -lazy val utilLogging = (project in internalPath / "util-logging"). - enablePlugins(ContrabandPlugin, JsonCodecPlugin). - dependsOn(utilInterface, utilTesting % Test). - settings( +lazy val utilLogging = (project in internalPath / "util-logging") + .enablePlugins(ContrabandPlugin, JsonCodecPlugin) + .dependsOn(utilInterface, utilTesting % Test) + .settings( commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", - libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), + libraryDependencies ++= + Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala", contrabandFormatsForType in generateContrabands in Compile := { tpe => val old = (contrabandFormatsForType in generateContrabands in Compile).value @@ -109,68 +119,69 @@ lazy val utilLogging = (project in internalPath / "util-logging"). ) // Relation -lazy val utilRelation = (project in internalPath / "util-relation"). - dependsOn(utilTesting % Test). - settings( +lazy val utilRelation = (project in internalPath / "util-relation") + .dependsOn(utilTesting % Test) + .settings( commonSettings, name := "Util Relation", mimaSettings, ) // Persisted caching based on sjson-new -lazy val utilCache = (project in file("util-cache")). - dependsOn(utilTesting % Test). - settings( +lazy val utilCache = (project in file("util-cache")) + .dependsOn(utilTesting % Test) + .settings( commonSettings, name := "Util Cache", - libraryDependencies ++= Seq(sjsonnewScalaJson.value, sjsonnewMurmurhash.value, scalaReflect.value), + libraryDependencies ++= + Seq(sjsonnewScalaJson.value, sjsonnewMurmurhash.value, scalaReflect.value), mimaSettings, - ). - configure(addSbtIO) + ) + .configure(addSbtIO) // Builds on cache to provide caching for filesystem-related operations -lazy val utilTracking = (project in file("util-tracking")). - dependsOn(utilCache, utilTesting % Test). - settings( +lazy val utilTracking = (project in file("util-tracking")) + .dependsOn(utilCache, utilTesting % Test) + .settings( commonSettings, name := "Util Tracking", mimaSettings, - ). - configure(addSbtIO) + ) + .configure(addSbtIO) // Internal utility for testing -lazy val utilTesting = (project in internalPath / "util-testing"). - settings( +lazy val utilTesting = (project in internalPath / "util-testing") + .settings( commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Testing", libraryDependencies ++= Seq(scalaCheck, scalatest), mimaSettings, - ). - configure(addSbtIO) + ) + .configure(addSbtIO) -lazy val utilScripted = (project in internalPath / "util-scripted"). - dependsOn(utilLogging, utilInterface). - settings( +lazy val utilScripted = (project in internalPath / "util-scripted") + .dependsOn(utilLogging, utilInterface) + .settings( commonSettings, name := "Util Scripted", libraryDependencies ++= { scalaVersion.value match { case sv if sv startsWith "2.11" => Seq(parserCombinator211) case sv if sv startsWith "2.12" => Seq(parserCombinator211) - case _ => Seq() + case _ => Seq() } }, mimaSettings, - ). - configure(addSbtIO) + ) + .configure(addSbtIO) def customCommands: Seq[Setting[_]] = Seq( commands += Command.command("release") { state => // "clean" :: "+compile" :: - "+publishSigned" :: - "reload" :: - state + "+publishSigned" :: + "reload" :: + state } ) diff --git a/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala b/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala index ae0d5443e..f9b101453 100644 --- a/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala +++ b/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala @@ -7,23 +7,20 @@ import java.io.IOException object ErrorHandling { def translate[T](msg: => String)(f: => T) = - try { f } - catch { + try { f } catch { case e: IOException => throw new TranslatedIOException(msg + e.toString, e) case e: Exception => throw new TranslatedException(msg + e.toString, e) } def wideConvert[T](f: => T): Either[Throwable, T] = - try { Right(f) } - catch { + try { Right(f) } catch { case ex @ (_: Exception | _: StackOverflowError) => Left(ex) case err @ (_: ThreadDeath | _: VirtualMachineError) => throw err case x: Throwable => Left(x) } def convert[T](f: => T): Either[Exception, T] = - try { Right(f) } - catch { case e: Exception => Left(e) } + try { Right(f) } catch { case e: Exception => Left(e) } def reducedToString(e: Throwable): String = if (e.getClass == classOf[RuntimeException]) { @@ -32,7 +29,11 @@ object ErrorHandling { } else e.toString } -sealed class TranslatedException private[sbt] (msg: String, cause: Throwable) extends RuntimeException(msg, cause) { + +sealed class TranslatedException private[sbt] (msg: String, cause: Throwable) + extends RuntimeException(msg, cause) { override def toString = msg } -final class TranslatedIOException private[sbt] (msg: String, cause: IOException) extends TranslatedException(msg, cause) + +final class TranslatedIOException private[sbt] (msg: String, cause: IOException) + extends TranslatedException(msg, cause) diff --git a/internal/util-control/src/main/scala/sbt/internal/util/ExitHook.scala b/internal/util-control/src/main/scala/sbt/internal/util/ExitHook.scala index 823c64b01..09d25aa3e 100644 --- a/internal/util-control/src/main/scala/sbt/internal/util/ExitHook.scala +++ b/internal/util-control/src/main/scala/sbt/internal/util/ExitHook.scala @@ -5,16 +5,20 @@ package sbt.internal.util /** Defines a function to call as sbt exits.*/ trait ExitHook { + /** Subclasses should implement this method, which is called when this hook is executed. */ def runBeforeExiting(): Unit + } + object ExitHook { def apply(f: => Unit): ExitHook = new ExitHook { def runBeforeExiting() = f } } object ExitHooks { + /** Calls each registered exit hook, trapping any exceptions so that each hook is given a chance to run. */ def runExitHooks(exitHooks: Seq[ExitHook]): Seq[Throwable] = - exitHooks.flatMap(hook => - ErrorHandling.wideConvert(hook.runBeforeExiting()).left.toOption) -} \ No newline at end of file + exitHooks.flatMap(hook => ErrorHandling.wideConvert(hook.runBeforeExiting()).left.toOption) + +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index 93686c334..b19fc2509 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -13,41 +13,43 @@ import java.util.concurrent.atomic.AtomicInteger object BufferedAppender { def generateName: String = "buffered-" + generateId.incrementAndGet + private val generateId: AtomicInteger = new AtomicInteger + def apply(delegate: Appender): BufferedAppender = apply(generateName, delegate) - def apply(name: String, delegate: Appender): BufferedAppender = - { - val appender = new BufferedAppender(name, delegate) - appender.start - appender - } + + def apply(name: String, delegate: Appender): BufferedAppender = { + val appender = new BufferedAppender(name, delegate) + appender.start + appender + } } /** - * Am appender that can buffer the logging done on it and then can flush the buffer + * An appender that can buffer the logging done on it and then can flush the buffer * to the delegate appender provided in the constructor. Use 'record()' to * start buffering and then 'play' to flush the buffer to the backing appender. * The logging level set at the time a message is originally logged is used, not * the level at the time 'play' is called. */ -class BufferedAppender private[BufferedAppender] (name: String, delegate: Appender) extends AbstractAppender(name, null, PatternLayout.createDefaultLayout(), true) { +class BufferedAppender private[BufferedAppender] (name: String, delegate: Appender) + extends AbstractAppender(name, null, PatternLayout.createDefaultLayout(), true) { + private[this] val buffer = new ListBuffer[XLogEvent] private[this] var recording = false - def append(event: XLogEvent): Unit = - { - if (recording) { - buffer += event.toImmutable - } else delegate.append(event) - } + def append(event: XLogEvent): Unit = { + if (recording) { + buffer += event.toImmutable + } else delegate.append(event) + } /** Enables buffering. */ def record() = synchronized { recording = true } def buffer[T](f: => T): T = { record() - try { f } - finally { stopQuietly() } + try { f } finally { stopQuietly() } } def bufferQuietly[T](f: => T): T = { record() @@ -70,10 +72,13 @@ class BufferedAppender private[BufferedAppender] (name: String, delegate: Append } buffer.clear() } + /** Clears buffered events and disables buffering. */ def clearBuffer(): Unit = synchronized { buffer.clear(); recording = false } + /** Plays buffered events and disables buffering. */ def stopBuffer(): Unit = synchronized { play(); clearBuffer() } + } /** @@ -93,8 +98,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { def record() = synchronized { recording = true } def buffer[T](f: => T): T = { record() - try { f } - finally { stopQuietly() } + try { f } finally { stopQuietly() } } def bufferQuietly[T](f: => T): T = { record() @@ -111,8 +115,10 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { * so that the messages are written consecutively. The buffer is cleared in the process. */ def play(): Unit = synchronized { delegate.logAll(buffer.toList); buffer.clear() } + /** Clears buffered events and disables buffering. */ def clear(): Unit = synchronized { buffer.clear(); recording = false } + /** Plays buffered events and disables buffering. */ def stop(): Unit = synchronized { play(); clear() } @@ -127,6 +133,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { delegate.setLevel(newLevel) () } + override def setSuccessEnabled(flag: Boolean): Unit = synchronized { super.setSuccessEnabled(flag) if (recording) @@ -135,6 +142,7 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { delegate.setSuccessEnabled(flag) () } + override def setTrace(level: Int): Unit = synchronized { super.setTrace(level) if (recording) @@ -144,12 +152,14 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { () } - def trace(t: => Throwable): Unit = - doBufferableIf(traceEnabled, new Trace(t), _.trace(t)) + def trace(t: => Throwable): Unit = doBufferableIf(traceEnabled, new Trace(t), _.trace(t)) + def success(message: => String): Unit = doBufferable(Level.Info, new Success(message), _.success(message)) + def log(level: Level.Value, message: => String): Unit = doBufferable(level, new Log(level, message), _.log(level, message)) + def logAll(events: Seq[LogEvent]): Unit = synchronized { if (recording) buffer ++= events @@ -157,11 +167,22 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { delegate.logAll(events) () } + def control(event: ControlEvent.Value, message: => String): Unit = doBufferable(Level.Info, new ControlEvent(event, message), _.control(event, message)) - private def doBufferable(level: Level.Value, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = + + private def doBufferable( + level: Level.Value, + appendIfBuffered: => LogEvent, + doUnbuffered: AbstractLogger => Unit + ): Unit = doBufferableIf(atLevel(level), appendIfBuffered, doUnbuffered) - private def doBufferableIf(condition: => Boolean, appendIfBuffered: => LogEvent, doUnbuffered: AbstractLogger => Unit): Unit = synchronized { + + private def doBufferableIf( + condition: => Boolean, + appendIfBuffered: => LogEvent, + doUnbuffered: AbstractLogger => Unit + ): Unit = synchronized { if (condition) { if (recording) buffer += appendIfBuffered diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index ef392e785..0685c69f0 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -51,10 +51,13 @@ object ConsoleLogger { * @param suppressedMessage How to show suppressed stack traces. * @return A new `ConsoleLogger` that logs to `out`. */ - def apply(out: ConsoleOut = ConsoleOut.systemOut, - ansiCodesSupported: Boolean = ConsoleAppender.formatEnabledInEnv, - useFormat: Boolean = ConsoleAppender.formatEnabledInEnv, - suppressedMessage: SuppressedTraceContext => Option[String] = ConsoleAppender.noSuppressedMessage): ConsoleLogger = + def apply( + out: ConsoleOut = ConsoleOut.systemOut, + ansiCodesSupported: Boolean = ConsoleAppender.formatEnabledInEnv, + useFormat: Boolean = ConsoleAppender.formatEnabledInEnv, + suppressedMessage: SuppressedTraceContext => Option[String] = + ConsoleAppender.noSuppressedMessage + ): ConsoleLogger = new ConsoleLogger(out, ansiCodesSupported, useFormat, suppressedMessage) } @@ -62,10 +65,12 @@ object ConsoleLogger { * A logger that logs to the console. On supported systems, the level labels are * colored. */ -class ConsoleLogger private[ConsoleLogger] (out: ConsoleOut, - override val ansiCodesSupported: Boolean, - useFormat: Boolean, - suppressedMessage: SuppressedTraceContext => Option[String]) extends BasicLogger { +class ConsoleLogger private[ConsoleLogger] ( + out: ConsoleOut, + override val ansiCodesSupported: Boolean, + useFormat: Boolean, + suppressedMessage: SuppressedTraceContext => Option[String] +) extends BasicLogger { private[sbt] val appender: ConsoleAppender = ConsoleAppender(generateName(), out, ansiCodesSupported, useFormat, suppressedMessage) @@ -160,7 +165,11 @@ object ConsoleAppender { * @param suppressedMessage How to handle stack traces. * @return A new `ConsoleAppender` that writes to `out`. */ - def apply(name: String, out: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = + def apply( + name: String, + out: ConsoleOut, + suppressedMessage: SuppressedTraceContext => Option[String] + ): ConsoleAppender = apply(name, out, formatEnabledInEnv, formatEnabledInEnv, suppressedMessage) /** @@ -184,14 +193,16 @@ object ConsoleAppender { * formatting. * @return A new `ConsoleAppender` that writes to `out`. */ - def apply(name: String, - out: ConsoleOut, - ansiCodesSupported: Boolean, - useFormat: Boolean, - suppressedMessage: SuppressedTraceContext => Option[String]): ConsoleAppender = { - val appender = new ConsoleAppender(name, out, ansiCodesSupported, useFormat, suppressedMessage) - appender.start - appender + def apply( + name: String, + out: ConsoleOut, + ansiCodesSupported: Boolean, + useFormat: Boolean, + suppressedMessage: SuppressedTraceContext => Option[String] + ): ConsoleAppender = { + val appender = new ConsoleAppender(name, out, ansiCodesSupported, useFormat, suppressedMessage) + appender.start + appender } /** @@ -242,7 +253,9 @@ object ConsoleAppender { // this results in a linkage error as detected below. The detection is likely jvm specific, but the priority // is avoiding mistakenly identifying something as a launcher incompatibility when it is not case e: IncompatibleClassChangeError if e.getMessage == jline1to2CompatMsg => - throw new IncompatibleClassChangeError("JLine incompatibility detected. Check that the sbt launcher is version 0.13.x or later.") + throw new IncompatibleClassChangeError( + "JLine incompatibility detected. Check that the sbt launcher is version 0.13.x or later." + ) } private[this] def os = System.getProperty("os.name") @@ -262,11 +275,11 @@ object ConsoleAppender { * This logger is not thread-safe. */ class ConsoleAppender private[ConsoleAppender] ( - name: String, - out: ConsoleOut, - ansiCodesSupported: Boolean, - useFormat: Boolean, - suppressedMessage: SuppressedTraceContext => Option[String] + name: String, + out: ConsoleOut, + ansiCodesSupported: Boolean, + useFormat: Boolean, + suppressedMessage: SuppressedTraceContext => Option[String] ) extends AbstractAppender(name, null, LogExchange.dummyLayout, true) { import scala.Console.{ BLUE, GREEN, RED, YELLOW } @@ -275,12 +288,12 @@ class ConsoleAppender private[ConsoleAppender] ( else "" } - private val SUCCESS_LABEL_COLOR = GREEN + private val SUCCESS_LABEL_COLOR = GREEN private val SUCCESS_MESSAGE_COLOR = reset - private val NO_COLOR = reset + private val NO_COLOR = reset private var traceEnabledVar: Int = Int.MaxValue - + def setTrace(level: Int): Unit = synchronized { traceEnabledVar = level } /** @@ -307,9 +320,11 @@ class ConsoleAppender private[ConsoleAppender] ( out.lockObject.synchronized { if (traceLevel >= 0) write(StackTrace.trimmed(t, traceLevel)) - if (traceLevel <= 2) - for (msg <- suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useFormat))) + if (traceLevel <= 2) { + val ctx = new SuppressedTraceContext(traceLevel, ansiCodesSupported && useFormat) + for (msg <- suppressedMessage(ctx)) appendLog(NO_COLOR, "trace", NO_COLOR, msg) + } } /** @@ -365,10 +380,16 @@ class ConsoleAppender private[ConsoleAppender] ( * @param messageColor The color to use to format the message. * @param message The message to write. */ - private def appendLog(labelColor: String, label: String, messageColor: String, message: String): Unit = + private def appendLog( + labelColor: String, + label: String, + messageColor: String, + message: String + ): Unit = out.lockObject.synchronized { message.lines.foreach { line => - val labeledLine = s"$reset[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" + val labeledLine = + s"$reset[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" write(labeledLine) } } @@ -395,31 +416,30 @@ class ConsoleAppender private[ConsoleAppender] ( private def appendTraceEvent(te: TraceEvent): Unit = { val traceLevel = getTrace val throwableShowLines: ShowLines[Throwable] = - ShowLines[Throwable]( (t: Throwable) => { + ShowLines[Throwable]((t: Throwable) => { List(StackTrace.trimmed(t, traceLevel)) }) val codec: ShowLines[TraceEvent] = - ShowLines[TraceEvent]( (t: TraceEvent) => { + ShowLines[TraceEvent]((t: TraceEvent) => { throwableShowLines.showLines(t.message) }) codec.showLines(te).toVector foreach { appendLog(Level.Error, _) } } - private def appendMessageContent(level: Level.Value, o: AnyRef): Unit = { - def appendEvent(oe: ObjectEvent[_]): Unit = - { - val contentType = oe.contentType - if (contentType == "sbt.internal.util.TraceEvent") { - appendTraceEvent(oe.message.asInstanceOf[TraceEvent]) - } - else LogExchange.stringCodec[AnyRef](contentType) match { + private def appendMessageContent(level: Level.Value, o: AnyRef): Unit = { + def appendEvent(oe: ObjectEvent[_]): Unit = { + val contentType = oe.contentType + if (contentType == "sbt.internal.util.TraceEvent") { + appendTraceEvent(oe.message.asInstanceOf[TraceEvent]) + } else + LogExchange.stringCodec[AnyRef](contentType) match { case Some(codec) if contentType == "sbt.internal.util.SuccessEvent" => codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach { success(_) } case Some(codec) => - codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach { appendLog(level, _) } - case _ => appendLog(level, oe.message.toString) + codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach (appendLog(level, _)) + case _ => appendLog(level, oe.message.toString) } - } + } o match { case x: StringEvent => Vector(x.message) foreach { appendLog(level, _) } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala index b9834d7e8..37af255cb 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala @@ -12,8 +12,8 @@ sealed trait ConsoleOut { object ConsoleOut { def systemOut: ConsoleOut = printStreamOut(System.out) - def overwriteContaining(s: String): (String, String) => Boolean = (cur, prev) => - cur.contains(s) && prev.contains(s) + def overwriteContaining(s: String): (String, String) => Boolean = + (cur, prev) => cur.contains(s) && prev.contains(s) /** Move to beginning of previous line and clear the line. */ private[this] final val OverwriteLine = "\u001B[A\r\u001B[2K" diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala index 1dcf9d9f9..7249bdd27 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/GlobalLogging.scala @@ -16,10 +16,21 @@ import org.apache.logging.log4j.core.Appender * `backing` tracks the files that persist the global logging. * `newLogger` creates a new global logging configuration from a sink and backing configuration. */ -final case class GlobalLogging(full: ManagedLogger, console: ConsoleOut, backed: Appender, - backing: GlobalLogBacking, newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging) +final case class GlobalLogging( + full: ManagedLogger, + console: ConsoleOut, + backed: Appender, + backing: GlobalLogBacking, + newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging +) -final case class GlobalLogging1(full: Logger, console: ConsoleOut, backed: AbstractLogger, backing: GlobalLogBacking, newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging1) +final case class GlobalLogging1( + full: Logger, + console: ConsoleOut, + backed: AbstractLogger, + backing: GlobalLogBacking, + newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging1 +) /** * Tracks the files that persist the global logging. @@ -27,6 +38,7 @@ final case class GlobalLogging1(full: Logger, console: ConsoleOut, backed: Abstr * `newBackingFile` creates a new temporary location for the next backing file. */ final case class GlobalLogBacking(file: File, last: Option[File], newBackingFile: () => File) { + /** Shifts the current backing file to `last` and sets the current backing to `newFile`. */ def shift(newFile: File) = GlobalLogBacking(newFile, Some(file), newBackingFile) @@ -38,32 +50,38 @@ final case class GlobalLogBacking(file: File, last: Option[File], newBackingFile * Otherwise, no changes are made. */ def unshift = GlobalLogBacking(last getOrElse file, None, newBackingFile) + } + object GlobalLogBacking { - def apply(newBackingFile: => File): GlobalLogBacking = GlobalLogBacking(newBackingFile, None, newBackingFile _) + def apply(newBackingFile: => File): GlobalLogBacking = + GlobalLogBacking(newBackingFile, None, newBackingFile _) } object GlobalLogging { import java.util.concurrent.atomic.AtomicInteger - private def generateName: String = - "GlobalLogging" + generateId.incrementAndGet + + private def generateName: String = "GlobalLogging" + generateId.incrementAndGet private val generateId: AtomicInteger = new AtomicInteger - def initial1(newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging1, newBackingFile: => File, console: ConsoleOut): GlobalLogging1 = - { - val log = ConsoleLogger(console) - GlobalLogging1(log, console, log, GlobalLogBacking(newBackingFile), newLogger) - } + def initial1( + newLogger: (PrintWriter, GlobalLogBacking) => GlobalLogging1, + newBackingFile: => File, + console: ConsoleOut + ): GlobalLogging1 = { + val log = ConsoleLogger(console) + GlobalLogging1(log, console, log, GlobalLogBacking(newBackingFile), newLogger) + } - def initial(newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging, newBackingFile: => File, console: ConsoleOut): GlobalLogging = - { - val loggerName = generateName - val log = LogExchange.logger(loggerName) - val appender = ConsoleAppender(ConsoleAppender.generateName, console) - LogExchange.bindLoggerAppenders( - loggerName, List(appender -> Level.Info) - ) - GlobalLogging(log, console, appender, GlobalLogBacking(newBackingFile), newAppender) - } + def initial( + newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging, + newBackingFile: => File, + console: ConsoleOut + ): GlobalLogging = { + val loggerName = generateName + val log = LogExchange.logger(loggerName) + val appender = ConsoleAppender(ConsoleAppender.generateName, console) + LogExchange.bindLoggerAppenders(loggerName, List(appender -> Level.Info)) + GlobalLogging(log, console, appender, GlobalLogBacking(newBackingFile), newAppender) + } } - diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala b/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala index 7b440c200..91ff8bc48 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/LoggerWriter.scala @@ -9,7 +9,11 @@ import sbt.util._ * Provides a `java.io.Writer` interface to a `Logger`. Content is line-buffered and logged at `level`. * A line is delimited by `nl`, which is by default the platform line separator. */ -class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: String = System.getProperty("line.separator")) extends java.io.Writer { +class LoggerWriter( + delegate: Logger, + unbufferedLevel: Option[Level.Value], + nl: String = System.getProperty("line.separator") +) extends java.io.Writer { def this(delegate: Logger, level: Level.Value) = this(delegate, Some(level)) def this(delegate: Logger) = this(delegate, None) @@ -17,6 +21,7 @@ class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: S private[this] val lines = new collection.mutable.ListBuffer[String] override def close() = flush() + override def flush(): Unit = synchronized { if (buffer.nonEmpty) { @@ -24,12 +29,14 @@ class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: S buffer.clear() } } + def flushLines(level: Level.Value): Unit = synchronized { for (line <- lines) delegate.log(level, line) lines.clear() } + override def write(content: Array[Char], offset: Int, length: Int): Unit = synchronized { buffer.appendAll(content, offset, length) @@ -44,6 +51,7 @@ class LoggerWriter(delegate: Logger, unbufferedLevel: Option[Level.Value], nl: S process() } } + private[this] def log(s: String): Unit = unbufferedLevel match { case None => lines += s; () diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala index dd08ba0bb..2a47587b0 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala @@ -10,47 +10,78 @@ object MainAppender { "GlobalBacking" + generateId.incrementAndGet private val generateId: AtomicInteger = new AtomicInteger - def multiLogger(log: ManagedLogger, config: MainAppenderConfig): ManagedLogger = - { - import config._ - // TODO - // console setTrace screenTrace - // backed setTrace backingTrace - // multi: Logger + def multiLogger(log: ManagedLogger, config: MainAppenderConfig): ManagedLogger = { + import config._ + // TODO + // console setTrace screenTrace + // backed setTrace backingTrace + // multi: Logger - // val log = LogExchange.logger(loggerName) - LogExchange.unbindLoggerAppenders(log.name) - LogExchange.bindLoggerAppenders( - log.name, - (consoleOpt.toList map { _ -> screenLevel }) ::: - List(backed -> backingLevel) ::: - (extra map { x => (x -> Level.Info) }) - ) - log - } + // val log = LogExchange.logger(loggerName) + LogExchange.unbindLoggerAppenders(log.name) + LogExchange.bindLoggerAppenders( + log.name, + (consoleOpt.toList map { _ -> screenLevel }) ::: + List(backed -> backingLevel) ::: + (extra map { x => + (x -> Level.Info) + }) + ) + log + } - def globalDefault(console: ConsoleOut): (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging = - { - lazy val newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging = (log, writer, backing) => { + def globalDefault( + console: ConsoleOut + ): (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging = { + lazy val newAppender: (ManagedLogger, PrintWriter, GlobalLogBacking) => GlobalLogging = + (log, writer, backing) => { val backed: Appender = defaultBacked(generateGlobalBackingName)(writer) val full = multiLogger(log, defaultMultiConfig(Option(console), backed, Nil)) GlobalLogging(full, console, backed, backing, newAppender) } - newAppender - } + newAppender + } - def defaultMultiConfig(consoleOpt: Option[ConsoleOut], backing: Appender, extra: List[Appender]): MainAppenderConfig = - MainAppenderConfig(consoleOpt map { defaultScreen(_, ConsoleAppender.noSuppressedMessage) }, backing, extra, - Level.Info, Level.Debug, -1, Int.MaxValue) - def defaultScreen(console: ConsoleOut): Appender = ConsoleAppender(ConsoleAppender.generateName, console) - def defaultScreen(console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): Appender = + def defaultMultiConfig( + consoleOpt: Option[ConsoleOut], + backing: Appender, + extra: List[Appender] + ): MainAppenderConfig = + MainAppenderConfig( + consoleOpt map { defaultScreen(_, ConsoleAppender.noSuppressedMessage) }, + backing, + extra, + Level.Info, + Level.Debug, + -1, + Int.MaxValue + ) + + def defaultScreen(console: ConsoleOut): Appender = + ConsoleAppender(ConsoleAppender.generateName, console) + + def defaultScreen( + console: ConsoleOut, + suppressedMessage: SuppressedTraceContext => Option[String] + ): Appender = ConsoleAppender(ConsoleAppender.generateName, console, suppressedMessage = suppressedMessage) - def defaultScreen(name: String, console: ConsoleOut, suppressedMessage: SuppressedTraceContext => Option[String]): Appender = + + def defaultScreen( + name: String, + console: ConsoleOut, + suppressedMessage: SuppressedTraceContext => Option[String] + ): Appender = ConsoleAppender(name, console, suppressedMessage = suppressedMessage) - def defaultBacked: PrintWriter => Appender = defaultBacked(generateGlobalBackingName, ConsoleAppender.formatEnabledInEnv) - def defaultBacked(loggerName: String): PrintWriter => Appender = defaultBacked(loggerName, ConsoleAppender.formatEnabledInEnv) - def defaultBacked(useFormat: Boolean): PrintWriter => Appender = defaultBacked(generateGlobalBackingName, useFormat) + def defaultBacked: PrintWriter => Appender = + defaultBacked(generateGlobalBackingName, ConsoleAppender.formatEnabledInEnv) + + def defaultBacked(loggerName: String): PrintWriter => Appender = + defaultBacked(loggerName, ConsoleAppender.formatEnabledInEnv) + + def defaultBacked(useFormat: Boolean): PrintWriter => Appender = + defaultBacked(generateGlobalBackingName, useFormat) + def defaultBacked(loggerName: String, useFormat: Boolean): PrintWriter => Appender = to => { ConsoleAppender( @@ -61,7 +92,12 @@ object MainAppender { } final case class MainAppenderConfig( - consoleOpt: Option[Appender], backed: Appender, extra: List[Appender], - screenLevel: Level.Value, backingLevel: Level.Value, screenTrace: Int, backingTrace: Int + consoleOpt: Option[Appender], + backed: Appender, + extra: List[Appender], + screenLevel: Level.Value, + backingLevel: Level.Value, + screenTrace: Int, + backingTrace: Int ) } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index 5a9215ba8..0fa390c2e 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -14,33 +14,31 @@ import sbt.internal.util.codec.JsonProtocol._ * Delegates log events to the associated LogExchange. */ class ManagedLogger( - val name: String, - val channelName: Option[String], - val execId: Option[String], - xlogger: XLogger + val name: String, + val channelName: Option[String], + val execId: Option[String], + xlogger: XLogger ) extends Logger { override def trace(t: => Throwable): Unit = logEvent(Level.Error, TraceEvent("Error", t, channelName, execId)) - override def log(level: Level.Value, message: => String): Unit = - { - xlogger.log( - ConsoleAppender.toXLevel(level), - new ObjectMessage(StringEvent(level.toString, message, channelName, execId)) - ) - } - + override def log(level: Level.Value, message: => String): Unit = { + xlogger.log( + ConsoleAppender.toXLevel(level), + new ObjectMessage(StringEvent(level.toString, message, channelName, execId)) + ) + } + // send special event for success since it's not a real log level override def success(message: => String): Unit = { infoEvent[SuccessEvent](SuccessEvent(message)) } - def registerStringCodec[A: ShowLines: TypeTag]: Unit = - { - val tag = StringTypeTag[A] - val ev = implicitly[ShowLines[A]] - // println(s"registerStringCodec ${tag.key}") - val _ = LogExchange.getOrElseUpdateStringCodec(tag.key, ev) - } + def registerStringCodec[A: ShowLines: TypeTag]: Unit = { + val tag = StringTypeTag[A] + val ev = implicitly[ShowLines[A]] + // println(s"registerStringCodec ${tag.key}") + val _ = LogExchange.getOrElseUpdateStringCodec(tag.key, ev) + } registerStringCodec[Throwable] registerStringCodec[TraceEvent] registerStringCodec[SuccessEvent] @@ -48,18 +46,17 @@ class ManagedLogger( final def infoEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Info, event) final def warnEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Warn, event) final def errorEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Error, event) - def logEvent[A: JsonFormat: TypeTag](level: Level.Value, event: => A): Unit = - { - val v: A = event - val tag = StringTypeTag[A] - LogExchange.getOrElseUpdateJsonCodec(tag.key, implicitly[JsonFormat[A]]) - // println("logEvent " + tag.key) - val entry: ObjectEvent[A] = ObjectEvent(level, v, channelName, execId, tag.key) - xlogger.log( - ConsoleAppender.toXLevel(level), - new ObjectMessage(entry) - ) - } + def logEvent[A: JsonFormat: TypeTag](level: Level.Value, event: => A): Unit = { + val v: A = event + val tag = StringTypeTag[A] + LogExchange.getOrElseUpdateJsonCodec(tag.key, implicitly[JsonFormat[A]]) + // println("logEvent " + tag.key) + val entry: ObjectEvent[A] = ObjectEvent(level, v, channelName, execId, tag.key) + xlogger.log( + ConsoleAppender.toXLevel(level), + new ObjectMessage(entry) + ) + } @deprecated("No longer used.", "1.0.0") override def ansiCodesSupported = ConsoleAppender.formatEnabledInEnv diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala index ef82fa10f..2d12a1b2f 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala @@ -1,4 +1,3 @@ - /* sbt -- Simple Build Tool * Copyright 2008, 2009, 2010 Mark Harrah */ @@ -17,19 +16,25 @@ class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { super.setLevel(newLevel) dispatch(new SetLevel(newLevel)) } + override def setTrace(level: Int): Unit = { super.setTrace(level) dispatch(new SetTrace(level)) } + override def setSuccessEnabled(flag: Boolean): Unit = { super.setSuccessEnabled(flag) dispatch(new SetSuccess(flag)) } + def trace(t: => Throwable): Unit = dispatch(new Trace(t)) def log(level: Level.Value, message: => String): Unit = dispatch(new Log(level, message)) def success(message: => String): Unit = dispatch(new Success(message)) def logAll(events: Seq[LogEvent]): Unit = delegates.foreach(_.logAll(events)) - def control(event: ControlEvent.Value, message: => String): Unit = delegates.foreach(_.control(event, message)) + + def control(event: ControlEvent.Value, message: => String): Unit = + delegates.foreach(_.control(event, message)) + private[this] def dispatch(event: LogEvent): Unit = { for (d <- delegates) { d.log(event) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala index f8f288c21..c2c92437d 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ObjectEvent.scala @@ -8,12 +8,12 @@ import sjsonnew.support.scalajson.unsafe.Converter import sjsonnew.shaded.scalajson.ast.unsafe.JValue final class ObjectEvent[A]( - val level: Level.Value, - val message: A, - val channelName: Option[String], - val execId: Option[String], - val contentType: String, - val json: JValue + val level: Level.Value, + val message: A, + val channelName: Option[String], + val execId: Option[String], + val contentType: String, + val json: JValue ) extends Serializable { override def toString: String = s"ObjectEvent($level, $message, $channelName, $execId, $contentType, $json)" @@ -21,12 +21,18 @@ final class ObjectEvent[A]( object ObjectEvent { def apply[A: JsonFormat]( - level: Level.Value, - message: A, - channelName: Option[String], - execId: Option[String], - contentType: String + level: Level.Value, + message: A, + channelName: Option[String], + execId: Option[String], + contentType: String ): ObjectEvent[A] = - new ObjectEvent(level, message, channelName, execId, contentType, - Converter.toJsonUnsafe(message)) + new ObjectEvent( + level, + message, + channelName, + execId, + contentType, + Converter.toJsonUnsafe(message) + ) } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala index 20821eefb..66468e2d5 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala @@ -5,6 +5,7 @@ package sbt.internal.util object StackTrace { def isSbtClass(name: String) = name.startsWith("sbt") || name.startsWith("xsbt") + /** * Return a printable representation of the stack trace associated * with t. Information about t and its Throwable causes is included. @@ -59,6 +60,6 @@ object StackTrace { appendStackTrace(c, false) } b.toString() - } + } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala index 5b90d9a12..00f30d7d2 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala @@ -8,14 +8,13 @@ final case class StringTypeTag[A](key: String) { } object StringTypeTag { - def apply[A: TypeTag]: StringTypeTag[A] = - { - val tag = implicitly[TypeTag[A]] - val tpe = tag.tpe - val k = typeToString(tpe) - // println(tpe.getClass.toString + " " + k) - StringTypeTag[A](k) - } + def apply[A: TypeTag]: StringTypeTag[A] = { + val tag = implicitly[TypeTag[A]] + val tpe = tag.tpe + val k = typeToString(tpe) + // println(tpe.getClass.toString + " " + k) + StringTypeTag[A](k) + } def typeToString(tpe: Type): String = tpe match { case TypeRef(_, sym, args) => diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala index 4e11f26e8..c0c79f7d1 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala @@ -17,7 +17,10 @@ trait JValueFormats { self: sjsonnew.BasicJsonProtocol => implicit val JBooleanFormat: JF[JBoolean] = projectFormat(_.get, (x: Boolean) => JBoolean(x)) implicit val JStringFormat: JF[JString] = projectFormat(_.value, (x: String) => JString(x)) - implicit val JNumberFormat: JF[JNumber] = projectFormat(x => BigDecimal(x.value), (x: BigDecimal) => JNumber(x.toString)) + + implicit val JNumberFormat: JF[JNumber] = + projectFormat(x => BigDecimal(x.value), (x: BigDecimal) => JNumber(x.toString)) + implicit val JArrayFormat: JF[JArray] = projectFormat[JArray, Array[JValue]](_.value, JArray(_)) implicit lazy val JObjectJsonWriter: JW[JObject] = new JW[JObject] { @@ -43,5 +46,6 @@ trait JValueFormats { self: sjsonnew.BasicJsonProtocol => def read[J](j: Option[J], u: Unbuilder[J]) = ??? // Is this even possible? with no Manifest[J]? } - implicit lazy val JValueFormat: JF[JValue] = jsonFormat[JValue](JValueJsonReader, JValueJsonWriter) + implicit lazy val JValueFormat: JF[JValue] = + jsonFormat[JValue](JValueJsonReader, JValueJsonWriter) } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala index d6ddf8049..e43ff03bf 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala @@ -1,7 +1,6 @@ /** * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. */ - package sbt.internal.util.codec import _root_.sjsonnew.{ deserializationError, Builder, JsonFormat, Unbuilder } import xsbti.Position diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala index cbb5f0010..9820289da 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala @@ -1,7 +1,6 @@ /** * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. */ - package sbt.internal.util.codec import xsbti.{ Problem, Severity, Position } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/SeverityFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/SeverityFormats.scala index 7548a2ff1..d572a146f 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/SeverityFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/SeverityFormats.scala @@ -1,7 +1,6 @@ /** * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. */ - package sbt.internal.util.codec import _root_.sjsonnew.{ deserializationError, Builder, JsonFormat, Unbuilder } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/SuccessEventShowLines.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/SuccessEventShowLines.scala index e3b338719..99cd31539 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/SuccessEventShowLines.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/SuccessEventShowLines.scala @@ -6,7 +6,7 @@ import sbt.internal.util.SuccessEvent trait SuccessEventShowLines { implicit val sbtSuccessEventShowLines: ShowLines[SuccessEvent] = - ShowLines[SuccessEvent]( (e: SuccessEvent) => { + ShowLines[SuccessEvent]((e: SuccessEvent) => { Vector(e.message) }) } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/ThrowableShowLines.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ThrowableShowLines.scala index 13abbdf8a..ace0b78fb 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/ThrowableShowLines.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ThrowableShowLines.scala @@ -6,7 +6,7 @@ import sbt.internal.util.{ StackTrace, TraceEvent } trait ThrowableShowLines { implicit val sbtThrowableShowLines: ShowLines[Throwable] = - ShowLines[Throwable]( (t: Throwable) => { + ShowLines[Throwable]((t: Throwable) => { // 0 means enabled with default behavior. See StackTrace.scala. val traceLevel = 0 List(StackTrace.trimmed(t, traceLevel)) @@ -17,7 +17,7 @@ object ThrowableShowLines extends ThrowableShowLines trait TraceEventShowLines { implicit val sbtTraceEventShowLines: ShowLines[TraceEvent] = - ShowLines[TraceEvent]( (t: TraceEvent) => { + ShowLines[TraceEvent]((t: TraceEvent) => { ThrowableShowLines.sbtThrowableShowLines.showLines(t.message) }) } diff --git a/internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala b/internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala index 51b7f08b5..253238038 100644 --- a/internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala @@ -13,6 +13,7 @@ abstract class AbstractLogger extends Logger { def control(event: ControlEvent.Value, message: => String): Unit def logAll(events: Seq[LogEvent]): Unit + /** Defined in terms of other methods in Logger and should not be called from them. */ final def log(event: LogEvent): Unit = { event match { diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index 63e4213cb..dc956ecbf 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -36,8 +36,15 @@ object InterfaceUtil { case None => Optional.empty[A]() } - def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], - pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = + def position( + line0: Option[Integer], + content: String, + offset0: Option[Integer], + pointer0: Option[Integer], + pointerSpace0: Option[String], + sourcePath0: Option[String], + sourceFile0: Option[File] + ): Position = new ConcretePosition(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0) def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = @@ -53,23 +60,22 @@ object InterfaceUtil { this.get2 == o.get2 case _ => false } - override def hashCode: Int = - { - var hash = 1 - hash = hash * 31 + this.get1.## - hash = hash * 31 + this.get2.## - hash - } + override def hashCode: Int = { + var hash = 1 + hash = hash * 31 + this.get1.## + hash = hash * 31 + this.get2.## + hash + } } private final class ConcretePosition( - line0: Option[Integer], - content: String, - offset0: Option[Integer], - pointer0: Option[Integer], - pointerSpace0: Option[String], - sourcePath0: Option[String], - sourceFile0: Option[File] + line0: Option[Integer], + content: String, + offset0: Option[Integer], + pointer0: Option[Integer], + pointerSpace0: Option[String], + sourcePath0: Option[String], + sourceFile0: Option[File] ) extends Position { val line = o2jo(line0) val lineContent = content @@ -81,10 +87,10 @@ object InterfaceUtil { } private final class ConcreteProblem( - cat: String, - pos: Position, - msg: String, - sev: Severity + cat: String, + pos: Position, + msg: String, + sev: Severity ) extends Problem { val category = cat val position = pos diff --git a/internal/util-logging/src/main/scala/sbt/util/Level.scala b/internal/util-logging/src/main/scala/sbt/util/Level.scala index 2f319cffd..fdc83178b 100644 --- a/internal/util-logging/src/main/scala/sbt/util/Level.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Level.scala @@ -12,6 +12,7 @@ object Level extends Enumeration { val Info = Value(2, "info") val Warn = Value(3, "warn") val Error = Value(4, "error") + /** * Defines the label to use for success messages. * Because the label for levels is defined in this module, the success label is also defined here. @@ -23,6 +24,7 @@ object Level extends Enumeration { /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ def apply(s: String) = values.find(s == _.toString) + /** Same as apply, defined for use in pattern matching. */ private[sbt] def unapply(s: String) = apply(s) } diff --git a/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala b/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala index bfc962891..c6ab6eecb 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogEvent.scala @@ -14,4 +14,4 @@ final class ControlEvent(val event: ControlEvent.Value, val msg: String) extends object ControlEvent extends Enumeration { val Start, Header, Finish = Value -} \ No newline at end of file +} diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index 150814bc5..ba3114643 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -24,8 +24,16 @@ sealed abstract class LogExchange { val _ = context val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } val config = ctx.getConfiguration - val loggerConfig = LoggerConfig.createLogger(false, XLevel.DEBUG, name, - "true", Array[AppenderRef](), null, config, null) + val loggerConfig = LoggerConfig.createLogger( + false, + XLevel.DEBUG, + name, + "true", + Array[AppenderRef](), + null, + config, + null + ) config.addLogger(name, loggerConfig) ctx.updateLoggers val logger = ctx.getLogger(name) diff --git a/internal/util-logging/src/main/scala/sbt/util/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/Logger.scala index 0bcea3b78..75d7a439d 100644 --- a/internal/util-logging/src/main/scala/sbt/util/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Logger.scala @@ -22,6 +22,7 @@ abstract class Logger extends xLogger { final def info(message: => String): Unit = log(Level.Info, message) final def warn(message: => String): Unit = log(Level.Warn, message) final def error(message: => String): Unit = log(Level.Error, message) + // Added by sys.process.ProcessLogger final def err(message: => String): Unit = log(Level.Error, message) // sys.process.ProcessLogger @@ -62,12 +63,16 @@ object Logger { def log(level: Level.Value, message: => String): Unit = () } - implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = new BufferedLogger(log) with ProcessLogger + implicit def absLog2PLog(log: AbstractLogger): ProcessLogger = + new BufferedLogger(log) with ProcessLogger + implicit def log2PLog(log: Logger): ProcessLogger = absLog2PLog(new FullLogger(log)) + implicit def xlog2Log(lg: xLogger): Logger = lg match { case l: Logger => l case _ => wrapXLogger(lg) } + private[this] def wrapXLogger(lg: xLogger): Logger = new Logger { import InterfaceUtil.toSupplier override def debug(msg: Supplier[String]): Unit = lg.debug(msg) @@ -78,23 +83,39 @@ object Logger { override def log(level: Level.Value, msg: Supplier[String]): Unit = lg.log(level, msg) def trace(t: => Throwable): Unit = trace(toSupplier(t)) def success(s: => String): Unit = info(toSupplier(s)) - def log(level: Level.Value, msg: => String): Unit = - { - val fmsg = toSupplier(msg) - level match { - case Level.Debug => lg.debug(fmsg) - case Level.Info => lg.info(fmsg) - case Level.Warn => lg.warn(fmsg) - case Level.Error => lg.error(fmsg) - } + def log(level: Level.Value, msg: => String): Unit = { + val fmsg = toSupplier(msg) + level match { + case Level.Debug => lg.debug(fmsg) + case Level.Info => lg.info(fmsg) + case Level.Warn => lg.warn(fmsg) + case Level.Error => lg.error(fmsg) } + } } def jo2o[A](o: Optional[A]): Option[A] = InterfaceUtil.jo2o(o) def o2jo[A](o: Option[A]): Optional[A] = InterfaceUtil.o2jo(o) - def position(line0: Option[Integer], content: String, offset0: Option[Integer], pointer0: Option[Integer], - pointerSpace0: Option[String], sourcePath0: Option[String], sourceFile0: Option[File]): Position = - InterfaceUtil.position(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0) + + def position( + line0: Option[Integer], + content: String, + offset0: Option[Integer], + pointer0: Option[Integer], + pointerSpace0: Option[String], + sourcePath0: Option[String], + sourceFile0: Option[File] + ): Position = + InterfaceUtil.position( + line0, + content, + offset0, + pointer0, + pointerSpace0, + sourcePath0, + sourceFile0 + ) + def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = InterfaceUtil.problem(cat, pos, msg, sev) } diff --git a/internal/util-logging/src/test/scala/Escapes.scala b/internal/util-logging/src/test/scala/Escapes.scala index 0ae24a6e4..9db109d7f 100644 --- a/internal/util-logging/src/test/scala/Escapes.scala +++ b/internal/util-logging/src/test/scala/Escapes.scala @@ -8,26 +8,25 @@ import EscHelpers.{ ESC, hasEscapeSequence, isEscapeTerminator, removeEscapeSequ object Escapes extends Properties("Escapes") { property("genTerminator only generates terminators") = - forAllNoShrink(genTerminator) { (c: Char) => isEscapeTerminator(c) } + forAllNoShrink(genTerminator)((c: Char) => isEscapeTerminator(c)) property("genWithoutTerminator only generates terminators") = forAllNoShrink(genWithoutTerminator) { (s: String) => - s.forall { c => !isEscapeTerminator(c) } + s.forall(c => !isEscapeTerminator(c)) } - property("hasEscapeSequence is false when no escape character is present") = forAllNoShrink(genWithoutEscape) { (s: String) => - !hasEscapeSequence(s) - } + property("hasEscapeSequence is false when no escape character is present") = + forAllNoShrink(genWithoutEscape)((s: String) => !hasEscapeSequence(s)) - property("hasEscapeSequence is true when escape character is present") = forAllNoShrink(genWithRandomEscapes) { (s: String) => - hasEscapeSequence(s) - } + property("hasEscapeSequence is true when escape character is present") = + forAllNoShrink(genWithRandomEscapes)((s: String) => hasEscapeSequence(s)) - property("removeEscapeSequences is the identity when no escape character is present") = forAllNoShrink(genWithoutEscape) { (s: String) => - val removed: String = removeEscapeSequences(s) - ("Escape sequence removed: '" + removed + "'") |: - (removed == s) - } + property("removeEscapeSequences is the identity when no escape character is present") = + forAllNoShrink(genWithoutEscape) { (s: String) => + val removed: String = removeEscapeSequences(s) + ("Escape sequence removed: '" + removed + "'") |: + (removed == s) + } property("No escape characters remain after removeEscapeSequences") = forAll { (s: String) => val removed: String = removeEscapeSequences(s) @@ -36,22 +35,26 @@ object Escapes extends Properties("Escapes") { } property("removeEscapeSequences returns string without escape sequences") = - forAllNoShrink(genWithoutEscape, genEscapePairs) { (start: String, escapes: List[EscapeAndNot]) => - val withEscapes: String = start + (escapes.map { ean => ean.escape.makeString + ean.notEscape }).mkString("") - val removed: String = removeEscapeSequences(withEscapes) - val original = start + escapes.map(_.notEscape).mkString("") - val diffCharString = diffIndex(original, removed) - ("Input string : '" + withEscapes + "'") |: - ("Expected : '" + original + "'") |: - ("Escapes removed : '" + removed + "'") |: - (diffCharString) |: - (original == removed) + forAllNoShrink(genWithoutEscape, genEscapePairs) { + (start: String, escapes: List[EscapeAndNot]) => + val withEscapes: String = + start + escapes.map(ean => ean.escape.makeString + ean.notEscape).mkString("") + val removed: String = removeEscapeSequences(withEscapes) + val original = start + escapes.map(_.notEscape).mkString("") + val diffCharString = diffIndex(original, removed) + ("Input string : '" + withEscapes + "'") |: + ("Expected : '" + original + "'") |: + ("Escapes removed : '" + removed + "'") |: + (diffCharString) |: + (original == removed) } def diffIndex(expect: String, original: String): String = { var i = 0; while (i < expect.length && i < original.length) { - if (expect.charAt(i) != original.charAt(i)) return ("Differing character, idx: " + i + ", char: " + original.charAt(i) + ", expected: " + expect.charAt(i)) + if (expect.charAt(i) != original.charAt(i)) + return ("Differing character, idx: " + i + ", char: " + original.charAt(i) + + ", expected: " + expect.charAt(i)) i += 1 } if (expect.length != original.length) return s"Strings are different lengths!" @@ -59,13 +62,21 @@ object Escapes extends Properties("Escapes") { } final case class EscapeAndNot(escape: EscapeSequence, notEscape: String) { - override def toString = s"EscapeAntNot(escape = [$escape], notEscape = [${notEscape.map(_.toInt)}])" + override def toString = + s"EscapeAntNot(escape = [$escape], notEscape = [${notEscape.map(_.toInt)}])" } + // 2.10.5 warns on "implicit numeric widening" but it looks like a bug: https://issues.scala-lang.org/browse/SI-8450 final case class EscapeSequence(content: String, terminator: Char) { if (!content.isEmpty) { - assert(content.tail.forall(c => !isEscapeTerminator(c)), "Escape sequence content contains an escape terminator: '" + content + "'") - assert((content.head == '[') || !isEscapeTerminator(content.head), "Escape sequence content contains an escape terminator: '" + content.headOption + "'") + assert( + content.tail.forall(c => !isEscapeTerminator(c)), + "Escape sequence content contains an escape terminator: '" + content + "'" + ) + assert( + (content.head == '[') || !isEscapeTerminator(content.head), + "Escape sequence content contains an escape terminator: '" + content.headOption + "'" + ) } assert(isEscapeTerminator(terminator)) def makeString: String = ESC + content + terminator @@ -74,14 +85,20 @@ object Escapes extends Properties("Escapes") { if (content.isEmpty) s"ESC (${terminator.toInt})" else s"ESC ($content) (${terminator.toInt})" } + private[this] def noEscape(s: String): String = s.replace(ESC, ' ') - lazy val genEscapeSequence: Gen[EscapeSequence] = oneOf(genKnownSequence, genTwoCharacterSequence, genArbitraryEscapeSequence) - lazy val genEscapePair: Gen[EscapeAndNot] = for (esc <- genEscapeSequence; not <- genWithoutEscape) yield EscapeAndNot(esc, not) + lazy val genEscapeSequence: Gen[EscapeSequence] = + oneOf(genKnownSequence, genTwoCharacterSequence, genArbitraryEscapeSequence) + + lazy val genEscapePair: Gen[EscapeAndNot] = + for (esc <- genEscapeSequence; not <- genWithoutEscape) yield EscapeAndNot(esc, not) + lazy val genEscapePairs: Gen[List[EscapeAndNot]] = listOf(genEscapePair) lazy val genArbitraryEscapeSequence: Gen[EscapeSequence] = - for (content <- genWithoutTerminator if !content.isEmpty; term <- genTerminator) yield new EscapeSequence("[" + content, term) + for (content <- genWithoutTerminator if !content.isEmpty; term <- genTerminator) + yield new EscapeSequence("[" + content, term) lazy val genKnownSequence: Gen[EscapeSequence] = oneOf((misc ++ setGraphicsMode ++ setMode ++ resetMode).map(toEscapeSequence)) @@ -91,14 +108,15 @@ object Escapes extends Properties("Escapes") { lazy val misc = Seq("14;23H", "5;3f", "2A", "94B", "19C", "85D", "s", "u", "2J", "K") lazy val setGraphicsMode: Seq[String] = - for (txt <- 0 to 8; fg <- 30 to 37; bg <- 40 to 47) yield txt.toString + ";" + fg.toString + ";" + bg.toString + "m" + for (txt <- 0 to 8; fg <- 30 to 37; bg <- 40 to 47) + yield txt.toString + ";" + fg.toString + ";" + bg.toString + "m" lazy val resetMode = setModeLike('I') lazy val setMode = setModeLike('h') def setModeLike(term: Char): Seq[String] = (0 to 19).map(i => "=" + i.toString + term) lazy val genWithoutTerminator = - genRawString.map(_.filter { c => !isEscapeTerminator(c) && (c != '[') }) + genRawString.map(_.filter(c => !isEscapeTerminator(c) && (c != '['))) lazy val genTwoCharacterSequence = // 91 == [ which is the CSI escape sequence. @@ -108,7 +126,8 @@ object Escapes extends Properties("Escapes") { lazy val genWithoutEscape: Gen[String] = genRawString.map(noEscape) def genWithRandomEscapes: Gen[String] = - for (ls <- listOf(genRawString); end <- genRawString) yield ls.mkString("", ESC.toString, ESC.toString + end) + for (ls <- listOf(genRawString); end <- genRawString) + yield ls.mkString("", ESC.toString, ESC.toString + end) private def genRawString = Arbitrary.arbString.arbitrary } diff --git a/internal/util-logging/src/test/scala/LogWriterTest.scala b/internal/util-logging/src/test/scala/LogWriterTest.scala index f00663b4b..7c9b29e68 100644 --- a/internal/util-logging/src/test/scala/LogWriterTest.scala +++ b/internal/util-logging/src/test/scala/LogWriterTest.scala @@ -36,6 +36,7 @@ object LogWriterTest extends Properties("Log Writer") { case l: Log => "Log('" + Escape(l.msg) + "', " + l.level + ")" case _ => "Not Log" } + /** * Writes the given lines to the Writer. `lines` is taken to be a list of lines, which are * represented as separately written segments (ToLog instances). ToLog.`byCharacter` @@ -46,7 +47,7 @@ object LogWriterTest extends Properties("Log Writer") { val content = section.content val normalized = Escape.newline(content, newLine) if (section.byCharacter) - normalized.foreach { c => writer.write(c.toInt) } + normalized.foreach(c => writer.write(c.toInt)) else writer.write(normalized) } @@ -56,6 +57,7 @@ object LogWriterTest extends Properties("Log Writer") { /** Converts the given lines in segments to lines as Strings for checking the results of the test.*/ def toLines(lines: List[List[ToLog]]): List[String] = lines.map(_.map(_.contentOnly).mkString) + /** Checks that the expected `lines` were recorded as `events` at level `Lvl`.*/ def check(lines: List[String], events: List[LogEvent], Lvl: Level.Value): Boolean = (lines zip events) forall { @@ -64,10 +66,10 @@ object LogWriterTest extends Properties("Log Writer") { } /* The following are implicit generators to build up a write sequence. - * ToLog represents a written segment. NewLine represents one of the possible - * newline separators. A List[ToLog] represents a full line and always includes a - * final ToLog with a trailing '\n'. Newline characters are otherwise not present in - * the `content` of a ToLog instance.*/ + * ToLog represents a written segment. NewLine represents one of the possible + * newline separators. A List[ToLog] represents a full line and always includes a + * final ToLog with a trailing '\n'. Newline characters are otherwise not present in + * the `content` of a ToLog instance.*/ implicit lazy val arbOut: Arbitrary[Output] = Arbitrary(genOutput) implicit lazy val arbLog: Arbitrary[ToLog] = Arbitrary(genLog) @@ -76,7 +78,8 @@ object LogWriterTest extends Properties("Log Writer") { implicit lazy val arbLevel: Arbitrary[Level.Value] = Arbitrary(genLevel) implicit def genLine(implicit logG: Gen[ToLog]): Gen[List[ToLog]] = - for (l <- listOf[ToLog](MaxSegments); last <- logG) yield (addNewline(last) :: l.filter(!_.content.isEmpty)).reverse + for (l <- listOf[ToLog](MaxSegments); last <- logG) + yield (addNewline(last) :: l.filter(!_.content.isEmpty)).reverse implicit def genLog(implicit content: Arbitrary[String], byChar: Arbitrary[Boolean]): Gen[ToLog] = for (c <- content.arbitrary; by <- byChar.arbitrary) yield { @@ -98,7 +101,7 @@ object LogWriterTest extends Properties("Log Writer") { new ToLog(l.content + "\n", l.byCharacter) // \n will be replaced by a random line terminator for all lines def listOf[T](max: Int)(implicit content: Arbitrary[T]): Gen[List[T]] = - Gen.choose(0, max) flatMap { sz => listOfN(sz, content.arbitrary) } + Gen.choose(0, max) flatMap (sz => listOfN(sz, content.arbitrary)) } /* Helper classes*/ @@ -107,35 +110,43 @@ final class Output(val lines: List[List[ToLog]], val level: Level.Value) { override def toString = "Level: " + level + "\n" + lines.map(_.mkString).mkString("\n") } + final class NewLine(val str: String) { override def toString = Escape(str) } + final class ToLog(val content: String, val byCharacter: Boolean) { def contentOnly = Escape.newline(content, "") - override def toString = if (content.isEmpty) "" else "ToLog('" + Escape(contentOnly) + "', " + byCharacter + ")" + + override def toString = + if (content.isEmpty) "" else "ToLog('" + Escape(contentOnly) + "', " + byCharacter + ")" } + /** Defines some utility methods for escaping unprintable characters.*/ object Escape { + /** Escapes characters with code less than 20 by printing them as unicode escapes.*/ - def apply(s: String): String = - { - val builder = new StringBuilder(s.length) - for (c <- s) { - val char = c.toInt - def escaped = pad(char.toHexString.toUpperCase, 4, '0') - if (c < 20) builder.append("\\u").append(escaped) else builder.append(c) - } - builder.toString - } - def pad(s: String, minLength: Int, extra: Char) = - { - val diff = minLength - s.length - if (diff <= 0) s else List.fill(diff)(extra).mkString("", "", s) + def apply(s: String): String = { + val builder = new StringBuilder(s.length) + for (c <- s) { + val char = c.toInt + def escaped = pad(char.toHexString.toUpperCase, 4, '0') + if (c < 20) builder.append("\\u").append(escaped) else builder.append(c) } + builder.toString + } + + def pad(s: String, minLength: Int, extra: Char) = { + val diff = minLength - s.length + if (diff <= 0) s else List.fill(diff)(extra).mkString("", "", s) + } + /** Replaces a \n character at the end of a string `s` with `nl`.*/ def newline(s: String, nl: String): String = if (s.endsWith("\n")) s.substring(0, s.length - 1) + nl else s + } + /** Records logging events for later retrieval.*/ final class RecordingLogger extends BasicLogger { private var events: List[LogEvent] = Nil @@ -147,6 +158,7 @@ final class RecordingLogger extends BasicLogger { def log(level: Level.Value, message: => String): Unit = { events ::= new Log(level, message) } def success(message: => String): Unit = { events ::= new Success(message) } def logAll(es: Seq[LogEvent]): Unit = { events :::= es.toList } - def control(event: ControlEvent.Value, message: => String): Unit = { events ::= new ControlEvent(event, message) } + def control(event: ControlEvent.Value, message: => String): Unit = + events ::= new ControlEvent(event, message) } diff --git a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala index 80e42a64d..a0cf1e569 100644 --- a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala +++ b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala @@ -24,7 +24,8 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { import sjsonnew.BasicJsonProtocol._ val log = LogExchange.logger("foo") LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) - implicit val intShow: ShowLines[Int] = ShowLines({ (x: Int) => Vector(s"String representation of $x") }) + implicit val intShow: ShowLines[Int] = + ShowLines((x: Int) => Vector(s"String representation of $x")) log.registerStringCodec[Int] log.infoEvent(1) } @@ -33,7 +34,8 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { import sjsonnew.BasicJsonProtocol._ val log = LogExchange.logger("foo") LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) - implicit val intArrayShow: ShowLines[Array[Int]] = ShowLines({ (x: Array[Int]) => Vector(s"String representation of ${x.mkString}") }) + implicit val intArrayShow: ShowLines[Array[Int]] = + ShowLines((x: Array[Int]) => Vector(s"String representation of ${x.mkString}")) log.registerStringCodec[Array[Int]] log.infoEvent(Array(1, 2, 3)) } @@ -42,7 +44,8 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { import sjsonnew.BasicJsonProtocol._ val log = LogExchange.logger("foo") LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) - implicit val intVectorShow: ShowLines[Vector[Vector[Int]]] = ShowLines({ (xss: Vector[Vector[Int]]) => Vector(s"String representation of $xss") }) + implicit val intVectorShow: ShowLines[Vector[Vector[Int]]] = + ShowLines((xss: Vector[Vector[Int]]) => Vector(s"String representation of $xss")) log.registerStringCodec[Vector[Vector[Int]]] log.infoEvent(Vector(Vector(1, 2, 3))) } @@ -51,7 +54,9 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { // this is passed into State normally val global0 = initialGlobalLogging val full = global0.full - (1 to 3).toList foreach { x => full.info(s"test$x") } + (1 to 3).toList foreach { x => + full.info(s"test$x") + } } // This is done in Mainloop.scala @@ -62,7 +67,7 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { val out = new PrintWriter(writer) val g = global0.newAppender(global0.full, out, logBacking0) val full = g.full - (1 to 3).toList foreach { x => full.info(s"newAppender $x") } + (1 to 3).toList foreach (x => full.info(s"newAppender $x")) assert(logBacking0.file.exists) g } @@ -71,7 +76,7 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { val out = new PrintWriter(writer) val g = global1.newAppender(global1.full, out, logBacking1) val full = g.full - (1 to 3).toList foreach { x => full.info(s"newAppender $x") } + (1 to 3).toList foreach (x => full.info(s"newAppender $x")) // println(logBacking.file) // print("Press enter to continue. ") // System.console.readLine @@ -81,6 +86,8 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { val console = ConsoleOut.systemOut def initialGlobalLogging: GlobalLogging = GlobalLogging.initial( - MainAppender.globalDefault(console), File.createTempFile("sbt", ".log"), console + MainAppender.globalDefault(console), + File.createTempFile("sbt", ".log"), + console ) } diff --git a/internal/util-logging/src/test/scala/TestLogger.scala b/internal/util-logging/src/test/scala/TestLogger.scala index b9ddda148..a7554f3a5 100644 --- a/internal/util-logging/src/test/scala/TestLogger.scala +++ b/internal/util-logging/src/test/scala/TestLogger.scala @@ -3,10 +3,9 @@ package sbt.internal.util import sbt.util._ object TestLogger { - def apply[T](f: Logger => T): T = - { - val log = new BufferedLogger(ConsoleLogger()) - log.setLevel(Level.Debug) - log.bufferQuietly(f(log)) - } -} \ No newline at end of file + def apply[T](f: Logger => T): T = { + val log = new BufferedLogger(ConsoleLogger()) + log.setLevel(Level.Debug) + log.bufferQuietly(f(log)) + } +} diff --git a/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala b/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala index 788f39362..d107b3bc0 100644 --- a/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala +++ b/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala @@ -6,6 +6,7 @@ package sbt.internal.util import Relation._ object Relation { + /** Constructs a new immutable, finite relation that is initially empty. */ def empty[A, B]: Relation[A, B] = make(Map.empty, Map.empty) @@ -13,17 +14,18 @@ object Relation { * Constructs a [[Relation]] from underlying `forward` and `reverse` representations, without checking that they are consistent. * This is a low-level constructor and the alternatives [[empty]] and [[reconstruct]] should be preferred. */ - def make[A, B](forward: Map[A, Set[B]], reverse: Map[B, Set[A]]): Relation[A, B] = new MRelation(forward, reverse) + def make[A, B](forward: Map[A, Set[B]], reverse: Map[B, Set[A]]): Relation[A, B] = + new MRelation(forward, reverse) /** Constructs a relation such that for every entry `_1 -> _2s` in `forward` and every `_2` in `_2s`, `(_1, _2)` is in the relation. */ - def reconstruct[A, B](forward: Map[A, Set[B]]): Relation[A, B] = - { - val reversePairs = for ((a, bs) <- forward.view; b <- bs.view) yield (b, a) - val reverse = (Map.empty[B, Set[A]] /: reversePairs) { case (m, (b, a)) => add(m, b, a :: Nil) } - make(forward filter { case (a, bs) => bs.nonEmpty }, reverse) - } + def reconstruct[A, B](forward: Map[A, Set[B]]): Relation[A, B] = { + val reversePairs = for ((a, bs) <- forward.view; b <- bs.view) yield (b, a) + val reverse = (Map.empty[B, Set[A]] /: reversePairs) { case (m, (b, a)) => add(m, b, a :: Nil) } + make(forward filter { case (a, bs) => bs.nonEmpty }, reverse) + } - def merge[A, B](rels: Traversable[Relation[A, B]]): Relation[A, B] = (Relation.empty[A, B] /: rels)(_ ++ _) + def merge[A, B](rels: Traversable[Relation[A, B]]): Relation[A, B] = + (Relation.empty[A, B] /: rels)(_ ++ _) private[sbt] def remove[X, Y](map: M[X, Y], from: X, to: Y): M[X, Y] = map.get(from) match { @@ -34,46 +36,61 @@ object Relation { } private[sbt] def combine[X, Y](a: M[X, Y], b: M[X, Y]): M[X, Y] = - (a /: b) { (map, mapping) => add(map, mapping._1, mapping._2) } + (a /: b)((map, mapping) => add(map, mapping._1, mapping._2)) private[sbt] def add[X, Y](map: M[X, Y], from: X, to: Traversable[Y]): M[X, Y] = map.updated(from, get(map, from) ++ to) private[sbt] def get[X, Y](map: M[X, Y], t: X): Set[Y] = map.getOrElse(t, Set.empty[Y]) - private[sbt]type M[X, Y] = Map[X, Set[Y]] + private[sbt] type M[X, Y] = Map[X, Set[Y]] } /** Binary relation between A and B. It is a set of pairs (_1, _2) for _1 in A, _2 in B. */ trait Relation[A, B] { + /** Returns the set of all `_2`s such that `(_1, _2)` is in this relation. */ def forward(_1: A): Set[B] + /** Returns the set of all `_1`s such that `(_1, _2)` is in this relation. */ def reverse(_2: B): Set[A] + /** Includes `pair` in the relation. */ def +(pair: (A, B)): Relation[A, B] + /** Includes `(a, b)` in the relation. */ def +(a: A, b: B): Relation[A, B] + /** Includes in the relation `(a, b)` for all `b` in `bs`. */ def +(a: A, bs: Traversable[B]): Relation[A, B] + /** Returns the union of the relation `r` with this relation. */ def ++(r: Relation[A, B]): Relation[A, B] + /** Includes the given pairs in this relation. */ def ++(rs: Traversable[(A, B)]): Relation[A, B] + /** Removes all elements `(_1, _2)` for all `_1` in `_1s` from this relation. */ def --(_1s: Traversable[A]): Relation[A, B] + /** Removes all `pairs` from this relation. */ def --(pairs: TraversableOnce[(A, B)]): Relation[A, B] + /** Removes all `relations` from this relation. */ def --(relations: Relation[A, B]): Relation[A, B] + /** Removes all pairs `(_1, _2)` from this relation. */ def -(_1: A): Relation[A, B] + /** Removes `pair` from this relation. */ def -(pair: (A, B)): Relation[A, B] + /** Returns the set of all `_1`s such that `(_1, _2)` is in this relation. */ def _1s: collection.Set[A] + /** Returns the set of all `_2`s such that `(_1, _2)` is in this relation. */ def _2s: collection.Set[B] + /** Returns the number of pairs in this relation */ def size: Int @@ -110,10 +127,12 @@ trait Relation[A, B] { * The value associated with a given `_2` is the set of all `_1`s such that `(_1, _2)` is in this relation. */ def reverseMap: Map[B, Set[A]] + } // Note that we assume without checking that fwd and rev are consistent. -private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) extends Relation[A, B] { +private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) + extends Relation[A, B] { def forwardMap = fwd def reverseMap = rev @@ -125,25 +144,30 @@ private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ex def size = (fwd.valuesIterator map (_.size)).sum - def all: Traversable[(A, B)] = fwd.iterator.flatMap { case (a, bs) => bs.iterator.map(b => (a, b)) }.toTraversable + def all: Traversable[(A, B)] = + fwd.iterator.flatMap { case (a, bs) => bs.iterator.map(b => (a, b)) }.toTraversable def +(pair: (A, B)) = this + (pair._1, Set(pair._2)) def +(from: A, to: B) = this + (from, to :: Nil) - def +(from: A, to: Traversable[B]) = if (to.isEmpty) this else - new MRelation(add(fwd, from, to), (rev /: to) { (map, t) => add(map, t, from :: Nil) }) + def +(from: A, to: Traversable[B]) = + if (to.isEmpty) this + else new MRelation(add(fwd, from, to), (rev /: to)((map, t) => add(map, t, from :: Nil))) def ++(rs: Traversable[(A, B)]) = ((this: Relation[A, B]) /: rs) { _ + _ } - def ++(other: Relation[A, B]) = new MRelation[A, B](combine(fwd, other.forwardMap), combine(rev, other.reverseMap)) + def ++(other: Relation[A, B]) = + new MRelation[A, B](combine(fwd, other.forwardMap), combine(rev, other.reverseMap)) def --(ts: Traversable[A]): Relation[A, B] = ((this: Relation[A, B]) /: ts) { _ - _ } - def --(pairs: TraversableOnce[(A, B)]): Relation[A, B] = ((this: Relation[A, B]) /: pairs) { _ - _ } + def --(pairs: TraversableOnce[(A, B)]): Relation[A, B] = ((this: Relation[A, B]) /: pairs)(_ - _) def --(relations: Relation[A, B]): Relation[A, B] = --(relations.all) + def -(pair: (A, B)): Relation[A, B] = new MRelation(remove(fwd, pair._1, pair._2), remove(rev, pair._2, pair._1)) + def -(t: A): Relation[A, B] = fwd.get(t) match { case Some(rs) => - val upRev = (rev /: rs) { (map, r) => remove(map, r, t) } + val upRev = (rev /: rs)((map, r) => remove(map, r, t)) new MRelation(fwd - t, upRev) case None => this } @@ -155,18 +179,21 @@ private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) ex (Relation.empty[A, B] ++ y, Relation.empty[A, B] ++ n) } - def groupBy[K](discriminator: ((A, B)) => K): Map[K, Relation[A, B]] = all.groupBy(discriminator) mapValues { Relation.empty[A, B] ++ _ } + def groupBy[K](discriminator: ((A, B)) => K): Map[K, Relation[A, B]] = + all.groupBy(discriminator) mapValues { Relation.empty[A, B] ++ _ } def contains(a: A, b: B): Boolean = forward(a)(b) override def equals(other: Any) = other match { // We assume that the forward and reverse maps are consistent, so we only use the forward map // for equality. Note that key -> Empty is semantically the same as key not existing. - case o: MRelation[A, B] => forwardMap.filterNot(_._2.isEmpty) == o.forwardMap.filterNot(_._2.isEmpty) - case _ => false + case o: MRelation[A, B] => + forwardMap.filterNot(_._2.isEmpty) == o.forwardMap.filterNot(_._2.isEmpty) + case _ => false } override def hashCode = fwd.filterNot(_._2.isEmpty).hashCode() - override def toString = all.map { case (a, b) => a + " -> " + b }.mkString("Relation [", ", ", "]") + override def toString = + all.map { case (a, b) => a + " -> " + b }.mkString("Relation [", ", ", "]") } diff --git a/internal/util-relation/src/test/scala/RelationTest.scala b/internal/util-relation/src/test/scala/RelationTest.scala index 31f68e0c3..47aacacdd 100644 --- a/internal/util-relation/src/test/scala/RelationTest.scala +++ b/internal/util-relation/src/test/scala/RelationTest.scala @@ -11,21 +11,20 @@ object RelationTest extends Properties("Relation") { val r = Relation.empty[Int, Double] ++ pairs check(r, pairs) } - def check(r: Relation[Int, Double], pairs: Seq[(Int, Double)]) = - { - val _1s = pairs.map(_._1).toSet - val _2s = pairs.map(_._2).toSet + def check(r: Relation[Int, Double], pairs: Seq[(Int, Double)]) = { + val _1s = pairs.map(_._1).toSet + val _2s = pairs.map(_._2).toSet - r._1s == _1s && r.forwardMap.keySet == _1s && - r._2s == _2s && r.reverseMap.keySet == _2s && - pairs.forall { - case (a, b) => - (r.forward(a) contains b) && - (r.reverse(b) contains a) && - (r.forwardMap(a) contains b) && - (r.reverseMap(b) contains a) - } + r._1s == _1s && r.forwardMap.keySet == _1s && + r._2s == _2s && r.reverseMap.keySet == _2s && + pairs.forall { + case (a, b) => + (r.forward(a) contains b) && + (r.reverse(b) contains a) && + (r.forwardMap(a) contains b) && + (r.reverseMap(b) contains a) } + } property("Does not contain removed entries") = forAll { (pairs: List[(Int, Double, Boolean)]) => val add = pairs.map { case (a, b, c) => (a, b) } @@ -39,17 +38,17 @@ object RelationTest extends Properties("Relation") { all(removeCoarse) { rem => ("_1s does not contain removed" |: (!r._1s.contains(rem))) && - ("Forward does not contain removed" |: r.forward(rem).isEmpty) && - ("Forward map does not contain removed" |: !r.forwardMap.contains(rem)) && - ("Removed is not a value in reverse map" |: !r.reverseMap.values.toSet.contains(rem)) + ("Forward does not contain removed" |: r.forward(rem).isEmpty) && + ("Forward map does not contain removed" |: !r.forwardMap.contains(rem)) && + ("Removed is not a value in reverse map" |: !r.reverseMap.values.toSet.contains(rem)) } && - all(removeFine) { - case (a, b) => - ("Forward does not contain removed" |: (!r.forward(a).contains(b))) && - ("Reverse does not contain removed" |: (!r.reverse(b).contains(a))) && - ("Forward map does not contain removed" |: (notIn(r.forwardMap, a, b))) && - ("Reverse map does not contain removed" |: (notIn(r.reverseMap, b, a))) - } + all(removeFine) { + case (a, b) => + ("Forward does not contain removed" |: (!r.forward(a).contains(b))) && + ("Reverse does not contain removed" |: (!r.reverse(b).contains(a))) && + ("Forward map does not contain removed" |: (notIn(r.forwardMap, a, b))) && + ("Reverse map does not contain removed" |: (notIn(r.reverseMap, b, a))) + } } property("Groups correctly") = forAll { (entries: List[(Int, Double)], randomInt: Int) => @@ -75,10 +74,10 @@ object RelationTest extends Properties("Relation") { object EmptyRelationTest extends Properties("Empty relation") { lazy val e = Relation.empty[Int, Double] - property("Forward empty") = forAll { (i: Int) => e.forward(i).isEmpty } - property("Reverse empty") = forAll { (i: Double) => e.reverse(i).isEmpty } + property("Forward empty") = forAll((i: Int) => e.forward(i).isEmpty) + property("Reverse empty") = forAll((i: Double) => e.reverse(i).isEmpty) property("Forward map empty") = e.forwardMap.isEmpty property("Reverse map empty") = e.reverseMap.isEmpty property("_1 empty") = e._1s.isEmpty property("_2 empty") = e._2s.isEmpty -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/CommentHandler.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/CommentHandler.scala index 370ae0005..373ae1334 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/CommentHandler.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/CommentHandler.scala @@ -7,4 +7,4 @@ package scripted object CommentHandler extends BasicStatementHandler { def apply(command: String, args: List[String]) = () -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index 65b5af423..3b5daaef7 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -58,18 +58,20 @@ class FileCommands(baseDirectory: File) extends BasicStatementHandler { val lines1 = IO.readLines(fromString(file1)) val lines2 = IO.readLines(fromString(file2)) if (lines1 != lines2) - scriptError("File contents are different:\n" + lines1.mkString("\n") + "\nAnd:\n" + lines2.mkString("\n")) + scriptError( + "File contents are different:\n" + lines1.mkString("\n") + + "\nAnd:\n" + lines2.mkString("\n") + ) } - def newer(a: String, b: String): Unit = - { - val pathA = fromString(a) - val pathB = fromString(b) - val isNewer = pathA.exists && (!pathB.exists || pathA.lastModified > pathB.lastModified) - if (!isNewer) { - scriptError(s"$pathA is not newer than $pathB") - } + def newer(a: String, b: String): Unit = { + val pathA = fromString(a) + val pathB = fromString(b) + val isNewer = pathA.exists && (!pathB.exists || pathA.lastModified > pathB.lastModified) + if (!isNewer) { + scriptError(s"$pathA is not newer than $pathB") } + } def exists(paths: List[String]): Unit = { val notPresent = fromStrings(paths).filter(!_.exists) if (notPresent.nonEmpty) @@ -127,9 +129,16 @@ class FileCommands(baseDirectory: File) extends BasicStatementHandler { IO.copy(mapped.init pair map) () } + def wrongArguments(args: List[String]): Unit = - scriptError("Command '" + commandName + "' does not accept arguments (found '" + spaced(args) + "').") + scriptError( + "Command '" + commandName + "' does not accept arguments (found '" + spaced(args) + "')." + ) + def wrongArguments(requiredArgs: String, args: List[String]): Unit = - scriptError("Wrong number of arguments to " + commandName + " command. " + requiredArgs + " required, found: '" + spaced(args) + "'.") + scriptError( + "Wrong number of arguments to " + commandName + " command. " + + requiredArgs + " required, found: '" + spaced(args) + "'." + ) } -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FilteredLoader.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FilteredLoader.scala index cb2c3100d..6eccab312 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FilteredLoader.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FilteredLoader.scala @@ -7,13 +7,12 @@ package scripted final class FilteredLoader(parent: ClassLoader) extends ClassLoader(parent) { @throws(classOf[ClassNotFoundException]) - override final def loadClass(className: String, resolve: Boolean): Class[_] = - { - if (className.startsWith("java.") || className.startsWith("javax.")) - super.loadClass(className, resolve) - else - throw new ClassNotFoundException(className) - } + override final def loadClass(className: String, resolve: Boolean): Class[_] = { + if (className.startsWith("java.") || className.startsWith("javax.")) + super.loadClass(className, resolve) + else + throw new ClassNotFoundException(className) + } override def getResources(name: String) = null override def getResource(name: String) = null -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/HandlersProvider.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/HandlersProvider.scala index 3dcb4ef6d..a0d6a3636 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/HandlersProvider.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/HandlersProvider.scala @@ -2,4 +2,4 @@ package sbt.internal.scripted trait HandlersProvider { def getHandlers(config: ScriptConfig): Map[Char, StatementHandler] -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala index f43b54f39..a15458dc4 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala @@ -6,7 +6,7 @@ package internal package scripted final class TestException(statement: Statement, msg: String, exception: Throwable) - extends RuntimeException(statement.linePrefix + " " + msg, exception) + extends RuntimeException(statement.linePrefix + " " + msg, exception) class ScriptRunner { import scala.collection.mutable.HashMap @@ -15,14 +15,16 @@ class ScriptRunner { def processStatement(handler: StatementHandler, statement: Statement): Unit = { val state = states(handler).asInstanceOf[handler.State] val nextState = - try { Right(handler(statement.command, statement.arguments, state)) } - catch { case e: Exception => Left(e) } + try { Right(handler(statement.command, statement.arguments, state)) } catch { + case e: Exception => Left(e) + } nextState match { case Left(err) => if (statement.successExpected) { err match { - case t: TestFailed => throw new TestException(statement, "Command failed: " + t.getMessage, null) - case _ => throw new TestException(statement, "Command failed", err) + case t: TestFailed => + throw new TestException(statement, "Command failed: " + t.getMessage, null) + case _ => throw new TestException(statement, "Command failed", err) } } else () @@ -36,13 +38,12 @@ class ScriptRunner { val handlers = Set() ++ statements.map(_._1) try { - handlers.foreach { handler => states(handler) = handler.initialState } + handlers.foreach(handler => states(handler) = handler.initialState) statements foreach (Function.tupled(processStatement)) } finally { for (handler <- handlers; state <- states.get(handler)) { - try { handler.finish(state.asInstanceOf[handler.State]) } - catch { case e: Exception => () } + try { handler.finish(state.asInstanceOf[handler.State]) } catch { case e: Exception => () } } } } -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index fb1ba9eca..e88d4bb16 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -12,7 +12,12 @@ import sbt.internal.io.Resources import java.util.concurrent.atomic.AtomicInteger object ScriptedRunnerImpl { - def run(resourceBaseDirectory: File, bufferLog: Boolean, tests: Array[String], handlersProvider: HandlersProvider): Unit = { + def run( + resourceBaseDirectory: File, + bufferLog: Boolean, + tests: Array[String], + handlersProvider: HandlersProvider + ): Unit = { val runner = new ScriptedTests(resourceBaseDirectory, bufferLog, handlersProvider) val logger = newLogger val allTests = get(tests, resourceBaseDirectory, logger) flatMap { @@ -36,15 +41,18 @@ object ScriptedRunnerImpl { ScriptedTest(group, name) } private[sbt] val generateId: AtomicInteger = new AtomicInteger - private[sbt] def newLogger: ManagedLogger = - { - val loggerName = "scripted-" + generateId.incrementAndGet - val x = LogExchange.logger(loggerName) - x - } + private[sbt] def newLogger: ManagedLogger = { + val loggerName = "scripted-" + generateId.incrementAndGet + val x = LogExchange.logger(loggerName) + x + } } -final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handlersProvider: HandlersProvider) { +final class ScriptedTests( + resourceBaseDirectory: File, + bufferLog: Boolean, + handlersProvider: HandlersProvider +) { private val testResources = new Resources(resourceBaseDirectory) private val consoleAppender: ConsoleAppender = ConsoleAppender() @@ -53,85 +61,96 @@ final class ScriptedTests(resourceBaseDirectory: File, bufferLog: Boolean, handl def scriptedTest(group: String, name: String, log: xsbti.Logger): Seq[() => Option[String]] = scriptedTest(group, name, Logger.xlog2Log(log)) + def scriptedTest(group: String, name: String, log: ManagedLogger): Seq[() => Option[String]] = - scriptedTest(group, name, { _ => () }, log) - def scriptedTest(group: String, name: String, prescripted: File => Unit, log: ManagedLogger): Seq[() => Option[String]] = { + scriptedTest(group, name, (_ => ()), log) + + def scriptedTest( + group: String, + name: String, + prescripted: File => Unit, + log: ManagedLogger + ): Seq[() => Option[String]] = { for (groupDir <- (resourceBaseDirectory * group).get; nme <- (groupDir * name).get) yield { val g = groupDir.getName val n = nme.getName val str = s"$g / $n" - () => { - println("Running " + str) - testResources.readWriteResourceDirectory(g, n) { testDirectory => - val disabled = new File(testDirectory, "disabled").isFile - if (disabled) { - log.info("D " + str + " [DISABLED]") - None - } else { - try { scriptedTest(str, testDirectory, prescripted, log); None } - catch { case _: TestException | _: PendingTestSuccessException => Some(str) } + () => + { + println("Running " + str) + testResources.readWriteResourceDirectory(g, n) { testDirectory => + val disabled = new File(testDirectory, "disabled").isFile + if (disabled) { + log.info("D " + str + " [DISABLED]") + None + } else { + try { scriptedTest(str, testDirectory, prescripted, log); None } catch { + case _: TestException | _: PendingTestSuccessException => Some(str) + } + } } } - } } } - private def scriptedTest(label: String, testDirectory: File, prescripted: File => Unit, log: ManagedLogger): Unit = - { - val buffered = BufferedAppender(consoleAppender) - LogExchange.unbindLoggerAppenders(log.name) - LogExchange.bindLoggerAppenders(log.name, (buffered -> Level.Debug) :: Nil) - if (bufferLog) { - buffered.record() - } - def createParser() = - { - // val fileHandler = new FileCommands(testDirectory) - // // val sbtHandler = new SbtHandler(testDirectory, launcher, buffered, launchOpts) - // new TestScriptParser(Map('$' -> fileHandler, /* '>' -> sbtHandler, */ '#' -> CommentHandler)) - val scriptConfig = new ScriptConfig(label, testDirectory, log) - new TestScriptParser(handlersProvider getHandlers scriptConfig) - } - val (file, pending) = { - val normal = new File(testDirectory, ScriptFilename) - val pending = new File(testDirectory, PendingScriptFilename) - if (pending.isFile) (pending, true) else (normal, false) - } - val pendingString = if (pending) " [PENDING]" else "" - - def runTest() = - { - val run = new ScriptRunner - val parser = createParser() - run(parser.parse(file)) - } - def testFailed(): Unit = { - if (pending) buffered.clearBuffer() else buffered.stopBuffer() - log.error("x " + label + pendingString) - } - - try { - prescripted(testDirectory) - runTest() - log.info("+ " + label + pendingString) - if (pending) throw new PendingTestSuccessException(label) - } catch { - case e: TestException => - testFailed() - e.getCause match { - case null | _: java.net.SocketException => log.error(" " + e.getMessage) - case _ => if (!pending) e.printStackTrace - } - if (!pending) throw e - case e: PendingTestSuccessException => - testFailed() - log.error(" Mark as passing to remove this failure.") - throw e - case e: Exception => - testFailed() - if (!pending) throw e - } finally { buffered.clearBuffer() } + private def scriptedTest( + label: String, + testDirectory: File, + prescripted: File => Unit, + log: ManagedLogger + ): Unit = { + val buffered = BufferedAppender(consoleAppender) + LogExchange.unbindLoggerAppenders(log.name) + LogExchange.bindLoggerAppenders(log.name, (buffered -> Level.Debug) :: Nil) + if (bufferLog) { + buffered.record() } + def createParser() = { + // val fileHandler = new FileCommands(testDirectory) + // // val sbtHandler = new SbtHandler(testDirectory, launcher, buffered, launchOpts) + // new TestScriptParser(Map('$' -> fileHandler, /* '>' -> sbtHandler, */ '#' -> CommentHandler)) + val scriptConfig = new ScriptConfig(label, testDirectory, log) + new TestScriptParser(handlersProvider getHandlers scriptConfig) + } + val (file, pending) = { + val normal = new File(testDirectory, ScriptFilename) + val pending = new File(testDirectory, PendingScriptFilename) + if (pending.isFile) (pending, true) else (normal, false) + } + val pendingString = if (pending) " [PENDING]" else "" + + def runTest() = { + val run = new ScriptRunner + val parser = createParser() + run(parser.parse(file)) + } + def testFailed(): Unit = { + if (pending) buffered.clearBuffer() else buffered.stopBuffer() + log.error("x " + label + pendingString) + } + + try { + prescripted(testDirectory) + runTest() + log.info("+ " + label + pendingString) + if (pending) throw new PendingTestSuccessException(label) + } catch { + case e: TestException => + testFailed() + e.getCause match { + case null | _: java.net.SocketException => log.error(" " + e.getMessage) + case _ => if (!pending) e.printStackTrace + } + if (!pending) throw e + case e: PendingTestSuccessException => + testFailed() + log.error(" Mark as passing to remove this failure.") + throw e + case e: Exception => + testFailed() + if (!pending) throw e + } finally { buffered.clearBuffer() } + } } // object ScriptedTests extends ScriptedRunner { @@ -148,31 +167,30 @@ object ListTests { import ListTests._ final class ListTests(baseDirectory: File, accept: ScriptedTest => Boolean, log: Logger) { def filter = DirectoryFilter -- HiddenFileFilter - def listTests: Seq[ScriptedTest] = - { - list(baseDirectory, filter) flatMap { group => - val groupName = group.getName - listTests(group).map(ScriptedTest(groupName, _)) - } - } - private[this] def listTests(group: File): Seq[String] = - { + def listTests: Seq[ScriptedTest] = { + list(baseDirectory, filter) flatMap { group => val groupName = group.getName - val allTests = list(group, filter).sortBy(_.getName) - if (allTests.isEmpty) { - log.warn("No tests in test group " + groupName) - Seq.empty - } else { - val (included, skipped) = allTests.toList.partition(test => accept(ScriptedTest(groupName, test.getName))) - if (included.isEmpty) - log.warn("Test group " + groupName + " skipped.") - else if (skipped.nonEmpty) { - log.warn("Tests skipped in group " + group.getName + ":") - skipped.foreach(testName => log.warn(" " + testName.getName)) - } - Seq(included.map(_.getName): _*) - } + listTests(group).map(ScriptedTest(groupName, _)) } + } + private[this] def listTests(group: File): Seq[String] = { + val groupName = group.getName + val allTests = list(group, filter).sortBy(_.getName) + if (allTests.isEmpty) { + log.warn("No tests in test group " + groupName) + Seq.empty + } else { + val (included, skipped) = + allTests.toList.partition(test => accept(ScriptedTest(groupName, test.getName))) + if (included.isEmpty) + log.warn("Test group " + groupName + " skipped.") + else if (skipped.nonEmpty) { + log.warn("Tests skipped in group " + group.getName + ":") + skipped.foreach(testName => log.warn(" " + testName.getName)) + } + Seq(included.map(_.getName): _*) + } + } } class PendingTestSuccessException(label: String) extends Exception { diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/StatementHandler.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/StatementHandler.scala index 15b7189ce..ac752a2cc 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/StatementHandler.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/StatementHandler.scala @@ -15,7 +15,10 @@ trait StatementHandler { trait BasicStatementHandler extends StatementHandler { final type State = Unit final def initialState = () - final def apply(command: String, arguments: List[String], state: Unit): Unit = apply(command, arguments) + + final def apply(command: String, arguments: List[String], state: Unit): Unit = + apply(command, arguments) + def apply(command: String, arguments: List[String]): Unit def finish(state: Unit) = () } @@ -23,4 +26,4 @@ trait BasicStatementHandler extends StatementHandler { /** Use when a stack trace is not useful */ final class TestFailed(msg: String) extends RuntimeException(msg) { override def fillInStackTrace = this -} \ No newline at end of file +} diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala index 74f96eb81..ca0826621 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/TestScriptParser.scala @@ -19,8 +19,13 @@ successChar ::= '+' | '-' word ::= [^ \[\]]+ comment ::= '#' \S* nl nl ::= '\r' \'n' | '\n' | '\r' | eof -*/ -final case class Statement(command: String, arguments: List[String], successExpected: Boolean, line: Int) { + */ +final case class Statement( + command: String, + arguments: List[String], + successExpected: Boolean, + line: Int +) { def linePrefix = "{line " + line + "} " } @@ -41,43 +46,50 @@ class TestScriptParser(handlers: Map[Char, StatementHandler]) extends RegexParse if (handlers.keys.exists(key => key == '+' || key == '-')) sys.error("Start characters cannot be '+' or '-'") - def parse(scriptFile: File): List[(StatementHandler, Statement)] = parse(read(scriptFile), Some(scriptFile.getAbsolutePath)) + def parse(scriptFile: File): List[(StatementHandler, Statement)] = + parse(read(scriptFile), Some(scriptFile.getAbsolutePath)) def parse(script: String): List[(StatementHandler, Statement)] = parse(script, None) - private def parse(script: String, label: Option[String]): List[(StatementHandler, Statement)] = - { - parseAll(statements, script) match { - case Success(result, next) => result - case err: NoSuccess => - { - val labelString = label.map("'" + _ + "' ").getOrElse("") - sys.error("Could not parse test script, " + labelString + err.toString) - } + private def parse(script: String, label: Option[String]): List[(StatementHandler, Statement)] = { + parseAll(statements, script) match { + case Success(result, next) => result + case err: NoSuccess => { + val labelString = label.map("'" + _ + "' ").getOrElse("") + sys.error("Could not parse test script, " + labelString + err.toString) } } + } lazy val statements = rep1(space ~> statement <~ newline) - def statement: Parser[(StatementHandler, Statement)] = - { - trait PositionalStatement extends Positional { - def tuple: (StatementHandler, Statement) - } - positioned { - val command = (word | err("expected command")) - val arguments = rep(space ~> (word | failure("expected argument"))) - (successParser ~ (space ~> startCharacterParser <~ space) ~! command ~! arguments) ^^ - { - case successExpected ~ start ~ command ~ arguments => - new PositionalStatement { - def tuple = (handlers(start), new Statement(command, arguments, successExpected, pos.line)) - } - } - } ^^ (_.tuple) + + def statement: Parser[(StatementHandler, Statement)] = { + trait PositionalStatement extends Positional { + def tuple: (StatementHandler, Statement) } + positioned { + val command = (word | err("expected command")) + val arguments = rep(space ~> (word | failure("expected argument"))) + (successParser ~ (space ~> startCharacterParser <~ space) ~! command ~! arguments) ^^ { + case successExpected ~ start ~ command ~ arguments => + new PositionalStatement { + def tuple = + (handlers(start), new Statement(command, arguments, successExpected, pos.line)) + } + } + } ^^ (_.tuple) + } + def successParser: Parser[Boolean] = ('+' ^^^ true) | ('-' ^^^ false) | success(true) def space: Parser[String] = """[ \t]*""".r - lazy val word: Parser[String] = ("\'" ~> "[^'\n\r]*".r <~ "\'") | ("\"" ~> "[^\"\n\r]*".r <~ "\"") | WordRegex - def startCharacterParser: Parser[Char] = elem("start character", handlers.contains _) | - ((newline | err("expected start character " + handlers.keys.mkString("(", "", ")"))) ~> failure("end of input")) + + lazy val word: Parser[String] = + ("\'" ~> "[^'\n\r]*".r <~ "\'") | ("\"" ~> "[^\"\n\r]*".r <~ "\"") | WordRegex + + def startCharacterParser: Parser[Char] = + elem("start character", handlers.contains _) | + ( + (newline | err("expected start character " + handlers.keys.mkString("(", "", ")"))) + ~> failure("end of input") + ) def newline = """\s*([\n\r]|$)""".r } diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 73e3a8304..6cc640ddf 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -44,9 +44,15 @@ object Dependencies { val scalatest = "org.scalatest" %% "scalatest" % "3.0.1" val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" - val sjsonnew = Def.setting { "com.eed3si9n" %% "sjson-new-core" % contrabandSjsonNewVersion.value } - val sjsonnewScalaJson = Def.setting { "com.eed3si9n" %% "sjson-new-scalajson" % contrabandSjsonNewVersion.value } - val sjsonnewMurmurhash = Def.setting { "com.eed3si9n" %% "sjson-new-murmurhash" % contrabandSjsonNewVersion.value } + val sjsonnew = Def.setting { + "com.eed3si9n" %% "sjson-new-core" % contrabandSjsonNewVersion.value + } + val sjsonnewScalaJson = Def.setting { + "com.eed3si9n" %% "sjson-new-scalajson" % contrabandSjsonNewVersion.value + } + val sjsonnewMurmurhash = Def.setting { + "com.eed3si9n" %% "sjson-new-murmurhash" % contrabandSjsonNewVersion.value + } def log4jVersion = "2.8.1" val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion diff --git a/project/plugins.sbt b/project/plugins.sbt index 493521644..aa8c8024a 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,3 +1,4 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0") -addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.17") +addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.17") +addSbtPlugin("com.lucidchart" % "sbt-scalafmt" % "1.10") diff --git a/util-cache/src/main/scala/sbt/util/Cache.scala b/util-cache/src/main/scala/sbt/util/Cache.scala index 78d694b27..8e4a4a7de 100644 --- a/util-cache/src/main/scala/sbt/util/Cache.scala +++ b/util-cache/src/main/scala/sbt/util/Cache.scala @@ -21,6 +21,7 @@ case class Miss[O](update: O => Unit) extends CacheResult[O] * A simple cache with keys of type `I` and values of type `O` */ trait Cache[I, O] { + /** * Queries the cache backed with store `store` for key `key`. */ @@ -59,7 +60,7 @@ object Cache { val result = default(key) update(result) result - } + } def debug[I](label: String, cache: SingletonCache[I]): SingletonCache[I] = new SingletonCache[I] { diff --git a/util-cache/src/main/scala/sbt/util/CacheImplicits.scala b/util-cache/src/main/scala/sbt/util/CacheImplicits.scala index 74cd51f68..b54a6b68c 100644 --- a/util-cache/src/main/scala/sbt/util/CacheImplicits.scala +++ b/util-cache/src/main/scala/sbt/util/CacheImplicits.scala @@ -3,5 +3,4 @@ package sbt.util import sjsonnew.BasicJsonProtocol object CacheImplicits extends CacheImplicits -trait CacheImplicits extends BasicCacheImplicits - with BasicJsonProtocol +trait CacheImplicits extends BasicCacheImplicits with BasicJsonProtocol diff --git a/util-cache/src/main/scala/sbt/util/CacheStore.scala b/util-cache/src/main/scala/sbt/util/CacheStore.scala index e29999471..054e07b6e 100644 --- a/util-cache/src/main/scala/sbt/util/CacheStore.scala +++ b/util-cache/src/main/scala/sbt/util/CacheStore.scala @@ -9,12 +9,15 @@ import sjsonnew.shaded.scalajson.ast.unsafe.JValue /** A `CacheStore` is used by the caching infrastructure to persist cached information. */ abstract class CacheStore extends Input with Output { + /** Delete the persisted information. */ def delete(): Unit + } object CacheStore { - implicit lazy val jvalueIsoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) + implicit lazy val jvalueIsoString: IsoString[JValue] = + IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) /** Returns file-based CacheStore using standard JSON converter. */ def apply(cacheFile: File): CacheStore = file(cacheFile) @@ -25,6 +28,7 @@ object CacheStore { /** Factory that can make new stores. */ abstract class CacheStoreFactory { + /** Create a new store. */ def make(identifier: String): CacheStore @@ -36,7 +40,8 @@ abstract class CacheStoreFactory { } object CacheStoreFactory { - implicit lazy val jvalueIsoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) + implicit lazy val jvalueIsoString: IsoString[JValue] = + IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) /** Returns directory-based CacheStoreFactory using standard JSON converter. */ def apply(base: File): CacheStoreFactory = directory(base) @@ -46,29 +51,38 @@ object CacheStoreFactory { } /** A factory that creates new stores persisted in `base`. */ -class DirectoryStoreFactory[J: IsoString](base: File, converter: SupportConverter[J]) extends CacheStoreFactory { +class DirectoryStoreFactory[J: IsoString](base: File, converter: SupportConverter[J]) + extends CacheStoreFactory { IO.createDirectory(base) def make(identifier: String): CacheStore = new FileBasedStore(base / identifier, converter) - def sub(identifier: String): CacheStoreFactory = new DirectoryStoreFactory(base / identifier, converter) + def sub(identifier: String): CacheStoreFactory = + new DirectoryStoreFactory(base / identifier, converter) } /** A `CacheStore` that persists information in `file`. */ class FileBasedStore[J: IsoString](file: File, converter: SupportConverter[J]) extends CacheStore { IO.touch(file, setModified = false) - def read[T: JsonReader]() = Using.fileInputStream(file)(stream => new PlainInput(stream, converter).read()) + def read[T: JsonReader]() = + Using.fileInputStream(file)(stream => new PlainInput(stream, converter).read()) def write[T: JsonWriter](value: T) = - Using.fileOutputStream(append = false)(file)(stream => new PlainOutput(stream, converter).write(value)) + Using.fileOutputStream(append = false)(file) { stream => + new PlainOutput(stream, converter).write(value) + } def delete() = IO.delete(file) def close() = () } /** A store that reads from `inputStream` and writes to `outputStream`. */ -class StreamBasedStore[J: IsoString](inputStream: InputStream, outputStream: OutputStream, converter: SupportConverter[J]) extends CacheStore { +class StreamBasedStore[J: IsoString]( + inputStream: InputStream, + outputStream: OutputStream, + converter: SupportConverter[J] +) extends CacheStore { def read[T: JsonReader]() = new PlainInput(inputStream, converter).read() def write[T: JsonWriter](value: T) = new PlainOutput(outputStream, converter).write(value) def delete() = () diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 9c675e652..6fcb3d619 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -32,7 +32,8 @@ object HashModifiedFileInfo { private final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo -private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) extends HashModifiedFileInfo +private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) + extends HashModifiedFileInfo final case class FilesInfo[F <: FileInfo] private (files: Set[F]) object FilesInfo { @@ -52,7 +53,8 @@ object FileInfo { type F <: FileInfo implicit def format: JsonFormat[F] - implicit def formats: JsonFormat[FilesInfo[F]] = projectFormat(_.files, (fs: Set[F]) => FilesInfo(fs)) + implicit def formats: JsonFormat[FilesInfo[F]] = + projectFormat(_.files, (fs: Set[F]) => FilesInfo(fs)) def apply(file: File): F def apply(files: Set[File]): FilesInfo[F] = FilesInfo(files map apply) @@ -113,7 +115,9 @@ object FileInfo { implicit def apply(file: File): HashFileInfo = FileHash(file.getAbsoluteFile, computeHash(file)) - private def computeHash(file: File): List[Byte] = try Hash(file).toList catch { case NonFatal(_) => Nil } + private def computeHash(file: File): List[Byte] = + try Hash(file).toList + catch { case NonFatal(_) => Nil } } object lastModified extends Style { @@ -140,7 +144,8 @@ object FileInfo { } } - implicit def apply(file: File): ModifiedFileInfo = FileModified(file.getAbsoluteFile, file.lastModified) + implicit def apply(file: File): ModifiedFileInfo = + FileModified(file.getAbsoluteFile, file.lastModified) } object exists extends Style { diff --git a/util-cache/src/main/scala/sbt/util/Input.scala b/util-cache/src/main/scala/sbt/util/Input.scala index 646660e59..ee89c30b3 100644 --- a/util-cache/src/main/scala/sbt/util/Input.scala +++ b/util-cache/src/main/scala/sbt/util/Input.scala @@ -7,7 +7,9 @@ import sbt.io.{ IO, Using } trait Input extends Closeable { def read[T: JsonReader](): T - def read[T: JsonReader](default: => T): T = try read[T]() catch { case NonFatal(_) => default } + def read[T: JsonReader](default: => T): T = + try read[T]() + catch { case NonFatal(_) => default } } class PlainInput[J: IsoString](input: InputStream, converter: SupportConverter[J]) extends Input { diff --git a/util-cache/src/main/scala/sbt/util/Output.scala b/util-cache/src/main/scala/sbt/util/Output.scala index cf4b27f12..db18f71a8 100644 --- a/util-cache/src/main/scala/sbt/util/Output.scala +++ b/util-cache/src/main/scala/sbt/util/Output.scala @@ -8,7 +8,8 @@ trait Output extends Closeable { def write[T: JsonWriter](value: T): Unit } -class PlainOutput[J: IsoString](output: OutputStream, converter: SupportConverter[J]) extends Output { +class PlainOutput[J: IsoString](output: OutputStream, converter: SupportConverter[J]) + extends Output { val isoFormat: IsoString[J] = implicitly def write[T: JsonWriter](value: T) = { diff --git a/util-cache/src/main/scala/sbt/util/SeparatedCache.scala b/util-cache/src/main/scala/sbt/util/SeparatedCache.scala index 0dc8dfceb..34a25345f 100644 --- a/util-cache/src/main/scala/sbt/util/SeparatedCache.scala +++ b/util-cache/src/main/scala/sbt/util/SeparatedCache.scala @@ -14,11 +14,13 @@ import CacheImplicits._ * A cache that stores a single value. */ trait SingletonCache[A] { + /** Reads the cache from the backing `from`. */ def read(from: Input): A /** Writes `value` to the backing `to`. */ def write(to: Output, value: A): Unit + } object SingletonCache { diff --git a/util-cache/src/main/scala/sbt/util/StampedFormat.scala b/util-cache/src/main/scala/sbt/util/StampedFormat.scala index c78186d1c..dddcefb42 100644 --- a/util-cache/src/main/scala/sbt/util/StampedFormat.scala +++ b/util-cache/src/main/scala/sbt/util/StampedFormat.scala @@ -10,13 +10,17 @@ object StampedFormat extends BasicJsonProtocol { withStamp(stamp(format))(format) } - def withStamp[T, S](stamp: S)(format: JsonFormat[T])(implicit formatStamp: JsonFormat[S], equivStamp: Equiv[S]): JsonFormat[T] = + def withStamp[T, S](stamp: S)(format: JsonFormat[T])( + implicit formatStamp: JsonFormat[S], + equivStamp: Equiv[S] + ): JsonFormat[T] = new JsonFormat[T] { override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): T = jsOpt match { case Some(js) => val stampedLength = unbuilder.beginArray(js) - if (stampedLength != 2) sys.error(s"Expected JsArray of size 2, found JsArray of size $stampedLength.") + if (stampedLength != 2) + sys.error(s"Expected JsArray of size 2, found JsArray of size $stampedLength.") val readStamp = unbuilder.nextElement val readValue = unbuilder.nextElement val actualStamp = formatStamp.read(Some(readStamp), unbuilder) @@ -34,7 +38,10 @@ object StampedFormat extends BasicJsonProtocol { builder.endArray() } } - private def stamp[T](format: JsonFormat[T])(implicit mf: Manifest[JsonFormat[T]]): Int = typeHash(mf) + + private def stamp[T](format: JsonFormat[T])(implicit mf: Manifest[JsonFormat[T]]): Int = + typeHash(mf) + private def typeHash[T](implicit mf: Manifest[T]) = mf.toString.hashCode -} \ No newline at end of file +} diff --git a/util-cache/src/test/scala/CacheSpec.scala b/util-cache/src/test/scala/CacheSpec.scala index 00481d227..468c647cd 100644 --- a/util-cache/src/test/scala/CacheSpec.scala +++ b/util-cache/src/test/scala/CacheSpec.scala @@ -13,7 +13,8 @@ import sbt.internal.util.UnitSpec class CacheSpec extends UnitSpec { - implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) + implicit val isoString: IsoString[JValue] = + IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) "A cache" should "NOT throw an exception if read without being written previously" in { testCache[String, Int] { @@ -68,10 +69,12 @@ class CacheSpec extends UnitSpec { } } - private def testCache[K, V](f: (Cache[K, V], CacheStore) => Unit)(implicit cache: Cache[K, V]): Unit = + private def testCache[K, V](f: (Cache[K, V], CacheStore) => Unit)( + implicit cache: Cache[K, V] + ): Unit = IO.withTemporaryDirectory { tmp => val store = new FileBasedStore(tmp / "cache-store", Converter) f(cache, store) } -} \ No newline at end of file +} diff --git a/util-cache/src/test/scala/SingletonCacheSpec.scala b/util-cache/src/test/scala/SingletonCacheSpec.scala index 9bfab82f8..84e91b627 100644 --- a/util-cache/src/test/scala/SingletonCacheSpec.scala +++ b/util-cache/src/test/scala/SingletonCacheSpec.scala @@ -42,7 +42,8 @@ class SingletonCacheSpec extends UnitSpec { } } - implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) + implicit val isoString: IsoString[JValue] = + IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) "A singleton cache" should "throw an exception if read without being written previously" in { testCache[Int] { @@ -83,10 +84,12 @@ class SingletonCacheSpec extends UnitSpec { } } - private def testCache[T](f: (SingletonCache[T], CacheStore) => Unit)(implicit cache: SingletonCache[T]): Unit = + private def testCache[T](f: (SingletonCache[T], CacheStore) => Unit)( + implicit cache: SingletonCache[T] + ): Unit = IO.withTemporaryDirectory { tmp => val store = new FileBasedStore(tmp / "cache-store", Converter) f(cache, store) } -} \ No newline at end of file +} diff --git a/util-tracking/src/main/scala/sbt/util/ChangeReport.scala b/util-tracking/src/main/scala/sbt/util/ChangeReport.scala index af8154729..d971e4241 100644 --- a/util-tracking/src/main/scala/sbt/util/ChangeReport.scala +++ b/util-tracking/src/main/scala/sbt/util/ChangeReport.scala @@ -10,28 +10,36 @@ object ChangeReport { override def modified = files override def markAllModified = this } + def unmodified[T](files: Set[T]): ChangeReport[T] = new EmptyChangeReport[T] { override def checked = files override def unmodified = files } } + /** The result of comparing some current set of objects against a previous set of objects.*/ trait ChangeReport[T] { + /** The set of all of the objects in the current set.*/ def checked: Set[T] + /** All of the objects that are in the same state in the current and reference sets.*/ def unmodified: Set[T] + /** * All checked objects that are not in the same state as the reference. This includes objects that are in both * sets but have changed and files that are only in one set. */ def modified: Set[T] // all changes, including added + /** All objects that are only in the current set.*/ def added: Set[T] + /** All objects only in the previous set*/ def removed: Set[T] def +++(other: ChangeReport[T]): ChangeReport[T] = new CompoundChangeReport(this, other) + /** * Generate a new report with this report's unmodified set included in the new report's modified set. The new report's * unmodified set is empty. The new report's added, removed, and checked sets are the same as in this report. @@ -45,14 +53,16 @@ trait ChangeReport[T] { def removed = ChangeReport.this.removed override def markAllModified = this } - override def toString = - { - val labels = List("Checked", "Modified", "Unmodified", "Added", "Removed") - val sets = List(checked, modified, unmodified, added, removed) - val keyValues = labels.zip(sets).map { case (label, set) => label + ": " + set.mkString(", ") } - keyValues.mkString("Change report:\n\t", "\n\t", "") - } + + override def toString = { + val labels = List("Checked", "Modified", "Unmodified", "Added", "Removed") + val sets = List(checked, modified, unmodified, added, removed) + val keyValues = labels.zip(sets).map { case (label, set) => label + ": " + set.mkString(", ") } + keyValues.mkString("Change report:\n\t", "\n\t", "") + } + } + class EmptyChangeReport[T] extends ChangeReport[T] { def checked = Set.empty[T] def unmodified = Set.empty[T] @@ -61,7 +71,9 @@ class EmptyChangeReport[T] extends ChangeReport[T] { def removed = Set.empty[T] override def toString = "No changes" } -private class CompoundChangeReport[T](a: ChangeReport[T], b: ChangeReport[T]) extends ChangeReport[T] { + +private class CompoundChangeReport[T](a: ChangeReport[T], b: ChangeReport[T]) + extends ChangeReport[T] { lazy val checked = a.checked ++ b.checked lazy val unmodified = a.unmodified ++ b.unmodified lazy val modified = a.modified ++ b.modified diff --git a/util-tracking/src/main/scala/sbt/util/FileFunction.scala b/util-tracking/src/main/scala/sbt/util/FileFunction.scala index cdc1cc4e8..0aa58ca4c 100644 --- a/util-tracking/src/main/scala/sbt/util/FileFunction.scala +++ b/util-tracking/src/main/scala/sbt/util/FileFunction.scala @@ -43,7 +43,9 @@ object FileFunction { * @param inStyle The strategy by which to detect state change in the input files from the previous run * @param action The work function, which receives a list of input files and returns a list of output files */ - def cached(cacheBaseDirectory: File, inStyle: FileInfo.Style)(action: Set[File] => Set[File]): Set[File] => Set[File] = + def cached(cacheBaseDirectory: File, inStyle: FileInfo.Style)( + action: Set[File] => Set[File] + ): Set[File] => Set[File] = cached(cacheBaseDirectory, inStyle = inStyle, outStyle = defaultOutStyle)(action) /** @@ -64,8 +66,12 @@ object FileFunction { * @param outStyle The strategy by which to detect state change in the output files from the previous run * @param action The work function, which receives a list of input files and returns a list of output files */ - def cached(cacheBaseDirectory: File, inStyle: FileInfo.Style, outStyle: FileInfo.Style)(action: Set[File] => Set[File]): Set[File] => Set[File] = - cached(CacheStoreFactory(cacheBaseDirectory), inStyle, outStyle)((in, out) => action(in.checked)) + def cached(cacheBaseDirectory: File, inStyle: FileInfo.Style, outStyle: FileInfo.Style)( + action: Set[File] => Set[File] + ): Set[File] => Set[File] = + cached(CacheStoreFactory(cacheBaseDirectory), inStyle, outStyle)( + (in, out) => action(in.checked) + ) /** * Generic change-detection helper used to help build / artifact generation / @@ -103,7 +109,9 @@ object FileFunction { * @param inStyle The strategy by which to detect state change in the input files from the previous run * @param action The work function, which receives a list of input files and returns a list of output files */ - def cached(storeFactory: CacheStoreFactory, inStyle: FileInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = + def cached(storeFactory: CacheStoreFactory, inStyle: FileInfo.Style)( + action: UpdateFunction + ): Set[File] => Set[File] = cached(storeFactory, inStyle = inStyle, outStyle = defaultOutStyle)(action) /** @@ -124,20 +132,21 @@ object FileFunction { * @param outStyle The strategy by which to detect state change in the output files from the previous run * @param action The work function, which receives a list of input files and returns a list of output files */ - def cached(storeFactory: CacheStoreFactory, inStyle: FileInfo.Style, outStyle: FileInfo.Style)(action: UpdateFunction): Set[File] => Set[File] = - { - lazy val inCache = Difference.inputs(storeFactory.make("in-cache"), inStyle) - lazy val outCache = Difference.outputs(storeFactory.make("out-cache"), outStyle) - inputs => - { - inCache(inputs) { inReport => - outCache { outReport => - if (inReport.modified.isEmpty && outReport.modified.isEmpty) - outReport.checked - else - action(inReport, outReport) - } + def cached(storeFactory: CacheStoreFactory, inStyle: FileInfo.Style, outStyle: FileInfo.Style)( + action: UpdateFunction + ): Set[File] => Set[File] = { + lazy val inCache = Difference.inputs(storeFactory.make("in-cache"), inStyle) + lazy val outCache = Difference.outputs(storeFactory.make("out-cache"), outStyle) + inputs => + { + inCache(inputs) { inReport => + outCache { outReport => + if (inReport.modified.isEmpty && outReport.modified.isEmpty) + outReport.checked + else + action(inReport, outReport) } } - } + } + } } diff --git a/util-tracking/src/main/scala/sbt/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala index 2cca19325..cfc1d089f 100644 --- a/util-tracking/src/main/scala/sbt/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -13,6 +13,7 @@ import sjsonnew.JsonFormat import sjsonnew.support.murmurhash.Hasher object Tracked { + /** * Creates a tracker that provides the last time it was evaluated. * If the function throws an exception. @@ -42,21 +43,24 @@ object Tracked { * If 'useStartTime' is false, the recorded time is when the evaluated function completes. * In both cases, the timestamp is not updated if the function throws an exception. */ - def tstamp(cacheFile: File, useStartTime: Boolean): Timestamp = tstamp(CacheStore(cacheFile), useStartTime) + def tstamp(cacheFile: File, useStartTime: Boolean): Timestamp = + tstamp(CacheStore(cacheFile), useStartTime) /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ def diffInputs(store: CacheStore, style: FileInfo.Style): Difference = Difference.inputs(store, style) /** Creates a tracker that provides the difference between a set of input files for successive invocations.*/ - def diffInputs(cacheFile: File, style: FileInfo.Style): Difference = diffInputs(CacheStore(cacheFile), style) + def diffInputs(cacheFile: File, style: FileInfo.Style): Difference = + diffInputs(CacheStore(cacheFile), style) /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ def diffOutputs(store: CacheStore, style: FileInfo.Style): Difference = Difference.outputs(store, style) /** Creates a tracker that provides the difference between a set of output files for successive invocations.*/ - def diffOutputs(cacheFile: File, style: FileInfo.Style): Difference = diffOutputs(CacheStore(cacheFile), style) + def diffOutputs(cacheFile: File, style: FileInfo.Style): Difference = + diffOutputs(CacheStore(cacheFile), style) /** Creates a tracker that provides the output of the most recent invocation of the function */ def lastOutput[I, O: JsonFormat](store: CacheStore)(f: (I, Option[O]) => O): I => O = { in => @@ -84,7 +88,9 @@ object Tracked { * cachedDoc(inputs)(() => exists(outputDirectory.allPaths.get.toSet)) * }}} */ - def outputChanged[A1: JsonFormat, A2](store: CacheStore)(f: (Boolean, A1) => A2): (() => A1) => A2 = p => { + def outputChanged[A1: JsonFormat, A2](store: CacheStore)( + f: (Boolean, A1) => A2 + ): (() => A1) => A2 = p => { val cache: SingletonCache[Long] = { import CacheImplicits.LongJsonFormat implicitly @@ -131,7 +137,9 @@ object Tracked { * cachedDoc(inputs)(() => exists(outputDirectory.allPaths.get.toSet)) * }}} */ - def inputChanged[I: JsonFormat: SingletonCache, O](store: CacheStore)(f: (Boolean, I) => O): I => O = { in => + def inputChanged[I: JsonFormat: SingletonCache, O](store: CacheStore)( + f: (Boolean, I) => O + ): I => O = { in => val cache: SingletonCache[Long] = { import CacheImplicits.LongJsonFormat implicitly @@ -159,7 +167,9 @@ object Tracked { * cachedDoc(inputs)(() => exists(outputDirectory.allPaths.get.toSet)) * }}} */ - def inputChanged[I: JsonFormat: SingletonCache, O](cacheFile: File)(f: (Boolean, I) => O): I => O = + def inputChanged[I: JsonFormat: SingletonCache, O](cacheFile: File)( + f: (Boolean, I) => O + ): I => O = inputChanged(CacheStore(cacheFile))(f) private final class CacheHelp[I: JsonFormat](val sc: SingletonCache[Long]) { @@ -186,23 +196,29 @@ object Tracked { } trait Tracked { + /** Cleans outputs and clears the cache.*/ def clean(): Unit + } -class Timestamp(val store: CacheStore, useStartTime: Boolean)(implicit format: JsonFormat[Long]) extends Tracked { + +class Timestamp(val store: CacheStore, useStartTime: Boolean)(implicit format: JsonFormat[Long]) + extends Tracked { def clean() = store.delete() + /** * Reads the previous timestamp, evaluates the provided function, * and then updates the timestamp if the function completes normally. */ - def apply[T](f: Long => T): T = - { - val start = now() - val result = f(readTimestamp) - store.write(if (useStartTime) start else now()) - result - } + def apply[T](f: Long => T): T = { + val start = now() + val result = f(readTimestamp) + store.write(if (useStartTime) start else now()) + result + } + private def now() = System.currentTimeMillis + def readTimestamp: Long = Try { store.read[Long] } getOrElse 0 } @@ -210,24 +226,30 @@ class Timestamp(val store: CacheStore, useStartTime: Boolean)(implicit format: J @deprecated("Use Tracked.inputChanged and Tracked.outputChanged instead", "1.0.1") class Changed[O: Equiv: JsonFormat](val store: CacheStore) extends Tracked { def clean() = store.delete() - def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => - { - if (uptodate(value)) - ifUnchanged(value) - else { - update(value) - ifChanged(value) - } - } - def update(value: O): Unit = store.write(value) //Using.fileOutputStream(false)(cacheFile)(stream => format.writes(stream, value)) + def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): O => O2 = value => { + if (uptodate(value)) + ifUnchanged(value) + else { + update(value) + ifChanged(value) + } + } + + def update(value: O): Unit = + store.write(value) //Using.fileOutputStream(false)(cacheFile)(stream => format.writes(stream, value)) + def uptodate(value: O): Boolean = { val equiv: Equiv[O] = implicitly equiv.equiv(value, store.read[O]) } } + object Difference { - def constructor(defineClean: Boolean, filesAreOutputs: Boolean): (CacheStore, FileInfo.Style) => Difference = + def constructor( + defineClean: Boolean, + filesAreOutputs: Boolean + ): (CacheStore, FileInfo.Style) => Difference = (store, style) => new Difference(store, style, defineClean, filesAreOutputs) /** @@ -236,55 +258,63 @@ object Difference { * before and after running the function. */ val outputs = constructor(true, true) + /** * Provides a constructor for a Difference that does nothing on a call to 'clean' and saves the * hash/last modified time of the files as they were prior to running the function. */ val inputs = constructor(false, false) + } -class Difference(val store: CacheStore, val style: FileInfo.Style, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked { - def clean() = - { - if (defineClean) IO.delete(raw(cachedFilesInfo)) else () - clearCache() - } + +class Difference( + val store: CacheStore, + val style: FileInfo.Style, + val defineClean: Boolean, + val filesAreOutputs: Boolean +) extends Tracked { + def clean() = { + if (defineClean) IO.delete(raw(cachedFilesInfo)) else () + clearCache() + } + private def clearCache() = store.delete() private def cachedFilesInfo = store.read(default = FilesInfo.empty[style.F])(style.formats).files private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) - def apply[T](files: Set[File])(f: ChangeReport[File] => T): T = - { - val lastFilesInfo = cachedFilesInfo - apply(files, lastFilesInfo)(f)(_ => files) - } + def apply[T](files: Set[File])(f: ChangeReport[File] => T): T = { + val lastFilesInfo = cachedFilesInfo + apply(files, lastFilesInfo)(f)(_ => files) + } - def apply[T](f: ChangeReport[File] => T)(implicit toFiles: T => Set[File]): T = - { - val lastFilesInfo = cachedFilesInfo - apply(raw(lastFilesInfo), lastFilesInfo)(f)(toFiles) - } + def apply[T](f: ChangeReport[File] => T)(implicit toFiles: T => Set[File]): T = { + val lastFilesInfo = cachedFilesInfo + apply(raw(lastFilesInfo), lastFilesInfo)(f)(toFiles) + } private def abs(files: Set[File]) = files.map(_.getAbsoluteFile) - private[this] def apply[T](files: Set[File], lastFilesInfo: Set[style.F])(f: ChangeReport[File] => T)(extractFiles: T => Set[File]): T = - { - val lastFiles = raw(lastFilesInfo) - val currentFiles = abs(files) - val currentFilesInfo = style(currentFiles) - val report = new ChangeReport[File] { - lazy val checked = currentFiles - lazy val removed = lastFiles -- checked // all files that were included previously but not this time. This is independent of whether the files exist. - lazy val added = checked -- lastFiles // all files included now but not previously. This is independent of whether the files exist. - lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) ++ added - lazy val unmodified = checked -- modified - } + private[this] def apply[T](files: Set[File], lastFilesInfo: Set[style.F])( + f: ChangeReport[File] => T + )(extractFiles: T => Set[File]): T = { + val lastFiles = raw(lastFilesInfo) + val currentFiles = abs(files) + val currentFilesInfo = style(currentFiles) - val result = f(report) - val info = if (filesAreOutputs) style(abs(extractFiles(result))) else currentFilesInfo - - store.write(info)(style.formats) - - result + val report = new ChangeReport[File] { + lazy val checked = currentFiles + lazy val removed = lastFiles -- checked // all files that were included previously but not this time. This is independent of whether the files exist. + lazy val added = checked -- lastFiles // all files included now but not previously. This is independent of whether the files exist. + lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) ++ added + lazy val unmodified = checked -- modified } + + val result = f(report) + val info = if (filesAreOutputs) style(abs(extractFiles(result))) else currentFilesInfo + + store.write(info)(style.formats) + + result + } } diff --git a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala index e0cf74558..a2015d76f 100644 --- a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala +++ b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala @@ -11,7 +11,6 @@ import sbt.internal.util.UnitSpec class TrackedSpec extends UnitSpec { "lastOutput" should "store the last output" in { withStore { store => - val value = 5 val otherValue = 10 @@ -132,7 +131,6 @@ class TrackedSpec extends UnitSpec { } } - "tstamp tracker" should "have a timestamp of 0 on first invocation" in { withStore { store => Tracked.tstamp(store) { last => @@ -143,7 +141,6 @@ class TrackedSpec extends UnitSpec { it should "provide the last time a function has been evaluated" in { withStore { store => - Tracked.tstamp(store) { last => assert(last === 0) } @@ -161,4 +158,4 @@ class TrackedSpec extends UnitSpec { f(store) } -} \ No newline at end of file +} From 0d4efe51e3f647278f5a782084fe7cd00a41016b Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 14 Aug 2017 15:25:36 +0100 Subject: [PATCH 712/823] Scalafmt 1.2.0 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 1c300d5ac..75189bece 100644 --- a/build.sbt +++ b/build.sbt @@ -63,7 +63,7 @@ lazy val utilRoot: Project = (project in file(".")) description := "Util module for sbt", scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")), scalafmtOnCompile := true, - scalafmtVersion := "1.1.0", + scalafmtVersion := "1.2.0", )), commonSettings, name := "Util Root", From f74e3e66f1336cb68e582cb91f9d38c397f85a37 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 24 Aug 2017 17:51:50 -0400 Subject: [PATCH 713/823] Trying to reproduce sbt/util#119 --- .../src/test/scala/ManagedLoggerSpec.scala | 24 +++++++++++++++++++ 1 file changed, 24 insertions(+) diff --git a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala index a0cf1e569..40601f1c4 100644 --- a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala +++ b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala @@ -50,6 +50,30 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { log.infoEvent(Vector(Vector(1, 2, 3))) } + it should "be thread safe" in { + import java.util.concurrent.{ Executors, TimeUnit } + import sjsonnew.BasicJsonProtocol._ + val pool = Executors.newFixedThreadPool(100) + + for { + i <- 1 to 10000 + } { + pool.submit(new Runnable { + def run(): Unit = { + val stringTypeTag = StringTypeTag[List[Int]] + val log = LogExchange.logger(s"foo$i") + LogExchange.bindLoggerAppenders(s"foo$i", List(LogExchange.asyncStdout -> Level.Info)) + if (i % 100 == 0) { + log.info(s"foo$i test $stringTypeTag") + } + Thread.sleep(1) + } + }) + } + pool.shutdown + pool.awaitTermination(30, TimeUnit.SECONDS) + } + "global logging" should "log immediately after initialization" in { // this is passed into State normally val global0 = initialGlobalLogging From b2be0f766ac58454a0b095c03610342c4e9704e2 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 25 Aug 2017 10:39:40 -0400 Subject: [PATCH 714/823] Attempt to solve the logger NPE issue Fixes sbt/util#119 1. perform the string codec registration once. 2. add retries. --- .../sbt/internal/util/ManagedLogger.scala | 12 ++------ .../sbt/internal/util/StringTypeTag.scala | 30 ++++++++++++++----- .../src/main/scala/sbt/util/LogExchange.scala | 19 ++++++++++++ .../src/test/scala/ManagedLoggerSpec.scala | 9 ++++-- 4 files changed, 51 insertions(+), 19 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index 0fa390c2e..d4cef3fe0 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -5,9 +5,6 @@ import org.apache.logging.log4j.{ Logger => XLogger } import org.apache.logging.log4j.message.ObjectMessage import sjsonnew.JsonFormat import scala.reflect.runtime.universe.TypeTag -import sbt.internal.util.codec.ThrowableShowLines._ -import sbt.internal.util.codec.TraceEventShowLines._ -import sbt.internal.util.codec.SuccessEventShowLines._ import sbt.internal.util.codec.JsonProtocol._ /** @@ -34,14 +31,9 @@ class ManagedLogger( } def registerStringCodec[A: ShowLines: TypeTag]: Unit = { - val tag = StringTypeTag[A] - val ev = implicitly[ShowLines[A]] - // println(s"registerStringCodec ${tag.key}") - val _ = LogExchange.getOrElseUpdateStringCodec(tag.key, ev) + LogExchange.registerStringCodec[A] } - registerStringCodec[Throwable] - registerStringCodec[TraceEvent] - registerStringCodec[SuccessEvent] + final def debugEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Debug, event) final def infoEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Info, event) final def warnEvent[A: JsonFormat: TypeTag](event: => A): Unit = logEvent(Level.Warn, event) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala index 00f30d7d2..aa635c975 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StringTypeTag.scala @@ -8,13 +8,29 @@ final case class StringTypeTag[A](key: String) { } object StringTypeTag { - def apply[A: TypeTag]: StringTypeTag[A] = { - val tag = implicitly[TypeTag[A]] - val tpe = tag.tpe - val k = typeToString(tpe) - // println(tpe.getClass.toString + " " + k) - StringTypeTag[A](k) - } + def apply[A: TypeTag]: StringTypeTag[A] = + synchronized { + def doApply: StringTypeTag[A] = { + val tag = implicitly[TypeTag[A]] + val tpe = tag.tpe + val k = typeToString(tpe) + // println(tpe.getClass.toString + " " + k) + StringTypeTag[A](k) + } + def retry(n: Int): StringTypeTag[A] = + try { + doApply + } catch { + case e: NullPointerException => + if (n < 1) throw new RuntimeException("NPE in StringTypeTag", e) + else { + Thread.sleep(1) + retry(n - 1) + } + } + retry(3) + } + def typeToString(tpe: Type): String = tpe match { case TypeRef(_, sym, args) => diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index ba3114643..0c35c98ee 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -8,6 +8,7 @@ import org.apache.logging.log4j.core.config.{ AppenderRef, LoggerConfig } import org.apache.logging.log4j.core.layout.PatternLayout import scala.collection.JavaConverters._ import scala.collection.concurrent +import scala.reflect.runtime.universe.TypeTag import sjsonnew.JsonFormat // http://logging.apache.org/log4j/2.x/manual/customconfig.html @@ -15,6 +16,7 @@ import sjsonnew.JsonFormat sealed abstract class LogExchange { private[sbt] lazy val context: LoggerContext = init() + private[sbt] lazy val builtInStringCodecs: Unit = initStringCodecs() private[sbt] lazy val asyncStdout: AsyncAppender = buildAsyncStdout private[sbt] val jsonCodecs: concurrent.Map[String, JsonFormat[_]] = concurrent.TrieMap() private[sbt] val stringCodecs: concurrent.Map[String, ShowLines[_]] = concurrent.TrieMap() @@ -22,6 +24,7 @@ sealed abstract class LogExchange { def logger(name: String): ManagedLogger = logger(name, None, None) def logger(name: String, channelName: Option[String], execId: Option[String]): ManagedLogger = { val _ = context + val codecs = builtInStringCodecs val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } val config = ctx.getConfiguration val loggerConfig = LoggerConfig.createLogger( @@ -57,6 +60,16 @@ sealed abstract class LogExchange { config.getLoggerConfig(loggerName) } + private[sbt] def initStringCodecs(): Unit = { + import sbt.internal.util.codec.ThrowableShowLines._ + import sbt.internal.util.codec.TraceEventShowLines._ + import sbt.internal.util.codec.SuccessEventShowLines._ + + registerStringCodec[Throwable] + registerStringCodec[TraceEvent] + registerStringCodec[SuccessEvent] + } + // This is a dummy layout to avoid casting error during PatternLayout.createDefaultLayout() // that was originally used for ConsoleAppender. // The stacktrace shows it's having issue initializing default DefaultConfiguration. @@ -85,6 +98,12 @@ sealed abstract class LogExchange { def getOrElseUpdateStringCodec[A](tag: String, v: ShowLines[A]): ShowLines[A] = stringCodecs.getOrElseUpdate(tag, v).asInstanceOf[ShowLines[A]] + def registerStringCodec[A: ShowLines: TypeTag]: Unit = { + val tag = StringTypeTag[A] + val ev = implicitly[ShowLines[A]] + val _ = getOrElseUpdateStringCodec(tag.key, ev) + } + private[sbt] def buildAsyncStdout: AsyncAppender = { val ctx = XLogManager.getContext(false) match { case x: LoggerContext => x } val config = ctx.getConfiguration diff --git a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala index 40601f1c4..8e9c905e1 100644 --- a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala +++ b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala @@ -20,6 +20,13 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { log.infoEvent(1) } + it should "support logging Throwable out of the box" in { + import sbt.internal.util.codec.JsonProtocol._ + val log = LogExchange.logger("foo") + LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) + log.infoEvent(SuccessEvent("yes")) + } + it should "allow registering Show[Int]" in { import sjsonnew.BasicJsonProtocol._ val log = LogExchange.logger("foo") @@ -52,9 +59,7 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { it should "be thread safe" in { import java.util.concurrent.{ Executors, TimeUnit } - import sjsonnew.BasicJsonProtocol._ val pool = Executors.newFixedThreadPool(100) - for { i <- 1 to 10000 } { From 83433d40e60c872395afa5d07c7e5a35c321be6b Mon Sep 17 00:00:00 2001 From: "tom.walford" Date: Fri, 25 Aug 2017 20:47:36 +0100 Subject: [PATCH 715/823] Cleaned up the deprecation messages to point to the correct classes. --- .../scala/sbt/internal/util/ConsoleAppender.scala | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 0685c69f0..48b81e786 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -13,17 +13,17 @@ import ConsoleAppender._ object ConsoleLogger { // These are provided so other modules do not break immediately. - @deprecated("Use EscHelpers.", "0.13.x") + @deprecated("Use EscHelpers.ESC instead", "0.13.x") final val ESC = EscHelpers.ESC - @deprecated("Use EscHelpers.", "0.13.x") + @deprecated("Use EscHelpers.isEscapeTerminator instead", "0.13.x") private[sbt] def isEscapeTerminator(c: Char): Boolean = EscHelpers.isEscapeTerminator(c) - @deprecated("Use EscHelpers.", "0.13.x") + @deprecated("Use EscHelpers.hasEscapeSequence instead", "0.13.x") def hasEscapeSequence(s: String): Boolean = EscHelpers.hasEscapeSequence(s) - @deprecated("Use EscHelpers.", "0.13.x") + @deprecated("Use EscHelpers.removeEscapeSequences instead", "0.13.x") def removeEscapeSequences(s: String): String = EscHelpers.removeEscapeSequences(s) - @deprecated("Use ConsoleAppenders.formatEnabledInEnv", "0.13.x") + @deprecated("Use ConsoleAppender.formatEnabledInEnv instead", "0.13.x") val formatEnabled = ConsoleAppender.formatEnabledInEnv - @deprecated("Use ConsoleAppender.", "0.13.x") + @deprecated("Use ConsoleAppender.noSuppressedMessage instead", "0.13.x") val noSuppressedMessage = ConsoleAppender.noSuppressedMessage /** From a088169568809317367b5414d8032ae38c9a71e6 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 25 Aug 2017 17:14:41 -0400 Subject: [PATCH 716/823] Don't format *.sbt files This adds update on load, which interacts badly with +compile. --- .travis.yml | 2 +- build.sbt | 1 + 2 files changed, 2 insertions(+), 1 deletion(-) diff --git a/.travis.yml b/.travis.yml index 1af85e971..3287be692 100644 --- a/.travis.yml +++ b/.travis.yml @@ -6,7 +6,7 @@ scala: - 2.12.3 script: - - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues scalafmt::test test:scalafmt::test sbt:scalafmt::test test + - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues scalafmt::test test:scalafmt::test test cache: directories: diff --git a/build.sbt b/build.sbt index 75189bece..8ddde2f78 100644 --- a/build.sbt +++ b/build.sbt @@ -63,6 +63,7 @@ lazy val utilRoot: Project = (project in file(".")) description := "Util module for sbt", scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")), scalafmtOnCompile := true, + scalafmtOnCompile in Sbt := false, scalafmtVersion := "1.2.0", )), commonSettings, From 32412e4625659547a1761b7274eebc5ce7f1056a Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 26 Aug 2017 13:13:04 -0400 Subject: [PATCH 717/823] Use sbt 1.0.0 --- build.sbt | 2 +- project/build.properties | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 75189bece..24db64b6d 100644 --- a/build.sbt +++ b/build.sbt @@ -32,7 +32,7 @@ def commonSettings: Seq[Setting[_]] = Seq( val mimaSettings = Def settings ( mimaPreviousArtifacts := Set( - organization.value % moduleName.value % "1.0.0-RC3" + organization.value % moduleName.value % "1.0.0" cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) ) ) diff --git a/project/build.properties b/project/build.properties index 12c38d389..94005e587 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.0.0-RC3 +sbt.version=1.0.0 From ddb6a13febf38814ef2e8858fda2b182747ebb8c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 24 Sep 2017 06:36:18 -0400 Subject: [PATCH 718/823] Provide JValue pass-through --- .../main/scala/sbt/internal/util/codec/JValueFormats.scala | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala index c0c79f7d1..5d8d58146 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/JValueFormats.scala @@ -42,8 +42,13 @@ trait JValueFormats { self: sjsonnew.BasicJsonProtocol => } } + // This passes through JValue, or returns JNull instead of blowing up with unimplemented. implicit lazy val JValueJsonReader: JR[JValue] = new JR[JValue] { - def read[J](j: Option[J], u: Unbuilder[J]) = ??? // Is this even possible? with no Manifest[J]? + def read[J](j: Option[J], u: Unbuilder[J]) = j match { + case Some(x: JValue) => x + case Some(x) => sys.error(s"Uknown AST $x") + case _ => JNull + } } implicit lazy val JValueFormat: JF[JValue] = From 6fca557dc535a17131ad02cafe2fabbd444989e8 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 24 Sep 2017 06:43:27 -0400 Subject: [PATCH 719/823] 1.0.2-SNAPSHOT --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 72a33fadd..41a967b3a 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,7 @@ import Dependencies._ import Util._ //import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion = "1.0.0-SNAPSHOT" +def baseVersion = "1.0.2-SNAPSHOT" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( From c0c186580a199657478d6844606c24f4254d58f0 Mon Sep 17 00:00:00 2001 From: Leonard Ehrenfried Date: Tue, 7 Nov 2017 09:25:10 +0100 Subject: [PATCH 720/823] Upgrade scala and dependency versions --- project/Dependencies.scala | 6 +++--- project/plugins.sbt | 2 +- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 6cc640ddf..bce9bf8cb 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -4,10 +4,10 @@ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { val scala210 = "2.10.6" - val scala211 = "2.11.11" - val scala212 = "2.12.3" + val scala211 = "2.11.12" + val scala212 = "2.12.4" - private val ioVersion = "1.0.0" + private val ioVersion = "1.0.2" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion diff --git a/project/plugins.sbt b/project/plugins.sbt index aa8c8024a..81ebe9957 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,4 +1,4 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0") addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.17") -addSbtPlugin("com.lucidchart" % "sbt-scalafmt" % "1.10") +addSbtPlugin("com.lucidchart" % "sbt-scalafmt" % "1.14") From 48f9cf3be60190c07d5c8be1ac3a7aa7c259715b Mon Sep 17 00:00:00 2001 From: Leonard Ehrenfried Date: Wed, 8 Nov 2017 12:30:51 +0100 Subject: [PATCH 721/823] Disable calculation of log4j caller location information --- .../util-logging/src/main/scala/sbt/util/LogExchange.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index 0c35c98ee..b5bb1cb56 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -31,7 +31,9 @@ sealed abstract class LogExchange { false, XLevel.DEBUG, name, - "true", + // disable the calculation of caller location as it is very expensive + // https://issues.apache.org/jira/browse/LOG4J2-153 + "false", Array[AppenderRef](), null, config, From 3cc56ad0bad6c23d458346bf16058a3ac98d9bdf Mon Sep 17 00:00:00 2001 From: Leonard Ehrenfried Date: Wed, 8 Nov 2017 13:43:03 +0100 Subject: [PATCH 722/823] Add performance test --- .../src/test/scala/ManagedLoggerSpec.scala | 12 ++++++++++++ 1 file changed, 12 insertions(+) diff --git a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala index 8e9c905e1..56742b836 100644 --- a/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala +++ b/internal/util-logging/src/test/scala/ManagedLoggerSpec.scala @@ -20,6 +20,18 @@ class ManagedLoggerSpec extends FlatSpec with Matchers { log.infoEvent(1) } + it should "validate performance improvement of disabling location calculation for async loggers" in { + val log = LogExchange.logger("foo") + LogExchange.bindLoggerAppenders("foo", List(LogExchange.asyncStdout -> Level.Info)) + val before = System.currentTimeMillis() + 1 to 10000 foreach { _ => + log.debug("test") + } + val after = System.currentTimeMillis() + + log.info(s"Peformance test took: ${after - before}ms") + } + it should "support logging Throwable out of the box" in { import sbt.internal.util.codec.JsonProtocol._ val log = LogExchange.logger("foo") From c1966b688decfdff6476cb16201ed3895585d47b Mon Sep 17 00:00:00 2001 From: Leonard Ehrenfried Date: Tue, 7 Nov 2017 09:56:34 +0100 Subject: [PATCH 723/823] Also upgrade versions in .travis.yml --- .travis.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.travis.yml b/.travis.yml index 3287be692..ce5390a3a 100644 --- a/.travis.yml +++ b/.travis.yml @@ -2,8 +2,8 @@ language: scala jdk: oraclejdk8 scala: - - 2.11.11 - - 2.12.3 + - 2.11.12 + - 2.12.4 script: - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues scalafmt::test test:scalafmt::test test From ff054d8ef5f294abfd1af6cf84f74e78e2309536 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 28 Nov 2017 23:12:01 -0500 Subject: [PATCH 724/823] IO 1.1.1 --- project/Dependencies.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index bce9bf8cb..439ff5650 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -3,11 +3,11 @@ import Keys._ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { - val scala210 = "2.10.6" + val scala210 = "2.10.7" val scala211 = "2.11.12" val scala212 = "2.12.4" - private val ioVersion = "1.0.2" + private val ioVersion = "1.1.1" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From ccf149e8bf87cd2cbf70313db023ec70ba4a618e Mon Sep 17 00:00:00 2001 From: Antonio Cunei Date: Thu, 30 Nov 2017 04:12:13 +0100 Subject: [PATCH 725/823] Convert lastModified() calls to sbt.io.Milli.getModifiedTime() --- .../src/main/scala/sbt/internal/scripted/FileCommands.scala | 4 +++- util-cache/src/main/scala/sbt/util/FileInfo.scala | 5 +++-- util-cache/src/test/scala/FileInfoSpec.scala | 3 ++- 3 files changed, 8 insertions(+), 4 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index 3b5daaef7..8849f4987 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -9,6 +9,7 @@ import java.io.File import sbt.io.{ IO, Path } import sbt.io.syntax._ import Path._ +import sbt.io.Milli.getModifiedTime class FileCommands(baseDirectory: File) extends BasicStatementHandler { lazy val commands = commandMap @@ -67,7 +68,8 @@ class FileCommands(baseDirectory: File) extends BasicStatementHandler { def newer(a: String, b: String): Unit = { val pathA = fromString(a) val pathB = fromString(b) - val isNewer = pathA.exists && (!pathB.exists || pathA.lastModified > pathB.lastModified) + val isNewer = pathA.exists && (!pathB.exists || getModifiedTime(pathA) > getModifiedTime( + pathB)) if (!isNewer) { scriptError(s"$pathA is not newer than $pathB") } diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 6fcb3d619..90a65b40b 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -8,6 +8,7 @@ import scala.util.control.NonFatal import sbt.io.Hash import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError } import CacheImplicits._ +import sbt.io.Milli.getModifiedTime sealed trait FileInfo { def file: File } sealed trait HashFileInfo extends FileInfo { def hash: List[Byte] } @@ -88,7 +89,7 @@ object FileInfo { } implicit def apply(file: File): HashModifiedFileInfo = - FileHashModified(file.getAbsoluteFile, Hash(file).toList, file.lastModified) + FileHashModified(file.getAbsoluteFile, Hash(file).toList, getModifiedTime(file)) } object hash extends Style { @@ -145,7 +146,7 @@ object FileInfo { } implicit def apply(file: File): ModifiedFileInfo = - FileModified(file.getAbsoluteFile, file.lastModified) + FileModified(file.getAbsoluteFile, getModifiedTime(file)) } object exists extends Style { diff --git a/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala index debd427c7..a10d4b6bc 100644 --- a/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -3,10 +3,11 @@ package sbt.util import sjsonnew.shaded.scalajson.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ import sbt.internal.util.UnitSpec +import sbt.io.Milli.getModifiedTime class FileInfoSpec extends UnitSpec { val file = new java.io.File(".").getAbsoluteFile - val fileInfo: ModifiedFileInfo = FileModified(file, file.lastModified()) + val fileInfo: ModifiedFileInfo = FileModified(file, getModifiedTime(file)) val filesInfo = FilesInfo(Set(fileInfo)) it should "round trip" in assertRoundTrip(filesInfo) From d03dfb39817be24a84556cabfc4affc4ac257bb2 Mon Sep 17 00:00:00 2001 From: Antonio Cunei Date: Mon, 4 Dec 2017 21:22:07 +0100 Subject: [PATCH 726/823] Moved Milli._ to IO. --- .../src/main/scala/sbt/internal/scripted/FileCommands.scala | 5 ++--- util-cache/src/main/scala/sbt/util/FileInfo.scala | 2 +- util-cache/src/test/scala/FileInfoSpec.scala | 2 +- 3 files changed, 4 insertions(+), 5 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index 8849f4987..95257d288 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -9,7 +9,7 @@ import java.io.File import sbt.io.{ IO, Path } import sbt.io.syntax._ import Path._ -import sbt.io.Milli.getModifiedTime +import sbt.io.IO.getModifiedTime class FileCommands(baseDirectory: File) extends BasicStatementHandler { lazy val commands = commandMap @@ -68,8 +68,7 @@ class FileCommands(baseDirectory: File) extends BasicStatementHandler { def newer(a: String, b: String): Unit = { val pathA = fromString(a) val pathB = fromString(b) - val isNewer = pathA.exists && (!pathB.exists || getModifiedTime(pathA) > getModifiedTime( - pathB)) + val isNewer = pathA.exists && (!pathB.exists || getModifiedTime(pathA) > getModifiedTime(pathB)) if (!isNewer) { scriptError(s"$pathA is not newer than $pathB") } diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 90a65b40b..218110f20 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -8,7 +8,7 @@ import scala.util.control.NonFatal import sbt.io.Hash import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError } import CacheImplicits._ -import sbt.io.Milli.getModifiedTime +import sbt.io.IO.getModifiedTime sealed trait FileInfo { def file: File } sealed trait HashFileInfo extends FileInfo { def hash: List[Byte] } diff --git a/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala index a10d4b6bc..936441fbb 100644 --- a/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -3,7 +3,7 @@ package sbt.util import sjsonnew.shaded.scalajson.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ import sbt.internal.util.UnitSpec -import sbt.io.Milli.getModifiedTime +import sbt.io.IO.getModifiedTime class FileInfoSpec extends UnitSpec { val file = new java.io.File(".").getAbsoluteFile From cd4346c5d7aa0c04893bc7ae85c240e1a6926c62 Mon Sep 17 00:00:00 2001 From: Antonio Cunei Date: Fri, 8 Dec 2017 23:50:40 +0100 Subject: [PATCH 727/823] Allow FileInfo for non-existent files with the new timestamps FileInfo is used to wrap information like last modified time on files that may or may not exist. Arguably, that does not make much sense: the non-existent files should not lead to modification file information, hashes, and a persistent serialized version of the resulting meaningless information. However, considering that the FileInfo information is serialized and saved, it is necessary to preserve compatibility at this stage. Therefore the modification time is explicitly set to zero for those files that do not exist when each FileInfo is built. --- util-cache/src/main/scala/sbt/util/FileInfo.scala | 14 ++++++++++++-- 1 file changed, 12 insertions(+), 2 deletions(-) diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 218110f20..4981831f4 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -4,6 +4,7 @@ package sbt.util import java.io.File +import java.io.FileNotFoundException import scala.util.control.NonFatal import sbt.io.Hash import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError } @@ -50,6 +51,15 @@ object FilesInfo { } object FileInfo { + + def getModifiedTimeOrZero(file: File) = { // returns 0L if file does not exist + try { + getModifiedTime(file) + } catch { + case _: FileNotFoundException => 0L + } + } + sealed trait Style { type F <: FileInfo @@ -89,7 +99,7 @@ object FileInfo { } implicit def apply(file: File): HashModifiedFileInfo = - FileHashModified(file.getAbsoluteFile, Hash(file).toList, getModifiedTime(file)) + FileHashModified(file.getAbsoluteFile, Hash(file).toList, getModifiedTimeOrZero(file)) } object hash extends Style { @@ -146,7 +156,7 @@ object FileInfo { } implicit def apply(file: File): ModifiedFileInfo = - FileModified(file.getAbsoluteFile, getModifiedTime(file)) + FileModified(file.getAbsoluteFile, getModifiedTimeOrZero(file)) } object exists extends Style { From 13a8d5336944ac12ed5dc39232effeedaea32ab7 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 13 Dec 2017 15:47:15 +0000 Subject: [PATCH 728/823] Use IO.getModified over importing the method .. and make getModifiedTimeOrZero private. --- .../scala/sbt/internal/scripted/FileCommands.scala | 3 ++- util-cache/src/main/scala/sbt/util/FileInfo.scala | 14 +++++--------- 2 files changed, 7 insertions(+), 10 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index 95257d288..549352ca1 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -68,7 +68,8 @@ class FileCommands(baseDirectory: File) extends BasicStatementHandler { def newer(a: String, b: String): Unit = { val pathA = fromString(a) val pathB = fromString(b) - val isNewer = pathA.exists && (!pathB.exists || getModifiedTime(pathA) > getModifiedTime(pathB)) + val isNewer = pathA.exists && + (!pathB.exists || IO.getModifiedTime(pathA) > IO.getModifiedTime(pathB)) if (!isNewer) { scriptError(s"$pathA is not newer than $pathB") } diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 4981831f4..2a26e193d 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -6,10 +6,9 @@ package sbt.util import java.io.File import java.io.FileNotFoundException import scala.util.control.NonFatal -import sbt.io.Hash +import sbt.io.{ Hash, IO } import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError } import CacheImplicits._ -import sbt.io.IO.getModifiedTime sealed trait FileInfo { def file: File } sealed trait HashFileInfo extends FileInfo { def hash: List[Byte] } @@ -52,13 +51,10 @@ object FilesInfo { object FileInfo { - def getModifiedTimeOrZero(file: File) = { // returns 0L if file does not exist - try { - getModifiedTime(file) - } catch { - case _: FileNotFoundException => 0L - } - } + // returns 0L if file does not exist + private def getModifiedTimeOrZero(file: File) = + try IO.getModifiedTime(file) + catch { case _: FileNotFoundException => 0L } sealed trait Style { type F <: FileInfo From d2338ff28760232e827e7f8cbca07af44c01ccf2 Mon Sep 17 00:00:00 2001 From: Antonio Cunei Date: Fri, 15 Dec 2017 17:23:39 +0100 Subject: [PATCH 729/823] Removed a couple more direct imports of getModifiedTime() --- .../src/main/scala/sbt/internal/scripted/FileCommands.scala | 2 +- util-cache/src/test/scala/FileInfoSpec.scala | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index 549352ca1..3a1dcc1fe 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -9,7 +9,7 @@ import java.io.File import sbt.io.{ IO, Path } import sbt.io.syntax._ import Path._ -import sbt.io.IO.getModifiedTime +import sbt.io.IO class FileCommands(baseDirectory: File) extends BasicStatementHandler { lazy val commands = commandMap diff --git a/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala index 936441fbb..62e941254 100644 --- a/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -3,11 +3,11 @@ package sbt.util import sjsonnew.shaded.scalajson.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ import sbt.internal.util.UnitSpec -import sbt.io.IO.getModifiedTime +import sbt.io.IO class FileInfoSpec extends UnitSpec { val file = new java.io.File(".").getAbsoluteFile - val fileInfo: ModifiedFileInfo = FileModified(file, getModifiedTime(file)) + val fileInfo: ModifiedFileInfo = FileModified(file, IO.getModifiedTime(file)) val filesInfo = FilesInfo(Set(fileInfo)) it should "round trip" in assertRoundTrip(filesInfo) From 3d9eab1bf8866010b8643261e7f2df3bab1102ab Mon Sep 17 00:00:00 2001 From: eugene yokota Date: Fri, 15 Dec 2017 12:51:29 -0500 Subject: [PATCH 730/823] IO 1.1.2 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 439ff5650..453f20649 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -7,7 +7,7 @@ object Dependencies { val scala211 = "2.11.12" val scala212 = "2.12.4" - private val ioVersion = "1.1.1" + private val ioVersion = "1.1.2" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From e835ce068959779984c80517d4b145947a361ccb Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 15 Dec 2017 13:15:34 -0500 Subject: [PATCH 731/823] bump plugins --- .../src/main/scala/sbt/internal/scripted/FileCommands.scala | 2 +- project/plugins.sbt | 6 ++---- 2 files changed, 3 insertions(+), 5 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index 3a1dcc1fe..331d34e93 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -69,7 +69,7 @@ class FileCommands(baseDirectory: File) extends BasicStatementHandler { val pathA = fromString(a) val pathB = fromString(b) val isNewer = pathA.exists && - (!pathB.exists || IO.getModifiedTime(pathA) > IO.getModifiedTime(pathB)) + (!pathB.exists || IO.getModifiedTime(pathA) > IO.getModifiedTime(pathB)) if (!isNewer) { scriptError(s"$pathA is not newer than $pathB") } diff --git a/project/plugins.sbt b/project/plugins.sbt index 81ebe9957..a36d654a3 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,4 +1,2 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.3") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.0") -addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.1.17") -addSbtPlugin("com.lucidchart" % "sbt-scalafmt" % "1.14") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.4") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.2") From 2ee0a1e19aee1875b956b39acef3af13fda2a28f Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 18 Dec 2017 16:01:01 +0000 Subject: [PATCH 732/823] Add 1.1.1 to mimaPreviousArtifacts, & backfill --- build.sbt | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 41a967b3a..c421a25c0 100644 --- a/build.sbt +++ b/build.sbt @@ -32,9 +32,12 @@ def commonSettings: Seq[Setting[_]] = Seq( val mimaSettings = Def settings ( mimaPreviousArtifacts := Set( - organization.value % moduleName.value % "1.0.0" + "1.0.0", "1.0.1", "1.0.2", "1.0.3", + "1.1.0", "1.1.1", + ) map (version => + organization.value %% moduleName.value % version cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) - ) + ), ) lazy val utilRoot: Project = (project in file(".")) From 28bcc6c602aee31a1208a16bbba9ab7a04a9fbb6 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 19 Dec 2017 13:22:48 +0000 Subject: [PATCH 733/823] Upgrade to sbt-houserules 0.3.5 --- build.sbt | 2 -- project/plugins.sbt | 2 +- 2 files changed, 1 insertion(+), 3 deletions(-) diff --git a/build.sbt b/build.sbt index c421a25c0..412d7b65f 100644 --- a/build.sbt +++ b/build.sbt @@ -65,9 +65,7 @@ lazy val utilRoot: Project = (project in file(".")) homepage := Some(url("https://github.com/sbt/util")), description := "Util module for sbt", scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")), - scalafmtOnCompile := true, scalafmtOnCompile in Sbt := false, - scalafmtVersion := "1.2.0", )), commonSettings, name := "Util Root", diff --git a/project/plugins.sbt b/project/plugins.sbt index a36d654a3..b85b1f525 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,2 +1,2 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.4") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.5") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.2") From 5ed25cbc56020784227c683117f1cf18ea6145c2 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 21 Dec 2017 17:09:30 +0000 Subject: [PATCH 734/823] Make EscHelpers.removeEscapeSequences handle partial escape sequences Fixes #67 --- .../src/main/scala/sbt/internal/util/EscHelpers.scala | 3 ++- internal/util-logging/src/test/scala/Escapes.scala | 10 ++++++++++ 2 files changed, 12 insertions(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/EscHelpers.scala b/internal/util-logging/src/main/scala/sbt/internal/util/EscHelpers.scala index ad394994c..26ad513a8 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/EscHelpers.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/EscHelpers.scala @@ -66,8 +66,9 @@ object EscHelpers { } else { sb.append(s, start, escIndex) val next: Int = + if (escIndex + 1 >= s.length) skipESC(s, escIndex + 1) // If it's a CSI we skip past it and then look for a terminator. - if (isCSI(s.charAt(escIndex + 1))) skipESC(s, escIndex + 2) + else if (isCSI(s.charAt(escIndex + 1))) skipESC(s, escIndex + 2) else if (isAnsiTwoCharacterTerminator(s.charAt(escIndex + 1))) escIndex + 2 else { // There could be non-ANSI character sequences we should make sure we handle here. diff --git a/internal/util-logging/src/test/scala/Escapes.scala b/internal/util-logging/src/test/scala/Escapes.scala index 9db109d7f..2464286bb 100644 --- a/internal/util-logging/src/test/scala/Escapes.scala +++ b/internal/util-logging/src/test/scala/Escapes.scala @@ -34,6 +34,16 @@ object Escapes extends Properties("Escapes") { !hasEscapeSequence(removed) } + private[this] final val ecs = ESC.toString + private val partialEscapeSequences = + Gen.oneOf(Gen const ecs, Gen const ecs ++ "[", Gen.choose('@', '_').map(ecs :+ _)) + + property("removeEscapeSequences handles partial escape sequences") = + forAll(partialEscapeSequences) { s => + val removed: String = removeEscapeSequences(s) + s"Escape sequence removed: '$removed'" |: !hasEscapeSequence(removed) + } + property("removeEscapeSequences returns string without escape sequences") = forAllNoShrink(genWithoutEscape, genEscapePairs) { (start: String, escapes: List[EscapeAndNot]) => From 8ba68eedfd9433c9cc5e5e36fc6db81e95352d2a Mon Sep 17 00:00:00 2001 From: Antonio Cunei Date: Mon, 18 Dec 2017 17:15:15 +0100 Subject: [PATCH 735/823] Revert *ModifiedTime() calls to *lastModified*() calls There are just too many instances in which sbt's code relies on the `lastModified`/`setLastModified` semantics, so instead of moving to `get`/`setModifiedTime`, we use new IO calls that offer the new timestamp precision, but retain the old semantics. --- .../main/scala/sbt/internal/scripted/FileCommands.scala | 2 +- util-cache/src/main/scala/sbt/util/FileInfo.scala | 9 ++------- util-cache/src/test/scala/FileInfoSpec.scala | 2 +- 3 files changed, 4 insertions(+), 9 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index 331d34e93..b038ceb67 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -69,7 +69,7 @@ class FileCommands(baseDirectory: File) extends BasicStatementHandler { val pathA = fromString(a) val pathB = fromString(b) val isNewer = pathA.exists && - (!pathB.exists || IO.getModifiedTime(pathA) > IO.getModifiedTime(pathB)) + (!pathB.exists || IO.lastModified(pathA) > IO.lastModified(pathB)) if (!isNewer) { scriptError(s"$pathA is not newer than $pathB") } diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 2a26e193d..1dc3cb186 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -51,11 +51,6 @@ object FilesInfo { object FileInfo { - // returns 0L if file does not exist - private def getModifiedTimeOrZero(file: File) = - try IO.getModifiedTime(file) - catch { case _: FileNotFoundException => 0L } - sealed trait Style { type F <: FileInfo @@ -95,7 +90,7 @@ object FileInfo { } implicit def apply(file: File): HashModifiedFileInfo = - FileHashModified(file.getAbsoluteFile, Hash(file).toList, getModifiedTimeOrZero(file)) + FileHashModified(file.getAbsoluteFile, Hash(file).toList, IO.lastModified(file)) } object hash extends Style { @@ -152,7 +147,7 @@ object FileInfo { } implicit def apply(file: File): ModifiedFileInfo = - FileModified(file.getAbsoluteFile, getModifiedTimeOrZero(file)) + FileModified(file.getAbsoluteFile, IO.lastModified(file)) } object exists extends Style { diff --git a/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala index 62e941254..ea60ead82 100644 --- a/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -7,7 +7,7 @@ import sbt.io.IO class FileInfoSpec extends UnitSpec { val file = new java.io.File(".").getAbsoluteFile - val fileInfo: ModifiedFileInfo = FileModified(file, IO.getModifiedTime(file)) + val fileInfo: ModifiedFileInfo = FileModified(file, IO.lastModified(file)) val filesInfo = FilesInfo(Set(fileInfo)) it should "round trip" in assertRoundTrip(filesInfo) From 0a1bd5a3b25f95d25a5fef18844cb0acf61a425b Mon Sep 17 00:00:00 2001 From: Antonio Cunei Date: Fri, 22 Dec 2017 00:03:11 +0100 Subject: [PATCH 736/823] Change modifiedTime definitions --- .../src/main/scala/sbt/internal/scripted/FileCommands.scala | 2 +- util-cache/src/main/scala/sbt/util/FileInfo.scala | 4 ++-- util-cache/src/test/scala/FileInfoSpec.scala | 2 +- 3 files changed, 4 insertions(+), 4 deletions(-) diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala index b038ceb67..6aefa4a7a 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/FileCommands.scala @@ -69,7 +69,7 @@ class FileCommands(baseDirectory: File) extends BasicStatementHandler { val pathA = fromString(a) val pathB = fromString(b) val isNewer = pathA.exists && - (!pathB.exists || IO.lastModified(pathA) > IO.lastModified(pathB)) + (!pathB.exists || IO.getModifiedTimeOrZero(pathA) > IO.getModifiedTimeOrZero(pathB)) if (!isNewer) { scriptError(s"$pathA is not newer than $pathB") } diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 1dc3cb186..bef3d6bfd 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -90,7 +90,7 @@ object FileInfo { } implicit def apply(file: File): HashModifiedFileInfo = - FileHashModified(file.getAbsoluteFile, Hash(file).toList, IO.lastModified(file)) + FileHashModified(file.getAbsoluteFile, Hash(file).toList, IO.getModifiedTimeOrZero(file)) } object hash extends Style { @@ -147,7 +147,7 @@ object FileInfo { } implicit def apply(file: File): ModifiedFileInfo = - FileModified(file.getAbsoluteFile, IO.lastModified(file)) + FileModified(file.getAbsoluteFile, IO.getModifiedTimeOrZero(file)) } object exists extends Style { diff --git a/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala index ea60ead82..7b0f3e035 100644 --- a/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -7,7 +7,7 @@ import sbt.io.IO class FileInfoSpec extends UnitSpec { val file = new java.io.File(".").getAbsoluteFile - val fileInfo: ModifiedFileInfo = FileModified(file, IO.lastModified(file)) + val fileInfo: ModifiedFileInfo = FileModified(file, IO.getModifiedTimeOrZero(file)) val filesInfo = FilesInfo(Set(fileInfo)) it should "round trip" in assertRoundTrip(filesInfo) From 2765e07add355a1d63eb63c15da4b74fa126146c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 21 Dec 2017 22:43:38 -0500 Subject: [PATCH 737/823] sbt 1.0.4 --- project/build.properties | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/build.properties b/project/build.properties index 94005e587..394cb75cf 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.0.0 +sbt.version=1.0.4 From de54721fc3c3beca8aa097d7ce3e0c78c8366077 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 21 Dec 2017 22:43:53 -0500 Subject: [PATCH 738/823] IO 1.1.3 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 453f20649..97e4f992b 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -7,7 +7,7 @@ object Dependencies { val scala211 = "2.11.12" val scala212 = "2.12.4" - private val ioVersion = "1.1.2" + private val ioVersion = "1.1.3" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From 966f3ebaadbfcb74f66a60689d5d6a8869004c13 Mon Sep 17 00:00:00 2001 From: xuwei-k <6b656e6a69@gmail.com> Date: Sat, 23 Dec 2017 20:07:14 +0900 Subject: [PATCH 739/823] add Java 9 test --- .travis.yml | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/.travis.yml b/.travis.yml index ce5390a3a..796ceaa60 100644 --- a/.travis.yml +++ b/.travis.yml @@ -5,6 +5,11 @@ scala: - 2.11.12 - 2.12.4 +matrix: + include: + - scala: 2.12.4 + jdk: oraclejdk9 + script: - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues scalafmt::test test:scalafmt::test test From 8e717bda30ea811ccf937eea8b4453b69d60ddc3 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 10 Jan 2018 15:42:04 +0000 Subject: [PATCH 740/823] Add version 1.1.2 to mimaPreviousArtifacts --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 412d7b65f..3d0e5da37 100644 --- a/build.sbt +++ b/build.sbt @@ -33,7 +33,7 @@ def commonSettings: Seq[Setting[_]] = Seq( val mimaSettings = Def settings ( mimaPreviousArtifacts := Set( "1.0.0", "1.0.1", "1.0.2", "1.0.3", - "1.1.0", "1.1.1", + "1.1.0", "1.1.1", "1.1.2", ) map (version => organization.value %% moduleName.value % version cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) From 8e8e0747a8b73f7ac32334abd5af733cb93713ac Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 10 Jan 2018 16:04:57 +0000 Subject: [PATCH 741/823] Upgrade to sbt 1.1.0 --- project/build.properties | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/build.properties b/project/build.properties index 394cb75cf..8b697bbb9 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.0.4 +sbt.version=1.1.0 From a21d7dec94a210311b5728477a6f4da498156d4d Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 11 Jan 2018 12:39:11 +0000 Subject: [PATCH 742/823] Add, configure & enforce sbt-whitesource --- .travis.yml | 16 ++++++++++++---- build.sbt | 9 +++++++++ project/plugins.sbt | 5 +++-- 3 files changed, 24 insertions(+), 6 deletions(-) diff --git a/.travis.yml b/.travis.yml index 796ceaa60..3b529232f 100644 --- a/.travis.yml +++ b/.travis.yml @@ -10,8 +10,16 @@ matrix: - scala: 2.12.4 jdk: oraclejdk9 -script: - - sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION" mimaReportBinaryIssues scalafmt::test test:scalafmt::test test +script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M + ++$TRAVIS_SCALA_VERSION + mimaReportBinaryIssues + scalafmt::test test:scalafmt::test + whitesourceCheckPolicies + test + +env: + global: + - secure: JzxepvrNQIem+7MS8pBfBkcWDgt/oNKOreI3GJMJDN9P7lxCmrW0UVhpSftscjRzz9gXGQleqZ8t/I0hqysY9nO/DlxDQil6FKpsqrEKALdIsez8TjtbOlV69enDl6SBCXpg1B/rTQ/dL9mpV3WMvNkmDOhcNmbNyfO9Uk8wAAEvGQNKyE02s0gjZf6IgfOHXInBB2o3+uQFiWCABFHDWInN4t0QZVEhF/3P3iDKEfauWGwugf/YKLrwUUzNyN+J1i1goYEWZvviP+KCNbPlEsVN60In8F0t+jYuBJb0ePNcl3waT/4xBKQRidB4XRbhOXrZIATdpHLnzKzk2TPf3GxijNEscKYGdq3v6nWd128rfHGYz528pRSZ8bNOdQJotB/bJTmIEOnk5P9zU0z4z2cawMF6EyBJka7kXnC9Vz6TpifvyXDpzfmRzAkBrD6PC+diGPbyy5+4zvhpZuv31MRjMckohyNb76pR9qq70yDlomn+nVNoZ1fpp7dCqwjIxm9h2UjCWzXWY4xSByI8/CaPibq6Ma7RWHQE+4NGG2CCLQrqN4NB+BFsH3R0l5Js9khvDuEUYJkgSmJMFluXranWRV+pp/YMxk1IT4rOEPOc/hIqlQTrxasp/QxeyAfRk9OPzoz9L2kR0RH4ch3KuaARUv03WFNarfQ/ISz3P/s= cache: directories: @@ -19,5 +27,5 @@ cache: - $HOME/.sbt before_cache: - - find $HOME/.ivy2/cache -name "ivydata-*.properties" -print -delete - - find $HOME/.sbt -name "*.lock" -print -delete + - find $HOME/.ivy2/cache -name "ivydata-*.properties" -delete + - find $HOME/.sbt -name "*.lock" -delete diff --git a/build.sbt b/build.sbt index 412d7b65f..5945c0429 100644 --- a/build.sbt +++ b/build.sbt @@ -187,3 +187,12 @@ def customCommands: Seq[Setting[_]] = Seq( state } ) + +inThisBuild(Seq( + whitesourceProduct := "Lightbend Reactive Platform", + whitesourceAggregateProjectName := "sbt-util-master", + whitesourceAggregateProjectToken := "b9b11b2f43d34c44b28d8922624eef07a3f1b20d95ad45a5b5d973513ab173f4", + whitesourceIgnoredScopes += "scalafmt", + whitesourceFailOnError := sys.env.contains("WHITESOURCE_PASSWORD"), // fail if pwd is present + whitesourceForceCheckAllDependencies := true, +)) diff --git a/project/plugins.sbt b/project/plugins.sbt index b85b1f525..7f3392433 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,2 +1,3 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.5") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.2") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.5") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.2") +addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.9") From ba5b60300de6a215f534c4438cdfe3a49cc94ba7 Mon Sep 17 00:00:00 2001 From: Seth Tisue Date: Tue, 30 Jan 2018 16:19:09 -0800 Subject: [PATCH 743/823] fix typo in AbstractLogger.scala filename --- .../scala/sbt/util/{AbtractLogger.scala => AbstractLogger.scala} | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename internal/util-logging/src/main/scala/sbt/util/{AbtractLogger.scala => AbstractLogger.scala} (100%) diff --git a/internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala b/internal/util-logging/src/main/scala/sbt/util/AbstractLogger.scala similarity index 100% rename from internal/util-logging/src/main/scala/sbt/util/AbtractLogger.scala rename to internal/util-logging/src/main/scala/sbt/util/AbstractLogger.scala From ada2a8aafaad65ca78d1141ae8be86003c26ab6f Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 29 Jan 2018 11:07:13 +0000 Subject: [PATCH 744/823] Give SourcePosition a macro instance creator --- build.sbt | 14 ++++-- .../scala/sbt/internal/util/Positions.scala | 46 +++++++++++++++++++ .../internal/util/SourcePositionSpec.scala | 18 ++++++++ 3 files changed, 73 insertions(+), 5 deletions(-) create mode 100644 internal/util-position/src/test/scala/sbt/internal/util/SourcePositionSpec.scala diff --git a/build.sbt b/build.sbt index 5945c0429..cc43f2c35 100644 --- a/build.sbt +++ b/build.sbt @@ -94,11 +94,15 @@ lazy val utilControl = (project in internalPath / "util-control").settings( mimaSettings, ) -val utilPosition = (project in file("internal") / "util-position").settings( - commonSettings, - name := "Util Position", - mimaSettings, -) +val utilPosition = (project in file("internal") / "util-position") + .dependsOn(utilTesting % Test) + .settings( + commonSettings, + name := "Util Position", + scalacOptions += "-language:experimental.macros", + libraryDependencies += scalaReflect.value, + mimaSettings, + ) // logging lazy val utilLogging = (project in internalPath / "util-logging") diff --git a/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala index a11ae9c24..ca3626db1 100644 --- a/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala +++ b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala @@ -18,3 +18,49 @@ final case class LineRange(start: Int, end: Int) { final case class RangePosition(path: String, range: LineRange) extends FilePosition { def startLine = range.start } + +object SourcePosition { + + /** Creates a SourcePosition by using the enclosing position of the invocation of this method. + * @see [[scala.reflect.macros.Enclosures#enclosingPosition]] + * @return SourcePosition + */ + def fromEnclosing(): SourcePosition = macro SourcePositionMacro.fromEnclosingImpl + +} + +import scala.annotation.tailrec +import scala.reflect.macros.blackbox +import scala.reflect.internal.util.UndefinedPosition + +final class SourcePositionMacro(val c: blackbox.Context) { + import c.universe.{ NoPosition => _, _ } + + def fromEnclosingImpl(): Expr[SourcePosition] = { + val pos = c.enclosingPosition + if (!pos.isInstanceOf[UndefinedPosition] && pos.line >= 0 && pos.source != null) { + val f = pos.source.file + val name = constant[String](ownerSource(f.path, f.name)) + val line = constant[Int](pos.line) + reify { LinePosition(name.splice, line.splice) } + } else + reify { NoPosition } + } + + private[this] def ownerSource(path: String, name: String): String = { + @tailrec def inEmptyPackage(s: Symbol): Boolean = + s != NoSymbol && ( + s.owner == c.mirror.EmptyPackage + || s.owner == c.mirror.EmptyPackageClass + || inEmptyPackage(s.owner) + ) + + c.internal.enclosingOwner match { + case ec if !ec.isStatic => name + case ec if inEmptyPackage(ec) => path + case ec => s"(${ec.fullName}) $name" + } + } + + private[this] def constant[T: WeakTypeTag](t: T): Expr[T] = c.Expr[T](Literal(Constant(t))) +} diff --git a/internal/util-position/src/test/scala/sbt/internal/util/SourcePositionSpec.scala b/internal/util-position/src/test/scala/sbt/internal/util/SourcePositionSpec.scala new file mode 100644 index 000000000..6fa955171 --- /dev/null +++ b/internal/util-position/src/test/scala/sbt/internal/util/SourcePositionSpec.scala @@ -0,0 +1,18 @@ +package sbt.internal.util + +import org.scalatest._ + +class SourcePositionSpec extends FlatSpec { + "SourcePosition()" should "return a sane SourcePosition" in { + val filename = "SourcePositionSpec.scala" + val lineNumber = 9 + SourcePosition.fromEnclosing() match { + case LinePosition(path, startLine) => assert(path === filename && startLine === lineNumber) + case RangePosition(path, range) => assert(path === filename && inRange(range, lineNumber)) + case NoPosition => fail("No source position found") + } + } + + private def inRange(range: LineRange, lineNo: Int) = + range.start until range.end contains lineNo +} From f593fc6c7471074c5ef8a729f272dc244b4191c0 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 14 Feb 2018 10:44:16 +0000 Subject: [PATCH 745/823] Update version/sbt.version --- build.sbt | 3 +-- project/build.properties | 2 +- 2 files changed, 2 insertions(+), 3 deletions(-) diff --git a/build.sbt b/build.sbt index 3d0e5da37..c5aaf8c18 100644 --- a/build.sbt +++ b/build.sbt @@ -2,7 +2,6 @@ import Dependencies._ import Util._ //import com.typesafe.tools.mima.core._, ProblemFilters._ -def baseVersion = "1.0.2-SNAPSHOT" def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( @@ -55,7 +54,7 @@ lazy val utilRoot: Project = (project in file(".")) .settings( inThisBuild( Seq( - git.baseVersion := baseVersion, + git.baseVersion := "1.1.2", version := { val v = version.value if (v contains "SNAPSHOT") git.baseVersion.value diff --git a/project/build.properties b/project/build.properties index 394cb75cf..31334bbd3 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.0.4 +sbt.version=1.1.1 From b68071a488e102aead631f26e82ef5df41d5fba3 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Sat, 17 Feb 2018 14:45:18 +1000 Subject: [PATCH 746/823] Cache evidence params for hot method --- .../src/main/scala/sbt/internal/util/ManagedLogger.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index d4cef3fe0..3add4cd04 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -25,9 +25,11 @@ class ManagedLogger( ) } + private val SuccessEventTag = scala.reflect.runtime.universe.typeTag[SuccessEvent] // send special event for success since it's not a real log level override def success(message: => String): Unit = { - infoEvent[SuccessEvent](SuccessEvent(message)) + infoEvent[SuccessEvent](SuccessEvent(message))(implicitly[JsonFormat[SuccessEvent]], + SuccessEventTag) } def registerStringCodec[A: ShowLines: TypeTag]: Unit = { From 0ebb7a5662f2bcc6599010f5a81ed0a540581fd8 Mon Sep 17 00:00:00 2001 From: Johannes Rudolph Date: Mon, 19 Feb 2018 09:19:31 +0100 Subject: [PATCH 747/823] In initStringCodecs avoid reflect universe initialization This showed up in profiling. It's known that TypeTags are expensive. Even more so if the reflect universe is accessed during startup when the class loading and JIT compiler are busy enough with other stuff. --- .../src/main/scala/sbt/util/LogExchange.scala | 16 +++++++++++++--- 1 file changed, 13 insertions(+), 3 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index b5bb1cb56..7879eb80a 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -67,9 +67,15 @@ sealed abstract class LogExchange { import sbt.internal.util.codec.TraceEventShowLines._ import sbt.internal.util.codec.SuccessEventShowLines._ - registerStringCodec[Throwable] - registerStringCodec[TraceEvent] - registerStringCodec[SuccessEvent] + // Register these StringCodecs manually, because this method will be called at the very startup of sbt + // and we'll try not to initialize the universe in StringTypeTag.apply + // If these classes are moved around, both the fully qualified names and the strings need to be adapted. + // A better long-term solution could be to make StringTypeTag.apply a macro. + registerStringCodecByStringTypeTag[_root_.scala.Throwable](StringTypeTag("scala.Throwable")) + registerStringCodecByStringTypeTag[_root_.sbt.internal.util.TraceEvent]( + StringTypeTag("sbt.internal.util.TraceEvent")) + registerStringCodecByStringTypeTag[_root_.sbt.internal.util.SuccessEvent]( + StringTypeTag("sbt.internal.util.SuccessEvent")) } // This is a dummy layout to avoid casting error during PatternLayout.createDefaultLayout() @@ -102,6 +108,10 @@ sealed abstract class LogExchange { def registerStringCodec[A: ShowLines: TypeTag]: Unit = { val tag = StringTypeTag[A] + registerStringCodecByStringTypeTag(tag) + } + + private[sbt] def registerStringCodecByStringTypeTag[A: ShowLines](tag: StringTypeTag[A]): Unit = { val ev = implicitly[ShowLines[A]] val _ = getOrElseUpdateStringCodec(tag.key, ev) } From d9b130d5199308440531cb74d79b318c3b5737b1 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Mon, 19 Feb 2018 15:49:40 +1000 Subject: [PATCH 748/823] Optimize ConsoleAppender.appendLog --- .../sbt/internal/util/ConsoleAppender.scala | 17 ++++++++++++----- .../scala/sbt/internal/util/ConsoleOut.scala | 2 +- 2 files changed, 13 insertions(+), 6 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 48b81e786..b0f68e88a 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -353,8 +353,10 @@ class ConsoleAppender private[ConsoleAppender] ( * @param msg The message to format * @return The formatted message. */ - private def formatted(format: String, msg: String): String = - s"$reset${format}${msg}$reset" + private def formatted(format: String, msg: String): String = { + val builder = new java.lang.StringBuilder(reset.length * 2 + format.length + msg.length) + builder.append(reset).append(format).append(msg).append(reset).toString + } /** * Select the right color for the label given `level`. @@ -388,9 +390,14 @@ class ConsoleAppender private[ConsoleAppender] ( ): Unit = out.lockObject.synchronized { message.lines.foreach { line => - val labeledLine = - s"$reset[${formatted(labelColor, label)}] ${formatted(messageColor, line)}" - write(labeledLine) + val builder = new java.lang.StringBuilder( + labelColor.length + label.length + messageColor.length + line.length + reset.length * 3 + 3) + def fmted(a: String, b: String) = builder.append(reset).append(a).append(b).append(reset) + builder.append(reset).append('[') + fmted(labelColor, label) + builder.append("] ") + fmted(messageColor, line) + write(builder.toString) } } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala index 37af255cb..717be2cfd 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala @@ -37,7 +37,7 @@ object ConsoleOut { lockObject.print(OverwriteLine) lockObject.println(s) last = Some(s) - current = new java.lang.StringBuffer + current.setLength(0) } } From 44a2f1d92cda8426aa431cd5ba44c5a2ce00c594 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 23 Feb 2018 18:16:30 +0000 Subject: [PATCH 749/823] Kill utilTesting No need for a 1-line, 1-class jar --- build.sbt | 28 ++++++------------- .../scala/sbt/internal/util/UnitSpec.scala | 5 ---- project/Dependencies.scala | 5 ++-- util-cache/src/test/scala/CacheSpec.scala | 4 +-- util-cache/src/test/scala/FileInfoSpec.scala | 4 +-- .../src/test/scala/SingletonCacheSpec.scala | 4 +-- .../src/test/scala/sbt/util/TrackedSpec.scala | 4 +-- 7 files changed, 19 insertions(+), 35 deletions(-) delete mode 100644 internal/util-testing/src/main/scala/sbt/internal/util/UnitSpec.scala diff --git a/build.sbt b/build.sbt index 9b0fbf89c..6cfe1075c 100644 --- a/build.sbt +++ b/build.sbt @@ -48,7 +48,6 @@ lazy val utilRoot: Project = (project in file(".")) utilRelation, utilCache, utilTracking, - utilTesting, utilScripted ) .settings( @@ -94,25 +93,24 @@ lazy val utilControl = (project in internalPath / "util-control").settings( ) val utilPosition = (project in file("internal") / "util-position") - .dependsOn(utilTesting % Test) .settings( commonSettings, name := "Util Position", scalacOptions += "-language:experimental.macros", - libraryDependencies += scalaReflect.value, + libraryDependencies ++= Seq(scalaReflect.value, scalaTest), mimaSettings, ) -// logging lazy val utilLogging = (project in internalPath / "util-logging") .enablePlugins(ContrabandPlugin, JsonCodecPlugin) - .dependsOn(utilInterface, utilTesting % Test) + .dependsOn(utilInterface) .settings( commonSettings, crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), + libraryDependencies ++= Seq(scalaCheck, scalaTest), sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala", contrabandFormatsForType in generateContrabands in Compile := { tpe => val old = (contrabandFormatsForType in generateContrabands in Compile).value @@ -122,45 +120,35 @@ lazy val utilLogging = (project in internalPath / "util-logging") }, mimaSettings, ) + .configure(addSbtIO) -// Relation lazy val utilRelation = (project in internalPath / "util-relation") - .dependsOn(utilTesting % Test) .settings( commonSettings, name := "Util Relation", + libraryDependencies ++= Seq(scalaCheck), mimaSettings, ) // Persisted caching based on sjson-new lazy val utilCache = (project in file("util-cache")) - .dependsOn(utilTesting % Test) .settings( commonSettings, name := "Util Cache", libraryDependencies ++= Seq(sjsonnewScalaJson.value, sjsonnewMurmurhash.value, scalaReflect.value), + libraryDependencies ++= Seq(scalaTest), mimaSettings, ) .configure(addSbtIO) // Builds on cache to provide caching for filesystem-related operations lazy val utilTracking = (project in file("util-tracking")) - .dependsOn(utilCache, utilTesting % Test) + .dependsOn(utilCache) .settings( commonSettings, name := "Util Tracking", - mimaSettings, - ) - .configure(addSbtIO) - -// Internal utility for testing -lazy val utilTesting = (project in internalPath / "util-testing") - .settings( - commonSettings, - crossScalaVersions := Seq(scala210, scala211, scala212), - name := "Util Testing", - libraryDependencies ++= Seq(scalaCheck, scalatest), + libraryDependencies ++= Seq(scalaTest), mimaSettings, ) .configure(addSbtIO) diff --git a/internal/util-testing/src/main/scala/sbt/internal/util/UnitSpec.scala b/internal/util-testing/src/main/scala/sbt/internal/util/UnitSpec.scala deleted file mode 100644 index 99ad43c2d..000000000 --- a/internal/util-testing/src/main/scala/sbt/internal/util/UnitSpec.scala +++ /dev/null @@ -1,5 +0,0 @@ -package sbt.internal.util - -import org.scalatest._ - -abstract class UnitSpec extends FlatSpec with Matchers diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 97e4f992b..6d4cbfa9e 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -40,8 +40,9 @@ object Dependencies { val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } - val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.4" - val scalatest = "org.scalatest" %% "scalatest" % "3.0.1" + val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.4" % Test + val scalaTest = "org.scalatest" %% "scalatest" % "3.0.1" % Test + val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" val sjsonnew = Def.setting { diff --git a/util-cache/src/test/scala/CacheSpec.scala b/util-cache/src/test/scala/CacheSpec.scala index 468c647cd..ad51d7ff5 100644 --- a/util-cache/src/test/scala/CacheSpec.scala +++ b/util-cache/src/test/scala/CacheSpec.scala @@ -9,9 +9,9 @@ import sjsonnew.IsoString import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } import sjsonnew.shaded.scalajson.ast.unsafe.JValue -import sbt.internal.util.UnitSpec +import org.scalatest.FlatSpec -class CacheSpec extends UnitSpec { +class CacheSpec extends FlatSpec { implicit val isoString: IsoString[JValue] = IsoString.iso(CompactPrinter.apply, Parser.parseUnsafe) diff --git a/util-cache/src/test/scala/FileInfoSpec.scala b/util-cache/src/test/scala/FileInfoSpec.scala index 7b0f3e035..d8e36386c 100644 --- a/util-cache/src/test/scala/FileInfoSpec.scala +++ b/util-cache/src/test/scala/FileInfoSpec.scala @@ -2,10 +2,10 @@ package sbt.util import sjsonnew.shaded.scalajson.ast.unsafe._ import sjsonnew._, support.scalajson.unsafe._ -import sbt.internal.util.UnitSpec +import org.scalatest.FlatSpec import sbt.io.IO -class FileInfoSpec extends UnitSpec { +class FileInfoSpec extends FlatSpec { val file = new java.io.File(".").getAbsoluteFile val fileInfo: ModifiedFileInfo = FileModified(file, IO.getModifiedTimeOrZero(file)) val filesInfo = FilesInfo(Set(fileInfo)) diff --git a/util-cache/src/test/scala/SingletonCacheSpec.scala b/util-cache/src/test/scala/SingletonCacheSpec.scala index 84e91b627..d22e10f2d 100644 --- a/util-cache/src/test/scala/SingletonCacheSpec.scala +++ b/util-cache/src/test/scala/SingletonCacheSpec.scala @@ -9,9 +9,9 @@ import sjsonnew.{ Builder, deserializationError, IsoString, JsonFormat, Unbuilde import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter, Parser } import sjsonnew.shaded.scalajson.ast.unsafe.JValue -import sbt.internal.util.UnitSpec +import org.scalatest.FlatSpec -class SingletonCacheSpec extends UnitSpec { +class SingletonCacheSpec extends FlatSpec { case class ComplexType(val x: Int, y: String, z: List[Int]) object ComplexType { diff --git a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala index a2015d76f..7db275d1a 100644 --- a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala +++ b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala @@ -6,9 +6,9 @@ import sbt.io.syntax._ import CacheImplicits._ import sjsonnew.IsoString -import sbt.internal.util.UnitSpec +import org.scalatest.FlatSpec -class TrackedSpec extends UnitSpec { +class TrackedSpec extends FlatSpec { "lastOutput" should "store the last output" in { withStore { store => val value = 5 From 770977a0bbb6f47b29d6f2d77597e750ad1dd585 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 26 Mar 2018 21:13:51 -0400 Subject: [PATCH 750/823] sbt 1.1.2 --- project/build.properties | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/build.properties b/project/build.properties index 31334bbd3..05313438a 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.1.1 +sbt.version=1.1.2 From 4791b38adfc0bf3408129a0d9b35d2106a2059f2 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Mon, 26 Mar 2018 21:19:59 -0400 Subject: [PATCH 751/823] bump to 1.1.4-SNAPSHOT --- build.sbt | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/build.sbt b/build.sbt index c5aaf8c18..6c557ee2d 100644 --- a/build.sbt +++ b/build.sbt @@ -32,7 +32,7 @@ def commonSettings: Seq[Setting[_]] = Seq( val mimaSettings = Def settings ( mimaPreviousArtifacts := Set( "1.0.0", "1.0.1", "1.0.2", "1.0.3", - "1.1.0", "1.1.1", "1.1.2", + "1.1.0", "1.1.1", "1.1.2", "1.1.3" ) map (version => organization.value %% moduleName.value % version cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) @@ -54,10 +54,10 @@ lazy val utilRoot: Project = (project in file(".")) .settings( inThisBuild( Seq( - git.baseVersion := "1.1.2", + git.baseVersion := "1.1.4", version := { val v = version.value - if (v contains "SNAPSHOT") git.baseVersion.value + if (v contains "SNAPSHOT") git.baseVersion.value + "-SNAPSHOT" else v }, bintrayPackage := "util", From 029952895b858f47684df3346e63dfac22a84b40 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 5 Apr 2018 09:43:22 +0100 Subject: [PATCH 752/823] Enforce invariant in StringTypeTag optimisation Or, put differently, "Add a test for sbt/util#153". --- .../src/main/scala/sbt/util/LogExchange.scala | 19 ++++++++++--------- .../src/test/scala/LogExchangeSpec.scala | 16 ++++++++++++++++ 2 files changed, 26 insertions(+), 9 deletions(-) create mode 100644 internal/util-logging/src/test/scala/LogExchangeSpec.scala diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index 7879eb80a..2341a4395 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -62,20 +62,21 @@ sealed abstract class LogExchange { config.getLoggerConfig(loggerName) } + // Construct these StringTypeTags manually, because they're used at the very startup of sbt + // and we'll try not to initialize the universe by using the StringTypeTag.apply that requires a TypeTag + // A better long-term solution could be to make StringTypeTag.apply a macro. + lazy val stringTypeTagThrowable = StringTypeTag[Throwable]("scala.Throwable") + lazy val stringTypeTagTraceEvent = StringTypeTag[TraceEvent]("sbt.internal.util.TraceEvent") + lazy val stringTypeTagSuccessEvent = StringTypeTag[SuccessEvent]("sbt.internal.util.SuccessEvent") + private[sbt] def initStringCodecs(): Unit = { import sbt.internal.util.codec.ThrowableShowLines._ import sbt.internal.util.codec.TraceEventShowLines._ import sbt.internal.util.codec.SuccessEventShowLines._ - // Register these StringCodecs manually, because this method will be called at the very startup of sbt - // and we'll try not to initialize the universe in StringTypeTag.apply - // If these classes are moved around, both the fully qualified names and the strings need to be adapted. - // A better long-term solution could be to make StringTypeTag.apply a macro. - registerStringCodecByStringTypeTag[_root_.scala.Throwable](StringTypeTag("scala.Throwable")) - registerStringCodecByStringTypeTag[_root_.sbt.internal.util.TraceEvent]( - StringTypeTag("sbt.internal.util.TraceEvent")) - registerStringCodecByStringTypeTag[_root_.sbt.internal.util.SuccessEvent]( - StringTypeTag("sbt.internal.util.SuccessEvent")) + registerStringCodecByStringTypeTag(stringTypeTagThrowable) + registerStringCodecByStringTypeTag(stringTypeTagTraceEvent) + registerStringCodecByStringTypeTag(stringTypeTagSuccessEvent) } // This is a dummy layout to avoid casting error during PatternLayout.createDefaultLayout() diff --git a/internal/util-logging/src/test/scala/LogExchangeSpec.scala b/internal/util-logging/src/test/scala/LogExchangeSpec.scala new file mode 100644 index 000000000..b29512296 --- /dev/null +++ b/internal/util-logging/src/test/scala/LogExchangeSpec.scala @@ -0,0 +1,16 @@ +package sbt.util + +import sbt.internal.util._ + +import org.scalatest._ + +class LogExchangeSpec extends FlatSpec with Matchers { + import LogExchange._ + + checkTypeTag("stringTypeTagThrowable", stringTypeTagThrowable, StringTypeTag[Throwable]) + checkTypeTag("stringTypeTagTraceEvent", stringTypeTagTraceEvent, StringTypeTag[TraceEvent]) + checkTypeTag("stringTypeTagSuccessEvent", stringTypeTagSuccessEvent, StringTypeTag[SuccessEvent]) + + private def checkTypeTag[A, B](name: String, actual: A, expected: B): Unit = + s"LogExchange.$name" should s"match real StringTypeTag[$expected]" in assert(actual == expected) +} From f2d3cfea3f0436c5a3c18cf1f40b3db0adee4937 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Sun, 8 Apr 2018 14:37:51 +1000 Subject: [PATCH 753/823] Upgrade to latest sbt-houserules --- project/plugins.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/plugins.sbt b/project/plugins.sbt index b85b1f525..8bd11098e 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,2 +1,2 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.5") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.6") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.2") From d2e59fa165a2174c48f7c6431bd3d13047015c1a Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 28 Apr 2018 00:17:03 -0400 Subject: [PATCH 754/823] Scala 2.12.6 and other dependencies sbt 1.1.4 Scala 2.12.6 ScalaCheck 1.14.0 ScalaTest 3.0.5 Contraband 0.4.0 --- build.sbt | 7 ++++++- .../contraband-scala/sbt/internal/util/StringEvent.scala | 2 +- .../contraband-scala/sbt/internal/util/SuccessEvent.scala | 2 +- .../contraband-scala/sbt/internal/util/TraceEvent.scala | 2 +- project/Dependencies.scala | 6 +++--- project/build.properties | 2 +- project/plugins.sbt | 2 +- 7 files changed, 14 insertions(+), 9 deletions(-) diff --git a/build.sbt b/build.sbt index 6c557ee2d..311e96eec 100644 --- a/build.sbt +++ b/build.sbt @@ -1,6 +1,6 @@ import Dependencies._ import Util._ -//import com.typesafe.tools.mima.core._, ProblemFilters._ +import com.typesafe.tools.mima.core._, ProblemFilters._ def internalPath = file("internal") @@ -117,6 +117,11 @@ lazy val utilLogging = (project in internalPath / "util-logging") else old(tpe) }, mimaSettings, + mimaBinaryIssueFilters ++= Seq( + exclude[DirectMissingMethodProblem]("sbt.internal.util.SuccessEvent.copy*"), + exclude[DirectMissingMethodProblem]("sbt.internal.util.TraceEvent.copy*"), + exclude[DirectMissingMethodProblem]("sbt.internal.util.StringEvent.copy*"), + ), ) // Relation diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala index 83e697ec6..ef4ff4b9b 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/StringEvent.scala @@ -22,7 +22,7 @@ final class StringEvent private ( override def toString: String = { "StringEvent(" + level + ", " + message + ", " + channelName + ", " + execId + ")" } - protected[this] def copy(level: String = level, message: String = message, channelName: Option[String] = channelName, execId: Option[String] = execId): StringEvent = { + private[this] def copy(level: String = level, message: String = message, channelName: Option[String] = channelName, execId: Option[String] = execId): StringEvent = { new StringEvent(level, message, channelName, execId) } def withLevel(level: String): StringEvent = { diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/SuccessEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/SuccessEvent.scala index 9fdcc8e09..6d00a7eb3 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/SuccessEvent.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/SuccessEvent.scala @@ -19,7 +19,7 @@ final class SuccessEvent private ( override def toString: String = { "SuccessEvent(" + message + ")" } - protected[this] def copy(message: String = message): SuccessEvent = { + private[this] def copy(message: String = message): SuccessEvent = { new SuccessEvent(message) } def withMessage(message: String): SuccessEvent = { diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala index 7775220fc..afc7d522e 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/TraceEvent.scala @@ -22,7 +22,7 @@ final class TraceEvent private ( override def toString: String = { "TraceEvent(" + level + ", " + message + ", " + channelName + ", " + execId + ")" } - protected[this] def copy(level: String = level, message: Throwable = message, channelName: Option[String] = channelName, execId: Option[String] = execId): TraceEvent = { + private[this] def copy(level: String = level, message: Throwable = message, channelName: Option[String] = channelName, execId: Option[String] = execId): TraceEvent = { new TraceEvent(level, message, channelName, execId) } def withLevel(level: String): TraceEvent = { diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 97e4f992b..02246553a 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -5,7 +5,7 @@ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { val scala210 = "2.10.7" val scala211 = "2.11.12" - val scala212 = "2.12.4" + val scala212 = "2.12.6" private val ioVersion = "1.1.3" @@ -40,8 +40,8 @@ object Dependencies { val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } - val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.4" - val scalatest = "org.scalatest" %% "scalatest" % "3.0.1" + val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.14.0" + val scalatest = "org.scalatest" %% "scalatest" % "3.0.5" val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" val sjsonnew = Def.setting { diff --git a/project/build.properties b/project/build.properties index 05313438a..64cf32f7f 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.1.2 +sbt.version=1.1.4 diff --git a/project/plugins.sbt b/project/plugins.sbt index 8bd11098e..d5696181d 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,2 +1,2 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.6") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.3.2") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.0") From 5ca377cd5972dd8a7293b71e8eee5a09904d01ac Mon Sep 17 00:00:00 2001 From: xuwei-k <6b656e6a69@gmail.com> Date: Tue, 1 May 2018 20:18:36 +0900 Subject: [PATCH 755/823] use foldLeft instead of /: https://github.com/scala/scala/blob/1c56f0af6d3d59b7d2a8dbcf64077b0c1fe90f07/src/library/scala/collection/IterableOnce.scala#L465 --- .../scala/sbt/internal/util/Relation.scala | 19 +++++++++++-------- 1 file changed, 11 insertions(+), 8 deletions(-) diff --git a/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala b/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala index d107b3bc0..6a1abd726 100644 --- a/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala +++ b/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala @@ -20,12 +20,14 @@ object Relation { /** Constructs a relation such that for every entry `_1 -> _2s` in `forward` and every `_2` in `_2s`, `(_1, _2)` is in the relation. */ def reconstruct[A, B](forward: Map[A, Set[B]]): Relation[A, B] = { val reversePairs = for ((a, bs) <- forward.view; b <- bs.view) yield (b, a) - val reverse = (Map.empty[B, Set[A]] /: reversePairs) { case (m, (b, a)) => add(m, b, a :: Nil) } + val reverse = reversePairs.foldLeft(Map.empty[B, Set[A]]) { + case (m, (b, a)) => add(m, b, a :: Nil) + } make(forward filter { case (a, bs) => bs.nonEmpty }, reverse) } def merge[A, B](rels: Traversable[Relation[A, B]]): Relation[A, B] = - (Relation.empty[A, B] /: rels)(_ ++ _) + rels.foldLeft(Relation.empty[A, B])(_ ++ _) private[sbt] def remove[X, Y](map: M[X, Y], from: X, to: Y): M[X, Y] = map.get(from) match { @@ -36,7 +38,7 @@ object Relation { } private[sbt] def combine[X, Y](a: M[X, Y], b: M[X, Y]): M[X, Y] = - (a /: b)((map, mapping) => add(map, mapping._1, mapping._2)) + b.foldLeft(a)((map, mapping) => add(map, mapping._1, mapping._2)) private[sbt] def add[X, Y](map: M[X, Y], from: X, to: Traversable[Y]): M[X, Y] = map.updated(from, get(map, from) ++ to) @@ -151,14 +153,15 @@ private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) def +(from: A, to: B) = this + (from, to :: Nil) def +(from: A, to: Traversable[B]) = if (to.isEmpty) this - else new MRelation(add(fwd, from, to), (rev /: to)((map, t) => add(map, t, from :: Nil))) + else new MRelation(add(fwd, from, to), to.foldLeft(rev)((map, t) => add(map, t, from :: Nil))) - def ++(rs: Traversable[(A, B)]) = ((this: Relation[A, B]) /: rs) { _ + _ } + def ++(rs: Traversable[(A, B)]) = rs.foldLeft(this: Relation[A, B]) { _ + _ } def ++(other: Relation[A, B]) = new MRelation[A, B](combine(fwd, other.forwardMap), combine(rev, other.reverseMap)) - def --(ts: Traversable[A]): Relation[A, B] = ((this: Relation[A, B]) /: ts) { _ - _ } - def --(pairs: TraversableOnce[(A, B)]): Relation[A, B] = ((this: Relation[A, B]) /: pairs)(_ - _) + def --(ts: Traversable[A]): Relation[A, B] = ts.foldLeft(this: Relation[A, B]) { _ - _ } + def --(pairs: TraversableOnce[(A, B)]): Relation[A, B] = + pairs.foldLeft(this: Relation[A, B])(_ - _) def --(relations: Relation[A, B]): Relation[A, B] = --(relations.all) def -(pair: (A, B)): Relation[A, B] = @@ -167,7 +170,7 @@ private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) def -(t: A): Relation[A, B] = fwd.get(t) match { case Some(rs) => - val upRev = (rev /: rs)((map, r) => remove(map, r, t)) + val upRev = rs.foldLeft(rev)((map, r) => remove(map, r, t)) new MRelation(fwd - t, upRev) case None => this } From e97451d8127c26235a34b3c668bfe31626e0dc73 Mon Sep 17 00:00:00 2001 From: xuwei-k <6b656e6a69@gmail.com> Date: Mon, 11 Jun 2018 13:11:51 +0900 Subject: [PATCH 756/823] fix adapted argument warning https://travis-ci.org/sbt/util/jobs/373445819#L517 ``` [warn] /home/travis/build/sbt/util/util-cache/src/main/scala/sbt/util/Input.scala:19:23: No automatic adaptation here: use explicit parentheses. [warn] signature: Using.apply[R](src: Source)(f: T => R): R [warn] given arguments: input, IO.utf8 [warn] after adaptation: Using((input, IO.utf8): (java.io.InputStream, java.nio.charset.Charset)) [warn] Using.streamReader(input, IO.utf8) { reader => [warn] ^ ``` --- util-cache/src/main/scala/sbt/util/Input.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util-cache/src/main/scala/sbt/util/Input.scala b/util-cache/src/main/scala/sbt/util/Input.scala index ee89c30b3..9dcdd5949 100644 --- a/util-cache/src/main/scala/sbt/util/Input.scala +++ b/util-cache/src/main/scala/sbt/util/Input.scala @@ -16,7 +16,7 @@ class PlainInput[J: IsoString](input: InputStream, converter: SupportConverter[J val isoFormat: IsoString[J] = implicitly private def readFully(): String = { - Using.streamReader(input, IO.utf8) { reader => + Using.streamReader((input, IO.utf8)) { reader => val builder = new StringBuilder() val bufferSize = 1024 val buffer = new Array[Char](bufferSize) From 2b8d71ebe5a9fb7dc32a8dab1172d3eece5f2c2c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 14 Jun 2018 02:01:40 -0400 Subject: [PATCH 757/823] sbt-houserules 0.3.7 --- project/build.properties | 2 +- project/plugins.sbt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/project/build.properties b/project/build.properties index 64cf32f7f..d6e35076c 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.1.4 +sbt.version=1.1.6 diff --git a/project/plugins.sbt b/project/plugins.sbt index f32aab9fb..e1c769a77 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,3 +1,3 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.6") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.7") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.0") addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.9") From 434e294f2849426888a49be4fa2d1b88931af0e8 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 27 Jun 2018 06:36:47 -0400 Subject: [PATCH 758/823] Fixes the stacktrace trimming Ref https://github.com/sbt/sbt/issues/4121 Ref https://github.com/sbt/sbt/pull/4232 --- .../scala/sbt/internal/util/ConsoleAppender.scala | 2 +- .../main/scala/sbt/internal/util/MainLogging.scala | 11 ++++++++--- .../src/main/scala/sbt/internal/util/StackTrace.scala | 2 +- 3 files changed, 10 insertions(+), 5 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index b0f68e88a..3e1d07da5 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -421,7 +421,7 @@ class ConsoleAppender private[ConsoleAppender] ( } private def appendTraceEvent(te: TraceEvent): Unit = { - val traceLevel = getTrace + val traceLevel = if (getTrace < 0) Int.MaxValue else getTrace val throwableShowLines: ShowLines[Throwable] = ShowLines[Throwable]((t: Throwable) => { List(StackTrace.trimmed(t, traceLevel)) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala index 2a47587b0..0ddc357e6 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MainLogging.scala @@ -13,15 +13,20 @@ object MainAppender { def multiLogger(log: ManagedLogger, config: MainAppenderConfig): ManagedLogger = { import config._ // TODO - // console setTrace screenTrace // backed setTrace backingTrace // multi: Logger - // val log = LogExchange.logger(loggerName) LogExchange.unbindLoggerAppenders(log.name) LogExchange.bindLoggerAppenders( log.name, - (consoleOpt.toList map { _ -> screenLevel }) ::: + (consoleOpt.toList map { appender => + appender match { + case a: ConsoleAppender => + a.setTrace(screenTrace) + case _ => () + } + appender -> screenLevel + }) ::: List(backed -> backingLevel) ::: (extra map { x => (x -> Level.Info) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala index 66468e2d5..58888e8be 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala @@ -4,7 +4,7 @@ package sbt.internal.util object StackTrace { - def isSbtClass(name: String) = name.startsWith("sbt") || name.startsWith("xsbt") + def isSbtClass(name: String) = name.startsWith("sbt.") || name.startsWith("xsbt.") /** * Return a printable representation of the stack trace associated From 141d9357cc9fd7810d0c6681f871a00d74267f78 Mon Sep 17 00:00:00 2001 From: "Brian P. Holt" Date: Fri, 27 Jul 2018 11:31:58 -0500 Subject: [PATCH 759/823] invoke output value function again, after executing output effect fixes #168 --- .../src/main/scala/sbt/util/Tracked.scala | 2 +- .../src/test/scala/sbt/util/TrackedSpec.scala | 39 ++++++++++++------- 2 files changed, 26 insertions(+), 15 deletions(-) diff --git a/util-tracking/src/main/scala/sbt/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala index cfc1d089f..e0e5c77e7 100644 --- a/util-tracking/src/main/scala/sbt/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -100,7 +100,7 @@ object Tracked { val changed = help.changed(store, initial) val result = f(changed, initial) if (changed) { - help.save(store, initial) + help.save(store, p()) } result } diff --git a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala index 7db275d1a..f2ef1bfb0 100644 --- a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala +++ b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala @@ -1,12 +1,11 @@ package sbt.util +import org.scalatest.FlatSpec import sbt.io.IO import sbt.io.syntax._ +import sbt.util.CacheImplicits._ -import CacheImplicits._ - -import sjsonnew.IsoString -import org.scalatest.FlatSpec +import scala.concurrent.Promise class TrackedSpec extends FlatSpec { "lastOutput" should "store the last output" in { @@ -105,29 +104,41 @@ class TrackedSpec extends FlatSpec { "outputChanged" should "detect that the output has not changed" in { withStore { store => - val input0: String = "foo" - val p0: () => String = () => input0 + val beforeCompletion: String = "before-completion" + val afterCompletion: String = "after-completion" + val sideEffectCompleted = Promise[Unit] + val p0: () => String = () => { + if (sideEffectCompleted.isCompleted) { + afterCompletion + } else { + sideEffectCompleted.success(()) + beforeCompletion + } + } + val firstExpectedResult = "first-result" + val secondExpectedResult = "second-result" val res0 = Tracked.outputChanged[String, String](store) { case (true, in) => - assert(in === input0) - in - case (false, in) => + assert(in === beforeCompletion) + firstExpectedResult + case (false, _) => fail() }(implicitly)(p0) - assert(res0 === input0) + assert(res0 === firstExpectedResult) val res1 = Tracked.outputChanged[String, String](store) { - case (true, in) => + case (true, _) => fail() case (false, in) => - assert(in === input0) - in + assert(in === afterCompletion) + secondExpectedResult }(implicitly)(p0) - assert(res1 === input0) + assert(res1 === secondExpectedResult) + () } } From 6f2b78b6a8ecf9f750814367613b0c1d3d4d096b Mon Sep 17 00:00:00 2001 From: "Brian P. Holt" Date: Fri, 27 Jul 2018 11:35:26 -0500 Subject: [PATCH 760/823] clean up compiler warnings in util-tracking --- .../src/main/scala/sbt/util/Tracked.scala | 2 +- .../src/test/scala/sbt/util/TrackedSpec.scala | 22 +++++++++++++------ 2 files changed, 16 insertions(+), 8 deletions(-) diff --git a/util-tracking/src/main/scala/sbt/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala index e0e5c77e7..c6286a832 100644 --- a/util-tracking/src/main/scala/sbt/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -178,7 +178,7 @@ object Tracked { def save(store: CacheStore, value: I): Unit = { Hasher.hash(value) match { case Success(keyHash) => store.write[Long](keyHash.toLong) - case Failure(e) => () + case Failure(_) => () } } diff --git a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala index f2ef1bfb0..8f88cc53a 100644 --- a/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala +++ b/util-tracking/src/test/scala/sbt/util/TrackedSpec.scala @@ -18,14 +18,14 @@ class TrackedSpec extends FlatSpec { case (in, None) => assert(in === value) in - case (in, Some(_)) => + case (_, Some(_)) => fail() }(implicitly)(value) assert(res0 === value) val res1 = Tracked.lastOutput[Int, Int](store) { - case (in, None) => + case (_, None) => fail() case (in, Some(read)) => assert(in === otherValue) @@ -36,7 +36,7 @@ class TrackedSpec extends FlatSpec { val res2 = Tracked.lastOutput[Int, Int](store) { - case (in, None) => + case (_, None) => fail() case (in, Some(read)) => assert(in === otherValue) @@ -44,6 +44,8 @@ class TrackedSpec extends FlatSpec { read }(implicitly)(otherValue) assert(res2 === value) + + () } } @@ -56,14 +58,14 @@ class TrackedSpec extends FlatSpec { case (true, in) => assert(in === input0) in - case (false, in) => + case (false, _) => fail() }(implicitly, implicitly)(input0) assert(res0 === input0) val res1 = Tracked.inputChanged[String, String](store) { - case (true, in) => + case (true, _) => fail() case (false, in) => assert(in === input0) @@ -71,6 +73,7 @@ class TrackedSpec extends FlatSpec { }(implicitly, implicitly)(input0) assert(res1 === input0) + () } } @@ -84,7 +87,7 @@ class TrackedSpec extends FlatSpec { case (true, in) => assert(in === input0) in - case (false, in) => + case (false, _) => fail() }(implicitly, implicitly)(input0) assert(res0 === input0) @@ -94,11 +97,12 @@ class TrackedSpec extends FlatSpec { case (true, in) => assert(in === input1) in - case (false, in) => + case (false, _) => fail() }(implicitly, implicitly)(input1) assert(res1 === input1) + () } } @@ -147,6 +151,8 @@ class TrackedSpec extends FlatSpec { Tracked.tstamp(store) { last => assert(last === 0) } + + () } } @@ -160,6 +166,8 @@ class TrackedSpec extends FlatSpec { val difference = System.currentTimeMillis - last assert(difference < 1000) } + + () } } From 184c5bdaef5eab49b65bb834f992c215492f9f9b Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 29 Jul 2018 15:36:27 -0400 Subject: [PATCH 761/823] IO 1.2.0 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index d66b8ef69..806cbe65b 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -7,7 +7,7 @@ object Dependencies { val scala211 = "2.11.12" val scala212 = "2.12.6" - private val ioVersion = "1.1.3" + private val ioVersion = "1.2.0" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion From 81a1317e53eb4c9845d5c185b2f64e1e2ab4388d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 1 Aug 2018 00:16:15 -0400 Subject: [PATCH 762/823] 1.2.1-SNAPSHOT --- build.sbt | 5 +++-- project/build.properties | 2 +- 2 files changed, 4 insertions(+), 3 deletions(-) diff --git a/build.sbt b/build.sbt index f9726ff0c..1896a4191 100644 --- a/build.sbt +++ b/build.sbt @@ -32,7 +32,8 @@ def commonSettings: Seq[Setting[_]] = Seq( val mimaSettings = Def settings ( mimaPreviousArtifacts := Set( "1.0.0", "1.0.1", "1.0.2", "1.0.3", - "1.1.0", "1.1.1", "1.1.2", "1.1.3" + "1.1.0", "1.1.1", "1.1.2", "1.1.3", + "1.2.0", ) map (version => organization.value %% moduleName.value % version cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) @@ -53,7 +54,7 @@ lazy val utilRoot: Project = (project in file(".")) .settings( inThisBuild( Seq( - git.baseVersion := "1.2.0", + git.baseVersion := "1.2.1", version := { val v = version.value if (v contains "SNAPSHOT") git.baseVersion.value + "-SNAPSHOT" diff --git a/project/build.properties b/project/build.properties index d6e35076c..f59579fd6 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.1.6 +sbt.version=1.2.0 From b1d02bee3034d70ee7aba138317a0b15fdeb4b35 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 2 Aug 2018 07:52:13 +0100 Subject: [PATCH 763/823] Make checkTypeTag lenient on "scala." type prefix --- .../src/test/scala/LogExchangeSpec.scala | 15 +++++++++++++-- 1 file changed, 13 insertions(+), 2 deletions(-) diff --git a/internal/util-logging/src/test/scala/LogExchangeSpec.scala b/internal/util-logging/src/test/scala/LogExchangeSpec.scala index b29512296..8a0706be9 100644 --- a/internal/util-logging/src/test/scala/LogExchangeSpec.scala +++ b/internal/util-logging/src/test/scala/LogExchangeSpec.scala @@ -11,6 +11,17 @@ class LogExchangeSpec extends FlatSpec with Matchers { checkTypeTag("stringTypeTagTraceEvent", stringTypeTagTraceEvent, StringTypeTag[TraceEvent]) checkTypeTag("stringTypeTagSuccessEvent", stringTypeTagSuccessEvent, StringTypeTag[SuccessEvent]) - private def checkTypeTag[A, B](name: String, actual: A, expected: B): Unit = - s"LogExchange.$name" should s"match real StringTypeTag[$expected]" in assert(actual == expected) + private def checkTypeTag[A](name: String, inc: StringTypeTag[A], exp: StringTypeTag[A]): Unit = + s"LogExchange.$name" should s"match real StringTypeTag[$exp]" in { + val StringTypeTag(incomingString) = inc + val StringTypeTag(expectedString) = exp + if ((incomingString startsWith "scala.") || (expectedString startsWith "scala.")) { + // > historically [Scala] has been inconsistent whether `scala.` is included, or not + // > would it be hard to make the test accept either result? + // https://github.com/scala/community-builds/pull/758#issuecomment-409760633 + assert((incomingString stripPrefix "scala.") == (expectedString stripPrefix "scala.")) + } else { + assert(incomingString == expectedString) + } + } } From f457696a999667b162e85e3d373549b13893120a Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Mon, 13 Aug 2018 01:03:08 +0900 Subject: [PATCH 764/823] Upgrade to sbt 1.2.1 --- project/build.properties | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/build.properties b/project/build.properties index f59579fd6..5620cc502 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.2.0 +sbt.version=1.2.1 From 78834527dffb89408142047d9a3a8bb5b56caa95 Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Mon, 13 Aug 2018 01:03:22 +0900 Subject: [PATCH 765/823] xsbti.Position: add startOffset and endOffset A position now has a start, an end, and a point (the existing `offset`), just like it does in the Scala compiler. This information is especially useful for displaying squiggly lines in an IDE. This commit and the next one are required for https://github.com/sbt/zinc/pull/571 --- build.sbt | 2 ++ .../src/main/java/xsbti/Position.java | 4 +++ .../main/scala/sbt/util/InterfaceUtil.scala | 30 +++++++++++++++++-- 3 files changed, 34 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 1896a4191..7a2d78ce6 100644 --- a/build.sbt +++ b/build.sbt @@ -124,6 +124,8 @@ lazy val utilLogging = (project in internalPath / "util-logging") exclude[DirectMissingMethodProblem]("sbt.internal.util.SuccessEvent.copy*"), exclude[DirectMissingMethodProblem]("sbt.internal.util.TraceEvent.copy*"), exclude[DirectMissingMethodProblem]("sbt.internal.util.StringEvent.copy*"), + // Private final class constructor changed + exclude[DirectMissingMethodProblem]("sbt.util.InterfaceUtil#ConcretePosition.this"), ), ) .configure(addSbtIO) diff --git a/internal/util-interface/src/main/java/xsbti/Position.java b/internal/util-interface/src/main/java/xsbti/Position.java index be0239046..0f27e295e 100644 --- a/internal/util-interface/src/main/java/xsbti/Position.java +++ b/internal/util-interface/src/main/java/xsbti/Position.java @@ -18,4 +18,8 @@ public interface Position Optional sourcePath(); Optional sourceFile(); + + // Default values to avoid breaking binary compatibility + default Optional startOffset() { return Optional.empty(); } + default Optional endOffset() { return Optional.empty(); } } diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index dc956ecbf..a2d705600 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -36,6 +36,7 @@ object InterfaceUtil { case None => Optional.empty[A]() } + // Overload to preserve binary compatibility def position( line0: Option[Integer], content: String, @@ -45,7 +46,28 @@ object InterfaceUtil { sourcePath0: Option[String], sourceFile0: Option[File] ): Position = - new ConcretePosition(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0) + position(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0, None, None) + + def position( + line0: Option[Integer], + content: String, + offset0: Option[Integer], + pointer0: Option[Integer], + pointerSpace0: Option[String], + sourcePath0: Option[String], + sourceFile0: Option[File], + startOffset0: Option[Integer], + endOffset0: Option[Integer] + ): Position = + new ConcretePosition(line0, + content, + offset0, + pointer0, + pointerSpace0, + sourcePath0, + sourceFile0, + startOffset0, + endOffset0) def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = new ConcreteProblem(cat, pos, msg, sev) @@ -75,7 +97,9 @@ object InterfaceUtil { pointer0: Option[Integer], pointerSpace0: Option[String], sourcePath0: Option[String], - sourceFile0: Option[File] + sourceFile0: Option[File], + startOffset0: Option[Integer], + endOffset0: Option[Integer] ) extends Position { val line = o2jo(line0) val lineContent = content @@ -84,6 +108,8 @@ object InterfaceUtil { val pointerSpace = o2jo(pointerSpace0) val sourcePath = o2jo(sourcePath0) val sourceFile = o2jo(sourceFile0) + override val startOffset = o2jo(startOffset0) + override val endOffset = o2jo(endOffset0) } private final class ConcreteProblem( From 5e3a102606a034dd6b4c4ac4b5ebed6028850f1d Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Tue, 14 Aug 2018 02:00:41 +0900 Subject: [PATCH 766/823] xsbti.Position: Also add {start,end}{Line,Column} Positions in the Language Server Protocol and Build Server Protocol are line/column-based instead of offset-based, so this is more convenient. Computing the line/column from the offset is possible but requires reading the source file. --- .../src/main/java/xsbti/Position.java | 4 ++++ .../main/scala/sbt/util/InterfaceUtil.scala | 24 +++++++++++++++---- 2 files changed, 24 insertions(+), 4 deletions(-) diff --git a/internal/util-interface/src/main/java/xsbti/Position.java b/internal/util-interface/src/main/java/xsbti/Position.java index 0f27e295e..c23c53b22 100644 --- a/internal/util-interface/src/main/java/xsbti/Position.java +++ b/internal/util-interface/src/main/java/xsbti/Position.java @@ -22,4 +22,8 @@ public interface Position // Default values to avoid breaking binary compatibility default Optional startOffset() { return Optional.empty(); } default Optional endOffset() { return Optional.empty(); } + default Optional startLine() { return Optional.empty(); } + default Optional startColumn() { return Optional.empty(); } + default Optional endLine() { return Optional.empty(); } + default Optional endColumn() { return Optional.empty(); } } diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index a2d705600..b11a3e54c 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -46,7 +46,7 @@ object InterfaceUtil { sourcePath0: Option[String], sourceFile0: Option[File] ): Position = - position(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0, None, None) + position(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0, None, None, None, None, None, None) def position( line0: Option[Integer], @@ -57,7 +57,11 @@ object InterfaceUtil { sourcePath0: Option[String], sourceFile0: Option[File], startOffset0: Option[Integer], - endOffset0: Option[Integer] + endOffset0: Option[Integer], + startLine0: Option[Integer], + startColumn0: Option[Integer], + endLine0: Option[Integer], + endColumn0: Option[Integer] ): Position = new ConcretePosition(line0, content, @@ -67,7 +71,11 @@ object InterfaceUtil { sourcePath0, sourceFile0, startOffset0, - endOffset0) + endOffset0, + startLine0, + startColumn0, + endLine0, + endColumn0) def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = new ConcreteProblem(cat, pos, msg, sev) @@ -99,7 +107,11 @@ object InterfaceUtil { sourcePath0: Option[String], sourceFile0: Option[File], startOffset0: Option[Integer], - endOffset0: Option[Integer] + endOffset0: Option[Integer], + startLine0: Option[Integer], + startColumn0: Option[Integer], + endLine0: Option[Integer], + endColumn0: Option[Integer] ) extends Position { val line = o2jo(line0) val lineContent = content @@ -110,6 +122,10 @@ object InterfaceUtil { val sourceFile = o2jo(sourceFile0) override val startOffset = o2jo(startOffset0) override val endOffset = o2jo(endOffset0) + override val startLine = o2jo(startLine0) + override val startColumn = o2jo(startColumn0) + override val endLine = o2jo(endLine0) + override val endColumn = o2jo(endColumn0) } private final class ConcreteProblem( From 494f384c49ac4fa49b907dfde64048af9a400986 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 16 Aug 2018 16:36:41 -0400 Subject: [PATCH 767/823] Formatting --- .../src/main/scala/sbt/util/InterfaceUtil.scala | 14 +++++++++++++- 1 file changed, 13 insertions(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index b11a3e54c..5a85c9b67 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -46,7 +46,19 @@ object InterfaceUtil { sourcePath0: Option[String], sourceFile0: Option[File] ): Position = - position(line0, content, offset0, pointer0, pointerSpace0, sourcePath0, sourceFile0, None, None, None, None, None, None) + position(line0, + content, + offset0, + pointer0, + pointerSpace0, + sourcePath0, + sourceFile0, + None, + None, + None, + None, + None, + None) def position( line0: Option[Integer], From a90675635fb0c2c51870bc066686e8afb49d0606 Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Tue, 28 Aug 2018 01:18:00 +0900 Subject: [PATCH 768/823] 1.2.2-SNAPSHOT --- build.sbt | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 7a2d78ce6..e8f39ffea 100644 --- a/build.sbt +++ b/build.sbt @@ -33,7 +33,7 @@ val mimaSettings = Def settings ( mimaPreviousArtifacts := Set( "1.0.0", "1.0.1", "1.0.2", "1.0.3", "1.1.0", "1.1.1", "1.1.2", "1.1.3", - "1.2.0", + "1.2.0", "1.2.1" ) map (version => organization.value %% moduleName.value % version cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled) @@ -54,7 +54,7 @@ lazy val utilRoot: Project = (project in file(".")) .settings( inThisBuild( Seq( - git.baseVersion := "1.2.1", + git.baseVersion := "1.2.2", version := { val v = version.value if (v contains "SNAPSHOT") git.baseVersion.value + "-SNAPSHOT" From e905b44a3353e351f48e19b92af2416629ef5b64 Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Tue, 28 Aug 2018 01:14:24 +0900 Subject: [PATCH 769/823] Follow-up to the fields added in #173 It turns out that there is more boilerplate to fill that I missed. Also add deprecation notices. --- .../src/main/contraband/interface.contra.txt | 6 ++++++ .../internal/util/codec/PositionFormats.scala | 21 +++++++++++++++++++ .../main/scala/sbt/util/InterfaceUtil.scala | 2 +- .../src/main/scala/sbt/util/Logger.scala | 1 + 4 files changed, 29 insertions(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/contraband/interface.contra.txt b/internal/util-logging/src/main/contraband/interface.contra.txt index 795e6a4c3..a42eb09cb 100644 --- a/internal/util-logging/src/main/contraband/interface.contra.txt +++ b/internal/util-logging/src/main/contraband/interface.contra.txt @@ -16,6 +16,12 @@ type Position { pointerSpace: String sourcePath: String sourceFile: java.io.File + startOffset: Int + endOffset: Int + startLine: Int + startColumn: Int + endLine: Int + endColumn: Int } type Problem { diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala index e43ff03bf..7fe9fd6ce 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/PositionFormats.scala @@ -19,6 +19,13 @@ trait PositionFormats { self: sjsonnew.BasicJsonProtocol => val pointerSpace0 = unbuilder.readField[Optional[String]]("pointerSpace") val sourcePath0 = unbuilder.readField[Optional[String]]("sourcePath") val sourceFile0 = unbuilder.readField[Optional[java.io.File]]("sourceFile") + val startOffset0 = unbuilder.readField[Optional[java.lang.Integer]]("startOffset") + val endOffset0 = unbuilder.readField[Optional[java.lang.Integer]]("endOffset") + val startLine0 = unbuilder.readField[Optional[java.lang.Integer]]("startLine") + val startColumn0 = unbuilder.readField[Optional[java.lang.Integer]]("startColumn") + val endLine0 = unbuilder.readField[Optional[java.lang.Integer]]("endLine") + val endColumn0 = unbuilder.readField[Optional[java.lang.Integer]]("endColumn") + unbuilder.endObject() new Position() { override val line = line0 @@ -28,6 +35,13 @@ trait PositionFormats { self: sjsonnew.BasicJsonProtocol => override val pointerSpace = pointerSpace0 override val sourcePath = sourcePath0 override val sourceFile = sourceFile0 + override val startOffset = startOffset0 + override val endOffset = endOffset0 + override val startLine = startLine0 + override val startColumn = startColumn0 + override val endLine = endLine0 + override val endColumn = endColumn0 + } case None => deserializationError("Expected JsObject but found None") @@ -42,6 +56,13 @@ trait PositionFormats { self: sjsonnew.BasicJsonProtocol => builder.addField("pointerSpace", obj.pointerSpace) builder.addField("sourcePath", obj.sourcePath) builder.addField("sourceFile", obj.sourceFile) + builder.addField("startOffset", obj.startOffset) + builder.addField("endOffset", obj.endOffset) + builder.addField("startLine", obj.startLine) + builder.addField("startColumn", obj.startColumn) + builder.addField("endLine", obj.endLine) + builder.addField("endColumn", obj.endColumn) + builder.endObject() } } diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index 5a85c9b67..a1ae332d8 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -36,7 +36,7 @@ object InterfaceUtil { case None => Optional.empty[A]() } - // Overload to preserve binary compatibility + @deprecated("Use the overload of this method with more arguments", "1.2.2") def position( line0: Option[Integer], content: String, diff --git a/internal/util-logging/src/main/scala/sbt/util/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/Logger.scala index 75d7a439d..37a043cfa 100644 --- a/internal/util-logging/src/main/scala/sbt/util/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Logger.scala @@ -97,6 +97,7 @@ object Logger { def jo2o[A](o: Optional[A]): Option[A] = InterfaceUtil.jo2o(o) def o2jo[A](o: Option[A]): Optional[A] = InterfaceUtil.o2jo(o) + @deprecated("Use InterfaceUtil.position", "1.2.2") def position( line0: Option[Integer], content: String, From 15522a0cbe1c0c89fa36c699f0442a3703deb4aa Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Tue, 28 Aug 2018 02:03:47 +0900 Subject: [PATCH 770/823] Add Problem#rendered to customize how problems are shown Dotty has its own logic for displaying problems with the proper file path, position, and caret, but if we store this information in Problem#message we end up with duplicated information in the output since Zinc will prepend/append similar things (see sbt.internal.inc.ProblemStringFormats). So far, we worked around this in Dotty by using an empty position in the sbt bridge reporter, but this means that crucial semantic information that could be used by a Build Server Protocol implementation and other tools is lost. This commit allows us to avoid by adding an optional `rendered` field to `Problem`: when this field is set, its value controls what the user sees, otherwise we fallback to the default behavior (the logic to do this will be added to Zinc after this PR is merged and a new release of sbt-util is made). --- build.sbt | 3 ++- .../src/main/java/xsbti/Problem.java | 12 ++++++++++- .../src/main/contraband/interface.contra.txt | 1 + .../internal/util/codec/ProblemFormats.scala | 21 +++++++++++++------ .../main/scala/sbt/util/InterfaceUtil.scala | 14 +++++++++++-- .../src/main/scala/sbt/util/Logger.scala | 1 + 6 files changed, 42 insertions(+), 10 deletions(-) diff --git a/build.sbt b/build.sbt index e8f39ffea..9a07324dd 100644 --- a/build.sbt +++ b/build.sbt @@ -124,8 +124,9 @@ lazy val utilLogging = (project in internalPath / "util-logging") exclude[DirectMissingMethodProblem]("sbt.internal.util.SuccessEvent.copy*"), exclude[DirectMissingMethodProblem]("sbt.internal.util.TraceEvent.copy*"), exclude[DirectMissingMethodProblem]("sbt.internal.util.StringEvent.copy*"), - // Private final class constructor changed + // Private final class constructors changed exclude[DirectMissingMethodProblem]("sbt.util.InterfaceUtil#ConcretePosition.this"), + exclude[DirectMissingMethodProblem]("sbt.util.InterfaceUtil#ConcreteProblem.this"), ), ) .configure(addSbtIO) diff --git a/internal/util-interface/src/main/java/xsbti/Problem.java b/internal/util-interface/src/main/java/xsbti/Problem.java index db7f67b22..db61f2bde 100644 --- a/internal/util-interface/src/main/java/xsbti/Problem.java +++ b/internal/util-interface/src/main/java/xsbti/Problem.java @@ -3,10 +3,20 @@ */ package xsbti; +import java.util.Optional; + public interface Problem { String category(); Severity severity(); String message(); Position position(); -} \ No newline at end of file + + // Default value to avoid breaking binary compatibility + /** + * If present, the string shown to the user when displaying this Problem. + * Otherwise, the Problem will be shown in an implementation-defined way + * based on the values of its other fields. + */ + default Optional rendered() { return Optional.empty(); } +} diff --git a/internal/util-logging/src/main/contraband/interface.contra.txt b/internal/util-logging/src/main/contraband/interface.contra.txt index a42eb09cb..3b5ed4986 100644 --- a/internal/util-logging/src/main/contraband/interface.contra.txt +++ b/internal/util-logging/src/main/contraband/interface.contra.txt @@ -29,4 +29,5 @@ type Problem { severity: Severity! message: String! position: Position! + rendered: String } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala index 9820289da..fb7583a5c 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/codec/ProblemFormats.scala @@ -4,8 +4,8 @@ package sbt.internal.util.codec import xsbti.{ Problem, Severity, Position } -import sbt.util.InterfaceUtil.problem import _root_.sjsonnew.{ deserializationError, Builder, JsonFormat, Unbuilder } +import java.util.Optional trait ProblemFormats { self: SeverityFormats with PositionFormats with sjsonnew.BasicJsonProtocol => implicit lazy val ProblemFormat: JsonFormat[Problem] = new JsonFormat[Problem] { @@ -13,12 +13,20 @@ trait ProblemFormats { self: SeverityFormats with PositionFormats with sjsonnew. jsOpt match { case Some(js) => unbuilder.beginObject(js) - val category = unbuilder.readField[String]("category") - val severity = unbuilder.readField[Severity]("severity") - val message = unbuilder.readField[String]("message") - val position = unbuilder.readField[Position]("position") + val category0 = unbuilder.readField[String]("category") + val severity0 = unbuilder.readField[Severity]("severity") + val message0 = unbuilder.readField[String]("message") + val position0 = unbuilder.readField[Position]("position") + val rendered0 = unbuilder.readField[Optional[String]]("rendered") + unbuilder.endObject() - problem(category, position, message, severity) + new Problem { + override val category = category0 + override val position = position0 + override val message = message0 + override val severity = severity0 + override val rendered = rendered0 + } case None => deserializationError("Expected JsObject but found None") } @@ -29,6 +37,7 @@ trait ProblemFormats { self: SeverityFormats with PositionFormats with sjsonnew. builder.addField("severity", obj.severity) builder.addField("message", obj.message) builder.addField("position", obj.position) + builder.addField("rendered", obj.rendered) builder.endObject() } } diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index a1ae332d8..1de667f5b 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -89,8 +89,16 @@ object InterfaceUtil { endLine0, endColumn0) + @deprecated("Use the overload of this method with more arguments", "1.2.2") def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = - new ConcreteProblem(cat, pos, msg, sev) + problem(cat, pos, msg, sev, None) + + def problem(cat: String, + pos: Position, + msg: String, + sev: Severity, + rendered: Option[String]): Problem = + new ConcreteProblem(cat, pos, msg, sev, rendered) private final class ConcreteT2[A1, A2](a1: A1, a2: A2) extends T2[A1, A2] { val get1: A1 = a1 @@ -144,12 +152,14 @@ object InterfaceUtil { cat: String, pos: Position, msg: String, - sev: Severity + sev: Severity, + rendered0: Option[String] ) extends Problem { val category = cat val position = pos val message = msg val severity = sev + override val rendered = o2jo(rendered0) override def toString = s"[$severity] $pos: $message" } } diff --git a/internal/util-logging/src/main/scala/sbt/util/Logger.scala b/internal/util-logging/src/main/scala/sbt/util/Logger.scala index 37a043cfa..3e543b5ce 100644 --- a/internal/util-logging/src/main/scala/sbt/util/Logger.scala +++ b/internal/util-logging/src/main/scala/sbt/util/Logger.scala @@ -117,6 +117,7 @@ object Logger { sourceFile0 ) + @deprecated("Use InterfaceUtil.problem", "1.2.2") def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = InterfaceUtil.problem(cat, pos, msg, sev) } From 7254a258d525b02e06704480b1d041df7ff272b1 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 19 Sep 2018 22:01:18 -0400 Subject: [PATCH 771/823] SIP-18 import Adds SIP-18 import for unidoc purpose. --- .../src/main/scala/sbt/internal/util/Positions.scala | 2 ++ project/build.properties | 2 +- 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala index ca3626db1..a991d2c03 100644 --- a/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala +++ b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala @@ -1,5 +1,7 @@ package sbt.internal.util +import scala.language.experimental.macros + sealed trait SourcePosition sealed trait FilePosition extends SourcePosition { diff --git a/project/build.properties b/project/build.properties index 5620cc502..0cd8b0798 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.2.1 +sbt.version=1.2.3 From 9beff331526fce1654002d318a0f54eebc850d8e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 19 Sep 2018 22:01:18 -0400 Subject: [PATCH 772/823] SIP-18 import Adds SIP-18 import for unidoc purpose. --- .../src/main/scala/sbt/internal/util/Positions.scala | 2 ++ project/build.properties | 2 +- 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala index ca3626db1..a991d2c03 100644 --- a/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala +++ b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala @@ -1,5 +1,7 @@ package sbt.internal.util +import scala.language.experimental.macros + sealed trait SourcePosition sealed trait FilePosition extends SourcePosition { diff --git a/project/build.properties b/project/build.properties index 5620cc502..0cd8b0798 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.2.1 +sbt.version=1.2.3 From cf0467609da1ee6ca59c76f13ca072653c7e51cb Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 19 Sep 2018 22:46:38 -0400 Subject: [PATCH 773/823] -Xfatal-warnings --- build.sbt | 13 +++---------- .../main/scala/sbt/internal/util/FilterLogger.scala | 3 ++- .../main/scala/sbt/internal/util/FullLogger.scala | 3 ++- .../main/scala/sbt/internal/util/MultiLogger.scala | 3 ++- .../main/scala/sbt/internal/util/Positions.scala | 1 - project/Dependencies.scala | 4 +++- project/plugins.sbt | 4 ++-- util-cache/src/main/scala/sbt/util/FileInfo.scala | 1 - util-cache/src/test/scala/CacheSpec.scala | 4 ++-- util-cache/src/test/scala/SingletonCacheSpec.scala | 4 ++-- 10 files changed, 18 insertions(+), 22 deletions(-) diff --git a/build.sbt b/build.sbt index 9a07324dd..dd6230b83 100644 --- a/build.sbt +++ b/build.sbt @@ -14,15 +14,6 @@ def commonSettings: Seq[Setting[_]] = Seq( testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-Xlint", "-Xlint:-serial"), crossScalaVersions := Seq(scala211, scala212), - scalacOptions := { - val old = scalacOptions.value - scalaVersion.value match { - case sv if sv.startsWith("2.10") => - old diff List("-Xfuture", "-Ywarn-unused", "-Ywarn-unused-import") - case sv if sv.startsWith("2.11") => old ++ List("-Ywarn-unused", "-Ywarn-unused-import") - case _ => old ++ List("-Ywarn-unused", "-Ywarn-unused-import", "-YdisableFlatCpCaching") - } - }, scalacOptions in console in Compile -= "-Ywarn-unused-import", scalacOptions in console in Test -= "-Ywarn-unused-import", publishArtifact in Compile := true, @@ -110,8 +101,10 @@ lazy val utilLogging = (project in internalPath / "util-logging") crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", libraryDependencies ++= - Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), + Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value, + compilerPlugin(silencerPlugin), silencerLib), libraryDependencies ++= Seq(scalaCheck, scalaTest), + Compile / scalacOptions += "-Ywarn-unused:-locals,-explicits,-privates", sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala", contrabandFormatsForType in generateContrabands in Compile := { tpe => val old = (contrabandFormatsForType in generateContrabands in Compile).value diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala index d52901c7b..03bc1e862 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/FilterLogger.scala @@ -4,13 +4,14 @@ package sbt.internal.util import sbt.util._ +import com.github.ghik.silencer.silent /** * A filter logger is used to delegate messages but not the logging level to another logger. This means * that messages are logged at the higher of the two levels set by this logger and its delegate. */ class FilterLogger(delegate: AbstractLogger) extends BasicLogger { - override lazy val ansiCodesSupported = delegate.ansiCodesSupported + @silent override lazy val ansiCodesSupported = delegate.ansiCodesSupported def trace(t: => Throwable): Unit = { if (traceEnabled) delegate.trace(t) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala index c3ad40442..60478d3b9 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/FullLogger.scala @@ -4,11 +4,12 @@ package sbt.internal.util import sbt.util._ +import com.github.ghik.silencer.silent /** Promotes the simple Logger interface to the full AbstractLogger interface. */ class FullLogger(delegate: Logger) extends BasicLogger { @deprecated("No longer used.", "1.0.0") - override val ansiCodesSupported: Boolean = delegate.ansiCodesSupported + @silent override val ansiCodesSupported: Boolean = delegate.ansiCodesSupported def trace(t: => Throwable): Unit = { if (traceEnabled) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala index 2d12a1b2f..a3eb9948e 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/MultiLogger.scala @@ -4,13 +4,14 @@ package sbt.internal.util import sbt.util._ +import com.github.ghik.silencer.silent // note that setting the logging level on this logger has no effect on its behavior, only // on the behavior of the delegates. class MultiLogger(delegates: List[AbstractLogger]) extends BasicLogger { @deprecated("No longer used.", "1.0.0") override lazy val ansiCodesSupported = delegates exists supported - private[this] def supported = (_: AbstractLogger).ansiCodesSupported + @silent private[this] def supported = (_: AbstractLogger).ansiCodesSupported override def setLevel(newLevel: Level.Value): Unit = { super.setLevel(newLevel) diff --git a/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala index a991d2c03..47c5bc16b 100644 --- a/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala +++ b/internal/util-position/src/main/scala/sbt/internal/util/Positions.scala @@ -24,7 +24,6 @@ final case class RangePosition(path: String, range: LineRange) extends FilePosit object SourcePosition { /** Creates a SourcePosition by using the enclosing position of the invocation of this method. - * @see [[scala.reflect.macros.Enclosures#enclosingPosition]] * @return SourcePosition */ def fromEnclosing(): SourcePosition = macro SourcePositionMacro.fromEnclosingImpl diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 806cbe65b..6d51c107f 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -7,7 +7,7 @@ object Dependencies { val scala211 = "2.11.12" val scala212 = "2.12.6" - private val ioVersion = "1.2.0" + private val ioVersion = "1.2.1" private val sbtIO = "org.scala-sbt" %% "io" % ioVersion @@ -58,4 +58,6 @@ object Dependencies { val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion val log4jCore = "org.apache.logging.log4j" % "log4j-core" % log4jVersion val disruptor = "com.lmax" % "disruptor" % "3.3.6" + val silencerPlugin = "com.github.ghik" %% "silencer-plugin" % "1.2" + val silencerLib = "com.github.ghik" %% "silencer-lib" % "1.2" % Provided } diff --git a/project/plugins.sbt b/project/plugins.sbt index e1c769a77..c064ea157 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,3 +1,3 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.7") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.0") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.8") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.1") addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.9") diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index bef3d6bfd..1fc605a3d 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -4,7 +4,6 @@ package sbt.util import java.io.File -import java.io.FileNotFoundException import scala.util.control.NonFatal import sbt.io.{ Hash, IO } import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError } diff --git a/util-cache/src/test/scala/CacheSpec.scala b/util-cache/src/test/scala/CacheSpec.scala index ad51d7ff5..138e6218b 100644 --- a/util-cache/src/test/scala/CacheSpec.scala +++ b/util-cache/src/test/scala/CacheSpec.scala @@ -46,7 +46,7 @@ class CacheSpec extends FlatSpec { } cache(store)("someKey") match { - case Hit(read) => assert(read === value) + case Hit(read) => assert(read === value); () case Miss(_) => fail } } @@ -63,7 +63,7 @@ class CacheSpec extends FlatSpec { } cache(store)(key) match { - case Hit(read) => assert(read === value) + case Hit(read) => assert(read === value); () case Miss(_) => fail } } diff --git a/util-cache/src/test/scala/SingletonCacheSpec.scala b/util-cache/src/test/scala/SingletonCacheSpec.scala index d22e10f2d..5c812f8d1 100644 --- a/util-cache/src/test/scala/SingletonCacheSpec.scala +++ b/util-cache/src/test/scala/SingletonCacheSpec.scala @@ -69,7 +69,7 @@ class SingletonCacheSpec extends FlatSpec { cache.write(store, value) val read = cache.read(store) - assert(read === value) + assert(read === value); () } } @@ -80,7 +80,7 @@ class SingletonCacheSpec extends FlatSpec { cache.write(store, value) val read = cache.read(store) - assert(read === value) + assert(read === value); () } } From dee4ccaa682d0a58a5683b9d25d771c6e342b76b Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 20 Sep 2018 00:36:32 -0400 Subject: [PATCH 774/823] only for 2.12 --- build.sbt | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index dd6230b83..2d5976cea 100644 --- a/build.sbt +++ b/build.sbt @@ -104,7 +104,10 @@ lazy val utilLogging = (project in internalPath / "util-logging") Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value, compilerPlugin(silencerPlugin), silencerLib), libraryDependencies ++= Seq(scalaCheck, scalaTest), - Compile / scalacOptions += "-Ywarn-unused:-locals,-explicits,-privates", + Compile / scalacOptions ++= (scalaVersion.value match { + case v if v.startsWith("2.12.") => List("-Ywarn-unused:-locals,-explicits,-privates") + case _ => List() + }), sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala", contrabandFormatsForType in generateContrabands in Compile := { tpe => val old = (contrabandFormatsForType in generateContrabands in Compile).value From 9f876009c8d2972de1472cb5d36837a9d0f8c50e Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 22 Sep 2018 00:35:20 -0400 Subject: [PATCH 775/823] Fix 2.10 build Fixes #179 --- .travis.yml | 14 ++++++++------ build.sbt | 7 +++++-- .../scala/com/github/ghik/silencer/silent.scala | 10 ++++++++++ .../scala/sbt/internal/util/BufferedLogger.scala | 1 + 4 files changed, 24 insertions(+), 8 deletions(-) create mode 100644 internal/util-logging/src/main/scala/com/github/ghik/silencer/silent.scala diff --git a/.travis.yml b/.travis.yml index 3b529232f..cd42eff8c 100644 --- a/.travis.yml +++ b/.travis.yml @@ -3,12 +3,7 @@ jdk: oraclejdk8 scala: - 2.11.12 - - 2.12.4 - -matrix: - include: - - scala: 2.12.4 - jdk: oraclejdk9 + - 2.12.6 script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M ++$TRAVIS_SCALA_VERSION @@ -17,6 +12,13 @@ script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M whitesourceCheckPolicies test +matrix: + include: + - scala: 2.12.6 + jdk: oraclejdk9 + - scala: 2.10.6 + script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION compile" + env: global: - secure: JzxepvrNQIem+7MS8pBfBkcWDgt/oNKOreI3GJMJDN9P7lxCmrW0UVhpSftscjRzz9gXGQleqZ8t/I0hqysY9nO/DlxDQil6FKpsqrEKALdIsez8TjtbOlV69enDl6SBCXpg1B/rTQ/dL9mpV3WMvNkmDOhcNmbNyfO9Uk8wAAEvGQNKyE02s0gjZf6IgfOHXInBB2o3+uQFiWCABFHDWInN4t0QZVEhF/3P3iDKEfauWGwugf/YKLrwUUzNyN+J1i1goYEWZvviP+KCNbPlEsVN60In8F0t+jYuBJb0ePNcl3waT/4xBKQRidB4XRbhOXrZIATdpHLnzKzk2TPf3GxijNEscKYGdq3v6nWd128rfHGYz528pRSZ8bNOdQJotB/bJTmIEOnk5P9zU0z4z2cawMF6EyBJka7kXnC9Vz6TpifvyXDpzfmRzAkBrD6PC+diGPbyy5+4zvhpZuv31MRjMckohyNb76pR9qq70yDlomn+nVNoZ1fpp7dCqwjIxm9h2UjCWzXWY4xSByI8/CaPibq6Ma7RWHQE+4NGG2CCLQrqN4NB+BFsH3R0l5Js9khvDuEUYJkgSmJMFluXranWRV+pp/YMxk1IT4rOEPOc/hIqlQTrxasp/QxeyAfRk9OPzoz9L2kR0RH4ch3KuaARUv03WFNarfQ/ISz3P/s= diff --git a/build.sbt b/build.sbt index 2d5976cea..58c1dcdc6 100644 --- a/build.sbt +++ b/build.sbt @@ -101,9 +101,12 @@ lazy val utilLogging = (project in internalPath / "util-logging") crossScalaVersions := Seq(scala210, scala211, scala212), name := "Util Logging", libraryDependencies ++= - Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value, - compilerPlugin(silencerPlugin), silencerLib), + Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), libraryDependencies ++= Seq(scalaCheck, scalaTest), + libraryDependencies ++= (scalaVersion.value match { + case v if v.startsWith("2.12.") => List(compilerPlugin(silencerPlugin)) + case _ => List() + }), Compile / scalacOptions ++= (scalaVersion.value match { case v if v.startsWith("2.12.") => List("-Ywarn-unused:-locals,-explicits,-privates") case _ => List() diff --git a/internal/util-logging/src/main/scala/com/github/ghik/silencer/silent.scala b/internal/util-logging/src/main/scala/com/github/ghik/silencer/silent.scala new file mode 100644 index 000000000..505099dfc --- /dev/null +++ b/internal/util-logging/src/main/scala/com/github/ghik/silencer/silent.scala @@ -0,0 +1,10 @@ +package com.github.ghik.silencer + +import scala.annotation.Annotation + +/** + * When silencer compiler plugin is enabled, this annotation suppresses all warnings emitted by scalac for some portion + * of source code. It can be applied on any definition (`class`, def`, `val`, `var`, etc.) or on arbitrary expression, + * e.g. {123; 456}: @silent` + */ +class silent extends Annotation diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index b19fc2509..f1948b2f8 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -43,6 +43,7 @@ class BufferedAppender private[BufferedAppender] (name: String, delegate: Append if (recording) { buffer += event.toImmutable } else delegate.append(event) + () } /** Enables buffering. */ From e121d969c4d353e9d118447895c89a6df255b500 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 22 Sep 2018 00:55:00 -0400 Subject: [PATCH 776/823] openjdk11 --- .travis.yml | 2 +- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 3 ++- 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/.travis.yml b/.travis.yml index cd42eff8c..31976b641 100644 --- a/.travis.yml +++ b/.travis.yml @@ -15,7 +15,7 @@ script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M matrix: include: - scala: 2.12.6 - jdk: oraclejdk9 + jdk: openjdk11 - scala: 2.10.6 script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION compile" diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 3e1d07da5..d5f6f7919 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -8,6 +8,7 @@ import org.apache.logging.log4j.{ Level => XLevel } import org.apache.logging.log4j.message.{ Message, ObjectMessage, ReusableObjectMessage } import org.apache.logging.log4j.core.{ LogEvent => XLogEvent } import org.apache.logging.log4j.core.appender.AbstractAppender +import scala.collection.immutable.StringOps import ConsoleAppender._ @@ -389,7 +390,7 @@ class ConsoleAppender private[ConsoleAppender] ( message: String ): Unit = out.lockObject.synchronized { - message.lines.foreach { line => + new StringOps(message).lines.foreach { line => val builder = new java.lang.StringBuilder( labelColor.length + label.length + messageColor.length + line.length + reset.length * 3 + 3) def fmted(a: String, b: String) = builder.append(reset).append(a).append(b).append(reset) From ab674321dce1439f80593ba58186346c2934c1a8 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 28 Sep 2018 02:27:45 -0400 Subject: [PATCH 777/823] Scala 2.12.7 --- .travis.yml | 4 ++-- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 3 +-- project/Dependencies.scala | 2 +- 3 files changed, 4 insertions(+), 5 deletions(-) diff --git a/.travis.yml b/.travis.yml index 31976b641..8436d27e9 100644 --- a/.travis.yml +++ b/.travis.yml @@ -3,7 +3,7 @@ jdk: oraclejdk8 scala: - 2.11.12 - - 2.12.6 + - 2.12.7 script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M ++$TRAVIS_SCALA_VERSION @@ -14,7 +14,7 @@ script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M matrix: include: - - scala: 2.12.6 + - scala: 2.12.7 jdk: openjdk11 - scala: 2.10.6 script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION compile" diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index d5f6f7919..584fd9dd1 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -8,7 +8,6 @@ import org.apache.logging.log4j.{ Level => XLevel } import org.apache.logging.log4j.message.{ Message, ObjectMessage, ReusableObjectMessage } import org.apache.logging.log4j.core.{ LogEvent => XLogEvent } import org.apache.logging.log4j.core.appender.AbstractAppender -import scala.collection.immutable.StringOps import ConsoleAppender._ @@ -390,7 +389,7 @@ class ConsoleAppender private[ConsoleAppender] ( message: String ): Unit = out.lockObject.synchronized { - new StringOps(message).lines.foreach { line => + message.linesIterator.foreach { line => val builder = new java.lang.StringBuilder( labelColor.length + label.length + messageColor.length + line.length + reset.length * 3 + 3) def fmted(a: String, b: String) = builder.append(reset).append(a).append(b).append(reset) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 6d51c107f..3437b7f0f 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -5,7 +5,7 @@ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { val scala210 = "2.10.7" val scala211 = "2.11.12" - val scala212 = "2.12.6" + val scala212 = "2.12.7" private val ioVersion = "1.2.1" From 5ea9ee159c22ce3e6f7b80edf971cae804c9d00c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 28 Sep 2018 02:28:12 -0400 Subject: [PATCH 778/823] 1.3.0 --- build.sbt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/build.sbt b/build.sbt index 58c1dcdc6..28b92091e 100644 --- a/build.sbt +++ b/build.sbt @@ -45,7 +45,7 @@ lazy val utilRoot: Project = (project in file(".")) .settings( inThisBuild( Seq( - git.baseVersion := "1.2.2", + git.baseVersion := "1.3.0", version := { val v = version.value if (v contains "SNAPSHOT") git.baseVersion.value + "-SNAPSHOT" From 53c9b848580376544543b0732848005b37d97fd7 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 30 Sep 2018 20:59:19 -0400 Subject: [PATCH 779/823] add sbt.color flag This implements a new sbt.color flag that takes always/auto/never/true/false value as a replacement of current sbt.log.format=true/false flag. When neither flags are set, the default behavior is to enable color when the terminal supports ANSI and it detects an stdout console (as opposed to redirects). Fixes https://github.com/sbt/sbt/issues/4284 --- .../sbt/internal/util/LogOption.scala | 15 +++++++ .../internal/util/codec/JsonProtocol.scala | 1 + .../util/codec/LogOptionFormats.scala | 31 +++++++++++++ .../src/main/contraband/logging.contra | 7 +++ .../sbt/internal/util/ConsoleAppender.scala | 44 +++++++++++++++++-- 5 files changed, 94 insertions(+), 4 deletions(-) create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/LogOption.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/LogOptionFormats.scala diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/LogOption.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/LogOption.scala new file mode 100644 index 000000000..769da7982 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/LogOption.scala @@ -0,0 +1,15 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util +/** value for logging options like color */ +sealed abstract class LogOption extends Serializable +object LogOption { + + + case object Always extends LogOption + case object Never extends LogOption + case object Auto extends LogOption +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala index a94906dda..15e4d9cb2 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala @@ -9,4 +9,5 @@ trait JsonProtocol extends sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.TraceEventFormats with sbt.internal.util.codec.AbstractEntryFormats with sbt.internal.util.codec.SuccessEventFormats + with sbt.internal.util.codec.LogOptionFormats object JsonProtocol extends JsonProtocol \ No newline at end of file diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/LogOptionFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/LogOptionFormats.scala new file mode 100644 index 000000000..e52700c19 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/LogOptionFormats.scala @@ -0,0 +1,31 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } +trait LogOptionFormats { self: sjsonnew.BasicJsonProtocol => +implicit lazy val LogOptionFormat: JsonFormat[sbt.internal.util.LogOption] = new JsonFormat[sbt.internal.util.LogOption] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.LogOption = { + jsOpt match { + case Some(js) => + unbuilder.readString(js) match { + case "Always" => sbt.internal.util.LogOption.Always + case "Never" => sbt.internal.util.LogOption.Never + case "Auto" => sbt.internal.util.LogOption.Auto + } + case None => + deserializationError("Expected JsString but found None") + } + } + override def write[J](obj: sbt.internal.util.LogOption, builder: Builder[J]): Unit = { + val str = obj match { + case sbt.internal.util.LogOption.Always => "Always" + case sbt.internal.util.LogOption.Never => "Never" + case sbt.internal.util.LogOption.Auto => "Auto" + } + builder.writeString(str) + } +} +} diff --git a/internal/util-logging/src/main/contraband/logging.contra b/internal/util-logging/src/main/contraband/logging.contra index 19b019c66..73d0b1a56 100644 --- a/internal/util-logging/src/main/contraband/logging.contra +++ b/internal/util-logging/src/main/contraband/logging.contra @@ -25,3 +25,10 @@ type TraceEvent implements sbt.internal.util.AbstractEntry { type SuccessEvent { message: String! } + +## value for logging options like color +enum LogOption { + Always + Never + Auto +} diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index d5f6f7919..f070a1252 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -101,13 +101,49 @@ object ConsoleAppender { /** Hide stack trace altogether. */ val noSuppressedMessage = (_: SuppressedTraceContext) => None - /** Indicates whether formatting has been disabled in environment variables. */ + /** + * Indicates whether formatting has been disabled in environment variables. + * 1. -Dsbt.log.noformat=true means no formatting. + * 2. -Dsbt.color=always/auto/never/true/false + * 3. -Dsbt.colour=always/auto/never/true/false + * 4. -Dsbt.log.format=always/auto/never/true/false + */ val formatEnabledInEnv: Boolean = { - import java.lang.Boolean.{ getBoolean, parseBoolean } - val value = System.getProperty("sbt.log.format") - if (value eq null) (ansiSupported && !getBoolean("sbt.log.noformat")) else parseBoolean(value) + def useColorDefault: Boolean = { + // This approximates that both stdin and stdio are connected, + // so by default color will be turned off for pipes and redirects. + val hasConsole = Option(java.lang.System.console).isDefined + ansiSupported && hasConsole + } + sys.props.get("sbt.log.noformat") match { + case Some(_) => !java.lang.Boolean.getBoolean("sbt.log.noformat") + case _ => + sys.props + .get("sbt.color") + .orElse(sys.props.get("sbt.colour")) + .orElse(sys.props.get("sbt.log.format")) + .flatMap({ s => + parseLogOption(s) match { + case LogOption.Always => Some(true) + case LogOption.Never => Some(false) + case _ => None + } + }) + .getOrElse(useColorDefault) + } } + private[sbt] def parseLogOption(s: String): LogOption = + s.toLowerCase match { + case "always" => LogOption.Always + case "auto" => LogOption.Auto + case "never" => LogOption.Never + case "true" => LogOption.Always + case "false" => LogOption.Never + case "default" => LogOption.Auto + case _ => LogOption.Auto + } + private[this] val generateId: AtomicInteger = new AtomicInteger /** From 9bb244314dc8e0c371dd39a1fd178ab7fc33937d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 30 Sep 2018 21:43:39 -0400 Subject: [PATCH 780/823] implement sbt.progress This implements a logger that grows upward, instead towards bottom. --- build.sbt | 1 + .../sbt/internal/util/ConsoleAppender.scala | 29 ++++++++++++++++--- .../scala/sbt/internal/util/ConsoleOut.scala | 18 ++++++++++-- 3 files changed, 41 insertions(+), 7 deletions(-) diff --git a/build.sbt b/build.sbt index 58c1dcdc6..84fac6e31 100644 --- a/build.sbt +++ b/build.sbt @@ -126,6 +126,7 @@ lazy val utilLogging = (project in internalPath / "util-logging") // Private final class constructors changed exclude[DirectMissingMethodProblem]("sbt.util.InterfaceUtil#ConcretePosition.this"), exclude[DirectMissingMethodProblem]("sbt.util.InterfaceUtil#ConcreteProblem.this"), + exclude[ReversedMissingMethodProblem]("sbt.internal.util.ConsoleOut.flush"), ), ) .configure(addSbtIO) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index f070a1252..79e210b62 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -133,6 +133,21 @@ object ConsoleAppender { } } + /** + * Indicates whether the super shell is enabled. + */ + lazy val showProgress: Boolean = + formatEnabledInEnv && sys.props + .get("sbt.progress") + .flatMap({ s => + parseLogOption(s) match { + case LogOption.Always => Some(true) + case LogOption.Never => Some(false) + case _ => None + } + }) + .getOrElse(true) + private[sbt] def parseLogOption(s: String): LogOption = s.toLowerCase match { case "always" => LogOption.Always @@ -443,11 +458,17 @@ class ConsoleAppender private[ConsoleAppender] ( appendLog(SUCCESS_LABEL_COLOR, Level.SuccessLabel, SUCCESS_MESSAGE_COLOR, message) } + private final val ScrollUp = "\u001B[S" + private final val DeleteLine = "\u001B[2K" + private final val CursorLeft1000 = "\u001B[1000D" private def write(msg: String): Unit = { - val cleanedMsg = - if (!useFormat || !ansiCodesSupported) EscHelpers.removeEscapeSequences(msg) - else msg - out.println(cleanedMsg) + if (!useFormat || !ansiCodesSupported) out.println(EscHelpers.removeEscapeSequences(msg)) + else { + if (ConsoleAppender.showProgress) { + out.print(s"$ScrollUp$DeleteLine$msg${CursorLeft1000}") + out.flush() + } else out.println(msg) + } } private def appendMessage(level: Level.Value, msg: Message): Unit = diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala index 717be2cfd..7edefebd7 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleOut.scala @@ -7,6 +7,7 @@ sealed trait ConsoleOut { def print(s: String): Unit def println(s: String): Unit def println(): Unit + def flush(): Unit } object ConsoleOut { @@ -39,6 +40,14 @@ object ConsoleOut { last = Some(s) current.setLength(0) } + def flush(): Unit = synchronized { + val s = current.toString + if (ConsoleAppender.formatEnabledInEnv && last.exists(lmsg => f(s, lmsg))) + lockObject.print(OverwriteLine) + lockObject.print(s) + last = Some(s) + current.setLength(0) + } } def printStreamOut(out: PrintStream): ConsoleOut = new ConsoleOut { @@ -46,17 +55,20 @@ object ConsoleOut { def print(s: String) = out.print(s) def println(s: String) = out.println(s) def println() = out.println() + def flush() = out.flush() } def printWriterOut(out: PrintWriter): ConsoleOut = new ConsoleOut { val lockObject = out def print(s: String) = out.print(s) - def println(s: String) = { out.println(s); out.flush() } - def println() = { out.println(); out.flush() } + def println(s: String) = { out.println(s); flush() } + def println() = { out.println(); flush() } + def flush() = { out.flush() } } def bufferedWriterOut(out: BufferedWriter): ConsoleOut = new ConsoleOut { val lockObject = out def print(s: String) = out.write(s) def println(s: String) = { out.write(s); println() } - def println() = { out.newLine(); out.flush() } + def println() = { out.newLine(); flush() } + def flush() = { out.flush() } } } From 458675239c5d01f83ae00166861fce82d7b5a6db Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 2 Oct 2018 08:17:08 -0400 Subject: [PATCH 781/823] Add mima exclusion for JsonProtocol.LogOptionFormat According to Travis CI only Scala 2.11 seems to be affected. --- build.sbt | 2 ++ 1 file changed, 2 insertions(+) diff --git a/build.sbt b/build.sbt index 84fac6e31..6267773b1 100644 --- a/build.sbt +++ b/build.sbt @@ -127,6 +127,8 @@ lazy val utilLogging = (project in internalPath / "util-logging") exclude[DirectMissingMethodProblem]("sbt.util.InterfaceUtil#ConcretePosition.this"), exclude[DirectMissingMethodProblem]("sbt.util.InterfaceUtil#ConcreteProblem.this"), exclude[ReversedMissingMethodProblem]("sbt.internal.util.ConsoleOut.flush"), + // This affects Scala 2.11 only it seems, so it's ok? + exclude[InheritedNewAbstractMethodProblem]("sbt.internal.util.codec.JsonProtocol.LogOptionFormat"), ), ) .configure(addSbtIO) From efe04c1cdecc5e43521c9b12565426d52f518b3d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 2 Oct 2018 08:51:17 -0400 Subject: [PATCH 782/823] Cleaning up code --- .../sbt/internal/util/ConsoleAppender.scala | 26 +++++++++---------- 1 file changed, 13 insertions(+), 13 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 79e210b62..fedb80f9f 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -150,13 +150,12 @@ object ConsoleAppender { private[sbt] def parseLogOption(s: String): LogOption = s.toLowerCase match { - case "always" => LogOption.Always - case "auto" => LogOption.Auto - case "never" => LogOption.Never - case "true" => LogOption.Always - case "false" => LogOption.Never - case "default" => LogOption.Auto - case _ => LogOption.Auto + case "always" => LogOption.Always + case "auto" => LogOption.Auto + case "never" => LogOption.Never + case "true" => LogOption.Always + case "false" => LogOption.Never + case _ => LogOption.Auto } private[this] val generateId: AtomicInteger = new AtomicInteger @@ -462,12 +461,13 @@ class ConsoleAppender private[ConsoleAppender] ( private final val DeleteLine = "\u001B[2K" private final val CursorLeft1000 = "\u001B[1000D" private def write(msg: String): Unit = { - if (!useFormat || !ansiCodesSupported) out.println(EscHelpers.removeEscapeSequences(msg)) - else { - if (ConsoleAppender.showProgress) { - out.print(s"$ScrollUp$DeleteLine$msg${CursorLeft1000}") - out.flush() - } else out.println(msg) + if (!useFormat || !ansiCodesSupported) { + out.println(EscHelpers.removeEscapeSequences(msg)) + } else if (ConsoleAppender.showProgress) { + out.print(s"$ScrollUp$DeleteLine$msg${CursorLeft1000}") + out.flush() + } else { + out.println(msg) } } From 65e2980e9dc93d6600367c5763eb33485abd3608 Mon Sep 17 00:00:00 2001 From: Jason Zaugg Date: Tue, 9 Oct 2018 15:44:06 +1000 Subject: [PATCH 783/823] Avoid temporary string in JSON reading --- util-cache/src/main/scala/sbt/util/Input.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/util-cache/src/main/scala/sbt/util/Input.scala b/util-cache/src/main/scala/sbt/util/Input.scala index 9dcdd5949..6f1d895e8 100644 --- a/util-cache/src/main/scala/sbt/util/Input.scala +++ b/util-cache/src/main/scala/sbt/util/Input.scala @@ -22,7 +22,7 @@ class PlainInput[J: IsoString](input: InputStream, converter: SupportConverter[J val buffer = new Array[Char](bufferSize) var read = 0 while ({ read = reader.read(buffer, 0, bufferSize); read != -1 }) { - builder.append(String.valueOf(buffer.take(read))) + builder.appendAll(buffer, 0, read) } builder.toString() } From 9f202397e47e1604b0cd89fe7e264ccaf80942c6 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Fri, 16 Nov 2018 23:03:25 -0800 Subject: [PATCH 784/823] expose ANSI control sequences --- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index d42c3adff..28e6c657c 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -96,6 +96,9 @@ class ConsoleLogger private[ConsoleLogger] ( } object ConsoleAppender { + private[sbt] final val ScrollUp = "\u001B[S" + private[sbt] final val DeleteLine = "\u001B[2K" + private[sbt] final val CursorLeft1000 = "\u001B[1000D" /** Hide stack trace altogether. */ val noSuppressedMessage = (_: SuppressedTraceContext) => None @@ -456,9 +459,6 @@ class ConsoleAppender private[ConsoleAppender] ( appendLog(SUCCESS_LABEL_COLOR, Level.SuccessLabel, SUCCESS_MESSAGE_COLOR, message) } - private final val ScrollUp = "\u001B[S" - private final val DeleteLine = "\u001B[2K" - private final val CursorLeft1000 = "\u001B[1000D" private def write(msg: String): Unit = { if (!useFormat || !ansiCodesSupported) { out.println(EscHelpers.removeEscapeSequences(msg)) From fac92b66cbc20e89363babc499677b27a2003672 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 25 Nov 2018 23:22:22 -0500 Subject: [PATCH 785/823] bump JLine and log4j 2 --- project/Dependencies.scala | 6 +++--- project/plugins.sbt | 1 + 2 files changed, 4 insertions(+), 3 deletions(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 3437b7f0f..d820356a3 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -35,7 +35,7 @@ object Dependencies { def addSbtIO(p: Project): Project = addSbtModule(p, sbtIoPath, "io", sbtIO) - val jline = "jline" % "jline" % "2.14.4" + val jline = "jline" % "jline" % "2.14.6" val scalaCompiler = Def.setting { "org.scala-lang" % "scala-compiler" % scalaVersion.value } val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } @@ -54,10 +54,10 @@ object Dependencies { "com.eed3si9n" %% "sjson-new-murmurhash" % contrabandSjsonNewVersion.value } - def log4jVersion = "2.8.1" + def log4jVersion = "2.11.1" val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion val log4jCore = "org.apache.logging.log4j" % "log4j-core" % log4jVersion - val disruptor = "com.lmax" % "disruptor" % "3.3.6" + val disruptor = "com.lmax" % "disruptor" % "3.4.2" val silencerPlugin = "com.github.ghik" %% "silencer-plugin" % "1.2" val silencerLib = "com.github.ghik" %% "silencer-lib" % "1.2" % Provided } diff --git a/project/plugins.sbt b/project/plugins.sbt index c064ea157..20afa9ff8 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,3 +1,4 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.8") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.1") addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.9") +addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.3.4") From 5b198b20be271e27768547ad944c362fab572c4e Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Mon, 4 Feb 2019 12:57:45 -0800 Subject: [PATCH 786/823] Add file FileInfo factory applys without io It may be the case that the file property is already known and we can avoid performing additional io by just passing in the value directly. --- util-cache/src/main/scala/sbt/util/FileInfo.scala | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index 1fc605a3d..f823f4924 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -90,6 +90,8 @@ object FileInfo { implicit def apply(file: File): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, Hash(file).toList, IO.getModifiedTimeOrZero(file)) + def apply(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = + FileHashModified(file.getAbsoluteFile, hash, lastModified) } object hash extends Style { @@ -115,6 +117,8 @@ object FileInfo { } implicit def apply(file: File): HashFileInfo = FileHash(file.getAbsoluteFile, computeHash(file)) + def apply(file: File, bytes: List[Byte]): HashFileInfo = + FileHash(file.getAbsoluteFile, bytes) private def computeHash(file: File): List[Byte] = try Hash(file).toList @@ -147,6 +151,8 @@ object FileInfo { implicit def apply(file: File): ModifiedFileInfo = FileModified(file.getAbsoluteFile, IO.getModifiedTimeOrZero(file)) + def apply(file: File, lastModified: Long): ModifiedFileInfo = + FileModified(file.getAbsoluteFile, lastModified) } object exists extends Style { @@ -175,5 +181,9 @@ object FileInfo { val abs = file.getAbsoluteFile PlainFile(abs, abs.exists) } + def apply(file: File, exists: Boolean): PlainFileInfo = { + val abs = file.getAbsoluteFile + PlainFile(abs, exists) + } } } From 3a6aa577479f94f112a99e93c3d9ed252687fb77 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 23 Feb 2019 14:21:07 -0500 Subject: [PATCH 787/823] log4j 2.11.2 --- .../src/main/scala/sbt/internal/util/BufferedLogger.scala | 2 +- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 2 +- .../util-logging/src/main/scala/sbt/util/LogExchange.scala | 3 ++- project/Dependencies.scala | 4 ++-- 4 files changed, 6 insertions(+), 5 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index f1948b2f8..dc22049c0 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -34,7 +34,7 @@ object BufferedAppender { * the level at the time 'play' is called. */ class BufferedAppender private[BufferedAppender] (name: String, delegate: Appender) - extends AbstractAppender(name, null, PatternLayout.createDefaultLayout(), true) { + extends AbstractAppender(name, null, PatternLayout.createDefaultLayout(), true, Array.empty) { private[this] val buffer = new ListBuffer[XLogEvent] private[this] var recording = false diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 28e6c657c..e18add319 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -333,7 +333,7 @@ class ConsoleAppender private[ConsoleAppender] ( ansiCodesSupported: Boolean, useFormat: Boolean, suppressedMessage: SuppressedTraceContext => Option[String] -) extends AbstractAppender(name, null, LogExchange.dummyLayout, true) { +) extends AbstractAppender(name, null, LogExchange.dummyLayout, true, Array.empty) { import scala.Console.{ BLUE, GREEN, RED, YELLOW } private val reset: String = { diff --git a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala index 2341a4395..98f7353ce 100644 --- a/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala +++ b/internal/util-logging/src/main/scala/sbt/util/LogExchange.scala @@ -124,7 +124,8 @@ sealed abstract class LogExchange { // CustomConsoleAppenderImpl.createAppender("Stdout", layout, null, null) appender.start config.addAppender(appender) - val asyncAppender: AsyncAppender = (AsyncAppender.newBuilder(): AsyncAppender.Builder) + val asyncAppender: AsyncAppender = AsyncAppender + .newBuilder() .setName("AsyncStdout") .setAppenderRefs(Array(AppenderRef.createAppenderRef("Stdout", XLevel.DEBUG, null))) .setBlocking(false) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index d820356a3..74d510024 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -5,7 +5,7 @@ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { val scala210 = "2.10.7" val scala211 = "2.11.12" - val scala212 = "2.12.7" + val scala212 = "2.12.8" private val ioVersion = "1.2.1" @@ -54,7 +54,7 @@ object Dependencies { "com.eed3si9n" %% "sjson-new-murmurhash" % contrabandSjsonNewVersion.value } - def log4jVersion = "2.11.1" + def log4jVersion = "2.11.2" val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion val log4jCore = "org.apache.logging.log4j" % "log4j-core" % log4jVersion val disruptor = "com.lmax" % "disruptor" % "3.4.2" From 8c85744d67e0b5da328b8c9c031df0f96e0a3a6c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 7 Mar 2019 16:39:47 -0500 Subject: [PATCH 788/823] Use IO.Newline for stack trace --- .../scala/sbt/internal/util/StackTrace.scala | 33 +++++++++++++------ 1 file changed, 23 insertions(+), 10 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala index 58888e8be..37cc42400 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala @@ -3,6 +3,9 @@ */ package sbt.internal.util +import sbt.io.IO +import scala.collection.mutable.ListBuffer + object StackTrace { def isSbtClass(name: String) = name.startsWith("sbt.") || name.startsWith("xsbt.") @@ -18,9 +21,9 @@ object StackTrace { * where the line for the Throwable is counted plus one line for each stack element. * Less lines will be included if there are not enough stack elements. */ - def trimmed(t: Throwable, d: Int): String = { + def trimmedLines(t: Throwable, d: Int): List[String] = { require(d >= 0) - val b = new StringBuilder() + val b = new ListBuffer[String]() def appendStackTrace(t: Throwable, first: Boolean): Unit = { @@ -33,16 +36,12 @@ object StackTrace { } def appendElement(e: StackTraceElement): Unit = { - b.append("\tat ") - b.append(e) - b.append('\n') + b.append("\tat " + e) () } - if (!first) - b.append("Caused by: ") - b.append(t) - b.append('\n') + if (!first) b.append("Caused by: " + t.toString) + else b.append(t.toString) val els = t.getStackTrace() var i = 0 @@ -59,7 +58,21 @@ object StackTrace { c = c.getCause() appendStackTrace(c, false) } - b.toString() + b.toList } + /** + * Return a printable representation of the stack trace associated + * with t. Information about t and its Throwable causes is included. + * The number of lines to be included for each Throwable is configured + * via d which should be greater than or equal to 0. + * + * - If d is 0, then all elements are included up to (but not including) + * the first element that comes from sbt. + * - If d is greater than 0, then up to that many lines are included, + * where the line for the Throwable is counted plus one line for each stack element. + * Less lines will be included if there are not enough stack elements. + */ + def trimmed(t: Throwable, d: Int): String = + trimmedLines(t, d).mkString(IO.Newline) } From 8215026bc3c45ff70d4d9c0be7c98bf5d2b85dc2 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 7 Mar 2019 16:42:10 -0500 Subject: [PATCH 789/823] Account for log line longer than the terminal width widthHolder will hold on to the terminal width if supplied by sbt. This avoids adding dependencies to JLine. --- .../sbt/internal/util/ConsoleAppender.scala | 19 ++++++++++++++++++- 1 file changed, 18 insertions(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index e18add319..50f6fb507 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -97,8 +97,14 @@ class ConsoleLogger private[ConsoleLogger] ( object ConsoleAppender { private[sbt] final val ScrollUp = "\u001B[S" + private[sbt] def cursorUp(n: Int): String = s"\u001B[${n}A" + private[sbt] def cursorDown(n: Int): String = s"\u001B[${n}B" + private[sbt] def scrollUp(n: Int): String = s"\u001B[${n}S" private[sbt] final val DeleteLine = "\u001B[2K" private[sbt] final val CursorLeft1000 = "\u001B[1000D" + private[this] val widthHolder: AtomicInteger = new AtomicInteger + private[sbt] def terminalWidth = widthHolder.get + private[sbt] def setTerminalWidth(n: Int): Unit = widthHolder.set(n) /** Hide stack trace altogether. */ val noSuppressedMessage = (_: SuppressedTraceContext) => None @@ -463,7 +469,18 @@ class ConsoleAppender private[ConsoleAppender] ( if (!useFormat || !ansiCodesSupported) { out.println(EscHelpers.removeEscapeSequences(msg)) } else if (ConsoleAppender.showProgress) { - out.print(s"$ScrollUp$DeleteLine$msg${CursorLeft1000}") + val textLength = msg.length - 5 + val scrollNum = + if (ConsoleAppender.terminalWidth == 0) 1 + else (textLength / ConsoleAppender.terminalWidth) + 1 + if (scrollNum > 1) { + out.print(s"${cursorDown(1)}$DeleteLine" * (scrollNum - 1) + s"${cursorUp(scrollNum - 1)}") + } + out.print( + s"$ScrollUp$DeleteLine$msg${CursorLeft1000}" + ( + if (scrollNum <= 1) "" + else scrollUp(scrollNum - 1) + )) out.flush() } else { out.println(msg) From d496a5dff559a14d689a7138069164e764cec1ea Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 7 Mar 2019 16:49:54 -0500 Subject: [PATCH 790/823] Make showProgress configurable --- .../sbt/internal/util/ConsoleAppender.scala | 20 ++++--------------- 1 file changed, 4 insertions(+), 16 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 50f6fb507..741bafc4f 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -3,7 +3,7 @@ package sbt.internal.util import sbt.util._ import java.io.{ PrintStream, PrintWriter } import java.util.Locale -import java.util.concurrent.atomic.AtomicInteger +import java.util.concurrent.atomic.{ AtomicBoolean, AtomicInteger } import org.apache.logging.log4j.{ Level => XLevel } import org.apache.logging.log4j.message.{ Message, ObjectMessage, ReusableObjectMessage } import org.apache.logging.log4j.core.{ LogEvent => XLogEvent } @@ -105,6 +105,9 @@ object ConsoleAppender { private[this] val widthHolder: AtomicInteger = new AtomicInteger private[sbt] def terminalWidth = widthHolder.get private[sbt] def setTerminalWidth(n: Int): Unit = widthHolder.set(n) + private[this] val showProgressHolder: AtomicBoolean = new AtomicBoolean(false) + def setShowProgress(b: Boolean): Unit = showProgressHolder.set(b) + def showProgress: Boolean = showProgressHolder.get /** Hide stack trace altogether. */ val noSuppressedMessage = (_: SuppressedTraceContext) => None @@ -141,21 +144,6 @@ object ConsoleAppender { } } - /** - * Indicates whether the super shell is enabled. - */ - lazy val showProgress: Boolean = - formatEnabledInEnv && sys.props - .get("sbt.progress") - .flatMap({ s => - parseLogOption(s) match { - case LogOption.Always => Some(true) - case LogOption.Never => Some(false) - case _ => None - } - }) - .getOrElse(true) - private[sbt] def parseLogOption(s: String): LogOption = s.toLowerCase match { case "always" => LogOption.Always From e4612c858c3411e9b4c667a6f1a00daef2a6d462 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 7 Mar 2019 18:19:12 -0500 Subject: [PATCH 791/823] switch to official sbt-scalafmt --- .travis.yml | 2 +- build.sbt | 7 +------ project/plugins.sbt | 3 ++- 3 files changed, 4 insertions(+), 8 deletions(-) diff --git a/.travis.yml b/.travis.yml index 8436d27e9..3233c1a4c 100644 --- a/.travis.yml +++ b/.travis.yml @@ -8,7 +8,7 @@ scala: script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M ++$TRAVIS_SCALA_VERSION mimaReportBinaryIssues - scalafmt::test test:scalafmt::test + scalafmtCheck whitesourceCheckPolicies test diff --git a/build.sbt b/build.sbt index 8b95d524e..1c0b3106f 100644 --- a/build.sbt +++ b/build.sbt @@ -55,15 +55,10 @@ lazy val utilRoot: Project = (project in file(".")) homepage := Some(url("https://github.com/sbt/util")), description := "Util module for sbt", scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")), - scalafmtOnCompile in Sbt := false, )), commonSettings, name := "Util Root", - publish := {}, - publishLocal := {}, - publishArtifact in Compile := false, - publishArtifact in Test := false, - publishArtifact := false, + publish / skip := true, customCommands ) diff --git a/project/plugins.sbt b/project/plugins.sbt index 20afa9ff8..8e37e2155 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,4 +1,5 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.8") +addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.9") +addSbtPlugin("com.geirsson" % "sbt-scalafmt" % "1.5.1") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.1") addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.9") addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.3.4") From 7431dbdf1a16cfa46f8000ba1d1ea4ed814879d9 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 4 Apr 2019 00:53:44 -0400 Subject: [PATCH 792/823] throw error on deserialization error --- .../sbt/internal/util/EmptyCacheError.scala | 5 +++++ util-cache/src/main/scala/sbt/util/Input.scala | 7 ++++++- .../src/main/scala/sbt/util/Tracked.scala | 16 +++++++++++++--- 3 files changed, 24 insertions(+), 4 deletions(-) create mode 100644 util-cache/src/main/scala/sbt/internal/util/EmptyCacheError.scala diff --git a/util-cache/src/main/scala/sbt/internal/util/EmptyCacheError.scala b/util-cache/src/main/scala/sbt/internal/util/EmptyCacheError.scala new file mode 100644 index 000000000..5d8b97f20 --- /dev/null +++ b/util-cache/src/main/scala/sbt/internal/util/EmptyCacheError.scala @@ -0,0 +1,5 @@ +package sbt +package internal +package util + +class EmptyCacheError extends RuntimeException diff --git a/util-cache/src/main/scala/sbt/util/Input.scala b/util-cache/src/main/scala/sbt/util/Input.scala index 6f1d895e8..2a011ed63 100644 --- a/util-cache/src/main/scala/sbt/util/Input.scala +++ b/util-cache/src/main/scala/sbt/util/Input.scala @@ -4,6 +4,7 @@ import java.io.{ Closeable, InputStream } import scala.util.control.NonFatal import sjsonnew.{ IsoString, JsonReader, SupportConverter } import sbt.io.{ IO, Using } +import sbt.internal.util.EmptyCacheError trait Input extends Closeable { def read[T: JsonReader](): T @@ -28,7 +29,11 @@ class PlainInput[J: IsoString](input: InputStream, converter: SupportConverter[J } } - def read[T: JsonReader]() = converter.fromJson(isoFormat.from(readFully())).get + def read[T: JsonReader](): T = { + val str = readFully() + if (str == "") throw new EmptyCacheError() + else converter.fromJson(isoFormat.from(str)).get + } def close() = input.close() } diff --git a/util-tracking/src/main/scala/sbt/util/Tracked.scala b/util-tracking/src/main/scala/sbt/util/Tracked.scala index c6286a832..238101e07 100644 --- a/util-tracking/src/main/scala/sbt/util/Tracked.scala +++ b/util-tracking/src/main/scala/sbt/util/Tracked.scala @@ -8,6 +8,7 @@ import scala.util.{ Failure, Try, Success } import java.io.File import sbt.io.IO import sbt.io.syntax._ +import sbt.internal.util.EmptyCacheError import sjsonnew.JsonFormat import sjsonnew.support.murmurhash.Hasher @@ -178,7 +179,9 @@ object Tracked { def save(store: CacheStore, value: I): Unit = { Hasher.hash(value) match { case Success(keyHash) => store.write[Long](keyHash.toLong) - case Failure(_) => () + case Failure(e) => + if (isStrictMode) throw e + else () } } @@ -187,12 +190,19 @@ object Tracked { case Success(prev: Long) => Hasher.hash(value) match { case Success(keyHash: Int) => keyHash.toLong != prev - case Failure(_) => true + case Failure(e) => + if (isStrictMode) throw e + else true } - case Failure(_) => true + case Failure(_: EmptyCacheError) => true + case Failure(e) => + if (isStrictMode) throw e + else true } } + private[sbt] def isStrictMode: Boolean = + java.lang.Boolean.getBoolean("sbt.strict") } trait Tracked { From 2ac7501c7ce2ad5f536af67100704f3395c5ae33 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 20 Apr 2019 23:20:19 -0400 Subject: [PATCH 793/823] sbt-scalafmt 2.0.0 --- .scalafmt.conf | 5 ++++- .travis.yml | 2 +- project/build.properties | 2 +- project/plugins.sbt | 2 +- 4 files changed, 7 insertions(+), 4 deletions(-) diff --git a/.scalafmt.conf b/.scalafmt.conf index e4ab36511..5b87db3b7 100644 --- a/.scalafmt.conf +++ b/.scalafmt.conf @@ -1,6 +1,7 @@ +version = 2.0.0-RC5 maxColumn = 100 project.git = true -project.excludeFilters = [ /sbt-test/, /input_sources/, /contraband-scala/ ] +project.excludeFilters = [ "\\Wsbt-test\\W", "\\Winput_sources\\W", "\\Wcontraband-scala\\W" ] # http://docs.scala-lang.org/style/scaladoc.html recommends the JavaDoc style. # scala/scala is written that way too https://github.com/scala/scala/blob/v2.12.2/src/library/scala/Predef.scala @@ -8,3 +9,5 @@ docstrings = JavaDoc # This also seems more idiomatic to include whitespace in import x.{ yyy } spaces.inImportCurlyBraces = true + +trailingCommas = preserve diff --git a/.travis.yml b/.travis.yml index 3233c1a4c..d6dc44b30 100644 --- a/.travis.yml +++ b/.travis.yml @@ -8,7 +8,7 @@ scala: script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M ++$TRAVIS_SCALA_VERSION mimaReportBinaryIssues - scalafmtCheck + scalafmtCheckAll whitesourceCheckPolicies test diff --git a/project/build.properties b/project/build.properties index 0cd8b0798..c0bab0494 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.2.3 +sbt.version=1.2.8 diff --git a/project/plugins.sbt b/project/plugins.sbt index 8e37e2155..70d7318b3 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,5 +1,5 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.9") -addSbtPlugin("com.geirsson" % "sbt-scalafmt" % "1.5.1") +addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.0.0") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.1") addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.9") addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.3.4") From 98ec0075f4c4b9367e1acfc2682d48d9d032d34b Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 20 Apr 2019 23:23:13 -0400 Subject: [PATCH 794/823] apply formatting --- .../sbt/internal/util/ErrorHandling.scala | 12 +++- .../sbt/internal/util/BufferedLogger.scala | 24 +++++-- .../sbt/internal/util/ConsoleAppender.scala | 6 +- .../sbt/internal/util/ManagedLogger.scala | 6 +- .../main/scala/sbt/util/InterfaceUtil.scala | 68 ++++++++++--------- .../sbt/internal/scripted/ScriptRunner.scala | 8 ++- .../sbt/internal/scripted/ScriptedTests.scala | 29 ++++---- .../src/main/scala/sbt/util/Cache.scala | 2 +- .../main/scala/sbt/util/FileFunction.scala | 17 +++-- 9 files changed, 105 insertions(+), 67 deletions(-) diff --git a/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala b/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala index f9b101453..e0e90a6d7 100644 --- a/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala +++ b/internal/util-control/src/main/scala/sbt/internal/util/ErrorHandling.scala @@ -7,20 +7,26 @@ import java.io.IOException object ErrorHandling { def translate[T](msg: => String)(f: => T) = - try { f } catch { + try { + f + } catch { case e: IOException => throw new TranslatedIOException(msg + e.toString, e) case e: Exception => throw new TranslatedException(msg + e.toString, e) } def wideConvert[T](f: => T): Either[Throwable, T] = - try { Right(f) } catch { + try { + Right(f) + } catch { case ex @ (_: Exception | _: StackOverflowError) => Left(ex) case err @ (_: ThreadDeath | _: VirtualMachineError) => throw err case x: Throwable => Left(x) } def convert[T](f: => T): Either[Exception, T] = - try { Right(f) } catch { case e: Exception => Left(e) } + try { + Right(f) + } catch { case e: Exception => Left(e) } def reducedToString(e: Throwable): String = if (e.getClass == classOf[RuntimeException]) { diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala index dc22049c0..ae5cb789a 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/BufferedLogger.scala @@ -50,7 +50,11 @@ class BufferedAppender private[BufferedAppender] (name: String, delegate: Append def record() = synchronized { recording = true } def buffer[T](f: => T): T = { record() - try { f } finally { stopQuietly() } + try { + f + } finally { + stopQuietly() + } } def bufferQuietly[T](f: => T): T = { record() @@ -60,7 +64,11 @@ class BufferedAppender private[BufferedAppender] (name: String, delegate: Append result } catch { case e: Throwable => stopQuietly(); throw e } } - def stopQuietly() = synchronized { try { stopBuffer() } catch { case _: Exception => () } } + def stopQuietly() = synchronized { + try { + stopBuffer() + } catch { case _: Exception => () } + } /** * Flushes the buffer to the delegate logger. This method calls logAll on the delegate @@ -99,7 +107,11 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { def record() = synchronized { recording = true } def buffer[T](f: => T): T = { record() - try { f } finally { stopQuietly() } + try { + f + } finally { + stopQuietly() + } } def bufferQuietly[T](f: => T): T = { record() @@ -109,7 +121,11 @@ class BufferedLogger(delegate: AbstractLogger) extends BasicLogger { result } catch { case e: Throwable => stopQuietly(); throw e } } - def stopQuietly() = synchronized { try { stop() } catch { case _: Exception => () } } + def stopQuietly() = synchronized { + try { + stop() + } catch { case _: Exception => () } + } /** * Flushes the buffer to the delegate logger. This method calls logAll on the delegate diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 741bafc4f..6cd21e55c 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -438,7 +438,8 @@ class ConsoleAppender private[ConsoleAppender] ( out.lockObject.synchronized { message.linesIterator.foreach { line => val builder = new java.lang.StringBuilder( - labelColor.length + label.length + messageColor.length + line.length + reset.length * 3 + 3) + labelColor.length + label.length + messageColor.length + line.length + reset.length * 3 + 3 + ) def fmted(a: String, b: String) = builder.append(reset).append(a).append(b).append(reset) builder.append(reset).append('[') fmted(labelColor, label) @@ -468,7 +469,8 @@ class ConsoleAppender private[ConsoleAppender] ( s"$ScrollUp$DeleteLine$msg${CursorLeft1000}" + ( if (scrollNum <= 1) "" else scrollUp(scrollNum - 1) - )) + ) + ) out.flush() } else { out.println(msg) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index 3add4cd04..b883c43d1 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -28,8 +28,10 @@ class ManagedLogger( private val SuccessEventTag = scala.reflect.runtime.universe.typeTag[SuccessEvent] // send special event for success since it's not a real log level override def success(message: => String): Unit = { - infoEvent[SuccessEvent](SuccessEvent(message))(implicitly[JsonFormat[SuccessEvent]], - SuccessEventTag) + infoEvent[SuccessEvent](SuccessEvent(message))( + implicitly[JsonFormat[SuccessEvent]], + SuccessEventTag + ) } def registerStringCodec[A: ShowLines: TypeTag]: Unit = { diff --git a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala index 1de667f5b..0b1f5bf8c 100644 --- a/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala +++ b/internal/util-logging/src/main/scala/sbt/util/InterfaceUtil.scala @@ -46,19 +46,21 @@ object InterfaceUtil { sourcePath0: Option[String], sourceFile0: Option[File] ): Position = - position(line0, - content, - offset0, - pointer0, - pointerSpace0, - sourcePath0, - sourceFile0, - None, - None, - None, - None, - None, - None) + position( + line0, + content, + offset0, + pointer0, + pointerSpace0, + sourcePath0, + sourceFile0, + None, + None, + None, + None, + None, + None + ) def position( line0: Option[Integer], @@ -75,29 +77,33 @@ object InterfaceUtil { endLine0: Option[Integer], endColumn0: Option[Integer] ): Position = - new ConcretePosition(line0, - content, - offset0, - pointer0, - pointerSpace0, - sourcePath0, - sourceFile0, - startOffset0, - endOffset0, - startLine0, - startColumn0, - endLine0, - endColumn0) + new ConcretePosition( + line0, + content, + offset0, + pointer0, + pointerSpace0, + sourcePath0, + sourceFile0, + startOffset0, + endOffset0, + startLine0, + startColumn0, + endLine0, + endColumn0 + ) @deprecated("Use the overload of this method with more arguments", "1.2.2") def problem(cat: String, pos: Position, msg: String, sev: Severity): Problem = problem(cat, pos, msg, sev, None) - def problem(cat: String, - pos: Position, - msg: String, - sev: Severity, - rendered: Option[String]): Problem = + def problem( + cat: String, + pos: Position, + msg: String, + sev: Severity, + rendered: Option[String] + ): Problem = new ConcreteProblem(cat, pos, msg, sev, rendered) private final class ConcreteT2[A1, A2](a1: A1, a2: A2) extends T2[A1, A2] { diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala index a15458dc4..e92de6e39 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptRunner.scala @@ -15,7 +15,9 @@ class ScriptRunner { def processStatement(handler: StatementHandler, statement: Statement): Unit = { val state = states(handler).asInstanceOf[handler.State] val nextState = - try { Right(handler(statement.command, statement.arguments, state)) } catch { + try { + Right(handler(statement.command, statement.arguments, state)) + } catch { case e: Exception => Left(e) } nextState match { @@ -42,7 +44,9 @@ class ScriptRunner { statements foreach (Function.tupled(processStatement)) } finally { for (handler <- handlers; state <- states.get(handler)) { - try { handler.finish(state.asInstanceOf[handler.State]) } catch { case e: Exception => () } + try { + handler.finish(state.asInstanceOf[handler.State]) + } catch { case e: Exception => () } } } } diff --git a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala index e88d4bb16..eec88b150 100644 --- a/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala +++ b/internal/util-scripted/src/main/scala/sbt/internal/scripted/ScriptedTests.scala @@ -75,21 +75,22 @@ final class ScriptedTests( val g = groupDir.getName val n = nme.getName val str = s"$g / $n" - () => - { - println("Running " + str) - testResources.readWriteResourceDirectory(g, n) { testDirectory => - val disabled = new File(testDirectory, "disabled").isFile - if (disabled) { - log.info("D " + str + " [DISABLED]") - None - } else { - try { scriptedTest(str, testDirectory, prescripted, log); None } catch { - case _: TestException | _: PendingTestSuccessException => Some(str) - } + () => { + println("Running " + str) + testResources.readWriteResourceDirectory(g, n) { testDirectory => + val disabled = new File(testDirectory, "disabled").isFile + if (disabled) { + log.info("D " + str + " [DISABLED]") + None + } else { + try { + scriptedTest(str, testDirectory, prescripted, log); None + } catch { + case _: TestException | _: PendingTestSuccessException => Some(str) } } } + } } } @@ -149,7 +150,9 @@ final class ScriptedTests( case e: Exception => testFailed() if (!pending) throw e - } finally { buffered.clearBuffer() } + } finally { + buffered.clearBuffer() + } } } diff --git a/util-cache/src/main/scala/sbt/util/Cache.scala b/util-cache/src/main/scala/sbt/util/Cache.scala index 8e4a4a7de..d0b03033d 100644 --- a/util-cache/src/main/scala/sbt/util/Cache.scala +++ b/util-cache/src/main/scala/sbt/util/Cache.scala @@ -60,7 +60,7 @@ object Cache { val result = default(key) update(result) result - } + } def debug[I](label: String, cache: SingletonCache[I]): SingletonCache[I] = new SingletonCache[I] { diff --git a/util-tracking/src/main/scala/sbt/util/FileFunction.scala b/util-tracking/src/main/scala/sbt/util/FileFunction.scala index 0aa58ca4c..1a7e033ef 100644 --- a/util-tracking/src/main/scala/sbt/util/FileFunction.scala +++ b/util-tracking/src/main/scala/sbt/util/FileFunction.scala @@ -137,16 +137,15 @@ object FileFunction { ): Set[File] => Set[File] = { lazy val inCache = Difference.inputs(storeFactory.make("in-cache"), inStyle) lazy val outCache = Difference.outputs(storeFactory.make("out-cache"), outStyle) - inputs => - { - inCache(inputs) { inReport => - outCache { outReport => - if (inReport.modified.isEmpty && outReport.modified.isEmpty) - outReport.checked - else - action(inReport, outReport) - } + inputs => { + inCache(inputs) { inReport => + outCache { outReport => + if (inReport.modified.isEmpty && outReport.modified.isEmpty) + outReport.checked + else + action(inReport, outReport) } } + } } } From e28e052b5bd7ced9e55ccc5665fb44e4009f3ce0 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Thu, 4 Apr 2019 22:27:33 -0400 Subject: [PATCH 795/823] move super shell rendering to ConsoleAppender Ref https://github.com/sbt/sbt/issues/4583 This moves the super shell rendering to ConsoleAppender with several improvements. Instead of scrolling up, supershell is now changed to normal scrolling down, with more traditional cursor position. Before printing out the logs, last known progress reports are wiped out. In addition, there's now 5 lines of blank lines to accomodate for `println(...)` by tasks. --- build.sbt | 2 + .../sbt/internal/util/ProgressEvent.scala | 59 +++++++++++++++ .../sbt/internal/util/ProgressItem.scala | 41 +++++++++++ .../util/codec/AbstractEntryFormats.scala | 4 +- .../internal/util/codec/JsonProtocol.scala | 2 + .../util/codec/ProgressEventFormats.scala | 35 +++++++++ .../util/codec/ProgressItemFormats.scala | 29 ++++++++ .../util/codec/TaskProgressFormats.scala | 29 ++++++++ .../src/main/contraband/logging.contra | 17 +++++ .../sbt/internal/util/ConsoleAppender.scala | 73 +++++++++++++------ 10 files changed, 266 insertions(+), 25 deletions(-) create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/ProgressEvent.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/ProgressItem.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressEventFormats.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressItemFormats.scala create mode 100644 internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TaskProgressFormats.scala diff --git a/build.sbt b/build.sbt index 1c0b3106f..e1f77c2b9 100644 --- a/build.sbt +++ b/build.sbt @@ -124,6 +124,8 @@ lazy val utilLogging = (project in internalPath / "util-logging") exclude[ReversedMissingMethodProblem]("sbt.internal.util.ConsoleOut.flush"), // This affects Scala 2.11 only it seems, so it's ok? exclude[InheritedNewAbstractMethodProblem]("sbt.internal.util.codec.JsonProtocol.LogOptionFormat"), + exclude[InheritedNewAbstractMethodProblem]("sbt.internal.util.codec.JsonProtocol.ProgressItemFormat"), + exclude[InheritedNewAbstractMethodProblem]("sbt.internal.util.codec.JsonProtocol.ProgressEventFormat"), ), ) .configure(addSbtIO) diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ProgressEvent.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ProgressEvent.scala new file mode 100644 index 000000000..9c0c09368 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ProgressEvent.scala @@ -0,0 +1,59 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util +/** used by super shell */ +final class ProgressEvent private ( + val level: String, + val items: Vector[sbt.internal.util.ProgressItem], + val lastTaskCount: Option[Int], + channelName: Option[String], + execId: Option[String]) extends sbt.internal.util.AbstractEntry(channelName, execId) with Serializable { + + + + override def equals(o: Any): Boolean = o match { + case x: ProgressEvent => (this.level == x.level) && (this.items == x.items) && (this.lastTaskCount == x.lastTaskCount) && (this.channelName == x.channelName) && (this.execId == x.execId) + case _ => false + } + override def hashCode: Int = { + 37 * (37 * (37 * (37 * (37 * (37 * (17 + "sbt.internal.util.ProgressEvent".##) + level.##) + items.##) + lastTaskCount.##) + channelName.##) + execId.##) + } + override def toString: String = { + "ProgressEvent(" + level + ", " + items + ", " + lastTaskCount + ", " + channelName + ", " + execId + ")" + } + private[this] def copy(level: String = level, items: Vector[sbt.internal.util.ProgressItem] = items, lastTaskCount: Option[Int] = lastTaskCount, channelName: Option[String] = channelName, execId: Option[String] = execId): ProgressEvent = { + new ProgressEvent(level, items, lastTaskCount, channelName, execId) + } + def withLevel(level: String): ProgressEvent = { + copy(level = level) + } + def withItems(items: Vector[sbt.internal.util.ProgressItem]): ProgressEvent = { + copy(items = items) + } + def withLastTaskCount(lastTaskCount: Option[Int]): ProgressEvent = { + copy(lastTaskCount = lastTaskCount) + } + def withLastTaskCount(lastTaskCount: Int): ProgressEvent = { + copy(lastTaskCount = Option(lastTaskCount)) + } + def withChannelName(channelName: Option[String]): ProgressEvent = { + copy(channelName = channelName) + } + def withChannelName(channelName: String): ProgressEvent = { + copy(channelName = Option(channelName)) + } + def withExecId(execId: Option[String]): ProgressEvent = { + copy(execId = execId) + } + def withExecId(execId: String): ProgressEvent = { + copy(execId = Option(execId)) + } +} +object ProgressEvent { + + def apply(level: String, items: Vector[sbt.internal.util.ProgressItem], lastTaskCount: Option[Int], channelName: Option[String], execId: Option[String]): ProgressEvent = new ProgressEvent(level, items, lastTaskCount, channelName, execId) + def apply(level: String, items: Vector[sbt.internal.util.ProgressItem], lastTaskCount: Int, channelName: String, execId: String): ProgressEvent = new ProgressEvent(level, items, Option(lastTaskCount), Option(channelName), Option(execId)) +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ProgressItem.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ProgressItem.scala new file mode 100644 index 000000000..f7b30dc6f --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/ProgressItem.scala @@ -0,0 +1,41 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util +/** + * used by super shell + * @param name name of a task + * @param elapsedMicros current elapsed time in micro seconds + */ +final class ProgressItem private ( + val name: String, + val elapsedMicros: Long) extends Serializable { + + + + override def equals(o: Any): Boolean = o match { + case x: ProgressItem => (this.name == x.name) && (this.elapsedMicros == x.elapsedMicros) + case _ => false + } + override def hashCode: Int = { + 37 * (37 * (37 * (17 + "sbt.internal.util.ProgressItem".##) + name.##) + elapsedMicros.##) + } + override def toString: String = { + "ProgressItem(" + name + ", " + elapsedMicros + ")" + } + private[this] def copy(name: String = name, elapsedMicros: Long = elapsedMicros): ProgressItem = { + new ProgressItem(name, elapsedMicros) + } + def withName(name: String): ProgressItem = { + copy(name = name) + } + def withElapsedMicros(elapsedMicros: Long): ProgressItem = { + copy(elapsedMicros = elapsedMicros) + } +} +object ProgressItem { + + def apply(name: String, elapsedMicros: Long): ProgressItem = new ProgressItem(name, elapsedMicros) +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala index 55784f9ac..37b4cfc91 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/AbstractEntryFormats.scala @@ -6,6 +6,6 @@ package sbt.internal.util.codec import _root_.sjsonnew.JsonFormat -trait AbstractEntryFormats { self: sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.StringEventFormats with sbt.internal.util.codec.TraceEventFormats => -implicit lazy val AbstractEntryFormat: JsonFormat[sbt.internal.util.AbstractEntry] = flatUnionFormat2[sbt.internal.util.AbstractEntry, sbt.internal.util.StringEvent, sbt.internal.util.TraceEvent]("type") +trait AbstractEntryFormats { self: sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.StringEventFormats with sbt.internal.util.codec.TraceEventFormats with sbt.internal.util.codec.ProgressItemFormats with sbt.internal.util.codec.ProgressEventFormats => +implicit lazy val AbstractEntryFormat: JsonFormat[sbt.internal.util.AbstractEntry] = flatUnionFormat3[sbt.internal.util.AbstractEntry, sbt.internal.util.StringEvent, sbt.internal.util.TraceEvent, sbt.internal.util.ProgressEvent]("type") } diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala index 15e4d9cb2..54bc48141 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/JsonProtocol.scala @@ -7,6 +7,8 @@ package sbt.internal.util.codec trait JsonProtocol extends sjsonnew.BasicJsonProtocol with sbt.internal.util.codec.StringEventFormats with sbt.internal.util.codec.TraceEventFormats + with sbt.internal.util.codec.ProgressItemFormats + with sbt.internal.util.codec.ProgressEventFormats with sbt.internal.util.codec.AbstractEntryFormats with sbt.internal.util.codec.SuccessEventFormats with sbt.internal.util.codec.LogOptionFormats diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressEventFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressEventFormats.scala new file mode 100644 index 000000000..4d836c02d --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressEventFormats.scala @@ -0,0 +1,35 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } +trait ProgressEventFormats { self: sbt.internal.util.codec.ProgressItemFormats with sjsonnew.BasicJsonProtocol => +implicit lazy val ProgressEventFormat: JsonFormat[sbt.internal.util.ProgressEvent] = new JsonFormat[sbt.internal.util.ProgressEvent] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.ProgressEvent = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val level = unbuilder.readField[String]("level") + val items = unbuilder.readField[Vector[sbt.internal.util.ProgressItem]]("items") + val lastTaskCount = unbuilder.readField[Option[Int]]("lastTaskCount") + val channelName = unbuilder.readField[Option[String]]("channelName") + val execId = unbuilder.readField[Option[String]]("execId") + unbuilder.endObject() + sbt.internal.util.ProgressEvent(level, items, lastTaskCount, channelName, execId) + case None => + deserializationError("Expected JsObject but found None") + } + } + override def write[J](obj: sbt.internal.util.ProgressEvent, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("level", obj.level) + builder.addField("items", obj.items) + builder.addField("lastTaskCount", obj.lastTaskCount) + builder.addField("channelName", obj.channelName) + builder.addField("execId", obj.execId) + builder.endObject() + } +} +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressItemFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressItemFormats.scala new file mode 100644 index 000000000..3aac75e91 --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressItemFormats.scala @@ -0,0 +1,29 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } +trait ProgressItemFormats { self: sjsonnew.BasicJsonProtocol => +implicit lazy val ProgressItemFormat: JsonFormat[sbt.internal.util.ProgressItem] = new JsonFormat[sbt.internal.util.ProgressItem] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.ProgressItem = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val name = unbuilder.readField[String]("name") + val elapsedMicros = unbuilder.readField[Long]("elapsedMicros") + unbuilder.endObject() + sbt.internal.util.ProgressItem(name, elapsedMicros) + case None => + deserializationError("Expected JsObject but found None") + } + } + override def write[J](obj: sbt.internal.util.ProgressItem, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("name", obj.name) + builder.addField("elapsedMicros", obj.elapsedMicros) + builder.endObject() + } +} +} diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TaskProgressFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TaskProgressFormats.scala new file mode 100644 index 000000000..fa79adffc --- /dev/null +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TaskProgressFormats.scala @@ -0,0 +1,29 @@ +/** + * This code is generated using [[http://www.scala-sbt.org/contraband/ sbt-contraband]]. + */ + +// DO NOT EDIT MANUALLY +package sbt.internal.util.codec +import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } +trait TaskProgressFormats { self: sjsonnew.BasicJsonProtocol => +implicit lazy val TaskProgressFormat: JsonFormat[sbt.internal.util.TaskProgress] = new JsonFormat[sbt.internal.util.TaskProgress] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.TaskProgress = { + jsOpt match { + case Some(js) => + unbuilder.beginObject(js) + val name = unbuilder.readField[String]("name") + val elapsedMicros = unbuilder.readField[Option[Long]]("elapsedMicros") + unbuilder.endObject() + sbt.internal.util.TaskProgress(name, elapsedMicros) + case None => + deserializationError("Expected JsObject but found None") + } + } + override def write[J](obj: sbt.internal.util.TaskProgress, builder: Builder[J]): Unit = { + builder.beginObject() + builder.addField("name", obj.name) + builder.addField("elapsedMicros", obj.elapsedMicros) + builder.endObject() + } +} +} diff --git a/internal/util-logging/src/main/contraband/logging.contra b/internal/util-logging/src/main/contraband/logging.contra index 73d0b1a56..34fa75c24 100644 --- a/internal/util-logging/src/main/contraband/logging.contra +++ b/internal/util-logging/src/main/contraband/logging.contra @@ -22,6 +22,23 @@ type TraceEvent implements sbt.internal.util.AbstractEntry { execId: String } +## used by super shell +type ProgressEvent implements sbt.internal.util.AbstractEntry { + level: String! + items: [sbt.internal.util.ProgressItem] + lastTaskCount: Int + channelName: String + execId: String +} + +## used by super shell +type ProgressItem { + ## name of a task + name: String! + ## current elapsed time in micro seconds + elapsedMicros: Long! +} + type SuccessEvent { message: String! } diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 6cd21e55c..bc30045c8 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -96,18 +96,19 @@ class ConsoleLogger private[ConsoleLogger] ( } object ConsoleAppender { - private[sbt] final val ScrollUp = "\u001B[S" private[sbt] def cursorUp(n: Int): String = s"\u001B[${n}A" private[sbt] def cursorDown(n: Int): String = s"\u001B[${n}B" private[sbt] def scrollUp(n: Int): String = s"\u001B[${n}S" private[sbt] final val DeleteLine = "\u001B[2K" private[sbt] final val CursorLeft1000 = "\u001B[1000D" + private[sbt] final val CursorDown1 = cursorDown(1) private[this] val widthHolder: AtomicInteger = new AtomicInteger private[sbt] def terminalWidth = widthHolder.get private[sbt] def setTerminalWidth(n: Int): Unit = widthHolder.set(n) private[this] val showProgressHolder: AtomicBoolean = new AtomicBoolean(false) def setShowProgress(b: Boolean): Unit = showProgressHolder.set(b) def showProgress: Boolean = showProgressHolder.get + private[sbt] val lastTaskCount = new AtomicInteger(0) /** Hide stack trace altogether. */ val noSuppressedMessage = (_: SuppressedTraceContext) => None @@ -454,23 +455,18 @@ class ConsoleAppender private[ConsoleAppender] ( appendLog(SUCCESS_LABEL_COLOR, Level.SuccessLabel, SUCCESS_MESSAGE_COLOR, message) } + // leave some blank lines for tasks that might use println(...) + private val blankZone = 5 private def write(msg: String): Unit = { if (!useFormat || !ansiCodesSupported) { out.println(EscHelpers.removeEscapeSequences(msg)) } else if (ConsoleAppender.showProgress) { - val textLength = msg.length - 5 - val scrollNum = - if (ConsoleAppender.terminalWidth == 0) 1 - else (textLength / ConsoleAppender.terminalWidth) + 1 - if (scrollNum > 1) { - out.print(s"${cursorDown(1)}$DeleteLine" * (scrollNum - 1) + s"${cursorUp(scrollNum - 1)}") + val clearNum = lastTaskCount.get + blankZone + if (clearNum > 1) { + deleteConsoleLines(clearNum) + out.print(s"${cursorUp(clearNum)}") } - out.print( - s"$ScrollUp$DeleteLine$msg${CursorLeft1000}" + ( - if (scrollNum <= 1) "" - else scrollUp(scrollNum - 1) - ) - ) + out.println(msg) out.flush() } else { out.println(msg) @@ -497,19 +493,50 @@ class ConsoleAppender private[ConsoleAppender] ( codec.showLines(te).toVector foreach { appendLog(Level.Error, _) } } + private def appendProgressEvent(pe: ProgressEvent): Unit = + if (ConsoleAppender.showProgress) { + out.lockObject.synchronized { + deleteConsoleLines(blankZone) + val currentTasksCount = pe.items.size + val ltc = pe.lastTaskCount.getOrElse(0) + val sorted = pe.items.sortBy(_.name).sortBy(x => -x.elapsedMicros) + sorted foreach { item => + val elapsed = item.elapsedMicros / 1000000L + out.println(s"$DeleteLine | => ${item.name} ${elapsed}s") + } + if (ltc > currentTasksCount) deleteConsoleLines(ltc - currentTasksCount) + else () + out.print(cursorUp(math.max(currentTasksCount, ltc) + blankZone)) + out.flush() + lastTaskCount.set(ltc) + } + } else () + + private def deleteConsoleLines(n: Int): Unit = { + (1 to n) foreach { _ => + out.println(DeleteLine) + } + } + private def appendMessageContent(level: Level.Value, o: AnyRef): Unit = { def appendEvent(oe: ObjectEvent[_]): Unit = { val contentType = oe.contentType - if (contentType == "sbt.internal.util.TraceEvent") { - appendTraceEvent(oe.message.asInstanceOf[TraceEvent]) - } else - LogExchange.stringCodec[AnyRef](contentType) match { - case Some(codec) if contentType == "sbt.internal.util.SuccessEvent" => - codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach { success(_) } - case Some(codec) => - codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach (appendLog(level, _)) - case _ => appendLog(level, oe.message.toString) - } + contentType match { + case "sbt.internal.util.TraceEvent" => appendTraceEvent(oe.message.asInstanceOf[TraceEvent]) + case "sbt.internal.util.ProgressEvent" => + appendProgressEvent(oe.message.asInstanceOf[ProgressEvent]) + case _ => + LogExchange.stringCodec[AnyRef](contentType) match { + case Some(codec) if contentType == "sbt.internal.util.SuccessEvent" => + codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach { success(_) } + case Some(codec) => + codec.showLines(oe.message.asInstanceOf[AnyRef]).toVector foreach (appendLog( + level, + _ + )) + case _ => appendLog(level, oe.message.toString) + } + } } o match { From 74fb5cde9a71b4513d60962a08753064b19037fb Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 4 May 2019 01:17:44 -0400 Subject: [PATCH 796/823] Bump sbt-whitesource --- project/build.properties | 2 +- project/plugins.sbt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/project/build.properties b/project/build.properties index c0bab0494..c59667ce9 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.2.8 +sbt.version=1.3.0-M3 diff --git a/project/plugins.sbt b/project/plugins.sbt index 70d7318b3..b8ef994f4 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,5 +1,5 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.9") addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.0.0") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.1") -addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.9") +addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.14") addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.3.4") From 96c91bb2a13d6c4ca13738da44dd9aea6e60b77d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 4 May 2019 02:23:55 -0400 Subject: [PATCH 797/823] Drop 2.10 cross building --- .travis.yml | 2 -- build.sbt | 2 +- project/Dependencies.scala | 1 - 3 files changed, 1 insertion(+), 4 deletions(-) diff --git a/.travis.yml b/.travis.yml index d6dc44b30..391862182 100644 --- a/.travis.yml +++ b/.travis.yml @@ -16,8 +16,6 @@ matrix: include: - scala: 2.12.7 jdk: openjdk11 - - scala: 2.10.6 - script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M "++$TRAVIS_SCALA_VERSION compile" env: global: diff --git a/build.sbt b/build.sbt index e1f77c2b9..2a1f47f2a 100644 --- a/build.sbt +++ b/build.sbt @@ -93,7 +93,7 @@ lazy val utilLogging = (project in internalPath / "util-logging") .dependsOn(utilInterface) .settings( commonSettings, - crossScalaVersions := Seq(scala210, scala211, scala212), + crossScalaVersions := Seq(scala211, scala212), name := "Util Logging", libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 74d510024..957313074 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -3,7 +3,6 @@ import Keys._ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { - val scala210 = "2.10.7" val scala211 = "2.11.12" val scala212 = "2.12.8" From dec2ba2d0727ab64a6a1fb4d36a5d8c59bc8864b Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 11 May 2019 15:45:55 -0400 Subject: [PATCH 798/823] IO 1.3.0-M10, and nightly version Fixes #199 --- .travis.yml | 1 - build.sbt | 29 ++++++++++++++--------------- project/Dependencies.scala | 8 ++++---- project/Util.scala | 2 +- 4 files changed, 19 insertions(+), 21 deletions(-) diff --git a/.travis.yml b/.travis.yml index 391862182..67ee93476 100644 --- a/.travis.yml +++ b/.travis.yml @@ -2,7 +2,6 @@ language: scala jdk: oraclejdk8 scala: - - 2.11.12 - 2.12.7 script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M diff --git a/build.sbt b/build.sbt index 2a1f47f2a..2d0abe0ea 100644 --- a/build.sbt +++ b/build.sbt @@ -2,6 +2,17 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ +ThisBuild / git.baseVersion := "1.3.0" +ThisBuild / version := { + val old = (ThisBuild / version).value + nightlyVersion match { + case Some(v) => v + case _ => + if (old contains "SNAPSHOT") git.baseVersion.value + "-SNAPSHOT" + else old + } +} + def internalPath = file("internal") def commonSettings: Seq[Setting[_]] = Seq( @@ -13,7 +24,7 @@ def commonSettings: Seq[Setting[_]] = Seq( // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-Xlint", "-Xlint:-serial"), - crossScalaVersions := Seq(scala211, scala212), + crossScalaVersions := Seq(scala212), scalacOptions in console in Compile -= "-Ywarn-unused-import", scalacOptions in console in Test -= "-Ywarn-unused-import", publishArtifact in Compile := true, @@ -45,12 +56,6 @@ lazy val utilRoot: Project = (project in file(".")) .settings( inThisBuild( Seq( - git.baseVersion := "1.3.0", - version := { - val v = version.value - if (v contains "SNAPSHOT") git.baseVersion.value + "-SNAPSHOT" - else v - }, bintrayPackage := "util", homepage := Some(url("https://github.com/sbt/util")), description := "Util module for sbt", @@ -93,7 +98,7 @@ lazy val utilLogging = (project in internalPath / "util-logging") .dependsOn(utilInterface) .settings( commonSettings, - crossScalaVersions := Seq(scala211, scala212), + crossScalaVersions := Seq(scala212), name := "Util Logging", libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), @@ -166,13 +171,7 @@ lazy val utilScripted = (project in internalPath / "util-scripted") .settings( commonSettings, name := "Util Scripted", - libraryDependencies ++= { - scalaVersion.value match { - case sv if sv startsWith "2.11" => Seq(parserCombinator211) - case sv if sv startsWith "2.12" => Seq(parserCombinator211) - case _ => Seq() - } - }, + libraryDependencies += parserCombinator, mimaSettings, ) .configure(addSbtIO) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 957313074..263c665bd 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -3,11 +3,11 @@ import Keys._ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { - val scala211 = "2.11.12" val scala212 = "2.12.8" - private val ioVersion = "1.2.1" + def nightlyVersion: Option[String] = sys.props.get("sbt.build.version") + private val ioVersion = nightlyVersion.getOrElse("1.3.0-M10") private val sbtIO = "org.scala-sbt" %% "io" % ioVersion def getSbtModulePath(key: String, name: String) = { @@ -40,8 +40,8 @@ object Dependencies { val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.14.0" % Test - val scalaTest = "org.scalatest" %% "scalatest" % "3.0.5" % Test - val parserCombinator211 = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4" + val scalaTest = "org.scalatest" %% "scalatest" % "3.0.6-SNAP5" % Test + val parserCombinator = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.1.2" val sjsonnew = Def.setting { "com.eed3si9n" %% "sjson-new-core" % contrabandSjsonNewVersion.value diff --git a/project/Util.scala b/project/Util.scala index f163f6dd0..de5ed8dd9 100644 --- a/project/Util.scala +++ b/project/Util.scala @@ -5,7 +5,7 @@ object Util { crossPaths := false, compileOrder := CompileOrder.JavaThenScala, unmanagedSourceDirectories in Compile := Seq((javaSource in Compile).value), - crossScalaVersions := Seq(Dependencies.scala211), + crossScalaVersions := Seq(Dependencies.scala212), autoScalaLibrary := false ) } From 5db20c200784f40413c20ecdef60d63c921da849 Mon Sep 17 00:00:00 2001 From: kenji yoshida <6b656e6a69@gmail.com> Date: Tue, 25 Jun 2019 11:17:26 +0900 Subject: [PATCH 799/823] Update dependencies (#202) --- .travis.yml | 4 ++-- project/Dependencies.scala | 8 ++++---- project/plugins.sbt | 4 ++-- 3 files changed, 8 insertions(+), 8 deletions(-) diff --git a/.travis.yml b/.travis.yml index 67ee93476..f8f3120f8 100644 --- a/.travis.yml +++ b/.travis.yml @@ -2,7 +2,7 @@ language: scala jdk: oraclejdk8 scala: - - 2.12.7 + - 2.12.8 script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M ++$TRAVIS_SCALA_VERSION @@ -13,7 +13,7 @@ script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M matrix: include: - - scala: 2.12.7 + - scala: 2.12.8 jdk: openjdk11 env: diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 263c665bd..771e7e403 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -7,7 +7,7 @@ object Dependencies { def nightlyVersion: Option[String] = sys.props.get("sbt.build.version") - private val ioVersion = nightlyVersion.getOrElse("1.3.0-M10") + private val ioVersion = nightlyVersion.getOrElse("1.3.0-M11") private val sbtIO = "org.scala-sbt" %% "io" % ioVersion def getSbtModulePath(key: String, name: String) = { @@ -40,7 +40,7 @@ object Dependencies { val scalaReflect = Def.setting { "org.scala-lang" % "scala-reflect" % scalaVersion.value } val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.14.0" % Test - val scalaTest = "org.scalatest" %% "scalatest" % "3.0.6-SNAP5" % Test + val scalaTest = "org.scalatest" %% "scalatest" % "3.0.8" % Test val parserCombinator = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.1.2" val sjsonnew = Def.setting { @@ -57,6 +57,6 @@ object Dependencies { val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion val log4jCore = "org.apache.logging.log4j" % "log4j-core" % log4jVersion val disruptor = "com.lmax" % "disruptor" % "3.4.2" - val silencerPlugin = "com.github.ghik" %% "silencer-plugin" % "1.2" - val silencerLib = "com.github.ghik" %% "silencer-lib" % "1.2" % Provided + val silencerPlugin = "com.github.ghik" %% "silencer-plugin" % "1.4.1" + val silencerLib = "com.github.ghik" %% "silencer-lib" % "1.4.1" % Provided } diff --git a/project/plugins.sbt b/project/plugins.sbt index b8ef994f4..c6e8c962e 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,5 +1,5 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.9") addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.0.0") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.1") -addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.14") -addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.3.4") +addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.16") +addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.4.1") From 9140fe3d2af45e31ab95dd6465fb794d6f01ee06 Mon Sep 17 00:00:00 2001 From: xuwei-k <6b656e6a69@gmail.com> Date: Fri, 12 Jul 2019 15:02:30 +0900 Subject: [PATCH 800/823] use openjdk instead of oraclejdk. fix travis matrix setting --- .travis.yml | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/.travis.yml b/.travis.yml index f8f3120f8..834114411 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,5 +1,4 @@ language: scala -jdk: oraclejdk8 scala: - 2.12.8 @@ -13,8 +12,8 @@ script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M matrix: include: - - scala: 2.12.8 - jdk: openjdk11 + - jdk: openjdk8 + - jdk: openjdk11 env: global: From 7de45416fad1e94ace8106675205da155772d65d Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 14 Jul 2019 12:47:43 -0400 Subject: [PATCH 801/823] Scala 2.13.0 --- .travis.yml | 47 +++++++++++++------ .../util/codec/LogOptionFormats.scala | 8 ++-- .../util/codec/ProgressEventFormats.scala | 8 ++-- .../util/codec/ProgressItemFormats.scala | 8 ++-- .../util/codec/StringEventFormats.scala | 8 ++-- .../util/codec/SuccessEventFormats.scala | 8 ++-- .../util/codec/TraceEventFormats.scala | 8 ++-- .../scala/sbt/internal/util/Relation.scala | 2 +- project/Dependencies.scala | 3 +- project/build.properties | 2 +- project/plugins.sbt | 2 +- 11 files changed, 61 insertions(+), 43 deletions(-) diff --git a/.travis.yml b/.travis.yml index 834114411..ccad81d19 100644 --- a/.travis.yml +++ b/.travis.yml @@ -2,28 +2,45 @@ language: scala scala: - 2.12.8 + - 2.13.0 -script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M - ++$TRAVIS_SCALA_VERSION - mimaReportBinaryIssues - scalafmtCheckAll - whitesourceCheckPolicies - test +env: + global: ADOPTOPENJDK=11 matrix: include: - - jdk: openjdk8 - - jdk: openjdk11 + - scala: 2.12.8 + env: + - ADOPTOPENJDK=8 + - secure: JzxepvrNQIem+7MS8pBfBkcWDgt/oNKOreI3GJMJDN9P7lxCmrW0UVhpSftscjRzz9gXGQleqZ8t/I0hqysY9nO/DlxDQil6FKpsqrEKALdIsez8TjtbOlV69enDl6SBCXpg1B/rTQ/dL9mpV3WMvNkmDOhcNmbNyfO9Uk8wAAEvGQNKyE02s0gjZf6IgfOHXInBB2o3+uQFiWCABFHDWInN4t0QZVEhF/3P3iDKEfauWGwugf/YKLrwUUzNyN+J1i1goYEWZvviP+KCNbPlEsVN60In8F0t+jYuBJb0ePNcl3waT/4xBKQRidB4XRbhOXrZIATdpHLnzKzk2TPf3GxijNEscKYGdq3v6nWd128rfHGYz528pRSZ8bNOdQJotB/bJTmIEOnk5P9zU0z4z2cawMF6EyBJka7kXnC9Vz6TpifvyXDpzfmRzAkBrD6PC+diGPbyy5+4zvhpZuv31MRjMckohyNb76pR9qq70yDlomn+nVNoZ1fpp7dCqwjIxm9h2UjCWzXWY4xSByI8/CaPibq6Ma7RWHQE+4NGG2CCLQrqN4NB+BFsH3R0l5Js9khvDuEUYJkgSmJMFluXranWRV+pp/YMxk1IT4rOEPOc/hIqlQTrxasp/QxeyAfRk9OPzoz9L2kR0RH4ch3KuaARUv03WFNarfQ/ISz3P/s= + script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M + ++$TRAVIS_SCALA_VERSION! + mimaReportBinaryIssues + scalafmtCheckAll + whitesourceCheckPolicies + test -env: - global: - - secure: JzxepvrNQIem+7MS8pBfBkcWDgt/oNKOreI3GJMJDN9P7lxCmrW0UVhpSftscjRzz9gXGQleqZ8t/I0hqysY9nO/DlxDQil6FKpsqrEKALdIsez8TjtbOlV69enDl6SBCXpg1B/rTQ/dL9mpV3WMvNkmDOhcNmbNyfO9Uk8wAAEvGQNKyE02s0gjZf6IgfOHXInBB2o3+uQFiWCABFHDWInN4t0QZVEhF/3P3iDKEfauWGwugf/YKLrwUUzNyN+J1i1goYEWZvviP+KCNbPlEsVN60In8F0t+jYuBJb0ePNcl3waT/4xBKQRidB4XRbhOXrZIATdpHLnzKzk2TPf3GxijNEscKYGdq3v6nWd128rfHGYz528pRSZ8bNOdQJotB/bJTmIEOnk5P9zU0z4z2cawMF6EyBJka7kXnC9Vz6TpifvyXDpzfmRzAkBrD6PC+diGPbyy5+4zvhpZuv31MRjMckohyNb76pR9qq70yDlomn+nVNoZ1fpp7dCqwjIxm9h2UjCWzXWY4xSByI8/CaPibq6Ma7RWHQE+4NGG2CCLQrqN4NB+BFsH3R0l5Js9khvDuEUYJkgSmJMFluXranWRV+pp/YMxk1IT4rOEPOc/hIqlQTrxasp/QxeyAfRk9OPzoz9L2kR0RH4ch3KuaARUv03WFNarfQ/ISz3P/s= +before_install: + # adding $HOME/.sdkman to cache would create an empty directory, which interferes with the initial installation + - "[[ -d $HOME/.sdkman/bin ]] || rm -rf $HOME/.sdkman/" + - curl -sL https://get.sdkman.io | bash + - echo sdkman_auto_answer=true > "$HOME/.sdkman/etc/config" + - source "$HOME/.sdkman/bin/sdkman-init.sh" -cache: - directories: - - $HOME/.ivy2/cache - - $HOME/.sbt +install: + - sdk install java $(sdk list java | grep -o "$ADOPTOPENJDK\.[0-9\.]*hs-adpt" | head -1) + - unset JAVA_HOME + - java -Xmx32m -version + - javac -J-Xmx32m -version + +script: sbt -Dfile.encoding=UTF8 -J-XX:ReservedCodeCacheSize=256M ++$TRAVIS_SCALA_VERSION! test before_cache: - find $HOME/.ivy2/cache -name "ivydata-*.properties" -delete - find $HOME/.sbt -name "*.lock" -delete + +cache: + directories: + - $HOME/.coursier + - $HOME/.ivy2/cache + - $HOME/.sbt diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/LogOptionFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/LogOptionFormats.scala index e52700c19..f5e851f68 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/LogOptionFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/LogOptionFormats.scala @@ -7,10 +7,10 @@ package sbt.internal.util.codec import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } trait LogOptionFormats { self: sjsonnew.BasicJsonProtocol => implicit lazy val LogOptionFormat: JsonFormat[sbt.internal.util.LogOption] = new JsonFormat[sbt.internal.util.LogOption] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.LogOption = { - jsOpt match { - case Some(js) => - unbuilder.readString(js) match { + override def read[J](__jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.LogOption = { + __jsOpt match { + case Some(__js) => + unbuilder.readString(__js) match { case "Always" => sbt.internal.util.LogOption.Always case "Never" => sbt.internal.util.LogOption.Never case "Auto" => sbt.internal.util.LogOption.Auto diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressEventFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressEventFormats.scala index 4d836c02d..6478f743a 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressEventFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressEventFormats.scala @@ -7,10 +7,10 @@ package sbt.internal.util.codec import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } trait ProgressEventFormats { self: sbt.internal.util.codec.ProgressItemFormats with sjsonnew.BasicJsonProtocol => implicit lazy val ProgressEventFormat: JsonFormat[sbt.internal.util.ProgressEvent] = new JsonFormat[sbt.internal.util.ProgressEvent] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.ProgressEvent = { - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) + override def read[J](__jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.ProgressEvent = { + __jsOpt match { + case Some(__js) => + unbuilder.beginObject(__js) val level = unbuilder.readField[String]("level") val items = unbuilder.readField[Vector[sbt.internal.util.ProgressItem]]("items") val lastTaskCount = unbuilder.readField[Option[Int]]("lastTaskCount") diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressItemFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressItemFormats.scala index 3aac75e91..261ab93d4 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressItemFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/ProgressItemFormats.scala @@ -7,10 +7,10 @@ package sbt.internal.util.codec import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } trait ProgressItemFormats { self: sjsonnew.BasicJsonProtocol => implicit lazy val ProgressItemFormat: JsonFormat[sbt.internal.util.ProgressItem] = new JsonFormat[sbt.internal.util.ProgressItem] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.ProgressItem = { - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) + override def read[J](__jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.ProgressItem = { + __jsOpt match { + case Some(__js) => + unbuilder.beginObject(__js) val name = unbuilder.readField[String]("name") val elapsedMicros = unbuilder.readField[Long]("elapsedMicros") unbuilder.endObject() diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala index 2d142f6ec..8b8ef3fe6 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/StringEventFormats.scala @@ -7,10 +7,10 @@ package sbt.internal.util.codec import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } trait StringEventFormats { self: sjsonnew.BasicJsonProtocol => implicit lazy val StringEventFormat: JsonFormat[sbt.internal.util.StringEvent] = new JsonFormat[sbt.internal.util.StringEvent] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.StringEvent = { - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) + override def read[J](__jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.StringEvent = { + __jsOpt match { + case Some(__js) => + unbuilder.beginObject(__js) val level = unbuilder.readField[String]("level") val message = unbuilder.readField[String]("message") val channelName = unbuilder.readField[Option[String]]("channelName") diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/SuccessEventFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/SuccessEventFormats.scala index 19621d7c1..8c556ba4e 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/SuccessEventFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/SuccessEventFormats.scala @@ -7,10 +7,10 @@ package sbt.internal.util.codec import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } trait SuccessEventFormats { self: sjsonnew.BasicJsonProtocol => implicit lazy val SuccessEventFormat: JsonFormat[sbt.internal.util.SuccessEvent] = new JsonFormat[sbt.internal.util.SuccessEvent] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.SuccessEvent = { - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) + override def read[J](__jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.SuccessEvent = { + __jsOpt match { + case Some(__js) => + unbuilder.beginObject(__js) val message = unbuilder.readField[String]("message") unbuilder.endObject() sbt.internal.util.SuccessEvent(message) diff --git a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TraceEventFormats.scala b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TraceEventFormats.scala index 379196a9b..babad8d58 100644 --- a/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TraceEventFormats.scala +++ b/internal/util-logging/src/main/contraband-scala/sbt/internal/util/codec/TraceEventFormats.scala @@ -7,10 +7,10 @@ package sbt.internal.util.codec import _root_.sjsonnew.{ Unbuilder, Builder, JsonFormat, deserializationError } trait TraceEventFormats { self: sjsonnew.BasicJsonProtocol => implicit lazy val TraceEventFormat: JsonFormat[sbt.internal.util.TraceEvent] = new JsonFormat[sbt.internal.util.TraceEvent] { - override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.TraceEvent = { - jsOpt match { - case Some(js) => - unbuilder.beginObject(js) + override def read[J](__jsOpt: Option[J], unbuilder: Unbuilder[J]): sbt.internal.util.TraceEvent = { + __jsOpt match { + case Some(__js) => + unbuilder.beginObject(__js) val level = unbuilder.readField[String]("level") val message = unbuilder.readField[Throwable]("message") val channelName = unbuilder.readField[Option[String]]("channelName") diff --git a/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala b/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala index 6a1abd726..61d5acde2 100644 --- a/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala +++ b/internal/util-relation/src/main/scala/sbt/internal/util/Relation.scala @@ -183,7 +183,7 @@ private final class MRelation[A, B](fwd: Map[A, Set[B]], rev: Map[B, Set[A]]) } def groupBy[K](discriminator: ((A, B)) => K): Map[K, Relation[A, B]] = - all.groupBy(discriminator) mapValues { Relation.empty[A, B] ++ _ } + (all.groupBy(discriminator) mapValues { Relation.empty[A, B] ++ _ }).toMap def contains(a: A, b: B): Boolean = forward(a)(b) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 771e7e403..d8d241042 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -4,10 +4,11 @@ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { val scala212 = "2.12.8" + val scala213 = "2.13.0" def nightlyVersion: Option[String] = sys.props.get("sbt.build.version") - private val ioVersion = nightlyVersion.getOrElse("1.3.0-M11") + private val ioVersion = nightlyVersion.getOrElse("1.3.0-M12") private val sbtIO = "org.scala-sbt" %% "io" % ioVersion def getSbtModulePath(key: String, name: String) = { diff --git a/project/build.properties b/project/build.properties index c59667ce9..c9f29468d 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.3.0-M3 +sbt.version=1.3.0-RC2 diff --git a/project/plugins.sbt b/project/plugins.sbt index c6e8c962e..7c0f95696 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,5 +1,5 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.9") addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.0.0") -addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.1") +addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.4") addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.16") addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.4.1") From 50b2ea6f83073d5ccd1be17dea4833533b8f7620 Mon Sep 17 00:00:00 2001 From: James Roper Date: Mon, 12 Aug 2019 15:16:48 +1000 Subject: [PATCH 802/823] Use byte arrays instead of lists of bytes in FileInfo Fixes #206 --- build.sbt | 13 +++++ .../src/main/scala/sbt/util/FileInfo.scala | 58 ++++++++++++++----- 2 files changed, 56 insertions(+), 15 deletions(-) diff --git a/build.sbt b/build.sbt index 2d0abe0ea..08d4af281 100644 --- a/build.sbt +++ b/build.sbt @@ -152,6 +152,19 @@ lazy val utilCache = (project in file("util-cache")) Seq(sjsonnewScalaJson.value, sjsonnewMurmurhash.value, scalaReflect.value), libraryDependencies ++= Seq(scalaTest), mimaSettings, + mimaBinaryIssueFilters ++= Seq( + // These are private case classes that have changed + exclude[IncompatibleMethTypeProblem]("sbt.util.FileHashModified.apply"), + exclude[IncompatibleResultTypeProblem]("sbt.util.FileHashModified.copy$default$2"), + exclude[IncompatibleMethTypeProblem]("sbt.util.FileHashModified.copy"), + exclude[IncompatibleMethTypeProblem]("sbt.util.FileHashModified.this"), + exclude[IncompatibleMethTypeProblem]("sbt.util.FileHash.apply"), + exclude[IncompatibleResultTypeProblem]("sbt.util.FileHash.copy$default$2"), + exclude[IncompatibleMethTypeProblem]("sbt.util.FileHash.copy"), + exclude[IncompatibleMethTypeProblem]("sbt.util.FileHash.this"), + // Added a method to a sealed trait, technically not a problem for Scala + exclude[ReversedMissingMethodProblem]("sbt.util.HashFileInfo.hashArray"), + ) ) .configure(addSbtIO) diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index f823f4924..b31640b02 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -4,13 +4,18 @@ package sbt.util import java.io.File + import scala.util.control.NonFatal import sbt.io.{ Hash, IO } -import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError } -import CacheImplicits._ +import sjsonnew.{ Builder, DeserializationException, JsonFormat, Unbuilder, deserializationError } +import CacheImplicits.{ arrayFormat => _, _ } sealed trait FileInfo { def file: File } -sealed trait HashFileInfo extends FileInfo { def hash: List[Byte] } +sealed trait HashFileInfo extends FileInfo { + @deprecated("Use hashArray instead", "1.3.0") + def hash: List[Byte] = hashArray.toList + private[util] def hashArray: Array[Byte] +} sealed trait ModifiedFileInfo extends FileInfo { def lastModified: Long } sealed trait PlainFileInfo extends FileInfo { def exists: Boolean } @@ -31,8 +36,8 @@ object HashModifiedFileInfo { private final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo -private final case class FileHash(file: File, hash: List[Byte]) extends HashFileInfo -private final case class FileHashModified(file: File, hash: List[Byte], lastModified: Long) +private final case class FileHash(file: File, hashArray: Array[Byte]) extends HashFileInfo +private final case class FileHashModified(file: File, hashArray: Array[Byte], lastModified: Long) extends HashModifiedFileInfo final case class FilesInfo[F <: FileInfo] private (files: Set[F]) @@ -50,6 +55,29 @@ object FilesInfo { object FileInfo { + /** + * Stores byte arrays as hex encoded strings, but falls back to reading an array of integers, + * which is how it used to be stored, if that fails. + */ + implicit val byteArrayFormat: JsonFormat[Array[Byte]] = new JsonFormat[Array[Byte]] { + override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): Array[Byte] = { + jsOpt match { + case Some(js) => + try { + Hash.fromHex(unbuilder.readString(js)) + } catch { + case _: DeserializationException => + CacheImplicits.arrayFormat[Byte].read(jsOpt, unbuilder) + } + case None => Array.empty + } + } + + override def write[J](obj: Array[Byte], builder: Builder[J]): Unit = { + builder.writeString(Hash.toHex(obj)) + } + } + sealed trait Style { type F <: FileInfo @@ -71,7 +99,7 @@ object FileInfo { def write[J](obj: HashModifiedFileInfo, builder: Builder[J]) = { builder.beginObject() builder.addField("file", obj.file) - builder.addField("hash", obj.hash) + builder.addField("hash", obj.hashArray) builder.addField("lastModified", obj.lastModified) builder.endObject() } @@ -80,7 +108,7 @@ object FileInfo { case Some(js) => unbuilder.beginObject(js) val file = unbuilder.readField[File]("file") - val hash = unbuilder.readField[List[Byte]]("hash") + val hash = unbuilder.readField[Array[Byte]]("hash") val lastModified = unbuilder.readField[Long]("lastModified") unbuilder.endObject() FileHashModified(file, hash, lastModified) @@ -89,8 +117,8 @@ object FileInfo { } implicit def apply(file: File): HashModifiedFileInfo = - FileHashModified(file.getAbsoluteFile, Hash(file).toList, IO.getModifiedTimeOrZero(file)) - def apply(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = + FileHashModified(file.getAbsoluteFile, Hash(file), IO.getModifiedTimeOrZero(file)) + def apply(file: File, hash: Array[Byte], lastModified: Long): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, hash, lastModified) } @@ -101,7 +129,7 @@ object FileInfo { def write[J](obj: HashFileInfo, builder: Builder[J]) = { builder.beginObject() builder.addField("file", obj.file) - builder.addField("hash", obj.hash) + builder.addField("hash", obj.hashArray) builder.endObject() } @@ -109,7 +137,7 @@ object FileInfo { case Some(js) => unbuilder.beginObject(js) val file = unbuilder.readField[File]("file") - val hash = unbuilder.readField[List[Byte]]("hash") + val hash = unbuilder.readField[Array[Byte]]("hash") unbuilder.endObject() FileHash(file, hash) case None => deserializationError("Expected JsObject but found None") @@ -117,12 +145,12 @@ object FileInfo { } implicit def apply(file: File): HashFileInfo = FileHash(file.getAbsoluteFile, computeHash(file)) - def apply(file: File, bytes: List[Byte]): HashFileInfo = + def apply(file: File, bytes: Array[Byte]): HashFileInfo = FileHash(file.getAbsoluteFile, bytes) - private def computeHash(file: File): List[Byte] = - try Hash(file).toList - catch { case NonFatal(_) => Nil } + private def computeHash(file: File): Array[Byte] = + try Hash(file) + catch { case NonFatal(_) => Array.empty } } object lastModified extends Style { From 5fcb100d6ffdf57ec1ed4efa536ae6885b485388 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sun, 18 Aug 2019 23:20:43 -0400 Subject: [PATCH 803/823] publish for 2.13 Fixes #201 https://repo1.maven.org/maven2/org/scala-sbt/util-cache_2.13/ --- build.sbt | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/build.sbt b/build.sbt index 08d4af281..c807ce0fe 100644 --- a/build.sbt +++ b/build.sbt @@ -24,7 +24,7 @@ def commonSettings: Seq[Setting[_]] = Seq( // concurrentRestrictions in Global += Util.testExclusiveRestriction, testOptions += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"), javacOptions in compile ++= Seq("-Xlint", "-Xlint:-serial"), - crossScalaVersions := Seq(scala212), + crossScalaVersions := Seq(scala212, scala213), scalacOptions in console in Compile -= "-Ywarn-unused-import", scalacOptions in console in Test -= "-Ywarn-unused-import", publishArtifact in Compile := true, @@ -98,7 +98,6 @@ lazy val utilLogging = (project in internalPath / "util-logging") .dependsOn(utilInterface) .settings( commonSettings, - crossScalaVersions := Seq(scala212), name := "Util Logging", libraryDependencies ++= Seq(jline, log4jApi, log4jCore, disruptor, sjsonnewScalaJson.value, scalaReflect.value), From 39fdf70c2188e97547028331bfaabbd34a56b216 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 20 Aug 2019 12:35:13 -0400 Subject: [PATCH 804/823] reimplement stacktrace suppression Ref https://github.com/sbt/sbt/issues/4964 --- .../sbt/internal/util/ConsoleAppender.scala | 27 ++++++++++++------- .../scala/sbt/internal/util/StackTrace.scala | 3 +++ 2 files changed, 20 insertions(+), 10 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index bc30045c8..2b381540a 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -481,16 +481,23 @@ class ConsoleAppender private[ConsoleAppender] ( } private def appendTraceEvent(te: TraceEvent): Unit = { - val traceLevel = if (getTrace < 0) Int.MaxValue else getTrace - val throwableShowLines: ShowLines[Throwable] = - ShowLines[Throwable]((t: Throwable) => { - List(StackTrace.trimmed(t, traceLevel)) - }) - val codec: ShowLines[TraceEvent] = - ShowLines[TraceEvent]((t: TraceEvent) => { - throwableShowLines.showLines(t.message) - }) - codec.showLines(te).toVector foreach { appendLog(Level.Error, _) } + val traceLevel = getTrace + if (traceLevel >= 0) { + val throwableShowLines: ShowLines[Throwable] = + ShowLines[Throwable]((t: Throwable) => { + List(StackTrace.trimmed(t, traceLevel)) + }) + val codec: ShowLines[TraceEvent] = + ShowLines[TraceEvent]((t: TraceEvent) => { + throwableShowLines.showLines(t.message) + }) + codec.showLines(te).toVector foreach { appendLog(Level.Error, _) } + } + if (traceLevel <= 2) { + suppressedMessage(new SuppressedTraceContext(traceLevel, ansiCodesSupported && useFormat)) foreach { + appendLog(Level.Error, _) + } + } } private def appendProgressEvent(pe: ProgressEvent): Unit = diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala index 37cc42400..5ff7086b7 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/StackTrace.scala @@ -20,6 +20,9 @@ object StackTrace { * - If d is greater than 0, then up to that many lines are included, * where the line for the Throwable is counted plus one line for each stack element. * Less lines will be included if there are not enough stack elements. + * + * See also ConsoleAppender where d <= 2 is treated specially by + * printing a prepared statement. */ def trimmedLines(t: Throwable, d: Int): List[String] = { require(d >= 0) From caecc7e6ae8e860a0095dd5abc63aa3d0e8a8754 Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Wed, 21 Aug 2019 00:03:53 -0700 Subject: [PATCH 805/823] Bump sbt version to 1.3.0-RC4 --- project/build.properties | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/build.properties b/project/build.properties index c9f29468d..2bdd560f2 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.3.0-RC2 +sbt.version=1.3.0-RC4 From 7f112052bfe8de2b8b53399449af11ca51c5395e Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Wed, 21 Aug 2019 00:02:43 -0700 Subject: [PATCH 806/823] Unbreak binary compatibility We discovered in the community build that 1.3.0-RC4 breaks the lucidchart scalafmt plugins. We can unbreak binary compatibility by adding alternative classes. --- build.sbt | 9 ----- .../src/main/scala/sbt/util/FileInfo.scala | 35 ++++++++++++++----- 2 files changed, 26 insertions(+), 18 deletions(-) diff --git a/build.sbt b/build.sbt index c807ce0fe..64621fdcd 100644 --- a/build.sbt +++ b/build.sbt @@ -152,15 +152,6 @@ lazy val utilCache = (project in file("util-cache")) libraryDependencies ++= Seq(scalaTest), mimaSettings, mimaBinaryIssueFilters ++= Seq( - // These are private case classes that have changed - exclude[IncompatibleMethTypeProblem]("sbt.util.FileHashModified.apply"), - exclude[IncompatibleResultTypeProblem]("sbt.util.FileHashModified.copy$default$2"), - exclude[IncompatibleMethTypeProblem]("sbt.util.FileHashModified.copy"), - exclude[IncompatibleMethTypeProblem]("sbt.util.FileHashModified.this"), - exclude[IncompatibleMethTypeProblem]("sbt.util.FileHash.apply"), - exclude[IncompatibleResultTypeProblem]("sbt.util.FileHash.copy$default$2"), - exclude[IncompatibleMethTypeProblem]("sbt.util.FileHash.copy"), - exclude[IncompatibleMethTypeProblem]("sbt.util.FileHash.this"), // Added a method to a sealed trait, technically not a problem for Scala exclude[ReversedMissingMethodProblem]("sbt.util.HashFileInfo.hashArray"), ) diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index b31640b02..d994d77cf 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -36,9 +36,25 @@ object HashModifiedFileInfo { private final case class PlainFile(file: File, exists: Boolean) extends PlainFileInfo private final case class FileModified(file: File, lastModified: Long) extends ModifiedFileInfo -private final case class FileHash(file: File, hashArray: Array[Byte]) extends HashFileInfo -private final case class FileHashModified(file: File, hashArray: Array[Byte], lastModified: Long) - extends HashModifiedFileInfo +@deprecated("Kept for plugin compat, but will be removed in sbt 2.0", "1.3.0") +private final case class FileHash(file: File, override val hash: List[Byte]) extends HashFileInfo { + override val hashArray: Array[Byte] = hash.toArray +} +private final case class FileHashArrayRepr(file: File, override val hashArray: Array[Byte]) + extends HashFileInfo +@deprecated("Kept for plugin compat, but will be removed in sbt 2.0", "1.3.0") +private final case class FileHashModified( + file: File, + override val hash: List[Byte], + lastModified: Long +) extends HashModifiedFileInfo { + override val hashArray: Array[Byte] = hash.toArray +} +private final case class FileHashModifiedArrayRepr( + file: File, + override val hashArray: Array[Byte], + lastModified: Long +) extends HashModifiedFileInfo final case class FilesInfo[F <: FileInfo] private (files: Set[F]) object FilesInfo { @@ -111,15 +127,15 @@ object FileInfo { val hash = unbuilder.readField[Array[Byte]]("hash") val lastModified = unbuilder.readField[Long]("lastModified") unbuilder.endObject() - FileHashModified(file, hash, lastModified) + FileHashModifiedArrayRepr(file, hash, lastModified) case None => deserializationError("Expected JsObject but found None") } } implicit def apply(file: File): HashModifiedFileInfo = - FileHashModified(file.getAbsoluteFile, Hash(file), IO.getModifiedTimeOrZero(file)) + FileHashModifiedArrayRepr(file.getAbsoluteFile, Hash(file), IO.getModifiedTimeOrZero(file)) def apply(file: File, hash: Array[Byte], lastModified: Long): HashModifiedFileInfo = - FileHashModified(file.getAbsoluteFile, hash, lastModified) + FileHashModifiedArrayRepr(file.getAbsoluteFile, hash, lastModified) } object hash extends Style { @@ -139,14 +155,15 @@ object FileInfo { val file = unbuilder.readField[File]("file") val hash = unbuilder.readField[Array[Byte]]("hash") unbuilder.endObject() - FileHash(file, hash) + FileHashArrayRepr(file, hash) case None => deserializationError("Expected JsObject but found None") } } - implicit def apply(file: File): HashFileInfo = FileHash(file.getAbsoluteFile, computeHash(file)) + implicit def apply(file: File): HashFileInfo = + FileHashArrayRepr(file.getAbsoluteFile, computeHash(file)) def apply(file: File, bytes: Array[Byte]): HashFileInfo = - FileHash(file.getAbsoluteFile, bytes) + FileHashArrayRepr(file.getAbsoluteFile, bytes) private def computeHash(file: File): Array[Byte] = try Hash(file) From 9611f737ec4d75a580dc34df79a6af9d19249883 Mon Sep 17 00:00:00 2001 From: xuwei-k <6b656e6a69@gmail.com> Date: Tue, 27 Aug 2019 15:56:59 +0900 Subject: [PATCH 807/823] Update dependencies --- .scalafmt.conf | 2 +- project/Dependencies.scala | 4 ++-- project/plugins.sbt | 4 ++-- 3 files changed, 5 insertions(+), 5 deletions(-) diff --git a/.scalafmt.conf b/.scalafmt.conf index 5b87db3b7..fb2e6f0f9 100644 --- a/.scalafmt.conf +++ b/.scalafmt.conf @@ -1,4 +1,4 @@ -version = 2.0.0-RC5 +version = 2.0.1 maxColumn = 100 project.git = true project.excludeFilters = [ "\\Wsbt-test\\W", "\\Winput_sources\\W", "\\Wcontraband-scala\\W" ] diff --git a/project/Dependencies.scala b/project/Dependencies.scala index d8d241042..8f1dd55fd 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -58,6 +58,6 @@ object Dependencies { val log4jApi = "org.apache.logging.log4j" % "log4j-api" % log4jVersion val log4jCore = "org.apache.logging.log4j" % "log4j-core" % log4jVersion val disruptor = "com.lmax" % "disruptor" % "3.4.2" - val silencerPlugin = "com.github.ghik" %% "silencer-plugin" % "1.4.1" - val silencerLib = "com.github.ghik" %% "silencer-lib" % "1.4.1" % Provided + val silencerPlugin = "com.github.ghik" %% "silencer-plugin" % "1.4.2" + val silencerLib = "com.github.ghik" %% "silencer-lib" % "1.4.2" % Provided } diff --git a/project/plugins.sbt b/project/plugins.sbt index 7c0f95696..4458f9e31 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,5 +1,5 @@ addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.9") -addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.0.0") +addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.0.3") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.4") addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.16") -addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.4.1") +addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.4.2") From c9e07b6010f318575fd61cf9d27c177ac9c4152b Mon Sep 17 00:00:00 2001 From: Alexey Vakhrenev Date: Thu, 29 Aug 2019 15:39:21 +0300 Subject: [PATCH 808/823] fix FileHash equality --- util-cache/src/main/scala/sbt/util/FileInfo.scala | 9 ++++++++- 1 file changed, 8 insertions(+), 1 deletion(-) diff --git a/util-cache/src/main/scala/sbt/util/FileInfo.scala b/util-cache/src/main/scala/sbt/util/FileInfo.scala index d994d77cf..95785e963 100644 --- a/util-cache/src/main/scala/sbt/util/FileInfo.scala +++ b/util-cache/src/main/scala/sbt/util/FileInfo.scala @@ -41,7 +41,14 @@ private final case class FileHash(file: File, override val hash: List[Byte]) ext override val hashArray: Array[Byte] = hash.toArray } private final case class FileHashArrayRepr(file: File, override val hashArray: Array[Byte]) - extends HashFileInfo + extends HashFileInfo { + override def hashCode(): Int = (file, java.util.Arrays.hashCode(hashArray)).hashCode() + override def equals(obj: Any): Boolean = obj match { + case that: FileHashArrayRepr => + this.file == that.file && java.util.Arrays.equals(this.hashArray, that.hashArray) + case _ => false + } +} @deprecated("Kept for plugin compat, but will be removed in sbt 2.0", "1.3.0") private final case class FileHashModified( file: File, From d9fe5540f508e020ea201dd0f3dabb0270faa23e Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Sat, 31 Aug 2019 19:26:41 -0700 Subject: [PATCH 809/823] Interlace log lines with task progress With the current supershell implementation, the progress display flickers when there is heavy console logging during task evaluation. This is because the console appender clears out the task progress and it isn't restored until the next periodic super shell report (which runs every 100ms by default). To remove the flickering, I reworked the implementation to interlace the log lines with progress reports. In order to ensure that the log lines remained contiguous, I had to apply padding at the bottom of the supershell region whenever the new report contained fewer lines than the old report. The report shifts down as new log lines are appended. This isn't optimal, but I think removing the flickering while preserving contiguous log lines is worth it. --- .../sbt/internal/util/ConsoleAppender.scala | 107 +++++++++++++----- 1 file changed, 77 insertions(+), 30 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 2b381540a..6a48084a5 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -3,7 +3,7 @@ package sbt.internal.util import sbt.util._ import java.io.{ PrintStream, PrintWriter } import java.util.Locale -import java.util.concurrent.atomic.{ AtomicBoolean, AtomicInteger } +import java.util.concurrent.atomic.{ AtomicBoolean, AtomicInteger, AtomicReference } import org.apache.logging.log4j.{ Level => XLevel } import org.apache.logging.log4j.message.{ Message, ObjectMessage, ReusableObjectMessage } import org.apache.logging.log4j.core.{ LogEvent => XLogEvent } @@ -108,7 +108,6 @@ object ConsoleAppender { private[this] val showProgressHolder: AtomicBoolean = new AtomicBoolean(false) def setShowProgress(b: Boolean): Unit = showProgressHolder.set(b) def showProgress: Boolean = showProgressHolder.get - private[sbt] val lastTaskCount = new AtomicInteger(0) /** Hide stack trace altogether. */ val noSuppressedMessage = (_: SuppressedTraceContext) => None @@ -311,6 +310,80 @@ object ConsoleAppender { } +/** + * Caches the task progress lines so that they can be reprinted whenever the default + * appender logs additional messages. This makes it so that the progress lines are + * more stable with less appearance of flickering. + */ +private object SuperShellLogger { + private val progressLines = new AtomicReference[Seq[String]](Nil) + // leave some blank lines for tasks that might use println(...) + private val blankZone = 5 + + /** + * Splits a log message into individual lines and interlaces each line with + * the task progress report to reduce the appearance of flickering. It is assumed + * that this method is only called while holding the out.lockObject. + */ + private[util] def writeMsg(out: ConsoleOut, msg: String): Unit = { + val progress = progressLines.get + msg.linesIterator.foreach { l => + out.println(s"$DeleteLine$l") + if (progress.length > 0) { + deleteConsoleLines(out, blankZone) + progress.foreach(out.println) + out.print(cursorUp(blankZone + progress.length)) + } + } + out.flush() + } + + /** + * Receives a new task report and replaces the old one. In the event that the new + * report has fewer lines than the previous report, the report is shifted up by + * the difference in the previous and current number of lines. As new log lines + * are added by the regular console appender, the report will shift down until the + * padding is filled. This allows the regular log lines to be viewable as a continuous + * line stream with no holes. The unfortunate part is that the super shell output + * location does shift around a little bit. There are at least two ways to address this + * + * 1) Add an invariant that the super shell region never shrinks, but may grow, during + * task evaluation. We'd have to add some kind of special message to send to the + * appender to let it now that task evaluation is completely done so that it can + * reset the progress area size. + * + * 2) Refactor ConsoleAppender so that knows the log level. Right now, we print lines + * to the console appender that ultimately get discarded because they are at the + * debug level. As a result, it's impossible to know whether or not the line + * out.println(s"$DeleteLine$l") actually prints anything to the console. If we + * could guarantee that every line in a message passed into writeMsg was actually + * written to the screen, then we could apply the padding at the top and reduce + * the padding whenever a new line is logged until it is no longer necessary to + * apply padding to the progress region to keep the lines stream contiguous. + * + */ + private[util] def update(out: ConsoleOut, pe: ProgressEvent): Unit = { + val sorted = pe.items.sortBy(x => x.elapsedMicros) + val info = sorted map { item => + val elapsed = item.elapsedMicros / 1000000L + s"$DeleteLine | => ${item.name} ${elapsed}s" + } + + deleteConsoleLines(out, blankZone) + info.foreach(i => out.println(i)) + + val previousLines = progressLines.getAndSet(info) + val padding = math.max(0, previousLines.length - info.length) + deleteConsoleLines(out, padding) + out.print(cursorUp(blankZone + info.length + padding)) + out.flush() + } + private def deleteConsoleLines(out: ConsoleOut, n: Int): Unit = { + (1 to n) foreach { _ => + out.println(DeleteLine) + } + } +} // See http://stackoverflow.com/questions/24205093/how-to-create-a-custom-appender-in-log4j2 // for custom appender using Java. // http://logging.apache.org/log4j/2.x/manual/customconfig.html @@ -455,19 +528,11 @@ class ConsoleAppender private[ConsoleAppender] ( appendLog(SUCCESS_LABEL_COLOR, Level.SuccessLabel, SUCCESS_MESSAGE_COLOR, message) } - // leave some blank lines for tasks that might use println(...) - private val blankZone = 5 private def write(msg: String): Unit = { if (!useFormat || !ansiCodesSupported) { out.println(EscHelpers.removeEscapeSequences(msg)) } else if (ConsoleAppender.showProgress) { - val clearNum = lastTaskCount.get + blankZone - if (clearNum > 1) { - deleteConsoleLines(clearNum) - out.print(s"${cursorUp(clearNum)}") - } - out.println(msg) - out.flush() + SuperShellLogger.writeMsg(out, msg) } else { out.println(msg) } @@ -503,27 +568,9 @@ class ConsoleAppender private[ConsoleAppender] ( private def appendProgressEvent(pe: ProgressEvent): Unit = if (ConsoleAppender.showProgress) { out.lockObject.synchronized { - deleteConsoleLines(blankZone) - val currentTasksCount = pe.items.size - val ltc = pe.lastTaskCount.getOrElse(0) - val sorted = pe.items.sortBy(_.name).sortBy(x => -x.elapsedMicros) - sorted foreach { item => - val elapsed = item.elapsedMicros / 1000000L - out.println(s"$DeleteLine | => ${item.name} ${elapsed}s") - } - if (ltc > currentTasksCount) deleteConsoleLines(ltc - currentTasksCount) - else () - out.print(cursorUp(math.max(currentTasksCount, ltc) + blankZone)) - out.flush() - lastTaskCount.set(ltc) + SuperShellLogger.update(out, pe) } - } else () - - private def deleteConsoleLines(n: Int): Unit = { - (1 to n) foreach { _ => - out.println(DeleteLine) } - } private def appendMessageContent(level: Level.Value, o: AnyRef): Unit = { def appendEvent(oe: ObjectEvent[_]): Unit = { From a5666e97b6e2395c7404d21c85f3efc58778df19 Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Sun, 1 Sep 2019 21:54:15 -0700 Subject: [PATCH 810/823] Manage progress padding With this commit, I improved the padding management so that padding is now added above the progress report. Whenever a line is logged at the info or greater level, we can reduce the padding level by one since that line has effectively filled in the padding. --- .../sbt/internal/util/ConsoleAppender.scala | 48 ++++++++----------- 1 file changed, 21 insertions(+), 27 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 6a48084a5..8a46acc35 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -319,6 +319,8 @@ private object SuperShellLogger { private val progressLines = new AtomicReference[Seq[String]](Nil) // leave some blank lines for tasks that might use println(...) private val blankZone = 5 + private val padding = new AtomicInteger(0) + private val ascii = "[^\\p{ASCII}]".r.pattern /** * Splits a log message into individual lines and interlaces each line with @@ -330,9 +332,15 @@ private object SuperShellLogger { msg.linesIterator.foreach { l => out.println(s"$DeleteLine$l") if (progress.length > 0) { - deleteConsoleLines(out, blankZone) + val stripped = ascii.matcher(l).replaceAll("") + val isDebugLine = + l.startsWith("[debug]") || l.startsWith(s"${scala.Console.RESET}[debug]") + // As long as the line isn't a debug line, we can assume it was printed to + // the console and reduce the top padding. + val pad = if (padding.get > 0 && !isDebugLine) padding.decrementAndGet else padding.get + deleteConsoleLines(out, blankZone + pad) progress.foreach(out.println) - out.print(cursorUp(blankZone + progress.length)) + out.print(cursorUp(blankZone + progress.length + padding.get)) } } out.flush() @@ -340,27 +348,10 @@ private object SuperShellLogger { /** * Receives a new task report and replaces the old one. In the event that the new - * report has fewer lines than the previous report, the report is shifted up by - * the difference in the previous and current number of lines. As new log lines - * are added by the regular console appender, the report will shift down until the - * padding is filled. This allows the regular log lines to be viewable as a continuous - * line stream with no holes. The unfortunate part is that the super shell output - * location does shift around a little bit. There are at least two ways to address this - * - * 1) Add an invariant that the super shell region never shrinks, but may grow, during - * task evaluation. We'd have to add some kind of special message to send to the - * appender to let it now that task evaluation is completely done so that it can - * reset the progress area size. - * - * 2) Refactor ConsoleAppender so that knows the log level. Right now, we print lines - * to the console appender that ultimately get discarded because they are at the - * debug level. As a result, it's impossible to know whether or not the line - * out.println(s"$DeleteLine$l") actually prints anything to the console. If we - * could guarantee that every line in a message passed into writeMsg was actually - * written to the screen, then we could apply the padding at the top and reduce - * the padding whenever a new line is logged until it is no longer necessary to - * apply padding to the progress region to keep the lines stream contiguous. - * + * report has fewer lines than the previous report, padding lines are added on top + * so that the console log lines remain contiguous. When a console line is printed + * at the info or greater level, we can decrement the padding because the console + * line will have filled in the blank line. */ private[util] def update(out: ConsoleOut, pe: ProgressEvent): Unit = { val sorted = pe.items.sortBy(x => x.elapsedMicros) @@ -369,13 +360,16 @@ private object SuperShellLogger { s"$DeleteLine | => ${item.name} ${elapsed}s" } + val previousLines = progressLines.getAndSet(info) + val prevPadding = padding.get + val newPadding = math.max(0, previousLines.length + prevPadding - info.length) + padding.set(newPadding) + + deleteConsoleLines(out, newPadding) deleteConsoleLines(out, blankZone) info.foreach(i => out.println(i)) - val previousLines = progressLines.getAndSet(info) - val padding = math.max(0, previousLines.length - info.length) - deleteConsoleLines(out, padding) - out.print(cursorUp(blankZone + info.length + padding)) + out.print(cursorUp(blankZone + info.length + newPadding)) out.flush() } private def deleteConsoleLines(out: ConsoleOut, n: Int): Unit = { From 635316902d3d46da31fb4afa607a683877b16e92 Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Mon, 2 Sep 2019 11:07:48 -0700 Subject: [PATCH 811/823] Allow supershell to work in no color mode Supershell actually works quite well in no color mode. On the sbt side, we still want to disable supershell automatically if the output is not a terminal or no color is set, but this commit allows the user to force supershell through -Dsbt.supershell or the useSuperShell setting even when no color is set. --- .../main/scala/sbt/internal/util/ConsoleAppender.scala | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 8a46acc35..4be18ffb3 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -523,12 +523,12 @@ class ConsoleAppender private[ConsoleAppender] ( } private def write(msg: String): Unit = { - if (!useFormat || !ansiCodesSupported) { - out.println(EscHelpers.removeEscapeSequences(msg)) - } else if (ConsoleAppender.showProgress) { - SuperShellLogger.writeMsg(out, msg) + val toWrite = + if (!useFormat || !ansiCodesSupported) EscHelpers.removeEscapeSequences(msg) else msg + if (ConsoleAppender.showProgress) { + SuperShellLogger.writeMsg(out, toWrite) } else { - out.println(msg) + out.println(toWrite) } } From 1aef82aedbfd6f5dc299620576660e0c65d8083c Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Tue, 3 Sep 2019 16:00:50 -0400 Subject: [PATCH 812/823] IO 1.3.0 --- project/Dependencies.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 8f1dd55fd..603b6396d 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -8,7 +8,7 @@ object Dependencies { def nightlyVersion: Option[String] = sys.props.get("sbt.build.version") - private val ioVersion = nightlyVersion.getOrElse("1.3.0-M12") + private val ioVersion = nightlyVersion.getOrElse("1.3.0") private val sbtIO = "org.scala-sbt" %% "io" % ioVersion def getSbtModulePath(key: String, name: String) = { From f20b2750454dc1fe63a6e3ffb8cb23d4d8ec3757 Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Tue, 3 Sep 2019 00:26:40 -0700 Subject: [PATCH 813/823] Fix supershell position bug The previous implementation of supershell log line interlacing with regular line interlacing relied on state in a global object. A somewhat better approach is for each appender to hold a reference to a state object. Every time tasks run, new appenders are created, so the state should always reflect the current progress state. --- .../sbt/internal/util/ConsoleAppender.scala | 147 +++++++++--------- 1 file changed, 73 insertions(+), 74 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 4be18ffb3..eb6ea11c7 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -310,74 +310,6 @@ object ConsoleAppender { } -/** - * Caches the task progress lines so that they can be reprinted whenever the default - * appender logs additional messages. This makes it so that the progress lines are - * more stable with less appearance of flickering. - */ -private object SuperShellLogger { - private val progressLines = new AtomicReference[Seq[String]](Nil) - // leave some blank lines for tasks that might use println(...) - private val blankZone = 5 - private val padding = new AtomicInteger(0) - private val ascii = "[^\\p{ASCII}]".r.pattern - - /** - * Splits a log message into individual lines and interlaces each line with - * the task progress report to reduce the appearance of flickering. It is assumed - * that this method is only called while holding the out.lockObject. - */ - private[util] def writeMsg(out: ConsoleOut, msg: String): Unit = { - val progress = progressLines.get - msg.linesIterator.foreach { l => - out.println(s"$DeleteLine$l") - if (progress.length > 0) { - val stripped = ascii.matcher(l).replaceAll("") - val isDebugLine = - l.startsWith("[debug]") || l.startsWith(s"${scala.Console.RESET}[debug]") - // As long as the line isn't a debug line, we can assume it was printed to - // the console and reduce the top padding. - val pad = if (padding.get > 0 && !isDebugLine) padding.decrementAndGet else padding.get - deleteConsoleLines(out, blankZone + pad) - progress.foreach(out.println) - out.print(cursorUp(blankZone + progress.length + padding.get)) - } - } - out.flush() - } - - /** - * Receives a new task report and replaces the old one. In the event that the new - * report has fewer lines than the previous report, padding lines are added on top - * so that the console log lines remain contiguous. When a console line is printed - * at the info or greater level, we can decrement the padding because the console - * line will have filled in the blank line. - */ - private[util] def update(out: ConsoleOut, pe: ProgressEvent): Unit = { - val sorted = pe.items.sortBy(x => x.elapsedMicros) - val info = sorted map { item => - val elapsed = item.elapsedMicros / 1000000L - s"$DeleteLine | => ${item.name} ${elapsed}s" - } - - val previousLines = progressLines.getAndSet(info) - val prevPadding = padding.get - val newPadding = math.max(0, previousLines.length + prevPadding - info.length) - padding.set(newPadding) - - deleteConsoleLines(out, newPadding) - deleteConsoleLines(out, blankZone) - info.foreach(i => out.println(i)) - - out.print(cursorUp(blankZone + info.length + newPadding)) - out.flush() - } - private def deleteConsoleLines(out: ConsoleOut, n: Int): Unit = { - (1 to n) foreach { _ => - out.println(DeleteLine) - } - } -} // See http://stackoverflow.com/questions/24205093/how-to-create-a-custom-appender-in-log4j2 // for custom appender using Java. // http://logging.apache.org/log4j/2.x/manual/customconfig.html @@ -398,6 +330,64 @@ class ConsoleAppender private[ConsoleAppender] ( ) extends AbstractAppender(name, null, LogExchange.dummyLayout, true, Array.empty) { import scala.Console.{ BLUE, GREEN, RED, YELLOW } + private val progressState: AtomicReference[ProgressState] = new AtomicReference(null) + private[sbt] def setProgressState(state: ProgressState) = progressState.set(state) + + /** + * Splits a log message into individual lines and interlaces each line with + * the task progress report to reduce the appearance of flickering. It is assumed + * that this method is only called while holding the out.lockObject. + */ + private def supershellInterlaceMsg(msg: String): Unit = { + val state = progressState.get + import state._ + val progress = progressLines.get + msg.linesIterator.foreach { l => + out.println(s"$DeleteLine$l") + if (progress.length > 0) { + val pad = if (padding.get > 0) padding.decrementAndGet() else 0 + deleteConsoleLines(blankZone + pad) + progress.foreach(out.println) + out.print(cursorUp(blankZone + progress.length + padding.get)) + } + } + out.flush() + } + + /** + * Receives a new task report and replaces the old one. In the event that the new + * report has fewer lines than the previous report, padding lines are added on top + * so that the console log lines remain contiguous. When a console line is printed + * at the info or greater level, we can decrement the padding because the console + * line will have filled in the blank line. + */ + private def updateProgressState(pe: ProgressEvent): Unit = { + val state = progressState.get + import state._ + val sorted = pe.items.sortBy(x => x.elapsedMicros) + val info = sorted map { item => + val elapsed = item.elapsedMicros / 1000000L + s"$DeleteLine | => ${item.name} ${elapsed}s" + } + + val previousLines = progressLines.getAndSet(info) + val prevPadding = padding.get + val newPadding = math.max(0, previousLines.length + prevPadding - info.length) + padding.set(newPadding) + + deleteConsoleLines(newPadding) + deleteConsoleLines(blankZone) + info.foreach(i => out.println(i)) + + out.print(cursorUp(blankZone + info.length + newPadding)) + out.flush() + } + private def deleteConsoleLines(n: Int): Unit = { + (1 to n) foreach { _ => + out.println(DeleteLine) + } + } + private val reset: String = { if (ansiCodesSupported && useFormat) scala.Console.RESET else "" @@ -525,8 +515,8 @@ class ConsoleAppender private[ConsoleAppender] ( private def write(msg: String): Unit = { val toWrite = if (!useFormat || !ansiCodesSupported) EscHelpers.removeEscapeSequences(msg) else msg - if (ConsoleAppender.showProgress) { - SuperShellLogger.writeMsg(out, toWrite) + if (progressState.get != null) { + supershellInterlaceMsg(toWrite) } else { out.println(toWrite) } @@ -560,10 +550,8 @@ class ConsoleAppender private[ConsoleAppender] ( } private def appendProgressEvent(pe: ProgressEvent): Unit = - if (ConsoleAppender.showProgress) { - out.lockObject.synchronized { - SuperShellLogger.update(out, pe) - } + if (progressState.get != null) { + out.lockObject.synchronized(updateProgressState(pe)) } private def appendMessageContent(level: Level.Value, o: AnyRef): Unit = { @@ -596,3 +584,14 @@ class ConsoleAppender private[ConsoleAppender] ( } final class SuppressedTraceContext(val traceLevel: Int, val useFormat: Boolean) +private[sbt] final class ProgressState( + val progressLines: AtomicReference[Seq[String]], + val padding: AtomicInteger, + val blankZone: Int +) { + def this(blankZone: Int) = this(new AtomicReference(Nil), new AtomicInteger(0), blankZone) + def reset(): Unit = { + progressLines.set(Nil) + padding.set(0) + } +} From b3165b5c8cd79b46f17fa7c8d570b831af7ba1f4 Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Wed, 18 Sep 2019 14:12:23 -0700 Subject: [PATCH 814/823] Load SuccessEventTag lazily It takes about a second to load scala.reflect.runtime.universe. If we lazy load here, we can load scala.relect.runtime.universe in the background to speed up the sbt start up time. See 0ebb7a5662f2bcc6599010f5a81ed0a540581fd8. --- .../src/main/scala/sbt/internal/util/ManagedLogger.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala index b883c43d1..a0725c1d1 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ManagedLogger.scala @@ -25,7 +25,7 @@ class ManagedLogger( ) } - private val SuccessEventTag = scala.reflect.runtime.universe.typeTag[SuccessEvent] + private lazy val SuccessEventTag = scala.reflect.runtime.universe.typeTag[SuccessEvent] // send special event for success since it's not a real log level override def success(message: => String): Unit = { infoEvent[SuccessEvent](SuccessEvent(message))( From 9c445896cf0499830e5469e233cf754f7d510e1a Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Wed, 18 Sep 2019 15:34:18 -0700 Subject: [PATCH 815/823] Bump sbt version to 1.3.0 --- project/build.properties | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/build.properties b/project/build.properties index 2bdd560f2..080a737ed 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.3.0-RC4 +sbt.version=1.3.0 From 7597cdb19b8ed0b5771312224ddc322654f9788c Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Tue, 24 Sep 2019 10:46:48 -0700 Subject: [PATCH 816/823] Take terminal width into account in supershell Sometimes if the progress lines are wider than the terminal width, the supershell blank zone can expand indefinitely because be do not move the cursor far enough up to properly re-fill the blank zone. --- .../scala/sbt/internal/util/ConsoleAppender.scala | 15 ++++++++++++--- 1 file changed, 12 insertions(+), 3 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index eb6ea11c7..132280160 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -346,9 +346,11 @@ class ConsoleAppender private[ConsoleAppender] ( out.println(s"$DeleteLine$l") if (progress.length > 0) { val pad = if (padding.get > 0) padding.decrementAndGet() else 0 + val width = ConsoleAppender.terminalWidth + val len: Int = progress.foldLeft(progress.length)(_ + terminalLines(width)(_)) deleteConsoleLines(blankZone + pad) progress.foreach(out.println) - out.print(cursorUp(blankZone + progress.length + padding.get)) + out.print(cursorUp(blankZone + len + padding.get)) } } out.flush() @@ -370,18 +372,25 @@ class ConsoleAppender private[ConsoleAppender] ( s"$DeleteLine | => ${item.name} ${elapsed}s" } + val width = ConsoleAppender.terminalWidth + val extra: Int = info.foldLeft(0)(_ + terminalLines(width)(_)) val previousLines = progressLines.getAndSet(info) + val prevExtra = previousLines.foldLeft(0)(_ + terminalLines(width)(_)) + val prevPadding = padding.get - val newPadding = math.max(0, previousLines.length + prevPadding - info.length) + val newPadding = + math.max(0, previousLines.length + prevExtra + prevPadding - info.length - extra) padding.set(newPadding) deleteConsoleLines(newPadding) deleteConsoleLines(blankZone) info.foreach(i => out.println(i)) - out.print(cursorUp(blankZone + info.length + newPadding)) + out.print(cursorUp(blankZone + info.length + newPadding + extra)) out.flush() } + private def terminalLines(width: Int): String => Int = + (progressLine: String) => (progressLine.length - 1) / width private def deleteConsoleLines(n: Int): Unit = { (1 to n) foreach { _ => out.println(DeleteLine) From 5cfab4c9a98c2511b0bae07c153af319c4f4e69d Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Tue, 24 Sep 2019 16:33:40 -0700 Subject: [PATCH 817/823] Cleanup implementation of progress report It was a bit cleaner to consolidate `extra` and (previousLines|info).length into prevLength and currentLength. --- .../main/scala/sbt/internal/util/ConsoleAppender.scala | 9 ++++----- 1 file changed, 4 insertions(+), 5 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 132280160..994b9bd8e 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -373,20 +373,19 @@ class ConsoleAppender private[ConsoleAppender] ( } val width = ConsoleAppender.terminalWidth - val extra: Int = info.foldLeft(0)(_ + terminalLines(width)(_)) + val currentLength = info.foldLeft(info.length)(_ + terminalLines(width)(_)) val previousLines = progressLines.getAndSet(info) - val prevExtra = previousLines.foldLeft(0)(_ + terminalLines(width)(_)) + val prevLength = previousLines.foldLeft(previousLines.length)(_ + terminalLines(width)(_)) val prevPadding = padding.get - val newPadding = - math.max(0, previousLines.length + prevExtra + prevPadding - info.length - extra) + val newPadding = math.max(0, prevLength + prevPadding - currentLength) padding.set(newPadding) deleteConsoleLines(newPadding) deleteConsoleLines(blankZone) info.foreach(i => out.println(i)) - out.print(cursorUp(blankZone + info.length + newPadding + extra)) + out.print(cursorUp(blankZone + currentLength + newPadding)) out.flush() } private def terminalLines(width: Int): String => Int = From 9c2dd05b6a48ad44bcf407251567efc991e73928 Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Tue, 24 Sep 2019 16:35:03 -0700 Subject: [PATCH 818/823] Avoid possible divide by zero On the off chance that in some configurations the terminal width is set to zero, avoid an exception by returning 0 for terminal lines. It is likely that supershell will not work well if terminal width is zero, but that's better than a potential crash (I think the crash would be in the progress background thread, so I'm not sure how bad it would be, but still its good to avoid). --- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index 994b9bd8e..c5d8cbea9 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -389,7 +389,7 @@ class ConsoleAppender private[ConsoleAppender] ( out.flush() } private def terminalLines(width: Int): String => Int = - (progressLine: String) => (progressLine.length - 1) / width + (progressLine: String) => if (width > 0) (progressLine.length - 1) / width else 0 private def deleteConsoleLines(n: Int): Unit = { (1 to n) foreach { _ => out.println(DeleteLine) From 43f25520a0c5cde0fff8e57afe8aee6997201977 Mon Sep 17 00:00:00 2001 From: Ethan Atkins Date: Sun, 6 Oct 2019 17:51:20 -0700 Subject: [PATCH 819/823] Don't include DeleteLine in progress length I incorrectly included the DeleteLine in the progress line length and this could cause certain progress lines to be incorrectly reported as multi line when they actually fit on a single terminal line. --- .../scala/sbt/internal/util/ConsoleAppender.scala | 11 ++++++++--- 1 file changed, 8 insertions(+), 3 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index c5d8cbea9..d205eb3bf 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -349,13 +349,18 @@ class ConsoleAppender private[ConsoleAppender] ( val width = ConsoleAppender.terminalWidth val len: Int = progress.foldLeft(progress.length)(_ + terminalLines(width)(_)) deleteConsoleLines(blankZone + pad) - progress.foreach(out.println) + progress.foreach(printProgressLine) out.print(cursorUp(blankZone + len + padding.get)) } } out.flush() } + private def printProgressLine(line: String): Unit = { + out.print(DeleteLine) + out.println(line) + } + /** * Receives a new task report and replaces the old one. In the event that the new * report has fewer lines than the previous report, padding lines are added on top @@ -369,7 +374,7 @@ class ConsoleAppender private[ConsoleAppender] ( val sorted = pe.items.sortBy(x => x.elapsedMicros) val info = sorted map { item => val elapsed = item.elapsedMicros / 1000000L - s"$DeleteLine | => ${item.name} ${elapsed}s" + s" | => ${item.name} ${elapsed}s" } val width = ConsoleAppender.terminalWidth @@ -383,7 +388,7 @@ class ConsoleAppender private[ConsoleAppender] ( deleteConsoleLines(newPadding) deleteConsoleLines(blankZone) - info.foreach(i => out.println(i)) + info.foreach(printProgressLine) out.print(cursorUp(blankZone + currentLength + newPadding)) out.flush() From 8c2aef75e4002f1c42f26c99ca99be0e4937b227 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 19 Oct 2019 19:14:16 -0400 Subject: [PATCH 820/823] in-source sbt-houserules --- build.sbt | 25 ++++++++------- project/HouseRulesPlugin.scala | 58 ++++++++++++++++++++++++++++++++++ project/plugins.sbt | 5 ++- 3 files changed, 76 insertions(+), 12 deletions(-) create mode 100644 project/HouseRulesPlugin.scala diff --git a/build.sbt b/build.sbt index 64621fdcd..7f938aee9 100644 --- a/build.sbt +++ b/build.sbt @@ -2,16 +2,25 @@ import Dependencies._ import Util._ import com.typesafe.tools.mima.core._, ProblemFilters._ -ThisBuild / git.baseVersion := "1.3.0" ThisBuild / version := { val old = (ThisBuild / version).value nightlyVersion match { case Some(v) => v - case _ => - if (old contains "SNAPSHOT") git.baseVersion.value + "-SNAPSHOT" - else old + case _ => old } } +ThisBuild / organization := "org.scala-sbt" +ThisBuild / bintrayPackage := "util" +ThisBuild / homepage := Some(url("https://github.com/sbt/util")) +ThisBuild / description := "Util modules for sbt" +ThisBuild / scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")) +ThisBuild / licenses := List(("Apache-2.0", url("https://www.apache.org/licenses/LICENSE-2.0"))) +ThisBuild / scalafmtOnCompile := true +ThisBuild / developers := List( + Developer("harrah", "Mark Harrah", "@harrah", url("https://github.com/harrah")), + Developer("eed3si9n", "Eugene Yokota", "@eed3si9n", url("http://eed3si9n.com/")), + Developer("dwijnand", "Dale Wijnand", "@dwijnand", url("https://github.com/dwijnand")), +) def internalPath = file("internal") @@ -54,16 +63,10 @@ lazy val utilRoot: Project = (project in file(".")) utilScripted ) .settings( - inThisBuild( - Seq( - bintrayPackage := "util", - homepage := Some(url("https://github.com/sbt/util")), - description := "Util module for sbt", - scmInfo := Some(ScmInfo(url("https://github.com/sbt/util"), "git@github.com:sbt/util.git")), - )), commonSettings, name := "Util Root", publish / skip := true, + mimaPreviousArtifacts := Set.empty, customCommands ) diff --git a/project/HouseRulesPlugin.scala b/project/HouseRulesPlugin.scala new file mode 100644 index 000000000..8c8958c4f --- /dev/null +++ b/project/HouseRulesPlugin.scala @@ -0,0 +1,58 @@ +import sbt._ +import Keys._ +import bintray.BintrayPlugin +import bintray.BintrayPlugin.autoImport._ + +object HouseRulesPlugin extends AutoPlugin { + override def requires = plugins.JvmPlugin && BintrayPlugin + override def trigger = allRequirements + + override def buildSettings: Seq[Def.Setting[_]] = baseBuildSettings + override def projectSettings: Seq[Def.Setting[_]] = baseSettings + + lazy val baseBuildSettings: Seq[Def.Setting[_]] = Seq( + bintrayOrganization := Some("sbt"), + bintrayRepository := "maven-releases", + ) + + lazy val baseSettings: Seq[Def.Setting[_]] = Seq( + bintrayPackage := (ThisBuild / bintrayPackage).value, + bintrayRepository := (ThisBuild / bintrayRepository).value, + scalacOptions ++= Seq("-encoding", "utf8"), + scalacOptions ++= Seq("-deprecation", "-feature", "-unchecked", "-Xlint"), + scalacOptions += "-language:higherKinds", + scalacOptions += "-language:implicitConversions", + scalacOptions ++= "-Xfuture".ifScala213OrMinus.value.toList, + scalacOptions += "-Xlint", + scalacOptions ++= "-Xfatal-warnings" + .ifScala(v => { + sys.props.get("sbt.build.fatal") match { + case Some(_) => java.lang.Boolean.getBoolean("sbt.build.fatal") + case _ => v == 12 + } + }) + .value + .toList, + scalacOptions ++= "-Yinline-warnings".ifScala211OrMinus.value.toList, + scalacOptions ++= "-Yno-adapted-args".ifScala212OrMinus.value.toList, + scalacOptions += "-Ywarn-dead-code", + scalacOptions += "-Ywarn-numeric-widen", + scalacOptions += "-Ywarn-value-discard", + scalacOptions ++= "-Ywarn-unused-import".ifScala(v => 11 <= v && v <= 12).value.toList + ) ++ Seq(Compile, Test).flatMap( + c => scalacOptions in (c, console) --= Seq("-Ywarn-unused-import", "-Xlint") + ) + + private def scalaPartV = Def setting (CrossVersion partialVersion scalaVersion.value) + + private implicit final class AnyWithIfScala[A](val __x: A) { + def ifScala(p: Long => Boolean) = + Def setting (scalaPartV.value collect { case (2, y) if p(y) => __x }) + def ifScalaLte(v: Long) = ifScala(_ <= v) + def ifScalaGte(v: Long) = ifScala(_ >= v) + def ifScala211OrMinus = ifScalaLte(11) + def ifScala211OrPlus = ifScalaGte(11) + def ifScala212OrMinus = ifScalaLte(12) + def ifScala213OrMinus = ifScalaLte(13) + } +} diff --git a/project/plugins.sbt b/project/plugins.sbt index 4458f9e31..1b9ab3b3f 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -1,4 +1,7 @@ -addSbtPlugin("org.scala-sbt" % "sbt-houserules" % "0.3.9") +addSbtPlugin("com.dwijnand" % "sbt-dynver" % "4.0.0") +addSbtPlugin("org.foundweekends" % "sbt-bintray" % "0.5.5") +addSbtPlugin("com.jsuereth" % "sbt-pgp" % "2.0.0") +addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.6.1") addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.0.3") addSbtPlugin("org.scala-sbt" % "sbt-contraband" % "0.4.4") addSbtPlugin("com.lightbend" % "sbt-whitesource" % "0.1.16") From 1e3e726d0bc62edfb5d9819b75102e95454defbb Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 19 Oct 2019 19:14:29 -0400 Subject: [PATCH 821/823] sbt 1.3.3 --- project/build.properties | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/build.properties b/project/build.properties index 080a737ed..6adcdc753 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.3.0 +sbt.version=1.3.3 From 6c9120ea62e225794697043af5306c99819600f5 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Sat, 19 Oct 2019 19:21:03 -0400 Subject: [PATCH 822/823] Bump Scala versions --- .travis.yml | 6 +++--- project/Dependencies.scala | 4 ++-- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/.travis.yml b/.travis.yml index ccad81d19..9643e7537 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,15 +1,15 @@ language: scala scala: - - 2.12.8 - - 2.13.0 + - 2.12.10 + - 2.13.1 env: global: ADOPTOPENJDK=11 matrix: include: - - scala: 2.12.8 + - scala: 2.12.10 env: - ADOPTOPENJDK=8 - secure: JzxepvrNQIem+7MS8pBfBkcWDgt/oNKOreI3GJMJDN9P7lxCmrW0UVhpSftscjRzz9gXGQleqZ8t/I0hqysY9nO/DlxDQil6FKpsqrEKALdIsez8TjtbOlV69enDl6SBCXpg1B/rTQ/dL9mpV3WMvNkmDOhcNmbNyfO9Uk8wAAEvGQNKyE02s0gjZf6IgfOHXInBB2o3+uQFiWCABFHDWInN4t0QZVEhF/3P3iDKEfauWGwugf/YKLrwUUzNyN+J1i1goYEWZvviP+KCNbPlEsVN60In8F0t+jYuBJb0ePNcl3waT/4xBKQRidB4XRbhOXrZIATdpHLnzKzk2TPf3GxijNEscKYGdq3v6nWd128rfHGYz528pRSZ8bNOdQJotB/bJTmIEOnk5P9zU0z4z2cawMF6EyBJka7kXnC9Vz6TpifvyXDpzfmRzAkBrD6PC+diGPbyy5+4zvhpZuv31MRjMckohyNb76pR9qq70yDlomn+nVNoZ1fpp7dCqwjIxm9h2UjCWzXWY4xSByI8/CaPibq6Ma7RWHQE+4NGG2CCLQrqN4NB+BFsH3R0l5Js9khvDuEUYJkgSmJMFluXranWRV+pp/YMxk1IT4rOEPOc/hIqlQTrxasp/QxeyAfRk9OPzoz9L2kR0RH4ch3KuaARUv03WFNarfQ/ISz3P/s= diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 603b6396d..a3737908b 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -3,8 +3,8 @@ import Keys._ import sbt.contraband.ContrabandPlugin.autoImport._ object Dependencies { - val scala212 = "2.12.8" - val scala213 = "2.13.0" + val scala212 = "2.12.10" + val scala213 = "2.13.1" def nightlyVersion: Option[String] = sys.props.get("sbt.build.version") From 365791006303aefaa3a627f8d1845b74774b8c20 Mon Sep 17 00:00:00 2001 From: "Diego E. Alonso-Blas" Date: Tue, 22 Oct 2019 23:30:09 +0200 Subject: [PATCH 823/823] ConsoleAppender: reuse/recycle StringBuilder storage. A StringBuilder is a mutable data structure to create a String. When the String is created, the new String does not share any storage with the StringBuilder. Thus, we can keep a same StringBuilder, and reuse its internal storage between different iterations. --- .../src/main/scala/sbt/internal/util/ConsoleAppender.scala | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala index d205eb3bf..d0bd1a309 100644 --- a/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala +++ b/internal/util-logging/src/main/scala/sbt/internal/util/ConsoleAppender.scala @@ -2,6 +2,7 @@ package sbt.internal.util import sbt.util._ import java.io.{ PrintStream, PrintWriter } +import java.lang.StringBuilder import java.util.Locale import java.util.concurrent.atomic.{ AtomicBoolean, AtomicInteger, AtomicReference } import org.apache.logging.log4j.{ Level => XLevel } @@ -507,10 +508,10 @@ class ConsoleAppender private[ConsoleAppender] ( message: String ): Unit = out.lockObject.synchronized { + val builder: StringBuilder = new StringBuilder(labelColor.length + label.length + messageColor.length + reset.length * 3) message.linesIterator.foreach { line => - val builder = new java.lang.StringBuilder( - labelColor.length + label.length + messageColor.length + line.length + reset.length * 3 + 3 - ) + builder.ensureCapacity(labelColor.length + label.length + messageColor.length + line.length + reset.length * 3 + 3) + builder.setLength(0) def fmted(a: String, b: String) = builder.append(reset).append(a).append(b).append(reset) builder.append(reset).append('[') fmted(labelColor, label)

I1?NL(); z!y^}`&Ygu5MkthpU+Fw;1?+!~mw<^27bc4U=4%dok*prS%dk81^sd`$w>*RDOer~G zsDSml3;$rKjIZ0mx_)ZJK*>8YtMv}~czpLJ;tivtAApVtue!HAlL=Qs_Qlp)Ab&UU zkfmmnA5mkWj$k6|IFFPZS*0>=G898OTrD+{*`R7Co>ifu17Tt-!l#_qB>|Y~aS^)N z#LPu3AnVYo`>}l~8hHH$({{Y4Oq#>+Q4;toh3?Aa*9FALHi}xsb8o2Gr5$ciZ3|>~ zpi_f80-cCt`uEw1>L>}bD2Yl;kO0Hb!UZjrkglM1PCHbOC&F!QeZe@ALkSjzqpS?| zu#fJ?maxmPn(~pP$AYzZVUPq2373fU_`RSD4_$hC;F1gWyP+N7Z|5jt^I@d3XLovF3=Y z!2!Y64Ak_(ZDB#HG>7xr7;4{kRc+o8=3;I5Fzp^$&Ma(>=UehT*)h{->wiJIL--6- zb8s(wguJHS8)uB|C5GmNQ%QGMsPsh}13#(Y6MhWJRyrGl=2KhSrC)VtM2$G?rMCFq8ihDO-{!guHLzVU?$S(MK!ojqIWYeF)P zEk`&54W^Z4jE8Q+nld0%3Y`qE%%pegLZa@KkpSuPK0CIf-uUw~!lYYBUgi$U-R+(j zssT{8&A&oX)z*vyW*cS;cV#ff6G8NbV!PHhujnzi=uzBgh6|EZ3*%>(yAkb_rWm^> ze{OZEq3e3Of!O5GQoaA5jeAVgD{$l*<{8A?(8m4(Cu{PqJ42dJaUzVGbpEJ zrNWK5$Zg4AK|2I*@h~3-Xr4;VxyZt7AeJon@gwD3R11}L3XA$m8JsSe*QS4c*-U?IL4})zT0-NvvKY5IChgg02R=n3Oy1WN|l&OCuaRjXx`5Ok#bM$QR+*K7ZG zM*g2PhJRab{^6@A|8@DNXzTNry7Sv|lcrXp444x3tX&U)#!3q!Xn2c2Xl^)9)3Jo5 zxDaL$^}A@ReWCpp_#HoO)Pf+b~XGCA*?3ifITLIsSt9cS7vt_8xMkTsQCeIOZ!S{mK(eVr9jtR%ShP-qz&cj$xATU1g2xNjv zH4l{-Br-ssCpv)p$w)~UVnC@>jkGq9hLP*_@67SQJvrO`8*z30ojv~V0RCrt0Peqv z?f(Q&!NtJV*}~c5pBP@L?J8lZAb;8dYp4SOYSs%DGz&vOq=KSQNh#nrizW(6@DJd% z6HBRgu8$a}349@d=6O539z>s-8(cNkGb&taMorfZP$3zDAAe8%z|ptG)W*%iahvPvf()lS=AA7 z>^;a%VQo>iBMaLKlF(e^o@U}eam+*YgeXe7ftn(Ynq!ys=160qDo9@#CBX*B!tqi4{)BrPTun3&=DQCh6g zy)cMu)m?FQnoU6>nP@<6IYWiJL+OY1D%c86l6yd=wAZp3sCm>fH>W7MN-a-;2%!P7 z*bA~GbtxE~)lN=kkRHHy_U78qGo^`aP*r$l?MbvTE`KUs+6+Id#{3qZeao5lmJ&v` zONvEI4w_nUPH1ndD4tMtb8INLuJmfYoH%9b84-`Q(LA+*zLw@8zY9?VLQ`dgEZ48vdXi z!l^r*kd~bMxtqplLBS-Gnk<2lnVbkWGMXz_I!a1PS*Sgd zB1q9#{PL!|)12nCfwVd$!Q$Cr_DgrC;?){H%P3p6v-pf_8qA2b#3*}_fpFP~ijt?a zGDXnBYKVGLBdIU5_^U9@Y8>xiRym@f?-Ols3aY?!#II~<{V1?f>!EdpTcYBv_NC0O zd~_~$G|h)0<8CQ6t+_k-2;{tmN>0P0;@q3b2`hLzwX#gs6BHrARn{&bxRz=}!`HAd z40eF288Sb0(E(NtfTekbQW}u5#qYb`hAu+_JcppzA$r}rI8)l3uo|BTaL;Lh72C7|aqWRhe=9rT zqXXB*?#=`jZw$vZwFkScJ^9@a;J00VbN6V_-9Nbf1B~|Zi6{NgYz(JBv>g#oe|CC~ z-%x3=pU{WGDgOid4fut%m? zg)6`;y3)2yb->6&_Z6LyZISsNTalzwt9g@pHmGjHP0g3N&@Dqs6Dg@C?SF2Qav^%w$2EfKn85FtXgHh&lP0ryw2UjDV%1yjp9u?H- z0lYdjuJjb#v*f?geIEtZXk>G41$Ev9DL9Ku>RR`yWyU1pBf1^xxU#j+MSf*xv%|L% zIzFXJlp7``hW`4Sj(8_19k}`3K{W&WKTVVW?xOlfzVW|);=YwM-%bD?CG#{LRgb;_ zv*lLumgnFV!ni_W6~wOO*+r8PlFjYTZNwKjU;p&*`C+`hIL4g}TgY|Q%**T!r`aB- zd=9ne%DMXnIx`k0$q!uJ6kuaI&DeD=+q0<{t_;# ziV5x^{sCv}RoF;qp9<%c0>4GBeB&X7TA&mHg?Me8AbDo3~>-JPU`s^Kk_a~ZC>V0 z)Y~1!6l~(WRjxM+DGe;1yl|^F-m2AX8Hdy0u6Qoz8)o2I8)zIWz)=r8y5Q~EH}i0XUQ28vjA#!9rzxoNN7 zv5B}w{fsC-UN}K@LpH8ZAfyH1sBCyhIxydQ9S9)L2rOF~EEu3iDp9#A#c7V9E2rZm z9kNdDygGjJ=~SFjtCuWmulGt!<2qDJ?A{`P)9noVYED?%SGe>PheP|`&UlclLVezX zhHDT(>FFM%UvSZY+?l&Tql?_9`Q{{O%<~CiQzFWnoA_5V2>w>`72q0cVX~nAZ7(!v zBPgK2bb>zNJ+ziVtX;-HfaAj)YXrm~kq3A7VV##FnvNkok3K}Ku~=M&(;%oSeX98$ zi&d6rkH8qo#Vp$j(M|z}Z`xK|>)P4HEOnrpZcBg+Kl{#qY?ZbmCWtkSC%H|?Ez0y; ztI#e;k$aRemspvb$XoXL8v#o%r_PTlGRQK>F}NCx6`XZQhhR&#^tY3&0s6+T;yWL3 z|5r8XKXU@2f8_-KtuFk&`InH=CPoHwkRConuUck9)9-XA$12JUKG@b-!5(i)Uegd# z;=!Jl+dmA^NTl=oLG*y@^|*K6xdvcOXCgLnV?epW6oGCMgjBuq&v z1u5MTG!_G$)Lm^4h5iob!S|BvK9)%Er;(h9xW_xtU)ijLhVf+i_tygc7xVK!J9huA z*!k~{T|omUlYjo%xgPCSc|k!zg+OsdL2LKk7YYxR+kR#%i-N-X`Slze4rdFCg0AR; zPyfnDOVd2ePEyy-*8Dk+pPQwb2(DUqb5nQleV34TkgT3PInxiR97R1b(+|sxQxKRS zfKSyxt-wddK&P|IM?uBX5u6hc6_XGU6OaW6l#YQOK#T;>81zET|8oRDabTc-Vie+U zHdKBE9>iaYowo1S|8t&)_pg}B8aO)=>!2l$&3KPp`?LY0=LnWi-;7WzB&t)Om|9KE30I`c zsU!OsWRy~3Sg?wq3)+-Y^N4>w_VZk<4ZG$^D1+DRHUv|z^gFP{y#Y^nEnrkh1U0^{tbszL|sAABuAqoKP9y3NN&|FAMq53 zw+u1QBIV66!MRAhvTfQ%d|-R=r_*O<;C(BK-n~Rs+=wB_%2?0f*!}RnKe>t117KV; zh^KPhoPS9*a0D&cu+8jlSy*6EGD6p8qaLDR3`Uy39O@7K2;eU=89Ly)8pU<1+{lHG zy0!w-+*qpi^mMb!yQuwM!=M8*hSL9}-#?SGYoXa@v8JjzFJ1ZO#tD-u8wi?&lTGrJ z3pSSSF^Yb3<4nTMr%PWl3}_UtS=3JXB%bgT+*P^YKF|Fp|yZ}#?ym^PparM?0CvIwD~tTI%4+O@nZDRSTyH^PLy*8^ensWy2RkwgCF2pACf`EVon>JB zK(YgzM%b91pESpj&Ec2z5qrS0ImI~z2sk>$vdnIP_0AF=YM!hZ*$t<|`8A1fCAj^U zxMJnM#1%i?|NJGc2zOTGTA*ycUZB2h$P4e&;vTiZbW4?uf6u#H%eX^VRDZZOQol>C zb9bC_P93!z8kh$6TS)_1DrGan5%CKJ>9&` z_Pss|#01uIN0|P{H1@KEmAmvk|Nq-G_Mh|r_lp1jfTk*Y*qbPsI6M6_1KP;T*nKk< zy>DKOda$}249RSgR6`K+L4=1ff`bL)DN!g9UV)|6OKK6hB4G&Iz>nsL-r;1i+3okh zZj0b&B`^cVHL0#Hqfe%1>UQ~jf%=Ko#N>M(K8ow~hJ|AGdNN{NjY$p{SJ}<+uHs#cJ#&BrfEp&t5@I_Z9Yta(LArGXUPYZrDKWnQlJ6eHOdVrV1!&> zoBB75pw(+xK3EUA0)q*Sz*KPNc-&)EOHy^9z^hpJ#0J*Ogwp+jgGh`IAS=9og}l+pQ#QT4A9yUw{9K1ix_E zEPYR*6aQni0RLZ4#=k5BvKH3=JRvR$-!-3a>gJGDv2sAAKRGWUl)b^P1@kR<=}dU( z;C8&pVCb`;=8-+V4u=wleXwVR(al73XikaQhaHYTTO3Y&eLH*r+IztQSkL$Uscn-9 z(4j{$M}o0oa!eSxVXgK*gswF`C_4?r+cw~!Gh+EjEgQR^2MznK;J51Pa#DWXO%7jW*wYhpPw>l%sFV%-2 z3ei}gwoi7a)#6QD52iMWzu<-ml==?F6L6`&&e6;!36NbAQs}su5U1Lx>ae{{?w2o+ zeDa>G5y5?Tq<@GJjOIcX)e`B%0-twh8j#mz{9-S(SUSy7G~Y`%O#+I_i=~%R2yzwm zdb&i}fcltI26vKZAAa0545wox>b$U}BTsDtzY6=7Lwh?7FB{6(Mk z^>;}J{KC*l?K?=+{|eH7lFPaO6{P=La{0d;G~WPo83Sjd|E8G7{1?Ssk&r=9V;J%P zvxP>ArrlovnG&TLwFG6$Owd~}o%yV|0{D}ZZ!bEqh$%MjLm~E035wNX<)X!o<805( zj#JtVAFm%s{Gb#l$BnXfkePH86M2+I(`Cxo)L$Ft`>U;)na1o{G>&qA=#o@|I|rpw zZEs(424y46tYZp{Mw?~qQuno`Jzv~XT7<|ZKxO!x*`pJT__#Z;ztp75G1t>jqtErq)zG*xb#DixejezMi#Pb z5*KGC5QbgXS=LcHY))Nsm1Cm`9Z+YY8xhpa9RtA-Y9F_M)l(rRpe^L2B??F=MUE61 zL?2Rb`q_M2pouTBKSBa33M;0SAW7g5g@%EuSz=yVpN|5$Q7vj!wS}#Z&Ut?vTH}SH zR$t=z8mjlLRT!(Z^!Q!#`cB)(cv+E)HBFKuaJ8{Q-(lSO8BE{AG!~U=6N3XcU^ue@Dzs(&`N`?Ip*SoP>$Fmk?GM zV#yh0Od(vAjs6l^Fwc{{RidQDgG=cnr>4Lz!u8z6#C|bll3k>O+%~oT0LtOD5kfXfD1F^5Eu0*6jac>>Q&jQI~ZcJL%ZA zZQHifv2CY=8FXxPY}*~%cG9tJcW&0&YwvT`x_8`je#|lQFJq2bUw!pey>C6o**OJ~ zB%%A!^4L^dyxpwoirpmrQpqG_Tp_qI$-KNlXS{cVN$KOZH3&$<73 zl>D_u=wCTEQ$t4sMFaIW0x)?n5=mg*%nn&&YYw8eeLxANm5mXl5bdHBq!U!WAu8E2XtYZC}f!lbDkeeio5o)81bGSm!l!|JTY4V-vB7geY%jKGV(mZbikP zO^F$PoNFT_vmbg`>@lfWpB_wDYA;rd!77ruzag^XnMFWnVaytfXckLRe`~x_Vs^#4 zgvFk?yK*aP#qr=V9b#98OOnCrq9()A04^O>sZ|%1qfz5k7ueikXX;!HNpzOK^IaK^ z{WQPLN|yyiKEe8I1SwOY{Faq|EFA}wxoL-qsG&rXBFblfKEcRD&8tK@2@=b{zl#B&|O*-n#g;}poK}Z$IN?4u-_WVEaq83t4_T|Z47Pf zGF+CK${fkDO*^SX*tQ=}bga0Fz(O;iYD#Aea>y5ahr8cFYXm)|GBM6}ziTS-6@djuE&V+H9fb~+cIeyHv3U8s5IGWRp%&4&N1;U|hHCh{i zRCdnP#E1!3GwCFHc2dw*96ROMt^|)ZQ&=^6tmLEMm>% zN$EGFUdrlrA;{k3kY_B)Z0$}iocP6Csc0+grI>JYsAB3>O%i_S%URfBC#BxkJJ}lA z(T8L`No0QFAG2c6+0#LGD}lMKqzMB`F>I@lZ08~XJVNWqIviTis4h@v@C2}-*xA%Ft01{iryALz*^w5g>tRHu9WH(oCE zCf^D`9vS zJe)#1Kipuhj)`|+PJ>Y{#^sIS9PDQ|lP(5SdZ|$7JX#0{^ z+9un^C1a#q1-4#<1WK(MCat1Q6w1nh4_dY(o7B&@UF=4{CjI`B{FL(jx9Lqb(~{|U ze1Obvw)8P#^H+t{-)puAqCFk~tyjLf<>esUiXij)U{*t}^#NiEJu+fBDV7&#N9@xj zbm}@Ka8!8u{ZZ0C-pM#^#eT3SKzzznIIGh*tJgT0R>@9^jj5>6iDq2)8mbEpxFa<2 z$a`mx{e!&j43eX27OG+Z(+n1+^t{$%8BL_F1B{!!YelE-N7X3@zxp7b5QTgqGoBi1 z!?PqguCW@O=l$f*QyPAwRgSUZ3tkEQaeIhra?4(Q_VKj8{P(`kKZ`-G|G60atlX7N z-AtXFO@$mBOzlieo&L?U2~pMgV}btfCJbTgSSJZoCAf6df~K%QF{D<>A%y6}ZNA{f zKO~g#%zv)WpTF+e{$qVUJG5-E1MOlqK^ooQmat7Qn|y_9}wk56OYzUj}pb} zS7egR@sg%9F>rmFvDGzmb?}=mXYodwZHH1VLS(2)!iJb?<#_t0=Zb?sc4kDzu|$;| z8H*GEw8mAfD%>RtN3?AV2ZJx=!BN!h-bG}F!G^qc$iklmm-2YgQIoO1&B|!FnD=Py z)Ktsqb(C4c+R(aM1Tr}$wN<0_fD=X|VsKli&>*6nDN{HVH^fK(9cJ=w=SQfRh-4Ox z7+~A(mPkb^6<6A;q`io9<0xzet4}n>ZrF-R_Mvx@an2e6a-mL~K%~g*Q%z~#s!Cn> zyP zh*pcc1sAFa!BDAMW_RZpAmk*Pa+TKr(OXT#WEudY2CRJ42VJ;95<+LdiPRFwEEiRQ zloY9yIOau|`k~9lU9oGBWV5JJFTh~F^fLpsr~qsJ_{{F%V#9f=)!W|x{r7Ls0Kn2G8P2kme6!Z-htP~p6t?{&hg`lH zBk3aT+6_CV(eXIG$Yrv#pnQ<&*nz$Fz3;`zU}wu8WH{{SbVH5X_WzLKl4e5v07{=^ zxVbvHDJL5_Hf!;Es%`8LMKP!(LsC5iPc#;9m?~pIK%fwyWyWHjpJ}{kY};W#wmG~X z%%ig)Fc=3e!SwtXL&^hmffyFpe_1%9FKVhg5xqGGkz=I}F*>BNiagXsvgs{F zv~>y^#|%j|%e=XEuQA%uoL#;59`&y+Uq!CE3 zBHc8xK^LbguO$f>lk8=w16YT@Frq5-!XnS#fAZnN8m9hYOVt0#mZ1J*OFSxlvL(p= z8(Tu^lMna4^2wI?U4 z$-tda=ujlD^hj2)a$4VGz_~cJ4Tjtg7-fj{6E1PzVSZ$daLm3gsw4U1lLd{8{!M}v zcES@I8-3MDea#7ZWgQDfBTiB`Rx&nu$ggJK;!i%DSZ&lCmCjuz=o# zYx*uy(ZnDxI)pN{X1k56jvK{k-c(t1zWiGSQ2ImPJnW(+R5UxyBb8*5u>%iR-4R&# z8%UDjVh!GP+nQC_Bu+h1A$jx!O8qixX8w-;uBImcP(>h^DwoEoh(m zB&NR3F?X_k`=psx_EB(Tg?z6=0Ve%NNC*7_1GS`a_}S~FZ4A-Pq+u#fAM-d4edWSL zG+jpsgy$0Ug8d%3^@7N1`WiBGxj%6`zdl{r^MO+A3$+NhD)B2(tr?r#)NV^(?902;VbhqGni#Mc_-PIf2B=%i z0&)OnQIDl5AdfWaW*3ywSCzg^L0%-SVNfQ*M9~c1%OY{a^ zgcx@PJPR4ZU*(fW=!73_JzoqU)P*MY=@2L1^ z)^`8fiqn6F=HKU)e}(3M;C%lH%aDJE+zRO-c9 zYsd{pbNIsOVtQlurO%*uVztPKRH!@iZoj<#c3hYVvv(V^ zT8TavFijmfg$Eqj9nvWs7U}OvGWNR%ey2X@Mx#{3I)Gn8dhKviX{nwlN=)Tt+RP|S zKE_Q#NT=w%;W|Jg%`FQfPR_CZPP%^PrbB=^>S>Hwx9Uv0dBM4c=4%C!DCm}+O)}wd zOi%$|>ZK(FL6cbLZ{rOY=Z?_V3zlp+b2P2h=VF_tl8SlBm3uPr<`U&4sdd>9Qn6+5 z-8hpdX^Fz9Yq~Z^w?g@6eCAoQW+ys)~-?=+zJ<5jAg~}LvA)P0BmQ=g;_>DLjCy~IFX*b2PRl}kZwaM8PVya=8xZBDo{VDLf&-H|f(5^P!-V>GvMMzaTH54W|MV=S|E=JDveizNpNmi1sQ11&JafN!+`JS3x}k*br;#74 z#9k{hg(~4Otv_Z<$xz#v1qUq3uHETF2dfsXO} z#3j>g-TDP`NPIyhrv!=wd5F;nM5VK9-EFMOtH>Cmp{(#ztWm`jOfF&pl2Gatqt4V( zZ3MQotH?x+xPSwH`n_AaY#pZxbFUzX_CSe)w*#+GucWEoS{sHOtSE?B3RM=`1e95> z8SmA3Y%!Yl*RgT_4C9$d-@*-y8JKjb)28A8zl~Jt1<Vr~7eKMf=(sU_n?>|figM<364%Q%rzu@oT+le^Dy3c%xouxm8vi(0{q(CW=j z^~X$0!VZqv2u#8DWY(5BeV(?y**fYv@e z?PrI~ImJ6%>V;DnvbD4LVvQzifOaYzL&jTCdv`PkS(3VehHEX5TaE5O_=lP}u0L0#{RyBQiuA*KO0K^rNH6UH9sPT{Lgv zygSrej>Us^`R3-pBt69j$NAcDWdI&%fJgGCOcj?CMeSsu29sPH6Y4H#<*uC3{h#mB zX*m&&?$7ga_cx8nKL_%EcP#x!W1?vP?>Xc5PcecZqG)+knop!s-4$3Be3lSmoe~j+ zMIR#EE*SGnU+WbRzO)TVRYOKmd5K2M;W@(BC@doSH5rZL^AQVS?PdL?HfI`*TEjpC zKR;#7^7bgh!v+%C$+kn_k*=uyTvuQp;fD-xD5xx;j|`p*(0<`C!f~8VC%fgv=ikdm z(siZ8lm&#cSNIC3jPdsn0?A|jRKa1Yq^e<}Vt<+zf02p9p;0!OP!r1M=GN;KRHq%}wP~HC^(`$FZ6Z-g^ zzhE$?V+y=VVEHd0FgJ-wIlI4a@O3>NUOZ-B3IG?>j1i`s73JUl{6l%LUXk&$c6s_! zd2lp^mAqZW1ZI2@(XAFs!V2N_m*5c#88T`9^jMLvuMM$hgHle3_GDi6MBM?w9R373A5)bC+6GZA%= z$w@{l>Exv8j|}88h-9+SE9t~I5fY0fBs!a9VN5YR@(HoH4IuLGtZ8-PN}@}dV@P{U8W4tT9ogo3Q0p4d-82D8qMzg+uAIk{b83DTW(ojLdGc=b0RYg_&((*u}?Z zl=Uj*Vk3D5EZL;kS5@^fCko zTBYZN*H`A&w6IB*P~RczwvEa=&{p&OLzwotS%*p!-F;Dvh*J#zqM#^P?q>CS_8o-F zQ)0~k}+riva?_n7Y>LpA~BoY&uz@_Uw3A5xI|~+DkB$r z`l0vwIfA?51;2>HLeYk}9|NMAwlWH^3)JMtKZH%TF*TwkVH$QF6Amm1LdshpJp7lK z_A|#?g#3Z)q#l#i4I4$w<$}XB#<#Rlas+BHgh(9~LtweL^R!AYCCmosoy8_ZB9|!wx5y;n*;Rr*Y z@5m`9AZ7bsk8wBjqp7XW6v6cWeT@HIhVbtc@vni$e(H#d&IsO+|ArS2hyZOc{4xZ~ zGK5V%G?5@P`?5#3S$z*plj%1TEwi|&ep-FzI~s%pog0EZ=4P$}r&JMy(ox)7fxRyy zWJ&gq-T}Pms)f`^6D1}3o!P%_s}0I!{GVS5K14;|8wQ89Xzf0ibK0RVI!B%qag4S4 z*p<(#Xirp_W3CjQFDw)i7hx50o3lIZM=E<=Cc&N-Mi<%}941FRvYf89nk>aplbm}o z!D7ULT1{z%NVNcr&rV>pL(odcFQW`QdWA&z6^aGcN zd1roB*y%v+pj`IS%Z#O~nVHMYg}l^i(hNh1+|s_HO$^nkHgLsv8^j+rZmRJ?+3yUiBctQfga5dq%9=0XI#{IkBZSy zB!!sZ(iH|nSmT+gj$k!dvN2ZNA{M_iZLIh6TS}EAqhV#z-ua7^9awSycYo>u3g|JH z&|a5c3XdT90w0%!uipzyq^}uWU#ax^o9Fzkr3t|4`|#mZVhG$A@(n)B`>LSi0qC%E z0R-BSGlN|u4evk&uiqHqpD$1%FWyeqA>|9c|7MJOp=XTzsHaaPvV<@=l3Tz4g`UII zOQNsZre0K<+;fCHI!i~G1Jg^HR)d;DK629D67q9)1@n**M*rrQqm+pdsG4O5} z9@j{Twye2^5uE5Y)xIwj6Q|qIM-xa1RjB=LuxG2Wx;gbsb6foz&tYg9fa|iD?c43 zV&?dSYl9&XW`8GH%3&8bvi$bw?3SUMAQGz;5~J%mNuyOQF)x7T1Gj_tSwlkZ23zij zOC93$r@7f_k9b6Dt8oRvDNA zD6ccrvO07Y__80=vJ!L_sK;-_ntJ3jkY!&ek5c5Ha5aaBHPy&xkem)soSM*szzBFC z2pWMTKoVboRg$*Q0zBSqQHM{JQ3$NJeE74XJa8zTc%AM!cMi*EFRz3JbO!) zzm0gLLbh_8Z28&P5shQZE-;o>rLUUdKhv}D9XkMlb%J{{u!OIz8=h@LV~U4?IF&xb zox{PoO4CnkdA$8?c3*qZ=r+skVtpd1X=!XWw&=0ZV}8oEQ5Bz1$p?mVwavkdY*Z%2B6k`0hMB|Lz?IyPDBPzvp;39eIoe%NXp#iA{xi4gWK-b8t+@s z_^21cQ?Q}l<85aT&18tAMQ>X0dfu^tya($59og2|_nTuZQOB|U=D2}ivx#5pG1Z#F zr3bd1(>)W{HjC~vUP}kx@pyl>bSVECt)L!ib7cru<6$imOzX7cTh0bK(b!X(%^%_g zBOOi2?mjOu!X=={?y# z2Pb*QxfXHy>83VH0_+xYQ;1uZTjnj>kYT2kJG8x4?XBb(Khz8}Z~CkxWOv(=CaFWM z9nK&7xt6&<#`mph+VEfxxS5v74zpwkpph{QU*p5t0WGv%{a@*1EQ%OW@Wb^feTmfx zPEg-9*&MXIqN%3%goO5FbthaPYh5g%<2;NxK^V|*otSzZRU(dvxBFvs%n~He;w9M~ z;(AHYpf@~75&JQ9_0X0ITCLOHRdu1WP>ADh!1hj!kcHMcDbl3&op_;y-Dhjc>ST`P z&8h_oVPevje~r_jI!MnSN2fA@LiF4l^I}9EGT|$pqMLX$6K%j3h)FV$(MegN&%O(K z0a+?aeJON-oZS`F2B2!pqEKxF-OZzw>;?1Va3N;P4y>;hFm#kH6 ztA_7i;=hIu^Fvf2{6%d9Bicp!>$M*e5tQ&i$fr2W^YahhBB+?4uy2|TvjqLq;f*!c#DT{bi_Ww8$x*+;x9&;$d&vv5Jq7A*kNvB72h!9L_-yRB;3b|(Sfw) z#1hXjlBLm)Fji-rztdyw{}NlWxu*|LTrv**Io88rez$=L^pZhK&wcAa+Sy z4+3hNd<>sngJljWKq$dxKa68L|1&s6b+N;+TH*r89;`Z+qiluXXR3ev%T3slNVCIC zix~UF)-P_r>mHWqaNn&mGF`jX+xRb2kXw4 z>jqu^Z(xc-NkW!xkXdAIwi1jbuR&QPFB!gnVxCym|Llh1@L=~T=acxbIHZVFJSabP zUa{t-S87F{ul{xhU~O11^*6U)hOr5p!%IXEX~F&~X#uJd{5HzCRHJw8HEjuNziA_5 zcueva;OXZQKXUREQ!ydDniy_4DxVHvt6BA-kJ0ipj$y;bH|uA&rApN(%Qh)DyNK^3 zV-C*(x^$&%)r6rS`$zsYuW+@G^nGui4X50h!uI6j73Ks=Jbnj~%F`EBQ}O=AGq>Qk z5`Xj|Y7ub`^4?O-L3pajPAet*NDR98PtX(g;&DQBBM)4Nvoy+*&n@HA1or|JGT~CJ zLOpK%=3&c3?7L1e%SD-zhLW&W4>_eGf9$U7*$vLj@r%*nGBRIj#x8xKyOlEALfa{f zy-po0^V}{B7rydtFmve{N*q^K0ZlA??^U9ikQ=nC4+s@$&#_r@1>CvQ6;dJ*eHIu* zf-fepx@AhNs${w!nge7B7`0zFn<76KdDdfx{zg9A%jW(WUph=q777U zl3+Y9oiTW1N!7NOXd{|ZDM(d2A$_?@9E5q`)nESr0kIqWex?z%WS7pyhqtRVxE zb04#r1-6rUnPcOIvoF#R?kQZ z+*mKR#(+1M{XTpeviw~2xVuk139&k}K(u$;mId?toJh3oO%9Lp$DtePo2JuQd=3?s z6%`7up7c)t^dnTqHJkw z@#j{Se`f}jzu<467bOc!2Vc*`HY)1D@t6%LI8(O9CCf-fq#kPL%}$lIK?cS@_8>7( znBt#6-;@TOWr8BZj zB9KTQLD@tXIE7HiEOpLy$!pD;t#XnlpNf)9Au2|^hiQFRxYE-qjeqDzFsTWaQa^HH zGI5=F`}t>PP#@yNKXO~Tr0xkJvxV{8$@<+KN-|Nyj8{YC!5}qEDC6{w(mvLW3n;Y2 zf}`5V-v!LnhBPS>UbYgw%DG}o$r0c$b|i~&&i>JfHq=@u(xWy;YX+CH+R7_aKxav; zr7{F{FLgH`4f+m4ZiS}iv&(tpn3Rq)i+GDD&xDsRlhm%DrgkrL5wD&UCn3~}-p>b| zAwNcst4%l6{4gb3e!iA9Rx3v=L5`+g)mF7o3gse`x(mjY#Yien%fRaL6CIc5?gzOq zE-4MV*`z_nAj&^T&Xi=pTR_Qu+K5*;EoZB2AHf?X8;>h zXrsRp(TX8B9&(v;)W0v2fBqwPgoG6f+Y{`P_aaUqr!s`wsP&T4FKa$Pf>pxuBbMH@ zhYV>o{E{gORSS!EfCa=anz?lS8T60P+;RhtQTz>*qR>9c7ugO^45_F zf#EO1gfu}s!&?X!S5-+#vUTd>So<0#u}K}}I?ABEKb&I{Jm}ih)Y>~cx;os|&aFoL zUoQP$K#2;H-lCB|#?{FiadE;aj1p0!BlI3&8lk}ceuld2;@ z$1!>pB*t(0Op%0?etZ@GyrfjAIg^f;-H6+8d0Kyn!*ITbIc$@;>K240#?cIS{)qQ~ z70^c*vf(n8!cF=8Jhme(q`*CvJ6hzQ6z~6GpoM@)Jlpk+xeKjf*`)d8l zlIN>m3lw!adx29*yuf=-K(=8#%DT#;um0Ioj17MU*U!0LuX&6|Dae<+1mp)9U&m45 zkK1Fx@i+;j!*B7t>$c!N-c~vF4PKHAvnTm`Up~GQ@w2oD5LXP?xeF>*qdIvfy9dJ& zUJ6NfzQ;QJE)&$B)cXAYQH5?splfTjd(yP7+GM-P+_rcGXjH^~CScG4d3#-cRNLWA zb3;X4hs7-Cy@^j*Q{<~2HdEvtZasaul&$@Gd}s)09Ck!18K}+lV+j|R` zPGnoMYC<$QsmM5K#c|Zt2PkxWu}J^k|6l@Mk83u*!7KN7Qw3cqIp9;% zq6DIMACKp98FH4+`q<(8Jq{GHoKsMN2$n)a%rF9?T?^IgdVGXUmh}RtCB8}j*>llc z!*<14$iC}-)h#&$5TF}sn!Bznkj+bca-P7GPQ+3LYS!h%8jK&&w4y}XSPAcuY{~IT zs|Y-d{xSj<9Y5RamofrDrvJM(2aeyQE}Z$PHWqG^!n6*a`HiIIQ1x*XOv_X!F%d)9 z8-FvrxLKKb1>Aja%|Yz#R~$_OpY&uaH_S*oHA5P1z|Zph*q^6XGzry&mnQ+(yA|hx zndFP%!RS_q20jQbf ze*?C%zdK|AFlOEc=e|B3zQ(A%i!JY6)L5rpIU1T7Hm_>V|O=8E8 zh=cYnN64DDOc8@meGG?TnoY=t)iTMo&XX&ErCNZ|RsA;TJV3#=UsN$~l6C5kNVj2> z5O1P&YQVvA1sUKxr|4D?;uSm5gU_5`-GT^FgF}d3H{!ntB7ZWkV@2$GLJ6S?B@Bm^ zn)fgzd$+X_oo4Ac!2=yq(EvZ(WF`Yh&u+nq4mScB1QDBz5IL!yM$u{ewPb>SwoLWns!(V_MV`DC z=}A`nX<8>#rC7?56^0Q47YflU zeN}rTz&UXy?4j5m0xN%qy9QzII^fD9FDz`-BzR4UZdysq6M4q(rQLpq^FB|tF^{*t zSWblre!kBIqG%#wRF`zjQC;caT#9A>JM#i!SeTGLfERX`1-O=pnSWPJKlhEJ4hUa2 z=F%gn+|7Hve10w4?5B})@K0FS88m)#s;x9jZ3t7EtB zQFcMngD}cR+n=J{bB(S0DDm%sVHl+WjB^)g0-}=ASNjN6HSQAUYOkQu#@gq~-t+uY zIs!Bv8mnadUs_d_%# z75jh{6zRa}Y9e@zM_8ef_Jb4ek0!bO_CY%K19ZV- zPY{!`ii~J#)v`0y@=Daf?}66gA~MM!t13@=V=4`fZROu=sL zt2Cu?P)d*vp6#OvH3bM|GAGjAOC<MT7j-O6OO5zloF=^Lns{*W@hcOvij;~YEUQ+!A>jf932IC3jM}X0mnAI5ch0S z0**Re8HBfBHN%mcwisd8&UNHiYtdi`TzoBY47c`)eSC!}e(uhtDle}YD6wuRvGb&< zO=|dn|9;Z;f{)q15EanOlxIh~HUF4rFCVpUX16m=+HsVXmeEys(fwx2rbb;xFz?b` zUX(un39A(hED4Akr8Txe_SNUJ!vY7vC5II2*j@A&EZ2u_DYz z)_cOBR4#i$Wg6&Z+UB@*Nxk>xr)|&72iN3yo>@3C;&;HsWF7PznF>n6gr~z9t^f2z zK-_C-RWx1C9c>Xwatx*3MNk{WE0g6{#k%h#wkF$>KPSE?ojWyDg0VcrE zaI@Udm!b(-{XqUvdF<-#X_3-Me=aRJJKEq>iOid$*W+Mq$g|ELqdo&Gal)+sn=o2C zX^~25RZVNU419?UYJR~2UP!9^#(Igq*ROh^cfQRpHoOQA>nc(m9HLjLYNMg{M9C0T z6b0gprLUUpW@=M1b1OUg+yvQ@b>3Uv+URO-oYj-;P zIeu&#Yyo3vqj#Qx_0k3BYYn6u(^>#Re@i&Bo`o1^b8;j81t1{deyfN7j?m#Ug=7`9 zr&A^HgLpNT9i;!Y0eG0v@=_3I9-p`y5#(V%UKO^ln-L+>VXt!?=Al7?)hJXuUbHi&HlsxZdDo6Oh-?7XqtKn$#7m3W@t;+|O64;IPb z>ZM8;ltgp)eu;-5SI24*JsiNKGh=j=jmd2{G6Go&FR}F1MeqBr1+Iif{fd*o_gj2t zdSI^R6$U7K9aJ&B{j%n-2Tc$-;SWMG;01$W$-?CX)-|EymQv;v%JXvcnZU<-=O~i? z^Za@96!QnF@DX_v&ZWL#b0FhXq3Vrph`oJ?AWl%wbwJQJYSBlcoPHe$GTaMddV#83 z^Wf$zvwUqDDME4dJ7xuV2R8~S4kV+l9bQR97H(5#bZPY7#lIs z0$@ZXl6SX+fW=%)g%lC@Hr%$k{pn2l&}j97>6B{0h!Ory-@jkH&x>>V|4MWzPGgI$5a0?&PVK!Jw!{tKIn2s}M{N3Ry( zmw1QzWh`ljJo?>fZ#V}`&N<*0X5yXpI-ncYiw;wT0QnBW`+~0DCOD^8(5`PgyDLto zWrQ>PehoOr@08q*-ivNV2dG8Oe&}sx?XI0u9bEkvD2=1{%-ZMYvZeB8)O%)0DKWvk z-Q5i}yY6Nf|j5?1ec~TENzvAV*35PboOTEavfB%hEXV#OW_;K`oi~K0yrzg1Q ztv2Ok2jY%r{*nEi?|Y^n5R)e?Q+OW3FGmB~1K*B<%j4{xrLkBZn%;1C=^VZy`y*(F zCuMf;LL=?**7YKMl*CWmP>lTqso@)XakRctq|TsiJ{c)ilz6=(WZ4FNK2=W1hN6)O z4wJnz8ON=(c1wK5itwt1$X@$=b8@@9%phO>Sj!joW*nbUU;e^{S{e2B<|z4mJc%|= zuW}RuExMSkT7bra7hJM8@TbE_3|-uMN30q8->MuR*-Tf<1HPT-ZTja{J`%-h{M*1>=?Q7BTh zE+*5+(#^=Q>tL^K{p4lP9fc??V}$kR@l@>s%28{5IJ5^TRPFC)p0f`}#3Cu}H;Y@zVP z)TVCBl9d*m%{y|W7)NNZ%9)uKgY!8kMl?CtBf(N4a9VaTal+8Y^0%XR8*m6;q*ihU z(fO`7R5e+fqENd6Eno*Bf28%gwx?yJ4W?mv-N1I&Y%a4sM1k&3lg5e=L{;MsKAHGz z1J1Fsk?aWI*G6#M0s!ZG&ES5Sqpl9YQ-c^Y_Omv{g2iYncL8`sk_xYrthDL+DhH;K zO<$!0Y(@q9QR_ry2|%8XOE>L9-w(KM1GS7GCT5jC-S{H76SV9M#00lLu>|K~yN6Vr ze8!Qtb{pbiutfP=5uy@OII-KRJOut9l5;Xkv9Ycg3+WS9l5fHj7D5751fNDMCesK>2HTkM+f%F9_StD zRa#NHeFiFrCwHf(CQ=TFG4Igfy(D1YR}m@2C)DQr2+Zh#2Tbk8>cTY`61rBHHus{m z#EgIK2{%O#JHBKZI0-+okJ~yA$$}g|nysoQG76+-=*|?ihO|gkS1yZ$Jj7q$0WFHE{@uxu^5Ru3NIX|Qh4K8zZF^^6>+xQ9bL1G5U z{z?~VAWwkh>RT=CKNh@X`L>ol;z=ObMOZ@2!hktQ>h(r3Z5TdM5%zoy&Y{xqI35bY zNYLjLaeWgW&IPN9q`B45bA84esGn2(md;yI14LtLN~=m;hwK(|+A>UEOSpt&Vi*kP zh&SMgS6GtIdFkwi zou?H9hVQJkLTkjSpf=|q;XRYCXZs-W_z7>T8lkN7(&JT0nDf$bR%C?={Rg@99GFVY{%sAd1qY9j}=K14zuiMumQd!`t{8~?cI6o z$Db}Nng>9KNXrtgKYz?e^6u23WS5nxz0LQp2(kK2Qq_vDNjS<;zzd&EHfqJFPwtwa zD_%*JLm89Z9K`Zn7TL-tKkO^PQ`{gsmYpyyG!D=AL%Nj%PmHNg<);?J2&%t(s0ybF z4q8?1N&UeL!yEWLC*ssufY>gKLvmcg=b6K&8Zl98I(luj)uYt>BJ8n z$c2vj`JO+ciVRHbyHguFTNoMOE68jkCAOKA;Yd7Pml`TE-2nALXx4tcLrm@4*SP(O znZ}4|#Nxgp*!aWeSFj&}E_m}mImw!ar_4F}4WpcTG!_cbT7^*P;#aU$8pw>JWKI`A$uoEy@1%Z)Wg3)&nNUZ?1( z>gH@WLDOw^8x5QTI2LMhSx$T$MdZ!(n#~U8j@)IQMZJ3v<>3Z5Hnk?}FI(iquAR%>qYC%AXSeSdmApRxa;DPO%= zg?SijR2lhAnHkqj@F^zv$|1U)=prviAO0zj`_G1rTEUG?sC*i9GK0bEewwTPyz_9e ztA3oTevrq0;^rGXO{_-s3N>rI3f;OLk7Ug;^>J;HC>o26wV$r#>w(Rl5*5zQn`YJ1 zslUe-vE73zxA-U}HP_S!rN0HrSr561%qDNtUBQ^y=BIxm5K$#IhO_MmQ=7zC)vo^O% zdsRYoQF z$TrSr8^PY?{JIs81^zW!S`#{Gdx}5k%*=Y_J>)uE|I>r>b(byZ*Ij-$l6MsaW~BQ3 zy{LWVp+gMIUbEgYz!hLq>eJ2GbV?96XVRCnV>{1@C_CqDpZIM=+EE8CGOev#c2Q@0@}IJ zKgkvR2;@gCi{3NWBG(7wNP0<_@G`1R9 zBAg9F8}1i%0i|Nj@-!hd;bMe{6I75Pd8RC5Wy(j7j)AW4HLxCxUR!b#LtC=I=srJ- zv2Z-mSm-#HdA7YlXU8KKh5yIx=_Sz4=N60X#WC?@hBEL;Li5o95QYtE)1<)O?g+RI zDQl)$tD{z@#UEx@ChEthGUc9fOk)tGuB57RZ049Rz3@lF5F%?ovvzx9%dR2!gYfK3 z67jZej{-6xqMF|Ui}Pn$`P8{807zT@VJODCGn&v*QK6X|l+sLJ`iFL^L?W8xQj-oUfbP8o}Y#1Y{F_6B%i(^_Z^W!{n--1awWtfG5a`6QGfTGyE{1Kj^ zbRgkU*3^KABga8D^BsT$URlVpUm;O^1j1ie9R^j{pL1n6dfrrbD0K^sI16n%zuZDc zi}DJzDoxmtYqnyPQ9XgqnoD0QX=483hyJ2dVTg1xhMd|GsVFAI`B7w@SsA2{k$m;l zthUzJw6JjGi=jld$(`xMo~Ion-2AKap6Cg=-7`}(G(>f2v7E@l4O~jEc~4VfjiW4# zx;@*K+rzn6=dJ|mr7|eJ3LL_zmIzz6CK-icH`4!M?5(2W+`4V;Bsf%XcX#*T?(Xgo z++7o-f(LhZcX#*T?(PH)4mp*q^-0#*|NhSfbwi^ys_Gr1k2zszPQ`~wo3^DRzUjvbFKekhW07?=isn#XopPxmr zZPw`F4l{w$7K-5mx&CN0H6|FS#1 z$G)0md#VVeIq)YlDLRNZsusv^I@>R7j!hM1X0JLL){`s)Hx9$q4xW@8so|DPF>%DY2H>Efb(pDB+MO0ooDh^=G;G>(Em&iT zz9xZlDBPuUk^0%}z(_it?UgZtRe+^rhp-ZUPNr5^fPKujqa=%CSnu6lG*5eG@l+j^ z$QFTfy*;w$T>OI-l2ni+QW{?TT&un#fNi&>7+oq`)W&fp3m>I8@8vtt2S$a~@BOYN zkzV~pJVc~=ce`vFDNp{&2yx{xiiI49kDw1Vy(i4ZCbljKud|I`#Fu+?XVU&s5o4(K zER0FHR_EKz{N}HuPON%a)il^oL41Wxm_DBueCw)wA6*f(yaR6Vs-m0x1-nwx>v?8p zC~TRu;94zW+|6-AMCI*mOivh`Z6kk%bU8UkWN!W-k+7MV+JDFn%8f3`MOv-GWor9e zwL>Vn9~5A4F&=e*)z1S+~d%SQOr?+`Oa^(QK6lV6;3rtvC+Sk>0ZThCGpX8b^^~!Us+W1|;Dm3>nx}N0X z-|gh^Rijw9_x0{T%ISs86umEVgxzEbi%sI~Imk1-TVfyX__WR0v~yEAaa@aX%8s2l zPCBN}_FWj?Yi?fELhPp#Zfq}n^X=ipq2+i=d5s+#DeZZ~t~4s`4Af27^oLO)aU>*9 zHfNrK>Of|Ai?T@&dn9=mM&uRZS5LGQDfH?C9l@`{{6 z~=*GGlqQwSw>uBAmMX%HvZIJgN+izN-sK^Fm4Jj2! zQAf+O?c1m8q6gc8KF}iWy9{=m&(p_kM5ydcDYg%tLB5mbDdWdPJ~cxudR5fXWzxaD zoasZ-ATjx}^A3k+1srK!OvU`Z{qU#~z|i&TC!B=$7}Z5@#}1;EjMX514Q)*KhXM1q z8^GFI91!>J@$B6VgG6N@A+oNDpfdaMzIIN1SC7>&v_2b*YTg5tvEYaXhF95VaQhn3 zCCDY^#WWpK4%?W4uas6(43t)yU}%(j7HRnUJ3{u2%HdKN$n%r?hv(;?#tN=KzR|z$ zqW*hAGq<*Iat4_Lg*<+xDuRO74UW-@NR0a)guN6T91JDo^Q3>8n3h&(qGWUw%A2A-vIS zGgFXu?Jd6Ww{U>8sW5NJ@R}>fOz56`5j$LB0&W>acme&AU`TJER#TT~?dClMcQJEN z3K-j|fKDOpqOmjpT@SBx=O5^@-55CaY~SM|*or}aT5Jog-DUyZZaL+PVLnw#zYGUd zFFw|UV0w?_+@H0LDEwT0IXn}J*^AS_KZL$T6K9lIXF7JViZvuvv{p}%27-uS!q59x4TSZ= zfJAQ2@y;{w{4GwoVv&1RkC{>s;!NZIfiF#5OmzK36Zk53dy8Syqs@e?JWz<0M?b+x z2h0|!3FvktK5^DjaX=;k6yA5HAi`mnR z35y?lZ|MGB9J|zO07&ra0W|KOiNOVn?Bi7#L~vr=s(jnWjIIl;l0P0)Yv;!A*w@+Xs-jxkRcd$G>7z1x>?$~XsX;ht(9h-(GYn> zEL6-n{K0O(Je7_rCzUoP6PE|7C!B?%a}-D3+ImaQOJ097xt67ke+`M%#(Wh?1uNff z64)^y3}oSiyDxR(l)v+-@t8~xOtBTRC}xEYg>mU-E&#D5C=v;uZRRe~{I`+P zd_7iI6BM`R5B1+I`JV;+C(-2Z0+#qsy5+x?RgkF1h)IlksQO=A+hhe48cg-k0n*qK zt2KRJP1na_D%fl^pA~&4<*{X--@a1rIITe1pG@DRJzcFHWL&3ty*|G@;B?_rzw;bz zy=uKGKa=>pT5JJjScT(EH;9~6vQ&XL4^bmlOu{&I!H3BDak+x0^;ln`(?++*_&RaE zi<&!Og(g#9XT-KhaFi$v25%+_DG9z^?Qq3b-e8I{nd8vCUg~&;WhHr_?9=|}=Nqt^ z3gqR)Y%~bvND>-q+{*`}1t*@n2DC*ErHc{Onc$rF>$tvkV4wG@)nUX9OrIaxjvShe z9~!2KkxX*U)TnpHXCoRk=`JynES7i|U@N`zUw#f-7R9 z!X}M76<8*btjsB~d?5zlF(??d=S)6#==xGT=~2q_x$PBB-s-&9_P$zBM)$2j>-?cJ z-|ng`cv{Z!&TU`cUR$} zTalcjl&c9nrnXxN4{w9qG486MMRLh1s{L}Sx3!EFnAFp;sOnzfPR<*X#&!(Hd+6j| zQj`Z#c-r^92NaQTU=AYK`KumGMN;uV&LsA|@aerqoah7+q?)*wH?^oQQLpBfz(6SX zh#kLQgF{U9*t#xAV#xp3#nhkH(tq21RUBRZPAa!i7zGUvsJxAp)fSH`D_Z3UhdC~W z_Ee#8VM5uIi`%}~Ut(gukQU!7+Uiiz|C%1!Vsi+YqFC<6(?CR7Z{K#mH=F}+yOY32 zC)Ly=OQGY#2<_up`7R$_8@#&e#`& zYXyouDjJ0Gnh*S}Z9!Z#jI4nE5eM;^cf9qRg~?<;Ut#9kqfeOEt1b_ueAizp8TFbr zPqE;JO_Qs^W5r*UHWd8i6ge>KkGV>fYbXL_4=acL0$SO*76)M9T&lDweScdjNBVM0 zf4P0(|4{ya?#&SXq5OX#BE)Q&)Ex~$salSInJZmlHk_sf5C=QO;gFEw`lV>lo9##APKmbdjYxdgnR^-ea=D#LiRU)SArj%pD#BFHVO8c5(!295tdHR&V24_-JVYF zzGl9mIZtmPn_2Y<|GTAB@a0wx9EzG;>9?Zh%T+q+ zNIDD*u+T+kFJfj>fw`wVt^HO-Ky{lJQA5)(`jG`<5D?V)Ota#WTb>=hf~8{LV~>b_ zJt$XBGY~7l3B*blw!_0C9#WM>GeUXyu zSutuF@@T);l=;9cYRVDwH&J*1!r|zsV76aq0qz^ENPL)Z(kR>z7$es<2NpR#LEZgs|cZ{$vIY1>3_ef z6S_`Z8J-(t9-?+-GW`tXCKxjO$(Uk7_7zDh5IkB2%}w~yi-=8^$+8Jy=l5k3!2ok% zbkOtl<`3KMKc6w8|KS#~;LH5CA}IVpwtCLOGw$*Y~qP`_o{k-hmb>rq=l^ zuzKHupe_`)7j63j+Ys-EPF~#KX%LZ+ih7+%Et!)9-fY%YOKaB{3TXhiSJ)$*t$OD} z$;KwOn^kjo zAD&ldg6lI0;`aa6-xTp{?l{mJ=mGuw$29-XxAs7~2 z(jX;50c1p^uh6p?DN@k@3E9J^0Zk&36~SudDZgX5V=!n^zE59;#%$jZ0EutRFGb~^ zbW8K!Wc7yhYtTpT+gz7B6nv=}w+xUg!e|}YX9}9NsNxR_oCB=jkCR`a;$jF!NRg5# z#uD4JF9h?qjL^g2(G>`3Q$pJ*i(V8IIaGhp=<5~;a-iS$#dNjfzO}m9TkjyjzX@j$Uk_ZI!;w+_X|%umhOeeh zhOpNW-3gQBVh_W;!ZU0;ljx9nly*fL2uU7MoxkvLU9_Ro;Qh(sAmoYcs93T~e@*7^ z?LlaW@7U#__Idh8`~2rp_|FiZf8F-}zaSN+k-s1AApP0#MXmZ4G`9*mP%gykVNWIq zQlT#bnMri@_ zpCriHPY=CFIT2hX8l~2F^6EHpWg^N=j3R{4JSC^}S-UMdhoPwr%D+(+)_Cu?HhzB+ z@qK7)&7h|r1N3S9rvm=@Nud5wX?I>POh;&hTTD=lv!E9u=hgl zl0DkF!C5LIBVih0c{CMF?4(jc8`_#*;22h(#_I^OFw*thr^57ZYFJ*-(~rI!$`t-| zat%{NE?_yUt2Z%l8vp5*aqu(kG{b4=Yp2g0FVYWXV05#_>avbop_2zj#D_v1E_E%p zD|nLJy84a|l{Bh2tO&OW+a^0mgx%J&N?3Mx&9)DTQZ+9;&ea&D>I#v1l#4f(M|vT^Wotr?Q3}coy&&~R(|%U zE{}zyQSKmXf@8b2C2NKX z#1susaz~$Py=YCh>)R-kUpaFF82VPq-+eHirFFzNxv_F7@O-GWq#&o(-RlqjC`N1f zgCaL#YT{?@aFJGHk>_HLYv(b%7=L?N= zzsAdguA3)0=A5cR`02ryPbY5pO(z$gIE8H=v`8*z1n9O=Mbm`L zMIB6=jyIHa&sj=-iZMf7jFN@yNC=3Lisyay7K@f*ghl)qT(X%hRRvKwkG;7ibphRV zi@iA|b%EJ6j%}b$=KgMT7~8;&Y!R%g5z@T?nFsDd=RKny%q~a;tflo9S~wJ>P8UrNcS%C5kjap=bjzu*=ZK4NQ$9tT}Sa&(oTz^0L}_$B7f8aq66@0jvRmHkOCfV&5G3eozin(&c)M?=N65 z(p^LP1+csrRVTTnzcVv)2r11m%^|4PyF0t1ERMI<=q!feewK}_=gFx^3L*DKrQyC& ziz_Fdkdn~V&IYI(un%mwZVm~CulR;Xc^U>W(z%^4t@6G-!>{A+Sg7*U-}zASh;o;O zsaWfKaSFeeYu&3Bv-A##s`_Hco}|eB)md{n{Bqh$h+##_@I*JDGUBZ_LN&qJ{7x<9|p^M|?M)qtZ+*y^aJ4GEqgWC}I@Ws@?Vo8O^+Oqn} zF|{)DW$vVx&kO6#E2bCVrEHQw_#DGCvvM#Sp|5GsWeo@M)1V*#+GvOZnGjOYAm~CV z208!}wJ%sc+#mCup+9CKIC}C}6yY?V1HQ;R5wKtwN!Cvi&_>e~^q(dC&EGN6g^(jk z3hG>jU?EZ=ej}`q%E1zmLR|1Rg70GtRztjwF^6EH4Ti|n0B3ragV$F>x-1wDZPZU; z8~P9eb_DJyO`I+xnrNT~(c89i`jr8$t9)q-o}u>m18}gNL*t!HbT4CYwq@_ z$3*r*_2He7Us697aYXw^`~{94NV@Hoz}xpa>XW{)>p7~&4|u;;wGp*2aZRAK7Y*e9 zR=$5;nEdk+>%XJ!nf?Rdicr**eg}${Tv)|h_(~gp2z6rsP2V9JiU9FBYpnMc%67$I zq;6y!`)0ut^t;@g6Lg|8K2}Ei@#cedFW|$yw-*$n&{u`8?5(WqZZuZVY~^3*%|^k$ zHfldfH62eZJojYpnP44%sSl}jP;91^3KaaQcj31tSf@pvcBVAUlp5@<>ntl>!;>0* zJ(fL>0T;__gP+ejtS)a_f#Jhn^g$Xqc3sPLoU=#MBU4DX>9zu~6r;utAL)klqK`)gDMO4ehOaWVQcQ88Kr_W>NTeAoq zzwH6{&cg^dpt)e|kICVmdsj?X0Ot|g%hLwh`)X4Tc?1b+g$n=xw zC5u9)m~+RvMN+={-J&`%*#=QWxda5-1XyZyo_DVC_cDBZ{fz6|9r)jokyXJ+3#e7kKrIe+{TW~m8^elAdxhfhqB#np1LuFNrjMqwUclHB`&APe#X zM4qL?L`$U?nF6;+JL>946L4W0DnZfC<%dJaQbfoxjIvmys!dSp4q{ZRaZCZ)fH9+_ zQwTEP>NXvu>`JNnZG)$pX~aBqwH*EuaH-Cv^Gs>PVs0VzGrAQ3C0kp#&M@>KCxyiE zK^HX&*0lVnT1g^fPKeT?;3TU!Ba*f%sc#L<+Gs(W zaUK<|9mpAooQH7Q@9$-QBw*;?R>+WAGXeTbXtuDXU>sS~l>%btX|H3WQUbLqWZn@F zCR-arO&o0rrOTlA`sdgYoMkr=RBUg86re z{IRNbQCg7;04+PUEvkbXe)uj>lOG54iHBYGn_CN68bzFIug=H)BLji2J0t*`6ry*)Jw=r)IZnM4_^p}`zOQ5U0L4&hnf0|faTNxO70 z)-}6yXW>C6f>Sq7V?_8?lJz*Phg$5xcka0+KG{6g%^_t>XQyKNn;Bq=0*#a#d@3G^ z4B4bFU6rcG8Dn)!IE+)7=_*HaB?d8jZwgXBxHcS<%f!8pQ!q@DrHA*L=0j;02+8G^ z*P+3>=OUL_R>x#oJ#DQ|SwRUc6jjbw+fKVq;il~VFjU|>iIX;DLT^HI`;CD5h;tLTHkbnJFck$M~5@ zIGGp+g&YNE#wpXZfGOZUIp2-v=p&MJ5V4DTKlUx(?{)}JMrRcQYK3F}>yi@fk9PQ* zdHDBAO2bMS#}w69&vqqmCBczSRD@IlnI6WGDk`aA6*lReP*%R4fuwp1D=F&|;g4m{ z;vcu&=UlIE`Rk0S!qYFI4na<-<9iC+kGK=9tudyx3b=WYvB_%(!(JKJ8MRAe9yR>0 z4=)IBmMlCl)VBhSt3 z{KCzuJn@lYR99}T6Sb~fvjo$ouuE-NSNoorzTe7?5OUPlEjW7Rn^liHQnu;R4c()( zgT9x0>Um-GE_~1>XQ+wYlCz!dEj==j;Mt0JXgC8TB3MFY{PX}`Kgso@)HAu?Rcygx z!{xJwu`c#V>EKHtTUfI7WFC# zeySj?5iW{c_!b#<=l7}~u#h|~tE4g?kiVaAI6_cWVmSeI_ ztn^1|Z=cr=ga15P@d8eQavN#3z<1_H6-Qkc%Uj~Y|Gh_dk$0Owqk6<9d2Id!;VB*$ zM<2?x!}lrEr8D6A@!_Ju!m1S?Kh)QnXMv+x%^h@T`b3~+?x4+)Aw#uNHXrZjBOg0) zydi%@7dzKBNDJ(xo`7FCRsAd-k6!-~#zc(z#~hXK?8kB%>{unLx23C^fwc+sie*OM zLOASWx8m4cV@5;ml=S3^t~Rw)^c3Sobh@-z7aA!etQ<)36JuX^T4iqTZyfj1JQ@M2OM)y?Qi>>&ZwZd%H2@uFSRw495UZw~me~0wIJH`+7g)<(FJ!$3 zoMHo!GUNj2kvhmet}*ubp&_xwE8C`hBeHG&?)f2^#CoRNg(2#x_sV@%GS3#<_2jRB zOD+_@1o!rk)?kjLd#AqPTV#45m?)z89yAkvw|)-^Cx`uMhZv@xo(cG~(AiZKLpzQ+ z=*GR;Im!J;j>(2dSO0YI*7>RQ&+S;4V_gc68KYet0NkOjYyj?fm!kjH(zC3Lt$`J#jU`Q1V^q2%F`E6YUYAFxDyfYFo4VLXX0xN^ z3-&L?-*~tZFJ)m<@VDn=C9WUM?&@fU+^DDNMWHIDtA)M}`nz)2qq40bFlX<$uMO!5 zhtt5dBiJ9Ut8ICPrV=Rz$FE2M%aJ>YG>O$LXs~Leyn(x1)zsR< zpH!zUC5-w?Y{-1nB*Ta_$<>T$K75R4#fjmp*3y<7qPAcrg)wqzh{;mjW+SEPJGLQf zR|^@8S|yw}pgC%gS_P6ymLXB?sDDhT(R*4=T=?DQ`I||(2!Q;BpzgP)OZcxT2 zdi7L4sgC*{t{qncbw2YDKn(Trqdf>AWFz}}u&?(XJhSKex1^pARnjqoPNn+)$V2+) z{w?7jP4{0mKT~^aL*xHSMEW}*{;xzN1+9{XiWWFnrOtfp*ky83^H`MxC^irkEFJOqB~d@Sc$ZBTClfRf!Ah|G!=m^>^C&ld`UJqVohh_&#;v^Tb~Pe}Ntau4Lf*yE1gqlpOC{M4da5BJ`D93fGkoFGfKf*N z&!!sY?#>I$KH6p|`F&5ec?**a3LizhWh+T`6JKgM=P{XEn2;1i-*V)$KGuo0GwHQR zJljB=Gzn+Al&yW#Xs!})pkqYn^($xDC5CatW3o&zzV*bt!>lQ+=Q^msazP_gL5&m4 zP7o6G9jrJFiNpv#GX`aFJ4U{jx){p~Epb^udrdIGFew;hbL1%);6{_`&kX){Nvj)V zjRrhmy{^$O`iUYz1S7{$Yhr4i7=Vxsuzw(BO*$Qdt`chv%nJn@!{{V(LVi~rd3OVs3s(NrP4)>si->`5qzvFk`G_)tMC}9 z4R(l8UHs9S8h|RiMz%_pX`&r*C4uUf^=Qfi#goS-cK|wI2of}Mj@dzy4f^ir?qA6G zmltPj$Z?k!r2S%loR<7^x+DJMQT*q0CuaNCFc6_?qYQGK`qIBm2;9rJ5F}GrMviZb z!_+E`L8W6|JQS*eA-wmOc7$1hX>DqC$k{o!2im=C-+&Cx#@WgrXMLxzeg^pUU#_m% zyjx1T6u$k=`?Ttmaj@zn_xgCSRQYC_k|i)yY>oCBt$YA$*DKU)X+BL=&5BxhqGQ}% z(80AhzD2KrO*nIck#yN+DJF($k9h>_2MwF5@(;_Bs=H5#XJf?K0ixw4rLJpv&@Y;@ z3#Xz$=~}+dYFXgAX|7NelIYBX){FhuR5BNNg;bv1hT{x-NwMpeep#T_^fDz~{ z4rnZ401DkHuzarUFqE&;v_xEdHI9*3xJLjwx2em1O0+yxEMKo7cQ3_K6PX@)qG%J# z6uNWs$S^oRoGJW^Jq!bq{*yp0COysCp(1=ws2~A{sxpcrkE~M%uJds2keq(KBTdTp zv^G2?7M$xPQN04;xI`L2?wKu_4oUIzXp0sJl5y>8Q#BrzVjFf3K+hPORpe%igafS^ zn}4^34u0SGH5b4_r>$3cA(m~6ahn+4qF3!0!OV{gAR|5O(U~{XZLAm0&|=13ZP8zl zu8X!$KX#ljj|*?Jy{6{C8kiWoJ|$%f;t)y#bHJ^N$g#2XR3$y3U93mGg|vlLwQD%K z?FiFOxKM+ZZ2Q)A)z=O3ea)U$QlZr{F~fTdxWt}Yy$5!)H1d5C#Zqpfjro~T^a&m- zGz~)!m0R9jrH477POGUkd$$O&FpgehI<{8|(e0A&$BM1=b9kjU_a5{J;gOZk zmUvCPn5#DJPpyOu#_u5E@I-h- z%yCO`yul|OWiIEfi|j*@XT+6HBgXX$P4si_BJ#XN!M%q^@{DX%X!D2Z^58!4bY!Kq zOu6lGh^Bq3RfuwlqNBU^iV~nb%+GiO-8+oZ<_jpkL-GXUcZLBOWmzLMDG-GL4h#aH zSHs>!51kEn2?K%|zxoL}a$lidQ#84Ry7``L^GrMA6CJ=sUIS_}u3LC!ocV`ZAV2Woa*lq|Z7wKmUAH4#zB+iZ^VfzI>3L2IGN`*0fC2#jb5!%s z4)f<><=-6!Wb6AkbHYdI?^_s)O=W6WGVBFe{%9+*$=AAHfZ&=sN<_UZvu+U#Z-RFiMxS|aYSWkPMHy<=lc#Mz!h9wtejbyPt>n)(O{W|fNo`LBx z;H{!tFp^*ca{JXkZI?HfGog+(K$j2c;v3bcUu5lD^s&|&wniQVPGA){=MG1$UTq** z`1YVgvi9DH-H7ymu!n$E0cy@%_Q_SQA+K?$Nso-+Y7vyxSb03{t<7gw2}f!|U=>lv zNaPO}ulT1X4WqWRj;(h$+P{wHDVhlRvB-2T3DUyb3 z`Jh#;@sQ%Lx|#=xXW@p4PO(5%95nAcJH+L}7|F0ely26lhG@T90Q*`#P31OMR7#m$ zW`M^mmshajI#MaRg3x~(O~<9gKBznLy)RFz#AO0yjSfVb7z>Bpb(biQR8jgi_|A!E zScULoz!udD#0NswW2!u+_i`L8CZ-mN1Ox&{JI@X#j34}&41!!?Br%?nXnaL)Ip67I zxo|mpk4(B75CQpnQIV1P1bi!J2SLeyq;kCuLG!DKfuqr+9$G!jAerm0PqJhwP=eA} zqw(pD;Jf)*T}<&y=14#?M+}m=$ZH9-&fF^xR;blNHpv4o)HqQ=WapR-ut&6((H$iG z-uvIi6KBV?2`-S#G5xPH$MQeP+@F?HVLMx=|C=>o0fM_>`X;=qW|9yTtN}XiZ=DyT5o(L~)t@K?W$(0^~@!|UbiL!8mW|ImN z&xRt0+{v@jgl1Y~ycnGK2T=-0;!1(0%kaI`BAC!2a=naVfj5-GexnV_QRwDRsyc6i zxUtn5)sAA%oD`dy`X=|I9;ei>(4J*D^_Pv@*Gs%n`ZaSkSji)d_-1NMgo@(NBx~^L z7HL**tTH63zo*jF{5-&noJE7()*ft~yJM~N(LU=4vmrfTmDwx^rIJA*PNuh8ljE4(5oBZm+)BCQgKYY$cnzvwBrSjD#+VZ#7=#FHTB3;crIP;iJtfYSWy1 zvDlbilChIafqw=$s(cZMu7jfAO-yDb9=vt0HzNv}@`=iQcoKy&!G9qR4`#`A1|m%C zTQz^Eq_V-l0HK;bc;yA8v_Z9AHH*cGo4>$trBGun2>5GH$oBkR>zLb9iQj$~t{rg~ z$jXT3MLAHET`Q14EYhmfn;QJp+QNzchwmkh3BhNc5v(W(C}q zb@lzE?2t3!u6AY$kuk92%(+AMC+>IFu5-1NpSTJ}ap~T{hc8*?B1+M(P|Sy4ZsZj$ zAQc3vS){TtE|UB^1?mDxGq$lZP*jgziCxcrn_vp~MyVV@Ce;2vOsM~~0Q_k}{mV-x z>fmDdH)7p1LKfa%01%}A^0s>;g2SF}1UUjHHzXtk?&3|n475q#(5Ej>TW@*d9B^G1 z3v90RuG{Yp9^O^;1#bcyOM?6P;{7N26N^=x&}vP3jPU3JD6iC`32<3_t0+Fmx6P2D zI5aOmOM_F$#HNjwOjg-y7(;&H!RS}3Q4&HbP*<#xypUP8cB|A_u7cSCn)Xg%?x2u= zTpr8M5N&V9Y^0%b^ z+cN)eVf`OqPNHi;T77o;DFa$@CiNjIF|RuF)wW=8h8W5j{puJ0wT9HBc^i=W_ZsqG z{pH~VWR@3=!;G^mU8xpXvW15H5oFk!(YmK))@u^1@`c8)`R8dwb)u;TOG*!mkDFoe zH$IH@nRUTRx)HgAd$Y!J;Mw9j3J^=Sk}1$l+b)EQCTH3G1dV;tmp^U^#Ee#Sq-&|X#j43S#7)_c_G&0s4 zL&<)_oIF7=r(ydof-<&WFeg|L%;|336-($G7Z?OX+nY0$Q=76>Z7L^d)OG@1!cl*O z{o_;3vL&o$cPIKFm$1+b2<9ZQ6Z#A0G#!4)0l3juuKIkU(iD3t`y1xOd6F#dn@j^J z76?lFlBZc}*g`+}9=Mx{f#Z;P6B^0`3)>(tmEa?rUZCqvmqI6hny!&U*YC8NgF7MP zsb0U$nNv%%1d)8{kPKRX?u*qap^^7E@q}X#aevuDia=Z9yHOTF6h=*xGaHtgq~_=Q zG(Fg?ykSry2-d|_j*L9|qdMxzzpog9Q{HLT%{yGV*MRC1{{TP;q_M}08-x85CV^sG z8u=eSfN#YxNPeFl`h+xxO%J8Q{WhMfYryc8rs6hR)>wIBQRQ&xq2+p!G}B}h+(QJ! z5qf6L0Q)|PTkv87JMv6Ov^;!H|2}A3sp%`!q~C9eJpzw>6#+@?pERd`)+5m$68l#@ z{_@?a*r^#>|I2b4p`z=wB7(|`!yR2-6q3&Vb{leez~&`-zG<)a;S;$__vWlcIi?mL zb7;HEtRg9iG+rWqx6``2q5D?0Y1VKTU{W3b;R6rjYCLLT!<@{Zary1{C8r5qFLl44 zcOV9L=%P`SRe}}Az3kd1U?(-w)|uW7J0+VboNpG8n4=sIwJcn2U~X2bu;$x^Gfph5 zQeMj1XXQCMi@QL+x)@`7JPzCIZ00WBAm=q-M!rQ((*m;Vw63hG)!RgK)Q{qQ8`Zi% z&6^XqsnBwX*6(TZht6@JT#sz z{0hfB-6T=q64VgVU?(uV*}Cc+xab&64SdMeShBvdopULR<7c9mL#wBv3umC`U! zXeh6+jF6|#(*cGd7bVtki;m^`!TP(m`#LU1}XEUEP zY5cUPxlDe7ym*JW3?lz4Xm7OuUco!G@$Paxb1~1E-T{hss)-Mmi?djh{dJeL@ zNe6i5i`7aDCFnCU(LUNs&+-#KJ|Ll`W}vw|j@A`yI?<9m@&8)(p~X=ib%Er0@eg_a z=N{*uWSZXxOQ4s(wv%)tbm6-d5Q8Qt?=0gJFmMiQ2arXGFMEWI-_EokIf8Qd&=0!b z(MpBKK(i!gvK&b7zc#JEkzpbQUpmr3GplYC>kGBwaDdPzcdoeuB$`^3G@BlTlY~bW z>_bZGI}^+13S&O6v$1x^Mlil|rFz=wGOyTvXcQwM6!9w%lb9s%MVJjUWLglfB4Uv9 z2GHMjk`oiPcGF@tbkelOh6rO)wPK*vauC!i-VrZhX<%X9Pnj%YEn;D52*`**elF{) z5qU}LYjgJbnO8Pzk_AuCE(XaCX$uK$U}gXwBO@>b*sp!VgaIZ20R={ZL-G4~u)B$C zG7R(&*ZuL9{`oEbdHDN3gefpH|H~s0|2LN}VH(mX&|l%=9W3~vXmL5Q(3^f(t-{>g zvy0=vngm9RX-Q?M7g@edIA%UhzNcK=3G&ZW7LYE?OW!hh54aDw6Q3VnUZ6g`(_ru% zv)2W-Y*%)-@E&XNgjB)V`U7tuu?BeCp5+EhT7fobvn|DqE65d5zp-2c)A3OW2}`Q> zXu`~>0G7TVYNOA6RZ$gzYZl3RNYR&CjTy2)UK70`qvQ(^hdM#4e_fb{d&OL~pMdsdB zwfb>=oI1f!BE8#H*DATjUv$^R#39je8gAYK`@yZvqSgDUXk0s|06C#(XZ&j%n`Gmy z71XMc0tt4Ocu&=8=`DgQEZY;RXJx;{4$_OyrsC5`2 zh6kl-S|hA{VUaNx%$ivpcv+XL18{Li{DhxF4D;FK3bB^4&H{c>ehWdAU$FPojP{taXA0 zoufbQ75#HV>;J()^zT{x-wq;Bb;zR#qP`AH)6c3((LpGxIDgjrq=be9TckoiCzfmD zPB5}azckK%Hvg$}7|L64JK%DOz?7#}8~Uh0Wae}X#QN=U>To&&k>Y*d4Qi1 zqE*{y`3r7<=FP(T$rAf8mcV%#3oabMXA#G*hJtF(^MXE=s6IW*( zz&Ls*nZ^)zBY-YrjIorit)pZw>VV4@`=&2PQ_QZ{4Wx~YnA3de-T8F<_4$gm66;9g z`wkCHMCcJ^Kt*7)NHvEJ!v>caBf1#wP8@5YSp(+&%fypc-X@s1SOU>*UqSN}YutwingmD%X~qlVyy4#Hf^%ySRFo>cVoZIBO$rji)LiTps|0 zu$+gBRw{AuP^f?zHn=&&+j!p_vvEqWe#A;4uY$m5ih1<8MD^6O*A@nEG-J&PC^$&} zK~`Cf@Rh+sxELc3^;?@%Nfxb0%Srwg8dj^Qanj+23G0w9 zlKCQWXa}stA`!N&bwN56%rW=qbI!7yFi}iCJ3*f2#sK~LWM(L20eR*;p`>yh%kmec z4J5gAqN@n-6i+STiX8k0fnMaZq{TB9nV6!gQ4XTp8CJGoCS(c*M4~@L78{Zp0a)8*wPw)c0 z(vZ!^&RT38R$6I-Q{1v)jTkFzz08=%Cj~Ywseo!{v#;5HD<25+w?Zwh)zb&~?2ZOH z7^U&C2*TS;7k@mTypZTQkx9TB9VRk;5(GLRFm$d|TU8t_rfAiMyK48KIcCFRQ1@!I zAWu`9-lFRCM*$#u0S>uKb$rZYg*1k4`7)KkL-+3Z`8k1BfSLE-v2~{#e&OOf&lfGz zTl@V>Br!iGeL(2vh!9d|^oPNsl2OuH6|VipsQp001@RjfKUL%|P6Lm1jzy*(XHovsiVV=14GIufbf|3UI|Dy3==# z?RY&P5sTdWsU*p#@_-=`BqP0Ux{1uD5O71oL)-Yhc^J;6hWi^=z@4k~9c;VM#`r3U zcsGZ=BSOqNLdY3WtUBVe0YrYt8A#a~cTrW+f?m-{ zavdYZbs$?^5iJZu4EEgloBAX8j9i(YU@$VQbl)7fWM@Xq@y2|E(s+fh;P8qJE#%xW z_I7cO{ig(cz(CvMP4X8AV@85@UV59YwFNQ( zo4M|8;4PF@K?YR*`-bByU_K5{rn(dI09Jw=igogr0Ujs?Mt?~xl%$+_4}x#xz1~Mc zmy~md*(t`D?@RqvJhB~!WL4c{nF4MJiO(H#Vu|``lDf;nNo|$TF-{Cr!xhljod9FT zm|Xe@!p2Kob`TGSdgP;-{yHY^4p$No6*Z(0T&TEPr(ms@g@ZJp;0Oo-DN4{AQ`rlQ=8zK|@%plNhfq*`Wo;}9j` zG|f09j|nV9WebJAaA)!{hzF z>pQ*Pp7*^jzvo`(zR$VOIrkk3&f@gCUP)fLf9)o}@sEUz#O}-Y1-&0`1~h26ce$5( z--&aT`^E7-U?I{X)`supx1+6ZIL}dcpX~{1#^X2BV$!KqqxtEC*Xym<&537dB!C{GWp6; zE!t@%BF_$7$rPl%s`}13fI6v|athADkQ#2@~9YVJjOELmEk%E?%p$pp`M;qQZUV zj&zI)p?1QzMR%KTEz8d=@-a7KsOirPylbEyjo|9gW0%rWVILe!N-|4!yKZ}XV{?&6 zh=iRYHsH#a!STFLxB+YRH8+0^-5RoI3otRR`tEYXBXF|1sld=#%)oz42D?9DO1Jxm zcHAWbtuRBIX6KWgIC|$BoYER2-z{GuylJR@N-}}AglIDR{t=BwLZ&VE!x=vuEVyNCB$l`@(%<}iuoC(BXUeLSxwq4a&DZ>*f-LlN_tl^4=6;-F zI{(?xfdy3bWkCoi+EEW7av$gDHD`MgoV-Ht%k0IM!{G}zBZ((^GTy4pBs02KM(ai< zK2U63^b1%HRt;#({(7SGxmwGjfvfNhxen~GTZ9(rHPhF_t4!ZdG)pQ5lsO;gzbbEt z`7T{oq$h)xR93&x^Ed`+OJ-e-NZf}LXOI6T8NVz26MHRfiGZg%lBg25`S9Q{x!_w8*M}ZZ1u}o(MgO#XTdqAf!g_S#! zCPa?Ul$A(HgSw?}M_4dB#y6`7_%0rS)+}SmHnZ%&{4OIM7ykcz^G+Q>kkwS^H6XtI zAx!DXNd!6d?MIKq2RJEmgGUKeF(0r}jubVVRrxx@I{iN52F`k7RSgyBN1eZ+)b!oi zH`JQ4Xm*3YbM3ag*PpdbA|d=rgncWXK#Z;SMKi@V;gQL(#sG_VheiJ+OkCUK7v;`S zBVWHbvWiK0P7B|QqHm>4KJVfkJnjdqN%T+#pNJ=zB>&X1O~#D!1%-S&pMk8Cy+M6)t|J$(KAY-$=yx+9qZcjlUpMOXP@Jnu)4wCseGRL( zx>5g7Y~SieEmlELZjx!_8uvvw_jr^`#S@c=&m?aQ)FlXcn1e&q3QlW1!A;^jJb50G z)ERLKa?SCkc{&{vuT3w%N+x5R(-#S-jaG9)T=~W{FWAqML)(svh>^P&eJ!!@;gV^T z;dq~JB5ytku}J|v-BFhJs`rC6o)_7=wlA*@JZ_!wQ6iit{Mh-dFXBFh>#3WTtQk%P z3O|JAgj>(;_~xHBt*fkA?+5)R&pvoKV_I}HizY&{?o628)4qq|8KbReJqG@aT1ECVKmZ&*(-C7oETPRG-=~ zFlw*}>$Z+_miT3`j%mM6MW{b5@@WJEBfH^7LSQ$kXP|0Y@#|@SeW~l=H`ngkL^%uC zbn*9!jp07S7@MZd6fF;(aSV=>OW=C%->-`+Ir`~Jq-UZgZ&M!8@Mg-`&j(|?lhSo{ zF{G4w*vmaX;%bxPe5cjpINKEe^gN)Q?`6TYkJx?BY9j1N`-B zN*d1Jzh9t8_RxSHFRCv=@rNDs_?U|{^lVbA$z~_8`id)PPdOPHCMMpiTYB^aN0Mf$ zlz{Fr>Q5gK#HOSWH$8Dq8`Ltz`TqQ?tI(3Iaru<|X&RGP5B%s|@o88$xn1O?8!%sK z_-WSjbX-W}zjA^~t=#T3HU=}c@sCe@SfRu2zN;gq9XNh>#lX#^pt~O+N1!;x`6mrT znQ5f_*WTAwL`ZQ6ytewA%ga-<)}Rt6z8!ElbMOq-I-$&slDzISinzs=)`Y1dwU4}M zth(0&5=~`EV@Ap}-Dka)eES__A}%*mkJ>KsxO!ga$!t9nb*y4)cDURj#clG#C70`) zKW9omqO;Z!bvX~ybruiIRxsFAD3_0Lg;mgUR1nFOE4h6us7fFuXXj!4i2dgKJsExl z*V?B;;d5kI@}@IJH|96hpWwbRZ5vCNi}HN& zcI*`I%U1%!EK!8KZ<|YVhhL=|;`WC2=MS=>cdJJIz~C#LwKsYD&5A_xQk9P3+4vBl z5Q3h*%<%bC zeAx(Xadjx<$+e=-p<$1`5pC;5-9<|qL@)BLk8tEoWUcpRmQ4QmNf6}v`<`jjnwLf6}Y&+$n}s5T6O(gWpfNG=K&K%qURs^-+_um zuGFg+hDF1hKjxh(BBjLW6@A{uOMMB0M_x~Pw6jh^A?OF!#sWiz?Zbz6>!j%qIY(4A zKE|2Nd3)_W55AAAqpksq-{DnTa-H{2so3Vf9~!8+60aQ3m!hj+KyiaoeUVeaIFc&g zu+vr>?`^5N7k6H!ZDly-U`cqJHeKO7Z(M2WJGPvdTgRoWupiEyjC=CUXD0t#P$xOY zB@D8|v%{%=9VC@(gq2^taoT;&x_u>2A{1K1xZ?@+zZY75^vOm?yoqW!wcr=tcKdBR z--Yxx`m5Jx?sYWa2ebGIlJfRm;2^IgFq*@6rfoPwcWkgEMn=i!h_wJ$%iK&ZBIu01 z+p-_-?;0+~IeD(?N-DH+qIQ>N~&2)(zWtTD7kmE`8#qj0PISFCmDhJuek&NSU@cK(}J!I&2d z8(l^PUog8gjsMti@DwJiPX8UebgH_xwAf2>%S6N@0>NWxr8ou;bG>!pi-Wy z#1F$UDoPJFECYAm%DtnX>AvU~1>U2ffg=8I>EX&?2m0r&xA;~CR{1*lrdFo%7p|bG zDw-;O70&;3_bn(r+_#}OL^RU=%-G|iXpW%HTH=VtLf*+R!+y)4uJGYWg50_Cj>SrO zyLS#EVLvlVPjoJie8Z}bwc#d@9;&5PB4gD1+`^Gwni|4#ed~tzb+)%;wjOzzUuK8o zm^nR+yb9zxS0dhwOa|GBfSYe%=^8C%`sKXK0-bNRc75c{hdG=|S6bPxE;)QA zpJcbYm_E*c<;JQ~F;+NaYGBR0sZTRfg28X`?xe;UBC`gM+pI34DXg7$ab|>NTLv~t zz|HPSM`PC1UYEy=>7?%^m0WHh)~@-21Nnqc zgCpmfPX-L9D2J8hO2~;*WctV{7H3GQWKAS^q>}g%#L4_lO-Umu67AQO(Xj z1QxLmPR6$`-Zi`~c2@1Ph--M(Ans>hLsto7&1%YZGUXN;JbA`m6SnOW`Hbf+21~k4 zcP?l#(5rkke)a?FsM+yRuJ-Acs17cjS|fy4I5R!H=bhie)rYqBmlFCPaxz@VbsYCKmTYVb zhaM6BD(_`Yt#dcDO;*l`wz*{91nadIta+9_?)rVRM1@nT@x`G&oBSdF3)b_^aeGY|_7suxh)JySd&Ve(_elEk>=8a@R|d|d7?vzAKIKm?ZrL=F zhTlw;YN?{XuG_wwPJM)XA*{1Vh=e$m!tyR{1=W&kh>If*v%ym4BTlJYiUf-GvClkW z(PVdTXJCyGat93iZhSdnncIEQljzV7Dc7%zMpI=@5+UC5yz@Q0^WGf(Lw9~T9qDxx zJbfv)gT9|KkAdYrf3t6{xlcT|uz;V$l*2jqxa!H{w!Lzt{+QAYb@4XGucazk%J(ML z+G}^XYs)h|sHkaEllMCZ3UI|`JuB@0AaLW)>*x3{+k2UfX5KKK9_{GqbBG_udZ|)O zRe!OFxa(wsi>M)2N!Z)&lC5Lx%1^xb6JQQG|7Z64|CFUWzjFv0rg{TZlqbL40kU+s zRG}ec#KkQaBMaM)I`Fb%b0ragouEQy-zR7%ID@_QRJ`WzQ}ODolO|Spv~r!xZVR0r zo14GZsDus=A?zD%F7bclzS{9fbx1CzG)}wXx+TK*Mu2IVc!1q$O=I$FrB?{ei>2NzsN3kBnaCC5d9DO$6 z^j=x5y(mw?TGq+6x6d*8bq)m%5#lT({`$E0#ZENh1`Y;e!roZ^)#u$t*O>HMPL)#% zDxOCe3oCW0sp|};XJ+yRJQ22M;W+jh7+DF z`gBDbfCJK&LCmf(LD5C44+A;OB1^ zmmA9(a%txD@rMEtD_rt*hc)q1PnI}G$l+)s3abTVT*X-gY!R1q#~Ortf3h$!JwDM+ zFcEl{f}yI3%KiG`+6HYQV?SpH?(+$OD$S89R|q!bsoan73mGu4$5EO+J@oN(H%Gg@ z<>Ik%S;;bcsY*qjQ#=+AmzBzbZ`>)Z2&Y)kYr@m=mtriTYZsZp55u7T<^S_nBi*oG z>R1P7^b~VM5=#WFzxaZn1omUk!I(-}GmEE=p>G^dhbhLczFL+DQlk`CKGl=boF-p2 zA`)@zm*115Z}e#ERn(L#lLWVBl6%>!bQFyZh}hHC{y@|vOlT?1$yUHw`5Vk&hny;6OVTxb$Tj7eB?7)SzIM!I|@4;+i zXQn6gc}n=1DN@ub(LbvTW9GNsubn}7$K4gISeP`<=vXZ?4Dnri@`tJZYI7Viv4}ON8Nca1Vy-+-*Q@h|4K!;r!#sgJ1NZy^dHB0jp(r)+pMS% z$>7aM_GaKb+k*E=Q1FrS&^%MhB8gqT*yqK^SIw@RUa8bFersJjHrG-4>ev$QsZ<*I zJSA!|t31*c9Zg=7vQPOn4tPNeq;| z7iop-@p5u#F091=#5L{03&N3_p=n;p)S2nNf%7nGvDg2R-LoY~+M$avo!4QgLW`y^HT?ZJyd}_#RCRdd0a! zoE9WnbrUW75A*t=E6K;rD^8kIK9{EnU^3FXgHcX~UN`3P7UQF_l%LzmB|`17DJkml z`Qslpz9Xi_UVP`^OL2K9SqthMDg5OCclImGpL(m9D^yrHYpRj5_T z@?!A2Evvv8%`*a+J9zlV; zq~y?8KaL@r;rsO7bo}*1&l)llPdm>QNxoz`43%EQS4U?akClg@EV$14_qU}BXxvpM zk-5%U*56J(uce69%IY8ONCgiWl4uKUv@$DO31rPy$k$6}4=GbVyHD!XZ!!PfS?(j* z^;WtK-(=e_>X-kx|G7)`3{zSdvB7niqlIIBY{G>loj+vDa9L=FiE|ez9C_ob5DE%B=>) zV~110wv_(Q3^0kfl<%h(mj!0l?>#xbNU<<&{rBYflHBJrTz;3z&Ke_n5pPV%*Uy`? zD~&`iYssDMx0Ef8vmlX4%m^jmM5wn{WKC>*NHD)mqy@yAS?kV$4M?XGiIeLoD?s8CiSK4#=cA# za8Iwqt3c(+biBAI2|2ic#87YZ!Yc6WwdJjoW9NSBMD@!$*!nW+4obAEUzo4eSJVtY zx(+I0WyZ3Fl$MGq^O@38*wErANK?+!CDcYJQaK%YjlVWA;pBwW1`C_Rck=i`tZ7m#tMLraJ;sy)eYqKJl&Oql-eYtha@OK$l^ zubTskxONrNR>?mi@498#Ex+@16_GuoiQ8Flk9Bh^#~?h59sR$b93x+?{8!lcZ)P${ zO;&|Ul3+eLY2<+}eIZSrH@0tqe^^2NYiB~4hN#*|8gJ=0Y0tj{6r-Z)X z6=lzpHZSm3lDvK3RWUtV-YBCf6f7}nr!5+pTa8(Agtq9~xl8@!(Ym>OpQrig(9!F$ zM{N1AbZn?UvtybtX?uweCiUZ>Yh?c@J#ps_r(cH(8afg7$!10D$!?3M042LTAk~vZ{-wgHKutzdiu`NF-i7wswV6Yf{Xf<50Ug8)6Tfl%6(is zQ@WLMd0jU1xv06$af#Aa{x?h#2+}axRc$i^M*CM${4>EEY*y*5<5Q{kIl4qH& zwvOYbexD|PZ9`JOWh$B%dSG9*%jwHUGjuU6KIYRV6 zDSFW7j2Y(O%&aJ>XY$SD3jvkc9ae_~Wemc5-sE_49DPojn8-6Yw}!2#S6Wx4DwyCO z!4~(+&+4dL;)&Pmp^;hF#`SU6;x%3M8{BA#nYaxYyxmT-*O-yM{>TulaLnRp!S%vr zJi=j`UtejRXdiWCd#0l4Uu%okAE9fxoHPHufIp|I{InElbz!XS!VqS;LU_J*1Un$7lG-f{1nVi^m>;W~-uh7&K zziaO=XSGbqx260p?m8skG^YJVRK@j;%J<|@=7g*-!OavowbrQ)3g0m=-A*B|9#S1o zmKXAEdM-_$LAE&agW!^aOnUl!qsW@Pz*>sT*XvJ`zteL%9zmRGl7FNei7)7iAOB_G ztG@+##wv+ipv!d9qYd5>GnL4-txC)9vZ$j)KzttBy5fzFL5xO>h7JBe$0J2v81vBE zE2u9EF;t2fX~>DI3bRSeOR&oc%S%g$tEirnlh|y79y9HtKyhC93x8@8*i$TnYXDH3 z-yMeFqqV7{p@XrZl_5KdE5{cu{JSp(2ZgpH&m(^sNC8k6Kov5mCrI4xf^Kh4l>b)T z_}ge?=hf6+Q017DHZ-GV(anrUw02jg2cMvbFFr-3;(IOwhhfMCo^MJ3~)4ypi*+cK?o5ujjoWPqx+1A|=2wv^) zYJ0ECvj6M7xp($}dx4Wil24WbG#nrkC^@w|YA?cp|2|dRDr9JAWVPKJWG|SCG*f99 zKzIp$kj=Z12M)GllhB9#!iCI8y$NSl4AArw{D5wzu#p5Bo9;5a^=3++V?PL_A^Sz@ zATgjJHK3t$FbL2T1+>@N1wpAHXc_~>9OOq=84V;jfJQ{Y#GygLZn_M|{wp{M8!n{& z9dOA(lm>`no7w<>&vSY&yJA9cv~5cFvkOg_doSxkHUPV#00>!^t5R@~Z6nz>!R?MB zaEc%WL%m;>Qk10Iq6i?AFVY)b(L`(ycTJP+tutUnMFvf+M{hr=9TxVvp=;q*Zg zOGvpWs1BI*1iA$l0UKdTaA1_W-FbQf6pYk2ZSS{mK5vf02c!o8kpk7ghO`=rZ`nC1 z{Flsy<3UorJrbX(0;Ea=M1nj+vNYj%aPN44=RO-^l8gsP)7wXyxHcSWTL3#OA4Hn^ zJicZkU~mxp08b4Ysrqm{q)r-87`+Nw(TQ7`TJJYXh~s0`+-9SI+x?y^)&LHBfB~Rh za6!PO_1@7jfXW8&gRGi%V>s~N38>;^;(A385DJw7qXC$^qre1? zw=K_qN&5q|Bi7%z2pLWT&<+TsupwZE0!`V}+0@>_^dIPcQ9|>Z>}Q$7V-NtOj-sJ~ zwHj#@rZzBH$pHgHXT1 z-J$0IM?*OWA^(cRHmK^yx&M10RZ}Is+cW{HQ3V1;c5J*Z@b7`%kvh;TIT_kGS~$9) z&AmWO7b930jfXGAe}&O3K@5;2z;mK$WB5pxf37LIF~pX1E9;8XtCrkjw$~+JdT;$g__QxS?Gd zw~TPV>>!G8Jip2K1B%K3MaXQbJcL7RQ?wIJ{*x|it~oOq0E!Jjp_Ie!U`vLB?yP80 zt(G7gp&Q6Jd=-EMf>{Q=(%v04DR7{jE!_dLHQnq*MG3&+0654y0M%4D4m^}Dy<^{Zh?E`Fd&)IprTRz`66jFvkfh^z2TG9gE&7-}*0^v|sC69)@-* z^*!J}3>Y;7Ob%#h4`0AB;r6rj?&aWBs1L`~1Ztn!S_+5U%Rl5(QEnw6%`b#C$~FP| zL;;=1c0K~^YM1vN&4bW>DInS=CJScmWF3K@`bs}bG1 z>3Ia?x+}nK(EXG_T6i>`@s$V68zEqYv@qVlEb4=U>}WyJ-WF5}bVRW_fM}|(5lOQF zXGgrMdxcdp0Ehca6KaxJf5?4%4|0;(phpS#2R6(=%50Y}1%>^;578iDNu>n6{(_!H zB8Lg@HTdtN5H>eTYlUo#OIpO99n4Zuz#Ou*Cj5qjZrj)aD*(tA#$96{x&oV@2P7f~ z93m`q7<+&1Nfr7FHJ?i&SAB+n+B)#}=KcB(?+5y~tV4}eh*2h$oF+82B{#!e&!9}W!7Teg`^jRS=l;A8fLnXzMVz#ZE> zI7ezQs^kkiDjHyB&l0$b01maa9S8q9kS<7rYr|B@eLz_I<7(w`IM`kT`xjRaT*vvX zU9#>26t)6zU}v!Lg%}BVJNt@S57;wC4jKT36`;_-FwoF2K*PJ;`GXVyxUmkpY0nFs zIr0Gxi69(ho4mifUlaitVio6|>!1{v`~(0G8S9}C9BU_sEMaKnfFd6aO~_GatO!Mb z6Ab7=UgmR%z_GxD+}ZV_APX`+&0Y%v2Qm;SdH2Sb*C0*3>y!Rk^nNixteq37yI=;$ z!P^(teRSX$d$kCr8tObKBzsG-ERP1j-!OP_a~jEr1@%#YLVOlff zPzoeGWfLU4e+Ov$dEN3}A7XShf!Fl`UY8#UhW6pR$2N{W}^!ZtcAn4c_gT z775MAZa*57ulvu~4{xH2%%>?h7uz*tq~N+=$B@Y2U1*WlOS%roP>!Z%AicS-2QQpe z2oT=fj~IznGjKkg^g+2Nhv4C@r;u0qJzgl_cK{F6E+j#CYZc@qRe?7gbZeQoUxbkL!dqg{?D7mg zF$BUPw!PH$3d#NezX>mYkL;cbK;Nxhm2L}hhf7qa6p+;5mCTXNmoOX!9$2y+82;fE zl+PnE8Xt)Q3OW2k3DW{Hg`;edZBgM127fm%3ykjcO+KO(Opa2~?p{%ub78y-Xyyx1Zol0JST zq5R!fqTW_Q)W8eWAy2H%BseuF!t8D(AvlYW-NDOvAv=RlkKmvvoxzSmP&xyMcX%}< zWSR-n;qWMEMxC64K;cznkkh9uS^ovPy=p`)3 z!=E)EALgyUgadEW{ohBT@Vns1^X6(D9BeO@U`c)Oof!DVPGomx(g+9rw-o%Ra4fJ> T2Ui8apF`k+(rhbOMWg)>*U46h literal 0 HcmV?d00001 diff --git a/util/control/ErrorHandling.scala b/util/control/ErrorHandling.scala new file mode 100644 index 000000000..fd8b6e69c --- /dev/null +++ b/util/control/ErrorHandling.scala @@ -0,0 +1,18 @@ +package xsbt + +object ErrorHandling +{ + def translate[T](msg: => String)(f: => T) = + try { f } + catch { case e => throw new TranslatedException(msg + e.toString, e) } + def wideConvert[T](f: => T): Either[Throwable, T] = + try { Right(f) } + catch { case e => Left(e) } // TODO: restrict type of e + def convert[T](f: => T): Either[Exception, T] = + try { Right(f) } + catch { case e: Exception => Left(e) } +} +final class TranslatedException private[xsbt](msg: String, cause: Throwable) extends RuntimeException(msg, cause) +{ + override def toString = msg +} \ No newline at end of file diff --git a/util/log/Logger.scala b/util/log/Logger.scala new file mode 100644 index 000000000..b5203acb8 --- /dev/null +++ b/util/log/Logger.scala @@ -0,0 +1,71 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ + package xsbt + +abstract class Logger extends NotNull +{ + def getLevel: Level.Value + def setLevel(newLevel: Level.Value) + def enableTrace(flag: Boolean) + def traceEnabled: Boolean + + def atLevel(level: Level.Value) = level.id >= getLevel.id + def trace(t: => Throwable): Unit + final def debug(message: => String): Unit = log(Level.Debug, message) + final def info(message: => String): Unit = log(Level.Info, message) + final def warn(message: => String): Unit = log(Level.Warn, message) + final def error(message: => String): Unit = log(Level.Error, message) + def success(message: => String): Unit + def log(level: Level.Value, message: => String): Unit + def control(event: ControlEvent.Value, message: => String): Unit + + def logAll(events: Seq[LogEvent]): Unit + /** Defined in terms of other methods in Logger and should not be called from them. */ + final def log(event: LogEvent) + { + event match + { + case s: Success => success(s.msg) + case l: Log => log(l.level, l.msg) + case t: Trace => trace(t.exception) + case setL: SetLevel => setLevel(setL.newLevel) + case setT: SetTrace => enableTrace(setT.enabled) + case c: ControlEvent => control(c.event, c.msg) + } + } +} + +sealed trait LogEvent extends NotNull +final class Success(val msg: String) extends LogEvent +final class Log(val level: Level.Value, val msg: String) extends LogEvent +final class Trace(val exception: Throwable) extends LogEvent +final class SetLevel(val newLevel: Level.Value) extends LogEvent +final class SetTrace(val enabled: Boolean) extends LogEvent +final class ControlEvent(val event: ControlEvent.Value, val msg: String) extends LogEvent + +object ControlEvent extends Enumeration +{ + val Start, Header, Finish = Value +} + +/** An enumeration defining the levels available for logging. A level includes all of the levels +* with id larger than its own id. For example, Warn (id=3) includes Error (id=4).*/ +object Level extends Enumeration with NotNull +{ + val Debug = Value(1, "debug") + val Info = Value(2, "info") + val Warn = Value(3, "warn") + val Error = Value(4, "error") + /** Defines the label to use for success messages. A success message is logged at the info level but + * uses this label. Because the label for levels is defined in this module, the success + * label is also defined here. */ + val SuccessLabel = "success" + + // added because elements was renamed to iterator in 2.8.0 nightly + def levels = Debug :: Info :: Warn :: Error :: Nil + /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ + def apply(s: String) = levels.find(s == _.toString) + /** Same as apply, defined for use in pattern matching. */ + private[xsbt] def unapply(s: String) = apply(s) +} \ No newline at end of file From 3c9cc8a94465c384ad28f4b7c5489eb2b9e8dd69 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 16 Aug 2009 20:33:46 -0400 Subject: [PATCH 004/823] Change TaskRunner to throw an exception instead of using Either --- cache/Cache.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/cache/Cache.scala b/cache/Cache.scala index b016c1654..ae46845c9 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -9,7 +9,7 @@ trait Cache[I,O] } trait SBinaryFormats extends CollectionTypes with JavaFormats with NotNull { - //TODO: add basic types minus FileFormat + //TODO: add basic types from SBinary minus FileFormat } object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits { From 1864c12f74c892cea8097477cde01f2b56899c7c Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 17 Aug 2009 10:51:43 -0400 Subject: [PATCH 005/823] Setting up compiler support and several related additions to util/io * Added the top-level interface project for communicating across scala versions within a jvm. * Added plugin project containing analysis compiler plugin * Added component compiler to build xsbt components against required version of Scala on the fly * Added interface to compiler that runs in the same version of Scala * Added frontend that compiles against a given version of Scala with or without analysis. --- cache/src/test/scala/CacheTest.scala | 3 +- .../src/main/java/xsbti/AnalysisCallback.java | 35 +++++++++++++++++++ .../java/xsbti/AnalysisCallbackContainer.java | 12 +++++++ interface/src/main/java/xsbti/F0.java | 9 +++++ interface/src/main/java/xsbti/Logger.java | 13 +++++++ 5 files changed, 71 insertions(+), 1 deletion(-) create mode 100644 interface/src/main/java/xsbti/AnalysisCallback.java create mode 100644 interface/src/main/java/xsbti/AnalysisCallbackContainer.java create mode 100644 interface/src/main/java/xsbti/F0.java create mode 100644 interface/src/main/java/xsbti/Logger.java diff --git a/cache/src/test/scala/CacheTest.scala b/cache/src/test/scala/CacheTest.scala index d4d767c01..f0cf919aa 100644 --- a/cache/src/test/scala/CacheTest.scala +++ b/cache/src/test/scala/CacheTest.scala @@ -16,6 +16,7 @@ object CacheTest// extends Properties("Cache test") val cTask = (createTask :: cached :: TNil) map { case (file :: len :: HNil) => println("File: " + file + " length: " + len); len :: file :: HNil } val cachedC = Cache(cTask, new File("/tmp/c-cache")) - TaskRunner(cachedC).left.foreach(_.foreach(f => f.exception.printStackTrace)) + try { TaskRunner(cachedC) } + catch { case TasksFailed(failures) => failures.foreach(_.exception.printStackTrace) } } } \ No newline at end of file diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java new file mode 100644 index 000000000..2ceedd4a6 --- /dev/null +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -0,0 +1,35 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ +package xsbti; + +import java.io.File; + +public interface AnalysisCallback +{ + /** The names of classes that the analyzer should find subclasses of.*/ + public String[] superclassNames(); + /** Called when the the given superclass could not be found on the classpath by the compiler.*/ + public void superclassNotFound(String superclassName); + /** Called before the source at the given location is processed. */ + public void beginSource(File source); + /** Called when the a subclass of one of the classes given in superclassNames is + * discovered.*/ + public void foundSubclass(File source, String subclassName, String superclassName, boolean isModule); + /** Called to indicate that the source file source depends on the source file + * dependsOn.*/ + public void sourceDependency(File dependsOn, File source); + /** Called to indicate that the source file source depends on the jar + * jar.*/ + public void jarDependency(File jar, File source); + /** Called to indicate that the source file source depends on the class file + * clazz.*/ + public void classDependency(File clazz, File source); + /** Called to indicate that the source file source produces a class file at + * module.*/ + public void generatedClass(File source, File module); + /** Called after the source at the given location has been processed. */ + public void endSource(File sourcePath); + /** Called when a module with a public 'main' method with the right signature is found.*/ + public void foundApplication(File source, String className); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/AnalysisCallbackContainer.java b/interface/src/main/java/xsbti/AnalysisCallbackContainer.java new file mode 100644 index 000000000..3d0641ed7 --- /dev/null +++ b/interface/src/main/java/xsbti/AnalysisCallbackContainer.java @@ -0,0 +1,12 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package xsbti; + +/** Provides access to an AnalysisCallback. This is used by the plugin to +* get the callback to use. The scalac Global instance it is passed must +* implement this interface. */ +public interface AnalysisCallbackContainer +{ + public AnalysisCallback analysisCallback(); +} \ No newline at end of file diff --git a/interface/src/main/java/xsbti/F0.java b/interface/src/main/java/xsbti/F0.java new file mode 100644 index 000000000..90e713b6b --- /dev/null +++ b/interface/src/main/java/xsbti/F0.java @@ -0,0 +1,9 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package xsbti; + +public interface F0 +{ + public T apply(); +} diff --git a/interface/src/main/java/xsbti/Logger.java b/interface/src/main/java/xsbti/Logger.java new file mode 100644 index 000000000..3b676650d --- /dev/null +++ b/interface/src/main/java/xsbti/Logger.java @@ -0,0 +1,13 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package xsbti; + +public interface Logger +{ + public void error(F0 msg); + public void warn(F0 msg); + public void info(F0 msg); + public void debug(F0 msg); + public void trace(F0 exception); +} From 165d1df52c2e11062e6a039315b2cc8b5f6ff17d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 18 Aug 2009 00:51:08 -0400 Subject: [PATCH 006/823] Setup interface project for testing --- interface/src/main/java/xsbti/Versions.java | 11 ++++++++ interface/src/test/scala/F0.scala | 6 +++++ interface/src/test/scala/TestCallback.scala | 28 +++++++++++++++++++++ interface/src/test/scala/TestLogger.scala | 25 ++++++++++++++++++ 4 files changed, 70 insertions(+) create mode 100644 interface/src/main/java/xsbti/Versions.java create mode 100644 interface/src/test/scala/F0.scala create mode 100644 interface/src/test/scala/TestCallback.scala create mode 100644 interface/src/test/scala/TestLogger.scala diff --git a/interface/src/main/java/xsbti/Versions.java b/interface/src/main/java/xsbti/Versions.java new file mode 100644 index 000000000..8576aeaf5 --- /dev/null +++ b/interface/src/main/java/xsbti/Versions.java @@ -0,0 +1,11 @@ +/* sbt -- Simple Build Tool + * Copyright 2009 Mark Harrah + */ +package xsbti; + +public interface Versions +{ + public static final String Sbt = "0.7"; + public static final int Interface = 1; + public static final int BootInterface = 1; +} diff --git a/interface/src/test/scala/F0.scala b/interface/src/test/scala/F0.scala new file mode 100644 index 000000000..d71458e68 --- /dev/null +++ b/interface/src/test/scala/F0.scala @@ -0,0 +1,6 @@ +package xsbti + +object f0 +{ + def apply[T](s: => T) = new F0[T] { def apply = s } +} \ No newline at end of file diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala new file mode 100644 index 000000000..7783a0df5 --- /dev/null +++ b/interface/src/test/scala/TestCallback.scala @@ -0,0 +1,28 @@ +package xsbti + +import java.io.File +import scala.collection.mutable.ArrayBuffer + +class TestCallback(val superclassNames: Array[String]) extends AnalysisCallback +{ + val invalidSuperclasses = new ArrayBuffer[String] + val beganSources = new ArrayBuffer[File] + val endedSources = new ArrayBuffer[File] + val foundSubclasses = new ArrayBuffer[(File, String, String, Boolean)] + val sourceDependencies = new ArrayBuffer[(File, File)] + val jarDependencies = new ArrayBuffer[(File, File)] + val classDependencies = new ArrayBuffer[(File, File)] + val products = new ArrayBuffer[(File, File)] + val applications = new ArrayBuffer[(File, String)] + + def superclassNotFound(superclassName: String) { invalidSuperclasses += superclassName } + def beginSource(source: File) { beganSources += source } + def foundSubclass(source: File, subclassName: String, superclassName: String, isModule: Boolean): Unit = + foundSubclasses += ((source, subclassName, superclassName, isModule)) + def sourceDependency(dependsOn: File, source: File) { sourceDependencies += ((dependsOn, source)) } + def jarDependency(jar: File, source: File) { jarDependencies += ((jar, source)) } + def classDependency(clazz: File, source: File) { classDependencies += ((clazz, source)) } + def generatedClass(source: File, module: File) { products += ((source, module)) } + def endSource(source: File) { endedSources += source } + def foundApplication(source: File, className: String) { applications += ((source, className)) } +} \ No newline at end of file diff --git a/interface/src/test/scala/TestLogger.scala b/interface/src/test/scala/TestLogger.scala new file mode 100644 index 000000000..f78a0f533 --- /dev/null +++ b/interface/src/test/scala/TestLogger.scala @@ -0,0 +1,25 @@ +package xsbti + +class TestLogger extends Logger +{ + private val buffer = new scala.collection.mutable.ArrayBuffer[F0[Unit]] + def info(msg: F0[String]) = buffer("[info] ", msg) + def warn(msg: F0[String]) = buffer("[warn] ", msg) + def debug(msg: F0[String]) = buffer("[debug] ", msg) + def error(msg: F0[String]) = buffer("[error] ", msg) + def verbose(msg: F0[String]) = buffer("[verbose] ", msg) + def show() { buffer.foreach(_()) } + def clear() { buffer.clear() } + def trace(t: F0[Throwable]) { buffer += f0(t().printStackTrace) } + private def buffer(s: String, msg: F0[String]) { buffer += f0(println(s + msg())) } +} +object TestLogger +{ + def apply[T](f: Logger => T): T = + { + val log = new TestLogger + try { f(log) } + catch { case e: Exception => log.show(); throw e } + finally { log.clear() } + } +} \ No newline at end of file From 4b824fbe84a54086393adda6a3b7dc46c48504cb Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 18 Aug 2009 23:25:34 -0400 Subject: [PATCH 007/823] Tests and fixes for analysis plugin and the task scheduler. --- interface/src/main/java/xsbti/AnalysisCallback.java | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 2ceedd4a6..70870965c 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -17,7 +17,9 @@ public interface AnalysisCallback * discovered.*/ public void foundSubclass(File source, String subclassName, String superclassName, boolean isModule); /** Called to indicate that the source file source depends on the source file - * dependsOn.*/ + * dependsOn. Note that only source files included in the current compilation will + * passed to this method. Dependencies on classes generated by sources not in the current compilation will + * be passed as class dependencies to the classDependency method.*/ public void sourceDependency(File dependsOn, File source); /** Called to indicate that the source file source depends on the jar * jar.*/ From 31b6464101f6fdc08a090b03406e4e2af09bd994 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 20 Aug 2009 00:02:06 -0400 Subject: [PATCH 008/823] Tests and fixes for component manager and cache interface. --- interface/src/test/scala/TestLogger.scala | 15 ++++++++++++++- 1 file changed, 14 insertions(+), 1 deletion(-) diff --git a/interface/src/test/scala/TestLogger.scala b/interface/src/test/scala/TestLogger.scala index f78a0f533..c8df588f6 100644 --- a/interface/src/test/scala/TestLogger.scala +++ b/interface/src/test/scala/TestLogger.scala @@ -8,10 +8,16 @@ class TestLogger extends Logger def debug(msg: F0[String]) = buffer("[debug] ", msg) def error(msg: F0[String]) = buffer("[error] ", msg) def verbose(msg: F0[String]) = buffer("[verbose] ", msg) + def info(msg: => String) = buffer("[info] ", msg) + def warn(msg: => String) = buffer("[warn] ", msg) + def debug(msg: => String) = buffer("[debug] ", msg) + def error(msg: => String) = buffer("[error] ", msg) + def verbose(msg: => String) = buffer("[verbose] ", msg) def show() { buffer.foreach(_()) } def clear() { buffer.clear() } def trace(t: F0[Throwable]) { buffer += f0(t().printStackTrace) } - private def buffer(s: String, msg: F0[String]) { buffer += f0(println(s + msg())) } + private def buffer(s: String, msg: F0[String]) { buffer(s, msg()) } + private def buffer(s: String, msg: => String) { buffer += f0(println(s + msg)) } } object TestLogger { @@ -22,4 +28,11 @@ object TestLogger catch { case e: Exception => log.show(); throw e } finally { log.clear() } } + def apply[L <: TestLogger, T](newLogger: => L)(f: L => T): T = + { + val log = newLogger + try { f(log) } + catch { case e: Exception => log.show(); throw e } + finally { log.clear() } + } } \ No newline at end of file From 11148ce7bdf75e50fc9f2812bc50176a27cf7843 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Wed, 26 Aug 2009 08:38:20 -0400 Subject: [PATCH 009/823] Composable dependency tracking on top of Tasks. --- cache/ChangeReport.scala | 25 +++++ cache/DependencyTracking.scala | 154 ++++++++++++++++++++++++++++ cache/TrackingFormat.scala | 52 ++++++++++ cache/src/test/scala/Tracking.scala | 45 ++++++++ 4 files changed, 276 insertions(+) create mode 100644 cache/ChangeReport.scala create mode 100644 cache/DependencyTracking.scala create mode 100644 cache/TrackingFormat.scala create mode 100644 cache/src/test/scala/Tracking.scala diff --git a/cache/ChangeReport.scala b/cache/ChangeReport.scala new file mode 100644 index 000000000..baa078034 --- /dev/null +++ b/cache/ChangeReport.scala @@ -0,0 +1,25 @@ +package xsbt + +trait ChangeReport[T] extends NotNull +{ + def allInputs: Set[T] + def unmodified: Set[T] + def modified: Set[T] // all changes, including added + def added: Set[T] + def removed: Set[T] + def +++(other: ChangeReport[T]): ChangeReport[T] = new CompoundChangeReport(this, other) +} +trait InvalidationReport[T] extends NotNull +{ + def valid: Set[T] + def invalid: Set[T] + def invalidProducts: Set[T] +} +private class CompoundChangeReport[T](a: ChangeReport[T], b: ChangeReport[T]) extends ChangeReport[T] +{ + lazy val allInputs = a.allInputs ++ b.allInputs + lazy val unmodified = a.unmodified ++ b.unmodified + lazy val modified = a.modified ++ b.modified + lazy val added = a.added ++ b.added + lazy val removed = a.removed ++ b.removed +} \ No newline at end of file diff --git a/cache/DependencyTracking.scala b/cache/DependencyTracking.scala new file mode 100644 index 000000000..931efba1c --- /dev/null +++ b/cache/DependencyTracking.scala @@ -0,0 +1,154 @@ +package xsbt + +import java.io.File +import sbinary.{Format, Operations} + +object DependencyTracking +{ + def trackBasic[T, F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cacheDirectory: File) + (f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[T]): Task[T] = + { + changed(filesTask, style, new File(cacheDirectory, "files")) { sourceChanges => + invalidate(sourceChanges, cacheDirectory) { (report, tracking) => + f(sourceChanges, report, tracking) + } + } + } + def changed[T, F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cache: File)(f: ChangeReport[File] => Task[T]): Task[T] = + filesTask bind { files => + val lastFilesInfo = Operations.fromFile(cache)(style.format).files + val lastFiles = lastFilesInfo.map(_.file) + val currentFiles = files.map(_.getAbsoluteFile) + val currentFilesInfo = style(files) + + val report = new ChangeReport[File] + { + lazy val allInputs = currentFiles + lazy val removed = lastFiles -- allInputs + lazy val added = allInputs -- lastFiles + lazy val modified = (lastFilesInfo -- currentFilesInfo.files).map(_.file) + lazy val unmodified = allInputs -- modified + } + + f(report) map { result => + Operations.toFile(currentFilesInfo)(cache)(style.format) + result + } + } + def invalidate[R](changes: ChangeReport[File], cacheDirectory: File)(f: (InvalidationReport[File], UpdateTracking[File]) => Task[R]): Task[R] = + { + val pruneAndF = (report: InvalidationReport[File], tracking: UpdateTracking[File]) => { + report.invalidProducts.foreach(_.delete) + f(report, tracking) + } + invalidate(Task(changes), cacheDirectory, true)(pruneAndF)(sbinary.DefaultProtocol.FileFormat) + } + def invalidate[T,R](changesTask: Task[ChangeReport[T]], cacheDirectory: File, translateProducts: Boolean) + (f: (InvalidationReport[T], UpdateTracking[T]) => Task[R])(implicit format: Format[T]): Task[R] = + { + changesTask bind { changes => + val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) + val tracker = trackFormat.read + def invalidatedBy(file: T) = tracker.products(file) ++ tracker.sources(file) ++ tracker.usedBy(file) ++ tracker.dependsOn(file) + + import scala.collection.mutable.HashSet + val invalidated = new HashSet[T] + val invalidatedProducts = new HashSet[T] + def invalidate(files: Iterable[T]): Unit = + for(file <- files if !invalidated(file)) + { + invalidated += file + if(!tracker.sources(file).isEmpty) invalidatedProducts += file + invalidate(invalidatedBy(file)) + } + + invalidate(changes.modified) + tracker.removeAll(invalidated) + + val report = new InvalidationReport[T] + { + val invalid = Set(invalidated.toSeq : _*) + val invalidProducts = Set(invalidatedProducts.toSeq : _*) + val valid = changes.unmodified -- invalid + } + + f(report, tracker) map { result => + trackFormat.write(tracker) + result + } + } + } + + import scala.collection.mutable.{Set, HashMap, MultiMap} + private[xsbt] type DependencyMap[T] = HashMap[T, Set[T]] with MultiMap[T, T] + private[xsbt] def newMap[T]: DependencyMap[T] = new HashMap[T, Set[T]] with MultiMap[T, T] +} + +trait UpdateTracking[T] extends NotNull +{ + def dependency(source: T, dependsOn: T): Unit + def use(source: T, uses: T): Unit + def product(source: T, output: T): Unit +} +import scala.collection.Set +trait ReadTracking[T] extends NotNull +{ + def dependsOn(file: T): Set[T] + def products(file: T): Set[T] + def sources(file: T): Set[T] + def usedBy(file: T): Set[T] +} +import DependencyTracking.{DependencyMap => DMap, newMap} +private final class DefaultTracking[T](translateProducts: Boolean)(val reverseDependencies: DMap[T], val reverseUses: DMap[T], val sourceMap: DMap[T]) extends DependencyTracking[T](translateProducts) +{ + val productMap: DMap[T] = forward(sourceMap) // map from a source to its products. Keep in sync with sourceMap +} +// if translateProducts is true, dependencies on a product are translated to dependencies on a source +private abstract class DependencyTracking[T](translateProducts: Boolean) extends ReadTracking[T] with UpdateTracking[T] +{ + val reverseDependencies: DMap[T] // map from a file to the files that depend on it + val reverseUses: DMap[T] // map from a file to the files that use it + val sourceMap: DMap[T] // map from a product to its sources. Keep in sync with productMap + val productMap: DMap[T] // map from a source to its products. Keep in sync with sourceMap + + final def dependsOn(file: T): Set[T] = get(reverseDependencies, file) + final def products(file: T): Set[T] = get(productMap, file) + final def sources(file: T): Set[T] = get(sourceMap, file) + final def usedBy(file: T): Set[T] = get(reverseUses, file) + + private def get(map: DMap[T], value: T): Set[T] = map.getOrElse(value, Set.empty[T]) + + final def dependency(sourceFile: T, dependsOn: T) + { + val actualDependencies = + if(!translateProducts) + Seq(dependsOn) + else + sourceMap.getOrElse(dependsOn, Seq(dependsOn)) + actualDependencies.foreach { actualDependency => reverseDependencies.add(actualDependency, sourceFile) } + } + final def product(sourceFile: T, product: T) + { + productMap.add(sourceFile, product) + sourceMap.add(product, sourceFile) + } + final def use(sourceFile: T, usesFile: T) { reverseUses.add(usesFile, sourceFile) } + + final def removeAll(files: Iterable[T]) + { + def remove(a: DMap[T], b: DMap[T], file: T): Unit = + for(x <- a.removeKey(file)) b --= x + def removeAll(a: DMap[T], b: DMap[T]): Unit = + files.foreach { file => remove(a, b, file); remove(b, a, file) } + + removeAll(forward(reverseDependencies), reverseDependencies) + removeAll(productMap, sourceMap) + removeAll(forward(reverseUses), reverseUses) + } + protected final def forward(map: DMap[T]): DMap[T] = + { + val f = newMap[T] + for( (key, values) <- map; value <- values) f.add(value, key) + f + } +} \ No newline at end of file diff --git a/cache/TrackingFormat.scala b/cache/TrackingFormat.scala new file mode 100644 index 000000000..1b6c463a0 --- /dev/null +++ b/cache/TrackingFormat.scala @@ -0,0 +1,52 @@ +package xsbt + +import java.io.File +import scala.collection.mutable.{HashMap, Map, MultiMap, Set} +import sbinary.{DefaultProtocol, Format, Operations} +import DefaultProtocol._ +import TrackingFormat._ +import DependencyTracking.{DependencyMap => DMap, newMap} + +private class TrackingFormat[T](directory: File, translateProducts: Boolean)(implicit tFormat: Format[T]) extends NotNull +{ + + val indexFile = new File(directory, "index") + val dependencyFile = new File(directory, "dependencies") + def read(): DependencyTracking[T] = + { + val indexMap = Operations.fromFile[Map[Int,T]](indexFile) + val indexedFormat = wrap[T,Int](ignore => error("Read-only"), indexMap.apply) + Operations.fromFile(dependencyFile)(trackingFormat(translateProducts)(indexedFormat)) + } + def write(tracking: DependencyTracking[T]) + { + val index = new IndexMap[T] + val indexedFormat = wrap[T,Int](t => index(t), ignore => error("Write-only")) + + Operations.toFile(tracking)(dependencyFile)(trackingFormat(translateProducts)(indexedFormat)) + Operations.toFile(index.indices)(indexFile) + } +} +private object TrackingFormat +{ + implicit def mutableMapFormat[S, T](implicit binS : Format[S], binT : Format[T]) : Format[Map[S, T]] = + viaArray( (x : Array[(S, T)]) => Map(x :_*)); + implicit def depMapFormat[T](implicit bin: Format[T]) : Format[DMap[T]] = + { + viaArray { (x : Array[(T, Set[T])]) => + val map = newMap[T] + map ++= x + map + } + } + def trackingFormat[T](translateProducts: Boolean)(implicit tFormat: Format[T]): Format[DependencyTracking[T]] = + asProduct3((a: DMap[T],b: DMap[T],c: DMap[T]) => new DefaultTracking(translateProducts)(a,b,c) : DependencyTracking[T])(dt => Some(dt.reverseDependencies, dt.reverseUses, dt.sourceMap)) +} + +private final class IndexMap[T] extends NotNull +{ + private[this] var lastIndex = 0 + private[this] val map = new HashMap[T, Int] + def indices = map.toArray.map( (_: (T,Int)).swap ) + def apply(t: T) = map.getOrElseUpdate(t, { lastIndex += 1; lastIndex }) +} \ No newline at end of file diff --git a/cache/src/test/scala/Tracking.scala b/cache/src/test/scala/Tracking.scala new file mode 100644 index 000000000..ff952556b --- /dev/null +++ b/cache/src/test/scala/Tracking.scala @@ -0,0 +1,45 @@ +package xsbt + +import java.io.File + +trait examples +{ + def classpathTask: Task[Set[File]] + def sourcesTask: Task[Set[File]] + import DependencyTracking._ + lazy val compile = + changed(classpathTask, FilesInfo.lastModified, new File("cache/compile/classpath/")) { classpathChanges => + changed(sourcesTask, FilesInfo.hash, new File("cache/compile/sources/")) { sourceChanges => + invalidate(classpathChanges +++ sourceChanges, new File("cache/compile/dependencies/'")) { (report, tracking) => + val recompileSources = report.invalid ** sourceChanges.allInputs + val classpath = classpathChanges.allInputs + Task() + } + } + } + + trait sync + { + def sources: Task[Set[File]] = Task(Set.empty[File]) + def mapper: Task[FileMapper] = outputDirectory map(FileMapper.basic) + def outputDirectory: Task[File] = Task(new File("test")) + + import Task._ + lazy val task = syncTask + def syncTask = + (sources, mapper) bind { (srcs,mp) => + DependencyTracking.trackBasic(sources, FilesInfo.hash, new File("cache/sync/")) { (sourceChanges, report, tracking) => + Task + { + for(src <- report.invalid ** sourceChanges.allInputs) yield + { + val target = mp(src) + FileUtilities.copyFile(src, target) + tracking.product(src, target) + target + } + } + } + } + } +} \ No newline at end of file From 129bc048c4d8d9a926d917e6e188aea3e5834e5a Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 29 Aug 2009 10:19:00 -0400 Subject: [PATCH 010/823] tuple caches, stamped caches, Path API, another type of change detection, and copying/archiving based on (source,target) tuples --- cache/Cache.scala | 33 +++++++++++++++++++-- cache/CacheIO.scala | 26 +++++++++++++++++ cache/ChangeReport.scala | 34 ++++++++++++++++++++++ cache/DependencyTracking.scala | 32 +++++++++++++++----- cache/FileInfo.scala | 12 +++++--- cache/TrackingFormat.scala | 18 +++++++----- cache/src/test/scala/Tracking.scala | 45 ----------------------------- 7 files changed, 133 insertions(+), 67 deletions(-) create mode 100644 cache/CacheIO.scala delete mode 100644 cache/src/test/scala/Tracking.scala diff --git a/cache/Cache.scala b/cache/Cache.scala index ae46845c9..90677f29a 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -1,6 +1,6 @@ package xsbt -import sbinary.{CollectionTypes, Format, JavaFormats} +import sbinary.{CollectionTypes, Format, JavaFormats, Operations} import java.io.File trait Cache[I,O] @@ -11,7 +11,7 @@ trait SBinaryFormats extends CollectionTypes with JavaFormats with NotNull { //TODO: add basic types from SBinary minus FileFormat } -object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits +object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits with TupleCacheImplicits { def cache[I,O](implicit c: Cache[I,O]): Cache[I,O] = c def outputCache[O](implicit c: OutputCache[O]): OutputCache[O] = c @@ -32,7 +32,7 @@ object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImpl cache(file)(in) match { case Left(value) => Value(value) - case Right(store) => NewTask(m.map { out => store(out); out }) + case Right(store) => m.map { out => store(out); out } } } trait BasicCacheImplicits extends NotNull @@ -55,4 +55,31 @@ trait HListCacheImplicits extends HLists implicit def hConsOutputCache[H,T<:HList](implicit headCache: OutputCache[H], tailCache: OutputCache[T]): OutputCache[HCons[H,T]] = new HConsOutputCache(headCache, tailCache) implicit lazy val hNilOutputCache: OutputCache[HNil] = new HNilOutputCache +} +trait TupleCacheImplicits extends HLists +{ + import Cache._ + implicit def tuple2HList[A,B](t: (A,B)): A :: B :: HNil = t._1 :: t._2 :: HNil + implicit def hListTuple2[A,B](t: A :: B :: HNil): (A,B) = t match { case a :: b :: HNil => (a,b) } + + implicit def tuple2InputCache[A,B](implicit aCache: InputCache[A], bCache: InputCache[B]): InputCache[(A,B)] = + wrapInputCache[(A,B), A :: B :: HNil] + implicit def tuple2OutputCache[A,B](implicit aCache: OutputCache[A], bCache: OutputCache[B]): OutputCache[(A,B)] = + wrapOutputCache[(A,B), A :: B :: HNil] + + implicit def tuple3HList[A,B,C](t: (A,B,C)): A :: B :: C :: HNil = t._1 :: t._2 :: t._3 :: HNil + implicit def hListTuple3[A,B,C](t: A :: B :: C :: HNil): (A,B,C) = t match { case a :: b :: c :: HNil => (a,b,c) } + + implicit def tuple3InputCache[A,B,C](implicit aCache: InputCache[A], bCache: InputCache[B], cCache: InputCache[C]): InputCache[(A,B,C)] = + wrapInputCache[(A,B,C), A :: B :: C :: HNil] + implicit def tuple3OutputCache[A,B,C](implicit aCache: OutputCache[A], bCache: OutputCache[B], cCache: OutputCache[C]): OutputCache[(A,B,C)] = + wrapOutputCache[(A,B,C), A :: B :: C :: HNil] + + implicit def tuple4HList[A,B,C,D](t: (A,B,C,D)): A :: B :: C :: D :: HNil = t._1 :: t._2 :: t._3 :: t._4 :: HNil + implicit def hListTuple4[A,B,C,D](t: A :: B :: C :: D :: HNil): (A,B,C,D) = t match { case a :: b :: c :: d:: HNil => (a,b,c,d) } + + implicit def tuple4InputCache[A,B,C,D](implicit aCache: InputCache[A], bCache: InputCache[B], cCache: InputCache[C], dCache: InputCache[D]): InputCache[(A,B,C,D)] = + wrapInputCache[(A,B,C,D), A :: B :: C :: D :: HNil] + implicit def tuple4OutputCache[A,B,C,D](implicit aCache: OutputCache[A], bCache: OutputCache[B], cCache: OutputCache[C], dCache: OutputCache[D]): OutputCache[(A,B,C,D)] = + wrapOutputCache[(A,B,C,D), A :: B :: C :: D :: HNil] } \ No newline at end of file diff --git a/cache/CacheIO.scala b/cache/CacheIO.scala new file mode 100644 index 000000000..ba4cc0edc --- /dev/null +++ b/cache/CacheIO.scala @@ -0,0 +1,26 @@ +package xsbt + +import java.io.File +import sbinary.{DefaultProtocol, Format, Operations} +import scala.reflect.Manifest + +object CacheIO +{ + def fromFile[T](format: Format[T])(file: File)(implicit mf: Manifest[Format[T]]): T = + fromFile(file)(format, mf) + def fromFile[T](file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): T = + Operations.fromFile(file)(stampedFormat(format)) + def toFile[T](format: Format[T])(value: T)(file: File)(implicit mf: Manifest[Format[T]]): Unit = + toFile(value)(file)(format, mf) + def toFile[T](value: T)(file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Unit = + Operations.toFile(value)(file)(stampedFormat(format)) + def stampedFormat[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Format[T] = + { + import DefaultProtocol._ + withStamp(stamp(format))(format) + } + def stamp[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Int = typeHash(mf) + def typeHash[T](implicit mf: Manifest[T]) = mf.toString.hashCode + def manifest[T](implicit mf: Manifest[T]): Manifest[T] = mf + def objManifest[T](t: T)(implicit mf: Manifest[T]): Manifest[T] = mf +} \ No newline at end of file diff --git a/cache/ChangeReport.scala b/cache/ChangeReport.scala index baa078034..a16b875dc 100644 --- a/cache/ChangeReport.scala +++ b/cache/ChangeReport.scala @@ -1,5 +1,21 @@ package xsbt +object ChangeReport +{ + def modified[T](files: Set[T]) = + new EmptyChangeReport[T] + { + override def allInputs = files + override def modified = files + override def markAllModified = this + } + def unmodified[T](files: Set[T]) = + new EmptyChangeReport[T] + { + override def allInputs = files + override def unmodified = files + } +} trait ChangeReport[T] extends NotNull { def allInputs: Set[T] @@ -8,6 +24,24 @@ trait ChangeReport[T] extends NotNull def added: Set[T] def removed: Set[T] def +++(other: ChangeReport[T]): ChangeReport[T] = new CompoundChangeReport(this, other) + def markAllModified: ChangeReport[T] = + new ChangeReport[T] + { + def allInputs = ChangeReport.this.allInputs + def unmodified = Set.empty[T] + def modified = ChangeReport.this.allInputs + def added = ChangeReport.this.added + def removed = ChangeReport.this.removed + override def markAllModified = this + } +} +class EmptyChangeReport[T] extends ChangeReport[T] +{ + def allInputs = Set.empty[T] + def unmodified = Set.empty[T] + def modified = Set.empty[T] + def added = Set.empty[T] + def removed = Set.empty[T] } trait InvalidationReport[T] extends NotNull { diff --git a/cache/DependencyTracking.scala b/cache/DependencyTracking.scala index 931efba1c..d7f889dc3 100644 --- a/cache/DependencyTracking.scala +++ b/cache/DependencyTracking.scala @@ -1,12 +1,14 @@ package xsbt import java.io.File -import sbinary.{Format, Operations} +import CacheIO.{fromFile, toFile} +import sbinary.Format +import scala.reflect.Manifest object DependencyTracking { def trackBasic[T, F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cacheDirectory: File) - (f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[T]): Task[T] = + (f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[T])(implicit mf: Manifest[F]): Task[T] = { changed(filesTask, style, new File(cacheDirectory, "files")) { sourceChanges => invalidate(sourceChanges, cacheDirectory) { (report, tracking) => @@ -14,9 +16,24 @@ object DependencyTracking } } } - def changed[T, F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cache: File)(f: ChangeReport[File] => Task[T]): Task[T] = + def changed[O,O2](task: Task[O], file: File)(ifChanged: O => O2, ifUnchanged: O => O2)(implicit input: InputCache[O]): Task[O2] { type Input = O } = + task map { value => + val cache = OpenResource.fileInputStream(file)(input.uptodate(value)) + if(cache.uptodate) + ifUnchanged(value) + else + { + OpenResource.fileOutputStream(false)(file)(cache.update) + ifChanged(value) + } + } + def changed[T, F <: FileInfo](files: Set[File], style: FilesInfo.Style[F], cache: File) + (f: ChangeReport[File] => Task[T])(implicit mf: Manifest[F]): Task[T] = + changed(Task(files), style, cache)(f) + def changed[T, F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cache: File) + (f: ChangeReport[File] => Task[T])(implicit mf: Manifest[F]): Task[T] = filesTask bind { files => - val lastFilesInfo = Operations.fromFile(cache)(style.format).files + val lastFilesInfo = fromFile(style.formats)(cache).files val lastFiles = lastFilesInfo.map(_.file) val currentFiles = files.map(_.getAbsoluteFile) val currentFilesInfo = style(files) @@ -31,7 +48,7 @@ object DependencyTracking } f(report) map { result => - Operations.toFile(currentFilesInfo)(cache)(style.format) + toFile(style.formats)(currentFilesInfo)(cache) result } } @@ -41,10 +58,11 @@ object DependencyTracking report.invalidProducts.foreach(_.delete) f(report, tracking) } - invalidate(Task(changes), cacheDirectory, true)(pruneAndF)(sbinary.DefaultProtocol.FileFormat) + implicit val format = sbinary.DefaultProtocol.FileFormat + invalidate(Task(changes), cacheDirectory, true)(pruneAndF) } def invalidate[T,R](changesTask: Task[ChangeReport[T]], cacheDirectory: File, translateProducts: Boolean) - (f: (InvalidationReport[T], UpdateTracking[T]) => Task[R])(implicit format: Format[T]): Task[R] = + (f: (InvalidationReport[T], UpdateTracking[T]) => Task[R])(implicit format: Format[T], mf: Manifest[T]): Task[R] = { changesTask bind { changes => val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala index 8c835fe95..0271ad0ec 100644 --- a/cache/FileInfo.scala +++ b/cache/FileInfo.scala @@ -60,13 +60,17 @@ object FilesInfo { sealed trait Style[F <: FileInfo] extends NotNull { - implicit def apply(files: Iterable[File]): FilesInfo[F] - implicit val format: Format[FilesInfo[F]] + implicit def apply(files: Set[File]): FilesInfo[F] + implicit def unapply(info: FilesInfo[F]): Set[File] = info.files.map(_.file) + implicit val formats: Format[FilesInfo[F]] + import Cache._ + implicit def infosInputCache: InputCache[Set[File]] = wrapInputCache[Set[File],FilesInfo[F]] + implicit def infosOutputCache: OutputCache[Set[File]] = wrapOutputCache[Set[File],FilesInfo[F]] } private final class BasicStyle[F <: FileInfo](fileStyle: FileInfo.Style[F])(implicit infoFormat: Format[F]) extends Style[F] { - implicit def apply(files: Iterable[File]) = FilesInfo( (Set() ++ files.map(_.getAbsoluteFile)).map(fileStyle.apply) ) - implicit val format: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) + implicit def apply(files: Set[File]): FilesInfo[F] = FilesInfo( files.map(_.getAbsoluteFile).map(fileStyle.apply) ) + implicit val formats: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) } lazy val full: Style[HashModifiedFileInfo] = new BasicStyle(FileInfo.full)(FileInfo.full.format) lazy val hash: Style[HashFileInfo] = new BasicStyle(FileInfo.hash)(FileInfo.hash.format) diff --git a/cache/TrackingFormat.scala b/cache/TrackingFormat.scala index 1b6c463a0..166c22df2 100644 --- a/cache/TrackingFormat.scala +++ b/cache/TrackingFormat.scala @@ -2,29 +2,31 @@ package xsbt import java.io.File import scala.collection.mutable.{HashMap, Map, MultiMap, Set} -import sbinary.{DefaultProtocol, Format, Operations} +import scala.reflect.Manifest +import sbinary.{DefaultProtocol, Format} import DefaultProtocol._ import TrackingFormat._ +import CacheIO.{fromFile, toFile} import DependencyTracking.{DependencyMap => DMap, newMap} -private class TrackingFormat[T](directory: File, translateProducts: Boolean)(implicit tFormat: Format[T]) extends NotNull +private class TrackingFormat[T](directory: File, translateProducts: Boolean)(implicit tFormat: Format[T], mf: Manifest[T]) extends NotNull { - val indexFile = new File(directory, "index") val dependencyFile = new File(directory, "dependencies") def read(): DependencyTracking[T] = { - val indexMap = Operations.fromFile[Map[Int,T]](indexFile) + val indexMap = CacheIO.fromFile[Map[Int,T]](indexFile) val indexedFormat = wrap[T,Int](ignore => error("Read-only"), indexMap.apply) - Operations.fromFile(dependencyFile)(trackingFormat(translateProducts)(indexedFormat)) + val trackFormat = trackingFormat(translateProducts)(indexedFormat) + fromFile(trackFormat)(dependencyFile) } def write(tracking: DependencyTracking[T]) { val index = new IndexMap[T] val indexedFormat = wrap[T,Int](t => index(t), ignore => error("Write-only")) - - Operations.toFile(tracking)(dependencyFile)(trackingFormat(translateProducts)(indexedFormat)) - Operations.toFile(index.indices)(indexFile) + val trackFormat = trackingFormat(translateProducts)(indexedFormat) + toFile(trackFormat)(tracking)(dependencyFile) + toFile(index.indices)(indexFile) } } private object TrackingFormat diff --git a/cache/src/test/scala/Tracking.scala b/cache/src/test/scala/Tracking.scala deleted file mode 100644 index ff952556b..000000000 --- a/cache/src/test/scala/Tracking.scala +++ /dev/null @@ -1,45 +0,0 @@ -package xsbt - -import java.io.File - -trait examples -{ - def classpathTask: Task[Set[File]] - def sourcesTask: Task[Set[File]] - import DependencyTracking._ - lazy val compile = - changed(classpathTask, FilesInfo.lastModified, new File("cache/compile/classpath/")) { classpathChanges => - changed(sourcesTask, FilesInfo.hash, new File("cache/compile/sources/")) { sourceChanges => - invalidate(classpathChanges +++ sourceChanges, new File("cache/compile/dependencies/'")) { (report, tracking) => - val recompileSources = report.invalid ** sourceChanges.allInputs - val classpath = classpathChanges.allInputs - Task() - } - } - } - - trait sync - { - def sources: Task[Set[File]] = Task(Set.empty[File]) - def mapper: Task[FileMapper] = outputDirectory map(FileMapper.basic) - def outputDirectory: Task[File] = Task(new File("test")) - - import Task._ - lazy val task = syncTask - def syncTask = - (sources, mapper) bind { (srcs,mp) => - DependencyTracking.trackBasic(sources, FilesInfo.hash, new File("cache/sync/")) { (sourceChanges, report, tracking) => - Task - { - for(src <- report.invalid ** sourceChanges.allInputs) yield - { - val target = mp(src) - FileUtilities.copyFile(src, target) - tracking.product(src, target) - target - } - } - } - } - } -} \ No newline at end of file From 7f3e21537b0155a68930ea1cae5f2b6d9ca8524d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 30 Aug 2009 11:10:37 -0400 Subject: [PATCH 011/823] Compile task with dependency tracking. Checkpoint: compiles successfully. --- cache/DependencyTracking.scala | 139 ++++++++++++++++++++++++--------- cache/TrackingFormat.scala | 8 +- 2 files changed, 108 insertions(+), 39 deletions(-) diff --git a/cache/DependencyTracking.scala b/cache/DependencyTracking.scala index d7f889dc3..bd14ef998 100644 --- a/cache/DependencyTracking.scala +++ b/cache/DependencyTracking.scala @@ -5,18 +5,17 @@ import CacheIO.{fromFile, toFile} import sbinary.Format import scala.reflect.Manifest -object DependencyTracking +trait Tracked extends NotNull { - def trackBasic[T, F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cacheDirectory: File) - (f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[T])(implicit mf: Manifest[F]): Task[T] = - { - changed(filesTask, style, new File(cacheDirectory, "files")) { sourceChanges => - invalidate(sourceChanges, cacheDirectory) { (report, tracking) => - f(sourceChanges, report, tracking) - } - } - } - def changed[O,O2](task: Task[O], file: File)(ifChanged: O => O2, ifUnchanged: O => O2)(implicit input: InputCache[O]): Task[O2] { type Input = O } = + def clear: Task[Unit] + def clean: Task[Unit] +} + +class Changed[O](val task: Task[O], val file: File)(implicit input: InputCache[O]) extends Tracked +{ + def clean = Task.empty + def clear = Clean(file) + def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): Task[O2] { type Input = O } = task map { value => val cache = OpenResource.fileInputStream(file)(input.uptodate(value)) if(cache.uptodate) @@ -27,14 +26,24 @@ object DependencyTracking ifChanged(value) } } - def changed[T, F <: FileInfo](files: Set[File], style: FilesInfo.Style[F], cache: File) - (f: ChangeReport[File] => Task[T])(implicit mf: Manifest[F]): Task[T] = - changed(Task(files), style, cache)(f) - def changed[T, F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cache: File) - (f: ChangeReport[File] => Task[T])(implicit mf: Manifest[F]): Task[T] = +} +class Difference[F <: FileInfo](val filesTask: Task[Set[File]], val style: FilesInfo.Style[F], val cache: File, val shouldClean: Boolean)(implicit mf: Manifest[F]) extends Tracked +{ + def this(filesTask: Task[Set[File]], style: FilesInfo.Style[F], cache: File)(implicit mf: Manifest[F]) = this(filesTask, style, cache, false) + def this(files: Set[File], style: FilesInfo.Style[F], cache: File, shouldClean: Boolean)(implicit mf: Manifest[F]) = this(Task(files), style, cache) + def this(files: Set[File], style: FilesInfo.Style[F], cache: File)(implicit mf: Manifest[F]) = this(Task(files), style, cache, false) + + val clear = Clean(cache) + val clean = if(shouldClean) cleanTask else Task.empty + def cleanTask = Clean(Task(raw(cachedFilesInfo))) + + private def cachedFilesInfo = fromFile(style.formats)(cache).files + private def raw(fs: Set[F]): Set[File] = fs.map(_.file) + + def apply[T](f: ChangeReport[File] => Task[T]): Task[T] = filesTask bind { files => - val lastFilesInfo = fromFile(style.formats)(cache).files - val lastFiles = lastFilesInfo.map(_.file) + val lastFilesInfo = cachedFilesInfo + val lastFiles = raw(lastFilesInfo) val currentFiles = files.map(_.getAbsoluteFile) val currentFilesInfo = style(files) @@ -43,7 +52,7 @@ object DependencyTracking lazy val allInputs = currentFiles lazy val removed = lastFiles -- allInputs lazy val added = allInputs -- lastFiles - lazy val modified = (lastFilesInfo -- currentFilesInfo.files).map(_.file) + lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) lazy val unmodified = allInputs -- modified } @@ -52,20 +61,32 @@ object DependencyTracking result } } - def invalidate[R](changes: ChangeReport[File], cacheDirectory: File)(f: (InvalidationReport[File], UpdateTracking[File]) => Task[R]): Task[R] = +} +object InvalidateFiles +{ + def apply(cacheDirectory: File): Invalidate[File] = apply(cacheDirectory, true) + def apply(cacheDirectory: File, translateProducts: Boolean): Invalidate[File] = { - val pruneAndF = (report: InvalidationReport[File], tracking: UpdateTracking[File]) => { - report.invalidProducts.foreach(_.delete) - f(report, tracking) - } - implicit val format = sbinary.DefaultProtocol.FileFormat - invalidate(Task(changes), cacheDirectory, true)(pruneAndF) + import sbinary.DefaultProtocol.FileFormat + new Invalidate[File](cacheDirectory, translateProducts, FileUtilities.delete) } - def invalidate[T,R](changesTask: Task[ChangeReport[T]], cacheDirectory: File, translateProducts: Boolean) - (f: (InvalidationReport[T], UpdateTracking[T]) => Task[R])(implicit format: Format[T], mf: Manifest[T]): Task[R] = +} +class Invalidate[T](val cacheDirectory: File, val translateProducts: Boolean, cleanT: T => Unit) + (implicit format: Format[T], mf: Manifest[T]) extends Tracked +{ + def this(cacheDirectory: File, translateProducts: Boolean)(implicit format: Format[T], mf: Manifest[T]) = + this(cacheDirectory, translateProducts, x => ()) + + private val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) + private def cleanAll(fs: Set[T]) = fs.foreach(cleanT) + + def clear = Clean(cacheDirectory) + def clean = Task(cleanAll(trackFormat.read.allProducts)) + def apply[R](changes: ChangeReport[T])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = + apply(Task(changes))(f) + def apply[R](changesTask: Task[ChangeReport[T]])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = { changesTask bind { changes => - val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) val tracker = trackFormat.read def invalidatedBy(file: T) = tracker.products(file) ++ tracker.sources(file) ++ tracker.usedBy(file) ++ tracker.dependsOn(file) @@ -89,17 +110,36 @@ object DependencyTracking val invalidProducts = Set(invalidatedProducts.toSeq : _*) val valid = changes.unmodified -- invalid } - + cleanAll(report.invalidProducts) + f(report, tracker) map { result => trackFormat.write(tracker) result } } } - - import scala.collection.mutable.{Set, HashMap, MultiMap} - private[xsbt] type DependencyMap[T] = HashMap[T, Set[T]] with MultiMap[T, T] - private[xsbt] def newMap[T]: DependencyMap[T] = new HashMap[T, Set[T]] with MultiMap[T, T] +} +class BasicTracked[F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cacheDirectory: File)(implicit mf: Manifest[F]) extends Tracked +{ + private val changed = new Difference(filesTask, style, new File(cacheDirectory, "files")) + private val invalidation = InvalidateFiles(cacheDirectory) + val clean = invalidation.clean + val clear = Clean(cacheDirectory) + + def apply[R](f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[R]): Task[R] = + changed { sourceChanges => + invalidation(sourceChanges) { (report, tracking) => + f(sourceChanges, report, tracking) + } + } +} +private object DependencyTracking +{ + import scala.collection.mutable.{Set, HashMap, Map, MultiMap} + type DependencyMap[T] = HashMap[T, Set[T]] with MultiMap[T, T] + def newMap[T]: DependencyMap[T] = new HashMap[T, Set[T]] with MultiMap[T, T] + type TagMap[T] = Map[T, Array[Byte]] + def newTagMap[T] = new HashMap[T, Array[Byte]] } trait UpdateTracking[T] extends NotNull @@ -107,6 +147,14 @@ trait UpdateTracking[T] extends NotNull def dependency(source: T, dependsOn: T): Unit def use(source: T, uses: T): Unit def product(source: T, output: T): Unit + def tag(source: T, t: Array[Byte]): Unit + def read: ReadTracking[T] +} +object Clean +{ + def apply(src: Task[Set[File]]): Task[Unit] = src map FileUtilities.delete + def apply(srcs: File*): Task[Unit] = Task(FileUtilities.delete(srcs)) + def apply(srcs: Set[File]): Task[Unit] = Task(FileUtilities.delete(srcs)) } import scala.collection.Set trait ReadTracking[T] extends NotNull @@ -115,9 +163,15 @@ trait ReadTracking[T] extends NotNull def products(file: T): Set[T] def sources(file: T): Set[T] def usedBy(file: T): Set[T] + def allProducts: Set[T] + def allSources: Set[T] + def allUsed: Set[T] + def allTags: Seq[(T,Array[Byte])] } -import DependencyTracking.{DependencyMap => DMap, newMap} -private final class DefaultTracking[T](translateProducts: Boolean)(val reverseDependencies: DMap[T], val reverseUses: DMap[T], val sourceMap: DMap[T]) extends DependencyTracking[T](translateProducts) +import DependencyTracking.{DependencyMap => DMap, newMap, TagMap} +private final class DefaultTracking[T](translateProducts: Boolean) + (val reverseDependencies: DMap[T], val reverseUses: DMap[T], val sourceMap: DMap[T], val tagMap: TagMap[T]) + extends DependencyTracking[T](translateProducts) { val productMap: DMap[T] = forward(sourceMap) // map from a source to its products. Keep in sync with sourceMap } @@ -128,11 +182,20 @@ private abstract class DependencyTracking[T](translateProducts: Boolean) extends val reverseUses: DMap[T] // map from a file to the files that use it val sourceMap: DMap[T] // map from a product to its sources. Keep in sync with productMap val productMap: DMap[T] // map from a source to its products. Keep in sync with sourceMap + val tagMap: TagMap[T] + + def read = this final def dependsOn(file: T): Set[T] = get(reverseDependencies, file) final def products(file: T): Set[T] = get(productMap, file) final def sources(file: T): Set[T] = get(sourceMap, file) final def usedBy(file: T): Set[T] = get(reverseUses, file) + final def tag(file: T): Array[Byte] = tagMap.getOrElse(file, new Array[Byte](0)) + + final def allProducts = Set() ++ sourceMap.keys + final def allSources = Set() ++ productMap.keys + final def allUsed = Set() ++ reverseUses.keys + final def allTags = tagMap.toSeq private def get(map: DMap[T], value: T): Set[T] = map.getOrElse(value, Set.empty[T]) @@ -151,6 +214,7 @@ private abstract class DependencyTracking[T](translateProducts: Boolean) extends sourceMap.add(product, sourceFile) } final def use(sourceFile: T, usesFile: T) { reverseUses.add(usesFile, sourceFile) } + final def tag(sourceFile: T, t: Array[Byte]) { tagMap(sourceFile) = t } final def removeAll(files: Iterable[T]) { @@ -162,6 +226,7 @@ private abstract class DependencyTracking[T](translateProducts: Boolean) extends removeAll(forward(reverseDependencies), reverseDependencies) removeAll(productMap, sourceMap) removeAll(forward(reverseUses), reverseUses) + tagMap --= files } protected final def forward(map: DMap[T]): DMap[T] = { @@ -169,4 +234,4 @@ private abstract class DependencyTracking[T](translateProducts: Boolean) extends for( (key, values) <- map; value <- values) f.add(value, key) f } -} \ No newline at end of file +} diff --git a/cache/TrackingFormat.scala b/cache/TrackingFormat.scala index 166c22df2..d773d13a6 100644 --- a/cache/TrackingFormat.scala +++ b/cache/TrackingFormat.scala @@ -7,7 +7,7 @@ import sbinary.{DefaultProtocol, Format} import DefaultProtocol._ import TrackingFormat._ import CacheIO.{fromFile, toFile} -import DependencyTracking.{DependencyMap => DMap, newMap} +import DependencyTracking.{DependencyMap => DMap, newMap, TagMap} private class TrackingFormat[T](directory: File, translateProducts: Boolean)(implicit tFormat: Format[T], mf: Manifest[T]) extends NotNull { @@ -42,7 +42,11 @@ private object TrackingFormat } } def trackingFormat[T](translateProducts: Boolean)(implicit tFormat: Format[T]): Format[DependencyTracking[T]] = - asProduct3((a: DMap[T],b: DMap[T],c: DMap[T]) => new DefaultTracking(translateProducts)(a,b,c) : DependencyTracking[T])(dt => Some(dt.reverseDependencies, dt.reverseUses, dt.sourceMap)) + { + implicit val arrayFormat = sbinary.Operations.format[Array[Byte]] + asProduct4((a: DMap[T],b: DMap[T],c: DMap[T], d:TagMap[T]) => new DefaultTracking(translateProducts)(a,b,c,d) : DependencyTracking[T] + )(dt => Some(dt.reverseDependencies, dt.reverseUses, dt.sourceMap, dt.tagMap)) + } } private final class IndexMap[T] extends NotNull From 68d50ae56bd55df3aad75f036e74ca80d2c30c25 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 30 Aug 2009 13:01:02 -0400 Subject: [PATCH 012/823] Moved type parameter for FileInfo(s).Style to abstract type --- cache/Cache.scala | 1 + cache/DependencyTracking.scala | 139 -------------------------------- cache/FileInfo.scala | 34 +++++--- cache/Tracked.scala | 141 +++++++++++++++++++++++++++++++++ 4 files changed, 167 insertions(+), 148 deletions(-) create mode 100644 cache/Tracked.scala diff --git a/cache/Cache.scala b/cache/Cache.scala index 90677f29a..d6f15290a 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -2,6 +2,7 @@ package xsbt import sbinary.{CollectionTypes, Format, JavaFormats, Operations} import java.io.File +import scala.reflect.Manifest trait Cache[I,O] { diff --git a/cache/DependencyTracking.scala b/cache/DependencyTracking.scala index bd14ef998..107c88fc8 100644 --- a/cache/DependencyTracking.scala +++ b/cache/DependencyTracking.scala @@ -1,138 +1,5 @@ package xsbt -import java.io.File -import CacheIO.{fromFile, toFile} -import sbinary.Format -import scala.reflect.Manifest - -trait Tracked extends NotNull -{ - def clear: Task[Unit] - def clean: Task[Unit] -} - -class Changed[O](val task: Task[O], val file: File)(implicit input: InputCache[O]) extends Tracked -{ - def clean = Task.empty - def clear = Clean(file) - def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): Task[O2] { type Input = O } = - task map { value => - val cache = OpenResource.fileInputStream(file)(input.uptodate(value)) - if(cache.uptodate) - ifUnchanged(value) - else - { - OpenResource.fileOutputStream(false)(file)(cache.update) - ifChanged(value) - } - } -} -class Difference[F <: FileInfo](val filesTask: Task[Set[File]], val style: FilesInfo.Style[F], val cache: File, val shouldClean: Boolean)(implicit mf: Manifest[F]) extends Tracked -{ - def this(filesTask: Task[Set[File]], style: FilesInfo.Style[F], cache: File)(implicit mf: Manifest[F]) = this(filesTask, style, cache, false) - def this(files: Set[File], style: FilesInfo.Style[F], cache: File, shouldClean: Boolean)(implicit mf: Manifest[F]) = this(Task(files), style, cache) - def this(files: Set[File], style: FilesInfo.Style[F], cache: File)(implicit mf: Manifest[F]) = this(Task(files), style, cache, false) - - val clear = Clean(cache) - val clean = if(shouldClean) cleanTask else Task.empty - def cleanTask = Clean(Task(raw(cachedFilesInfo))) - - private def cachedFilesInfo = fromFile(style.formats)(cache).files - private def raw(fs: Set[F]): Set[File] = fs.map(_.file) - - def apply[T](f: ChangeReport[File] => Task[T]): Task[T] = - filesTask bind { files => - val lastFilesInfo = cachedFilesInfo - val lastFiles = raw(lastFilesInfo) - val currentFiles = files.map(_.getAbsoluteFile) - val currentFilesInfo = style(files) - - val report = new ChangeReport[File] - { - lazy val allInputs = currentFiles - lazy val removed = lastFiles -- allInputs - lazy val added = allInputs -- lastFiles - lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) - lazy val unmodified = allInputs -- modified - } - - f(report) map { result => - toFile(style.formats)(currentFilesInfo)(cache) - result - } - } -} -object InvalidateFiles -{ - def apply(cacheDirectory: File): Invalidate[File] = apply(cacheDirectory, true) - def apply(cacheDirectory: File, translateProducts: Boolean): Invalidate[File] = - { - import sbinary.DefaultProtocol.FileFormat - new Invalidate[File](cacheDirectory, translateProducts, FileUtilities.delete) - } -} -class Invalidate[T](val cacheDirectory: File, val translateProducts: Boolean, cleanT: T => Unit) - (implicit format: Format[T], mf: Manifest[T]) extends Tracked -{ - def this(cacheDirectory: File, translateProducts: Boolean)(implicit format: Format[T], mf: Manifest[T]) = - this(cacheDirectory, translateProducts, x => ()) - - private val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) - private def cleanAll(fs: Set[T]) = fs.foreach(cleanT) - - def clear = Clean(cacheDirectory) - def clean = Task(cleanAll(trackFormat.read.allProducts)) - def apply[R](changes: ChangeReport[T])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = - apply(Task(changes))(f) - def apply[R](changesTask: Task[ChangeReport[T]])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = - { - changesTask bind { changes => - val tracker = trackFormat.read - def invalidatedBy(file: T) = tracker.products(file) ++ tracker.sources(file) ++ tracker.usedBy(file) ++ tracker.dependsOn(file) - - import scala.collection.mutable.HashSet - val invalidated = new HashSet[T] - val invalidatedProducts = new HashSet[T] - def invalidate(files: Iterable[T]): Unit = - for(file <- files if !invalidated(file)) - { - invalidated += file - if(!tracker.sources(file).isEmpty) invalidatedProducts += file - invalidate(invalidatedBy(file)) - } - - invalidate(changes.modified) - tracker.removeAll(invalidated) - - val report = new InvalidationReport[T] - { - val invalid = Set(invalidated.toSeq : _*) - val invalidProducts = Set(invalidatedProducts.toSeq : _*) - val valid = changes.unmodified -- invalid - } - cleanAll(report.invalidProducts) - - f(report, tracker) map { result => - trackFormat.write(tracker) - result - } - } - } -} -class BasicTracked[F <: FileInfo](filesTask: Task[Set[File]], style: FilesInfo.Style[F], cacheDirectory: File)(implicit mf: Manifest[F]) extends Tracked -{ - private val changed = new Difference(filesTask, style, new File(cacheDirectory, "files")) - private val invalidation = InvalidateFiles(cacheDirectory) - val clean = invalidation.clean - val clear = Clean(cacheDirectory) - - def apply[R](f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[R]): Task[R] = - changed { sourceChanges => - invalidation(sourceChanges) { (report, tracking) => - f(sourceChanges, report, tracking) - } - } -} private object DependencyTracking { import scala.collection.mutable.{Set, HashMap, Map, MultiMap} @@ -150,12 +17,6 @@ trait UpdateTracking[T] extends NotNull def tag(source: T, t: Array[Byte]): Unit def read: ReadTracking[T] } -object Clean -{ - def apply(src: Task[Set[File]]): Task[Unit] = src map FileUtilities.delete - def apply(srcs: File*): Task[Unit] = Task(FileUtilities.delete(srcs)) - def apply(srcs: Set[File]): Task[Unit] = Task(FileUtilities.delete(srcs)) -} import scala.collection.Set trait ReadTracking[T] extends NotNull { diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala index 0271ad0ec..435201049 100644 --- a/cache/FileInfo.scala +++ b/cache/FileInfo.scala @@ -4,6 +4,7 @@ import java.io.{File, IOException} import sbinary.{DefaultProtocol, Format} import DefaultProtocol._ import Function.tupled +import scala.reflect.Manifest sealed trait FileInfo extends NotNull { @@ -25,30 +26,39 @@ private final case class FileHashModified(file: File, hash: List[Byte], lastModi object FileInfo { - sealed trait Style[F <: FileInfo] extends NotNull + sealed trait Style extends NotNull { + type F <: FileInfo implicit def apply(file: File): F implicit def unapply(info: F): File = info.file implicit val format: Format[F] + /*val manifest: Manifest[F] + def formatManifest: Manifest[Format[F]] = CacheIO.manifest[Format[F]]*/ import Cache._ implicit def infoInputCache: InputCache[File] = wrapInputCache[File,F] implicit def infoOutputCache: OutputCache[File] = wrapOutputCache[File,F] } - object full extends Style[HashModifiedFileInfo] + object full extends Style { + type F = HashModifiedFileInfo + //val manifest: Manifest[F] = CacheIO.manifest[HashModifiedFileInfo] implicit def apply(file: File): HashModifiedFileInfo = make(file, Hash(file).toList, file.lastModified) def make(file: File, hash: List[Byte], lastModified: Long): HashModifiedFileInfo = FileHashModified(file.getAbsoluteFile, hash, lastModified) implicit val format: Format[HashModifiedFileInfo] = wrap(f => (f.file, f.hash, f.lastModified), tupled(make _)) } - object hash extends Style[HashFileInfo] + object hash extends Style { + type F = HashFileInfo + //val manifest: Manifest[F] = CacheIO.manifest[HashFileInfo] implicit def apply(file: File): HashFileInfo = make(file, computeHash(file).toList) def make(file: File, hash: List[Byte]): HashFileInfo = FileHash(file.getAbsoluteFile, hash) implicit val format: Format[HashFileInfo] = wrap(f => (f.file, f.hash), tupled(make _)) private def computeHash(file: File) = try { Hash(file) } catch { case e: Exception => Nil } } - object lastModified extends Style[ModifiedFileInfo] + object lastModified extends Style { + type F = ModifiedFileInfo + //val manifest: Manifest[F] = CacheIO.manifest[ModifiedFileInfo] implicit def apply(file: File): ModifiedFileInfo = make(file, file.lastModified) def make(file: File, lastModified: Long): ModifiedFileInfo = FileModified(file.getAbsoluteFile, lastModified) implicit val format: Format[ModifiedFileInfo] = wrap(f => (f.file, f.lastModified), tupled(make _)) @@ -58,21 +68,27 @@ object FileInfo final case class FilesInfo[F <: FileInfo] private(files: Set[F]) extends NotNull object FilesInfo { - sealed trait Style[F <: FileInfo] extends NotNull + sealed trait Style extends NotNull { + val fileStyle: FileInfo.Style + type F = fileStyle.F + //def manifest: Manifest[F] = fileStyle.manifest implicit def apply(files: Set[File]): FilesInfo[F] implicit def unapply(info: FilesInfo[F]): Set[File] = info.files.map(_.file) implicit val formats: Format[FilesInfo[F]] + val manifest: Manifest[Format[FilesInfo[F]]] import Cache._ implicit def infosInputCache: InputCache[Set[File]] = wrapInputCache[Set[File],FilesInfo[F]] implicit def infosOutputCache: OutputCache[Set[File]] = wrapOutputCache[Set[File],FilesInfo[F]] } - private final class BasicStyle[F <: FileInfo](fileStyle: FileInfo.Style[F])(implicit infoFormat: Format[F]) extends Style[F] + private final class BasicStyle[FI <: FileInfo](val fileStyle: FileInfo.Style { type F = FI }) + (implicit val manifest: Manifest[Format[FilesInfo[FI]]]) extends Style { + private implicit val infoFormat: Format[FI] = fileStyle.format implicit def apply(files: Set[File]): FilesInfo[F] = FilesInfo( files.map(_.getAbsoluteFile).map(fileStyle.apply) ) implicit val formats: Format[FilesInfo[F]] = wrap(_.files, (fs: Set[F]) => new FilesInfo(fs)) } - lazy val full: Style[HashModifiedFileInfo] = new BasicStyle(FileInfo.full)(FileInfo.full.format) - lazy val hash: Style[HashFileInfo] = new BasicStyle(FileInfo.hash)(FileInfo.hash.format) - lazy val lastModified: Style[ModifiedFileInfo] = new BasicStyle(FileInfo.lastModified)(FileInfo.lastModified.format) + lazy val full: Style = new BasicStyle(FileInfo.full) + lazy val hash: Style = new BasicStyle(FileInfo.hash) + lazy val lastModified: Style = new BasicStyle(FileInfo.lastModified) } \ No newline at end of file diff --git a/cache/Tracked.scala b/cache/Tracked.scala new file mode 100644 index 000000000..530e292fc --- /dev/null +++ b/cache/Tracked.scala @@ -0,0 +1,141 @@ +package xsbt + +import java.io.File +import CacheIO.{fromFile, toFile} +import sbinary.Format +import scala.reflect.Manifest + +trait Tracked extends NotNull +{ + def clear: Task[Unit] + def clean: Task[Unit] +} +object Clean +{ + def apply(src: Task[Set[File]]): Task[Unit] = src map FileUtilities.delete + def apply(srcs: File*): Task[Unit] = Task(FileUtilities.delete(srcs)) + def apply(srcs: Set[File]): Task[Unit] = Task(FileUtilities.delete(srcs)) +} + +class Changed[O](val task: Task[O], val file: File)(implicit input: InputCache[O]) extends Tracked +{ + def clean = Task.empty + def clear = Clean(file) + def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): Task[O2] { type Input = O } = + task map { value => + val cache = OpenResource.fileInputStream(file)(input.uptodate(value)) + if(cache.uptodate) + ifUnchanged(value) + else + { + OpenResource.fileOutputStream(false)(file)(cache.update) + ifChanged(value) + } + } +} +class Difference(val filesTask: Task[Set[File]], val style: FilesInfo.Style, val cache: File, val shouldClean: Boolean) extends Tracked +{ + def this(filesTask: Task[Set[File]], style: FilesInfo.Style, cache: File) = this(filesTask, style, cache, false) + def this(files: Set[File], style: FilesInfo.Style, cache: File, shouldClean: Boolean) = this(Task(files), style, cache) + def this(files: Set[File], style: FilesInfo.Style, cache: File) = this(Task(files), style, cache, false) + + val clear = Clean(cache) + val clean = if(shouldClean) cleanTask else Task.empty + def cleanTask = Clean(Task(raw(cachedFilesInfo))) + + private def cachedFilesInfo = fromFile(style.formats)(cache)(style.manifest).files + private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) + + def apply[T](f: ChangeReport[File] => Task[T]): Task[T] = + filesTask bind { files => + val lastFilesInfo = cachedFilesInfo + val lastFiles = raw(lastFilesInfo) + val currentFiles = files.map(_.getAbsoluteFile) + val currentFilesInfo = style(files) + + val report = new ChangeReport[File] + { + lazy val allInputs = currentFiles + lazy val removed = lastFiles -- allInputs + lazy val added = allInputs -- lastFiles + lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) + lazy val unmodified = allInputs -- modified + } + + f(report) map { result => + toFile(style.formats)(currentFilesInfo)(cache)(style.manifest) + result + } + } +} +object InvalidateFiles +{ + def apply(cacheDirectory: File): Invalidate[File] = apply(cacheDirectory, true) + def apply(cacheDirectory: File, translateProducts: Boolean): Invalidate[File] = + { + import sbinary.DefaultProtocol.FileFormat + new Invalidate[File](cacheDirectory, translateProducts, FileUtilities.delete) + } +} +class Invalidate[T](val cacheDirectory: File, val translateProducts: Boolean, cleanT: T => Unit) + (implicit format: Format[T], mf: Manifest[T]) extends Tracked +{ + def this(cacheDirectory: File, translateProducts: Boolean)(implicit format: Format[T], mf: Manifest[T]) = + this(cacheDirectory, translateProducts, x => ()) + + private val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) + private def cleanAll(fs: Set[T]) = fs.foreach(cleanT) + + def clear = Clean(cacheDirectory) + def clean = Task(cleanAll(trackFormat.read.allProducts)) + def apply[R](changes: ChangeReport[T])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = + apply(Task(changes))(f) + def apply[R](changesTask: Task[ChangeReport[T]])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = + { + changesTask bind { changes => + val tracker = trackFormat.read + def invalidatedBy(file: T) = tracker.products(file) ++ tracker.sources(file) ++ tracker.usedBy(file) ++ tracker.dependsOn(file) + + import scala.collection.mutable.HashSet + val invalidated = new HashSet[T] + val invalidatedProducts = new HashSet[T] + def invalidate(files: Iterable[T]): Unit = + for(file <- files if !invalidated(file)) + { + invalidated += file + if(!tracker.sources(file).isEmpty) invalidatedProducts += file + invalidate(invalidatedBy(file)) + } + + invalidate(changes.modified) + tracker.removeAll(invalidated) + + val report = new InvalidationReport[T] + { + val invalid = Set(invalidated.toSeq : _*) + val invalidProducts = Set(invalidatedProducts.toSeq : _*) + val valid = changes.unmodified -- invalid + } + cleanAll(report.invalidProducts) + + f(report, tracker) map { result => + trackFormat.write(tracker) + result + } + } + } +} +class BasicTracked(filesTask: Task[Set[File]], style: FilesInfo.Style, cacheDirectory: File) extends Tracked +{ + private val changed = new Difference(filesTask, style, new File(cacheDirectory, "files")) + private val invalidation = InvalidateFiles(cacheDirectory) + val clean = invalidation.clean + val clear = Clean(cacheDirectory) + + def apply[R](f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[R]): Task[R] = + changed { sourceChanges => + invalidation(sourceChanges) { (report, tracking) => + f(sourceChanges, report, tracking) + } + } +} \ No newline at end of file From e69bdb856050b00d225ea85a4cc872f135f18830 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 30 Aug 2009 21:47:33 -0400 Subject: [PATCH 013/823] Removed tuple caches. Removing these 16 implicits brought compile time for Cache subproject down to 7s from 17s. --- cache/Cache.scala | 31 ++----------------------------- 1 file changed, 2 insertions(+), 29 deletions(-) diff --git a/cache/Cache.scala b/cache/Cache.scala index d6f15290a..011d35211 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -1,6 +1,6 @@ package xsbt -import sbinary.{CollectionTypes, Format, JavaFormats, Operations} +import sbinary.{CollectionTypes, Format, JavaFormats} import java.io.File import scala.reflect.Manifest @@ -12,7 +12,7 @@ trait SBinaryFormats extends CollectionTypes with JavaFormats with NotNull { //TODO: add basic types from SBinary minus FileFormat } -object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits with TupleCacheImplicits +object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImplicits { def cache[I,O](implicit c: Cache[I,O]): Cache[I,O] = c def outputCache[O](implicit c: OutputCache[O]): OutputCache[O] = c @@ -56,31 +56,4 @@ trait HListCacheImplicits extends HLists implicit def hConsOutputCache[H,T<:HList](implicit headCache: OutputCache[H], tailCache: OutputCache[T]): OutputCache[HCons[H,T]] = new HConsOutputCache(headCache, tailCache) implicit lazy val hNilOutputCache: OutputCache[HNil] = new HNilOutputCache -} -trait TupleCacheImplicits extends HLists -{ - import Cache._ - implicit def tuple2HList[A,B](t: (A,B)): A :: B :: HNil = t._1 :: t._2 :: HNil - implicit def hListTuple2[A,B](t: A :: B :: HNil): (A,B) = t match { case a :: b :: HNil => (a,b) } - - implicit def tuple2InputCache[A,B](implicit aCache: InputCache[A], bCache: InputCache[B]): InputCache[(A,B)] = - wrapInputCache[(A,B), A :: B :: HNil] - implicit def tuple2OutputCache[A,B](implicit aCache: OutputCache[A], bCache: OutputCache[B]): OutputCache[(A,B)] = - wrapOutputCache[(A,B), A :: B :: HNil] - - implicit def tuple3HList[A,B,C](t: (A,B,C)): A :: B :: C :: HNil = t._1 :: t._2 :: t._3 :: HNil - implicit def hListTuple3[A,B,C](t: A :: B :: C :: HNil): (A,B,C) = t match { case a :: b :: c :: HNil => (a,b,c) } - - implicit def tuple3InputCache[A,B,C](implicit aCache: InputCache[A], bCache: InputCache[B], cCache: InputCache[C]): InputCache[(A,B,C)] = - wrapInputCache[(A,B,C), A :: B :: C :: HNil] - implicit def tuple3OutputCache[A,B,C](implicit aCache: OutputCache[A], bCache: OutputCache[B], cCache: OutputCache[C]): OutputCache[(A,B,C)] = - wrapOutputCache[(A,B,C), A :: B :: C :: HNil] - - implicit def tuple4HList[A,B,C,D](t: (A,B,C,D)): A :: B :: C :: D :: HNil = t._1 :: t._2 :: t._3 :: t._4 :: HNil - implicit def hListTuple4[A,B,C,D](t: A :: B :: C :: D :: HNil): (A,B,C,D) = t match { case a :: b :: c :: d:: HNil => (a,b,c,d) } - - implicit def tuple4InputCache[A,B,C,D](implicit aCache: InputCache[A], bCache: InputCache[B], cCache: InputCache[C], dCache: InputCache[D]): InputCache[(A,B,C,D)] = - wrapInputCache[(A,B,C,D), A :: B :: C :: D :: HNil] - implicit def tuple4OutputCache[A,B,C,D](implicit aCache: OutputCache[A], bCache: OutputCache[B], cCache: OutputCache[C], dCache: OutputCache[D]): OutputCache[(A,B,C,D)] = - wrapOutputCache[(A,B,C,D), A :: B :: C :: D :: HNil] } \ No newline at end of file From aa8dfc5a513dd0b718d06e05d06db1138d4e145f Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 30 Aug 2009 21:53:38 -0400 Subject: [PATCH 014/823] General improvement of tasks/caches/tracking: - Specify behavior of ChangeReport and give it a toString implementation. - Cache initialization. - Specify cleaning behavior on TaskDefinition and Tracked instances. - Sync task implementation handles output changes. --- cache/CacheIO.scala | 14 ++++--- cache/ChangeReport.scala | 31 ++++++++++++---- cache/DependencyTracking.scala | 7 +++- cache/FileInfo.scala | 3 +- cache/Tracked.scala | 67 ++++++++++++++++++++-------------- cache/TrackingFormat.scala | 14 +++---- 6 files changed, 86 insertions(+), 50 deletions(-) diff --git a/cache/CacheIO.scala b/cache/CacheIO.scala index ba4cc0edc..05768bab9 100644 --- a/cache/CacheIO.scala +++ b/cache/CacheIO.scala @@ -1,19 +1,23 @@ package xsbt -import java.io.File +import java.io.{File, FileNotFoundException} import sbinary.{DefaultProtocol, Format, Operations} import scala.reflect.Manifest object CacheIO { - def fromFile[T](format: Format[T])(file: File)(implicit mf: Manifest[Format[T]]): T = - fromFile(file)(format, mf) - def fromFile[T](file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): T = - Operations.fromFile(file)(stampedFormat(format)) + def fromFile[T](format: Format[T], default: => T)(file: File)(implicit mf: Manifest[Format[T]]): T = + fromFile(file, default)(format, mf) + def fromFile[T](file: File, default: => T)(implicit format: Format[T], mf: Manifest[Format[T]]): T = + try { Operations.fromFile(file)(stampedFormat(format)) } + catch { case e: FileNotFoundException => default } def toFile[T](format: Format[T])(value: T)(file: File)(implicit mf: Manifest[Format[T]]): Unit = toFile(value)(file)(format, mf) def toFile[T](value: T)(file: File)(implicit format: Format[T], mf: Manifest[Format[T]]): Unit = + { + FileUtilities.createDirectory(file.getParentFile) Operations.toFile(value)(file)(stampedFormat(format)) + } def stampedFormat[T](format: Format[T])(implicit mf: Manifest[Format[T]]): Format[T] = { import DefaultProtocol._ diff --git a/cache/ChangeReport.scala b/cache/ChangeReport.scala index a16b875dc..41f99ca1a 100644 --- a/cache/ChangeReport.scala +++ b/cache/ChangeReport.scala @@ -5,43 +5,60 @@ object ChangeReport def modified[T](files: Set[T]) = new EmptyChangeReport[T] { - override def allInputs = files + override def checked = files override def modified = files override def markAllModified = this } def unmodified[T](files: Set[T]) = new EmptyChangeReport[T] { - override def allInputs = files + override def checked = files override def unmodified = files } } +/** The result of comparing some current set of objects against a previous set of objects.*/ trait ChangeReport[T] extends NotNull { - def allInputs: Set[T] + /** The set of all of the objects in the current set.*/ + def checked: Set[T] + /** All of the objects that are in the same state in the current and reference sets.*/ def unmodified: Set[T] + /** All checked objects that are not in the same state as the reference. This includes objects that are in both + * sets but have changed and files that are only in one set.*/ def modified: Set[T] // all changes, including added + /** All objects that are only in the current set.*/ def added: Set[T] + /** All objects only in the previous set*/ def removed: Set[T] def +++(other: ChangeReport[T]): ChangeReport[T] = new CompoundChangeReport(this, other) + /** Generate a new report with this report's unmodified set included in the new report's modified set. The new report's + * unmodified set is empty. The new report's added, removed, and checked sets are the same as in this report. */ def markAllModified: ChangeReport[T] = new ChangeReport[T] { - def allInputs = ChangeReport.this.allInputs + def checked = ChangeReport.this.checked def unmodified = Set.empty[T] - def modified = ChangeReport.this.allInputs + def modified = ChangeReport.this.checked def added = ChangeReport.this.added def removed = ChangeReport.this.removed override def markAllModified = this } + override def toString = + { + val labels = List("Checked", "Modified", "Unmodified", "Added", "Removed") + val sets = List(checked, modified, unmodified, added, removed) + val keyValues = labels.zip(sets).map{ case (label, set) => label + ": " + set.mkString(", ") } + keyValues.mkString("Change report:\n\t", "\n\t", "") + } } class EmptyChangeReport[T] extends ChangeReport[T] { - def allInputs = Set.empty[T] + def checked = Set.empty[T] def unmodified = Set.empty[T] def modified = Set.empty[T] def added = Set.empty[T] def removed = Set.empty[T] + override def toString = "No changes" } trait InvalidationReport[T] extends NotNull { @@ -51,7 +68,7 @@ trait InvalidationReport[T] extends NotNull } private class CompoundChangeReport[T](a: ChangeReport[T], b: ChangeReport[T]) extends ChangeReport[T] { - lazy val allInputs = a.allInputs ++ b.allInputs + lazy val checked = a.checked ++ b.checked lazy val unmodified = a.unmodified ++ b.unmodified lazy val modified = a.modified ++ b.modified lazy val added = a.added ++ b.added diff --git a/cache/DependencyTracking.scala b/cache/DependencyTracking.scala index 107c88fc8..5d61f020e 100644 --- a/cache/DependencyTracking.scala +++ b/cache/DependencyTracking.scala @@ -29,7 +29,12 @@ trait ReadTracking[T] extends NotNull def allUsed: Set[T] def allTags: Seq[(T,Array[Byte])] } -import DependencyTracking.{DependencyMap => DMap, newMap, TagMap} +import DependencyTracking.{DependencyMap => DMap, newMap, newTagMap, TagMap} +private object DefaultTracking +{ + def apply[T](translateProducts: Boolean): DependencyTracking[T] = + new DefaultTracking(translateProducts)(newMap, newMap, newMap, newTagMap) +} private final class DefaultTracking[T](translateProducts: Boolean) (val reverseDependencies: DMap[T], val reverseUses: DMap[T], val sourceMap: DMap[T], val tagMap: TagMap[T]) extends DependencyTracking[T](translateProducts) diff --git a/cache/FileInfo.scala b/cache/FileInfo.scala index 435201049..66c8c496b 100644 --- a/cache/FileInfo.scala +++ b/cache/FileInfo.scala @@ -68,7 +68,7 @@ object FileInfo final case class FilesInfo[F <: FileInfo] private(files: Set[F]) extends NotNull object FilesInfo { - sealed trait Style extends NotNull + sealed abstract class Style extends NotNull { val fileStyle: FileInfo.Style type F = fileStyle.F @@ -77,6 +77,7 @@ object FilesInfo implicit def unapply(info: FilesInfo[F]): Set[File] = info.files.map(_.file) implicit val formats: Format[FilesInfo[F]] val manifest: Manifest[Format[FilesInfo[F]]] + def empty: FilesInfo[F] = new FilesInfo(Set.empty) import Cache._ implicit def infosInputCache: InputCache[Set[File]] = wrapInputCache[Set[File],FilesInfo[F]] implicit def infosOutputCache: OutputCache[Set[File]] = wrapOutputCache[Set[File],FilesInfo[F]] diff --git a/cache/Tracked.scala b/cache/Tracked.scala index 530e292fc..375d24137 100644 --- a/cache/Tracked.scala +++ b/cache/Tracked.scala @@ -4,11 +4,14 @@ import java.io.File import CacheIO.{fromFile, toFile} import sbinary.Format import scala.reflect.Manifest +import Task.{iterableToBuilder, iterableToForkBuilder} trait Tracked extends NotNull { - def clear: Task[Unit] + /** Cleans outputs. This operation might require information from the cache, so it should be called first if clear is also called.*/ def clean: Task[Unit] + /** Clears the cache. If also cleaning, 'clean' should be called first as it might require information from the cache.*/ + def clear: Task[Unit] } object Clean { @@ -17,33 +20,38 @@ object Clean def apply(srcs: Set[File]): Task[Unit] = Task(FileUtilities.delete(srcs)) } -class Changed[O](val task: Task[O], val file: File)(implicit input: InputCache[O]) extends Tracked +class Changed[O](val task: Task[O], val cacheFile: File)(implicit input: InputCache[O]) extends Tracked { - def clean = Task.empty - def clear = Clean(file) + val clean = Clean(cacheFile) + def clear = Task.empty def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): Task[O2] { type Input = O } = task map { value => - val cache = OpenResource.fileInputStream(file)(input.uptodate(value)) + val cache = OpenResource.fileInputStream(cacheFile)(input.uptodate(value)) if(cache.uptodate) ifUnchanged(value) else { - OpenResource.fileOutputStream(false)(file)(cache.update) + OpenResource.fileOutputStream(false)(cacheFile)(cache.update) ifChanged(value) } } } -class Difference(val filesTask: Task[Set[File]], val style: FilesInfo.Style, val cache: File, val shouldClean: Boolean) extends Tracked +object Difference { - def this(filesTask: Task[Set[File]], style: FilesInfo.Style, cache: File) = this(filesTask, style, cache, false) - def this(files: Set[File], style: FilesInfo.Style, cache: File, shouldClean: Boolean) = this(Task(files), style, cache) - def this(files: Set[File], style: FilesInfo.Style, cache: File) = this(Task(files), style, cache, false) - + sealed class Constructor private[Difference](defineClean: Boolean, filesAreOutputs: Boolean) extends NotNull + { + def apply(filesTask: Task[Set[File]], style: FilesInfo.Style, cache: File): Difference = new Difference(filesTask, style, cache, defineClean, filesAreOutputs) + def apply(files: Set[File], style: FilesInfo.Style, cache: File): Difference = apply(Task(files), style, cache) + } + object outputs extends Constructor(true, true) + object inputs extends Constructor(false, false) +} +class Difference(val filesTask: Task[Set[File]], val style: FilesInfo.Style, val cache: File, val defineClean: Boolean, val filesAreOutputs: Boolean) extends Tracked +{ + val clean = if(defineClean) Clean(Task(raw(cachedFilesInfo))) else Task.empty val clear = Clean(cache) - val clean = if(shouldClean) cleanTask else Task.empty - def cleanTask = Clean(Task(raw(cachedFilesInfo))) - private def cachedFilesInfo = fromFile(style.formats)(cache)(style.manifest).files + private def cachedFilesInfo = fromFile(style.formats, style.empty)(cache)(style.manifest).files private def raw(fs: Set[style.F]): Set[File] = fs.map(_.file) def apply[T](f: ChangeReport[File] => Task[T]): Task[T] = @@ -51,19 +59,20 @@ class Difference(val filesTask: Task[Set[File]], val style: FilesInfo.Style, val val lastFilesInfo = cachedFilesInfo val lastFiles = raw(lastFilesInfo) val currentFiles = files.map(_.getAbsoluteFile) - val currentFilesInfo = style(files) + val currentFilesInfo = style(currentFiles) val report = new ChangeReport[File] { - lazy val allInputs = currentFiles - lazy val removed = lastFiles -- allInputs - lazy val added = allInputs -- lastFiles - lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) - lazy val unmodified = allInputs -- modified + lazy val checked = currentFiles + lazy val removed = lastFiles -- checked // all files that were included previously but not this time. This is independent of whether the files exist. + lazy val added = checked -- lastFiles // all files included now but not previously. This is independent of whether the files exist. + lazy val modified = raw(lastFilesInfo -- currentFilesInfo.files) ++ added + lazy val unmodified = checked -- modified } f(report) map { result => - toFile(style.formats)(currentFilesInfo)(cache)(style.manifest) + val info = if(filesAreOutputs) style(currentFiles) else currentFilesInfo + toFile(style.formats)(info)(cache)(style.manifest) result } } @@ -85,9 +94,10 @@ class Invalidate[T](val cacheDirectory: File, val translateProducts: Boolean, cl private val trackFormat = new TrackingFormat[T](cacheDirectory, translateProducts) private def cleanAll(fs: Set[T]) = fs.foreach(cleanT) - - def clear = Clean(cacheDirectory) - def clean = Task(cleanAll(trackFormat.read.allProducts)) + + val clean = Task(cleanAll(trackFormat.read.allProducts)) + val clear = Clean(cacheDirectory) + def apply[R](changes: ChangeReport[T])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = apply(Task(changes))(f) def apply[R](changesTask: Task[ChangeReport[T]])(f: (InvalidationReport[T], UpdateTracking[T]) => Task[R]): Task[R] = @@ -127,10 +137,11 @@ class Invalidate[T](val cacheDirectory: File, val translateProducts: Boolean, cl } class BasicTracked(filesTask: Task[Set[File]], style: FilesInfo.Style, cacheDirectory: File) extends Tracked { - private val changed = new Difference(filesTask, style, new File(cacheDirectory, "files")) - private val invalidation = InvalidateFiles(cacheDirectory) - val clean = invalidation.clean - val clear = Clean(cacheDirectory) + private val changed = Difference.inputs(filesTask, style, new File(cacheDirectory, "files")) + private val invalidation = InvalidateFiles(new File(cacheDirectory, "invalidation")) + private def onTracked(f: Tracked => Task[Unit]) = Seq(invalidation, changed).forkTasks(f).joinIgnore + val clear = onTracked(_.clear) + val clean = onTracked(_.clean) def apply[R](f: (ChangeReport[File], InvalidationReport[File], UpdateTracking[File]) => Task[R]): Task[R] = changed { sourceChanges => diff --git a/cache/TrackingFormat.scala b/cache/TrackingFormat.scala index d773d13a6..c1bb3da56 100644 --- a/cache/TrackingFormat.scala +++ b/cache/TrackingFormat.scala @@ -15,10 +15,10 @@ private class TrackingFormat[T](directory: File, translateProducts: Boolean)(imp val dependencyFile = new File(directory, "dependencies") def read(): DependencyTracking[T] = { - val indexMap = CacheIO.fromFile[Map[Int,T]](indexFile) - val indexedFormat = wrap[T,Int](ignore => error("Read-only"), indexMap.apply) + val indexMap = CacheIO.fromFile[Map[Int,T]](indexFile, new HashMap[Int,T]) + val indexedFormat = wrap[T,Int](ignore => error("Read-only"), i => indexMap.getOrElse(i, error("Index " + i + " not found"))) val trackFormat = trackingFormat(translateProducts)(indexedFormat) - fromFile(trackFormat)(dependencyFile) + fromFile(trackFormat, DefaultTracking[T](translateProducts))(dependencyFile) } def write(tracking: DependencyTracking[T]) { @@ -42,17 +42,15 @@ private object TrackingFormat } } def trackingFormat[T](translateProducts: Boolean)(implicit tFormat: Format[T]): Format[DependencyTracking[T]] = - { - implicit val arrayFormat = sbinary.Operations.format[Array[Byte]] asProduct4((a: DMap[T],b: DMap[T],c: DMap[T], d:TagMap[T]) => new DefaultTracking(translateProducts)(a,b,c,d) : DependencyTracking[T] )(dt => Some(dt.reverseDependencies, dt.reverseUses, dt.sourceMap, dt.tagMap)) - } } private final class IndexMap[T] extends NotNull { private[this] var lastIndex = 0 private[this] val map = new HashMap[T, Int] - def indices = map.toArray.map( (_: (T,Int)).swap ) - def apply(t: T) = map.getOrElseUpdate(t, { lastIndex += 1; lastIndex }) + private[this] def nextIndex = { lastIndex += 1; lastIndex } + def indices = HashMap(map.map( (_: (T,Int)).swap ).toSeq : _*) + def apply(t: T) = map.getOrElseUpdate(t, nextIndex) } \ No newline at end of file From 7abdc45e936c1d1ad5ddff379ae65150baa04979 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 31 Aug 2009 10:41:59 -0400 Subject: [PATCH 015/823] Helper CacheResult subclass --- cache/SeparatedCache.scala | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/cache/SeparatedCache.scala b/cache/SeparatedCache.scala index f9b212f4a..91ecda713 100644 --- a/cache/SeparatedCache.scala +++ b/cache/SeparatedCache.scala @@ -9,6 +9,11 @@ trait CacheResult def uptodate: Boolean def update(stream: OutputStream): Unit } +class ForceResult[I](inCache: InputCache[I])(in: I) extends CacheResult +{ + def uptodate = false + def update(stream: OutputStream) = inCache.force(in)(stream) +} trait InputCache[I] extends NotNull { def uptodate(in: I)(cacheStream: InputStream): CacheResult From aaba9b7ca7a9bdce163ed47c9f5f430975175432 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 31 Aug 2009 10:43:41 -0400 Subject: [PATCH 016/823] Correct cache intialization in Changed tracker. --- cache/Tracked.scala | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/cache/Tracked.scala b/cache/Tracked.scala index 375d24137..7eba581a0 100644 --- a/cache/Tracked.scala +++ b/cache/Tracked.scala @@ -1,6 +1,6 @@ package xsbt -import java.io.File +import java.io.{File,IOException} import CacheIO.{fromFile, toFile} import sbinary.Format import scala.reflect.Manifest @@ -26,7 +26,9 @@ class Changed[O](val task: Task[O], val cacheFile: File)(implicit input: InputCa def clear = Task.empty def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): Task[O2] { type Input = O } = task map { value => - val cache = OpenResource.fileInputStream(cacheFile)(input.uptodate(value)) + val cache = + try { OpenResource.fileInputStream(cacheFile)(input.uptodate(value)) } + catch { case _: IOException => new ForceResult(input)(value) } if(cache.uptodate) ifUnchanged(value) else From 140e2cbcb611aa0a8dc796ec10cd42fc68f696f7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 31 Aug 2009 10:45:32 -0400 Subject: [PATCH 017/823] Tracking subproject. --- cache/{ => tracking}/ChangeReport.scala | 0 cache/{ => tracking}/DependencyTracking.scala | 0 cache/{ => tracking}/Tracked.scala | 0 cache/{ => tracking}/TrackingFormat.scala | 0 4 files changed, 0 insertions(+), 0 deletions(-) rename cache/{ => tracking}/ChangeReport.scala (100%) rename cache/{ => tracking}/DependencyTracking.scala (100%) rename cache/{ => tracking}/Tracked.scala (100%) rename cache/{ => tracking}/TrackingFormat.scala (100%) diff --git a/cache/ChangeReport.scala b/cache/tracking/ChangeReport.scala similarity index 100% rename from cache/ChangeReport.scala rename to cache/tracking/ChangeReport.scala diff --git a/cache/DependencyTracking.scala b/cache/tracking/DependencyTracking.scala similarity index 100% rename from cache/DependencyTracking.scala rename to cache/tracking/DependencyTracking.scala diff --git a/cache/Tracked.scala b/cache/tracking/Tracked.scala similarity index 100% rename from cache/Tracked.scala rename to cache/tracking/Tracked.scala diff --git a/cache/TrackingFormat.scala b/cache/tracking/TrackingFormat.scala similarity index 100% rename from cache/TrackingFormat.scala rename to cache/tracking/TrackingFormat.scala From b094fc3cc856ba7b76941d04ab7fa2dd269e10c4 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Thu, 3 Sep 2009 23:40:47 -0400 Subject: [PATCH 018/823] Mostly working cross-compile task. Analyzer plugin is now a proper internal phase to get around bootstrapping issues. Correctly handle source tags. --- cache/CacheIO.scala | 9 +++++++++ .../main/java/xsbti/AnalysisCallbackContainer.java | 12 ------------ 2 files changed, 9 insertions(+), 12 deletions(-) delete mode 100644 interface/src/main/java/xsbti/AnalysisCallbackContainer.java diff --git a/cache/CacheIO.scala b/cache/CacheIO.scala index 05768bab9..e5c643c6a 100644 --- a/cache/CacheIO.scala +++ b/cache/CacheIO.scala @@ -6,6 +6,15 @@ import scala.reflect.Manifest object CacheIO { + def toBytes[T](format: Format[T])(value: T)(implicit mf: Manifest[Format[T]]): Array[Byte] = + toBytes[T](value)(format, mf) + def toBytes[T](value: T)(implicit format: Format[T], mf: Manifest[Format[T]]): Array[Byte] = + Operations.toByteArray(value)(stampedFormat(format)) + def fromBytes[T](format: Format[T], default: => T)(bytes: Array[Byte])(implicit mf: Manifest[Format[T]]): T = + fromBytes(default)(bytes)(format, mf) + def fromBytes[T](default: => T)(bytes: Array[Byte])(implicit format: Format[T], mf: Manifest[Format[T]]): T = + if(bytes.isEmpty) default else Operations.fromByteArray(bytes)(stampedFormat(format)) + def fromFile[T](format: Format[T], default: => T)(file: File)(implicit mf: Manifest[Format[T]]): T = fromFile(file, default)(format, mf) def fromFile[T](file: File, default: => T)(implicit format: Format[T], mf: Manifest[Format[T]]): T = diff --git a/interface/src/main/java/xsbti/AnalysisCallbackContainer.java b/interface/src/main/java/xsbti/AnalysisCallbackContainer.java deleted file mode 100644 index 3d0641ed7..000000000 --- a/interface/src/main/java/xsbti/AnalysisCallbackContainer.java +++ /dev/null @@ -1,12 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah - */ -package xsbti; - -/** Provides access to an AnalysisCallback. This is used by the plugin to -* get the callback to use. The scalac Global instance it is passed must -* implement this interface. */ -public interface AnalysisCallbackContainer -{ - public AnalysisCallback analysisCallback(); -} \ No newline at end of file From 3aba701b00120043e286a7acba055eb65730edda Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 5 Sep 2009 12:19:34 -0400 Subject: [PATCH 019/823] Filling in logging and making cross-compile work. --- .../src/main/java/xsbti/CompileFailed.java | 6 ++ interface/src/test/scala/TestLogger.scala | 38 -------- util/log/BasicLogger.scala | 15 ++++ util/log/BufferedLogger.scala | 86 +++++++++++++++++++ util/log/ConsoleLogger.scala | 76 ++++++++++++++++ util/log/Level.scala | 25 ++++++ util/log/LogEvent.scala | 17 ++++ util/log/Logger.scala | 48 +++-------- 8 files changed, 236 insertions(+), 75 deletions(-) create mode 100644 interface/src/main/java/xsbti/CompileFailed.java delete mode 100644 interface/src/test/scala/TestLogger.scala create mode 100644 util/log/BasicLogger.scala create mode 100644 util/log/BufferedLogger.scala create mode 100644 util/log/ConsoleLogger.scala create mode 100644 util/log/Level.scala create mode 100644 util/log/LogEvent.scala diff --git a/interface/src/main/java/xsbti/CompileFailed.java b/interface/src/main/java/xsbti/CompileFailed.java new file mode 100644 index 000000000..bb5b2a93a --- /dev/null +++ b/interface/src/main/java/xsbti/CompileFailed.java @@ -0,0 +1,6 @@ +package xsbti; + +public abstract class CompileFailed extends RuntimeException +{ + public abstract String[] arguments(); +} \ No newline at end of file diff --git a/interface/src/test/scala/TestLogger.scala b/interface/src/test/scala/TestLogger.scala deleted file mode 100644 index c8df588f6..000000000 --- a/interface/src/test/scala/TestLogger.scala +++ /dev/null @@ -1,38 +0,0 @@ -package xsbti - -class TestLogger extends Logger -{ - private val buffer = new scala.collection.mutable.ArrayBuffer[F0[Unit]] - def info(msg: F0[String]) = buffer("[info] ", msg) - def warn(msg: F0[String]) = buffer("[warn] ", msg) - def debug(msg: F0[String]) = buffer("[debug] ", msg) - def error(msg: F0[String]) = buffer("[error] ", msg) - def verbose(msg: F0[String]) = buffer("[verbose] ", msg) - def info(msg: => String) = buffer("[info] ", msg) - def warn(msg: => String) = buffer("[warn] ", msg) - def debug(msg: => String) = buffer("[debug] ", msg) - def error(msg: => String) = buffer("[error] ", msg) - def verbose(msg: => String) = buffer("[verbose] ", msg) - def show() { buffer.foreach(_()) } - def clear() { buffer.clear() } - def trace(t: F0[Throwable]) { buffer += f0(t().printStackTrace) } - private def buffer(s: String, msg: F0[String]) { buffer(s, msg()) } - private def buffer(s: String, msg: => String) { buffer += f0(println(s + msg)) } -} -object TestLogger -{ - def apply[T](f: Logger => T): T = - { - val log = new TestLogger - try { f(log) } - catch { case e: Exception => log.show(); throw e } - finally { log.clear() } - } - def apply[L <: TestLogger, T](newLogger: => L)(f: L => T): T = - { - val log = newLogger - try { f(log) } - catch { case e: Exception => log.show(); throw e } - finally { log.clear() } - } -} \ No newline at end of file diff --git a/util/log/BasicLogger.scala b/util/log/BasicLogger.scala new file mode 100644 index 000000000..23607c8ed --- /dev/null +++ b/util/log/BasicLogger.scala @@ -0,0 +1,15 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ + package xsbt + +/** Implements the level-setting methods of Logger.*/ +abstract class BasicLogger extends Logger +{ + private var traceEnabledVar = true + private var level: Level.Value = Level.Info + def getLevel = level + def setLevel(newLevel: Level.Value) { level = newLevel } + def enableTrace(flag: Boolean) { traceEnabledVar = flag } + def traceEnabled = traceEnabledVar +} diff --git a/util/log/BufferedLogger.scala b/util/log/BufferedLogger.scala new file mode 100644 index 000000000..7e8348f2d --- /dev/null +++ b/util/log/BufferedLogger.scala @@ -0,0 +1,86 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ + package xsbt + + import scala.collection.mutable.ListBuffer + +/** A logger that can buffer the logging done on it and then can flush the buffer +* to the delegate logger provided in the constructor. Use 'startRecording' to +* start buffering and then 'play' from to flush the buffer to the backing logger. +* The logging level set at the time a message is originally logged is used, not +* the level at the time 'play' is called. +* +* This class assumes that it is the only client of the delegate logger. +* */ +class BufferedLogger(delegate: Logger) extends Logger +{ + private[this] val buffer = new ListBuffer[LogEvent] + private[this] var recording = false + + /** Enables buffering. */ + def record() = { recording = true } + def buffer[T](f: => T): T = + { + record() + try { f } + finally { stopQuietly() } + } + def bufferQuietly[T](f: => T): T = + { + record() + try + { + val result = f + clear() + result + } + catch { case e => stopQuietly(); throw e } + } + private def stopQuietly() = try { stop() } catch { case e: Exception => () } + + /** Flushes the buffer to the delegate logger. This method calls logAll on the delegate + * so that the messages are written consecutively. The buffer is cleared in the process. */ + def play() { delegate.logAll(buffer.readOnly); buffer.clear() } + /** Clears buffered events and disables buffering. */ + def clear(): Unit = { buffer.clear(); recording = false } + /** Plays buffered events and disables buffering. */ + def stop() { play(); clear() } + + def setLevel(newLevel: Level.Value) + { + buffer += new SetLevel(newLevel) + delegate.setLevel(newLevel) + } + def getLevel = delegate.getLevel + def traceEnabled = delegate.traceEnabled + def enableTrace(flag: Boolean) + { + buffer += new SetTrace(flag) + delegate.enableTrace(flag) + } + + def trace(t: => Throwable): Unit = + doBufferableIf(traceEnabled, new Trace(t), _.trace(t)) + def success(message: => String): Unit = + doBufferable(Level.Info, new Success(message), _.success(message)) + def log(level: Level.Value, message: => String): Unit = + doBufferable(level, new Log(level, message), _.log(level, message)) + def logAll(events: Seq[LogEvent]): Unit = + if(recording) + buffer ++= events + else + delegate.logAll(events) + def control(event: ControlEvent.Value, message: => String): Unit = + doBufferable(Level.Info, new ControlEvent(event, message), _.control(event, message)) + private def doBufferable(level: Level.Value, appendIfBuffered: => LogEvent, doUnbuffered: Logger => Unit): Unit = + doBufferableIf(atLevel(level), appendIfBuffered, doUnbuffered) + private def doBufferableIf(condition: => Boolean, appendIfBuffered: => LogEvent, doUnbuffered: Logger => Unit): Unit = + if(condition) + { + if(recording) + buffer += appendIfBuffered + else + doUnbuffered(delegate) + } +} \ No newline at end of file diff --git a/util/log/ConsoleLogger.scala b/util/log/ConsoleLogger.scala new file mode 100644 index 000000000..a7217f026 --- /dev/null +++ b/util/log/ConsoleLogger.scala @@ -0,0 +1,76 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ + package xsbt + +object ConsoleLogger +{ + private val formatEnabled = ansiSupported && !formatExplicitlyDisabled + + private[this] def formatExplicitlyDisabled = java.lang.Boolean.getBoolean("sbt.log.noformat") + private[this] def ansiSupported = + try { jline.Terminal.getTerminal.isANSISupported } + catch { case e: Exception => !isWindows } + + private[this] def os = System.getProperty("os.name") + private[this] def isWindows = os.toLowerCase.indexOf("windows") >= 0 +} + +/** A logger that logs to the console. On supported systems, the level labels are +* colored. */ +class ConsoleLogger extends BasicLogger +{ + import ConsoleLogger.formatEnabled + def messageColor(level: Level.Value) = Console.RESET + def labelColor(level: Level.Value) = + level match + { + case Level.Error => Console.RED + case Level.Warn => Console.YELLOW + case _ => Console.RESET + } + def successLabelColor = Console.GREEN + def successMessageColor = Console.RESET + override def success(message: => String) + { + if(atLevel(Level.Info)) + log(successLabelColor, Level.SuccessLabel, successMessageColor, message) + } + def trace(t: => Throwable): Unit = + System.out.synchronized + { + if(traceEnabled) + t.printStackTrace + } + def log(level: Level.Value, message: => String) + { + if(atLevel(level)) + log(labelColor(level), level.toString, messageColor(level), message) + } + private def setColor(color: String) + { + if(formatEnabled) + System.out.synchronized { System.out.print(color) } + } + private def log(labelColor: String, label: String, messageColor: String, message: String): Unit = + System.out.synchronized + { + for(line <- message.split("""\n""")) + { + setColor(Console.RESET) + System.out.print('[') + setColor(labelColor) + System.out.print(label) + setColor(Console.RESET) + System.out.print("] ") + setColor(messageColor) + System.out.print(line) + setColor(Console.RESET) + System.out.println() + } + } + + def logAll(events: Seq[LogEvent]) = System.out.synchronized { events.foreach(log) } + def control(event: ControlEvent.Value, message: => String) + { log(labelColor(Level.Info), Level.Info.toString, Console.BLUE, message) } +} \ No newline at end of file diff --git a/util/log/Level.scala b/util/log/Level.scala new file mode 100644 index 000000000..86abc257d --- /dev/null +++ b/util/log/Level.scala @@ -0,0 +1,25 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ + package xsbt + +/** An enumeration defining the levels available for logging. A level includes all of the levels +* with id larger than its own id. For example, Warn (id=3) includes Error (id=4).*/ +object Level extends Enumeration with NotNull +{ + val Debug = Value(1, "debug") + val Info = Value(2, "info") + val Warn = Value(3, "warn") + val Error = Value(4, "error") + /** Defines the label to use for success messages. A success message is logged at the info level but + * uses this label. Because the label for levels is defined in this module, the success + * label is also defined here. */ + val SuccessLabel = "success" + + // added because elements was renamed to iterator in 2.8.0 nightly + def levels = Debug :: Info :: Warn :: Error :: Nil + /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ + def apply(s: String) = levels.find(s == _.toString) + /** Same as apply, defined for use in pattern matching. */ + private[xsbt] def unapply(s: String) = apply(s) +} \ No newline at end of file diff --git a/util/log/LogEvent.scala b/util/log/LogEvent.scala new file mode 100644 index 000000000..19a5b24db --- /dev/null +++ b/util/log/LogEvent.scala @@ -0,0 +1,17 @@ +/* sbt -- Simple Build Tool + * Copyright 2008, 2009 Mark Harrah + */ + package xsbt + +sealed trait LogEvent extends NotNull +final class Success(val msg: String) extends LogEvent +final class Log(val level: Level.Value, val msg: String) extends LogEvent +final class Trace(val exception: Throwable) extends LogEvent +final class SetLevel(val newLevel: Level.Value) extends LogEvent +final class SetTrace(val enabled: Boolean) extends LogEvent +final class ControlEvent(val event: ControlEvent.Value, val msg: String) extends LogEvent + +object ControlEvent extends Enumeration +{ + val Start, Header, Finish = Value +} \ No newline at end of file diff --git a/util/log/Logger.scala b/util/log/Logger.scala index b5203acb8..153596f6f 100644 --- a/util/log/Logger.scala +++ b/util/log/Logger.scala @@ -3,13 +3,14 @@ */ package xsbt -abstract class Logger extends NotNull + import xsbti.{Logger => xLogger, F0} +abstract class Logger extends xLogger with NotNull { def getLevel: Level.Value def setLevel(newLevel: Level.Value) def enableTrace(flag: Boolean) def traceEnabled: Boolean - + def atLevel(level: Level.Value) = level.id >= getLevel.id def trace(t: => Throwable): Unit final def debug(message: => String): Unit = log(Level.Debug, message) @@ -19,7 +20,7 @@ abstract class Logger extends NotNull def success(message: => String): Unit def log(level: Level.Value, message: => String): Unit def control(event: ControlEvent.Value, message: => String): Unit - + def logAll(events: Seq[LogEvent]): Unit /** Defined in terms of other methods in Logger and should not be called from them. */ final def log(event: LogEvent) @@ -34,38 +35,11 @@ abstract class Logger extends NotNull case c: ControlEvent => control(c.event, c.msg) } } + + def debug(msg: F0[String]): Unit = log(Level.Debug, msg) + def warn(msg: F0[String]): Unit = log(Level.Warn, msg) + def info(msg: F0[String]): Unit = log(Level.Info, msg) + def error(msg: F0[String]): Unit = log(Level.Error, msg) + def trace(msg: F0[Throwable]) = trace(msg.apply) + def log(level: Level.Value, msg: F0[String]): Unit = log(level, msg.apply) } - -sealed trait LogEvent extends NotNull -final class Success(val msg: String) extends LogEvent -final class Log(val level: Level.Value, val msg: String) extends LogEvent -final class Trace(val exception: Throwable) extends LogEvent -final class SetLevel(val newLevel: Level.Value) extends LogEvent -final class SetTrace(val enabled: Boolean) extends LogEvent -final class ControlEvent(val event: ControlEvent.Value, val msg: String) extends LogEvent - -object ControlEvent extends Enumeration -{ - val Start, Header, Finish = Value -} - -/** An enumeration defining the levels available for logging. A level includes all of the levels -* with id larger than its own id. For example, Warn (id=3) includes Error (id=4).*/ -object Level extends Enumeration with NotNull -{ - val Debug = Value(1, "debug") - val Info = Value(2, "info") - val Warn = Value(3, "warn") - val Error = Value(4, "error") - /** Defines the label to use for success messages. A success message is logged at the info level but - * uses this label. Because the label for levels is defined in this module, the success - * label is also defined here. */ - val SuccessLabel = "success" - - // added because elements was renamed to iterator in 2.8.0 nightly - def levels = Debug :: Info :: Warn :: Error :: Nil - /** Returns the level with the given name wrapped in Some, or None if no level exists for that name. */ - def apply(s: String) = levels.find(s == _.toString) - /** Same as apply, defined for use in pattern matching. */ - private[xsbt] def unapply(s: String) = apply(s) -} \ No newline at end of file From 0fb8ff7bb44206480b508047235ab409877ca739 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 27 Sep 2009 14:39:26 -0400 Subject: [PATCH 020/823] Turned sbt launcher into a general Scala application launcher as described in launch.specification --- interface/src/main/java/xsbti/Versions.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/interface/src/main/java/xsbti/Versions.java b/interface/src/main/java/xsbti/Versions.java index 8576aeaf5..122499c04 100644 --- a/interface/src/main/java/xsbti/Versions.java +++ b/interface/src/main/java/xsbti/Versions.java @@ -5,7 +5,7 @@ package xsbti; public interface Versions { - public static final String Sbt = "0.7"; + public static final String Sbt = "0.7.0_13"; // keep in sync with LauncherProject in the XSbt project definition; public static final int Interface = 1; public static final int BootInterface = 1; } From 218a10c83a9db5f8a40be38d6557e77494854d92 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 9 Nov 2009 09:34:52 -0500 Subject: [PATCH 021/823] New scripted test framework --- LICENSE | 25 +++++++++++++++++++++++++ 1 file changed, 25 insertions(+) create mode 100644 LICENSE diff --git a/LICENSE b/LICENSE new file mode 100644 index 000000000..49fe1ee66 --- /dev/null +++ b/LICENSE @@ -0,0 +1,25 @@ +Copyright (c) 2008, 2009 Mark Harrah +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions +are met: +1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. +2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. +3. The name of the author may not be used to endorse or promote products + derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR +IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES +OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. +IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, +INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT +NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, +DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY +THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT +(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF +THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + From 13d75b680a1f55a97f4f049fa4d2366c128ce7fd Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 9 Nov 2009 22:02:53 -0500 Subject: [PATCH 022/823] Fix issue where compiler interface was poorly cached --- interface/src/main/java/xsbti/Versions.java | 11 ----------- 1 file changed, 11 deletions(-) delete mode 100644 interface/src/main/java/xsbti/Versions.java diff --git a/interface/src/main/java/xsbti/Versions.java b/interface/src/main/java/xsbti/Versions.java deleted file mode 100644 index 122499c04..000000000 --- a/interface/src/main/java/xsbti/Versions.java +++ /dev/null @@ -1,11 +0,0 @@ -/* sbt -- Simple Build Tool - * Copyright 2009 Mark Harrah - */ -package xsbti; - -public interface Versions -{ - public static final String Sbt = "0.7.0_13"; // keep in sync with LauncherProject in the XSbt project definition; - public static final int Interface = 1; - public static final int BootInterface = 1; -} From ec85abb0b96df73efb54513fd6a966ad56f638bf Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Mon, 16 Nov 2009 08:46:47 -0500 Subject: [PATCH 023/823] Source API extractor --- interface/definition | 110 ++++++++++++++++++ .../src/main/java/xsbti/AnalysisCallback.java | 2 + 2 files changed, 112 insertions(+) create mode 100644 interface/definition diff --git a/interface/definition b/interface/definition new file mode 100644 index 000000000..33cb55aec --- /dev/null +++ b/interface/definition @@ -0,0 +1,110 @@ +Type + SimpleType + Projection + prefix : SimpleType + id : String + ParameterRef + id: Int + Singleton + path: Path + EmptyType + Parameterized + baseType : SimpleType + typeArguments: SimpleType* + Annotated + baseType : SimpleType + annotations : Annotation* + Structure + parents : Type* + declarations: Definition* + Existential + baseType : Type + clause: TypeParameter* + +Source + packages : Package* + definitions: Definition* + +Package + name: String + +Definition + name: String + access: Access + modifiers: Modifiers + FieldLike + tpe : Type + Val + Var + ParameterizedDefinition + typeParameters: TypeParameter* + Def + valueParameters: ParameterList* + returnType: Type + ClassLike + definitionType: DefinitionType + selfType: Type + structure: Structure + TypeMember + TypeAlias + tpe: Type + TypeDeclaration + upperBound: Type + lowerBound: Type + +Access + Public + Qualified + qualifier: Qualifier + Protected + Private + Pkg + +Qualifier + Unqualified + ThisQualifier + IdQualifier + value: String + +Modifiers + isAbstract: Boolean + isDeferred: Boolean + isOverride: Boolean + isFinal: Boolean + isSealed: Boolean + isImplicit: Boolean + isLazy: Boolean + +ParameterList + parameters: MethodParameter* + isImplicit: Boolean +MethodParameter + name: String + tpe: Type + hasDefault: Boolean + modifier: ParameterModifier + +TypeParameter + id: Int + typeParameters : TypeParameter* + variance: Variance + lowerBound: Type + upperBound: Type + +Annotation + base: SimpleType + arguments: String* + +enum Variance : Contravariant, Covariant, Invariant +enum ParameterModifier : Repeated, Plain, ByName +enum DefinitionType : Trait, ClassDef, Module, PackageModule + +Path + components: PathComponent* + +PathComponent + Super + qualifier: Path + This + Id + id: String \ No newline at end of file diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 70870965c..4c7bb8689 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -34,4 +34,6 @@ public interface AnalysisCallback public void endSource(File sourcePath); /** Called when a module with a public 'main' method with the right signature is found.*/ public void foundApplication(File source, String className); + + public void api(File sourceFile, xsbti.api.Source source); } \ No newline at end of file From d568fcef6f5144314e90aa04436e9a2c6012e5d7 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sat, 21 Nov 2009 16:14:09 -0500 Subject: [PATCH 024/823] Reorder API definition file --- interface/definition | 46 ++++++++++++++++++++++---------------------- 1 file changed, 23 insertions(+), 23 deletions(-) diff --git a/interface/definition b/interface/definition index 33cb55aec..1ce6fe02a 100644 --- a/interface/definition +++ b/interface/definition @@ -1,26 +1,3 @@ -Type - SimpleType - Projection - prefix : SimpleType - id : String - ParameterRef - id: Int - Singleton - path: Path - EmptyType - Parameterized - baseType : SimpleType - typeArguments: SimpleType* - Annotated - baseType : SimpleType - annotations : Annotation* - Structure - parents : Type* - declarations: Definition* - Existential - baseType : Type - clause: TypeParameter* - Source packages : Package* definitions: Definition* @@ -52,6 +29,29 @@ Definition upperBound: Type lowerBound: Type +Type + SimpleType + Projection + prefix : SimpleType + id : String + ParameterRef + id: Int + Singleton + path: Path + EmptyType + Parameterized + baseType : SimpleType + typeArguments: SimpleType* + Annotated + baseType : SimpleType + annotations : Annotation* + Structure + parents : Type* + declarations: Definition* + Existential + baseType : Type + clause: TypeParameter* + Access Public Qualified From fd2c309f9d2ef6f5dbf9bc946b5aeb5693b9e416 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 22 Nov 2009 22:54:17 -0500 Subject: [PATCH 025/823] Type member support, linearization instead of parents and add inherited members for structure --- interface/definition | 4 +++- interface/src/main/java/xsbti/AnalysisCallback.java | 2 +- 2 files changed, 4 insertions(+), 2 deletions(-) diff --git a/interface/definition b/interface/definition index 1ce6fe02a..5269e50f7 100644 --- a/interface/definition +++ b/interface/definition @@ -26,8 +26,8 @@ Definition TypeAlias tpe: Type TypeDeclaration - upperBound: Type lowerBound: Type + upperBound: Type Type SimpleType @@ -48,6 +48,7 @@ Type Structure parents : Type* declarations: Definition* + inherited: Definition* Existential baseType : Type clause: TypeParameter* @@ -74,6 +75,7 @@ Modifiers isSealed: Boolean isImplicit: Boolean isLazy: Boolean + isSynthetic: Boolean ParameterList parameters: MethodParameter* diff --git a/interface/src/main/java/xsbti/AnalysisCallback.java b/interface/src/main/java/xsbti/AnalysisCallback.java index 4c7bb8689..2372a408b 100644 --- a/interface/src/main/java/xsbti/AnalysisCallback.java +++ b/interface/src/main/java/xsbti/AnalysisCallback.java @@ -34,6 +34,6 @@ public interface AnalysisCallback public void endSource(File sourcePath); /** Called when a module with a public 'main' method with the right signature is found.*/ public void foundApplication(File source, String className); - + /** Called when the public API of a source file is extracted. */ public void api(File sourceFile, xsbti.api.Source source); } \ No newline at end of file From d9ba74b24e75e05649451b4e42fe7868bd63d567 Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Tue, 24 Nov 2009 23:01:05 -0500 Subject: [PATCH 026/823] Annotations on definintions and implicit parameters in 2.7 --- interface/definition | 1 + 1 file changed, 1 insertion(+) diff --git a/interface/definition b/interface/definition index 5269e50f7..f2696c863 100644 --- a/interface/definition +++ b/interface/definition @@ -9,6 +9,7 @@ Definition name: String access: Access modifiers: Modifiers + annotations: Annotation* FieldLike tpe : Type Val From 4bd6f9627b7706b07ff8184e6912056d63015e0d Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Sun, 29 Nov 2009 22:12:36 -0500 Subject: [PATCH 027/823] Fix tests --- interface/src/test/scala/TestCallback.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/interface/src/test/scala/TestCallback.scala b/interface/src/test/scala/TestCallback.scala index 7783a0df5..23978f920 100644 --- a/interface/src/test/scala/TestCallback.scala +++ b/interface/src/test/scala/TestCallback.scala @@ -25,4 +25,5 @@ class TestCallback(val superclassNames: Array[String]) extends AnalysisCallback def generatedClass(source: File, module: File) { products += ((source, module)) } def endSource(source: File) { endedSources += source } def foundApplication(source: File, className: String) { applications += ((source, className)) } + def api(source: File, sourceAPI: xsbti.api.Source) = () } \ No newline at end of file From 604a5413c96fe830efbbb9b7193727d54ade7d4b Mon Sep 17 00:00:00 2001 From: Mark Harrah Date: Fri, 11 Dec 2009 18:56:09 -0500 Subject: [PATCH 028/823] Cleaning up tasks and caching --- cache/Cache.scala | 29 ++++++++++++++----------- cache/HListCache.scala | 8 +++---- cache/src/test/scala/CacheTest.scala | 18 ++++++++++------ cache/tracking/Tracked.scala | 2 +- util/collection/HLists.scala | 30 +++++++++++++++++++------- util/collection/lib/metascala-0.1.jar | Bin 128005 -> 0 bytes 6 files changed, 56 insertions(+), 31 deletions(-) delete mode 100644 util/collection/lib/metascala-0.1.jar diff --git a/cache/Cache.scala b/cache/Cache.scala index 011d35211..2f35d2c96 100644 --- a/cache/Cache.scala +++ b/cache/Cache.scala @@ -23,18 +23,23 @@ object Cache extends BasicCacheImplicits with SBinaryFormats with HListCacheImpl def wrapOutputCache[O,DO](implicit convert: O => DO, reverse: DO => O, base: OutputCache[DO]): OutputCache[O] = new WrappedOutputCache[O,DO](convert, reverse, base) - /* Note: Task[O] { type Input = I } is written out because ITask[I,O] did not work (type could not be inferred properly) with a task - * with an HList input.*/ - def apply[I,O](task: Task[O] { type Input = I }, file: File)(implicit cache: Cache[I,O]): Task[O] { type Input = I } = - task match { case m: M[I,O,_] => - new M[I,O,Result[O]](None)(m.dependencies)(m.extract)(computeWithCache(m, cache, file)) - } - private def computeWithCache[I,O](m: M[I,O,_], cache: Cache[I,O], file: File)(in: I): Result[O] = - cache(file)(in) match - { - case Left(value) => Value(value) - case Right(store) => m.map { out => store(out); out } - } + def apply[I,O](file: File)(f: I => Task[O])(implicit cache: Cache[I,O]): I => Task[O] = + in => + cache(file)(in) match + { + case Left(value) => Task(value) + case Right(store) => f(in) map { out => store(out); out } + } + def cached[I,O](file: File)(f: I => O)(implicit cache: Cache[I,O]): I => O = + in => + cache(file)(in) match + { + case Left(value) => value + case Right(store) => + val out = f(in) + store(out) + out + } } trait BasicCacheImplicits extends NotNull { diff --git a/cache/HListCache.scala b/cache/HListCache.scala index eb3affd13..90f00f47d 100644 --- a/cache/HListCache.scala +++ b/cache/HListCache.scala @@ -1,18 +1,18 @@ package xsbt import java.io.{InputStream,OutputStream} -import metascala.HLists.{HCons,HList,HNil} +import HLists._ class HNilInputCache extends NoInputCache[HNil] class HConsInputCache[H,T <: HList](val headCache: InputCache[H], val tailCache: InputCache[T]) extends InputCache[HCons[H,T]] { def uptodate(in: HCons[H,T])(cacheStream: InputStream) = { - lazy val headResult = headCache.uptodate(in.head)(cacheStream) - lazy val tailResult = tailCache.uptodate(in.tail)(cacheStream) + val headResult = headCache.uptodate(in.head)(cacheStream) + val tailResult = tailCache.uptodate(in.tail)(cacheStream) new CacheResult { - lazy val uptodate = headResult.uptodate && tailResult.uptodate + val uptodate = headResult.uptodate && tailResult.uptodate def update(outputStream: OutputStream) = { headResult.update(outputStream) diff --git a/cache/src/test/scala/CacheTest.scala b/cache/src/test/scala/CacheTest.scala index f0cf919aa..7bba6ec79 100644 --- a/cache/src/test/scala/CacheTest.scala +++ b/cache/src/test/scala/CacheTest.scala @@ -4,19 +4,25 @@ import java.io.File object CacheTest// extends Properties("Cache test") { + val lengthCache = new File("/tmp/length-cache") + val cCache = new File("/tmp/c-cache") + import Task._ import Cache._ import FileInfo.hash._ - def checkFormattable(file: File) + def test { val createTask = Task { new File("test") } - val lengthTask = createTask map { f => println("File length: " + f.length); f.length } - val cached = Cache(lengthTask, new File("/tmp/length-cache")) - val cTask = (createTask :: cached :: TNil) map { case (file :: len :: HNil) => println("File: " + file + " length: " + len); len :: file :: HNil } - val cachedC = Cache(cTask, new File("/tmp/c-cache")) + val length = (f: File) => { println("File length: " + f.length); f.length } + val cachedLength = cached(lengthCache) ( length ) - try { TaskRunner(cachedC) } + val lengthTask = createTask map cachedLength + + val c = (file: File, len: Long) => { println("File: " + file + ", length: " + len); len :: file :: HNil } + val cTask = (createTask :: lengthTask :: TNil) map cached(cCache) { case (file :: len :: HNil) => c(file, len) } + + try { TaskRunner(cTask) } catch { case TasksFailed(failures) => failures.foreach(_.exception.printStackTrace) } } } \ No newline at end of file diff --git a/cache/tracking/Tracked.scala b/cache/tracking/Tracked.scala index 7eba581a0..72be00ac6 100644 --- a/cache/tracking/Tracked.scala +++ b/cache/tracking/Tracked.scala @@ -24,7 +24,7 @@ class Changed[O](val task: Task[O], val cacheFile: File)(implicit input: InputCa { val clean = Clean(cacheFile) def clear = Task.empty - def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): Task[O2] { type Input = O } = + def apply[O2](ifChanged: O => O2, ifUnchanged: O => O2): Task[O2] = task map { value => val cache = try { OpenResource.fileInputStream(cacheFile)(input.uptodate(value)) } diff --git a/util/collection/HLists.scala b/util/collection/HLists.scala index 4d4a00caa..f376ee4fd 100644 --- a/util/collection/HLists.scala +++ b/util/collection/HLists.scala @@ -1,15 +1,29 @@ package xsbt -import metascala.HLists.{HCons => metaHCons, HList => metaHList, HNil => metaHNil} +// stripped down version of http://trac.assembla.com/metascala/browser/src/metascala/HLists.scala +// Copyright (c) 2009, Jesper Nordenberg +// new BSD license, see licenses/MetaScala object HLists extends HLists -// add an extractor to metascala.HLists and define aliases to the HList classes in the xsbt namespace -trait HLists extends NotNull +trait HLists { object :: { def unapply[H,T<:HList](list: HCons[H,T]) = Some((list.head,list.tail)) } - final val HNil = metaHNil - final type ::[H, T <: HList] = metaHCons[H, T] - final type HNil = metaHNil - final type HList = metaHList - final type HCons[H, T <: HList] = metaHCons[H, T] + type ::[H, T <: HList] = HCons[H, T] +} + +object HNil extends HNil +sealed trait HList { + type Head + type Tail <: HList +} +sealed class HNil extends HList { + type Head = Nothing + type Tail = HNil + def ::[T](v : T) = HCons(v, this) +} + +final case class HCons[H, T <: HList](head : H, tail : T) extends HList { + type Head = H + type Tail = T + def ::[T](v : T) = HCons(v, this) } \ No newline at end of file diff --git a/util/collection/lib/metascala-0.1.jar b/util/collection/lib/metascala-0.1.jar deleted file mode 100644 index ea1d3a62b2590fa6231ee203c443348c971ee86a..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 128005 zcmb6A1yG$`(gg|!g6H5)aCZ+D+}+*6!QDMbu!Fn1ySqbhcXxO9gdjh!d^3~xpF6iM z)Khg%QN>ex?_R4{cdsok^&SH49n9PBS4G6n@BZTl?H$BB84+axItf`(1{nca2~iOx zWqKLWm*IEsZe&JBq^0QSr{JaNsK-YqY84r0SvL2!_rTxDOF_fbc!QsReEV0}Z;$-H zUQiPuMT)_Q1)GatGz)ji@pRLX=1~e{xFn9VI)|V=1p1)|OuXZE zmhu?$$FgKiTxv}^pT~cWXUu%*?0kKfwMFKq?VM6EP<1y zdVvz6O=z^X!g6ZXdvi29(?B#$jRosXbQR9m`94P{7)Wf=4^T=WdJ$FgPis3Pi! zEjlTRAu3O*sh?5fpkvDTPI;~h@0ZYwYU%2NlS(Y(FryQOr0b^;oi3z)!#qY=vnxc} zNMOsYnll`=bDlY%N8t(~5x1qyqA6j}oTi@V!YMj+5b1ogA`DrYnK2W%sXqx=3T2+B zMlY8yrkzyJs}>ewuta0l2OctLsjOV4(*HbF@yx~Hc8>6`ypFR(RHDb>O=^xfG9SKlI#Y-2i+j3XvarW})AKs+bTxdGi-!KW%tfd6QHGe47L$~mCNQY~| zbxu~SvpTm9&a)pDtf>tnKM=9p6v}tnG6W8_${TY5^y@ukCwwpUo2t@3XuNiF{aK~z| zJGo0K>s*&bEOA|egCI+7>#^mz3y5Or0ty!;E1tmqfQG;!S580pyLWex{|OCrh5rQ& zacMILM+Z_xBU?+Lfsvq_vYV}us5R+d&;}P_s0yWu-2VKdr(ZaXl=mB=QCn?UJs62KGl9$gXtKlp`t#Mz zipx8)IP5sh(iJ9=z8XlsN5Mwz2^|NS9yToE`mMn)=RDr!7slACcLkRK?^cdp;|Gxy zHtDM}mr0AzkXW+uBf6Oe+>x)tjPWzafUWfTk2!L3+@r__LDoM5RkYu!W$_y&P0@+9 z9kYGRanNJUoBdIJwNAMVa!_8i!2K|@lcazYSV9~o9u_ec4VJss@D-5C>FH{ff0C3j zXU{ggEKFf&XoLbH8M6J!aOza-B_IG7YPj*$q#2Oih|qJ2l>n5rnbKT4RG-$;i*iZl ze|ER7QKic&uTK#)zLV>XHWuHltNyA&3sj-bYa2hfoq7OpuPR4%(Mj$~IHwcc@Ks`r zyZ1+DK2i%&%M<@1vYW4iU)XEviF$grKWG@yR1pCafwai_i0Pq|UZApyIhH`-EGIpc z0L8&Jc%UJEQL2h3=IBh?n#IUV9lyh?G99ElJeUL#A$Emg_=$YexuB~i5pLG8~8@wCTpQnO;u;fQwB_@mI56iI^O-O%EA z7>`ZY!M>Aj92E+gxW2Y?iKHe{>QE#lC+e_84ezO)zKsI6K3LyI|H4dl?=KJ(T3^ul zZnYe1IpPy3{z4;3AjY$ae+Nw2$d^j?mIyMwUdbz%)PzI4L|SI}BRFfLe(8z71*Z$d z{|e4ezw!3p!TB%VHnn^U*55*v@&8!d^Z!^}QP6-Ia##cn0)9C`kc!&3?_<7tUbpz? z(H}|k_5KpEvMF4fHrHTZzE5HQMdNjh^%BX__osWGO(ym}|2nvc{PJ#k(IhHVb*AFd zYHbrB*^$Mb)~}eGgN@Vk9b|nx{sF{K>oqu;S=W(3%$?pt-C5t1KTd zR#tF7GSIdy9)$)m$zXs)1Y%myHE%Uj#a&}on+?hSgo9!|iC)qqXcW{Br&A>TBrkx) zZbXTt4`Y7UVmDlcLF4!(@#?&{T??@D>D;tjOVg+1XPHq9NiKkDBDo|v;b*Z~i5OHl zqU+iuwZ~1JqvpYgxn?(3k80Vb&GbSpzC2`4QWR5!gk-ft9y`{w?&;mSysZ!Fr!mHD zOm@8TAd*l3zIg+E$vnMIXYLVHkMigQ57A(|#^_McHhxdem-DZO!N-tUaAR&zqKVS? z!|$5A;B^fK?}iUVsS>H;nHiW#zmi?U^&n9nt0+07Gz7^z)s;wX;emn^G!??=UKyI3 zgm0Vj^akw6J8}twjj*ZY-6_^C&r&kDHb0GQHOInLerS+%6>fK|&%=SDky#<{h)^_t zHEAeHI2Lk2l0q8i+6u9C3AqiuN9USg85 zHW4A_aKk?uuk@DzZy5_ zb6L9N#5lKP7-h}Yd(Wc)DD5S&4Vs0cgQIq!^25 zarcHRH>m%GE8PEvD_KWVQR}~%UPVjoqYBy!0<1JM9-+(!RmD6yg`vKF^)}=X3`t8z z@hVutJKxw{qyn1JO~dVwXK3CZ?_F>D<}e9LMg-M#q|W?W68I$7OpH@lG1WZ>(>T9& z)IL0%wNH%mz4E@m^W&gHHJh@P!%ZAJjc2VY-htMdmWxcC!hxm>dK{%(t;!8%%9d-a z)7YS9-4%U_u_BlH%6erT7V~1kpY)ZB$z;Azq3&aorv)PCVi-SZahsge3m8fCKH>79-VE^v{~b7U3X&hh9c%EK%%kIc#^V?Ls~9SmyB^NPY8* zBY4n$9{v;+Kx38K3Ql@p|fyJy>}-v|Pk{waX^xr$EaT+=%ltzq0*{#{bQLISYM0M2Jw8&UY7TfyisJ&dc zCu9|sTy^?YkOsb4VOUAAn-gnH`0CIAD}lO2?zLi#4?1XS-7Cy7u7N&#L4Jjgp!}5$ zdFp+AL)^Qt6O4^HEoH*OMKD^bq<$&lvy0Gf#6g)nEqjGi?jZ$O8o;#X);_33GM&9V z!|{YeV{l6-dE84Fdu0aa>$%g1bI0{xE4+n-k-i-DvQ`B3 z^(hw++!1i6dTr_a%uR{>BQ3(6Ej-y9mfBk)A|k9`EfCO4$_~ zs6qn1ppjWlqL$__n^JKCDo8@%^T;^ULRcW_3)Z$n9`OWp%|KJUND6#?+6)d+ju#kA zFZ=JzJYp~it3v)6vGiyR`f#@E8BU%|Xc*~IU$>gXF$l_8f`1^>S6aP>vuF)7AUgtv&KYOv341)5Pgc2eF}4BPMco0zj0qKvX7S8l;VSgS zLB2px)0F8V0rvh(ytPZdbY;*Wgez~pufGLg#_1dzZ#tm_C*q{d=CgVD*8txV2`rN& zPDC=sa6vUjUs!A^X+_O8q0&A^t)Dbpkm#RPKY^x-T*YYUu`jS?qzpevTWz%Z4$2t9 zxY(-oV-;X04KUuCOnT*Tk)59c` zO#2)PlbQkKUSOyLw94+85A_F!7GOt6*3_gZ6etgTF0tR=FGC)1)4@v-rPoKv5~qId z88#9irS}y_N(}6%oik0_e%&t1M%#9bS>!$bQ22e^|Vql1JAshTk`syT=XmEZ-7*y`BW5R-e0*o!_; zK1=*dWxX}R&uP7Fj-Ic1WwVA^`1EX|P;5}vP%wJNdJr+v{DXk~+ILKt@5LdZ-&1^~ z_)|DIsFhQOyvc&*x98uDlz&!>jQ>qI{8Nxy**F`$iHU!M{#6N84DFTHN82F-M;^&n z-byY7SD#y}4y)JPTyhyL6t?L37)8~7+$geH8~0>m7m;J9v6=DklwD71x`f82#g#trbQY+!-*l}XC%eVRFi%gFvqX9 zwBJy`KFT+0BKy&xNKVB4o_zc$sXy%+^qF3WK6AG<1twigAe#;}6F3(Px}hGY!u!w^ zMVXz#s%@w)KG;L}OMz2c5#M9z`{2@$05eM&;_#TgLH(?0oGR45IM}vqIg&}i`KdU* zqbFG@_GNmNcqyQeIH3iH=(SYOqr!uPXm0R5t1qPN9MYzr~Ve~w~20ka8V+e%KIc&?n(2-cC1|7$Q0r{Z93AYM^N{{M1=-7$s2`qwK-C6pRg!vr~C1sx3tm&sfqR00U8aE6;^-3o5 z_O9ZCIw8(1g{^Plr!DS&?VlIiU6|*eK1+9YpW@Sj5W258!%>l8GPeOpUg>=%j=Uwy z80l0cpX^1l?Y)7IFHbZc!1iwfEYFx5U0&gp2s`6v4hL01BaSalK^=l$q?(p%nS^MZ z9Q=t}U&{EHSM%(WI&3B@!)(4X>7Ml^HSBL}w)UinUwn=(c&8?0FlU%S@WLPm&v);N z0ChE^OBS(M0`sn^eNu(a6Ne-r?`=&z|~`1F8|O*SfA$w>`QfRlaJTxMo~K zf;}q}A&=?u?Kk`&wIxtJ?e2F&rBKZ60N0%}lRz7i9e73l+2m=fe6RoYR zz*_iiLI^XHd(Vuv&Vy&XCFzVY2Jh!fl5fr)WSaz1Ml($96G127BP0?^nNu^#R>?^i z3lP);rz+SuN6A6=Y_PM!s4c}BE9DD0slMP+g4IpYtfN)@Zdq=MSvjAb;>0TMVbnHJ zi{Lc2s1pRQdJy9^3YPno!ca^HN0@;)SJE>-ien#L%@spUoQy-!RuJ9dIp#3J@E45h z=bIAJn(yryM{7L{`lS_FYRy!&c<0v1tJY7xWmj9$VxZ+XP038OJ&Yqz;NJhhq)c6= z>ph;PeT+a5i!e_7PLQv0?aljc=|$FOL$Y>LYdX);F7UftzQ#G9wQ$1})#aUzd>O4% z>-QxTD)EFFBHL{?@^~r&WUS^s=n;bzNUoOvb;+OgnyupBzK0Y=&j!{8oAK~e$VgGt zFhA(XSEvlZ`FN(LV3ixh?}@ySqZc*a zvo38CTM#{|^^m`RJKAMwpC;TD^^Ge%rIt3vQb5m&aQbx1Tscf{HWG-t83wNKK>8Sou>oGAl8xVB;nTeAi!F$ zQ?R3bYfrrgudOnedLF}kT?_%pc-fy!-8~bF6)t_gPjj1)Om6DT=0!7&DrYcd@%3B3;)YqT0Q0WG0$B*C*6*ZQw@ep*hDOY)|d!t9ARCbaA4ZqW2+mk|R zgcs&*&;@hF>=@Q(#PBv6PDZB@mfpnOHj%xLAeB|UpjtXe5XP18E$dRMS)e;|A(f~R zm6KScTe=WeH2lazM%0)$5uHZ5vP*SjFXfIm%u>(gpFw!l;~nRd@@!w>9#khjm3U%vC3)a1hXPigkgAsYT~qvnsAn_QS>n=pn#uq&xeKk9o>kahfmp|`$gf2MKA9jjw zx0w8Ll1(rf8PZQDxZKWYJD$wK68P)q-3!;qW2xUEtIwddZpkA&v=Db5Bl| zrICFN&D^I+^?Uazxy5}_3F$|@CC(iflRXDGr)vCrv=o%{34RWS-5x-6SI+g^;mREOo)<_(ye;@T+f(S8+yEuMA5ed6-PSSp z*2UY#6oLKc*8Q`B{eA2HmG@+noD2;9-Z}*-xo&=>mjtzZRkxyVaJ&)_h$xYY-2tZ4 z5Yg&8T`P+0d3kjOI$?anp%mfJ-rx9!Ip~hNYKF#7Pe<8VHbxH?enEdz!euThV85ZW z1Ag`e&e`T;ec`u&CkrR}&OvQ=w^;3Aik)IuU>0G|5(EA#V{D!>}!SZC*lVaD1b{kuyrjHZsWz;X=mf44QxQqIwvHpvR|cfNTBw%_B?KmH33ff5rjg)_M~%sY z4Ogv;b~>Z)Vs;8^x%F*6?cE1<0WKNUa9U9=&nLLM0Q@?`nB>D7m=R<`kpxgY)-HVC z0hXbkA&h#YA&lHp9#wYOrSc(0u2)HsQO&Fl`d5x$f1aO&=Y^Q#+t~%aRa^hAO8$9% zWWR;}A2TZ@qqixRh?TA5pA)RNZ+{=7i~xc!KOimS6xOZ)LVB<27*-gl^c#3v$uh!ruKBy8kT5 z@P1qR-xCsXAsg%elWp-zR&uBUNL~$fI_?>Y{FH=*hp-F8X!6Rm+fpAA)X9)Zw{{fR zYeT#RhY(RvEiAVYFU7D;9_URE@B9*1$H&K6SSCitUml-#D88Y`Vb}3g&J~4N ztsn#3oAueB_Z4f+IWdIv8KR!ftdE3@M)a^MXZ>xc9C*Lu?ByJ=g@X&diq2K*tZWz_ zveZHYu5w(Y1z32rITQgy@OwEsI89LSTs{1K&V8+_XX32)Ncb<%@Dx`F;3+%)4BkIIN+7HBJSXFKdT|5 zEZvpd8F{c05RV1m6c~*(;*8br4jE^Sk4K;muhl?tifq#ie7`@ct71gcwUwcUiYE4} z)NVC@1@ZP-bq%=`_W{c~Wyv~(i=2t!gbAB`vAx#GadR+N}Xty9^t(~_eQHw9UIR0_eVJK08^5bZ|} zDW;&mn*gi|#T{^VhrC8+Y%B2A?tpNgfz6A8D2(-*21yJ@x@6VcJquDvchn;>N+}8| zHnTxFd|~p)e#dSqL*3(;Qt*z1s7+eqQQREQ5CQYNk?AU+Zn4vKbdn1-%T~%Rrd2|k zs#P>-B}idbV-+^HgJ=V5NE*5y2<;H~VIwL(bNgULJzSOvr%AhDa1#u_#C-lT$ddU| zSzU+t0WV-eji!gZ+-{kVq(Pc2m}>pI!(*WM$R?RC5RWWIYiO3>kD%7kcTos=!!pNj zos-gk*fo9+>VNa`+o!*9>?toJ)x(eEX)t4{WTBN?kHIZ50X2av0Ff>5?kr;)d zw7B*I@kt5)3G9W^4u#M3qf_D<`)3>D`?0q6&Ngt40sqNx)qL4mnFJa^Z&mepYO8T+ zVh0J=42Hc}fW1aumrk2VL%`FoKkXk%k( z1pF6Uq${r|zOe<55WPjh9)@6(ABb{^QW#A^`uhh=0StK<<*gO+h*`x{`1m#TA9B3n zROpf=Pw)7KYfMlX_l>uBM;&c4&v|ENX8624U%~XCY_NCNJgTGiNcT}A@g4TNqDpk3e9PNIYrp;ddj0Y*S9r2&GkG7URw|TjfF!xMqRPGPY5H=OKioM` zvR_A9NvOYWnS^4_b4vBWsOZg(Ae@->ATMB|;oi%Luk<>lom!5(U|m`M8njRoDEb1Y zzQEHHJ{I49C5rbv=~;3zi(nM?YkmYh%vffvfs8hUX7YGc!bwK}Zo-JW=Yu(Nq8^b6 zsk1Bn#btafEuV8(8yPT|d^FyC=D9)(gMl{qFl| zQ&_Vu5PfYueRRkOw;2(4Id0jqQO&#vR*yK_H&OF!B26XrA`<(E#7v7U{IW^v<7R1Q z2Z{P<(hilxqD13G9HWc@i)HpFFdQX|IQ@+MF7k$~mp_$;C*iQi{Tpz=p#M{8{4?E> z{I_(gX!Li-^IuMxfPXt>R_3PlvsFy3MAiZ{du{S@Vl|0}mdSHSpPS1JTE}Xn^vsSw z|Db%SVnQYfM%)y_zM!=r)3c5J+2k;h_Vs+V^-myWg~9nLKDgsaDO-s#7BRrLp13r& zBpf9Hsvm9lD(}CveR945ScppfFy5qxul~7^vzH#k^ICMgA8g~iVsE){pUE_8>h4F> zveQm-ygD0-3k&+;Y^%_R+XN0t-O~na)vsJI4;lYhOGQQ=Ifh!hn{*B|>vW@~%{0&z zKq(2K)Q4?p5bqvyUitZ5em)F1Xg*QzvEnd-?r9Dc$M2kyNit@8L{J7_?5-gIMH5%! zZRmnce4zshfTiup*_zVmIbCEfWMo=*5SmL}b_{nH*LctrRK2Y1s~(Sd2AIp`fttYDouonC^J~m{p_b_ar5swBd_0 z0+o;7eT43-Y+4Uw3zy+BQOeFmI;7W+^@yRv+()MI5c)aeiqItH5O?oFMft&3*p#C4 z7b_aq$R-iK!0Vr523ofJIKj7&)cVaM^Uq}Y`RTg(G91SJb0`$V+2x_8 zPv}rC3Y!>nC>-x<`}95|>{Ee@+s0WPZ237~ixM=3c^rN|8;YC>899DL^{bbbX(&i`nI6${@)k1%7!_fIv5;OAKt!*eSRa{J1N z1@{x~pl?IoJ@A+2J2@F0WOTA&vjS0-at1CuTffB=?A_CQ^t|Zy@VqVqa+l!bgg(`g zy_BWX#MOnSBDb>L=3SWm1c`|mxRR(0lSYj>;J)YyQ|FcMhBh{wB-+S&{s|8%~-3Js#2d!*I$YW%iy)Qh@S@ z&zs>iP*JG}2J(r2l1t_PaVEO{HWOX`YbN?|+WfHE-q`8o{R+n4BL&rb+>lDX(O@~A zb$A9E)A~R+Qw9HKINeiU|I=^^Jlt@6fMHdx#(ing+1Qg=TSQ-ofqpZb+I*Cydo_Oe zmYM+}Apm%0hGHCK+`rjtT!}c+sK_lYu&UEUp^yH{a5@uDuS`{&yUAQJD9_l7#Et|1 z3$;hPW%{fl?&wo=(zl63zn1qA(Ml;N>m&Gm20G#&m|_aPht!#VbHiLOwbu} zUl@9#xSXc!w1?x+M@cA1LR$f-UYlbr@!kU`6$6b3GV4eeG(ew{3ri{NIDm=V#sPvK zjsk!O1EkMzN^nl4{3A4{C;jSnSU1?SH-b9hg$DYmd-kKGVqSJ3JRg7@}I(}qVH{^dAD zlrOd3;vD|BCP4 z{7n%bzr8x4#yaxSri6?`%)y;OQfGYbzZGG>6RzdG9)jaCcOG< znhyNI3?t=+l@(2pvot$Rn9ap^TSS0#+*dp*0lK4h&om?cbx%WtYWE1O+HRJFc)zlH zjwExN2|n49io4I)eh={vXh_~_VJVErcE17QeE{4ay_O&w zj7>pDDkL`HSaS4mEcOZzmzkug#j^erC3^)c{U7SnwXo93p`+JUl zULog08Be9OCDg8}vexV>e*#=6@=RNh9)j*Ubs|{9Mhl^NM+FSBtbQ10E+$WJ3 zjr{rEMyAQ4o?g@d2%=Vom!Q<3CG3g9UQQV4DV9^u$G4|ZoW&ki$eEX4#;&!Nbww&t zH>Ug)w&+k;Y%v)%K^zD!L?(Iq+4)40LzhyPK*o+>IiplRAmtbQd(g^x)*2fw@^$i| zT;-h}it&&WQN1;FEQduj)I-UD5_RV5t}-iDYiuc9TYn$yTAI08Uh z*dx_X8X&(R?;fR4fS=m~ReN)>C8vJJnmUSZID-U(Y+%I6s$IH?m00Bp>+4R9TPN>@ z?m>sfLM@yqYcC@QIQj|p_QcN*8P(0TX&UP8^wM9tpg`)(^+iCKRAo!yG@#T$zWxf6 z=ZKSii`k8;R2P=QG#Obs(4l74k9n6YwOXW)qajAIZK+Oe@XoOLL4l+#29ANr626*# zPbUyJS+5C~$gJw4n;St9=xnh#hP74vYo~QR);Os8mFg2G?$=@y?EajlNmW_vX2%=i z2U_d}Yg_0OU%vPd(V(m?LUI*SMlzn=gfYhGux%3fY$b82Th^X+Z=*Edo|snzbg`&l zG%Xy+-a}huz^@;Z68Q$xA4*U~5o|6f_!LkYOc~K0Asq@-naw{L%om|QirS&~Fe=t3 z;GH+}6qlUe>=8B|U^O^JsO&ZFyjG_1ij8{bU#D1+d@Ahv?uK65#E%C4sM|BFEZlP^ zvXxJj!hC|Rodc@o%7?&VUda%1qA2ar7AIQZ` z=3r8!eheeY_Nc_|r7sNinHMITIYf}OwV??{9bD;pqOX$LGA6JQT&DgAZ`spwmSW&) zUcpCiy+m8h*`KSe++?G={U{uiA#9lRIDwK|%cG3Wt%n64AlF4QO@q{B&!Qr5XRK9e z*i7Y$G9k#;f`z}$#G&pX?JR!$(nFZirIx)!3Uvr=DJ|C-nD7!bbxUFSRf+~Jo*t)Y zz-E~EID=|RhN;Hf{qZA8Ag|uT#}xRoKB2;kbp*Jtrv7?$h`;7y_khaA)qS< ziVfbuneSc`pGPrM9qXd6l9a}@6Y$-j{=v-Z0bO$TZ{y$L--?@mGBeF@=>F#|8T-HP z1h6vwpKu9RwvwBXNAo)Gr<{d*&-NV?t!|Jls+tLXwHMQnIhoK}W1B+h*eSvBY}yIp zS;p%-gB~l>8=A2vte~-1a`KuV?$dsyJvf|S@NIa5ivtq{=$B-~4%~Wn0g}xc>~03t z*%k%;lbI~EJvs(3{oClE4yRu{KA=!gUt?-L&v{%u4-v-t0zzG0vBuo)_&V=`&f7g1 z!bd}ZUKG*sk<@-_!Z}VuysDFu1uLmGF=`28HnXsPAA&NQu7WML5HlKp#lOyd?Wwm4 zg~mB$#MxncjTUaAaT78*nz5fzYo}tq->Rs@cxgVcAfKzv+MH`5Z#K@KvS+&3uftU? z_23R4elrfYvJxP-JtwQQuBk>sM#KT2;HOFielblP%j2uY z7>gCoO%#^dN(3UkC z*<&p?(o8P3y$R|ehZR0)DmXzF^UehP} z;?4`t@EK#-p2eJCsn*w)@g^#d-zzQ!C|#nR2p%}tSMSO-Dpt{+mxT)3UURQ5rJ%44 znODB#v1j!Z_|t)+E4U?HMX0%L0G~CsXUht%G;T}$(``eH4Zd;Z1zEKfJQNW~bK}$n z2Bl=jPH`r2_@w;+U8)fucA0|t7`i5sTh>8tkWLY%+v~qz5`P$IvKLm)!i!9;llw)f z-S_GHuhs~v3*CPB#}84mDOy|s^C&}b9=FiMHkkX7cB%7tnpdc_Pb2)dqx@(t)rV_h zutb17?q;&>{OVO^*0~Ti;phsFqcAj$Wf6V+>ESH}ntkVN3NanQu#lHBXvf@-IJe{P zJl!UaFm-uNIpn{OTps^voLE^V9oTxyiA=xwssD)_g5Qv%;%H{+K&ooyVCD$axBPqf zrW4){_clTDH~B!66Su|H$C z`#MiSRJ^C*_e(mpXU$q-Jx3ksFdK&c2?Sn$_7&6DGVmJTRtWn)OP=3#-M{*Ze|-2$ z*BMDk!F4mfHEsE->w}uEz$@T&1q5vOawxER2osX{m3Y!VAZ}*&7WBolv=Q9wKf86j zvs6GHUf$3m2`okL_Bd!qcLt&Yw8#zbisq&`%^nhjqWIO<@^-Rwya`TB(U`lhUyHI} z%%uHAdJyLm)an7(d_lE(@tWxxV!3ezMlHM`8&MHf;0sEzQTA&sFlmOTw_|Akk%71nvV_sOXplh6)cw zT{Qbe*4NBb#@+rAo-zTf;ik$~aX7X2#l1D*>kT|dcC*-$rN9UsD1&K(s@E%6j(al# zW^pSky$GQeOFw7!$!_GerFSz2grm_5Ihj+LpWOm~ibCn6FlItobraK4Yvg-nJ+_Fh zk%(U3H+UXUdHIgUUQiek?)yoX^q zJAOO0e8X2OdEhe;qJ^>T85gH67nK_u-tQN4FY0bq^PUlYfM987 z4G{_i8V-Nfp6aOPRWzp!dtrgn70`j5W9)W{vbF(7N^xCwNX#x}1Qmxb{j~&#vWBzXDN@EAkO%plVBvKDmjIs0mSHjW#Nn&}>cau~v}A!cPG-dmRJfW+>?FKYzU(o8 zn)A(ot5u77DLTcrPFjhkMfYWD#jNH{cs=1g3!t4yNhN~l(Poq(y5C!MIZ`kv$08Ge>z!u&W#f z)GpYj165$D>bl^3j4&xl0@%4=-T#uEbD^sypZQw&zDFqrCrpXcS4p8eZkJXGTe+to zYa9mHLUyDa+ZJVwNHFjo9J&}u3(kH!iA1_8&VlOEWP|NK+GZ(36(eAP9zFBOo>uY3 zwDq7!QMf?>?MX+{$8aTV&sBOojP(70(zW^~)X0#UObNn@J42}7j$6ySB~=txUlUib zg!6>wphrq*Y#-9_G-shjF}}^Hj9=g?uOal$AKm$;VEY0@gWvB@ht`N|GB;{6Q}H~Y zKBscrMYH{K?zYS5&p`5@;dl>GTFW8!pi?6^nVy$osZjiAQq(M?y(nzWz~&Sx)5l0Z z?a9j@IcXmELM%8Bwk$CuTeG(u=&WTM+mE2Rk!1tuOHeVOQ;Bgu|44qXNZgz+93$O= z$IXU4;qc(?CfT&Gy=2z=9EnVV(nXt@f z|IjCCaVDav3|=pO+rxkrH9!z> zv>w!K{2V8Hdbzp5{Kl2RnD=!j*(5mO3*d_JW)!6s5? zO>a@F?vr$_by9(Kl1{D!SaAlAV>90k49ilL&Wa7QWgLmTW+rgY&_fis6mb?Zvt&xh z9g7Igyh5^da~Y7QV&42{5h+LBtrVqzJRk{2t%_}b@Pt>%xr#Thi!dIrM>ELWJEHj| z!1s>Teo%?{wVcOj@#u*lvFPBlf&E>x2{i4#atQMA0u92kIFj#;&z4zpt@uEYW?fOR zdsUQhS3?Nh>i8ytvnhTX{-xG6W4{w@A zINMfX9lbHTUIlm43+LhT8&JylPSm-aM%3U7x3^8O1izw>sFY)|qMt}l8DD4(>m6DD zIf|Bp#B0uN(N?>y%v{3>+y!ZosQlw;w=4Q=&}Jw0jHK8k+9BmM{oeP>B(I-(&4<#{ zYA8a!tfHZge&1P0gP}Tn;-{QRAyybrYsMV z7zQy5mC8N3H17_fop*lJeM^Zha_MQ3Qbzx}nCKPn@#~hjmm%itU7*90&a`im@ZRTP zf}`$=sqz>=kxHIFEu%kYw@~uFdcQ*BRnQAZUFRQZ`hh}El<`f13H{av|7UjiJx%|e z9sUZsKFy!rXeQXN9oj+#RImXkS>Ni`BvF?4zlnBN6w{HCHql1Wb}3iWr}|i~p7yW7 zqoJGel!%lDJ@{a5YoOCQS>)bzOfI`CpLovQ$X!D7d2P5*)YdwjT1uAe-Yi|T{o=l0 z*?4g4xEcLitNl1Z5($3zoe%Oh`&60U>^6TI#P8|N8qct6GfsheEcrKa-uC zf%<4rT*n{#xsCM_4Sq_wKS)Lfm(~8NS74~#j@yoODnZr|q4A8D!8!#fnGyTb z5r3_cRLFVdS_^fWs--<-NyD0CA*JLff=c;7&;gA0M`3IA{hyk6CwA%IulG%dYHg@v zY4d855oU<;4DhV=h~Ny0xoNy9QI8K%)G_PoN^>-D0#Tde%0e_!28f@va>uXzceEPa zRDmI6rki3-3nY^KOvG2N9QYpwzV`5dIR(2$EhUWYl3cwo`rWDnA&xlhm7IT+4{Fki{bRpC?P` z3hf=QK+dy6>oR?`kTGr@-KOJeF~n@H)Vyz-VI3&#to1ZrvOt>$2rfKK2XSl zGr8#4b_6o}d9Y4eNb&P}x(@I1e%S-**E z#Z#Q7J`4E4j6DX_2jUA)Nl3Ce{PRgBU(Az5o4X*{2^n#+l|qvh0(zyphFWOK^>gf5 zO!Kx_O`_-Xy!d9=me_=zAwe>yFv);Ik||DkSCNbyA8>qD@)|L~7ced3PA!9_Q!>3& z9U*Gs zubvDwlYI!2-`-%4J7uRle7nB1KURp$cayKwV?TBml6R|oeRf;qS7~daSk=9fvo!}k z26n_WLF;gX<74q8S1Ih?7=hX(Hee@}M>{R)Z5+uPdJRC^*A~B){2J3VKc~x_q+umx z)vPMmCO05K)!TX?9F$1b6_3#@uoDzL^084y7AIMOg7_z|5KXC32- zxyLVebZ_MupA}a)-hU%|hjqPpCvKIJHp{=&GlMKm>c@j64f@$_BfKh&E#b%FNAtc2 zXX>9&YUDqng#u7a6@NzNPtyngI89DpT0xW7bTQRfLOkXmsf>9)=Zwx<&S{%{{Y=P? zY40XQ=N6CI(3guaCv2m2O8q4l6bZYk>^FL*@q>dMTxEz7=q)(F>7h8VgblO-@>NRS zT)pPzoGH==mq|)mB$(}lB@7#BpX4KGp;!Yj3feKgOxo#E+q=$(lqrOiw(X}})L{C< z1@|1Q{CvkS6f+R@L>o^ZCrxqf2LEL{L2v(4$QZ6QcF}+ccUf)_L+x4V@Bz7^mHkI0 zM)wOzoY?E|P>>ne8qR*R1oog|*eVhI_cR@h<A`!uW_Z?#+%vcX^wCwk{NZrn(CmF zP}g%eFLt>x?Yx|Wso@E9vM-kM{(A~6nx>r!ZZ^A9GG5UuhP%#@gR#8PpDm+5yVb#_ z{{M)3%b+^5ZEZAoa1Rc_-6goYyD!|`-DTmf!GgQHy9W0Jx8M#zg58zwK4qkk#ne2pUE zchYL zvKrkUXT10}3})NYB*BVXsk_vyCvDRj*FJwJ`h&>I(&Hu`2FXBjr=Ci&GRdE@~=!1M12zLjZ;q7Tv8`0Vm>QUC+bAnU;51H zwUI9mogX@XKcCackREu}NV31j=+p>_oI&qMucVFDtT$+Nfbso}VTbsJ*K|4TY7_!# z`SF29fIOilUCxd0ZJ+k~T0rSJYCiMQHKz@4lguybfndiSYjKb3?%=x;6H}c!zptckulc?2bZ&K*3wmpgxKMivRa-F-e0Z;g&l!;iQB5=pErb z#DwIpshrZ!1XsLQ_k+%Z0x0)rZ<02ouMr-=2lE}T9sTf<2nBR^v^VKf;@7IWr$1U2 zpxAp;JCM}o6Uo0L3;!84`U`LU%g^GkemO>MT?Hh>e*=Ki*eI6qGm{O89Qt8n{o#q??5VC2h-c%kY^!Cfwc8#V2v+u_ozq{(@fO(4U0V1HvNv59*3iTFAxDRE(1Fv=4qCH!e*{p@GX%=M^ z6$%a7MAR{7J19=OVIo7vHBk-FWDAQGZlWA7%Q6@%1&y6wYJg$P!mtBHa&l_PA<}uN zYgfpx&U7?hY-K4dnM;AH>|bw^kK+4t#m1@FJDZ>Qe7-Cmph`@vf?3TZdQqWY!3AJH zZZrD9U&D=7oxt>(QgJNm>C!#}n_CN)=pi+Xp=TD(CU&Qj^>4ILku#F!&97ni)!1L)@@;RUuu0nZk=F1nt2WJF?`R)TB`@*qS+< zkmVPe88?V2M%-eSAZ%h%eZNeckpl&j#{d(0LKl$S9F1@b~3lX9&!6tB~?B&3uWtY2_rBpkA9M;_p090Uz+-|@3_}4J)!Ou zG4!k#E~O-@Bwqb4r`-|K1_(U7PR|~jbhfpVJHCB@oBr}Wi8hOqEM>Fj#{%FU(VMEZ-t840n&|xWaDUE-iLgNQ_n%$F8lB6ZjC!1 z3Sd%?zAHG)treS|v}jB>({~Jy3##$;K2n2r4k6 zg{;U9^bHxl&_CW49GzY$hQFbn<74f=VOSIw83NhpymajvTLHw;m)H-m7b=c$>+h?Z zG2C+nj*Gp!LhVi%Tp{L&L}v7pz-k#^Sb+lw&r{UiVemDWzT?mwd;p!>s9C)}jt_^I z1rHcsh<_6O9)41~`lDVzqqoT4gnR$QDF4ps`~#!>ckP1QZ|#CFJZPWJkEz(FfV3sF zqx4ZXoPrDy*Q%4zRYY2v`ON59Nsrf(_W|NrY52R5Y@toPy}8@Loz zBfI`XbE})K&{X{R7H@6aI8Gv9#MvZ{c*A0Y)#Bw5BGg1~1btx0hVGe%U%hZGS>P&l zyi4lDj2E+cArCKHM8+6`8kqb;mX`sGf5^&pjM8uzOoWSWp&o)p9D`f1lK6)}fH^JP1R@hTwn_T!i zv1-T}3MSuOX0d3Em6`=sKS-}Yt^(34NVNXxxfRDN(`x}jF>OKf>c1f6e-{4#t3@00 zzZPvO<95FpLP>0{ySN$M@I^C0B4>)Y_$mOwI%6z1Yvf8gW0tY_>#{FL8E@VCGpeF@US-JNW%{1k1 z78THshWx1Tim@(dB?ov@nABjC42P@~JlM~w1Fc)D2YSIpB9O!SD1i_ad|K6?m%n*p z0zSTO#bLXxLZIGMYP%WE@3ACbpj(O?VLK>r#vUoYi^54QHZ?lg6%J#R$S;irGi88OOYJly$3`MZ(kba{ugnG_=z=xayePGp;93xdr~Z0zbcvu~2j zVh}yA>InUsSw6Reigm@?3_ou#66|!Y6RAixQt7>e3|&Q>;m)`|L8zUUb7&$sJ01=v zFg}%=YG_S0Cf9S*n@&v&B6`ythy)#)-Iz5NV$9Z&2%}sd`!`7ii=cBJm2WeHW$i0K z@Pa7}umXqhf8ZifiNUQQpH%1a!@Qa=njtdMAmOsDiSdpQXWX);^ezl0(al&QAd>9s9~=+)SwwG9(2E)RAAuCkzrlF_Bg`&g>hh;G90+Uw8@nS#O-lu|5`L`? z9Ui8F0z>U%H^ORM0>Fi<(pdBrVd@eIcg=Ag$x%uXTzz`XWsQtRj-7s`8gWhv$}SF& zr*!AL=Q-ff@qc-Jg%JEugT0Wr?m*LNpq(X@$(T`npyxmfG-Yz>LG`+$8Mrcv@do;* z!;;;0tYrciYnDBQ_OSDlikZ4BfhrN}VoBo&0M94(qhsofAVzkpiZyDy3gxGw8I^UE zW{-Ikuf_Zqj+sOa09&WD+B8kLWq2+vG$su8j(v=~_%O?$_I}^7j@}gpzDYP);y7;m zY{3r74gy4AtgwhG@B9a?E+d@kirl#H!^H5ZnJ-k$Qa{Kf1)x>&hd(|Jmo8Vi0_#NU z0G13m{6^Rab(zzvK?_K%(e4hJ{;ROk11RFEB)rnyx8=p-4&={13d;#n3HM&9c{NE* zzG9g(sfZ)Vb(YYjzj61hEaec~`nF}ITXIT!}X2-XzJjb#2yTrndg;bDnLg2K6dO;I2qg|q3Zrck2 zMz!Yeu#|!WeXzLLS#Q#pZQ`c8YA)Ml;h~nxfXyu!@Ixpy<7y1pw@6$V+=i{VAD+V8 zaVeOnogE|dS|G}m>#RW)8`liHZ582(uVas^STpHyJwvO{a_g}6I6e!=yH92oc$|xI z3ui0z{(P@UrEQgY^+(|rp_=7Wb`S~JqczuouK?0pir3EP)Cw$7x1qf9-PoY81%Z`w zOx|M2XG|l`nMC`AjJwMB{WhdE2BDpJ(nwI#qru%evOFp~s${8;LQCO6SXXo=Um~`s zb9Tu!I~6#ntXkm0@$=be8u14Be}RQ5AJxn#Z_Z?{3?@vw2s`F0eFgjZK&nMP`AKE+ zQ=RHB%2?a22g;RGYs?WGTo?%AtX`>J{p^0r_KTF5;B&GCS#)U`-@w$3-ZsQScaiz# zS+?k1-gR@IBlyO4$u}~vW2&>gal0L-Gh-gCwZginrB8>Mg4JVI`IgdbDI0^gcH*Kf zG2v)MI^Ff3r#KtlqLN0kDdcG*I=%>#k-XBWO+KN(-xg8T-q(c}Swwqnk`SUSN>Js= zuG33BQL>e@F^<^(d3H0taPX^x`T(E5LGb?sK|$ljzX|yNI=lZyPyZ7SjZw4xJyiHH zfTf1~D6$kHS6o43x3`V9^(AK0rb;*zj=v;%2$FGvU4`vv{eC9;7RdSv*8Mnige9Ol zB3v?6`jXH3iue{N(0jGEZVUds_g3WYobPGPIp<)_S^o9);>i9T3l(dKG|)EHQ?J4i z+EZxuSp7x3t8z&dSPjVD4%#`L^VDI~$;dOWN2qgKohVOKJFh@qaFJWAINF)=)oZTj z3WAo=R@0zgCA;rp&8XJm0?P2ie1G5;xzAD}{uZBq$B<6n^xJ?$&jZn{%&1p6x#sX+Hje!N=LE2f5Z^(iKOO#2a?AX5)bUlfj zSSTE{$liP%UkF$*xad2{+;T2UL%5(V>pbk`K9|sBDNgkz=6LP}w}v5?es9*W_w4+l4-DyfKb-@=IYl); z|2piLdOBC8AVcGuE5T8cwZ3XDc8$%&brWqHRN5we-kAj7J~he348!Run zCbY2G>VR=uB_q8%9*w^wk_+VX#erEWqRBc(F5i$>>`N*h42h{x<^(MB@99nZ2DToA z?a+8P_w8I5S@>2yIC>-4ME>=T|KdW<|e z_=fHNg>qjEE!om)G#;oGHaEehY?DwA(^*(u1bw>?&KKgahqvET26MwmDiWwG z0RG1*>8$cNpy zYhhjkwkyzHFM2){8-2&l&CGni%QNZmm){$tLo7OUfxzc!8_%=WL}H$F1;BT_*dhXG z^FVstK9q|Z-ku}Cy8Egc&}lg-Pto**v0%kSve{<&D*G}>sA7=_+bAtb!V2&l)}O~p zrX6Wygtry|mPEYh>f_0C4|L){8L()*)#Qa8$9x^tPT*38o@0_5L|KkVC2z8#5|?fl z$)M$9e~VOZ0-!?TLji@WI+W~|cqp!^aH#Sp6qO@cr|ZrOKf#(leNUaA`TT$n)7ZAj zWH4%TyTh4}C;`EOKTj*6GPlHjC>fwaA*B@SmSeKm}Cr_Yg#ShQ=%9DKi#%c`*!-neUVZ9Fp++-TE2q zIOH!SJf?4l@k%~*{Rt^~7qEgFw);~CSQFA+F}M4yo2+}^oP*AbvCX&62ShE=EMc~|kVBA`;U>%3$U4ujj&B-Ww}bLP~Hc5SW( z;-sA>MvcGTCl{Up2g4;S^7g5j4h!}79Qnu4`j)4ut!fL>L8E&NE$r@LGSp6StME@l zFMQo5lGv&!g7zqIKx~SYZ|piXi=;B*Rs(Z)cdg`xqJbe`I0c*|4jW=s+Ga&c&7?mv z)N5oo6?jR5%tMG>CgulFWT1?((;UHSQ1!hfYFKrZr}HHULsXMKEVIjCImNO+@>;Y} zbRJfy+leW!S9wXUnHO|>Zud9JGU%FN`xg3EkXvUc$?eO;d3;ZGNC~;J$JLIZikmFY zeg{|aG3b-Y#FiXTP{o93!#UL4+9wUK`CEF_uNIbx#sequ&`&prJUv|o9MRHzt8_Pf zeTu?au_LCo>Hi|Q7KTqa=X0_4#@nPWM$Y+CBN^XY$XIT&* z#-_nQr(FrtEE}?;;ZIPhgP^_3)v*|DBQ2qq9y*m2RQGvO?Jklq8yQ@Co zKr3Wjy<>qZc92ScOW?d0XJgfEHBo@aR( z7Ls;`Odk^39ORlEV-RmUx5N5Ullw@yI?+^x(9_6BB-v5GXH^Geyk@t8ky*3~|74K{ zJYjZ=!|rvpjvTHlEIT6q`Z)h-UE`LZ@UxiL{fsx+hd0cDx(8I@x&x0FN*KENhA&&a z#vw;s=faMt=m9sbEVp|Wk_)skVq4&okcWH(aRfVGulMz_*Cd}K@+h(8@pq~qB)b%5 z@I1W4^N-Z?Eho4mu=B$`VXrat$liz9D?t1821wqVlA~pAAGG>~-^X6|U6?~6w*-Yz zKV+fv#&|M5gs>K`cLjaq!3^H!u=8rI4TBHi_gR9H)c69kmqSqYKs$l|kud5lyyEJ| zAT1RPb`<$*fbesi!@;-oH)F-}B4RPz5BLn9wAUEW#ZR#xaPG(OoACO6r0yWymaMx7 zSmdp?!>TVfM{`woZ9ZE7IQa4Jc+GF{1s}!8!5+IIG}L}#S;Ds z|GV_lMw#Rwf=a*Me=Pmfe`_qI>`Y8OOig|pEjW|?{`uQAT+GJQ_Rs$E|J~S*c)Bxr zA#x!({%g>}f3Ua9&HB^cF0N7rO7xe>hqTuH!u@*IS{9aYLygg?UL-bjEq2miL3Jao zd_Q?3z4mT@WpyhjSRM#0JW>ceh))oZOsq`cQWPIepl`GS(xbo%gMtH-V$lDfGpe5> zX=Q@`pYNd0zc8AALTdk47|p+&En`&A6y^o-{p^e}tU|%?w&3BkP4Gs)rNi0nz_Wao zMQNeCN}~RmMGok2ArN|_5ZJ~x@fI3v$O8o47-Z);WSR_b~Q7k@fDoNX8TKy;DFe`&g$d}q5IdB97XsG zYZ&dhg-Rc zq=J@@AX5VN>zL0s2Z9Q_FC^MgR{3ZxPCVGK+Jo31!|GB|jyd}7Yf$Xm&L{;d<|MNy zvUbe-PMHYlJ4wAhHZv1Z2btVCI92^D%$@omF8yd`8$f9{*0GRS{72%`DORR5!$Y4? zi<!f&WGnZdhxgN6u!F@@L zm{W}Kys$V#?s_Ha%Pq9aLvrm9)0~|GKs20OF4T%vObRnbd*gRpyppF(;P1(C(!-xm&axD1HvT*@gU&e2fr`xDnYMAJ%; zL9JZ1O5HTZ^{eSZboQL2Ur(8?<@UV$Ug#}TQL9Y?2#k9b+&g<_wwrGiA?LzS2loLtf(>TOBaf-l!pkEm2ktIy3UmTIPy9>(V6vGBDGa5ScrhE`B4 za00e|2!!rp7u4T?{}rl%+>TGRW$N+s56^bwZZ`8RP!IBVF7!VS&fjOyKWShpu0}3S zhQ|L+hn|oF(V-Ebwy>;u?qF`;F#vcxph0Y4qswG?cjFHzV#m2h@z~lLGvK<+QVaA$TP=l zb<2-OOZ68P8e;5;uC;ocT5)s-vF0)HFe0Pl^9EI6srKpIST=4rp}LI%dB+8QSHqTp zz{Dg_T^RgZZTM%a^%3;jfBEG7HD(KeSZ;PE|1AOcRsJh1Ax~xLw*fLoy}CBEu58?QsV!P3e3Qu=>O4^J3`xy*}pPqnobOgG^_Ut>MU@jiI{gX)I% z6+wFP#MMU0F(XLG*OnX5qjR{TK5rDh2;}i^csdSlD_}O6Ouek?1>hSsrJXf9+|yQz zn>g#Z%{vmjaScS6NvCW%4E-7?XhajE_%?l7{7xS zwv?b|qkS&q*y2cK^IQE5{#*U+=+!KlSmoC7_&F~(bHF1d$NDMuz%DgrY(>h;EI)~K z$l+1I$F%!Zf3l0epNd3(WQhy)K0@jZ?Su&Z(LF(@=Air2P>h)r0YP|m4 z^p2*v`|A}VDcrAC6pweUuQjmWYOQ8s?69J2r z+x)Q`*Ev(6ytV5TfTNt|nu>-hK$spTh?~CTLB5e!{}g{UpFYLeMmflwLQS=*z538b z#!bk~NR@O%#Hv#zPxZO(8@>V>`KY>~hcqkcQ|K=NAg5X1ky&gd*_qQNSTa9D4kUCE z0S@?*Wm>#PbzJD$Co%=_UUF@7h!LfrUyw4y&p2i%3724V-jDq1-HE7N$Rn8ADk3QA z5K`C4L%7&VfT5=NV=>7hq}sZ)w^%&i!})IutQ?roPJ!pVc1a5LRiU1PP6CR*STZ2a z=W&Z}_~5<?bdnJd!d6XKDc7JtCYUT>fvENOF1a?UBUAoXppQ~N ztNUSw`vbSD_*vA{L%x5)Zd=D{p$WvmAArxg^VI$NAArwy`*-yqz$a3Co$ki8Y8Ypa zClaKaVXycvz$bs_lx5tt%f=eWpJJSD*D{%!Y*Sq*`y<)cuHsx`Sz|)W$=Rj6rc}*! zu{sl@%v^n*-8**MY3)Fut`9|*jRq1=l`VU->f*woq3}E$W5gG1=>ft{+vjyF;3UNq z&<}+J2Kn@{;EQ2lsD4XR2BywZ35Q%tZ*hna4w{$9`(2U&a;zpR+!U|xu_@N&rc$k< zZNs9~%)rqoT|NX;8sql^#Ja<@$qij4S|C+0;)*gEJxQcc0OH~TBz@ef+?NjrGis%a zfGXLM`s|@jiHRN9F8~X+4#;D>cf= z)qp&>pgYw2<#*c+Tnl1A;LloXh`Wg7E+2Gi2|n-AO|hB3jPfA5$?1J`1;yqfuIh*_ z=1de9PJstNw>K;wm%9{$PpX^Bjqou&L=4)p$iW++mEVuAHr0z(LQZ$T&B32fxMl~- zbVPd~@3nX37(V4EymD5yl(U-VyISS$ud!#kCMJ}nT~rOYSNpHEWjcORWq#DNto^G2pqq*vE z!R^ctWbA&kgr(e{rm95QFa50;+|5pgl|@Y6xUkrlvN-%V?o-~su&KXoeHvgVIU%C$ zZE@(HhezyH(k5}?Yob*M!hQNoj$xq_%Zx=66|RnrVes?Kk^2In^KbFp@;nep18(l> zPiVKpORZMmN5twOEeK@D;~ZQ;u!8P4*|&(-FlD>L3#+`3ZI8z032TD%-yPVU8OV0c|r$;V| zWTWyA2Cbndgbc^q;0eftuJ7jJZv)nnNHe`+$3zMf%ZQj0w50^Xe8Tfgej$%0zcKIY zzjr1Ktd~R5+Ns^zV_BcD&0^XSy_jwl*`dlt)3s!=6Y*b(xYtHs-vOLL7W!k=bc^#D zxL;!-Jg_>@mh9;=WGTDQpL}!hu$^NReqw5Nh-1(n#!462@&A3>i7pOZ{f2!8|JM8e zb6bh|w+i#eh5lzj@m1H+K>vK{f zzf>43HG-NApOOMYWd7Rs2zFq*zI$%%a6=(nI%;^rRAg}_=lL{olTQC?%EaWP5s z_OjdqW&nQ9#N_l-yS$>7Z+pLb=?|v0R^TZ0rR0%(p={w5%wV)|uTUD`((Gi{zVNea! z)+w>ap*^swpOfA)ZJTimY?hmRTKsafck5QkqxxsR{kNjhaUHg6JNT;mSzkp%V0YGy z1Kwa9WUx_aph$EhS#59mD>Pq^$j-7LmrxJB7p&w778rdZcsZ)Wt-GWfDZTLA27cys zl0WoUl$Wo2oMKH{D1+N8s)M!yW_(b9F5(4%QHhIyWzh+3ya zbd{B3{F3g|VLyEG$WI%~HxJG>s80`=dfL&2r8KzS6AaRVmZ{T>S397j-$1vq4WBUz zwt|soHl6yZ>%U>dEXG&$+W7{bwY@E0T-b0hC&h0vx^jtv`iPu$z$-92!l=viIOKwF za;9w2BYS5<)WafeD&dk-*V?WJYKB@wfEC<5W{#G#t15dZjRc+67s_)nN2^kn6Nk%< zr<&exL{HabyQ?|Wl=kwIhtH`)KT}dQ(Dc_KF|#RXj18sc?8aBJC3kI-aB)w1vxWq+ z*?v6Mm3@X(Y5E5K{exC?x!OG)KWovbyISi=#}XGgqku)RXlI zxBi(?d{G8+c;6gxa2Q)q+zRwn1RF`iLg2`U%|g{1ndHdQcyfEGllx_iwo(S(?R9-A zX)kThoMd?Mw=}kf8kbjg#4Du&J-b;Y%W1eXfjgc{diaYcjMaRnY12st{E43Q{Sr*&IyjhfjWr^nCi z+3V#J_!%&MRYsp^cMNA6)h)aS*m`nq@LaY&JWcJ{Z{A<;Uj&K zB_90PQT&DhBw-ptU0rG9ttLH`%KhqNZu|2_M#U-EFg&v%GjA z7`G@6eJb36kB)Hr173Acnvts$1&KN1NlezbTFJgg_mIR1sWOS(tRLn zooOAGoIXR9&x~7IiF0DSlqU>1kxSvXH}!Tzhoa9$$;6WjsaSL@MeQNRoYG|syJ{yk8Z}%bW2KT##L_g8fPv`RxCo;JfQn-l16XI@VOwz_I%jAS1dG`fXlEUX zvD~4&)?Znx5P^{ZV0I8c3%7=_zd85jrQU8N#^a2d53!kV$H|m$Q2PlbK6^F4Ro&3Q6>CsWa=&%ZilW%Zct|@6 z*IVwePPq(`1rM;!!VEpg+N@nOblh15a)jFoUAS`mJ- z=Oc`z-6orrv$Nd)Lhkqrj);=e5CLm|5#h@)hEFlL-?;J7x>bb~>4q@gcyZ8lT>yFC zd~pWjWhT?dk;k2o+SIXQe9e|IT()KrwVYk!oeS}2{p&Yx__Bydi~Pux+6Y{YG5?tm z%8XpRckXqGsf2FdmRQ}UH#0=oJwj9#<*qqG z1i5*ydh6|jF22Ay4F!0XT>3&{jWE&8r}a-b`S z_127GRv_=-kf0~`FrLCQVh=NsluSZOFBBm!tTmS z*_C@NbN$})qCmgL_ZcBKoIBX=X)<*1T- z=!f2{-DwY$>xLL;tSC%F=3Du;{+ME%JF_>$>B*$)`W{INnDXi^jZ9j2-cm03xd>8b z$9dFlk2#F}dWnO2#(n{e>s~smL)Lg;69C zt%2YObRiB$OQmW?K4Duo)o0gfSP?`Kj9BUcW*iRF3R@}}n(rZ#+#E*fc_Wr=omdEz z^^0x=>dzEdJx18EV!Wim3rvZlNumR8KpEIF^+fglE;5sDh=E@9(-~?{o-&r%dOQ~V zDC4&R6b$*}HZ2ou>BwYF^08~FlTP&3`W>}OFvhg zC|ikEtI(QPuhKr&6$J9H0?_0a5KLGmN41m@1;%TOpjv)(qy!Pg$nSDL7xefW>$GV> zsBjkSvP~t<(**ZL>ZI?|`VBJj*4zN`p%6{AKMKSxu=4#hLz!W+DQth{;;3hy2+@k{ zLr`TRHIL%e$9i0vCVA#iaph9s++)FlNjY@-G89usC9dvWDq$jR(krZ&5OH~nc!}>}K2a^P zz$=pYD=H$R-YsntYous1_oT{Sd zF7M1F=d-*lr_6sk98<9rH!N6Mm8f2Am73bj?_3j$<%o;%7In?PpHJHSWA=xJd|Cf( z*zx^ubdP^RZU3+xu(Jed)0JEe?OZHfJpWqB`u=7A`(cHFPhLlmiZMS(Z6zPWXPKmw zoI>zA`UDJcytV|K4Qz$yiQWWBSh}cXxXsKskV*Qu8QM!)6m=?52|}CIeKgiV+K$ zM3CO1)+z2tw;{T)SHSRB#a z)f{mjZ3H&ps5+=`E;>Of7D&5A^@k@HQ`Z`mpVE4{c(-MpBSh6PZ?6o0T!kSnT zT8Ax9y}cxai9zySDaU&3y^i80-Br}z!qx5M2scbNxNhubk2OvWkbH&q}M zE~D*4PAu8ao?~siy4Q}`_VVn#8AjkUrawz*7^DI;nj<3tiJx0PCKzWoUw)AX*2xrc zaU@wQMh#p7b8wpYZmC64B@nWBNloNtbY47=#j%#<8~YagfB{fmw358bLz?G26!L6a z+}X?o=`t*9y(aJd<|JkV!s4%BcFKYWw13OFJW zX~)bJj1%g{Xle;O)&D(8;$?q(H35NDs(%Bk{`taE{_QIIr^5^=hne-iNw5ENR9GH~ zFc>2!hq<6&^70tepD@|ZNh!mC)cafTEc~4OPX+kX6rp8OFs|GknGN1k?o;jzFOMgn zs99V+mQnVSvkadRD+f?iSPSl&&aG^LAyeTJ{p$569JKFi44|m61_kK>wekI$l}Wzy zY#D##5{v^V#`MDm^g%+?^X){F039B7GgcYGO}&B0hgpF@#U(La*(~##cH$m zfuEELJM1n3t9f2Xan}rHw1KcVbW-w%x87@PvkHH)l3X;2@?oYG+Kt(Ak>L)Ybws6` z^6I@jdfO3BKmA%NGtd%xpB3Q+$0d5Wz1rFopr5ZsW1s3QY=T{><+^&g%J6us;afn6 z(b3O@y0SD#=*3&19ZT#Cz5Y{oN)3Z9_akW>4?>bquiZRN_Yb*|gH;I%`-paF&BAIP zENbUO!;2G`H8VwTHlb6XAy@SsvKt(`j}+UyFED|yH1UW{Cw2t0GQVU^b+E=4Kpn7@H0K6+s<{ZkPn?^8@C#wZy8Z`_9+~j!V6uo zf(*XOWE+wdtB-8jFflz;`r;<=4&Rsl<7!~$F^Uu#h?-YG<`C!B*Cr`5pd2=(9VSns z&pC80CR8{TEQy*16WEZl!CCPZ)c|yYDajn%cZ?P#;S8;i02?&cW?47AOg?V>h4=eN zpe0JZ?*<)-v%h&=|FiS_A4oOMWFXpx%YRAcW7Gh@vpCn&ASrO5CFh0d!7ODKe7<9s z>4%ELhvFh4a)_YxI3()!yM*FoO|<5}g?!T2^}Wh#$7o+^TIfF1eueiBXPD0ZOc7s5 zN0j*4yZ(a9`@(x_QXiDWi7AN04&7$MtsYZzBsY~?Iwh3UQr%>wtzwM5!j3(Kxvu6m z$!yhZhubCwJlgn{Sk6$6>B_7i037!kn6Vkj4Z)a z@@lhD>&?;u!Y@-M?x-&^$my`{7O-~O`jC8uJJH9wTOADKDPR~_MbEAEt-lwWvDFtH z`|352>yj%y(@I#Qt&fS;VF_Ad4qhcjdv;EsSVSG~!sHw-QfwXo={Y_mzUbiWchm(b z9h@Z^r@5ucjg+>iud@4n#FmzvpY^DuAF2K6n*-oj?R1@HR@IjDNb#z&YF2A7_}1A+ z4=kJi`OGObxdW2_2;z%0Yca#Buyyc=4j z*Ew`}A!L-*>pl$IuH|{I93AIeoXiSwU|a^udJB+^-!(4fd@|n7KKJ&oz`|G)lR90= zRgYWMCS~Hjp58=YeR*+r`c_5$Q@jT~MX1l`>TzI2?g-2I?DEr{Ls0^?_kEHw)9?6A zrTvAIgpIM}OXhWYJ~i?g)v@(Cmw}%t$(LL|`^xIk+)B@MmD)ekJ7-)KzgA&8!s1pd zs)$<9fM|K8K?on*G5917GYI;bOuR}u!Z!vmwDJy6_$ZZ+^qF#`iAXcNLa_W?`dED9 zQ1CT{qq|FZsTd4#9X92Qb0S2=ROYnF7*xmvr`ueP-)k(jybT_z?;}U^^?K~TufZ3F z;}>MQ9KZ?P=MLSEcYtRmOSpzlrs|?XiCWKs^$Cloy19g9S*?@sczJJ6xSPE>B}sTf z7N>NLXLl_o>@ild&VGgj7j&wM1k7wb-!>utz^TJnM zyv(YQhi}A^@<;%8B`{oNKKx7+>_mXbHCKD;q;=@fA&Xr{dW*h6=^PpSzS6kAg8Je68}WZY4>qlyF8%!L{7&) z|1rBJL1~QMgStSD|A^22@0Tj;e+_g0-!Ij`GPyr4RlfgUF4c}k(25!=jhRfOk4a?C zq6rt2zuB;8=l(-iSJXtfGeB9Y9}{>)aUX+J_HUPJS)mo$Fpk-jFsz(+9_?>_0s;j! z{cGS4<_Yz;NO>XhSk)NAMvcw_wXkS+#AP(C7|^9!P3APu8luxm4>GAmm@vow#d?Sx z;|f^*+TJZ%p}IkLFxbbW*>K#v-2%8?tobF*N97i3NYYpp!FTJS-h?l*kI=r`mQ_Ke zDJUdP=#EVh9I#%j<};odJKYi|yMgAKQY#5?_bZx^0(R99Ib;f*%|bH64mjza5uIXK zJtSHbKOvCvDU#1yHsi<9YQp{`WI=e4pLD11*Z_nmN~=3}3CA8$YYw#cuH}kXht3zN z+GKXgx06FOvv+D22LzXOU=l`N--DZ$A0QmMmQ2zLea=p-uGEmx5HX5`dRGB zq*?N4nIp_-5nCA5git7PmSbe&Thz^S6+M4UMqPdOKYRXt>@wf_$o~qfRrj2gRgQA3kRJXiK8y9~^ycgZlyiMH z?eG5!Q}CU>vxB$_4m-oouTl{rVr2EC1XxKVFbrdcwid#3qJWt>ti*ZkmHFjtgwZQzU$i(HFr;L4VL|q=cE_v6Z;2QwqB15Px{=nrR$QE&a>3vpl@9qx z3`n$yS)^80Wa!JhSB2>mxoUfbVO&SsTh~A*mmq{(LhCHen5MlV4vG@z0%O<6(hgy2 zG~~$giM3B$HuU0rc&s0S>IWrz(KVSADLE_eIWY0ez${&q1v$`3k5t8F6Kg8E@}9q; z#0FGd)|Re`C}kL_TT>to^GR6>E3(5y-l2?RO1935nMT3R zRdjioPy&py5})0JEi0mBCoh6wk&fo~U~@(Rs;Jbd_T)>!N)?G7tg!wNwSv*k`WCBB zqfC#vsH!F9p!uVT22Xj;c0hjkxSMxlzf}nlnR9z2SoPp#FL=!B3yt->S6_VX3I?H4 zk&DGwbkJn@b*D70WX@_ z7(FLLi=1CeLAXs(qC~&4+3r7z*~-es;fz-}S#;}b9>0M1b559vl`imQq&qzOZaoX^ z)tPIm^372rNp!Up5v4ZPEl8%Nrv`!S@{cVm!D*4y&NEy7D5 zAhv`a>E>-6znL~T(vJlMKn|BP!F|5V-IoDD!4CCuWkfR;TU;lRqpGh9djEWW9xPh;fp4(K^$CS zK|r?3X`>B_;PA&B7dc71ae$%CoeAyG6ySiRCg*FHIBTKZ;gPEe;1t(%%GtdfU)MMx z@Vb2)F0!3PYx;b9=9D~KuTy zrc!VA2zfi&kIREAD1g_`*0HkGQqha3LLVR>qlZt#WFyzY?Ae>M?yyE~2Z1-Pl8(H6 z7h3WxHeg3l=tVlwMM+24HfH*b^aU^?y@^=)q_^xk=g(yc!EclM<|TrDvJHKkgXw9U z9R&zG%G?&0+R1J1yT}|$y}S}obgg`JSJfAiyP<2I3}EaEQXX-K$F>^9wd(C^Cq~j^ z`wUG2IgQY|?#;o;P!ROF?HB9w$oeOg^r4sKx|0^D-gPa9Pk|RpeXFQ!eak1CY#qs5 zqdyEB4twQQrfrY8bh9;VF^&MHG}30A;IFHtnSq-JZ_ql=+xx31X0m-`8n!X{dRf#4sOI;@Z zinj~!MI6xrJ|{nq)a*N2EpoBrMrz3~dVq`k@?e)2F2oVfMofyH5}fZ9YCwSOn#!;@ zS_|}f{s7+{KZN!EP0CkSFfspAn1B!q6g9el4Mp{mL{5o4@orBp`O_xD-+m)7i%es3E zUp7DX8JUh3d{@@1uEb1>kw;7UnPC46>37!2HR6%yr9R zQi?YTPJ8nbkkS8@y|~j3x`j;U39++Gq(`jBZRN+*HbJLMg-);7X&S z34bn!$XVEepR5-v+bea`m%O7tNN02XIGquo6$LK(zr*orEtFR|A<+i`y)~SWV6P#`Q7=~$^AXiJ2;QYkISd>t6QsP z!K_r;C5u~ALB@xO*~E2b+Ql7xC^F`5IKMuf`8wg2XMET^V!QXynZF-w+RJ-Yn*QWc zk>lUsK8mOjRQM6-hXa9af_-y&y-G0dAh$|WDlX!V534~HLL*#?P;g{5T9Nf#%>i@S zgbLdz{G>{K_jjZKE|SL7ZFzWf+<_8}RO=Me;$HPek*lg&i_MA1i1NuAsQ3qkFawVo zRG%x94&i{z45Xp3G*R9L+z70TJ^3oib)-$bOlT<=l8eRT9bT&#YKYRS_uW)h=%4!m zYe~$J)X@7Zx*h3kB6lP)>~gfp?6wK&2}@{I%AN&Ts3Zdhla)H{tClJgev6AI8wDEJ z!@kiO-(E(|YtjrQvqKMA(pWK}ZbrvOclZZF(mXmW_cyPK&#tRTgIfllq&0KTd9ccY z679~cVb+WkNi*^!1{9B7pTfN1*!`5)7SRSAAj?g@O{ShXK#g~A8Uw+WtJ`en%fpBU zmglOzmnM5ZV^kW+8C0u?EI(7L(4-EnX6Bb>mB%BrB!rk3BHp7-8dBwiV2PF<@R8Z5 z%;`}SXV*Ad)c_y0T+%``#=w7GzKc2jpqDTF1`BHUbXRuC z)!=mxIvO|FIbd>}5F~%fI^dJhzmbG-NCHdXOz^lQmjy?*#LBv9gwsSgtb}#};cL{= zY2*o<3M4|hgr0q=1&->kSmu~N-hm*LVMpYMboWR_1>W$IMw7XsOZ?2Sls`^}$Nu5; z5UAPv`5z&a|Mg_B{2f(PM0rIjLL7V#mA3_j@qQ*sSw)NvN))wGUgD3Fv8~zPC`Bnp zaQo>ofN}I;>)|xcX)Ot|xR7AsXixX<)URVx@6Ut5)@c-9A-~H07^`-G5V>Kuu|OltRiVB0bSy z2rSW09*^10wqB{m*5z8CwCQ0rMnh?IOA(BZq(v)|e;eg$t7UdNG4|#7HROh;F4JJU)g}b}EySux) zySoQ>cL?qp+}%lVcXziC2-Yh*=iYPn?yqlu-Btaot5{V~{95n($e42uG5liqKm##x;O5|-_-1A5mg#mJ#ShQvGWSI#&9zY zGYs_dgyu&HGRx=})s>uSNiM+@$0&*6V1iewO|W$~OZ)v&oho|@p<|W-b5pGY-<+gM z1@4GGH=Z;m(@#;qcLd!)5q{NL?d_Gvz|G@B!dO;82~uj=-sIP0*0I{3KS1r6%XqQ! zr2A{vtN-dMK7-Ckz+;MeW;g|x7 zJ_+l`#fO+9HoZKVcTnJtP}ACT$;!{3;Y)u7=k$);S8t!ro6Wq&XG^j#e-;=1;rhD( zI`T6*2V44609k4&2O!yPlWphP8Afi34-(8|eWse#wHR_HPwxVd`N1KN;HY@%UpF0{ zG={rCvZEqJAV+xX7mzsQs`YWo=Y|SNn6nlojULL?`Dh|U+z}uR9wc1Z6tm0S_&cg8 zLOXIv+Tge+ggi#g&k+ z;1X`v;x->;d65Q@<(x&X7VIGoWZsOhm|J>D)#kl)txlS3a!3I>N&tQ3j7ORG*Xqw4 zSs@{_7*Vf=BY|n7I-^$Nj-xCac<{ooNdOpC@;ck4OmKkAS$aQUz(|MVSo%!m-j@{a zO2@IV=*lv^cu~x9)WLE4{EFK*|53qZxg3+;ZR2Sp;1zat7p%yEPPqJ~_iX@8xS5xS zx$wl3$U;OHl1lrRx|nc;;q6g#DAUSYKSMHdtoEr@QLT{qJ9eHS?JPz`2+I3_^-SU0Y>!q&JhopQ3W#tB0*2F9YVp3QVhhy zU5pv!TqYO_A}_QP+Cjfm#j%624KB9YpMMp!YwYZlfk1U%74W70dx-zf+y5UVyx*g~ zKb7o)hORoYBvwsD&1DVTN9;rj0!SkPfsk;>vT(4#@XvXI?rv;w!_xvJ`X$I(IR)#)tIA^_So4&9_c3eR^rmIi1bibvOe2wvIeE{rD>?JMQ-Q z-*44}r)62966ka2(UlhmT!#D0c0B2H>%-1fn34v=i`W=WkMT9WX%!dkVY^*l&p9!v zOQk(r8H$+SVH2>(E4dUI7SVql%%PtMVYGUZjP~{$j5j2}eR`1)oQDwii+*z)J|R1c z578N0t&eo`;B~rUcBCKqUOsQhDc46YCdSj1BAxn8)FLBs!u(~_h1&k@z3hcQ*@Nmc z;^s|Zp0@QdbLEdCR9CfXNq#F=Yuo20t5?w|u1dAFIIeXq?)S5%H~L7fLbdva1Mv;J zx-|bbBmbd_Ni3VixPH6pF8;#Bsjp?sRtpVkvu1p^ z`nkGnuxW|f4#cPU8rOx`77}V~(TZnT`*u#P>9X+r4fRintSxEv(8Vr%rG;77oeHeB z(Hc77p;=gBZy`F9RFh_j&9)zrcvi5U)WwZ4vbh6}EkKkK!Z1SQDi(DxNh)jFk{9Q^ zY6>YO%8dGmBb3F+GS;y4JYR>U1k)*aZr25wCEhT~Wcl0(0##~3Ai4Ekr^p7)=LqpA&sqL7c&QxD`(uz=0FWzm zUP-R^iHOE64kK6%@p0<&@=xA&wW&@x84aFyHj{nrk=9T6(7lkseX-$riosZKayM5A zHAc4)Wg4!-Cj-!I6~t=sk}O?_$T5sgE*_5&7=s6&P&Sr~62Th~pM8t!pp-c+@%h5a zzjO#@zKS>>*(LTqiQ(MuAgE=Rfwv)$z)sG(V#?Ug#VK$vlFcD_Q-!O>xE_o9bG#6r zl1$D5;-mP!wbKc~fw&@fuFKGc5jw;P)__I4k?60=9W~3$u5F1l@&)OTrmNDi2LS*` z?SK;amt@_qB7{+Q$*`)%!<11fF%+i6s>TWRGm-f;N6g7Fs>b8PEL{0X+=E>LLg#iY zZ0{bPtau8;zO)J}kBFIo-RBZU);EKPo&MQx4PsSUCS0fD$rI7SU+_)7{4l*9qR6WB zNd7V8=^z*xXH~@Efv4Q8aVOH(AhVK`X^zNKnao@5am-Z&;48DQM)P4Ssi(8VD}bQt zxWt>Nuo>m`Dyt9oZp^Dk{AnR+T6j4V`E7W;&>YLRT;5-_0^KjX`eq)$w z(Zr66FfpHV!sWJ_dne>3+pIUn&-5hv+G>CT!(r2G22m5(LV1IzynHA>QQ+@}&nY=e48#(+M z+K&yTlV9vI%tpaKBDbM zUUci-*05BMQW5%(tE$3op`C-IgCp9~gSZQZH~N~xCzao2(uI1&UiiCMzUijQzURmnw~cGE%WG&KAtLuLYPiCYFF7J)*>t$VQa(~d z@L%9?h21<=h?wQvV+p-^$Pze9nMdM5`REhG<>aZtt4`T+w|_pO#NA@#nq%oFr-7&? zg$SADK0Vg!ep-m2ud zw8+|sz!!G11O4kQR>Al!p?LZr4Q*O2>_svbrapb73DaLBV1kQX1Yc{-x@gIn*#IPC z>?Uwu1#<$e{mF8tG{(NTXl}0tHx_VaJU5rh?hOZa@b_l0&3O~{Z>39tZhds?haapG zA2;j~k48Ii104dEJs(gGcKH}M#amk6hbh+r~{g&;61lt@TW zXz!`&VBcS`F(Rr!*(KupZe;Wp^iz~`(_NGB+YdL0eSLfVz?ZWNyV0J_%K+dn4Ey?Z z8FatyiD_zfze$6zG;i6S3%q_UzmF7@cwHq}2O%CuV86%TCL<1Im<$caa6?H*ipAOi@T(uOt5?dZQ2R#&z3Y)4WoSgR z$w9N!+P1VuH`Mam&H)&YzGn_)YjM9WBF|SI`RLjer#g!RrRQzm3s3x{P*!T~+Q>FgOr0aJ+ zAG}_3EZCR71j}Ehqfjq^12KEF|HMoG1i*i<3IFMY^A8w2CO04e@SUS&32r`rR-#b) z9uP?k<|s%&B`R7_#tt{fKn0{{NR^=C_Z~UyfbI4|Tq{D73(~-fMB+|$n)8qNdHZ{Q zstvOR?>d0>(f;zS+kjz`2D8k7z1B0Ac^|HrdI1p#gRPlOV!NzCP-#yh4|`7_-xJ-1 zw8}@qIyaB(TKqOoptLQg0YwQjjG;Wj-J~6K>Y%x`8Iiji6vjb@d{ip-A#7%9=X~%W zu%ANPDpFe6tYJaHrMM!~O*F!xpJS`^AxD&V_b1U>q`YcEEv+u70T?xK)j^=&5hF(= z_vEM(nB%Hu;KuQ%$;b&ZngK<(K6mh%fEBfusp!x%=y#5DA%&G8|2Yc{6N;Da#afyA z9!9`H@NVr~zC}gWR~~DY0LAoBST5WlCK<9u>*WGjjZ(?t+Fy1#YQqC!k$&mwLD~qyVlHerXVrKfmmG~5pvWiS)O0W*oiXx9dpG1q zlI5q$l`j)gG!WANl;03xSm8Y!#us93>WrkL-Y~*NadGHvr_R&BYUX_4dO_-GD^$0j ze=%HGIZQJ6LL+pP*fY(plB+yz+w8bRdnq@D^T~Cvw6A4Y561RB3wJ_(~xE_DcRaZNvK8D*;a1WKCTx>`fT|R4A@0Yj&#w2)r;KA?hU|i%Ging`~6r zmJ+j-R)fNYA_WCaDhO9m4WwcQ#Ksj55WiS`EsE8;cwCJ0Fgp zSG~Wy{=n`f%wa0hei&rjFXxO0M74MGvkeL`3~p)xtY!5pbL40SSZ&WFjF&T87FNAl zzBXZNX)m4b$6ICaO1C11eMn-RW7^;nu0aV@;s(HsJYcp}v&l+#=KwlpEu(5iNZ6BW zAcBX6>%Xe)k(o;9Z;*72BodSg!{CxFij{!yzOE! zZzm{33bJfhP0k@l?wRE0(1l#Ss+~B>htF7C2kdB;bIOarXwTP1oaRUuY?F#(?Q{It}2I( z)m@j~qq}6bd9=*V*YqV0Yo?ap#CF=BQOTQK*QuCak7E_`z&l(CuRTF*_7Z_8tT_w8 z_`Xeass-lXo}|Xm)oG|X%lYsvWiW0kkg|-4LRp0B&OZVLhmmFiMD&agGAl{!ZY!y7lu;@ z;$?&8lV-~`v$^&n*O_arB>Y20oe$ChK80@qY==;b&}^aF1L+G0XgV*?_1yVf8fo_4XFo0~*3%DhiRJ#k*Cg>` zF}MtHJ%;^{RKvedfdA96;?Jt|RgzW)Cf#42_>NziflBKMfPjBWaRDSrG+g}tzi2+E^ zLJfYa^>RDh86Weo>fM=?A^0p}Xcnrc=6&*2Ix`0V2#7Us$?x{uG+{SDkV^FzeO zI4a;izQ(59x!A@G={+Fq_JQE-a5w;Dl6cr(%$R`lwW|x2!o#9wk?PBcbw^Dc54nwl zX?;|4TbC9|5g8Vi$3w+2cIq#h#kzH%o>=I~eUNkS=6q;SDoXz+J!9;!4> zK>TOssJ>(WYtPb?gNlM-LP05dFAR-rr^;q+N!!jfbkn#2)>A1!;x>Id|GV7>K|*So z=AK7%NS~1HqgcZkYURAN&wdyuCxy0RX7;W9*C?oFdh;d>5hq7Zq_qkMf}s^9h8+uB zqPX;Z14!&bXy8=@%&>q5h81<|T0OGoRo8Q7#S5zz1{|3B@V)|>llZe`{62wFS>50g zz%AQgM*DVaYzu~AWCiTa!^k2z#;~+H@^CPr^%$5WR{$blAMUQd+QWN3o3E2ktlaLH z14~e^%-cIwyYBkX1HPigr87oGyJo4f-z|$vPNv{o8~n-Cl**&-+5sVi!Cw5qDO zgklE1HE7mMSRt@6#5sasufi=&xhUl4G=6mL7sgq_3Zl8tXT*ok3YenOWU zbPB^tdlD44F5qZ|t&K8ow*cj2_flF@SI}Q1SK`t+2?vxxERv{2Bho<^R8B zU-ECZZH$UFFv}6cZ}NARBX)Guzp@-BV4OMVA(^rfHtGlEOXl}zPF9Q0Ab(}OeAe%^ zFnT$UW9lx&HL-yul%4wFc{Szz_V)aQEdag-a2m4X=GtPBnaopJlTB(#)owSl0t8uy znzo9DIuL0ZV`#MSBUv=sX!vw&EFZ|JuAv&u@Y?E2XJx9&U!7k%Zln-ULK+JUBZOj- z@%WxhG#7V-+M23v%mcgPX`{hZGG%}pkv0fbiA5Bh;c&pGLI7g*3QG?|71_70L+DrX zu{YvBQI1IrlMGwiHH)sm4C;ycYI$2}SC6@Qmz?iqm{7<7YI3+=y8&!dA5aDvNR5%^ zfKiSOGe#N|`oE(b`4pG|#)68|s%knbMI|Mip=E-l2>I}u31gqPQOAws#Bf@`s7eKX zwA=xnU`SaA)-%g)0;dxV4Cn?%SHPxs_=In4HWw=vWb(BpW^<`gV2sju;CmT#7*A6V zn{lBX1YDciM@EnalIhhD3$Ny&>L^^3$jv%xV8#*yS)7>aTSpi!LiSmr@3}W?b-Nw& z?n?4r&hyJDQn7kB_|gM;paZ{K1B7dI9f6)DKh?hTlcwAW zQaw9BM|x1X@DZa^`^{02e&3yJmjEN5D(s3*suffVvJkJ{FjnqI6Rx6U)7rGe$hoBMN zT|A6=*4rY%BGd3i>m}%M7O@6rmTo>~L>9~37o}zwYrDF}c(^0Y5g3L!D|sQ&Fx@U( z*gN7!_~vJM1KD!Y;~c&3`OUcWo3X?UEHE)Q=hjsIk}kWrZkCa0E`mi!ar1ofNW><;XCN>n*yp4mUa`rjG2e+)|mO-%kHx-k+MTquAbQW-Or4u3oe)9Q8&Nd#G=0|q}Wi6x04 zJ}Lr#)DP-G;rOMsKC+_V8@PB``lMy$Mcp&U*2Y&wDHU8fOX8(#WbXB&s2WBz*(P6tZy3&U;WUsA< zCI!Ya>hoy?Mnp+VRLBNjzs~AvLIJIh3*X1!GcA0cw&AbEG3B*MQg*{~yeCiErtZGX z&FS~}zkn%>AcMDE$ni?+EDQJ5nq`tBN5OLnpT-Q-ng%Og7@cZ#UvRl*8=5mN9jA8% zY%=~J>zZi3=pJ}27q`d&65;)l2=(Q*Ep$T{oPQ>MV*{)qel01?x#sFbA03Sgo+`vH z(e$Exq5gTR5?EmLsZz77b-0#u+uqZ5G}|xcIOITj*+7HRS{D_m*~ErO9OW&A*N_6^ zJA$O$qyy#3K@Kj%)bURwCu3Uh{y85Wyl!(2I_xMa7ut*hH1undb)~wPENC}}ER~Kf z!93G3*)|*{>p|I#WS>p=EE`4?;&C^F!rMgF{iKLsVyj1cn7XC*MJ3129ZXUjX3KCZ zoe8?$v|=^%dYuh}917(5GM#tv!$+);JC;g0Z6r%FC8==loVno^RkFSjYFJL(S|Uvc z9R1wzTF6DBF+q|XhVf?M^Rw^aB=eZb8i8uueUv+hG2+ETk~@GnL4A^Cb}r`ohNAeA z@a0-Dzf>k!E@hW*DW~XF>fO7)N^9lFvht<2!J$oH6j5(ci0^mHL~kvS&C;jL(xncV z7b~~wvie6pO&bAJGJ>d<2i|b*;XENYz64wYLrc&YNp zf+%mHz(Ye-iuMzI*)d2LK`|9ka+Fq|8g_K^jxU9AhY9yuqju068UFRB2>4-h|Kyu- zJHnzhB-sXLZ@2EJ$*0NUp7)nWD1xXKz)t;!rRq?uB@}p<0@IqDcb)D5nmJ^3)~pK~ zCP>6S-hjgOel!neC)71mK=)z3i38Vo>G-5Q9Osh{u zc5D?_32lSe!tO$rux~r*`7n89;I?qpz5j->pcoWmK(Kx3t^DgjI%thD7Rh?3d>vW9%5iZhAp7 zr)$GR{DVIJejErq4yIRtv!RgBYiZ68AkdL1FHn& zz&Sy}S&C4N(wOaG;(b4Z7MpbFNdo%5ES+SCb|xj+lrX1U-(Lzc<%Ag% zi_!4&DYpba+8D zFh8-yi=f+ZQJ07p;dxo529a*0n!xm`_n1#=VuyWT%>%$ zEgnjcHKWQj4##W6Li9*ON;yyH473~{4c|@`C`K*o=Aa&o{(ojMr~l$l5Zafh*u>Iu zqComfMZMB%kd{H_v4WKGxh|WDVSvH!wEKbso1Q{quRGe?qnF{N_;5aYA6u^h2-nK) z+PBMsRoey4iCj5~*m1)&xX_~9uQgp%F9iR@ZG)ZQGXq-P##{lFt1D{$X)==!On>6G z`wyL=>-M+oiVZZ^HPJ)1Q9N6de#dQF)BDY0PwxQn+JJ;QE@>wbvK~ulHjN`urKJy= zPD(AZ+{^|BK`2UH%+fvHkDv(gphE;n=0+wh2xRhjoSOm@@ypKWHlid8je6KZpzZW(boW7ui9x`7XHYmp!Hv&x?6eN z%`)RX(RfyP%isMYx3Pr!99xhq3I5r zb3_}B8JBJ~ej2BSJhV?~C&G5+lTcx*9zp8VRuS~LsIe73qG=9-q3~UPqd+>H{SG^A zwGzzNW9PLJt!gLb{#pPO=*Q@7py?GvlO2}#k3>{D)p;%1e$rhmG@SuJ-;s_kfH8$J zvhD-j-lts~TnpAv%!3{ZCAvVM=N0Y3d@zmrS~5AhIWooUtomjLrkv~4DsLh}gx5I{ zrt(LrVU~eW-yhPfnuC|A%*^!>IN>n30EJd!5`)Xns^bibD>KHz-h`wc~ z$5Seggh0b{2J9brrs=jT^}q1!_fOUA{Kv}9yqjbS2P*^fgE=-52Ue@jQr$+Aq{V`k z*iO9y*pT!Jp=q#U0uvT!^$y7fG`xA1wusnp`C^L`e^Oll>f+tziw z(5*+YLeE9#R`=29X9{Y}>}V)ADFY!-ajq?*3_|6<+~&1v-cSQE<8@u5#n``KSpG7Qs`c>7_;jQXQ11*`=7Z_hQ#d^8iZP-;X=;|Y z{%v^9#e6lfS#W#x>dRK}eRwj>yHOaxs=k?F@1zEFzoPk-tE>)i7m2?{oETBVu&R{9 z1*R-Poj_6UC|9I6*CM%<3)pIACX730LZDtyGs2OSwMwZn3%x#L@`)q5fEv}#tjZg= zL-;~YbHg%t{Y)~=Sj5u`Grk$Yh#ZnfjOOi&inDR}obw6>5sO%Glny#1jRA4q(ojP# z7Fb|q34{e>yM?Y)c-(yp3)x!^hTJ#Q;P?g<5n{JDEP^DcNqM|aJ4END7Mdk;BsKcNGw zCLjiA*Bu-Q(T`xd?`mP&5#Nz-T2lRA98yd5L#ngx$%cJLqgrn6`crT`m0s zXlN(k%Qwp9G?)$)-0x(~Ho9x~FtdJO+^yni^n1uOM8EzTH`roey`uw>tlR&nt|I;0 zQS+aD#J`+x{~faw6E|ViFMtvKdgkAzv8C@G>} z9mOYrFU2JWhKP8@;2&l``Q^Fg{pr)RXcPj;mS~AbvK3k5h}ooej{n7?%~o}y7Sj9( z%bp&+o+F(!C0FSt98e+aQ|7tw9-ZbyOiVjLlsTSSb^E4!1?iVa7d;*|K2%0F0_-Ik zb6{Io(rWjAhb=8qkP<-PS|a?ItH7NQyu_^p#iA4H9D<bgN>n*$gV&y9xfJvpdQwrDwV*kUQIX zyY+0?Pucms{d&Ot9GV8~IoYzyc3ZD}5iD=UX0xYlmKk8fK(bCX_QVF%xtPV}`+w&p zDBh_*xK_XT#b>V;<7=48!^DT-x#nQ?wDc^#eorC+ZS+Uq?TAm`v*so<<%BECYFoIF z=n$e))*teXA(bwy@qRofu=0<-+l~e6K|J)yh4Wl)O5|TmOi~~dv&MPqI1$LibkI5S zT~h=yG0{|H`#A>2Jt+b#>0Le%r^hGslN1keH);VI4uBM8g*szZV+`P!)MABnkGIhB zkG{Kx;I|ZwyOsJcDVp(bDH>!&M^nL zim-J<1#c8Vk^Wi~)n^Fq5a@1-$P>Z=33!z;mowOwAygU&uQ*I_o|au!+EO&XV1Ykc zv|~z*q0C;LQlT2KjpAgSF!mM^B~DX9Ogf+z4aiC(Jnf0+hguV%xWz+qqqN{h`aYq~ zk&k|Y-gcftkm0D_HRtK$ojc+iQ@XH5frJjI#H{yN@ zJ$79m2O_z9&AE&UshC?7l1fL7{7|e%Csm{{kAN4^zQZP~p>X8L z*=W;|H~+xP0;c2dF9E^}OH8YvQeh1eDPh2*^4-?IyO`J$amYF@A8xkqxPx`;B$+9r zRK+}hZo-e1tPlbu7Qe<=3LYv8{UyxV6K#YmL&PMS4Wt_1iY+x>#y3}z!{(6$vhlr> zTJf+;Pb2T`$vh%+;;aBJ0Q%DK26ptOu0B(o@NDgi?67Dn#R^=5u8Qpu`0wX&6*D9~ z3_K%*|8_?F6T30|?Lhrswg)8boK2lvfDMBGIxKuufF?u=D7Tk%V>?w zb_NJ(QnpQj)6lzkr^x+9^fH-Tr}no&ldUn696k@?oSSK>Lu9606TaU)e{MRr@ca9} zfyWOs1!*~*+Jzj^WHDPj6{y}>9a*E!v0=7vLh@!Cawt|gP2IyQ!v^aiuwHsGZ(7M= z05+0RYda}J^B*s#2SJ3*7d>U0;ZSJ%%ug3gFHEqO6LDzf?!?N5dDn=E^ZCFH4( zXAlI<;Z@E}!)r3^Cz5R=Q|8ZavA%jhKQM~yMtp+R<;XMQF93vLwL7~i-q47f+(+6j zT0#E`|4avU41m}hLQ5u0!5Jh(edUm0WL#Rh_zBp%F{)G~_DF=pW5j{G9yiAoj0G^Q zLqX;FV^?&-sthRNC{wbGP_AxP0FM$Fqc!j}o#@WFIe@R>=A5dS9%qqtAF$iiVFtpj z-u9$5%n;lhY8YJ>zEgES)@T#j=YauA37IXRR3MHSdjVu2Pf9j}QcV)0VrfyLAY?fk zVp<#bC~}||L!@C~aC$fD)<(d!c&6m9)43)cbSlzeD~oxy+fI^@9dDvoUQudN_urq^ z0BSEYsQnCq5mYRV$@?tZ z%9y>DtziZ8PoH&YSj|$^5CS_iF(Ku0#I*;i4PtxWw7ykM**jqW7;QMu%t6i&P2X=D$5^&XteDGSho^{N+6F zFpI}BGF~OA5Ia1^cOXK(@oYIFOm0gV7Da9r`ppWm^wk^H++ra>k26FD%4%P=lkFh=h0z_xPhiE6h~fVX zKJ3ou*U+(RMrlo=W0f`eh50Ufg_AkUk9o@vXgy>A8HX8|A=E}eJ!d6n$WdQb?q!)r zBF!0*PiDN>KYqS%dS{5Z|#^A@K#>bDL zt3`;^yO6;Ife4s9x6Gjrw!%{>q_XNi*zd!yxZTKXc`0Sp>!s>Y{J7G~5)U(EG*jI( z_c(kpK`>b>CB1;IAcqs^PVYf$J=M=RQ?^zyNB{zIR4&NgA@9exv#jvi0J+~ zX)9e!9^rE9Yz*&HZxvJFQ9bRA6_oDZpHPci397G))ZT@(Qg%5dzYm+M>rvFxZL1)Z z5=LIH^MdiDeq9O`-PN8Q+)7>O5Ad9ejiMr&)!W0s%%FdG)EQlP5uw+rt6^w}p z6A6*E6YQo(&@s7)6xeBAg-N?NnFMYeb-nj(T;A!ZwyoXKxN}^>QH@+QbO=v6cpudK zs_(B^SIvb*cNfx2kQ$cFPhv{czPvt}Qst_sQ~f1-8>Jnvn_5)x&AA{2qo+Mak=DQO z5@r|XOlh&zy#zyRJRLwkA%maltW-|2jVA0&(pAlkC>xPJd_kbpDs28?FHd2-^2U=? ze>$anss2_<9VYWLW;=zwqLw;iv89w_U|bD?ty2vV{ipF6$H5t50^x?k_?=eeCWCLs z(7Fu&USAm(x#y;=D^Vz+cUrc)dv z=rqiS+p8fSv|0`9Uf&A*X*XBpuIOX>-#hx4=W{|Va7WYrO<(xWG1&htc4z$$6_qZ~ z6A*=0+7gseNRZ+&0H`kvQvy|?`S>C*{TsW7s*@)-1zRS`WpW+c-+t7Ozv;VK!9Va| zhmw)xE}L7~INxYE+I0T8`S$De72*TzSB(A0Ik!rmkqT$ak;gXthrF-qpIDez7<_EBT|NQkDO=!0_kOEw^pQsb*|-`9k}O zm1L{XA9p6{G-3+ghRSWKP@urjEIez%>=k*yrya+&)Vy7mglzkc^e-Kiox{|5EfBpo z{#{3PNRzy6Mep`+^j`ZHdannfcOcFD4|>o4-cbT{X9_L9@ctUHeh=7zyZnYRzyf-Q z5lRO>5#L2&yurY-!H{Y}GpCWSIL-bent>UhR+VY%Wt-;nP?@87hu<~Gfastyh~vTM z-31Un9Ty#`%b}#y5Rel?`-wsJ+Pe{&;y;2NJJJ$Y(Mh?f-cO{u^_0p)k@3*&*lP>k zB+lrfi2=P)>)%?cutOhJpk;`0(nqD5EjDNA_^I;aCE~#6BNd*ddh3}H$SN1>8ud|K zj!)Ja=wGl^rc?61CnwrEyGxU#D5E9LFeP3mSIuY@@I^l+U4H>{?FtLVg=i2Q_C877=K?YOFc8Cc{K4?{@eQRCe#Nk?v_K4>_r*s#n*pnz zeOjFVQaxSF=hV zdw8s;g>&Q(vqp4@jNYhpcpoTprsKy_`qx^woECgh-_ zto_$*rw!$!GKBd~7R+-S9xot-1PR$HgcfMH<(&mqPXxWg6fc3sfOlL^L&#<`I}vKt zq*vV~Z(I4$Ul>Mffk-V4+tjp(w>q%8I)K-_i1$!i{lGVGe<-`T#%^!st|$r8v(;&D zzWLJg{?pg_+^1n@2mP`G|AUYT`^TfEO^oKIb8sV6Sek<}mHZ@a?5pbNQ_n5WmmMXEauy9L;0_ejz=s&> zPz>j$dvip%aL5=c`HtCbY>Bevil#fQhNxl95YmslF~doplR82ZKbG(y%;t%OX<3h^zg>r#F5YJ5Cy3O1!mJwRBn8u`5G16G-6qDNdTy|k zXTi8=c=Wefw%cI=|18Uja<$;tS!dB<7xvr$D6*%WB5>`j132xpJeNv_7R@xm$W9Mq zMusMYcpG8Sf@pk|r8|aM|Q->O4txQO-Ne8MnF%Wq6V&kCDv-WfQ~87Vewro<}ibhq7sK zGmLDj*?3}`64*4eN2k-WF>*LTGa;+y91otLR{OAJeIQ=>aBJctZ0vUVoy{TXw%jh^ z%4?~j1tS)sERn&{K+_ds2eB^vuRM2dv_>jW#~`*=%7+|CnV7JL1w!kXuQY3_`t7@xtkIr_mAgFwL#XNSs%7{JH)N3xG(`UKVl4JsaN%N4H=|uOsFi@_pN>!|PMEmZ?Lr zZdat*1XJs9jh=L+7-r{Kjh=XAK9fh3t}k_f-^w&z3mUhrH}`=b1RSMYHKmaZ9(EVA zQKKZ+JYH1M$bdy7u$ow|i|A?eFdxut-poLF5E|&0fwaEn!>tLUKHj7H)tbszQwWb! z2NSm!zOv_xY}&4gfo*lrRp{wQzn_eU!i|dL_Z_Ao_*xswM<&dd<@VWP|L4r;Rbc}E zvuac)?opg_S1!V?%tYG>2kzM|?q;A-m`^@S7lJZmae1G)%d&i7Yp$3hf6Q)u8m1_> zS#Gfw95iELXIjrKv20C{w1Q6tHBD$s2VwBg^tK)@+*~+L38JQkf{4vKzdYkls5j-n zixC$P#cbxlU6`ezSrwx^dNlcI8;|Dm>)I7R`5rs#6nfT2qyXiGQuMIk9E0wvfuLC! zfgtfcLR`GhUy)ip?{D)E#i1wLvRgAgaZWN7gt>YIbMf)LGIjZ3etH^;z1^DGnYlSj z+xbDQ60d(=_Uem^hu`Dpx%5?Ynx~QZ4*%>L?8n*ZULw7euEN1#?9(S%c5d-^Q#6*!bDLLnE*{>)LvQjpMTPr|&R2IX9^r{A&*HYac_q!mOASqh_zS?BJQokQ zmYSC_Ce^4EcbdOk*im^(b;!|eEUz8gSI6x1UubTS%KjtL4KJzva7?#8@A~&0FI(xT zm!u34*pZsARbL=toa1OiMmYANlaof=XT3v!3zVoWNgybmcYl;rD>z9nN(?dpTa0yt z?vX~4?6@c#1(`xB?0A5n$SykR%A%pF?!c6kdDolUHcl+7t}$GE1I;w&3dCs&j1bid z%}l64%mrjA`bMU9q)Uc8X-pDK4=1kB*9O$u3tSCj{_smsro8~{yYGhb-l-jxzD=}5 zYzNgsF_J4O?Gz85wby7_!0X`$Yf=+-nb6-_MPL!_Jd$F~Sy|rPf z4#^T*MYK6f)p6zcv8;@4$vftYm_epIxinnbX;DJ(ac-#h7E-tfR~kLK)=IUMo$+xK zv}-j-eC&_f9u%q%67S>h$^wZNeF^KQ7t03}Hh2L| z8<^=C0V@!n6v)3sPZ6aDHMLP)`+(&VeROwl_+VXH(F=p9L;ul!cbi}lt-Kc&Qjh4P z{r>h`?P3{77=n#N?*)#H(S%Y!7Q#BsNBiwp%kn$Gia)B)eMy*a+?GRb7_FbJ@0CXpAAG%~OHRq#I)sV_IN1(e@FEs~8K$GtDKwqVz zT%Io$8h+>!jtQ^MxRVDh-_a^>UYsrB{T!Y|{ZGys)vo6R z$VUQXH36(}!mvy}-wa!W7=1FCs|NTQ-47VfO92qI>k$VvfOT3ccyr6IE{*C155lnz z-sulhoW#20UTN?Hh84U)2NK|+q7!PVhfdjvFXNZ@iAZ`Z;DzS1}&2r6Ej?-|RK=7R?yRu_s_SBlKd9nEZK;f{3n zc}^YF7$z;hIAh0x3@zMVtbV6lsXVJ6wZg5}3adgJON3i-5mPINJlFM+=iSCv&^Iur z0`(q+s^ZR5jdl+us$J>4H24LoT?sfRB(J?RP-C9Nn)JkjKOg;Rz($TQp>Jbd|5)D* zf!obBC9ufUo$=)>jH`kU>bscB6N!Gvt@&{k?>lN;(plW-ClQ2I_&CqNG)^h#LwE;q zI6or6sD?@+t4pTAiOT7?1UKS+*n#_8)l2U8oHV!M`+>9}368*gdyn%4ypqg3vEh38 z=EFBt1QJ^lTFo#g&~FWjY)0PP-CMCv#!pNZLTyk7;K{fYM}_Ain4+F=-@ucz6l+9Q zLh7+iQA{BZ(2{K_S_-N}+h9#0zBwz>iLAubqaJut>`_dKKe2p+QQ(Wb6-t)RTaUff zOh(W5K{yzq$Q8j9e&YFtOMx$lDgMOqEt?`&FoQz>0qJ)+HW@_JE(QDHL$C0E_8$Mq zY9;(zIriUjBLydW6IWxG|1v&T`WLx1NVhE8I4h{@b9=;UAogwtQ_R>GD5wUpWD|Vp zww3ne)MTSW|J$Ib2n@-Wk8c!vZN+lKM6zC&yuY)Jk2+s|z3!2JhOPie4V*ovdf0LY z5?W%c@qTxa2^BGRx%J&!2>YcMK&XEe-V2y89iTXm+8(v;`O)RJ1q~gXyk3X1))a&e zV9wk$3BU5P1<}n#8cr}u-+&rW(E(A<6EIExx>n;cchwmDzG)+Abz*#1sOcv2v(~0D zHU!mSoLWLy6;Yf>)G$h9_1rC69-yt|o6PVHh#?)wPp#iuoJ=s#=(kx(*HAxcRw?E8 zC2SZQ=t{nH5)Uo%dRb<6)51hl?D(=5Xw8=>&JnkUE+IN;ytBl@hJKRG9W5ma!%jql zwb87UXQL(ZlkOw4oBJXM?T+a*mxW9Zc4){khb7ErL~sCEZ@_8xb);@6{%Ng*(OLCt zIYoYz0d}%AsmZ?k`7QU%4M?54QG}fV(9@i^;!392t%7w4^{qw(T?v?Yjs5=Z%T zIAkA3ehzh%)zCxiy<#18^#x2R{ZnZgiKVp5w>bEG0E0Ugm5iP7eyw1sizF(928G|y ztmUyo#=eb(N4i;S;KArCrx`-hrLYOS6lj$%At~EsY$XXZF?m&3@;+@C4~C%bzz@h% zQ?O#VI!}dPp;@37bJO$8ET~+OgjYzH@v;R6gI{pYOv?;Y?zO@V*)1azQuu)lleY6dqZenLwP zhWEj?x&9=v%d~8;2ZL=4xQ(F41RDkq5L_Pqd3j!?pVjDVnMG!q#rFFriA^$ViOhDE z%yyN`)~o2}M#)d#E!#x8evAtsW#IjNYyAAY#d^Cz|7DpRk}#$* z1Zit~etq2GcKcBz5y{`j&Wnh%!`Y34F(I$Wg(kz!s-UQ%xyGF-L(ZUevtI=tXtFbw z;U=ZCp-ep8vI`X))=GJU!`=OEaur}D6)5l2_#7eCskckW$4^N}hwpQ%eA^f6u~)Nc z9g!9BD?-Go>g_SbdThtVn@5QPgBB$k6`Bnd^3qYRx=GvvIoS-|3TV22HCcCiRNbnG z`mu7(6|u&T6@BJdee^MUYh#KL4_nfVOx6s15b6*r-2;e`27>9rOLBs$i2H2nhzeX0 z;d#~diRv$)QiO(N^QUB0HN-331BvE8Nf3gp40}?BK_z#dI*Apkch``7(P%%=sXr{I ztfr&RmUZ_T)c0FbS3A)bt9s&%>f_tAbltksb*~;VLs;hYJ#2~(2h=WF)V1b8T2W