Friday, February 06, 2015

Coqdoc requires latex package 'preprint'...

For TeXStudio to convert coqdoc Latex to pdf, you'll need to install the 'preprint' package that is required by the 'usepackage{coqdoc}' command.

Wednesday, February 04, 2015

Emacs + Proof General + Coq + Windows (for beginners)

I eventually managed to get this working. Here are some quick pointers if you're in a hurry.

  1. Use the release version not the beta of Coq (8.4 as of this post). The beta (8.5) caused a couple of errors on my system. First, it spat out Unicode character codes in the middle of command line output - making results unreadable. Second, running Proof General in Emacs then gave an error (something like) 'file specification error replace-regexp-in-string'. I'm not Emacs expert so left it at that.
  2. Use the Windows Emacs version from the Emacs site (24.4.1 as of this post).
  3. Untar Proof General to a directory of your choice.
  4. Add these two lines to your .emacs:
    1. (setq coq-prog-name "<PATH-TO-COQ\\bin\\coqtop.exe")
    2. (load-file "<PATH-TO-PG>\\generic\\proof-site.el")
Note, if you don't know where to find your '.emacs' configuration file or it doesn't exist, just follow these hints.

Assuming it works, when you create any file in Emacs that ends with the file extension '.v', then you will see a distinct menu-bar with icons appear under the Emacs menus.

Sunday, February 02, 2014

Google Auto Awesome...

I have to say, it's a pretty amazing service being provided by Google.
The creations are only visible to my Family, so I won't show any here.
But I am amazed at the smarts!

Thursday, May 31, 2012

Test of Scala code formatting on Blogger

The author of this blog (most of which goes over my head!) asked about Scala code formatting. So I took his code and applied the formatter mentioned a few posts back. Here's the result:
object Compositional {

  type FunctionOfList[T1] = List[T1] => Option[Double]
  def convertFunctionOfListToFunctionInf[T1](f: FunctionOfList[T1], support: Set[Integer]): (Stream[T1] => Option[Double]) = {
    (xs: Stream[T1]) =>
      {
        val variablesThatMatter = support.toList.map(i => xs(i))
        val fx = f(variablesThatMatter)
        fx
      }
  }

  class FunctionInf[T1](val f: Stream[T1] => Option[Double])(val support: Set[Int]) {
    /* T1 FunctionInf is T1 real valued function with an infinite number of arguments (represented as T1 stream)
     that only depends on finitely many arguments (called the support, in a slight abuse of language)   
     */

    def apply(x: Stream[T1]): Option[Double] = this.f(x)
    type doubleOperator = (Double, Double) => Double
    type optionOperator = (Option[Double], Option[Double]) => Option[Double]

    // Need to define an extended field of reals/undefined ...
    def divide(a: Option[Double], b: Option[Double]): Option[Double] = (a, b) match {
      case (Some(0.0), Some(0.0)) => None // May want to modify this to None
      case (Some(x: Double), Some(0.0)) => None
      case (Some(x: Double), Some(y: Double)) => Some(x / y)
      case _ => None
    }
    def add(a: Option[Double], b: Option[Double]): Option[Double] = (a, b) match {
      case (Some(x: Double), Some(y: Double)) => Some(x + y)
      case _ => None
    }
    def multiply(a: Option[Double], b: Option[Double]): Option[Double] = (a, b) match {
      case (Some(x: Double), Some(y: Double)) => Some(x * y)
      case _ => None
    }
    def subtract(a: Option[Double], b: Option[Double]): Option[Double] = (a, b) match {
      case (Some(x: Double), Some(y: Double)) => Some(x - y)
      case _ => None
    }
    // ... so as to define binary operations on FunctionInf
    def opply(that: FunctionInf[T1], op: optionOperator): FunctionInf[T1] = {
      val supportUnion = this.support.union(that.support)
      val f = (x: Stream[T1]) => op(this.apply(x), that.apply(x))
      new FunctionInf[T1](f)(supportUnion)
    }
    def +(that: FunctionInf[T1]) = opply(that, add)
    def -(that: FunctionInf[T1]) = opply(that, subtract)
    def *(that: FunctionInf[T1]) = opply(that, multiply)
    def /(that: FunctionInf[T1]) = opply(that, divide)

    // Margins 
    def project(J: Set[Int])(implicit values: List[T1]): FunctionInf[T1] = {
       // J,K is the same notation as page 626 of Jirousek:  J is a subset of K
      val K = this.support
      if (J.isEmpty)
        new FunctionInf((x:Stream[T1])=>Some(1.0))(Set()) // We define projection onto empty set this way, as per page 636 Jirousek
      else {
        val M = (K.diff(J)).toList // M is the set of variables we integrate out
        val marginal = (xs: Stream[T1]) => {
          val ss = subspace(xs, M, values)
          val fx: List[Option[Double]] = ss.map(x => this.apply(x)) // Evaluate f at all the points in the subspace
          val theMarginalAtXs:Option[Double] = fx.foldLeft(Some(0.0): Option[Double])((c, d) => add(c, d))
          theMarginalAtXs
        }
        new FunctionInf(marginal)(K)
      }
    }

    // Right composition
    def |>(that: FunctionInf[T1])(implicit values: List[T1]): FunctionInf[T1] = {
      val supportIntersection = this.support & that.support
      val denominator = that.project(supportIntersection)(values)
      val numerator = this * that
      val theRightComposition = numerator / denominator
      theRightComposition
    }

    // Left composition
    def <|(that: FunctionInf[T1])(implicit values: List[T1]): FunctionInf[T1] = {
      val supportIntersection = this.support & that.support
      val denominator = this.project(supportIntersection)(values)
      val numerator = this * that
      val theRightComposition = numerator / denominator
      theRightComposition
    }
  }

  // Helper: converts a "sparse" stream representation (i.e. T1 list of coordinates we care about into one of many commensurate streams)
  def sparseToStream[T1](l: List[(Int, T1)]): Stream[T1] = {
    if (l.isEmpty)
      Stream.empty[T1]
    else {
      val nextPos = l(1)._1
      val nextVal = l(1)._2
      val hd = List(1 to nextPos).map(i => nextVal).toStream
      hd #::: sparseToStream(l.tail)
    }
  }
  
  // Helper: All the perturbations of a point when coordinates in a set M are allowed to spin over all possible values ...
    def subspace[T1](xs: Stream[T1], M: List[Int], values: List[T1]): List[Stream[T1]] = {
      if (M.isEmpty)
        List(xs)
      else {
        val subsub = subspace[T1](xs, M.tail, values)
        val listing = for (b <- values; x <- subsub) yield ((x.take(M.head) :+ b) #::: x.drop(M.head + 1)) // replace the m'th coordinate
        //println("first guy in list is "+listing(0)(0)+listing(0)(1)+listing(0)(2))
        listing
      }
    }

}

object CompositionTest extends App {

 // Define Pi as page page 629 of Jirousek - ostensibly depends on 1st and 2nd coordinates
  def pi(x:Stream[Boolean]):Option[Double] = {x match {
    case (a:Boolean) #:: false #:: (rest:Stream[Boolean]) => Some(0.5)
    case _ => Some(0)
    }
  }
  val Pi = new Compositional.FunctionInf[Boolean](pi)(Set(0,1))

  // Define Kappa as page page 629 of Jirousek - ostensibly depends on 2nd and 3rd coordinates
  def kappa(x:Stream[Boolean]):Option[Double] = {x match {
    case (a:Boolean) #:: true #:: (rest:Stream[Boolean]) => Some(0.5)
    case _ => Some(0)
    }
  }
  val Kappa = new Compositional.FunctionInf[Boolean](kappa)(Set(1,2))

   // Define nu - ostensibly depends on 1st and 2nd coordinates
  val Nu = new Compositional.FunctionInf[Boolean]((x:Stream[Boolean])=>Some(0.25))(Set(1,2))

  implicit val booleanValues: List[Boolean] = List(false, true)

  // Point-wise operations
  val PiTimesNu = Pi * Nu
  val PiOnKappa = Pi / Kappa
  
  // Projection on to X1 and X2
  val Pi_x1 = Pi.project(Set(0))
  val Pi_x2 = Pi.project(Set(1))
  val K_x2 = Kappa.project(Set(1))
  val K_x3 = Kappa.project(Set(2))

  // Composition 
  val PiNu = Pi |> Nu
  val NuPi = Nu |> Pi

  val tf = List(false, true)
  val X3: List[Stream[Boolean]] = for (b0 <- tf; b1 <- tf; b2 <- tf) yield (b0 #:: b1 #:: b2 #:: Stream.empty[Boolean]) 

  // Check composition
  println("Compare Table 3. on page 630")
  println("x1   x2   x3         PiNu(x)     NuPi(x)")
  for (x <- X3) {
    println(x(0) + "-" + x(1) + "-" + x(2) +"  "+ PiNu(x)+"      " + NuPi(x))
  }
}

Wednesday, May 30, 2012

Scala and Avro

Its possible to use Avro in a Protobuf-esque manner. This means you can input the Avro message schema file (.avsc) into the Avro compiler and generate Java wrappers for you messages. The easiest way to do this is via the sbt plugin for Avro. Here's how.
  1. Include the Avro plugin as part of your sbt configuration by adding the following lines to your project/plugins.sbt file (manually create the folder and file if they don't exist):
  2. resolvers += "cavorite" at "http://files.cavorite.com/maven/"
    
    addSbtPlugin("com.cavorite" % "sbt-avro" % "0.1")
    
  3. Restart sbt, or type 'reload'. The plugin should be ready to use. Now you should read more about its default settings etc.
  4. Next, create a directory 'src/main/avro' to hold your first Avro Schema file (.avsc).
  5. As a test, in the above folder create a file DummyAvroClass.avsc with experimental contents. E.g:
    {
        "namespace": "dummy.avro",
        "type": "record",
        "name": "DummyAvroClass",
        "fields": [
            {"name": "header", "type":
                {
                  "namespace": "dummy.avro",
                  "type": "record",
                  "name": "HeaderAvro",
                  "fields": [
                      {"name": "something", "type": "string"},
                      {"name": "somethingElse", "type": "string"}
                  ]
                }
            },
            {"name": "anotherField", "type": "string"}
        ]
    }
    NOTE: as demonstrated above, avsc files require nested definitions. In this post, Doug Cutting gives a nice summary of avsc limitations and more powerful alternatives.
  6. Add the following line to your build.sbt. This library is needed to compile the Avro generated Java classes:
    libraryDependencies += "org.apache.avro" % "avro" % "1.6.3"
    
    Of course, now run sbt commands 'reload' and 'update'.
  7. (Optional) If you're using IntelliJ, and you have the sbt IntelliJ plugin installed then now's the time to run 'gen-idea'.
  8. Now you can compile, run or test your project. Of course, you haven't yet made use of Avro - we'll save that for the next post.
  9. (Optional) As an example of how you can change the default Avro plugin settings, let's alter the folder into which the Java classes are generated by adding the following lines to your build.sbt (I sometimes do this so my IDE picks them up which is great for debugging and learning purposes):
    seq( sbtavro.SbtAvro.avroSettings : _*)
    
    //E.g. put the source where IntelliJ can see it 'src/main/java' instead of 'targe/scr_managed'.
    javaSource in sbtavro.SbtAvro.avroConfig <<= (sourceDirectory in Compile)(_ / "java")
    

Tuesday, May 29, 2012

Akk and Camel and AMQP

So now we have Akka+Camel - let's introduce an AMQP component (so we can talk to an AMQP broker i.e. RabbitMQ).

Add the following line to your build.sbt:
libraryDependencies += "com.bluelock" % "camel-spring-amqp" % "1.1.0"
From the sbt command console type 'reload' then 'update' and you will be ready to connect Akka to Rabbit via Camel.

It may help to know that you MUST provide a queue name when creating a Consumer. This means it doesn't support server assigned queue names. So consumers have to generate their own unique queue names when implementing pub-sub - a bit of an inconvenience if you have many subscribers. It may also help to know that the above library uses the following defaults when it creates Exchanges:
boolean durable = false;
boolean exclusive = false;
boolean autodelete = true;

Book - 'Pragmatic Thinking and Learning'

Jonas Boner during an interview by OTN, mentions an interesting book Pragmatic Thinking and Learning (Introduction and Chapter 1 available). Its a light book but an interesting read.

Akka and Camel

I'm still trying to get my head around this library, but here's how I've pulled in Camel.
First a warning: Akka is currently at release 2.0.1 which does not provide Camel integration out of the box. Version 2.1 is scheduled to use Camel as its integration layer but this hasn't been released - so we're toying around with nightly builds.
First let's pull in the Akka goodness. Add the following lines to your build.sbt:
resolvers += "Typesafe Snapshots" at "http://repo.typesafe.com/typesafe/snapshots/"

libraryDependencies += "com.typesafe.akka" % "akka-camel" % "2.1-20120529-002605"

If you're already running the sbt console window for this project then run the commands 'reload' then 'update'. Otherwise just start sbt (which reloads build.sbt at start) and type 'update' to refresh your local Ivy repository. The number at the end is the development build which is updated every night.

If you'd like the feeling of a little more stability then try the snapshot issued on 22 Mar 2012 by changing the last string to "2.1-SNAPSHOT".

The above dependency also pulls in the Akka Core with the same build. So that's all you need to get started.

Monday, May 28, 2012

Test Blogger code formatting


A snippet of Akka-Camel stuff - really just to test code formatting:
  
class Orders extends Actor with Producer with Oneway {
    //todo: untested
    def endpointUri = "spring-amqp:businessX:outgoingQ:outgoingQ?type=direct"  // todo: I've had to manually create the Q - why?
  }

Shame about the scroll bars!
Credit goes to Alex Gorbatchev  http://alexgorbatchev.com/SyntaxHighlighter/manual/demo/ .

But just look at those lucky WordPress users http://bcomposes.wordpress.com/2012/05/12/processing-json-in-scala-with-jerkson/ !!

 NOTE: the above method doesn't appear to work against the delicious new dynamic Blogger templates. That's because you have to manually paste stuff within the page template header - here are detailed instructions for setting up Blogger with SyntaxHighlighter.