Saturday 29 December 2012

Hermanizer Power Pedal - A DIY Fuzz Box

For christmas gift to my brother I thought I'd try to make a guitar effect pedal. The main feature would be to have a nice sounding distortion effect, classic rock fuzz. I also found an article on Instructables by Harrymatic discussing adding a timer IC to the circuit to create a sound slicing/chopping kind of effect. This construction is based on his ideas.

So what you need is an amplifier IC, I used an LM386 circuit which can amplify a signal up to 200 times. To be able to control the amplification and the output signal, two potentiometers (resistors with variable resistance) acts as gain and volume controls. The sound slicing effect is created by a NE555 timer circuit which flips the voltage on its output pin high and low in a repeating fashion. The frequency of this is determined by a third potentiometer which is glued to a "effect rate" knob.

Circuit diagram of the effect pedal - click for larger picture
Depending on the state of the FX On/Off switches, pin 6 of the amplifier IC is fed by either a constant feed from the power source or a chopped up square wave from the timer IC.

So first step to try this out was connect everything on a breadboard.

If you buy your ICs with DIL8 sockets they will fit nicely on a standard breadboard. An important lesson from this setup is that you must ground both your input jack coming from your instrument and the output jack going to headphones/speakers/amplifiers or else you will get very disturbing hizzes and noises.

Once everything worked out on the breadboard I printed and soldered the circuits on a stripboard. The soldering is not shown in this picture since the circuits are on the backside of the stripboard.

For a case I bought a metal case and drilled the necessary holes for the potentiometers, switches and the power LED.

I bought the stuff I didn't already have at home from ELFA. It amounts to somewhere around 200 SEK. The integrated circuits are about 8 SEK each but the casing and the knobs turned out to be the most expensive stuff.

After stuffing everything inside the pedal works fine. When only using the distortion effect no hearable background fuzz is noticed. However, when enabling the chopping effect there is a constant background noise which is evident if no signal is coming from the instrument.
The final effect box
So how does the pedal sound? Well, this is the sound when applying quite a lot of distortion effect on an otherwise clean organ sound. I'm not a guitar player like my brother, but here's a few chords from my old Squire guitar lined directly into the effect pedal with pretty much distortion.

Here's a synth line which after a few seconds get the the chopper effect applied. In the end of the clip the rate of the effect is increased to maximum.



Saturday 6 October 2012

Functional programming in Java?

After more toying with Scala on my spare time while hacking Java in office hours I miss out on using all these functional idioms in my Java programs. So with Java 8 coming closer I got interested in how the JDK team is planning on utilizing the new Java closure syntax in for example the collections libraries to support a more functional style of programming. While I have it fresh in memory, let me tell you some cool aspects of this!

Closures in Java 

Closures are anonymous functions also known as lambda expression. In languages where functions are first class members you could assign a lamba expression to a function variable. However, this is not entirely true for the Java implementation. In order not to mess up the type system to much, the engineers behind the implementation of the Lambda JSR 335 in OpenJDK has taken the approach to use interfaces containing one method to become special in the language. Such an interface is called a functional interface. An example of such an interface is ActionListener
public interface ActionListener { 
    void actionPerformed(ActionEvent e);
}
or Runnable or Comparator.
So in JDK 7 or earlier syntax, this is how you would declare for example an ActionListener
ActionListener l = new ActionListener() { 
  public void actionPerformed(ActionEvent e) { 
    doWork(e.getModifiers());
  }
};

However, with the new Java 8 syntax, you'd write it as
ActionListener l = (ActionEvent e) -> doWork(e.getModifiers());
So you get rid of all the unnecessary boilerplate code inherent in anonymous class creations so common in Java. More pleasing examples of usages of this syntax
Comparator<String> c = (s1, s2) -> s1.compareToIgnoreCase(s2);
FileFilter java = f -> f.getName().endsWith(".java");

There's a lot of interesting things going on here like how lexical scopes work, variable binding, type inference etcetera. Check out project lead Brian Goetz report on how these things works.

Java Collection libraries 

So how will this be used in the collection libraries which are probably the most used toolkit of the Java platform? First of all, the Collections team has decided not to restart by rewriting the collections libraries from scratch. While mentioning that this might be a candidate for future JDK versions, version 8 will instead provide an evolutionary step forward in the Collections by adding extension methods to existing interfaces (List, Set, Iterable) and retrofit existing classes with new interfaces such as Stream.

A major shift will be from the imperative style of external iteration to the more functional style of internal iteration. For example, the recommended idiom in Java 5+ to change a property on all objects in a collection uses 
for (Car c : cars) {
    c.setState(MOVING);
}
With closures you'll write this as
cars.forEach(c -> { c.setState(MOVING); });
What are the benefits of this idiom? You move the control flow to the library instead of the client code. In this way the library can decide on potentially use laziness, parallelism and out-of-order execution to improve performance which will be showed in later examples.

You can pipeline operations. In this example the filter operation uses a predicate to decide which objects in the collection to pipe to the final forEach clause.
cars.filter(c -> c.getWeight() > 2000)
      .forEach(c -> { c.setState(STOPPED); });
And to store results from computations using for example the map operation which operates on each value in the piped collection use something like
Set<Engine> smallEngines = cars.filter(c -> c.getMaxSpeed() < 100)
                .map(c -> c.getEngine())
                .into(new HashSet<>());
or to sum them. This is the well known functional idiom of map reduce.
int sum = cars.filter(c -> c.getState() == MOVING)
                .map(c -> c.getWeight())
                .sum();

So, all these operations will not create temporary new collections and pass on to the next operation. Instead they operate lazily and stream values between the control blocks. This implies good performance when for example searching for the first object that satisfies some condition. The upstream iterator in this example will not continue the iteration when getFirst() has found a match.
Car fastCar = cars.filter(c -> c.getSpeed() > 120).getFirst();
When used to these constructs, a lot of boilerplate code should be possible to remove. Here Brian Goetz shows an example of a method in java.lang.Class as today
 for (Method m : enclosingInfo.getEnclosingClass().getDeclaredMethods()) {
     if (m.getName().equals(enclosingInfo.getName()) ) {
         Class<?>[] candidateParamClasses = m.getParameterTypes();
         if (candidateParamClasses.length == parameterClasses.length) {
             boolean matches = true;
             for(int i = 0; i < candidateParamClasses.length; i++) {
                 if (!candidateParamClasses[i].equals(parameterClasses[i])) {
                     matches = false;
                     break;
                 }
             }

             if (matches) { // finally, check return type
                 if (m.getReturnType().equals(returnType) )
                     return m;
             }
         }
     }
 }

 throw new InternalError("Enclosing method not found");

and how it could be rewritten without all the temporary variables making it both more readable and less error prone.
Method matching =
  Arrays.asList(enclosingInfo.getEnclosingClass().getDeclaredMethods())
    .filter(m -> Objects.equals(m.getName(), enclosingInfo.getName())
    .filter(m ->  Arrays.equals(m.getParameterTypes(), parameterClasses))
    .filter(m -> Objects.equals(m.getReturnType(), returnType))
    .getFirst();
if (matching == null)
    throw new InternalError("Enclosing method not found");
return matching;

There's a lot of more cool features on the project site, but before ending you should see how easy parallel computation can become. By streaming the pipeline via parallel() the library will try to divide the pipeline stream of operations to all your cores. 
int sum = cars.parallel()
                .filter(c -> c.getState() == MOVING)
                .map(c -> c.getWeight())
                .sum();
Via the new interface Splittable you can also very easily use the Fork/Join framwork for divide and conquer tasks.

Timeplans

This looks awesome. But when can we use it? According to the milestone plan the public review version should be available in January 2013. The JDK 8 timeplan currently looks like

2012/7 Expert Group formation
2012/9 Early Draft Review
2013/1 Public Review
2013/6 Proposed Final Draft
2013/8 Final Release
http://openjdk.java.net/projects/jdk8/spec/

But you can download bleeding edge versions of the OpenJDK 8 binaries today and play around with the lambda language construct and the current implementations in the collections libraries as well as a lot of the other libraries. http://jdk8.java.net/download.html

Alternative functional libraries

If you can't wait there are libraries for functional programming in Java that will work with JDK5 or newer.

Guava

Googles Guava libraries have support for functional idioms. However without language support the code easily becomes messed up with boilerplate. On the other hand, there are a lot of good stuff in Guava like a richer set of collection constructs and easier use of immutable collections types. Here
Multiset<Integer> lengths = HashMultiset.create(
  FluentIterable.from(strings)
    .filter(new Predicate<String>() {
       public boolean apply(String string) {
         return CharMatcher.JAVA_UPPER_CASE.matchesAllOf(string);
       }
     })
    .transform(new Function<String, Integer>() {
       public Integer apply(String string) {
         return string.length();
       }
     }));

Lambdaj

Another interesting library is Lambdaj which uses static imported methods to hide give a nicer looking syntax. This is some typical Java code to sort a list of persons according to age
List<Person> sortedByAgePersons = new ArrayList<Person>(persons);
Collections.sort(sortedByAgePersons, new Comparator<Person>() {
        public int compare(Person p1, Person p2) {
           return Integer.valueOf(p1.getAge()).compareTo(p2.getAge());
        }
});
With Lambdaj, you could express this as
List<Person> sortedByAgePersons = sort(persons, on(Person.class).getAge());
Check out more features at http://code.google.com/p/lambdaj/wiki/LambdajFeatures.

FunctionalJava

FunctionalJava solves the syntax verbosity problem by using the Java 7 BGGA proposal syntax. This adds closures as a part of the language dialect. However, this requires a pass with a pre compiler to render compilable Java code.
This is an example of code that adds 42 to each element in the array.


  1. final Array<Integer> a = array(123);  
  2. final Array<Integer> b = a.map({int i => i + 42});  
  3. arrayShow(intShow).println(b); // {43,44,45}  

Noteworthy is that solution also heavily relies on static imports. Check out more example at http://functionaljava.org/examples/1.5/


My thoughts

In the end though, due to the lack of anonymous functions in Java today, the best choice to program in a functional way is probably to stick to Scala, Clojure, Groovy or another of the JVM languages that has inherent support for this style until Java 8. With what we got today in Java you can still use many of the functional concepts like preferring immutable data, avoiding side effects and more.

The above mentioned alternatives are just a few of what's out there. But common among them are either that you must rely on preprocessing some functional dialect of Java to produce compilable Java code or to use a verbose syntax like Guava. In my opinion these tradeoffs effect on readability, maintainability and possibly portability just isn't worth it.

By the way, Brian Goetz appears 20 mins into the Java One 2012 Technical keynote to show off some collection examples of Java 8. http://medianetwork.oracle.com/video/player/1871712019001


Monday 3 September 2012

Debugging Knockout.js

Knockout.js is really powerful. A consequence of being awesome is unfortunately that its magic makes it hard to debug when it doesn't behave as you expect.

Here's a trick that might come in handy if a data binding does not behave properly. You can print the the variables of the view model that is in the current scope by inserting this debugging tag in your HTML source.
<pre data-bind="text: JSON.stringify(ko.toJS($data), null, 3)"></pre>
Yielding a print like this in the rendered page
{
   "myViewModelVar1": "someValue",
   "myViewModelVar2": "someOtherValue"
}
You know any other handy tricks?

Sunday 19 August 2012

Learning Scala - Photo collage creator

I finally took the time this summer to read a book on Scala. I bought Programming in Scala by Martin Odersky, the father of the language, which I think was a good choice. Regardless whether I'll write a lot of Scala programs in the future I learnt some new general programming techniques and a well needed recap on  programming language fundamentals from school. After reading it and applied it on a couple of hobby projects I must say that I feel excited.

My first project to play around with the language is a photo collage creator where you supply the program a motive and a set of images to create a collage from. The algorithm tries to puzzle the images together to create a collage that best fits the motive.

The motive to create collage from.
The motive is divided in segments and the image catalogue is searched for best fitting images to puzzle together.

The final collage in low-res.
When printing the collage in high resolution, for example 16384 x 10922 pixels the effect becomes quite cool when you approach the collage from a distance to a near closeup.

Let me just show you an arbitrary Scala function in this program that demonstrates a few of the things that I like compared to my daily work horse Java.

  /**
   * Calculate the average brightness of a portion of an image.
   * 
   * @param img Image to analyse for average brightness.
   * @param startx Start x coordinate of image subset to analyze 
   * @param starty Start y coordinate of image subset to analyze
   * @param stopx Stop x coordinate of image subset to analyze
   * @param stopy Stop y coordinate of image subset to analyze
   * @return Average brightness of the subset of the image
   */
  def brightness(img: BufferedImage, startx: Int, starty: Int, stopx: Int, stopy: Int): Int = {

    @tailrec
    def estimateBrightness(x: Int, y: Int, maxx: Int, maxy: Int, aggr: Int): Int = {
      if (y == maxy)
        aggr
      else if (x == maxx)
        estimateBrightness(startx, y + 1, maxx, maxy, aggr)
      else
        estimateBrightness(x + 1, y, maxx, maxy, aggr + rgb2gray(img.getRGB(x, y)))
    }
   
    /*
     * Average the brightness of the number of evaluated pixels with 
     * the aggregate of their calculated values.
     */
    val aggregatedBrightness = estimateBrightness(startx, starty, stopx, stopy, 0)
    aggregatedBrightness / ((stopx - startx) * (stopy - starty))
  }



As you can see, Scala is statically typed, but the compiler tries to infer as much as possible. You can create a constant variable with the val keyword. But it will in this example figure out that the variable aggregatedBrightness must be of the type (or subclass) Int since it is evaluated via the function estimateBrightness(). You will save yourself a lot of boilerplate declarations.

But what about the function estimateBrightness? It is declared inside the scope of the function brightness(). In Scala a function is on par with any plain old objects and can be referenced via variables or passed as arguments to functions and also be declared inside functions as a consequence. Why wouldn't it always be so?

Everything has a value, even a for loop or an if clause will result in something that can be passed to a variable or statement. This makes for concise and beautiful code.

Scala is basically a functional language but with all the imperative concepts around to make it easy for imperative people like me to make the transition to a more functional style in the tempo that suits me. In this example I made my calculations in a functional style using recursion instead of using loop constructs. I tried to write the program without any for or while loops at all but my conclusion is that just because it is nice to make everything recursive and functional it must not inherently be readable and understandable. I'll stick to my imperative guns when I need them for some time more.

An interesting annotation is @tailrec on the local function declaration. It forces the compiler to verify that this recursive function will be tail-call optimized, meaning that you can be sure that this function will not create stack frames for each invocation in the recusive loop. If so you would be running out of stack after some 10 000 invocations depending on your JVM startup flags.

To be able to write efficient and understandable functional programs my impression is that the requirements of the programmer are heightened compared to programming in plain old Java/C/C++. A challenge I'm gladly willing to continue with.

Instead of me trying to convince you that Scala seems like a great contribution to the Java VM family I strongly recommend you to read the book. You'll definitively become a better C# or Java programmer as well afterwards.



If you want to play around with the Photo Collage creator program and generate som collages of your own clone the code from GitHub.

https://github.com/johannoren/PhotoCollage

I have used Eclipse with the Scala plugin which make the program run without any hazzle. To configure it without any code changes, create a directory photos in the module, and put one image in that directory as your motive and name if motive.jpg. Put all your images that will be part of the puzzle in a subdirectory called inputphotos. Run PhotoCollage and monitor the standard out until the program is finished. Run time depends mainly on the number of images in the directory inputphotos.

Saturday 4 August 2012

Raspberry Pi - how to get ssh and Tomcat running

Finally got my Raspberry Pi! The cheap $25/$35 board with 700Mhz ARM cpu, GPU, 256 mb RAM, dual USB, ethernet and a bunch of general purpose IO pins. Looking awesome in its bare metal and firing it up is no problem. I flashed an SD card with the Raspbian “wheezy” Linux distribution. To write the image to the SD card I used Win32DiskImager from a PC with SD card slot.

After attaching USB keyboard, network cable and HDMI  it comes to life by using a micro USB as power supply. The diods flashes and even X runs quite smooth on this limited hardware.

However, after playing around I soon realised I will be much more comfortable working on a distance from my ordinary desktop machine. So how to enable ssh?
A lot of tutorials talks about something as simple as this to enable ssh daemon on boot.
sudo mv /boot/boot_enable_ssh.rc /boot/boot.rc

Sorry, I have no such files in my /boot directory. Furthermore, when trying to start the ssh daemon with
/etc/init.d/ssh start
it refuses to start. Clues in the startup log are

Could not load host key: /etc/ssh/ssh_host_rsa_key Could not load host key: /etc/ssh/ssh_host_dsa_key

Why these are corrupt I don't know, but it's easy to regenerate them.

ssh-keygen -t rsa1 -f /etc/ssh/ssh_host_rsa_key
ssh-keygen -t dsa -f /etc/ssh/ssh_host_dsa_key

Make ssh daemon start permanent

sudo update-rc.d ssh defaults

sudo reboot and everything seems ok.

Now, my mission for tonight was to get a web server running on the chipset. Here's what you need to do to make Tomcat 7 run.

Tomcat obviously need Java:
sudo apt-get install openjdk-6-jdk

And you obviously need a potent editor
sudo apt-get install vim

Clean up your installs to save some space on the SD card
sudo apt-get clean

Now, download, unpack and install Tomcat
wget http://mirrors.axint.net/apache/tomcat/tomcat-7/v7.0.28/bin/apache-tomcat-7.0.28.tar.gz
tar xzf apache-tomcat-7.0.28.tar.gz
cd apache-tomcat-7.0.28/conf
vim tomcat-users.xml

Add a user to the authorization file, directly below <tomcat-users> add
<user username="system" password="raspberry" roles="manager-gui"/>

Now start Tomcat
cd ../bin
sudo sh startup.sh

Nice! From your PC (or via a browser on the Pi) surf against the Tomcat console.
http://192.168.1.90:8080/

(Figure out the ip address via for example ifconfig)

It takes a short while to warm up the server but then you can login via Manager App. Now, it's business as usual. Upload a war archive and you have a nifty web server running you web application, for $35!

Saturday 24 March 2012

Inspect traffic from your iPhone

Quick guide how to you inspect iPhone/iOS network traffic

We're in times of more and more reports of malware on our phones. What are the apps on your iPhone actually sending and receiving? 


  1. Download a web proxy. I use Fiddler from http://www.fiddler2.com/ on my PC but there's for example Paros which is written in Java if you want to run on all platforms. This tutorial however uses Fiddler.
  2. Install Fiddler and fire it up. Goto Tools -> Fiddler Options and tab Connections. Select "Allow remote computers to connect". 
  3. Also notice the port number 8888 or change it something that suites you.
  4. Restart Fiddler.
  5. Now start a command prompt and run ipconfig to find your ip number. Or on a Mac/Linux machine: ifconfig
  6. In your iOS device, goto to your Wifi settings and scroll down to the proxy settings. Choose manual settings and type in the proxy computers ip number and port.
  7. Fire away!
Here's an example of stock information sent by the Stock app on my iPhone. I've chosen the XML view in the reponse inspection to get pretty format.





Compression

Many sites compress their http responses which Fiddler has support for. So in the Inspector view in Fiddler use the raw format tab. I almost always use it anyway but if the response is gzipped there will be a hint in the top of the window to let you unzip it on the fly.

HTTPS

Another obstacle in monitoring traffic can be that the client app and the server communicates over SSL. You won't notice that in the protocol column since Fiddler tells you it's plain HTTP but in the Host column you'll see it says "Tunnel to". There is a way to come around at least some of the SSL problems by enabling "Decrypt HTTPS traffic" in the HTTPS tab in Fiddler Options.

What this really means is that Fiddler will act as a man in the middle and generate SSL server certificates on the fly mimicking the real server. Obviously, your iPhone will not trust the root certificate Fiddler has used to create the fake certificates with so you will be prompted with "Unsecure certificate, possible attacker..." etcetera if you for example would surf against https://www.google.com. Some apps/sites won't even work if they don't trust the certificate.

In Fiddler, you can export the Fiddler root certificate to a cer-file and you could import that to your iPhone to trust it. It would end up under Options -> Profile as a trusted certificate. But I wouldn't recommend you to add unknown certificates as trusted unless you know what you're doing.

Monday 2 January 2012

Create Spring REST service for Google App Engine in 15 minutes

Here's how you setup a REST service deployed in the Google App Engine cloud in 15 minutes. The use case in this example is a highscore backend service for my Android game Othello Legends.

Requirements:
We want to create a REST interface for these resources representing a highscore service.


GET http://myapp.appspot.com/api/highscores/
Fetch all applications backed by the highscore service since we want to reuse this for multiple games.

GET http://myapp.appspot.com/api/highscores/myappname
Fetch a sorted listed of highscores for a particular application myappname.

POST http://myapp.appspot.com/api/highscores/myappname
Post a potential new highscore to service. If it makes it to the highscore list it will be saved in database. The data will be sent as query parameters.


Ingredients of the solution:
Google App Engine runs Java and Python. This example will use the Java infrastructure.
So what we'll do is to create a standard Java J2EE web application built for deployment in App Engine backed by a simple DAO to abstract the Google BigTable databases. By using Spring REST together with Jackson we can communicate with JSON in a RESTful manner with minimum effort.

Sounds complicated? Not at all, here's how you do it!

Prerequisities:
REST Implementation:

So to create an App Engine web app, click the New Web Application Project icon. Deselect Google Web Toolkit if you don't intend to use it.

Now, we're going to use Spring REST for the REST heavy weight lifting. Download Spring Framework 3 or later from http://www.springsource.org/download. While at it, download the Jackson JSON library from http://jackson.codehaus.org/. Put the downloaded jars in the /war/WEB-INF/lib/ folder and add them to the classpath of your web application.

Now, to bootstrap Spring to handle your incoming servlet requests you should edit the web.xml file of your web application found in war/WEB-INF/.




   api
   
      org.springframework.web.servlet.DispatcherServlet
   
   1

  

   api
   /api/*



   index.html



That will put Spring in charge of everything coming in under path /api/*. Spring must now which packages to scan for Spring annotated classes. We add a Spring configuration file for this and also add some Spring/Jackson config for specifying how to convert from our Java POJOs to JSON. Put this stuff in a file called api-servlet.xml in war/WEB-INF.


 

 
  
   
    
   
  
 

 

 
  
   
    
   
  
  
  
   
    
   
  
 




Without going into detail, this config pretty much tells Spring to convert POJOs to JSON as default using Jackson for servlet responses. If you're not interested in the details just grab it, but you must adjust the <context:component-scan base-package="se.noren.othello" /> to match your package names.

Now to the fun part, mapping Java code to the REST resources we want to expose. We need a controller class to annotate how our Java methods should map to the exposed HTTP URIs. Create something similar to

import java.util.Date;
import java.util.List;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.validation.BindingResult;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.servlet.ModelAndView;

/**
 * Controller for Legends app family highscore services.
 */
@Controller
@RequestMapping("/highscores")
public class LegendsHighScoreController {
 private static final long serialVersionUID = 1L;

 @Autowired
 HighScoreService highScoresService;

 /**
  * @return Fetch all registered applications in the highscore database.
  */
 @RequestMapping(value = "/", method = RequestMethod.GET)
 public ModelAndView getAllApplications() {
  List<String> allApplications = highScoresService.getAllApplications();
  return new ModelAndView("highScoresView", BindingResult.MODEL_KEY_PREFIX + "applications", allApplications);
 }
 
 /**
  * Fetch all highscores for a particular application.
  * @param application Name of application
  * @return
  */
 @RequestMapping(value = "/{application}", method = RequestMethod.GET)
 public ModelAndView getAllHighScores(@PathVariable String application) {
  List<HighScore> allHighScores = highScoresService.getAllHighScores(application);
  return new ModelAndView("highScoresView", BindingResult.MODEL_KEY_PREFIX + "scores", allHighScores);
 }
 
 /**
  * Add a new highscore to the database if it makes it to the high score list.
  * @param application Name of application
  * @param owner Owner of the highscore
  * @param score Score as whole number
  * @param level Level of player reaching score.
  * @return The created score.
  */
 @RequestMapping(value = "/{application}", method = RequestMethod.POST)
 public ModelAndView addHighScores(@PathVariable String application,
                             @RequestParam String owner,
                             @RequestParam long score,
                             @RequestParam long level
                             ) {
  
  HighScore highScore = new HighScore(owner, score, application, new Date().getTime(), level);
  highScoresService.addHighScores(highScore);
  return new ModelAndView("highScoresView", BindingResult.MODEL_KEY_PREFIX + "scores", highScore);
 }
}


So what's the deal with all the annotations? They're pretty self explanatory once you start matching the Java methods to the three HTTP REST URIs we wanted to create, but in short:

  • @Controller - The usual Spring annotation to tell Spring that this is a controller class that should be managed by the Spring container. All RESTful stuff is contained within the this class.
  • @RequestMapping("/highscores") - This means that this controller class should accept REST calls under the path /highscores. Since we deployed the servlet under servlet mapping /api in the web.xml this means all REST resources resides under http://host.com/api/highscores
  • @Autowired HighScoreService highScoresService - Our backing service class to do real business logic. Agnostic that we're using a RESTful front.
  • @RequestMapping(value = "/{application}", method = RequestMethod.GET) public ModelAndView getAllHighScores(@PathVariable String application) -  A method annotated like this creates a REST resource /api/highscores/dynamicAppName where the value given for dynamicAppName is given via the path variable application. The request method specifies that this Java method will be called if this URI is requested via HTTP GET. All ordinary HTTP verbs are supported.
  • @RequestParam String owner - If you wish to pass query parameters like myvar1=foo&myvar2=bar you can use the request param annotation.
  • The Java class returned in the ModelAndView response will be automatically marshalled to JSON by Jackson on the same structure as the Java POJO.
Database
Google App Engine uses the Google BigTables behind the scenes to store data. You can abstract this by using the standard JPA annotations on your POJOs. The similar JDO standard can be used as well. I've used JDO in previous projects and it works very well. For this simple server application we will however use the query language to directly access the document database. Here's the code for the first method to fetch all highscores for a particular Legends application. The database can filter and sort via API methods in the query.


@Service
public class HighScoreServiceImpl implements HighScoreService {

 @Override
 public List<HighScore> getAllHighScores(String application) {
  ArrayList<HighScore> list = new ArrayList<HighScore>();
  DatastoreService datastore = DatastoreServiceFactory
    .getDatastoreService();

  // The Query interface assembles a query
  Query q = new Query("HighScore");
  q.addFilter("application", Query.FilterOperator.EQUAL, application);
  q.addFilter("score", FilterOperator.GREATER_THAN_OR_EQUAL, 0);
  q.addSort("score", SortDirection.DESCENDING);

  // PreparedQuery contains the methods for fetching query results
  // from the datastore
  PreparedQuery pq = datastore.prepare(q);

  for (Entity result : pq.asIterable()) {
   String owner = (String) result.getProperty("owner");
   Long date = (Long) result.getProperty("date");
   Long score = (Long) result.getProperty("score");
   Long level = (Long) result.getProperty("level");
   list.add(new HighScore(owner, score, application, date, level));
  }

  return list;
 }


That's pretty much it. Run the project locally by right-clicking it and choose Run As -> Web application. Once you are ready to go live create a cloud application by going to https://appengine.google.com/ and Create new application

Now in Eclipse, right click on your project and choose Google -> Deploy to Google App Engine.
You will be asked to supply the name you created in the App Engine administration interface. Wait a few seconds and the application will be deployed in the cloud.