Blog JVM Stuff.

Spock Quick-Tip: Grails Integration Tests

I'm currently working in project with a nearly six year old Grails application. Over time, we changed from having plain JUnit 4/GroovyTestCase unit and integration tests to Spock tests. Spock is such a great library to write both, unit and integration tests. Normally, we tend to write unit tests whenever possible. But there are cases (and code parts) where it is more practical to rely on a fully initialized application context and (Hibernate) data-store.

Integration Tests with Spock

The Spock Grails plugin comes with a distinct base class for Grails integration tests: IntegrationSpec.

IntegrationSpec initialises the application context, sets up an autowirer that autowires bean properties from the specification class and create a transactional boundary around every feature method in the specification.

All our Spock integration tests extend IntegrationSpec.

Mocking with Groovy's meta-classes

One thing I love about Spock is that it comes out-of-the-box with great support for mocking and stubbing. But there are times when you actually need to stub certain parts of the Grails artifact that is currently under test by your integration test.

We do this with the help of the Groovy Meta-Object protocol (MOP), that is, by altering the underlying meta-class. The next example shows how getCurrentUser is overwritten, as we do want to stub out the Spring Security part from the StatisticsService.

StatisticsServiceIntegrationTest extends IntegrationSpec {

    StatisticsService statisticsService

    void "count login to private area"() {

        setup:
            def user = new User(statistics: new UserStatistics())
            statisticsService.metaClass.getCurrentUser = { -> user }

        when:
            statisticsService.countLoginPA()

        then:
            user.statistics.logins == 1

    }    
}

Altering classes at runtime is a nice feature, but it can also become confusing when you don't know about the side-effects it may cause. For integration tests, changes to the meta-class won't be resetted, so once you do changes to a meta-class (we are working with per-instance meta-class changes, the same is even more true for global meta-class changes) those will be persistent through the entire test.

To solve that, we added a helper method that allows to revoke meta-class changes inbetween test runs:

public static void revokeMetaClassChanges(Class type, def instance = null)  {
    GroovySystem.metaClassRegistry.removeMetaClass(type)
    if (instance != null)  {
        instance.metaClass = null
    }
}

And applied it like this:

StatisticsServiceIntegrationTest extends IntegrationSpec {

    StatisticsService statisticsService

    void "count login to private area"() {

        setup:
            def user = new User(statistics: new UserStatistics())
            statisticsService.metaClass.getCurrentUser = { -> user }

        when:
            statisticsService.countLoginPA()

        then:
            user.statistics.logins == 1

        cleanup:
            revokeMetaClassChanges(StatisticsService, statisticsService)

    }    
}

This actually sets back the meta-class code and the service class is again un-altered when executing the next feature method.

Be warned.

Meta-class overriding can become tricky. One thing we came across multiple times is that you can't replace methods of super classes being called from super class methods. Here is a simplified example:

class A {
    def a(){
       a2()

    }
    def a2(){   
         println 'In class A'
    }
}

class B extends A{
    def b(){
        a()
    }
}

B b = new B();

b.metaClass.a2 = {
    println 'In class B'
}

b.b(); // still prints 'In class A'

If we wanted to stub the implementation of b inside our test code, this wouldn't work, as a and a2 are implemented in the same class A and therefore the method call won't be intercepted by a per-instance change to instance b. This now might seem obvious, but we had a hard time tracking this down.

If you start to experience weird issues of tests failing when you run the entire test suite, but being green when executed separately, it almost certainly has to do with meta-class rewriting that isn't undone between feature methods or even specifications. Just be aware of that.

@ConfineMetaClassChanges

Lately I became aware that our revokeMetaClassChanges is actually "part" of Spock with the @ConfineMetaClassChanges extension annotation.

The code behind it works a bit differently but the meaning is the same; it can be used on methods or classes to rollback meta-class changes declaratively:

@ConfineMetaClassChanges([StatisticsService])
StatisticsServiceIntegrationTest extends IntegrationSpec {

    StatisticsService statisticsService

    void "count login to private area"() {

        setup:
            def user = new User(statistics: new UserStatistics())
            statisticsService.metaClass.getCurrentUser = { -> user }

        when:
            statisticsService.countLoginPA()

        then:
            user.statistics.logins == 1

    }    
}

Speaking of Spock extensions. It's definitely worth to have a look at the chapter on Spock Extensions in the documentation. There is lots of great stuff already available (and coming in Spock 1.0).

Conclusion

Besides Spock's great mocking and stubbing capabilities, writing Grails integration tests also involves meta-class changes. This article shows how to rollback these changes to avoid side-effects and explained the usage of @ConfineMetaClassChanges a Spock extension annotation.

Grails - Tracking Principals

We use the Grails auto timestamp feature in nearly all of our domain classes. It basically allows the definition of two special domain class properties dateCreated and lastUpdated and automatically sets the creation and modification date whenever a domain object is inserted or updated.

In addition to dateCreated and lastUpdated we wanted to have a way to define two additional properties userCreated and userUpdated to save the principal who created, updated or deleted a domain class (deletion because we have audit log tables that track all table changes, so when an entry is deleted and the principal is set before, we can see who deleted an entry).

PersistenceEventListener

Grails provides the concept of GORM events, so we thought its implementation might be a good hint on how to implement our requirement for having userCreated and userUpdated. And indeed, we found DomainEventListener, a descendant class of AbstractPersistenceEventListener. It turns out that DomainEventListener is responsible for executing the GORM event hooks on domain object inserts, updates and deletes.

The event listener is registered at the application context as the PersistenceListener interface (which is implemented by AbstractPersistenceListener) extends from Spring's ApplicationListener and therefore actually uses the Spring event system.

In order to create a custom persistence listener, we just have to extend AbstractPersistenceEventListener and listen for the GORM events which are useful to us. Here is the implementation we ended up with:

@Log4j
class PrincipalPersistenceListener extends AbstractPersistenceEventListener {

    public static final String PROPERTY_PRINCIPAL_UPDATED = 'userUpdated'
    public static final String PROPERTY_PRINCIPAL_CREATED = 'userCreated'

    SpringSecurityService springSecurityService

    PrincipalPersistenceListener(Datastore datastore) {
        super(datastore)
    }

    @Override
    protected void onPersistenceEvent(AbstractPersistenceEvent event) {

        def entityObject = event.entityObject

        if (hasPrincipalProperty(entityObject)) {
            switch (event.eventType) {
                case EventType.PreInsert:
                    setPrincipalProperties(entityObject, true)
                    break

                case EventType.Validation:
                    setPrincipalProperties(entityObject, entityObject.id == null)
                    break

                case EventType.PreUpdate:
                    setPrincipalProperties(entityObject, false)
                    break

                case EventType.PreDelete:
                    setPrincipalProperties(entityObject, false)
                    break
            }
        }
    }

    protected boolean hasPrincipalProperty(def entityObject) {
        return entityObject.metaClass.hasProperty(entityObject, PROPERTY_PRINCIPAL_UPDATED) || entityObject.metaClass.hasProperty(entityObject, PROPERTY_PRINCIPAL_CREATED)
    }

    protected void setPrincipalProperties(def entityObject, boolean insert)  {
        def currentUser = springSecurityService.currentUser

        if (currentUser instanceof User) {
            def propertyUpdated = entityObject.metaClass.getMetaProperty(PROPERTY_PRINCIPAL_UPDATED)
            if (propertyUpdated != null)  {
                propertyUpdated.setProperty(entityObject, currentUser.uuid)
            }

            if (insert)  {
                def propertyCreated = entityObject.metaClass.getMetaProperty(PROPERTY_PRINCIPAL_CREATED)
                if (propertyCreated != null)  {
                    propertyCreated.setProperty(entityObject, currentUser.uuid)
                }
            }
        }
    }

    @Override
    boolean supportsEventType(Class<? extends ApplicationEvent> eventType) {
        return eventType.isAssignableFrom(PreInsertEvent) ||
                eventType.isAssignableFrom(PreUpdateEvent) ||
                eventType.isAssignableFrom(PreDeleteEvent) ||
                eventType.isAssignableFrom(ValidationEvent)
    }
}

As you can see in the code above, the implementation intercepts the PreInsert, PreUpdate and PreDelete events. If any of these event types is thrown, the code checks the affected domain object for the existence of either the userCreated or userUpdated property. If available, it uses the springSecurityService to access the currently logged-in principal and uses its uuid property, as this is the unique identifier of our users in this application.

To register the PrincipalPersistenceListener and attach it to a Grails datastore, we need to add the following code to BootStrap.groovy:

def ctx = grailsApplication.mainContext
ctx.eventTriggeringInterceptor.datastores.each { key, datastore ->

    def listener = new PrincipalPersistenceListener(datastore)
    listener.springSecurityService = springSecurityService

    ctx.addApplicationListener(listener)
}

To make this work, the springSecurityService needs to be injected, the same is true for grailsApplication.

But that's all we have to do to support our new domain class properties userCreated and userUpdated. The last step is to add both properties to the domain class(es) we want to track.

Conclusion

Grails integrates with Spring's event mechanism and provides the AbstractPersistenceEventListener base class to listen to certain GORM events. Grails uses this mechanism internally for example for the GORM event hooks but it can of course be used by the application logic too. This article showed how to introduce support for userCreated and userUpdated which are similar to dateCreated and lastUpdated but store the principal how created, updated or deleted a domain object.

Google I/O App Insights

A while ago I came across the Google I/O App in one of the latest Android Developers blog posts. I thought it would be interesting to have a look at some of the internals, to also gain some insights at how Android applications are developed at Google and what third party libraries are actually used there.

The source code for the Google I/O app is available on GitHub.

build.gradle

My journey through the source code began with the settings.gradle file. It contains the information about the projects Gradle modules. This app consists of two modules: one for the wearable version and the other one for the Android version.

This article will not talk about the implementation of the wearable version, I will create a separate blog post for that.

I have to say that the Android version was of particular interest for me, so I decided to go on with the Android modules build.gradle dependencies section that holds all the external dependencies that are needed by the implementation:

dependencies {
    wearApp project(':Wearable')

    compile 'com.google.android.gms:play-services:5+' 
    compile 'com.android.support:support-v13:20.+'
    compile 'com.android.support:support-v4:20.+'
    compile 'com.google.android.apps.dashclock:dashclock-api:+'
    compile 'com.google.code.gson:gson:2.+'
    compile('com.google.api-client:google-api-client:1.+') {
        exclude group: 'xpp3', module: 'shared'
        exclude group: 'org.apache.httpcomponents', module: 'httpclient'
        exclude group: 'junit', module: 'junit'
        exclude group: 'com.google.android', module: 'android'
    }
    compile 'com.google.api-client:google-api-client-android:1.17.+'
    compile 'com.google.apis:google-api-services-plus:+'
    compile 'com.github.japgolly.android:svg-android:2.0.6'
    compile fileTree(dir: 'libs', include: '*.jar')
    compile files('../third_party/glide/library/libs/glide-3.2.0a.jar')
    compile files('../third_party/basic-http-client/libs/basic-http-client-android-0.88.jar')

    compile('com.google.maps.android:android-maps-utils:0.3+') {
        exclude group: "com.google.android.gms"
    }

    compile 'com.google.http-client:google-http-client-gson:+'
    compile 'com.google.apis:google-api-services-drive:+'
}

As you can see in the code snippet above, several external dependencies have been included.

Dashclock API

Let's start with the first dependency that gained my attention:

compile 'com.google.android.apps.dashclock:dashclock-api:+'

The Android dash-clock project comes with an alternative lock screen clock widget implementation that can be used to show additional status items. Showing additional information on the lock screen is done by implementing so-called DashClockExtension extension descendant classes, as described in the DashClockExtension documentation. Although this API looked pretty interesting, I couldn't find any use for it in the Google I/O application and also removing it from the dependencies did work, so I guess it might have been planned to use it, but actually it was never implemented.

GSON

Next up is Google's JSON library: Gson:

compile 'com.google.code.gson:gson:2.+'

The Google I/O app's main purpose is the give an overview of all the scheduled talks at Google I/O and also allow some interaction for the user to give feedback about visited sessions. Gson is used to parse JSON that comes from Google's web services and contains the entire conference data.

One particular piece of code that shows some Gson usage is the ConferenceDataHandler. This handler basically is responsible for parsing most of the JSON data that holds information about the scheduled conference sessions, speakers, etc. Instead of parsing the JSON content directly to an object tree, it registers "handlers" for every JSON property in a map:

mHandlerForKey.put(DATA_KEY_ROOMS, mRoomsHandler = new RoomsHandler(mContext));
mHandlerForKey.put(DATA_KEY_BLOCKS, mBlocksHandler = new BlocksHandler(mContext));
mHandlerForKey.put(DATA_KEY_TAGS, mTagsHandler = new TagsHandler(mContext));
mHandlerForKey.put(DATA_KEY_SPEAKERS, mSpeakersHandler = new SpeakersHandler(mContext));
mHandlerForKey.put(DATA_KEY_SESSIONS, mSessionsHandler = new SessionsHandler(mContext));
mHandlerForKey.put(DATA_KEY_SEARCH_SUGGESTIONS, mSearchSuggestHandler = new SearchSuggestHandler(mContext));
mHandlerForKey.put(DATA_KEY_MAP, mMapPropertyHandler = new MapPropertyHandler(mContext));
mHandlerForKey.put(DATA_KEY_EXPERTS, mExpertsHandler = new ExpertsHandler(mContext));
mHandlerForKey.put(DATA_KEY_HASHTAGS, mHashtagsHandler = new HashtagsHandler(mContext));
mHandlerForKey.put(DATA_KEY_VIDEOS, mVideosHandler = new VideosHandler(mContext));
mHandlerForKey.put(DATA_KEY_PARTNERS, mPartnersHandler = new PartnersHandler(mContext));

With the registered handlers set up, it parses the JSON response body property by property in processDataBody:

private void processDataBody(String dataBody) throws IOException {
    JsonReader reader = new JsonReader(new StringReader(dataBody));
    JsonParser parser = new JsonParser();
    try {
        reader.setLenient(true); // To err is human

        // the whole file is a single JSON object
        reader.beginObject();

        while (reader.hasNext()) {
            // the key is "rooms", "speakers", "tracks", etc.
            String key = reader.nextName();
            if (mHandlerForKey.containsKey(key)) {
                // pass the value to the corresponding handler
                mHandlerForKey.get(key).process(parser.parse(reader));
            } else {
                LOGW(TAG, "Skipping unknown key in conference data json: " + key);
                reader.skipValue();
            }
        }
        reader.endObject();
    } finally {
        reader.close();
    }
}

When we have a look at one of the handler classes, let's say at SessionsHandler, we will see that it not only encapsulates the code for parsing the session JSON objects, but also code for building so-called "content provider operations". The ContentProviderOperation class is a class from the Android SDK that is used to build content provider actions such as inserting, updating or deleting entities stored by a content provider. The handler classes provide methods to directly create content provider operations based on the current state of an entity. E.g. if a session is new, needs to be updated or deleted, its makeContentProviderOperations method from the handler class will create the appropriate operation. Let's have a look now how actually parsing JSON is done for the SessionsHandler:

@Override
public void process(JsonElement element) {
    for (Session session : new Gson().fromJson(element, Session[].class)) {
        mSessions.put(session.id, session);
    }
}

The code is quite slick. It uses an array of Session model classes as GSON target type and GSON will create the instances and populate the available properties from the JSON values:

public class Session {
    public String id;
    public String url;
    public String description;
    public String title;
    public String[] tags;
    public String startTimestamp;
    public String youtubeUrl;
    public String[] speakers;
    public String endTimestamp;
    public String hashtag;
    public String subtype;
    public String room;
    public String captionsUrl;
    public String photoUrl;
    public boolean isLivestream;
    public String mainTag;
    public String color;
    public String relatedContent;
    public int groupingOrder;

    public String getImportHashCode() {
        StringBuilder sb = new StringBuilder();
        sb.append("id").append(id == null ? "" : id)
                .append("description").append(description == null ? "" : description)
                .append("title").append(title == null ? "" : title)
                .append("url").append(url == null ? "" : url)
                .append("startTimestamp").append(startTimestamp == null ? "" : startTimestamp)
                .append("endTimestamp").append(endTimestamp == null ? "" : endTimestamp)
                .append("youtubeUrl").append(youtubeUrl == null ? "" : youtubeUrl)
                .append("subtype").append(subtype == null ? "" : subtype)
                .append("room").append(room == null ? "" : room)
                .append("hashtag").append(hashtag == null ? "" : hashtag)
                .append("isLivestream").append(isLivestream ? "true" : "false")
                .append("mainTag").append(mainTag)
                .append("captionsUrl").append(captionsUrl)
                .append("photoUrl").append(photoUrl)
                .append("relatedContent").append(relatedContent)
                .append("color").append(color)
                .append("groupingOrder").append(groupingOrder);
        for (String tag : tags) {
            sb.append("tag").append(tag);
        }
        for (String speaker : speakers) {
            sb.append("speaker").append(speaker);
        }
        return HashUtils.computeWeakHash(sb.toString());
    }

    public String makeTagsList() {
        int i;
        if (tags.length == 0) return "";
        StringBuilder sb = new StringBuilder();
        sb.append(tags[0]);
        for (i = 1; i < tags.length; i++) {
            sb.append(",").append(tags[i]);
        }
        return sb.toString();
    }

    public boolean hasTag(String tag) {
        for (String myTag : tags) {
            if (myTag.equals(tag)) {
                return true;
            }
        }
        return false;
    }
}

What's interesting about this class (and the other model classes) is the getImportHashCode method. This method is needed to find out about changes that might have been done on already processed entities and is actually a main method to be used by the data sync logic implemented by the SyncAdapter.

google-api-client

Next up in our list of dependencies is the Google APIs client library and its Android extension. Both libraries are used in conjunction with the Google Plus API from the next dependency

compile 'com.google.apis:google-api-services-plus:+'

to fetch the latest announcements via the AnnouncementsFetcher class. Once the announcements are fetched from the Google+ profile, they are stored by the content provider ScheduleProvider:

Plus plus = new Plus.Builder(httpTransport, jsonFactory, null)
        .setApplicationName(NetUtils.getUserAgent(mContext))
        .setGoogleClientRequestInitializer(
                new CommonGoogleClientRequestInitializer(Config.API_KEY))
        .build();

ActivityFeed activities;
try {
    activities = plus.activities().list(Config.ANNOUNCEMENTS_PLUS_ID, "public")
            .setMaxResults(100l)
            .execute();
    if (activities == null || activities.getItems() == null) {
        throw new IOException("Activities list was null.");
    }

} catch (IOException e) {
    LOGE(TAG, "Error fetching announcements", e);
    return batch;
}

// ...

StringBuilder sb = new StringBuilder();
for (Activity activity : activities.getItems()) {
    // ...

    // Insert announcement info
    batch.add(ContentProviderOperation
            .newInsert(ScheduleContract
                    .addCallerIsSyncAdapterParameter(Announcements.CONTENT_URI))
            .withValue(SyncColumns.UPDATED, System.currentTimeMillis())
            .withValue(Announcements.ANNOUNCEMENT_ID, activity.getId())
            .withValue(Announcements.ANNOUNCEMENT_DATE, activity.getUpdated().getValue())
            .withValue(Announcements.ANNOUNCEMENT_TITLE, activity.getTitle())
            .withValue(Announcements.ANNOUNCEMENT_ACTIVITY_JSON, activity.toPrettyString())
            .withValue(Announcements.ANNOUNCEMENT_URL, activity.getUrl())
            .build());
}

Again, the ContentProviderOperation builder methods are used to create the appropriate operations and return them to the class client.

Android SVG

Next up is a very interesting dependency: the Android SVG library:

compile 'com.github.japgolly.android:svg-android:2.0.6'

The SVG Android project adds support for showing scalable vector graphic files in an Android application. In the Google I/O application it is used to show the location of different floors in the Google I/O venue.

One place to have a look at SVG processing is the ConferenceDataHandler implementation, again, a handler class:

private void processMapOverlayFiles(Collection<Tile> collection, boolean downloadAllowed) throws IOException, SVGParseException {
    boolean shouldClearCache = false;
    ArrayList<String> usedTiles = Lists.newArrayList();

    for (Tile tile : collection) {
        final String filename = tile.filename;
        final String url = tile.url;

        usedTiles.add(filename);

        if (!MapUtils.hasTile(mContext, filename)) {
            shouldClearCache = true;

            if (MapUtils.hasTileAsset(mContext, filename)) {

                MapUtils.copyTileAsset(mContext, filename);

            } else if (downloadAllowed && !TextUtils.isEmpty(url)) {
                try {
                    // download the file only if downloads are allowed and url is not empty
                    File tileFile = MapUtils.getTileFile(mContext, filename);
                    BasicHttpClient httpClient = new BasicHttpClient();
                    httpClient.setRequestLogger(mQuietLogger);
                    HttpResponse httpResponse = httpClient.get(url, null);
                    FileUtils.writeFile(httpResponse.getBody(), tileFile);

                    // ensure the file is valid SVG
                    InputStream is = new FileInputStream(tileFile);
                    SVG svg = new SVGBuilder().readFromInputStream(is).build();
                    is.close();
                } catch (IOException ex) {
                    LOGE(TAG, "FAILED downloading map overlay tile "+url+
                            ": " + ex.getMessage(), ex);
                } catch (SVGParseException ex) {
                    LOGE(TAG, "FAILED parsing map overlay tile "+url+
                            ": " + ex.getMessage(), ex);
                }
            } else {
                LOGD(TAG, "Skipping download of map overlay tile" +
                        " (since downloadsAllowed=false)");
            }
        }
    }

    if (shouldClearCache) {
        MapUtils.clearDiskCache(mContext);
    }

    MapUtils.removeUnusedTiles(mContext, usedTiles);
}

The code looks if the SVG graphic is available in the APK's asset directory. If so, it copies the file to a custom directory. If not, it downloads the SVG and uses the svg-android library to validate if it is a valid SVG graphic.

The main place where the SVG graphics are later used is in the MapFragment implementation. It uses a TileOverlay and registers multiple TileProvider implementations of type SVGTileProvider class. The SVGTileProvider uses the previously shown SVGBuilder in order to draw the currently shown floor onto the map.

public SVGTileProvider(File file, float dpi) throws IOException {
    // ...

    SVG svg = new SVGBuilder().readFromInputStream(new FileInputStream(file)).build();
    mSvgPicture = svg.getPicture();

    // ...
}

// later on when drawing:

public byte[] getTileImageData(int x, int y, int zoom) {
    mStream.reset();

    Matrix matrix = new Matrix(mBaseMatrix);
    float scale = (float) (Math.pow(2, zoom) * mScale);
    matrix.postScale(scale, scale);
    matrix.postTranslate(-x * mDimension, -y * mDimension);

    mBitmap.eraseColor(Color.TRANSPARENT);
    Canvas c = new Canvas(mBitmap);
    c.setMatrix(matrix);

    // NOTE: Picture is not thread-safe.
    synchronized (mSvgPicture) {
        mSvgPicture.draw(c);
    }

    BufferedOutputStream stream = new BufferedOutputStream(mStream);
    mBitmap.compress(Bitmap.CompressFormat.PNG, 0, stream);
    try {
        stream.close();
    } catch (IOException e) {
        Log.e(TAG, "Error while closing tile byte stream.");
        e.printStackTrace();
    }
    return mStream.toByteArray();
}

As can be seen in the code above, the method getTileImageData applies some scaling and translating, but in the end it draws the mSvgPicture onto a newly created Canvas and writes it to the resulting ByteArrayOutputStream. In order to enhance performance on creating the tile graphics, there is the CachedTileProvider implementation that uses a disk LRU cache to cache results on disk.

I found it very refreshing to see an application of the svg-android library in action. Its definetly an implementation option to carry in mind for future Android apps.

Glide

Another third party library in use is Glide:

compile files('../third_party/glide/library/libs/glide-3.2.0a.jar')

Glide is an image loading and caching library that comes with extensions to other commonly used libraries such as OkHttp and Volley. In the Google I/O application the Glide API is encapsulated in the ImageLoader class.

One interesting detail in this class is the VariableWidthImageLoader implementation:

// ...
private static final Pattern PATTERN = Pattern.compile("__w-((?:-?\\d+)+)__");
// ...

@Override
protected String getUrl(String model, int width, int height) {
    Matcher m = PATTERN.matcher(model);
    int bestBucket = 0;
    if (m.find()) {
        String[] found = m.group(1).split("-");
        for (String bucketStr : found) {
            bestBucket = Integer.parseInt(bucketStr);
            if (bestBucket >= width) {
                // the best bucket is the first immediately bigger than the requested width
                break;
            }
        }
        if (bestBucket > 0) {
            model = m.replaceFirst("w"+bestBucket);
            LOGD(TAG, "width="+width+", URL successfully replaced by "+model);
        }
    }
    return model;
}

The VariableWidthImageLoader is used by Glide in order to return a customized URL that should be used for a given width and height. The implementation above looks for an image indicator in the current URL (think of model as being an URL to an image) that might look like __w-200-400-800__. If this indicator is available it replaces it with w<desiredWith> to actually fetch an image with a width that is actually larger than the requested width.

We used a similar pattern in our applications for image URLs (though with a width request parameter), but I wasn't aware of Glide providing such a nice API to inject this behaviour.

Basic HTTP Client

Of course, the Android basic http client implementation must also not be missed. It is needed to execute the actual HTTP requests for example in the RemoteConferenceDataFetcher that fetches the JSON content from Google servers. In fact, it first fetches only a so-called manifest file and checks whether data has changed based on that manifest. A detailed explanation on the actual synchronisation of the conference data can be found at the Android developers blog.

Conclusion

This article had a look at some places in the Google I/O Android application and showed some third party libraries in use. The application has been open-sourced on GitHub and is available under the Apache license.

REST Interfaces and Android

As I already told in my last blog post, I lately discovered this very good collection of open source projects from Square Inc., a commercial payment service.

One project among their released open source projects that raised my interest was also Retrofit. Retrofit allows to access REST web interfaces via type-safe Java types and annotations.

Configuring the RestAdapter

Before we can declare our interface and start making REST requests, we need to configure the so-called RestAdapter. It allows to change various aspects from HTTP settings to adding custom converters for the content found in HTTP responses (in our case, the REST interface returned JSON) from the accessed web service.

I put this setup code into our Dagger application module:

@Provides
@Singleton
public AppConnector providesAppConnector() {

  OkHttpClient okHttpClient = new OkHttpClient();
  okHttpClient.setConnectTimeout(ApplicationConstants.HTTP_TIMEOUT, TimeUnit.MILLISECONDS);
  okHttpClient.setWriteTimeout(ApplicationConstants.HTTP_TIMEOUT, TimeUnit.MILLISECONDS);
  okHttpClient.setReadTimeout(ApplicationConstants.HTTP_TIMEOUT, TimeUnit.MILLISECONDS);

  RestAdapter restAdapter = new RestAdapter.Builder()
    .setEndpoint(application.getString(R.string.appEndPoint))
    .setLogLevel(RestAdapter.LogLevel.FULL)
    .setClient(new OkClient(okHttpClient))
    .setConverter(new GsonConverter(gson))
    .build();

    return restAdapter.create(AppConnector.class);
}

Usually the RestAdapter defaults to parsing JSON responses (utilizing Google's GSON library), but in our case we needed to adapt the pre-defined GSON converter slightly, this shouldn't irritate in this example. The endpoint is the base URL which is used for all the REST requests. The HTTP client in use is OkHttpClient, another great project from Square, actually worth another blog post. It supports loading content from multiple IPs if single hosts aren't available for some reason and also does HTTP response caching based on the given response headers, if so configured. That's it for our Retrofit configuration.

Declaring the Java REST interface

Once the RestAdapter is available, it can be used to instantiate proxy implementations for your Java REST interfaces. So let's first create a simple REST Java interface:

public interface AppConnector {

  @Headers("Cache-Control: max-age=14400")
  @GET("/connector/contents/app-id/{app-id}")
  Contents getContents(@Path("app-id") String appId);

}

The example above is taken from one of our production apps (with only a little change). With this interface, calling restAdapter.create(AppConnector.class) returns a REST client object (proxy) that implements AppConnector and that does all the content parsing and conversion into Java objects for us. This works for plain Java types, collection types and custom Java classes that are used as return types and/or parameter types.

The example above actually makes a synchronous request. In fact, we do use asynchronous requests in our application. Going from synchronous to asynchronous requests only needs a little change:

public interface AppConnector {

  @Headers("Cache-Control: max-age=14400")
  @GET("/connector/contents/app-id/{app-id}")
  void getContents(@Path("app-id") String appId, Callback<Contents> callback);

}

For aynchronous requests a second parameter is introduced to our interface method called callback. The Callback instance is called once the request has succeeded or failed. We can also go reactive. Retrofit integrates with RxJava and allows to return rx.Observable instances that enfore reactive programming in your Android code :-).

The @Headers, @GET and @Path annotations are all pretty self-explainatory. As you can imagine there is also @POST, @PUT, @DELETE or, for example, @Query which can be used to added request parameters to the configured URL.

Conclusion

This article should serve as a small pointer and introduction to Retrofit, a leight-weight library for calling REST web services from Android code. Retrofit basically supports the creation of REST clients based on type-save Java interfaces and types and supports synchronous, asynchronous or reactive programming paradigms. More information on Retrofit can be found at their GitHub project page.

Android Dependency Injection

I recently came across a very good collection of open source projects maintained by the guys from Square, a commercial payment service.

One of their projects came in use in one of my current Android projects: Dagger, a light-weight dependency injection library that can be used in Java and Android projects. Dagger kind of introduced me to the world of dependency injection libraries for Android.

DI and Android

If you are like me with a history (and present) in writing Java backend code, you will consider dependency injection as an almost given pattern when you start writing applications. For Android applications, dependency injection not only gets interesting for injecting things such as services or repositories, but also for injecting views.

Let's have a look at the more familiar pattern of injecting Java objects/beans like services into activities.

DI with Dagger

Dagger is a light-weight dependency injection library that comes with support for Java's JSR-330 annotations like @Inject or @Singleton . The creation of object instances that need to be injected at runtime are created in a so-called module definition:

@Module
public class ApplicationModule {

    @Provides
    @Singleton
    public AppConnector providesAppConnector() {

        OkHttpClient okHttpClient = new OkHttpClient();
        okHttpClient.setConnectTimeout(ApplicationConstants.HTTP_TIMEOUT, TimeUnit.MILLISECONDS);
        okHttpClient.setWriteTimeout(ApplicationConstants.HTTP_TIMEOUT, TimeUnit.MILLISECONDS);
        okHttpClient.setReadTimeout(ApplicationConstants.HTTP_TIMEOUT, TimeUnit.MILLISECONDS);

        try {
            File cacheDir = new File(application.getCacheDir(), "http-cache");
            Cache cache = new Cache(cacheDir, 1024 * 1024 * 5l); // 5 MB HTTP Cache

            okHttpClient.setCache(cache);
        } catch (IOException e) {
            Log.e("ApplicationModule", "Could not create cache directory for HTTP client: " + e.getMessage(), e);
        }

        RestAdapter restAdapter = new RestAdapter.Builder()
                .setEndpoint(application.getString(R.string.appEndPoint))
                .setLogLevel(RestAdapter.LogLevel.FULL)
                .setClient(new OkClient(okHttpClient))
                .setConverter(new GsonConverter(gson))
                .build();

        return restAdapter.create(AppConnector.class);
    }
}

In the case above, the module definition provides a REST interface called AppConnector. Therefore, it configures an OkHttpClient instance and the REST endpoint. At runtime, the ApplicationModule needs to be instantiated and the dependency object graph defined by the module needs to be bootstrapped. This can be done in a custom Application implementation:

public class Application extends android.app.Application {

    private ObjectGraph objectGraph;

    @Override
    public void onCreate() {
        super.onCreate();

       objectGraph = ObjectGraph.create(new ApplicationModule());
    }

    public ObjectGraph getObjectGraph() {
        return objectGraph;
    }
}

In order to actually trigger the dependency injection for an activity, ObjectGraph#inject() needs to be called. This is typically done inside a common activity base class:

public abstract class BaseActivity extends FragmentActivity {

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        ((Application) getApplication()).getObjectGraph().inject(this);
    }
}

Once this initial setup is done, activities can be injected with the objects created in the module:

public class LoginActivity extends BaseActivity {

    @Inject
    AppConnector appConnector;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        appConnector.doSomething(...);
    }
}

Of course, Dagger also comes with more advanced conecepts and features. But with the features shown above, you can already start to use dependency injection for services, repositories and other patterns.

VDI with Butterknife

When developing on Android, glue code to set instance variables which are all View descendants is frequently needed:

public class LoginActivity extends Activity {

    private EditText username;
    private EditText password;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

    setContentView(R.layout.login);

        username = (EditText) findViewById(R.id.username);
        password = (EditText) findViewById(R.id.password);

        // ...
    }

    // ...
}

View dependecy injection is a way to get rid of glue code that is needed to link instance variables with resources from the layout, and to speed up development. One library that is determined to be used exactly for this case is Butterknife. It comes with annotations that can be used to inject views, but also to annotated methods which at compile-time are transformed to listener instances.

But let's first of all have a look at view injection. Actually, it is pretty simple, the annotation to use is @InjectView. It needs a single annotation parameter which is the resource ID of the targeted view:

public class LoginActivity extends BaseActivity {

    @InjectView(R.id.username)
    EditText username;

    @InjectView(R.id.password)
    EditText password;

    // ...
}

No more code needed for manually retrieving and casting the view instances in the onCreate Activity callback. One little detail that is hidden here is the call to ButterKnife.inject(this) which actually causes Butterknife to inject all the instance fields with the target views at runtime. I do the calls to inject usually in my BaseActivity which is shared across all activities in my project:

public abstract class BaseActivity extends FragmentActivity {

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        ButterKnife.inject(this);
    }

    // ...
}

Among other nice Butterknife features, I really like the annotations for various listeners, take @OnClick for example. This is a method level annotation that annotates methods which are at compile-time transformed/wrapped with an actual listener implementation:

public class LoginActivity extends Activity {

    @OnClick(R.id.btnLogin)
    public void login() {
        String user = username.getText().toString();
        // ...
    }
}

Or suppose the layout has a Spinner and we want to react when an item is selected:

@OnItemSelected(R.id.mobilePrefix)
public void mobilePrefixSelected(int position) {
    // ...
}

The method targeted by OnItemSelected may come with any parameters also found in the actual listener interface.

Conclusion

I used Dagger and Butterknife in one of my latest Android projects and I won't go without them anymore. Especially Butterknife comes in very handy as it allows to get rid of view glue code that is needed to get references to UI view elements.

It should be noted that there is also the AndroidAnnotations project that comes with an even richer set of annotations to do basically everything with annotations. In my current projects, our needs were totally satisfied with the Butterknife feature set, so I can't say anything about AndroidAnnotations in depth.