Feb 17 2014

Stream-based real-time architecture against a Lambda architecture

This is less of an informative post than usual. I’ll repost a question I’ve asked in Programmers Stack Exchange hoping to trigger some interesting discussion.
I’m not a real-time architectures expert, I’d like only to throw a couple of personal considerations and evaluate what others would suggest or point out.

Let’s imagine we’d like to design a real-time analytics system. Following, Lambda architecture Nathan Marz’s definition, in order to serve the data we would need a batch processing layer (i.e. Hadoop), continuously recomputing views from a dataset of all the data, and a so-called speed layer (i.e. Storm) that constantly processes a subset of the views (made by the events coming in after the last full recomputation of the batch layer). You query your system by merging the results of the two together.

Oct 3 2013

Implement your 2-legged OAuth server with Java Restlet

In the previous post we’ve been looking into how we can implement a simple Java client to access resources protected via 2-legged OAuth.
This time we’ll focus on the server side code needed to develop a 2-legged OAuth server that enables your REST API’s resources to be protected with the same method.

In order to do so, we’ll use Restlet, a Java framework to create RESTful web services, and again Scribe, the simple OAuth Java library for the validation part. The code presented here has been tested and is ready to be used in your Restlet application.

Aug 27 2013

2-legged OAuth Java client made easy

Some scenario

When you build up your RESTful web API, there comes a time when authorization comes in the game. You need to provide means to restrict and control the access to your protected resources.
One of the most popular approaches involves the use of OAuth, an open authorization protocol (in its 1.0 version, than a framework in its 2.0 counterpart) that puts simplicity and security as part of its main goals.
I will not spend time discussing the good and the bad of OAuth, or the large controversy around the development of the version 2.0. I’d like instead to focus on how to secure your API without losing your mind around the potential complexities of OAuth.

A good and nowadays fairly recognized approach to deal with this is the so-called 2-legged OAuth. For the details, I invite you to check the excellent article Designing a Secure REST (Web) API without OAuth. Don’t get confused by the title, the author describes the authorization methodology applied by Amazon for its AWS API which is basically identical to 2-legged OAuth, the latter being only a bit more restrictive on some protocol aspects (i.e. the hash algorithm, the order of the parameters to be hashed). Read More >>

Mar 11 2013

Fast and easy integration tests for your Hibernate data layer with HSQLDB and DBUnit

Integration testing with Hibernate

I guess there is no need to introduce Hibernate (http://www.hibernate.org/), probably the most populare Java ORM framework. One main point of an ORM is that it decouples your data layer logic from the underlying RDBMS, making it relatively easy to switch among different solutions. Moreover, the early adoption of a package like Hibernate helps maintaining an Agile approach over your project, since schema changes are smooth to handle and your schema can be versioned, tagged and so on within the same versioning system used for your code.

A potential big pain of developing your application data layer may arise when it comes to integration testing. Singularly, your classes (and DAO) have been tested in full isolation, but there comes a time where you have to make sure your interaction with the real RDBMS works as expected; testing this part may result relatively slow and additional issues should be taken into consideration.
Testing over a real RDBMS means that all your data manipulations are persisted. As this is a trivial observation, it means anyway that without some proper treatments, running twice a test that modifies your data store (i.e. producing some insertions\updates\deletes) will likely lead to different results.

Jan 22 2013

Chef and automated service discovery

Chef — the short story

Chef is a cloud-oriented open-source integration framework. By describing in a platform-independent way how a specific component of your service architecture should be deployed (a cookbook), Chef takes care of automating the deployment and configuration of your infrastructure. More infos about Chef in the official website.

Services and interdependencies

A common scenario is working on an architecture composed by different services communicating between each other through APIs. Possibly, your system is fully replicated in two or more environments (acceptance, production, and so on) running in one or more different cloud systems (AWS, Eucalyptus, etc).
If such is the case, every service may need to have an endpoint specified in some configuration file for all the services it needs access to.

Chef allows you to create dynamically the needed configuration files through a simple and powerful templating system. Simply enough, a template can describe how a configuration file (or whatever file your service needs to run) has to be structured, and values to populate it can be injected from attributes that can depend on your environment or on some other conditions (for instance, the role given to the node).