6
May 13

Functional Testing Using JBehave, Jetty and Maven

JBehave
JBehave is a java-based functional testing framework which allows tests written in human readable text to be automatically executed.
Functionality is defined at the user story level and broken down into a number of scenarios. Each scenario should capture a unique piece of functionality or behavior.
The format of the JBehave stories are given below:

Narrative: In order to ….. As a ….. I want to …..

Scenario: …..

Given …..
When …..
Then …..

The Narrative defines the functionality we are trying to test in user story form.
Following the narrative is a number of scenarios.
The Given clause defines a set of pre-conditions to set up the context of the test.
The When clause defines one or more actions to be performed to trigger the functionality that we are testing.
The Then clause is then use to test the state is as expected following the action.
A concrete example for a “happy path” scenario is given below:

Narrative:
In order to buy a product
As a customer
I want to register an account

Scenario: register an account

Given no account exists with email tester@test.com
And no account exists with username tester
When I create an account with email tester@test.com, username tester and date of birth 11/12/1981
Then I receive a succesful response
And the account is created in the database

Each line of this scenarios can then be mapped to a Java “Step” using annotations:

@Given(“no account exists with email $email”)
public void checkEmailDoesNotExist( String email ) {
userRepository.deleteByEmail(email);
}

@When(“I create an account with email  $email, username $user and date of birth $dob”) {
public void create Account( String email, STring username, Date dob ) {
Account account = new Account(email, username, dob);
response = restTemplate.postForEntity(url, account,Account.Class);
}

@Then(“I receive a succesful response”)
public void checkSuccesfulResponse( String response ) {
assertThat(response.getStatusCode(), is (200) );
}

Test can be run a number of ways. The easiest way is through JBehaves JUnit integration which allows annotation driven configuration. It also integrates nicely with Spring. For example:

@RunWith(SpringAnnotatedEmbedderRunner.class)
@Configure()
@UsingSpring
public class RegistrationTests() {

}

I recommend downloading the JBehave Eclipse plugin which performs syntax highlighting and linking from textueal steps to code.

Behaviour Driven Development

Behavior-driven development is a specialized version of test-driven development which focuses on behavioral specification of software units.
The tests are defined first. JBehave will mark these tests “Pending” until the implementation is complete. Preferably, these tests will be defined collaboratively between the developers, testers and business in which case they form the acceptance criteria for the user story.
The developer then implements the functionality required so that the tests pass. At this stage the developer may discover alternative scenarios that may form further acceptance criteria. For example:

Scenario: I must use a unique email

Given an account already exists with email tester@test.com
And no account exists with username tester
When I create an account with email tester@test.com, username tester and date of birth 11/12/1981
Then I receive a bad request error with message “Email must be unique”

Scenario: My Date of Birth Must be in the past

Given no account exists with email tester@test.com
And no account exists with username tester
When I create an account with email tester@test.com, username tester and date of birth 11/12/2081
Then I receive a bad request error with message “Date of birth must be in the past”

Again these stories should be a product of the collaboration between developers, testers and project stakeholders. In this way the stories not only provide acceptance criteria forming a functional contract regarding what functionality the application will provide, but also provides regularly validated “Living Documentation” specifying exactly what an application does at any point in time.

Integrating into the build process using Maven and Jetty

Your functional tests could be run against your application at various levels:

Directly against the code
Against a test container running the code (for example using spring mvc integration tests)
Against a deployed application at the http level (below the GUI)
Over the GUI (Using a tool such as Selenium).

I prefer the third option as I want to test as much as possible without the fragility that GUI testing often brings. When testing REST services this level is particularly appropriate.
A lightweight container such as Jetty provides a simple way to expose your application as part of the build lifecycle as follows:

pre-integration-test: Start Jetty containing deployed application
integration-test: Run JBehave tests using the JBehave Maven plugin
post-integration-test: Shutdown Jetty

This lifecycle can easily be set up in the plugin section of your POM:

<plugins>
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>=
<configuration>
<executions>
<execution>
<id>start-jetty</id>
<phase>pre-integration-test</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<daemon>true</daemon>
</configuratin>
</execution>
<execution>
<id>stop-jetty</id>
<phase>post-integration-test</phase>
<goals>
<goal>stop</goal>
</goals>
</execution>
</executions>
</configuration>
</plugin>

<plugin>
<groupId>org.jbehave</groupId>
<artifactId>jbehave-maven-plugin</artifactId>
<executions>
<execution>
<id>unpack-view-resources</id>
<phase>process-resources</phase>
<goals>
<goal>unpack-view-resources</goal>
</goals>
</execution>
<execution>
<id>embeddable-stories</id>
<phase>integration-test</phase>
<goals>
<goal>run-stories-with-annotated-embedder</goal>
</goals>
<configuration>
<includes>
<include>**/*Strories.java</include>
</includes>
</configuration>
</execution>
</executions>
</plugins>

JBehave as part of a continuous delivery pipeline

When using continuous delivery it is often preferable to run your JBehave tests directly against the application deployed in a test environment, this way your testing the configuration and deployemnt process in addition to the code.
A typical deployment pipeline will consist of the following steps

  • Build binaries
  • Deploy to test environment
  • Run acceptance tests against test environment
  • Promote to UAT
  • ….

As the tests are running directly against an app server we don’t strictly need to run jetty anymore, however it is often worth keeping it in to run a restricted set of smoke tests in the buid phase to validate the build. JBehave supports meta tags so that you can mark a restriced set of tests as smoke tests:

Scenario: Simple smoke test

Meta: @Theme smoke

Given…..

When you instruct JBehave to run the tests, you can inform it to run only the smoke tests, or to exclude the smoke tests.

@UsingEmbedder(metaFilters= {“+theme smoke”})

or

@UsingEmbedder(metaFilters= {“-theme smoke”})

If you configure your acceptance tests so that you can specify endpoints and databases connection details externally you can point your acceptance tests to the test environments .

20
Mar 13

Quickly creating a simple Spring MVC application

This post will cover the simple steps required to create a maven based web application using Spring. The end result will be a very basic framework that does virtually nothing, the intent is to use this as a base for other posts which will describe how we can turn this simple framework into something more functional such as a set of dynamically created web pages, or REST services.

Prerequisits

  • Java is installed
  • Maven is installed and configured corrctly

Step 1: Create a maven archytype

First create a simple maven project using the quickstart artifact as follows:

mvn archetype:generate -DgroupId=com.agitech -DartifactId=sampleapp -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false

Step 2: Update POM

Go into the the sampleapp directory and open the pom.xml file. First update the packaging element to the following:

<packaging>war</packaging>

By default maven uses Java 5, to update the app to use a different version of Java add the following elements:

<properties>
<java.version>1.6</java.version>
</properties>
 
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
</plugins>
</build>

Step 3: Add a webapps directory

Create the following directory under samplaeapp/src/main:

mkdir resources
mkdir webapp

Under the webapps directory create the directory WEB-INF:

mkdir WEB-INF

Step 4: Adding Spring

Open the pom and update the properties element defined earlier as follows:

<properties>
<java.version>1.6</java.version>
<spring.version>3.2.1.RELEASE</spring.version>
</properties>

Now add the following Spring dependencies under the <dependencies> element:

<dependency>   
<groupId>org.springframework</groupId>   
<artifactId>spring-core</artifactId>   
<version>${spring.version}</version>  
</dependency>  
<dependency>   
<groupId>org.springframework</groupId>   
<artifactId>spring-context</artifactId>   
<version>${spring.version}</version>  
</dependency>  
<dependency>   
<groupId>org.springframework</groupId>   
<artifactId>spring-web</artifactId>   
<version>${spring.version}</version>  
</dependency>  
<dependency>   
<groupId>org.springframework</groupId>   
<artifactId>spring-webmvc</artifactId>   
<version>${spring.version}</version>  
</dependency>

Step 5: Add web.xml

Create the file web.xml under samplaeapp/src/main/webapps/WEB-INF:

<?xml version=”1.0″ encoding=”UTF-8″?>
<web-app version=”2.5″ xmlns=”http://java.sun.com/xml/ns/javaee”
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance”
xsi:schemaLocation=”http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd”>
 
<servlet>
<servlet-name>sample</servlet-name>
<servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
 
<servlet-mapping>
<servlet-name>sample</servlet-name>
<url-pattern>/*</url-pattern>
</servlet-mapping>
</web-app>
 

The web.xml creates the DispatcherServlet used as an entry point to Spring MVC. By default the DispatchServlet will attempt to read the Spring config from /WEB-INF/{servlet-name}-servlet.xml, in this case /WEB-INF/sample-servlet.xml. To use a different name/location add an init param to the servlet definition named contextConfigLocation:

<servlet>
<servlet-name>Spring MVC Dispatcher Servlet</servlet-name>
<servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
<init-param>
<param-name>contextConfigLocation</param-name>
<param-value>classpath:spring/web-application-config.xml</param-value>
</init-param>
<load-on-startup>1</load-on-startup>
</servlet>

Step 6: Add spring XML

Create the file sample-servlet.xml under samplaeapp/src/main/webapps/WEB-INF:

<beans xmlns=”http://www.springframework.org/schema/beans”        
xmlns:context=”http://www.springframework.org/schema/context”        
xmlns:mvc=”http://www.springframework.org/schema/mvc”        
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance”        
xsi:schemaLocation=” http://www.springframework.org/schema/beans             
http://www.springframework.org/schema/beans/spring-beans-3.2.xsd        
http://www.springframework.org/schema/context        
http://www.springframework.org/schema/context/spring-context-3.2.xsd        
http://www.springframework.org/schema/mvc        
http://www.springframework.org/schema/mvc/spring-mvc-3.2.xsd”>      
  
<context:component-scan base-package=”com.agitech.sample.controllers” />        
<mvc:annotation-driven />
</beans>

This config provides a very basic setup which scans the package com.agitech.sample.controllers for Controllers and enables them to be configured using annotations. If you want to allow these controllers to to forward onto JSP views add the following bean:

<bean id=”viewResolver” class=”org.springframework.web.servlet.view.UrlBasedViewResolver”>
<property name=”viewClass” value=”org.springframework.web.servlet.view.JstlView”/>
<property name=”prefix” value=”/WEB-INF/jsp/”/>
<property name=”suffix” value=”.jsp”/>
</bean>

Step 7: Test with Jetty

We can add the Jetty plugin to start the webapp in a container on port 8080. Obviously we have not yet added any functionality so there will be nothing to actually test. Add the following to the <plugins> element in your pom:

<plugin>        
<groupId>org.mortbay.jetty</groupId>        
<artifactId>maven-jetty-plugin</artifactId>        
<version>6.1.10</version>        
<configuration>                
<scanIntervalSeconds>10</scanIntervalSeconds>                
<stopKey>foo</stopKey>                
<stopPort>9999</stopPort>        
</configuration>
</plugin>

Now from the top level directory run the command

mvn clean install jetty:run

Now that we have our basic setup we can focus on adding functionality in subsequent posts.

12
Feb 13

Continuous Delivery Part 2: Implementing a Deployment Pipeline with Jenkins

In this post we will build a very basic deployment pipeline using the Jenkins Continuous Integration Server and the Build Pipeline Jenkins plugin.

For an overview on deployment pipelines see my previous post Building deployment pipelines.

This basic pipeline will consist of a number of jobs:

  1. Build the artifact
  2. Acceptance testing
  3. Deploy to Staging
  4. Deploy to live

Some of the gates between jobs will be automatically triggered and some will be done manually using the Build Pipeline View. Please note the purpose of this post is to illustrate the process of creating a pipeline, the details of building artifacts, performing acceptance testing, automated deployment, etc will be saved for future posts.

Prerequisites

  • Install Jenkins
  • Install the Build Pipeline plugin
  • Intall the Groovy plugin
  • Install the parameterized Trigger plugin

Build Job

The first job in your pipeline is usually responsible for setting the version number, building the artifact and deploying the artifact to an artifact repo. The build task may also include running unit tests, reporting code coverage and code analytics.

The first task is to create a new job as below:

create_build

After clicking OK you are forwarded on to the job configuration page. Add an “Execute system Groovy script” pre build step to create a release number and add it as a parameter of the job. In this example we use the Jenkins build number. You may want to use the Version Control revision number instead (assuming you are not using a distributed revision control system such as Git, which uses hashes instead of revision numbers).

This is achieved with the following script:

import hudson.model.AbstractBuild
import hudson.model.ParametersAction
import hudson.model.StringParameterValue
def build = Thread.currentThread().executable;
String release = "1.0." + build.number;
def currentBuild = Thread.currentThread().executable;
def newParamAction = new ParametersAction(new StringParameterValue("RELEASE_NO",release));
currentBuild.addAction(newParamAction);

To set the release version of the artifact add an “Invoke top-level Maven targets” pre step with the command

versions:set -DnewVersion=$RELEASE_NO

Finally we add our “clean deploy” maven task.

configure_build

Acceptance Test Job

The next job is responsible deploying the artifact from the artifact repo into a test environment and then running a suite of automated acceptance tests against the deployment.

The first task is to create the job.

create_acceptance

I will save the details of automated deployment and automated acceptance testing for a later post.

We will now add a trigger from the Build job to the Acceptance Test job. From within the Build Job configuration add a “Trigger parameterized buid on other projects” post-build action. In “Projects to Build” specify “Acceptance Test”. Finally Click on “Add Parameters” and select “Current build parameters” to pass our release number to the next job.

auto_trigger

Deploy to Staging Job

Create a new free-style project job to named “Deploy to Staging”. This job will be responsible for running the automated deployment and smoke test scripts against the staging environment. Again I will not cover these details in this post. This job should be a manually triggered gate in the pipeline. To do this open the Acceptance Test job configuration and create a new “Build Pipeline Plugin -> Manually Execute Downstream Project” Post-build action.

manual_trigger

You can repeat the process with the “Deploy to Live” Job.

Create the Pipeline

The final task is to create the pipeline. From the Jenkins homepage create a new view by clicking on the “+” tab next to “All”. Give the view a name and specify a “Build Pipeline View”.

create_pipeline

Clicking OK will take you to the pipeline configuration page. Make sure you specify the inital build as the “Build” job we created earlier.

configure_pipeline

Once the job is created go to the pipeline view and click the “Run” icon. This should run the “Build” and “Acceptace Test” tasks. To manually trigger the subsequent classes click the trigger button in the bottom right corner of the respective blue box.

pipeline

6
Feb 13

Continuous Delivery Part 1: The Deployment Pipeline

Most of the ideas in this article come from the excellent book “Continuous Delivery: Reliable Software Releases Through Build, Test, and Deployment Automation” by Jez Humble and Dave Farley.

Continuous Delivery defines a set of Patterns to implement a rapid, reliable and stress-free process of Software delivery. This is achieved by following a number of principles:

  • Every check in Leads to a Potential Release
    • This is very different to the Maven Snapshot-Release process of delivering Software
  • Create a Repeatable, Reliable Process for Releasing Software
  • Automate almost Everything
  • Keep Everything in Version Control
    • This includes code, test scripts, configuration, etc
  • If It Hurts, Do It More Frequently, and Bring the Pain Forward
    • Use the same release process and scripts for each environment
  • Build Quality In
    • Continuous Integration, Automated Functional Testing, Automated Deployment
  • Done Means Released
  • Everyone is Responsible for the Delivery Process
    • DevOps – Encourage greater Collaboration between everyone involved in Software Delivery
  • Continuous Improvement
    • Refine and evolve your delivery platform

The most central pattern for achieving the above is creating a Deployment Pipeline. This pipeline models the steps from committing a change, through building, testing, promoting and releasing it. The first step usually builds the module and creates the project artifacts, these artifacts then pass along the pipeline, each step providing more confidence that the release will be successful. Gates between steps can be automated or manually triggered depending on the work flow desired. If all gates in the process are automated, this is known as Continuous Deployment.

Implementing Pipelines

Pipelines can be modelled within the Jenkins CI Server using the Build Pipeline plug-in. The plug-in supports both manual and automatic steps.

pipeline

In the example above, each build is assigned a unique release number of <app version>.<build number>. The artifacts are created as part of the build task and if successful deployed to an artifact repository. The next step is automatically triggered to deploy the artifacts onto a test environment, if this succeeds a set of Automated functional tests will be run to validate that the application functions correctly. The “Deploy to UAT” step is manually triggered by the appropriate user so that User Acceptance Testing can begin. Finally when UAT testing is complete, automated deployment to the live environment can be triggered. Permission could be added so that only certain groups of users can trigger specific steps.

As an alternative “Go” by Thoughtworks provides a more integrated solution to creating deployment pipelines, but the community edition is limited to 10 users.

The following Jenkins plug-ins are required to set up a pipeline such as the one above:

  • Build Pipeline
  • Groovy Builder – Required to set parameters for manual downstream jobs
  • Parameterized Trigger – Required to trigger the next step with the same parameters

And the following plug-ins could be useful when building your pipeline:

  • HTML Publisher – Useful for publishing reports such as the Living Documentation produced by functional tests
  • Sonar – Integration with Sonar code analytics
  • Join plugin – Useful when pipeline steps have forked for concurrent processing
  • Performance plugin – Useful for running reporting Load tests
  • Clone Workspace – Copies workspace to be used in another job

Build Process

When using the Maven release plugin we generally build snapshots continuously until we are happy to create a release, at which point we perform a release. Using the Maven release plugin this requires 3 builds, 2 POM transformations and 3 SCM revisions. Versions are usually hard coded directly into the pom.

When following Continuous Delivery every CI build leads to a potential release meaning there is no concept of snapshots and we must provide a unique version number for each build. This can be achieved by either using the maven command versions:set or by following the process as defined in the article

http://www.axelfontaine.com/2011/01/maven-releases-on-steroids-adios.html.

Configuration management

For Continuous Delivery and Automated Deployment to succeed it is important that each environment is as similar to live as possible. This way we can build up confidents that a release will be successful as we have done the same process on this pipeline several times before. Using a declarative configuration management tool such as puppet can ease this concern. By storing the puppet manifests/modules in version control along with the artifact we can always match up releases with configuration and test them together.

Artifact management

The artifact should be build only once and then used throughout the pipeline. An artifact repository such as Nexus/Artifactory should take care of this. It is often preferable to set up a number of repos, one for each environment (e.g. test, UAT, staging, live). This way we can grant permission to promote an artifact for an environment to one set of users (e.g. to signify UAT is complete) and then a different set can initialize the release. We can also regularly clear out repositories used earlier in the pipeline (e.g. remove test artifacts older than 2 weeks, UAT artifacts older than 2 months).

Automated Functional Testing

Functional testing tools such as JBehave, Fitnesse and Cucumber allow tests written in human readable form to be automated and run against a deployment. Tests should be organized by User Story and are often written in a Give-When-Then format. When following the process of Behaviour Driven Development tests are written before the functionality is implemented, either directly before coding the change or earlier in the process. If the tests themselves form the user requirements as defined by the “Specification by example” process then the Functional Tests actually become Acceptance Tests.

Once the tests are run they can produce Living Documentation defining exactly what the Application can and can’t do for a particular build.

Automated Deployment

I won’t go into Automated Deployment in too much detail, but this generally requires some kind of orchestration server (Possibly a Jenkins slave). The responsibilities for this server may include updating puppet configs, deploying multiple artifacts to multiple servers in order, running database migrations (Liquibase is often useful for this task), running smoke tests to validate the deployment or performing automated rollback if something goes wrong. Extra consideration is needed if zero-downtime deployments are required, for example data migrations will need to be forward/backward compatible.

Tools such as Control Tier, Rundeck and Capistrano can be used to help this process.