The Agile Samurai: Mini book review

The Pragmatic Programmers have published quite a few books over the years. My bookshelf contains nearly a dozen of them. The latest addition being The Agile Samurai: How Agile Masters Deliver Great Software.

It’s essentially a high level overview of agile practices, with a focus on the why and the how. Unlike some other books on Agile, this one tries to remain pretty neutral when it comes to methodologies. XP, Scrum, Kanban are all briefly mentioned, but the author managed to boil Agile down to its essentials: common sense, being goal oriented and having a willingness to improve.

The chapter on Agile Planning is a particularly excellent treat. It could have easily been called “the idiot proof guide to agile planning”, because really, it’s that good. Concepts like velocity and burndown are illustrated with pretty graphs. Not only does the book explain how to apply agile planning, but by the end of the chapter you’ll also know why it’s a good idea. The phrase “Why does reality keep messing with my Gantt chart!?!” sums it up pretty nicely.

I have just one problem with the book. The Samurai theme could’ve been explored a bit better. For starters, this here Samurai is wearing his swords on the wrong hip. Second, his name, Master Sensei is a bit silly. Ō-sensei would’ve been much more appropriate. But in all seriousness, the whole Samurai theme could’ve been expanded on. There are many similarities between software development and martial arts in general. Mostly when it comes to drive and focus, a bit less so when it comes to actual sword wielding. Still, it doesn’t detract from the book, so all is well.

All in all, a pretty good book. If you’re an Agile Veteran, you won’t need it, but maybe your pesky manager or team leader could benefit from it …

Automating Maven Releases

Automating maven releases should be pretty straightforward in non-interactive mode. A bug in the release plugin made it impossible in my situation. Every time I would provide the release version(s) as command line arguments, the release plugin would choke on me with the following error message:

Error parsing version, cannot determine next version: Unable to parse the version string

The following shellscript works around this problem, by redirecting input to the maven execution.

Note: I’m releasing a project with a parent and 2 child modules, which is why I have to specify three versions ( + 1 SCM tag). If you’re not using multiple modules, or are using more, you’ll have to adjust the script accordingly.



mvn \
    release:prepare -P production &>> /tmp/build.log << EOS

mvn release:perform -P production &>> /tmp/build.log

This is an abridged version of our full release script. The full version asks the user to enter the release version once, then releases several versions using different profiles and creates a distribution set with all versions and a bunch of documentation. This works in my situation, but if your release procedure is more complicated then you can just expand on the script :-).

Test design – Equivalence Classes

During a recent job interview, I was asked to write some code — I know, shocking! The idea was that several test cases had been defined, and that I was to implement a relative simple class that would make the tests pass. The problem was pretty simple, so I won’t bore you with it.

What was shocking, however, was how poorly designed the tests were. Boundary cases were largely untested, and it seemed like someone spent an inordinate amount of time writing useless tests. When I brought this up during the interview, the person who wrote the tests seemed surprised that they weren’t very good, because he got nearly 100% code coverage on the implementation he created.

While code coverage is all fine and dandy, it doesn’t actually say anything about the quality of your tests. Maybe his implementation would’ve worked perfectly, even with strange values and edge-cases. Maybe not. We’ll never know.

Equivalence Partitioning is one of the simplest test-design techniques. As the name pretty much implies, the idea is to partition possible input values into equivalent classes. Sounds like a bunch of gibberish? Let me illustrate with a classic example. Liquor laws.

As you can tell from the image, if you’re under 16, you’re not allowed any alcoholic beverages. Once you turn 16, you’re allowed to have beer and other non-spirits. Once you turn 18, you hit the jackpot and can drink whatever tickles your fancy.

The red, yellow and green areas are the three Equivalence Classes for this problem. Whether you’re newborn, 5, 11 or 15, it doesn’t matter, you’re not getting a drink. And once you’re older than 18, your age stops mattering entirely.

Once you have this information, you can design a couple of test cases. In this case, you could start off by designing a test case for each class. The exact age for each test you pick doesn’t matter, as long as it’s in the class you’re testing – or outside of it if that’s what you’re testing.

So that’s three easy tests. Then it’s time to apply a bit of Boundary Value Analysis. After all, it’s so very easy to create off-by-one errors.

Boundaries are the areas where equivalent classes meet. The boundaries in this case are 16 and 18. When you look at the boundaries you’ve defined, you’ll want to look very carefully at your specifications again. Someone’s just turned 16 on this very day. Does that mean they can have a drink? Or not? Once you have the answer, create a test case. Then do the same for all other boundaries.

With five test cases, one for each boundary and one for each equivalence class, you’ll have tested this very thoroughly. Additional test cases can be made to test invalid input. What happens if you try to pass a person with a negative age? What if the age is a million years old?

OutOfMemoryError while running Maven Surefire tests

Imagine you have a project which works perfectly fine and well. All tests pass, each and every time. Then one day you commit a couple of new classes with related tests. Of course you ran all tests before committing, and everything worked just fine. Then, a minute or so later, you get a mail from Hudson (or whatever you’re using for CI) saying that there are test failures. “Maybe I forgot a file”, I thought. Checked the test results on Hudson. About a dozen tests were failing, unrelated to anything I touched. Odd. OutOfMemoryErrors all over the place. Most odd. Hudson’s tomcat has 1G, which should be plenty. Same with each build’s MAVEN_OPTS.

Apparently, someone who wrote the Maven Surefire Plugin thought that it would be a GREAT idea to ignore things like MAVEN_OPTS and other memory settings. The plugin seems to start a new JVM instance to run the tests. Without any of the arguments you so carefully selected. No. Apparently you have to explicitly tell the Surefire plugin that maybe, just maybe, it would be a good idea to use the memory settings you already provided elsewhere.

Anyhoo, this fixed it:


DRY, you say? Not so much, eh.

Maven 3 resource filtering weirdness

Maven 3 is all nice and fast(er) and shiny, so I decided to upgrade a Maven 2 project to Maven 3. It (cl)aims to be backwards-compatible, so my consternation was pretty great when my build failed straight away. That’s to say, my tests failed. For some reason, my resources were no longer being filtered. Yup, ${property.keys} weren’t being replaced by values.

This struck me as being somewhat odd, because it worked fine with 2.2.1. A bit of debugging led me to the cause of the problem:

<!-- @Transactional can now be used as well -->

… apparently, the @ symbol is an escape character of sorts.

Considering that blurb on their website doesn’t even qualify as English, I’m not sure if this is a feature or a bug. But whatever. Removing that comment fixed the problem. Whoever came up with that bright idea (especially in an age where @annotations are as rampant as the black plague in the 14th century) probably deserves a spanking.

Two Spring/Hibernate gotchas that got me

Earlier this week I got caught out by two Spring/Hibernate features. One of my service layer methods threw a checked exception after a bit of relatively complex validation, just like I intended. What happened next, however, was not expected. My test failed. Well, it didn’t fail, it died with an error and a horrible stack trace of doom. Hibernate was kind enough to tell me that a batch update failed. This struck me as slightly odd, considering the fact that an exception was thrown long before any saving was meant to occur.

I had, apparently, forgotten about the rather unintuitive automagic dirty checking and subsequent saving. In my rather humble opinion, it’s a feature that should be disabled by default, but that’s not the point here. I had been made aware of the feature in the past, but had somehow completely forgotten about it. Fine, no worries, I can live with it, I’ll just roll back my transaction.

However! My test still shouldn’t have failed, or so I thought. After all, I had Spring transaction support set up for all my service calls, and surely an exception will cause the transaction to roll back? Wrong. By default, transactions are only rolled back for Unchecked exceptions, not for checked exceptions. I can’t for the life of me fathom what the reasoning behind this decision might be, but there you have it. It’s well document in the Spring transaction documentation, but again I had completely forgotten about it.

A simple bit of config fixed the problem:

<tx:method name="*" rollback-for="java.lang.Throwable"/>

So please, don’t be like me. Don’t get caught out by these features! Unless you’re looking for a couple of hours of entertainment, that is :-).

Howto: soapUI integration tests with Maven

Running soapUI tests with maven is surprisingly easy, all it requires is a few simple steps. This howto will walk you through deploying your web project in an embedded container and running the soapUI tests in the integration test phase.

Cargo configuration

With the cargo plugin you can deploy your project to just about any container. For the sake of simplicity I’ll be using an embedded Jetty 6 container.

<!-- Deploy the project WAR to a built-in container during the integration test phase -->
		<!--Start the container in the pre-integration-test phase -->
		<!-- Stop the container after integration tests are done -->
		<wait>false</wait> <!-- We want to deploy, run tests and exit, not wait -->

soapUI project configuration

If you haven’t already created a SOAP UI test suite, now’s the time to do so. Once this is done, copy the test suite to your test resources folder (src/test/resources). Set up your project to filter resources.


With that out of the way, you can now edit the soapUI project file with your favourite XML editor. What you want to do is replace all endpoint references (and possibly WSDL locations) by property keys. So <con:endpoint>http://localhost:8080/MyProject/endpoint</con:endpoint> becomes <con:endpoint>${my.project.endpoint}</con:endpoint>.
Your webapp will be deployed to http://localhost:${my.project.port}/${project.artifactId}-${project.version}, so I suggest using that as a property value.

<project | profile>
</project | profile>

soapUI plugin configuration

First, add the eviware soapUI maven repository to your list of repositories.


Then, add the plugin to your build and let maven know when you want to execute it. Considering the container is starting up before the integration test phase, and is shutting down afterwards, running the tests as integration tests seems like the best option ;-).

<!-- Run SOAP UI tests during the integration phase. -->

All done!

Now when you run maven verify (or install, or ..) your SOAP UI tests will automagically be executed and you’ll be informed of any failures.

Maven sucks?

Kent Spillner certainly seems to think so. I disagree with the conclusion, but I find myself agreeing with a lot of the points he makes. It’s true that maven builds aren’t always consistent across different platforms or even maven versions. It’s also true that your pom.xml file can grow rather large and complicated if you want to get maven to do interesting things. And it’s definitely very true that maven dependency management can be hair-pullingly-complicated and is essentially broken. Sure.

But saying that writing your own build manager is better? Advocating ant and rake? Seriously? Maven does a lot more than just build your project. It does reporting, site generation, eclipse project generation and pretty much anything else you can think of. The convenience of the thing is worth a lot. It’s definitely worth having to struggle with the POM every once in a while. And to be honest, the POM syntax isn’t much more horrible than ant’s or rake’s.

As for platform/version inconsistencies: if you can’t force the people on your team to use the same software, then your problems probably run a lot deeper than just build management. Software has bugs, this includes maven as well. If you’ve hit a particular bug that causes your build to go kaboom, then fixing it sounds like the way to go.

If you have the time to write a build manager for every project you work on, be my guest. I for one don’t, and maven has actually served me pretty well so far.

Howto: PostgreSQL data source in JDeveloper/OC4J

Today I had to create a postgres connection pool in JDeveloper’s embedded oc4j container. JDeveloper being the horrible piece of software that it is, and its documentation being rather lacking, this took a lot longer than it should have. The pretty GUI wizards aren’t able to pull it off either — these measly conjurers really aren’t worthy of the name.

The biggest hurdly was postgres’ connection pool not being happy with just a jdbc URL. Instead it expects a hostname, port number and database name. These things are all in the jdbc url, but never mind, that would’ve been too simple. After reading through the XSD for data-sources.xml, I realised that there’s an option to provide custom properties to the factory. Quite simple really. A connection pool definition looks something like this:

<connection-pool name="myPool" disable-server-connection-pooling="false">
		user="postgres" password="1234"
		<property name="serverName" value="localhost" />
		<property name="portNumber" value="5432" />
		<property name="databaseName" value="db" />
<managed-data-source name="dataSource" jndi-name="jdbc/postgresDS" connection-pool-name="myPool" />

Once this is done, all that’s left to do is place the postgres driver JAR in the j2ee/home/applib folder in your JDeveloper folder. If you don’t place it there, you’ll get very nice class not found errors.

That’s it. Not very hard at all!