Last updated : 09/11/2002; 12:44:39

Joe's Jelly

Joe Walnes rambling on and on about Software Development, Java and Extreme Programming.

NAVIGATION >>

/WIKI /WEBLOG /THOUGHTWORKS /OPENSYMPHONY /EMAIL

:: ELSEWHERE
xp developer wiki
c2 xp wiki
the server side
javaworld
sd magazine
jakarta

:: BLOGS
Ara Abrahamian
Mathias Boegart
Mike Cannon-Brookes
Paul Hammant
Aslak Hellesøy
Darren Hobbs
Patrick Lightbody
Charles Miller
Brett Morgan
Rickard Öberg
Joseph Ottinger
Mike Roberts
Chris Stevenson
James Strachan

:: INVOLVEMENTS
SiteMesh
QDox
MockDoclet
OSCore
OSUser
PropertySet
RichTags
Marathon
SiteMesh C++
Alt-RMI
Jelly MockTags
more...
:: BACKLOGS
October 2002
Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31    
Sep   Nov



Click here to visit the Radio UserLand website.

Click to see the XML version of this web page.

joe@truemesh.com

:. 14 October 2002

  10:59:57 PM  

Ara is on another roll. N-way code syncronisation.

Many moons ago TogetherSoft cracked the round-trip modelling/coding thing (strangely enough with JavaDoc comments). Problem was that it was too coupled to a particular problem.... it will be interesting to see how this unfolds when opened out further.


  10:54:38 PM  

Have you ever noticed how IT companies tend to have coherent naming schemes for their servers? It has always amused me. In the office all our machines are named after Muppets (Gonzo, Scooter, Beaker, Bunsen, Cookie, Grover etc), our servers are all named after Greek gods (Zeus, Bacchus - son of Zeus etc). At my university, the servers were all named after composers (Mozart, Liszt, Handel etc). What wacky naming schemes do you have? [rebelutionary]

The 'shrooms: mushroom, fungus, lichen, shiitake, button, closedcup.

The annoyances: bunion, herpes, ingrowntoenail, crabs, scabies, wart, theclap.

And remember kids, when trying to think of a new name, Google Sets is your friend.


  10:49:10 PM  

In response to Jon...

If you're having a test-suite that takes 2 hours to run, what good is it? Especially if you're trying to do XP you want to run your tests basically every time before doing a commit.

Before doing a commit? I run my full test-suite 2-5 times per minute! If it takes any longer than 5 seconds between writing some code and getting a test result (1 second to load ant, 1 to compile and 3 to run tests) I tend to stop practising test first design and/or start working in bigger chunks than my brain heap can deal with (line at a time...).

Nowadays you're test-suite is probably executing Servlets, EJB session-beans, Swing-applications and so forth. This takes for ever, and is strictly speaking more of integration-tests. But you don't want to maintain one set of unit-tests and one set of integration-tests.

Aside from acting as a design aid, unit-tests prove that all the parts of the engine are perfect. Just because they are, doesn't mean the car can drive. Integration tests (well, acceptance tests) are needed as well to prove the system lives up to the functionality expected. Because you are testing different things (code versus functionality) you need to take a totally different approach to test design (does the spark plug ignite properly vs. can I do hand-break turns).

Take a common test written in for example HTMLUnit: You basically pick up an HTML page generated by servlet running in a servlet container. The servlets are calling some session-beans that executes some entity-beans. The entity-beans are loaded to and from a database. So basically we've got a Servlet Container, a Session Bean Container, an Entity Bean Container & Persistence Manager and a Databse. You're communicating with the servlet container through http, you're doing some JNDI-lookups, you're communicating with the database, you're running some transactions in JTA, and so on. This overhead is what's eating at least 90% of the time, probably more.

Yup. On top of that you have the problem of exponential test dependency. If you test all your layers using this stacked approach (i.e. to test layer D, I depend on C, B and A behaving in this way) your tests get more and more brittle. If you need to make a change to a lower layer you can often break not only the tests for that layer but all the layers stacked on top of that and the layers stacked on them and the layers stacked on them.... Another test smell that leads to frustration, which leads to suffering and eventual sloppiness in tests.

So let's mock all of this! Let's do a mock servlet container that just calls the servlet-implementations directly, returning the results "in-proc". (And for JSP... Seriously, there isn't people out there still using this crap? :-) You could probably get something working with Jasper.) ...

All this stuff is readily available. See www.mockobjects.com.

Okay, you still need integration-testing too...

Or skip the integration tests and go straight to acceptance (functional) testing.

But if you could have this switchable, you set a system-property for the thing you need. If you run it with 100%-mock you can execute those tests in zero-time, if it doesn't run with mock it sure won't run in integration-testing mode. In your continous integration-box you run the full integration-tests.

A strategy to implement all this would be to start small. Implement the required interfaces and fill out with stubs that throw exceptions. And feature for feature you add to the mock-appserver. Except for an EJB-QL implementation it actually seems be almost trivial.

This is a common misunderstanding of mock objects.

Mocks are dumb. They are not lightweight reimplementations of existing APIs or containers. A mock has no understanding of EJB-QL.

But mocks also provide functionality not providied by the real API implementation - expectations - meaning your tests would be useless unless used in a real environment. Mocks are much more than stubs that return dummy values.

Imagine J2EE The Next Generation had a new magical EnterpriseWidget API.

package javax.magic;
interface EnterpriseWidget {
  void performVoodoo(String spell);
}

And we wanted to test a custom bit of code (Cheese) that invoked calls to this service, we could write a test:

public void testCheese() {
  Cheese c = new Cheese();
  MockEnterpriseWidget ew = new MockEnterpriseWidget();
  ew.expectPerformVoodoo("hocus pocus");
  ew.expectPerformVoodoo("abracadabra");
  c.setEnterpriseWidget(ew);
  c.doThings();
  ew.verify();
}

We can see here that the mocked version allows itself to be setup with expectations that will assert the the performVoodoo() method is called correctly. Apart from the fact that we couldn't run this in a real environment (because the original interface doesn't contain an expectPerformVoodoo() method), we wouldn't WANT to run this because we rely on our mock implementation to tell us that the contract has been met. The mock does this through expectations (note lack of assertEquals()).

What if we apply this to something like J2EE. Or better yet J2EE - The Next Generation. That is the AOP-stuff and so on... What if all this was created with Test First Design? What if testability was something that was built into the platform? For example that you could always run all you're stuff in-proc, short-circuiting all remote calls but still behaving as close as possible to fully deployed. That you didn't need to have all these thousands of xml-files lying around. That you could start simple, adding aspects as you went along, refactoring out some reusability here, some other niceties there. Always being able to run your unit-tests hassle-free, in no-time, and with no-brainer.

If you avoid code that is tightly coupled to the container this is very easy. In fact most of the J2EE APIs (with the exception of EvilJB) allow you to do this anyway.

Mock implementations for most of J2EE and some other popular APIs are already available at www.mockobjects.com.

Of course you'll always want to create mocks for your custom code too so take a look at (plug, plug) MockMaker. It's an Ant task that will scan your source tree for anything marked with the @mock doclet tag and create a mock implementation complete with the ability to setup expectations.

And another plug; .NET users may want to take a look at NMock which builds mocks dynamically at runtime.

Remember, when using mocks you're testing that your component interacts with its environment correctly, not that your environment works correctly. Design by contract is your friend.


  9:34:12 PM  

Mike has some mock object tips:

Tequila Mocking Bird. I spent about 2 hours today trying to slowly build a mock object test suite for JIRA. Actually not a test suite, just a few tests of increasing complexity. Some mental rules I've made for myself:

  1. Try as hard as you can to only test the method or class you are testing. Often this means breaking one method into two, changing method protection or at the extreme introducing new classes.
  2. If you can't test it via a mock - your architecture is probably too coupled! I found the 2 hours of testing very informative, I have already decoupled our architecture in a few places as a result - replacing static methods with interfaced factories and such like.
  3. Keep it simple, yet be flexible! I found I had to keep thinking of exactly what I was trying to test (and no more!) yet at the same time thinking whether I could simply change my code to make my testing life easier.

Rule 2 is especially interesting, because mock testing forces you to think of your code as an API - as you now have two 'clients' (your other code, and your tests). I think in the end it will lead to a much nicer architecture. Also - our mock tests run hundreds of times faster than our Cactus ones, which is great for rapid development. It certainly seem to encourage (or rather not discourage?) you to write tests. More rules later - the light rail has reached Glebe! [rebelutionary]

 


  9:30:56 PM  

Rickard has a really nice use of Jelly to migrate JISP database schemas.


Web-design by John Doe from open source web design.