RE: Running tests from ant on a continuous integration server
RE: Running tests from ant on a continuous integration server
- Subject: RE: Running tests from ant on a continuous integration server
- From: "Brook, James" <email@hidden>
- Date: Fri, 24 Oct 2008 17:00:11 +0200
- Thread-topic: Running tests from ant on a continuous integration server
>-----Original Message-----
>From: Guido Neitzer [mailto:email@hidden]
>Sent: Fri 24.10.2008 16:20
>To: WebObjects-Dev Mailing List List
>Cc: Brook, James
>Subject: Re: Running tests from ant on a continuous integration server
>
>>On 24.10.2008, at 04:37, Brook, James wrote:
>
>> I guess I am going way OT by taking this discussion further, but I
>> feel
>> the need to respond anyway. We are using Scrum for software
>> development.
>> This means that developers don't get the points for a story until it
>> is
>> 'DONE'. We have a generic definition of done, which includes the
>> provision of unit tests, acceptance tests, database scripts and UI
>> approval. The end result is that the developers are becoming
>> intimately
>> familiar with the real meaning of 'DONE' and have realized the
>> benefits
>> of writing good tests. In a lot of cases the product owner provides
>> Selenese (and Java scripts) recorded with the IDE. These are then
>> integrated/merged with the developers tests. So far we are finding
>> that
>> Tests written by developers even provide superior coverage.
>
>The problem with that approach is that you normally can't do that with
>WO teams, as the teams are normally so small for a given project that
>this approach just doesn't work ... at least on all projects were I
>was working on, the developers were part of the technical design team,
>part of the testing team, and did the implementation.
>
>This approach wouldn't work in my environment or any environment I
>have seen WO in so far.
>
>And my experience also is that developers that don't see the "big
>picture" are less productive and come back with inferior solutions.
One of the few things that Scrum is prescriptive about is that there are
only three roles on a team, the ScrumMaster, Product Owner and 'team'.
The team is self organising and collectively responsible and trusted to
deliver features that are 'DONE' at the end of each iteration. So our team
consists of two people who can write WO code, a CSS/Javascript person,
a ScrumMaster and a Product Owner. The trick is make sure that we all
share and agree to the same "definition of done" before we start. After a
couple of painful iterations with lots of loose ends, the planning
soon gets good enough that everyone knows just how much is expected.
Currently the thinking seems to be that for Scrum, the ideal team size is
7 or less, so I would argue that this is approach is quite well suited to
small WO teams. The trick is in blurring the boundaries between the skills
of the different people on the team and making everyone equally
responsible for quality. Note that there is no one called "tester" involved.
>
>cug
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Webobjects-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden