In many ways testing a Web service is no different than testing anything else. You still do all the same steps of the test method, but some of them can take on a bit of a different flavor. Here are a couple of tidbits I've picked up over the last couple of years testing Web services.
Determining coverage and oracles
When testing Web services, I find that scheme coverage becomes important. You still probably care about all the same coverage you cared about with non-Web service testing (scenario coverage, requirements coverage, code coverage, and application data coverage), but now you get to add schema coverage to the list.
For me, schema means not only performing a schema validation against all the response XMLs that come back from the service you're testing, it also includes testing for the right number of maximum and minimum repeating elements, correct reference identification tag values, and other subtle nuances for XML. This can sometimes create an oracle problem. I find that I'm often comparing against a schema along with a mapping document of some sort. A mapping document tells me what data should be stored in what element and when it should be there.
Determining the test procedures
One of the initial problems to be solved when performing Web service testing is management of all the data files (typically XML, but not always). There are tools that do this file management for you (I'll cover some of them briefly below), but you should have a game plan for files that need to be managed manually. When I'm testing I typically track three files: a request file, an expected response file, and an actual response file.
The naming convention I've picked up along the way is to put a "-rq.xml", "-rs.xml", or "-result.xml" on the end of all my files. That way for each file name you have a set of XMLs that paint the entire picture for that test case. Once you get a naming convention worked out, get the files into source control if you don't have a tool that keeps them all straight for you.
Once you have all these XMLs (sometimes hundreds, sometimes thousands), now you get the joy of keeping them up to date. I've found that schema changes can happen quite often on a project, as can mapping changes. Whenever one of those changes occur, you get the pleasure of updating all those XML files. And remember, if you have 100 test cases, you have 200 files -- assuming you don't bother to update your actual result file since you'll be re-running the test case.
The way I handle most schema changes is to have a Ruby script handy that I can use to make the updates for me programmatically. If you do it enough times, you eventually build a library of scripts for just about every type of change that comes your way. There are some exceptions to that where you may need to do some changes manually, but I find those changes don't normally affect all the test cases, just a subset.
Operating the test system
There are a lot of great tools available for testing web services. MindReef SoapScope is a fine commercial option, and SoapUI is a fine open source option. I've found them both to have all the basic features I've needed. SoapScope is a bit better about data trending and analysis. SoapUI has a few more technical features for test execution. I've also spent some time using IBM Rational for SOA Quality, which is an Eclipse-based tool focused on Web service testing. It's a latecomer, but has a nice feature set.
More often then not, I find that I use homegrown tools written in Ruby and Java to perform Web service testing. It gives you more control over the interfaces, features, and a working knowledge of what's actually going on in the test tool you're using. There are some drawbacks like support and documentation. Even the open source SoapUI team does a fantastic job in support -- they turned around a defect for me in 24 hours once. That's better then you'll see from MindReef or IBM I'm sure.
Try a couple of tools before you settle. I've found that I switch between the different tools based on the team I'm working with and what we're tying to do. Once you've compared a couple of them you'll figure out which features you really need and which are nice but optional.
Evaluating the test results
Evaluating the results for Web service tests can sometimes be really easy, and sometimes painful. There are couple of things you'll want to practice getting good at:
- writing XSLTs to transform your actual response to mask out values you don't care about (server dates for example);
- writing Xpath queries to check for specific values in an XML document;
- learning all the command line options on your favorite diff-tool;
- and ensuring you have at least one person on the team that knows the schema inside and out and can see the entire mapping document in their head when they look at the response files.
The better you are at the first three, the less important the last one is. I've also found that custom logging (both in your test execution tool and in the Web service under test) can also help add visibility to the results. Depending on what you're testing, sometimes you can cut out the need for file comparison entirely.
Once you get up to speed with the basics of manually testing the Web service, performance testing is normally trivial. Some of the tools have the ability to generate load built in. If you're using a homegrown option, all you need to do is thread your requests and you have the same ability. Getting the usage model right can sometimes be a challenge, but it's doable.
Other concerns that come into play can be authentication/authorization and encryption. I've not personally had to do a lot there, but I know that it's a problem for some. I imagine that if you're testing a payment gateway like your questions states, you'll need to do some investigation around what that means for your data and the tools you can use.
1 comment:
I think this post is useful for everyone as it includes very important topic that is how to test web services. I have to say that the method given in this post to check these services is very easy and efficient. You can try it.
Post a Comment