|Issue||Procedures for measuring SOAP request processing time|
In DEBUG level, customers *can* roughly infer it from PassMark logs.
Start message for a request looks like: 2006-12-05 13:52:13,174 DEBUG [[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)']  [PM-23dfbb:-7fe2-72ebb2d7] [org.codehaus.xfire.transport.DefaultEndpoint] - <Received message to /pmws_server/services/AuthService
End message for a request looks like: 2006-12-05 13:52:13,378 DEBUG [[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)']  [PM-23dfbb:-7fe2-72ebb2d7] [org.codehaus.xfire.handler.HandlerPipeline] - <Invoking handler org.codehaus.xfire.handler.OutMessageSender in phase send
So, for example in the above pair, elapsed time is 13:52:13,378 minus 13:52:13,174 equals 204 msec.
This is duration from the time that XFire stack sees the request to the time it responds, it doesn?t include network latency between customers app server and our app server, nor does it include the time spent waiting for the request to be dispatched by the Adaptive Authentication app server prior to invocation of XFire. The time waiting for a request to be dispatched is neglible in a well-behaved system, but on an underpowered app server requests could spend a considerable amount of timed queued waiting for an available thread.
Running Adaptive Authentication with DEBUG level log messages imposes a noticable overhead on the application server, and given the possible difference between server processing time and total roundtrip time, RSA recomend that customers wishing to monitor response times do so in their own application measuring the total elapsed time between submitting a request and recieving a reply.
|Legacy Article ID||a34154|