I know for best most accurate results you should run it while the servers not getting a lot of traffic, but this is where the odd results come in
I run a sitemap crawler wich hits the site kinda fast keeping the cpu usage around 0.89-1.24 when I run the test and cpu usage is at around 0.98 to 1.24 I get a 2,500+ result
I pause the sitemap crawler wait for cpu usage to drop to the 0.36-0.58 range and run the test and I lose 500 points averaging 1,800+ to just barely over 2,000
the cpu results vary from 0.23-0.30 based on how much the servers being hit.. the cpu fluctuations are minimal and do improve slightly with less traffic, (disk is pretty consistent around 0.47)
however the odd part is the database
under stress I get 3.34 ish but when theres no traffic its over 4.0 or higher
wich doesn't make much sense to me since I think the rest bypasses cache ..ignoring most cache tweaks and buffers and measures pure db performance when accessing the db (as aposaed to ram cache)
I would expect a higher rating when the db is basicly idle then when its under stress (ps MySQL accounts for most of the cpu usage although router.php tends to use more cpu then id expect..)
so can anyone explain why id get better performance readings while my sitemap crawlers hitting it semi fast (I slowed it from 500-600k pages a day to about 300k just for reference)
just confuses me why the results are better with more traffic then with next to none
--
soaringeagle
head dreadhead at dreadlocks site
glider pilot student and member/volunteer coordinator with freedoms wings international soaring for people with disabilities
updated by @soaringeagle: 09/26/15 04:57:38AM
