delicious

A/A testing

A murb'ed feed, posted more than 2 years ago filed in testing, statistics, research, experiment, lean, development, ux & numbers.

An article on A/B testing. A/A testing is of course not changing something at all between the variations, but what his experiment showed was already A/B testing providers showing ‘significant’ performance differences between the same page.

Of course that is just plain statistics. If you don’t set a hypothesis and let the numbers speak a change lower than <0.05 (the typical p value) is still quite within reach. Every one out of 20 ‘experiments’ will be a false positive with p-values near 0.05.

But the most important take home message is: instead of trying to come up with variations, you could also have thought about just the best way to put things / make your product better /…

(via Adactio)

Go to the original link.