the startups.com platform about startups.comCheck out the new Startups.com - A Comprehensive Startup University
Education
Planning
Mentors
Funding
Customers
Assistants
Clarity
Categories
Business
Sales & Marketing
Funding
Product & Design
Technology
Skills & Management
Industries
Other
Business
Career Advice
Branding
Financial Consulting
Customer Engagement
Strategy
Sectors
Getting Started
Human Resources
Business Development
Legal
Other
Sales & Marketing
Social Media Marketing
Search Engine Optimization
Public Relations
Branding
Publishing
Inbound Marketing
Email Marketing
Copywriting
Growth Strategy
Search Engine Marketing
Sales & Lead Generation
Advertising
Other
Funding
Crowdfunding
Kickstarter
Venture Capital
Finance
Bootstrapping
Nonprofit
Other
Product & Design
Identity
User Experience
Lean Startup
Product Management
Metrics & Analytics
Other
Technology
WordPress
Software Development
Mobile
Ruby
CRM
Innovation
Cloud
Other
Skills & Management
Productivity
Entrepreneurship
Public Speaking
Leadership
Coaching
Other
Industries
SaaS
E-commerce
Education
Real Estate
Restaurant & Retail
Marketplaces
Nonprofit
Other
Dashboard
Browse Search
Answers
Calls
Inbox
Sign Up Log In

Loading...

Share Answer

Menu
A/B Testing: Building a database of A/B testing examples
JL
JL
Jonny Longden, Conversion Optimisation | A/B Testing | Ecommerce answered:

There are already a couple of products in market like this, goodui.org being the main one.

However, personally, I don't believe that this kind of meta-analysis of experiments from different sites has too much value, for a couple of reasons. I'm only really outlining this stuff for you because it might be food for thought if you want to create products for the experimentation community:

Whilst the theory of meta-analysis is technically sound, it is also fraught with issues relating to the complexity of web experiences:
Tests must be subjectively categorized and then analysed as 'patterns' when there is zero evidence that that thing was the cause of the result. For example, you might think a test won because you removed the voucher code box, but it equally could have won because of the change in page layout.
No two tests are in any way the same in any controlled way. Meta analysis mainly comes from sciences where experiments can be repeated under (relatively) identical controlled conditions. This is impossible and A/B testing cannot really be compared to that kind of controlled experimentation. By comparing two different experiments you instantly remove any control you had around audiences etc because they will not be the same.

The greatest benefit to be had from experimentation is by integrating it with business and brand strategy. A good business strategy is aimed at carving out a unique and differentiated direction of travel for that brand. The opportunity with experimentation is to iteratively develop and innovate the business through learning and adaptation. This means that experimentation should aim at being meaningful within the context of the brand strategy. Meta-analysis of experiments from different sites is, in many ways, the opposite of this because it seeks to make the ‘lowest common denominator’ changes which would be beneficial to any business anywhere. Conversion uplift and revenue might be achieved but what has been furthered in terms of strategy and differentiation? This is not to say that it isn’t valuable if you simply want to do ‘CRO’ but my argument is that nobody should merely do CRO.

Talk to Jonny Upvote • Share
•••
Share Report

Answer URL

Share Question

  • Share on Twitter
  • Share on LinkedIn
  • Share on Facebook
  • Share on Google+
  • Share by email
About
  • How it Works
  • Success Stories
Experts
  • Become an Expert
  • Find an Expert
Answers
  • Ask a Question
  • Recent Answers
Support
  • Help
  • Terms of Service
Follow

the startups.com platform

Startups Education
Startup Planning
Access Mentors
Secure Funding
Reach Customers
Virtual Assistants

Copyright © 2025 Startups.com. All rights reserved.