Back on July 20, we were notified by Tolly that they were doing a comparison report on Veeam Backup & Replication v5. Tolly claims to provide “fair, accurate and well-documented testing.” But what if one of your competitors hires Tolly for the testing and determines the criteria? Can it really be fair and accurate? Tolly sent us the basic test criteria, and it was obvious to us that the sponsor had cherry picked the criteria to show their product in the best possible light and to ensure that our strongest features – such our vPower functionality – would not be tested at all. In no way would these criteria constitute a comprehensive test of both products’ strengths and weaknesses.
We have repeatedly asked Tolly who is sponsoring the report but they have declined to tell us. As of this writing, Veeam has no idea who is paying Tolly for this report (but we have our suspicions). So, in an effort to be truly fair and accurate, we proposed the following to Tolly:
- Veeam co-sponsors the tests and the report, and pays half the sponsorship fee;
- Veeam participates in determining the test criteria on an equal basis with Tolly and the other sponsor; or, alternatively, test criteria are determined by a mutually agreed-upon neutral third party;
- Both vendors’ representatives are present during the tests;
- The tests are conducted in September 2011. Veeam’s key people who can participate in the project are on vacation or on other projects until September 15; and
- The tests are conducted with Veeam Backup & Replication v6, which will be released in Q4 2011.
The above proposal was not accepted by Tolly, and we were told the process was too far along with the sponsoring vendor. They did tell us that we could sponsor our own report at a later time if we like.
My question now is how fair and accurate will this report be? According to the Fair Testing Charter (parts below, emphasis mine), we will be allowed to review the findings and provide feedback.
Finally, we provide them [Veeam] with the opportunity to give us an official commentary on their results, the essence of which is included in any documents we might publish.
It’s interesting they say the “essence of which.” To me it means that they will listen to our feedback but will edit it into an “essence” that puts their client in a better light. Likewise, “might publish” says to me that if Veeam were to come out ahead in the test, even with cherry-picked criteria, the report will never see the light of day.
It appears that Tolly has been doing this type of “fair and accurate” testing for quite some time; I just hope no one loses millions of dollars this time based on Tolly’s findings: http://www.brisbanetimes.com.au/business/compressed-into-nothingness-20100808-11qdh.html?from=age_ft (the Tolly report is available here)
Steve Duplessie from ESG has a great take on head-to-head testing and why ESG doesn’t do it: http://www.thebiggertruth.com/2010/03/the-politics-of-testing/
Vendors in the backup space have also had their share of problems with Tolly, including Quest: http://www.infoworld.com/d/data-management/just-when-you-thought-backup-wars-were-dead-654?page=0,0
It’s clear to us that one of our competitors intends to use this Tolly report as part of a competitive marketing campaign against us. Veeam did NOT participate in the testing and, from our vantage point, the criteria have been designed to avoid testing typical customer scenarios as well as very features that make Veeam Backup & Replication v5 stand out as the best solution on the market. So please treat any report from Tolly as suspect.