Real-World Statistics on Downtime and Data Loss in 2022

As another installment from the Data Protection Trends 2022 report, we’re covering the topic of downtime and data loss. These results really give us a view on the current trends happening in the IT environment.

There isn’t much difference between high-priority and normal applications

Even though applications vary from each other in other ways, the survey respondents noted minimal difference between the expectations against downtime or data loss of ‘high-priority or critical’ applications versus ‘normal’ applications. In 2021, organizations reported a downtime tolerance of “an hour or less” for 56% of their high-priority applications and 49% of their normal applications. Similarly, the frequency of protecting their data (i.e. reducing data loss) was also at similar levels between their high-priority and normal applications.

It is notable that “Tolerable Data Loss” showed the same trends as “Tolerable Downtime” as described above:

Tolerable Data Loss, as shown by protection frequency

Downtime from outages and the most common causes

The 2022 report carefully examined the frequency and causes of outages. Globally, 40% or 2 out of 5 servers had at least one or more outages over the past 12 months.

What’s the cause of these outages?  The most common outages are still caused by breakage of infrastructure or software, while the most impactful outages in both 2020 and 2021 were caused by cyberattacks and user errors.  For more details, please check out this 4-minute video:

We hope these topics have been helpful. This post is the first part of a multi-part series, so please stay tuned to the Veeam blog:

Please reach out to Dave (@BackupDave) or Jason(@JBuff) or email us at StrategicResearch@veeam.com to tell us how this research aligns with your organization’s data protection goals for 2022 — and don’t forget to download the full report.

Click here to download the full Data Protection Trends 2022 report.

Exit mobile version