A backup failure on Oct. 1 forced the Tokyo Stock Exchange to halt all trading for the entire day, leaving frustrated investors unable to capitalize on surging stock prices. Economists have suggested the outage creates uncertainty that may ultimately drive many foreign investors out of the world’s third-largest equity market.

Sadly, such backup failures are not all that uncommon — although most don’t generate near the global attention. Industry analysts say many businesses experience backup failure rates of up to 20 percent. One recent survey found that more than half of organizations experience backup failures multiple times a year due to a host of issues. Common reasons for failure include:

  • Human error. Research finds that nearly 30 percent of failures stem from human error, often involving the inadvertent deletion of database data or modifications of command line interfaces.
  • Media failure. Although this was a more common occurrence when tape was the primary backup media, disk isn’t entirely reliable, either. A Carnegie Mellon University study found disk failure rates of up to 16 percent.
  • Hardware failure. As with all mechanical devices, disk arrays, tape libraries and other backup hardware components have moving parts that wear out. For example, backup servers have a life expectancy of only about three to five years.
  • Software updates. Operating system, application and policy updates can create incompatibilities with backup software.
  • Ransomware and other types of malware commonly move laterally throughout the IT environment, infecting any backup systems attached to the network.

These issues are particularly vexing for small to midsized businesses (SMBs) with IT budget and staffing limitations. With data increasingly distributed across multiple cloud platforms, branch offices and remote users’ endpoint devices, backup processes have become so complicated and time-consuming that many SMBs only back up their data infrequently, if at all.

Cloud backup services have grown in response to these challenges, but they can also introduce complexity in some areas. For example, bandwidth consumption during cloud backups can sometimes impact application performance. Additionally, finding and recovering data in the cloud can be difficult and time-consuming.

An outsourced backup service can be a much better option. In this approach, backups are carried out by a qualified managed services provider (MSP) who is deeply familiar with the latest backup technologies, techniques and tools. It ensures optimal data protection while also allowing you to optimize costs and offload staffing and management burdens.

In addition, an MSP will test backups frequently to ensure they are working properly and readily available in the event of a system failure, cyberattack or some other business disruption. MSPs will also make sure there is a clear recovery plan for all your data, whether in the cloud or on-premises.

A managed backup approach can be especially useful for companies that have large numbers of remote workers with essential company data and applications on laptops, tablets and home computers. Lightweight software agents installed on these devices can interface with backup systems to ensure that remote data, files and applications are protected and recoverable.

Although the consequences of data loss are well understood, backup failures remain an all-too-common occurrence. SSD can help you eliminate much of the uncertainty surrounding data protection with our managed backup offering. We’d welcome the opportunity to demonstrate how we can help you protect your critical data, systems and applications.