The Importance Of Image Drives

Both cloning and imaging create an exact record of your drive or partition. This consists of not only the files, but the master boot record, allocation table, and everything else needed to boot and successfully run an operating system.

This isn’t necessary for protecting your data as a simple file backup will handle that particular job just fine. However should your hard drive crash or Windows become hopelessly corrupt, a clone or image backup can quickly get you back to work.

When you clone a drive, you copy everything on it onto another drive, so that the two are effectively identical. Normally, you would clone to an internal drive made external via a SATA/USB adapter or enclosure.

But imaging a drive is more like creating a great big .zip file (without the .zip extension). Image backup software copies everything on the drive into a single, compressed, but still very large file. You would probably save the image onto an external hard drive.

What are the advantages of each?

Should your primary hard drive crash, a clone will get you up and running quickly. All you have to do is swap the drives.

On the other hand, if your drive crashes and you've backed it up to an image, you'd have to buy and install a new internal hard drive, boot from your backup programs emergency boot disc, and restore the drives contents from the backup.

Why image?

An image backup provides greater versatility when backing up. You can save several images onto one sufficiently large external hard drive, making it easier and more economical to save multiple versions of the same disk or back up multiple computers.

Get advice and assistance from Andy and the team at R3. R3 Data Recovery is real lab that deals with real disasters each an every day. If you have any sort of problem with a hard drive or any data storage device, we are the people to contact. Call us today on 0800 999 3282 for immediate help and assistance.

Why Do We Back Up Data?

We backup data to prevent data loss. No matter how much you look after your hard drive or flash device they will most likely fail sooner or later. Hard drives are not made to last forever, however some hard drives will last for years but some hard drives will last for months, we’ve seen every outcome imaginable here at R3.

Drives will start to gradually build up bad sectors over time and this will start the drives road to failure, once the drive had accumulated a large number of bad sectors the drive will no longer be able to re-allocate any more sectors and you will see the drive start to slow down and fail. No matter how much you look after your drive this will start to happen, you are not able to prevent this.

It is similar with flash devices, bad blocks will start to accumulate on the NAND chip which is where the data is stored, this again cannot be controller and with a lot of manufacturers trying to cut costs this is happening more and more often because the quality of a lot of flash devices these days are becoming worse and worse.

There are a lot more reasons we back up data, things like power fluctuations can affect and potentially mechanically damage hard drives that are plugged in and powered on when something happens to the power, mechanical failures can happen by power fluctuations or things like being dropped. This is also another reason why we backup data.

If the data is important to you data backups should be done periodically and the backup should be kept in a safe place where it cannot be damaged or affected by any power fluctuations or anything that could potentially damage the back up drive in anyway.

How Often Should You Back Data Up?

This question is really down to personal preference and choice. We always recommend to back up your data as soon as anything changes, if that is not possible a weekly backup would be the next best thing. There is no definitive answer as to how often you should back data up, like previously mentioned it is all down to personal preference and how often data changes on the device you need to back up.

Something else that you need to take into consideration is the amount of data, if you’re thinking about backing up a flash device it can be done daily as flash devices are not usually larger than 32GB, they can obviously be larger but not often. However, if you are having to back up a 50TB server which is full it cannot be done daily, purely due to the amount of data.

The more often you back up your data the less likely you are to lose any of that data however the less you back up the more likely you are to be prone to data loss. It all really depends on how important the data is to you and how much data there is, it is all down to personal preference.

What Is The Difference Between Physical Backup And Logical Backup?

Physical backup is when your data is backed up to a physical storage device like an external hard drive or a NAS box. With physical backups you have a cost as you have to buy hard drives or flash devices to keep the data on.

Logical back up is storing data on the cloud, websites like drop box are becoming more and more popular as they offer a free storage location for people to store photos, precious family moments etc, without the cost of buying hard drives or the worry that the drive might fail sooner or later losing all the data on it and therefore meaning you will need data recovery services.

Here at R3 Data Recovery we see all types of storage devices small and large, in both size and storage capacity. For a FREE quotation on your storage device call us on 0800 999 3282 or alternatively email us at [email protected].

Netgear loses customer backups

Neatgear has “dropped the ball” with its cloud management service, losing data stored locally on ReadyNAS devices’ shared folders worldwide – and customers have been complaing online about only being informed four weeks later.

This week, the San Jose-based networking business sent an email to customers, confirming that an “outage” affecting ReadyCLOUD, the free service for its network attached storage offering, caused the storage systems to disconnect from the cloud service and be marked as deleted at the end of March.

Compounding the issue, as part of a clean-up process, Netgear decided that when a ReadyCloud account is marked as closed, the NAS holding that account’s home folder should be deleted along with all of the data it was holding.

As one user complained “In practice, accounts are generally deleted from the NAS admin screen by the user and a big warning flashes up to tell you that all data will be deleted. In this case, as the glitch was server side, no warning was presented and loads of people found that their home folders and data had mysteriously been deleted, by the looks of it, at the command of Netgear.”

A reader at - The Register - got in touch to say that the outage lost all of his photographs of a trip with his 18-month-old daughter to Disneyland, and complained that despite Netgear’s claims they had identified all users, the company had not yet contacted him.

Netgear was asked what the cause of the incident was, in response to which the company stated it was “a server outage”.

“There was no outside or malicious action that caused this issue,” said the spokesman. “It was caused by an internal server-side interruption. Should note that ReadyCLOUD is an enterprise VPN grade remote access solution and at no point has it ever been compromised.”

Netgear said it “cannot estimate at this point that any data loss has taken place given that we are actively working with those affected by the outage to help recover their data.”

“The affected number of users was between 40 and 50,” the spokesperson claimed, “of which mostly were consumers and not business. We encourage anyone who may think that they have been impacted by this outage to contact us for assistance as soon as possible.”

In response to our questions regarding the four-week delay, the spokesperson said: “We had immediately reached out to those registered users who appeared to have been affected by the outage.

“To err on the side of caution, Netgear then subsequently expanded our outreach to the larger community to ensure that no one who may have been exposed by the incident had been overlooked,” they added. “It should also be noted the importance of registering Netgear products. We encourage product registration for instances such as this when communication to our customers becomes necessary.

“We have already identified the root cause in our server software and applied a patch immediately after the incident occurred. We are currently working with each impacted user to recover as much of their data as possible using custom data recovery tools,” the spokesperson added.

GitLab backups fail

Source-code hub is in meltdown after experiencing data loss as a result of what it has suddenly discovered are ineffectual backups.

On Tuesday evening, Pacific Time, the startup issued a sobering series of tweets we’ve listed below. Behind the scenes, a tired sysadmin, working late at night in the Netherlands, had accidentally deleted a directory on the wrong server during a frustrating database replication process: he wiped a folder containing 300GB of live production data that was due to be replicated.

Just 4.5GB remained by the time he canceled the rm -rf command. The last potentially viable backup was taken six hours beforehand.

That Google Doc mentioned in the last tweet notes: “This incident affected the database (including issues and merge requests) but not the git repos (repositories and wikis).”

So some solace there for users because not all is lost. But the document concludes with the following:
So in other words, out of 5 backup/replication techniques deployed none are working reliably or set up in the first place.

The world doesn’t contain enough faces and palms to even begin to offer a reaction to that sentence. Or, perhaps, to summarise the mistakes the startup candidly details as follows:

- LVM snapshots are by default only taken once every 24 hours. YP happened to run one manually about 6 hours prior to the outage
- Regular backups seem to also only be taken once per 24 hours, though YP has not yet been able to figure out where they are stored. According to JN these don’t appear to be working, producing files only a few bytes in size.
- SH: It looks like pg_dump may be failing because PostgreSQL 9.2 binaries are being run instead of 9.6 binaries. This happens because omnibus only uses Pg 9.6 if data/PG_VERSION is set to 9.6, but on workers this file does not exist. As a result it defaults to 9.2, failing silently. No SQL dumps were made as a result. Fog gem may have cleaned out older backups.
- Disk snapshots in Azure are enabled for the NFS server, but not for the DB servers.
- The synchronisation process removes webhooks once it has synchronised data to staging. Unless we can pull these from a regular backup from the past 24 hours they will be lost
- The replication procedure is super fragile, prone to error, relies on a handful of random shell scripts, and is badly documented
- Our backups to S3 apparently don’t work either: the bucket is empty

Making matters worse is the fact that GitLab last year decreed it had outgrown the cloud and would build and operate its own Ceph clusters. GitLab’s infrastructure lead Pablo Carranza said the decision to roll its own infrastructure “will make GitLab more efficient, consistent, and reliable as we will have more ownership of the entire infrastructure.”

At the time of writing, GitLab says it has no estimated restore time but is working to restore from a staging server that may be “without webhooks” but is “the only available snapshot.” That source is six hours old, so there will be some data loss.

Last year, GitLab, founded in 2014, scored US$20m of venture funding. Those investors may just be a little more ticked off than its users right now.

“On Tuesday, GitLab experienced an outage for one of its products, the online service,” a spokesperson for the San Francisco-based biz told The Register in an email, adding: “This outage did not affect our Enterprise customers.”

“We have been working around the clock to resume service on the affected product, and set up long-term measures to prevent this from happening again,” the spinner said. “We will continue to keep our community updated through Twitter, our blog and other channels.”

Meanwhile, the sysadmin who accidentally nuked the live data reckons “it’s best for him not to run anything with sudo any more today.”

Enable Workforce Mobility Backups

The office for today’s worker can be anywhere and their computing device can take many forms – from smartphones and tablets to home computers and roaming laptops. How can you safeguard your corporate data from the risk of data loss while also empowering your mobile workforce with the information access and sharing they need to be productive from ANYWHERE? Use your backup solution Here are 10 ways your backups can enable better employee productivity, in the office and on the road…

1. Become Centered By backing up data up to a central location,sophisticated backup solutions can provide end users with the ability to find and restore files from any backup, on any client they are authorized to access, without requiring administrator intervention.This can be offered through a web console,mobile app or natively in Windows Explorer.

2. Control Your Remotes With today’s remote worker environment, advanced backup solutions will enable you to perform backup for users over HTTPs without impacting their day-to-day activities. Remote backup ensures that files created or edited on anydevice; including tablets and smartphones, are backed upa ndi ncorporated into the same enterprise backup and recovery infrastructure as files created on office computers. This expedites recovery, centralizes protection and improves operational efficiency.

3. Offer Self-Service Self-service access gives power to your users and freedom for IT. It lets them directly browse or search and retrieve files and versions directly from a centralizedrepository–avoidingtheburdensandcostsofinvolvingahelpdesk while driving productivity by avoiding rework and waiting for IT assistance. Sophisticated solutions will offer ubiquitous access and edit capabilities via personal employee data clouds making file recovery instant and easy.

4. Get Synchronized Through secure, automatic sync capabilities advanced backup solutions can enable access to important data on multiple client systems without a manual recovery step. Files between multiple laptop and desktop systems can be automatically updated based on user policies so that the most current versions are always available, regardless of the computer being used. This further secures the enterprise by eliminating external and public file sharing services that expose corporate data to risk.

5. Share Nice Collaboration has quickly become a significant business process. Using a smart backup strategy, you can enable secure file sharing by sending a link to a centrally protected file to a colleague, either internal or external, over instant message or via email. This enables them to easily view any shared file on the web without having to mail it as an attachment, which is especially beneficial for sharing large files that simply can’t be emailed efficiently.

6. Create Personal Clouds Using this centralized approach, your backup solution can essentially deliver personal data clouds so that your employees can access, edit and upload their files while protecting content on mobile devices. These personal data clouds enable users to search, sync and share their content securely, while being protected by their corporate data backup policies.

7. Encrypt It. While giving users the freedom to access data from anywhere on multiple devices, it’s also critical that your backup solution supports encryption. It should offer remote wipe capabilities, along with the ability to encrypt data on the client device, in transit and at the data center. This will protect corporate data if a client device is lost or compromised, when data is traveling over non-secure networks and while it’s stored.

8. Automate Your Routines Even while you are delivering flexibility and access to users, you can remain in control by selecting a backup solution that will automate client backup routines. Policies and workflows should be customizable and deployable to multiple endpoints so that you can perform routine tasks such as auto- discovery of new desktops and laptops to automatically install backup agents for comprehensive protection.

9. Discover More For many organizations, endpoint data can be a mystery of both risk and opportunity. By leveraging a sophisticated centralized backup solution to  protect client information, you can enable enterprise-wide search and efficient discovery for client data throughout the organization. Not only can users easily search their own files for rapid, self-serve recovery, IT, HR and legal teams can easily search employee data for rapid discovery of information related to corporate litigation, internal investigations, public information and audit requests.

10. Be Insightful By using advanced backup technology to enable the mobile worker you can also empower your organization with the data insights that support informed decision making and operational excellence. With robust, built-in reporting analytics, backup solutions can support your goals to deliver IT as a Service, infrastructure cost planning, insight into operations and simplified compliance audits.

Offsite Backup Companies Store

When you create backups of your data, you must store them somewhere where they can’t be damaged or someone else can access them. For businesses, offsite backup is a well known and popular way to backup files. It offers you several different advantages when compared to other ways, such as CD, DVD, external hard drives, and even servers. One of the biggest advantages to offsite backup is the fact that the backups aren’t stored in your office or business.

Offsite backup companies store your data in state of the art safes, to protect them against fire, flood, and even prying eyes. This can be extremely beneficial if unexpected things have a habit of occurring around your office.

Another great thing about offsite backups are the fact that they can be used as stores for your data. You won’t need to rely on online space, as you can easily go to the company who is storing your data and go through it anytime you wish. You can also use online space with most companies as well. You simply upload your data to their online storage area, then go back anytime you wish and view it. This is a very handy feature, similar to a hosting company.

Another benefit of offsite backup is the fact that your data will always be protected, and you won’t have to use CD or DVDs to do it. CD and DVD storage is great for individuals, although there will be quite a bit of them for most businesses. This can get somewhat costly, but more importantly, it will use a lot of space to store the backup files. They can also become damaged or lost, unlike offsite backup storage.

Offsite storage is also great if your business is in a bad area. If Mother Nature has a habit of bashing your area with floods, fires, or hurricanes, you should look into offsite backup storage immediately. They have ways to protect your information from harm, including anything Mother Nature can dish out. There is no need to worry about natural disasters, system failures, hard drive crashes, or data failure with offsite backup storage.

Even though you may not realize it, the data will be available anytime you need it. Online backup services are available anytime, day or night, and can be accessed anywhere you are. Most are easy to set up, and offers you very impressive security measures.

When it comes to offsite backup, you can store virtually any file you need to, such as text files, e-books, contact record, pictures, music, and anything else you can think of. The storage for online backups are virtually endless, capable of storing everything you need.

Offsite backup storage is ideal for any business or corporation. You can store your data with an online offsite backup, or choose to do it physically in an offsite safe. No matter which method of offsite backup you choose – your data will always be protected, and best of all – it will always be there anytime you need it.