Backup Software for Archive

Kayot

Well-Known Member
OP
Member
Joined
Jan 24, 2010
Messages
362
Trophies
0
Website
sites.google.com
XP
490
Country
United States
I'm getting ready to migrate from a 12x2TB (9 Data, 3 Parity) DrivePool/SnapRAID to a 4x6TB (3 Data, 1 Parity) FreeNAS ZFS with a SSD for Deduplication and 8GB of Ram.

This will leave me with 12x2TB drives that I can use as a backup solution.

Is there any good software that would allow me to do this?
 

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,373
Country
United Kingdom
18TB effective does still just about put you in the big boy world of backups.

In the real world I would probably ask what the data is (database, never you mind as it is just a binary blob, virtual machines, general file store*, actual disc partitions/PC clones....), what sort of downtimes you can handle (pumping 18 TB over a 100mbps network port would take a little while after all, even with gig it is not necessarily an overnight job -- maybe a weekend if everything goes smoothly and 10Gb stuff is available but not exactly cheap) and depending upon the data then what kind of restore you want (full partition map, data on the partitions, just need my files...)? Equally what sort of backup are we looking at (hot, cold, offsite/onsite, what sort of restore timeframes, what sort of backup schedule will they have....)? Has someone read a whitepaper/attended a conference and spoken nasty words like high availability (HA) or fault tolerance (FT)? Can you deal with a certain amount of loss in a given situation (either changes since last back, someone wanting to rewind the clock even further, an error being duplicated into your backups...)?

*story from a friend but he had a image library of considerable size itself with various size thumbnails/scaled versions for each image, made for quite a few files in the end and caused some things some trouble. Personally I would have considered regenerating the thumbnails but that might have been effort at some level.

If it is just files or effectively reduced to files then never underestimate good old FTP with a proper client and server. Similarly rsync is not to be sniffed at.
 

Kayot

Well-Known Member
OP
Member
Joined
Jan 24, 2010
Messages
362
Trophies
0
Website
sites.google.com
XP
490
Country
United States
The Archive is just a collection of Applications, Movies, Music, Games (Every system and Type such as BD, DVD, ROMS, etc), and pictures. I have a gigabit network, but data doesn't change all that often. Stuff will be moved around and sorted, which I wondered if there was backup software that would account for this. For instance, SnapRAID uses a files id rather than it's path so that it wont lose and gain a file if a file is moved to another directory.

It isn't terribly complicated to use the drives individually. I would prefer having something keep track of the drive map instead of doing it manually or writing software to do it.

The purpose of the backup is incase something goes wonky and I lose the ZFS array. The backup will probably be bi-weekly and only kept to make sure I don't lose everything.
 

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,373
Country
United Kingdom
Hmm, that might change things and I can not say I have really had to do that before in that way.

I have file duplicate checking software (does it by hash, name and size, and is fairly smart about hashing only when it needs to), however it takes ages for a small windows install and there is very little chance it would be parity aware.

Is it worth considering a CMS, probably starting on the new ZFS stuff and going back to the old drives with that setup. By nature they are change aware, you could do notes on top of them and sort things out that way. Only problem I see is space if you are going to be keeping something to revert later. I would agree that such a thing is probably a night and/or Herculean task to get set up in the first place.
 

Kayot

Well-Known Member
OP
Member
Joined
Jan 24, 2010
Messages
362
Trophies
0
Website
sites.google.com
XP
490
Country
United States
I've been using SnapRAID's duplicate check function up till now. When the ZFS goes live, it'll be running deduplication which means a duplicate isn't a problem.

What are my content management software choices in this case?
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
    K3Nv2 @ K3Nv2: https://www.acepcs.com/products/wizbox-g-mini-pc-amd-r7-7735hs lol wizbox +1