View Single Post
Old 07-15-2017, 08:54 PM   #30324
DMcCunney
New York Editor
DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.DMcCunney ought to be getting tired of karma fortunes by now.
 
DMcCunney's Avatar
 
Posts: 6,384
Karma: 16540415
Join Date: Aug 2007
Device: PalmTX, Pocket eDGe, Alcatel Fierce 4, RCA Viking Pro 10, Nexus 7
Quote:
Originally Posted by fum View Post
Probably. :-)
But the thinking behind is sound, I think: What is the best way to preserve data in a way that it possible can be read in 100 years? In a stupid .txt file, wrapped with some xml. It is based on the 'SIARD' standard. (Software Independent Archiving of Relational Databases). But when you have a table with 17 million rows, 44 columns, you get a .xml file with 17 * 44 million lines. + some overhead. I have a file from another table, that is like this, but only with 13 million rows. That one is 25GB. So I think this one will nearing the 30gb, when sql-server is finished writing it, in a couple of days. :-)
Judging by SIARD (Software Independent Archiving of Relational Databases) from the US Library of Congress, I don't think what you are dealing with is quite what the spec has in mind...
______
Dennis
DMcCunney is offline   Reply With Quote