Quote:
Originally Posted by fum
Probably. :-)
But the thinking behind is sound, I think: What is the best way to preserve data in a way that it possible can be read in 100 years? In a stupid .txt file, wrapped with some xml. It is based on the 'SIARD' standard. (Software Independent Archiving of Relational Databases). But when you have a table with 17 million rows, 44 columns, you get a .xml file with 17 * 44 million lines. + some overhead. I have a file from another table, that is like this, but only with 13 million rows. That one is 25GB. So I think this one will nearing the 30gb, when sql-server is finished writing it, in a couple of days. :-)
|
Judging by
SIARD (Software Independent Archiving of Relational Databases) from the US Library of Congress, I don't think what you are dealing with is quite what the spec has in mind...
______
Dennis