Jan-03-2017, 12:13 PM
(Jan-02-2017, 04:04 AM)Skaperen Wrote:(Jan-01-2017, 05:52 PM)Farrou Wrote: I want to data mine some extremely large (50 gig) data sets and hope thisfind the largest hard drive of the day.
will be easier to learn. It kind of reminds me of basic.
find the largest multi-drive RAID controller (SAN/SCSI/SATA) of the day.
multiply. now refer to that size when saying "extremely large" on that day.
in 2016 that was at least 192TB.
in 1994 i saw a 4TB DB spinning in one rack.
i can't even remember when 50GB was "extremely large".
i was a co-sysadmin of a mainframe with 220GB back in 1980.
imagine needing over a Petabyte for your next DB project.
there are probably a few DB admins that would laugh that off.
imagine what Facebook is adding on this year.
ignore that blank line that was somehow added to my post
Disks and files aren't the same thing. In my previous project (digital videos for a TV group) they acquired 3 peta-bytes per year, but the videos themselves where "manageable" at 60GB, even if when you routinely handle files that big, you start having lots of interesting problems
Unless noted otherwise, code in my posts should be understood as "coding suggestions", and its use may require more neurones than the two necessary for Ctrl-C/Ctrl-V.
Your one-stop place for all your GIMP needs: gimp-forum.net
Your one-stop place for all your GIMP needs: gimp-forum.net