I’m writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.

This has caused me to wonder what the largest storage operation you guys have done. I’ve taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.

81 points

Not that big by today’s standards, but I once downloaded the Windows 98 beta CD from a friend over dialup, 33.6k at best. Took about a week as I recall.

permalink
report
reply
31 points

I remember downloading the scene on American Pie where Shannon Elizabeth strips naked over our 33.6 link and it took like an hour, at an amazing resolution of like 240p for a two minute clip 😂

permalink
report
parent
reply
13 points

And then you busted after 15 seconds?

permalink
report
parent
reply
3 points

Totally worth it.

permalink
report
parent
reply
16 points

Yep, downloaded XP over 33.6k modem, but I’m in NZ so 33.6 was more advertising than reality, it took weeks.

permalink
report
parent
reply
1 point

In similar fashion, downloaded dude where’s my car, over dialup, using at the time the latest tech method - a file download system that would split the file into 2mb chunks and download them in order.

It took like 4 days.

permalink
report
parent
reply
59 points

I’m currently backing up my /dev folder to my unlimited cloud storage. The backup of the file /dev/random is running since two weeks.

permalink
report
reply
13 points

That’s silly. You should compress it before uploading.

permalink
report
parent
reply
9 points

No wonder. That file is super slow to transfer for some reason. but wait till you get to /dev/urandom. That file hat TBs to transfer at whatever pipe you can throw at it…

permalink
report
parent
reply
6 points

Cool, so I learned something new today. Don’t run cat /dev/random

permalink
report
parent
reply
1 point

Why not try /dev/urandom?

😹

permalink
report
parent
reply
2 points

Ya know, if not for the other person’s comment, I might have been gullible enough to try this…

permalink
report
parent
reply
5 points

I’m guessing this is a joke, right?

permalink
report
parent
reply
3 points

/dev/random and other “files” in /dev are not really files, they are interfaces which van be used to interact with virtual or hardware devices. /dev/random spits out cryptographically secure random data. Another example is /dev/zero, which spits out only zero bytes.

Both are infinite.

Not all “files” in /dev are infinite, for example hard drives can (depending on which technology they use) be accessed under /dev/sda /dev/sdb and so on.

permalink
report
parent
reply

I’m aware of that. I was quite sure the author was joking, with the slightest bit of concern of them actually making the mistake.

permalink
report
parent
reply
58 points

I obviously downloaded a car after seeing that obnoxious anti-piracy ad.

permalink
report
reply
43 points

In grad school I worked with MRI data (hence the username). I had to upload ~500GB to our supercomputing cluster. Somewhere around 100,000 MRI images, and wrote 20 or so different machine learning algorithms to process them. All said and done, I ended up with about 2.5TB on the supercomputer. About 500MB ended up being useful and made it into my thesis.

Don’t stay in school, kids.

permalink
report
reply
17 points

You should have said no to math, it’s a helluva drug

permalink
report
parent
reply
2 points
0 points

golden 😂😂

permalink
report
parent
reply
37 points

Entire drive/array backups will probably be by far the largest file transfer anyone ever does. The biggest I’ve done was a measly 20TB over the internet which took forever.

Outside of that the largest “file” I’ve copied was just over 1TB which was a SQL file backup for our main databases at work.

permalink
report
reply
9 points

+1

From an order of magnitude perspective, the max is terabytes. No “normal” users are dealing with petabytes. And if you are dealing with petabytes, you’re not using some random poster’s program from reddit.

For a concrete cap, I’d say 256 tebibytes…

permalink
report
parent
reply
2 points

brother?..

permalink
report
parent
reply
1 point

10TB is child’s play

permalink
report
parent
reply

Linux

!linux@lemmy.ml

Create post

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word “Linux” in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

  • Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
  • No misinformation
  • No NSFW content
  • No hate speech, bigotry, etc

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

Community stats

  • 9.6K

    Monthly active users

  • 3.1K

    Posts

  • 35K

    Comments