You are viewing a single thread.
View all comments View context
3 points
*

True, although once per hour would still be a lot of data.

For example me running a fast.com test uses about 1.5GB of data to run a single test, so around 1TB per month if ran hourly.

permalink
report
parent
reply
1 point

Once every 6hrs would only be 180GB. A script that does it every six hours, but then increases the frequency if it goes below a certain threshold, could work well. I guess it all depends on how accurate you need the data to be.

permalink
report
parent
reply

Open Source

!opensource@lemmy.ml

Create post

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

  • Posts must be relevant to the open source ideology
  • No NSFW content
  • No hate speech, bigotry, etc

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

Community stats

  • 5.5K

    Monthly active users

  • 895

    Posts

  • 6.8K

    Comments