• MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 months ago

    Fair warning that this would chew through a ton of bandwidth if you run it often, so only do it if you don’t have bandwidth caps.

      • MangoPenguin@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        3 months ago

        True, although once per hour would still be a lot of data.

        For example me running a fast.com test uses about 1.5GB of data to run a single test, so around 1TB per month if you ran it hourly.

        • Norah - She/They@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Once every 6hrs would only be 180GB. A script that does it every six hours, but then increases the frequency if it goes below a certain threshold, could work well. I guess it all depends on how accurate you need the data to be.