I had to make a decision. If curl estimates time left to transfer a file is 4398046511103 seconds. How do we display that in - 7 bytes?
I decided that if the time exceeds 99,999 years we show it as ">99999y" even though a 63bit epoch *can* hold up to 292 billion years.
7 bytes because I need the entire curl progress meter to fit within 79 bytes.
What's a better thing to do on a Saturday? 😁
@bagder At some point the answer is just “forever”, practically. which, heh, also is 7 bytes
@bagder Ah by 7 bytes you really meant 7 characters, as in those in ">99999y". But seriously, for all practical purposes anything greater than a few days should be impractical. I would seriously consider anything beyond 365 days maybe in logarithmic steps of:
">1y"
">10y"
">100y"
">1000y"
For anything so slow as > 99999 y, saying it is already ">1000y" would be plenty good enough I'd say.
@raulinbonn sure, but I figured I could just as well use the space I have while it works. I don't think it hurts.
@noisytoot kiloyears is not exactly a common unit, I think that would be causing more questions than answers!
@bagder Honestly anything over a year should be printed as "forever". But if you had to pick something then "292by" is easy enough to decode to 292 billion years.
@bagder @noisytoot Right, the correct SI/metric unit would be Ms (megaseconds). 1000 years = 876Ms. Years are also problematic because they don't all have the same duration.
@bagder something something precision versus accuracy 😊
@bagder Mondo will chime in to say how wrong you are. trusty Mondo
@bagder Honestly you could just leave it at ">1yr" and call it good. How many servers even stay up that long continuously?
@bagder Use the infinity symbol over 100 years. The user would not be alive when the download completes. Besides how big and how slow to reach those limits?
@raulinbonn
@bagder
I think seeing a very specific.number of years - like "4715 years" or sth - can add a bit of comic relief to an otherwise frustrating situation
@bagder I would call it “huge”. That will avoid all and any misinterpretation.
@whynothugo @bagder @noisytoot why stop at mega? This transfer will take 238 exoseconds... Or 7.547 × 10-6 years
@vrek @whynothugo @bagder @noisytoot
That depends on the velocity of the observer
@andrewfeeney @whynothugo @bagder @noisytoot well also what is the current size of and distance to any local gravity sources?
@bagder If you're curl-ing from the Mars Reconnaissance Orbiter (~10-15 minute latency) with moderate data loss at minimum rates (say, effectively 16 kb/s), you'll (eventually) get about 8800 TB after 4,398,046,511,103 seconds.
Even in space, you can't beat sneakernet.
@bagder When you write 7 or 79 "bytes" do you mean [ASCII] characters or is there some technical limit of storage??
Not meant to nitpick, actually confused. Maybe because I'm not a C programmer but Perl very well knows about byte length and string length🤔
Edit: missed the PR link before posting. The PR makes it all clear. My bad!
@bagder You could do a hard cut at 1 year and show “>1 year”. In reality this would never happen anyways. And having 1 year time in advance to plan what happens after that 1 year of continuous uninterrupted download is more than anyone would ever need.
@bagder adding to the „other possible solutions“ list: use scientific notation … in all honesty, if someone is interested in the duration, longer than 1y you can work with „0.57E45“