Banned User
|
This post was updated on .
CONTENTS DELETED
The author has deleted this message.
|
"Michael J. Baars" <[hidden email]> writes:
> Can someone please tell me which of these two queries gives the correct result and which one the incorrect? > // 2.922 (&) > with A1 as ( select make_interval (0, 0, 0, 0, 0, 0, ( extract ( epoch from interval '8 years' ) / 1000 ) ) as "00" ) select ( extract ( hours from "00" ) + > extract ( minutes from "00" ) / 60 + extract ( seconds from "00" ) / 3600 ) / 24 as dT from A1; > // 2.88 (X) > with A1 as ( select interval '8 years' / 1000 as "00" ) select extract ( days from "00" ) + extract ( hours from "00" ) / 24 + extract ( minutes from "00" ) / > 1440 + extract ( seconds from "00" ) / 86400 as dT from A1; They'e both "incorrect", for some value of "incorrect". Quantities like years, days, and seconds don't interconvert freely, which is why the interval datatype tries to keep them separate. In the first case, the main approximation is introduced when you do select extract ( epoch from interval '8 years' ); date_part ----------- 252460800 (1 row) If you do the math, you'll soon see that that corresponds to assuming 365.25 days (of 86400 seconds each) per year. So that's already wrong; no year contains fractional days. In the second case, the trouble starts with select interval '8 years' / 1000; ?column? ----------------- 2 days 21:07:12 (1 row) Internally, '8 years' is really 96 months, but to divide by 1000 we have to down-convert that into the lesser units of days and seconds. The approximation that's used for that is that months have 30 days, so we initially get 2.88 days, and then the 0.88 days part is converted to 76032 seconds. So yeah, you can poke a lot of holes in these choices, but different choices would just be differently inconsistent. The Gregorian calendar is not very rational. Personally I stay away from applying interval multiplication/division to anything except intervals expressed in seconds. As soon as you get into the larger units, you're forced to make unsupportable assumptions. regards, tom lane |
"Michael J. Baars" <[hidden email]> writes:
> So how do you compute the number of seconds in 8 years? IMO, that's a meaningless computation, because the answer is not fixed. Before you claim otherwise, think about the every-four-hundred-years leap year exception in the Gregorian rules. Besides, what if the question is "how many seconds in 7 years"? Then it definitely varies depending on the number of leap days included. What does make sense is timestamp subtraction, where the actual endpoints of the interval are known. regards, tom lane |
On Mon, Feb 22, 2021 at 10:52:42AM -0500, Tom Lane wrote:
> "Michael J. Baars" <[hidden email]> writes: > > So how do you compute the number of seconds in 8 years? > > IMO, that's a meaningless computation, because the answer is not fixed. > Before you claim otherwise, think about the every-four-hundred-years > leap year exception in the Gregorian rules. Besides, what if the > question is "how many seconds in 7 years"? Then it definitely varies > depending on the number of leap days included. > > What does make sense is timestamp subtraction, where the actual > endpoints of the interval are known. True. I'm not sure whether this is a bug or an infelicity we document, but at least in some parts of the world, this calculation doesn't comport with the calendar in place at the time: SELECT to_timestamp('1753', 'YYYY') - to_timestamp('1752', 'YYYY'); ?column? ══════════ 366 days (1 row) I'd like to imagine nobody will ever go mucking with the calendar to the extent the British did that year, but one never knows. Best, David. -- David Fetter <david(at)fetter(dot)org> http://fetter.org/ Phone: +1 415 235 3778 Remember to vote! Consider donating to Postgres: http://www.postgresql.org/about/donate |
David Fetter <[hidden email]> writes:
> I'm not sure whether this is a bug or an infelicity we document, but > at least in some parts of the world, this calculation doesn't comport > with the calendar in place at the time: > SELECT to_timestamp('1753', 'YYYY') - to_timestamp('1752', 'YYYY'); Yeah, Appendix B.6 mentions that. What isn't documented, and maybe should be, is the weird results you get from the tzdata info for years before standardized time zones came into use. regression=# show timezone; TimeZone ------------------ America/New_York (1 row) regression=# select '2020-01-01 00:00'::timestamptz; timestamptz ------------------------ 2020-01-01 00:00:00-05 (1 row) regression=# select '1800-01-01 00:00'::timestamptz; timestamptz ------------------------------ 1800-01-01 00:00:00-04:56:02 (1 row) If you're wondering where the heck that came from, it corresponds to the actual longitude of New York City, i.e. local mean solar time. regards, tom lane |
Free forum by Nabble | Edit this page |