javascript - Why Python datetime and JS Date does not match? -
i have code, returns utc offset given date:
>>> import datetime >>> import pytz >>> cet = pytz.timezone("europe/moscow") >>> cet.localize(datetime.datetime(2000, 6, 1)) datetime.datetime(2000, 6, 1, 0, 0, tzinfo=<dsttzinfo 'europe/moscow' msd+4:00:00 dst>) >>> int(cet.localize(datetime.datetime(2000, 6, 1)).utcoffset().seconds/60) 240
ok, in js using code ( http://jsfiddle.net/nvn1fef0/ )
new date(2000, 5, 1).gettimezoneoffset(); // -180
maybe doing wrong? , how can plus-minus
before offset (like in js result)?
on system both python , javascript produce same result (modulo sign):
>>> datetime import datetime, timedelta >>> import pytz >>> tz = pytz.timezone('europe/moscow') >>> dt = tz.localize(datetime(2000, 6, 1), is_dst=none) >>> print(dt) 2000-06-01 00:00:00+04:00 >>> dt.utcoffset() // timedelta(minutes=1) 240
and new date(2000, 6, 1).gettimezoneoffset()
returns -240
(different sign, same value).
python uses: local time = utc time + utc offset
definition. while javascript uses different definition: utc offset = utc time - local time
i.e., both results correct , have correct signs corresponding definitions.
for portable javascript solution, use momentjs library provides access same tz database pytz
python module:
> var moscow = moment.tz("2000-06-01", "europe/moscow"); undefined > moscow.format() "2000-06-01t00:00:00+04:00"
Comments
Post a Comment