Why does Javascript evaluate a 2-digit year of 00 as 1900 instead of 2000?

kimberlyseal

New Member
I have an old web app where Javascript is used to validate some dates. Users usually use 2-digit years and I recently discovered it was evaluating 00 as 1900 instead of 2000\[code\]if (new Date(tb[0].value) > new Date(tb[1].value)){ alert('Starting date must come before the ending date'); tb[0].focus(); return false;}\[/code\]Entering 1/1/99 in the first box and 1/1/00 in the 2nd will cause an error message saying the start date has to be before the end date because 99 is evaluating at 1999 while 00 is evaluating at 1900.Of course, Users can get around this using 4-digit years, but I still want to know what can be done to get Javascript to evaluate 2-digit years correctly.So my question is, how can I get Javascript to evaluate 00 as 2000 and not 1900?
 
Top