This article is a good example. TS can't fix the underlying APIs, standard library etc.
Best language would be the entire ecosystem, including standard library
I've been playing around with rust in my free time and like it. I think it's a good FP middle ground. Gleam also looks interesting. But to your point I imagine there aren't many jobs paying for rust and practically none for Gleam.
Hell you might even like ActionScript ;P
They were all written by the same guy, Anders Hejlsberg:
https://en.wikipedia.org/wiki/Anders_Hejlsberg
https://news.ycombinator.com/item?id=19568681
"My favorite is always the billion dollar mistake of having null in the language. And since JavaScript has both null and undefined, it's the two billion dollar mistake." -Anders Hejlsberg
"It is by far the most problematic part of language design. And it's a single value that -- ha ha ha ha -- that if only that wasn't there, imagine all the problems we wouldn't have, right? If type systems were designed that way. And some type systems are, and some type systems are getting there, but boy, trying to retrofit that on top of a type system that has null in the first place is quite an undertaking." -Anders Hejlsberg
> "My favorite is always the billion dollar mistake of having null in the language. And since JavaScript has both null and undefined, it's the two billion dollar mistake."
> -Anders Hejlsberg
Why can't all-1s be null? E.g. a small int goes from the range 0-255 to the range 0-254, but we get a really useful property with no out-of-band Nullable overhead.With signed ints it even leads to symmetric ranges in the negative and positive directions.
https://python-history.blogspot.com/2010/08/why-pythons-inte...
https://forth-standard.org/standard/diff?utm_source=chatgpt....
https://atariwiki.org/wiki/Wiki.jsp?page=Converting+FIG-Fort...
>4. For various reasons the definition of all divide functions general effect is that quotients are floored instead of rounded toward zero. This should cause no problems for most pre-existing application software. The new divide functions are marginally slower than the old (a few machine cycles under most circumstances). The side-effects of the redefinition for floored divide can be counter-intuitive under some circumstances. For example, in FIG-Forth the operation
-40 360 MOD
>would return the obvious answer (-40) on the stack, while 83- Standard Forth will return the answer 320!>5. The true flag returned by all logical operations has been changed from the value 1 (in FIG-Forth) to the value -1 (in Forth-83, all bits set). If your code used the 0 or 1 returned by a comparison in an arithmetic operation, you will need to interpolate the operator ABS after the logical operator. This is a particularly difficult problem to look for in your source code. However, we feel that this mutation in the 83-Standard was beneficial as it allows the returned true/false value to be used as a mask for AND.
I always suspected that FORTH had inconsistencies in division across versions. That's why the lord told us to Go FORTH and Multiply instead.
The point is that TypeScript and C# are extremely similar for a good reason, not a coincidence, and that Anders Hejlsberg knows what the fuck he's doing and talking about, and has been implementing amazing groundbreaking well designed languages and IDEs for a very long time. Turbo Pascal was so great it flummoxed Bill Gates, so Microsoft sent a limo to recruit and hire Anders Hejlsberg from Borland, then he made Visual J++, Windows Foundation Classes, C#, and TypeScript.
https://en.wikipedia.org/wiki/Turbo_Pascal
>Scott MacGregor of Microsoft said that Bill Gates "couldn't understand why our stuff was so slow" compared to Turbo Pascal. "He would bring in poor Greg Whitten [programming director of Microsoft languages] and yell at him for half an hour" because their company was unable to defeat Kahn's small startup, MacGregor recalled.
https://news.ycombinator.com/item?id=8664370
>"According to the suit, Microsoft also offered Mr. Hejlsberg a $1.5 million signing bonus, a base salary of $150,000 to $200,000 and options for 75,000 shares of Microsoft stock. After Borland's counteroffer last October, Microsoft offered another $1.5 million bonus, the complaint says."
It’s a good language that scales quite well to the point where you can then extract specific parts to more performant languages.
99.9% of people won’t have that problem, so I think they should just use TS and solve problems.
Anyone else can be safely ignored and they can complain in the corner.
Python is a dynamic language as well and in many ways worse than JS, but
[] + {}
raises a type error.In JS
[] + {}
is an object and {} + []
is 0. It's not about being smug, it's that in no way, shape or form that makes any sense.Second, {} + [] isn't a type conversion issue, it's a parsing issue. That {} isn't an object, it's a code block. Assign {} to a variable to ensure it's an object, then do var + [] and you get the same result as the first one.
When using an actual object in both of these, the type conversion makes sense: "+" acts on primitives like strings and numbers, so it has to convert them first. You're getting obj.toString() + array.toString() in either case.
I'll admit the parsing issue here is odd, but most of the time peoples' complaints about javascript type coercion is that they simply never bothered to learn how it works.
Or are you arguing that ceteris paribus you'd rather not have the language throw an error or just propagate undefined?
Why is it still here today?
What breaking changes has JS ever had? Its an incredibly stable language.
If the platform can continuously shift, in an ecosystem that rots faster than my veggie garden, can we not have more sensible systems in place?
I’ll try again. What breaking changes does JavaScript make? What are you talking about?
We have had so many APIs vanish from any context to "secure contexts" only. Like AppCache, or workers, or the clipboard.
Entire APIs like XMLHttpRequestProgressEvent have vanished entirely.
In fact, there is a lot of obsolete features. Enough that there's a list. [1] You can't use global or source with a RegExp anymore. You can't use the arity property on a function anymore. `Object.prototype.eval` is dead. `Object.getNotifier` is dead. `Date.prototype.toLocaleFormat()` is dead.
Old websites have been broken. Repeatedly. Or we wouldn't have, say, blogspam on why mixed content suddenly broke some Wordpress systems.
Yes, these are good changes. They're more secure. However, we're willing to break some sites in the name of security, but unwilling to fix ambiguous parsing for the programmer?
[0] https://news.ycombinator.com/item?id=41912354
[1] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...
soo.... if the pesky people keep complaining, maybe it really doesn't make any sense? i for the life of me could never figure out why a bunch of mainstream languages a couple of decades ago decided that typing is no longer necessary and should be abolished. Is there any benefit of removing type checking (and relying on bizarre implicit coercion rules when ambiguity ensues)?
For today's lucky 10,000:
This is one of those cases where the problem isn't language design but boardroom politics - Sun wanted JS not to compete with Java so they had to try to make JS more like Java to present JS as a kind of "Java for amateurs".
Think hard about whether your use case really cares about local time, try and find ways to make instants appropriate. Then stick to UTC ISO 8601 strings / Unix timestamps and most of the complexity goes away, or at least gets contained to a small part of your software.
I know this isn't always possible (I once had to support a feature that required the user took a break covering two periods of 1-5am local, which was obviously fun around DST boundaries) - but in my experience at least the majority of the time you can find ways to minimise the surface area that cares.
If you're passing raw/unvalidated user input to the date parser you're holding it wrong.
TBH even when working with other languages I'd skew towards doing this, possibly because I've worked with JavaScript/TypeScript too much. It's a balance but there's a certain comfort in making constraints really explicit in code you control over blindly trusting the standard library to keep it's behaviour over the lifetime of the product.
[Small edits for clarity.]
Try that tactic with FUTURE dates.
Meet at 7pm still means meet at 7pm when timezones change, countries make changes to when their summer time starts, etc. Which happens all the time.
And it's actually a more subtle problem. You actually need the context of the timezone in some applications. If your application is showing dinner reservations, for example, you want to display the time in the local time of the restaurant, not the local time of the user. You want to know when it's booked THERE, not where you happen to be right now. I want to know my booking is at 7pm, not 1pm because I happen to be in America right now.
So using GMT/UTC is not a panacea for all the date problems you face.
It's only a solution for dates in the past. And even then you might argue that sometimes it's worth also storing the local time of the user/thing the event happened to, or at the very least the timezone they were in when it happened in a separate field.
The frequency in which time zones are changed surprised me the first time I looked it up. For a single country it's probably quite a big deal that doesn't happen too often. But internationally there are several changes each year. I think it was like 4-6 changes per year in the past decades.
But - wouldn't this be just as horrible in Go or Rust or any other language? (Or god forbid, C?) Are there better timezone APIs in other languages than what you can find in NPM that make these problems any easier to deal with?
JS has libraries that make the problem easier too. Though you'll never find a magic library that makes all the complexity disappear, because at the end of the day, you do need to tell the computer whether you mean 7 pm on the wall clock of where the users happen to be, 7 pm on a specific date in a specific time zone, etc. and doing math between dates is inherently complex with e.g. daylight savings changes. You might end up creating a datetime that doesn't exist (which Elixir handles well).
Many European railway timetable amd booking websites fall foul of this.
Oh absolutely. But the difference between a sane api and other APIs is that a sane one will fail in a sane way, ideally telling me I’m holding it wrong.
Especially: the key at every turn is to fail hard if there is anything even slightly wrong rather than do something potentially incorrect. Many JS apis seem designed to continue at any cost which is the fundamental problem. You never really want a NaN. You don’t want to coerce strings to [whatever].
Exactly. I would have never thought about using the Date class in this way. So the behavior is pretty much wtf and local time can get pretty complicated, but I wouldn't expect to get the right time, when passing some vague strings.
There are so many valid and reasonable cases where this will bite you.
"Real-world data" from CSV files or XML files or whatnot (that you don't control) and sometimes they have errors in them, and it's useful to know when instead of importing wrong data.
You do something wrong by mistake and getting wrong/confusing behaviour that you literally never want, and then you need to debug where that's coming from.
The user gives you a date, you have no idea what they entered and you want to know if that's correct.
But I do dream of a alternative timeline, in where the web evolved different.
It can be a bit belt and suspenders doing validation of specific forms but shit happens. It's much better to catch stuff before it's persisted on the back end or on disk.
And there can also be man in the middle attacks or whatever, the efforts you do for validation depends still on your task at hand. How critical an error would be.
But even for the most trivial tasks I would never think of passing some user strings to Date and expect to get a valid value.
You decide on the format that makes sense for the user, you validate the input against that format, and then you map it into a native date.
If the date constructor supports the format you want, it’s only coincidence that makes that final mapping step simpler.
So, the native date constructor having weird behavior on invalid strings doesn’t really change real world code.
Burn it and start again.
Good news! The builtin Temporal API is on its way. It's widely regarded as a solid API, learning from some of the best time APIs in other languages.
In terms of parsing, it looks like Temporal only accepts RFC 9557 strings (an extension of ISO 8601 / RFC 3339). Anything else throws an exception.
Of course... they will. Because there are going to be more outdated results on search engines for Date, instead of the Temporal API. But at least there's something positive coming!
You don't need to burn it, you just need to never rely on the constructor for parsing user input. It all works fine as long as you know the structure of the data before you pass it in.
Neither seem awesome.
Let’s say you started a date lib right now that would take random strings and do its best.
Suppose you identify the string is only integers. What logic needs to be applied to make the example I pointed out make any sort of common sense?
why would you do that? that's a bad idea, and no matter what implementation you choose you will end up with some silly compromises like javascript has in it's date constructor. javascript's default choices are probably no better or worse than whatever choices you would make.
if you're writing a date parser, the first thing you should do is come up with a better definition than "take random strings and do your best" unless you're putting an LLM behind it.
A rather large disadvantage in a programming language for developing user interfaces
Usually I can overcome that urge. It's curious that these designers could not.
The junior developer experiences nothing but errors, and struggles to get something over the line
The intermediate developer adopts “reduce errors at all cost” mentality and their parsers are too helpful, they make too many assumptions, and you end up with the Date class type behaviour.
The senior developer knows the deadliness of this, and designs a robust class that behaves consistently and errors early on invalid input.
Implementations are ongoing and open source, so you can contribute if you want it to come faster! V8 is currently aiming to use https://github.com/boa-dev/temporal IIRC.
I've had to develop interfaces built entirely around date and time operations, and it was painful not so much because of the awful and broken date parsing in JS (although it definitely doesn't help), but rather because of the inherent complexity of how we represent calendar dates and times, and how we build abstractions around these representations. Definitely NOT looking forward to developing that booking system at work...
My son was pretty happy with his 11/28 without any experience with js Date. Just deduced it from previous answers. And I explained type coercion to him.
I now realize I may have put him off a career in IT.
Second, I opened this in Firefox with the console open to answer these questions, and found these divergences (to summarize, Firefox is strict):
Question 14:
new Date("12.1")
> "12.1" is interpreted as the date December 1st, and as before for dates with no year the default is 2001 because of course.Firefox returns an Invalid Date object instead.
Question 16:
new Date("12.-1")
> The dash here is ignored, so this is interpreted the same as "12.1".Again, Firefox returns an Invalid Date object instead.
Question 19:
new Date("maybe 1")
> "may" in "maybe" is parsed as the month May! And for some reason this expression cares about your local timezone, which happens to be BST for me right now.Seems a broken record, but this is still an Invalid Date for Firefox.
Question 20:
new Date("fourth of may 2010")
> "fourth of" is ignored, this is just parsing "may 2010" and again local timezone is important.Ibid in Firefox.
Question 21:
new Date("May 4 UTC")
> UTC is correctly parsed as a timezone.No, Firefox is still not accepting this one.
Question 22:
new Date("May 4 UTC+1")
> You can add modifiers to timezones and it works as you would expect.Neither this one.
Question 23:
new Date("May 4 UTC+1:59")
> It also supports minutes!Firefox: Not really.
Final note: It parses Question 24 as you expect in Firefox. Which honestly, it shouldn't!
We ended up with a bunch of Safari specific workarounds that weren't necessary on Chrome (it was mostly a webview use case so Safari and Chrome were the two we cared about at the time)
Assumingly to me this was around the same time that Apple seemed to have DST problems more generally, such as their iOS alarm clock mishap https://www.theguardian.com/technology/blog/2010/nov/01/ipho...
new Date("2025-07-12 12:30:45 BST")
is an invalid date, whereas all three of new Date("2025-07-12 12:30:45 GMT")
new Date("2025-07-12 12:30:45 UTC")
new Date("2025-07-12 12:30:45 Z")
are valid but in British Summer Time.i coded security js code for a decade. right when the standard started to get the many updates.
our system was just a really tiny subset of things you could use that worked safely (and predictably) across browsers. even after the updates, we only incorporated was array.filter and structuredcopy. everything else offered no real benefit and only added to the attack surface.
and then came typescript. the biggest missed opportunity in js history.
even today, good js is knowing you can only use 1% of the language, and even that with lot of care
If you're going to use fancy fonts at least make them webfonts.
I picked fonts that are present on a wide variety of systems without having to be downloaded on page load. I just forgot some CSS to make sure numbers looked distinct.
Fortunately someone sent me a PR with the right CSS.
I think my strategy for JavaScript going forward is to 'drop & run'.
Anyway, after the experience trying to automate something with Google Sheets and wasting 4 hours just to discover that months start at 0 (in the context of parsing string dates and creating Date objects)... yep, no more JS for me.
Chrome is implementing it[2]
WebKit has an open PR[3]
I found no reference to Node working on it
and of course you can already use a polyfill (though it's quite heavy)
[1] https://deno.com/blog/v1.40?ref=computus.org#temporal-api
[2] https://chromestatus.com/feature/5668291307634688?ref=comput...
Realistically, you're going to use an actual date library, because even the good bits of Date are pretty bad.
But this is somehow my point, what were the reasons that lead to parse those strings as dates. I guess the answer is usually legacy, but in that "legacy" why did someone need to take those decisions? My naive way of parsing dates would be to reject everything that doesn't have a well defined format while the format is well defined and at least somehow composable, but here someone had the need (I hope the had the need) to parse these seemingly random strings, I wonder why.
And, "don't use the standard library" is not a solution. Well I mean it is, but it's not free - there's caveats. Which tools do you choose? How do you make sure all your engineers use the same ones? How do you make sure nobody touches the standard library? It's not an easy problem.
If that’s not relevant, I don’t know what is.
`getYear` is literally deprecated everywhere and is not part of the spec.
https://tc39.es/ecma262/multipage/numbers-and-dates.html#sec...
This is what we call a ‘foot gun’.
So it’s more like a foot dollar store water gun.
And, in fact, most C++ compilers today will actually warn you when you derefernence a nullptr in a stupid, obvious way. Evidently, JS has not caught up to this incredibly sophisticated technology /s.
Which at least will tell you you’re using a deprecated API, but won’t overcome the dumb API naming choices JavaScript blindly imported from Java
getYear() returns 125 as it was standard for dates to be offset from 1900 (which led to the Y2K problem). This behaviour should be maintained forever. "Nothing is more important than backwards compatibility"
Or rather, that should be mindset, so that we can achieve at least 90% backwards compatibility in practice.
It is just programmer education to know to add 1900 to years when using struct tm and also to use getFullYear() in JS.
Once that had been done, future versions of javascript had to maintain the same behaviour. This is where I'm using the backwards compatibility argument.
I suppose the offset method was really quite entrenched so no one thought to special case that function, instead special casing the "2025" returning getFullYear()
The first set of questions are logical and make some sense. But then there come questions asking you similar but with slightly different values.
And that’s when everything stops making sense.
At least on my pretty standard Windows 10 system. No idea which ones of the fonts Avenir, Montserrat, Corbel, URW Gothic, source-sans-pro, sans-serif is being used.
I'm not sure if it's caused by JavaScript quirkiness under the hood or a bug in the library, but it's mindblowing.
I am willing to (sarcastically) stake something of value on the premise that you are considered to be an interesting individual to interact with by your interlocutors and bystanders at socially coordinated gatherings of individuals convened for the explicit purpose of engaging in recreational, celebratory, or convivial activities, often characterized by the consumption of food and beverages, the enjoyment of music or other entertainments, and the collective participation in structured or unstructured merriment.
But the morale of this story should be if you need a date from a user user a date input and if you need to parse a computer-readable date always use UNIX timestamps or RFC 3339 (which are correctly and consistently supported).
Dates and times are insane, no matter the language.
Adding Temporal will only add to the chaos. Now there will be Date, moment, Luxon’s objects, and Temporal. See??? We fixed it!!!
„0“ and „1“ are for years, everything above are month.
Does the text parsing only work for English or other languages too?
I think after the 1970s "Worse is better" languages vanish from the Earth the last shadow of that thinking left might be Javascript, hopefully by then humans aren't writing it but of course for "compatibility" it'll still be implemented by web browsers.
> They're not the same kind of thing at all, if you insist that we should be able to compare them then they're not equal, and since programmers are human and make mistakes probably comparing them is itself a mistake.
It is literally encoded in the spec.
You're saying that the design makes sense because there's a definition, and the definition is what makes it make sense.
There is value in having defined behaviors, but those behaviors can't be immune from criticism. That's letting the tail wag the dog. The purpose of a program is not to execute the rules of the programming language. It's to perform a real and useful task. If those real and useful tasks are complicated because synthetic and arbitrary behaviors of the language exist, then the language is wrong. The tool exists to do work. The work does not exist to provide academic examples for the language.
And, yes, it's possible for it to be impossible to determine a reasonable behavior, but that still doesn't mean we can't have reasonable behavior whenever possible.
The tasks are not complicated because of this, it literally is default behavior in mainstream languages. And no, they’re neither synthetic, nor arbitrary limitations. The rule is based on types, not on whatever one specific value might mean.
And if they were to define “exceptions”, where do you draw the line?
“F41S3” is this? False? No? What if I’m a l33t h4x0r? What about 0xF4153? Looks false enough to me.
You know what you do when you can't handle an exceptional case? You throw an exception! Emitting errors is not undesirable behavior! It just means the computer says, "I don't know what to do so I better stop." You're never going to design all possible exceptions away, and that's not a flaw.
The question you should be asking is: why does it ever make sense to silently compare a character string to anything other than a character string? Semanticly, it's nonsense to compare different types. The only way you can do it is when the other type has a canonical string representation, neverminding issues of culture or language.
This is why, for example, C# has String.IsNullOrWhitespace() and String.IsNullOrEmpty(). It's partly to cover common combinations, but also to idiomatically determine if a meaningful value is present, which is what `StringVal == True` and `if (StringVal)` are trying to express and failing at.
Because this is how C does it.
But hilariously it's not even true, you see C believes the empty string is also true, it's a string, only the absence of a string would be false in C - whereas Javascript is quite sure the empty string, just like an undefined or null value, is false.
No.
> Some comparisons just don't make sense.
The comparison makes enough sense to me to determine the result.
They should’ve just deprecated it right away, invent Time Machine, move to the future, grab Rust and make it the scripting language. Duh.
And then all of a sudden code that is expecting to get an array or undefined gets handed a zero or an empty string because someone called it with
x && [x]
Or x.length && x
And that’s how you end up with a zero showing up on a webpage instead of a list of to-do items when the list is empty.Programming languages aren't for the machine, they're for humans, and humans make mistakes so we need to design the language with that in mind. "Truthiness" is a footgun, it increases the chance you'll write something you did not mean without even realising.
there are plenty of javascript examples that are actually weird though, especiall when javascript DOES apply meaning to strings, e.g. when attempting implicit integer parsing.
Because "0" is false. In a logical world, a non-empty string being truthy is fine even if the value is "false". Javascript isn't logical.
true ```
Excuse me?
> In a logical world, a non-empty string being truthy is fine even if the value is "false". Javascript isn't logical.
You must hate our illogical world built on C, because it has the same behavior.
Should "0" also === 0? How about "{}" === {}? When does it stop?
Python does the same thing. I don’t like it there either, but at least it’s more consistent about it.
In other words
If ("false") { /* You will be here */ }