JavaScript is a fine language these days, but it wasn’t always this way. I think we’re all familiar with how bad of a language JavaScript was for a very long time.
There’s really no shortage of articles on the Internet railing against the language and its variety of strange edge cases and seemingly perplexing behavior.
However, ever since ECMAScript 6 (ES6) was released in 2015, much of the language was polished up, many of its wrongs were corrected, and each ES release has made working with the language acceptable and even enjoyable.
This article is not an attack on the language, the paradigms and patterns it encourages, or the unfavorable circumstances in which it was born but rather it is a brief guide to be used to understand the pain points of the language and effectively write better JavaScript.
The Major Wrongs
Wrong #1: Problematic ASI Implementation
Now this notion has become somewhat controversial as of recent times but JavaScript has a mechanism called Automatic Semicolon Insertion (ASI) that automatically inserts semicolons into your code should it need it.
// Statements can be terminated by ; doStuff(); // ... but they don't have to be, as semicolons are automatically inserted // wherever there's a newline, except in certain cases. doStuff()
What this means is that in JavaScript, semicolons are technically optional. The problem is that the implementation is problematic and there are some cases in which the ASI mechanism will break your code.
You can choose to ignore them but you’d still have to agree, it’s not “automatic” if you need to go out of your way to protect your code from an edge case. It’s a wrong.
Wrong #2: One Singular Number Type
As opposed to many other languages, JavaScript has one singular numeric type called the Number
type. This type is used for both integers and floats, which is counter-intuitive for many developers coming from a language that has clear and useful distinctions between integers and decimal numbers.
0.1 + 0.2; // = 0.30000000000000004 5 / 2; // = 2.5 18.5 % 7; // = 4.5
It takes a constant awareness for the programmer to mentally process all numeric operations as float-based and it also requires the use of the Math
object for rounding and finding the floor or ceiling of an operation, which is typically cumbersome.
Additionally, using only one type for numbers leads to only being able to use 53 bit integers. For working with big integers you’ll probably need to grab a library.
Wrong #3: Three Not-A-Real-Number Values: Infinity, -Infinity, NaN
Thanks to JavaScript’s “the show must go on” attitude, making invalid or nonsensical calculations such as division by zero doesn’t cause an error or throw an exception, instead it yields one of the three not-a-real-number values that are… actually of the Number
type.
Infinity; // result of e.g. 1/0 -Infinity; // result of e.g. -1/0 NaN; // result of e.g. 0/0, stands for 'Not a Number' typeof Infinity; // "number" typeof -Infinity; // "number" typeof NaN; // "number"
This makes working with numbers a chore and tracking down bugs much harder when it comes to programs that are heavy on calculations. It also leads to the necessity of methods such as isFinite
, isInteger
, and isNaN
for number checking.
Wrong #4: Two “null” Types
Personally I don’t have much of a problem with the inclusions of these two types but the fact that there are two different null or nil types add to the growing list of factors that can cause developers to introduce bugs in their software.
let bag; // undefined bag = {}; bag.candy; // undefined bag.candy = null; // null
Generally, undefined
refers to an uninitialized variable or non-existent object property and null
to a deliberately empty variable or object property but knowing when to check for one or the other and why you are checking for one in particular can be a bother.
Bonus wrong:
The typeof
operator is broken when it comes to the null
use case:
typeof null; // "object"
Wrong #5: Broken Abstract Equality (==
) Algorithm
There’s really not much else to be said here. The abstract equality algorithm behind the ==
operator is comically bad and breaks the mathematical concept of equality and equivalence relations.
It just shouldn’t be used. The strict equality operator ===
should be used in its stead, but having to be aware of this fact and avoiding an entire operator in the language is just in poor taste and trivially maddening.
Wrong #6: Unmemorable Type Coercion
When dealing with dynamic variables and loose typing, you’ll eventually need to do some type coercion to complete some operation. The problem with JavaScript is that the implicit type coercion done by the engine is unmemorable, unconventional, and can lead to a large amount of bugs in your production code.
For example, the +
operator can lead to some unexpected outcomes when mixing strings and numbers:
13 + !0; // 14 "13" + !0; // '13true'
And not to mention that all of these are valid but puzzling language expressions:
true + false 12 / "6" "number" + 15 + 3 15 + 3 + "number" [1] > null "foo" + + "bar" 'true' == true false == 'false' null == '' !!"false" == !!"true" [‘x’] == ‘x’ [] + null + 1 [1,2,3] == [1,2,3] {}+[]+{}+[1] !+[]+[]+![] new Date(0) - 0 new Date(0) + 0
As a developer you’ll want to stick to explicit type coercion via the wrapper class constructors Number
, Boolean
, and String
or some of the built-in parsing methods.
The good news is that the falsy coercions are very easy and intuitive to remember: “false, null, undefined, NaN, 0 and “” are falsy; everything else is truthy”.
Wrong #7: The var Keyword and Function Scoping
Yes, yes, I know – this is no longer a concern after ES6. But it is a testament of bad language design that for more than a decade programmers had to accept the fact that declaring a variable could literally be a global hazard:
// Variables are declared with the `var` keyword. JavaScript is dynamically // typed, so you don't need to specify type. Assignment uses a single `=` // character. var someVar = 5; // If you leave the var keyword off, you won't get an error... someOtherVar = 10; // ...but your variable will be created in the global scope, not in the scope // you defined it in.
Basically, you could (well, you still can) introduce global variables using var
into your program if you weren’t careful and there was no true block scoping for variables, which made understanding something as simple as scoping puzzling.
Now you should completely desist from using var
, always use let
and const
that provide conventional block scoping, and include a use strict
directive at the top of your script to prevent unintended global variable declarations and other problematic run-time behavior.
The Minor Wrongs
Briefly, these are what I would consider to be some minor wrongs that are easy to live with but have a learning curve or are just a peculiarity of JavaScript’s unique language design that enable its rich flexibility but also its complexity:
- Convoluted and poor inheritance model
- Complicated use of the
this
keyword - Poor standard library
I think any programmer who has come from a language like Java or C# into JavaScript has had to wrestle with the language’s confusing class
syntax and convoluted object model that is used to implement classes.
Most of this was due to ill-devised conventions and syntax decisions made to accommodate enterprise developers, since in reality JavaScript is based on prototypes (which I consider great), not classes, and its syntax should reflect that; but it definitely adds to the complexity in being proficient in the language for most existing developers.
Additionally, the runtime and concurrency model in which JavaScript is executed is fairly unique among conventional programming languages, since all JS code is executed in an Event Loop Runtime with Event-Driven Non-Blocking Asynchronous I/O.
And if you don’t understand what that means and what implications that has for your code, well… See my point? Have fun!
Writer’s Note: Some of these language examples were taken from the Learn X in Y Minutes JavaScript article. I recommend you visit the site for quick, scenic tours of your favorite programming languages.