Working With the Number Type
Explore the JavaScript number type, which uses a single format for integers and floating-point values. Understand different literal formats like decimal, octal, and hexadecimal, and learn about special values such as NaN, Infinity, and zero distinctions. This lesson helps you grasp how JavaScript handles numeric data and common pitfalls in number operations.
We'll cover the following...
JavaScript does not make a distinction among types of numbers, such as integers and floating-point numbers with different lengths.
JavaScript only has a single number type that represents both integers and floating point numbers using the IEEE-754 format.
You can check the Number.MIN_VALUE and the Number.MAX_VALUE constants to know this representation’s limits:
Despite the single storage format, JavaScript provides several literal formats.
Integer literals
Integer numbers can be represented with decimal, octal, and hexadecimal literals. Most frequently, decimal literals are used:
When the integer literal starts with 0, it is parsed as an octal literal, unless a digit out of the 0-7 range is detected, in this case, the number is ...