HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

What does normalizing with hidden bit really mean?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
hiddenbitreallywhatwithmeandoesnormalizing

Problem

I have a question related to representing numbers in base 2 with floating point. For example, if I have such a number $$0.000011 \cdot 2^3$$ then is its normalized form this?
$$1.1\cdot 2^{-2}$$
Generally speaking about normalizing, normalizing with hidden bit, does it imply that the first number of mantissa should always be zero, or does hidden bit just mean that a "one" should be taken left to the point?

Solution

Generally speaking, normalized means "put in scientific notation." That just means, the mantissa should never start with 0, and should be less than the base. In binary that means the mantissa must be "1". Since the mantissa of a normalized binary floating point number is always 1, we don't need to store the 1. The first mantissa bit is hidden in the sense that it always exists, but we don't actually store the bit, because we know its value is 1.

So your normalized result ($1.1 \times 2^{-2}$) is correct, and it is correct because you've moved the first 1 to the left of the binary point.

Context

StackExchange Computer Science Q#39998, answer score: 7

Revisions (0)

No revisions yet.