cmd/internal/gc: use 512 bits (rather than 464) for multi-precision arithmetic
The original implementation used 16 int "words" but only 29 bits per word
for a total of 16*29 = 464 bits, with a space consumption of 16*64 = 1024
bits on a 64 bit machine. Switching to 512 bits increases precision while
still using (in the worst case) half the amount of memory per mp value on
a 64 bit machine.
Also: Decreased permitted number of least-significant mantissa bits which
may be incorrect when considering if a precise floating-point constant is
an integer from 29 to 16 bits.