From two different comments on here:
http://stackoverflow.com/questions/5689369/what-is-the-diffe... Precision is the number of significant digits. Oracle guarantees the
portability of numbers with precision ranging from 1 to 38.
Scale is the number of digits to the right (positive) or left (negative)
of the decimal point. The scale can range from -84 to 127.
Worth noting this isn't specifically an Oracle thing, most financial systems need to be sure that it can store currency numbers accurately and this convention is widely used to ensure this.
And:
Precision 4, scale 2: 99.99
Precision 10, scale 0: 9999999999
Precision 8, scale 3: 99999.999
Precision 5, scale -3: 99999000
Typically when dealing with currencies scale is only used to represent the units less than whole unit of the currency, i.e. cents and pence. But there isn't anything that restricts it from being used to accommodate larger numbers with the use of negative scales.
A current list of all ISO 4217 codes and the currency properties can be found here http://www.currency-iso.org/en/home/tables/table-a1.html
i.e.
<CcyNtry>
<CtryNm>UNITED STATES OF AMERICA (THE)</CtryNm>
<CcyNm>US Dollar</CcyNm>
<Ccy>USD</Ccy>
<CcyNbr>840</CcyNbr>
<CcyMnrUnts>2</CcyMnrUnts>
</CcyNtry>
The CcyMnrUnts property denotes 2 decimal places for the sub-unit of the dollar.
So for the above example of 99999.999 you would store an amount of 99999999 and a scale of 3.