Alfredo writes: >>> "F. Alfredo Rego" <[log in to unmask]> 11/08/97 06:57am >>> "Robert A. Karlin" <[log in to unmask]> wrote: >Cobol does not have a "Currency" type. As such, all significance is based >on the underlying field definitions. To gain significance, you add >decimals. Cobol can currently support 18 decimal places, I believe. Maybe >I am dense, but I do not see this as a problem for the compiler, just for >the developer, and there it is a matter of understanding the underlying >currencies, and making the appropriate provisions for them. Being a COBOL illiterate, I would like to ask a few silly questions. What do you mean, precisely, by "Cobol can currently support 18 decimal places"? Are these "hard" digits or "floating-point-like" digits? Is each digit represented by a nibble or a byte? Bottom line: Do you get precise results when dealing with this "18-digit arithmetic" or does COBOL round things up (or truncate) before delivering the result? The whole issue hinges on the answer to THIS question. I'm not a member of SigCOBOL, so I would appreciate copies of messages dealing with this topic. Thanks a bunch (expressed in 18 digits?) Alfredo -------------------------------------------------------------------------------------- The 18 digits are as hard as diamond. There is no sort of correct numeric type in Cobol. 2 + 2 = 4, not 3.99999999999999999. As for representation, like at Burger King, have it your way. You can have the digits as full bytes (PIC 9 DISPLAY(IMAGE type Z)), integer format (PIC 9 COMP(IMAGE type I)) or as nibbles (PIC 9 COMP-3(IMAGE type P)). All formats support from 1 to 18 digits with any number of digits on either side of the decimal point. PIC V9(18) is valid as is PIC 9(18) or anything inbetween. In the Cobol 2000 standard, this will jump to 31 digits. The only time that there is rounding or trunction is doing multiplication or division, the field with the least number of decimal digits determines decimal accuracy. Mike Berkowitz Guess? Inc.