The difference between decimal and digits is that decimal accepts
floating point numbers and negative numbers, while digits accepts only the
digits 0-9.
The difference between decimal and digits is that decimal accepts
floating point numbers and negative numbers, while digits accepts only the
digits 0-9.
Creates a decimal validation action.
The difference between
decimalanddigitsis thatdecimalaccepts floating point numbers and negative numbers, whiledigitsaccepts only the digits 0-9.