Uploaded image for project: 'JDK'
  1. JDK
  2. JDK-8211449

The DecimalFormat spec is incorrect about the implicit negative subpattern


    • Type: Bug
    • Status: Open
    • Priority: P4
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: core-libs


      The DecimalFormat specification about the pattern syntax says that

       "The negative subpattern is optional; if absent, then the positive subpattern prefixed with the localized minus sign ('-' in most locales) is used as the negative subpattern."
      Use of term "localized minus sign" seems incorrect, because when a pattern is passed in the DecimalFormat constructor "new DecimalFormat(pattern)" the special characters in the pattern are always expected to be in non-localized (ASCII) form e.g. "." (decimal separator), "-" (minus sign) etc. This description in the specification doesn't seem right. Also, in the implementation, if an explicit negative subpattern is absent, the positive subpattern prefixed with non-localized minus sign is used as negative subpattern.
       "negPrefixPattern = "'-" + posPrefixPattern;".

      The same is mentioned in LDML specification


      "A pattern contains a positive subpattern and may contain a negative subpattern, for example, "#,##0.00;(#,##0.00)". Each subpattern has a prefix, a numeric part, and a suffix. If there is no explicit negative subpattern, the implicit negative subpattern is the ASCII minus sign (-) prefixed to the positive subpattern. That is, "0.00" alone is equivalent to "0.00;-0.00"."


          Issue Links



              • Assignee:
                naoto Naoto Sato
                nishjain Nishit Jain
              • Votes:
                0 Vote for this issue
                1 Start watching this issue


                • Created: