Uploaded image for project: 'JDK'
  1. JDK
  2. JDK-8016252

More defensive HashSet.readObject

    Details

    • Type: Bug
    • Status: Closed
    • Priority: P3
    • Resolution: Fixed
    • Affects Version/s: 8
    • Fix Version/s: 8
    • Component/s: core-libs
    • Labels:
    • Subcomponent:
    • Resolved In Build:
      b113
    • Verification:
      Verified

      Backports

        Description

        HashSet.readObject should validate its serial data, similar to what is done in HashMap.

        diff --git a/src/share/classes/java/util/HashSet.java b/src/share/classes/java/util/HashSet.java
        --- a/src/share/classes/java/util/HashSet.java
        +++ b/src/share/classes/java/util/HashSet.java
        @@ -24,6 +24,8 @@
          */

         package java.util;
        +
        +import java.io.InvalidObjectException;

         /**
          * This class implements the <tt>Set</tt> interface, backed by a hash table
        @@ -293,17 +295,20 @@ public class HashSet<E>
                 throws java.io.IOException, ClassNotFoundException {
                 // Read in any hidden serialization magic
                 s.defaultReadObject();
        -
        - // Read in HashMap capacity and load factor and create backing HashMap
        - int capacity = s.readInt();
        + s.readInt(); // Read and ignore capacity
                 float loadFactor = s.readFloat();
        + if (loadFactor <= 0 || Float.isNaN(loadFactor))
        + throw new InvalidObjectException("Illegal load factor: " +
        + loadFactor);
        + int size = s.readInt();
        + if (size < 0)
        + throw new InvalidObjectException("Illegal element count: " + size);
        + // Compute capacity by number of elements and desired load (if >= 0.25)
        + int capacity = (int)Math.min(size * Math.min(1 / loadFactor, 4.0f),
        + HashMap.MAXIMUM_CAPACITY);
                 map = (((HashSet<?>)this) instanceof LinkedHashSet ?
                        new LinkedHashMap<E,Object>(capacity, loadFactor) :
                        new HashMap<E,Object>(capacity, loadFactor));
        -
        - // Read in size
        - int size = s.readInt();
        -
                 // Read in all elements in the proper order.
                 for (int i=0; i<size; i++) {
                     @SuppressWarnings("unchecked")

          Issue Links

            Activity

            Hide
            mduigou Michael Duigou added a comment -
            The capacity passed to the constructor is different than the field assignment as the field is the size * loadFactor value. In this case, I think the right thing to do is to pass size to the Map constructor as the "* loadFactor" is otherwise done twice.
            Show
            mduigou Michael Duigou added a comment - The capacity passed to the constructor is different than the field assignment as the field is the size * loadFactor value. In this case, I think the right thing to do is to pass size to the Map constructor as the "* loadFactor" is otherwise done twice.
            Hide
            mduigou Michael Duigou added a comment -
            Great idea to fix this though!
            Show
            mduigou Michael Duigou added a comment - Great idea to fix this though!
            Hide
            chegar Chris Hegarty added a comment -
            Thank you for your comments Mike, they are really appreciated.

            Can you please elaborate on "I think the right thing to do is to pass size to the Map constructor as the "* loadFactor" is otherwise done twice", as I don't see that this is possible. Since the two arg HashMap constructor is invoked, then I would have assumed that determining the initial capacity, from the size and loadFactor, was the right thing to do.

            BTW, this issue was raised to me by Doug, in a separate mail thread.
            Show
            chegar Chris Hegarty added a comment - Thank you for your comments Mike, they are really appreciated. Can you please elaborate on "I think the right thing to do is to pass size to the Map constructor as the "* loadFactor" is otherwise done twice", as I don't see that this is possible. Since the two arg HashMap constructor is invoked, then I would have assumed that determining the initial capacity, from the size and loadFactor, was the right thing to do. BTW, this issue was raised to me by Doug, in a separate mail thread.
            Hide
            mduigou Michael Duigou added a comment -
            Sorry, I was incorrect and temporarily conflated threshold and capacity. The initialCapacity you are requesting will ensure that the table never needs to be rehashed during load and ends up being at least 25% full. This is exactly what we want.
            Show
            mduigou Michael Duigou added a comment - Sorry, I was incorrect and temporarily conflated threshold and capacity. The initialCapacity you are requesting will ensure that the table never needs to be rehashed during load and ends up being at least 25% full. This is exactly what we want.
            Hide
            hgupdate HG Updates added a comment -
            URL: http://hg.openjdk.java.net/jdk8/tl/jdk/rev/b86e6700266e
            User: bpb
            Date: 2013-10-09 18:50:18 +0000
            Show
            hgupdate HG Updates added a comment - URL: http://hg.openjdk.java.net/jdk8/tl/jdk/rev/b86e6700266e User: bpb Date: 2013-10-09 18:50:18 +0000
            Hide
            hgupdate HG Updates added a comment -
            URL: http://hg.openjdk.java.net/jdk8/jdk8/jdk/rev/b86e6700266e
            User: lana
            Date: 2013-10-22 17:23:08 +0000
            Show
            hgupdate HG Updates added a comment - URL: http://hg.openjdk.java.net/jdk8/jdk8/jdk/rev/b86e6700266e User: lana Date: 2013-10-22 17:23:08 +0000

              People

              • Assignee:
                bpb Brian Burkhalter
                Reporter:
                chegar Chris Hegarty
              • Votes:
                0 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: