peter-lawrey / HugeCollections-OLD

Huge Collections for Java using efficient off heap storage
273 stars 51 forks source link

Error in HugeHashMap for specific configurations #5

Closed sampath06 closed 10 years ago

sampath06 commented 10 years ago

Modifying the existing testcase in HugeHashMapTest giving different values of count makes it fail.

Sample code given below:

    @Test
    public void testPut() throws ExecutionException, InterruptedException {

        int count = 50;
        HugeConfig config = HugeConfig.DEFAULT.clone()
                .setSegments(128)
                .setSmallEntrySize(72) // TODO 64 corrupts the values !!
                .setCapacity(count);

        final HugeHashMap<CharSequence, SampleValues> map =
                new HugeHashMap<CharSequence, SampleValues>(
                        config, CharSequence.class, SampleValues.class);
        long start = System.nanoTime();

        final SampleValues value = new SampleValues();
        StringBuilder user = new StringBuilder();
        for (int i = 0; i < count; i++) {
            value.ee = i;
            value.gg = i;
            value.ii = i;
            map.put(users(user, i), value);
        }
        for (int i = 0; i < count; i++) {
            assertNotNull(map.get(users(user, i), value));
            assertEquals(i, value.ee);
            assertEquals(i, value.gg, 0.0);
            assertEquals(i, value.ii);
        }

        long time = System.nanoTime() - start;
        System.out.printf("Put/get %,d K operations per second%n",
                (int) (count * 4 * 1e6 / time));
    }

The assertion is

java.lang.AssertionError: expected:<30> but was:<22>

With the count of 100, the assertion is

java.lang.AssertionError: expected:<40> but was:<32>

With count of 2000, the assertion is

java.lang.AssertionError: expected:<1247> but was:<1239>

Seems to work for other counts like 5000, 10000 etc.

peter-lawrey commented 10 years ago

The entry size should be a power of two regardless of what you give it. The count will be a multiple of the number of segments. There should be checks to enforce this but they might be broken. On 29 Jan 2014 04:12, "sampath06" notifications@github.com wrote:

Modifying the existing testcase in HugeHashMapTest giving different values of count makes it fail.

Sample code given below:

@Test
public void testPut() throws ExecutionException, InterruptedException {

    int count = 50;
    HugeConfig config = HugeConfig.DEFAULT.clone()
            .setSegments(128)
            .setSmallEntrySize(72) // TODO 64 corrupts the values !!
            .setCapacity(count);

    final HugeHashMap<CharSequence, SampleValues> map =
            new HugeHashMap<CharSequence, SampleValues>(
                    config, CharSequence.class, SampleValues.class);
    long start = System.nanoTime();

    final SampleValues value = new SampleValues();
    StringBuilder user = new StringBuilder();
    for (int i = 0; i < count; i++) {
        value.ee = i;
        value.gg = i;
        value.ii = i;
        map.put(users(user, i), value);
    }
    for (int i = 0; i < count; i++) {
        assertNotNull(map.get(users(user, i), value));
        assertEquals(i, value.ee);
        assertEquals(i, value.gg, 0.0);
        assertEquals(i, value.ii);
    }

    long time = System.nanoTime() - start;
    System.out.printf("Put/get %,d K operations per second%n",
            (int) (count * 4 * 1e6 / time));
}

The assertion is

java.lang.AssertionError: expected:<30> but was:<22>

With the count of 100, the assertion is

java.lang.AssertionError: expected:<40> but was:<32>

With count of 2000, the assertion is

java.lang.AssertionError: expected:<1247> but was:<1239>

Seems to work for other counts like 5000, 10000 etc.

Reply to this email directly or view it on GitHubhttps://github.com/OpenHFT/HugeCollections/issues/5 .

sampath06 commented 10 years ago

I tried with these values and it still gives the same error

int count = 32;
        HugeConfig config = HugeConfig.DEFAULT.clone()
                .setSegments(16)
                .setSmallEntrySize(128) // TODO 64 corrupts the values !!
                .setCapacity(count);
peter-lawrey commented 10 years ago

This needs investigating. On 29 Jan 2014 08:20, "sampath06" notifications@github.com wrote:

I tried with these values and it still gives the same error ''' int count = 32; HugeConfig config = HugeConfig.DEFAULT.clone() .setSegments(16) .setSmallEntrySize(128) // TODO 64 corrupts the values !! .setCapacity(count); '''

Reply to this email directly or view it on GitHubhttps://github.com/OpenHFT/HugeCollections/issues/5#issuecomment-33564439 .