Examples of readFullyScatterGather()


Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    // open file for reading
    FSDataInputStream fin = fs.open(fileName);

    // Do a first read so that the data is read into cache. Do not measure the time
    // for this read call.
    List<ByteBuffer> rlist = fin.readFullyScatterGather(0, 10000);

    // create all threads
    Putter[] all = new Putter[numThreads];
    for (int i = 0; i < numThreads; i++) {
      all[i] = new Putter(fin, fileName, fileSize, i, numIterationsPerThread);
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    // test empty file open and read
    stm.close();
    FSDataInputStream in = fileSys.open(name);
    byte[] buffer = new byte[(int)(12*blockSize)];
    in.readFully(0, buffer, 0, 0);
    List<ByteBuffer> rlist = in.readFullyScatterGather(0, 0);

    IOException res = null;
    try { // read beyond the end of the file
      in.readFully(0, buffer, 0, 1);
      rlist = in.readFullyScatterGather(0, 1);
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    List<ByteBuffer> rlist = in.readFullyScatterGather(0, 0);

    IOException res = null;
    try { // read beyond the end of the file
      in.readFully(0, buffer, 0, 1);
      rlist = in.readFullyScatterGather(0, 1);
    } catch (IOException e) {
      // should throw an exception
      res = e;
    }
    assertTrue("Error reading beyond file boundary.", res != null);
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    } else {
      Random rand = new Random(seed);
      rand.nextBytes(expected);
    }
    // do a sanity check. pread the first 4K bytes
    List<ByteBuffer> rlist = stm.readFullyScatterGather(0, 4096);
    checkAndEraseData(rlist, 4096, 0, expected, "Read Sanity Test");

    // now do a pread for the first 8K bytes
    byte[] actual = new byte[8192];
    doPread(stm, 0L, actual, 0, 8192);
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    stm.readFully(actual);
    checkAndEraseData(actual, 0, expected, "Pread Test 2");

    // Now see if we can cross a single block boundary successfully
    // read 4K bytes from blockSize - 2K offset
    rlist = stm.readFullyScatterGather(blockSize - 2048, 4096);
    checkAndEraseData(rlist, 4096, (int)(blockSize-2048), expected, "Pread Test 3");

    // now see if we can cross two block boundaries successfully
    // read blockSize + 4K bytes from blockSize - 2K offset
    int size = (int)(blockSize+4096);
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    checkAndEraseData(rlist, 4096, (int)(blockSize-2048), expected, "Pread Test 3");

    // now see if we can cross two block boundaries successfully
    // read blockSize + 4K bytes from blockSize - 2K offset
    int size = (int)(blockSize+4096);
    rlist = stm.readFullyScatterGather(blockSize - 2048, size);
    checkAndEraseData(rlist, size, (int)(blockSize-2048), expected, "Pread Test 4");

    // now see if we can cross two block boundaries that are not cached
    // read blockSize + 4K bytes from 10*blockSize - 2K offset
    size = (int)(blockSize+4096);
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    checkAndEraseData(rlist, size, (int)(blockSize-2048), expected, "Pread Test 4");

    // now see if we can cross two block boundaries that are not cached
    // read blockSize + 4K bytes from 10*blockSize - 2K offset
    size = (int)(blockSize+4096);
    rlist = stm.readFullyScatterGather(10*blockSize - 2048, size);
    checkAndEraseData(rlist, size, (int)(10*blockSize-2048), expected, "Pread Test 5");

    // now check that even after all these preads, we can still read
    // bytes 8K-12K
    actual = new byte[4096];
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    stm.readFully(actual);
    checkAndEraseData(actual, 8192, expected, "Pread Test 6");

    // pread beyond the end of the file. It should return the last half block.
    size = blockSize/2;
    rlist = stm.readFullyScatterGather(11*blockSize+size, blockSize);
    checkAndEraseData(rlist, size, (int)(11*blockSize+size), expected, "Pread Test 5");

    IOException res = null;
    try { // normal read beyond the end of the file
      stm.readFully(11*blockSize+blockSize/2, actual, 0, blockSize);
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    stm.readFully(actual);
    checkAndEraseData(actual, 0, expected, "Pread Test 2");

    // Now see if we can cross a single block boundary successfully
    // read 4K bytes from blockSize - 2K offset
    rlist = stm.readFullyScatterGather(blockSize - 2048, 4096);
    checkAndEraseData(rlist, 4096, (int)(blockSize-2048), expected, "Pread Test 3");

    // now see if we can cross two block boundaries successfully
    // read blockSize + 4K bytes from blockSize - 2K offset
    int size = (int)(blockSize+4096);
View Full Code Here

Examples of org.apache.hadoop.fs.FSDataInputStream.readFullyScatterGather()

    checkAndEraseData(rlist, 4096, (int)(blockSize-2048), expected, "Pread Test 3");

    // now see if we can cross two block boundaries successfully
    // read blockSize + 4K bytes from blockSize - 2K offset
    int size = (int)(blockSize+4096);
    rlist = stm.readFullyScatterGather(blockSize - 2048, size);
    checkAndEraseData(rlist, size, (int)(blockSize-2048), expected, "Pread Test 4");

    // now see if we can cross two block boundaries that are not cached
    // read blockSize + 4K bytes from 10*blockSize - 2K offset
    size = (int)(blockSize+4096);
View Full Code Here
TOP
Copyright © 2018 www.massapi.com. All rights reserved.
All source code are property of their respective owners. Java is a trademark of Sun Microsystems, Inc and owned by ORACLE Inc. Contact coftware#gmail.com.