umd-dslam / FileScale

FileScale: Fast and Elastic Metadata Management for Distributed File Systems
https://dslam-umd.github.io/docs/filescale
Apache License 2.0
3 stars 2 forks source link

Optimize FS Open OP #88

Closed gangliao closed 5 years ago

gangliao commented 5 years ago

VoltDB Cluster with 3 nodes:

2019-04-22 02:29:11,240 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-04-22 02:29:11,245 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-04-22 02:29:11,249 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-04-22 02:29:11,252 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-04-22 02:29:11,277 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388],16387)
2019-04-22 02:29:11,281 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-04-22 02:29:11,285 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16387,1555900150024)
2019-04-22 02:29:11,291 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16387,0)
2019-04-22 02:29:11,295 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16387,1099511693805)
2019-04-22 02:29:11,315 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-04-22 02:29:11,319 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-04-22 02:29:11,323 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-04-22 02:29:11,352 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-04-22 02:29:11,356 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-04-22 02:29:11,361 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-04-22 02:29:11,364 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-04-22 02:29:11,367 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-04-22 02:29:11,370 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-04-22 02:29:11,373 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-04-22 02:29:11,391 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16387],16386)
2019-04-22 02:29:11,394 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-04-22 02:29:11,400 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388],16387)
2019-04-22 02:29:11,403 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-04-22 02:29:11,411 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390],16388)
2019-04-22 02:29:11,414 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-04-22 02:29:11,418 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-04-22 02:29:11,424 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-04-22 02:29:11,428 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-04-22 02:29:11,446 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16387,create)
2019-04-22 02:29:11,453 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388],16387)
2019-04-22 02:29:11,456 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-04-22 02:29:11,463 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390],16388)
2019-04-22 02:29:11,466 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-04-22 02:29:11,470 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-04-22 02:29:11,476 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-04-22 02:29:11,479 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-04-22 02:29:11,488 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388],16387)
2019-04-22 02:29:11,491 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-04-22 02:29:11,504 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390],16388)
2019-04-22 02:29:11,508 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-04-22 02:29:11,511 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-04-22 02:29:11,516 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-04-22 02:29:11,520 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-04-22 02:29:11,528 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16387,16386)
2019-04-22 02:29:11,529 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16387,16386,open)
2019-04-22 02:29:11,545 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16387,1099511693805)
2019-04-22 02:29:11,556 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16386)
2019-04-22 02:29:11,565 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16386,1555900150024)
2019-04-22 02:29:11,576 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [UPDATE]: (16386,1555900151374)
2019-04-22 02:29:11,581 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16386,1555900151374)
2019-04-22 02:29:11,585 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16387,1555900150024)
2019-04-22 02:29:11,589 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16387,0)
2019-04-22 02:29:11,620 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-04-22 02:29:11,625 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-04-22 02:29:11,629 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-04-22 02:29:11,633 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-04-22 02:29:11,647 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388],16387)
2019-04-22 02:29:11,650 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-04-22 02:29:11,653 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16387,1555900150024)
2019-04-22 02:29:11,657 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16387,0)
2019-04-22 02:29:11,661 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16387,1099511693805)
2019-04-22 02:29:11,692 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-04-22 02:29:11,697 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-04-22 02:29:11,701 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-04-22 02:29:11,703 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-04-22 02:29:11,707 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-04-22 02:29:11,710 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-04-22 02:29:11,730 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16389,0)
2019-04-22 02:29:11,737 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-04-22 02:29:11,745 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16389,1555900150102)
2019-04-22 02:29:11,785 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-04-22 02:29:11,791 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-04-22 02:29:11,795 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-04-22 02:29:11,799 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-04-22 02:29:11,817 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-04-22 02:29:11,820 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-04-22 02:29:11,829 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16390,0)
2019-04-22 02:29:11,836 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-04-22 02:29:11,841 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16390,1555900150913)
2019-04-22 02:29:11,846 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe mode is already OFF
./bin/hadoop org.apache.hadoop.hdfs.server.namenode.NNThroughputBenchmark -fs hdfs://localhost:9000 -op open -threads 1 -files 2 -keepResults -logLevel INFO
2019-04-22 02:29:01,653 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-04-22 02:29:09,223 INFO namenode.NNThroughputBenchmark: Starting benchmark: open
2019-04-22 02:29:09,414 INFO namenode.NNThroughputBenchmark: Generate 2 intputs for create
2019-04-22 02:29:09,415 ERROR namenode.NNThroughputBenchmark: Log level = ERROR
2019-04-22 02:29:09,720 INFO namenode.NNThroughputBenchmark: Starting 2 create(s).
2019-04-22 02:29:11,209 INFO namenode.NNThroughputBenchmark: Created 2 files.
2019-04-22 02:29:11,216 INFO namenode.NNThroughputBenchmark: Generate 2 intputs for open
2019-04-22 02:29:11,663 ERROR namenode.NNThroughputBenchmark: Log level = INFO
2019-04-22 02:29:11,664 INFO namenode.NNThroughputBenchmark: Starting 2 open(s).
2019-04-22 02:29:11,844 INFO namenode.NNThroughputBenchmark: Memory Used: -20310944
2019-04-22 02:29:11,849 INFO namenode.NNThroughputBenchmark: 
2019-04-22 02:29:11,849 INFO namenode.NNThroughputBenchmark: --- open inputs ---
2019-04-22 02:29:11,851 INFO namenode.NNThroughputBenchmark: nrFiles = 2
2019-04-22 02:29:11,851 INFO namenode.NNThroughputBenchmark: nrThreads = 1
2019-04-22 02:29:11,852 INFO namenode.NNThroughputBenchmark: nrFilesPerDir = 4
2019-04-22 02:29:11,853 INFO namenode.NNThroughputBenchmark: --- open stats  ---
2019-04-22 02:29:11,853 INFO namenode.NNThroughputBenchmark: # operations: 2
2019-04-22 02:29:11,853 INFO namenode.NNThroughputBenchmark: Elapsed Time: 180
2019-04-22 02:29:11,854 INFO namenode.NNThroughputBenchmark:  Ops per sec: 11.11111111111111
2019-04-22 02:29:11,854 INFO namenode.NNThroughputBenchmark: Average Time: 89
gangliao commented 5 years ago

Optimization:

INodeDirectory: cleanSubtree

gangliao commented 5 years ago
2019-05-02 12:59:04,724 INFO org.apache.hadoop.util.GSet: capacity      = 2^15 = 32768 entries
2019-05-02 12:59:04,919 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /home/gangl/hadoop/name/in_use.lock acquired by nodename 11071@localhost
2019-05-02 12:59:05,546 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /home/gangl/hadoop/name/current
2019-05-02 12:59:05,589 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: No edit log streams selected.
2019-05-02 12:59:05,591 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Planning to load image: FSImageFile(file=/home/gangl/hadoop/name/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000)
2019-05-02 12:59:06,478 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
2019-05-02 12:59:06,484 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 0 from /home/gangl/hadoop/name/current/fsimage_0000000000000000000
2019-05-02 12:59:06,545 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
2019-05-02 12:59:06,554 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 1
2019-05-02 12:59:07,312 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups
2019-05-02 12:59:07,315 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 2561 msecs
2019-05-02 12:59:12,184 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to localhost:9000
2019-05-02 12:59:12,337 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
2019-05-02 12:59:12,530 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9000
2019-05-02 12:59:15,878 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState, ReplicatedBlocksState and ECBlockGroupsState MBeans.
2019-05-02 12:59:15,892 INFO org.apache.hadoop.hdfs.server.common.Util: Assuming 'file' scheme for path /home/gangl/hadoop/name in configuration.
2019-05-02 12:59:16,390 INFO org.apache.hadoop.hdfs.server.namenode.LeaseManager: Number of blocks under construction: 0
2019-05-02 12:59:16,410 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getSize: (0)
2019-05-02 12:59:16,508 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: initializing replication queues
2019-05-02 12:59:16,512 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 0 secs
2019-05-02 12:59:16,514 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
2019-05-02 12:59:16,519 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
2019-05-02 12:59:16,533 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getSize: (0)
2019-05-02 12:59:16,542 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getSize: (0)
2019-05-02 12:59:16,543 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks            = 0
2019-05-02 12:59:16,547 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks          = 0
2019-05-02 12:59:16,549 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0
2019-05-02 12:59:16,551 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of  over-replicated blocks = 0
2019-05-02 12:59:16,553 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written    = 0
2019-05-02 12:59:16,555 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 41 msec
2019-05-02 12:59:16,895 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getINodesNum [GET]: (1)
2019-05-02 12:59:16,914 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getSize: (0)
2019-05-02 12:59:16,948 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2019-05-02 12:59:16,952 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting
2019-05-02 12:59:17,012 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: localhost/127.0.0.1:9000
2019-05-02 12:59:17,059 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state
2019-05-02 12:59:17,061 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: Initializing quota with 4 thread(s)
2019-05-02 12:59:17,136 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([],16385)
2019-05-02 12:59:17,154 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: Quota initialization completed in 91 milliseconds
name space=1
storage space=0
storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0, PROVIDED=0
2019-05-02 12:59:17,288 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
2019-05-02 12:59:22,121 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(127.0.0.1:9866, datanodeUuid=946a32ac-0b48-4e9c-828b-41fc9a39541e, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-fe7c640d-3c8f-4f3b-bd1d-af07d8a5dc59;nsid=1603389028;c=1556801888712) storage 946a32ac-0b48-4e9c-828b-41fc9a39541e
2019-05-02 12:59:22,276 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/127.0.0.1:9866
2019-05-02 12:59:22,284 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockReportLeaseManager: Registered DN 946a32ac-0b48-4e9c-828b-41fc9a39541e (127.0.0.1:9866).
2019-05-02 12:59:23,546 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-38e7a90f-d312-4673-92b9-bfdc886e1cd0 for DN 127.0.0.1:9866
2019-05-02 12:59:24,187 INFO BlockStateChange: BLOCK* processReport 0x1d13e82e4111d100: Processing first storage report for DS-38e7a90f-d312-4673-92b9-bfdc886e1cd0 from datanode 946a32ac-0b48-4e9c-828b-41fc9a39541e
2019-05-02 12:59:24,383 INFO BlockStateChange: BLOCK* processReport 0x1d13e82e4111d100: from storage DS-38e7a90f-d312-4673-92b9-bfdc886e1cd0 node DatanodeRegistration(127.0.0.1:9866, datanodeUuid=946a32ac-0b48-4e9c-828b-41fc9a39541e, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-fe7c640d-3c8f-4f3b-bd1d-af07d8a5dc59;nsid=1603389028;c=1556801888712), blocks: 0, hasStaleStorage: false, processing time: 217 msecs, invalidatedBlocks: 0
2019-05-02 13:00:27,397 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe mode is already OFF
2019-05-02 13:00:27,864 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:27,879 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:27,911 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16386)
2019-05-02 13:00:27,913 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16386],16385)
2019-05-02 13:00:27,917 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:27,921 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([],16386)
2019-05-02 13:00:27,923 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16386,16385,nnThroughputBenchmark)
2019-05-02 13:00:27,936 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16386,1099511693805)
2019-05-02 13:00:27,944 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16385)
2019-05-02 13:00:27,954 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16387)
2019-05-02 13:00:27,957 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16387],16386)
2019-05-02 13:00:27,960 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:27,965 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([],16387)
2019-05-02 13:00:27,965 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16387,16386,create)
2019-05-02 13:00:27,978 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16387,1099511693805)
2019-05-02 13:00:27,984 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16386)
2019-05-02 13:00:27,992 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16388)
2019-05-02 13:00:27,995 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388],16387)
2019-05-02 13:00:27,997 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,001 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([],16388)
2019-05-02 13:00:28,002 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16388,16387,ThroughputBenchDir0)
2019-05-02 13:00:28,011 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16388,1099511693805)
2019-05-02 13:00:28,017 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16387)
2019-05-02 13:00:28,034 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16389)
2019-05-02 13:00:28,055 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [UPDATE]: (16389,281474976710672)
2019-05-02 13:00:28,062 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [UPDATE]: (16389,ThroughputBench0)
2019-05-02 13:00:28,082 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertUc [UPDATE]: (16389, create-client-0, 127.0.0.1)
2019-05-02 13:00:28,087 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([],16388)
2019-05-02 13:00:28,104 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16389,0)
2019-05-02 13:00:28,105 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16389,16388,ThroughputBench0)
2019-05-02 13:00:28,114 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16389,1099511693823)
2019-05-02 13:00:28,119 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16388)
2019-05-02 13:00:28,137 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16389, create-client-0)
2019-05-02 13:00:28,158 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16389,0)
2019-05-02 13:00:28,161 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16389,1556802028020)
2019-05-02 13:00:28,166 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16389,1556802028020)
2019-05-02 13:00:28,300 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:28,305 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:28,308 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:28,310 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:28,313 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,315 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:28,321 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:28,323 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16389,16388)
2019-05-02 13:00:28,325 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16389,ThroughputBench0)
2019-05-02 13:00:28,330 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16389,16388,ThroughputBench0)
2019-05-02 13:00:28,332 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:28,335 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16388,ThroughputBenchDir0)
2019-05-02 13:00:28,338 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16388,16387)
2019-05-02 13:00:28,341 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16388,16387,ThroughputBenchDir0)
2019-05-02 13:00:28,344 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,347 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16387,create)
2019-05-02 13:00:28,350 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16387,16386)
2019-05-02 13:00:28,352 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16387,16386,create)
2019-05-02 13:00:28,355 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:28,357 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16386,nnThroughputBenchmark)
2019-05-02 13:00:28,360 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16386,16385)
2019-05-02 13:00:28,362 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16386,16385,nnThroughputBenchmark)
2019-05-02 13:00:28,365 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:28,367 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:28,370 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16389, create-client-0)
2019-05-02 13:00:28,379 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16389, create-client-0)
2019-05-02 13:00:28,396 INFO org.apache.hadoop.hdfs.db.DatabaseINode: removeUc [UPDATE]: (16389)
2019-05-02 13:00:28,403 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [UPDATE]: (16389,1556802028381)
2019-05-02 13:00:28,408 INFO org.apache.hadoop.hdfs.StateChange: DIR* completeFile: /nnThroughputBenchmark/create/ThroughputBenchDir0/ThroughputBench0 is closed by create-client-0
2019-05-02 13:00:28,429 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:28,433 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:28,436 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:28,439 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:28,442 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,465 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16390)
2019-05-02 13:00:28,470 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [UPDATE]: (16390,281474976710672)
2019-05-02 13:00:28,475 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [UPDATE]: (16390,ThroughputBench1)
2019-05-02 13:00:28,482 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertUc [UPDATE]: (16390, create-client-0, 127.0.0.1)
2019-05-02 13:00:28,485 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389],16388)
2019-05-02 13:00:28,487 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:28,492 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16390,0)
2019-05-02 13:00:28,492 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16390,16388,ThroughputBench1)
2019-05-02 13:00:28,500 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16388,1099511693805)
2019-05-02 13:00:28,505 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16390,1099511693823)
2019-05-02 13:00:28,510 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16388)
2019-05-02 13:00:28,515 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16390, create-client-0)
2019-05-02 13:00:28,518 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16390,0)
2019-05-02 13:00:28,520 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16390,1556802028454)
2019-05-02 13:00:28,523 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16390,1556802028454)
2019-05-02 13:00:28,529 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:28,533 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:28,536 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:28,538 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:28,540 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,543 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:28,547 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:28,549 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16390,16388)
2019-05-02 13:00:28,551 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16390,ThroughputBench1)
2019-05-02 13:00:28,554 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16390,16388,ThroughputBench1)
2019-05-02 13:00:28,556 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:28,559 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16388,ThroughputBenchDir0)
2019-05-02 13:00:28,561 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16388,16387)
2019-05-02 13:00:28,564 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16388,16387,ThroughputBenchDir0)
2019-05-02 13:00:28,575 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,578 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16387,create)
2019-05-02 13:00:28,580 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16387,16386)
2019-05-02 13:00:28,583 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16387,16386,create)
2019-05-02 13:00:28,585 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:28,587 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16386,nnThroughputBenchmark)
2019-05-02 13:00:28,589 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16386,16385)
2019-05-02 13:00:28,592 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16386,16385,nnThroughputBenchmark)
2019-05-02 13:00:28,594 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:28,596 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:28,598 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16390, create-client-0)
2019-05-02 13:00:28,607 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16390, create-client-0)
2019-05-02 13:00:28,617 INFO org.apache.hadoop.hdfs.db.DatabaseINode: removeUc [UPDATE]: (16390)
2019-05-02 13:00:28,622 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [UPDATE]: (16390,1556802028608)
2019-05-02 13:00:28,626 INFO org.apache.hadoop.hdfs.StateChange: DIR* completeFile: /nnThroughputBenchmark/create/ThroughputBenchDir0/ThroughputBench1 is closed by create-client-0
2019-05-02 13:00:28,635 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:28,639 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:28,643 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:28,652 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:28,656 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,691 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16391)
2019-05-02 13:00:28,699 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [UPDATE]: (16391,281474976710672)
2019-05-02 13:00:28,705 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [UPDATE]: (16391,ThroughputBench2)
2019-05-02 13:00:28,714 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertUc [UPDATE]: (16391, create-client-0, 127.0.0.1)
2019-05-02 13:00:28,717 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390],16388)
2019-05-02 13:00:28,730 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:28,737 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:28,741 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16391,0)
2019-05-02 13:00:28,745 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16391,16388,ThroughputBench2)
2019-05-02 13:00:28,762 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16388,1099511693805)
2019-05-02 13:00:28,787 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16391,1099511693823)
2019-05-02 13:00:28,793 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16388)
2019-05-02 13:00:28,799 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16391, create-client-0)
2019-05-02 13:00:28,802 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16391,0)
2019-05-02 13:00:28,805 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16391,1556802028664)
2019-05-02 13:00:28,808 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16391,1556802028664)
2019-05-02 13:00:28,814 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:28,818 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:28,821 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:28,830 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:28,835 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,838 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:28,862 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:28,875 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16391,16388)
2019-05-02 13:00:28,880 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16391,ThroughputBench2)
2019-05-02 13:00:28,883 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16391,16388,ThroughputBench2)
2019-05-02 13:00:28,887 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:28,890 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16388,ThroughputBenchDir0)
2019-05-02 13:00:28,892 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16388,16387)
2019-05-02 13:00:28,915 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16388,16387,ThroughputBenchDir0)
2019-05-02 13:00:28,937 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:28,939 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16387,create)
2019-05-02 13:00:28,942 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16387,16386)
2019-05-02 13:00:28,945 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16387,16386,create)
2019-05-02 13:00:28,947 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:28,950 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16386,nnThroughputBenchmark)
2019-05-02 13:00:28,953 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16386,16385)
2019-05-02 13:00:28,955 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16386,16385,nnThroughputBenchmark)
2019-05-02 13:00:28,958 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:28,960 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:28,963 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16391, create-client-0)
2019-05-02 13:00:28,972 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16391, create-client-0)
2019-05-02 13:00:28,982 INFO org.apache.hadoop.hdfs.db.DatabaseINode: removeUc [UPDATE]: (16391)
2019-05-02 13:00:28,989 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [UPDATE]: (16391,1556802028973)
2019-05-02 13:00:29,006 INFO org.apache.hadoop.hdfs.StateChange: DIR* completeFile: /nnThroughputBenchmark/create/ThroughputBenchDir0/ThroughputBench2 is closed by create-client-0
2019-05-02 13:00:29,016 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,022 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,026 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,029 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,033 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,067 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16392)
2019-05-02 13:00:29,074 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [UPDATE]: (16392,281474976710672)
2019-05-02 13:00:29,079 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [UPDATE]: (16392,ThroughputBench3)
2019-05-02 13:00:29,086 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertUc [UPDATE]: (16392, create-client-0, 127.0.0.1)
2019-05-02 13:00:29,088 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390, 16391],16388)
2019-05-02 13:00:29,091 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,094 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:29,097 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:29,104 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16392,0)
2019-05-02 13:00:29,105 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16392,16388,ThroughputBench3)
2019-05-02 13:00:29,111 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16388,1099511693805)
2019-05-02 13:00:29,117 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16392,1099511693823)
2019-05-02 13:00:29,123 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16388)
2019-05-02 13:00:29,127 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16392, create-client-0)
2019-05-02 13:00:29,130 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16392,0)
2019-05-02 13:00:29,134 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16392,1556802029049)
2019-05-02 13:00:29,136 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16392,1556802029049)
2019-05-02 13:00:29,142 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,147 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,149 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,152 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,154 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,156 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,160 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,163 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16392,16388)
2019-05-02 13:00:29,165 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16392,ThroughputBench3)
2019-05-02 13:00:29,168 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16392,16388,ThroughputBench3)
2019-05-02 13:00:29,170 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,173 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16388,ThroughputBenchDir0)
2019-05-02 13:00:29,175 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16388,16387)
2019-05-02 13:00:29,178 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16388,16387,ThroughputBenchDir0)
2019-05-02 13:00:29,180 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,183 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16387,create)
2019-05-02 13:00:29,185 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16387,16386)
2019-05-02 13:00:29,188 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16387,16386,create)
2019-05-02 13:00:29,190 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,193 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16386,nnThroughputBenchmark)
2019-05-02 13:00:29,195 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16386,16385)
2019-05-02 13:00:29,198 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16386,16385,nnThroughputBenchmark)
2019-05-02 13:00:29,200 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,202 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,204 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16392, create-client-0)
2019-05-02 13:00:29,214 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16392, create-client-0)
2019-05-02 13:00:29,222 INFO org.apache.hadoop.hdfs.db.DatabaseINode: removeUc [UPDATE]: (16392)
2019-05-02 13:00:29,234 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [UPDATE]: (16392,1556802029215)
2019-05-02 13:00:29,239 INFO org.apache.hadoop.hdfs.StateChange: DIR* completeFile: /nnThroughputBenchmark/create/ThroughputBenchDir0/ThroughputBench3 is closed by create-client-0
2019-05-02 13:00:29,247 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,251 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,253 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,256 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,267 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16387,1099511693805)
2019-05-02 13:00:29,274 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16393)
2019-05-02 13:00:29,276 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388, 16393],16387)
2019-05-02 13:00:29,278 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,281 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:29,285 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([],16393)
2019-05-02 13:00:29,286 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16393,16387,ThroughputBenchDir1)
2019-05-02 13:00:29,294 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16393,1099511693805)
2019-05-02 13:00:29,299 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16387)
2019-05-02 13:00:29,315 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertInode: (16394)
2019-05-02 13:00:29,320 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [UPDATE]: (16394,281474976710672)
2019-05-02 13:00:29,325 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [UPDATE]: (16394,ThroughputBench4)
2019-05-02 13:00:29,331 INFO org.apache.hadoop.hdfs.db.DatabaseINode: insertUc [UPDATE]: (16394, create-client-0, 127.0.0.1)
2019-05-02 13:00:29,333 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([],16393)
2019-05-02 13:00:29,337 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16394,0)
2019-05-02 13:00:29,339 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16394,16393,ThroughputBench4)
2019-05-02 13:00:29,347 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [UPDATE]: (16394,1099511693823)
2019-05-02 13:00:29,351 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16393)
2019-05-02 13:00:29,355 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16394, create-client-0)
2019-05-02 13:00:29,358 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16394,0)
2019-05-02 13:00:29,362 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16394,1556802029302)
2019-05-02 13:00:29,364 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16394,1556802029302)
2019-05-02 13:00:29,370 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,374 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,378 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,382 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,386 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:29,388 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,392 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,394 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16394,16393)
2019-05-02 13:00:29,397 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16394,ThroughputBench4)
2019-05-02 13:00:29,400 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16394,16393,ThroughputBench4)
2019-05-02 13:00:29,402 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,404 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16393,ThroughputBenchDir1)
2019-05-02 13:00:29,406 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16393,16387)
2019-05-02 13:00:29,410 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16393,16387,ThroughputBenchDir1)
2019-05-02 13:00:29,415 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:29,418 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16387,create)
2019-05-02 13:00:29,426 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16387,16386)
2019-05-02 13:00:29,428 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16387,16386,create)
2019-05-02 13:00:29,431 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,433 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16386,nnThroughputBenchmark)
2019-05-02 13:00:29,436 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16386,16385)
2019-05-02 13:00:29,438 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChild: (16386,16385,nnThroughputBenchmark)
2019-05-02 13:00:29,441 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,443 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,446 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16394, create-client-0)
2019-05-02 13:00:29,454 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getUcClientName [GET]: (16394, create-client-0)
2019-05-02 13:00:29,462 INFO org.apache.hadoop.hdfs.db.DatabaseINode: removeUc [UPDATE]: (16394)
2019-05-02 13:00:29,466 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [UPDATE]: (16394,1556802029455)
2019-05-02 13:00:29,470 INFO org.apache.hadoop.hdfs.StateChange: DIR* completeFile: /nnThroughputBenchmark/create/ThroughputBenchDir1/ThroughputBench4 is closed by create-client-0
2019-05-02 13:00:29,474 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe mode is already OFF
2019-05-02 13:00:29,495 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,499 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,501 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,504 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,523 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388, 16393],16387)
2019-05-02 13:00:29,525 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,527 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:29,530 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16387,1556802029268)
2019-05-02 13:00:29,533 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16387,0)
2019-05-02 13:00:29,537 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16387,1099511693805)
2019-05-02 13:00:29,554 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,558 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,563 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,585 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,588 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,591 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,594 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,596 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,600 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,602 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,617 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16387],16386)
2019-05-02 13:00:29,619 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,624 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388, 16393],16387)
2019-05-02 13:00:29,626 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,630 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390, 16391, 16392],16388)
2019-05-02 13:00:29,635 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,637 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,641 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:29,643 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:29,647 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:29,650 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:29,654 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,657 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,661 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:29,665 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16394],16393)
2019-05-02 13:00:29,667 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,670 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,685 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16387,create)
2019-05-02 13:00:29,690 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388, 16393],16387)
2019-05-02 13:00:29,693 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,702 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390, 16391, 16392],16388)
2019-05-02 13:00:29,705 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,709 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,713 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:29,715 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:29,719 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:29,722 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:29,734 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,736 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,740 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:29,744 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16394],16393)
2019-05-02 13:00:29,746 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,749 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,754 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388, 16393],16387)
2019-05-02 13:00:29,757 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,761 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390, 16391, 16392],16388)
2019-05-02 13:00:29,763 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,766 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,770 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:29,772 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:29,776 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:29,778 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:29,783 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,786 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:29,790 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:29,793 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16394],16393)
2019-05-02 13:00:29,795 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,798 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:29,802 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16387,16386)
2019-05-02 13:00:29,803 INFO org.apache.hadoop.hdfs.db.DatabaseINode: addChild: [OK] UPSERT (16387,16386,open)
2019-05-02 13:00:29,809 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16387,1099511693805)
2019-05-02 13:00:29,821 INFO org.apache.hadoop.hdfs.db.DatabaseINode: updateModificationTime [UPDATE]: (16386)
2019-05-02 13:00:29,825 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16386,1556802029268)
2019-05-02 13:00:29,831 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [UPDATE]: (16386,1556802029603)
2019-05-02 13:00:29,834 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16386,1556802029603)
2019-05-02 13:00:29,837 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16387,1556802029268)
2019-05-02 13:00:29,839 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16387,0)
2019-05-02 13:00:29,875 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,879 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,882 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,887 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,900 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388, 16393],16387)
2019-05-02 13:00:29,903 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,906 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:29,908 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16387,1556802029268)
2019-05-02 13:00:29,911 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16387,0)
2019-05-02 13:00:29,913 INFO org.apache.hadoop.hdfs.db.DatabaseINode: permission [GET]: (16387,1099511693805)
2019-05-02 13:00:29,938 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:29,945 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:29,950 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:29,952 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:29,956 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:29,959 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,976 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16389,0)
2019-05-02 13:00:29,979 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:29,983 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16389,1556802028020)
2019-05-02 13:00:30,019 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:30,023 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:30,026 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:30,028 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:30,031 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:30,033 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:30,036 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16390,0)
2019-05-02 13:00:30,041 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:30,045 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16390,1556802028454)
2019-05-02 13:00:30,050 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:30,054 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:30,056 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:30,059 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:30,061 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:30,063 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:30,073 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16391,0)
2019-05-02 13:00:30,077 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:30,081 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16391,1556802028664)
2019-05-02 13:00:30,086 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:30,089 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:30,092 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:30,094 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:30,096 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:30,099 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:30,103 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16392,0)
2019-05-02 13:00:30,107 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:30,110 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16392,1556802029049)
2019-05-02 13:00:30,115 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:30,119 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:30,121 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:30,123 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:30,126 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:30,128 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:30,132 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16394,0)
2019-05-02 13:00:30,135 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:30,139 INFO org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16394,1556802029302)
2019-05-02 13:00:30,144 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe mode is already OFF
2019-05-02 13:00:30,166 INFO org.apache.hadoop.hdfs.db.DatabaseINode: name [GET]: (16385,null)
2019-05-02 13:00:30,169 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-02 13:00:30,172 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16386,0)
2019-05-02 13:00:30,174 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16387,0)
2019-05-02 13:00:30,176 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388, 16393],16387)
2019-05-02 13:00:30,178 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:30,180 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:30,226 INFO org.apache.hadoop.hdfs.db.DatabaseINode: parent [GET]: (16387,16386)
2019-05-02 13:00:30,229 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [GET]: (16386,1556802029603)
2019-05-02 13:00:30,233 INFO org.apache.hadoop.hdfs.db.DatabaseINode: modificationTime [UPDATE]: (16386,1556802030181)
2019-05-02 13:00:30,236 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16388, 16393],16387)
2019-05-02 13:00:30,238 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16388,0)
2019-05-02 13:00:30,241 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16393,0)
2019-05-02 13:00:30,243 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16389, 16390, 16391, 16392],16388)
2019-05-02 13:00:30,245 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:30,247 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:30,249 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:30,251 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:30,254 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16389,281474976710672)
2019-05-02 13:00:30,272 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: deleteViaBcId: (16389,DELETE FROM inode2block WHERE id = ?;)
2019-05-02 13:00:30,276 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16390,281474976710672)
2019-05-02 13:00:30,283 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: deleteViaBcId: (16390,DELETE FROM inode2block WHERE id = ?;)
2019-05-02 13:00:30,286 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16391,281474976710672)
2019-05-02 13:00:30,293 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: deleteViaBcId: (16391,DELETE FROM inode2block WHERE id = ?;)
2019-05-02 13:00:30,296 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16392,281474976710672)
2019-05-02 13:00:30,304 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: deleteViaBcId: (16392,DELETE FROM inode2block WHERE id = ?;)
2019-05-02 13:00:30,307 INFO org.apache.hadoop.hdfs.db.DatabaseINode: getChildrenIds: ([16394],16393)
2019-05-02 13:00:30,309 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:30,311 INFO org.apache.hadoop.hdfs.db.DatabaseINode: header [GET]: (16394,281474976710672)
2019-05-02 13:00:30,328 INFO org.apache.hadoop.hdfs.db.DatabaseINode2Block: deleteViaBcId: (16394,DELETE FROM inode2block WHERE id = ?;)
2019-05-02 13:00:30,360 INFO org.apache.hadoop.hdfs.db.DatabaseINode: removeChild Return: 1
2019-05-02 13:00:30,362 INFO org.apache.hadoop.hdfs.db.DatabaseINode: removeChild: 16387
2019-05-02 13:00:35,216 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Roll Edit Log from 127.0.0.1
2019-05-02 13:00:35,217 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Rolling edit logs
2019-05-02 13:00:35,218 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Ending log segment 1, 1
2019-05-02 13:00:35,220 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Number of transactions: 2 Total time for transactions(ms): 10 Number of transactions batched in Syncs: 0 Number of syncs: 2 SyncTimes(ms): 32 
2019-05-02 13:00:35,223 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Number of transactions: 2 Total time for transactions(ms): 10 Number of transactions batched in Syncs: 0 Number of syncs: 3 SyncTimes(ms): 34 
2019-05-02 13:00:35,239 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /home/gangl/hadoop/name/current/edits_inprogress_0000000000000000001 -> /home/gangl/hadoop/name/current/edits_0000000000000000001-0000000000000000002
2019-05-02 13:00:35,248 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 3
2019-05-02 13:00:35,942 INFO org.apache.hadoop.hdfs.server.namenode.TransferFsImage: Sending fileName: /home/gangl/hadoop/name/current/fsimage_0000000000000000000, fileSize: 98. Sent total: 98 bytes. Size of last segment intended to send: -1 bytes.
2019-05-02 13:00:36,097 INFO org.apache.hadoop.hdfs.server.namenode.TransferFsImage: Sending fileName: /home/gangl/hadoop/name/current/edits_0000000000000000001-0000000000000000002, fileSize: 42. Sent total: 42 bytes. Size of last segment intended to send: -1 bytes.
2019-05-02 13:00:37,003 INFO org.apache.hadoop.hdfs.server.common.Util: Combined time for file download and fsync to all disks took 0.01s. The file download took 0.00s at 0.00 KB/s. Synchronous (fsync) write to disk of /home/gangl/hadoop/name/current/fsimage.ckpt_0000000000000000002 took 0.00s.
2019-05-02 13:00:37,007 INFO org.apache.hadoop.hdfs.server.namenode.TransferFsImage: Downloaded file fsimage.ckpt_0000000000000000002 size 98 bytes.
2019-05-02 13:00:37,095 INFO org.apache.hadoop.hdfs.server.namenode.NNStorageRetentionManager: Going to retain 2 images with txid >= 0
2019-05-02 13:01:49,222 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM
2019-05-02 13:01:49,248 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at localhost/127.0.0.1
************************************************************/
gangliao commented 5 years ago
2019-05-05 20:45:30,504 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-05-05 20:45:34,778 INFO namenode.NNThroughputBenchmark: Starting benchmark: open
2019-05-05 20:45:35,305 INFO namenode.NNThroughputBenchmark: Generate 100 intputs for create
2019-05-05 20:45:35,305 ERROR namenode.NNThroughputBenchmark: Log level = ERROR
2019-05-05 20:45:35,653 INFO namenode.NNThroughputBenchmark: Starting 100 create(s).
2019-05-05 20:45:53,497 INFO namenode.NNThroughputBenchmark: Created 100 files.
2019-05-05 20:45:53,501 INFO namenode.NNThroughputBenchmark: Generate 100 intputs for open
2019-05-05 20:45:56,247 ERROR namenode.NNThroughputBenchmark: Log level = INFO
2019-05-05 20:45:56,247 INFO namenode.NNThroughputBenchmark: Starting 100 open(s).
2019-05-05 20:45:57,966 INFO namenode.NNThroughputBenchmark: Memory Used: -20039616
2019-05-05 20:45:57,969 INFO namenode.NNThroughputBenchmark:
2019-05-05 20:45:57,970 INFO namenode.NNThroughputBenchmark: --- open inputs ---
2019-05-05 20:45:57,970 INFO namenode.NNThroughputBenchmark: nrFiles = 100
2019-05-05 20:45:57,970 INFO namenode.NNThroughputBenchmark: nrThreads = 10
2019-05-05 20:45:57,971 INFO namenode.NNThroughputBenchmark: nrFilesPerDir = 4
2019-05-05 20:45:57,971 INFO namenode.NNThroughputBenchmark: --- open stats  ---
2019-05-05 20:45:57,972 INFO namenode.NNThroughputBenchmark: # operations: 100
2019-05-05 20:45:57,972 INFO namenode.NNThroughputBenchmark: Elapsed Time: 1719
2019-05-05 20:45:57,972 INFO namenode.NNThroughputBenchmark:  Ops per sec: 58.17335660267597
2019-05-05 20:45:57,972 INFO namenode.NNThroughputBenchmark: Average Time: 163
gangliao commented 5 years ago

Open: average 40 ms

2019-05-13 18:38:36,871 DEBUG org.apache.hadoop.hdfs.server.namenode.top.metrics.TopMetrics: a metric is reported: cmd: open user: gangl (auth:SIMPLE)
2019-05-13 18:38:36,872 DEBUG org.apache.hadoop.hdfs.server.namenode.top.TopAuditLogger: ------------------- logged event for top service: allowed=true ugi=gangl (auth:SIMPLE) ip=/127.0.0.1   cmd=open    src=/nnThroughputBenchmark/open/ThroughputBenchDir2/ThroughputBench8    dst=null    perm=null
2019-05-13 18:38:36,873 DEBUG org.apache.hadoop.ipc.Server: Served: getBlockLocations, queueTime= 2 procesingTime= 27
2019-05-13 18:38:36,875 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on default port 9000: responding to Call#34 Retry#0 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:36514
2019-05-13 18:38:36,876 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on default port 9000: responding to Call#34 Retry#0 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:36514 Wrote 40 bytes.
2019-05-13 18:38:36,880 DEBUG org.apache.hadoop.ipc.Server:  got #35
2019-05-13 18:38:36,881 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 5 on default port 9000: Call#35 Retry#0 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations from 127.0.0.1:36514 for RpcKind RPC_PROTOCOL_BUFFER
2019-05-13 18:38:36,881 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:gangl (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2727)
2019-05-13 18:38:36,883 DEBUG org.apache.hadoop.hdfs.server.namenode.INodeDirectoryKeyedObjectPool: get INodeDirectory Object (16385) from Pool via borrowActiveObject
2019-05-13 18:38:36,887 DEBUG org.apache.hadoop.hdfs.db.DatabaseINode: getChildIdsByPath: 16385
2019-05-13 18:38:36,888 DEBUG org.apache.hadoop.hdfs.server.namenode.INodeDirectoryKeyedObjectPool: get INodeDirectory Object (16386) from Pool via borrowActiveObject
2019-05-13 18:38:36,889 DEBUG org.apache.hadoop.hdfs.server.namenode.INodeDirectoryKeyedObjectPool: get INodeDirectory Object (16514) from Pool via borrowActiveObject
2019-05-13 18:38:36,890 DEBUG org.apache.hadoop.hdfs.server.namenode.INodeDirectoryKeyedObjectPool: get INodeDirectory Object (16525) from Pool via borrowActiveObject
2019-05-13 18:38:36,891 DEBUG org.apache.hadoop.hdfs.server.namenode.INodeFileKeyedObjectPool: get INodeFile Object (16527) from Pool via borrowActiveObject
2019-05-13 18:38:36,895 DEBUG org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16527,0)
2019-05-13 18:38:36,900 DEBUG org.apache.hadoop.hdfs.db.DatabaseINode2Block: getBlockIds: (16527, [])
2019-05-13 18:38:36,903 DEBUG org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16527,1557772169484)
gangliao commented 5 years ago

getNumblocks and getBlockIds are too slow

2019-05-13 18:38:36,895 DEBUG org.apache.hadoop.hdfs.db.DatabaseINode2Block: getNumBlocks: (16527,0)
2019-05-13 18:38:36,900 DEBUG org.apache.hadoop.hdfs.db.DatabaseINode2Block: getBlockIds: (16527, [])
2019-05-13 18:38:36,903 DEBUG org.apache.hadoop.hdfs.db.DatabaseINode: accessTime [GET]: (16527,1557772169484)

TODO Solution: Using stored procedures

gangliao commented 5 years ago

borrow from object pool is too slow (1ms)

2019-05-13 18:38:36,888 DEBUG org.apache.hadoop.hdfs.server.namenode.INodeDirectoryKeyedObjectPool: get INodeDirectory Object (16386) from Pool via borrowActiveObject
2019-05-13 18:38:36,889 DEBUG org.apache.hadoop.hdfs.server.namenode.INodeDirectoryKeyedObjectPool: get INodeDirectory Object (16514) from Pool via borrowActiveObject
2019-05-13 18:38:36,890 DEBUG org.apache.hadoop.hdfs.server.namenode.INodeDirectoryKeyedObjectPool: get INodeDirectory Object (16525) from Pool via borrowActiveObject

TODO Solution: Using high perfmance local cache: caffeine https://github.com/ben-manes/caffeine

gangliao commented 5 years ago

getChildIdsByPath is too slow:

TODO solution: bring childsId into local cache