## web interface
### port numbers
* name node: 50070
* job tracker: 50030
* data node: 50075
* task tracker: 50060
### operations
* thread dump: /stacks
* log level: /LogLevel
* log: /logs
* metrics: /metrics
* jmx: /jmx
* configuration : /conf
## command
### hdfs
* hdfs dfsadmin -safemode enter/leave/get
* hdfs dfsadmin -report
* hdfs dfsadmin -saveNamespace # restore fsimage and journal log
* hdfs fsck (hdfs path)
* hdfs fsck (hdfs path) -list-corruptfileblocks
* hdfs fsck (hdfs path) -delete
### job
* hadoop job -list
* hadoop job -kill (job id)
* hadoop job -list-active-trackers
* hadoop mapred job -kill-task (task id)
* hadoop mapred job -fail-task (task id) # be retried
## operation log
```
$sudo chown -R hdfs:hadoop /var/lib/hadoop-hdfs/cache/hdfs
$sudo -u hdfs hdfs namenode -format
$sudo /etc/init.d/hadoop-hdfs-namenode start
$sudo /etc/init.d/hadoop-hdfs-datanode start
$sudo -u hdfs hdfs dfs -mkdir -p hdfs://localhost:8020/var/lib/hadoop-hdfs/cache/mapred/mapred/system
$sudo -u hdfs hdfs dfs -chown -R mapred:hadoop hdfs://localhost:8020/var/lib/hadoop-hdfs/cache/mapred/mapred/system
$sudo -u hdfs hdfs dfs -chown -R mapred hdfs://localhost:8020/var/lib/hadoop-hdfs/cache/mapred
$sudo -u mapred hadoop jar /usr/lib/hadoop-0.20-mapreduce/hadoop-examples.jar pi 3 10000
```