1.å¨hbaseä¸å建ä¸ä¸ªè¡¨
ä¾å¦ï¼create 'test','info'
2.é
ç½®ç¯å¢
å¨hadoopçå®è£
ç®å½ä¸æ¾å°hadoop.env.shé
ç½®æ件ï¼å°ä¸æ件å å
¥å°æ¤é
ç½®æ件ä¸
ï¼export HBASE_HOME=/usr/hbase
export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.12.jar:$HBASE_HOME/hbase-0.94.12-test.jar:$HBASE_HOME/conf:${HBASE_HOME}/lib/zookeeper-3.4.5.jar:${HBASE_HOME}/lib/guava-11.0.2.jarï¼
以ä¸çé
ç½®æ件å¯ä»¥ä¸ç¨é
ç½®ï¼ä¸ä½é
ç½®å¨å¯å¨hiveæ¶å°±ä¼åºéï¼éè¦å¦å é
ç½®ã
ç¶åæ·è´jarå
å°hbaseçhbase-0.91.12.jaræ·è´å°haddoopçlibä¸ï¼å°hbase-0.94.12.tests.jar copyå°hadoopçlibä¸
å°hbaseçé
ç½®æ件hbase-site.xmlæ件æ·è´å°hadoopçconfä¸
3.éæ°å¯å¨hadoop
4.å°æéè¦çæ件ä¸ä¼ å°hdfsä¸,æç¨çeclipseä¸ä¼ çï¼å¤§å®¶ä¹å¯ä»¥ç¨hadoop fs -put test3.dat /application/logAnalyse/test/
5.å¨ä½ å®è£
çhbaseçlibç®å½ä¸æ§è¡ä¸ä¸çå½ä»¤
hbase org.apache.hadoop.hbase.mapreduce.ImportTsv - Dimporttsv.columns=info:userid,HBASE_ROW_KEY,info:netid test2 /application/logAnalyse/test/test3.dat
ææ¯
hbase org.apache.hadoop.hbase.mapreduce.ImportTsv - Dimporttsv.columns=HBASE_ROW_KEY,cf:c1,cf:c2 -Dimporttsv.separator=, test2 /application/logAnalyse/test/test3.txt
è¿æ ·ä½ å»hbaseæ§è¡scan 'test2'å å¯ä»¥çå°å·²ç»ææ°æ®äº
温馨提示:内容为网友见解,仅供参考