国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁(yè) > 學(xué)院 > 操作系統(tǒng) > 正文

獲取hadoop版本關(guān)聯(lián)到eclipse中的安裝包

2024-06-28 16:05:42
字體:
來(lái)源:轉(zhuǎn)載
供稿:網(wǎng)友

轉(zhuǎn)載自:http://blog.csdn.net/jiutianhe/article/details/39233609

我們?nèi)绻敫汩_(kāi)發(fā),研究源碼對(duì)我們的幫助很大。不明白原理就如同黑盒子,遇到問(wèn)題,我們也摸不著思路。所以這里交給大家一.如何獲取源碼二.如何關(guān)聯(lián)源碼一.如何獲取源碼1.下載hadoop的maven程序包(1)官網(wǎng)下載這里我們先從官網(wǎng)上下載maven包hadoop-2.4.0-src.tar.gz。官網(wǎng)下載地址對(duì)于不知道怎么去官網(wǎng)下載,可以查看:新手指導(dǎo):hadoop官網(wǎng)介紹及如何下載hadoop(2.4)各個(gè)版本與查看hadoop API介紹(2)網(wǎng)盤(pán)下載也可以從網(wǎng)盤(pán)下載:http://pan.baidu.com/s/1kToPuGB2.通過(guò)maven獲取源碼獲取源碼的方式有兩種,一種是通過(guò)命令行的方式,一種是通過(guò)eclipse。這里主要講通過(guò)命令的方式通過(guò)命令的方式獲取源碼:1.解壓包 解壓包的時(shí)候遇到了下面問(wèn)題。不過(guò)不用管,我們繼續(xù)往下走
1        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-PRoject/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice/target/classes/org/apache/hadoop/yarn/server/applicationhistoryservice/ApplicationHistoryClientService$ApplicationHSClientProtocolHandler.class:路徑和文件名總長(zhǎng)度不能超過(guò)260個(gè)字符系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip2        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice/target/classes/org/apache/hadoop/yarn/server/applicationhistoryservice/timeline/LeveldbTimelineStore$LockMap$CountingReentrantLock.class:系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip3        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice/target/test-classes/org/apache/hadoop/yarn/server/applicationhistoryservice/webapp/TestAHSWebApp$MockApplicationHistoryManagerImpl.class:系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip4        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/target/test-classes/org/apache/hadoop/yarn/server/resourcemanager/monitor/capacity/TestProportionalCapacityPreemptionPolicy$IsPreemptionRequestFor.class:路徑和文件名總長(zhǎng)度不能超過(guò)260個(gè)字符系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip5        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/target/test-classes/org/apache/hadoop/yarn/server/resourcemanager/recovery/TestFSRMStateStore$TestFSRMStateStoreTester$TestFileSystemRMStore.class:系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip6        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/target/test-classes/org/apache/hadoop/yarn/server/resourcemanager/recovery/TestZKRMStateStore$TestZKRMStateStoreTester$TestZKRMStateStoreInternal.class:路徑和文件名總長(zhǎng)度不能超過(guò)260個(gè)字符系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip7        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/target/test-classes/org/apache/hadoop/yarn/server/resourcemanager/recovery/TestZKRMStateStoreZKClientConnections$TestZKClient$TestForwardingWatcher.class:路徑和文件名總長(zhǎng)度不能超過(guò)260個(gè)字符系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip8        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/target/test-classes/org/apache/hadoop/yarn/server/resourcemanager/recovery/TestZKRMStateStoreZKClientConnections$TestZKClient$TestZKRMStateStore.class:路徑和文件名總長(zhǎng)度不能超過(guò)260個(gè)字符系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip9        : 無(wú)法創(chuàng)建文件:D:/hadoop2/hadoop-2.4.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/target/test-classes/org/apache/hadoop/yarn/server/resourcemanager/rmapp/attempt/TestRMAppAttemptTransitions$TestApplicationAttemptEventDispatcher.class:路徑和文件名總長(zhǎng)度不能超過(guò)260個(gè)字符系統(tǒng)找不到指定的路徑。        D:/hadoop2/hadoop-2.4.0-src.zip2.通過(guò)maven獲取源碼這里需要說(shuō)明的是,在使用maven的時(shí)候,需要先安裝jdk,protoc ,如果沒(méi)有安裝可以參考Win7如何安裝maven、安裝protoc(1)進(jìn)入hadoop-2.4.0-src/hadoop-maven-plugins,運(yùn)行mvn installD:/hadoop2/hadoop-2.4.0-src/hadoop-maven-plugins>mvn install復(fù)制代碼顯示如下信息[INFO] Scanning for projects...[WARNING][WARNING] Some problems were encountered while building the effective model fororg.apache.hadoop:hadoop-maven-plugins:maven-plugin:2.4.0[WARNING] 'build.plugins.plugin.(groupId:artifactId)' must be unique but found duplicate declaration of plugin org.apache.maven.plugins:maven-enforcer-plugin @org.apache.hadoop:hadoop-project:2.4.0, D:/hadoop2/hadoop-2.4.0-src/hadoop-project/pom.xml, line 1015, column 15[WARNING][WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.[WARNING][WARNING] For this reason, future Maven versions might no longer support building such malformed projects.[WARNING][INFO][INFO] Using the builder org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder with a thread count of 1[INFO][INFO] ------------------------------------------------------------------------[INFO] Building Apache Hadoop Maven Plugins 2.4.0[INFO] ------------------------------------------------------------------------[INFO][INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-maven-plugins---[INFO] Executing tasksmain:[INFO] Executed tasks[INFO][INFO] --- maven-plugin-plugin:3.0:descriptor (default-descriptor) @ hadoop-maven-plugins ---[INFO] Using 'UTF-8' encoding to read mojo metadata.[INFO] Applying mojo extractor for language: java-annotations[INFO] Mojo extractor for language: java-annotations found 2 mojo descriptors.[INFO] Applying mojo extractor for language: java[INFO] Mojo extractor for language: java found 0 mojo descriptors.[INFO] Applying mojo extractor for language: bsh[INFO] Mojo extractor for language: bsh found 0 mojo descriptors.[INFO][INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-maven-plugins ---[INFO] Using default encoding to copy filtered resources.[INFO][INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-maven-plugins ---[INFO] Nothing to compile - all classes are up to date[INFO][INFO] --- maven-plugin-plugin:3.0:descriptor (mojo-descriptor) @ hadoop-maven-plugins ---[INFO] Using 'UTF-8' encoding to read mojo metadata.[INFO] Applying mojo extractor for language: java-annotations[INFO] Mojo extractor for language: java-annotations found 2 mojo descriptors.[INFO] Applying mojo extractor for language: java[INFO] Mojo extractor for language: java found 0 mojo descriptors.[INFO] Applying mojo extractor for language: bsh[INFO] Mojo extractor for language: bsh found 0 mojo descriptors.[INFO][INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-maven-plugins ---[INFO] Using default encoding to copy filtered resources.[INFO][INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-maven-plugins ---[INFO] No sources to compile[INFO][INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hadoop-maven-plugins---[INFO] No tests to run.[INFO][INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-maven-plugins ---[INFO] Building jar: D:/hadoop2/hadoop-2.4.0-src/hadoop-maven-plugins/target/hadoop-maven-plugins-2.4.0.jar[INFO][INFO] --- maven-plugin-plugin:3.0:addPluginArtifactMetadata (default-addPluginArtifactMetadata) @ hadoop-maven-plugins ---[INFO][INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hadoop-maven-plugins ---[INFO][INFO] --- maven-install-plugin:2.3.1:install (default-install) @ hadoop-maven-plugins ---[INFO] Installing D:/hadoop2/hadoop-2.4.0-src/hadoop-maven-plugins/target/hadoop-maven-plugins-2.4.0.jar to C:/Users/hyj/.m2/repository/org/apache/hadoop/hadoop-maven-plugins/2.4.0/hadoop-maven-plugins-2.4.0.jar[INFO] Installing D:/hadoop2/hadoop-2.4.0-src/hadoop-maven-plugins/pom.xml to C:/Users/hyj/.m2/repository/org/apache/hadoop/hadoop-maven-plugins/2.4.0/hadoop-maven-plugins-2.4.0.pom[INFO] ------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 4.891 s[INFO] Finished at: 2014-06-23T14:47:33+08:00[INFO] Final Memory: 21M/347M[INFO] ------------------------------------------------------------------------復(fù)制代碼部分截圖如下:  (2)運(yùn)行mvn eclipse:eclipse -DskipTests復(fù)制代碼這時(shí)候注意,我們進(jìn)入的是hadoop_home,我這里是D:/hadoop2/hadoop-2.4.0-src部分信息如下[INFO][INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO][INFO] Apache Hadoop Main ................................ SUCCESS [  0.684 s][INFO] Apache Hadoop Project POM ......................... SUCCESS [  0.720 s][INFO] Apache Hadoop Annotations ......................... SUCCESS [  0.276 s][INFO] Apache Hadoop Project Dist POM .................... SUCCESS [  0.179 s][INFO] Apache Hadoop Assemblies .......................... SUCCESS [  0.121 s][INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [  1.680 s][INFO] Apache Hadoop MiniKDC ............................. SUCCESS [  1.802 s][INFO] Apache Hadoop Auth ................................ SUCCESS [  1.024 s][INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  0.160 s][INFO] Apache Hadoop Common .............................. SUCCESS [  1.061 s][INFO] Apache Hadoop NFS ................................. SUCCESS [  0.489 s][INFO] Apache Hadoop Common Project ...................... SUCCESS [  0.056 s][INFO] Apache Hadoop HDFS ................................ SUCCESS [  2.770 s][INFO] Apache Hadoop HttpFS .............................. SUCCESS [  0.965 s][INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [  0.629 s][INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [  0.284 s][INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.061 s][INFO] hadoop-yarn ....................................... SUCCESS [  0.052 s][INFO] hadoop-yarn-api ................................... SUCCESS [  0.842 s][INFO] hadoop-yarn-common ................................ SUCCESS [  0.322 s][INFO] hadoop-yarn-server ................................ SUCCESS [  0.065 s][INFO] hadoop-yarn-server-common ......................... SUCCESS [  0.972 s][INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [  0.580 s][INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [  0.379 s][INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [  0.281 s][INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [  0.378 s][INFO] hadoop-yarn-server-tests .......................... SUCCESS [  0.534 s][INFO] hadoop-yarn-client ................................ SUCCESS [  0.307 s][INFO] hadoop-yarn-applications .......................... SUCCESS [  0.050 s][INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [  0.202 s][INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [  0.194 s][INFO] hadoop-yarn-site .................................. SUCCESS [  0.057 s][INFO] hadoop-yarn-project ............................... SUCCESS [  0.066 s][INFO] hadoop-mapreduce-client ........................... SUCCESS [  0.091 s][INFO] hadoop-mapreduce-client-core ...................... SUCCESS [  1.321 s][INFO] hadoop-mapreduce-client-common .................... SUCCESS [  0.786 s][INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [  0.456 s][INFO] hadoop-mapreduce-client-app ....................... SUCCESS [  0.508 s][INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [  0.834 s][INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [  0.541 s][INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [  0.284 s][INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [  0.851 s][INFO] hadoop-mapreduce .................................. SUCCESS [  0.099 s][INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [  0.742 s][INFO] Apache Hadoop Distributed Copy .................... SUCCESS [  0.335 s][INFO] Apache Hadoop Archives ............................ SUCCESS [  0.397 s][INFO] Apache Hadoop Rumen ............................... SUCCESS [  0.371 s][INFO] Apache Hadoop Gridmix ............................. SUCCESS [  0.230 s][INFO] Apache Hadoop Data Join ........................... SUCCESS [  0.184 s][INFO] Apache Hadoop Extras .............................. SUCCESS [  0.217 s][INFO] Apache Hadoop Pipes ............................... SUCCESS [  0.048 s][INFO] Apache Hadoop OpenStack support ................... SUCCESS [  0.244 s][INFO] Apache Hadoop Client .............................. SUCCESS [  0.590 s][INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [  0.230 s][INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [  0.650 s][INFO] Apache Hadoop Tools Dist .......................... SUCCESS [  0.334 s][INFO] Apache Hadoop Tools ............................... SUCCESS [  0.042 s][INFO] Apache Hadoop Distribution ........................ SUCCESS [  0.144 s][INFO] ------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 31.234 s[INFO] Finished at: 2014-06-23T14:55:08+08:00[INFO] Final Memory: 84M/759M[INFO] ------------------------------------------------------------------------復(fù)制代碼這時(shí)候,我們已經(jīng)把源碼給下載下來(lái)了。這時(shí)候,我們會(huì)看到文件會(huì)明顯增大。 3.關(guān)聯(lián)eclipse源碼加入我們以下程序 hadoop2.2mapreduce例子.rar (1.14 MB, 下載次數(shù): 68, 售價(jià): 1 云幣) 如下圖示,對(duì)他們進(jìn)行了打包 這兩個(gè)文件, MaxTemperature.zip為mapreduce例子,mockito-core-1.8.5.jar為mapreduce例子所引用的包(這里需要說(shuō)明的是,mapreduce為2.2,但是不影響關(guān)聯(lián)源碼,只是交給大家該如何關(guān)聯(lián)源碼)我們解壓之后,導(dǎo)入eclipse(對(duì)于導(dǎo)入項(xiàng)目不熟悉,參考零基礎(chǔ)教你如何導(dǎo)入eclipse項(xiàng)目) 我們導(dǎo)入之后,看到很多的紅線(xiàn),這些其實(shí)都是沒(méi)有引用包,下面我們開(kāi)始解決這些語(yǔ)法問(wèn)題。一、解決導(dǎo)入jar包(1)引入mockito-core-1.8.5.jar(2)hadoop2.4編譯包中的jar文件,這些文件的位置如下:hadoop_home中share/hadoop文件夾下,具體我這的位置D:/hadoop2/hadoop-2.4.0/share/hadoop找到里面的jar包,舉例如下:lib文件中的jar包,以及下面的jar包都添加到buildpath中。如果對(duì)于引用包,不知道該如何添加這些jar包,參考hadoop開(kāi)發(fā)方式總結(jié)及操作指導(dǎo)。(注意的是,我們這里是引入的是編譯包,編譯的下載hadoop--642.4.0.tar.gz鏈接: http://pan.baidu.com/s/1c0vPjG0 密碼:xj6l)更多包下載可以參考hadoop家族、strom、spark、linux、flume等jar包、安裝包匯總下載  二、關(guān)聯(lián)源碼1.我們導(dǎo)入jar包之后,就沒(méi)有錯(cuò)誤了,如下圖所示 2.找不到源碼當(dāng)我們想看一個(gè)類(lèi)或則函數(shù)怎么實(shí)現(xiàn)的時(shí)候,通過(guò)Open Call Hierarchy,卻找不到源文件。  3.Attach Source 上面三處,我們按照順序添加即可,我們選定壓縮包之后,單擊確定,ok了,我們的工作已經(jīng)完畢。注意:對(duì)于hadoop-2.2.0-src.zip則是我們上面通過(guò)maven下載的源碼,然后壓縮的文件,記得一定是壓縮文件zip的形式4.驗(yàn)證關(guān)聯(lián)后查看源碼我們?cè)俅螆?zhí)行上面操作,通過(guò)Open Call Hierarchy看到下面內(nèi)容 然后我們雙擊上圖主類(lèi),即紅字部分,我們看到下面內(nèi)容: 問(wèn)題:細(xì)心的同學(xué),這里面我們產(chǎn)生一個(gè)問(wèn)題,因?yàn)槲覀兛吹降氖?class文件,而不是.java文件。那么他會(huì)不會(huì)和我們所看到的.java文件不一樣那。其實(shí)是一樣的,感興趣的同學(xué),可以驗(yàn)證一下。
發(fā)表評(píng)論 共有條評(píng)論
用戶(hù)名: 密碼:
驗(yàn)證碼: 匿名發(fā)表
主站蜘蛛池模板: 鄂尔多斯市| 泾阳县| 永年县| 二连浩特市| 改则县| 洪雅县| 辉县市| 宝山区| 澎湖县| 西宁市| 茶陵县| 新兴县| 庆云县| 闽侯县| 阿克| 黄梅县| 苏州市| 中西区| 马尔康县| 阳春市| 广宁县| 陈巴尔虎旗| 罗江县| 永靖县| 金华市| 蒲江县| 资讯 | 囊谦县| 仪陇县| 板桥市| 黄石市| 榆中县| 赫章县| 西和县| 新安县| 密山市| 洛南县| 青冈县| 泸州市| 岑溪市| 来宾市|