[lpj@cpu-node0 .ssh]$ ssh-copy-id -i /home/lpj/.ssh/id_rsa.pub lpj@cpu-node3 /usr/bin/ssh-copy-id: INFO: Source of key(s) to be installed: "/home/lpj/.ssh/id_rsa.pub" /usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed /usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys lpj@cpu-node3's password:
Number of key(s) added: 1
Now try logging into the machine, with: "ssh 'lpj@cpu-node3'" and check to make sure that only the key(s) you wanted were added.
测试一下,免密成功:
1 2 3
[lpj@cpu-node0 .ssh]$ ssh cpu-node3 Last login: Tue Sep 8 13:51:48 2020 from cpu-node0 [lpj@cpu-node3 ~]$
[lpj@cpu-node0 ~]$ source ~/.bashrc [lpj@cpu-node0 ~]$ java -version openjdk version "1.8.0_212" OpenJDK Runtime Environment (build 1.8.0_212-b04) OpenJDK 64-Bit Server VM (build 25.212-b04, mixed mode)
配置JAVA环境需要在Slave中也完成该操作。
然后再进入hadoop文件夹里测试hadoop的版本号:
1 2 3 4 5 6 7
[lpj@cpu-node0 hadoop]$ ./bin/hadoop version Hadoop 2.7.7 Subversion Unknown -r c1aad84bd27cd79c3d1a7dd58202a8c3ee1ed3ac Compiled by stevel on 2018-07-18T22:47Z Compiled with protoc 2.5.0 From source with checksum 792e15d20b12c74bd6f19a1fb886490 This command was run using /home/lpj/hadoop2.7/share/hadoop/common/hadoop-common-2.7.7.jar
[lpj@cpu-node0 ~]$ hadoop version Hadoop 2.7.7 Subversion Unknown -r c1aad84bd27cd79c3d1a7dd58202a8c3ee1ed3ac Compiled by stevel on 2018-07-18T22:47Z Compiled with protoc 2.5.0 From source with checksum 792e15d20b12c74bd6f19a1fb886490 This command was run using /home/lpj/hadoop2.7/share/hadoop/common/hadoop-common-2.7.7.jar
<configuration> <property> <name>fs.defaultFS</name> <value>hdfs://cpu-node0:9000</value> </property> <property> <name>fs.checkpoint.period</name> <value>3600</value> <description>The number of seconds between two periodic checkpoints. </description> </property> <property> <name>hadoop.tmp.dir</name> <value>file:/home/lpj/hadoop2.7/tmp</value> <description>Abase for other temporary directories.</description> </property> <property> <name>fs.checkpoint.size</name> <value>67108864</value> <description>The size of the current edit log (in bytes) that triggers a periodic checkpoint even if the fs.checkpoint.period hasn't expired. </description> </property> <property> <name>hadoop.proxyuser.hadoop.hosts</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.hadoop.groups</name> <value>*</value> </property> </configuration>
2020-09-16 15:38:10,678 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain java.net.BindException: Problem binding to [0.0.0.0:50010] java.net.BindException: Address already in use; For more details see: http://wiki.apache.org/hadoop/BindException ...... Caused by: java.net.BindException: Address already in use
[lpj@cpu-node0 ~]$ hadoop jar /home/lpj/hadoop2.7/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.7.jar grep /user/lpj/input /user/lpj/output 'dfs[a-z.]+' 20/09/16 16:47:54 INFO client.RMProxy: Connecting to ResourceManager at cpu-node0/192.168.232.100:8032 20/09/16 16:47:55 INFO input.FileInputFormat: Total input paths to process : 9 20/09/16 16:47:56 INFO mapreduce.JobSubmitter: number of splits:9 20/09/16 16:47:56 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1600243226377_0001 20/09/16 16:47:56 INFO impl.YarnClientImpl: Submitted application application_1600243226377_0001 20/09/16 16:47:56 INFO mapreduce.Job: The url to track the job: http://cpu-node0:8088/proxy/application_1600243226377_0001/ 20/09/16 16:47:56 INFO mapreduce.Job: Running job: job_1600243226377_0001 20/09/16 16:48:05 INFO mapreduce.Job: Job job_1600243226377_0001 running in uber mode : false 20/09/16 16:48:05 INFO mapreduce.Job: map 0% reduce 0% 20/09/16 16:48:09 INFO mapreduce.Job: map 56% reduce 0% 20/09/16 16:48:10 INFO mapreduce.Job: map 67% reduce 0% 20/09/16 16:48:13 INFO mapreduce.Job: map 100% reduce 0% 20/09/16 16:48:14 INFO mapreduce.Job: map 100% reduce 100% 20/09/16 16:48:15 INFO mapreduce.Job: Job job_1600243226377_0001 completed successfully 20/09/16 16:48:15 INFO mapreduce.Job: Counters: 49 File System Counters FILE: Number of bytes read=285 FILE: Number of bytes written=1235413 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=30693 HDFS: Number of bytes written=419 HDFS: Number of read operations=30 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=9 Launched reduce tasks=1 Data-local map tasks=9 Total time spent by all maps in occupied slots (ms)=23940 Total time spent by all reduces in occupied slots (ms)=2649 Total time spent by all map tasks (ms)=23940 Total time spent by all reduce tasks (ms)=2649 Total vcore-milliseconds taken by all map tasks=23940 Total vcore-milliseconds taken by all reduce tasks=2649 Total megabyte-milliseconds taken by all map tasks=24514560 Total megabyte-milliseconds taken by all reduce tasks=2712576 Map-Reduce Framework Map input records=854 Map output records=9 Map output bytes=261 Map output materialized bytes=333 Input split bytes=1050 Combine input records=9 Combine output records=9 Reduce input groups=9 Reduce shuffle bytes=333 Reduce input records=9 Reduce output records=9 Spilled Records=18 Shuffled Maps =9 Failed Shuffles=0 Merged Map outputs=9 GC time elapsed (ms)=791 CPU time spent (ms)=6020 Physical memory (bytes) snapshot=2622357504 Virtual memory (bytes) snapshot=21854842880 Total committed heap usage (bytes)=1958215680 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=29643 File Output Format Counters Bytes Written=419 20/09/16 16:48:15 INFO client.RMProxy: Connecting to ResourceManager at cpu-node0/192.168.232.100:8032 20/09/16 16:48:15 INFO input.FileInputFormat: Total input paths to process : 1 20/09/16 16:48:15 INFO mapreduce.JobSubmitter: number of splits:1 20/09/16 16:48:15 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1600243226377_0002 20/09/16 16:48:15 INFO impl.YarnClientImpl: Submitted application application_1600243226377_0002 20/09/16 16:48:15 INFO mapreduce.Job: The url to track the job: http://cpu-node0:8088/proxy/application_1600243226377_0002/ 20/09/16 16:48:15 INFO mapreduce.Job: Running job: job_1600243226377_0002 20/09/16 16:48:25 INFO mapreduce.Job: Job job_1600243226377_0002 running in uber mode : false 20/09/16 16:48:25 INFO mapreduce.Job: map 0% reduce 0% 20/09/16 16:48:30 INFO mapreduce.Job: map 100% reduce 0% 20/09/16 16:48:35 INFO mapreduce.Job: map 100% reduce 100% 20/09/16 16:48:36 INFO mapreduce.Job: Job job_1600243226377_0002 completed successfully 20/09/16 16:48:36 INFO mapreduce.Job: Counters: 49 File System Counters FILE: Number of bytes read=285 FILE: Number of bytes written=246463 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=547 HDFS: Number of bytes written=207 HDFS: Number of read operations=7 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=1 Launched reduce tasks=1 Data-local map tasks=1 Total time spent by all maps in occupied slots (ms)=2145 Total time spent by all reduces in occupied slots (ms)=2290 Total time spent by all map tasks (ms)=2145 Total time spent by all reduce tasks (ms)=2290 Total vcore-milliseconds taken by all map tasks=2145 Total vcore-milliseconds taken by all reduce tasks=2290 Total megabyte-milliseconds taken by all map tasks=2196480 Total megabyte-milliseconds taken by all reduce tasks=2344960 Map-Reduce Framework Map input records=9 Map output records=9 Map output bytes=261 Map output materialized bytes=285 Input split bytes=128 Combine input records=0 Combine output records=0 Reduce input groups=1 Reduce shuffle bytes=285 Reduce input records=9 Reduce output records=9 Spilled Records=18 Shuffled Maps =1 Failed Shuffles=0 Merged Map outputs=1 GC time elapsed (ms)=144 CPU time spent (ms)=1490 Physical memory (bytes) snapshot=444129280 Virtual memory (bytes) snapshot=4380962816 Total committed heap usage (bytes)=346030080 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=419 File Output Format Counters Bytes Written=207