Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

$ hdfs dfsadmin -report

Live datanodes (1):

Name: 10.0.2.15:50010 (quickstart.cloudera)


Hostname: quickstart.cloudera
Decommission Status : Normal
Configured Capacity: 58531520512 (54.51 GB)
DFS Used: 869883019 (829.59 MB)
Non DFS Used: 10532074357 (9.81 GB)
DFS Remaining: 43881076545 (40.87 GB)
DFS Used%: 1.49%
DFS Remaining%: 74.97%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 6
Last contact: Mon Mar 08 17:59:41 PST 2021

• Ciclo de creación, lectura y eliminación de un archivo en HDFS.


$ hdfs dfs -mkdir -p /user/prueba_folder

$ hdfs dfs -ls /user


Found 10 items
drwxr-xr-x - cloudera cloudera 0 2017-07-19 05:33 /user/cloudera
drwxr-xr-x - cloudera supergroup 0 2021-03-08 17:53 /user/hadoop
drwxr-xr-x - mapred hadoop 0 2017-07-19 05:34 /user/history
drwxrwxrwx - hive supergroup 0 2017-07-19 05:36 /user/hive
drwxrwxrwx - hue supergroup 0 2017-07-19 05:35 /user/hue
drwxrwxrwx - jenkins supergroup 0 2017-07-19 05:35 /user/jenkins
drwxrwxrwx - oozie supergroup 0 2017-07-19 05:35 /user/oozie
drwxr-xr-x - cloudera supergroup 0 2021-03-08 18:04 /user/prueba_folder
drwxrwxrwx - root supergroup 0 2017-07-19 05:35 /user/root
drwxr-xr-x - hdfs supergroup 0 2017-07-19 05:36 /user/spark

$ ls Desktop/
Eclipse.desktop Express.desktop Parcels.desktop texto.txt
Enterprise.desktop Kerberos.desktop prueba.txt~
$ hdfs dfs -put Desktop/texto.txt /user/prueba_folder

$ hdfs dfs -ls /user/prueba_folder


Found 1 items
-rw-r--r-- 1 cloudera supergroup 2198936 2021-03-08 18:07 /user/prueba_folder/texto.txt

$ hdfs dfs -cat /user/prueba_folder/texto.txt

/user/prueba_folder /user/prueba_output

21/03/08 18:22:11 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032


21/03/08 18:22:12 INFO input.FileInputFormat: Total input paths to process : 1
21/03/08 18:22:12 INFO mapreduce.JobSubmitter: number of splits:1
21/03/08 18:22:12 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1615253556497_0003
21/03/08 18:22:12 INFO impl.YarnClientImpl: Submitted application
application_1615253556497_0003
21/03/08 18:22:12 INFO mapreduce.Job: The url to track the job:
http://quickstart.cloudera:8088/proxy/application_1615253556497_0003/
21/03/08 18:22:12 INFO mapreduce.Job: Running job: job_1615253556497_0003
21/03/08 18:22:19 INFO mapreduce.Job: Job job_1615253556497_0003 running in uber mode :
false
21/03/08 18:22:19 INFO mapreduce.Job: map 0% reduce 0%
21/03/08 18:22:26 INFO mapreduce.Job: map 100% reduce 0%
21/03/08 18:22:34 INFO mapreduce.Job: map 100% reduce 100%
21/03/08 18:22:34 INFO mapreduce.Job: Job job_1615253556497_0003 completed successfully
21/03/08 18:22:34 INFO mapreduce.Job: Counters: 49
File System Counters
FILE: Number of bytes read=605516
FILE: Number of bytes written=1461499
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=2199061
HDFS: Number of bytes written=448901
HDFS: Number of read operations=6
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Launched reduce tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=4990
Total time spent by all reduces in occupied slots (ms)=4857
Total time spent by all map tasks (ms)=4990
Total time spent by all reduce tasks (ms)=4857
Total vcore-milliseconds taken by all map tasks=4990
Total vcore-milliseconds taken by all reduce tasks=4857
Total megabyte-milliseconds taken by all map tasks=5109760
Total megabyte-milliseconds taken by all reduce tasks=4973568
Map-Reduce Framework
Map input records=37861
Map output records=384260
Map output bytes=3688608
Map output materialized bytes=605516
Input split bytes=125
Combine input records=384260
Combine output records=40059
Reduce input groups=40059
Reduce shuffle bytes=605516
Reduce input records=40059
Reduce output records=40059
Spilled Records=80118
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=123
CPU time spent (ms)=3030
Physical memory (bytes) snapshot=362762240
Virtual memory (bytes) snapshot=3015323648
Total committed heap usage (bytes)=226365440
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=2198936
File Output Format Counters
Bytes Written=448901

$ hdfs dfs -rm /user/prueba_folder/texto.txt


Deleted /user/prueba_folder/texto.txt

$ hdfs dfs -ls /user/prueba_folder

• Lanzar un trabajo sencillo de prueba


$ hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar wordcount
/user/prueba_folder /user/prueba_output
21/03/08 18:22:11 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
21/03/08 18:22:12 INFO input.FileInputFormat: Total input paths to process : 1
21/03/08 18:22:12 INFO mapreduce.JobSubmitter: number of splits:1
21/03/08 18:22:12 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1615253556497_0003
21/03/08 18:22:12 INFO impl.YarnClientImpl: Submitted application
application_1615253556497_0003
21/03/08 18:22:12 INFO mapreduce.Job: The url to track the job:
http://quickstart.cloudera:8088/proxy/application_1615253556497_0003/
21/03/08 18:22:12 INFO mapreduce.Job: Running job: job_1615253556497_0003
21/03/08 18:22:19 INFO mapreduce.Job: Job job_1615253556497_0003 running in uber mode :
false
21/03/08 18:22:19 INFO mapreduce.Job: map 0% reduce 0%
21/03/08 18:22:26 INFO mapreduce.Job: map 100% reduce 0%
21/03/08 18:22:34 INFO mapreduce.Job: map 100% reduce 100%
21/03/08 18:22:34 INFO mapreduce.Job: Job job_1615253556497_0003 completed successfully
21/03/08 18:22:34 INFO mapreduce.Job: Counters: 49
File System Counters
FILE: Number of bytes read=605516
FILE: Number of bytes written=1461499
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=2199061
HDFS: Number of bytes written=448901
HDFS: Number of read operations=6
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Launched reduce tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=4990
Total time spent by all reduces in occupied slots (ms)=4857
Total time spent by all map tasks (ms)=4990
Total time spent by all reduce tasks (ms)=4857
Total vcore-milliseconds taken by all map tasks=4990
Total vcore-milliseconds taken by all reduce tasks=4857
Total megabyte-milliseconds taken by all map tasks=5109760
Total megabyte-milliseconds taken by all reduce tasks=4973568
Map-Reduce Framework
Map input records=37861
Map output records=384260
Map output bytes=3688608
Map output materialized bytes=605516
Input split bytes=125
Combine input records=384260
Combine output records=40059
Reduce input groups=40059
Reduce shuffle bytes=605516
Reduce input records=40059
Reduce output records=40059
Spilled Records=80118
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=123
CPU time spent (ms)=3030
Physical memory (bytes) snapshot=362762240
Virtual memory (bytes) snapshot=3015323648
Total committed heap usage (bytes)=226365440
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=2198936
File Output Format Counters
Bytes Written=448901

/user/prueba_folder /user/prueba_output

21/03/08 18:22:11 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032


21/03/08 18:22:12 INFO input.FileInputFormat: Total input paths to process : 1
21/03/08 18:22:12 INFO mapreduce.JobSubmitter: number of splits:1
21/03/08 18:22:12 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1615253556497_0003
21/03/08 18:22:12 INFO impl.YarnClientImpl: Submitted application
application_1615253556497_0003
21/03/08 18:22:12 INFO mapreduce.Job: The url to track the job:
http://quickstart.cloudera:8088/proxy/application_1615253556497_0003/
21/03/08 18:22:12 INFO mapreduce.Job: Running job: job_1615253556497_0003
21/03/08 18:22:19 INFO mapreduce.Job: Job job_1615253556497_0003 running in uber mode :
false
21/03/08 18:22:19 INFO mapreduce.Job: map 0% reduce 0%
21/03/08 18:22:26 INFO mapreduce.Job: map 100% reduce 0%
21/03/08 18:22:34 INFO mapreduce.Job: map 100% reduce 100%
21/03/08 18:22:34 INFO mapreduce.Job: Job job_1615253556497_0003 completed successfully
21/03/08 18:22:34 INFO mapreduce.Job: Counters: 49
File System Counters
FILE: Number of bytes read=605516
FILE: Number of bytes written=1461499
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=2199061
HDFS: Number of bytes written=448901
HDFS: Number of read operations=6
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Launched reduce tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=4990
Total time spent by all reduces in occupied slots (ms)=4857
Total time spent by all map tasks (ms)=4990
Total time spent by all reduce tasks (ms)=4857
Total vcore-milliseconds taken by all map tasks=4990
Total vcore-milliseconds taken by all reduce tasks=4857
Total megabyte-milliseconds taken by all map tasks=5109760
Total megabyte-milliseconds taken by all reduce tasks=4973568
Map-Reduce Framework
Map input records=37861
Map output records=384260
Map output bytes=3688608
Map output materialized bytes=605516
Input split bytes=125
Combine input records=384260
Combine output records=40059
Reduce input groups=40059
Reduce shuffle bytes=605516
Reduce input records=40059
Reduce output records=40059
Spilled Records=80118
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=123
CPU time spent (ms)=3030
Physical memory (bytes) snapshot=362762240
Virtual memory (bytes) snapshot=3015323648
Total committed heap usage (bytes)=226365440
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=2198936
File Output Format Counters
Bytes Written=448901

You might also like