Get list of directories

GET Service operation is used to fetch the list of all the directories in root directory.

DFS Client API

The code given below fetches the list of all the top level buckets in HDFS using DFS client APIs.

Configuration settings consist of URL and replication count for files. Hadoop file system object is used with these configuration settings. FileSystem.listFiles(org.apache.hadoop.fs.Path) is used to get all the buckets available.

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.LocatedFileStatus;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.RemoteIterator;
import org.apache.hadoop.hdfs.DFSConfigKeys;

public class GetService {
	/*
	 * This program lists all the directories in a server non recursively.
	 */
	public static void main(String[] args) throws IOException{
		Configuration conf = new Configuration(true);	//Create a configuration object to define hdfs properties
		conf.set(DFSConfigKeys.FS_DEFAULT_NAME_KEY, "hdfs://192.168.0.1:9000"); // URL for your namenode
		conf.set(DFSConfigKeys.DFS_REPLICATION_KEY, "3"); // Replication count for files you write
		
		FileSystem dfs = FileSystem.get(conf);	//Returns the configured filesystem implementation.
		
		FileStatus[] statusList = dfs.listStatus(new Path("/"));	//get directories from root directory
		for(int i=0; i<statusList.length; i++){
			if(statusList[i].isDir()){
				System.out.println(statusList[i].getPath().getName());	//display directory name
			}
		}
	}
}

	
	


Copyright 2017 QueryIO Corporation. All Rights Reserved.

QueryIO, "Big Data Intelligence" and the QueryIO Logo are trademarks of QueryIO Corporation. Apache, Hadoop and HDFS are trademarks of The Apache Software Foundation.