After 3 days of hectic efforts in installing, error fixing and configuration changes, I managed to run Hadoop 3.x.x in Windows 10 environment with Java 12 being installed. It was a bunch of errors that made me delay in running it successfully, which I finally managed to go through each line via console and resolve it. Here I would like to discuss those errors in detail rather than steps to install it, which is readily available in internet.

Follow the below link to install & Configure Hadoop3.x in Windows: http://toppertips.com/hadoop-3-0-installation-on-windows/ . Just follow all the configuration steps in it as prescribed

Main rules you need to follow is, whenever you step into an issue hadoop is not running properly, need to give utmost attention to every console window you have (2 console windows when you run start-dfs.cmd & 2 console windows you get while running start-yarn.cmd)

ERROR#1] When the console shows NativeIOLibraries are not loaded (eg:

FATAL nodemanager.NodeManager: Error starting NodeManager
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)

), it means hadoop.dll and/or winutils.exe which is needed for hadoop 3.x onwards (for windows platform) is not in BIN directory of hadoop. You can just download the entire BIN directory needed for Hadoop 3.1.x from the gitlab link below : https://github.com/s911415/apache-hadoop-3.1.0-winutils . Replace it in your Hadoop directry and execute all commands in following order

stop-dfs.cmd

stop-yarn.cmd

hdfs namenode -format

hdfs datanode -format

start-dfs.cmd

start-yarn.cmd

ERROR#2] If error as like “ERROR namenode.NameNode: Failed to start namenode. java.lang.IllegalArgumentException: No class configured for C” shows in console, just go into hadoop-3.1.2\etc\hadoop path, open the hdfs-site.xml and make sure the path is without the root drive (Example below)

Make sure its forward slash itself being used to separate path parameters. If used backward slash, the error console will show you Path not found exception.

ERROR#3] After the hadoop portal is up and running (http://localhost:9870/ ), when you go to “Browse file system” menu, you may step upon the error message as : ” Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error

This is because the javax.activation component is removed from Java 11 onward and you are using some version >= Java 11. (Look closely the console and you can see the similar error message there also). Go to https://jar-download.com/?search_box=javax.activation and download the activation jar file. Paste it into the hadoop directory as “<hadoop root directory>\share\hadoop\common”. Close all Hadoop consoles and execute the commands in the order as I quoted above . The issue resolved !

ERROR#4] After fixing above error, I stumbled on a permission error while trying to upload or create a directory in Hadoop file system as ” Permission denied: user=dr.who, access=WRITE, inode=”/”:Binukumar.S:supergroup:drwxr-xr-x ” This is simply because the user you are logged in to machine has no write permission by default. For quick results you just go into hdfs-site.xml (<hadoop root>\etc\hadoop) and add a property tag as below:

That will bypass the permission for testing/staging. Not recommended for production though

ERROR#5] Even though folders can be got created in file system, when you start uploading files, we again get disappointed with a message as ” Couldn’t find datanode to write file. Forbidden “. This is because the datanode has not been created properly or not at all created. Why because its not created is, you may have formatted namenode in between, but forgot to format datanode and so the clusterID’s got mismatch.

Again if you look at the console closely (got while running start-dfs.cmd), you can see the message related to this as “Failed to add storage directory [DISK]file:/C:/hadoop-3.1.2/data/datanode
java.io.IOException: Incompatible clusterIDs
“. Here is the solution for this… Just go to the datanode folder you created at the time of installation, delete the folder and files in it manually. Run all the commands again and you are good to go. Now you can upload files to HDFS successfully

Leave a Reply