You are here:Home » Tools » Set up eclipse plugin for hadoop 2.2.0 (Mac OS X 10.9.1)

Set up eclipse plugin for hadoop 2.2.0 (Mac OS X 10.9.1)



  1. Go to hadoop2x-eclipse-plugin
  2. Click "Download ZIP" on the right of that page.
  3. Open terminal
    • $ unzip hadoop2x-eclipse-plugin-master.zip
    • $ cd hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/
    • $ ant jar -Dversion=2.2.0 -Declipse.home={Eclipse-Home} -Dhadoop.home={Hadoop-Home} 
    • The created jar now is in folder hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin
    • $ cd hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin
    • $ cp hadoop-eclipse-plugin-2.2.0.jar /Applications/{Eclipse-Home}/plugins/
  4. Open eclipse:
    • Go to Windows -> Open Perspective -> Other -> Map/Reduce 
    • Click "New Hadoop Location..."
    • Fill in:
      • Location Name -- localhost
      • Map/Reduce Master
        • Host -- localhost
        • Port -- 54311 (related to the value of mapred.job.tracker in mapred-site.xml )
      • DFS Master 
        • Check "Use M/R Master Host"
        • Port -- 8020 (related to the value of fs.default.name in core-site.xml )
      • User name -- User (hadoop user name)
    • Then press the Finish button.
  5. File -> New -> Select a wizard -> Map/Reduce Project 
  6. Configure Hadoop Installation (select the Hadoop installation directory -> {Hadoop-Home})
  7. Then you can create your own Map/Reduce program and run it by choosing "Run on Hadoop". Have fun!
My hadoop-eclipse-plugin-2.2.0.jar is here, or you may find one later in this website.

5 comments:

  1. Thanks a lot for detailed steps for configuring the plug-in which are perfectly working.


    Can you please let me know how to provide input file for programs while choosing "Run on Hadoop option so that program will work on provided input file.

    ReplyDelete
    Replies
    1. Sure! Thanks for your comment.

      There are two steps:
      1) copy your local file into Hadoop Distributed File System (HDFS) with the command as follows:
      $ bin/hdfs dfs -copyFromLocal local_FS_filename target_on_HDFS
      e.g.
      $ bin/hdfs dfs -copyFromLocal local_input_file.data /user/username/hdfsdata

      2) In Eclipse, put arguments "input_file output_dir" into "Run Configurations" -> "Arguments" -> "Program arguments",
      where input_file is the path of your input file on HDFS
      and output_dir is the output directory on HDFS

      Delete
  2. It will really help us out, thanks for sharing this plugin. wordpress whmcs integration

    ReplyDelete
  3. One of trusted & Reliable Hosting infrastructure Provider, dedicated servers, Cloud Servers, Network Services, Server Management, 24/7 Monitoring mechanism unlimited technical supports, all are on single place.
    http://www.server-administration.com/

    ReplyDelete
  4. Top ias coaching in Bangalore
    http://www.globalias.in/top-ias-coaching-center-in-bangalore-india/

    ReplyDelete