Friday, July 03, 2009

Hadoop on OpenSolaris Zone

Of course there is no point to run a cluster on a single machine. :-) But there is a point to run a Hadoop node within a Solaris container (zone) in order to isolate it from other stuff. To make it run, nothing really special:
  1. Setup a Solaris zone as you always do.
  2. Make sure that storage for Hadoop data is on LOFS and pointing to a real device.
  3. Redirect hadoop.tmp.dir to your storage.
As for version 0.20, having redirected temporary directory elsewhere, it will also drag everything else, including data. Thus better do at least the following layout:
  1. hadoop.tmp.dir$STORAGE/tmp
  2. dfs.name.dir$STORAGE/name
  3. dfs.data.dir$STORAGE/data
That's it, you've got the picture.

No comments: