你按照这样试试:
首先配置WebHDFS,并启动Httpfs ,详细参考:
Hadoop web编程--REST API WebHDFS
这个操作成功之后,我们进行第二部,Java中嵌入shell。
如何嵌入,参考下面代码:
使用到Process和Runtime两个类,返回值通过Process类的getInputStream()方法获取
- package ark;
-
- import java.io.BufferedReader;
- import java.io.IOException;
- import java.io.InputStreamReader;
- import java.util.ArrayList;
- import java.util.List;
-
- public class ReadCmdLine {
- public static void main(String args[]) {
- Process process = null;
- List<String> processList = new ArrayList<String>();
- try {
- process = Runtime.getRuntime().exec("ps -aux");
- BufferedReader input = new BufferedReader(new InputStreamReader(process.getInputStream()));
- String line = "";
- while ((line = input.readLine()) != null) {
- processList.add(line);
- }
- input.close();
- } catch (IOException e) {
- e.printStackTrace();
- }
-
- for (String line : processList) {
- System.out.println(line);
- }
- }
- }
复制代码
调用shell脚本,判断是否正常执行,如果正常结束,Process的waitFor()方法返回0
- public static void callShell(String shellString) {
- try {
- Process process = Runtime.getRuntime().exec(shellString);
- int exitValue = process.waitFor();
- if (0 != exitValue) {
- log.error("call shell failed. error code is :" + exitValue);
- }
- } catch (Throwable e) {
- log.error("call shell failed. " + e);
- }
- }
复制代码
完成上面两部,就可以达到通过Java操作web hdfs.
|