DATAVIEW is a big data workflow management system. It uses Dropbox as the data cloud and Amazon EC2 as the compute cloud. Current research focuses on the security and privacy aspects of DATAVIEW as well as performance and cost optimization for running workflows in clouds.
Per Dr. Lu's request, here is the code for the issue I am having running a simple example. GitHub does not allow .java file uploads in here, so I have posted the raw text.
When I run it, I get a NullPointerException for each of the Task_Average_Column tasks that are in the workflow during execution on the local executor.
import dataview.models.*;
public class Task_Average_Column extends Task {
public static int instanceCount = 0;
// declare all I/O ports of the task
public Task_Average_Column() {
super("Average column vector", "This task averages a column vector.");
instanceCount++;
// define input ports
InputPort ins[] = new InputPort[1];
ins[0] = new InputPort("in0", Port.DATAVIEW_MathVector, "Vector to be averaged");
// define output ports
OutputPort outs[] = new OutputPort[1];
outs[0] = new OutputPort("out0", Port.DATAVIEW_double, "Result of averaging vector");
}
// this method performs the actual actions of the tasks
@Override
public void run() {
System.out.println("RUN AVG COL");
// step 1: read data from input ports
DATAVIEW_MathVector inputVector = (DATAVIEW_MathVector) ins[0].read();
// step 2: perform computation using data read from step 1
// sum up all of the values in the vector
double sum = 0;
for(int i = 0; i < inputVector.length(); i++) {
System.out.println("____________________________" + inputVector.get(i));
sum += inputVector.get(i);
}
double average = sum / inputVector.length();
// step 3: write data to output port
outs[0].write(average);
}
}
Please let me know if any additional information is required.
Per Dr. Lu's request, here is the code for the issue I am having running a simple example. GitHub does not allow .java file uploads in here, so I have posted the raw text. When I run it, I get a NullPointerException for each of the Task_Average_Column tasks that are in the workflow during execution on the local executor. import dataview.models.*;
public class Task_Average_Column extends Task { public static int instanceCount = 0;
}
Please let me know if any additional information is required.