Closed flankerhqd closed 2 years ago
The problem is you have target
as entry point. The allocation site new AHelper()
is in the constructor, but it is not reachable from your entry point. So with spark, this allocation site wont be found, thus no edge. So you have to set some method as entry point that calls the constructor at first, then somewhere later the target.
The problem is you have
target
as entry point. The allocation sitenew AHelper()
is in the constructor, but it is not reachable from your entry point. So with spark, this allocation site wont be found, thus no edge. So you have to set some method as entry point that calls the constructor at first, then somewhere later the target.
Nope, linghui, the test case driver creates the allocation site for constructor in makeDummyClass
, (https://github.com/soot-oss/soot/blob/19630a60fa9070be071e54e50d366279deb9a179/src/systemTest/java/soot/testing/framework/AbstractTestingFramework.java#L206)
The class instance containing the targetMethod is created and constructor called in dummy method.
Local argsParameter = jimp.newLocal("args", argsParamterType);
locals.add(argsParameter);
units.add(jimp.newIdentityStmt(argsParameter, jimp.newParameterRef(argsParamterType, 0)));
RefType testCaseType = RefType.v(sootTestMethod.getDeclaringClass());
Local allocatedTestObj = jimp.newLocal("dummyObj", testCaseType);
locals.add(allocatedTestObj);
units.add(jimp.newAssignStmt(allocatedTestObj, jimp.newNewExpr(testCaseType)));
SootMethod method;
try {
method = testCaseType.getSootClass().getMethod("void <init>()");
} catch (RuntimeException ex) {
method = testCaseType.getSootClass().getMethodByName("<init>");
}
List<Value> constructorArgs = Collections.nCopies(method.getParameterCount(), NullConstant.v());
units.add(jimp.newInvokeStmt(jimp.newSpecialInvokeExpr(allocatedTestObj, method.makeRef(), constructorArgs)));
I did not notice that you were using that test class.
this.helpers.add(new AHelper());
is not considered by spark, since spark only considers four kinds of statements when building up the Pointer Assignment Graph (see screenshot below). I don't know how this can be modeled in PAG, as helpers
hold pointers to a set of allocation sites. But you can write an analysis to detect such cases and add those edges to the call graph later. CHA and RTA would also include such edges in the call graph, but they are not so precise.
Thanks for the reply. I'll see if SPARK can be improved to handle this.
@flankerhqd Hi, I talked with my advisor about this issue. He said Spark should handle such cases if the library methods collection.add() is analyzed, because the assignments are usually in the method. But often people exclude these libraries methods, maybe you can check the excluded list of packages?
Hi linghui @linghuiluo :
Thanks for the reply.
It seems even if the java.util.* is removed from exclude list, the result still does not contain relevant edges. See this pull request with added test case: https://github.com/soot-oss/soot/pull/1772.
@flankerhqd thanks for the test. I will take a look into it.
Please use https://github.com/soot-oss/soot/pull/1772 instead, thanks!
Hi:
It seems the current SPARK implementation does not support detection for collection element types. For example, for the following unit test code :
Using
target
as entrypoint, the Call Graph is missing edge to any implementation ofhandle
.The test case driver is as follows:
It seems currently there is no handling for collection item type in SPARK. (If I'm missing something please correct me) Any idea on best ways to implement this?
Thanks!