Skip to content

Fix for #103 #124

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Dec 16, 2017
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -643,6 +643,24 @@ protected CGNode getEnclosingMethodNode() throws IOException, CoreException, NoE
else
return nodes.iterator().next(); // just return the first.
}

/**
* add all CGNodes in call graph to a hash set
* @return a hash set of CGNode
* @throws IOException
* @throws CoreException
* @throws NoEnclosingMethodNodeFoundException
*/
protected HashSet<CGNode> getCGNodesInGraph() throws IOException, CoreException {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not just use the iterator?

HashSet<CGNode> cgNodes = new HashSet<>();

Iterator<CGNode> cgNodeIterator = this.getAnalysisEngine().getCallGraph().iterator();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Finally, I found a easy way to get CGNodes in the call graph. I tried a complex way in the weekend and got an exception.

for (; cgNodeIterator.hasNext();) {
cgNodes.add(cgNodeIterator.next());
}

return cgNodes;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Call graphs can be massive. We don't want an extra step here of adding them to a set if it is not necessary.

}

/**
* Returns true iff all of the predecessors of all of the given {@link CGNode}s
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@

import org.eclipse.core.runtime.CoreException;
import org.eclipse.core.runtime.NullProgressMonitor;
import org.eclipse.jdt.core.dom.MethodInvocation;

import com.google.common.collect.HashBasedTable;
import com.google.common.collect.Table;
Expand Down Expand Up @@ -281,114 +280,115 @@ public void start() throws IOException, CoreException, CallGraphBuilderCancelExc
// the node of where the stream was declared: TODO: Can this be
// somehow rewritten to get blocks corresponding to terminal
// operations?
CGNode cgNode = this.getStream().getEnclosingMethodNode();

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just add a FOR statement. It changes the format of the block in the loop, so red block is large.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The loop is from line 283 to line 396.

for (Iterator<CallSiteReference> callSites = cgNode.iterateCallSites(); callSites.hasNext();) {
CallSiteReference callSiteReference = callSites.next();
MethodReference calledMethod = callSiteReference.getDeclaredTarget();

// is it a terminal operation? TODO: Should this be
// cached somehow? Collection of all terminal operation
// invocations?
if (isTerminalOperation(calledMethod)) {
// get the basic block for the call.
ISSABasicBlock[] blocksForCall = cgNode.getIR().getBasicBlocksForCall(callSiteReference);

assert blocksForCall.length == 1 : "Expecting only a single basic block for the call: "
+ callSiteReference;

for (int i = 0; i < blocksForCall.length; i++) {
ISSABasicBlock block = blocksForCall[i];

BasicBlockInContext<IExplodedBasicBlock> blockInContext = getBasicBlockInContextForBlock(
block, cgNode, supergraph)
.orElseThrow(() -> new IllegalStateException(
"No basic block in context for block: " + block));

if (!terminalBlockToPossibleReceivers.containsKey(blockInContext)) {
// associate possible receivers with the
// blockInContext.
// search through each instruction in the
// block.
int processedInstructions = 0;
for (Iterator<SSAInstruction> it = block.iterator(); it.hasNext();) {
SSAInstruction instruction = it.next();

// if it's a phi instruction.
if (instruction instanceof SSAPhiInstruction)
// skip it. The pointer analysis
// below will handle it.
continue;

// Get the possible receivers. This
// number corresponds to the value
// number of the receiver of the method.
int valueNumberForReceiver = instruction.getUse(0);

// it should be represented by a pointer
// key.
PointerKey pointerKey = engine.getHeapGraph().getHeapModel()
.getPointerKeyForLocal(cgNode, valueNumberForReceiver);

// get the points to set for the
// receiver. This will give us all
// object instances that the receiver
// reference points to.
OrdinalSet<InstanceKey> pointsToSet = engine.getPointerAnalysis()
.getPointsToSet(pointerKey);
assert pointsToSet != null : "The points-to set (I think) should not be null for pointer: "
+ pointerKey;

OrdinalSet<InstanceKey> previousReceivers = terminalBlockToPossibleReceivers
.put(blockInContext, pointsToSet);
assert previousReceivers == null : "Reassociating a blockInContext: "
+ blockInContext + " with a new points-to set: " + pointsToSet
+ " that was originally: " + previousReceivers;

++processedInstructions;
for (CGNode cgNode : this.getStream().getCGNodesInGraph()) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You would use the iterator directly here.

for (Iterator<CallSiteReference> callSites = cgNode.iterateCallSites(); callSites.hasNext();) {
CallSiteReference callSiteReference = callSites.next();
MethodReference calledMethod = callSiteReference.getDeclaredTarget();

// is it a terminal operation? TODO: Should this be
// cached somehow? Collection of all terminal operation
// invocations?
if (isTerminalOperation(calledMethod)) {
// get the basic block for the call.

ISSABasicBlock[] blocksForCall = cgNode.getIR().getBasicBlocksForCall(callSiteReference);

assert blocksForCall.length == 1 : "Expecting only a single basic block for the call: "
+ callSiteReference;

for (int i = 0; i < blocksForCall.length; i++) {
ISSABasicBlock block = blocksForCall[i];

BasicBlockInContext<IExplodedBasicBlock> blockInContext = getBasicBlockInContextForBlock(
block, cgNode, supergraph)
.orElseThrow(() -> new IllegalStateException(
"No basic block in context for block: " + block));

if (!terminalBlockToPossibleReceivers.containsKey(blockInContext)) {
// associate possible receivers with the
// blockInContext.
// search through each instruction in the
// block.
int processedInstructions = 0;
for (Iterator<SSAInstruction> it = block.iterator(); it.hasNext();) {
SSAInstruction instruction = it.next();

// if it's a phi instruction.
if (instruction instanceof SSAPhiInstruction)
// skip it. The pointer analysis
// below will handle it.
continue;

// Get the possible receivers. This
// number corresponds to the value
// number of the receiver of the method.
int valueNumberForReceiver = instruction.getUse(0);

// it should be represented by a pointer
// key.
PointerKey pointerKey = engine.getHeapGraph().getHeapModel()
.getPointerKeyForLocal(cgNode, valueNumberForReceiver);

// get the points to set for the
// receiver. This will give us all
// object instances that the receiver
// reference points to.
OrdinalSet<InstanceKey> pointsToSet = engine.getPointerAnalysis()
.getPointsToSet(pointerKey);
assert pointsToSet != null : "The points-to set (I think) should not be null for pointer: "
+ pointerKey;

OrdinalSet<InstanceKey> previousReceivers = terminalBlockToPossibleReceivers
.put(blockInContext, pointsToSet);
assert previousReceivers == null : "Reassociating a blockInContext: "
+ blockInContext + " with a new points-to set: " + pointsToSet
+ " that was originally: " + previousReceivers;

++processedInstructions;
}

assert processedInstructions == 1 : "Expecting to process one and only one instruction here.";
}

assert processedInstructions == 1 : "Expecting to process one and only one instruction here.";
}

IntSet intSet = instanceResult.getResult().getResult(blockInContext);
for (IntIterator it = intSet.intIterator(); it.hasNext();) {
int nextInt = it.next();

// retrieve the state set for this instance
// and block.
Map<TypestateRule, Set<IDFAState>> ruleToStates = instanceBlockStateTable
.get(instanceKey, blockInContext);

// if it doesn't yet exist.
if (ruleToStates == null) {
// allocate a new rule map.
ruleToStates = new HashMap<>();

// place it in the table.
instanceBlockStateTable.put(instanceKey, blockInContext, ruleToStates);
}

Set<IDFAState> stateSet = ruleToStates.get(rule);

// if it does not yet exist.
if (stateSet == null) {
// allocate a new set.
stateSet = new HashSet<>();

// place it in the map.
ruleToStates.put(rule, stateSet);
}

// get the facts.
Factoid factoid = instanceResult.getDomain().getMappedObject(nextInt);
if (factoid != DUMMY_ZERO) {
BaseFactoid baseFactoid = (BaseFactoid) factoid;
assert baseFactoid.instance.equals(
instanceKey) : "Sanity check that the fact instance should be the same as the instance being examined.";

// add the encountered state to the set.
stateSet.add(baseFactoid.state);
IntSet intSet = instanceResult.getResult().getResult(blockInContext);
for (IntIterator it = intSet.intIterator(); it.hasNext();) {
int nextInt = it.next();

// retrieve the state set for this instance
// and block.
Map<TypestateRule, Set<IDFAState>> ruleToStates = instanceBlockStateTable
.get(instanceKey, blockInContext);

// if it doesn't yet exist.
if (ruleToStates == null) {
// allocate a new rule map.
ruleToStates = new HashMap<>();

// place it in the table.
instanceBlockStateTable.put(instanceKey, blockInContext, ruleToStates);
}

Set<IDFAState> stateSet = ruleToStates.get(rule);

// if it does not yet exist.
if (stateSet == null) {
// allocate a new set.
stateSet = new HashSet<>();

// place it in the map.
ruleToStates.put(rule, stateSet);
}

// get the facts.
Factoid factoid = instanceResult.getDomain().getMappedObject(nextInt);
if (factoid != DUMMY_ZERO) {
BaseFactoid baseFactoid = (BaseFactoid) factoid;
assert baseFactoid.instance.equals(
instanceKey) : "Sanity check that the fact instance should be the same as the instance being examined.";

// add the encountered state to the set.
stateSet.add(baseFactoid.state);
}
}
}
}
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
package p;

import java.util.HashSet;
import java.util.stream.*;

import edu.cuny.hunter.streamrefactoring.annotations.*;

class A {

Stream<Object> m() {
Stream<Object> stream = new HashSet<>().stream().distinct();
return stream;
}

@EntryPoint
void n() {
Stream<Object> s = m();
s.count();
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
package p;

import java.util.HashSet;
import java.util.stream.*;

import edu.cuny.hunter.streamrefactoring.annotations.*;

class A {

Stream<Object> m() {
Stream<Object> stream = new HashSet<>().stream();
return stream;
}

@EntryPoint
void n() {
Stream<Object> s = m();
s.distinct().count();
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
package p;

import java.util.HashSet;
import java.util.stream.*;

import edu.cuny.hunter.streamrefactoring.annotations.*;

class A {

Stream<Object> m() {
Stream<Object> stream = new HashSet<>().stream().sorted();
return stream;
}

@EntryPoint
void n() {
Stream<Object> s = m();
s.distinct().count();
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
package p;

import java.util.HashSet;
import java.util.stream.*;

import edu.cuny.hunter.streamrefactoring.annotations.*;

class A {

Stream<Object> m() {
Stream<Object> stream = new HashSet<>().stream().parallel();
return stream;
}

@EntryPoint
void n() {
Stream<Object> s = m();
s.distinct().count();
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
import java.util.Arrays;
import java.util.Collections;
import java.util.EnumSet;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
Expand Down Expand Up @@ -454,9 +455,42 @@ public void testNonInternalAPI() throws Exception {
helper(new StreamAnalysisExpectedResult("new HashSet<>().parallelStream()",
Collections.singleton(ExecutionMode.PARALLEL), Collections.singleton(Ordering.UNORDERED), false, false,
false, null, null, null, RefactoringStatus.ERROR,
EnumSet.of(PreconditionFailure.NO_TERMINAL_OPERATIONS)));
EnumSet.of(PreconditionFailure.UNORDERED)));
}

public void testNonInternalAPI2() throws Exception {
helper(new StreamAnalysisExpectedResult("new HashSet<>().stream()",
Collections.singleton(ExecutionMode.SEQUENTIAL), Collections.singleton(Ordering.UNORDERED), false, true,
false, Collections.singleton(TransformationAction.CONVERT_TO_PARALLEL), PreconditionSuccess.P1,
Refactoring.CONVERT_SEQUENTIAL_STREAM_TO_PARALLEL, RefactoringStatus.OK, Collections.emptySet()));
}

public void testNonInternalAPI3() throws Exception {
helper(new StreamAnalysisExpectedResult("new HashSet<>().stream()",
Collections.singleton(ExecutionMode.SEQUENTIAL), Collections.singleton(Ordering.UNORDERED), false, true,
false, Collections.singleton(TransformationAction.CONVERT_TO_PARALLEL), PreconditionSuccess.P1,
Refactoring.CONVERT_SEQUENTIAL_STREAM_TO_PARALLEL, RefactoringStatus.OK, Collections.emptySet()));
}

public void testNonInternalAPI4() throws Exception {
HashSet<Ordering> orderings = new HashSet<>();
orderings.add(Ordering.UNORDERED);
orderings.add(Ordering.ORDERED);

helper(new StreamAnalysisExpectedResult("new HashSet<>().stream()",
Collections.singleton(ExecutionMode.SEQUENTIAL), orderings, false, true, false, null, null, null,
RefactoringStatus.ERROR, EnumSet.of(PreconditionFailure.INCONSISTENT_POSSIBLE_ORDERINGS)));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why inconsistent orderings?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is that supposed to be the case? Please investigate.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@saledouble Any progress on this?

}

public void testNonInternalAPI5() throws Exception {
HashSet<ExecutionMode> executionModes = new HashSet<>();
executionModes .add(ExecutionMode.PARALLEL);
executionModes .add(ExecutionMode.SEQUENTIAL);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@saledouble There's also a similar problem here. Do we have a bug?

helper(new StreamAnalysisExpectedResult("new HashSet<>().stream()", executionModes,
Collections.singleton(Ordering.UNORDERED), false, true, false, null, null, null,
RefactoringStatus.ERROR, EnumSet.of(PreconditionFailure.INCONSISTENT_POSSIBLE_EXECUTION_MODES)));
}

public void testCollectionFromParameter() throws Exception {
helper(new StreamAnalysisExpectedResult("h.parallelStream()", Collections.singleton(ExecutionMode.PARALLEL),
Collections.singleton(Ordering.UNORDERED), false, true, false, null, null, null,
Expand Down