patterncsharpMinor
Loop optimization for image processing
Viewed 0 times
imageloopoptimizationforprocessing
Problem
I have this piece of code that is running too slow. I was wondering if anyone can help me optimize it as I can't seem to find any more shortcuts. I'm not sure if using
Also,
The purpose of the first piece of code is to group together pixels via horizontal comparison and vertical comparison. This runs for about 70 seconds.
The second portion combines both vertical and horizontal pixel blocks into a 2D-block. This section runs for about 120 seconds. My goal is to have both of these loops complete under 10 seconds.
These numbers are from a 300x300 pixel region comparisons of a 4000x3000 image.
List<> is going to help me but I need complex operation such as Union and Overlap. Also,
List is desirable because I don't know how many unique partitions an Image Region is going to be before running.u1.length = 13254 which is the number of distinct elements in RevisedListMeanHRevisedListMeanH.Count = 90000The purpose of the first piece of code is to group together pixels via horizontal comparison and vertical comparison. This runs for about 70 seconds.
The second portion combines both vertical and horizontal pixel blocks into a 2D-block. This section runs for about 120 seconds. My goal is to have both of these loops complete under 10 seconds.
These numbers are from a 300x300 pixel region comparisons of a 4000x3000 image.
watch.Start();
for (int s = 0; s ConnectedBlocksH = new List();
List ConnectedBlocksV = new List();
float[] RH = RevisedListMeanH.ToArray();
float[] RV = RevisedListMeanV.ToArray();
for (int a = 0; a > ListOfConnectedBlocks = new List>();
for (int a = 0; a i = new HashSet(ArrayOfConnectedBlocksH[a]);
//trigger means it scanned and there was no overlap to add to the group
while (true)
{
bool trigger = true;
for (int b = 0; b ());
}
}
watch.Stop();
long asasasda = watch.ElapsedMilliseconds;//122 secondsSolution
First of, I'm gonna give you a basic idea...
Instead of doing 2 loops over two different arrays, check their sizes first. Create a loop for the smaller of both array sizes. In that loop you will process that Array, and then create another loop for the remainder of unprocessed data.
This can be done for your two double fors you have in there. This way you could reduce the complexity from O(n+m) to O(n+(m-n))
The pseudo code is this...
Instead of doing 2 loops over two different arrays, check their sizes first. Create a loop for the smaller of both array sizes. In that loop you will process that Array, and then create another loop for the remainder of unprocessed data.
This can be done for your two double fors you have in there. This way you could reduce the complexity from O(n+m) to O(n+(m-n))
The pseudo code is this...
if(Arr1 > Arr2)
{
process(Arr1, Arr2);
}
else if (Arr2 > Arr1)
{
process(Arr2, Arr1);
}
else //They're the same size!
{
for(idx = 0; idx != count(any will do); ++idx)
{
//process Arr1
//process Arr2
}
}Code Snippets
if(Arr1 > Arr2)
{
process(Arr1, Arr2);
}
else if (Arr2 > Arr1)
{
process(Arr2, Arr1);
}
else //They're the same size!
{
for(idx = 0; idx != count(any will do); ++idx)
{
//process Arr1
//process Arr2
}
}Context
StackExchange Code Review Q#43969, answer score: 3
Revisions (0)
No revisions yet.