Skip to content

Commit c6318ff

Browse files
author
Andrey Kamaev
committed
Merge pull request opencv#567
2 parents b9b4200 + f40725b commit c6318ff

File tree

6 files changed

+1483
-0
lines changed

6 files changed

+1483
-0
lines changed

modules/video/doc/motion_analysis_and_object_tracking.rst

+46
Original file line numberDiff line numberDiff line change
@@ -153,6 +153,50 @@ In case of point sets, the problem is formulated as follows: you need to find a
153153
:ocv:func:`getPerspectiveTransform`,
154154
:ocv:func:`findHomography`
155155

156+
findTransformECC
157+
------------------------
158+
Finds the geometric transform (warp) between two images in terms of the ECC criterion [EP08]_.
159+
160+
.. ocv:function:: double findTransformECC( InputArray templateImage, InputArray inputImage, InputOutputArray warpMatrix, int motionType=MOTION_AFFINE, TermCriteria criteria=TermCriteria(TermCriteria::COUNT+TermCriteria::EPS, 50, 0.001))
161+
.. ocv:cfunction:: double cvFindTransformECC( const CvArr* templateImage, const CvArr* inputImage, CvMat* warpMatrix, const int motionType, const CvTermCriteria criteria)
162+
163+
:param templateImage: single-channel template image; ``CV_8U`` or ``CV_32F`` array.
164+
165+
:param inputImage: single-channel input image which should be warped with the final ``warpMatrix`` in order to provide an image similar to ``templateImage``, same type as ``temlateImage``.
166+
167+
:param warpMatrix: floating-point :math:`2\times 3` or :math:`3\times 3` mapping matrix (warp).
168+
169+
:param motionType: parameter, specifying the type of motion:
170+
* **MOTION_TRANSLATION** sets a translational motion model; ``warpMatrix`` is :math:`2\times 3` with the first :math:`2\times 2` part being the unity matrix and the rest two parameters being estimated.
171+
* **MOTION_EUCLIDEAN** sets a Euclidean (rigid) transformation as motion model; three parameters are estimated; ``warpMatrix`` is :math:`2\times 3`.
172+
* **MOTION_AFFINE** sets an affine motion model (DEFAULT); six parameters are estimated; ``warpMatrix`` is :math:`2\times 3`.
173+
* **MOTION_HOMOGRAPHY** sets a homography as a motion model; eight parameters are estimated;``warpMatrix`` is :math:`3\times 3`.
174+
175+
:param criteria: parameter, specifying the termination criteria of the ECC algorithm; ``criteria.epsilon`` defines the threshold of the increment in the correlation coefficient between two iterations (a negative ``criteria.epsilon`` makes ``criteria.maxcount`` the only termination criterion). Default values are shown in the declaration above.
176+
177+
178+
The function estimates the optimum transformation (``warpMatrix``) with respect to ECC criterion ([EP08]_), that is
179+
180+
..math::
181+
182+
\texttt{warpMatrix} =
183+
\texttt{warpMatrix} = \arg\max_{W} \texttt{ECC}(\texttt{templateImage}(x,y),\texttt{inputImage}(x',y'))
184+
185+
where
186+
187+
..math::
188+
189+
\begin{bmatrix} x' \\ y' \end{bmatrix} = W \cdot \begin{bmatrix} x \\ y \\ 1 \end{bmatrix}
190+
191+
(the equation holds with homogeneous coordinates for homography). It returns the final enhanced correlation coefficient, that is the correlation coefficient between the template image and the final warped input image. When a :math:`3\times 3` matrix is given with ``motionType`` =0, 1 or 2, the third row is ignored.
192+
193+
194+
Unlike :ocv:func:`findHomography` and :ocv:func:`estimateRigidTransform`, the function :ocv:func:`findTransformECC` implements an area-based alignment that builds on intensity similarities. In essence, the function updates the initial transformation that roughly aligns the images. If this information is missing, the identity warp (unity matrix) should be given as input. Note that if images undergo strong displacements/rotations, an initial transformation that roughly aligns the images is necessary (e.g., a simple euclidean/similarity transform that allows for the images showing the same image content approximately). Use inverse warping in the second image to take an image close to the first one, i.e. use the flag ``WARP_INVERSE_MAP`` with :ocv:func:`warpAffine` or :ocv:func:`warpPerspective`. See also the OpenCV sample ``image_alignment.cpp`` that demonstrates the use of the function. Note that the function throws an exception if algorithm does not converges.
195+
196+
.. seealso::
197+
198+
:ocv:func:`estimateRigidTransform`,
199+
:ocv:func:`findHomography`
156200

157201

158202
updateMotionHistory
@@ -726,3 +770,5 @@ Releases all inner buffers.
726770
.. [Zach2007] C. Zach, T. Pock and H. Bischof. "A Duality Based Approach for Realtime TV-L1 Optical Flow", In Proceedings of Pattern Recognition (DAGM), Heidelberg, Germany, pp. 214-223, 2007
727771
728772
.. [Javier2012] Javier Sanchez, Enric Meinhardt-Llopis and Gabriele Facciolo. "TV-L1 Optical Flow Estimation".
773+
774+
.. [EP08] Evangelidis, G.D. and Psarakis E.Z. "Parametric Image Alignment using Enhanced Correlation Coefficient Maximization", IEEE Transactions on PAMI, vol. 32, no. 10, 2008

modules/video/include/opencv2/video/tracking.hpp

+33
Original file line numberDiff line numberDiff line change
@@ -218,6 +218,24 @@ CVAPI(const CvMat*) cvKalmanCorrect( CvKalman* kalman, const CvMat* measurement
218218
#define cvKalmanUpdateByTime cvKalmanPredict
219219
#define cvKalmanUpdateByMeasurement cvKalmanCorrect
220220

221+
222+
/****************************************************************************************\
223+
* Image Alignment (ECC algorithm) *
224+
\****************************************************************************************/
225+
enum
226+
{
227+
MOTION_TRANSLATION,
228+
MOTION_EUCLIDEAN,
229+
MOTION_AFFINE,
230+
MOTION_HOMOGRAPHY
231+
};
232+
233+
/* Estimate the geometric transformation between 2 images (area-based alignment) */
234+
CVAPI(double) cvFindTransformECC (const CvArr* templateImage, const CvArr* inputImage,
235+
CvMat* warpMatrix,
236+
const int motionType,
237+
const CvTermCriteria criteria);
238+
221239
#ifdef __cplusplus
222240
}
223241

@@ -323,6 +341,21 @@ CV_EXPORTS_W void calcOpticalFlowFarneback( InputArray prev, InputArray next,
323341
CV_EXPORTS_W Mat estimateRigidTransform( InputArray src, InputArray dst,
324342
bool fullAffine);
325343

344+
//! estimates the best-fit Translation, Euclidean, Affine or Perspective Transformation
345+
// with respect to Enhanced Correlation Coefficient criterion that maps one image to
346+
// another (area-based alignment)
347+
//
348+
// see reference:
349+
// Evangelidis, G. E., Psarakis, E.Z., Parametric Image Alignment using
350+
// Enhanced Correlation Coefficient Maximization, PAMI, 30(8), 2008
351+
352+
CV_EXPORTS_W double findTransformECC(InputArray templateImage,
353+
InputArray inputImage,
354+
InputOutputArray warpMatrix,
355+
int motionType=MOTION_AFFINE,
356+
TermCriteria criteria=TermCriteria(TermCriteria::COUNT+TermCriteria::EPS, 50, 0.001));
357+
358+
326359
//! computes dense optical flow using Simple Flow algorithm
327360
CV_EXPORTS_W void calcOpticalFlowSF(InputArray from,
328361
InputArray to,

modules/video/perf/perf_ecc.cpp

+74
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
#include "perf_precomp.hpp"
2+
3+
using namespace std;
4+
using namespace cv;
5+
using namespace perf;
6+
using std::tr1::make_tuple;
7+
using std::tr1::get;
8+
9+
CV_ENUM(MotionType, MOTION_TRANSLATION, MOTION_EUCLIDEAN, MOTION_AFFINE, MOTION_HOMOGRAPHY)
10+
11+
typedef std::tr1::tuple<MotionType> MotionType_t;
12+
typedef perf::TestBaseWithParam<MotionType_t> TransformationType;
13+
14+
15+
PERF_TEST_P(TransformationType, findTransformECC, /*testing::ValuesIn(MotionType::all())*/
16+
testing::Values((int) MOTION_TRANSLATION, (int) MOTION_EUCLIDEAN,
17+
(int) MOTION_AFFINE, (int) MOTION_HOMOGRAPHY)
18+
)
19+
{
20+
21+
Mat inputImage = imread(getDataPath("cv/shared/fruits.png"),0);
22+
Mat img;
23+
resize(inputImage, img, Size(216,216));
24+
Mat templateImage;
25+
26+
int transform_type = get<0>(GetParam());
27+
28+
Mat warpMat;
29+
Mat warpGround;
30+
31+
double angle;
32+
switch (transform_type) {
33+
case MOTION_TRANSLATION:
34+
warpGround = (Mat_<float>(2,3) << 1.f, 0.f, 7.234f,
35+
0.f, 1.f, 11.839f);
36+
37+
warpAffine(img, templateImage, warpGround,
38+
Size(200,200), CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS+CV_WARP_INVERSE_MAP);
39+
break;
40+
case MOTION_EUCLIDEAN:
41+
angle = CV_PI/30;
42+
43+
warpGround = (Mat_<float>(2,3) << (float)cos(angle), (float)-sin(angle), 12.123f,
44+
(float)sin(angle), (float)cos(angle), 14.789f);
45+
warpAffine(img, templateImage, warpGround,
46+
Size(200,200), CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS+CV_WARP_INVERSE_MAP);
47+
break;
48+
case MOTION_AFFINE:
49+
warpGround = (Mat_<float>(2,3) << 0.98f, 0.03f, 15.523f,
50+
-0.02f, 0.95f, 10.456f);
51+
warpAffine(img, templateImage, warpGround,
52+
Size(200,200), CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS+CV_WARP_INVERSE_MAP);
53+
break;
54+
case MOTION_HOMOGRAPHY:
55+
warpGround = (Mat_<float>(3,3) << 0.98f, 0.03f, 15.523f,
56+
-0.02f, 0.95f, 10.456f,
57+
0.0002f, 0.0003f, 1.f);
58+
warpPerspective(img, templateImage, warpGround,
59+
Size(200,200), CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS+CV_WARP_INVERSE_MAP);
60+
break;
61+
}
62+
63+
TEST_CYCLE()
64+
{
65+
if (transform_type<3)
66+
warpMat = Mat::eye(2,3, CV_32F);
67+
else
68+
warpMat = Mat::eye(3,3, CV_32F);
69+
70+
findTransformECC(templateImage, img, warpMat, transform_type,
71+
TermCriteria(TermCriteria::COUNT+TermCriteria::EPS, 5, -1));
72+
}
73+
SANITY_CHECK(warpMat, 1e-3);
74+
}

0 commit comments

Comments
 (0)