OpenCV2 薄板样条应用变换不起作用?

2024-03-30

我正在使用 Python OpenCV2 实现薄板变压器并遇到一些问题。当我执行 WarpImage 时,图像会正确扭曲,但是当我使用一些手动输入的点的estimateTransformation 时,这些点无法正确映射。相反,所有点最终都会映射到完全相同的位置。任何帮助,将不胜感激!我在下面附上了我的代码:

splines= cv2.createThinPlateSplineShapeTransformer()
temp=splines.estimateTransformation(reference_coordinate_arr,image_marks_coordinates_arr,matches)
warpedimage=splines.warpImage(image) #image warps fine 
moved_barcodes= splines.applyTransformation(image_bar_coordinates_arr)[0] #these coordinates all map to the same location 

非常感谢您提出这个问题,我一直在寻找样条扭曲很长时间,但从未在 openCV 中找到 ThinPlateTransformation 。

对我来说,在 C++ 中它是有效的。我提供了一些样本点,据我所知它们可能不共面。

#include <opencv2/shape/shape_transformer.hpp>
int main()
{

    cv::Mat img = cv::imread("C:/data/StackOverflow/Lenna.png");

    auto tps = cv::createThinPlateSplineShapeTransformer();
    std::vector<cv::Point2f> sourcePoints, targetPoints;
    sourcePoints.push_back(cv::Point2f(0, 0));
    targetPoints.push_back(cv::Point2f(0, 0));
    sourcePoints.push_back(cv::Point2f(0.5*img.cols, 0));
    targetPoints.push_back(cv::Point2f(0.5*img.cols, 0.25*img.rows));
    sourcePoints.push_back(cv::Point2f(img.cols, 0));
    targetPoints.push_back(cv::Point2f(img.cols, 0));
    sourcePoints.push_back(cv::Point2f(img.cols, 0.5*img.rows));
    targetPoints.push_back(cv::Point2f(0.75*img.cols, 0.5*img.rows));
    sourcePoints.push_back(cv::Point2f(img.cols, img.rows));
    targetPoints.push_back(cv::Point2f(img.cols, img.rows));
    sourcePoints.push_back(cv::Point2f(0.5*img.cols, img.rows));
    targetPoints.push_back(cv::Point2f(0.5*img.cols, 0.75*img.rows));
    sourcePoints.push_back(cv::Point2f(0, img.rows));
    targetPoints.push_back(cv::Point2f(0, img.rows));
    sourcePoints.push_back(cv::Point2f(0, 0.5*img.rows/2)); // accidentally unwanted y value here by 0.5 and /2
    targetPoints.push_back(cv::Point2f(0.25*img.cols, 0.5*img.rows));

    std::vector<cv::DMatch> matches;
    for (unsigned int i = 0; i < sourcePoints.size(); i++)
        matches.push_back(cv::DMatch(i, i, 0));

    tps->estimateTransformation(targetPoints, sourcePoints, matches); // this gives right warping from source to target, but wront point transformation
    //tps->estimateTransformation(sourcePoints, targetPoints, matches); // this gives wrong warping but right point transformation from source to target
    std::vector<cv::Point2f> transPoints;
    tps->applyTransformation(sourcePoints, transPoints);

    std::cout << "sourcePoints = " << std::endl << " " << sourcePoints << std::endl << std::endl;
    std::cout << "targetPoints = " << std::endl << " " << targetPoints << std::endl << std::endl;
    std::cout << "transPos = " << std::endl << " " << transPoints << std::endl << std::endl;

    cv::Mat dst;
    tps->warpImage(img, dst);

    cv::imshow("dst", dst);
    cv::waitKey(0);

};

给出这个结果:

 [0, 0;
 128, 0;
 256, 0;
 256, 256;
 256, 512;
 128, 512;
 0, 512;
 0, 128]

targetPoints =
 [0, 0;
 128, 128;
 256, 0;
 192, 256;
 256, 512;
 128, 384;
 0, 512;
 64, 256]

transPos =
 [0.0001950264, -5.7220459e-05;
 128, -27.710777;
 255.99991, -0.00023269653;
 337.67929, 279.34125;
 255.99979, 512;
 127.99988, 570.5177;
 -0.00029873848, 511.99994;
 -45.164845, -0.20605469]

所以它正在改变点,但方向不是正确的。

在estimateTransformation调用中切换源和目标时给出正确的值(但随后图像给出错误的变形):

tps->estimateTransformation(sourcePoints, targetPoints, matches);


    [0, 0;
 128, 0;
 256, 0;
 256, 256;
 256, 512;
 128, 512;
 0, 512;
 0, 128]

targetPoints =
 [0, 0;
 128, 128;
 256, 0;
 192, 256;
 256, 512;
 128, 384;
 0, 512;
 64, 256]

transPos =
 [-4.7683716e-05, -0.00067138672;
 128.00008, 127.99954;
 256.00012, 0;
 192.00012, 256.00049;
 255.99988, 512.00049;
 127.9995, 383.99976;
 -0.00016021729, 512.00049;
 64.000031, 255.99982]

Input:

Output:

我只是不知道为什么我必须在估计转换调用中切换源点和目标点。最初它表现出与我预期相反的行为......

源代码库取自:https://github.com/opencv/opencv/issues/7084 https://github.com/opencv/opencv/issues/7084

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

OpenCV2 薄板样条应用变换不起作用? 的相关文章

随机推荐