如何调查随机Android本机函数调用错误? - java

首先,对于这个问题的标题,我感到抱歉,我遇到了一个问题,但是找不到关于日志错误的任何信息。

我正在开发使用Opencv进行图像处理和匹配的Android应用程序。

Opencv的主要代码使用c ++完成,并使用JNI函数导出到Java。

应用过程是这样的,首先获取相机图像,然后打开Opencv相机,并开始将每个帧与参考图像进行匹配。

在每一帧上,我调用一个称为detectFeatures的本机方法,该方法返回一个double值,该值表示两个图像匹配的百分比。

有时应用程序正常工作,有时崩溃,并且我得到此错误是日志

/data/app/com.grimg.coffretpfe-2/lib/arm/libnative-lib.so (_Z6toGrayN2cv3MatES0_+1577)
/data/app/com.grimg.coffretpfe-2/lib/arm/libnative-lib.so (Java_com_grimg_coffretpfe_Activities_CompareActivity_detectFeatures+100)

在C ++代码中,我有一个名为toGray的函数,这是其签名

double toGray(Mat captured, Mat target)

我在Java中调用的jni方法是

extern "C"
jdouble
JNICALL Java_com_grimg_coffretpfe_Activities_CompareActivity_detectFeatures(
    JNIEnv *env,
    jclass type, jlong addrRgba, jlong addrGray /* this */) {

Mat &mRgb = *(Mat *) addrRgba;
Mat &mGray = *(Mat *) addrGray;

jdouble retVal;

double conv = toGray(mRgb, mGray);


retVal = (jdouble) conv;

return retVal;

}

我搜索了很多有关此错误的信息,但找不到任何有关此错误的信息。

也许你们可以帮我解决这个问题。

编辑:

double toGray(Mat captured, Mat target) {
std::vector<cv::KeyPoint> keypointsCaptured;
std::vector<cv::KeyPoint> keypointsTarget;

cv::Mat descriptorsCaptured;
cv::Mat descriptorsTarget;
//cv::Mat captured;
std::vector<cv::DMatch> matches;
std::vector<cv::DMatch> symMatches;

//std::vector<std::vector<cv::DMatch> > matches1;
//std::vector<std::vector<cv::DMatch> > matches2;
//Mat captured, target;
/*captured = imread("/storage/emulated/0/data/img1.jpg", IMREAD_GRAYSCALE);
target = imread("/storage/emulated/0/data/img3.jpg", IMREAD_GRAYSCALE);
if (!captured.data) {
    // Print error message and quit
    __android_log_print(ANDROID_LOG_INFO, "sometag", "I cant do this.");

}
if (!target.data) {
    // Print error message and quit
    __android_log_print(ANDROID_LOG_INFO, "sometag", "cant do nuthin.");


}*/
//cvtColor(capturedR, captured, CV_RGBA2GRAY);
//cvtColor(targetR, target, CV_RGBA2GRAY);


orb = ORB::create();

//Pre-process
resize(captured, captured, Size(480, 360));
medianBlur(captured, captured, 5);

resize(target, target, Size(480, 360));
medianBlur(target, target, 5);

orb->detectAndCompute(captured, noArray(), keypointsCaptured, descriptorsCaptured);
orb->detectAndCompute(target, noArray(), keypointsTarget, descriptorsTarget);
//__android_log_print(ANDROID_LOG_INFO, "sometag", "keypoints2 size = %d", keypointsTarget.size());
//__android_log_print(ANDROID_LOG_INFO, "sometag", "keypoints size = %d", keypointsCaptured.size());

//Match images based on k nearest neighbour
std::vector<std::vector<cv::DMatch> > matches1;
matcher.knnMatch(descriptorsCaptured, descriptorsTarget,
                 matches1, 2);
//__android_log_print(ANDROID_LOG_INFO, "sometag", "Matches1 = %d",     matches1.size());
std::vector<std::vector<cv::DMatch> > matches2;
matcher.knnMatch(descriptorsTarget, descriptorsCaptured,
                 matches2, 2);
//Ratio filter
ratioTest(matches1);
ratioTest(matches2);
symmetryTest(matches1, matches2, symMatches);
ransacTest(symMatches,
           keypointsCaptured, keypointsTarget, matches);
const int symMatchCount = matches.size();

Point2f point1;
Point2f point2;
float median;
float meanBoy = 0;
float greatest = 0;
float lowest = 0;
int count = 0;
vector<float> angleList;
vector<Point2f> point1List;
vector<Point2f> point2List;

for (int i = 0; i < matches.size(); i++) {
    point1 = keypointsCaptured[matches[i].queryIdx].pt;
    point2 = keypointsTarget[matches[i].trainIdx].pt;
    point1List.push_back(point1);
    point2List.push_back(point2);

    deltaY = ((360 - point2.y) - (360 - point1.y));
    deltaX = (point2.x + 480 - point1.x);

    angle = atan2(deltaY, deltaX) * 180 / PI;
    cout << "ORB Matching Results" << angle << endl;
    //if (angle > greatest) greatest = angle;
    //if (angle < lowest) lowest = angle;
    meanBoy += angle;

    angleList.push_back(angle);
    //std::cout << "points " << "(" << point1.x << "," <<360-point1.y<<") (" << point2.x << ","<<360-point2.y<<") angle:" <<angle << std::endl;
    //std::cout << angle << std::endl;

}
// do something with the best points...

//std::cout << "Mean" << meanBoy/symMatchCount << std::endl;
vector<float> angleLCopy(angleList);
std::sort(angleLCopy.begin(), angleLCopy.end());
/*               if(angleList.size() % 2 == 0)
                         median = (angleList[angleList.size()/2 - 1] + angleList[angleList.size()/2]) / 2;
                 else
                         median = angleList[angleList.size()/2];
                */
size_t medianIndex = angleLCopy.size() / 2;
nth_element(angleLCopy.begin(), angleLCopy.begin() + medianIndex, angleLCopy.end());
median = angleLCopy[medianIndex];
std::cout << "new Median method " << angleLCopy[medianIndex] << std::endl;
//std::cout << "greatest " << greatest << "|| lowest "<< lowest << std::endl;

//std::cout << "No of matches by shehel: " << angleList[35] << " size " << symMatchCount << std::endl;
//std::cout << "Median" << median << std::endl;
//std::cout << matches.size()<< std::endl;
count = 0;
for (auto i = matches.begin(); i != matches.end();) {

    //std::cout << angleList.at(count)<< std::endl;

    //if (angle > greatest) greatest = angle;
    //if (angle < lowest) lowest = angle;
    point1 = point1List.at(count);
    point2 = point2List.at(count);

    deltaY = ((360 - point2.y) - (360 - point1.y));
    deltaX = ((point2.x + 480) - point1.x);

    angle = atan2(deltaY, deltaX) * 180 / PI;
    //angleList.push_back (angle);
    cout << "Is it sorted? " << angleList.at(count) << endl;

    if (angleList.at(count) > (median + 5) | angleList.at(count) < (median - 5)) {
        //cout << "bitch is gone" << angleList.at(count) << endl;
        matches.erase(i);
        count++;

    }
        //{i++; count++;}
    else {
        cout << "Points A (" << point1.x << ", " << point1.y << ") B (" <<
             point2.x + 480 << ", " << point2.y << ") Deltas of X" << deltaX << " Y " <<
             deltaY << "  Angle " << angle << endl;
        cout << "aint going no where" << angleList.at(count) << endl;

        ++i;
        count++;
        //if (angle>0.5 | angle < -0.7)
        //matches.erase(matches.begin()+i);
        // do something with the best points...
    }
}

return (static_cast<double>(matches.size()) / static_cast<double>(matches1.size()));
}

编辑2:

cv::Mat ransacTest(
  const std::vector<cv::DMatch>& matches,
  const std::vector<cv::KeyPoint>& keypoints1,
  const std::vector<cv::KeyPoint>& keypoints2,
  std::vector<cv::DMatch>& outMatches) {
  // Convert keypoints into Point2f
  std::vector<cv::Point2f> points1, points2;
  cv::Mat fundemental;
  for (std::vector<cv::DMatch>::
     const_iterator it= matches.begin();
   it!= matches.end(); ++it) {
   // Get the position of left keypoints
   float x= keypoints1[it->queryIdx].pt.x;
   float y= keypoints1[it->queryIdx].pt.y;
   points1.push_back(cv::Point2f(x,y));
   // Get the position of right keypoints
   x= keypoints2[it->trainIdx].pt.x;
   y= keypoints2[it->trainIdx].pt.y;
   points2.push_back(cv::Point2f(x,y));
}
 // Compute F matrix using RANSAC
  std::vector<uchar> inliers(points1.size(),0);
  if (points1.size()>0&&points2.size()>0){
     cv::Mat fundemental= cv::findFundamentalMat(
     cv::Mat(points1),cv::Mat(points2), // matching points
      inliers,       // match status (inlier or outlier)
      CV_FM_RANSAC, // RANSAC method
      distance,      // distance to epipolar line
      confidence); // confidence probability
  // extract the surviving (inliers) matches
  std::vector<uchar>::const_iterator
                     itIn= inliers.begin();
  std::vector<cv::DMatch>::const_iterator
                     itM= matches.begin();
  // for all matches
  for ( ;itIn!= inliers.end(); ++itIn, ++itM) {
     if (*itIn) { // it is a valid match
         outMatches.push_back(*itM);
      }
   }
   if (refineF) {
   // The F matrix will be recomputed with
   // all accepted matches
      // Convert keypoints into Point2f
      // for final F computation
      points1.clear();
      points2.clear();
      for (std::vector<cv::DMatch>::
             const_iterator it= outMatches.begin();
          it!= outMatches.end(); ++it) {
          // Get the position of left keypoints
          float x= keypoints1[it->queryIdx].pt.x;
          float y= keypoints1[it->queryIdx].pt.y;
          points1.push_back(cv::Point2f(x,y));
          // Get the position of right keypoints
          x= keypoints2[it->trainIdx].pt.x;
          y= keypoints2[it->trainIdx].pt.y;
          points2.push_back(cv::Point2f(x,y));
      }
      // Compute 8-point F from all accepted matches
      if (points1.size()>0&&points2.size()>0){
         fundemental= cv::findFundamentalMat(
            cv::Mat(points1),cv::Mat(points2), // matches
            CV_FM_8POINT); // 8-point method
      }
   }
}
return fundemental;

}

参考方案

当您遍历toGray函数中的匹配项时,我会看到可能的崩溃:

matches.erase(i);

这将使迭代器i无效。您应该将其替换为:

i = matches.erase(i);

SOAPFaultException部署在Tomcat上时,但在GlassFish中工作正常 - java

朋友们,我一直在尝试很多,阅读了很多论坛,但无法理解为什么出现此问题。我使用契约优先方法创建了一个Jax-WS WebService。创建WSDL和XSD,然后使用wsimport工具生成其余工件,为SEI提供实现。将WebService应用程序部署到Eclipse Helios中的GlassFish(Glassfish适配器和Eclipse中安装的插件)。…

页面加载而不是提交时发生struts验证 - java

请原谅我;我对Struts有点陌生。我遇到一个问题,即页面加载而不是我实际提交表单时发生了验证。我整天都在论坛上搜寻和搜寻,没有任何运气。我显然做错了一些事情,应该很容易确定,但是我还没有发现问题所在。这是我的struts.xml的片段:<action name="*Test" method="{1}" clas…

DataSourceTransactionManager和JndiObjectFactoryBean和JdbcTemplate的用途是什么? - java

以下的用途是什么:org.springframework.jdbc.core.JdbcTemplate org.springframework.jdbc.datasource.DataSourceTransactionManager org.springframework.jndi.JndiObjectFactoryBean <tx:annotatio…

无法从ArrayList <String>转换为List <Comparable> - java

当我写下面的代码时,编译器说 无法从ArrayList<String>转换为List<Comparable>private List<Comparable> get(){ return new ArrayList<String>(); } 但是当我用通配符编写返回类型时,代码会编译。private List&l…

Struts2中的错误处理 - java

我对如何在Struts2中进行错误处理感到困惑。我希望在发生错误时将用户引导到一个中心页面。此外,当发生错误时,我希望将其记录下来,因为我使用的是log4j,因此我将其记录为log.error(e.getMessage(), e);但是,在动作类中,如果我捕获到错误(将我的所有代码放入try / catch中),则不会出现中央/常见错误页面。因此,我决定不捕…