Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Facing problems using VINS-Fusion on T265 realsense stereo fisheye camera #57

Open
shubhamwagh opened this issue Jul 11, 2019 · 26 comments

Comments

@shubhamwagh
Copy link

shubhamwagh commented Jul 11, 2019

Hi!
I want to compare T265's visual odometry against VINS-Fusion as I don't have ground truth for my experiments.
I created the config file similar to d435i for my T265 realsense stereo fisheye camera by using KANNALA_BRANDT camera model.

First I start my realsense t265 camera launch.
Then on running the

rosrun vins vins_node /home/qbot/catkin_ws/src/VINS-Fusion/config/realsense_t265/realsense_stereo_imu_config.yaml 

I get nan values :

[ WARN] [1562834830.147733340]: gyroscope bias initial calibration -nan -nan -nan
[ INFO] [1562834830.149798690]: Initialization finish!
time: 1562834582.463970, t: nan nan nan q: -nan -nan -nan -nan 
time: 1562834582.530332, t: -nan -nan -nan q: -nan -nan -nan -nan 
time: 1562834582.597377, t: -nan -nan -nan q: -nan -nan -nan -nan 
time: 1562834582.664094, t: -nan -nan -nan q: -nan -nan -nan -nan 
time: 1562834582.730809, t: -nan -nan -nan q: -nan -nan -nan -nan 
time: 1562834582.797328, t: -nan -nan -nan q: -nan -nan -nan -nan 
time: 1562834582.864194, t: -nan -nan -nan q: -nan -nan -nan -nan 
wait for imu ... 
time: 1562834582.930874, t: -nan -nan -nan q: -nan -nan -nan -nan 
time: 1562834582.997389, t: -nan -nan -nan q: -nan -nan -nan -nan 
time: 1562834583.064506, t: -nan -nan -nan q: -nan -nan -nan -nan 
time: 1562834583.130749, t: -nan -nan -nan q: -nan -nan -nan -nan 
wait for imu ... 

May I know what is going wrong?

Following are my config files:
left.yaml

%YAML:1.0
---
model_type: KANNALA_BRANDT
camera_name: t265
image_width: 848
image_height: 800
distortion_parameters:
   k1: -0.00361037999391556 
   k2: 0.0395187214016914
   p1: -0.0362809598445892
   p2: 0.00565103720873594
projection_parameters:
   fx: 285.3223876953125
   fy: 286.35479736328125
   cx: 425.3219909667969
   cy: 397.4468994140625

right.yaml

%YAML:1.0
---
model_type: KANNALA_BRANDT
camera_name: t265
image_width: 848
image_height: 800
distortion_parameters:
   k1: -0.00486675789579749
   k2: 0.0450274795293808
   p1: -0.0427920483052731
   p2: 0.00793429743498564
projection_parameters:
   fx: 285.404388427734
   fy: 286.497985839844
   cx: 422.697601318359
   cy: 408.199005126953

realsense_stereo_imu_config.yaml

%YAML:1.0

#common parameters
#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam; 
imu: 1         
num_of_cam: 2  

imu_topic: "/camera/imu"
image0_topic: "/camera/fisheye1/image_raw"
image1_topic: "/camera/fisheye2/image_raw"
output_path: "/home/qbot/catkin_ws/output/"

cam0_calib: "left.yaml"
cam1_calib: "right.yaml"
image_width: 848
image_height: 800
   

# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 1   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.

body_T_cam0: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [-0.999932, -0.000790528, 0.0116805, 0.0106992684304714,
          0.000781081, -0.999999, -0.000813372, -8.35757236927748e-06,    
          0.0116811, -0.000804193, 0.999931, -0.000124988204333931,
          0., 0., 0., 1.]
             
body_T_cam1: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [-0.999979, 0.00284343, 0.00574249, -0.0536321885883808,
         -0.00283284, -0.999994, 0.00185211, 0.000244596012635157,
         0.00574772, 0.00183581, 0.999982, 0.00027683938969858,
         0., 0., 0., 1. ]              

#Multiple thread support
multiple_thread: 1

#feature traker paprameters
max_cnt: 150            # max feature number in feature tracking
min_dist: 30            # min distance between two features 
freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image 
F_threshold: 1.0        # ransac threshold (pixel)
show_track: 0           # publish tracking image as topic
flow_back: 1            # perform forward and backward optical flow to improve feature tracking accuracy

#optimization parameters
max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
max_num_iterations: 8   # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters       The more accurate parameters you provide, the better performance
acc_n: 0.1          # accelerometer measurement noise standard deviation. #0.2   0.04
gyr_n: 0.01         # gyroscope measurement noise standard deviation.     #0.05  0.004
acc_w: 0.001         # accelerometer bias random work noise standard deviation.  #0.002
gyr_w: 0.0001       # gyroscope bias random work noise standard deviation.     #4.0e-5
g_norm: 9.805         # gravity magnitude

#unsynchronization parameters
estimate_td: 0                      # online estimate time offset between camera and imu
td: 0.00                            # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#loop closure parameters
load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
pose_graph_save_path: "/home/qbot/catkin_ws/output/pose_graph/" # save and load path
save_image: 0                   # save image in pose graph for visualization prupose; you can close this function by setting 0 

rs_camera.launch

<launch>
  <arg name="serial_no"           default=""/>
  <arg name="json_file_path"      default=""/>
  <arg name="camera"              default="camera"/>
  <arg name="tf_prefix"           default="$(arg camera)"/>

  <arg name="fisheye_width"       default="848"/> 
  <arg name="fisheye_height"      default="800"/>
  <arg name="enable_fisheye1"     default="true"/>
  <arg name="enable_fisheye2"     default="true"/>

  <arg name="fisheye_fps"         default="30"/>

  <arg name="gyro_fps"            default="200"/>
  <arg name="accel_fps"           default="62"/>
  <arg name="enable_gyro"         default="true"/>
  <arg name="enable_accel"        default="true"/>

  <arg name="enable_sync"           default="true"/>

  <arg name="linear_accel_cov"      default="0.01"/>
  <arg name="initial_reset"         default="true"/>
  <arg name="unite_imu_method"      default="linear_interpolation"/>
  
  <group ns="$(arg camera)">
    <include file="$(find realsense2_camera)/launch/includes/nodelet.launch.xml">
      <arg name="tf_prefix"                value="$(arg tf_prefix)"/>
      <arg name="serial_no"                value="$(arg serial_no)"/>
      <arg name="json_file_path"           value="$(arg json_file_path)"/>

      <arg name="enable_sync"              value="$(arg enable_sync)"/>

      <arg name="fisheye_width"            value="$(arg fisheye_width)"/>
      <arg name="fisheye_height"           value="$(arg fisheye_height)"/>
      <arg name="enable_fisheye1"          value="$(arg enable_fisheye1)"/>
      <arg name="enable_fisheye2"          value="$(arg enable_fisheye2)"/>

      <arg name="fisheye_fps"              value="$(arg fisheye_fps)"/>
      <arg name="gyro_fps"                 value="$(arg gyro_fps)"/>
      <arg name="accel_fps"                value="$(arg accel_fps)"/>
      <arg name="enable_gyro"              value="$(arg enable_gyro)"/>
      <arg name="enable_accel"             value="$(arg enable_accel)"/>

      <arg name="linear_accel_cov"         value="$(arg linear_accel_cov)"/>
      <arg name="initial_reset"            value="$(arg initial_reset)"/>
      <arg name="unite_imu_method"         value="$(arg unite_imu_method)"/>
    </include>
  </group>
</launch>  

Any insight will be appreciated. Thanks.

@shubhamwagh shubhamwagh changed the title Facing problems using VINS-Fusion on T265 relasense stereo fisheye camera Facing problems using VINS-Fusion on T265 realsense stereo fisheye camera Jul 12, 2019
@SainaRez
Copy link

I'm having a similar problem and would like to know how I can get fisheye to work with vins fusion

@ethanguo0327
Copy link

firstly,I found that Vins-Fusion doesnt read fisheye mask,then I added some words in the program,like what Vins-Mono does.Then in my experiment the problem still exists,but I think its the first step of what we should do.

@ethanguo0327
Copy link

Does Vins-Fusion support fisheye stereo?Any suggestion for using it? @pjrambo Thanks in advance

@shubhamwagh
Copy link
Author

shubhamwagh commented Sep 24, 2019

@ethanguo0327 In the paper, if you read Applications: Feedback control on an aerial robot, they make use of fisheye lens for which they use KANNALA_BRANDT camera model widely used for the fish-eye lens. It will be great if authors can provide more insight into this.

@RamSrivatsav
Copy link

Hello,
I'm having the same issue with mynteye s2100 stereo camera (pinhole). Is this problem solved? If not please guide me on how I can overcome this problem. Looking forward to your reply.
Thanks.

@LuciousZhang
Copy link

LuciousZhang commented Oct 29, 2019

@shubhamwagh
I encounter the same problem as yours. luckily, I address it at the end.
The key is to calibrate the VIO device yourself, DON'T USE ITS OWN PARAMETERS.
I use imu_utils and kalibr to calibrate T265. The following is my fisheye1.yaml.

%YAML:1.0
---
model_type: MEI
camera_name: camera
image_width: 848
image_height: 800
mirror_parameters:
   xi: 1.6943561
distortion_parameters:
   k1: -0.1075293
   k2: 0.6081762
   p1: 0.0029581
   p2: 0.0020715
projection_parameters:
   gamma1: 774.927
   gamma2: 773.762
   u0: 420.086
   v0: 402.516

Then running VINS-Fusion.
VINS T265

If you can read some Chinese, you can see my article↓.
Realsense T265 calibration and running VINS. Using kalibr and imu_utils

@shubhamwagh
Copy link
Author

It is great someone finally got VINS fusion working using T265 camera. Thanks, @LuciousZhang for the info.

@mzahana
Copy link

mzahana commented Dec 7, 2019

Hello @shubhamwagh , where you able to use VINS-Fusion with T265?

@mzahana
Copy link

mzahana commented Dec 9, 2019

@LuciousZhang Can you please mention which camera model from Kalibr you used?

@LuciousZhang
Copy link

@LuciousZhang Can you please mention which camera model from Kalibr you used?

@mzahana MEI

@mzahana
Copy link

mzahana commented Dec 13, 2019

Thanks @LuciousZhang . However I see no model named MEI in Kalibr documentation (https://github.com/ethz-asl/kalibr/wiki/supported-models)

Is it one of the supported models?

@shubhamwagh
Copy link
Author

shubhamwagh commented Dec 13, 2019

@mzahana This VINS-Fusion package does provide MEI camera model. Check code here
And for calibration using Kalibr, it does provide omnidirectional camera model (omni) (which is MEI)

@mzahana
Copy link

mzahana commented Dec 13, 2019

@shubhamwagh so you are suggesting to use the calibrator in this package?

@shubhamwagh
Copy link
Author

shubhamwagh commented Dec 13, 2019

@mzahana
Use either Kalibr to calibrate MEI type camera model which is actually omnidirectional camera model (omni) and get the calibration data.
And then use VINSfusion as mentioned by @LuciousZhang

or use Vins-Fusion calibration package to calibrate camera...check here and then use VINS-fusion stuff.

@mzahana
Copy link

mzahana commented Dec 13, 2019

@shubhamwagh thanks for the explanation.

@mzahana
Copy link

mzahana commented Dec 18, 2019

@LuciousZhang I followed the steps of the calibration using Kalibr and also tried to follow he steps that you wrote in your blog. However, the estimator diverges very fast. My config files looks as follwos,

%YAML:1.0

#common parameters
#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam; 
imu: 1         
num_of_cam: 2  

imu_topic: "/camera/imu"
image0_topic: "/camera/fisheye1/image_raw"
image1_topic: "/camera/fisheye2/image_raw"
output_path: "/home/dji/output/"

cam0_calib: "left.yaml"
cam1_calib: "right.yaml"
image_width: 848
image_height: 800
   

# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 1   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.

# Left
body_T_cam0: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [1.0, 0.0, 0.0, 0.0,
           0.0, 1.0, 0.0,  0.0,
           0.0, 0.0, 1.0, 0.0,
           0, 0, 0, 1]

# Right
body_T_cam1: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [0.9999607139422159, 0.0022898268013124634, -0.008563134087404545, -0.06387522333501072,
           -0.002248740448562943, 0.999985929369192, 0.004804605086086959, 2.259229602903413e-05,
           0.008574015312202064, -0.00478516006610509, 0.9999517930903333, 0.00046170207522490083,
           0, 0, 0, 1]

#Multiple thread support
multiple_thread: 1

#feature traker paprameters
max_cnt: 150            # max feature number in feature tracking
min_dist: 30            # min distance between two features 
freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image 
F_threshold: 1.0        # ransac threshold (pixel)
show_track: 0           # publish tracking image as topic
flow_back: 1            # perform forward and backward optical flow to improve feature tracking accuracy

#optimization parameters
max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
max_num_iterations: 8   # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters       The more accurate parameters you provide, the better performance
acc_n: 0.018536          # accelerometer measurement noise standard deviation. #0.2   0.04
gyr_n: 0.00202783         # gyroscope measurement noise standard deviation.     #0.05  0.004
acc_w: 0.00061931 #0.001         # accelerometer bias random work noise standard deviation.  #0.002
gyr_w: 0.000014786 #0.0001       # gyroscope bias random work noise standard deviation.     #4.0e-5
g_norm: 9.805         # gravity magnitude

#unsynchronization parameters
estimate_td: 1                      # online estimate time offset between camera and imu
td: 0.003                             # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#loop closure parameters
load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
pose_graph_save_path: "/home/dji/output/pose_graph/" # save and load path
save_image: 0                   # save image in pose graph for visualization prupose; you can close this function by setting 0 

Any ideas what could be wrong?

@LuciousZhang
Copy link

LuciousZhang commented Dec 19, 2019

@mzahana
body_T_cam0should not be a identity matrix. It is the extrinsic matrix between cam0 to imu. Identity matrix indicates the imu and cam0 are overlapped, which is impossible.
You need to check where the identity matrix comes from. (if you followed my blog carefully, it won't happened. HAHA)

@mzahana
Copy link

mzahana commented Dec 20, 2019

@LuciousZhang After fixing the tranfomation matrices, the estimates still diverges fast. It initially follows the camera motion, but only for a brief amount of time, and then diverges fast with nan values on the screen. Please see this issue. I created a separate issue as the issue may also relate to devices other than T265.

Cheers.

@hridaybavle
Copy link

hridaybavle commented Apr 21, 2020

Hi all,

For those still facing the NaNs problem when using a Fisheye lens, I figured the algorithm produces a NaN when undistorting the points in feature_tracker.cpp. I have added a NaN check in order to remove the points with NaNs and now the algorithm can work with the T265 and the default camera params from Intel T65.

This is the pull request.

@greymaner
Copy link

@hridaybavle I cloned the newest code which includes "assert(id_pts.second[0].first == 0);",but I still got this problem by using t265 fisheye. frustrating-.- .

@hamsterasu
Copy link

Hi guys, was it necessary to turn off the auto-exposure in T265 for it to work properly?

@CeccAnd
Copy link

CeccAnd commented Oct 24, 2020

Hello, should be autoexposure turn off? Through Realviewer or something different?

@shanekelly
Copy link

shanekelly commented Apr 23, 2021

@shubhamwagh when using the KANNALA_BRANDT model, the parameters must be specified as follows:

%YAML:1.0
---

model_type: KANNALA_BRANDT
camera_name: <camera_name_val>
image_width: <image_width_val>
image_height: <image_height_val>
projection_parameters:
   k2: <k2_val>
   k3: <k3_val>
   k4: <k4_val>
   k5: <k5_val>
   mu: <mu_val>
   mv: <mv_val>
   u0: <u0_val>
   v0: <v0_val>

See here for reference.

Unfortunately, the VINS-Fusion code does not do any checking to make sure the user has specified the configuration parameters properly, so if they are not in this format, then the distortion parameters will be saved as zeros, which will result in nan values.

@HViktorTsoi
Copy link

You may try this fixup #168, adding the fisheye mask helps reduce the large error in the fisheye edge and improve robustness, even with the KANNALA_BRANDT model.

@githubwys
Copy link

I met the same problem and solve it.

  1. I uesed the camera and IMU data that are synchronized by hardware synchronize.
    (I will check camera and IMU data without hardware synchronize later.)
  2. check the parameters. espicially the camera parameters.

@ahjcahkl
Copy link

@hridaybavle I cloned the newest code which includes "assert(id_pts.second[0].first == 0);",but I still got this problem by using t265 fisheye. frustrating-.- .
I have also encountered the same problem as you. Have you solved this problem now?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests