how can I find the the following parameters in the mission planner
Condition delay with absolute time reference?
@3Denis wrote:
Hi all,
I´m wondering if there is an option to set a condition_delay, not using a relative reference (e.g. seconds), but an absolut reference like UTC, or even better something like time since armed or so.
I´m running rover/boat on Pixhawk 2.1 and the idea is to send two vehicles on their mission as synced as possible. They start at the same time and their mission path is similar, lets say parallel. Due to different conditions (wind, current, etc) the progress of their mission and its sync will drift. To get back to a sync, at some point is something like a condition_delay using an absolute time reference. Something like a waiting point, to make sure all vehicles have reached the waypoint, and then start off again together.
Thanks for your ideas!
Denis
Posts: 1
Participants: 1
Is hoverboard wheel can be used on Rover?
@Hijohn wrote:
Hello guys,
I want to build a Rover with Brushless DC motors, and I have a motor drive board and two hoverboard wheels, I have connected to Arduino and control my vehicle successfully like picture, but now I want to use Pixhawk to replace Arduino because MissionPlanner is a powerful software.
Please give some advice about brushless motor can be used on Pixhawk?
I have tred many times to connect Pixhawk and motor drive board, but I think I may be don’t figure out how does Pixhawk works.Here is my motor drive board and it has 6 out pin, [5V]-[EL]-[M]-[ZF]-[VR]-[GND] ,these 6 pin can be connected to MCU(like Arduino)
5V : power positive.
EL : brake. (5V = Lock, 0V = Unlock)
M : RPM output.
ZF : Invert. (5V = clockwise, 0V = anticlockwise)
GND : Ground.Sorry for my bad English
And thank you for any advice!
Posts: 1
Participants: 1
Heilcopters.hi can someone may be help or point me to right direction about which flight controller will work for a 450size heil.i have seen a pixhawk with gprs and I want to know will that work for heilcopters and mission planner. thanks
@Heliyrich wrote:
Hi can someone help me which flight controller will work for a heilcopter (electric)450 size or 500size.i did see a pixhawk.with gprs and will that work with mission planner?I really want to do this fly a heily useing mission planner and what mission planner would I need?I use a spekturm DX 9 G2 transmitter black edition.9channel.thanks your help is greatly appreciated
Posts: 1
Participants: 1
New Tuning Instructions Wiki Page
@Leonardthall wrote:
I have written a tuning guide for multirotors. I thought I would start this discussion group to get feedback and suggestions.
http://ardupilot.org/copter/docs/tuning-process-instructions.html
As usual we are always looking for help to develop the wiki pages. So videos of aircraft doing various parts of this process could be a great addition.
Feel free to ask any questions about this process. That way I can also ensure they are properly addressed in the page.
Posts: 2
Participants: 2
Set param error,please ensure your version is ac3.3+
@cutepeii wrote:
Hey,guys.when i try to calibrate my esc the mp says set param error,Please ensure your version is AC3.3+. I’m sure this is a pixhawk.I search similar topic but seems other guys didn’t solve this problem.Anyone knows how to solve this?
![]()
Posts: 1
Participants: 1
Is the new Raspberry Pi 4 compatible with ArduPilot?
@JonathanP wrote:
Hi,
I would like to use the new Raspberry Pi 4.
Is it compatible with ArduPilot?
Posts: 1
Participants: 1
VTOL -- freeman vertical take-off and landing fixed wing +pixhakw
Is it possible to log at > 50 Hz?
@jmatt1998 wrote:
I’ve been modifying the code to add different logs, and no matter what I try I have been unable to log anything at > 50 Hz, regardless of how often the function is called. I have tried both the “easy way” and “hard way” outlined in the documentation, and I have tried both calling my logging function at a high frequency from the scheduler as well as calling it from within another function that updates at a high frequency. Nothing seems to work so far. I’m using a Pixhawk 1 board if that makes a difference. Does this mean the code is failing to run at high frequencies or is it updating at a high frequency and just failing to log fast enough?
Additionally, the logging frequency does not seem precise. E.g. when trying to record at 25 Hz, I get frequency ranges from 24.5 to 25.5 which is undesirable.
Posts: 2
Participants: 2
Auto mission command status after completion of a mission?
@RogerR wrote:
I’m flying some fully automated missions. I throw the plane, it flies around and then lands. The last step in Mission Planner is the landing. And the last part of that is that it disarms the motors after a short delay.
All good.
However, I often want to fly again w/o power cycling the Pixhawk or using the GCS. I have done multiple launches w/o power cycling before when I landed manually after flying a mission that had a infinite loop circle pattern at the end. But I found that subsequent launches did NOT restart from the first command (it flew instead to the next waypoint in the mission from the PREVIOUS flight…which worked poorly since I didn’t throw it in that direction).
Does an auto mission restart at step 1 after “disarming” and then “arming”?
Or does it restart after hitting the end of the command list?
Could I place a Do_Jump back to step 1 (the takeoff) as the last item in the flight plan (after the landing)?RR
Posts: 2
Participants: 2
Test button in Extended Tuning page of Antenna Tracker not working
@wcfung1 wrote:
I am building an Antenna Tracker using Pixhawk as AT controller flashed with AT V1.00.
Pan and tilt servos are connected to outputs PWM1 and PWM2 of Pixhawk, and powered by 5V.After connected to MP 1.3.61, the 2 servos do not move after moving either the yaw or pitch slider, followed by pushing the TEST buttons.
Pls advise what has gone wrong. If I replace the Pixhawk with APM2.6 as AT controller, the TEST buttons do work.
Any help is deeply appreciated.
Posts: 1
Participants: 1
Inject gps is not successful, only ublox message displayed in link status
@Gusti_Made wrote:
I have used here+ in my copter and I just get RTK float result. But yesterday I get a problem. The base status, gps, glonnass, and beidou status indicator is still in red, and rtcm message in link status box is not displayed, only ublox message. What should I do to fix this problem? I also tried to revert the base and rover GPS config using u-center software but it is not resolved the error.
I use pixhawk cube 2.1, arducopter v3.6, mission planner 1.3.58, and here+ GPS
Posts: 1
Participants: 1
Do all quad motors have to be exactly at the same level (height wise)?
@iSkyMaster wrote:
I have a theoretical question.
- What if I put two front motors of a quad on equal plane.
- The back two motors will be lets say 3 inches higher than the front two motors.
- The distance between the motors will be exactly the same and far apart as normal quad.
Will there be any issues flying this quad while the front two motors and the rear tow motors are not exactly level (same plane).
Please let me know.
Posts: 2
Participants: 2
Problem in pid tune in acro mode
@lmy568 wrote:
today i tune my differential thrust boat with rover3.3 firmware.
- I set the cruise speed and throttle successfully. (in mannul mode, and use the rc7 switch to learn cruise, every thing is wonderful)
2*. I want to tune the speed 2 throttle PID, then i change to acro mode from mannul mode. everything is ok in mannul mode, but in acro mode the boat goes automaticly and crazy, i didn’t do anything, the throttle is 0, and the steering stick is in the centor. when i return to mannul mode, it quiet down and motionless.*#my suspect : does it possible in mannul mode the left stick is steer and right is throttle, but in acro mode it changes, the left stick becames throttle and the right is steer?#
Posts: 1
Participants: 1
How to Configure Dual Air speed sensor for Pixhawk 1?
@ton999 wrote:
Hi all, I plan to use dual digital air speed sensor for my Vtol (quadplane) using Pixhawk 1. Normally I connect it to I2C. Can I just connect the second AS to the same I2C? As I use also Dual Gps, is it a problem? Anybody can help me how to setup the parameter in Mission Planner?
Another question is how the Pixhawk 1 will treat the second AS? Is there any option that we can choose the best one? Or to mix the AS calculation and get the average speed value?? I think about using dual AS sensor is for reliability and redundancy. I believe both Gps and AS sensor are very critical components for the aircraft. Thank you. BR.
Posts: 1
Participants: 1
HELP with the log analysis for drone crash
@zabbie wrote:
Hi.
Today I tried to fly my Hexa with a pixhawk cube but before even taking off in auto mode the drone flipped itself sideways and ultimately flipped upside down. that was scary. I have been trying to check logs to make out the sense of what actually went wrong.
I think the RCOUT value for Roll and Yaw seems pretty abnormal to me.For flight mode, that I used: LOITER, AUTO & RTL
As soon as I switched to AUTO mode the hex raised its throttle and instead of gaining altitude it rolled right side and ultimately flipped the drone
I used the motor emergency stop switch to stop the motor using channel-7 but there is no data of channel-7 switching in RCIN c7.
here is the log file.00000056.BIN (255.8 KB)
Posts: 1
Participants: 1
I want to number the route
@QAQshangbuqi wrote:
I want to number the routes in Grid in MissionPlanner, how should I do it? Thank you very much.
Posts: 1
Participants: 1
Wrong RTC time when using u-blox F9P
@MartinSollie wrote:
Hi, I have been flying with the u-blox F9P GNSS receiver lately, using the 3.9.9 beta where support for it was added.
I noticed that even if the autoconfig enables the NAV-TIMEGPS message used to get the GPS week, the date in the dataflash logs are wrong for most of the logs (once in a while after rebooting it is correct). The date corresponds to GPS week 0, but the correct time of week (today this gives logs with date 1980-1-11). After looking at the code, I find that
AP_GPS::update_instance
will set the time of the RTC as long as we have 3D fix, even if a NAV-TIMEGPS message has not yet been received (autoconfig sets this up with a 1 Hz rate) such that the week number has not yet been set:if (state[instance].status >= GPS_OK_FIX_3D) { const uint64_t now = time_epoch_usec(instance); AP::rtc().set_utc_usec(now, AP_RTC::SOURCE_GPS); }
(here)
This will happen if a NAV-PVT message is received (received at 5 Hz) andAP_GPS::update_instance
(called at 10Hz or greater) runs before the NAV-TIMEGPS message is received. Since the RTC will never allow the GPS to set its time a second time after getting a valid week, this date error persists until a reboot where it is likely to happen again.
Posts: 1
Participants: 1
Integration of ArduPilot and VIO tracking camera (Part 5): Camera Position Offsets Compensation, Scale Calibration and Compass North Alignment (Beta)
@LuckyBird wrote:
Introduction
In continuation of our ongoing labs, we have demonstrated how to let ArduPilot make full use of the Intel Realsense T265, a new off-the-shelf VIO tracking camera that can provide accurate position feedback in GPS-denied environment, with and without the use of ROS.
This blog is a direct next-step from part 4. We will further enhance the performance of our non-ROS system by taking into account other factors of a robotic platform that are oftentimes ignored. Specifically, we will look at:
- Camera position offset compensation
- Scale calibration
- Compass north alignment (beta)
Let’s dive in.
1. Prerequisite
If you have just started with this series, check out part 1 for a detailed discussion on hardware requirements, then follow the installation process until
librealsense
is verified to be working.In summary, basic requirements for this lab include:
- An onboard computer capable of polling pose data from the T265, i.e. USB2 port. A Raspberry Pi 3 Model B has been proven to be sufficient for our labs.
- A working installation of librealsense and pyrealsense2.
- Newest version of
t265_to_mavlink.py
and verified that the script is working as established in part 4.- This lab is about improving the performance of the system introduced in part 4, thus it’s best if you have completed some handheld / flight tests so you can have a good sense of before-after comparison throughout the process.
2. Camera Position Offset Compensation
What is the problem?
If the camera is positioned far away from the center of the vehicle (especially on large frames), the pose data from the tracking camera might not entirely reflect the actual movement of the vehicle. Most noticeably, when the frame does a pure rotation (no translation at all) the feedback will become a combination of rotation and translation movement on a curve, which is reasonable since that’s how the camera actually moves. Conversely, for small frames this effect is neglectable.
To fix that, we need to take a step back and see how all the coordinate systems are related:
What we actually want is the pose data of IMU body frame
{B}
(also called IMU frame or body frame). What we have from the tracking camera, after all of the transformations performed in part 4, is pose data of camera frame{C}
. Up until now, we have treated the camera frame{C}
and IMU body frame{B}
as identical, and that’s where the problem lies.General solution
Suppose the pose measurement provided from the tracking camera at any given time is
Pc
, and the camera - IMU relative transformation isH_BC
(transforming{C}
into{B}
), then the desired measurement in the IMU body framePb
is:
withH_BC
being a homogeneous transformation matrix, complete with rotation and translation parts.
Solution in our case
Figuring out all of the elements of
H_BC
can be quite tricky and complicated. Thankfully, in our case, the rotation portion has been taken care of in part 4 of this series. That is to say,{C}
and{B}
are ensured to have their axes always point to the same directions, so no further rotation is required to align them. All we need to do now is to find out the camera position offsetsd_x
,d_y
,d_z
.
- Measuring camera position offsets: x, y and z distance offsets (in meters) from the IMU or the center of rotation/gravity of the vehicle, defined the same as in ArduPilot’s wiki page Sensor Position Offset Compensation:
![]()
The sensor’s position offsets are specified as 3 values (X, Y and Z) which are distances in meters from the IMU (which can be assumed to be in the middle of the flight controller board) or the vehicle’s center of gravity. * X : distance forward of the IMU or center of gravity. Positive values are towards the front of the vehicle, negative values are towards the back. * Y : distance to the right of the IMU or center of gravity. Positive values are towards the right side of the vehicle, negative values are towards the left. * Z : distance below the IMU or center of gravity. Positive values are lower, negative values are higher.
Modify the script: Within the script
t265_to_mavlink.py
:
- Change the offset values in the script
body_offset_x
,body_offset_y
andbody_offset_z
accordingly.- Enable using offset compensation:
body_offset_enabled = 1
.Testing: Next time you run the script, notice the difference in pure rotation movements. Other than that, the system should behave the same.
# Navigate to and run the script cd /path/to/the/script python3 t265_to_mavlink.py
Note: Similar to the note of sensor offset wiki: In most vehicles which have all their sensors (camera and IMU in this case) within 15cm of each other, it is unlikely that providing the offsets will provide a noticeable performance improvement.
3. Scale calibration
What is the problem?
At longer distance, the output scale in some cases are reported to be off by 20-30% of the actual scale. Hre are some Github issues related to this problem:
- https://github.com/IntelRealSense/librealsense/issues/4174
- https://github.com/IntelRealSense/librealsense/issues/4450
Solution
We need to find a scale factor that can up/downscale the output position to the true value. To achieve that, we will go with the simple solution of measuring the actual displacement distance of the camera, then divide that with the estimated displacement received from the tracking camera.
Once the scale factor is determined, position output from the camera will be re-scaled with this factor. Scale factor can then be saved in the script for subsequent runs.Procedure:
Note: In the steps below, I would recommend doing the tests in the same environment, same moving trajectory (for example, walks on a 2m x 2m square every time). If the environment changes to a new location or frame configuration changes, it’s best to first restore the scale factor to default value (
1.0
) and calibrate again if the scale is incorrect.
Do at least a few tests to determine whether the scale is arbitrarily changing, or if it stays the same (however incorrect), between runs. The behavior will determine the action in the last step.
Run the script with scale calibration option enabled:
python3 t265_to_mavlink.py --scale_calib_enable true
Perform handheld tests and calculate the new scale based on actual displacement and position feedback data. Actual displacement can also come from other sensors’ reading (rangefinder for example), provided they are accurate enough.
Input new scale: By default, the scale factor is
1.0
. Scale should be input as a floating point number, i.e.1.1
is a valid value. At any time, type in new scale into the terminal, finish by pressing Enter. The new scale will take effect immediately. Note that scale should only be modified when the vehicle is not moving, otherwise local position (quadcopter icon on the map) might diverge.Observe the changes on Mission Planner’s map. Below are some walking tests by @ppoirier in the same trajectory, with two different scale factors.
Flight tests: Once you have obtained a good scale, flight tests can be carried out like normal.
Save or discard new scale value: if the original scale is incorrect but consistent between runs, modify the script and change the default
scale_factor
. If scale changes arbitrarily between runs, you might have to perform this calibration again next time.4. Compass north alignment (BETA)
Note 1: Of the new features introduced in this post, compass north alignment is verified to work with internal compass, with and without optical flow. However, more evaluations are needed in various cases where multiple data sources are being used at the same time (multiple compasses, GPS, optical flow etc.). Any beta testers are thus much appreciated.
Note 2: In outdoor testing, optical flow can really help to improve stability, especially against wind.
What is the problem?
If you wish to have the heading of the vehicle aligned with real world’s north, i.e.
0
degree always means facing magnetic north direction, then this section is for you.General solution
The main implementation idea comes from this post, adapted to our robotic application, workflow and other modules.
Solution in our case
Assumptions:
- Suppose that the Realsense T265 and compass are rigidly attached to the frame, meaning the translation and rotation offsets won’t change over time. Furthermore, compass’s heading and pose data (already transformed into NED frame) are assumed to align with the vehicle’s forward direction.
- Compass data is available through MAVLink
ATTITUDE
message.Procedure:
- Compass yaw data is updated in the background by listening to MAVLink
ATTITUDE
message.- In our main loop, we will capture the latest T265’s raw pose, perform other submodules, before multiply by latest compass’s transformation matrix, which only takes into account the rotation. The translation offsets do not contribute to our solution and will be ignored.
- The final pose with rotation aligned to magnetic north will be used.
ArduPilot parameters:
# Enable compass and use at least one of them COMPASS_ENABLE = 1 COMPASS_USE = 1
Optical flow can also be enabled. The use of other systems (multiple compasses, GPS etc.) requires more beta testing.
- Enable compass north alignment in the script:
# Navigate to the script nano /path/to/t265_to_mavlink.py # compass_enabled = 1 : Enable using yaw from compass to align north (zero degree is facing north) # Run the script: python3 /path/to/t265_to_mavlink.py
Verify the new heading: Now the vehicle should point toward the current actual heading. Other than that, everything should work similarly.
(Same vehicle position and heading in real world. Initialization without compass on the left, with compass on the right)Ground test, handheld test and flight test: This feature is still in beta since consistent performance across systems is still a question mark.
Ground test: Once the system starts running, let the vehicle grounded and see if final heading is drifting. It should align north even when vision position starts streaming. If it drifts, it’s a bad sign for flight tests. See if compass data is stable and not jumping around, retry again.
Handheld test: move the vehicle in a specific pattern (square, circle etc.) and verify that trajectory follows on the map. For example, if you moved in a cross you should see a cross and not a cloverleaf.
Flight test: If all is well, you can go ahead with flight tests. Note that things can still go wrong at any time, so proceed with caution and always be ready to regain control if the vehicle starts to move erratically.
Here are some outdoor tests by @ppoirier, with all of the offsets, scale and compass settings that are introduced in this blog.
5. Open issues and further development
As a relatively new kind of device, it’s not surprising that there are multiple open questions regarding the performance and features of the Intel Realsense T265. A good source of new information and updates is the
librealsense
andrealsense-ros
repos. Interesting info can be found in some of these issues:
- Potential for a more versatile, intended towards embedded platforms version: https://github.com/IntelRealSense/librealsense/issues/4408
- T265 vs GPS: https://github.com/IntelRealSense/librealsense/issues/4412
- In-depth discussion about the inner working of tracking confidence and covariance: https://github.com/IntelRealSense/realsense-ros/issues/770
- Exposure control is now available through
Viewer
app and code: https://github.com/IntelRealSense/librealsense/pull/3992- Plan to use the device as a pure odometry sensor? That might be possible in future version by disabling loop closure: https://github.com/IntelRealSense/realsense-ros/issues/779
- It’s not possible to run multiple T265s at the same time (for now, at least): https://github.com/IntelRealSense/realsense-ros/issues/706
- The T265’s firmware is bundled with
librealsense
, that means updating firmware = updatinglibrealsense
to latest version: https://github.com/IntelRealSense/librealsense/issues/3746- Did you know there is an example to detect AprilTag pose using the T265’s fisheye streams? This provides a simple way to track a target without too much computing resources. Check it out: https://github.com/IntelRealSense/librealsense/tree/development/examples/pose-apriltag
All in all, check the repos regularly if you are developing applications using this device, or just want to learn and explore as much as possible.
6. Conclusion and next steps
In this blog, we have taken the next steps to improve and enhance the performance of our system, which incorporates a VIO tracking camera with ArduPilot using Python and without ROS. Specifically, we have added support for:
- camera position offsets compensation,
- scale calibration and correction, with a step-by-step guide
- a working (beta) method to align the heading of the system with real world’s north.
We have now completed the sequence of weekly labs for our ArduPilot GSoC 2019 project. But we still have even more exciting developments on the way, so stay tuned! Here are a few Github issues to give you a glimpse of what’s next: #11812, #10374 and #11671.
If you are interested in contributing, or just keen on discussing the new developments, feel free to join ArduPilot’s many Gitter channels: ArduPilot and VisionProjects.
Hope this helps. Happy flying!
Posts: 3
Participants: 2
If you flash new firmware onto a Pixhawk do you need to recalibrate the compass and accel?
@LukasMarkussen wrote:
I just flashed new firmware onto my Pixhawk and it remembered the compass offsets. Do I need to recalibrate everything assuming it worked before the flash?
Thanks,
Lukas Markussen
Posts: 1
Participants: 1