Difference between revisions of "Laserhawk"

From PaparazziUAV
Jump to navigation Jump to search
 
(45 intermediate revisions by 2 users not shown)
Line 1: Line 1:
[[Image:laserhawk.png|thumb|c|x]]
[[Image:laserhawk.png|thumb|c|Be afraid. Be very afraid.]]
=LaserHawk Project Description=  
=LaserHawk Project Description=  
Lidar UAV for traversability map generation
Lidar UAV for traversability map generation
[https://github.com/paulcox/laserhawk/blob/master/docs/PDC_LAAS_Report1.pdf Preliminary Report]


=High-level Goals=
=High-level Goals=
[[Image:trav_map.png|thumb|c|x]]
[[Image:trav_map.png|thumb|c|A robot plans its trajectory]]


#Autonomous flights of Multiplex Mentor UAV with onboard payload including:
#Autonomous flights of Multiplex Mentor UAV with onboard payload including:
#*Hokuyo [http://www.hokuyo-aut.jp/02sensor/07scanner/utm_30lx.html UTM-30LX] LIDAR sensor
#*Hokuyo [http://www.hokuyo-aut.jp/02sensor/07scanner/utm_30lx.html UTM-30LX] LIDAR sensor
#*Gumstix Overo
#*OMAP3 Gumstix [http://www.gumstix.com/store/catalog/index.php?cPath=27_33 Overo]
#*Xsens IMU/AHRS
#*Xsens [http://www.xsens.com/en/general/mti MTi] and [http://www.xsens.com/en/general/mti-g MTi-G] IMU/AHRS
#*Paparazzi autopilot
#*[http://paparazzi.enac.fr/ Paparazzi] autopilot
#Generate traversability map that can be transmitted to UGV
#Generate traversability map that can be transmitted to UGV
#*cartesian coordinates with traversability probabilities/confidence
#*cartesian coordinates with traversability probabilities/confidence
== Elrob competition details==
Questions :
: Distance from launch point to zone of interest : 1-2 km
: Size of zone of interest : 500 x 500 m
:Will stay in standby while waiting for zone of interest coordinates from ground robot or will go home?
:Will standby waypoint need to follow ground robot as it travels?
:How long aircraft on alert for acquisition requests? Hours?
:Terrain map/data downloaded to ground station/groundbot via xbee or wifi or what? file transfer or socket? raw data options?


=Current Tasks and Priorities=
=Current Tasks and Priorities=
Line 22: Line 36:


|- style="background:lemonchiffon; color:black"
|- style="background:lemonchiffon; color:black"
| 1 || Acquire Lidar/attitude data from rooftop || use PC then gumstix || x || Starting
| 1 || Acquire ground Lidar/attitude data while we wait for plane || use PC then gumstix. Hokuyo and MTIG mounted on boom. biketest git directory has initial data  ) || x || done
|- style="background:bisque; color:black"
|- style="background:bisque; color:black"
| 2 || start flying mentor airframe || x || x || Bertrand ready?
| 2 || start flying mentor airframe || Maiden manual flight accomplished. autonomous flying now working || x || done
|- style="background:bisque; color:black"
|- style="background:bisque; color:black"
| 3 || || x || x || x
| 3 || Be able to treat realtime or recorded data from real or virtual data || Some initial application code done, integrating all together next || x || In progress
|- style="background:bisque; color:black"
|- style="background:bisque; color:black"
| x ||x || x || x || x
| 4 || Lit review || x || x || In progress
|- style="background:bisque; color:black"
|- style="background:bisque; color:black"
| x ||x || x || x || x  
| x ||x || x || x || x  
Line 36: Line 50:
| x ||x || x || x || x  
| x ||x || x || x || x  
|}
|}
====Mentor1 build todo====
#make hokuyo power switchable via GPIO (from tiny and/or overo)
#new mti gps antenna


= Architecture =
= Architecture =
Line 42: Line 60:
#*autopilot/groundstation - standard paparazzi Datalink/Telemetry serial modem-based com
#*autopilot/groundstation - standard paparazzi Datalink/Telemetry serial modem-based com
#*gumstix/ground - wifi for debugging
#*gumstix/ground - wifi for debugging
#*UAV to UGV com : TBD
#*UAV to UGV com : 115.2kbps Xbee Pro link (data reliability testing needed)


== Hardware ==
== Hardware ==


===Airborne===
===Airborne===
*Airframe : Multiplex [[LAAS Mentor]]
[[Image:payload_hw_block_diagram.png|thumb|hw block diagram]]
[[Image:bikesciencepackage.jpg|thumb|sensors mounted on bike for initial testing]]
*Autopilot : [http://paparazzi.enac.fr paparazzi]
*Autopilot : [http://paparazzi.enac.fr paparazzi]
*Hokuyo : [http://www.hokuyo-aut.jp/02sensor/07scanner/download/index.html#top_urg_section UTM-30LX]
*Hokuyo : [http://www.hokuyo-aut.jp/02sensor/07scanner/download/index.html#top_urg_section UTM-30LX]
Line 52: Line 73:


===Ground===
===Ground===
*Who cares?  laptops, modems, and antennae should suffice
*nothing special : laptops, modems, and antennae


== Software ==
== Software ==
Version control : git (git-hub ok?)
Version control : [https://github.com/paulcox/laserhawk github]


===Airborne===
===Airborne===
*Overo
*Overo
**OS: openembedded-built linux (rt?)
**OS: openembedded-built linux (use rt-linux patches?)
**Drivers: hokuyo/xsens stuff ([http://homepages.laas.fr/bvandepo/wiki/doku.php?id=gumstixd robotpkg et genom] )
**Drivers: hokuyo/xsens stuff ([http://homepages.laas.fr/bvandepo/wiki/doku.php?id=gumstixd Bertand's notes] ; overo [https://github.com/paulcox/laserhawk/tree/master/ipks ipkg files] built on borderouge )
**apps:
**apps:
***mainloop for sensor data processing
***mainloop for sensor data processing
Line 71: Line 92:
*Paparazzi ground package
*Paparazzi ground package
*gdhe for data visualization (polyline representations of laser scan lines)
*gdhe for data visualization (polyline representations of laser scan lines)
GIT Projects:
# hoku2gdhe - Demo app uses hokuyo and Xsens MTI sensor and generates polylines in GDHE [[Image:hoku2gdhe.png|thumb|hoku2gdhe]]
# hokuyomti - App for aquiring hokuyo data and storing to file (Run in parallel with MTIHardTest)
# log2gdhe - App for visualising stored log (real or virtual) in gdhe
# log2gdhe/mkvirtlog - scripts for generating scan/attitude logs from a virtual terrain model  [[Image:mkvirtsample.png|thumb|| mkvirtlog image output. <p> view [http://paparazzi.enac.fr/wiki_images/Terrain.gif animation] </p> gdhe [http://paparazzi.enac.fr/wiki_images/Virtual_surface_gdhe.png screenshot] ]]
# biketest/scripts - scripts for generating images/animations from stored hokuyo/mti logs
[[Image:plotlogsample.png|thumb|| Plotlogs image output. view [http://paparazzi.enac.fr/wiki_images/Loganim0_50_1.gif animation] ]]
More info in the git [https://github.com/paulcox/laserhawk/blob/master/biketest/README README]
: The goal is to fuse all of these capabilities into one package that allows:
# displaying real or virtual data
# live data, recording, and/or playback
# display raw data in 2d plots (scan data along scan plane, attitude, position, altitude, etc)
# rectify scan data using attitude and send 3d points to gdhe or store (where? what format?)
# later on will treat data to generate terrain model / traversability map / whatever
# not all of this needs to run on overo, just the acquisition and data treatment (some part of it anyways)
# more?


== Geometry ==
== Geometry ==
[[Image:geometry1.png|thumb|c|x]]
{| border="1" cellpadding="5" cellspacing="0" align="center"
[[Image:geometry2.png|thumb|c|x]]
!colspan=2|Laserhawk geometry
|-
|bgcolor=#aaaaff|<center>Overview</center>||bgcolor=#aaaaff|<center>Detail</center>
|-
|[[Image:geometry1.png|250px|center|scan line relative to heading and track]]
|[[Image:geometry2.png|250px|center|Elevation and bird's eye view of scan area]]
|-
|<center>Figure 1</center>
|<center>Figure 2</center>
|}
 
 
* nominal UAV flight velocity : 20-30 m/s
* nominal UAV flight velocity : 20-30 m/s
* nominal UAV flight height AGL : 30 m
* nominal UAV flight height AGL : 30 m
* Lidar sensor resolution : 1080 points over 270 deg
* Lidar sensor resolution : 1080 points over 270 deg visible (1440 points over 360 deg) @40Hz
 
* ground covered distance during one revolution of scanner:
: <center><math>Dist_{per\_scan\_rev} = ground\_speed \times time_{per\_scan\_rev} = 20~\frac{m}{s} \times \frac{1}{40}~s = 0.5~m</math></center>
 
* For 90&deg; interest zone :
:* scan line advances down ground track :
: <center><math> Dist_{x}=  \frac{90}{360} \times Dist_{per\_scan\_rev} = \frac{1}{4} \times 0.5~m = 12.5~cm</math></center>
:* scan line proceeds along sensor rotation (for a 90 scan, this is twice the AGL height) :
: <center><math> Dist_{y}=  2 \times AGL = 2 \times 30~m = 60~m</math></center>
:* Resolution :
: <center><math>  \frac{ \frac{90}{360} \times 1440~pixels }{scan\_length} = \frac{360~pixels}{\sqrt{{Dist_x}^2+{Dist_y}^2}} \approx \frac{360~pixels}{Dist_y}= 6~ \frac{pixels}{m} = 17~</math>cm between pixels</center>
:* Angle relative to track :
: <center><math> Angle_{scan\_to\_track} = \tan^{-1} \frac{Dist_x}{Dist_y} = \tan^{-1} \frac{0.125}{60} = 0.119^\circ</math> (negligible relative to crab angle)</center>
 
= Flight Logs =
Available on git repo : https://github.com/paulcox/laserhawk/tree/master/mentor
 
#May 11 - Manual flights, propeller small so not much thrust margin. IR and servo config tuning.
#May 17 - Manual and Auto1/2 Flights with bigger prop and 80A ESC. Used all three 5Ah batteries, progressively increasing payload weight.
#May 19 - Airframe configured for carrot=3 and reduction in throttle increment. High winds during flight. Lost canopy during flight.
 
 
 
=Scratchpad =
 
MTI comand line application, for dumping attitude/GPS of MTIG to file : (PC and overo)
MTIHardTest /dev/ttyUSB0 -o 2 -d 6 -v >> MTIG.out


=Risks =  
record laser scans to disk (PC and overo)
hokuyomti /dev/ttyACM0
 
scan and display on gdge (PC only for now)
 
hoku2gdhe
 
= Risks =  


{| class="wikitable" style="text-align:center;background:black; color:blue"
{| class="wikitable" style="text-align:center;background:black; color:blue"
Line 87: Line 173:


|- style="background:lemonchiffon; color:black"
|- style="background:lemonchiffon; color:black"
| 1 || Lidar performance || at 30m flight height, over row-reflectance surfaces (roads) || x ||
| 1 || Lidar performance || at 30m flight height, over low-reflectance surfaces (asphalt roads, grass, dirt) || x || Sunny day testing against house exterior walls shows we can measure at least to 16m distance.
|- style="background:bisque; color:black"
|- style="background:bisque; color:black"
| 2 || Crashing Lidar || Will build metal/foam box for protection || x || x
| 2 || Crashing Lidar || Will build box for protection || x || Done
|- style="background:bisque; color:black"
|- style="background:bisque; color:black"
| 3 || Radio interference at Elrob || need robust behaviors with lost com || x || x  
| 3 || Radio interference at Elrob || need robust behaviors with lost com || x || x  
Line 99: Line 185:
| x ||x || x || x || x  
| x ||x || x || x || x  
|}
|}
[[Category:User_Projects]]

Latest revision as of 05:35, 20 May 2011

Be afraid. Be very afraid.

LaserHawk Project Description

Lidar UAV for traversability map generation


Preliminary Report

High-level Goals

A robot plans its trajectory
  1. Autonomous flights of Multiplex Mentor UAV with onboard payload including:
  2. Generate traversability map that can be transmitted to UGV
    • cartesian coordinates with traversability probabilities/confidence

Elrob competition details

Questions :

Distance from launch point to zone of interest : 1-2 km
Size of zone of interest : 500 x 500 m
Will stay in standby while waiting for zone of interest coordinates from ground robot or will go home?
Will standby waypoint need to follow ground robot as it travels?
How long aircraft on alert for acquisition requests? Hours?
Terrain map/data downloaded to ground station/groundbot via xbee or wifi or what? file transfer or socket? raw data options?

Current Tasks and Priorities

TASK LIST
Num Name Notes Priority Status
1 Acquire ground Lidar/attitude data while we wait for plane use PC then gumstix. Hokuyo and MTIG mounted on boom. biketest git directory has initial data ) x done
2 start flying mentor airframe Maiden manual flight accomplished. autonomous flying now working x done
3 Be able to treat realtime or recorded data from real or virtual data Some initial application code done, integrating all together next x In progress
4 Lit review x x In progress
x x x x x
x x x x x
x x x x x

Mentor1 build todo

  1. make hokuyo power switchable via GPIO (from tiny and/or overo)
  2. new mti gps antenna

Architecture

  1. Communication
    • autopilot/groundstation - standard paparazzi Datalink/Telemetry serial modem-based com
    • gumstix/ground - wifi for debugging
    • UAV to UGV com : 115.2kbps Xbee Pro link (data reliability testing needed)

Hardware

Airborne

hw block diagram
sensors mounted on bike for initial testing

Ground

  • nothing special : laptops, modems, and antennae

Software

Version control : github

Airborne

  • Overo
    • OS: openembedded-built linux (use rt-linux patches?)
    • Drivers: hokuyo/xsens stuff (Bertand's notes ; overo ipkg files built on borderouge )
    • apps:
      • mainloop for sensor data processing
      • coordination with ground
      • coordination with autopilot
  • Autopilot
    • paparazzi LPC2148-based firmware

Ground

  • Paparazzi ground package
  • gdhe for data visualization (polyline representations of laser scan lines)

GIT Projects:

  1. hoku2gdhe - Demo app uses hokuyo and Xsens MTI sensor and generates polylines in GDHE
    hoku2gdhe
  2. hokuyomti - App for aquiring hokuyo data and storing to file (Run in parallel with MTIHardTest)
  3. log2gdhe - App for visualising stored log (real or virtual) in gdhe
  4. log2gdhe/mkvirtlog - scripts for generating scan/attitude logs from a virtual terrain model
    mkvirtlog image output.

    view animation

    gdhe screenshot
  5. biketest/scripts - scripts for generating images/animations from stored hokuyo/mti logs
Plotlogs image output. view animation

More info in the git README

The goal is to fuse all of these capabilities into one package that allows:
  1. displaying real or virtual data
  2. live data, recording, and/or playback
  3. display raw data in 2d plots (scan data along scan plane, attitude, position, altitude, etc)
  4. rectify scan data using attitude and send 3d points to gdhe or store (where? what format?)
  5. later on will treat data to generate terrain model / traversability map / whatever
  6. not all of this needs to run on overo, just the acquisition and data treatment (some part of it anyways)
  7. more?

Geometry

Laserhawk geometry
Overview
Detail
scan line relative to heading and track
Elevation and bird's eye view of scan area
Figure 1
Figure 2


  • nominal UAV flight velocity : 20-30 m/s
  • nominal UAV flight height AGL : 30 m
  • Lidar sensor resolution : 1080 points over 270 deg visible (1440 points over 360 deg) @40Hz
  • ground covered distance during one revolution of scanner:
  • For 90° interest zone :
  • scan line advances down ground track :
  • scan line proceeds along sensor rotation (for a 90 scan, this is twice the AGL height) :
  • Resolution :
cm between pixels
  • Angle relative to track :
(negligible relative to crab angle)

Flight Logs

Available on git repo : https://github.com/paulcox/laserhawk/tree/master/mentor

  1. May 11 - Manual flights, propeller small so not much thrust margin. IR and servo config tuning.
  2. May 17 - Manual and Auto1/2 Flights with bigger prop and 80A ESC. Used all three 5Ah batteries, progressively increasing payload weight.
  3. May 19 - Airframe configured for carrot=3 and reduction in throttle increment. High winds during flight. Lost canopy during flight.


Scratchpad

MTI comand line application, for dumping attitude/GPS of MTIG to file : (PC and overo)

MTIHardTest /dev/ttyUSB0 -o 2 -d 6 -v >> MTIG.out

record laser scans to disk (PC and overo) hokuyomti /dev/ttyACM0

scan and display on gdge (PC only for now)

hoku2gdhe

Risks

Risks and unanswered questions
Num Name Notes Priority Status
1 Lidar performance at 30m flight height, over low-reflectance surfaces (asphalt roads, grass, dirt) x Sunny day testing against house exterior walls shows we can measure at least to 16m distance.
2 Crashing Lidar Will build box for protection x Done
3 Radio interference at Elrob need robust behaviors with lost com x x
x x x x x
x x x x x
x x x x x