Mission computer

From PaparazziUAV
Jump to navigation Jump to search

Copilot: a Mission Computer

Copilot is a short name for a mission computer that can be used with Paparazzi. Mission computer is another computer being flown on the airplane. The autopilot (also called flight computer) takes care of maintaining stable flight, running control loops and sensor fusion algorithms, and communicating with the GCS. The autopilot typically runs an RTOS or bare-metal on a small microcontroller (MCU), such as Lisa/M or Pixhawk and has limited memory and computational power.

Mission computer on the other hand is typically a more powerful computer with Linux OS, and more memory. It takes care of more complex tasks, such as data-logging (if the autopilot doesn't log data), system and mission monitoring and it can also change the flight plan and command the autopilot to fly somewhere else. Mission computer has also a couple of additional interfaces, such as ethernet, so it can create a local network on the airplane. An example Mission computer hardware is Gumstix Overo.

A payload computer carries a payload - such as a camera array, and is the most powerful of all three computers. An example of a payload computer is Intel NUC.

The payload computer communicates with the mission computer using pprzlink messages over Ivy bus on a local network. The mission computer is then connected to the autopilot over serial link and uses pprzlink. The autopilot then communicates with the GCS over wireless telemetry link, using also pprzlink.


System diagram showing the autopilot, mission computer and payload computer

So what is Copilot?

The system described above is very general and can indeed be modified as needed. It is hard to make one-system-fits-all but it is good to have a basic template that can be used and extended as needed. And that is Copilot. Loosely based on ISaAC: The Intelligent Safety and Airworthiness Co-Pilot module described in paper "A Payload Verification and Management Framework for Small UAV-based Personal Remote Sensing Systems" by Cal Coopmans and Chris Coffin. Think of Copilot as a shortcut for Mission computer, and mission computer enabling software.

Copilot consists of three parts:

  • autopilot module that sends and receives messages to/from the Mission computer
  • a sample Copilot application running on the Mission computer itself, logging data and forwarding messages
  • a sample payload application running on the Payload computer, that logs messages, and can request a change of a waypoint

Below we describe how to use Copilot and how to tweak it to your needs.

Starting Copilot

Since Copilot has three parts, we describe them in order. Note that you can use Copilot either as a simulation on your localhost, or deployed on real hardware (and optionally use HITL).

Paparazzi Autopilot

Get the latest version of paparazzi, as described in Installation. Then select Minion Lia airframe. You can use any other airframe too, just make sure you are using the copilot module.

Simulation

Select NPS target, build and then hit Simulate.

Flight

Select AP target, build and upload. Then start your USB-Serial@57600 session.

Now you have the autopilot running, and you should be able to control it from GCS, send commands etc.


Copilot: Mission Computer

We assume that the mission computer is running Linux of some sort, and has the paparazzi packages installed. In case of embedded hardware, check Installation/BeagleBoneBlack or GettingTheGCSRunningonAGumstixBoard .

We use ivy-cpp package, but if you prefer to write your own application in pure C, that is indeed possible.

git clone git@github.com:podhrmic/ivy-cpp.git
cd ivy-cpp
make; sudo make install;

Then we need to copy messages.xml, messaxes.dtd and units.xml into /usr/share/pprzlink Do:

cd ~/paparazzi/sw/ext/pprzlink/message_definitions
sudo mkdir /usr/share/pprzlink
sudo cp common/units.xml /usr/share/pprzlink/.
sudo cp v1.0/messages.dtd /usr/share/pprzlink/.
sudo ln -s `pwd`/v1.0/messages.xml /usr/share/pprzlink

(we want just a symlink to messages.xml because they might change in the future with the new commits in your paparazzi directory).

Simulation

All you need to do for simulation is to execute the copilot binary on the same machine as you are running your paparazzi simulation. Change DEBUG flag to 1 and

make testCopilot; ./testCopilot

You can specify the Ivy broadcast address (default is 127.255.255.255) with -b (type -h to get help). You should see something like this: <nowki> $./testCopilot Broadcasting on network 127.255.255.255, port 2010 </nowiki>

Flight

If you are flying (and are connected to the autopilot via serial port, we need to start 'link' to decode the serial protocol to Ivy messages and publish them on the bus.

# in prrz root dir
cd sw/ground_segment/tmtc
sudo ./link -d /dev/ttyUSB0 -s 921600 -b 10.0.0.255

where you substitute the serial device and the ivy bus address accordingly.

Then compile and launch testCopilot (make sure you have DEBUG flag to 0): make testCopilot; ./testCopilot -b 127.255.255.255:2010


Payload Computer: camera control software

We are using a simple test program called testAggieCap (it is named after AggieCap software that is used on http://aggieair.usu.edu/ airplanes). This simple program binds to messages of interest (such as GPS, ATTITUDE etc), sends periodically status update about its (here virtual) cameras and also requests a change of PAYLOAD waypoint (so it can steer the airplane to go where the payload needs to).

The installation is similar to the Mission computer. After you install paparazzi, copy ivy-cpp repo and make; make install;

Simulation

Change the DEBUG flag to 1 and

make testAggieCap
./testAggieCap

Flight

Change the DEBUG flag to 0 and

make testAggieCap
./testAggieCap

I recommend launching ivyprobe '(.*)' so you see what is going on. You should see something like this:

Payload computer running ivyprobe and testAggieCap


Putting it together

When you have all three parts of Copilot running (doesn't matter if its simulation or real flight) you should see something like this on your GCS:

GCS showing payload and camera status
Messages showing relayed messages from the Payload computer


Customizing Copilot

Since Copilot is not meant to be a finished full featured system, but rather an architecture to build on, it is easy (and often necessary) to modify for a particular purpose. First and most important thing is probably changing which messages are being sent.

Pprzlink messages

If you want to send more messages to the Mission computer from the autopilot, just add them into your telemetry file, under the extra tag. Example of such file is here

If you want to send more message from the Mission computer to the autopilot, add appropriate message handlers to the copilot.xml module.

If you want to add new messages to pprzlink, you can just edit your messages.xml file. If you want to forward new messages from the payload/mission computer to the autopilot, you have to define them in the 'datalink' class and declare them https://github.com/paparazzi/pprzlink/blob/master/message_definitions/v1.0/messages.xml#L1943.

Expanding functionality

If you want to expand functionality of testCopilot or testAggieCap you can just add your own callbacks and/or integrate the examples into your own application.

If you have some cool code you would like to share, don't hesitate and make a pull request:)


Mission Computer Use cases

Mission Planner

Mission planner is a process that runs on the Mission Computer and dynamically changes mission as needed. It utilizes Mission module that has to be running on the autopilot. The mission control is handed over to the mission planner when the mission_run() is executed from the flight plan.

For example, a flight plan block called Mission is created as follows:

    <block name="Mission" strip_button="Mission">
      <call fun="mission_run()"/>
      <deroute block="Standby"/>
    </block>

Use cases

Vision based navigation

Mission planning with machine vision

Assume we have a UAV with a camera connected to the mission computer, some image processing algorithm that detects targets in the acquired images, and Mission Planner process that patrols a specified area, and if a target is detected in the image, it guides the UAV to follow the target (using GOTO_WP messages). We also an event logger to log communication between processes. All processes are communicating over Ivy bus. A good practical example would be wildlife monitoring and tracking of poachers.

Vision based mission can also be about autonomous area surveillance (for example after a natural disaster), the general framework would be the same.

Payload driven mission

Conceptually similar configuration occurs when there is a dedicated payload computer with a complex sensor (e.g. a camera array, high-resolution lidar, etc.). The payload is connected (typically over Ethernet) to the Ivy bus, and can issue messages for the mission computer (similar to the previous use case). A simple example:

  1. payload requests to perform a survey over area A
  2. Mission Planner checks whether the area is in the allowed airspace (and no other aircrafts are nearby)
  3. Mission Planner checks whether the area can be surveyed with the current resources (remaining battery, flight speed, ...)
  4. Mission Planner decides whether to perform the survey, and informs both the Flight computer (GOTO_WP) and Payload computer
  5. (Optional): Mission planner can decide to survey only a part of the area

Obstacle avoidance sensors

Mission planning with Obstacle avoidance modules

Either of the previous cases can be augmented with on-board obstacle avoidance sensors, for example ADS-B module. ADS-B receives position information about other aircrafts, and sends out its won position. Another obstacle avoidance sensor can be a short-range lidar, or a ultrasonic sensor array. In general, we can aggregate the sensors in a Obstacle Avoidance process.

Other scenarios

Indeed, this list is non-exhaustive and a number of other scenarios (or combination of all use cases above) is possible. If you are a curious reader, feel free to expand this list:-)