Skip to content

Commit e8859a7

Browse files
crowdin-bothamishwillee
authored andcommitted
New Crowdin translations - uk
1 parent 78e70d8 commit e8859a7

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

52 files changed

+1085
-311
lines changed

docs/uk/SUMMARY.md

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -542,6 +542,7 @@
542542
- [Airspeed](msg_docs/Airspeed.md)
543543
- [AirspeedWind](msg_docs/AirspeedWind.md)
544544
- [AutotuneAttitudeControlStatus](msg_docs/AutotuneAttitudeControlStatus.md)
545+
- [BatteryInfo](msg_docs/BatteryInfo.md)
545546
- [ButtonEvent](msg_docs/ButtonEvent.md)
546547
- [CameraCapture](msg_docs/CameraCapture.md)
547548
- [CameraStatus](msg_docs/CameraStatus.md)
@@ -560,6 +561,7 @@
560561
- [DifferentialPressure](msg_docs/DifferentialPressure.md)
561562
- [DistanceSensor](msg_docs/DistanceSensor.md)
562563
- [DistanceSensorModeChangeRequest](msg_docs/DistanceSensorModeChangeRequest.md)
564+
- [DronecanNodeStatus](msg_docs/DronecanNodeStatus.md)
563565
- [Ekf2Timestamps](msg_docs/Ekf2Timestamps.md)
564566
- [EscReport](msg_docs/EscReport.md)
565567
- [EscStatus](msg_docs/EscStatus.md)
@@ -732,6 +734,10 @@
732734
- [Wind](msg_docs/Wind.md)
733735
- [YawEstimatorStatus](msg_docs/YawEstimatorStatus.md)
734736
- [AirspeedValidatedV0](msg_docs/AirspeedValidatedV0.md)
737+
- [ArmingCheckReplyV0](msg_docs/ArmingCheckReplyV0.md)
738+
- [BatteryStatusV0](msg_docs/BatteryStatusV0.md)
739+
- [EventV0](msg_docs/EventV0.md)
740+
- [HomePositionV0](msg_docs/HomePositionV0.md)
735741
- [VehicleAttitudeSetpointV0](msg_docs/VehicleAttitudeSetpointV0.md)
736742
- [VehicleStatusV0](msg_docs/VehicleStatusV0.md)
737743
- [MAVLink Messaging](mavlink/index.md)
@@ -796,6 +802,9 @@
796802
- [Інтеграція камери/Архітектура](camera/camera_architecture.md)
797803
- [Комп'ютерний зір](advanced/computer_vision.md)
798804
- [Захоплення руху (VICON, Optitrack, NOKOV)](tutorials/motion-capture.md)
805+
- [Neural Networks](advanced/neural_networks.md)
806+
- [Neural Network Module Utilities](advanced/nn_module_utilities.md)
807+
- [TensorFlow Lite Micro (TFLM)](advanced/tflm.md)
799808
- [Встановлюється драйвер для Intel RealSense R200](advanced/realsense_intel_driver.md)
800809
- [Перемикання оцінювачів стану](advanced/switching_state_estimators.md)
801810
- [Out-of-Tree модулі](advanced/out_of_tree_modules.md)
@@ -827,7 +836,8 @@
827836
- [Тест MC_02 - Повна автономність](test_cards/mc_02_full_autonomous.md)
828837
- [Тест MC_03 - поєднання автоматичного і ручного керування](test_cards/mc_03_auto_manual_mix.md)
829838
- [Тест MC_04 - Тестування відмовостійкості](test_cards/mc_04_failsafe_testing.md)
830-
- [Тест MC_05 - Політ у приміщенні (ручні режими)](test_cards/mc_05_indoor_flight_manual_modes.md)
839+
- [Test MC_05 - Manual Modes (Inside)](test_cards/mc_05_indoor_flight_manual_modes.md)
840+
- [Test MC_06 - Optical Flow (Inside)](test_cards/mc_06_optical_flow.md)
831841
- [Модульні Тести](test_and_ci/unit_tests.md)
832842
- [Fuzz Tests](test_and_ci/fuzz_tests.md)
833843
- [Безперервна інтеграція](test_and_ci/continous_integration.md)

docs/uk/advanced/neural_networks.md

Lines changed: 115 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
# Neural Networks
2+
3+
<Badge type="tip" text="main (planned for: PX4 v1.17)" /> <Badge type="warning" text="Experimental" />
4+
5+
:::warning
6+
This is an experimental module.
7+
Use at your own risk.
8+
:::
9+
10+
The Multicopter Neural Network (NN) module ([mc_nn_control](../modules/modules_controller.md#mc_nn_control)) is an example module that allows you to experiment with using a pre-trained neural network on PX4.
11+
It might be used, for example, to experiment with controllers for non-traditional drone morphologies, computer vision tasks, and so on.
12+
13+
The module integrates a pre-trained neural network based on the [TensorFlow Lite Micro (TFLM)](../advanced/tflm.md) module.
14+
The module is trained for the [X500 V2](../frames_multicopter/holybro_x500v2_pixhawk6c.md) multicopter frame.
15+
While the controller is fairly robust, and might work on other platforms, we recommend [Training your own Network](#training-your-own-network) if you use a different vehicle.
16+
Note that after training the network you will need to update and rebuild PX4.
17+
18+
TLFM is a mature inference library intended for use on embedded devices.
19+
It has support for several architectures, so there is a high likelihood that you can build it for the board you want to use.
20+
If not, there are other possible NN frameworks, such as [Eigen](https://eigen.tuxfamily.org/index.php?title=Main_Page) and [Executorch](https://pytorch.org/executorch-overview).
21+
22+
This document explains how you can include the module in your PX4 build, and provides a broad overview of how it works.
23+
The other documents in the section provide more information about the integration, allowing you to replace the NN with a version trained on different data, or even to replace the TLFM library altogether.
24+
25+
If you are looking for more resources to learn about the module, a website has been created with links to a youtube video and a workshop paper. A full master's thesis will be added later. [A Neural Network Mode for PX4 on Embedded Flight Controllers](https://ntnu-arl.github.io/px4-nns/).
26+
27+
## Neural Network PX4 Firmware
28+
29+
The module has been tested on a number of configurations, which can be build locally using the commands:
30+
31+
```sh
32+
make px4_sitl_neural
33+
```
34+
35+
```sh
36+
make px4_fmu-v6c_neural
37+
```
38+
39+
```sh
40+
make mro_pixracerpro_neural
41+
```
42+
43+
You can add the module to other board configurations by modifying their `default.px4board file` configuration to include these lines:
44+
45+
```sh
46+
CONFIG_LIB_TFLM=y
47+
CONFIG_MODULES_MC_NN_CONTROL=y
48+
```
49+
50+
:::tip
51+
The `mc_nn_control` module takes up roughly 50KB, and many of the `default.px4board file` are already close to filling all the flash on their boards. To make room for the neural control module you can remove the include statements for other modules, such as FW, rover, VTOL and UUV.
52+
:::
53+
54+
## Example Module Overview
55+
56+
The example module replaces the entire controller structure as well as the control allocator, as shown in the diagram below:
57+
58+
![neural_control](../../assets/advanced/neural_control.png)
59+
60+
In the [controller diagram](../flight_stack/controller_diagrams.md) you can see the [uORB message](../middleware/uorb.md) flow.
61+
We hook into this flow by subscribing to messages at particular points, using our neural network to calculate outputs, and then publishing them into the next point in the flow.
62+
We also need to stop the module publishing the topic to be replaced, which is covered in [Neural Network Module: System Integration](nn_module_utilities.md)
63+
64+
### Вхід
65+
66+
The input can be changed to whatever you want.
67+
Set up the input you want to use during training and then provide the same input in PX4.
68+
In the Neural Control module the input is an array of 15 numbers, and consists of these values in this order:
69+
70+
- [3] Local position error. (goal position - current position)
71+
- [6] The first 2 rows of a 3 dimensional rotation matrix.
72+
- [3] Linear velocity
73+
- [3] Angular velocity
74+
75+
All the input values are collected from uORB topics and transformed into the correct representation in the `PopulateInputTensor()` function.
76+
PX4 uses the NED frame representation, while the Aerial Gym Simulator, in which the NN was trained, uses the ENU representation.
77+
Therefore two rotation matrices are created in the function and all the inputs are transformed from the NED representation to the ENU one.
78+
79+
![ENU-NED](../../assets/advanced/ENU-NED.png)
80+
81+
ENU and NED are just rotation representations, the translational difference is only there so both can be seen in the same figure.
82+
83+
### Output
84+
85+
The output consists of 4 values, the motor forces, one for each motor.
86+
These are transformed in the `RescaleActions()` function.
87+
This is done because PX4 expects normalized motor commands while the Aerial Gym Simulator uses physical values.
88+
So the output from the network needs to be normalized before they can be sent to the motors in PX4.
89+
90+
The commands are published to the [ActuatorMotors](../msg_docs/ActuatorMotors.md) topic.
91+
The publishing is handled in `PublishOutput(float* command_actions)` function.
92+
93+
:::tip
94+
If the neural control mode is too aggressive or unresponsive the [MC_NN_THRST_COEF](../advanced_config/parameter_reference.md#MC_NN_THRST_COEF) parameter can be tuned.
95+
Decrease it for more thrust.
96+
:::
97+
98+
## Training your own Network
99+
100+
The network is currently trained for the [X500 V2](../frames_multicopter/holybro_x500v2_pixhawk6c.md).
101+
But the controller is somewhat robust, so it could work directly on other platforms, but performing system identification and training a new network is recommended.
102+
103+
Since the Aerial Gym Simulator is open-source you can download it and train your own networks as long as you have access to an NVIDIA GPU.
104+
If you want to train a control network optimized for your platform you can follow the instructions in the [Aerial Gym Documentation](https://ntnu-arl.github.io/aerial_gym_simulator/9_sim2real/).
105+
106+
You should do one system identification flight for this and get an approximate inertia matrix for your platform.
107+
On the `sys-id` flight you need ESC telemetry, you can read more about that in [DSHOT](../peripherals/dshot.md).
108+
109+
Then do the following steps:
110+
111+
- Do a hover flight
112+
- Read of the logs what RPM is required for the drone to hover.
113+
- Use the weight of each motor, length of the motor arms, total weight of the platform with battery to calculate an approximate inertia matrix for the platform.
114+
- Insert these values into the Aerial Gym configuration and train your network.
115+
- Convert the network as explained in [TFLM](tflm.md).
Lines changed: 86 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,86 @@
1+
# Neural Network Module: System Integration
2+
3+
The neural control module ([mc_nn_control](../modules/modules_controller.md#mc_nn_control)) implements an end-to-end controller utilizing neural networks.
4+
5+
The parts of the module directly concerned with generating the code for the trained neural network and integrating it into the module are covered in [TensorFlow Lite Micro (TFLM)](../advanced/tflm.md).
6+
This page covers the changes that were made to integrate the module into PX4, both within the module, and in larger system configuration.
7+
8+
:::tip
9+
This topic should help you to shape the module to your own needs.
10+
11+
You will need some familiarity with PX4 development.
12+
For more information see the developer [Getting Started](../dev_setup/getting_started.md).
13+
:::
14+
15+
## Автозавантаження
16+
17+
A line to autostart the [mc_nn_control](../modules/modules_controller.md#mc_nn_control) module has been added in the [`ROMFS/px4fmu_common/init.d/rc.mc_apps`](https://github.com/PX4/PX4-Autopilot/blob/main/ROMFS/px4fmu_common/init.d/rc.mc_apps) startup script.
18+
19+
It checks whether the module is included by looking for the parameter [MC_NN_EN](../advanced_config/parameter_reference.md#MC_NN_EN).
20+
If this is set to `1` (the default value), the module will be started when booting PX4.
21+
Similarly you could create other parameters in the [`mc_nn_control_params.c`](https://github.com/PX4/PX4-Autopilot/blob/main/src/modules/mc_nn_control/mc_nn_control_params.c) file for other startup script checks.
22+
23+
## Custom Flight Mode
24+
25+
The module creates its own flight mode "Neural Control" which lets you choose it from the flight mode menu in QGC and bind it to a switch on you RC controller.
26+
This is done by using the [ROS 2 Interface Library](../ros2/px4_ros2_interface_lib.md) internally.
27+
This involves several steps and is visualized here:
28+
29+
:::info
30+
The module does not actually use ROS 2, it just uses the API exposed through uORB topics.
31+
:::
32+
33+
:::info
34+
In some QGC versions the flight mode does not show up, so make sure to update to the newest version.
35+
This only works for some flight controllers, so you might have to use an RC controller to switch to the correct external flight mode.
36+
:::
37+
38+
![neural_mode_registration](../../assets/advanced/neural_mode_registration.png)
39+
40+
1. Publish a [RegisterExtComponentRequest](../msg_docs/RegisterExtComponentRequest.md).
41+
This specifies what you want to create, you can read more about this in the [Control Interface](../ros2/px4_ros2_control_interface.md).
42+
In this case we register an arming check and a mode.
43+
2. Wait for a [RegisterExtComponentReply](../msg_docs/RegisterExtComponentReply.md).
44+
This will give feedback on wether the mode registration was successful, and what the mode and arming check id is for the new mode.
45+
3. [Optional] With the mode id, publish a [VehicleControlMode](../msg_docs/VehicleControlMode.md) message on the `config_control_setpoints` topic.
46+
Here you can configure what other modules run in parallel.
47+
The example controller replaces everything, so it turns off allocation.
48+
If you want to replace other parts you can enable or disable the modules accordingly.
49+
4. [Optional] With the mode id, publish a [ConfigOverrides](../msg_docs/ConfigOverrides.md) on the `config_overrides_request` topic.
50+
(This is not done in the example module) This will let you defer failsafes or stop it from automatically disarming.
51+
5. When the mode has been registered a [ArmingCheckRequest](../msg_docs/ArmingCheckRequest.md) will be sent, asking if your mode has everything it needs to run.
52+
This message must be answered with a [ArmingCheckReply](../msg_docs/ArmingCheckReply.md) so the mode is not flagged as unresponsive.
53+
In this response it is possible to set what requirements the mode needs to run, like local position.
54+
If any of these requirements are set the commander will stop you from switching to the mode if they are not fulfilled.
55+
It is also important to set health_component_index and num_events to 0 to not get a segmentation fault.
56+
Unless you have a health component or events.
57+
6. Listen to the [VehicleStatus](../msg_docs/VehicleStatus.md) topic.
58+
If the nav_state equals the assigned `mode_id`, then the Neural Controller is activated.
59+
7. When active the module will run the controller and publish to [ActuatorMotors](../msg_docs/ActuatorMotors.md).
60+
If you want to replace a different part of the controller, you should find the appropriate topic to publish to.
61+
62+
To see how the requests are handled you can check out [src/modules/commander/ModeManagement.cpp](https://github.com/PX4/PX4-Autopilot/blob/main/src/modules/commander/ModeManagement.cpp).
63+
64+
## Логування
65+
66+
To add module-specific logging a new topic has been added to [uORB](../middleware/uorb.md) called [NeuralControl](../msg_docs/NeuralControl.md).
67+
The message definition is also added in `msg/CMakeLists.txt`, and to [`src/modules/logger/logged_topics.cpp`](https://github.com/PX4/PX4-Autopilot/blob/main/src/modules/logger/logged_topics.cpp) under the debug category.
68+
For these messages to be saved in your logs you need to include `debug` in the [SDLOG_PROFILE](../advanced_config/parameter_reference.md#SDLOG_PROFILE) parameter.
69+
70+
## Timing
71+
72+
The module has two includes for measuring the inference times.
73+
The first one is a driver that works on the actual flight controller units, but a second one, `chrono`, is loaded for SITL testing.
74+
Which timing library is included and used is based on wether PX4 is built with NUTTX or not.
75+
76+
## Changing the setpoint
77+
78+
The module uses the [TrajectorySetpoint](../msg_docs/TrajectorySetpoint.md) message’s position fields to define its target.
79+
To follow a trajectory, you can send updated setpoints.
80+
For an example of how to do this in a PX4 module, see the [mc_nn_testing](https://github.com/SindreMHegre/PX4-Autopilot-public/tree/main/src/modules/mc_nn_testing) module in this fork.
81+
Note that this is not included in upstream PX4.
82+
To use it, copy the module folder from the linked repository into your workspace, and enable it by adding the following line to your `.px4board` file:
83+
84+
```sh
85+
CONFIG_MODULES_MC_NN_TESTING=y
86+
```

docs/uk/advanced/tflm.md

Lines changed: 77 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,77 @@
1+
# TensorFlow Lite Micro (TFLM)
2+
3+
The PX4 [Multicopter Neural Network](advanced/neural_networks.md) module ([mc_nn_control](../modules/modules_controller.md#mc_nn_control)) integrates a neural network that uses the [TensorFlow Lite Micro (TFLM)](https://github.com/tensorflow/tflite-micro) inference library.
4+
5+
This is a mature inference library intended for use on embedded devices, and is hence a suitable choice for PX4.
6+
7+
This guide explains how the TFLM library is integrated into the [mc_nn_control](../modules/modules_controller.md#mc_nn_control) module, and the changes you would have to make to use it for your own neural network.
8+
9+
:::tip
10+
For more information, see the [TFLM guide](https://ai.google.dev/edge/litert/microcontrollers/get_started).
11+
:::
12+
13+
## TLMF NN Formats
14+
15+
TFLM uses networks in its own [tflite format](https://ai.google.dev/edge/litert/models/convert).
16+
However, since many microcontrollers do not have native filesystem support, a tflite file can be converted to a C++ source and header file.
17+
18+
This is what is done in `mc_nn_control`.
19+
The tflight neural network is represented in code by the files [`control_net.cpp`](https://github.com/PX4/PX4-Autopilot/blob/main/src/modules/mc_nn_control/control_net.cpp) and [`control_net.hpp`](https://github.com/PX4/PX4-Autopilot/blob/main/src/modules/mc_nn_control/control_net.hpp).
20+
21+
### Getting a Network in tflite Format
22+
23+
There are many online resource for generating networks in the `.tflite` format.
24+
25+
For this example we trained the network in the open source [Aerial Gym Simulator](https://ntnu-arl.github.io/aerial_gym_simulator/).
26+
Aerial Gym includes a guide, and supports RL both for control and vision-based navigation tasks.
27+
28+
The project includes conversion code for `PyTorch -> TFLM` in the [resources/conversion](https://github.com/ntnu-arl/aerial_gym_simulator/tree/main/resources/conversion) folder.
29+
30+
### Updating `mc_nn_control` with your own NN
31+
32+
You can convert a `.tflite` network into a `.cc` file in the ubuntu terminal with this command:
33+
34+
```sh
35+
xxd -i converted_model.tflite > model_data.cc
36+
```
37+
38+
You will then have to modify the `control_net.hpp` and `control_net.cpp` to include the data from `model_data.cc`:
39+
40+
- Take the size of the network in the bottom of the `.cc` file and replace the size in `control_net.hpp`.
41+
- Take the data in the model array in the `cc` file, and replace the ones in `control_net.cpp`.
42+
43+
You are now ready to run your own network.
44+
45+
## Code Explanation
46+
47+
This section explains the code used to integrate the NN in `control_net.cpp`.
48+
49+
### Operations and Resolver
50+
51+
Firstly we need to create the resolver and load the needed operators to run inference on the NN.
52+
This is done in the top of `mc_nn_control.cpp`.
53+
The number in `MicroMutableOpResolver<3>` represents how many operations you need to run the inference.
54+
55+
A full list of the operators can be found in the [micro_mutable_op_resolver.h](https://github.com/tensorflow/tflite-micro/blob/main/tensorflow/lite/micro/micro_mutable_op_resolver.h) file.
56+
There are quite a few supported operators, but you will not find the most advanced ones.
57+
In the control example the network is fully connected so we use `AddFullyConnected()`.
58+
Then the activation function is ReLU, and we `AddAdd()` for the bias on each neuron.
59+
60+
### Interpreter
61+
62+
In the `InitializeNetwork()` we start by setting up the model that we loaded from the source and header file.
63+
Next is to set up the interpreter, this code is taken from the TFLM documentation and is thoroughly explained there.
64+
The end state is that the `_control_interpreter` is set up to later run inference with the `Invoke()` member function.
65+
The `_input_tensor` is also defined, it is fetched from `_control_interpreter->input(0)`.
66+
67+
### Вхідні дані
68+
69+
The `_input_tensor` is filled in the `PopulateInputTensor()` function.
70+
`_input_tensor` works by accessing the `->data.f` member array and fill in the required inputs for your network.
71+
The inputs used in the control network is covered in [Neural Networks](../advanced/neural_networks.md).
72+
73+
### Виводи
74+
75+
For the outputs the approach is fairly similar to the inputs.
76+
After setting the correct inputs, calling the `Invoke()` function the outputs can be found by getting `_control_interpreter->output(0)`.
77+
And from the output tensor you get the `->data.f` array.

0 commit comments

Comments
 (0)