diff --git a/docs/_config.yml b/docs/_config.yml index 0e073a3..b5ae306 100644 --- a/docs/_config.yml +++ b/docs/_config.yml @@ -15,9 +15,9 @@ # in the templates via {{ site.myvariable }}. title: ROBOTICS LAB URJC -email: poveslara@gmail.com -description: "Bitácora para mi trabajo de fin de grado" -github_username: larapoves +email: b.villalba.2019@alumnos.urjc.es +description: "Bitacora for my University degree final project" +github_username: bb6698 minimal_mistakes_skin: dark search: true logo: ./logo.png @@ -27,9 +27,9 @@ markdown: kramdown remote_theme: mmistakes/minimal-mistakes # Outputting permalink: /:categories/:title/ -paginate: 1 # amount of posts to show +paginate: 5 # amount of posts to show paginate_path: /page:num/ -timezone: Europe/Madrid +timezone: # https://en.wikipedia.org/wiki/List_of_tz_database_time_zones include: - _pages @@ -56,25 +56,19 @@ plugins: - jekyll-include-cache author: - name : "Lara Poves Martínez" + name : "Barbara Villalba Herreros" avatar : "/assets/images/bio-photo.jpeg" - bio : "Ingeniería robótica software en la URJC" + bio : "Software robotics engineer at URJC" links: - label: "GitHub" icon: "fab fa-fw fa-github" - url: "https://github.com/larapoves" - - label: "LinkedIn" - icon: "fab fa-fw fa-linkedin" - url: "https://www.linkedin.com/in/lara-poves-mart%C3%ADnez-147ba62a7/" - + url: "https://github.com/bb6698" footer: links: - - label: "LinkedIn" - icon: "fab fa-fw fa-linkedin" - url: "https://www.linkedin.com/in/lara-poves-mart%C3%ADnez-147ba62a7/" + - label: "GitHub" icon: "fab fa-fw fa-github" - url: "https://github.com/larapoves" + url: "https://github.com/bb6698" defaults: # _posts diff --git a/docs/_pages/about.md b/docs/_pages/about.md index 72997aa..9d55a87 100644 --- a/docs/_pages/about.md +++ b/docs/_pages/about.md @@ -3,4 +3,9 @@ permalink: /about me/ title: "About me" --- -Hola, soy Lara y estoy terminando el grado de ingeniería robótica software en la Universidad Rey Juan Carlos. \ No newline at end of file +Hello, my name is Bárbara and I am currently finishing my degree in Robotics Engineering at URJC. + +I am passionate about programming and everything related to robots and helping people through robotics. +Currently improving my knowledge about ROS, ROS2, C++, Python, C,...etc. + +I am a hard-working, constant person who hardly gives up on things. I love challenges and overcome them and always trying to move forward to be competent in the future diff --git a/docs/_posts/2022-03-20-Getting started-1.md b/docs/_posts/2022-03-20-Getting started-1.md new file mode 100644 index 0000000..b622a78 --- /dev/null +++ b/docs/_posts/2022-03-20-Getting started-1.md @@ -0,0 +1,33 @@ +--- +title: "Week 1-10. Getting started" +last_modified_at: 2023-03-20T19:43:00 +categories: + - Blog +tags: + - ROS Noetic + - PX4 + - Mavros + - Mavlink + - Gazebo + - openCV + - PyQt5 +--- + +This weeks were to create the workspace, install all needs and start developing a simple teleoperate for drone. + +## Weeks 1-8 +In the first two months I investigated how to install mavros and mavlink on my personal computer. + +Previously I had ROS installed, the non-etic distribution and gazebo, which I did not have to install. + +I found problems to be able to work with PX4 on my computer. The problem was that I had ROS and ROS2 installed, which had a conflict between both versions. +Which I opted for a simple solution and was to create a virtual machine with a Linux 22.04 operating system and install ROS no etic, Gazebo, mavros, mavlink and I was able to launch the PX4 package without any problem + +## Week 8-10 + +The next weeks, I started developing a simple teleoperate for Iris drone. I tried some libraries (opencv and pyqt5). Finalliced I chooice Pyqt5 because I worked with hem and it was simple. + +The teleoperator will consist of commanding positions and speeds to the iris drone through some sliders, we will also command orientations and angular speeds. + +It is a first contact with a drone since my TFG will try to focus on reinforcement learning through a drone. + diff --git a/docs/_posts/2022-03-21-Teleop-Drone.md b/docs/_posts/2022-03-21-Teleop-Drone.md new file mode 100644 index 0000000..aec9f39 --- /dev/null +++ b/docs/_posts/2022-03-21-Teleop-Drone.md @@ -0,0 +1,145 @@ +--- +title: "Month 1-2.Teleop Drone" +last_modified_at: 2023-03-20T13:05:00 +categories: + - Blog +tags: + - ROS Noetic + - PX4 + - Mavros + - Mavlink + - Gazebo + - openCV + - PyQt5 +--- + +These months developed simple teleop for iris drone. + +## Month 1 +In the first few months it investigates how to load the sdf iris model into the PX4 package. + +To load the iris model in the launch 'mavros_posix_sitl.launch' it was enough with change sdf field for model iris: + +
+ +
+ +But I found the problem that PX4 was not able to load the model well and could not find the exact folder where it was located. +During a week of research and reading the PX4 forum, I tried to load the model by passing it as a parameter to launch itself, that is, passing the destination path to launch so that PX4 could find it: + ++ +
+ +This was a possible solution to be able to load the desired model. + +From here I already developed the teleoperator for the drone that will consist of two nodes: + +- The interface where we can see the image of the drone's camera and command different behaviors for the movement of the drone + +- The teleoperator that will consist of reading the data that we are receiving from the interface and processing it correctly. + +The first approximation was to do everything in the same node but having to launch it with PX4 and using the Pyqt5 library would have conflicts, which led me to separate it into two nodes. + +## Month 2 + +In the second month I focused on developing the previously mentioned node and develop a launch that launches the iris model with the world and both nodes. + +First of all we will talk about the interface node and after the teleoperate node. + +### Interface node + +To develop the interface for us to command the drone we will use the Pyqt5 library. + +The node consists of two classes, one for the image that we are going to process from the camera and another class for the interface with the different buttons and sliders. + +#### Camara Image +To process the image we will subscribe to the topic '/iris/usb_cam/image_raw'. In order to use the image with the Pyqt5 library, we first need to convert the topic to openCV. +After having it in opencv format, we must transform it into QT format, which will be an image with Pixmap format. + +The camera image also shows the current FPS. This will be done simply by measuring the time elapsed between one image and another and calculating the frequency of the image (it is the inverse of the period). Said calculation will make an average with all the measurements and we will update the FPS every second: + ++ +
+ +Finally with opencv we can use a function called putText to average the FPS. + +### Controls Fuctions +For the drone commands we will have different buttons and sliders to be able to do it: + +- The buttons are for LAND, TAKE_OFF,POSITION AND VELOCITY. When these buttons are pressed, a String message will be sent to the topic '/commands/mode'. +POSITION AND VELOCITY are modes in how we want to command the drone, whether in position control or speed control. +LAND and TAKEOFF are modes to land and take off the drone. + ++ +
+ +- The sliders are to send position, speed and orientation to the drone depending on which control the user has chosen. And the topics '/commands/control_position' and '/commands/control_velocity' will be used, filling the messages with type PoseStamped and Twist + ++ +
+ ++ +
+ + +### Teleop node + +The teleoperated node will consist of subscribing to the interface node issues, processing them (saying if we are in position and speed control, or taking off or landing) and commanding the drone. + +I will use the OFFBOARD flight mode, which consists of a mode to control the movement and attitude of the vehicle, establishing the position, the speed. acceleration...etc In order for the commands for the drone to work, we have to make sure that the vehicle is armed and for this we will use a service to be able to say that the vehicle is ready. Without the armed vehicle, the drone would not be able to fly. + +For each control I will subscribe to the topics of the interface node to obtain the messages and later to be able to process them. + +In order to control the modes in which we find ourselves, we will use some simple checks: + ++ +
+ +#### Control Position +The position control will consist of commanding the drone positions in the gazebo world and it will go towards them, in addition we will also command turns in the z axis in radians from 0 to pi. +This control will only work only when the drone has taken off. + +To make the turns in the z axis we have had to subscribe to the topic '/mavros/local_position/pose' to obtain the local position of the drone and obtain what orientations it has since when we publish the angle we are working in quaternions not in angles which we will use Two methods of the tfs euler_from_quaternion and quaternion_from_euler to be able to transform the euler angles (in this case we are interested in the z axis) into quaternions + ++ +
+ +#### Control Velocity +The position control will consist of commanding the drone speeds in the gazebo world and it will go towards them, in addition we will also command angular speed in the z axis in radians/seconds from 0 to pi. +This control will only work only when the drone has taken off. + ++ +
+ +#### Mode Land +In order to land the drone we will use a service called "/mavros/cmd/land" and when the user presses the LAND button the drone will land where it is + ++ +
+ +#### Mode Take Off +In order to take off the drone and make it fly, we send the height on the z axis by position. + +### Results + ++ +
+ +### Conclusions +It is a very simple teleoperator and a contact to know the controls that we can command a drone. Both the interface and the teleoperator could be improved by having other types of behaviors such as commanding speeds, angular speeds, turns and positions in the 3 axes. + + + + + + diff --git a/docs/_posts/2023-06-16-Airsim.md b/docs/_posts/2023-06-16-Airsim.md new file mode 100644 index 0000000..ec3454c --- /dev/null +++ b/docs/_posts/2023-06-16-Airsim.md @@ -0,0 +1,80 @@ +--- +title: "Airsim" +last_modified_at: 2023-03-20T13:05:00 +categories: + - Blog +tags: + - Airsim + - UnRealEngine +--- + +## Why use Airsim? +First, Airsim is a simulator open source that it use in robotic applications and machine learning. AirSim provides a realistic simulated environment for experimenting with control, navigation and perception algorithms on unmanned aerial vehicles (UAVs), ground vehicles and water vehicles. The simulator is compatible with platforms such as Linux, Windows and macOS. + +In your case, we use Linux. + +Airsim has different environments depending on the application. + +### Scenarios within Airsim + +Mentioned above, airsim offers different scenarios, such as: + +1. AbandonedPark +2. Africa +3. AirSimNH: It's a small urban neighborhood +4. Blocks +5. Building_99 +6. LandscapeMountains +7. MSBuild2018 (soccer field) +8. TrapCamera +9. ZhangJiajie + +Depend the version releases for Linux, we can have a other scenarios, as instance: + +- City: Large environment with moving vehicles and pedestrians.These scenarios consist of 2 packages +- Forest + ++ +
+ +### Sensors + +Airsim provides differentes sensors as instance: + +1. Camera +2. Barometer +3. Imu +4. Gps +5. Magnetometer +6. Distance Sensor +7. Lidar + +### Environment + +In Airsim, you can configure the enviroment: + +1. The weather: you can have effects as instance rain,fog,dug,snow,etc. +2. Time of day,atmospheric effects +3. Collision and detection + +### Vehicles types + +In Airsim, it exists differents types of availables vehicles: + +1. PhysXCar: Represents a ground vehicle with realistic physics based on the PhysX physics engine. + +2. SimpleFlight : Represents a drone with a simplified flight model. + +3. SimpleQuadcopter: Represents a quadcopter type drone with a basic flight model. + +4. SimpleWheeledVehicle: Represents a wheeled ground vehicle with a simplified physics model. + + + + + + + + + diff --git a/docs/_posts/2023-10-31-PX4.md b/docs/_posts/2023-10-31-PX4.md new file mode 100755 index 0000000..a37b83b --- /dev/null +++ b/docs/_posts/2023-10-31-PX4.md @@ -0,0 +1,137 @@ +--- +title: "PX4 SITL + Mavros + Airsim " +last_modified_at: 2023-03-20T13:05:00 +categories: + - Blog +tags: + - PX4 + - Airsim + - Mavros + - QGControl +--- + +## Introduction +In this post, we will talk about the integration to PX4 + Mavros + Airsim for drone behavior and the different configurations that have to be done. + +Airsim offers a ROS wrapper. The ROS wrapper is composed of two ROS nodes: the first is a wrapper over the AirSim multirotor C++ client library, and the second is a simple PD position controller. For more information : [AirSim/airsim_ros_pkgs](https://microsoft.github.io/AirSim/airsim_ros_pkgs/) + +The first approach was to use this ROS wrapper to be able to give a behavior by means of speeds to the drone with the topic that provided AirSim ROS Wrapper Node. The difficulty that we can find is that when we command velocities to the drone in the x and y axes the height of the drone is not constant, that is to say, as we command velocities in some of the mentioned axes the drone is losing the height and as a result it ends up on the ground since this node does not have any position controller in the z axis nor velocity controllers in the x and y axes. +It is true that there is a node called Simple PID Position Controller Node that provides a position controller in the x, y and z axes but what we need is to control the position of z by commanding velocities in the x and y axes and angular velocities in the z axis. Given what happened, we opted for the solution of integrating PX4 together with Mavros and Airsim, since PX4 offers position and velocity controllers and together with Mavros to be able to command velocities + + +## Configuration of Airsim settings file + +To use PX4 together with Airsim, the first thing to do is to configure the Airsim configuration file to specify that we want to use PX4. Note that PX4 [simulator] uses TCP, so we must add: "UseTcp": true,. Note that we are also enabling LockStep, see PX4 LockStep for more information [Lockstep](https://microsoft.github.io/AirSim/px4_lockstep/) + +All the installation and what to build of PX4 firmware SITL mode is on this page: [PX4 STIL](https://microsoft.github.io/AirSim/px4_sitl/). Step 4 is important to analyze it and to know which ports are enabled, in order to be able to make the connections with Mavlink through the mavros node. + +For this purpose a diagram is shown: + ++ +
+ +The ports 14030 and 14020 appers in window PX4 STIL when we execute : make make px4_sitl_default none_iris + ++ +
+ + + +## PX4 Flight Modes Overview +Flight modes define how the autopilot responds to remote control input, and how it manages vehicle movement during fully autonomous flight. + +The modes provide different types/levels of autopilot assistance to the user (pilot), ranging from automation of common tasks like takeoff and landing, through to mechanisms that make it easier to regain level flight, hold the vehicle to a fixed path or position, etc. + +In this diagram, you can view flight modes: + ++ +
+ + + +In our case, we are interested in the OFFBOARD flight mode, the HOLD/POSITION flight mode to keep the drone at a constant altitude. + +### Offboard Mode +This mode allows us to control the movement of the vehicle and the altitude, by setting position, velocity, acceleration, attitude, attitude rates or thrust/torque setpoints. + +PX4 must receive a stream of MAVLink setpoint messages or the ROS 2 OffboardControlMode at 2 Hz as proof that the external controller is healthy. +If the rate falls below 2Hz while under external control PX4 will switch out of offboard mode after a timeout (COM_OF_LOSS_T), and attempt to land or perform some other failsafe action (security actions). The action depends on whether or not RC control is available, and is defined in the parameter COM_OBL_RC_ACT. For more information : [Safety](https://docs.px4.io/v1.14/en/config/safety.html) and [Failsafes](https://docs.px4.io/v1.14/en/simulation/failsafes.html) + +Parameter COM_FAIL_ACT_T is disable in our case because of if it able before entering failsafe (RTL, Land, Hold), wait COM_FAIL_ACT_T seconds in Hold mode for the user to become aware. During this time the user cannot take control. Then the configured fail-safe action is triggered and the user can take control. + +In resume, this mode we will help to command velocities of drone with the help of the topic provided by mavros. + +#### Frames Mavros +The coordinate frames that follow Mavros are 21 (You can see it in mavros messages in the SetMavFrame service [SetMavFrame](https://github.com/mavlink/mavros/blob/master/mavros_msgs/srv/SetMavFrame.srv) + +We use coordinate frames BODY_NED (number 8) because of same as MAV_FRAME_BODY_FRD when used with velocity/accelaration values. + +For change coordinate frames, we must go file px4_config.yaml and "setpoint_velocity" in "mav_frame" set BODY_NED + + +#### Offboard Parameters +Offboard mode is affected by the following parameters: + +1. COM_OF_LOSS_T : Time-out (in seconds) to wait when offboard connection is lost before triggering offboard lost failsafe (COM_OBL_RC_ACT) + +2. COM_OBL_RC_ACT : Flight mode to switch to if offboard control is lost (Values are - 0: Position, 1: Altitude, 2: Manual, 3: *Return, 4: Land). + +3. COM_RC_OVERRIDE +4. COM_RC_STICK_OV +5. COM_RCL_EXCEPT + +We will use parameters COM_OF_LOSS_T and COM_OBL_RC_ACT for when we stop publishing speeds + +Note: For know more about full parameter reference in PX4 [Parameters](https://docs.px4.io/v1.14/en/advanced_config/parameter_reference.html) + +### Hold Mode +The Hold flight mode causes the vehicle to stop and hover at its current GPS position and altitude. + +### Position Mode +Position is an easy-to-fly RC mode in which roll and pitch sticks control acceleration over ground in the vehicle's left-right and forward-back directions (similar to a car's accelerator pedal), and throttle controls speed of ascent-descent. + +Position mode is the safest manual mode for new fliers + ++ +
+ +## QGControl +QGControl is applications that it provides full flight control and vehicle setup for PX4 or ArduPilot powered vehicles. + +To install in Ubuntu Linux : [Download QGControl](https://docs.qgroundcontrol.com/master/en/getting_started/download_and_install.html) + +This application allows you to see which parameters have been loaded with PX4 SITL and to visualize which flight mode you are in. It's more useful for can change some parameter if you wish it. Also we can teleoperate vehicle through joystick. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/docs/_posts/2023-11-1-Yolop.md b/docs/_posts/2023-11-1-Yolop.md new file mode 100644 index 0000000..8936f2c --- /dev/null +++ b/docs/_posts/2023-11-1-Yolop.md @@ -0,0 +1,67 @@ +--- +title: "Perception with Yolop " +last_modified_at: 2023-03-20T13:05:00 +categories: + - Blog +tags: + - Yolop + - Pytorch + - Onnx +--- + +## Introduction +In this post, we will talking about Yolop for drivable area and lane detection. Yolop will use for perception + +## What is Yolop? +YOLOP is a Panoptic vision perception system to aid autonomous driving in real-time. This is one of the very first end-to-end panoptic vision perception models aimed at self-driving systems running in real time. + +It performs traffic object detection, drivable area segmentation, and lane detection simultaneously. + ++ +
+ +For it can use, you must install all depedencies [Requeriments ](https://github.com/hustvl/YOLOP/blob/main/requirements.txt) + +In our case, we interest drivable area and lane detection because of the cars can not detect in this moment. + +### Models +YOLOP follow a model neural network **MCNet (Convolutional neural network,CNN) End-to-end**. This model is builded in Pytorch : 'End-to-end.pth' + +From the end-to-end model it is possible to export models in onnx format. This type of format offers optimized operations and computational graphics to improve model performance (model accuracy) and reduce computational costs. + +With all this information, we have tested 3 models: + +- **End-to-end**: Pytorch model. For this model the input image, its dimensions must be **multiples of 32**. This is because the maximum step of the network is 32 and it is a fully convolutional network. + +- **Yolop-320-320**: Onnx model. Input image 320x320 + +- **Yolop-640-640**: Onnx model. Input image 640x640 + +Note: Yolop-320-320 and Yolop-640-640, are models exported ad from End-to-end model. + +For more information about Onnx and how to install : [Onnx](https://onnxruntime.ai/) , [Install onnx](https://onnxruntime.ai/getting-started) + +## Results with YOLOP +For analise each model, we calculate inference time to each model and then we have represented the results in a bar graph + ++ +
+ +As you can view yolop-320-320 has better mean inference time than end-to-end and yolop-640-640,therefore it is a good candidate model to choose since with this model we can achieve a rate of approximately 100 fps. + +Finally, in this video you can see each model and the results regarding the drivable area segmentation and lane detection. + +Note: The drone is remote controlled with a joystick and to compute the inference in each model we have used CUDA to speed up the computation + ++ +
+ +[YOLOP] (https://youtu.be/G0New6pOUbs?si=_XqWbcm6EAjRD-w9) + + + + + diff --git a/docs/_posts/2023-2-23- Perception.md b/docs/_posts/2023-2-23- Perception.md new file mode 100644 index 0000000..f6891e7 --- /dev/null +++ b/docs/_posts/2023-2-23- Perception.md @@ -0,0 +1,43 @@ +--- +title: "Perception" +last_modified_at: 2023-03-20T13:05:00 +categories: + - Blog +tags: + - DBSCAN + - Pytorch + - Onnx +--- + +## Introduction +In this post, we will talking about Perception. + +## Perception +In the previous post we talked about YOLOP and which model to choose to get the best result. From this neural network, we will keep the detected lines of the lanes of the road and we will perform an unsupervised learning algorithm called clustering (DBSCAN) to choose the group of lines that we are interested in the lane to follow, for more details of this algorithm you can visit the following page [DBSCAN](https://scikit-learn.org/stable/modules/clustering.html#dbscan). +From this, a quadratic regression will be performed on both groups of lines chosen to represent them in curvilinear lines since the lines of the scenario road are not straight at all. Finally, once both regressions have been performed, an interpolation will be performed to know which points are within the 2 regressions and to be able to represent it as a mass of points and calculate the centroid of the lane. + +The result is as follows: + ++ +
+ ++ +
+ +When the insight was obtained, a simple PID controller was made to see how the insight worked. The result is as follows: + + + + + + + + + + + + + + diff --git a/docs/_posts/2024-02-12-intro.md b/docs/_posts/2024-02-12-intro.md deleted file mode 100644 index 51db1be..0000000 --- a/docs/_posts/2024-02-12-intro.md +++ /dev/null @@ -1,10 +0,0 @@ ---- -title: "Introducción" -last_modified_at: 2024-02-12T120:37:00 -categories: - - Blog -tags: - - Carla ---- - -Conducción autónoma diff --git a/docs/images/AddSliders.png b/docs/images/AddSliders.png new file mode 100644 index 0000000..eeae330 Binary files /dev/null and b/docs/images/AddSliders.png differ diff --git a/docs/images/BottomsInterface.png b/docs/images/BottomsInterface.png new file mode 100644 index 0000000..a3be437 Binary files /dev/null and b/docs/images/BottomsInterface.png differ diff --git a/docs/images/Capture-Video-Yolop.png b/docs/images/Capture-Video-Yolop.png new file mode 100644 index 0000000..87fb60a Binary files /dev/null and b/docs/images/Capture-Video-Yolop.png differ diff --git a/docs/images/ControlPosition.png b/docs/images/ControlPosition.png new file mode 100644 index 0000000..ad34c0a Binary files /dev/null and b/docs/images/ControlPosition.png differ diff --git a/docs/images/ControlVelocity.png b/docs/images/ControlVelocity.png new file mode 100644 index 0000000..85e5e5f Binary files /dev/null and b/docs/images/ControlVelocity.png differ diff --git a/docs/images/ImageCapture.png b/docs/images/ImageCapture.png new file mode 100644 index 0000000..53fd65f Binary files /dev/null and b/docs/images/ImageCapture.png differ diff --git a/docs/images/LAND.png b/docs/images/LAND.png new file mode 100644 index 0000000..888e4ab Binary files /dev/null and b/docs/images/LAND.png differ diff --git a/docs/images/MAPS_AIRSIM.png b/docs/images/MAPS_AIRSIM.png new file mode 100644 index 0000000..c2ec42d Binary files /dev/null and b/docs/images/MAPS_AIRSIM.png differ diff --git a/docs/images/Modes.png b/docs/images/Modes.png new file mode 100644 index 0000000..4fefc43 Binary files /dev/null and b/docs/images/Modes.png differ diff --git a/docs/images/PositionMode.png b/docs/images/PositionMode.png new file mode 100644 index 0000000..4dcef0e Binary files /dev/null and b/docs/images/PositionMode.png differ diff --git a/docs/images/Results-Yolop.png b/docs/images/Results-Yolop.png new file mode 100644 index 0000000..591c449 Binary files /dev/null and b/docs/images/Results-Yolop.png differ diff --git a/docs/images/Sliders.png b/docs/images/Sliders.png new file mode 100644 index 0000000..a131fd0 Binary files /dev/null and b/docs/images/Sliders.png differ diff --git a/docs/images/Teleop-Interface.png b/docs/images/Teleop-Interface.png new file mode 100644 index 0000000..25e99ff Binary files /dev/null and b/docs/images/Teleop-Interface.png differ diff --git a/docs/images/Try-PX4.png b/docs/images/Try-PX4.png new file mode 100644 index 0000000..e660917 Binary files /dev/null and b/docs/images/Try-PX4.png differ diff --git a/docs/images/Whiteboard.png b/docs/images/Whiteboard.png new file mode 100644 index 0000000..6fc3412 Binary files /dev/null and b/docs/images/Whiteboard.png differ diff --git a/docs/images/commander-flow-diagram.png b/docs/images/commander-flow-diagram.png new file mode 100755 index 0000000..33d9c95 Binary files /dev/null and b/docs/images/commander-flow-diagram.png differ diff --git a/docs/images/load_vehicle.launch.png b/docs/images/load_vehicle.launch.png new file mode 100644 index 0000000..04af59e Binary files /dev/null and b/docs/images/load_vehicle.launch.png differ diff --git a/docs/images/mavros_posix_sitl.launch.png b/docs/images/mavros_posix_sitl.launch.png new file mode 100644 index 0000000..8103b31 Binary files /dev/null and b/docs/images/mavros_posix_sitl.launch.png differ diff --git a/docs/images/perception1.png b/docs/images/perception1.png new file mode 100644 index 0000000..31007a0 Binary files /dev/null and b/docs/images/perception1.png differ diff --git a/docs/images/perception2.png b/docs/images/perception2.png new file mode 100644 index 0000000..d7067d7 Binary files /dev/null and b/docs/images/perception2.png differ diff --git a/docs/images/portsPX4.png b/docs/images/portsPX4.png new file mode 100644 index 0000000..3b41648 Binary files /dev/null and b/docs/images/portsPX4.png differ diff --git a/docs/images/yolop-architecture.png b/docs/images/yolop-architecture.png new file mode 100644 index 0000000..fa56159 Binary files /dev/null and b/docs/images/yolop-architecture.png differ