#Coding4Fun – How to control your #drone with 20 lines of code! (9/N)

Buy Me A Coffee

Hi!

Let’s take some Frames Per Second measurements on the UDP and OpenCV connection. It seems that working with simple movements, the values moves between 30 and 60 FPS.

showing FPS information with the drone camera

Just added a couple of lines in the main While, to calculate the FPS.

# open
i = 0
while True:
    i = i + 1
    start_time = time.time()

    sendReadCommand('battery?')
    print(f'battery: {battery} % - i: {i}')

    try:
        ret, frame = cap.read()
        img = cv2.resize(frame, (640, 480))

        if (time.time() - start_time ) > 0:
            fpsInfo = "FPS: " + str(1.0 / (time.time() - start_time)) # FPS = 1 / time to process loop
            font = cv2.FONT_HERSHEY_DUPLEX
            cv2.putText(img, fpsInfo, (10, 20), font, 0.4, (255, 255, 255), 1)

        cv2.imshow('@elbruno - DJI Tello Camera', img)
    except Exception as e:
        print(f'exc: {e}')
        pass

    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

As a final note, just need to mention that I make some tests using different camera resolutions and the FPS averages are similar. I tested with 640 * 480 pixels and 1024*768 pixels.

Next posts, let’s do some Face Detection and rock some AI with the drone!

Happy coding!

Greetings

El Bruno

References

My Posts

#Coding4Fun – How to control your #drone with 20 lines of code! (8/N)

Buy Me A Coffee

Hi!

Now that I started to understand how UDP works, I also did my research to find which are the best options to access an UDP video feed. Lucky for me, there are plenty of resources about doing this task using my old friend OpenCV.

Most of the OpenCV documentation is written in C++. However at the end, it all goes down to these basic lines of code

# open UDP
videoUDP = 'udp://192.168.10.1:11111'
cap = cv2.VideoCapture(videoUDP)

# read a frame from the feed
ret, frame = cap.read()
img = cv2.resize(frame, (320, 240))

# diplay the frame in openCV video window
cv2.imshow('Video', img)


Note: At the references section below I shared some of my posts with my experiences on how to install OpenCV in Windows 10.

Let’s go back to our sample Python App. Using the previous sample that display the battery level, I changed this code to be alive all the time and displaying the video feed in a 320*240 window.

The following video display how fast this works:

And of course the complete code with this notes:

  • Line 96-100. Open video feed and wait 2 seconds
  • Line 104-118. Main App.
    • Get and display battery level
    • Get UDP video frame
    • Resize frame to 320×240
    • Display frame
    • When Q key is pressed, exit app
  • Line 121. Close video stream

Happy coding!

Greetings

El Bruno

References

My Posts

#Coding4Fun – How to control your #drone with 20 lines of code! (7/N)

Buy Me A Coffee

Hi!

No code today. Mostly because I spend a decent amount of time trying to understand how the DJI Trello camera feed works.

In order to access the camera feed remotely we need to perform 2 steps. First we need to send the command “command” to the drone, and then the command “streamon” to enable the video stream. Of course, there is also a command to stop the stream “streamoff“.

In the following sample, I enable the camera feed, then keep the camera feed live for 90 seconds and then disable the camera feed.

OK, once I got this, I needed to spend some time trying to figure out how to get this feed. Based on the SDK details, I realize that the url to access the video feed is:

udp://192.168.10.1:11111

First I make a try and try to access the UDP feed using VLC, however it didn’t work. So I did a little research, and found that I can use FFmpeg to do this. In case you don’t know about FFmpeg.

FFmpeg is the leading multimedia framework to decode, encode, transcode, mux, demux, stream, filter and play. All builds require at least Windows 7 or Mac OS X 10.10. Nightly git builds are licensed as GPL 3.0, and release build are licensed as GPL 3.0 and LGPL 3.0. LGPL 3.0 release builds can be found using the “All Builds” links.

FFmpeg Builds (see references)

I downloaded the latest FFMpeg build, and run the following command locally:

.\ffplay.exe -i udp://192.168.10.1:11111

And after a couple of seconds, I got my drone video feed displayed locally. It has a huge delay from the real action to the camera feed. So, there is some improvement opportunities here.

drone camera video feed using ffmpeg and udp

The video is at 3X speed, and I avoid the initial comments and setup. The main idea was to access the video feed and it’s done. Now, in next posts, I’ll try to use OpenCV to work with the feed and maybe process and display each frame independently.

Important: when you run the ffplay command, it will show an scary output, and we will get a great windows with the camera feed. This is the PowerShell output:

Happy coding!

Greetings

El Bruno

References

#Coding4Fun – How to control your #drone with 20 lines of code! (6/N)

Buy Me A Coffee

Hi!

Today is code time also! And a continuation from my previous sample.

Yesterday I show how to read a static value: the battery. And, when you work with a device like a drone there are other important values to analyze in order to send commands to the drone. Like altitude, position, time of flight etc.

So, based on yesterday sample, I’ll show how to create a simple Python app, that will display the accelerator X value. In the following video you can see the how the value start “static” during the 1st couple of seconds, until I pickup the device and I moved it around.

Important: don’t blame me about the low battery. Playing with the drone drains the battery very fast!

Once again, the code is very straight forward. It runs a loop furing 10 seconds, showing the battery and accelX information.

Just as a reminder, this is the information we get from the drone:

pitch:0;roll:1;yaw:0;vgx:0;vgy:0;vgz:0;templ:79;temph:82;tof:10;h:0;bat:39;baro:50.42;time:0;agx:-8.00;agy:-17.00;agz:-999.00

As you can read, all the information is condensed in a single line and we can split and get:

  • pitch
  • roll
  • yaw
  • vgx
  • vgv
  • vgz
  • templ (temperature low)
  • temph (temperature high)
  • tof (time of flight)
  • h (height)
  • b (battery)
  • baro (barometer)
  • time
  • agx
  • agy
  • agz

Happy coding!

Greetings

El Bruno

References

#Coding4Fun – How to control your #drone with 20 lines of code! (5/N)

Buy Me A Coffee

Hi!

Today is code time! In my previous post I share some code to send commands to the drone. Today I’ll show how to read the information from the drone. Before I start, someone asked if my kids are having fun with the drone. A image will be enough to answer this.

Reading drone information

So, as far as I understand, the drone is constantly sending information to the connected client. That’s why we have the following function running all the time in a separate thread:

def receiveData():
    global response
    while True:
        try:
            response, _ = clientSocket.recvfrom(1024)
        except:
            break
# ... more code
response = None
recThread = threading.Thread(target=receiveData)
recThread.daemon = True
recThread.start()

The response is stored in a global variable named response. And it’s very easy to understand the information that we the drone sends back. This is a sample received data.

pitch:0;roll:1;yaw:0;vgx:0;vgy:0;vgz:0;templ:79;temph:82;tof:10;h:0;bat:39;baro:50.42;time:0;agx:-8.00;agy:-17.00;agz:-999.00

As you can read, all the information is condensed in a single line and we can split and get:

  • pitch
  • roll
  • yaw
  • vgx
  • vgv
  • vgz
  • templ (temperature low)
  • temph (temperature high)
  • tof (time of flight)
  • h (height)
  • b (battery)
  • baro (barometer)
  • time
  • agx
  • agy
  • agz

In python this is a simple routine, and after split this into a list, the battery value is on index 21:

ls = 'pitch:0;roll:1;yaw:0;vgx:0;vgy:0;vgz:0;templ:79;temph:82;tof:10;h:0;bat:39;baro:50.42;time:0;agx:-8.00;agy:-17.00;agz:-999.00'
>>> ls1 = ls.replace(';', ':').split(':')
>>> ls1
['pitch', '0', 'roll', '1', 'yaw', '0', 'vgx', '0', 'vgy', '0', 'vgz', '0', 'templ', '79', 'temph', '82', 'tof', '10', 
'h', '0', 'bat', '39', 'baro', '50.42', 'time', '0', 'agx', '-8.00', 'agy', '-17.00', 'agz', '-999.00']
>>> ls1[21]
'39'
>>>

So with this, to get the battery level of the drone will require this code:

The output is not very amazing, but it works!

powershell console displaying the drone battery level.

Happy coding!

Greetings

El Bruno

References

#Coding4Fun – How to control your #drone with 20 lines of code! (3/N)

Buy Me A Coffee

Hi!

Today I’ll write the equivalent of a Hello World in the drone ecosystem. This is a very complex app which:

  • take off the drone
  • wait a couple of seconds
  • lands the drone

I’ve followed the Python code sample from the SDK and the final code is very complex (see below). And it deserves some remarks

  • Line 11. The function recv() run in a separated thread to receive messages from the drone. The thread is started on line 44
  • Line 19. The function sendMessage() sends messages to the drone. Important, the messages must be UTF-8 encoded. This took me some time, until I figure it out. This time implements a timeout with 5 second until the response is processed in a separated thread
  • Lines 31-41. Connection information and sockets to communicate with the drone
  • Line 48. Main code for the app. Start the SDK mode, Wait 5 seconds, send the take off message, and then send the land message. A very simple exception catch is implemented here.

Important: When using the SDK, the 1st command should be “command”, as I did in line 49.

Here is the code:

Here is the app running at 3X speed so you don’t spend all day watching boring drones videos:

Drone Hello World, take off and land

Happy coding!

Greetings

El Bruno

#Coding4Fun – How to control your #drone with 20 lines of code! (2/N)

Buy Me A Coffee

Hi!

I my previous posts I shared some links about the DJI Tello drone. One of them is the SDK 1.3.0.0. In this document we can find the main commands and descriptions of the specifics commands to use to communicate with the drone.

dji tello drone sdk architecture

The document also links a Python sample file with the following code:

The code is very easy to read:

  • Lines 10-14. Defines the main libraries to be used. I never used sockets and threads in Python, so this is an excellent chance for me to learn about this
  • Lines 16-26. Implements the basic communication via UDP described in the architecture from the SDK. Accessing the drone via IP 192.168.10.1 and port 8889, and bind to localhost with post 9000
  • Lines 28-48. Function to receive data from the drone. It’s executed in an different thread, so here are my 1st multi-threading python app.
  • Lines 50-74. Main App Loop, where it waits for user type command and send the command to the drone. It also checks for Python version, and close the socket before end.

An amazing way to start playing with the drone. Just 5 minutes to connect and have something up and running.

Happy coding!

Greetings

El Bruno

References

#Coding4Fun – How to control your #drone with 20 lines of code! (1/N)

Buy Me A Coffee

Hi !

I’ll start to write a couple of posts about controlling a drone using Python. This all started at the end of 2019 when my friend Daniel (@danielcauser) told me that he was playing around with the DJI Tello Drone (see references). This drone is a small quadcopter that features a Vision Positioning System and an onboard camera.

The drone is a perfect balance between a toy and also a capable device to play around with some code. The price is around U$D100, so it’s not very expensive. And the main specifications are very cool for a device of this price

DJI Tello Specs

  • 720p Videos
  • 5MP Photos
  • Takeoff and Landing from Your Hand
  • Intel Processor
  • Programmable via Scratch SDK
  • Free Tello App with User-Friendly UI
  • Compatible with Bluetooth Controllers

DJI Tello Features

  • Collision detection system
  • Auto Takeoff
  • Auto Landing
  • Low Battery protection features
  • DJI’s flight stabilization technology

This is not a new device, so you may find several reviews online for the device. I’ll keep this mostly focused on the programming side.

One of the key features of the device is focused on the programming capabilities for this drone. DJI provides a couple of applications to interact with the drone and also a Tello EDU app focused on education and introduction to programming and controlling the drone using Scratch.

As part of this Edu package there is a Tello SDK PDF document (see references) where we can find the description for all the commands to control the drone. The communication is via UDP, so we only need to connect to the device Wifi, and … start to have fun!

Happy coding!

Greetings

El Bruno

References

#RaspberryPi – Install Virtual Environments

Buy Me A Coffee

Hi!

Virtual Environments are a great way to isolate our dev tests, and after been using this in Windows also, work with them in the Raspberry Pi makes a lot of sense.

I’ll leave here the necessary steps to do this, however the full credit is based on some posts from Adrian Rosebrock and his amazing blog (see references).

Once we have everything updated in our Raspbian, let’s run the following command:

sudo pip install virtualenv virtualenvwrapper
sudo rm -rf ~/get-pip.py ~/.cache/pip

Now we need to update the ~/.bashrc file using nano (not a big fan of VI )

nano ~/.bashrc

And then add the following lines

# virtualenv and virtualenvwrapper
export WORKON_HOME=$HOME/.virtualenvs
export VIRTUALENVWRAPPER_PYTHON=/usr/bin/python3
source /usr/local/bin/virtualenvwrapper.sh

Now everytime we open a new terminal session, this commands will be applied and we will have our virtual environments up and running.

Next, source the bashrc file

source ~/.bashrc

And we can create a new virtual environment. Let’s create one named devOpenCV using python 3.

mkvirtualenv devOpenCV -p python3

We can enable and access the virtual environment with the workon command, and we will see the virtual environment as a prefix in our terminal

workon devOpenCV

As we can see in the following screenshot, the virtual environment uses the latest Python 3 version, and just a few packages installed.

Happy coding!

Greetings

El Bruno

References

My posts on Raspberry Pi

Dev posts for Raspberry Pi
Tools and Apps for Raspberry Pi
Setup the device
Hardware

#RaspberryPi – Performance differences in #FaceRecognition using #OpenVino (code with @code!)

Buy Me A Coffee

Hi !

I’ve been looking to use the amazing Intel Neural Stick 2 for a while, and one of the 1st ideas that I have was to check how fast my Raspberry Pi 4 can run using this device.

The Intel team released a nice step by step process installation for Raspberry Pi. And it works great, there are a couple of minor glitches that you need to figure out, like the latest package version, everything else works great.

Note: I downloaded my openvino toolkit from here (https://download.01.org/opencv/2019/openvinotoolkit/R3/), and the downloaded file is (l_openvino_toolkit_runtime_raspbian_p_2019.3.334.tgz).

Once installed, the 1st python sample is a face recognition one. This sample analyzes a image file using OpenCV to detect faces, and creates a new output file with the detected images. As I said, is very straight forward.

So, I decided to create a new python sample to run live face detection using the camera feed and also display the FPS. This is the output code:

The code is very straight forward and the main matters are

  • It uses 2 models from the Intel Zoo to perform the face detection: face-detection-adas-0001.xml and face-detection-adas-0001.bin
  • Lines 22 and 23 are key to define that OpenCV will load and use the models in the Intel device
  • I use imutils to resize the image to 640×480. Feel free to use any other library for this, even OpenCV
  • Also, it works also with smaller resolutions, however 640×480 is good for this demo

And the final app running analyzing almost 8 frames per second (8 FPS).

Which is almost 10 times faster that the 0.7 FPS without Intel NCS2

And, I already wrote about running Visual Studio Code in the Raspberry Pi (see references) is an amazing experience. I did all my Python in VSCode coding remote accesing my device via VNC. Python runs like a charm!

You can download the code from https://github.com/elbruno/rpiopenvino/tree/master/facedetection

References

My posts on Raspberry Pi

Dev posts for Raspberry Pi
Tools and Apps for Raspberry Pi
Setup the device
Hardware