#MSIgniteTheTour Sydney Summary

Rebecca Jackson

I went to my first ever big Microsoft event, Microsoft Ignite the Tour which landed in Sydney on February 13-14 2020. Here is my summary and a video which Megan Strant and I recorded.


View original post 729 more words

#Coding4Fun – How to control your #drone with 20 lines of code! (9/N)

Buy Me A Coffee

Hi!

Let’s take some Frames Per Second measurements on the UDP and OpenCV connection. It seems that working with simple movements, the values moves between 30 and 60 FPS.

showing FPS information with the drone camera

Just added a couple of lines in the main While, to calculate the FPS.

# open
i = 0
while True:
    i = i + 1
    start_time = time.time()

    sendReadCommand('battery?')
    print(f'battery: {battery} % - i: {i}')

    try:
        ret, frame = cap.read()
        img = cv2.resize(frame, (640, 480))

        if (time.time() - start_time ) > 0:
            fpsInfo = "FPS: " + str(1.0 / (time.time() - start_time)) # FPS = 1 / time to process loop
            font = cv2.FONT_HERSHEY_DUPLEX
            cv2.putText(img, fpsInfo, (10, 20), font, 0.4, (255, 255, 255), 1)

        cv2.imshow('@elbruno - DJI Tello Camera', img)
    except Exception as e:
        print(f'exc: {e}')
        pass

    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

As a final note, just need to mention that I make some tests using different camera resolutions and the FPS averages are similar. I tested with 640 * 480 pixels and 1024*768 pixels.

Next posts, let’s do some Face Detection and rock some AI with the drone!

Happy coding!

Greetings

El Bruno

References

My Posts

Office 365: Pin de Aplicaciones en la barra de aplicaciones de Microsoft Teams (II)!

#Coding4Fun – How to control your #drone with 20 lines of code! (8/N)

Buy Me A Coffee

Hi!

Now that I started to understand how UDP works, I also did my research to find which are the best options to access an UDP video feed. Lucky for me, there are plenty of resources about doing this task using my old friend OpenCV.

Most of the OpenCV documentation is written in C++. However at the end, it all goes down to these basic lines of code

# open UDP
videoUDP = 'udp://192.168.10.1:11111'
cap = cv2.VideoCapture(videoUDP)

# read a frame from the feed
ret, frame = cap.read()
img = cv2.resize(frame, (320, 240))

# diplay the frame in openCV video window
cv2.imshow('Video', img)


Note: At the references section below I shared some of my posts with my experiences on how to install OpenCV in Windows 10.

Let’s go back to our sample Python App. Using the previous sample that display the battery level, I changed this code to be alive all the time and displaying the video feed in a 320*240 window.

The following video display how fast this works:

And of course the complete code with this notes:

  • Line 96-100. Open video feed and wait 2 seconds
  • Line 104-118. Main App.
    • Get and display battery level
    • Get UDP video frame
    • Resize frame to 320×240
    • Display frame
    • When Q key is pressed, exit app
  • Line 121. Close video stream

Happy coding!

Greetings

El Bruno

References

My Posts

Understanding Ingress Controllers and Azure App Gateway for Azure Kubernetes Part 2: AGIC

Roy Kim on Azure, Office 365 and SharePoint

The previous part 1 blog post went over fundamental concepts of ingress and ingress controller. This part 2 post will build on this concept and give a review of the App Gateway Ingress Controller (AGIC)

First of all, what happens when you deploy AKS with its default settings?

The default AKS deployment when going through the Azure Portal has network related configuration set with HTTP application routing as disabled.

Http Application Routing add-on makes it really easy to setup an ingress controller so that you can access your deployed apps in AKS. However, it is not recommended for production deployments.

Essentially, there is no real ingress controller set up upon default AKS deployment. So you have to install an ingress controller. A popular approach is with NGINX (i.e. in the cluster), you can read about it in Create an ingress controller in Azure Kubernetes Service (AKS). This will work…

View original post 602 more words

#Event – Let's rock some #AI and #ComputerVision at @devdotnext #devdotnext2020

Buy Me A Coffee

Hi!

In a couple of weeks, I’ll be visiting one of the biggest events in Broomfield, Colorado: @devdotnext.

DevDotNext

DevDotNext hosts 150+ 75-minutes Presentations, 4 Keynotes/Panels, and 11 All-day Pre-Conference Workshops.

The schedule is available here https://www.devdotnext.com/schedule with some of this amazing topics:

  • Languages
  • Design and Architecture Cloud
  • Server-Side
  • Frontend
  • DevOps
  • Microservices
  • Machine Learning
  • Testing
  • Being agile
  • Leadership
  • And more

I’ll be sharing some experiences and insights around Machine Learning, Computer Vision and IoT.

Registration and event details

Hurry up, regular registration ends soon.
Register at https://www.devdotnext.com/register

Hope to see you there. Use coupon code LEARNWITHME

Happy coding!

Greetings

El Bruno

Understanding Ingress Controllers and Azure App Gateway for Azure Kubernetes Part 1: Intro

Roy Kim on Azure, Office 365 and SharePoint

I will share my experiences with a design and implementation of Azure Application Gateway for an Azure Kubernetes Service (AKS) cluster. This is so that you may get some practical insight as you plan and design for using the Azure App Gateway.

In this blog series, I will go over

  1. Fundamental Ingress concepts
  2. Architecture and deployment components
  3. My review, observations and insights

Typical Use Case: For an application that is required to be internet facing, it is recommended to have a networking appliance that allows for ingress traffic into your application where you can manage web security, networking and http/layer 7 traffic.

As a result, a valuable option is to leverage the Azure Application Gateway as part of your overall Azure Kubernetes system architecture.

AKS Architecture

My implementation was based on the following documentation. So read the following article to further understand background details.

The fundamentals of an Ingress and an…

View original post 430 more words

#Coding4Fun – How to control your #drone with 20 lines of code! (7/N)

Buy Me A Coffee

Hi!

No code today. Mostly because I spend a decent amount of time trying to understand how the DJI Trello camera feed works.

In order to access the camera feed remotely we need to perform 2 steps. First we need to send the command “command” to the drone, and then the command “streamon” to enable the video stream. Of course, there is also a command to stop the stream “streamoff“.

In the following sample, I enable the camera feed, then keep the camera feed live for 90 seconds and then disable the camera feed.

OK, once I got this, I needed to spend some time trying to figure out how to get this feed. Based on the SDK details, I realize that the url to access the video feed is:

udp://192.168.10.1:11111

First I make a try and try to access the UDP feed using VLC, however it didn’t work. So I did a little research, and found that I can use FFmpeg to do this. In case you don’t know about FFmpeg.

FFmpeg is the leading multimedia framework to decode, encode, transcode, mux, demux, stream, filter and play. All builds require at least Windows 7 or Mac OS X 10.10. Nightly git builds are licensed as GPL 3.0, and release build are licensed as GPL 3.0 and LGPL 3.0. LGPL 3.0 release builds can be found using the “All Builds” links.

FFmpeg Builds (see references)

I downloaded the latest FFMpeg build, and run the following command locally:

.\ffplay.exe -i udp://192.168.10.1:11111

And after a couple of seconds, I got my drone video feed displayed locally. It has a huge delay from the real action to the camera feed. So, there is some improvement opportunities here.

drone camera video feed using ffmpeg and udp

The video is at 3X speed, and I avoid the initial comments and setup. The main idea was to access the video feed and it’s done. Now, in next posts, I’ll try to use OpenCV to work with the feed and maybe process and display each frame independently.

Important: when you run the ffplay command, it will show an scary output, and we will get a great windows with the camera feed. This is the PowerShell output:

Happy coding!

Greetings

El Bruno

References

#Coding4Fun – How to control your #drone with 20 lines of code! (6/N)

Buy Me A Coffee

Hi!

Today is code time also! And a continuation from my previous sample.

Yesterday I show how to read a static value: the battery. And, when you work with a device like a drone there are other important values to analyze in order to send commands to the drone. Like altitude, position, time of flight etc.

So, based on yesterday sample, I’ll show how to create a simple Python app, that will display the accelerator X value. In the following video you can see the how the value start “static” during the 1st couple of seconds, until I pickup the device and I moved it around.

Important: don’t blame me about the low battery. Playing with the drone drains the battery very fast!

Once again, the code is very straight forward. It runs a loop furing 10 seconds, showing the battery and accelX information.

Just as a reminder, this is the information we get from the drone:

pitch:0;roll:1;yaw:0;vgx:0;vgy:0;vgz:0;templ:79;temph:82;tof:10;h:0;bat:39;baro:50.42;time:0;agx:-8.00;agy:-17.00;agz:-999.00

As you can read, all the information is condensed in a single line and we can split and get:

  • pitch
  • roll
  • yaw
  • vgx
  • vgv
  • vgz
  • templ (temperature low)
  • temph (temperature high)
  • tof (time of flight)
  • h (height)
  • b (battery)
  • baro (barometer)
  • time
  • agx
  • agy
  • agz

Happy coding!

Greetings

El Bruno

References