#Python – The best way to explain how jupyter notebooks works with Visual Studio Code @Code

Hi !

So, after my yesterday post [Edit and work with Jupyter notebooks in Visual Studio Code], today some people asked me how the Jupyter Notebooks and Python integration works.

The best way to explain this is with a simple animated video with the following actions

  • Create a cell using the prefix # %%
  • Run the cell and display the output in Python Interactive
  • Create a new cell
  • Run the new cell and the previous one
  • Analyze output in Python Interactive

I think this 15 seconds are good enough to understand the benefits of Jupyter Notebooks and Visual Studio Code.

Happy Coding!

Greetings @ NY

El Bruno

References

Advertisements

#VSCode – Edit and work with #jupyter notebooks in Visual Studio Code

Hi !

I’ve been using Python and Jupyter notebooks more and more. And somehow, during this learning path I also realize that I can use Visual Studio Code to code amazing Python apps, and also to edit and work with Jupyter notebooks.

If you are VSCode python developer, you may know some of the features available in the tool. I won’t describe them, because you may find the official documentation very useful (see below links or references).

The Python extension provides many features for editing Python source code in Visual Studio Code:

However, during the part months I’ve also working a lot using Jupyter notebooks, and I was very happy when I realize that VSCode also have some cool features to work with notebooks. The core of the notebooks are cells, and we can use them with the prefix #%%.

This is how it looks inside the IDE, running a cell in the code

Another interesting feature is to run notebooks in a remote Jupyter server, maybe using Azure Notebooks. I haven’t tried this one, and it’s on my ToDo list for the near future.

On top of adding cells features into standard python [.py] files, we can also edit standard Jupyter files. I’ve installed jupyter into one of my anaconda local environments, and now I can edit files inside VSCode.

First, I’ll be prompted to import the file as a standard python file

And, done! Now I got my Jupiter notebook inside VSCode

The final step will be to export my file or debug session, and for this we have the command [Python: Export …]

Super useful!

Happy coding!

Greetings @ NY

El Bruno

References

#Event – Materiales utilizados durante #GlobalAINight con los amigos de @metrotorontoUG

 

Buenas !

Después de una noche genial con los amigos de Metro Toronto UG, llega el momento de compartir los materiales que utilice durante la sesión. La idea inicial era hablar un poco de Azure Notebooks, y de alguna manera terminamos hablando también de Cognitive Services y Custom Vision, fue genial!

Para comenzar, los 15 min con el video de la Keynote:

Mis Slides

Y algunos de los links que utilicé durante la sesión

My posts on Custom Vision and ONNX

  1. Object recognition with Custom Vision and ONNX in Windows applications using WinML
  2. Object recognition with Custom Vision and ONNX in Windows applications using WinML
  3. Object recognition with Custom Vision and ONNX in Windows applications using Windows ML, drawing frames
  4. Object recognition with Custom Vision and ONNX in Windows applications using Windows ML, calculate FPS
  5. Can’t install Docker on Windows 10 Home, need Pro or Enterprise
  6. Running a Custom Vision project in a local Docker Container
  7. Analyzing images in a Console App using a Custom Vision project in a Docker Container
  8. Analyzing images using PostMan from a Custom Vision project hosted in a Docker Container
  9. Building the CustomVision.ai project in Docker in a RaspberryPi
  10. Container dies immediately upon successful start in a RaspberryPi. Of course, it’s all about TensorFlow dependencies
  11. About ports, IPs and more to access a container hosted in a Raspberry Pi
  12. Average response times using a CustomVision.ai docker container in a RaspberryPi and a PC

Windows 10 and YOLOV2 for Object Detection Series

Saludos @ Toronto

El Bruno

#Event – Resources used during the #GlobalAINight @metrotorontoUG

 

Hi  !

After an amazing event with my friends from Metro Toronto UG, it’s time to share some resources. It was initially supposed to be focused only on Azure Notebooks, but somehow we spend a lot of time talking about Cognitive Services and Custom Vision, that was great!

Let’s start with the 15 min Keynote video:

My Slides

And some interesting online resources

My posts on Custom Vision and ONNX

  1. Object recognition with Custom Vision and ONNX in Windows applications using WinML
  2. Object recognition with Custom Vision and ONNX in Windows applications using WinML
  3. Object recognition with Custom Vision and ONNX in Windows applications using Windows ML, drawing frames
  4. Object recognition with Custom Vision and ONNX in Windows applications using Windows ML, calculate FPS
  5. Can’t install Docker on Windows 10 Home, need Pro or Enterprise
  6. Running a Custom Vision project in a local Docker Container
  7. Analyzing images in a Console App using a Custom Vision project in a Docker Container
  8. Analyzing images using PostMan from a Custom Vision project hosted in a Docker Container
  9. Building the CustomVision.ai project in Docker in a RaspberryPi
  10. Container dies immediately upon successful start in a RaspberryPi. Of course, it’s all about TensorFlow dependencies
  11. About ports, IPs and more to access a container hosted in a Raspberry Pi
  12. Average response times using a CustomVision.ai docker container in a RaspberryPi and a PC

Windows 10 and YOLOV2 for Object Detection Series

Greetings @ Toronto

El Bruno

#Event – Resources used on the [#ArtificialIntelligence and #MachineLearning in #Azure] event

04

Hi!

Let me start with a big Thanks to my friends on [The Azure Group (Canada’s Azure User Community] for all the work and amazing time in my session [Artificial Intelligence and Machine Learning in Azure].

As usual, now it’s the share resources time. This one will be slides and tons of links, the source code was to basic to even push to GitHub

And some interesting links

Windows 10 and YOLOV2 for Object Detection Series

Happy coding !

Greetings @ Burlington

El Bruno

 

 

#Event – Materiales utilizados en la sesión [#ArtificialIntelligence and #MachineLearning in #Azure]

04

Buenas!

Gracias a los amigos de [The Azure Group (Canada’s Azure User Community] por el excelente rato hace un par de días en la sesión [Artificial Intelligence and Machine Learning in Azure].

Como siempre, ahora es el momento de compartir las slides y materiales utilizados en la sesión

Y esta vez en lugar de código, pues una lista larga de recursos

Windows 10 and YOLOV2 for Object Detection Series

Happy coding !

Saludos @ Burlington

El Bruno

 

 

#AutoML – Automated Machine Learning,AKA #Skynet

Hi!

IMHO one of the most important announcements presented last week in Ignite was the Azure preview for AutoML: Automated Machine Learning.

I’m not going to get into details about AutoML, the best option is to read the official post from the Azure Machine Learning team (see references). I’ll do my best effort to try to summarize that the objective of this new tool if to allows you to automatically identify the best pipeline to work in a machine learning environment / scenario.

A pipeline comprises the basic steps of a process of ML

  • Working with data, this means sorting, filtering, check for nulls, labeling, etc.
  • Select a learning algorithm, SVM, Fast Tree, etc.
  • Define features and Labels, adjust parameters, etc

The [try / error / learn] model in each of these steps help us to improve our model, and to get better results (better accuracy).

AutoML It proposes an automatic service, where the best combination is identified to create a pipeline with the best possible accuracy. As always an image rocks the explanation

01 AutoML process.png

Official description

Automated ML is available to try in the preview of Azure Machine Learning. We currently support classification and regression ML model recommendation on numeric and text data, with support for automatic feature generation (including missing values imputations, encoding, normalizations and heuristics-based features), feature transformations and selection. Data scientists can use automated ML through the Azure Machine Learning Python SDK and Jupyter notebook experience. Training can be performed on a local machine or by leveraging the scale and performance of Azure by running it on Azure Machine Learning managed compute. Customers have the flexibility to pick a pipeline from automated ML and customize it before deployment. Model explainability, ensemble models, full support for Azure Databricks and improvements to automated feature engineering will be coming soon.

From here I strongly recommend reading the official documentation that is where it is explained in detail AutoML. Also, if you are familiar with Jupyter Notebooks, in few seconds you can clone and access a new library with a tutorial to try AutoML from zero. You need to clone a repo from https://github.com/Azure/MachineLearningNotebooks

02 AutoML Jupyter Notebooks

The tutorial is pretty straightforward, and with little Azure resources you can see how you optimize a A Classification model with AutoML

03 AzureML Local tutorial

Although for now only models of classification and regression are supported, AutoML is a tool a Keep in mind when you start working in ML.

See you at the event, Happy Coding!

Greetings @ Toronto

El Bruno

References

#AutoML – Automated Machine Learning, modelos de #MachineLearning que aprenden a optimizarse! (en las movies se llama #skynet)

Buenas!

Una de las noticias mas importantes que se presentaron la semana pasada en Ignite fue la Preview de Azure AutoML: Automated Machine Learning.

Lo mejor para entrar en detalles sobre AutoML es leer el post oficial del equipo de Azure Machine Learning (ver referencias). Yo lo intentare resumir en un nuevo framework que permite identificar de forma automática el mejor pipeline para trabajar con datos.

Un pipeline comprende los pasos básicos de un proceso de ML

  • Trabajar con datos, esto significa ordenarlos, eliminar los nulls, etiquetarlos, etc.
  • Seleccionar un algoritmo de aprendizaje, SVM, Fast Tree, etc.
  • Definir Features y Labels, ajustar parámetros, etc

El modelo de prueba / error / aprendizaje en cada uno de estos pasos define la precisión que tendrá nuestro modelo final.

AutoML propone un servicio automático, donde se identifica la mejor combinación para crear una Pipeline con la mejor precisión posible. Como siempre una imagen ayuda a la explicación

01 AutoML process.png

Y la descripción oficial

Automated ML is available to try in the preview of Azure Machine Learning. We currently support classification and regression ML model recommendation on numeric and text data, with support for automatic feature generation (including missing values imputations, encoding, normalizations and heuristics-based features), feature transformations and selection. Data scientists can use automated ML through the Azure Machine Learning Python SDK and Jupyter notebook experience. Training can be performed on a local machine or by leveraging the scale and performance of Azure by running it on Azure Machine Learning managed compute. Customers have the flexibility to pick a pipeline from automated ML and customize it before deployment. Model explainability, ensemble models, full support for Azure Databricks and improvements to automated feature engineering will be coming soon.

Pues bien, a partir de aquí recomiendo leer la documentación oficial que es donde se explica en detalle AutoML.

Si estas familiarizado con Jupyter notebooks, en pocos segundos puedes tener acceso a un tutorial mas que completo solo clonado una library desde https://github.com/Azure/MachineLearningNotebooks

02 AutoML Jupyter Notebooks

El tutorial es bastante sencillo, y con pocos recursos de Azure puedes ver como se optimiza un un modelo de clasificación con AutoML

03 AzureML Local tutorial

Si bien por ahora solo se soportan modelos de clasificación y regresión, AutoML es una herramienta a tener en cuenta cuando comienzas a trabajar en ML.

Nos vemos en el evento, happy coding!

Saludos @ Toronto

El Bruno

References