As it turns out that I have been fortunate to participate, once again, in Interface: the podcast that my friend Rodrigo Diaz Concha manages and coordinates (link). This time, I’ve talked about one of the coolest preview products we have in Azure: Azure Notebooks.
sounds weird for a .Net Developers, however, the power, productivity, and
collaboration capabilities that Jupyter notebooks provide are something the
Python community has long taken advantage of.
I’d better leave the podcast link and hope you enjoy it (remember, is in Spanish):
Pues resulta que he tenido la suerte de participar, una vez más, en Interfaz: el podcast que dirige y coordina mi amigo Rodrigo Diaz Concha (link). En esta oportunidad, he hablado de uno de los productos en Preview que tenemos en Azure: Azure Notebooks.
Este producto suena
raro para un .Net Developers, sin embargo, la potencia, productividad y
capacidades de colaboración que proveen las Jupyter notebooks, son algo que la
comunidad de Python aprovecha desde hace tiempo.
Mejor dejo el
link del podcast y espero que lo disfruten:
Interfaz Podcast Episodio 113 – Azure Notebooks con Bruno Capuano
I’ve been using Python and Jupyter notebooks more and more. And somehow, during this learning path I also realize that I can use Visual Studio Code to code amazing Python apps, and also to edit and work with Jupyter notebooks.
If you are VSCode python developer, you may know some of the features available in the tool. I won’t describe them, because you may find the official documentation very useful (see below links or references).
The Python extension provides many features for editing Python source code
in Visual Studio Code:
However, during the part months I’ve also working a lot using Jupyter notebooks, and I was very happy when I realize that VSCode also have some cool features to work with notebooks. The core of the notebooks are cells, and we can use them with the prefix #%%.
This is how it looks inside the IDE, running a cell in the code
feature is to run notebooks in a remote Jupyter server, maybe using Azure Notebooks.
I haven’t tried this one, and it’s on my ToDo list for the near future.
On top of adding cells features into standard python [.py] files, we can also edit standard Jupyter files. I’ve installed jupyter into one of my anaconda local environments, and now I can edit files inside VSCode.
First, I’ll be prompted to import the file as a standard python file
Now I got my Jupiter notebook inside VSCode
The final step will be to export my file or debug session, and for this we have the command [Python: Export …]
Después de una noche genial con los amigos de Metro Toronto UG, llega el momento de compartir los materiales que utilice durante la sesión. La idea inicial era hablar un poco de Azure Notebooks, y de alguna manera terminamos hablando también de Cognitive Services y Custom Vision, fue genial!
Para comenzar, los 15 min con el video de la Keynote:
Y algunos de los links que utilicé durante la sesión
After an amazing event with my friends from Metro Toronto UG, it’s time to share some resources. It was initially supposed to be focused only on Azure Notebooks, but somehow we spend a lot of time talking about Cognitive Services and Custom Vision, that was great!
IMHO one of the most important announcements presented last week in Ignite was the Azure preview for AutoML: Automated Machine Learning.
I’m not going to get into details about AutoML, the best option is to read the official post from the Azure Machine Learning team (see references). I’ll do my best effort to try to summarize that the objective of this new tool if to allows you to automatically identify the best pipeline to work in a machine learning environment / scenario.
A pipeline comprises the basic steps of a process of ML
Working with data, this means sorting, filtering, check for nulls, labeling, etc.
Select a learning algorithm, SVM, Fast Tree, etc.
Define features and Labels, adjust parameters, etc
The [try / error / learn] model in each of these steps help us to improve our model, and to get better results (better accuracy).
AutoML It proposes an automatic service, where the best combination is identified to create a pipeline with the best possible accuracy. As always an image rocks the explanation
Automated ML is available to try in the preview of Azure Machine Learning. We currently support classification and regression ML model recommendation on numeric and text data, with support for automatic feature generation (including missing values imputations, encoding, normalizations and heuristics-based features), feature transformations and selection. Data scientists can use automated ML through the Azure Machine Learning Python SDK and Jupyter notebook experience. Training can be performed on a local machine or by leveraging the scale and performance of Azure by running it on Azure Machine Learning managed compute. Customers have the flexibility to pick a pipeline from automated ML and customize it before deployment. Model explainability, ensemble models, full support for Azure Databricks and improvements to automated feature engineering will be coming soon.
From here I strongly recommend reading the official documentation that is where it is explained in detail AutoML. Also, if you are familiar with Jupyter Notebooks, in few seconds you can clone and access a new library with a tutorial to try AutoML from zero. You need to clone a repo from https://github.com/Azure/MachineLearningNotebooks
The tutorial is pretty straightforward, and with little Azure resources you can see how you optimize a A Classification model with AutoML
Although for now only models of classification and regression are supported, AutoML is a tool a Keep in mind when you start working in ML.