goku at keyboard

Hi !

One team in Microsoft Research have released a new Deep Neural Network for learning universal language embeddings: Multi-Task Deep Neural Networks for Natural Language Understanding (MT-DNN).  Language embeddings is a process used to map elements in sentences to vector representations. This is mostly used in tools like LUIS (Language Understanding) to analyze sentences and identify intentions, entities and more.

A complete description of MT-DNN could be found in the official Microsoft Research Blog. I found interesting that the approach is to add a pre-trained bidirectional transformer language model, known as BERT, developed by Google AI.

As usual, the code, based on PyTorch, is available in GitHub: https://github.com/namisan/mt-dnn. The repo contains the pretrained models, the source code and the Readme that describes step by step how to reproduce the results reported in the MT-DNN paper,

More information: Towards universal language embeddings

Greetings @ Toronto

El Bruno

References

Leave a comment

Discover more from El Bruno

Subscribe now to keep reading and get access to the full archive.

Continue reading