[# VS2010] New date of expiration for the virtual machines of Visual Studio 2010 ALM



those who are still using the virtual machine Visual Studio 2010 ALM for demos and testing, we have the good news that the expiry date of the same has been extended from April 9 to September 10, 2012, 2012. Risa

Download data can be seen from https://elbruno.com/2012/03/02/vs1-maquinas-virtuales-para-probar-visual-studio-11-alm-y-visual-studio-2010-alm/

Saludos @ Home

El Bruno

image image image

Source: http://blogs.msdn.com/b/briankel/archive/2012/03/27/updated-visual-studio-2010-alm-virtual-machine.aspx

[# ALM] As 30 minutes a day can change your life

ALM 03


today is post of self-help. I’ve always thought that self-help books are created for people that do not fit or are not happy with what you have. Then these books describe a series of problems that is very likely that no one has, but clear the people endorsed them and… pum! editorial hit, you have millions of people buying a book that you basically repeated the lessons that you gave your parents when you were a teenager.

Although of course, it is better to stop €10 for a book that will help you, to take advantage of the wisdom which give you free there.

In my case, the 30 minutes is something that has cost me 35 years to learn. Some time ago, I commented as he personalized my own version of GTD, working with pomodoros of 25 minutes and managing my all with OneNote (link).Eventually, the fact devote periods of 25 or 30 minutes with an exclusive focus towards a single subject, has helped me to get things that previously could not.

But the most important thing I’ve learned, is that in these small blocks of time, I can achieve much, if I learn to separate the tasks that I’m adding in small blocks. Who writes so easily in 3 lines, then is far more complicated to implement when you take it to reality. But over time, you learn to distinguish large small stones stones; and finally you get are all small stones.

Note: do not know the history of rocks and the Lake of Toyota? because you should (link ))

This is beside the point that my father of guy, always told me that the best way to embark on a journey is taking a first step. Or if I forward my have a very big problem, so it’s best to separate into small problems. And then go to solving one by one. Come on it was free information he could have free and after study on ALM for more than 10 years, him I’m capturing the idea.

Note 2: my father is an engineer and is a crack if you did not know (link). In the next photo you see as the next generation of also has understood things better enter hammer blows to the head Risa


Then, what should I do with this to improve my development lifecycle?. Apply common sense and devote 30 minutes a day to find answers to the following questions:

  • I am developing quality software?
  • What can I do to improve the quality of my development process?
  • I’m working with the best tools?
  • I know how to work with these tools to be more productive?
  • I’m adding value to my business?
  • etc.

These wonder appear trivial, but for example the latter is one of the points that we abandon over the computer. Many times we think that the technological solution of the death is the best we can do, when in reality with something simple and concrete we are providing more value than we need. That Yes, while forgetting that "technical debt" (link) can destroy a project little by little.

To summarize:

It dedicates 30 minutes a day to think about how to improve your way of working.

Then apply another 30 minutes to implement these improvements.

You finally have a 30 minute session to assess whether these improvements are actually reflected in your day to day.

Beware, this is not a task for 90 minutes in a day, we distribute and thinking on how to improve with time… but thinking!

I do, and am also a daily pomodoro to

  • Make a CodeKata, to not forget about the foundations of programming
  • Read a little about technology, trends, etc., not to miss the train

Saludos @ La Finca

El Bruno

image image image

[# VS11] DemoMates presentations for Visual Studio 11 ALM available for download



those who usually do presentations of ALM tools Microsoft know the power of DemoMate. Demomate is a client in Silverlight that allows us to reproduce step-by-step, a particular scene recorded from the interaction with a computer.

I.e. as a video but much more interactive and where you know that we should not fail. All you have to bear in mind is thatDemoMate is payment, so if we want to record demos for our account must first purchase the license.

As well, the great Brian Keller, goes to step with a solution that it leaves us pre-recorded several demos of Visual Studio 11and Team Foundation 11 to show the main capabilities of both products.

You can download the demos from here, and obviously for the live demonstrations, the virtual machine from here.

If want to view online

Saludos @ Home

El Bruno

image image image

References: http://blogs.msdn.com/b/briankel/archive/2012/03/15/visual-studio-11-beta-alm-demomates-now-available.aspx


[# CODEMOTION] Materials of the event of # KinectSdk and my passage through the CodeMotion.es



yesterday was fortunate to participate in the first Spain CodeMotion a presentation along with @ vgaltes. The event overall was great, many people joined, there was much networking, many faces known and obviously with the luck to be able to desvirtualizar more than one.

He was also the ideal excuse to learn a little tools and technologies outside my scope, where I have to admit that Groovy and Grails, have been a big surprise for my.

Finally at the last session it had the option of going to see Luis Fraile or David Bonilla. As Luis I have it known to @ David_Bonilla I had not seen him live but yes he knew his fame, got in his session. The truth is that it was a crack, on the one hand a deployment of resources as long not saw it and on the other, a close person but with well placed in the head ideas.

Turning to the meeting about Kinect with Vincenc, therefore we had well. We threw a couple of lines of code and we started to show examples of Kinect applications where what better gave us was to get a couple of smiles. The poor Vicenc had to suffer the labia of an Argentine and also a laptop to which it was not used but he stepped in as a crack!

I’ll stick with the anecdote that the room of our event was so filled with grapple to enter it when I had the following conversation with a skinny that was before mine:

Do Bruno – let me pass?

Flaco – there is no more room

Bruno – but is that with @ vgaltes we are the rapporteurs

Flaco – ahh so if they can pass Risa

almost no enter!



Code Motion KinectSdk

And the code source examples can be downloaded from https://skydrive.live.com/redir.aspx?cid=bef06dffdb192125 & SPL = BEF06DFFDB192125! 3904 & parid = BEF06DFFDB192125! 3842 & authkey =!AHlC-AoSBzrGWnA

[# CODEPLEX] Now also supports GIT



Although they have already commented in several places, it is important to highlight the news.

Now CodePlex supports the creation of projects based on GIT.


Seen from the outside does not seem something so important, but if we read between the lines there are 2 interesting things to highlight.

Firstly that this change is not based on a strategic decision of MS but which comes driven by the large number of petitions that performs the communicated. Beware, we must not be naive; This does not mean that now MS begin to do what you say the community, nor to think not not win anything with this change. But it is important to note that include a software of this type within the code for the Microsoft communities management platform is a point which opens up many possibilities.

On the one hand, help to improve the Visual Studio family of products. VS11 and TFS11 are fine, but if we compare the fluidity of work which gives a DCVS as GIT, because no color. Therefore, there is a desire to educate the developers guide to a new way of working (already included in VS) behind this decision.

In addition this change is intended to include in Codeplex to a large set of developers who are currently using GIT for their developments.

Secondly, do not lose sight that MS is incorporating an OpenSource software within its platform. This is not the first time, nor will not be the last, but serves as a reference to give an idea of how powerful that is the idea of OpenSource for teams of MS. something that from the outside is not seen much, already only takes into account the facet sell canned software that has Microsoft.

In the long run we will see if it ends up being a copy from GitHub or other more advanced sites based on GIT. What is important not to lose sight of, is that CodePlex not only offers support for a Source Control repository, but it adds several possibilities more… there I leave.

Saludos @ Home

El Bruno

image image image




[# TFS11] Microsoft Visual Studio Team Foundation Server 11 Beta Power Tools



I say and I repeat it:

You can not work with Visual Studio but have ReSharper

The phrase is similar to the case of Team Foundation

You can not work with Team Foundation Server without the TFS Power Tools

Visual Studio 11 and Team Foundation 11 are in Beta mode, but that does not mean that the guys from Redmond put the batteries and send us a version of the specific PowerTools for TFS11.

The Power Tools do not have anything new or new tool (this is now Guiño). In the same you will find classic Team Process Editor, Best Practices Analyzer, etc.

This is a good one.


Saludos @ Home

El Bruno

image image image

Download: http://visualstudiogallery.msdn.microsoft.com/27832337-62ae-4b54-9b00-98bb4fb7041a

[# KINECTSDK] HowTo: Paint a skeleton



a couple of days ago I asked how to paint a skeleton with the new KinectSDK in this post. Today’s post explains the basics in a few steps to paint the skeleton.

For this example we will use a form of WPF, which add a Canvas brush where the skeleton.

   1: <Window x:Class="KinectSkeleton01.MainWindow"

   2:         xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

   3:         xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

   4:         Title="MainWindow" Height="480" Width="640">

   5:     <Grid>

   6:         <Canvas Name="skeletonCanvas" 

   7:                 Height="480" Width="640" 

   8:                 HorizontalAlignment="Center"/>

   9:     </Grid>

  10: </Window>

The following to keep in mind is to work with the sensor Kinect as a local variable of the form. In this post (link) talk a little in this regard.

Once controlled the State of the Kinect, the following is initialize the capture of skeleton (line 6 and 7) and subscribe to the event of change of frame for the skeleton (line 8).

In the implementation of this event, firstly we clean the canvas (line 14) and once validated the received frame (line 19) copy the array of skeletons to a local variable (line 21 and 22).

The concluding lines verify the State of the Joint of the head to see if tracking is correct and then use a helper of ElBruno.Kinect to paint the skeleton.

   1: void MainWindowLoaded(object sender, RoutedEventArgs e)

   2: {

   3:     if(KinectSensor.KinectSensors.Count == 0)

   4:         return;

   5:     _kinect = KinectSensor.KinectSensors[0];

   6:     _kinect.SkeletonStream.Enable();

   7:     _kinect.Start();

   8:     _kinect.SkeletonFrameReady += KinectSkeletonFrameReady;

   9: }


  11: void KinectSkeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e)

  12: {

  13:     // Remove the old skeleton

  14:     skeletonCanvas.Children.Clear();

  15:     Skeleton[] skeletons = null;


  17:     using (var frame = e.OpenSkeletonFrame())

  18:     {

  19:         if (frame != null)

  20:         {

  21:             skeletons = new Skeleton[frame.SkeletonArrayLength];

  22:             frame.CopySkeletonDataTo(skeletons);

  23:         }

  24:     }


  26:     if (skeletons == null) return;


  28:     foreach (var skeleton in skeletons)

  29:     {

  30:         if (skeleton.TrackingState != SkeletonTrackingState.Tracked) continue;

  31:         var headJoint = skeleton.Joints[JointType.Head];

  32:         if (headJoint.TrackingState != JointTrackingState.NotTracked)

  33:         {

  34:             var skeletonDraw = new SkeletonDraw();

  35:             skeletonDraw.DrawSkeleton(_kinect, skeletonCanvas, skeleton);

  36:         }

  37:     }

  38: }

The class responsible for painting the skeleton basically paints lines between each of the joints of the same. As we can see in the following code lines are painted between 2 points with the canvas and the sensor as benchmarks.

   1: void AddLine(KinectSensor kinectSensor, Canvas drawCanvas, Joint j1, Joint j2)


   3:    var boneLine = new Line {Stroke = SkeletonBrush, StrokeThickness = 5};


   5:    var j1P = kinectSensor.MapSkeletonPointToDepth(j1.Position, DepthImageFormat.Resolution640x480Fps30);

   6:    boneLine.X1 = j1P.X;

   7:    boneLine.Y1 = j1P.Y;


   9:    DepthImagePoint j2P = kinectSensor.MapSkeletonPointToDepth(j2.Position, DepthImageFormat.Resolution640x480Fps30);

  10:    boneLine.X2 = j2P.X;

  11:    boneLine.Y2 = j2P.Y;


  13:    drawCanvas.Children.Add(boneLine);



  16: ublic  float JointDistance(Joint first, Joint second)


  18:    float dx = first.Position.X - second.Position.X;

  19:    float dy = first.Position.Y - second.Position.Y;

  20:    float dz = first.Position.Z - second.Position.Z;


  22:    return (float)Math.Sqrt((dx * dx) + (dy * dy) + (dz * dz));

The sample code can be downloaded from

https://skydrive.live.com/redir.aspx?cid=bef06dffdb192125 & SPL = BEF06DFFDB192125! 3903 & parid = BEF06DFFDB192125! 1932 & authkey =!AKQC01rb-avYBVg

Saludos @ Home

El Bruno

image image image



[# ALM] Automate processes save costs in the long term (more than you think)

ALM 03


I repeat the title of the post that more that a title is a statement


This is not an easy task, but one of the ways of addressing the same is as follows

1. Identify the repetitive tasks that we perform manually during the development of an application

2 Assess the possibility of creating an automated process to deal with these tasks

3 Define a period of trial for the implementation of this process

4 Verify the gained time using this process

If you follow these steps during the implementation of a process of automation, we’ll probably see one of these two options

-What initially appeared to be a task that could be quickly replaced by a script is then quite complicated and no sense abandoning the manual process

-automated process begins to be part of a process of increasing automation that helps us to gain quality in our developments

This seems a little theory of Friday night beer actually is fairly close to our day to day. Here is an example with one of the great "Gallardo Javi" (to see when create you a blog che!)

It was necessary to compress it in a special format, separate the compressed file in multiple chunks, sign them, and couple of steps more to share the output of an application.

When we did this process by hand, it took us less than a minute. But there was always the possibility of wrong put password of the ZIP, separate evil chunks, etc. Javi took 30 minutes and created a script that was responsible for this process.

In this way, we always have the same OUTPUT from a repetitive and predictable process (which is one of the bases on which we must work).

To metric level, simply avoiding Javi is wrong in the generation of a package already we had won the the script generation time. Javi is a crack, but if we assume that 2 times a day I could be wrong.

All subsequent executions gave us a profit of + 300 seconds. Finally, after a couple of months we had won 2 days/man. (This translated into €uros always gives us a joy)

With Javi not we went further, with the script us enough, but there is always the possibility of a little thinking out-of-the-box

  • delegate the responsibility to carry out this process in a Team Foundation build.
  • process the result of this process only occasionally successful compilation and if the tests are executed correctly
  • automate the packaging and distribution from this process
  • etc.

When we arrived at these scenarios are closer to having Continuous Delivery scenarios (about what spoke in this link), simply automating tasks.

There are many scenarios where it can automate, most result in deployments, but quickly occur to me the following

  • Deployments, for example when we deploy to AZURE, are why it doing always manually?
  • Tests, the main point where automate guarantees us quality
  • Code generation, is not one of the most recommended, but work with templates for example is a way to always ensure the same OUTPUT from a given INPUT
  • Many more…

Finally, discuss the best time where we can apply these processes is perhaps in time to integrate our code. In the case of working with Team Foundation, the definition of a build is incredibly powerful to implement these processes.

AVANADE Spain we have a number of Build definitions that allow us to perform different tasks of automation. From processes to ensure the quality, as the execution of analysis of coding style (StyleCop), generation of custom reports based on unit testing and testing of MTM, until deployments automated AZURE, ClickOnce, WebDeploy, etc.

Saludos @ Home

El Bruno

image image image

[# RESHARPER] HowTo: Create a template for a property (for the weird ones RaisePropertyChanged)



again ReSharper comes to my and hours and hours of unnecessary work, avoids me this time by taking advantage of the widespread ability to generate templates.

If you use much Visual Studio, I assume that you know that if you write "prop" and press the TAB key twice, then the code snippet that comes by default with the IDE create you a template of an auto-implemented property.


This is enough in most cases, but when come the friends of the paint and color, because you have to start working with Dinamyc Properties (I think that they are named) where your class must implement the INotifyPropertyChanged interface and then shoot the notification on the set of the property.

We are going a way of implementing a property Name of this type would be something like.

   1: #region Name

   2: [XmlIgnore] public const string NamePropertyName = "Name";

   3: [XmlIgnore] private string _name;

   4: public string Name

   5: {

   6:     get { return _name; }

   7:     set

   8:     {

   9:         _name = value;

  10:         RaisePropertyChanged(NamePropertyName);

  11:     }

  12: }

  13: #endregion

The problem comes when you have to go creating properties of this type in a quick way and you passes what happens to my Friday: you are tired (this serves as an excuse for not thinking a bit more).

But luckily, ReSharper comes with his edition of extended templates and allows you to get you up the problem as if nothing.

The first thing is to access the panel "Explorer Templates" of ReSharper, using the "ReSharper // Templates Explorer" menu. In it we will see the list of templates with which we can work.


For the case said earlier, I’ve created a new template called PropNot with the following text within the same

   1: #region $Name$

   2: [XmlIgnore]

   3: public const string $Name$PropertyName = "$Name$";

   4: [XmlIgnore]

   5: private $type$ _$NameLower$;

   6: public $type$ $Name$

   7: {

   8:     get { return _$NameLower$; }

   9:     set

  10:     {

  11:         _$NameLower$ = value;

  12:         RaisePropertyChanged($Name$PropertyName);

  13:     }

  14: }

  15: #endregion

As you can see, variables identified with the sign are implemented within the same $$ and then each of these variables has a special behavior.

-The variable Name $$ is of type input to define the name of the property

-the variable $type$ defines the type of the property and the list of types in the intellisense

-the variable $NameLower$ Gets the value of the variable $Name$ and change the first character by lowerCase()


Therefore ready, now with two clicks I can longer with rare properties these at a stretch Risa


Saludos @ La Finca

El Bruno

image image image

[# VS11] What happened to my unit tests migrated from # VS2010?



After a good while working with Visual Studio 11 and ReSharper 7, because will already grasping you hand. Those who complain of the icons and the Look & Feel because they complain free because in reality, once you get used to it or you realize that has changed the IDE.

Clarification: I personally think that if the colors of the VS11 IDE you bothered, is that these more concerned by the IDE for the code… should not be so.

But hey, how they have changed quite a lot, if it is important to know and learn about them. One of them is that now withVisual Studio 11 we can integrate the implementation of unit testing with different set of unit tests, nUnit, MSTest, etc. But of course to support this scenario, have had to modify the way in which Visual Studio 2010 worked tied to MSTests.

Let’s see an example. I have a solution in Visual Studio with a ClassLibrary featured Foo() and Bar(), and then 2 unit tests with the following code.

   1: [TestMethod()]

   2: public void AskForFooAndGetFoo()

   3: {

   4:     var target = new Class1();

   5:     var actual = target.Foo();

   6:     Assert.AreEqual("Foo", actual);

   7: }


   9: [TestMethod()]

  10: public void AskForBarAndGetBar()

  11: {

  12:     var target = new Class1();

  13:     var actual = target.Bar();

  14:     Assert.AreEqual("Bar", actual);

  15: }

As I am an organized person, was created my list of tests


and of course, the unit tests passed OK and code coverage was very good.


Until here everything great. And I so happy working with Visual Studio 2010.

However, things change a bit when we open the same solution with Visual Studio 11 we see that the lists of tests are not more supported. We can see the tests and the categories that we have created, but you can not run tests.


The reference article that redirected to us, it is very good although I am sure that will change in the near future.

For the implementation of unit testing, now have a new panel in Visual Studio 11 called "Unit Test Explorer". And here’s an interesting detail about the operation of the aircraft.

The "Unit Test Explorer" pane, is responsible for inspecting the code of our solution automatically and detects the unit tests in the same.


Then we can run the unit tests, launching the failed tests, which have passed correctly, etc.


We also have the ability to filter by these views and select the TestSettings with which we want to run the tests.

But the best (or one of the best things) is slightly hidden. Now the IDE allows us to configure Visual Studio 11 so that the unit tests run automatically after each compilation (something which previously made with a MACRO).


So… our tests is not lost, we only have to organize them again. And it is also a great time to review our evidence, starting with the same name.

Said it this morning via Twitter the ReSharperteam. @ ReSharper

Documenting a method name with a comment? Try giving it a better name. Better name doesn’t fit in one line?Now you have a serious issue!

Saludos @ Home

El Bruno

image image image

References: http://msdn.microsoft.com/library/dd286595(VS.110).aspx