Performance benchmarks for the manager

I promised last week (here) to post a target table of content of this series of posts about performance benchmarks.

It is a quite long list of topics that I split in two main areas:

  • Topics that are relevant for everyone in the organization, labeled “for the manager”,
  • Topics that are of interest mostly for the technical people, labeled “for the operational people

Below you find the candidate subjects that I believe are of general interest.


1)   What is a performance benchmark

2)  Types of technical benchmark

  1.   The check mark benchmark
  2.   The confirmation benchmark
  3.   The industry standard benchmark
  4.   The valuable benchmark

3)  Organizational challenges to implement a valuable benchmark

  1.  The IT architecture perspective
  2.  The IT operations perspective
  3.  The CFO perspective
  4.  The CEO perspective

Next post will contain the list of technical subjects while the following ones will start to dig into each subject in the two lists in an orderly fashion.

As I wrote earlier: your feedback will be key in shaping how this will move forward. Feel free to comment here or to reach me directly.

Performance benchmarks

It is time for me to give back.

Dealing with performance benchmarks has occupied a fair share of my life from my early days in the computer world in the mid ’80s.

In the beginning it was mostly reading, with just a bit of writing, that today I would be ashamed of, in one of the early Italian BBS “newspaper” called “Corriere Telematico“.

At the time I could have never imagined that benchmarks would have a very large role in my career to the point that for about 8 years they even defined my job title.

Now, as I my transition into a new role is almost complete, it feels like the right time to write something about benchmarks that can help many people in the industry.


I recall reading in one of the paper magazines of my early days something along the lines of “benchmarks don’t lie, but liars do use benchmarks”. I believe it was on MCmicrocomputer but I can’t bet on this.

This bleak statement about benchmarks was true 30+ years ago and it’s still true now, but we should not throw the good away together with the bad: proper benchmarks were and still are useful tools for individuals and organizations alike.  It’s all about defining “proper” correctly in each context.

For a while, given the scarcity of published material on the subject, I was thinking of putting together a book, with the help of a friend of mine.

I fear I will not be able to put in all the time needed to complete it in a reasonable time frame and for this reason I decided to blog on the subject instead.

In the coming weeks (or months, I don’t know yet how this will work) I will share what I learned in many years as a source for anyone wanting to get closer to the holy grail of the “proper benchmark”.

I will be vendor and technology neutral, covering both the business and the technical sides.

Your feedback will be key in shaping how this will move forward.

In the next post I’ll share the target table of content of this series of posts.

Disclaimer

The opinions expressed in this blog are my own, not the opinions of my current employer or the opinions of any of my former employers and might diverge from their current, past or future opinions on the subjects I discuss. 

The opinion expressed are not the opinions of any of the associations I was part of in the past or I am part of today.

All the content of this blog is mine: it can be freely quoted in academic papers and free publications as long as credit is provided and the content is linked, but can’t be included into any paid publication like books, e-books, web content behind a paywall or any money-generating media without prior written authorization.

I reserve the right to change my mind on any of the subjects I write about and the freedom to share or not to share my revised opinion.

While I take no responsibility for the comments made by others on my blog I reserve the right to delete any comment as I see appropriate without the obligation to provide an explanation of my action.

I provide credit to my sources to the best of my knowledge: I will be happy to revise and correct any quote and credit if notified.

While I share my experience (positive or otherwise) in full honesty your mileage may vary and what worked for me might not work for you: the readers are the sole responsible for taking or not taking any action or decision (financial or otherwise) based on what I write in this blog.

I am not making money out of this blog nor get free products or perks of any type associated with my blogging activity.

There is no conflict of interests between this blog and my work or financial activities.

I reserve the right to change the terms of this disclaimer at any time.

Google maps killed my earlier habit of blogging here about places and food

When I started this blog a fair share of the content related to my travel and food experiences around the world.

Formally blogging is, in my opinion, a fairly involved process to ensure not only the content is relevant, but also the writing is at least properly structured. This, together with the fact that blogging is not a job for me and I have other hobbies too, led me to write only for significant experiences rather than always.

The excitement of writing was drying up over time (it is easy to see in the posting history) when the mechanism to contribute to Google Maps became available.

Albeit reviews on Maps tend to be shorter and a bit “twitter-style” the mechanism to contribute them is so convenient that I ended up being much more active than I was before.

An added stimulus to keep writing there is the monthly feedback showing the level of visibility of my contributions: reviews on Google Maps have a visibility that this blog never reached nor was ever going to reach while keeping its nature of a small side project.

If you are interested in keeping up with my reviews you should be able to find them here

I the end Google Maps contributions might be considered the last nail in the coffin of my writing, but I rather like to look at them just like an evolution of it and a greater value for the community than the earlier model.

Smart car software update: chronicle of an unacceptable journey.

I recently posted about my very unsatisfactory experience with service personnel while attempting to get a few problems fixed on my Renault Megane.
The mechanics had no clue about how to fix them, but a factory reset of the on-board computer (like on current personal computing devices) did the trick.

I inferred from this fact that updating the software, again like in personal computing devices, was the way to go to avoid facing the same problems in the future and started my long journey to accomplish this.

I followed the manufacturer instructions and downloaded the software downloader on my notebook, inserted in it a 8GB USB flash drive previously initialized in the car and, after a byzantine procedure requiring web interaction to select the updates that then the application would fetch, I started downloading.
Again. And again. And again…

What looked strange is that the download counter made it to the full size, but then continued!
After a few dozens attempts all failed in the same way and with no success in sight I decided to get in touch with the country support.

As a reply to my first contact I received a cut&paste of the standard procedure.
This is a fairly common practice in every sector and makes a lot of sense because most people is not reading the manuals.

Unfortunately I was already following the standard procedure so I replied back with more data including the fact that to get 5.4GB of updated maps the tool had downloaded already over 113GB (from a non-Renault domain) without success.
The solution proposed was to use a larger flash drive.
I could not obtain from them an answer about why to get 5.4GB an empty 8GB drive was not enough.
And a 16GB drive was not a fix for the problem anyway.

During the fruitless exchanges with the support I kept attempting the download until it finally worked. On the 8GB drive.
I believed that even if this was not communicated to me they had fixed whichever issue there was and I was happy with that.

A few months later I found out that it was just one lucky astral alignment.
The situation is back where it was: tens of downloads attempts needed to get an updated version of the maps and failed downloads leave the flash drive in an inconsistent state where the car tries the update anyway only to fail after a few minutes.

I was guessing in my earlier post that the challenges I faced were due to the time needed for the knowledge to move from the top of the manufacturer organization to the service people.
But from my experience attempting to do the software update it looks like I was wrong: even at the country level the manufacturer appears unable to support the smartness they are putting in the vehicles.

According to the discussions I had with a few colleagues in the office other manufacturers have a much smoother user experience.
In my opinion Renault really needs to evolve quickly to stay relevant.

Smart cars without smart mechanics in the long run are not going to work as a business model.

A few months ago I started to drive a 4th generation Renault Megane in the (Italian) Bose trim:
in this version you get almost as many gadgets as possible.

While they all work driving the car is a very enjoyable experience for the vehicle class, but as soon as problems started to appear and I was looking for a fix, I realized that the support personnel was left behind in the product evolution.

After a few months the electric massage seat and the lumbar support stopped to work, some time later the rear cam did not disengage anymore as soon as moving forward, after some more time parking sensors stopped working and also the lane assist stopped to produce the sound feedback, finally the HUD was resetting the position to default every time I was turning off the engine.

I an attempt to get the issues resolved I have contacted 3 different mechanics from the official support network getting vague statements about what the problem could be, but all of them agreed that it would take multiple days to get it fixed. One stated “for electric problems you need to plan at least a 3-days stop”.
I tried contacting the online support describing the issue and all I got back was the link to the list of services.

None of the mechanics offered a replacement car during the troubleshooting and repair even if the vehicle is well within the warranty period: very upsetting.
I started planning the right time to bring the car in when I could stay without it for an extended period of time when, by pure chance, I ended in a menu of the car computer that offered a reset to factory defaults.
Having some past experience with consumer electronics I decided to trigger it counting on the fact that worst case if the car stopped completely I could call the service to pick it up free of charge.
With my surprise all of the problems I was having suddenly disappeared.

How it is possible that not only 3 authorized services had no clue about this basic troubleshooting, but also the online support did not come up with the advice to reset?

In my opinion putting cars ahead of the support structure is not a safe bet.
Not for the for the manufactured nor for the consumers.

Thinkering with my desk: an attempt at IoT as in “intranet of things” rather than “internet of things”.

Microsoft sends me a periodic email (TechNet Flash Newsletter) listing the news related to their product and ecosystem that I read in a sporadic way, but a few weeks ago thanks to the Christmas holidays I had a bit of extra time and read through one containing an invite to a challenge on hackster.io.

Joining the community was quick and straightforward.
After a few days I put in an idea for the challenge pre-contest and earlier today I found out that it was selected and I should get the Genuino MKR1000 to make it a reality.

Tools are installed on my W10 phone and notebook ready to consume my week-end spare time for the next 46 days.

I believe this marketing initiative is a very smart one giving good visibility of MS tools in the IoT space to the people who should really care (developers and tinkerers) and can create the tools, applications and devices that will feed the Azure infrastructure with major volumes of data in the coming years.

Homeplug AV2 with MIMO: real life

I posted about my early experiences with this technology about 8 months ago here and here.

At the time of these posts the apartment was perfectly empty except for the adapter and the notebook I was using.
Now, after a full renovation including the electrical infrastructure, it is a real home with all the associated devices and appliances connected and operating.
Another important change is that, due to the limits of the electrical tubing connecting upper and lower level, I had to connect the two levels using the same technology and now I have in place 4 adapters from the same manufacturer.

What I read right now in the monitor from the upper level is:
110 Mbit/s to the lower level
70 Mbit/s to the garage
60 Mbit/s to the underground room.

To the garage what I get at the application level moving data to the garage is about 3MB/s (a bit shy of 30Mbit/s) when using a backup program targeting a share on the LS220D and about half of this when using directly a samba share on the same device to move files with windows explorer.

The number reported are in the lowest range of the day and they can easily get 30% better than this depending on the amount of electrical noise tha is injected on the line from the other apartments on my building.

Marketing proves to be even more distant from the reality than I already complained about. But it is still better than WiFi in my environment.

I’m experiencing some significant brownouts during the day with the worst quality during the evening and at lunch time.
This might also be a factor, but I have opened a complaint and in a couple of months should be addressed by my electricity distribution company.

Windows 7 network performance monitor inconsistency

I am doing a backup using disk2vhd from former sysinternals (now Microsoft) Mark Russinovich and I’ve found something puzzling.
The source is a W7 ultimate machine, the destination is a SMB share on a W7Pro connected to the same 1Gb/s Ethernet switch.
The two systems are not doing any other significant network activity but I see resource monitor reporting very different network use on the two sides of the transfer.

I am monitoring the destination machine from the source machine using teamviewer (great tool free for personal use) and here is the picture of the two paired resource monitors:
networkuse
The source machine (on the right side) reports a network use that is significantly lower than the destination machine (on the left side).
The source machine also shows that the largest transfer alone (the backup) has a higher throughput than the total reported.

I was thinking the discrepancy might be related to a time drift in the reporting, but the graphs show that is not the case: the destination system constantly reports a higher network use.
Does anyone know if it is a known bug in W7 and how to fix it?

Dell 6430u updated to bios A10

Two days ago I did a new bios update on the notebook.

The process worked fine as usual and, again as usual, did not fix or improve the issue with the fan noise.
After 36 months with it I have to bear it for only 12 more months until the notebook is due to refresh.

A positive note about the 6430U: it does no longer trigger the security scanner in the Ben Gurion airport. Whatever the chemical that was there it is now completely evaporated.