Back in business…

I haven’t blogged for some time now.  This in large part has been due to heavy workload, close deadlines, and the fact that I was alone in my workload.  Over the past few weeks, I’ve been able to get my head above water.  While our open position on the team is still “open”, we’ve filled our contractor position. Not only have we “filled” it, we’ve actually brought in one of our old contractors who is more than capable.  He is definitely helping to alieviate my workload already.  I’ve finished my Vista RTM handoffs and that has taken off some more pressure.  I’ve also completed my first review at Microsoft and, while I definitely see much room for improvement in the process, I was pretty pleased with the outcome.

All three of these events have helped me free up time to start blogging again.  In fact, this new found freedom has given me some time to start taking some classes at Microsoft. As we speak, I’m typing this blog post up during a break of a class on managed code threading.  Those of you that know me may be saying, “Didn’t you write books on threading? Why would you sit in a class on that very topic?”  Well, I’m attending for two reasons. The first of these reasons is that the class is being taught by Jeffrey Richter.  No matter how much you think you know about anything, I guarantee you that Jeffrey Richter can make you feel like a “n00b”. OK, there may be a small percentage of you out there that know more about obscure printer driver hacks, but even there, I’d defer to Mr Richter.  If you ever get a chance to sit in on one of Wintellect‘s classes, I recommend you take advantage of that opportunity.  If you can’t afford it, I’d recommend you read the many books published by Wintellect employees.  The second reason I’m sitting in this class is because I think threading is increasingly important. When I co-authored my first book on this topic, I believed that the multi-core and multi-processor industry would be growing by leaps and bounds making threading knowledge extremely valuable.  This is proving true as Intel has just announced that they will have 80 core processors by 2011.  If you don’t know how to use multi-threading techniques PROPERLY, I highly suggest you start learning.  Despite my involvement in three books on the topic of threading, Richter’s class, in my opinion, is one of the best means to get solid, current multi-threading advice today.

I hope you’ll forgive the silence on my blog from the past few months.  I also hope you’ll come back often and trust me to provide you with some relevant articles on a more regular basis.

Accessing IIS 7 APIs through PowerShell (Part I)

I’ve caught the PowerShell bug. In between stints with my ever-expanding code samples, I play with PowerShell a lot.  I thought I’d share a quick example of how to load Microsoft.Web.Administration.dll and use it to perform some basic tasks.

Note: I’m running these samples on Windows Vista RTM, but I have no reason to believe this will not work on the PowerShell release candidates for the Vista RC* builds that are available now

So let’s get started.

First, PowerShell has no idea where Microsoft.Web.Administration.DLL is so you have to tell it how to load it. Anyone who has written code to dynamically load an assembly should be familiar with this syntax.  Type the following command

PS C:> [System.Reflection.Assembly]::LoadFrom( “C:windowssystem32inetsrvMicrosoft.Web.Administration.dll” )

The path to your assembly may change depending on your install.  I’ll show you later how to use environment variables to calculate the correct path.  In the mean time the out put of the line above display something like the following:

GAC  Version    Location
---  -------    --------
True v2.0.50727 C:WindowsassemblyGAC_MSILMicrosoft.Web.Administration 7.0.0.0__31bf3856ad364e35Microsoft . . .

Once the assembly is loaded you can use PowerShell’s “New-Object” command to create a ServerManager object that is defined in Microsoft.Web.Administration.

PS C:> (New-Object Microsoft.Web.Administration.ServerManager)

This doesn’t give you much except the list of properties the ServerManager exposes:

ApplicationDefaults      : Microsoft.Web.Administration.ApplicationDefaults
ApplicationPoolDefaults  :
ApplicationPools         :
SiteDefaults             : Microsoft.Web.Administration.SiteDefaults
Sites                    : {Default Web Site}
VirtualDirectoryDefaults : Microsoft.Web.Administration.VirtualDirectoryDefaults
WorkerProcesses          : {}

To get more detail, you need to use the properties and methods of the ServerManager object to drill down and get the information we want. The ServerManager provides access to all of the sites on your machine through a SiteCollection object. This SiteCollection is made available through the “Sites” property of the ServerManager. 

PS C:> (New-Object Microsoft.Web.Administration.ServerManager).Sites

Which will produce a list view of all the sites and their associated property names/values.

ApplicationDefaults        : Microsoft.Web.Administration.ApplicationDefaults
Applications               : {DefaultAppPool, Classic .NET AppPool}
Bindings                   : {}
Id                         : 1
Limits                     : Microsoft.Web.Administration.SiteLimits
LogFile                    : Microsoft.Web.Administration.SiteLogFile
Name                       : Default Web Site
ServerAutoStart            : True
State                      : Started
TraceFailedRequestsLogging : Microsoft.Web.Administration.SiteTraceFailedRequestsLogging
VirtualDirectoryDefaults   : Microsoft.Web.Administration.VirtualDirectoryDefaults
ElementTagName             : site
IsLocallyStored            : True
RawAttributes              : {name, id, serverAutoStart}
ApplicationDefaults        : Microsoft.Web.Administration.ApplicationDefaults
Applications               : {DefaultAppPool}
Bindings                   : {}
Id                         : 2
Limits                     : Microsoft.Web.Administration.SiteLimits
LogFile                    : Microsoft.Web.Administration.SiteLogFile
Name                       : Test Web Site 1
ServerAutoStart            : False
State                      : Stopped
TraceFailedRequestsLogging : Microsoft.Web.Administration.SiteTraceFailedRequestsLogging
VirtualDirectoryDefaults   : Microsoft.Web.Administration.VirtualDirectoryDefaults
ElementTagName             : site
IsLocallyStored            : True
RawAttributes              : {name, id, serverAutoStart}

Of course, this isn’t the easiest view to read, so let’s say we just list the site names by piping our site list to the “ForEach-Object” command in PowerShell and display a list of site names only:

 PS C:> (New-Object Microsoft.Web.Administration.ServerManager).Sites | ForEach-Object {$_.Name}

This looks much more concise:

Default Web Site
Test Web Site 1

We could also use the Select-Object syntax to query the list into a table format:

 PS C:> (New-Object Microsoft.Web.Administration.ServerManager).Sites | Select Id, Name
        Id Name
        -- ----
         1 Default Web Site
         2 Test Web Site 1

Now lets use PowerShell to manage application pools. We can fit several commands on one line by using the semi-colon.  The following command-line is actually four different operations: Storing the application pool collection into a variable, displaying the name and runtime status of the first application pool, stopping the first application pool, then displaying the name and status again.

PS C:> $pools=(New-Object Microsoft.Web.Administration.ServerManager).ApplicationPools; $pools.Item(0) | Select Name, State;$pools.Item(0).Stop(); $pools.Item(0) | Select Name, State

Running this sample should display the following:

Name                                         State
----                                         -----
DefaultAppPool                               Started
Stopped
DefaultAppPool                               Stopped

This is nice, but we can do this already with appcmd.exe right? Well, to some extent.  We don’t get the features of PowerShell that allow us to format our output the data to our liking. Also, as a developer, I find it much easier to use the API syntax I’m already familiar with than to remember appcmd.exe syntax.  Furthermore, PowerShell allows us to use WMI alongside our managed code calls, and unlike appcmd.exe, you can extend PowerShell and cmdlets. PowerShell gives you the ability to easily manage multiple servers from one command prompt on one machine.  Watch the PowerShell/IIS 7 interview on Channel9 if you want to see this remote administration in action.

One last thing that PowerShell brings to the table is the ability to “spot-weld” our object models (as Scott Hanselman quipped). You can create/modify/extend type data and formatting to your hearts desire.  For more information on this, check out the PowerShell documentation found in the PowerShell install, or in the PowerShell documentation set.

So, I would be remiss in this post if I didn’t try to make your PowerShell / IIS 7.0 life easier.  As such, I’ve created a profile script that loads all the IIS 7.0 managed assemblies for you.  The script is simple and contains more  echo commands than actual working script lines.

To install this script run the following command inside PowerShell:

PS C:> if ( test-path $profile ) { echo “Path exists.” } else { new-item -path $profile -itemtype file -force }; notepad $profile

This will create a profile path for you if you don’t already have one, then open up your profile in notepad.  If you haven’t added anything to the file, it will obviously display an empty file.  Paste the following in notepad when it opens:

echo “Microsoft IIS 7.0 Environment Loader”
echo “Copyright (C) 2006 Microsoft Corporation. All rights reserved.”
echo ”  Loading IIS 7.0 Managed Assemblies”

$inetsrvDir = (join-path -path $env:windir -childPath “system32inetsrv”)
Get-ChildItem -Path (join-path -path $inetsrvDir -childPath “Microsoft*.dll”) | ForEach-Object {[System.Reflection.Assembly]::LoadFrom( (join-path -path $inetsrvDir -childPath $_.Name)) }

echo ”  Assemblies loaded.”

Now, save the profile and close notepad.  You will likely have to sign this script or change your script execution policy to something very weak to make this script run properly (obviously I’m not recommending the latter). To find out more about signing scripts, type “get-help about_signing” in PowerShell. The instructions to create a self-signed certificate found in that help file are as follows:

In an SDK Command Prompt window, run the following commands.
The first command creates a local certificate authority for your computer.
The second command generates a personal certificate from the certificate authority:

makecert -n "CN=PowerShell Local Certificate Root" -a sha1 `
   -eku 1.3.6.1.5.5.7.3.3 -r -sv root.pvk root.cer `
   -ss Root -sr localMachine
makecert -pe -n "CN=PowerShell User" -ss MY -a sha1 `
   -eku 1.3.6.1.5.5.7.3.3 -iv root.pvk -ic root.cer
   MakeCert will prompt you for a private key password.

Go ahead and make a certificate for yourself following those instructions. To sign your profile, within PowerShell type the following:

PS C:>Set-AuthenticodeSignature $profile @(get-childitem cert:CurrentUserMy -codesigning)[0]

So far, you’ve created a certificate and signed your script. Now, you will have to change your script execution policy down at least one level from the default.  The default doesn’t allow scripts at all.  To get scripts to execute, at the minimum you’ll have to set it to “AllSigned” to allow only signed scripts to execute.  In this mode, each time you execute a script from a new publisher, you’ll be asked what level of trust to assign to the publisher (unless you respond to the prompt to “Always Run” or “Never Run” scripts from that publisher)

Do you want to run software from this untrusted publisher?
File C:UsersTobinTDocumentsWindowsPowerShellMicrosoft.PowerShell_profile.ps1 is published by CN=PowerShell User
and is not trusted on your system. Only run scripts from trusted publishers.
[V] Never run  [D] Do not run  [R] Run once  [A] Always run  [?] Help (default is "D"):

Now, each new instance of PowerShell that you run will automatically load the IIS 7.0 managed assemblies.  I know it seems like a great deal of work, but it really isn’t once you’ve made a few rounds around inside PowerShell. Consider that you only have to create the script once and then you have full the full range of the managed IIS 7.0 SDK at your fingertips inside PowerShell. 

If you have problems, feel free to leave comments and I’ll do my best to help you.

Why am I smiling?

I moved to Redmond just over four months ago.  In the time I have been here, my rental car was side-swiped, my truck was broken into, my headlight and bumper were damaged by someone in our own parking garage, someone stole my copy of “Professional Visual C++/CLI” from my office today (clearly someone missed the “corporate values” talk at New Employee Orientation), and my relocation to the great Pacific Northwest has been less than smooth or swift — waiting for the insurance company to assess and pay my claim for the furniture damaged by the movers.   I’m the only PW in my group and we’ve been unable to find anyone else that can fill the shoes for our open position. I have deadlines looming with tons of work to do and not enough time to do it all by myself.  Bill Gates has announced he is reducing his role here, both Windows and Office have announced schedule changes and Microsoft’s stock has dropped over four dollars since I arrived on campus.

So why am I smiling?

In four month’s time I’ve learned so much.  I’ve been able to look at the technologies we will implement in the future before most people even know they are in the pipeline. I sit in on meetings and get to give real feedback that can influence products used by more people than I could have ever imagined.  I have taken over ownership of an internal tool our team uses and have written a few of my own.  I’ve gathered customer feedback and helped several customers personally or got them in touch with others who could help them. The amount of responsibility piled on me is less of a burdon and more of a compliment, in my opinion.  Who puts that amount of pressure on someone if they feel they can’t handle it?

Apart from all the benefits provided by Microsoft there are other reasons I’m happy to be working here. I’m in a technological heaven.  The people are brilliant and open-minded (except when it comes to “Red State” ideas, but give me time — I’m still working on it).  I pass those same brilliant people in the halls every day.  If I have a question about something, I can go hit our Global Address Book and track down the person who owns the feature to discuss the matter with them personally. 

I also get to see the company make huge changes in the way it delivers software.  With the industry changing so quickly, its awesome to see a company of this size roll with the punches and adapt. 

It’s hard to explain why I’m so happy to work here. The only thing I can say is that you can tell that the majority of people working here love working here and finding new ways to make customers happy.  That reason alone is enough to make me love working at Microsoft.

Enjoy the weekend!

OT: Bank transaction privacy

BankRate.com is reporting a story about the use of financial transactions being used to track down terrorism.  Of course, our anti-American partners at MSN are, of course, lapping up the story like a dog to a toilet bowl. 

The article states:

According to their rules, any group of transactions totaling $5,000 or more that “is not the sort in which the particular customer would normally be expected to engage” can cause enough suspicion to create a SAR [Suspicious Activity Report].

Having engaged in more than my fair share of transactions last year that involved more than $5000 on a weekly basis, and having had my fair share of “irregular” bank transaction requests, I can state that I more than likely have one or more of these reports generated about me and suprisingly enough, I’m not cowering in a corner waiting for the feds to bust down my door.  Now, if I had engaged in any illegal activity, I am sure I’d be singing a different tune. I also appreciate the fact that our government is trying to track possible terrorist activities and I’m thankful that we have a President who is willing to do what is necessary to meet that need.

It seems to me I remember some certain “Jersey Girls” complaining:

I watched my husband murdered live on TV. . . . At any point in time the casualties could have been lessened, and it seems to me there wasn’t even an attempt made.

Among other things, they charge that nothing was done in a meaningful timeframe to save anyone’s life, that the delay was on purpose, and that [President] George Bush was responsible for the deaths of three thousand people.

I actually can understand the sentiment and complaints. I can even sympathize with the rationale, but these same women are complaining when we do try to gather intelligence.  Democrats are proud to bring up the intelligence failures of 9/11, but are also quick to leak intelligence to ensure failure again in the future — and they wonder why they lost the Presidential election as well as the House and Senate in 2004?

Now, I’m certain I’ll be attacked for taking this stand, but I want to get one thing straight. I don’t believe that the government should be so involved in our lives either.  I think that we have let the government etch itself too deeply into our way of life. Let’s not forget that the government was never intended to control so much of our lives.  Instead, we’ve perverted it to do so and now we are paying the consequences of those decisions.  Socialism has crept into our society years after we supposedly defeated it.  We’ve put the burdon of our every-day lives onto the government and now that the government is taking what we have given them over the years, we are going to complain? 

You cannot have it both ways. Either you have a government that is in charge of distributing wealth, protecting citizenry, giving “free” health care to “everyone”, establishing what is moral and what is not, literally robbing from the not-so-rich and giving to the not-so-poor, educating our children (if you can call it that) and caring for our every need, or they can be a thin layer of government that conforms to the will of the people and doesn’t have the ability or business of prying into our every day lives.

Looking left and turning right: management style

Today, I was returning from my manager’s office to my own when I nearly collided with another manager-type in the hall.  As I was approaching a hallway intersection, a manager emerged in a bit of a hurry looking to the left while she was turning to the right.  She prolonged her view to the left for so long that her path was diverging directly into mine.  In motorcycle safety course several years ago we were taught while taking a corner that we should look in the direction of the curve. Looking to the opposing direction could often cause us to veer off course toward the direction of our gaze. Referring back to my manager-turned-missile, of course, I scrambled to get out of her way before she hit me. This was rather awkward to do and by the time the manager looked back at me shuffling around, she look at me like I was the stupid one and didn’t as much as say “oops, sorry”.

Nothing in this world enrages me more than managers with an inflated view of their own self-importance.  But this is rather indicative of the problem I think we face in our company.  We know where we want to go, and if we just focused on our own goals, we would get there in spectacular fashion.  This isn’t the case, however. We fixate on what other companies are doing and what else we could be doing instead of directing our gaze at what we are working on until it is completed.  Couple these misguiding glances with all of our team meetings, morale events, office sharing and quarterly group/org/company ra-ra meetings that do nothing more than tell us what we already know — or tell us more than we care to know — and it’s no wonder we cannot get anything done.

I encourage Microsoft to start training our managers — and our non-management employees for that matter — to stay focused on the direction of our company. Stop worrying about what every other company out there is doing and start worrying about what we are NOT getting done on time.  Our customers depend on us.  You want to drive up customer satisfaction rates?  How about delivering a product for them to be satisfied with!  You want to drive up revenue?  How about filling some warehouses with some freshly minted retail bits!

Obsessing over our career options at myMicrosoft and worrying about work-life balance cannot continue to be our main focus.  Putting our focus in that direction will only take us off course from our real goals. Trust me, when we deliver quality products to our customers on time and under budget, our career options will open up for themselves. And nothing makes work-life balance easier than getting performance bonuses that we can spend on our nights, weekends and vacations or put toward our children’s college education fund.

Edit:
I should clarify that I am also guilty of this very same problem.  While re-reading my annual review, my commitments are filled with goals that aren’t in my direct line of responsibility.  This is as much of a criticism of myself as it is of anyone else.  Furthermore, my managers up my direct line have been pretty wonderful, supportive and have kept me fairly focused on my tasks.

Change the runtime of an IIS 7 application pool

The following quick SDK sample demonstrates how to list and change the managed runtime version of application pools programatically in IIS 7.

[VB]

Imports System
Imports Microsoft.Web.Administration
Public Class AppPoolSample
 Shared manager As ServerManager = New ServerManager()
  ' Main application processing
  Public Shared Sub Main(ByVal args As String())
      ' Get the apppool to change
      Dim iPool As Integer = GetAppPool()
      ' Get the framework version desired
      Dim rtVersion As String = GetVersion()
      ' Set the apppool runtime
      Dim poolToSet As ApplicationPool = manager.ApplicationPools(iPool)
      Console.WriteLine(
            "Setting application pool '{0}' to runtime version: {1}...", _
            poolToSet.Name, rtVersion)
      poolToSet.ManagedRuntimeVersion = rtVersion
      ' Commit the changes and recycle the application pool
      manager.CommitChanges()
      poolToSet.Recycle()
      Console.WriteLine("Your changes have been committed.")
  End Sub

  ' Prompts the user to select an application pool
  Public Shared Function GetAppPool() As Integer
    Dim pool As String = String.Empty
    Dim iPool As Integer = 0
    While (Not Integer.TryParse(pool, iPool))
      Console.WriteLine("Available ApplicationPools: Managed runtime version")
      Dim i As Integer
      For i = 0 To manager.ApplicationPools.Count - 1 Step i + 1
        Dim appPool As ApplicationPool = manager.ApplicationPools(i)
        Console.WriteLine("{3}{0,3}.{3}{1}: {2}", i + 1, _
            appPool.Name, appPool.ManagedRuntimeVersion, vbTab)
      Next
      Console.Write("{0}Choose an application pool to change: ", vbCrLf)
      pool = Console.ReadLine()
    End While
    Return iPool - 1
  End Function

    ' Prompts a user to select the version of runtime they would like
    ' the application pool to use
    Public Shared Function GetVersion() As String
        Dim rtVersion As String = String.Empty
        Dim iVersion As Integer = 0
        While (Not Integer.TryParse(rtVersion, iVersion))
            Console.WriteLine("{0}  1.{0}Framework version 1.0", vbTab)
            Console.WriteLine("{0}  2.{0}Framework version 1.1", vbTab)
            Console.WriteLine("{0}  3.{0}Framework version 2.0", vbTab)
            Console.Write("Choose the new managed runtime version: ")
            rtVersion = Console.ReadLine()
        End While
        Select Case iVersion
            Case 1
                rtVersion = "v1.0"
            Case 2
                rtVersion = "v1.1"
            Case 3
                rtVersion = "v2.0"
        End Select
        Return rtVersion
    End Function
End Class

[C#]

using System;
using Microsoft.Web.Administration;

public class AppPoolSample 
{
  static ServerManager manager = new ServerManager();
  // Main application processing
  public static void Main(string[] args)  
  {
    // Get the apppool to change
    int iPool = GetAppPool();
    // Get the framework version desired
    string rtVersion = GetVersion();
    // Set the apppool runtime
    ApplicationPool poolToSet = manager.ApplicationPools[iPool];
    Console.WriteLine(
        "Setting application pool '{0}' to runtime version: {1}...",
        poolToSet.Name, rtVersion);
    poolToSet.ManagedRuntimeVersion = rtVersion;
    // Commit the changes and recycle the application pool
    manager.CommitChanges();
    poolToSet.Recycle();
    Console.WriteLine("Your changes have been committed.");
  }
  // Prompts the user to select an application pool
 public static int GetAppPool()
 {
    string pool = String.Empty;
    int iPool = 0;
    while ((!int.TryParse(pool, out iPool)) ||
            (iPool > manager.ApplicationPools.Count || iPool <= 0))
    {
      Console.WriteLine(
                    "Available ApplicationPools: Managed runtime version");
      for (int i = 0; i <= manager.ApplicationPools.Count - 1; i++)
      {
        ApplicationPool appPool = manager.ApplicationPools[i];
        Console.WriteLine("t{0,3}.t{1}: {2}", i + 1, 
            appPool.Name, appPool.ManagedRuntimeVersion);
      }
      Console.Write("rnChoose an application pool to change: ");
      pool = Console.ReadLine();
  }
  return iPool -1;
}
  // Prompts a user to select the version of runtime they would like
  // the application pool to use
  public static string GetVersion()
  {
    string rtVersion = String.Empty;
    int iVersion = 0;
    while ((!int.TryParse(rtVersion, out iVersion)) ||
            (iVersion > 3 || iVersion < 1))
    {
      Console.WriteLine("rnt   1.tFramework version 1.0");
      Console.WriteLine("t   2.tFramework version 1.1");
      Console.WriteLine("t   3.tFramework version 2.0");
      Console.Write("Choose the new managed runtime version: ");
      rtVersion = Console.ReadLine();
    }
    switch (iVersion)
    {
        case 1:
            rtVersion  = "v1.0";
            break;
        case 2:
            rtVersion = "v1.1";
            break;
        case 3:
            rtVersion = "v2.0";
            break;
    }
    return rtVersion;
  }
}

Internal spam affects productivity

When I interviewed with Microsoft, I was asked, “What is one thing we can count on you to do at Microsoft?” 

My reply was brief: “You can count on me to complain.” 

My interviewer suddenly took on the puzzled look that an interviewee would normally take when presented with an unfamiliar scenario.  The facial expression of my inquisitor demanded an explanation.  I decided to end the torment by presenting further detail;  “When something is wrong, I’m going to bring it up and someone will hear me.  I’m not going to just complain, but I’m going to offer at least one alternative solution.”  Apparently, my explanation cured the torment of my initial declaration. I was hired and here I sit; typing my 23rd Microsoft blog post to an audience that has graced me with tens of thousands of views.  This particular post is a follow-up on my promise to complain and provide an alternative solution.

As many have said in the past, email is the life-blood of our company. We communicate everything in email.  That’s why our anti-spam measures that counteract external tormenters are so critical to our business.  But what are we doing about the measures to counteract internal spammers?  Adam Barr made light of the need for Microsoft Outlook rules in his short story, “The Microsoft Code”, but the premise is 100% genuine.  Internally, we have people who just LOVE to toot their own horns and, as you might guess, these horn-tooters are typically the management way up the line.  They do this horn-tooting in a barrage of email that cripples productivity.  This is fine, I suppose, but it comes from the internal culture that supports working toward better reviews each year rather than making meaningful contributions to their teams, organizations, the company and (hopefully) the customers.  When a director sends you a “professional” email that contains exclamation points and smiley faces, you can almost assuredly delete the email – can anyone make an Outlook rule for that?
 
I’ve been at Microsoft for less than 4 months now and I’ve been swamped with work since I came to Redmond.   I am the only worker of my particular discipline in my particular group.  I have deadlines; big ones; looming ones. I sincerely mean no offense here, but I don’t have time to listen to everybody in our company talk about what they did and what they are going to do.  To take that one step further, I don’t care to know everything that’s going on. I lose my focus as I try to decipher the email and understand how that particular communication affects my day-to-day work.  If our upper management stopped to think about the opportunity cost of each email they sent out, we might actually have time to stop reading email and start getting products shipped on time.  Seriously, I wonder how many people send out these emails to distribution lists that “appear” correct (apparently based on the name of the group) without actually looking to see who is involved in the distribution.  Those distribution lists in the GAL can be nested so many levels deep, I venture to say that no one knows who is going to receive a particular email with any degree of specificity.
 
If upper management is going to send email to everyone, what is intermediate management meant to do?  I personally prefer a military style approach where one level talks directly to the level above them and the level below them with rare communications in between.  If those particular levels above or below think the information is important to the next level up or down, they can forward that communication accordingly.  Dissemination of information in this particular manner assures that the appropriate people get the data, and everyone else doesn’t suffer from information overload.  Each management level can also summarize the information and parse out the pieces that they feel are important for their group – a human Outlook rule, if you will.  That’s obviously not the culture we have here at Microsoft. 
 
One idea I tossed about in my head was the idea that everyone internally should have their own RSS feed or VLOG.  When someone wants to toot their own horn, they can do so on their blog; their managers can see those accolades, their team can see it, and the people who really care about it can see it.  If that particular employee keeps blogging about stuff that doesn’t affect me, I can unsubscribe and stop wasting my time reading/watching that feed – providing incentive for everyone to keep their topics relevant or risk humiliation with low agg-view participation.  In this approach, new employees or employees who transfer to groups could then subscribe to manager-recommended OPMLs that generally contain information that you will want so you can be effective in your new role.   Furthermore, RSS feeds can be much more easily tagged than email can.  I can subscribe to “John Doe’s IIS posts” but filter out “John Doe’s Accomplishments” (much like my own blog that will allow some of you to ignore this particular “Microsoft Culture” post if you so desire).  Sure, this is not a perfect solution, but it’s a start.
 
I have tons of important data to keep track of.  Communication is very important to the company – too important to gum up with internal spam.  We need to find a solution or we will forever find ourselves bogged down in self-congratulatory expression and no real work getting done.

Security: Incompetence

It’s one type of incompetence to keep the personal identifiers and financial data of customers on your laptop and then lose ittwice;  It’s an entirely different type of incompetence that allows government data to be compromised through a network.  Last year at TechEd, a demo showed how a completely patched network could be compromised using an exploit in a web site.  The best part of the exploit was made possible due to turning on more functionality than was necessary. Namely, one issue in the demo was that the router configuration allowed port 80 and port 443 traffic — despite the fact that SSL was not in use on the web site. 

Regardless of the platform being used, many of these compromises are possible these days not due to the operating system itself, but due to assumptions made about users, lack of planning, or pure laziness of administrators and developers.  This is one major reason why I’m not a big fan of agile. Despite the best arguments I’ve heard for agile software development, I have witnessed too much emphasis on feature completion without regard to overall system security. I would encourage you all to read Michael Howard’s new book on the security development lifecycle (link provided below).

Whatever the case — whatever the cause — I would urge the community to pay attention to the recent news stories, learn to start protecting important data and please stop putting personal and financial information that doesn’t belong to you on your laptop!

For more Microsoft resources on security please check out the following:

General Security Websites:
http://www.microsoft.com/security/default.mspx

Blogs:
http://blogs.technet.com/msrc/default.aspx
http://blogs.msdn.com/michael_howard/

Books:
http://books.mcgraw-hill.com/getbook.php?isbn=0072260858
http://www.microsoft.com/MSPress/books/5957.asp
http://www.microsoft.com/MSPress/books/8753.asp
http://www.microsoft.com/mspress/books/6893.asp
http://www.microsoft.com/mspress/books/6788.asp
http://www.microsoft.com/mspress/books/6892.asp
http://www.microsoft.com/mspress/books/6432.asp

ADO.NET 2.0 Boot Camp

Sahil Malik, a prolific speaker, Microsoft MVP and author of “Professional ADO.NET 2.0” is holding a one-day ADO.NET boot camp in Charlotte next month.  If you are in the area, I think this class will definitely give you your money’s worth.  Sahil has a very unique way of teaching that is easy to follow and highly effective.  If you are going to be in the area on July 21st, and want to master ADO.NET, I would encourage you to take a look at this great opportunity in the Charlotte, NC.

Workaround: Adding a script map in IIS 5.1

I was contacted by a customer who commented that he could not add a Script Map to IIS 5.1.  After selecting his executable for the script map and adding his extension, the “OK” button was still disabled — preventing him from committing the script map change.

To work around this issue, once you have selected the executable and set the extension, click inside the “Executable” text box to expand the full path to the executable.  Doing so will enable the OK button and you will be able to commit your script map change.