12 Replies Latest reply on Oct 3, 2013 12:03 PM by Kevin Rak

    Why Didn't We Know About This?

    Tom Hollingsworth

      Those six words can make you shudder.  When they are the only text in an email below a link leading to a story about a zero-day exploit or a news flash on CNN talking about stolen personal information or compromised systems your blood will run cold.  The only way to fix that problem is through judicious patching.

       

      Patch management isn't fun.  It is a lot of painstaking work.  You have to test your patches against baselines and ensure that one minor graphical glitch resolution doesn't crater the CRM system.  It's gotten to the point where taking care of patches manually is impossible.

       

      Thankfully, operating system manufacturers are tired of being embarrassed by news stories as well.  Most modern operating systems have regularly scheduled patch downloads and installations.  They go fine for the most part.  But what happens when you have to manage those patches across hundreds of systems?

       

      True enterprise patch management systems should give you granular information about patch status and on-demand report generation for stakeholders.  You need to be able to answer the above email with a detailed list of affected systems and mitigation status.  You need to prove you *did* know about it and you've already fixed it.

       

      Windows Server Update Services won't cut it any longer.  You've got to have the ability to generate reports for operating systems, applications, and even mobile devices.  Visibility into all aspects of the enterprise are critical.  You need more robust tools to fix your problems.

       

      Have you ever gotten the above email?  What did you do about it?  How did you fix the issue?  Were you scrambling to prove you didn't drop the ball?

        • Re: Why Didn't We Know About This?
          syldra

          Have you ever gotten the above email?

          Yes, many times, in different environments and from various level of management... not something I like to receive though.

           

          What did you do about it?

          I'm usually honest when dealing with those kind of questions. Either I was aware of the problem or not, and I already acted on it or not. If I had already taken the steps to protect the company then good for me, another good point to raise on my yearly eval. If not, then it gets a little trickier...

           

          How did you fix the issue?

          Saying "I'll look into it" and coming up quickly with a bulletproof plan to mitigate the issue is sometimes as good as answering "It's already taken care of". Reactive maintenance is as highly viewed as preventive maintenance, if the delay between the two is not dangerously long. Of course, I never worked in health care or for a financial institution so maybe our standards might have been a little lower.

           

          Were you scrambling to prove you didn't drop the ball?

          No. Never. Making up excuses doesn't change the fact and is usually frowned upon. If I did drop the ball, I take it as a signal to change the way we do things. When the systems are humming along, it is easy to just sit back and enjoy the calm, which can get dangerous if you become reckless. Those emails are a reminder to always remain alert.

            • Re: Why Didn't We Know About This?
              Aaron Denning

              Have you ever gotten the above email?

              oh yeah more times then i care to count

               

              What did you do about it?

              Didn't sugar coat anything either i did know about it and forgot to add that person to the email or i didnt know anything about it and will straight up say it.

               

              How did you fix the issue?

              looking into the issues on forums or threads like this to see if anyone else has had the same issues before i beat myself up and try to fix it. usually i use google, bing, and yahoo and see who has the best answers or fixes. if nothing i'll just work it out from the dumbest thing it might be to the hardest thing it will be.

               

              Were you scrambling to prove you didn't drop the ball?

              forget that i screwed up ill be the first to say it like i said before i dont sugar coat it. if i missed it then its on me but i also dont like the blame game so if i didnt do it and find out who did it ill just ask that group/team if they did it and if they say no and i can prove it ill show them but thats as far as it will go if they dont want to fess up then whatever just dont ask me to help you out a bind that you messed up and want my knowledge.

            • Re: Why Didn't We Know About This?
              wbrown

              As one of my coworkers like to paraphrase "nothing ever gets safer until somebody dies."

              I believe the same is true for any monitoring system.  A number of different metrics may go unmonitored until somebody deems it necessary to keep track of.

              We've had a number of instances in our environment, for both network and server teams, that something happened because we did not know about it.  Most of those were because we did not deem the relevant metrics important enough to monitor or we did not have thresholds and alerts configured.  Luckily all those situations ended with nobody being fired and we add another metric to be monitored.

               

              As for patching servers and workstations, I cannot comment as I have no role in doing so.

              As for patching firewalls and switches, this is where our InfoSec department does us a huge favor.  They keep track of vulnerability alerts and let us know when there is one that is applicable to our hardware.  We review and respond whether the specified bug impacts our operation.  If so then they provide the necessary arguments to our change review committee when we want to upgrade firmware.

              • Re: Why Didn't We Know About This?
                bsciencefiction.tv

                Yes,many times.Fortunately for us, the answer is usually security did not give us access to that server.  Our patch team is well on the ball and has a judicious testing and implementation procedure.

                 

                We have for the most part gotten our upper management to pull the cowboys in security in line, so we get this statement less and less. However when it is our problem, we:

                 

                Own it: Take Responisbility.

                Own it:  Find the Solution.

                Own it:  Make Sure the Same Problem Does not Happen Twice.

                 

                Did I mention... Own it.

                • Re: Why Didn't We Know About This?
                  cahunt

                  This email has come from the suits, and from time to time it still finds it's way through.

                  Though when I get the email, there is usually a chain of emails about 3 or 4 deep with the Suits emailing our Director and then it finds it's way through my manager or Supervisor into my inbox with the full stream attached so I can see everyone of the Suits that were just informed about the lack of monitoring in a situation.

                   

                  Like bsciencefiction.tv ; if it is ours, We Own It! and own it well and every time.. so that when it is not our's we can say 'it is not ours' and the suits actually believe us and move on.

                  As big as we are, a new circuit or service implimentation can get setup and have a portion or a small token in the monitoring not get added....in this case, it seems to be the most important link/node.  Sometimes we have to go back and check documentation and labeling to determine if we really missed something or if an Engineer missed something and we did our part correctly.

                   

                  Either way it is never a good feeling to see that email and our efforts are to get things setup as soon as possible; but being as big as we are it is hard to have eyes and ears everywhere at all times. Every now and again another group will get a task or directive and have the idea that it does not need to be documented. When that happens we lock the managers of each department in a cage, err; huddle room and see who walks out

                  • Re: Why Didn't We Know About This?
                    rharland2012

                    Have you ever gotten the above email?

                    A few times over the years.

                    What did you do about it? 

                    Respond honestly - and sometimes that hurts when you, your reports, or your team has to own the shortfall and the mistake.

                    How did you fix the issue?

                    Remediation, either by process or documentation to fix the broken things now and put things in place to prevent reoccurrence.

                    Were you scrambling to prove you didn't drop the ball?

                    Here's the deal - if I didn't drop the ball, I don't have to scramble. And if I did drop the ball - which happens to many people - there's no point in scrambling. If you work somewhere that will fire people for a dropped ball, then you're going to get fired eventually unless you're a scrambler.

                    Scrambling/covering/throwing others under the bus can save your bacon, but it roasts your soul. I'd rather admit to being a dolt than pretend to be smart.

                    • Re: Why Didn't We Know About This?
                      byrona

                      LIke everybody else here I have seen that email as well; however, not regarding patching related activities.  Normally when I see this it's because something happened on a customer system and management wants to know why our monitoring system and/or NOC didn't see it.  Unfortunately when customers and management see "monitoring" as a service they just assume we are looking at any possible thing that could go wrong, they don't realize that there are a defined set of things we watch for.

                       

                      When it comes to patching, we manage that relatively well and watch it very closely to correct issues as they happen so that normally isn't a problem.

                      • Re: Why Didn't We Know About This?
                        freid.42

                        I must be one of the lucky few who have not received an email like the above.. At least directly to me.

                        • Re: Why Didn't We Know About This?
                          fitzy141

                          Have gotten one in the past - it is what it is..

                           

                          Process , adoption of that process , support from leadership and a good tool - solves it

                          • Re: Why Didn't We Know About This?
                            Kevin Rak

                            Thankfully we're a pretty small shop and WSUS in combination with NiniteOne (for Java, .NET, SilverLight, Flash, etc) have been doing okay so far. When I started here just a few years ago we didn't even have that! And to make things worse, Windows Update had been blocked by the previous system admin because it was taking up too much bandwidth... WSUS fixed that problem by downloading the updates once and then distributing them.

                             

                            I've been told WDS has some services that may take over patch management in the future and I haven't had much time to look into those yet.