Skip navigation

It’s no secret that SolarWinds does a lot of web marketing.  With that, we do a lot of web development.  Our ability to generate revenue is dependent on the health of our website and our web applications.   Over the years we have grown internationally and also needed to test the health of our website from many locations around the world – places where we do not have a physical presence.  In order to ensure global high availability of our websites, we turned to a SaaS provider  to test our website and registration form availability and performance from 10 locations around the world.  This year, however, we were able to manage our websites and web applications – at a much lower cost.

In addition to growing internationally, SolarWinds has been expanding its product footprint, especially focused on the sysadmin market.  Earlier last year we brought to market our Server & Application Monitor product, which monitors server and application resources, like web servers.  What we heard from these customers is that they also wanted to monitor the transaction response times of their web applications.  So, in August 2011 we launched Web Performance Monitor (formerly named SeUM) 1.0.  Like the SaaS provider we were using, Web Performance Monitor monitors individual steps of the web transaction to proactively identify and isolate performance problems for web applications but our first release did not have the ability to monitor remote locations, unless you had a physical presence at that location.  On Valentine’s Day, SolarWinds release version 1.5 – which includes the ability to deploy players to the Amazon ® EC2 portal without leaving the product UI – which then allowed us to finally deploy players to all those locations we needed to monitor around the world – the reason why we went with a SaaS vendor to begin with.

Now you are thinking, “of course this is lower cost, you are eating your own dog food.”  Well, yes, it is more economical for us, but it is also more economical for everyone.  With SaaS, we were monitoring 6 transactions from 10 locations around the world for a total of 60 transactions, and we were paying subscription fees of $20k per year, which translates to $60k over 3 year period.  The cost to manage the same number of transactions with SolarWinds Web Perfomance Monitor over a 3 year period is around $18k-list price.  What’s more, we can now manage our external facing web apps with the same tool to manage our internal web apps, a feature not available with SaaS options on the market today.

IT users want software that is powerful, easy-to-use and install.  Can anyone argue with statement?  If we can all agree that this is one of the goals of users, then why are there so many solutions in the market that don’t meet this fundamental requirement? Why are so many vendors putting out expensive, hard to use and difficult to install products?

It is my opinion that the core of the problem is that most suppliers sell software, under the auspices of being “powerful” or “flexible” that simply cannot be used “out of the box”. Perhaps they do this because they also sell professional services that provide greater financial gain.  With many suppliers, professional services are needed just to get the software “deployed”, and are couched under the customization umbrella, when in reality it’s building the software out to do what you thought you were buying in the demo.

An example of this hard-to-use problem is illustrated by one of the log management vendors, Splunk.  Based upon a recent review of Splunk it would appear that what you get is plenty of complexity and less out-of-the-box value.  Below are just a few of the direct quotes from this review:

“If you want to make Splunk work, you’ve got to be ready to abandon the slick GUI and dive deep into difficult technical configuration, editing configuration files, writing regular expressions, and taking the time to understand where your data are coming from and how Splunk will see them"

“We got Splunk working very smoothly in our multi-vendor environment, but only after investing serious effort in understanding how Splunk collect and indexes data."

“Overall, getting data into Splunk is much more of your typical open source experience with a confusing maze of pointers, wikis, product tech notes and documentation…"

“The Search manual is 289 pages long, and starts with Splunk's idea of the top search commands you have to learn…there are almost 125 search commands."

 

Now to be fair, the reviewer had many positive things to say about the product, but these comments above illustrate a core problem.  Just because the product is powerful doesn’t justify the need to make it hard-to-use.  Do they not agree with the customer’s goals of powerful, easy-to-use, and install?

By no means is SolarWinds perfect in this regard. I’m sure you could pick at many things in our products that aren’t the easiest to use, but I will say this, when we know a feature is causing problems in the product we will focus on reworking it so it becomes easier, but I never hear that of other vendors.  Will Splunk shorten the search command list so it’s easier to use?  Will they make it easier so I don’t have to dive into deep technical configuration editing to get value out of the data?  Maybe they will, but most likely like many other vendors those ‘powerful features’ will remain in the product for years to come.

What do you think, are we listening to you and creating better, more powerful, easy to use software?

Well, evidently I caused a stir with my recent blog post questioning whether VMware has ceded the SMB to Microsoft Hyper-V.  The VMware author was quick to strongly critique my post, so let me take a minute to set straight the record straight.

Today, the VMware platform is undoubtedly more mature and robust than Hyper-V. Heck, we even ran a webcast last week with Tech Republic’s Scott Lowe that compared VMware and Hyper-V that I thought offered a very fair comparison that favored VMware in many, if not most, areas. There are definitely a lot of questions around how much Microsoft will close the functionality gap with Hyper-V 3.0, but since it’s not shipping until at least later this year, I’ll reserve my comments on that until another time.

The statistics about market share used in my earlier post were from an article by Tim Greene published in Network World back in November entitled “Virtualization wars: VMware vs. Hyper-V vs. XenServer vs. KVM,” where the author says the following:

This flurry of improvements is in addition to progress Hyper-V has been making against ESX in licenses issued. Hyper-V grew 62% last year compared to ESX's 21% growth and Citrix's 25%, according to IDC. Separately, Gartner projects that by next year Hyper-V will account for 27% of the market, up from 11% two years ago. Within that projected 27%, Gartner says Microsoft will have captured 85% of all businesses with less than 1,000 employees that use virtual servers.

I’ve asked Mr. Greene via email to provide a citation for the stats above, and will update here when/if I receive it. Even if Mr. Greene misquoted Gartner, it’s not a big secret that Microsoft has had strong penetration into SMBs with Hyper-V and that their overall market share is growing. This has been a topic of discussion for years (e.g.: a post from the average blogger in 2010, a PC World post from 2009). The price of “free” often just resonates better with IT shops of this size.

The VMware Essentials kits are great, but from everything I can see, they’re not applicable to environments of more than three physical servers or six sockets. This is fine for very small businesses, but probably won’t meet the needs of many medium-sized organizations. These larger-than-very-small organizations will have to move up to higher-priced VMware products in order to get the functionality they need, but many will not need or want to pay for the most advanced features that VMware offers. So, a lot of these folks are at least considering Hyper-V as a lower-cost alternative. Every business will have to buy a Microsoft Windows Server license for their virtual environment. Since this Windows Server license includes Hyper-V with features like Live Migration for no additional cost, a lot of them choose to forego licensing separate virtualization software.

Again, these are just observations. SolarWinds doesn’t develop its own hypervisor (we just manage virtual environments), so we don’t have a horse in this race. We’ve got tons of customers of all sizes happily using both Hyper-V and VMware, some in mixed-hypervisor environments. We probably have an equal number on either side that have some angst with their hypervisor. To that end, we’ve made versions of our FREE VM Monitor tool for both VMware and Hyper-V. We think the market is big enough for at least two hypervisors!

There is a battle brewing in the IT industry. As I’ve noted before, virtualization is at the core of many of the fundamental changes in today’s data center. Pure virtualization served as the bridge between the old client-server model and new cloud computing, and, in most cases, a hypervisor exists in the cloud stack.

For years, the bulk of the data center was a two horse race between Microsoft Windows Server and one of many Unix or Linux platforms. That battle appears to have calmed down with most environments settling into some mix of these two technologies. Now, the battle is on a new front. It’s not really about hardware, and it’s not totally about software in the sense that we’ve talked about software before. It’s about software that emulates hardware. VMware was first to market and jumped out to a huge lead, primarily because they were a company dedicated to virtualization. Microsoft was a relative latecomer, but they’ve been able to use their considerable resources to play catch-up rather nicely…oh, and the fact that they give their product away for free doesn’t hurt much either!

I think we’ve only seen the beginning of this battle. Today, VMware owns somewhere north of 70% of the hypervisors deployed, but Gartner predicts that Microsoft will have around 27% of the market by the end of 2012. Given that there has to be room for some KVM and Xen, this has to be putting a significant cramp on VMware’s market share.

Even more telling is Gartner’s prediction that 85% of businesses with less than 1,000 employees will be Microsoft Hyper-V virtualization shops. So, that begs an important question: Has VMware ceded small and medium businesses to Microsoft Hyper-V? I assert that they have based on a few observations:

1)     VMware continues to add enterprise-focused capability to their product, most of which will not be used by SMBs.

2)     The additional functionality wouldn’t be a big deal for SMBs except that VMware needs to charge more for licensing to support further development. So, SMBs end up paying more for functionality that they won’t use.

3)     Unlike Microsoft, VMware doesn’t really have a way to profit from customers using a free version of their product, giving us reason to expect that they won’t be a price competitor with Microsoft in the short term.

I’m not suggesting that Hyper-V is better than VMware. As usual, the answer to the question of which one is better is really “it depends.” However, I think Hyper-V is becoming a much more formidable competitor all the way up to the enterprise and is actually dominating the SMB market.

We’ll explore the differences between VMware and Microsoft Hyper-V in depth in a webcast with industry expert Scott Lowe on March 21st at 11:00 AM CDT. If you’re interested in attending, we’d love to have you. Just click the following link to register.

In researching information on application optimization, I recently came upon this research report published by TRAC research, “WAN Acceleration Can’t Do it All; Organizations Looking for More Advanced Capabilities for WAN Visibility & Control.”  WAN Optimization is a hot topic, especially with the influx of new cloud providers, as evidenced by all the WAN Optimization vendors at last year’s VMworld and recent VC funding.  According to the survey, top selection criteria for deploying a WAN optimization solution are improvements in the speed of applications, ease of deployment & management and improvements in network throughput.  Survey respondents also noted one of the top challenges with managing applications include a lack of visibility into network traffic.  With this, the article concludes that while WAN optimization will improve the speed of application throughput, it is not enough to solve for the challenge of managing application performance and WAN appliance solutions should be supplemented with tools for network and application visibility.

I agree with this assessment but would like to take it a step further.  If the main goal is to improve application responsiveness, it is important to look at the application as a whole.  The problem may be in the network tier, but maybe not.  The network guy is often the fall guy.  To determine whether a WAN accelerator actually solves for the application latency problem in the network, it is critical to monitor traffic latency levels before and after the WAN optimization appliance is implemented.  Is the network faster? Yes.  Did it solve the application responsiveness issue? Umm….

Application responsiveness can be driven by several problems in the application from the network tier to a problem with the database SQL query to a constrained memory issue with the web server.  To understand whether you have an application response time issue from the end users perspective, it is important to monitor the transaction from their eyes.  By monitoring responsiveness of each step of the transaction, you can better isolate where the problem is coming from – was the checkout process slow, or did the application have a problem in loading catalog data, etc. 

I am curious to understand your experience – did deployment of WAN acceleration in your environment in fact solve your application response time problem…..or do you know?

I read this article from InformationWeek recently on Cisco ruling the data center but there being threats on the horizon and it got me thinking about the IT management software landscape.  For years if you wanted to manage your network or servers well, you were forced to buy a management platform and tailor it to your needs.  Many big name vendors (who shall remain nameless) charged folks through the nose for features they showed you in a demo and then charged you again to build those features (through services) in your environment.  What’s that got to do with proprietary features you may ask?  Well here it is – more than likely someone in your business bought those big unwieldy platforms because they were sold a vision, and in that vision were a sprinkling of features that made their mouths water.  You know the features – the ones you imagine in your head right after you spent three days over Christmas tracing cables or rebuilding servers.  They were proprietary features that you could only get if you bought the platform – the whole enchilada.

So you’ve been around the block with that story once, maybe twice to three times now, and you’ve got to wonder whether the products that offer 95% of what the big platforms do are good enough.  It’s the software equivalent of what the Information week article talked about when they refer to vendors like Force10 (now Dell) using commodity silicon and focusing on price per port and raw performance.  It’s the ethos we have in building our software at SolarWinds.  Build the things you can actually use, make them surprisingly easy, and do it at a price most of the market can afford.  We believe that there are lots of problems that can be solved using this approach and that you, the IT user know it’s a better way to build and buy products.

I hope you agree that the future of IT management software is squarely grounded in the idea that companies (and the people in those companies) build complete products for other people, to help them solve problems and actually remove complexity from their life, and they make these products easy to try and easy to buy.  If you do, then welcome to the SolarWinds Way!

Related recent news

Over the past few weeks, we’ve added to our family of products, so in addition to having the best network management products out there (if you haven’t tried NPM network monitor you should), we now have the best systems management products out there and we’ve got a number of problem areas covered (feel free to try any or all):

 

I was recently reading an interesting article from Eric Parizo at Tech Target  -- “Time to ban dangerous apps?  Exploring third party app security,” and while I agree with a lot of the points he makes in the article, I would argue that his argument that banning common applications is an answer to protecting your organization will not fly in the today’s business world.

As Dan Guido says in the same article, “every single piece of software you have is crap.”  From a hacker/exploiter perspective, there will always be a vulnerable app. When you close down one application with holes, whatever you choose to use instead is going to have similar or other issues attackers can exploit.  

Businesses will also incur the cost and penalty of having to re-train users to use these new applications and if there are dependencies on other software you use -- either COTS (commercial off the shelf) or internal home grown apps --then those need to be updated as well. One example is applications that leverage JRE’s. 

From a business perspective, I recommend that the old adage, “the best defense is a strong offense” should be followed.

Parizo references two patch management solutions in the article, but says that users “either struggle to quickly identify and test high-priority security patches, or simply don't make it a priority.”  Isn’t that what patch management solutions are for? 

The root problem with many of the solutions in the market today is twofold.  First, many are just too darn expensive for most organizations to afford.  Unfortunately, the true cost of being exploited is not realized by many until too late.  

Second is ease of use.  As Parizo writes, Microsoft has gotten much better at protecting its OS’s from a security standpoint.  It is also one of the few vendors out there that has an update service, Windows Server Updates Services (WSUS), as mature as it is, also provides functionality built into their server OS’s to aid in distributing their product patch. However, as Parizo writes, third party applications get left behind and do not enjoy the same luxury. 

I believe patch management should protect both Windows and third-party applications. What about you? Would you ban common third-party apps at your company?

Never, really giving it much thought; I, like many others, insert a USB drive into my company PC, highlight a company folder with confidential information; drag…copy…paste to the little, portable and infamous USB flash drive that we’ve all come to depend on for easy sharing of information with others. Later; Hmmm!  I can’t find it!  Where is my USB Drive?

Just how many companies out there worry about external security breaches but don’t think about their own internal organization?

A recent survey of 302 IT decision makers in the U.S. and Canada (Conducted by Harris Interactive on behalf of Imation) revealed nearly 37 percent of IT decision makers report unintentional exposure of corporate data through theft or loss of removable devices in the past two years.  91 percent of companies allow removable storage devices on their corporate networks, but only 34 percent enforce encryption on those devices. 

In addition, 81 percent of the U.S. and Canadian IT decision makers reported that their companies have a policy regarding the encryption of corporate data on employees’ removable storage devices; however, only 34 percent enforce encryption on both personal and company devices on their networks, and only 35 percent enforce encryption on company issued devices. Twelve percent leave it to the user to enforce encryption.   

Of the 91 percent of storage devices allowed on U.S. and Canada corporate networks, 83 percent of users use USB flash drives as one of their devices.  No encryption, no policy or no enforcement of the encryption policy.  So, how does a company easily and affordably start getting control of what their own employees plug into their corporate PCs?

In the new mobile era, USB monitoring and defense is becoming more important with the increasingly use of USB storage devices as it has created a new security breach in the corporate network; your own internal organization!  The operation of copying a file to a flash drive can, literally, take just seconds.  And, so can losing it!  IT organizations can consider a few different types of options to reduce the risk of security breaches and the compromise of corporate data through USB devices:

 

  • Entirely shut down all USB ports
  • Ignore the issue
  • Spend a lot of money on a Data Loss Prevention (DLP) solution

 

Any way you look at these types of options, you’re taking the chance of reducing productivity, making the user unhappy, just asking for security breaches or explaining to your boss why you’re going over budget on a solution that is way over-needed.    

A USB defender will allow IT to set rules on what types of USB devices are allowed on the network, usage monitoring, alert/block and record all files that a user is copying to the USB.  This type of option will keep productivity up, keep your users happy, block security breaches and with the right product, stay within your budget. 

Are you one of those companies who only worry about external security breaches?  THINK AGAIN!

SolarWinds award-winning Log & Event Manager provides Log collection, aggregation, and correlation along with a unique active response and USB defender technology.  A free 30-day trial is available here. 

 

Debbie Russo is SolarWinds’ Product Marketing Manager.  Debbie has over 10 years of experience in developing and implementing successful direct and indirect marketing strategies for high-tech companies ranging from start-ups to Fortune 100 organizations.

Filter Blog

By date: By tag: