Skip navigation

Geek Speak

2 Posts authored by: gerardodada

At Tech Field Day 13, we gave everyone a sneak peek at some of the cool features we’ll be launching in March. If you missed the sneak peak, check out the footage below:

 

 

Our PerfStack Product Manager has also provided a detailed blog post on what you can expect from this cool feature. In the near future, look for the other Product Managers to give you more exclusive sneak peeks right here on THWACK.

 

Join the conversation

We are curious to hear more about what you think of PerfStack. After all, we used your requests to build this feature. With that said, I’ve seen several requests in THWACK expressing what you need from an Orion dashboard. I would love to hear from those of you directly in the IT trenches, specifically some ideas on how you would manipulate all this Orion data with PerfStack.

 

Personally, I would use PerfStack to visually correlate the page load times in synthetic transactions as observed by WPM with web server performance data in WPM and network latency from NetPath, and maybe storage performance from SRM to get a better understanding of what drives web server performance and what is likely to become a bottleneck that may impact end user experience if load goes up. But we want to hear from you.

 

If you had access to PerfStack right now, what would you do with it?

What interesting problems could you solve if you could take any monitoring object from any node that you are monitoring with Orion? What problems would you troubleshoot? what interesting use cases can you imagine? What problems you are facing today would it help you solve?

 

Let us know in the comments!

 

It seems like every organization is looking at what can be moved—or should be moved—to the cloud. However, the cloud is clearly not for everything; as with any technology there are benefits and tradeoffs. As such, it is important for all IT professionals to understand when and how the cloud is advantageous for their applications.

 

In this evaluation process and the migration planning for moving applications to the cloud, databases are usually the more difficult element to understand. Of course, data is the heart of every application, so knowing how databases can reliably work in the cloud is key. Here are a few ideas and recommendations to keep in mind when considering moving databases to the cloud:

 

1. It starts with performance. If I had a penny for every time I have heard, “the cloud is too slow for databases,” I might have enough for a double venti latte. Performance uncertainty is the key concern that stops professionals from moving databases to virtualized environments or the cloud. However, this concern is often unfounded as many applications have performance requirements that are easy to meet in a number of different cloud architectures. Cloud technology has evolved over the past three years to offer multiple deployment options for databases, some of them with very high performance capabilities.

 

2. Visibility can help. The easiest way to solve performance problems is to throw hardware at them, but that is obviously not a best practice and is not very cost effective. A database monitoring tool can help you understand the true database and resource requirements of your application. Things such as:

    • CPU, Storage, memory, latency and storage throughput (IOPS can be deceiving)
    • Planned storage growth and backup requirements
    • Resource fluctuation based on peak application usage or batch processes
    • Data connection dependencies—aside from application connectivity there may be other application data interchange requirements, backups or flow of incoming data

One of the advantages of the cloud is the ability to dynamically scale resources up and down. So, rather than being the source of performance uncertainty concerns, it can actually give you peace of mind that the right amount of resources can be allocated to your applications to ensure adequate performance. The key, however, is knowing what those requirements are. You can use Database Performance Analyzer (there is a 14 day free trial) to understand these requirements.

 

3. Take a test drive. One of the obvious benefits of the cloud is low cost and accessibility. Even if you don’t have a migration plan in the works yet, it is a good idea to play with cloud databases to become familiar, experiment and learn. In an hour of your time, you can get a database running in the cloud. Set one up, play with it and kill it. The cost is minimal. With a bit more time and a few more dollars, you can even move a copy of a production database to the cloud and test deployment options and learn how things specific to your application and database will work in the cloud.


4.Carefully plan your deployment model. The cloud offers multiple deployment options that should be considered. For example, Database as a Service (DBaaS) provides simplicity in deployment, automation and a managed service. Leveraging Infrastructure as a Service (IaaS) is an alternative for running database instances on cloud servers that provides more control and that looks and feels like a traditional on-premise deployment. There are also various storage options, including block storage, SSD drives, guaranteed IOPS, dedicated connections and database-optimized instances. As the cloud is mostly a shared environment, it is also important to understand and test for performance consistency and variability, not just peak theoretical performance.

 

5. Make the move. There is no single migration plan that covers all use cases. Rather than trying to use dome formula for making the move to the cloud, I recommend talking to your cloud provider, explaining your environment and getting their guidance. It is also usually a good idea to create a duplicate environment in the cloud and verify it runs well before switching the production application. And in addition to your data recovery and backup requirements, it is also important to consider replication or standby servers in a different region than where your primary servers are located.

 

6. Monitor and optimize. Just like with on-premise deployments, it is important to monitor and optimize your cloud environment once it is up and running. Database optimization tools offer wait time analysis and resource correlation can speed database operations significantly, alert to issues before they become big problems, increase application performance and monitor resources to help with planning. Database administrators, developers and IT operations can benefit from a performance analysis tool like SolarWinds DPA that allows them to write good code and pinpoint the root cause of whatever could be slowing down the database, whether that be queries, storage events, server resources, etc.

 

The cloud is evolving quickly. It is getting better, more reliable and more flexible all the time. Just like five years ago when most of us could not envision just how transformative the cloud would be today, we should expect the technology to continue evolving at the same pace over the next five years. This is one more reason to start experimenting with the cloud today. It is a journey that requires breaking some paradigms and shifting your mindset, but also a journey that can provide significant benefits for your applications and your job.

Filter Blog

By date: By tag: