Malware is an issue that has been around since shortly after the start of computing and isn't something that is going to go away anytime soon. Over the years, the motivations, sophistication, and appearance have changed, but the core tenants remain the same. The most recent iteration of malware is called ransomware. Ransomware is software that takes control of the files on your computer, encrypts them with a password known only to the attacker, and then demands money (ransom) in order to unlock the files and return the system to normal.


Why is malware so successful? It’s all about trust. Users needs to be trusted to some degree so that they can complete the work that they need to do. Unfortunately, the more we entrust to the end-user, the more ability a bad piece of software has to inflict damage to the local system and all the systems it’s attached to. Limiting how much of your systems/files/network can be modified by the end-user can help mitigate this risk, but it has the side effect of inhibiting productivity and the ability to complete assigned work. Often it’s a catch-22 for businesses to determine how much security is enough, and malicious actors have been taking advantage of this balancing act to successfully implement their attacks. Now that these attacks have been systematically monetized, we're unlikely to see them diminish anytime soon.


So what can you do to move the balance back to your favor?


There are some well-established best practices that you should consider implementing in your systems if you haven't done so already. These practices are not foolproof, but if implemented well should mitigate all but the most determined of attackers and limit the scope of impact for those that do get through.


End-user Training: This has been recommended for ages and hasn't been the most effective tool in mitigating computer security risks. That being said, it still needs to be done. The safest way to mitigate the threat of malware is to avoid it altogether. Regularly training users to identify risky computing situations and how to avoid them is critical in minimizing risk to your systems.


Implement Thorough Filtering: This references both centralized and distributed filtering tools that are put in place to automatically identify threats and stop users from making a mistake before they can cause any damage. Examples of centralized filtering would be systems like web proxies, email spam/malware filtering, DNS filters, intrusion detection systems, and firewalls. Examples of local filtering include regularly updated anti-virus and anti-malware software. These filtering systems are only as good as the signatures they have though so regular definition updates are critical. Unfortunately, signatures can only be developed for known threats, so this too is not foolproof, but it’s a good tool to help ensure older/known versions/variants aren't making it through to end-users to be clicked on and run.


The Principle of Least Privilege: This is exactly what it sounds like. It is easy to say and hard to implement and is the balance between security and usability. If a user has administrative access to anything, they should never be logged in for day-to-day activities with that account and should be using the higher privileged account only when necessary. Users should only be granted write access to files and shares that they need write access to. Malware can't do anything with files it can only read. Implementing software that either whitelists only specific applications, or blacklists applications from being run from non-standard locations (temporary internet files, downloads folder, etc…) can go a long way in mitigating the threats that signature-based tools miss.


Patch Your Systems: This is another very basic concept, but something that is often neglected. Many pieces of malware make use of vulnerabilities that are already patched by the vendor. Yes, patches sometimes break things. Yes, distributing patches on a large network can be cumbersome and time consuming. You simply don't have an option, though. It needs to be done.


Have Backups: If you do get infected with ransomware, and it is successful in encrypting local or networked files, backups are going to come to the rescue. You are doing backups regularly, right? You are testing restores of those backups, right? It sounds simple, but so many find out that their backup system isn't working when they need it the most. Don't make that mistake.


Store Backups Offline: Backups that are stored online are at the same risk as the files they are backing up. Backups need to be stored on a removable media and then that media needs to be removed from the network and stored off-site. The more advanced ransomware variants look specifically to infect backup locations, as a functioning backup guarantees the attackers don't get paid. Don't let your last recourse become useless because you weren't diligent enough to move them off-line and off-site.


Final Thoughts


For those of you who have been in this industry for any time (yes, I'm talking to you graybeards of the bunch), you'll recognize the above list of action items as a simple set of good practices for a secure environment.  However, I would be willing to bet you've worked in environments (yes, plural) that haven't followed one or more of these recommendations due to a lack of discipline or a lack of proper risk assessment skills. Regardless, these tried and true strategies still work because the problem hasn't changed. It still comes down to the blast radius of a malware attack being directly correlated with the amount of privilege you grant the end-users in the organizations you manage. Help your management understand this tradeoff and the tools you have in your arsenal to manage it, and you can find the sweet spot between usability and security.