Click here to request your free 14-day trial of Cisco Umbrella through NetCraftsmen today!

8/12
2019
Terry Slattery

IT Security Refresh: Practical Tips for a Good Foundation (Part 1)

IT system intrusions and malware are increasing. What can you do about it?

I’ve stayed away from IT security for a long time. It has seemed to me to be like standing next to a dike, poking fingers in as many holes as I can, eventually running out of fingers. Then I look around and see malware coming through more leaky holes, then looking up to see malware sloshing over the top. Is it really possible to provide reasonable IT security to an organization? Are there practical things enterprises can and should be doing?

NetCraftsmen co-worker John Cavanaugh recently pulled me in to assist with a security assessment. I was skeptical at first, since IT security seemed so futile. However, he shared some excellent material about how to protect corporate IT infrastructure. It turns out to be easier than I expected to build a solid security foundation. More importantly, there is factual evidence to indicate that the foundational steps actually work.

Ransomware

Ransomware is the most common malware, often sent in phishing emails that get someone to click though to a Web page that infects the user’s computer. The malware will then use horizontal propagation to distribute itself within an organization. It stays dormant for a few days to allow the encryption to spread, then wakes up and encrypts files. The encrypted data can be unencrypted by paying a ransom, typically in Bitcoin, to the encryption key holder. Ransomware authors have generally been good about providing the keys, otherwise their victims would have no reason to pay the ransoms.

Businesses have frequently been paying ransomware fees as the easiest approach to returning to business. But that only encourages more ransomware. Even if an organization has paid the ransomware fee, it still needs to spend on making sure that the IT systems aren’t vulnerable to a repeat attack. Some organizations have been hit multiple times because they continue with a business-as-usual approach. Other organizations have used system backups to return to business, although with a loss of some data since the last good backup.

Macros in Word documents propagate one version of ransomware. Infections spread when users share those documents, regardless of the sharing mechanism. Storing documents in a cloud system doesn’t prevent such malware unless the cloud storage system includes additional security scanning measures.

Ransomware that encrypts large shared document repositories are particularly painful. The same goes for backup partitions. If the backup repository is read/write accessible all the time, the ransomware can encrypt the old backups, rendering them useless for a recovery (read related article, “How Ransomware is Beating Your Backup”).

Building a Security Foundation

The National Security Agency’s cybersecurity unit has identified three basic security functions that would have prevented a substantial percentage of cyber incidents that they they’ve experienced (read related article, “NSA: We have not responded to a zero-day in two years…”). That’s a pretty darn good foundation upon which to build.

Multi-Factor Authentication (MFA)

The city of Atlanta could have protected itself from its 2018 cyberattack using MFA. The ransomware easily propagated internally by using a brute-force password guessing mechanism. MFA relies on two or more secrets to gain access to an IT system. The typical mechanism uses two factors: something the user knows and something the user possesses. A password is frequently the factor that the user knows and a security pass key token generates a numeric pass key that’s valid for only a minute or two. Another secondary factor is to send a numeric code via an SMS message to the user’s cell phone. The user must enter the numeric code as part of the two-step login process. The Atlanta ransomware would have failed the second step of the login process and been unable to propagate.

Role-Based Access

The next function is role-based access, which prevents users from performing functions that aren’t part of their roles in the organization. Only IT administrators, not computer users, should be allowed to modify computer system files and privileges. There may also need to be an additional authentication step for system administrators to perform system modification functions.

Attempts by a non-administrator to modify system settings should be automatically logged to a security server for reporting and tracking of intrusion attempts. This is much like an early warning system that malware is knocking on the door and where it originates. IT security professionals can then take defensive measures.

White-Listing Applications

White-listing applications is similar to role-based access, only allowing IT systems to communicate with other IT systems that are necessary for performing their functions. In a three-tier (Web, application, and database server) application model, the Web server should only communicate with the application server and with the external user, perhaps via a proxy and/or load balancer. In turn, the application server should only be able to communicate upstream with the Web tier servers and downstream with the database tier servers. Finally, the database tier servers should only be able to communicate upstream with the application tier servers and perhaps downstream with a back-end storage system. Each of these systems may also need to communicate with other systems like Domain Name System servers or be monitored by network management systems.

Unfortunately, many application vendors can’t tell what protocols (TCP or UDP) and ports are in use. Some application vendors just request that all communications be left open, but that’s not a good idea. Fortunately, you can use tools like NetFlow to determine what communications is happening. You should be prepared to monitor some applications for longer periods of time to catch periodic activity, like monthly and quarterly financial reporting. You should investigate tools that automatically build access control lists that you can use to allow the desired connectivity and block everything else.

It’s important that you review the communications that is occurring between systems, just in case your IT infrastructure is already infected and is trying to propagate horizontally. I know that this isn’t generally possible for organizations that have hundreds of applications. However, by using automation and exception reporting, combined with other foundational steps, constructing basic white-listing security is possible.

Building on the Foundation

From this foundation, you can take many additional steps. I’ll look at recommendations for next steps in my next post.

To read the original blog post, view No Jitter’s post here.

Terry Slattery

Terry Slattery

Principal Architect

Terry Slattery is a Principal Architect at NetCraftsmen, an advanced network consulting firm that specializes in high-profile and challenging network consulting jobs. Terry is currently working on network management, SDN, business strategy consulting, and interesting legal cases. He is the founder of Netcordia, inventor of NetMRI, has been a successful technology innovator in networking during the past 20 years, and is co-inventor on two patents. He has a long history of network consulting and design work, including some of the first Cisco consulting and training. As a consultant to Cisco, he led the development of the current Cisco IOS command line interface. Prior to Netcordia, Terry founded Chesapeake Computer Consultants, which became a Cisco premier training and consulting partner. At Chesapeake, he co-invented and patented the v-LAB system to provide hands-on access to real hardware for the hands-on component of internetwork training classes. Terry co-authored the successful McGraw-Hill text "Advanced IP Routing in Cisco Networks," is the second CCIE (1026) awarded, and is a regular speaker at Enterprise Connect and Interop. He currently blogs at TechTarget, No Jitter and our very own NetCraftsmen.

View more Posts

 

Nick Kelly

Cybersecurity Engineer, Cisco

Nick has over 20 years of experience in Security Operations and Security Sales. He is an avid student of cybersecurity and regularly engages with the Infosec community at events like BSides, RVASec, Derbycon and more. The son of an FBI forensics director, Nick holds a B.S. in Criminal Justice and is one of Cisco’s Fire Jumper Elite members. When he’s not working, he writes cyberpunk and punches aliens on his Playstation.

 

Virgilio “BONG” dela Cruz Jr.

CCDP, CCNA V, CCNP, Cisco IPS Express Security for AM/EE
Field Solutions Architect, Tech Data

Virgilio “Bong” has sixteen years of professional experience in IT industry from academe, technical and customer support, pre-sales, post sales, project management, training and enablement. He has worked in Cisco Technical Assistance Center (TAC) as a member of the WAN and LAN Switching team. Bong now works for Tech Data as the Field Solutions Architect with a focus on Cisco Security and holds a few Cisco certifications including Fire Jumper Elite.

 

John Cavanaugh

CCIE #1066, CCDE #20070002, CCAr
Chief Technology Officer, Practice Lead Security Services, NetCraftsmen

John is our CTO and the practice lead for a talented team of consultants focused on designing and delivering scalable and secure infrastructure solutions to customers across multiple industry verticals and technologies. Previously he has held several positions including Executive Director/Chief Architect for Global Network Services at JPMorgan Chase. In that capacity, he led a team managing network architecture and services.  Prior to his role at JPMorgan Chase, John was a Distinguished Engineer at Cisco working across a number of verticals including Higher Education, Finance, Retail, Government, and Health Care.

He is an expert in working with groups to identify business needs, and align technology strategies to enable business strategies, building in agility and scalability to allow for future changes. John is experienced in the architecture and design of highly available, secure, network infrastructure and data centers, and has worked on projects worldwide. He has worked in both the business and regulatory environments for the design and deployment of complex IT infrastructures.