Endpoint Protection

 View Only

DoD-Certified Trusted Systems And You - Part Two 

Mar 22, 2000 02:00 AM

by Ben Malisow

 

Windows NT 4.0 was deemed C2 compliant on 02 December, 1999, as a result of an extensive survey performed according to Trusted Computer Systems Evaluation Criteria, the United States Department of Defense analysis protocols. The famed Orange Book review means that DoD service components and agencies are authorized to use NT 4.0 as purchased "off the shelf" for government and military sensitive information.

While this is great news for Microsoft, what does it mean to you as an NT administrator?

The TCSEC Orange Book Interpretation

The Orange Book itself is not that useful for evaluating or assessing NT 4.0; the TCSEC guidance solely exists to give a broad overview of what an organization should examine when determining the security level compliance of a given system. The Orange Book is therefore necessarily vague, and the terms and conditions may or may not apply to NT 4.0.

There are four general divisions of security criteria, A, B, C, and D, with A being the most rigorous standard. divisions B and C are further broken into classes C1, C2, B1, B2, and B3, and there is an unnamed category "beyond A1." It is readily apparent that C2 is a relatively low security criteria class.

Each class is defined by six integral requirements: security policy, marking, identification, accountability, assurance, and continuous protection. The Orange Book lists the six C2 requirements in a very brief, terse manner, in less than three printed pages; even when combined with the C1 requirements, the description of necessary evaluation criteria is not very substantial.

Which leaves the Orange Book widely open to interpretation. In June 1998, Science Applications International Corporation (SAIC), began that interpretation, for NT Workstation and NT Server 4.0. In December last year, NT workstation and server were found C2 compliant. The complete 193-page study can be found at: http://www.radium.ncsc.mil/tpep/library/fers/TTAP-CSC-FER-99-001.pdf .

There is much dissension in the industry over exactly what C2 compliance means- what it's worth, what it covers, how it's applied, how the rating should affect your decision to purchase, implement, or utilize NT systems.

Please bear in mind that a Trusted Systems designation means only that the system is approved, as an off-the-shelf purchase, to handle data classified at the level assigned. This is not to say that the product in question is "secure," but only that it is "trusted." This may seem like a spurious distinction, but it is quite relevant.

Another point of contention is whether C2 approval means that NT is cleared in a networking configuration or solely as a stand-alone unit. According to the SAIC study, NT 4.0 with Service Pack 6a is C2 compliant in both designations.

On the other hand, "trusted" does not mean secure, and NT, right out of the box, is not necessarily as stringent a protection as you might need, or even a particularly wise option. That is where Trusted Systems Services comes in.

The Guidelines

The complete analysis of the NT product in a secure format is found in the Windows NT Security Guidelines, published by Trusted Systems Services, and can be found at www.trustedsystems.com . Trusted Systems Services is an independent contractor commissioned by the National Security Agency (NSA) in 1998 to determine a specific means of securing Windows NT for Department of Defense and other government agencies. The following is a brief description of those facets of the Guidelines you, as an NT administrator might find useful, but it is highly recommended that you download a copy and peruse it for your own purposes.

The Guidelines make use of an arbitrary distinction created by the authors, called Level 1 and Level 2 security, where the latter is for "those who wish to maximize the protection of that Windows NT affords." Please bear in mind, and this can't be stressed enough, that C2 is already a relatively low security requirement, and most organizations will desire a much more stringent set of criteria.

Chapter 3 of the Guidelines involves the process of setting up a new NT network, dealing with both hardware concerns and a step-by-step installation process. If you are in the advantageous position of creating a new network for your organization, this can be of great help; if your system is already in place, reviewing this guide might be of some benefit in double-checking your current safeguards.

Key points offered in this chapter are: an iteration of common security practices, booting exclusion recommendations, and strong warnings against installing other operating systems on NT machines. This chapter also highly recommends following the prescribed procedures when installing NT, and admonishes administrators for taking shortcuts; this is always good advice.

The warning which concerns installing multiple operating systems, which contains rather urgent wording, might be considered an homage to the mindset of the National Security Agency (NSA), which sponsored this analysis. As I was told by a person who formerly worked for the NSA, institutional paranoia is standard practice, and the "single target" philosophy is a watchword of this belief. That is, by minimizing the number of targets the enemy might exploit, enhanced security measures are much simpler to implement. Therefore, this particular prohibition might be taken with a grain of salt if your particular situation calls for such an architecture.

Chapter 4 governs three integral NT aspects: domains and trusts; logon rights; and per-account restrictions. For domains and trusts, a concise, instructional background of the relationships between domains and users is offered, including some advice for ameliorating risk by restricting superfluous domain trusts. There is also a suggestion to use the Resource Domain model for small to medium installations. This chapter also gives variant combinations of membership login groups with differing security postures- you might find this extremely helpful, depending on the nature of your user base.

Chapter 6, entitled General Practices, should be reviewed for comparison with your own policies. Recommendations including restricting remote and anonymous access to the Registry are very helpful. A suggestion to use the ProtectionMode is wisely tempered by acknowledging the chance its employment can impact operations- again, see what is right for you and act accordingly. There are several other good suggestions in this chapter, such as prohibiting unauthenticated event log review, unapproved print driver installation, viewing of user account and share names, and last user name. It also recommends several security features that are always a good practice: screen saver lockout, SYSKEY encryption of hashed passwords, and password failure notification.

Chapter 7 is the prize-winning chapter; it covers the file system and Registry ACLs. Pages 38-40 are an extensive chart cover the WINNT directories and files that are of notable import to maintain a secure environment, and the ACL settings relevant for effective restrictions. You should by all means review this table and check it against your system- if there are disparities, you should have a rationale for each one, usually in terms of how your system's specific requirements precludes compliance. This is a good tool for your own understanding.

Page 44 is another, similar table, this one for the Registry ACLs. The same precedent applies.

Chapter 13 covers policies for auditing and security logs. It is preceded by a very rational disclaimer that auditing policy is direct function of the amount of time administrators are willing to spend conducting maintenance activity- plan accordingly.

The Guidelines notably do not prescribed setting your system to automatically crash if the security log fills, as some sites might opt. There is also a good table on page 78 which suggests events that should be logged for security audits.

Chapter 14 deals with system services, programs that run upon booting and continue to operate during use. There is some very useful information here regarding restricting the number and power of services; administrators are urged to review this chapter.

Chapter 16 adds some input about networking, the heart of NT. There are very few suggestions here, as most of the crucial data is in previous chapters. This chapter does contain some basic security measures, such as limiting networking services and creating policies for client software.

Other aspects of this chapter that make for very interesting reading: assessment of threats, such as LANMAN format risks and service attacks, network eavesdropping and interception, and how and why to apply cryptography to network traffic.

Chapter 18 contains a great deal of quality information regarding spoofing and how to prevent it. The beginning of the chapter is rather vague and broad, suggesting logon separation and discussing Trusted Path and system-wide PATH qualities and options.

The rest of the chapter is extremely helpful, covering such topics as the "." issue (how the DOS window searches the current directory for commands before the PATH directories), data files that hold hidden programs, CD-ROM auto-run programs, shortcut spoofing, the suggestion to remove "R" from executable files, and spoofing Internet browsers.

The final, and most comprehensive, portion of Chapter 18 concerns DLL spoofing, and provides a detailed, extensive list of precautions. The authors indicate, however, that no current methods for completely ameliorating the threat of DLL spoofing are available, and that quasi-preventative measures should not be relied upon.

The Guidelines, while helpful, are in no way comprehensive or complete. This is a baseline perspective of DoD acceptance standards, not a recipe for surety. Several criticisms are glaringly apparent, but most are not the fault of the authors, as the publication was not designed as an NT panacea.

The Guidelines do not deal with legal constraints and issues for a non-DoD workplace. If you administrate systems for a commercial organization, or even a government agency outside the DoD, this work is in no way the last word in analysis.

It is also severely limited in the scope of products recommended for all architectures; there are several good recommendations for applications that can assist you in your security efforts, but they are limited to other trusted systems- the government is not capable of reviewing cutting-edge technology that you, as an independent administrator, might be.

And perhaps the most readily-apparent fault with the Guidelines is the product they are applied to: MS Windows NT 4.0. Not NT 2000, not UNIX, not LINUX, not any other system or platform. If you don't have NT 4.0, this publication might be a useful reference, but it might not. This is only good for one product, from one perspective.

Learning to Trust Your System

As mentioned in Part I of this article, C2 compliance is nice, but integrating your system and your organization in a secure operational fashion is a matter for the DoD Information Technology Security Certification and Accreditation Process (DITSCAP). The DITSCAP is a continual, participatory method of designing, implementing, documenting, and securing your system, from inception through the end of the product life cycle. The DITSCAP is highly recommended, no matter what security level you're trying to attain, or what product you are using.

The process is broken down into four phases: Definition, Verification, Validation, and Post-Accreditation. The numerical nature of the phases is a bit of a misnomer; the order is not set in stone, and each phase is in some manner continuous, so tailoring the DITSCAP to fit your needs is a fairly easy matter.

Definition is something many current systems are in dire need of; many organizations went about acquiring and employing their information technology in a rather ad hoc manner, without much procedural documentation. This can be a severe security hazard, as knowing what to protect makes creating security measures much more effective.

In this portion of the DITSCAP, your team is assembled to conduct the rest of the process. You'll need an organization information technology expert, your Certification Authority (CA), to act as the responsible party for inspecting each component of the system, related procedures, and structure for each of the phases. Then there is the Designated Approving Authority (DAA), the person in your organization who will ultimately be responsible for the integrity of the system; with the signature and approval of the DAA, the system is operating in the absence of organizational sanction. This should probably be your CIO, if your agency has one, or another member of the executive chain who understands the risks and uses of information technology. The team should also include the system administrator or manager, and a representative from the user community.

This phase will produce a document contractually outlining the responsibilities of each party or group of parties in your organization in terms of the system, how to protect it, and the policies for accomplishing this. If this is a new or soon-to-be-purchased system, the Definition phase will concentrate on exactly what uses would best be met by what components. If the system is already in place, this phase should concentrate on documentation of existing components, policies, and responsibilities- you'll probably be surprised to learn how much work needs to be done on this aspect of your system, but, in the long run, this is extremely useful.

Phase two, Verification, is the meat of the security certification aspect of the DITSCAP. You'll refine the documents compiled in the Definition phase, conduct development or modification of the system, design certification procedures, and analyze the results of the certification plan.

A risk assessment and agreement on what the organization considers acceptable risk is necessary before proceeding beyond this phase, as your actual accreditation activity takes place next. All preliminary work should be completed here.

Validation is the crux of the DITSCAP; in this phase the system is integrated into the organization, and the accreditation is accomplished. All of the preceding agreements and analysis should be incorporated with the accreditation process, and each member of the team has a distinct role in this phase; however, the CA is the primary actor in Validation.

All of the system requirements must be matched against threats, and compared to the risk assessment to determine if the minimal acceptable levels are met. A formal report is prepared and forwarded to the DAA for acceptance and approval. Once the system has been accredited, it is now considered operational, functioning, and within acceptable risk parameters- it should be trusted for the use your organization intended.

The last phase, Post-Accreditation, is a continual process, incorporating portions of each previous phase. The team should have determined a time frame for rigorous but non-intrusive risk testing; this can include Red Team "hostile" attack simulations, compiling security incident reports and determining if the system needs further protective modification, or any number of other means of testing reliability.

Additionally, changes to the system require completion of each of the DITSCAP phases before they are implemented. Your team should define the degree of the change necessary for formal, dedicated procedures, and which modifications require only simple documentation. Adding a new user is one end of the spectrum, while creating a new LAN or switching to a different platform is the other.

The DITSCAP is incredibly helpful for any system, and, in terms of those rated as C2 compliant, can be of great value to your organization to define exactly what system integrity and acceptable risk really mean.

You And Your Untrusted System

Please bear in mind that while the DoD and Microsoft have a lot to be happy about since the C2 certification of NT 4.0, C2 is actually a very low security classification. It is extremely doubtful that you will want to consider your systems secure at C2 compliance. If you are undergoing the C2 certification process for an express purpose, such as possible consideration for military contracts at the government-sensitive level, or executive mandate, then by all means adhere to the Guidelines. Most organizations, however, will want to prescribe a much stricter set of policies and procedures.

The Orange Book, C2 classification, Guidelines, and DITSCAP do not constitute a magic bullet for systems security. If you are protecting the secret formula for a popular soft drink or the specifications for the next transonic jumbo jet, C2 is almost comical.

The government, and in particular, the military, relies on a credo of CYA- Cover Your Assets, to be polite. C2 is not designed to counter threats from dedicated sources, its purpose to provide an organization with a modicum of security so that should a successful attack take place, the owners, administrators, and users of said system can indicate that the requisite steps had been taken to prevent such an occurrence. There is a lot to be said for the government's reliance on CYA philosophy, but if you're not working for the government, it might not be any comfort to you.

With that said, there is one particular way the Guidelines, the TCSEC, and any number of other government regulations can assist you in your efforts: standardization. Gone are the years of freewheeling, unhindered systems security practices; professional organizations, not matter their nature, are going to need a set security template, something that can be read, understood, repeated, and modified. While the C2 certification might not mean much in terms of classification, it is a standard, and all organizations will be well served to choose a mold, similar or not.

Unfortunately, many organizations may run into common problems: groupthink, uninformed management or executive levels unconcerned with actual system integrity opting for some sort of simple verification, ad hoc decisions, and overreliance on buzzworded solutions.

Let's face it: real security is expensive, time-consuming, and fairly monotonous and difficult. Explaining to your organization's senior staff why the process is never-ending and oftentimes intrusive, is not easy. The real threat to your system is that programs like C2 certification will be seen as some sort of security immunization, and will become the industry standard. Like the management certification of ISO 9000 for manufacturing, many organizations will ignore the real benefits of the C2 compliance designation and instead pay a large one-time fee for outside accreditation and buy a sense of invulnerability, only to never again deal with actual security concerns.

If the best aspects of the C2, TCSEC, and DITSCAP are distilled and implemented for your organization, you have really accomplished something.

As one security expert wrote concerning this topic: "Everyone should... (have) the understanding that even if they don't like the rigidity of the Orange Book - they still need to define, or adopt, a security policy and then make sure that their system configuration can enforce that policy."

I couldn't put it any better.

 


 

This article originally appeared on SecurityFocus.com -- reproduction in whole or in part is not allowed without expressed written consent.

Statistics
0 Favorited
0 Views
0 Files
0 Shares
0 Downloads

Tags and Keywords

Related Entries and Links

No Related Resource entered.