Intelligence cycle security

National intelligence programs, and, by extension, the overall defenses of nations, are vulnerable to attack. It is the role of intelligence cycle security to protect the process embodied in the intelligence cycle, and that which it defends. A number of disciplines go into protecting the intelligence cycle. One of the challenges is there are a wide range of potential threats, so threat assessment, if complete, is a complex task. Governments try to protect three things:

  • Their intelligence personnel
  • Their intelligence facilities and resources
  • Their intelligence operations

Defending the overall intelligence program, at a minimum, means taking actions to counter the major disciplines of intelligence collection techniques:

  • Human Intelligence (HUMINT)
  • Signals Intelligence (SIGINT)
  • Imagery Intelligence (IMINT)
  • Measurement and Signature Intelligence (MASINT)
  • Technical Intelligence (TECHINT)
  • Open Source Intelligence (OSINT)

To these are added at least one complementary discipline, counterintelligence (CI) which, besides defending the six above, can itself produce positive intelligence. Much, but not all, of what it produces is from special cases of HUMINT.

Also complementing intelligence collection are additional protective disciplines, which are unlikely to produce intelligence:

These disciplines, along with CI, form intelligence cycle security, which, in turn, is part of intelligence cycle management. Disciplines involved in "positive security", or measures by which one's own society collects information on its actual or potential security, complement security. For example, when communications intelligence identifies a particular radio transmitter as one used only by a particular country, detecting that transmitter inside one's own country suggests the presence of a spy that counterintelligence should target.

CI refers to efforts made by intelligence organizations to prevent hostile or enemy intelligence organizations from successfully gathering and collecting intelligence against them. Frank Wisner, a well-known CIA operations executive said of the autobiography of Director of Central Intelligence Allen W. Dulles,[1] that Dulles "disposes of the popular misconception that counterintelligence is essentially a negative and responsive activity, that it moves only or chiefly in reaction to situations thrust upon it and in counter to initiatives mounted by the opposition" Rather, he sees that can be most effective, both in information gathering and protecting friendly intelligence services, when it creatively but vigorously attacks the "structure and personnel of hostile intelligence services.[2] In 1991[3] and 1995 US Army manuals dealing with counterintelligence,[4] CI had a broader scope against the then-major intelligence collection disciplines. While MASINT was defined as a formal discipline in 1986,[5][6] it was sufficiently specialized not to be discussed in general counterintelligence documents of the next few years.

All US departments and agencies with intelligence functions are responsible for their own security abroad.[7]

In many governments, the responsibility for protecting intelligence and military services is split. Historically, CIA assigned responsibility for protecting its personnel and operations to its Office of Security, while it assigned the security of operations to multiple groups within the Directorate of Operation: the counterintelligence staff and the area (or functional) unit, such as Soviet Russia Division. At one point, the counterintelligence unit operated quite autonomously, under the direction of James Jesus Angleton. Later, operational divisions had subordinate counterintelligence branches, as well as a smaller central counterintelligence staff. Aldrich Ames was in the Counterintelligence Branch of Europe Division, where he was responsible for directing the analysis of Soviet intelligence operations. US military services have had a similar and even more complex split.

Some of the overarching CI tasks are described as

  • Developing, maintaining, and disseminating multidiscipline threat data and intelligence files on organizations, locations, and individuals of CI interest. This includes insurgent and terrorist infrastructure and individuals who can assist in the CI mission.
  • Educating personnel in all fields of security. A component of this is the multidiscipline threat briefing. Briefings can and should be tailored, both in scope and classification level. Briefings could then be used to familiarize supported commands with the nature of the multidiscipline threat posed against the command or activity.

Changes in doctrine for protecting the entire intelligence cycle?

The US definition of counterintelligence, however, is narrowing, while the definition of Operations Security (OPSEC) seems to be broadening. The manuals of the early 1990s describedCI as responsible for overall detection of, and protection from, threats to the intelligence cycle. With the 2005-2007 National Counterintelligence Strategy statements, it is no longer clear what function is responsible for the overall protection of the intelligence cycle. In this recent US doctrine, although not necessarily that of other countries, counterintelligence (CI) is now seen as primarily a counter the Human Intelligence HUMINT to Foreign Intelligence Service (FIS). FIS is a term of art that covers both nations and non-national groups, the latter including terrorists, or organized crime involved in areas that are considered fundamental threats to national security

The National Counterintelligence Strategy of 2005[8] states the CI mission as:

  • counter terrorist operations
  • seize advantage:
  • protect critical defense technology
  • defeat foreign denial and deception
  • level the economic playing field
  • inform national security decisionmaking
  • build a national CI system

The 2007 US joint intelligence doctrine[9] restricts its primary scope to counter-HUMINT, which usually includes counter-terror. It is not always clear, under this doctrine, who is responsible for all intelligence collection threats against a military or other resource. The full scope of US military counterintelligence doctrine has been moved to a classified publication, Joint Publication (JP) 2-01.2, Counterintelligence and Human Intelligence Support to Joint Operations, so publicly it is unknown if that problem is clarified there.

Countermeasures to specific collection disciplines

More specific countermeasures against intelligence collection disciplines are listed below

CI roles against Intelligence Collection Disciplines, 1995 doctrine (FM34-60)
Discipline Offensive CI Defensive CI
HUMINT Counterreconnaissance, offensive counterespionage Deception in operations security
SIGINT Recommendations for kinetic and electronic attack Radio OPSEC, use of secure telephones, SIGSEC, deception
IMINT Recommendations for kinetic and electronic attack Deception, OPSEC countermeasures, deception (decoys, camouflage)

If accessible, use SATRAN reports of satellites overhead to hide or stop activities while being viewed

Counter-HUMINT

See additional detail on Project Slammer, which was an effort of the Intelligence Community Staff, under the Director of Central Intelligence, to come up with characteristics of Project Slammer, an Intelligence Community sponsored study of espionage.

Aspects of physical security, such as guard posts and wandering guards that challenge unidentified persons, certainly help with counter-HUMINT. Security education, also part of OPSEC, is important to this effort.

Counter-SIGINT

Military and security organizations will provide secure communications, and may monitor less secure systems, such as commercial telephones or general Internet connections, to detect inappropriate information being passed through them. Education on the need to use secure communications, and instruction on using them properly so that they do not become vulnerable to specialized technical interception. Methods of including encryption and traffic flow security may be needed in addition to, or instead of, specialized shielding of the equipment.

The range of methods possible in counter-SIGINT cover a wide range of what is appropriate in a combat zone to what can be done in a research laboratory. At the combat end, if an enemy SIGINT interception antenna can be targeted, thoroughly bombing it and its associated electronics will definitely end its career as a SIGINT threat. In slight less dramatic fashion, it can be taken under electronic attack, and strong enough electromagnetic signals directed at it to conceal any friendly signals, and perhaps to overload and destroy the electronics of the SIGINT facility. SIGINT protection for office buildings may require a room to be electronically shielded.

Counter-IMINT

The basic methods of countering IMINT are to know when the opponent will use imaging against one's own side, and interfering with the taking of images. In some situations, especially in free societies, it must be accepted that public buildings may always be subject to photography or other techniques.

Countermeasures include putting visual shielding over sensitive targets or camouflaging them. When countering such threats as imaging satellites, awareness of the orbits can guide security personnel to stop an activity, or perhaps cover the sensitive parts, when the satellite is overhead. This also applies to imaging on aircraft and UAVs, although the more direct expedient of shooting them down, or attacking their launch and support area, is an option in wartime.

Counter-OSINT

While the concept well precedes the recognition of a discipline of OSINT, the idea of censorship of material directly relevant to national security is a basic OSINT defense. In democratic societies, even in wartime, censorship must be watched carefully lest it violate reasonable freedom of the press, but the balance is set differently in different countries and at different times.

Britain is generally considered to have a very free press, but the UK does have the DA-Notice, formerly D-notice system. Many British journalists find that this system is used fairly, although there always be arguments. In the specific context of counterintelligence, note that Peter Wright, a former senior member of the Security Service who left their service without his pension, moved to Australia before publishing his book Spycatcher. While much of the book was reasonable commentary, it did reveal some specific and sensitive techniques, such as Operation RAFTER, a means of detecting the existence and setting of radio receivers.

Operations Security

Even though the principles of OPSEC go back to the beginning of warfare, formalizing OPSEC as a US doctrine began with a 1965 study called PURPLE DRAGON, ordered by the Joint Chiefs of Staff, to determine how the North Vietnamese could get early warning of ROLLING THUNDER fighter-bomber strikes against the North, and ARC LIGHT B-52 missions against the South.[10]

The methodology used was to consider what information the adversary would need to know in order to thwart the flights and the sources from which the adversary might collect this information.

It became apparent to the team that although traditional security and intelligence countermeasures programs existed, reliance solely upon them was insufficient to deny critical information to the enemy—especially information and indicators relating to intentions and capabilities. The group conceived and developed the methodology of analyzing U.S. operations from an adversarial viewpoint to find out how the information was obtained.

The team then recommended corrective actions to local commanders. They were successful in what they did, and to name what they had done, they coined the term "operations security."

Establishment within DOD

After the Vietnam War ended in 1973 the JCS adopted the OPSEC instruction developed by the CINCPAC OPSEC branch and published it as JCS Publication 18, “Doctrine for Operations Security”. Originally classified, an unclassified version was published the following year. The JCS published the first JCS OPSEC Survey Planning Guide, and distributed this publication within DOD and to other Federal agencies.

The JCS began presenting a series of annual OPSEC conferences for representatives throughout government. Attendees discussed ways to adapt the OPSEC concept developed for combat operations to the peacetime environment. Throughout the late 1970s, the military services established their own OPSEC programs and published implementing directives and regulations. By the end of the decade, they were conducting their own surveys.

Establishment outside DOD

After the Vietnam War, a number of the individuals who had been involved in the development or application of the OPSEC concept were either already working for, or went to work with the National Security Agency (NSA). As a result, NSA played a major role in adapting OPSEC to peacetime operations, and in informally communicating OPSEC concepts to other agencies. Thus, non-DoD agencies began to establish their own programs, and OPSEC was on its way to becoming a national program.

Establishment within DOE

DOE began to assume a role in the peacetime application of OPSEC by participating in the JCS OPSEC conferences and interfacing with other Federal agencies. In 1980, DOE began establishing its own OPSEC program and, by 1983, the Department had the only formal OPSEC program outside DOD. Since that time, DOE has continued to refine and adapt the OPSEC concept to meet the specific needs of its mission.

In 1983, DOE was a member of the Senior Interagency Group for Intelligence (SIG-I), an advisory group of the National Security Council. SIG-I proposed the establishment of a national policy on OPSEC and the formation of a National OPSEC Advisory Committee (NOAC).

In 1988, President Ronald Reagan issued National Security Decision Directive 298 (NSDD-298) that established a national policy and outlined the OPSEC five-step process. Also mandated within NSDD-298, was the establishment of the Interagency OPSEC Support Staff (IOSS). The IOSS mission is to assist in implementing the national-level OPSEC program as directed by the President. The NSDD directs IOSS to provide or facilitate OPSEC training and act as a consultant to Executive departments and agencies required to have OPSEC programs. Operations security (OPSEC), in a widely accepted meaning,[11] relates to identifying the information that is most critical to protect regarding future operations, and planning activities to:

  • Identifying those actions that can be observed by adversary intelligence systems
  • Determining indicators that adversary intelligence systems might obtain that could be interpreted or pieced together to derive critical information in time to be useful to adversaries
  • Designing and executing measures that eliminate or reduce to an acceptable level the vulnerabilities of friendly actions to adversary exploitation.

Contrary to the US Department of Defense definition, a 2007 webpage of the US Intelligence Board[12] describes (emphasis added) "the National Operations Security (OPSEC) Program - a means to identify, control, and protect unclassified information and evidence associated with U.S. national security programs and activities. If not protected, such information often provides an opportunity for exploitation by adversaries or competitors working against the interests of the US."

Even though there are legitimate concerns about adversaries exploiting open societies, there must be vigilance that a broader definition of OPSEC has created concerns that national security may be used to conceal the politically embarrassing or marginally legal, rather than protecting legitimate functions.

What constitutes OPSEC? A Presidential-level National Security Decision Directive formalized it[13] as five steps, the details of which have been expanded upon by the Department of Energy[14]

  1. Identification of the critical information to be protected: Basic to the OPSEC process is determining what information, if available to one or more adversaries, would harm an organization's ability to effectively carry out the operation or activity. This critical information constitutes the "core secrets" of the organization, i.e., the few nuggets of information that are central to the organization's mission or the specific activity. Critical information usually is, or should be, classified or least protected as sensitive unclassified information.
  2. Analysis of the threats: Knowing who the adversaries are and what information they require to meet their objectives is essential in determining what information is truly critical to an organization's mission effectiveness. In any given situation, there is likely to be more than one adversary and each may be interested in different types of information. The adversary's ability to collect, process, analyze, and use information, i.e., the threat, must also be determined.
  3. Analysis of the vulnerabilities: Determining the organization's vulnerabilities involves systems analysis of how the operation or activity is actually conducted by the organization. The organization and the activity must be viewed as the adversaries will view it, thereby providing the basis for understanding how the organization really operates and what are the true, rather than the hypothetical, vulnerabilities.
  4. Assessment of the risks: Vulnerabilities and specific threats must be matched. Where the vulnerabilities are great and the adversary threat is evident, the risk of adversary exploitation is expected. Therefore, a high priority for protection needs to be assigned and corrective action taken. Where the vulnerability is slight and the adversary has a marginal collection capability, the priority should be low
  5. Application of the countermeasures: Countermeasures need to be developed that eliminate the vulnerabilities, threats, or utility of the information to the adversaries. The possible countermeasures should include alternatives that may vary in effectiveness, feasibility, and cost. Countermeasures may include anything that is likely to work in a particular situation. The decision of whether to implement countermeasures must be based on cost/benefit analysis and an evaluation of the overall program objectives.

OPSEC Scope

OPSEC complements (emphasis in (NSDD 298)) physical, information, personnel, computer, signals, communications, and electronic security measures. Note that this does not include counter-HUMINT or counter-IMINT. If the definition of counterintelligence is redefined to cover counter-HUMINT, a scope begins to emerge, although an official term still seems lacking to deal with the totality of security against all threats. This series of articles simply tries to define it for the security of intelligence.

Another way of describing the scope was coined, by Kurt Haase, OPSEC Program at the DOE Nevada Operations Office in order to simplify the understanding of OPSEC; it is titled the "Laws of OPSEC":

  • 1. If you don't know the threat, how do you know what to protect? Although specific threats may vary from site to site or program to program. Employees must be aware of the actual and postulated threats. In any given situation, there is likely to be more than one adversary, although each may be interested in different information.
  • 2. If you don't know what to protect, how do you know you are protecting it? The "what" is the critical and sensitive, or target, information that adversaries require to meet their objectives.
  • 3. If you are not protecting it (the critical and sensitive information), the adversary wins! OPSEC vulnerability assessments, (referred to as "OPSEC assessments" - OA's - or sometimes as "Surveys") are conducted to determine whether or not critical information is vulnerable to exploitation. An OA is a critical analysis of "what we do" and "how we do it" from the perspective of an adversary. Internal procedures and information sources are also reviewed to determine whether there is an inadvertent release of sensitive information

OPSEC techniques

OPSEC measures may include, but are not limited to, counterimagery, cover, concealment, and deception.

According to (NSDD 298), the Director, National Security Agency, is designated Executive Agent for interagency OPSEC training. In this capacity, he has responsibility to assist Executive departments and agencies, as needed, to establish OPSEC programs; develop and provide interagency OPSEC training courses; and establish and maintain an Interagency OPSEC Support Staff (IOSS), whose membership shall include, at a minimum, a representative of the Department of Defense, the Department of Energy, the Central Intelligence Agency, the Federal Bureau of Investigation, and the General Services Administration. The IOSS will:

  • Carry out interagency, national level OPSEC training for executives, program managers, and OPSEC specialists;
  • Act as a consultant to Executive departments and agencies in connection with the establishment of OPSEC programs and OPSEC surveys and analyses; and
  • Provide an OPSEC technical staff for the SIG-I.

Nothing in this directive:

  • Is intended to infringe on the authorities and responsibilities of the Director of Central Intelligence to protect intelligence sources and methods, nor those of any member of the Intelligence Community as specified in Executive Order No. 12333; or
  • Implies an authority on the part of the SIG-I Interagency Group for Countermeasures (Policy) or the NOAC to examine the facilities or operations of any Executive department or agency without the approval of the head of such Executive department or agency.

NSDD 298 preceded the creation of a Director of National Intelligence (DNI), who assumed some responsibilities previously held by the Director of Central Intelligence (DCI). The Director of Central Intelligence was no longer the head of the intelligence community, and was retitled the Director of the Central Intelligence Agency (DCIA). The National Counterintelligence Executive (NCIX), reporting directly to the DNI, also was created after NSDD 298. It is not clear, therefore, if the italicized responsibilities (above) of the DCI may have moved to the DNI or NCIX.

Communications security

Communications security forms an essential part of counterintelligence, preventing an adversary to intercept sensitive information that is transmitted, especially through free space, but also through wired networks capable of being wiretapped. It includes several disciplines, both including those for protecting the hostile acquisition of information either from the patterns of flow of messages, or the content of messages (e.g., encryption. It also includes a number of disciplines that protect against the inadvertent emanation of unprotected information from communications systems.

Physical security

This article discusses physical security in the context of information cycle security; see Physical security for a more general view of the topic. Protection of both sensitive information in human-readable form, as well as of cryptographic equipment and keys, is the complement of communications security. The strongest cryptography in the world cannot protect information that, when not being sent through strong cryptographic channels, is left in an area where it can be copied or stolen.

It's useful to look at some of the extremes of physical security, to have a standard of reference. A U.S. Sensitive Compartmented Intelligence Facility (SCIF) is a room or set of rooms, with stringent construction standards,[15] to contain the most sensitive materials, and where discussions of any security level can take place. A SCIF is not a bunker, and might slow down, but certainly not stop, a determined entry attempt that used explosives.

There can be individual variations in construction based on specific conditions; the standards for one inside a secure building inside a military base in the continental US are not quite as stringent as one in an office building in a hostile country. While, in the past, it was assumed that the SCIF itself and a good deal of electronics in it needed specialized and expensive shielding, and other methods, to protect against eavesdropping—most of these are under the unclassified code word TEMPEST, TEMPEST has generally been waived for facilities in the continental U.S. At the other extreme, the secure conference and work area in the US Embassy in Moscow, where the new building was full of electronic bugs, had extra security measures making it a "bubble" inside as secure a room as was possible. In each case, however, the plans must be preapproved by a qualified security expert, and the SCIF inspected before use, and periodically reinspected.

Facilities for less sensitive material do not need the physical protection provided by a SCIF. While the most recent policies should be consulted, TOP SECRET needs both an alarmed (with response) high-security filing cabinet with a combination lock; SECRET may relax the alarm requirement but still require the same high-security filing cabinet; CONFIDENTIAL may accept a regular filing cabinet, which has had a bracket welded to each drawer, and a strong steel bar run through the brackets and secured with a combination padlock.

Other security measures might be closing drapes or blinds over the window of a room where classified information is handled, and might be photographed by an adversary at a height comparable to that of the window.

No matter how physically secure a SCIF may be, it is no more secure than the people with access to it. One severe security breach was by a clerk, Christopher John Boyce, who worked inside the SCIF that held communications equipment and stored documents for a TRW facility in Redondo Beach, California. At this facility, among other sensitive programs, TRW built U.S. reconnaissance satellites for the National Reconnaissance Office and Central Intelligence Agency.[16] Boyce stole documents and gave them to a drug dealer, Andrew Daulton Lee, who sold them to the Soviet KGB. Personnel security measures are as important as physical security. Counter-HUMINT measures might have detected the documents being transferred and taken to the Soviet Embassy in Mexico City.

Personnel security

Many of the greatest enemy penetrations of a country's secrets came from people who were already trusted by that government. Logically, if your service desires to find out your opponent's secrets, you do not try to recruit a person that would have no access to those secrets – unless the person you recruit gives you access to such a person.

The general process of determining whether a person can be trusted is security clearance; the British term, used in many Commonwealth countries, is positive vetting. In the most sensitive positions, clearances are periodically renewed. Indications of possible compromise, such as spending patterns that are inconsistent with one's known income, are supposed to be reported to personnel security officers.

Security clearance is a starting point, but, while respecting individual rights, a certain amount of monitoring is appropriate. Some of the worst US betrayals, of recent years, have been money-driven, as with Aldrich Ames and Robert Hanssen. Electronic reviews of financial records, raising flags if there are irregularities that need to be checked, may be among the least obtrusive means of checking for unexplained income. There are cases where such checks simply revealed an inheritance, and no further questions were raised.

Detecting other vulnerabilities may come from things that are simply good management. If someone is constantly clashing with colleagues, it is good people management, not uniquely security management, for a manager to talk with the individual, or have them talk to an Employee Assistance Program. It is not considered a disqualification to seek mental health assistance; it may be considered excellent judgement. For people with especially high security access, it may be requested that they see a therapist from a list of professionals with security clearance, so there is not a problem if something sensitive slips or they need to discuss a conflict.

gollark: I only block bad OSes, and dangerous programs such as "Webicity" and [REDACTED] Siri.
gollark: No, I quite like it.
gollark: No.
gollark: I think plethora just makes all TEs have getMetadata and ßŧ↓đđ!
gollark: The number of chests needed is actually `n²-n`.

References

  1. Dulles, Allen W. (1977). The Craft of Intelligence. Greenwood. ISBN 0-8371-9452-0. Dulles-1977.
  2. Wisner, Frank G. (22 September 1993). "On The Craft of Intelligence". CIA-Wisner-1993. Archived from the original on 15 November 2007. Retrieved 2007-11-03.
  3. US Department of the Army (15 January 1991). "Field Manual 34-37: Echelons above Corps (EAC) Intelligence and Electronic Warfare (IEW) Operations". Retrieved 2007-11-05.
  4. US Department of the Army (3 October 1995). "Field Manual 34-60: Counterintelligence". Retrieved 2007-11-04.
  5. Interagency OPSEC Support Staff (IOSS) (May 1996). "Operations Security Intelligence Threat Handbook: Section 2, Intelligence Collection Activities and Disciplines". IOSS Section 2. Retrieved 2007-10-03.
  6. US Army (May 2004). "Chapter 9: Measurement and Signals Intelligence". Field Manual 2-0, Intelligence. Department of the Army. FM2-0Ch9. Retrieved 2007-10-03.
  7. Matschulat, Austin B. (2 July 1996). "Coordination and Cooperation in Counerintelligence". CIA-Matschulat-1996. Archived from the original on 10 October 2007. Retrieved 2007-11-03.
  8. Van Cleave, Michelle K. (April 2007). "Counterintelligence and National Strategy" (PDF). School for National Security Executive Education, National Defense University (NDU). USNDU-Van Cleave-2007. Archived from the original (PDF) on 2007-11-28. Retrieved 2007-11-05.
  9. Joint Chiefs of Staff (22 June 2007). "Joint Publication 2-0: Intelligence" (PDF). US JP 2-0. Retrieved 2007-11-05.
  10. "History of OPSEC" (PDF). Retrieved 2005-11-05.
  11. US Department of Defense (12 July 2007). "Joint Publication 1-02 Department of Defense Dictionary of Military and Associated Terms" (PDF). Archived from the original (PDF) on 23 November 2008. Retrieved 2007-10-01.
  12. US Intelligence Board (2007). "Planning and Direction". USIB 2007. Archived from the original on 2007-09-22. Retrieved 2007-10-22.CS1 maint: ref=harv (link)
  13. "National Security Decision Directive 298: National Operations Security Program". July 15, 2003. NSDD 298. Retrieved 2007-11-05.
  14. Department of Energy. "An Operational Security (OPSEC) Primer". Archived from the original on 2007-10-28. Retrieved 2007-11-05.
  15. Director of Central Intelligence Directive (18 November 2002). "Physical Security Standards for Sensitive Compartmented Information Facilities". DCID 6/9. Retrieved 2005-11-05.
  16. Lindsey, Robert (1979). The Falcon and the Snowman: A True Story of Friendship and Espionage. Lyons Press. ISBN 1-58574-502-2.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.