Skip to main content

Considerations, Adaptation, and Sharing

  • Chapter
  • First Online:
Cyber Denial, Deception and Counter Deception

Part of the book series: Advances in Information Security ((ADIS))

  • 2057 Accesses

Abstract

Adaptability and agility are essential in planning, preparing, and executing deception operations. Deception planners must be prepared to respond so that they can still achieve their goals even when it seems that everything is going wrong. This chapter brings together considerations for the cyber-D&D planner, covering the realities of utilizing cyber-D&D. Applying cyber-D&D poses risk and has the potential for unintended consequences. Cyber-D&D operations can be compromised, and even the best-laid plans can fail. Although the defender can gain advantages by using D&D in each phase of the kill chain, utilizing cyber-D&D TTPs always involves challenges and potential drawbacks. We review some of these to inform and encourage cyber-D&D cadres to explore these considerations early in the planning phases of cyber-D&D operations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See Appendix C for a list of the deception maxims and their application to cyber-D&D.

  2. 2.

    Blow-offs and cool-outs are typically associated with confidence games (see Goffman, E. (1952), “On cooling the mark out: Some aspects of adaptation to failure.” Psychiatry, 15 (4): 451–463.), but can be adapted for espionage. A blow-off is the means taken by the entity executing the covert operation to recuse itself from any or all activities attributed to the blow-back. A cool-out is the means taken to assuage a complainant, particularly one that is a blow-off target, in order to minimize the extent of the blow-back.

  3. 3.

    See Sect. 6.4 for a more detailed exploration of this topic.

  4. 4.

    Chapter 7 describes this concept in detail.

  5. 5.

    Chapter 5 of this book presents a case study of a cyber-wargame Red/Blue team exercise in which the EEFI was fully compromised, and thus Blue’s military operation was a potential failure, but with an additional iteration through the deception chain, the cyber-D&D operation supported Blue in planning and successfully executing its military operation.

  6. 6.

    Briefly, the lessons learned are: Know your commander’s (or organization’s) intentions and battle plans; know the desired enemy reaction; know the available deception assets (i.e., human and material) and their capabilities and limitations; communicate up (e.g., your commander), across (e.g., parallel staffs), and down (e.g., subordinate staffs) your deception capabilities; coordinate the deception plan with other planning and operational units whose cooperation is needed; provide advance notification to the HQs of all friendly units that could be adversely affected by the deception operation if they fell for the deception; take advantage of the target’s military doctrine, capabilities, intelligence standard operating procedure, and preconception in the deception plan; keep the deception plan simple so that one detected discrepancy does not unravel the whole deception operation; make plans within plans so that there are separate deception sub-plans for the lower level units; the deception plan must become operational in time to serve its purpose, though that timeline can be unexpectedly shortened in order to achieve surprise; keep the deception operation as brief as possible to minimize the likelihood of its detection; seek accurate and timely feedback as these are the only clues to whether the deception operation is succeeding or failing; fine-tune the deception operation based on the feedback received; and stay flexible by designing the deception plan to include at least one “out.”

  7. 7.

    Cave Brown, A. (2007) Bodyguard of Lies: The Extraordinary True Story Behind D-Day. Lyons Press.

  8. 8.

    Norton, Michael I., Mochon, Daniel, and Ariely, Dan. (2012). “The IKEA effect: When labor leads to love,” Journal of Consumer Psychology, 22 (3), 453–460.

  9. 9.

    The IKEA effect occurs when consumers place a disproportionally high value on self-made products as compared to objectively similar products which they did not assemble. Empirical studies show that this effect occurs only when labor results in successful completion of tasks. That is, when experiment participants built and then destroyed their creations, or failed to complete them, the effect dissipated.

  10. 10.

    Cialdini, Robert B., (1993). Influence: The Psychology of Persuasion. William Morrow and Company: New York.

  11. 11.

    The principle of commitment and consistency deals with our desire to be and look consistent with our words, beliefs, attitudes, and deeds. Being consistent with earlier decisions reduces the need to process all the relevant information in future similar situations; instead, one simply needs to recall the earlier decision and to respond consistently with it, even if the decision was erroneous. People will add new reasons and justifications to support the wisdom of commitments they have already made. As a result, some commitments remain in effect long after the conditions that initiated them have changed. It should be noted that not all commitments are equally effective. Those that are the most effective are active, public, effortful, and viewed as internally motived (i.e., uncoerced).

  12. 12.

    These duals of measures–countermeasures are basic elements of electronic warfare; see, e.g., Sergei A. Vakin, Lev N. Shustov, and Robert H. Dunwell. Fundamentals of electronic warfare. Norwood MA: Artech House, 2001.

  13. 13.

    George A. Crawford (2009) Manhunting: Counter-Network Organization for Irregular Warfare. JSOU Report 09-7 Hurlburt Field FL, JSOU Press.

  14. 14.

    Steven Marks, Thomas Meer, Matthew Nilson (2005) Manhunting: A Methodology for Finding Persons of National Interest, Monterey, CA: Naval Postgraduate School, p. 19.

  15. 15.

    Stokes, Mark A. and L.C. Russell Hsiao (2012) Countering Chinese Cyber Operations: Opportunities and Challenges for U.S. Interests. Project 2049 Institute, October 29, 2012, p. 3.

  16. 16.

    One community-driven solution to this problem is the Structured Threat Information eXpression (STIX™) language, which extends indicator sharing to also include other full-spectrum cyber threat information (Barnum, 2014). STIX is a language for the specification, capture, characterization, and communication of standardized cyber threat information. Cyber threat information represented as STIX is shared through Trusted Automated eXchange of Indicator Information (TAXII™) services that allow organizations to share cyber threat information in a secure and automated manner (see http://stix.mitre.org/ and http://taxii.mitre.org/). STIX, TAXII, and the STIX and TAXII logos are trademarks of The MITRE Corporation.

  17. 17.

    We are currently exploring the development of a structured cyber-D&D data solution via the STIX language (see https://github.com/STIXProject/schemas/pull/334/files).

  18. 18.

    See, for example, The MITRE Corporation. (2012). Cyber Information-Sharing Models: An Overview. http://www.mitre.org/sites/default/files/pdf/cyber_info_sharing.pdf. Last accessed October 25, 2015.

References

  • Barnum, S. (2014). Standardizing Cyber Threat Intelligence Information with the Structured Threat Information eXpression (STIXTM). MITRE White Paper.

    Google Scholar 

  • Steven Marks, Thomas Meer, Matthew Nilson (2005) Manhunting: A Methodology for Finding Persons of National Interest, Monterey, CA: Naval Postgraduate School, p. 19.

    Google Scholar 

  • The MITRE Corporation. (2012). Cyber Information-Sharing Models: An Overview.http://www.mitre.org/sites/default/files/pdf/cyber_info_sharing.pdf.

  • Endsley, M. R. & Robertson, M. M. (1996). Team situation awareness in aircraft maintenance. Lubbock, TX: Texas Tech University.

    Google Scholar 

  • Rowe, N., & Rothstein, H. (2004). Two taxonomies of deception for attacks on information systems. Journal of Information Warfare, 3(2), 27-39.

    Google Scholar 

  • Rowe, N.C. (2007). “Deception in defense of computer systems from cyber-attack,” in Andrew M. Colarik and Lech J. Janczewski, eds. Cyber War and Cyber Terrorism. The Idea Group: Hershey, PA.

    Google Scholar 

  • Whaley, B. (2010a). Practise to Deceive: Learning Curves of Military Deception Planners. Washington, DC: Office of the Director of National Security, National Intelligence Center, Foreign Denial & Deception Committee.

    Google Scholar 

  • Whaley, B. (2010b). When Deception Fails: The Theory of Outs. Washington, DC: Office of the Director of National Security, National Intelligence Center, Foreign Denial & Deception Committee.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Heckman, K.E., Stech, F.J., Thomas, R.K., Schmoker, B., Tsow, A.W. (2015). Considerations, Adaptation, and Sharing. In: Cyber Denial, Deception and Counter Deception. Advances in Information Security. Springer, Cham. https://doi.org/10.1007/978-3-319-25133-2_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-25133-2_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-25131-8

  • Online ISBN: 978-3-319-25133-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics