Information Security Triad Model and Framework

Information security is understood as the security of information and means of its processing from accidental or deliberate influences of a natural or artificial kind. The security of information is not an end in itself, its provision is necessary for adequate protection of information assets (information and software and hardware complex used for its storage and (or) processing) of the company. It is also needed for reduction of risks and economic losses associated with threats to information assets and resources of the company (main information systems and databases of the company, data on issued loans, methodologies of the company, computer, telecommunication, and server equipment).

As part of information security, the main attributes of information are supported (Figure 1):

Basic Information Security Attributes
Figure 1. Basic Information Security Attributes (CIA Triad).

The main goals of information security are:

  1. ensuring adequate protection of information, depending on its significance for the company;
  2. ensuring confidentiality, integrity, and availability of information, protection of personal data;
  3. ensuring information security as an integral part of information systems throughout their entire life cycle (Brantly, 2019).

To achieve the goals of guaranteeing information security, the company provides an effective solution to the following tasks:

  1. inventory and classification of the company’s information assets (by the owners of information assets in all business units within the scope of the ISMS);
  2. identification and assessment of information security risks and the potential of these risks (based on the information security threat model);
  3. development of measures for information security risk management (including both methodological, manual, and automated controls);
  4. defining and documenting the basic requirements and procedures for ensuring information security;
  5. implementation and configuration of information security tools;
  6. training of the company’s personnel in the field of information security;
  7. timely identification and elimination of vulnerabilities in the company’s assets and thereby preventing the possibility of damage and disruption of the normal functioning of the company’s business processes as a result of the implementation of information security threats;
  8. reduction to an acceptable level of possible damage to the company during the implementation of information security threats, including reducing the time required to restore business processes after possible interruptions;
  9. monitoring and processing of information security events and incidents;
  10. planning and optimization of costs for ensuring the information security of the company (McDermott, 2019).

The cybersecurity framework must constantly monitor the changing priorities of all stakeholders and the capabilities of adversaries, using analysis tools to match technical information with expert judgment (Woods & Simpson, 2017). Otherwise, the likelihood of the proliferation of counterfeit electronic components increases, with dire consequences for the economy and national security. In its current philosophy, risk analysis is concerned only with describing physical failures, their consequences, and events that may occur in the future. Difficulties arise with its perception by people and the definition of values that influence practical decisions.

The first step here is to adjust the perception of the work being done in the analysis process in decision making, using algorithms to combine performance metrics with perspectives and values that are worth considering from a stakeholder perspective. When analyzing decisions, it is necessary to remember the human factor that arises in defining the boundaries of a task, evaluating each action or mitigation strategy in relation to overall goals, and providing decision-makers with information about how the various strategies are being implemented.

The Cyber Risk Management Standard NIST Special Publication 800-39 is used by US government agencies (including the Department of Energy and the Nuclear Regulatory Commission) as a basis. The document details the risk assessment management procedure and approaches national standards based on a quasi-quantitative risk assessment under conditions of high uncertainty. The publication discusses identifying risk factors, combining them, and determining the relative importance of factors, which, in turn, are divided into threats, vulnerabilities, impacts, probabilities, and preconditions. However, there are very few specific methods, algorithms, and rules for combining and determining the significance of factors. NIST emphasizes that relevance and consolidation issues need to be addressed with organizational risk tolerance in mind. Risk assessment is considered incomplete without specifying the limits of their applicability in the context of important organizational decisions.

The preliminary structure of the NIST standard maintains the status quo in the area of cybersecurity – risk assessment is considered in detail, but there is no specific guidance on mitigation. At the core of this framework are five central cybersecurity functions: identification, protection, detection, response, and recovery. Each of the functions has several nested categories and subcategories that reflect specific results and means of performing the function. The response function, for example, includes five categories: response planning, interaction, analysis, mitigation, and improvement. In the mitigation category of the response function, there are two subcategories with the inclusion and resolution of incidents. Discrepancies between profiles are viewed as weaknesses in the organization’s risk management.

Electronic systems are increasingly being tampered with by fraudsters who change the labeling and packaging or update the appearance of electronic components in an attempt to pass them off as genuine goods. Figure 2 shows the samples of a real and a fake inductor. In addition to supplying defective components, the adversary is capable of launching hardware Trojans and destructively altering chip circuits. All of this can lead to serious security problems, from the compromise of classified information to the disabling of the system.

The comparison of genuine and falsified elements: a - inductor, top view; b - inductor, bottom view; c - inductor, side view.
Figure 2. The comparison of genuine and falsified elements: a – inductor, top view; b – inductor, bottom view; c – inductor, side view. In photographs a and b of the falsified element (left), a different coating can be seen, and the quality of its performance is poor. In the photo in the counterfeit element (above), the coil is wound unevenly and has a different conductor diameter.

Research initiatives to detect counterfeit electronic components are focused on finding appropriate technology solutions or improving supply chains. Technological solutions include labeling plant-derived DNA components or providing authentication through the use of physically non-reproducible functions. Suggestions for improved supply chain management consist of guidelines for selected suppliers, disposal of electronic waste, and management of equipment obsolescence and decommissioning. Such efforts can reduce the risks associated with counterfeiting electronic components, but the assessment procedure itself is developed individually in each case. This unique procedure is fraught with danger because in the long term risk management cannot be successful without effective and carefully planned risk assessment.

The tests performed must ensure that incoming electronic components are authenticated even when counterfeit items are labeled using the most advanced technology. This is one of the reasons for the need to balance test cases because insufficient testing increases the likelihood of counterfeit components entering the supply chain, while excessive testing increases lead times and costs. Testing standards should provide an organization with an optimal balance between the two.


Brantly, A. F. (2019). Conceptualizing cyber policy through complexity theory. Journal of Cyber Policy, 4(2), 275–289. Web.

McDermott, R. (2019). Some emotional considerations in cyber conflict. Journal of Cyber Policy, 4(3), 309–325. Web.

Woods, D., & Simpson, A. (2017). Policy measures and cyber insurance: A framework. Journal of Cyber Policy, 2(2), 209–226. Web.

Find out your order's cost