Расширенный поиск

Pavel I.Pilyugin, Alexey A.Salnikov
Institute of Information Security Issues,
Lomonosov Moscow State University

Today the concept of “trust” is more and more often used when discussing the problem of cyberspace relations (the Internet, social networks, financial transactions, etc.) gradually replacing such terms as “reliability”, “protectability”, “security” (and their measurable parameters). Furthermore, a number of experts believe that the concept of “trust” can completely replace the conventional notion of security and its characteristics.

This is primarily due to economic issues: the sale of hardware, software, services, data, and other goods and services in cyberspace when the buyers often have no access to objective characteristics of the goods and services he/ she purchases.

Overconfidence in cyberspace may prove to be unsafe. In particular, the study conducted by researchers from the University of Pennsylvania and Duke University along with Intel Labs® proofs that two-thirds of Android® applications track dialed numbers, gather information on the geographic location of the device (a cellphone, tablet computer, etc.) and carry out other “suspicious” operations with user’s personal data.

It will therefore be worthwhile to give a more precise definition to the notion of “trust” and identify its role in terms of emergence, development and elimination of cyber-conflicts.

Cyber-Conflicts as Determined by the Unified Theory of Conflict

The Unified Theory of Conflict (UTC) [3] views a conflict as a dialectical contradiction which sets the system in motion. The results of a conflict are either synergy or antagonism.

In  the  case  of  synergy,  all  the  interacting  systems  are  linked  to  each other in a directly proportional relationship. Therefore, synergy represents a symmetrical reinforcement (or weakening) of the interacting systems. In other words, synergy is a type of coexistence (collaboration, merging into one system) of systems or elements of a system when their parameters get reinforced or weakened simultaneously.

Antagonism, often understood as a conflict situation, represents a type of coexistence of elements of a system when the changes in the properties of the system elements go in different directions. It reasonable to say that antagonism joins elements into one system while separating them onto two poles (coalitions) in such a way that the elements in each coalition are linked to one another through a direct relation while maintaining an inverse relation to the elements in the opposite coalition.

Interactions (including conflicts) in cyberspace have their own peculiar properties. Very often, the so called cyber-conflicts are in fact no less than cyber- wars. For instance, one can recall the recent cyber-attack (or cyber-conflict) that took place between activists of two groups: Spamhaus (London/Geneva) and Cyberbunker (the Netherlands). In essence, this conflict is an example of the antagonism between  economic  interests  of  spammers  and  anti-spammers. The conflict could have been avoided altogether if the spamming activity had a more robust legal barriers or, alternatively, if barriers to dissemination of any information were removed.

Interacting parties in cyberspace (including parties in conflict) are represented by users (possibly joined through commercial, public, or government organizations), providers (of services, communication, content, etc.), as well as software and hardware and thus, their respective manufacturers and developers.

In the proposed model all participants, the causes and the subject matter of a conflict are interlinked with positive (cooperation, accord) or negative relations. Analysis of the model helps to identify a conflict situation and resolve it by replacing negative relations with positive ones and vice versa.

The Role of Trust in Cyber Conflicts

Considering the fact that synergy means cooperation and antagonism means confrontation, the necessary prerequisite for a conflict situation is trust which is essential for cooperation and mistrust which appears when relations begin to show antagonism.

Under the UTC, both synergy and antagonism may be acceptable as results of a conflict situation. An obvious solution would be to build up trust to promote cooperation, yet, although less obvious, excessive trust might also be reduced to diminish negative effects of antagonism.

As an example of the first approach we can consider software and hardware with a good record of practical application and open marketing policies offered by their developers.

An example of the second approach is excessive (unreasonable) trust exercised by users of social networks and other online services which causes leaks of personal data and financial losses.

Both cases require special methods and means of measuring the level of trust.

Trust Models (Metrics)

Over recent years, various mechanisms for measuring trust (reputation) have been in use, which is especially important for virtual communities and systems of electronic commerce [5]. The problem of trust and reputation in online systems is the topic of many studies [6] conducted in economics (building up reputation, social learning), computer science (computational models for trust and reputation, scalability, distribution and security of computing operations), sociology and psychology (rationality, the importance of emotional and cognitive factors), management science (the role of reputation/trust in marketing, brand building, etc.) and political science (the effect of reputation on public opinion).

Simple summation or the average value

The reputation is represented by the sum of the numbers of positive and negative comments, an example of this approach in use may be given by eBay. This method is primitive and the resulting reputation values are rather a rough approximation. The advantages of this method, however, are transparency and clarity for the end-user. Epinions and Amazon apply more complex counting algorithms with weighted values (the weights are determined based on the reputation, the time of valuation, distance, etc.)

Aggregators of trust values

Trust and reputation as applied to e-commerce and social networks may be evaluated through a number of mathematical aggregators:

  • Abdul-Rahman and Hailes – the graph theory [7];
  • Advogato Trust Metric – network flows [8];
  • Bayesian approach (based on the beta distribution function [9] and the subjective logic model [10]);
  • the lattice model [12].

In the above list, the Abdul-Rahman and Hailes model and the lattice model focus not on the quantitative, but qualitative indicators of trust (which is more natural and convenient for the estimate).

Analytical relationship model

S.Marsh has proposed a more general formalization of trust in his research [13]. He suggests a set of variables and such their arrangement as to arrive at one trust value within the range of [-1, 1]. Since this will be the model used hereinafter, we provide the notation and relationships for the base, overall and situational trust values depending on the values of knowledge (expertise), importance and usefulness in a particular situation:

concept-of-trust

Above parameters are linked through a following relationship:

T (y,α)=U (α)∙I (α)∙T (y)*

or through a more complex relationship:

T (y,α)=(U (α)+T (y)*)∙I (α)∙T (y)* x ,

where

xT (y)* represents the value of trust x has for y (a mean trust value for all possible situations or, in other words, reputation);

xT (y)* x   represents the subjective value of trust of x for y (mean trust value taking into account the knowledge x has of y).

The model by S.Marsh assumed that the above values may change over time. It makes an allowance for a certain “depth” of memory, and yet results in more complex relationships which shall not be considered in this document here.

The trust values are assessed to assist an agent in making the decision wheth- er to interact with another agent by comparing against a certain threshold value. Cooperation between x and y in a given situation α is possible when the situational trust is above a certain threshold value:

T (y,α) > Cooperation_Threshold (α) ═> cooperation is possible, where the co-operation threshold Cooperation_Thresholdx(α) is calculated, for instance, based on the presumed risk and the presumed competence and the importance of the situation:

Cooperation_Threshold (α) = (PR (α) / PC (y,α))∙I (α)

or more precisely:

Cooperation_Threshold (α) = (PR (α) / (PC (y,α)+ T (y)*))∙I (α)

Measurement Model for Qualitative Scales

Using the above analytical model we will demonstrate how to build a model accounting for changes of primary parameters in qualitative scales. In that case the resulting trust value will no longer be in the range [-1, 1], but will be expressed in terms of a quality scale (e.g.: “very poor”, “poor”, “satisfactory”, “good”, “excellent”). We shall hereinafter use a scale of ranks with n number of grades.

Assuming that all of the above values are expressed through the same qualitative scale (the trust and mistrust values are to be considered separately) the above analytical relationships may be represented as shown below:

trust in cyber

Or, accordingly:

trust in cyber 2

for the cooperation threshold:

trust in cyber 3Such hierarchical structures were previously processed (e.g. [14]) using various aggregation and scale transformation techniques: lexicographical ordering (ordering relation of Cartesian product of sets), the method of the rank sums, etc. Since our hierarchical structures are built on analytical expressions, it is advisable to use scale convolutions that correspond to the arithmetic operations keeping the same scale graduation.

In the general case, the problem of building such convolutions represents a linear order problem with Cartesian product of linearly ordered sets of the same dimension n and classifying the product elements under n equivalence classes. Such Cartesian product induces a partial order relation: (i,j) ≥ (k,l) <= i ≥ k & j≥l , where i,j,k,l are the indices of the linearly ordered sets and their Cartesian product.

Additive convolutions which satisfy the induced partial order relation can be obtained by ordering of the rank sum and product multiplicative. Examples of the multiplicative (x) and additive (+) convolutions for rank scales with graduation of 5 are shown below:

trust in cyber 4

For the operations of subtraction and division the respective convolutions are less obvious. In such case the ordering relation for one of the linearly ordered sets is reversed which changes the partial order relation of the Cartesian product of sets accordingly:  (i,j) ≥ (k,l)<= i ≤ k & j≥l , where i,j,k,l are the indices of the linearly ordered sets and their Cartesian product.

Examples of convolutions of rank scales with graduation of 5 for the opera- tions of division (/) and subtraction (-) respectively based on the ordering of the quotient and difference of the ranks satisfying the new order relation for the Car- tesian product of sets are given below:

As regards the comparative analysis of the tables the commentary provided in the study [13] indicates the following: in the absence of risk the cooperation threshold must be minimum (true for both tables, no math error in the rank scales table for no division by 0);

trust in cyber 5

Although the proposed methods of convolution of scales can be applied to scales with an arbitrary number of gradations, the gradation number presented above is convenient for practical calculations as demonstrated below.

Let us now compare the results of the analysis of the modified formula for Co- operation_Thresholdx(α) given by S.Marsh in his model [13] with the data obtained from the transition to the qualitative scales when I (α)=1.x Values of the Cooperation Thresholdx(α), (Model)

-1 Presumed competence PCx(y,α) + Tx(y)*
-0,5 0 +0,5 +1 +2
 

 

PR (α)

x

0 0 0 0 0 0
+0,5 -0,5 -1 +1 +0,5 +0,25
1 -1 2 +2 +1 +0,5

 

Values of the Cooperation_Threshold x(α),  (Rank scales)

1 Presumed competence PCx(y,α) + Tx(y)*
2 3 3 5 5
 

 

PR (α)

x

1 1 1 1 1 1 1
3 5 4 3 3 2 2
5 5 4 4 4 3 3

As regards the comparative analysis of the tables the commentary provided in the study [13] indicates the following:

in the absence of risk the cooperation threshold must be minimum (true for both tables, no math error in the rank scales table for no division by 0); when PC (y,α) = T (y)* 

the value in the analytical formulas is not defined,

while the corresponding value in the rank scales is;

low values of competence and high values of risk must result in a high coop- eration threshold which is observed only in rank scales and is altogether absent in the analytical model (the author considers this to be one of their main drawbacks);

when the competence value is growing while the risk values remains a constant the cooperation threshold should decrease which can be observed in both tables.

The study [13] offers the total of 12 comments which are mostly related to inconsistencies of calculating the analytical relationship. As illustrated above the inconsistencies of the analytical model provided in the commentary disappear after transition to the qualitative scales while preserving all of the behavioral characteristics relating to the high and low cooperation threshold.

Conclusion

We considered the problem of measuring the level of trust when representing analytical models as hierarchical structures and when applying qualitative measurement scales. This is just one, albeit a very important characteristic of the conflict resolution process. It is worthwhile to note that transparency and simplicity of measuring parameters is of a particular importance when analyzing conflict situations as it must be clear to all parties to the conflict which then, of course, makes its resolution an easier task.

Footnotes

[1] Alexei  Babanin  Cloud  Security:  Myths  and  Realities.  Information Security/Информационная безопасность #1, 2013.

[2] Dan  Goodin  Security,  «2  out  of  3  Android  apps  use  private  data ‘suspiciously’
http://www.theregister.co.uk/ /2010/09/30/ suspicious_android_apps/

[3] Svetlov V.A. Analysis of A Conflict, St. Petersburg. Rostok, 2001.(in Russian)

[4] Coser L. The Functions of Social Conflict. London. (3rd edition) 1968. Р.8.

[5] D.A.  Gubanov.  Overview  of  Online  Reputation/Trust  Systems.  The Institute of Control Sciences, Russian Academy of Sciences, Moscow. (in Russian)

[6] Dellarocas C., Resnick P. Online Reputation Mechanisms A Roadmap for Future Research. 2003. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.15.514

[7] Abdul-Rahman A., Hailes S. Supporting trust in virtual communities // In: Proc. of Hawaii International Conference on System Sciences. 2000.

[8] Advogato Trust Metric http://www.advogato.org/trust-metric.html

[9] Mui L., Mohtashemi M., Halberstadt A. A computational model of trust and reputation //System Sciences. 2002. P. 2431-2439.

[10] Josang A., Ismail R., Boyd. C. A Survey of Trust and Reputation Systems for Online Service Provision // Decision Support Systems. 2007. Vol. 43. P.618-644.

[11] Kamvar S.D., Schlosser M.T., Garcia   Molina   H. The Eigen Trust Algorithm for Reputation Management in P2P Networks // Proceedings of the 12th international conference on World Wide Web. 2003. P. 640-651.

[12] Wagealla, W. and Carbone, M. and English, C. and Terzis, S. and Nixon, P. (2003) A formal model of trust lifecycle management. In: Workshop on Formal Aspects of Security and Trust (FAST2003) as part of the 12th Formal Methods Europe Symposium (FM2003), 2003-09-08 – 2003-09-12, Pisa, Italy.

[13] Marsh S. Formalizing Trust as a Computational Concept. 1994. Ph.D. dissertation, University of Stirling.

[14] Rameev O.A. Methods for Expert Evaluations. A Course of Lectures. Moscow 2004. (in Russian)

 

This article is based on a presentation delivered at the 7th Scientific conference of the International Research Consortium on Information Security, as part of the International Forum on «Partnership of state authorities, civil society and business community in ensuring international information security», held on 22-25 April 2013 in Garmisch-Partenkirchen, Germany. It is published on Digital.Report with an explicit permission from the conference organizers.

Об авторе

Павел Пилюгин

Институт проблем информационной безопасности, МГУ имени М.В.Ломоносова.

Написать ответ

Send this to a friend

Перейти к верхней панели