DecideIT Manual

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 148

Users Manual Version 2.

60

www.preference.bz info@preference.bz

User Manual DecideIT Decision Tool Preference AB 2006-2011

Copyright Notice
Copyright 2006 - 2011 M.A.D. Preference AB. All rights reserved. No part of this manual may be reproduced in any manner or translated into another language without a written permission of Preference.

Trademarks
DecideIT is a trademark of Preference AB. Windows is a registered trademark of Microsoft Corporation.

Reservation
The information in this manual has been carefully reviewed and is believed to be accurate. The vendor assumes no responsibility for any inaccuracies that may be contained in this document, makes no commitment to update or to keep current the information in this manual, or to notify any person or organization of the updates. Preference AB reserves the right to make changes to the product described in this manual at any time and without notice. This product, including software, and documentation may not, in whole or in part, be copied, photocopied, reproduced, translated or reduced to any medium or machine without prior written consent. IN NO EVENT WILL PREFERENCE AB BE LIABLE FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING FROM THE USE OR INABILITY TO USE THIS PRODUCT OR DOCUMENTATION, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.

Page ii of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Table of Contents
SYSTEM REQUIREMENTS .............................................................................................................................. 8 INSTALLATION .................................................................................................................................................. 8 CONTACT............................................................................................................................................................. 8 1 THE PROBLEM ....................................................................................................................................... 10 1.1 1.2 1.3 1.4 1.5 1.6 1.7 2 MAPPING OF PRODUCT AND INFORMATION FLOW ................................................................................... 10 STOCK LEVELS ........................................................................................................................................ 11 ADMINISTRATION .................................................................................................................................... 12 PRODUCTION ........................................................................................................................................... 12 PRICE OF THE SYSTEM ............................................................................................................................. 12 QUALITY ASPECTS................................................................................................................................... 12 MODELLING THE PROBLEM ..................................................................................................................... 13

USING DECIDEIT .................................................................................................................................... 14 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 2.10 2.11 2.12 2.13 2.14 2.15 2.16 2.17 2.18 2.19 2.20 2.21 2.22 2.23 2.24 START DECIDEIT ..................................................................................................................................... 14 CREATE A TREE AND LABEL ALTERNATIVES ........................................................................................... 14 DEFINE CONSEQUENCES .......................................................................................................................... 16 SAVING AND NAMING THE TREE ............................................................................................................. 16 LABEL CONSEQUENCES ........................................................................................................................... 17 DEFINE AND LABEL SUB-CONSEQUENCES ............................................................................................... 17 ASSIGN PROBABILITIES ........................................................................................................................... 19 DEFINE TEMPLATES ................................................................................................................................. 21 ASSIGN TEMPLATES ................................................................................................................................ 23 ASSIGN VALUES ................................................................................................................................. 25 EVALUATE THE DECISION PROBLEM................................................................................................... 27 TOTAL RANKING................................................................................................................................. 30 CARDINAL RANKING .......................................................................................................................... 31 STUDY CRITICAL PROBABILITIES AND VALUES .................................................................................. 32 SECURITY THRESHOLDS...................................................................................................................... 33 EXTREME VALUES .............................................................................................................................. 35 CUMULATIVE RISK PROFILE ............................................................................................................... 36 DEFINE CRITERIA................................................................................................................................ 36 ASSIGN CRITERIA WEIGHTS................................................................................................................ 38 ASSIGN A TREE TO A CRITERION ......................................................................................................... 40 DEFINE NEW DECISION MODELS ........................................................................................................ 41 ASSERTING VALUE RELATIONS .......................................................................................................... 41 EVALUATING MULTI-CRITERIA MODELS ............................................................................................ 44 A ROUGH SENSITIVITY ANALYSIS ...................................................................................................... 45

Page iii of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011 2.25 1 CONCLUSIONS ..................................................................................................................................... 47

PREPARATIONS...................................................................................................................................... 49 1.1 1.2 INSTALL DECIDEIT .................................................................................................................................. 49 START DECIDEIT ..................................................................................................................................... 49

MENUS AND TOOLBARS; FILE .......................................................................................................... 50 2.1 2.2 2.3 2.4 2.5 2.6 CREATE MODEL ...................................................................................................................................... 50 OPEN AN EXISTING MODEL ..................................................................................................................... 50 CLOSE A MODEL ...................................................................................................................................... 50 SAVE A MODEL........................................................................................................................................ 51 SAVE A COPY OF CURRENT TREE ............................................................................................................ 51 PRINT MODEL .......................................................................................................................................... 51

MENUS AND TOOLBARS; EDIT .......................................................................................................... 52 3.1 3.2 3.3 3.4 3.5 3.6 UNDO ...................................................................................................................................................... 52 REDO ....................................................................................................................................................... 52 ALTERNATIVE PROPERTIES...................................................................................................................... 52 SET VALUE/WEIGHT RELATIONS............................................................................................................. 53 SET VALUE SCALE ................................................................................................................................... 54 Value Scales and Multi-Criteria Decision Problems .................................................................... 54 SET BACKGROUND COLOR ...................................................................................................................... 57

3.5.1

MENUS AND TOOLBARS; VIEW ......................................................................................................... 57 4.1 4.2 4.3 OVERVIEW............................................................................................................................................... 57 HIDE/SHOW ALL EVALUATION WINDOWS ............................................................................................... 57 UPDATE MODEL ...................................................................................................................................... 58

5 6

MENUS AND TOOLBARS; TEMPLATES ............................................................................................ 58 MENUS AND TOOLBARS; EVALUATION .......................................................................................... 59 6.1 6.2 6.3 6.4 6.5 6.6 6.7 6.8 6.9 6.10 6.11 6.12 SECURITY THRESHOLDS .......................................................................................................................... 59 TOTAL RANKING ..................................................................................................................................... 60 CARDINAL RANKING ............................................................................................................................... 61 EXPECTED VALUE GRAPH ....................................................................................................................... 62 CUMULATIVE RISK PROFILE .................................................................................................................... 63 RISK PROFILE .......................................................................................................................................... 64 CRITICAL PROBABILITIES/VALUES/WEIGHTS .......................................................................................... 64 TOTAL RANKING ALL CRITERIA ........................................................................................................... 65 CARDINAL RANKING ALL CRITERIA ..................................................................................................... 66 EXPECTED VALUE GRAPH - ALL CRITERIA ......................................................................................... 66 EXTREME VALUES .............................................................................................................................. 66 PREFERENCE ORDER ........................................................................................................................... 68 Page iv of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011 7 MENUS AND TOOLBARS; TOOLS ....................................................................................................... 69 7.1 7.2 7.3 8 DOCUMENT HISTORY .............................................................................................................................. 69 CHOOSE ACTIVE EXCEL SPREADSHEET ................................................................................................... 69 SETTINGS ................................................................................................................................................. 69

MENUS AND TOOLBARS; HELP ......................................................................................................... 70 8.1 8.2 8.3 ABOUT..................................................................................................................................................... 70 CONTENTS AND INDEX ............................................................................................................................ 70 ENTER LICENSE KEY ............................................................................................................................... 70

NODE PROPERTY FRAME ................................................................................................................... 71 9.1 9.2 IDENTIFY DECISION ALTERNATIVES ........................................................................................................ 71 IDENTIFY EVENT AND CONSEQUENCE NODES ......................................................................................... 73

10

EVALUATION WINDOWS..................................................................................................................... 79 10.1 10.2 10.3 10.4 FILE .................................................................................................................................................... 79 EDIT .................................................................................................................................................... 80 VIEW................................................................................................................................................... 80 UPDATE .............................................................................................................................................. 82

11 12

MULTIPLE AND SEQUENTIAL DECISIONS .................................................................................... 83 HISTORICAL BACKGROUND ............................................................................................................. 87 1.1 1.2 DECISION ANALYSIS ................................................................................................................................ 89 PERSPECTIVES ON DECISION THEORY...................................................................................................... 92

13 14 15

PROBABILITY THEORY ....................................................................................................................... 94 UTILITY THEORY .................................................................................................................................. 96 DECISION MODELLING ....................................................................................................................... 98 15.1 15.2 DECISIONS UNDER CERTAINTY ........................................................................................................... 99 DECISIONS UNDER STRICT UNCERTAINTY ........................................................................................ 100 Laplace ................................................................................................................................... 100 Wald ....................................................................................................................................... 100 Hurwicz .................................................................................................................................. 101 Savage .................................................................................................................................... 101

15.2.1 15.2.2 15.2.3 15.2.4 15.3 15.4

DECISIONS UNDER RISK BAYESIAN DECISION ANALYSIS .............................................................. 102 ASSUMPTIONS AND AXIOMS IN UTILITY THEORY ............................................................................. 103 Axiom Systems ........................................................................................................................ 103 Some Criticism Against the Utility Theory ............................................................................. 108 Risk Attitudes .......................................................................................................................... 109 Security Thresholds ................................................................................................................ 110

15.4.1 15.4.2 15.4.3 15.4.4

Page v of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011 16 17 MULTIPLE AND CONFLICTING OBJECTIVES............................................................................. 111 ELICITATION TECHNIQUES............................................................................................................. 114 6.1 6.2 6.3 18 ASSESSING UTILITIES ............................................................................................................................ 114 ASSESSING PROBABILITIES .................................................................................................................... 115 ASSESSING WEIGHTS ............................................................................................................................. 116

IMPRECISE DOMAINS ........................................................................................................................ 117 18.1 18.2 18.3 18.4 MEASURABLE AND IMMEASURABLE UNCERTAINTIES ....................................................................... 117 IMPRECISE PROBABILITY .................................................................................................................. 118 IMPRECISE UTILITY ........................................................................................................................... 119 SECOND-ORDER BELIEFS .................................................................................................................. 120

19

GRAPH MODELS .................................................................................................................................. 122 19.1 19.2 DECISION TREES ............................................................................................................................... 122 INFLUENCE DIAGRAMS ..................................................................................................................... 123 Relationship between Influence Diagrams and Trees ............................................................ 126

19.2.1 20

THE METHOD OF DECIDEIT............................................................................................................. 127 20.1 20.2 20.3 20.4 20.5 20.6 20.7 20.8 20.9 20.10 20.11 20.12 INFORMATION GATHERING ............................................................................................................... 128 MODELLING ...................................................................................................................................... 128 INFORMATION AND DECISION FRAMES ............................................................................................. 128 FRAME STRUCTURE .......................................................................................................................... 130 BASES ............................................................................................................................................... 131 PROBABILITY BASES ......................................................................................................................... 132 VALUE BASES ................................................................................................................................... 132 FRAMES ............................................................................................................................................ 133 SANITY CHECKS ............................................................................................................................... 133 SECURITY THRESHOLDS.................................................................................................................... 133 EVALUATIONS .................................................................................................................................. 134 CUTTING THE HULL .......................................................................................................................... 136

9.13 SENSITIVITY ANALYSES .......................................................................................................................... 138 9.14 DECISION PROCESS RESULTS .................................................................................................................. 138 REFERENCES .................................................................................................................................................. 140 INDEX ................................................................................................................................................................ 144

Page vi of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

PART I: GETTING STARTED

DecideIT is a user-friendly tool for decision analysis developed by Preference AB. It has several features such as: Good overview to yield a better overall picture Easy to document, review, and adjust the underlying data Hard problems are solvable within reasonable time Supports evaluation of imprecise probability and value estimates Supports comparative statements of values Supports evaluation of multiple criteria decision problems Simple ways of detecting lack of information Applicable within both decision and risk analysis

This manual describes how you use DecideIT for modelling and evaluation of decision situations. This version of the manual is written for DecideIT 2.5 2.69 and may be lacking some of the recently features of the 2.69 version. The manual consists of three main parts: A tutorial that introduces the terminology and working procedures in DecideIT A reference manual for DecideIT An introduction to the area of decision analysis

Page 7 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

System Requirements
Operating system: Windows XP, Windows Vista, Windows 7 (with Java Runtime Environment) Processor: 500MHz (may depend on operating system) RAM Memory: 512MB (may depend on operating system) HDD: 100MB of free hard disk space

Installation
If downloaded from our homepage on the web, run the file DecideIT_Setup.exe by doubleclick on it. If you received a CD-Rom with DecideIT, insert the installation CD-Rom in your CD/DVD-reader. If the installation procedure does not start automatically, follow the following procedure. 1. Double-click the program run DecideIT_Setup.exe from the installation CD and follow the instructions. 2. Restart the computer if needed. Note that you might need administrative rights on your operative system to be allowed to install this software. Contact your system administrator if you need such rights. If you are running Windows XP or 2000, this step might not be necessary. For portability reasons, the graphical user interface of DecideIT is developed in the Java Programming Language. The Java Runtime Environment is needed.

Contact
For questions and comments concerning this manual, please contact info@preference.bz

Page 8 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

PART II: TUTORIAL

This part describes how to model and analyze a decision problem. It starts by presenting a problem followed by a step-wise instruction of how to use DecideIT for analyzing the options.

Page 9 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

1 The Problem
The decision problem consists of whether a new system for logistic control should be implemented in a company or not.

1.1 Mapping of Product and Information Flow


The customers of the company make monthly forecasts of the estimated consumption for three months at the time. These forecasts constitute the basis for the production planning. During this time interval, the customers make daily call-offs where they specify the delivery size for the following day. The order is checked against the stock levels and, if possible to fulfil the order, a confirmation is sent to the customer and a delivery order is sent to the central warehouse, where the goods are loaded on trucks for delivery to the customer. The daily process is handled in approximately eight hours, while the forecast is made once a month. An overview of the process can be seen in Figure 21. A, B, and C are locations for the respective activities.
1. Call -offs (daily) & Forecasts (monthly) 3. Delivery order (daily) 4. Delivery (daily)

B
2. Order confirmation (daily) Physical flow Info. flow

5. Replenishment (weekly)

Figure 2-1: Overview of the logistics.

With a new system for logistic control the forecasts will still be made, but these are entered directly into the system and will, together with information regarding stock levels and order and production planning, be transferred in real time to the main factory. This means that all information about the central warehouse will be available for this factory, which then has the ability to check the stock levels, planned production, and already made orders. The order administration will then have been replaced by a demand driven information system. When this information is visible, there are vastly increased possibilities to replenish in a more efficient way. The flow can be seen in Figure 22.

Page 10 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

1. Demand (real time)

2. Delivery order (daily)

3. Replenishment (daily)

A
Physical flow

B
Info. flow

4. Replenishment (weekly)

Figure 2-2: Flow after the introduction of new system.

1.2 Stock Levels


The stock level statistics over the six most selling articles can be used as a basis for suggested improvements. Because of the similarities of the graphs representing the stock level statistics, only one of these is presented in Figure 23.
Article 8343 120 100 80 60 40 20 0

Stock level

2001-1-2

2001-2-2

2001-3-2

2001-4-2

2001-5-2

2001-6-2

2001-7-2

2001-8-2

2001-9-2

Time

Figure 2-3: Stock level statistics.

The production has to be reliable for a possible reduction in the stock levels. As can be seen from Figure 23, the stock level decreases to zero on a number of occasions. This depends to a large extent on production disturbances and planned production stops that were delayed. Assuming that the production can be stabilized, stock level reductions are possible. This depends foremost on better forecasts that can be made due to customer needs being followed in real time. If better forecasts can be made, this will lead to better production planning, which in turn will facilitate lower stock levels. No change in the customers stock levels will occur. The possible effects of introducing the system for logistic control are: Reduction in stock levels by 40%, with a probability between 0% and 20%. Reduction in stock levels by 25%, with a probability between 25% and 55%. Reduction in stock levels by 15%, with a probability between 20% and 40%. No change in stock levels, with a probability between 10% and 30%.

Page 11 of 148

2001-10-2

User Manual DecideIT Decision Tool Preference AB 2006-2011

The storage time is 16 days on average, which is equivalent to approximately 4200 tons. The price per ton of these articles is around 400500 USD, and the stock-keeping cost is 15% 25% of the stock value.

1.3 Administration
The administration cost will probably be reduced after an introduction of the system. Because of the automated process of order receiving and order confirmation, these resources can be redirected to more qualitative work, and there are three possibilities. Reduction of two employees, with a probability between 10% and 30%. Reduction of one employee, with a probability between 50% and 70%. No change, with a probability between 10% and 30%.

One employee corresponds to a cost between 40,000 and 60,000 USD to the company.

1.4 Production
Savings in production costs can probably not be done, but through more efficient production planning, the production can become demand driven. This will not result in any direct cost reductions, but will affect the stock levels.

1.5 Price of the System


The cost of the system is around 25,000 USD per node, but for a larger investment the price will probably be somewhat lower. In addition, there is an annual fee of 20% of the investment cost and also a development cost of the interface between the business system and the system. The company will altogether require eight nodes of the system, one in each factory and six nodes at the customers sites. This will approximately cost between 150,000 and 200,000 USD initially in addition to the annual fee of 30,000 to 40,000 USD. The development cost of the interface is difficult to estimate, but is probably between 50,000 to 90,000 USD. If the depreciation rate of the system is estimated to five years, an annual cost of the system of between 70,000 to 98,000 USD is obtained.

1.6 Quality Aspects


Except for the financial aspects above, there is also a quality criterion involved. One assumption is that more employed personnel increase the probability for a better quality in the process. Another assumption is that such an investment will also raise the quality of the product, even if it now is unclear in which respects. Thus, the quality criterion is divided into

Page 12 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

two sub-criteria: process quality and product quality. The quality criterion is considered to be important, but definitely not of equal importance as the financial aspect. The financial aspect is considered to be at least four to five times more important. The criterion Finance has a weight between 0.8 and 1, whereas the weight of the criterion Quality is between 0 and 0.2. Furthermore, the Product quality is considered to have a weight between 0.7 and 1.0, whereas Process quality has a weight between 0 and 0.3.

1.7 Modelling the Problem


Figure 24 shows a simplified view of the updated problem structure and the collected data from the financial perspective.

Figure 2-4: Structure of the decision problem

As Figure 24 shows, two areas that have improvement potential are detected. These are changes in stock levels in one of the factories (factory B) and changes in administration in the other (factory A). The cost of the system has been adjusted to include the actual purchase together with an annual fee as well as the development cost for the interface between the business system and the system.

Page 13 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

As can be seen, we discuss only two alternatives. Alternative 1 means an investment in the system and alternative 2 means to not invest. Other alternatives, such as considering other suppliers, are omitted.

2 Using DecideIT
Consider the tree in Figure 24. You will now construct this tree in DecideIT and perform various kinds of analyses.

2.1 Start DecideIT


To construct the tree, you must first start DecideIT, if you have not already done this. 1. Double click (the icon DecideIT).

After the program has been loaded, your screen will look as in Figure 25.

Figure 2-5: DecideIT Start Screen

2.2 Create a Tree and Label Alternatives


You will now start constructing the decision tree. 1. Select New from the File menu. You are now asked whether you want to create a decision tree or a multicriteria model. Figure 26. You will select a decision tree.

Page 14 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-6: Choosing decision model to create.

2. Click OK. A rudimentary decision tree opens. It contains two alternatives and two consequences. See Figure 27.

Figure 2-7: A decision tree.

As you saw from Figure 24, the decision problem consists of two alternatives. These have the labels Invest and Not Invest. You will now label the alternatives in the tree. 1. Left-click the upper yellow box. The dialog box Node Properties opens. In this you can label the first alternative. See Figure 28.

Figure 2-8: Node properties.

Page 15 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

2. Write the text Invest in the text field Scenario and click OK. Now the text Invest appears in the upper yellow box in the tree. 3. Open the Node Property dialog box for the other alternative and label the alternative Not Invest (short for Do Not Invest)

2.3 Define Consequences


You will now start defining the consequences. As you saw in Figure 24, the alternative Invest has four direct consequences, each of which has three sub-consequences. 1. Left-click the consequence node of the alternative Invest. The dialog box Add Nodes opens. In this you can define the number of direct consequences to the alternative Invest. See Figure 29.

Figure 2-9: Number of sub-nodes.

2. Write the number 4 in the text field and click OK. Figure 210.

The tree now contains four consequences of the alternative Invest. See

Figure 2-10: Decision tree after adding of consequences.

2.4 Saving and Naming the Tree


You will now save the tree.

Page 16 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

1. Select Save as from the File Menu. The Save dialog box opens. In this you can save the tree under a name. 2. Write the text Investment in the text field and click OK. The tree is now saved in the file Investment.tree.

2.5 Label Consequences


You will now label the consequences as in Figure 24. 1. Left-click the yellow box before the event node E1. The Node Properties dialog box opens. In this you can label the consequence in the same way as you labelled the alternatives before. 2. Write the text 40% Reduction in the text field Scenario and click OK. Now the text 40% Reduction appears in the yellow box. 3. Open the Node Properties dialog box for the other consequences and label these in accordance with Figure 24. The tree should now look like Figure 211.

Figure 2-11: Labelled consequences.

2.6 Define and Label Sub-Consequences


You will now define sub-consequences to each of the direct consequences of the alternative Invest. 1. Left-click the consequence node C1. The dialog box Add Nodes opens again. In this you can define the number of sub-consequences to the first consequence. 2. Write the number 3 in the text field and click OK.

The consequence C1 now has three sub-consequences. See Figure 212.


Page 17 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-12: Decision tree after adding of consequences to C1, converting it to the event node E1.

3. Name the three sub-consequences No, -1 Person and -2 Persons, respectively. You will now define these new sub-consequences to each of the remaining direct consequences of alternative Invest. You can repeat the above procedure or copy the branch to the other nodes: 1. Right-click the node E1 and select Copy Branch from the pop-up menu. 2. Right-click the node C4 and select Paste Branch from the pop-up menu. A copy of the branch from node E1 now replaces the node C4. 3. Define the remaining sub-consequences in the tree. The tree should now look as in Figure 213.

Page 18 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-13: Decision tree with twelve consequences associated with Alt. 1.

2.7 Assign Probabilities


As you saw above, the possible effects of introducing the system for logistic control are: Reduction in stock levels by 40%, with a probability between 0% and 20%. Reduction in stock levels by 25%, with a probability between 25% and 55%. Reduction in stock levels by 15%, with a probability between 20% and 40%. No change in stock levels, with a probability between 10% and 30%.

You will now assert these values for the direct consequences of the alternative Invest. 1. Left-click the yellow box 40% Reduction. The Node Properties dialog box opens. In this you can define a probability interval for this event. 2. Click the tag Probability (%). The dialog box Node Properties now looks as in Figure 214.

Page 19 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-14: Probability tab.

There are four radio buttons in the dialog box. These are used for asserting the type of probabilities you want to assign to an event. You check the leftmost radio button when you want to define a precise probability. The second radio button is used when you want to assign a probability interval as in this case. You will now assign the probability interval 0% to 20% to the event. 3. Click the second radio button from the left. You can now define a probability interval for the event. See Figure 215.

Figure 2-15: Probability tab, entering of interval probability.

Page 20 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

4. Write the number 0 in the left text field and the number 20 in the right text field. Then click OK. The event is now in the probability interval 0% to 20%. 5. Assign probability intervals to the remaining direct consequences of the alternative Invest. The tree should now look as Figure 216.

Figure 2-16: Assigned interval probabilities.

2.8 Define Templates


Recall that there are three possibilities for the reduction of administrative costs. Reduction of two employees, with a probability between 10% and 30%. Reduction of one employee, with a probability between 50% and 70%. No change, with a probability between 10% and 30%.

These options are the same for all sub-consequences. Therefore, it is useful to define templates for the probability assignments.

Page 21 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

1. Select Probability Templates from the Templates Menu. The Probability Templates dialog box now opens. In this you define templates that can be used to simplify the handling of probability assignments. See Figure 217.

Figure 2-17: Probability templates.

You will define the three templates: None, One and Two. 2. Write the text None in the textbox Name of Probability Templates. 3. Click the Second Radio Button. You can now assign an interval probability to this template. See Figure 218.

Figure 2-18: Probability templates.

The probability for No change is between 10% and 30%. 4. Write the number 10 in the left textbox and the number 30 in the right textbox. You define the template by clicking the button Add.

Page 22 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

5. Click the button Add. The template is now defined. See Figure 219.

Figure 2-19: Added probability template.

6. Define the two remaining templates in the same way. The templates should now be One (5070%) and Two (1030%). 7. Click OK. You have now defined the three templates.

2.9 Assign Templates


You will now assign the templates to the consequences in the tree. 1. Left-click the text P:[0.0%,100.0%] over the uppermost consequence C1. The Node Properties dialog box opens. You can now use the templates you defined earlier. 2. Click the rightmost radio button and select None in the combo box. The template is selected. See Figure 220.

Page 23 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-20: Using probability template.

3. Assign probability templates for the remaining consequences. The tree should now look as Figure 221.

Figure 2-21: Assigned probability templates.

Page 24 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

2.10 Assign Values


You will now assign values to the consequences. According to the information in Section 2.1, the assignments should be as follows: C1 C2 C3 C4 C5 C6 C7 C8 C9 USD 0 to 140,000 USD 40,000 to 200,000 USD 80,000 to 260,000 USD 37,000 to 60,000 USD 3000 to 120,000 USD 43,000 to 180,000 USD 62,000 to 10,000 USD 22,000 to 70,000 USD 18,000 to 130,000

C10 USD 100,000 to 70,000 C11 USD 60,000 to 10,000 C12 USD 20,000 to 50,000 C13 USD 0 Before you assert these values, you must define a value scale. 1. Select Set Value Scale in the Edit menu. The Value Scale Settings dialog box opens. In this you can set the value scale. See Figure 222.

Figure 2-22: Value scale settings.

You will let the scale be between 100 and 260. This means that the minimal and maximal values will be the values of the best and the worst consequences (divided by 1000), respectively. 2. Write the number -100 in the upper textbox and the number 260 in the lower textbox and click OK. You will now assert values to the consequences.

Page 25 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

1. Left-click the uppermost yellow box No. The Node Properties dialog box opens. In this you can define a value interval for this consequence. 2. Click the tag Value. The Node Properties dialog box now looks as in Figure 223.

Figure 2-23: Entering values.

There are three radio buttons in this dialog box. As for probabilities, these are used for asserting different types of values you want to assign to the event. You check the leftmost radio button when you want to assign a precise value. The second radio button is used when you want to assign a value interval as in this case. You will now assign the value interval 0 to 140,000 to the consequence. 3. Click the second radio button. You can now define a value interval for the consequence. 4. Write the number 0 in the left text field and the number 140 in the right text field. Then click OK. The event is now in the value interval 0 to 140. 5. Assign value intervals to the remaining consequences. The tree should now look as Figure 224.

Page 26 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-24: Assigned values.

2.11 Evaluate the Decision Problem


After the decision problem is structured and all the variables are estimated, an analysis can be made of the courses of actions. You can analyze the problem with different decision rules (as is discussed in more detail in Part III). The principle of maximizing the expected utility (PMEU) Qualitatively, i.e., excluding alternatives that have bad consequences with too high a probability Extreme values, such as maximin and maximax You can also perform various kinds of sensitivity analyses

You will first analyze the alternatives with the PMEU rule. 1. Select Expected Value Graph from the Evaluation menu. The Evaluation Properties dialog box is shown. See Figure 225.

Page 27 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-25: Evaluation properties.

In this dialog box, you have various options. Perform pair-wise comparisons; if there are more than two alternatives, pair-wise comparisons can be carried out between all of them. Compare alternatives against a mean value of all others; this is only applicable if you have more than two alternatives. Evaluate the alternatives separately; you should here keep in mind that this means that all relations between the alternatives are dismissed during the analysis. Set how many evaluation steps that will be shown; usually 20% is sufficient. This is further described below. You can also choose the preferred contraction mode. Before the evaluation, you have also the choice to contract only the probability or value base or pre contract one of them to the most probable point first. The default is to contract both the probability and value bases. Normally you just use the default choice.

You will now perform a pair-wise evaluation of the two alternatives. 2. Click OK in the Evaluation Properties dialog box. A window presenting the result is shown. See Figure 226.

Figure 2-26: Evaluation window.

Page 28 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

The result of the analysis is shown when the depreciation rate is five years. The upper graph of Figure 226 shows the maximal possible difference between the alternatives Invest and Not Invest, i.e., when the former is made as good as possible compared to the latter. The lower graph shows the opposite, i.e., when the alternative Not Invest is made as good as possible compared to the alternative Invest. The result also shows the values in relation to the degree of contraction of the decision frame. The values for various degrees of contraction demonstrate the stability of the decision. Values near the boundaries of the constraint intervals are likely less reliable than the centre values, due to the intervals being deliberately imprecise. If the decision problem is evaluated on a sequence of ever-smaller intervals, a good appreciation of the solutions dependency on boundary values can be obtained. This is taken into account by cutting off the dominated regions indirectly using reductions of the probability and value bases, and is denoted cutting the bases. The amount of cutting is indicated as a percentage, which can range from 0% to 100%. In the figures, the numerical difference in expected value is shown for each 20% contraction step. The intuition behind contractions is to zoom in on increasingly believable sub-intervals of the deliberately imprecise original statements, thus forming a succession of belief-denser solution sub-spaces. Before the evaluation, you have also the choice to contract only the probability or value base or pre contract one of them to the most probable point first. The default is to contract both the probability and value bases. In this case, the alternative Invest is significantly better than the alternative Not Invest. The easiest way to see this is to consider the sizes of the respective areas above and below the axis Contraction. In the case of Figure 226, you can see that the area above the axis is considerably larger than the area below the axis. This means that Alt. 1, i.e., the alternative Invest is better. You can also see other aspects of the result. For instance, you can have the results presented in a numerical format. 3. Select Numerical in the View menu. A window presenting the result numerically is shown. See Figure 227.

Page 29 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-27: Numerical view.

Considering the alternative to invest in relation to not invest, if the worst case occurs (from the perspective of he alternative to invest), the difference in expected monetary values is a loss of about USD 34,000. On the other hand if the best case occurs the expected monetary value is a gain of USD 125,000. The most likely difference in expected monetary value is a positive result of approximately USD 37,000, as can be seen at 100% contraction.

2.12 Total Ranking


A more direct way to see the relationship between the alternatives is to use total ranking. 1. Select Total Ranking in the Evaluation menu. The Total Ranking Properties dialog box is shown. See Figure 228.

Figure 2-28: Total ranking properties.

In this dialog box, you set with how large percentages of the expected value at a given contraction level the alternatives must differ to be considered to be different. The default is 5% difference at 100% contraction. 2. Click OK. The Total Ranking dialog box is shown. See Figure 229.

Page 30 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-29: Total ranking.

In this it can clearly be seen that the Invest alternative is ranked higher that Not invest.

2.13 Cardinal Ranking


The relationship between the alternatives can also be seen in more details using Cardinal ranking. 1. Select Cardinal Ranking in the Evaluation menu. The Cardinal Ranking Properties dialog box is shown. See Figure 230.

Figure 2-30: Cardinal ranking properties.

In this dialog box, you set the contraction level and the contraction mode. The default contraction level is 0% contraction, i.e., the entire intervals are taken into account. 2. Click OK. The Total Ranking dialog box is shown. See Figure 231.

Page 31 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-31: Cardinal ranking.

In this it can be seen that there is an overlap between the alternatives, but that the Invest alternative should be preferred to the alternative Not invest.

2.14 Study Critical Probabilities and Values


There are also other interesting observations that can be made when using DecideIT. For instance, it is possible to see how large impact the various consequences have on the result. 1. Select Critical Probabilities/Values in the Evaluation menu. A window presenting the impact of the different consequences is shown. See Figure 232.

Page 32 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-32: Critical probabilities and values.

The window presents the possible variation of the expected value when the respective probabilities and values are varying within their admissible intervals. For instance, the value of consequence C6 can affect the expected value with an impact of USD 5480 to 5480, and the probability of the same consequence can affect the expected value with an impact of USD 3000 to 3000. This type of information can be vital when the alternatives under consideration are close to equal. In such cases, it is important to know which consequences that affect the situation most, i.e., are the most critical for the analysis. When more information has to be collected and when resources have to be allocated, the consequences that are critical in this respect should be focused on primarily.

2.15 Security Thresholds


It might be the case that even if an alternative has a high expected value, some combination of consequences makes it impossible to select this. Another useful evaluation rule is therefore that an alternative should be dismissed if it has too bad consequences with too high a probability. In DecideIT, you analyze such situations in the following way. 1. Select Security Thresholds in the Evaluation menu. The Security Thresholds dialog box is shown. See Figure 233.

Page 33 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-33: Security thresholds.

In this dialog box, you make settings for when you consider a consequence, or a combination of consequences, too dire. In the upper text box you set the value threshold. The lower text bow is used for defining the probability threshold. We assume that you do not accept that an alternative has consequences with values lower than USD 30,000 if they can occur with probabilities greater than 40%. 2. Write 30 in the left text box and 40 in the right text box. See Figure 234.

Figure 2-34: Defining thresholds.

3. Click OK. A window presenting the result of the analysis is shown. See Figure 235.

Figure 2-35: Result of analysing security thresholds.

Page 34 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

The window shows that there are consequences of the alternative Invest that are unacceptable. This is shown in red. However, already after about 10% contraction, this is not the case any longer. It is a matter of your own risk attitude whether this is acceptable or if the alternative should be dismissed. However, since the alternative Invest becomes acceptable after just a few contractions, it seems reasonable to still consider it as the best candidate for selection.

2.16 Extreme Values


A similar phenomenon can also be seen by studying the values of the extreme consequences in the same way as in the maximin and maximax principles (see Part III). 1. Select Extreme Values in the Evaluation menu. The Extreme Values dialog box is shown. See Figure 236.

Figure 2-36: Extreme values.

In this dialog box, you can see how the alternatives look considering the best and worst consequences only. There is obviously a consequence (-100) that can make the Invest alternative worse than Not invest, but, on the other hand, the best consequence is much better for the invest alternative (260). The Not invest alternative is of course constant zero. The values indicated in the dialog box is the weighted averages between the best and the worst consequences of the respective alternatives in the same way as for the Hurwitz criteria (see Part III).

Page 35 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

2.17 Cumulative Risk Profile


In DecideIT you can also investigate the cumulative risk profiles of the alternatives and therefore also whether an alternative statistically dominates another. See Part III. 1. Select Risk Profile in the Evaluation menu. The Cumulative Risk Profile dialog box is shown. See Figure 237.

Figure 2-37: Cumulative risk profile.

In this dialog box, several graphs can be seen. The (blue) graphs for the Invest alternative are showing the cumulative risk profiles for the minimum and maximum possible values (0% contraction) as well as the average of these two. Since the alternative Not invest has a constant value, all the (green) graphs coincide. In this case no of the alternatives statistically dominates the other; not even when considering the extreme (lower and upper) graphs.

2.18 Define Criteria


The analyses above were performed considering financial data only. However, as you saw in Section 2.1, there is also a quality criterion involved. One assumption is that more personnel employed increase the probability for a better quality in the process. Another assumption is that such an investment will also raise the quality of the product, even if it is unclear in which respects. The quality criterion is thus divided into two sub-criteria: process quality and product quality. The quality criterion is considered to be important, but definitely not of equal importance as the financial aspect. The financial aspect is considered to be at least four to five

Page 36 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

times more important. Thus, the criterion Finance has a weight between 0.8 and 1, whereas the weight of the criterion Quality is between 0 and 0.2. Furthermore, the Product quality is considered to have a sub-weight between 0.7 and 1.0, whereas Process quality has a subweight between 0 and 0.3 (i.e. relative to the Quality criterion). In his section, you will build a multi-criteria model, consisting of a criteria tree and connected decision trees. Let us first build the criteria tree. 1. Select New from the File menu. You are now asked whether you want to create a decision tree or a multicriteria model. See Figure 238. You will create a multi-criteria model.

Figure 2-38: Choosing multi-criteria model to create.

2. Click the radio button Multi-criteria model and then click OK. A criteria tree opens. See Figure 239.

Figure 2-39: A criteria tree.

You will now create the multi-criteria tree. 1. Left-click the node Cr. 1. The dialog box Add Criteria opens. In this you can define the number of sub-criteria. 2. Write the number 2 in the text field and click OK. 3. Confirm the message. The tree should now look as in Figure 240.

You are now asked to confirm that you cannot use weight relations.

Page 37 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-40: A criteria hierarchy.

4. Put in names as in Figure 241. This is done exactly in the same way as for decision trees.

Figure 2-41: A criteria hierarchy.

2.19 Assign Criteria Weights


You will now assert the respective weights of the criteria. The criterion Quality is between 0 and 0.2. The criterion Finance has a weight between 0.8 and 1. The criterion Process quality has a weight between 0 and 0.3. The criterion Product quality has a weight between 0.7 and 1.0.

1. Left-click the yellow box Quality. The Node Properties dialog box opens. In this you can define the weight of this criterion. 2. Click the tag Weight (%). The dialog box Node Properties now looks as in Figure 242.

Page 38 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-42: Weight tab.

There are three radio buttons in the dialog box. These are used for asserting the type of weights you want to assign to a criterion. You check the leftmost radio button when you want to define a precise weight. The second radio button is used when you want to assign a weight interval (as in your model). 3. Click the second radio button from the left. You can now enter a weight interval for the criterion. 4. Write the number 0 in the left text field and the number 20 in the right text field. Then click OK. The criterion is now assigned the weight interval 0% to 20%. 5. Assign intervals to the remaining criteria. The tree should now look as in Figure 243.

Figure 2-43: Assigned interval weights.

Page 39 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

2.20 Assign a Tree to a Criterion


You will now connect the earlier defined tree to this criteria model. 1. Right-click the node Cr. 3 and choose the option Connect decision model to this weight. The Node Properties dialog box opens again. In this you can connect a decision model to this criterion. See Figure 244.

Figure 2-44: Node properties.

2. Click the radio button Connect decision model, choose T1:Investment.tree in the popup menu and click OK. The decision tree that was earlier created is now connected to the criteria model. See Figure 245.

Figure 2-45: Connected decision tree.

3. Save the criteria tree under the name MCDM Investment. The tree is now saved in the file MCDM Investment.ch.

Page 40 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

2.21 Define New Decision Models


You will now define two simple decision models for each Quality sub-criteria. The two assumptions were that more employed personnel increase the probability for a better quality in the process and that an investment will raise the quality of the product. Considering the employees, recall that there are three possibilities when investing. Reduction of two employees, with a probability between 10% and 30%. Reduction of one employee, with a probability between 50% and 70%. No change, with a probability between 10% and 30%.

The assumption, from the process quality perspective, is that two employed persons are preferred to one, which in turn is preferred to none. Such a decision tree is shown in Figure 2 46. Note that the scale is chosen as in the financial decision model to make these trees comparable.

Figure 2-46: Process quality.

1. Define a new decision model as shown above and label it Process quality. This is done in the same way as the for the financial decision model.

2.22 Asserting Value Relations


You will now assert relations between the consequences. Recall that two employed persons are preferred to one, which in turn is preferred to none. This means that C1 is preferred to C2, which is preferred to C3. Furthermore C4 is equally preferred as C1. 1. Select Value Relations from the Edit menu. The Value Relations dialog box is shown. In this dialog box, you can assert qualitative estimates between the consequences. See Figure 247.

Page 41 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

s
Figure 2-47: Value relations.

2. Select C1 in the left upper combo box and C2 in the right upper combo box. See Figure 248.

Figure 2-48: Adding value relation.

Page 42 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

3. Select C2 in the 2nd left upper combo box and C3 in the 2nd right upper combo box. 4. Select C4 in the 3rd left upper combo box, = in the 3rd pop-up menu and C1 in the 3rd right upper combo box. Note that you can add more cells by clicking the button Extra relations. The Value Relations dialog box should now appear as Figure 249.

Figure 2-49: Adding value relations.

4. Click OK. The relations are now defined. 5. Save the model under the name Process quality. The decision tree for the product quality is very simple. See Figure 250.

Figure 2-50: Product quality.

Recall that an investment will raise the quality of the product. Thus the only assertion to be made is that at C1 is preferred to C2.

Page 43 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

1. Create the decision tree above and set the scale between -100 and 260. 2. Select Value Relations from the Edit menu. 3. Select C1 in the left upper combo box and C2 in the right upper combo box. 4. Click OK. 5. Save the model under the name Product quality. Now the decision model for product quality is defined. The only thing that remains is to connect the decision models to the multi-criteria tree. 1. Right-click the node Cr. 1 in the tree MCDM Investment.ch and assign the tree Product quality.tree. 2. Right-click the node Cr. 1 in the tree MCDM Investment.ch and assign the tree Investment.tree. The multi-criteria tree should now look like Figure 251.

Figure 2-51: Connected decision tree.

3. Save the criteria tree. The entire model is now saved under the name Investment.ch.

2.23 Evaluating Multi-Criteria Models


You will now perform a total evaluation of the alternatives, when all criteria weights, probabilities, and values are taken into account. 1. Select Expected Value Graph - All Criteria from the Evaluation menu and perform a pair-wise comparison between the alternatives. The result of this analysis should now look as in Figure 252.

Page 44 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-52: Multi-criteria evaluation.

It is still the case that the alternative Invest is the best, but with a slightly lower degree of dominance.

2.24 A Rough Sensitivity Analysis


As you could see from the above analysis, it seems that it is better to invest in the system than not to invest. However, it is interesting to investigate how stable the result is with respect to the input data. You will now investigate how critical the weight assessments are. You will change the criterion Finance to have a weight between 0.7 and 0.9, whereas the weight of the criterion Quality should be between 0.1 and 0.3. 1. Right-click the node Cr. 1 and choose the option Criteria properties. The Node Properties dialog box opens again. 2. Click the tag Weight (%). Now you can change the weight intervals. 3. Change the weights according to the above. See Figure 253.

Page 45 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-53: Editing the criteria weights.

5. Then click OK. The new weights are assigned to the criteria. You will now again perform a total evaluation of the alternatives. 6. Select Expected Value Graph - All Criteria from the Evaluation menu and perform a pair-wise comparison between the alternatives. The result of this analysis should now look as in Figure 254.

Figure 2-54: Multi-criteria evaluation.

The alternative Not Invest is only slightly changed compared to Invest. The result is therefore not very sensitive to changes in the criteria weights for Quality and Finance. This is further emphasized when the criteria are set to be equal. The result can be seen in Figure 255.

Page 46 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 2-55: The criteria are set to be equally important.

As can be seen from the result, the alternative Invest is still much better than Not Invest.

2.25 Conclusions
You have now modelled an investment decision problem under two criteria and performed various kinds of analyses. All weights, probabilities, and values have been taken into account in the evaluation phase. Despite the impreciseness of the input data, important results could be obtained. In short, given the original estimates, the alternative Invest should definitely be chosen.

Page 47 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

PART III: REFERENCE GUIDE

In this part there are instructions how to install DecideIT and information about different parts of the application. The reference guide also contains descriptions of all menus and commands.

Page 48 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

1 Preparations
1.1 Install DecideIT
If downloaded from our homepage on the web, run the file DecideIT_Setup.exe by doubleclick on it. If you received a CD-Rom with DecideIT, insert the installation CD-Rom in your CD/DVD-reader. If the installation procedure does not start automatically, follow the following procedure. 1. Double-click the program run DecideIT_Setup.exe from the installation CD and follow the instructions. 2. Restart the computer if needed. Note that you might need administrative rights on your operative system to be allowed to install this software. Contact your system administrator if you need such rights. If you are running Windows XP or 2000, this step might not be necessary. For portability reasons, the graphical user interface of DecideIT is developed in the Java Programming Language. The Java Runtime Environment is needed.

1.2 Start DecideIT


1. Double-click the icon DecideIT in the program menu or on your desktop. The DecideIT window will be displayed. You can increase or decrease the window by clicking the zoom button in the upper right corner.

Page 49 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

2 Menus and Toolbars; File


The File menu contains the following commands: New Open Close Save Save as Print model Page setup Export Tree to JPEG-format Exit

2.1 Create Model


1. Select New from the File menu to create a new decision or criteria model. The New button file menu. 2. Choose to create Decision Tree or Multi-Criteria Model by checking the desired radio-button. You may set a number of alternatives, default is two alternatives. Alternatives may naturally be modified/added/removed later on. 3. Click OK. in the toolbar corresponds to the command New in the

2.2 Open an Existing Model


1. Select Open from the file menu to open an existing model. The Open button in the file menu. 2. Browse for a specific file. When selecting Open from the file menu or clicking the button Open, you are able to browse for existing models on your disk. in the toolbar corresponds to the command Open

2.3 Close a Model


1. Select Close from the File menu. This command closes the active model. If the model has been modified and

Page 50 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

not saved, DecideIT will ask if the model should be saved. 1a. Close by using close button in window corner. The Close button in a model window corner corresponds to Close in the file menu. 1b. Close DecideIT via Exit in the File menu. This command will be followed by a question whether you want to save an open model or not. The Close button in the corner of the program session corresponds to the Exit command in the file menu, and will be followed by a question whether you want to save an open model or not.

2.4 Save a Model


1. Select Save/Save as from the File menu. The Save button menu. If the model is a new non-saved file, the Save button corresponds to Save as in the file menu. Save as is used to save a model in a different file on your disk. 2. Name your structure in the text box File name. It is possible to use longer, descriptive file names. 3. Click Save. in the toolbar corresponds to Save/Save as in the file

2.5 Save a Copy of Current Tree


Use this command if you wish to save a decision tree, which is connected to a multi-criteria model, to a stand alone decision tree file.

2.6 Print Model


Information on how to install and connect a printer is found in the printer documentation. 1. Select Print Model from the file menu. The Print button menu.
Page 51 of 148

in the toolbar corresponds to Print Model in the file

User Manual DecideIT Decision Tool Preference AB 2006-2011

2. Select Page Setup to optimize printing properties. Orientate the structure horizontally or vertically. Choose margin settings for a suitable print.

3 Menus and Toolbars; Edit


The menu Edit contains the commands used for editing the properties of a decision tree. The menu has the following commands: Undo Redo Alternative Properties Value/Weight Relations Set Value Scale Set Background Color

3.1 Undo
The command Undo disregards the latest action.

3.2 Redo
The command Redo reapplies the latest action before the latest Undo command.

3.3 Alternative Properties


The command Alternative Properties opens the dialog box Alternative properties. In this you get an overview of the multi-criteria model and its bindings with decision models. In the dialog box you can also give internal name to the alternatives.

Figure 3-1: Alternative properties.

Page 52 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

The Alternative Properties button in the Edit menu.

in the toolbar corresponds to Alternative Properties

3.4 Set Value/Weight Relations


Value/Weight Relations are used when adding qualitative relations to the model. For example, the consequence C1 is better than C2 in the sense that the value of C1 cannot be lower than the value of C2. It is also possible to state that C1 is equal to C2, C1 is at least X value units greater than C2, or C1 approximately equals C2. Press Clear All to delete all value relations (be careful), and use Clear to delete a single value or weight relation. If more relations are desired, press Extra relations. When value or weight relations are added and OK is pressed, a consistency check will be carried out. If the relations cannot be consistent then you will be asked to modify the relations marked red.

Figure 3-2: Value relations.

In the screenshot above, C1 is better than C2, C3 is at least 250 value units better than C4, and the value of C5 is between 100 and 180 value units greater than the value of C7. The Value/Weight Relations button or in the toolbar corresponds to Value

Relations and Weight Relations respectively in the edit menu. If a criteria tree contains more than one level, weight relation cannot be asserted.

Page 53 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

3.5 Set Value Scale


The scale is set manually. Note that when the scale is defined all values assigned to any consequence node must fall within the given value scale value scale. As default the scale is from zero to 100. The Set Value Scale button menu. in the toolbar corresponds to Set Value Scale in the Edit

3.5.1

Value Scales and Multi-Criteria Decision Problems

Changing the value scale of any criterion in the decision problem will affect the evaluation. It is extremely important that prior to any multi-criteria evaluation, all criteria must have well defined value scales. This is why the greatest value in each value scale represents the best possible outcome with respect to the given criteria, and the worst value in each value scale represent the worst possible outcome with respect to the given criteria. Manipulating a value scale in a multi-criteria decision problem means that the given best and/or the worst possible outcome is assigned new values, making the already defined values of the different consequences to be less good and/or less bad relative to the best and/or worst cases. Example: Consider a decision situation with two criteria: ROI and Research. ROI will be measured in monetary units and Research will be measured in number of active researchers. A pre-investigation leads the decision analyst to state that the worst possible ROI is to loose 1 million, and the best possible ROI is to gain 5 million. Another investigation leads the analyst to state that the worst possible research is to employ zero researchers, and the best possible consequence is to have ten active full time researchers. Now, assume that the company holds the two criteria to be equally important, thus assigning them both the weight of 0.5, and that the following two alternatives are considered: Alt. 1 which means an ROI of 4 million and number of researchers set to 4, and Alt. 2 which means an ROI of 3 million and number of researchers is 8.

Page 54 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-3: Decision tree of the example.

According to the semantics of the additive utility function described in section 5, the utility of 5 million will be one, and the utility of -1 million will be zero. A transformation1 of 4 million onto the [0,1] scale will then be 4E6 / |5E6 - (-1E6)| = 0.667, and a transformation of 4 researchers onto the [0,1] scale will be 4 / |10-0| = 0.4. Building this decision situation in DecideIT and performing a multi-criteria pairwise comparison of the alternatives yields the evaluation window below.

1 The given example is made under the assumption that the utility is linear with size of ROI and number of

researchers, but the issue discussed is not restricted to such circumstances.

Page 55 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-4: Multi-criteria evaluation.

Now, if we should change the scale for the criterion Research to be [0,30].

Figure 3-5: Changing the value scale for criterion Research.

Then the multi-criteria evaluation of the same decision problem would yield the result in Figure 3-6.

Figure 3-6: Evaluation after changing of the scale.

As can be seen, the result of the evaluation now says that Alt.1 is the better option, and all we have done is change the scale for one criterion in the decision context! To clarify this, recall that the utility of 4 workers mapped to 0.4 using the first scale [0,10]. Now, since the scale is set to [0,30] the utility of 4 workers will map to 4 / |30-0| = 0.133, thus the changing of scale
Page 56 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

imply a depreciation of the utility of 4 researchers, implicitly making the values in the ROI criterion greater relative to the Research criterion. The important conclusion of this example is that the individual value scales for all criteria must be well defined prior to any multicriteria evaluation, when the scale is adjusted, all values defined on this scale must be re-set relative to the new worst and best cases represented by the upper and lower bounds.

3.6 Set Background Color


As default the background is white. Use this command to change this colour.

4 Menus and Toolbars; View


The menu View contains the following commands to control the open sessions in DecideIT. Overview Hide/Show all Evaluation Windows Update Tree

4.1 Overview
Use this command to zoom out the view of a decision tree, so that a better overview of large trees is obtained. Use the command again to restore the view to normal. Note that it is still possible to click with the mouse button on the nodes in the tree when overview is chosen. The Overview button in the toolbar corresponds to Overview in the view menu.

4.2 Hide/Show all Evaluation Windows


Use this command to hide all evaluation windows, e.g., windows containing evaluation graphs, security thresholds, critical values etc. Use the command again to set the evaluation windows visible again. The Hide/Show all Evaluation Windows button Hide/Show all Evaluation Windows in the view menu. in the toolbar corresponds to

Page 57 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

4.3 Update Model


Update Model will force an update of the model with new values, if the model is not automatically updated. This may be the case if DecideIT is running on a relatively slow and/or low memory machine or if you have edited an active Excel-sheet related to the model. The Update Model button menu. in the toolbar corresponds to Update Model in the view

5 Menus and Toolbars; Templates


The menu Templates contains the command Probability Templates... The feature Probability Templates is used to create templates, which can be easily and repeatedly used as a probability statement in the model. Every template needs to have a unique name. The probability can be set to be a precise probability C, an interval OI, and an interval with a most likely point OI+P. Enter the name of the template in the text field, for example Prob. of Rain, assign probabilities, and click Add. See Figure 3-7.

Figure 3-7: Probability templates.

In the screenshot above a probability template Prob. of rain has been added, with a probability between 0.15 and 0.25 and no explicitly given most likely point. A probability template Prob. of snow will be added if the user should click Add. The Probability Templates button Templates in the edit menu. in the toolbar corresponds to Probability

Page 58 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

6 Menus and Toolbars; Evaluation


The menu Evaluation contains commands for evaluating a decision tree. The commands are: Security Thresholds Total Ranking Cardinal Ranking Expected Value Graph Risk Profile Critical Probabilities/Values/Weights Total Ranking All Criteria Cardinal Ranking All Criteria Expected Value Graph - All Criteria Extreme Values Preference Order

Note: When the values in a decision tree are modified, the evaluation windows of the decision tree need to be updated. To mark that an evaluation window needs to be updated, it will turn grey (shaded).

6.1 Security Thresholds


Use this command to define the security thresholds of a given decision problem. First, define the lowest value not to fall short of then state the maximum acceptable probability of ending up with such a low value. The threshold frame is represented in Figure 3-8.

Figure 3-8: Security thresholds.

When pressing OK or Apply, an evaluation of the security thresholds is performed.

Page 59 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-9: Security thresholds evaluation window.

In the screenshot above given security threshold will find Alt. 2 to fit into the specific risk profile. Alt. 1 might be at risk even if it fulfils the thresholds at an early contraction level. The Security Threshold button the evaluate menu. in the toolbar corresponds to Security Threshold in

6.2 Total Ranking


Use this command to get an overview of the result. Before performing the calculations, you will be asked for an indifference interval in percentage of the value scale between 0% and 10%, as default this interval is 5%. See Figure 3-10.

Figure 3-10: Ranking properties.

Total Ranking presents an overview of a preference order of the alternatives based on the alternatives' expected values at a specified level of contraction. The ranking is obtained through the following procedure: 1) Pick the alternative with the greatest expected value at the specified contraction level, 2) Let any remaining alternatives having expected values not differing more than the indifference interval percentage of the value scale receive the same rank as the alternative picked in the previous step, 3) Remove the alternatives picked in

Page 60 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

previous steps, 4) If there are remaining alternatives, go to step 1. A result of such an analysis is shown in Figure 3-11.

Figure 3-11: Total Ranking.

The Total Ranking button menu.

in the toolbar corresponds to Total Ranking in the Evaluate

6.3 Cardinal Ranking


Use this command to get an overview of the result. Before performing the calculations, you will be asked to set the contraction level and the contraction mode. See Figure 3-12.

Figure 3-12: Ranking properties.

Cardinal Ranking presents an overview of the respective range of the alternatives' expected values at a specified level of contraction. A result of such an analysis is shown in Figure 3-13, where the ranges of the expected values can be seen at 0% contraction level.

Page 61 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-13: Total Ranking.

The Cardinal Ranking button evaluate menu.

in the toolbar corresponds to Cardinal Ranking in the

6.4 Expected Value Graph


The command Expected Value Graph opens the Evaluate Property Frame. See Figure 3-14.

Figure 3-14: Evaluation property frame.

The comparisons take account of value relations between consequences belonging to different alternatives. If there are no such relations between the alternatives, the result is the same as taking the difference between two single evaluations. a. Compare alternatives pair-wise (compare two alternatives against each other). b. Compare alternative to average (compare one alternative against an average of the other alternatives). c. Single alternative (study the Expected value of a single alternative). Size of evaluation step sets the number of calculated points in the evaluation graph, i.e., how fast the interval contractions should be performed. An evaluation step of 20%

Page 62 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

is the smallest number of evaluation steps, and the calculation process will run quicker. Before the evaluation, you chose the Contraction Mode, i.e., you chose to contract only the probability or value base or pre contract one of them to the most probable point first. The default is to contract both the probability and value bases. Normally you just use the default choice. When pressing OK the calculation of evaluation graphs is carried out and the graphs will be presented in a new window. The Expected Value Graph button the evaluate menu. in the toolbar corresponds to Expected Value Graph in

6.5 Cumulative Risk Profile


In DecideIT you can also investigate the cumulative risk profiles of the alternatives and therefore also whether an alternative statistically dominates another. The command Risk Profile opens the Cumulative Risk Profile. See Figure 3-15.

Figure 3-15: Cumulative risk profile.

In this dialog box, several graphs can be seen. The graphs show the cumulative risk profiles for the minimum and maximum possible values (0% contraction) as well as the average of these two. In Figure 3-15 none of the alternatives statistically dominates the other. However in Figure 3-16 alternative 1 (middle graph) statistically dominates alternative 2.

Page 63 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-16: Cumulative risk profile.

The Cumulative Risk Profile button Profile in the evaluate menu.

in the toolbar corresponds to Cumulative Risk

6.6 Risk Profile


Basically, the risk profile is a discrete density function over the consequences for a given alternative. It essentially holds the same information as the cumulative risk profile, although it is more cumbersome to investigate the stochastic dominance using this feature. The Risk Profile button in the toolbar corresponds to Risk Profile in the evaluate menu.

6.7 Critical Probabilities/Values/Weights


The feature Critical Probabilities/Values shows what impact the probability and value intervals of the consequences have on the expected value (the expected value span). The span is obtained through letting each probability and value interval assume the consistent extreme points within the given intervals. Note that when the consequence is assigned a probability, the size of the probability interval also will affect the expected value span. The result is then ordered from greatest impact to lowest impact in a so called tornado diagram. The result can be interpreted such that value intervals implying greater impact on the expected value is critical, and information or data related to these consequences is important for the decision situation at hand.

Page 64 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Red colour indicates that the expected value is influenced in a negative way, and green colour indicates a positive influence on the expected value. A representation of this is shown in Figure 3-17.

Figure 3-17: Critical probabilities/values.

In the screenshot above, the value interval of consequence C4 has the most critical probability assignment and the consequence C5 the most critical value assignment considering the impact on the expected value. The Critical probabilities button circles). The Critical values button latter, the values are also weighted. The Critical weights button in the toolbar corresponds to Critical weights in the in the toolbar corresponds to Critical values in the in the toolbar corresponds to Critical probabilities

in the evaluate menu. Critical probabilities apply to decision trees having event nodes (red

evaluate menu. Critical values apply to both decision trees and multi-criteria models. For the

evaluate menu. Critical weights apply to multi-criteria models only.

6.8 Total Ranking All Criteria


This is the same function as Total Ranking, but with respect to all criteria in the decision model. The Total Ranking - All Criteria button Ranking All Criteria in the Evaluate menu.
Page 65 of 148

in the toolbar corresponds to Total

User Manual DecideIT Decision Tool Preference AB 2006-2011

6.9 Cardinal Ranking All Criteria


This is the same function as Cardinal Ranking, but with respect to all criteria in the decision model. The Cardinal Ranking - All Criteria button Ranking All Criteria in the Evaluate menu. in the toolbar corresponds to Cardinal

6.10 Expected Value Graph - All Criteria


The command Expected Value Graph - All Criteria correlates to the latter part about Expected Value Graph, except that it must be used when evaluating multi-criteria decisions. The Expected Value Graph - All Criteria button Graph - All Criteria in the evaluate menu. in the toolbar corresponds to Expected Value

6.11 Extreme Values


When no probability estimates can be performed, or they can be considered to be meaningless, a decision is said to be under strict uncertainty. For such cases, there exist a number of other decision rules beside the maximization of the expected value. The feature of Extreme Values evaluates the given decision tree according to the following decision rules: maximin, maximax, pessimism-optimism-index, value span, and the principle of insufficient reason. See Figure 3-18. a. Maximin. Choose the alternative that gives the best result if the worst possible outcome will occur for each alternative. b. Maximax. Choose the alternative that gives the best result if the best possible outcome will occur for each alternative.

Page 66 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-18: Extreme values.

c. Pessimism-Optimism Index. This rule can be regarded as a mixture of maximin and maximax. Let the number A in the interval [0,1] be the index, when A = 1 we are just as pessimistic as in maximin, and when A = 0 we are just as optimistic as in maximax. Slightly more formally explained: Let A in [0,1] be the PO-index. Let Pi denote the value of the best consequence of each alternative i, and let Qi be the value of the worst consequence of each alternative i. The decision rule will choose the alternative whose A* Pi + (1A)* Qi is greatest. Again, note that when A = 1 the rule is the same as maximin, and when A = 0 the rule is the same as maximax. In the graphs to the left in the windows, DecideIT lets the index assume the values 0, 0.2, 0.4, 0.6, 0.8 and 1 for all evaluated alternatives, and each graph respond to an alternative. d. Value Span. Choose the alternative where the consequence with the maximum value span of that alternative is lowest. e. Principle of Insufficient Reason. This rule is based on the assumption that if the probabilities of the different consequences are completely unknown, then they can be assumed to be
Page 67 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

equal. Choose the alternative such that the average most likely point value of the possible consequence is maximized. The Extreme Values button menu. in the toolbar corresponds to Extreme Values in the evaluate

6.12 Preference Order


This feature presents the preference order on the set of all ordered consequences in the decision problem. The order is derived from the assigned value intervals and value relations.

Figure 3-19: Preference order.

If the value intervals of two consequences are overlapping and no ordering value relation is set between the consequences, they will be assumed to be indifferent in this evaluation and will not be present in the window. The order is presented from top to bottom, i.e., the most preferred consequence(s) is at the top and the least preferred consequences(s) is at the bottom. See Figure 3-19. Unordered consequences will not be present in the window. The Preference Order button evaluate menu. in the toolbar corresponds to Preference Order in the

Page 68 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

7 Menus and Toolbars; Tools


7.1 Document History
Use this command to open a simple text editor, mainly with the purpose of keeping track of changes made in the current decision model.

7.2 Choose Active Excel Spreadsheet


In the node property frame, it is possible to read from value cells in an Excel Spreadsheet instead of entering a value, weight, or probability. In order to do this you must choose an active excel document for each model in DecideIT. Use this command to choose an active excel document. Browse for the desired file then press OK. More details on how to create references from the node property frame in DecideIT to a cell in a spreadsheet is offered in the DecideIT help-system, Contents and Index.

7.3 Settings
Use this command to edit general settings for each decision model. a. Default Settings. Press this button to restore default settings. b. State probability in percentage. If the set probabilities are stated in percentage 0% 100%, instead of being stated on the scale 0 1. c. Allow usage of solver optimization. If this is set (highly recommended), the computational kernel of DecideIT performs approximations in the cases where there are dependencies between alternatives (value relations) and the values of the consequences within each alternative can not be strictly ordered. d. Use approximate contraction point If this is set (highly recommended), approximations in the computational steps of computing contraction points are allowed resulting in a significant increase of speed. e. Auto save every X minutes If set, your model will be automatically saved every X minutes.
Page 69 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

f. Explanation width. Sets the width of the yellow rectangles in the tree. Increase this size if the sentences are too wide to fit within the rectangle.

Figure 3-20: Settings.

8 Menus and Toolbars; Help


8.1 About
This command opens the About-frame, where you can see information about your license and version of DecideIT.

8.2 Contents and Index


This command opens the DecideIT help-system. The help-system is very useful in order to get acquainted with DecideIT.

8.3 Enter License Key


Use this command if you wish to enter a license key to access a fully featured version of DecideIT. This frame will automatically open every time you start the software if it is not registered.

Page 70 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

9 Node Property Frame


9.1 Identify Decision Alternatives
To begin designing a new tree structure, select Newfrom the file menu. A rudimentary tree opens and you are able to begin designing a tree structure of the problem.

Figure 3-21: Right-click pop-up menu when right-clicking on the initial decision node (green square).

a. Increase the number of alternatives. Left-click the green box to open the dialog box - Add Alternatives, or rightclick the green box (which then turns yellow) and select Add alternative(s)Enter identified number of alternatives in the problem structure. Minimum set of alternatives possible are two and maximum set are eight alternatives. During the designing phase it is evidently feasible to add and remove alternatives from the decision structure.

Figure 3-22: Defining alternatives.

b. Copy the tree to another location. Right-click a node and chose Copy tree. Choose a new node in another tree and select Paste.

Page 71 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-23: Copying trees.

c. Hide Sub-nodes. The command Hide sub-nodes is useful for large tree structures. Right-click the green box (which then turns yellow) and select Hide sub-nodes The command will collapse (hide) the tree structure from the specific node you are editing. To unfold the collapsed tree structure repeat given procedure with the command Show sub-nodes. d. Label identified decision structure. Left/Right-click the light yellow box to open the dialog box Node Properties: D1, or right-click the green box (which then turns yellow) and select Node properties It is also possible to select Decision from then Open Node menu to open the dialog box for Node Properties for the specific node The decision structure can be labelled partly with a short name under Tree name, occurring in the yellow box, partly with an extended description of the structure under Decision.

Page 72 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-24: Node property frame for single decision situations.

9.2 Identify Event and Consequence Nodes


There are several commands for changing a tree. Many of these can be applied by rightclicking a node.

Figure 3-25: Right-click pop-up menu.

a. Identify number of sub nodes (consequences) for each specific event. Left-click the blue triangular to open the dialog box - Add Nodes, or rightclick the blue triangular (which then turns yellow) and select Add node(s)Enter identified number of sub nodes in the problem structure. Minimum set of sub nodes possible are two and maximum set are 512. Maximum set of sub nodes altogether is around 900. During the designing phase it is evidently feasible to add and remove sub nodes from the decision structure.

Page 73 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

b. Convert nodes. Right-click a node and choose Convert to probability/decision node. This is depending of the selected node type.

Figure 3-26: Changing Node Type.

c. Delete a branch. Right-click a node (which then turns yellow) to open the node menu and select Delete branch. A dialog box Delete branch will open and ask whether to delete or not. d. Move up and down branches. Right-click a node and choose Move up/Move down. This enables you to move up or down a branch in a tree.

Page 74 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-27: Moving branches.

e. Copy node/ branch.. If two nodes are identified to contain identical values the command Copy node/branch is useful. Right-click the blue triangle (which then turns yellow) to open the node menu and select Copy node/branch. Select the specific node where to Paste the previous copied node. f. Hide Sub-nodes. The command Hide sub-nodes is useful for large and complex tree structures. Right-click the green box (which then turns yellow) and select Hide sub-nodes. g. Choose/Disregard/Regard alternative. The commands Choose/Disregard/Regard alternative are primarily used when handling multiple decisions in the same tree. Multiple decisions are described below. Right-click the green box (which then turns yellow) and select Choose/Disregard/Regard alternative.

Page 75 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

h. Label identified event nodes and consequence nodes. Left/Right-click the light yellow box to open the dialog box Node Properties, or right-click a node (which then turns yellow) and select Node properties. The Event node/Consequence node can be labelled partly with a short name under Scenario, occurring in the yellow box, partly with an extended description of the structure under Extended explanation of scenario. It is also possible to select Event or Consequence from then Open Node menu to open the dialog box Node Properties: D1. i. Edit the probabilities of an event node. Left/Right-click the light yellow box to open the dialog box Node Properties, or right-click the blue box (which then turns yellow) and select Node properties The valid probability statements are; a precise probability P, an interval I, an interval with a most likely point I+P, and a probability template PT. The probabilities must be consistent with given constraints, and the fully consistent probabilities are shown to the right as hull-probabilities. The consistency checks are performed when OK or Apply is pressed in the node property frame. If the assigned probabilities cannot be consistent, you will be asked to modify your probability statements. The probabilities may be given as percentage (%) between 0%100% or on the 01 scale, see Settings.

Page 76 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-28: Entering probabilities (in percentage).

j. Edit the value of an event node. Left/Right-click the light yellow box to open the dialog box Node Properties, or right-click the blue box (which then turns yellow) and select Node properties The valid value statements are; a precise value P, an interval I and an interval with a most likely point I+C. The values must be consistent with given constraints, and the fully consistent values are shown at the bottom as hull-values. The consistency checks are performed when OK or Apply is pressed in the node property frame. If the assigned values cannot be consistent with given value relations, you will be asked to modify your value statements.

Page 77 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-29: Entering values.

Page 78 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

10

Evaluation Windows

The evaluation windows are Security Thresholds, Total Ranking, Cardinal Ranking, Expected Value Graph, Cumulative Risk Profile, Critical Probabilities/Values, Total Ranking All Alternatives, Cardinal Ranking All Alternatives , Expected Value Graph All Alternatives , Extreme Values and Preference Order . These windows contain menus, a toolbar, and a frame showing the result. The choices made in the menu and toolbar is slightly different depending on the given evaluation window. The menus contained in the windows are;

File Edit View Update

Figure 3-30: Basic expected value graph window.

10.1 File
a. Select Export Analysis to JPEG-format from the File menu The Export button in the toolbar corresponds to Export Analysis to

JPEG-format in the file menu. b. Name the image in the panel File name. c. Click Save The main purpose of the command, Export Tree to JPEG-format, is to facilitate further documentation and representation of problem structures.

Page 79 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

d. Close Use this command to close the selected evaluation window.

10.2 Edit
The Edit menu contains the following commands:

Set y-scale (Expected value graphs and Cardinal ranking) Reset y-scale (Expected value graphs and Cardinal ranking) Set Color

a. Set y-scale. Use this command to set the vertical scale in the evaluations window. This will only change the graphical presentation of the graph.

Figure 3-31: Setting Y-scale in evaluation window.

The Set y-scale button evaluate menu. b. Reset y-scale.

in the toolbar corresponds to Set y-scale in the

This command sets the vertical scale in the evaluation window to be automatically set, which is default. c. Set Color. Use this command to change the colour of the background and some objects.

10.3 View
The View menu contains the following commands:

Hide window Compare positive graphs (Expected value graphs) Numerical (Expected value graphs) Contraction (Expected value graphs) Size (Expected value graphs)

Page 80 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

a. Hide window. Use this command to hide all evaluation windows, e.g., windows containing evaluation graphs, security thresholds, critical values etc. The Hide/Show button in the toolbar corresponds to Hide window in the

view menu. Use this button to show the evaluation window again. b. Compare Positive Graphs. Use this command to compare the positive graphs of two alternatives instead of one positive and one negative. Consider a comparison between Alt. 1 and Alt. 2. Comparing the positive graphs means that we compare the upper line in the comparison Alt. 1 against Alt. 2 and the upper line in the comparison Alt. 2 against Alt. 1 in the default evaluation graphs

Figure 3-32: Comparing positive graphs.

The Compare Positive Graphs button

in the toolbar corresponds to

Compare Positive Graphs in the view menu. c. Numerical. Use this command to show some of the calculated values in the graph.

Page 81 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-33: Numerical view.

The Numerical button view menu. d. Contraction.

in the toolbar corresponds to Numerical in the

Use this command to show the level of contraction along the x-axis. e. Size. Use this command to set the size of the evaluation windows. The options are small and large where small is the default size. The Show Small/Large Graph button Small/Large in the view menu. in the toolbar corresponds to

10.4 Update
The Update menu contains the following commands:

Update Automatic update

a. Update. Use this command to update the evaluation according to new parameters in the decision tree. The Update button menu. b. Automatic update. When Automatic Update is selected the evaluation graph will automatically
Page 82 of 148

in the toolbar corresponds to Update in the update

User Manual DecideIT Decision Tool Preference AB 2006-2011

be updated according to new parameters in the decision tree. The default setting is that updating the evaluation windows is done manually, while if many evaluation windows are open it may take some time to update them all.

11

Multiple and Sequential Decisions

Sometime you have to analyses a sequence of decisions. Se Figure 3-34.

Figure 3-34: A decision sequence.

In Figure 3-34, you can see that there are three decisions involved. Two of them are crossed. This means that you have to evaluate decision D2 first. Obviously, you should choose Alt. 1 High, since it has a higher value. 1. Right-click the node C2 and select Choose alternative. This means that you decide that the alternative High in decision D2 is the preferred one. The other alternative (Low) then becomes grey and the cross in decision node D3 disappears. This means that you can evaluate decision node D3. See Figure 3-35.

Page 83 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 3-35: A decision sequence.

Obviously, you should choose Alt. 1 Modify, since it has a higher value (60) than Not modify (25). 2. Right-click the node D2 and select Choose alternative. The other alternative (Not modify) then becomes grey and the cross in decision node D1 disappears. This means that you can make a decision also in node D1. Now, you should choose Alt. 2 Buy, since it has a higher value (60) than Sell (50). 3. Right-click the node D3 and select Choose alternative. The other alternative (Sell) then becomes grey and you have finished the analysis. The decision sequence to make is then [Buy, Modify, High]. See Figure 3-36.

Figure 3-36: A decision sequence.

Page 84 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

You can also reset a selection by right-clicking a grey node and select Regard Alternative. As an alternative of choosing an alternative you can also select Disregard Alternatives for the other ones. A further way of making these selections is to right-click a node and select Node Properties and the click the tag Alt. status in the dialog box that appears. In this you can change the status. See Figure 3-37.

Figure 3-37: Alt. status panel in the node property frame.

Page 85 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

PART IV: THEORY

This part describes general concepts and procedures in the area of decision analysis and the theoretical background of the tool DecideIT. The content is provided as a general background to the area and parts of it might be complicated to grasp at a first reading. However, a complete understanding of the details is absolutely not necessary when working with DecideIT and can be skipped. The background theory of DecideIT is provided in Chapter 20.

Page 86 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

12

Historical Background

Cogito, ergo sum. Following these words, Descartes concluded the existence of free will without the presence of pre-determinism. In a non-deterministic world, we are capable of choosing for ourselves from the possible courses of action we identify. But with the right to choose comes the responsibility for the consequences of our actions. It is up to ourselves to discriminate between the different alternatives, and we are expected to do the right thing. The majority of such discriminations are trifling little choices, natural parts of our everyday lives, but some are of such importance that a structured approach is desired and a careful analysis is undertook before choosing and implementing a particular course of action. However, the origin of the field decision analysis can be traced back beyond any Descartes meditation, while the theory has evolved from the statistical aspects of games. Fibonaccis Liber Abaci (1202) and Pacciolis Summa de arithmetic, geometria et proportionalit (1494) constitute crucial early written work on such questions. Paccioli raises the question of how the stakes should be divided between two players of balla, who have agreed to play until one of them wins six rounds, but they are interrupted and cannot continue when one player has won five rounds and his counterpart has won three ([David, 1962], p. 37). Later, Gerolamo Cardano (1501-1571) tried to answer the question in his Liber de ludo aleae (1663), in which he formulated the fundamental concept of solving a probability problem by identifying a sample space with equally likely outcomes. Pierre-Remond Montmort further stimulated the early work on probability theory in his Essay d Analyse sur les Jeux de Hazard (1708), where he wanted to show superstitious gamblers how to behave rationally. Other important early contributors to a general theory of probability include Blaise Pascal (1623-1662) and Pierre de Fermat (1601-1665), who, after they encountered a gambling question from the French nobleman Antoine Gombaud (a.k.a. Chevalier de Mr, 1607-1684), initiated an exchange of letters in which fundamental principles of probability theory were formulated. Gombauds game consisted in throwing two six-sided dices 24 times, and the problem was to decide whether or not to bet even money on the occurrence of at least one pair of sixes among the 24 throws. A seemingly well established but deceiving gambling rule had led Gombaud to believe that betting on a double six in 24 throws would be profitable; however his calculations had indicated the opposite.

Page 87 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

The importance of statistics grew in the 17th and 18th century with the introduction of life annuities and insurance. Mortality statistics and life annuities were research areas of Abraham de Moivre (1667-1754), and in his Doctrine of Chance (1718) de Moivre defines statistical independence. Later, in Miscellanea Analytica (1730) the same de Moivre introduced the normal distribution as an approximation of the binomial distribution for use in prediction of gambles. In the second edition of Miscellanea Analytica (1738), de Moivre improved the formula for the normal distribution with the support of James Stirling (1692-1770). Furthermore, Reverend Thomas Bayes (1702-1761), an English Presbyterian minister, famous from his posthumously published An Essay Toward Solving a Problem in the Doctrine of Chances (1763), introduced the widely applied Bayes theorem and the concept of Bayesian updating. As a result, Bayes is credited with the introduction of subjective probability theory as well as the theory of information. Bayes conclusions were later accepted by Pierre-Simon Laplace (1749-1827), and published in his double volume Thorie Analytique des Probabilits (1812). In this comprehensive work, Laplace investigated generating functions, approximations to various expressions occurring in probability theory, methods of finding probabilities of compound events when the probabilities of their simple components are known, and a discussion of the method of least squares. Alongside with the early development on a theory of probability, the Swiss physician and mathematician Daniel Bernoulli (1700-1782) wrote a landmark paper, Specimen Theoriae Novae de Mensara Sortis (1738), in which a motivation for the concept of utility is given, commonly referred to as his solution to the famous St. Petersburg Paradox posed in 1713 by Daniel Bernoullis cousin, Nicolaus Bernoulli. The name St. Petersburg Paradox is due to the fact that the distinguished Bernoulli family was in many ways connected to St. Petersburg. In this paradox, Nicolaus Bernoulli considered a fair coin, defined by the property that the probability of heads is . This coin is tossed until head appears. The gambler is rewarded with 2n ducats if the first head appears on the n:th trial. The expected monetary value of this game is EMV(w) = i=1 (1/2n)2n = (1/2)2 + (1/4)22 + (1/8)23 + .... = 1 + 1 + 1 + ..... = (emv) Thus, it is infinite. It is nevertheless difficult to believe that any gambler would be willing to pay an infinite amount of money to participate in such a game. Bernoulli concluded therefore that the expected monetary value is inappropriate as a decision rule. Bernoullis solution to this paradox involved two ideas that have had great impact on economic theory. Firstly, he

Page 88 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

stated that the utility of money cannot be linearly related to the amount of money; it rather increases at a decreasing rate.
To make this clear it is perhaps advisable to consider the following example: Somehow a very poor fellow obtains a lottery ticket that will yield with equal probability either nothing or twenty thousand ducats. Will this man evaluate his chance of winning at ten thousand ducats? Would he not be illadvised to sell this lottery ticket for nine thousand ducats? To me it seems that the answer is in the negative. On the other hand I am inclined to believe that a rich man would be ill-advised to refuse to buy the lottery ticket for nine thousand ducats. If I am not wrong then it seems clear that all men cannot use the same rule to evaluate the gamble [...] the value of an item must not be based on its price, but rather on the utility it yields. The price of the item is dependent only on the thing itself and is equal for everyone; the utility, however, is dependent on the particular circumstances of the person making the estimate. (Bernoulli, 1954, p.23)

Bernoulli identified the value of the consequences of a choice as being different from the objective economical outcome, commonly referred to as the idea of diminishing marginal utility. Bernoullis second idea is that a persons valuation of a risky prospect is not the expected return of that prospect, but rather the prospects expected utility, E(u | p, X) = x X p(x)u(x) Where X is the set of possible outcomes, p(x) is the probability of a particular outcome x X, and u: X R is a utility function over the outcomes X on the real numbers. Thus, expected utility is the mathematically expected value, when subjective utility is taken into account. In the St. Petersburg Paradox, the value of the game is finite due to the principle of diminishing marginal utility. Originally Bernoulli employed a logarithmic utility function, u(x) = log x, where the is dependent on the gamblers wealth prior to the gamble itself, and x is the outcome. Substituting this value for x in (emv) yields a finite number. Consequently, people would only be willing to pay a finite amount of money to participate, even though the expected monetary value of the game is infinite.

1.1 Decision Analysis


Decision analysis is often regarded as a conjunction of subjective probability and subjective utility. Frank P. Ramsey (1903-1930), suggested a theory that integrated these areas in his Truth and Probability (1926). In this paper, Ramsey informally presented a general set of axioms for preference comparisons between acts with uncertain outcomes. From this set of
Page 89 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

axioms, he could justify a procedure to measure a persons degree of belief from preferences between acts of certain forms. Preceding Ramseys work, the concept of degree of belief as an approach to subjective probability had been introduced by John Maynard Keynes (1883-1946) in his A Treatise on Probability (1921). Subjective probability, as opposed to objective probability, means that the different values reflect the decision-makers actual beliefs, thus they are a measure of the degree of belief in a statement. These beliefs are not necessarily logical or rational, while they should be interpreted in terms of the willingness to act in a certain way.
[Under uncertainty] there is no scientific basis on which to form any calculable probability whatever. We simply do not know. Nevertheless, the necessity for action and for decision compels us as practical men to do our best to overlook this awkward fact and to behave exactly as we should if we had behind us a good Benthamite calculation of a series of prospective advantages and disadvantages, each multiplied by its appropriate probability waiting to be summed. (Keynes, 1937)

In contrast, an objective or classic view on probabilities, as defined by Laplace, says that probabilities are exogenously given by nature. In Probability, Statistics and Truth (1928), Richard von Mises (1883-1953) introduced the relative frequency view, which argues that the probability of a specific event in a particular trial is the relative frequency of occurrence of that event in an infinite sequence of similar trials. The modern and formal approach to game theory is attributed to John von Neumann (1903-1957), who in Zur Theorie der Gesellschaftsspiele (1928) laid the foundation to a theory of games and conflicting interests. Later he wrote, together with Oskar Morgenstern (1902-1976) the important book Theory of Games and Economic Behaviour (1947), in which they introduced a considerable amount of important elements such as the axiomatization of utility theory per se and a formalization of the expected utility hypothesis. This axiomatization is sometimes deemed reasonable to a rational decision-maker, and it is demonstrated that the decision-maker is obliged to prefer the alternative with the highest expected utility to act rational, given that she acted in accordance with the axioms. Of further importance, through this work von Neumann and Morgenstern bridged the gap between the mathematics of rationality and social science. However, von Neumann and Morgenstern did not take subjective probability into account, while they regarded probability in an objective sense and thus the decision-maker could not influence the probabilities. Leonard J. Savage (1917-1971) combined the ideas by Ramsey and the ideas by von Neumann and Morgenstern
Page 90 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

in The Theory of Statistical Decision (1951). Savage here gives a thorough treatment of a complete theory of subjective expected utility and associated utility functions. In Statistical Decision Functions (1950), Abraham Wald (1902-1950) takes use of loss functions and an expected loss criterion, as opposed to utility functions and the expected utility criteria. Loss functions and expected loss criteria later become standard basic elements in what that is commonly referred to as Bayesian or statistical decision theory. The name Bayesian derives from that this theory utilizes prior information and non-experimental sources of information. However, in the general case it is easy to adjust Walds statistical decision theory to include utilities (cf. Savage, 1972, p.159). Further, Wald had an objective view on probabilities. His concern focused on characterizing admissible acts and strategies for experimentation, where an act or strategy is admissible if no other act is better. Hence, Walds decision analysis could result in a family of admissible strategies, i.e., the non-dominated set of strategies. In recent literature, many modern characterizations of decision theory and decision analysis are suggested. Simon French, Ralph Keeney, Michael D. Resnik, and Peter Grdenfors and Nils-Eric Sahlin, respectively, have given their, more or less technical, views on the area as follows:
Decision analysis is the term used to refer to the careful deliberation that precedes a decision. More particularly it refers to the quantitative aspects of that deliberation. (French, 1988, p.27) A philosophy, articulated by a set of logical axioms, and a methodology and collection of systematic procedures, based upon those axioms, for responsibly analyzing the complexities inherent in decision problems. (Keeney, 1982, p.806) Decision theory is the product of the joint efforts of economists, mathematicians, philosophers, social scientists, and statisticians toward making sense of how individuals and groups make or should make decisions. (Resnik, 1987, p.3) The main aims of a decision theory are, first, to provide models for how we handle our wants and our beliefs and, second, to account for how they combine into rational decisions. (Grdenfors and Sahlin, 1988, p.1)

Page 91 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Solving decision problems computationally are usually categorized as belonging to the area of optimization, and in particular linear optimization subject to some linear constraints. Typically, such questions are of the form what is the maximum/minimum value of this variable subject to these constraints? When discussing decision problems, such constraints typically include economical, time, or personnel aspects. The use of formal methods and mathematics for evaluating possible strategies had an important upswing during the second World War, and following this war the terms operations analysis and operations research are closely related to decision analysis and optimization techniques. Later, the militaristic area of operational research is often being studied together with topics such as management science, industrial engineering, and mathematical programming. At present time, the widespread use of computers and the rise of the graphical user interface have rendered it possible to facilitate the use of decision analytic techniques to a wider group of users. The growth of operational research since it began is, to a large extent, the result of the increasing computational power and widespread availability of desktop computers. Finally, due to the well-foundedness of decision theory, research in artificial intelligence have merged classical theories of decision making with other techniques for handling uncertainty into a sub-field of artificial intelligence commonly referred to as uncertain reasoning.

1.2 Perspectives on Decision Theory


Decision theory serves different purposes. Throughout the 20th century, it has evolved into a widespread tool for economists, mainly for predicting how a population will react on changes in their environment (Friedman, 1953). From this perspective, the logical foundation of the theory is less important, while the ability to predict the behaviour of decision-makers is what matters. When using decision theory in such contexts, the decision theory is said to be descriptive, thus we speak in terms of descriptive decision theory. The aim of a descriptive decision theory is to explain how decisions are being made and why human decision-makers choose to act in a certain way. A central result is the bounded rationality theorem, which states that due to limitations in the processing of information people cannot act entirely rational (Simon, 1955), (March and Simon, 1958). Further, there is a tendency that depending on how the information is presented, people choose differently although according to theory of expected utility the alternatives are the same. This behaviour is referred to as the framing process in the descriptive theory (Tversky and Kahneman, 1986). Another violation of the expected utility
Page 92 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

hypothesis occurs when gains are replaced by losses in choosing between alternatives with uncertain outcomes; people tend to be less keen on risk taking when there are gains involved rather than losses (Markowitz, 1952). However, the perspective of main interest here is of the normative kind. The aim of normative decision theory is to recommend various decision procedures and decision rules implying rational decision making when followed. In this case, the logical foundations and the validity of the model do matter. The proponents of such models often argue for them by constructing axiom systems (like the one of Savage presented below), and then deduce some decision rules, which induce a (normative) preference order on a set of alternatives. The area of decision tools is clearly derived from the normative kind of decision theory. According to Danielson (1997, p.2), this area contains approaches, which deal with mechanizing the structuring and analysis of decision situations. A salient idea is to model the situations according to a normative model of rational behaviour. Presuming the decisionmaker to be rational, the mechanical model can devise suitable courses of action given supplied information. A decision analytic tool then handles a smaller number of alternative courses of action and supports the evaluation and selection of those alternatives. Such a tool aids human decision-maker in her search for a preference order of a set of alternatives and in her strive for rationality. Prescriptive decision theory is a more recent perspective. The prescriptive theory focuses on identifying the discrepancies between how decisions are made (descriptive) and how the normative theory suggests they should be made (Riabacke, 2002). One purpose of the theory is to bridge the gap between decision analysis and actual decision making.

Page 93 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

13

Probability Theory

Something that is not for certain is matter of some uncertainty. When a decision-maker has to act in situations where uncertainty prevails, and this uncertainty can be quantified in terms of a probability measure, it is said that the decision is made under risk. In Bayesian decision theory, probabilities are used to capture and model beliefs. Thus, they are considered to be measures of degrees of beliefs. Needless to say, performing statistical investigations to obtain these degrees of beliefs is recommended, but in many real-life situations historical data is not available and the probability assessment has to be made on more subjective grounds. Although the theories of probability can be traced back to the 16th century, the foundations to modern probability theory were laid by Andrey Nikolaevich Kolmogorov (1902-1987). Kolmogorov rigorously constructed a probability theory from fundamental axioms, defining conditional expectation, and laying the foundations to Markov random processes in Grundbegriffe der Wahrscheinlichkeitsrechnung (1933) and in Analytic Methods in Probability Theory (1938). The basic formulas for probability calculus usually takes the form P(A) = pA, and is read as the probability of the uncertain event A is pA, where pA[0,1] is a real number. For example, A can be the statement it will not rain on your next birthday and you will receive at least ten gifts. Every event is a subset of a sample space , supposed to capture every possible event. The Kolmogorov-axioms: are usually stated as follows: 1. 0 P(A) 1, for all events A 2. P() = 1 3. If A and B are mutually exclusive events, then P(A B) = P(A) + P(B), and P(A B) = 0. The second axiom can be interpreted as it is certain that one of the events in the sample space will be the true outcome, i.e., a condition of exhaustiveness. Conditional probability arises when additional information is obtained, and is formulated as P(A | B) which can be interpreted as: the probability of A given B. Thus, the decision-maker knows that B is true and this might have impact on the probability of A. For example in medical applications, a test yields a positive result, which in turn implies some probability of an actual disease.

Page 94 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Definition: Definition:

Conditional probability: P(A | B) = P(A B) / P(B). Independence: Event A with outcomes {A1, , An} and B with outcomes {B1, , Bm} are independent if and only if P(Ai | Bj) = P(Ai) for all Ai and Bj.

Definition:

Conditional independence: Event A and B are conditionally independent given event C if and only if P(Ai | Bj, Ck) = P(Ai | Ck).

Theorem:

Bayes Theorem: P(B | A) = P(A | B)P(B) / ( P(A | B)P(B) + P(A | B)P(B)), where B denotes not B.

It follows from these definitions that two mutual exclusive events cannot be independent. The set of probabilities associated with all possible outcomes is a probability distribution. When the sample space contains of a discrete set of outcomes, the probability distribution on it is discrete.

Page 95 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

14

Utility Theory

The term utility can be regarded as a measure of some degree of satisfaction, and a utility function is a mapping from outcomes, i.e., losses or gains, to real numbers representing this degree of satisfaction. The logarithmic utility function defined by Bernoulli was in itself considered adequate for almost two hundred years. However, Karl Menger (1902-1985) showed in his Das Unsicherheitsmoment in der Wertlehre (1934) that the Bernoulli function was heuristic and ad hoc, while the function was unsatisfactory already on its formal grounds. Menger showed the existence of a game related to the game presented in the St. Petersburg Paradox, in which the subjective expectation of the gambler on the basis of this value function is infinite when evaluating additions to a fortune by any unbounded function (Menger, 1934, p.264). The implication of this is that it is always possible to provide a paradox, in the important respects equivalent to the St. Petersburg Paradox, which cannot be resolved only through the idea of diminishing marginal utility. Menger also showed the inadequacy of mathematical utility functions of the type suggested by Bernoullis contemporary Gabriel Cramer (1704-1752). Consequently, we have to elaborate a bit more on utility theory. Before we continue, we first present some notation: a >p b means that the decision-maker holds alternative a to be strictly preferred to alternative b. This binary relation is transitive and asymmetric, thus it is a strict order. a p b means that the decision-maker holds alternative a to be at least as good as alternative b, i.e., b is weakly preferred to a. This binary relation is complete and transitive, thus it is a weak order. a p b means that the decision-maker is indifferent between alternative a and alternative b. This binary relation is reflexive, transitive, and symmetric, thus it is an equivalence relation. If the decision-maker can assign a number u(a) such that u(a) u(b) if and only if a p b, then it is said that there exists a utility function over a and b. Utility functions are defined on an interval scale, i.e., they are unique up to a positive affine transformation; such transformations are the only admissible transformations of utility functions. In formal terms: Let U be a utility function on a set C of consequences, then there
Page 96 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

exists > 0 and such that W(x) = U(x) + is a utility functions representing the same preferences, i.e., two different interval scales count as equivalent if and only if they can be obtained from each other by means of positive affine transformations. Apart from ratio scales, interval scales do not have an absolute zero (e.g., zero length); nor do they represent the ratio of some measured entity to some standard unit of measurement (e.g., meters or seconds). Thus, in an interval scale, the gap between two degrees has a meaning, while the gap between two ratios does not. In general, people are willing to pay more money for what they consider to be more desirable. In this respect a monetary scale can at least be expected to be an ordinal scale, i.e., a scale measuring preference ordering without the possibility to state, e.g., magnitudes of desires. For a majority of business decisions, the use of monetary scales is considered as a reasonable and acceptable measure of utility. However, it is not uncommon that monetary values are used to scale non-monetary outcomes, such as public health and environmental damage. In many cases, this problem is due to lack of means and usable tools for representing and evaluating intangibles and vague valuations. This is particularly troublesome when aggregating ordinal information and can be severely misleading. Ordinal scales are described further below.

Page 97 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

15

Decision Modelling

A world can be modelled as having different possible future states and in many situations it is beyond the capabilities of the decision-maker to tell in advance which state will be the true state. In this world, the decision-maker is an entity facing a choice between a set of alternatives. Every alternative in turn has a set of consequences connected to the states via the alternatives, i.e., given an alternative and a state there is a consequence of the performed alternative. The concern of the decision-maker is to choose the best alternative given the sets of consequences and states. Given this, there are at least four basic types of difficulties: How should the decision-maker to compare the alternatives with respect to different multiple objectives on the decision?2 How should the decision-maker compare the alternatives for each objective? How should the decision-maker estimate the probabilities that the given states occur, given that a certain act is performed? How should the decision-maker estimate the different values of the consequences?

If not considering multiple objectives, a decision table as the one in Figure 4-1 is a frequently used representation of a decision problem. s1 a1 a2 ... am c11 c21 ... cm1 s2 c12 c22 ... cm2 ... ... ... ... ... sn c1n c2n ... cmn

Figure 4-1: A decision table.

2Typical perspectives can be environmental, financial, security, etc. Such concerns will be further demonstrated

in section 5.

Page 98 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

The possible states (s1,...,sn) describe a set of mutually exclusive (disjoint) and complete descriptions of the world, not leaving any relevant state out. These determine the consequences (such as cij) of the different alternatives (a1,...,am). The true state is the state that does in fact occur. Thus, if the decision-maker selects the alternative a2, and if s3 will be the true state, consequence c23 will occur. An immediate question is which world to use as an adequate frame and how this world serves as the description of the actual world the decision-maker perceives? Depending on the purpose of the model, this world has to bee large enough. To build such a world, with all its relevant state descriptions, usually requires a thorough investigation and analysis. Luce and Raiffa (1957, p.13) provided a useful classification of decision situations, addressing that an important factor in every decision problem is the decision-makers knowledge and beliefs about the situation. They distinguish between the following three types of decision situations: Decisions under certainty Decisions under strict uncertainty Decisions under risk

15.1 Decisions under Certainty


In decisions under certainty, the decision-maker knows the true state before she performs an act; or can predict the consequences with certainty. This means that n = 1 in the decision table for this kind of decisions (cf. Figure 41). Thus, in this case, it is reasonable to demand of a rational decision-maker that she should choose the alternative whose one and only consequence has a value not less than the value of any other alternative. The value of a consequence may be expressed by an ordinal value function defined on an ordinal scale. Definition: Given a set of consequences P and a relation p denoting the decision-makers preferences over P, an ordinal value function (x), representing these preferences, is a real-valued function with domain P such that (ci) (cj) iff ci p cj. When the set P of consequences is finite, and a reasonable ordering relation is defined, then a numerical order preserving function (x) can be constructed. In decisions under certainty,

Page 99 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

such a function is all that is needed, since it is enough in this context only to treat the cases involving a finite number of consequences.3 Because an ordinal value function can always be constructed, it makes sense to talk about the value of a consequence. This is valid also when P is an arbitrary set of objects that a decision-maker can have preferences upon.

15.2 Decisions under Strict Uncertainty


In decisions under strict uncertainty, the decision-maker cannot quantify her uncertainty in any way, thus no probability estimations are possible or they are meaningless. Milnor (1954) provides an exposition of four proposals by four different authors: The Principle of Insufficient Reason (Laplace, 1825) The Maximin Principle (Wald, 1950) The Pessimism-optimism Index (Hurwicz, 1951) The Minimax-Regret Principle (Savage, 1951)

15.2.1

Laplace

The decision rule of Laplace is based on the assumption that if the probabilities of the different states are completely unknown, then they can be assumed to be equal. This idea is commonly referred to as the principle of insufficient reason. Choose the alternative ak, such that the average value of the possible outcomes from this alternative is maximized: max(jn vij)/n, where 1 k n, and where vij denotes the value of cij.

15.2.2

Wald

Walds rule can be expressed as follows: 1. Set a security level by choosing an index pi = min{vij : j = 1,...,n} 2. Choose ak such that its index pk = max{pi}. As can be seen, Walds view on strict uncertainty was not an optimistic one, while according to Wald, you should always choose the alternative that gives the best result if the worst

3Uncountable sets are treated in (Debreu, 1952) (which demands that you are comfortable with topological

arguments) as well as in (Krantz, 1971), Chapter 4. The corresponding result for countable sets can be found in (French, 1988), p.98, together with a simple induction argument.

Page 100 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

possible outcome will occur for each alternative. Thereof the name the maximin utility criterion, which originated from Walds work within game theory.

15.2.3

Hurwicz

Apart from Wald, the rule of Hurwicz has a less pessimistic approach. Hurwicz recommends a mixture of an optimistic and a pessimistic attitude: 1. Select a constant [0,1] as the pessimism-optimism index. 2. Let oi = max{vij, j = 1,...,n} and pi = min{vij, j = 1,...,n}. 3. Choose ak such that pk + (1 )ok = max{pi + (1 )oi}. Note that if = 1 this is again the maximin utility criterion, whereas if = 0, it is the socalled maximax utility criterion. Different ways of choosing appropriate pessimism-optimism indices have been presented, but we will not enter into that discussion here.

15.2.4

Savage
1972, p.164). Informally speaking,

In Savages own words: [...] the minimax rule recommends the choice of such an act that the greatest loss
that can possibly accrue to it shall be as small as possible. (Savage,

the decision-maker should choose the alternative giving the smallest possible regret. 1. Let rij = max{vsj, s = 1, ...,m} vij. 2. Let pi = max{rij, j = 1,...,n} 3. Choose ak such that pk = min{pi} This minimax risk criterion was first suggested as an improvement over Walds maximin utility criterion. Figure 42 shows Milnors example (Milnor, 1954, p.50) of a decision problem where all of the above criteria give different results. s1 a1 a2 a3 a4 2 1 0 1 s2 2 1 4 3 s3 0 1 0 0 s4 1 1 0 0 Laplace Wald Hurwicz (>1/4) Savage

Figure 4-2: Milnors example.

The question remains: to act rational, which one of the above rules should be employed? Milnor proved that no decision criterion is compatible with ten seemingly reasonable axioms

Page 101 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

that constitute his test set (Milnor, 1954, p.53). As it turns out, it is relatively easy to show that it is impossible to find a decision rule that fulfils all desirable properties. Further, Ackhoff (1962) argues that any concept of strict uncertainty is inappropriate, i.e., strict uncertainty implies that there is always some information or some beliefs being disregarded. In DecideIT, it is nevertheless possible, but not recommended, to employ the decision rules suggested by Laplace, Wald, and Hurwicz if they are felt to be appropriate in certain situations.

15.3 Decisions Under Risk Bayesian Decision Analysis


When the decision-maker is able to quantify her beliefs in terms of a probability distribution on the set of possible outcomes given a chosen course of action, it is said that the decision is made under risk. If all utilities and probabilities in a decision problem are subjectively assigned numerical values by the decision-maker, and then the problem is evaluated according to the principle of maximizing the expected utility, the decision-maker conforms to Bayesian decision analysis. This kind of decision problem is our main concern. However, as later will be pointed out, DecideIT does not require from the decision-maker to characterize one single probability distribution or precise utility estimates. The decision method is called Bayesian, named after the English clergyman Thomas Bayes, due to the use of subjective probability assignments and the common procedure of updating the probabilities through employing the Bayes theorem. In this respect, the probabilities are treated subjectively as a statistical procedure that, in many cases, endeavours to estimate parameters of an underlying probability distribution (posterior distribution) based on an observed probability distribution (prior distribution). Suppose that each alternative a can be represented by a set of consequences and a set of numbers {ci}, {pi}, where {ci} is the set of possible consequences of a, and pi is the probability that ci occurs given that a is implemented.4 Then, the meaning of accepting the utility principle and the principle of maximizing the expected utility can now be formulated as follows (Malmns, 1990b):

4 Note here that probabilities are assigned to consequences instead of being assigned to states of the world.

These two models are fully compatible when considering only a finite number of states and consequences.

Page 102 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Definition:

If a is {ci}, {pi}, and Va is a real-valued function on {ci}, then a has a value equal to piVa(ci), denoted by EV(a).

Definition:

A decision-maker accepts the utility principle if and only if she assigns the value piVa(ci) to a, given that it has assigned the value Va(ci) to ci.

Definition:

An ordering p of the alternatives is compatible to the principle of maximizing the expected utility if and only if a p b implies EV(a) EV(b).

Definition:

A decision-maker accepts the principle of maximizing the expected utility if and only if its ordering of the values of the alternatives is compatible to that principle.

15.4 Assumptions and Axioms in Utility Theory


Utility theory was, taking Mengers results in account, obviously not a well-founded subject until the late 1930s, when the works of Ramsey, and von Neumann and Morgenstern appeared. They proposed reasonable principles governing decisions, which they such constructed a theory out of them in which a set of axioms was formulated, whose purpose was to justify their particular attitude towards the utility principle.5

15.4.1

Axiom Systems

The idea is to in a systematic way define the meaning of rationality. The point is, if a decision rule can be deduced from an indisputable axiomatization, then this rule should be the natural and obvious rule for a rational entity, provided that the necessary information is available. Fllesdal (1984, p.268) suggests the following conditions for a decision rule: A decision rule should recommend an alternative with valuable consequences before an alternative with less valuable consequences. A decision rule should recommend an alternative with high probability of valuable consequences before an alternative with low probability of valuable consequences.

5Cf., e.g., (Savage, 1972), (Herstein, 1953), (Suppes, 1956), (Luce, 1971), and (Jeffrey, 1983). Surveys over a

wide variety of axiomatizations are given in (Fishburn, 1981) and (Malmns, 1990b).

Page 103 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

A decision rule should recommend an alternative with low probability of bad consequences before an alternative with high probability of bad consequences.

This seems to be reasonable, but is too vague to fill the needs of a normative decision theory and has to be elaborated a bit. In this, we introduce the technique of axiomatization. The axiom systems that will be presented consist of primitives, and axioms constructed from the primitives. Typical primitives include states, sets of states, and ordering relations such as p. The axioms then imply a numerical representation of probabilities and preferences, i.e., the axioms imply the existence of a probability distribution and a utility function. Although Ramsey (1931) and von Neumann and Morgenstern (1947) are credited for the axiomatic foundation of utility theory, we present the axiom system of Luce and Raiffa (1957), very similar to the aforementioned, and later the axiomatic justification of the utility principle according to Savage (1972). At a first glance, the two systems seem dissimilar, but the important implications boil down to the same central results. Starting with Luce and Raiffa, in which alternatives (or a gambles) with uncertain outcomes are called lotteries. An alternative is denoted p1v1, , pivi, , prvr, which can be considered as a lottery with the probability pi for the outcome vi. All the probabilities are supposed to sum up to one. For example, the alternative a with uncertain outcomes v1 and v2 associated with probabilities p1 and (1- p1) respectively is represented as the lottery a = piv1, (1- pi) vr. Axiom 1: Ordering of alternatives and transitivity: For any two alternatives a and b, either a p b or b p a, and if a p b and b p c then a p c. Axiom 2: Reduction of compound lotteries: Any compound lottery6 is indifferent to a simple lottery with v1, v2, , vr as prizes, in which the probabilities for the prizes in the simple lottery is computed according to ordinary probability calculus. Axiom 3: Continuity: Each prize vi is indifferent to some lottery involving just v1 and vr. Thus, there exists some number (or probability) pi[0,1] such that vi p piv1, 0v2, , 0vr-1, (1- pi) vr.

6 A compound lottery may be thought upon as a mixture of lotteries, i.e., the prize of a lottery consists of another

lottery instead of a certain reward.

Page 104 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Axiom 4:

Substitutability (independence of irrelevant alternatives): In any lottery L, vi is substitutable for vi, that is, p1v1, , pivi, , prvr p p1v1, , pi vi, , prvr when vi p vi.

Axiom 5:

Monotonicity: piv1, (1- pi) vr p piv1, (1- pi) vr if and only if pi pi.

Note that nothing is being explicitly said about the origin of the probability distributions, they are just assumed to exist, and thus the view on probabilities is of the objective kind. From these axioms, the principle of maximizing the expected utility as well as some other important results in utility theory are readily derived. Shifting our attention to the system of Savage, he argues7 that if utility is regarded as affecting only consequences (rather than acts), then for a weakly ordered consequence set C, the following is valid: 1(x) and 2(x) are numerical order preserving functions representing the ordering relation between the consequences if and only if there is a strictly increasing function r such that, for every ciC, 1(ci) = r(2(ci)). This shows that (ci) is just an ordinal scale: it cannot be interpreted as quantitatively measuring the strength of preferences in any meaningful way.
The probability-less idea of utility of economics has been completely discredited in the eyes of almost all economists, the following argument against it [...] being widely accepted. If utility is regarded as controlling only consequences, rather than acts, it is not true as it is when acts, or at least gambles, are considered and the formal definition in 3,8 is applied that utility is determined except for a linear transformation. Indeed, confining attention to consequences, any strictly monotonically increasing function of one utility is another utility. Under these circumstances there is little, if any, value in talking about utility at all [...] In particular, utility as a function of wealth can have any shape whatsoever in the probability-less context, provided only that the function in question is increasing with increasing wealth, the provision following from the casual observation that almost nobody throws money away. (Savage, 1972, p.96).

7 Savage adopted this argument from Vilfredo Pareto (1848-1923). 8(Savage, 1972, p.73).

Page 105 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

The primitives building up the axiom system of Savage9 slightly differ from the ones of Luce and Raiffa. Savage proposes the following primitives: (i) the binary preference relation p, (ii) a set S = {s1, s2,} of states, (iii) a set C = {c1, c2, } of consequences, and (iv) a set F = {f: S C} of all possible mappings from S to C where such a mapping is called an act. Now, Savage defines E as the power set of S, where the elements of E are called events denoted by A, B, C,and further defines the following concepts: 1. For f,g,f,gF and B,BcE, f p g given B if and only if f p g for every f and g that agree with f and g respectively, on B, and with each other on Bc and also g p f either for all such pairs or for no such pair (where Bc is the complement of B). 2. ci p cj if and only if f p f when f(s) = ci and f(s) = cj, for all sS. 3. B is null (B = ) if and only if f p g given B, for all f,gF. 4. A is not more probable than B (A B) if and only if fA p fB or ci p cj, for every fA,fB,ci,cj such that fA(s) = ci for sA, fB(s) = cj for sAc, fB(s) = ci for sB, fB(s) = cj for sBc. 5. f p ci given B (ci p f given B) if and only if f p h given B (h p f given B), when h(s) = ci, for all sS. To clarify some concepts if desired, in the first concept, when act f agrees with act f on B, then when performing f this will yield the same consequence as when performing f given the event (set of states) B, thus f(s) = f(s) for all sB. The third concept says that if weak preference holds regardless of which pair of acts compared given the event B, implying that all acts are indifferent given B, then B is an empty set of states (and vice versa). Further, looking at the fourth concept, when an act fB given A is preferred to an act fA given not A, and fB given not B is preferred to fA given B, then if fB is preferred to fA this means that a decision-maker holds event B more probable than event A (and vice versa). Now Savage proposes the following assumptions: Axiom 1: Transitivity: The relation p is a weak order.

9 We adopt the notations of (Malmns, 1990) and (Ekenberg, 1994).

Page 106 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Axiom 2: Axiom 3:

Completeness: For every f,g, and B, f p g or g p f given B. Resolution independence: If f(s) = ci, f(s) = cj, for every sB, B, then f p f given B if and only if ci p cj.

Axiom 4: Axiom 5: Axiom 6:

Qualitative probability: For every A,BE, A B or B A. Minimal strict preference: It is false that for every cj, cj, ci p cj. Continuity: Suppose h p g, then for every ci there is a finite partition {Bi} of S such that, if g = ci(Bi), and h = ci(Bi), for some i, then h p g or h p g.

Axiom 7:

Dominance: If f p g(s) given B (g(s) p f given B) for every sB, then f p g given B (g p f given B).

The second axiom says that when two acts have the same consequences, the relation between f and f must be independent of states. Furthermore, the third axiom says that the knowledge of an event cannot discard any preference between two consequences. Together, axioms 2 and 3 constitute Savages debated sure-thing principle. Informally, if a decision-maker does not prefer f to g, either knowing that the event B obtained, or knowing that B is not obtained, then the decision-maker does not prefer f to g (Savage, 1972, p.21). Further, from axiom 3 we can deduce that preferences between acts depend only on realized consequences, and not possible ones. The fourth axiom says that is a qualitative probability, thus is a weak order, and B C if and only if (B D) (C D) when (B D) = (C D) = 0. Furthermore, 0 B, 0 < S (all events are at least as probable as the impossible event and the universal event S must not be regarded as impossible). Axiom 5 says that there is at least one pair of consequences such that one is strictly preferred to the other, and axiom 6 implies the existence of a unique probability measure P on E. This probability measure is consistent with the qualitative probability in that E is not more probable than E if and only if P(E) P(E). The last axiom says that if f p g(s) for all consequences of f for a set of states B, then f p g, if one of those states occurs, of further importance this axiom implies that the utility function is bounded (nothing is infinitely bad or infinitely good). Given these assumptions, Savage proved the existence of a real-valued utility function on C with the following property: Let {Li} be a partition of S and let f be an act with consequences {f(si)} on {Li}, and let {Li} be another partition of S and let g be an act with

Page 107 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

consequences {g(si)} on {Li}. Then f p g if and only if piu(f(si)) qiu(g(si)) where pi = P(Li) and qi = P(Li), i.e., the principle of maximizing the expected utility. Looking back at the system of Luce and Raiffa, it has been proved by von Neumann and Morgenstern (1947) that if a decision-maker has preferences between lotteries, i.e., given that the assumptions in the axiom system are fulfilled, then there is a real-valued utility function, unique up to a positive affine transformation, on the set of lotteries. Furthermore, let Lc = {L1, L2, } be a set of lotteries on C (alternatives with uncertain outcomes in the consequence set C), then they showed that the utility function u:LcR, has a representation u(Li) = pi(ci)u(ci) and Li p Lj if and only if u(Li) u(Lj). Thus, both axiom systems serve as attempts to a formal justification of the utility principle and the principle of maximizing the expected utility. Due to the subjective vein in the approach of Savage, his theory is often referred to as subjective expected utility (SEU).

15.4.2

Some Criticism Against the Utility Theory

The assumptions in both systems may seem reasonable at a first glance, but they have been subject to severe controversy. Human decision-makers tend to, under given circumstances, behave inconsistent with the utility principle. Famous so-called paradoxes include Allais paradox and Ellsbergs paradox. Allais paradox shows that people tend to act inconsistent with the sure-thing principle. This paradox derives from a common human behaviour of preferring a good outcome for certain to having a chance between something not as good and something even better. Ellsbergs paradox is quite similar, while it shows peoples tendencies towards preferring known risks to unknown uncertainties, and thereby violating the utility principle. Paradoxes of these kinds are often resolved through arguing that even intelligent beings make mistakes, and after some explanation of the inconsistency in their choices, they change their minds. However, for instance, an empirical study by Slovic (1974) has shown that as much as about 30% refuse to change their opinion and conform to the utility principle. Tversky (1981) tries to answer why this is the case, and his conclusion is that irrelevant contextual effects are often influencing people, making them act inconsistent with the utility principle, i.e., the framing process. Further, it can be argued that it is impossible for any normative theory of decision making to embrace all inherent peculiarities in a free world of heterogeneous decision-making inhabitants.

Page 108 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Furthermore, and independent of this, in real life decision making, the requirements of precise probability and utility estimates are often too strong, and thus making the utility theory in this format inapplicable.

15.4.3

Risk Attitudes

Defenders of classical Bayesian decision theory often argue that the concept of utility captures different risk attitudes. The assumption is that to each expected utility, there corresponds a certainty monetary equivalent xce. The decision-maker is indifferent between having this monetary value with certainty, and performing an alternative with uncertain outcomes, i.e., u(xce) = piu(xi), where u(xi) is the utility of gaining the monetary value xi. The risk premium, p, of an act is now defined as the demand that a decision-maker has for carrying out the act, instead of having the monetary equivalent xce for certain, i.e., p = pixi xce. With respect to the risk premium p, a classification of decision-makers into three classes can be made: a decision-maker is risk averse if p > 0; risk prone if p < 0; and risk neutral if p = 0. As an example, assume that a decision-maker is in desperate need of a certain amount of money, and any lesser amount than this amount would not be useful. For instance, a person may be in need of money for a medical treatment of a disease that, if not cured, will result in death. If this person should seize the opportunity of entering a bet with her last funds that will give her a chance of winning an amount sufficient enough for the treatment to be affordable, this person would probably not be labelled irrational. In this situation, the risk premium p is probably negative. However, some argue that it will never be possible to formalize the decision process with all reasonable risk attitudes by a utility function and an associated risk premium. Many critics emphasize that a majority of the mathematical models of decision analysis are oversimplified. Consider, e.g., the reasons for gambling. Most people would agree on that there is a pleasure involved in the pure act of participating in a game with uncertain outcomes. If mathematical expectation were the only criterion for gambling, no games would ever be arranged by rational beings, since when the rules of the game would make it rational for the gambler to bet, then the arranger should be irrational to offer the bet. However, people do still arrange and participate in games, although either the gambler or the bookmaker will be on the irrational side. Furthermore, it has also been argued that humans tend to disregard very small probabilities, even in games with finite mathematical expectations (like nation-wide lotteries),

Page 109 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

and that, in the case of very high probabilities, a gambler is not willing to risk arbitrary amounts (Menger, 1934).

15.4.4

Security Thresholds

In many decision contexts, decision-makers wish to avoid particular strategies which involve some risk of ending up in a, for the decision-maker, consequence that is considered as a catastrophe, or at least highly undesirable. Even if the probability for such an event is estimated as extremely low, it is simply not a risk the decision-maker is willing to be exposed to. An insurance company serves as a pertinent example, while insurance companies probably find it irrational to let their clients insure themselves against nuclear war, meteorites, acts of terrorism, and similar catastrophes. Although the insurance company might find such events to be highly improbable, the occurrence of any such event would without doubt imply bankruptcy. Having such concerns in mind, a decision theory should be sensitive to different risk attitudes and provide the decision-maker means to express her risk attitudes in a number of different ways. As indicated above, the possibility to shape a utility function is not a sufficient model alone in this respect. One way to express such attitudes includes the ability to define security thresholds, together with procedures for handling the inevitable vagueness in the estimations of the probabilities and values that often is inherent in all decision modelling.

Page 110 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

16

Multiple and Conflicting

Objectives
In the previous section, the question was raised on how decision-maker should to compare the alternatives with respect to different types of objectives of the decision. Keeney and Raiffa (1976) present four adequate examples of decision situations where the decision-maker cannot hide from the fact that there are multiple objectives in conflict with each other. One of the examples considers the choice of a site for a new airport near Mexico City, where the head of the Ministry of Public Works was obliged to balance objectives such as, e.g., minimize costs, capacity of airport facilities, improve regional developments, and minimize access time for travellers. Such decision problems are the concern of multi attribute utility theory (MAUT) or multi-criteria decision analysis. In MAUT, each objective is referred to as one attribute in the decision context, and the approach is to define one individual utility function for each attribute. These are then aggregated into a global utility function, in which weights express the relative importance of each attribute. Each consequence Ci may be thought upon as a vector of achievement levels regarding the identified attributes, in the case of n attributes, the consequence Ci = (ci1, ci1, , cin). There is a vast source of literature on decision making with multiple objectives, some literature use the terms criteria or perspective instead of attribute, however from the decision-makers point of view we can use these terms interchangeably. A number of approaches to aggregate utility functions under a variety of attributes have been suggested, such as (Keeney and Raiffa, 1976), (Keeney, 1980), (Saaty, 1980), and (von Winterfeldt and Edwards, 1986). The most widely employed method is the additive utility function, sometimes referred to as the weighted sum. Some conditions must be fulfilled in order for the additive utility function to serve properly as an aggregated utility function. Firstly, the assumption of mutual preferential independence must hold, which states that when a subset of alternatives differs only on a subset Gi G of the set of attributes G. Then the preferences between the alternatives must not depend on the common performance levels G \ Gi. Secondly, the condition of additive independence must hold, meaning that changes in the uncertain outcomes (its probability distribution) in one attribute will not affect preferences for lotteries in other attributes.

Page 111 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

The weights are restricted by a normalization constraint wj = 1, wj[0,1], where wj denotes the weight of attribute Gj. A global utility function U using the additive utility function is then expressed as
n

U ( x) =

w u ( x) ,
i =1 i i

where wi is the weight representing the relative importance of attribute i. ui: Xi [0,1] is the increasing individual utility function for attribute Gi, and Xi is the state space for attribute Gi. It is assumed that the ui:s map to zero for the worst possible state regarding the i:th attribute, and map to one for the best. Another global utility function is the multiplicative utility function, introduced in (Keeney and Raiffa, 1976). The multiplicative model requires that every attribute must be mutually utility independent of all other attributes, saying that changes in sure levels of one attribute do not affect preferences for lotteries in the other attributes. In contrast to additive independence, the condition of utility independence allows the decision-maker to consider two attributes to be substitutes or complements of each other. In this respect, it is a weaker preference condition than additive independence. Generally, the global utility function is usually expressed as
1 + KU ( xi ) =
n

[ Kk u ( x ) + 1] ,
i =1 i i i

where ui: Xi [0,1]. ui is the increasing individual utility function for attribute Gi, and Xi is the state space for attribute Gi. As for the additive function, the ui:s map to zero for the worst possible state regarding the i:th attribute, and map to one for the best. The scaling constant K is the nonzero solution to
n

1+ K =

(1 + Kk ) ,
i =1 i

where the ki represent scaling constants, similar in their meaning to weights, but without the normalization requirement. Other formal methods of decision evaluation under multiple objectives include the outranking approach (Roy, 1991), (Vincke, 1992), often referred to as the European/French School of Decision Aid. This approach is based on a search for outranking relations deduced from a set of binary preference relations. However, these approaches do not incorporate the modelling of uncertainty in the probabilistic sense, and thus does not capture the risk

Page 112 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

associated with different courses of actions. Nevertheless, it has proved to be useful in a number of applications.

Page 113 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

17

Elicitation Techniques

In any model for decision analysis the input parameters do matter. All input parameters must be elicited carefully, while it is their responsibility to reflect the attitude and beliefs of the decision-maker. Consequently, much deliberation must be taken into the elicitation process of these input parameters, and methods for such processes have been suggested by a number of authors.

6.1 Assessing Utilities


Many decision situations under risk in the context of business and investments can be based on expected monetary values. The value of each consequence is then represented as the monetary payoff estimated and/or calculated with respect to that consequence. When basing a decision on its expected utility, a systematic elicitation is helpful in order to create legitimate utility measures. As utilities are defined on an interval scale, a zero and scale may be arbitrarily defined. A common approach, direct assessment, is this: Let x- and x+ denote a least preferred10 and a most preferred consequence respectively, then define u(x-) = 0 and u(x+) = 1. Now, for each other consequence xi, there is a probability pi such that having xi for sure is considered equal to the alternative of ending up with x+ with probability pi and x- with probability (1- pi). Now, because of the decision-makers indifference together with the continuity assumption, the utility of xi must equal the expected utility of the reference alternative with uncertain outcomes. Thus u(xi) = piu(x+) + (1- pi) u(x-). According to Keeney and Raiffa (1976), the procedure above is suitable for decision problems involving smaller consequence sets (fewer than 50 consequences), because of the increasing number of consistency checks. In such cases, a procedure for assessing a complete utility function is suggested. This is done through determining whether the utility function is monotonically increasing, determining the risk attitude, determining quantitative restrictions (fixing utilities at a few particular points of the utility function), and finally performing

10 There may be several least preferred as well as several most preferred consequences.

Page 114 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

consistency checks. However, one drawback of assessing utilities in this way is the requirement for eminent introspective powers. For instance, when fixing the probabilities in the reference alternatives, the procedure results in a precise utility measure of a certain consequence. Although this consequence may be well defined in terms of which state of the modelled world it represents, the actual implications of this consequence may be very hard to grasp due to insufficient or imperfect information. Under such circumstances, a decisionmaker may feel more comfortable in defining utilities in vague or imprecise statements.

6.2 Assessing Probabilities


Whenever we go to the station we are betting that a train will really run, and if we had not a sufficient degree of belief in this we should decline the bet and stay at home. (Ramsey, 1931, p. 173)

In its simplest subjective form, assessing a probability is done through asking the question What is my belief regarding the probability of A? If the phenomenon is known to occur in a given number of trials or an investigation regarding relative frequencies has been realized, the decision-maker may feel confident when answering such a question. Another way of assessing probabilities is to investigate the decision-makers attitude to placing bets concerning the uncertain event that is to be attached a probability estimate. This technique assumes that the decision-maker has accepted the principle of maximizing the expected utility, while it is based on a search for indifference between two alternatives with uncertain outcomes, which then imply equality in the mathematical expectation. As an example, when assessing the probability of A, consider the following two bets 1. Win x if A is true, Lose y if A is false 2. Win y if A is true, Lose x if A is false When indifference holds, then P(A)x + P(A)y = P(A)y + P(A)x and the subjective probability of A is P(A) = y/(x+y). However, the assessed probabilities must be consistent with the axioms of probability theory. If not, they must be modified until they are. Assessing probabilities in this way is commonly referred to as the reference lottery approach. As in the case of utility assessment, the demanded introspection of the decision-maker has been discussed and a common understanding is that in real-life decision problems, a decision-

Page 115 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

maker in many situations finds it too difficult to precisely define when exactly indifference is the case.

6.3 Assessing Weights


There are several different techniques for assessing the weights in a multi-attribute model, e.g., pricing out (Keeney and Raiffa, 1976), swing weighting (von Winterfeldt and Edwards, 1986), the reference lottery approach (Keeney and Raiffa, 1976), and the analytic hierarchy process, AHP (Saaty, 1980). The pricing out approach is based on the amount of money a decision-maker is willing to pay in order to obtain a given individual utility measure for a certain attribute, thus one attribute in the model must be measured in monetary units and preferential independence must hold between the monetary attribute and the one being priced out. Using the reference lottery approach, the first alternative is a lottery, with one outcome having the best value for all attributes, and the other outcome having the worst value for all attributes. The probability pi of ending up with the best value for all attributes, where the decision-maker is indifferent between the lottery and the alternative of having the best value for attribute Gi and the worst value for all other attributes for sure, then equals wi. Thus
wi = U (u1 , u2 ,..., ui+ , ui+1 ,..., un )

where ui+ is the best utility for attribute Gi and uj- the worst utility for attribute Gj. If the weights are assessed in this manner, but the decision-maker cannot agree with assigning the weights consistent with wj = 1, the multiplicative model is more appropriate. In this model, the assessment of the scaling constants can be done through the reference lottery approach, such that
ki = U (u1 , u2 ,..., ui+ , ui+1 ,..., un ) ,

where ui+ is the best utility and ui- the worst utility for attribute i. In the case of two attributes with scaling constants k1 and k2, such that k1 + k2 < 1, it can be said that they complement each other. If k1 + k2 > 1, the attributes can be considered as substitutes of each other. Assessing weights through the swing weighting technique begins with defining the worst-case scenario, which is used as a benchmark, attached with a value of zero. The term swing derives from that a hypothetical outcomes are constructed in which the attributes swings from worst to best, and the decision-maker creates a preference order on this set of hypothetical outcomes.

Page 116 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

18

Imprecise Domains

In a vast majority of real-life decision situations, the decision-maker do not have access to the significant amount of statistical data demanded to aggregate precise numerical values and probabilities, nor does the decision-maker have the ability to perform precise estimations of utilities. Furthermore, people find it hard to distinguish between probabilities ranging from approximately 0.3 to 0.7, Shapira (1995). A great deal of attention has been given to problems of imprecise information as a source of decision uncertainty, Morgan and Henrion (1990) identifies two main types of uncertainty. The first type of uncertainty derives from lack of historical data, and takes its form in statistical variation, subjective judgments, linguistic imprecision, variability, inherent randomness, disagreement and approximation. For example in experiments, errors in the measurements of quantities give rise to statistical variation. The second type of uncertainty arises from the model chosen, for example a utility function. Furthermore, uncertainty due to biases in communication and value differences is unavoidable in the use of expertise in policy processes. Instead of addressing the sources of uncertainty, Funtowicz and Ravetz (1990) discuss different types of uncertainties, including inexactness (or technical uncertainty), unreliability (or methodological uncertainty), and border with ignorance (or epistemological uncertainty). These authors consider ignorance to be endemic to scientific research. Finally, Wynne (1992) addresses uncertainty in the foundations of information and knowledge, as well as in processing information.

18.1 Measurable and Immeasurable Uncertainties


Even if a decision-maker is able to discriminate between different probability measures, very often adequate, reliable, and precise information is missing. Consequently, there seem to be significant reasons for discriminating between measurable and immeasurable uncertainty. Measurable uncertainty is often referred to as risk and can be represented by precise probabilities. In contrast, immeasurable uncertainty occurs frequently in high consequence/low frequency situations since the low frequency imply lack of statistical data, and thereby the axiom systems given by, e.g., Savage and von Neumann and Morgenstern, are not satisfied. Ellsberg (1961) proposes a class of choice situations involving immeasurable uncertainty, in which the behaviour of people is inconsistent with the suggested axiomatic

Page 117 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

systems. He does not object to the use of the principle of maximizing the expected utility, but suggests that the underlying axiomatic systems should not be applied in situations where the available information is to some extent not precisely defined. Doyle and Thomason (1997) give an approach where impreciseness is being modelled through using only qualitative data. However, in many cases this restriction will yield a too narrow outlook of a decision problem, numerical estimates should still play a role. Using the words of Ekenberg:
A useful theory for decision analysis should include procedures for handling such qualitative aspects in connection with a quantitative evaluation. (Ekenberg, 1994, p. 39)

18.2 Imprecise Probability


Consider the uncertainty about whether it will rain in Brisbane next weekend. A weather forecaster may be able to assess a precise probability of rain, such as 0.3285, although even an expert should feel uncomfortable about specifying a probability to more than one or two decimal places. Someone who has little information about the prospects for rain may be able to make only an imprecise judgement such as it will probably rain, or it is more likely to rain tomorrow than at the weekend, or the probability of rain is between 0.2 and 0.4. People living outside Australia may be completely ignorant about the weather in Brisbane and assign lower probability 0 and upper probability 1. Probabilities based on extensive data can be distinguished, through their precision, from those based on ignorance. (Walley, 1997, p.1)

There is a wide variety of mathematical models for the representation of imprecise probability. According to Walley (1997), most research in imprecise probabilities has been concerned with different types of upper and lower probability. However, some common and useful kinds of uncertainty cannot be modelled through the use of upper and lower probability models, especially, commonly used comparative statements of the form A is at least as probable as B cannot be allowed for11. Walleys highly influential Statistical Reasoning with Imprecise Probabilities (1991) introduces the concept of upper and lower previsions. Briefly speaking, the lower prevision of a gamble is defined from the amount a gambler is willing to

11 Such statements are allowed for in DecideIT.

Page 118 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

pay for a lottery ticket, the upper prevision is defined from how much he is willing to sell the same ticket for. When more than one probability distribution defined on the same set of outcomes is reasonable given the information obtained, we speak in terms of sets of probability distributions. The American philosopher Isaac Levi gives three conditions such sets of probability measures B must satisfy. These imply (among other things) that the probability distributions in B for a given state of nature form an interval, in literature such sets is commonly referred to as convex sets of probability measures. The significance of Levis work is emphasized as Levi compares the different alternatives in decision situations. He gives an example in which two similar decision situations with different sets of probability measures yield results different from his theory, even if the generated intervals are the same (Levi, 1974, pp. 416-418). Many attempts have been made to express imprecise probabilities in terms of intervals. In Choquet (1953) the concept of capacities is introduced. These capacities can be used for defining a framework for modelling imprecise probabilities as intervals (Huber, 1973). The use of interval-valued probability functions, by means of classes of probability measures, has also been integrated in classical probability theory by e.g., (Good, 1962) and (Smith, 1961). A similar approach was taken by Dempster (1967), where a framework for modelling upper and lower probabilities is investigated. This was further developed in (Shafer, 1976), where the concept of basic probability assignments was also introduced. Within the field of artificial intelligence, the Dempster-Shafer theory for quantifying subjective judgments has received a lot of attention, but it seems to be unnecessarily strong with respect to interval representation (Weichselberger and Phlman, 1990). Weichselbergers theory of intervalprobability argues in favour of an axiom system for interval probabilities clearly related to the one of Kolmogorov, in his own words:
Altogether, theory of interval-probability comes nearer to the classical understanding of probability assignment than those approaches relying on more general types of assessment. (Weichselberger, 1999)

18.3 Imprecise Utility


Imprecision in decision situations often prevails in both probability estimates and in utility assessments. For example in business decisions when acting upon a forecast, the forecasted value often is subject to some forecast error encouraging the use of a prediction interval

Page 119 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

instead of a predicted fixed number which in almost every case will be more or less incorrect. Furthermore, many types of decisions involve utility measures of non-monetary outcomes which then must be measured on some precisely defined interval scale, such measurements is often hard to motivate, e.g., due to underlying ethic responsibilities and democratic values. Levi uses a set G of permissible utility functions, which do not obey the classical Bayesian requirement that all elements in G are positive affine transformations of each other. He then stipulates the following definitions: Definition: An alternative A is E-admissible if and only if there is a probability distribution p in B and a utility function u in G, such that E(A), defined relative to p and u, is optimal among all alternatives. Definition: An alternative A is S-admissible if and only if it is E-admissible and there is a function u in G such that the minimum u-value assigned to some possible consequence is at least as great as the u-values assigned to the consequences of any other of the remaining alternatives. However, a problem with Levis approach is the violation of the independence of irrelevant alternatives.

18.4 Second-Order Beliefs


According to Grdenfors and Sahlin (1982), one major drawback in the classic Bayesian approach as well as in Levis approach is that is does not account for variations of the epistemic reliability in different decision situations. Even if an outcome is associated with a set of probability measures and a set of utility measures, some of these measures are often regarded as more reliable than others, due to the nature of the obtained information. Thus, we have a second-order belief in the sense that we hold some of our beliefs to be more reliable. Grdenfors and Sahlin provide an example for demonstrating variations in the epistemic reliability in which a certain Miss Julie is invited to bet on the winner of three different tennis matches:
As regards match A, she is very well-informed about the two players she knows everything about the results of their earlier matches, she has watched them play several times, she is familiar with their present physical condition and the setting of the match, etc. Given all this information, Miss Julie predicts that it will be a very even match and that a mere chance will determine the winner. In match B, she knows nothing whatsoever about the relative strength of the contestants (she has not even heard their names before) and she has no other information that is relevant for predicting the winner of the match. Match C is similar to match B, except that Miss Julie has happened to hear that one of the

Page 120 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011 contestants is an excellent tennis player, although she does not know anything about which player it is, and the second player is indeed an amateur so that everybody considers the outcome of the match a foregone conclusion. (Grdenfors and Sahlin, 1982, p. 362)

There are however several complication with this theory that are solved in (Ekenberg and Thorbirnson, 2001), where another approach to second-order decision analysis is suggested. Their theory does not only support the use of interval statements to model imprecise information, but also takes into account various belief distributions over the intervals as measures of the epistemic reliabilities concerning the different probability and utility distributions on a set of outcomes. In DecideIT, it is possible to model and evaluate secondorder beliefs through explicitly defining points with higher density of belief within the given intervals.

Page 121 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

19

Graph Models

Graphical models are often of intuitive appeal to humans. They serve well as an instrument for communication, they are dynamic, and they are easy to manipulate (especially through the assistance of a graphical user interface). Two important graph models that have proliferated within the area of risk and decision analysis are the decision tree and the influence diagram. Generally, any graph model G is a structure consisting of a set of nodes N and a set of edges E between these nodes, thus G = N,E. A directed graph is a graph in which the edges have a direction, i.e., an edge between ni and nj does not imply an edge in the opposite direction. A tree graph is a set of straight line segments connected at their ends with no cycles, thus it is an acyclic graph and a tree with m nodes has m-1 edges. In a rooted tree, each node nj one edge further away from a given node ni is called a child to ni, and nodes connected to the same node which are the same distance from the root node are referred to as siblings. The root node is the node in absence of a parent, nodes without children are called leaves, and there is a unique path from the root node to any leaf.

19.1 Decision Trees


Raiffa (1968) is commonly credited for the use of decision trees. A decision tree is a rooted tree with three different types of nodes, decision nodes, event nodes (chance nodes), and consequence nodes. In a decision tree, squares represent decisions to be made (decision nodes), and circles represent chance events (event nodes). The edges emanating from a square represent the identified alternatives or the choices available to the decision-maker, and the edges from an event node represent the possible outcomes of a chance event with an associated probability distribution. The third decision element, the consequence, is specified at the leaves as consequence nodes. These are associated with a real numbered value representing the utilities of the different consequences. An example of a decision tree is presented in the screenshot from DecideIT as Figure 4-3.

Page 122 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Figure 4-3: A decision tree in DecideIT (note that this screenshot is a zoomed out view and shows less details than the default view).

Usually, the root node is a decision node representing the initial decision, as in Figure 43. The tree often indicates a temporal order in which the events take place, i.e., if event Ei is said to occur before Ej then Ej usually do not precede Ei in the model. This is especially the case for decision nodes, i.e., all outcomes related to preceding nodes are known prior to the actual decision the decision node represents. Furthermore, the tree is a representation of a conditional expansion order. For example, the probability of C1 in Figure 43 is a conditional probability, P(C1) = P(E11 | E21,D11). Decision trees are usually evaluated by pruning the tree, sometimes called rolling back or folding back the tree. This technique creates a preference ordering according to PMEU and is quite straightforward. Start at the consequence nodes and move towards the root node. Calculate the expected values of chance nodes when such are encountered, and replace the chance node with its expected value. When a decision node is encountered, choose the branch with the highest value, discarding other branches with lower expected values. When this algorithm terminates, the path that remains is the one to choose. This is the evaluating algorithm of DecideIT, generalized for imprecise input parameters.

19.2 Influence Diagrams


An influence diagram is a compact representation of a symmetric decision tree, in the form of a directed acyclic graph (DAG). Influence diagrams are closely related to belief-networks and Bayesian nets, and may be looked upon as an extension of Bayesian nets tailored to decision

Page 123 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

making. According to Howard and Matheson (1984, p.721), the influence diagram is a formal description of a problem that can be treated by computers and a representation easily understood by people in all walks of life and degrees of technical proficiency. Shachter (1986) continues in an analogous fashion:
An influence diagram...is an intuitive framework in which to formulate problems as perceived by decision-makers and to incorporate the knowledge of experts. At the same time, it is a precise description of information that can be stored and manipulated by a computer. (Shachter, 1986, p.871)

The classic influence diagram is a network with three types of nodes. The nodes are: decision nodes, chance nodes (event nodes), and one value node (utility node, payoff node, consequence node). In an influence diagram, squares represent the decisions to be made, circles represent chance nods, and a rounded rectangle represents the value node. There are two types of directed arcs: conditional arcs with chance nodes or the value node as successors and informational arcs with decision nodes as successors. An example of an influence diagram is shown in Figure 44.

Figure 4-4: Example of an influence diagram and corresponding symmetric decision tree if every node contains two outcomes.

Page 124 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

With respect to any given node in an influence diagram, Howard and Matheson (1984, p.737) provides the following definitions: The predecessor set of a node is the set of all nodes having a path leading to the given node. The direct predecessor set of a node is the set of nodes having an arc connected directly to the given node. The indirect predecessor set of a node is the set formed by removing from its predecessor set all elements of its direct predecessor set. The direct successor set of a node is the set of nodes having an arc connected directly from the given node. The indirect successor set of a node is the set formed by removing from its successor set all elements of its direct successor set. Each chance node can be thought of as a random variable, with an assigned probability distribution. There is an underlying joint probability distribution for all chance nodes. This joint distribution can be decomposed into a set of conditional distributions, to be assessed by the analyst, with conditioning represented by arcs in the influence diagram. If there are no undirected paths between two nodes, then they must be independent. If a chance node has no arcs into it, then its probability distribution is a marginal distribution. (Shachter, 1986, p.872) For chance nodes, the diagram partially constrains the probabilistic conditioning order. Let Nx be the set of all non-successors of node x, and Dx be the set of direct predecessors of x so that Dx Nx. The influence diagram then asserts that the probability assignment to x given Nx is the same as to x given Dx, so that {x | Nx} = {x | Dx}. With respect to x, Dx is a sufficient statistic for Nx (Howard and Matheson, 1984, p.739). This assertion is noticeable in the DecideIT implementation (not in version 2.5). When setting the conditional probabilities of a conditionally dependent chance node, the number of expansions in the probabilistic conditioning order for x equals the size of Dx. The influence diagram asserts that the only available information when a decision is made is represented by the direct predecessors of the relevant decision node; thus the name informational arcs. A common practice is therefore the use of no-forgetting arcs in

Page 125 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

diagrams with more than one decision node. This is to make it explicit in the structure so that the decision-maker does not forget the information from at an earlier decision part of the same problem.12 The original way of evaluating influence diagrams is to convert them into a corresponding decision tree and evaluate the tree (Howard and Matheson, 1984). The influence diagram is then only used to formulate the situation. There are also methods for evaluating influence diagrams without converting them into decision trees. The most famous method is that developed by Shachter (1986), based on node removal and arc reversal. Due to the workings of the DELTA method (Danielson and Ekenberg 1998), DecideIT uses the original method to evaluate influence diagrams. Influence Diagrams are not supported in DecideIT 2.5 and 2.6, however work is being done and future releases of DecideIT are planned to support this type of graph model.

19.2.1

Relationship between Influence Diagrams and Trees

As stated above, an influence diagram is a compact representation of a symmetric decision tree. Decision trees and influence diagrams are isomorphic structures, i.e., any properly built influence diagram can be converted into a corresponding decision tree and vice versa (Clemens, 1996, p.74). The conversion from an influence diagram to its corresponding decision tree is very useful for the implementation of influence diagrams in the DecideIT software. Unlike the nodes in a decision tree, the nodes in an influence diagram do not have to be totally ordered nor do they have to depend directly on all predecessors. To convert an influence diagram into a corresponding decision tree, two main requirements must be fulfilled in the diagram: The influence diagram must imply a total ordering over the decision nodes. Each decision node in the influence diagram and its direct predecessors directly influence all successor decision nodes. The first condition is quite obvious, as a decision tree with multiple decisions clearly defines a sequence order in which the decision-maker makes his/her choices. The second condition is the no-forgetting condition, which assures that a single decision-maker does not forget information. According to Howard and Matheson (1984, p.744), these two conditions

12 No-forgetting arcs also fill a purpose when converting an influence diagram into a decision tree.

Page 126 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

guarantee that a decision tree can be constructed. However, for a proper conversion some probabilistic processing may be necessary. This is why a non-direct predecessor z of a decision node x in an influence diagram does not imply that the decision rule in x depends on z. It simply implies that z is used in the probability assignment model (Howard and Matheson, 1984, p.740). When such a situation is converted into a decision tree, the tree would then imply that the decision rule in x depends on z.

20

The Method of DecideIT

Suppose a decision-maker wants to evaluate a specific decision situation. In order to approach the problem in a reasonable way, given available resources, a decision process such as the following could be employed, not necessarily in the exact order given. Clarify the problem, divide it into sub-problems if necessary Decide which information is a prerequisite for the decision Collect and compile the information Define possible courses of action For each alternative: Identify possible consequences For each consequence: Estimate how probable it is Estimate the value of it occurring for each criterion Disregard obviously bad courses of action Based on the above, evaluate the remaining alternatives Carry out a sensitivity analysis

The method described in the following should be seen in the context of such a decision process. The process above is supported by DecideIT and is carried out in a number of steps. The first step is a bit special, since there is much information to collect. The initial information is gathered from different sources. Then it is formulated in statements and entered into DecideIT. Following that, an iterative process commences where step by step the decision-makers gain further insights. During this process, decision-makers receive help in

Page 127 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

realizing which information is missing, is too vague, or is too precise. They can also change the problem structure by adding or removing consequences or even entire alternatives, as more information becomes available.

20.1 Information Gathering


In some cases, the first information collection phase can be a very long and tedious step. Sometimes, it might take man-months. In other cases, it might only require a few half-day discussions with experts. It is impossible to describe any typical case because the situations are too diverse. In many situations, much work ranging from interviews to simulation is required.

20.2 Modelling
After the data collection phase, a modelling task commences where the decision-maker structures and orders the information. Given the set of criteria, she tries to compile a smaller number of reasonable courses of action and identify the consequences belonging to each alternative. For instance, simulation results can be clustered into meaningful sets. There is no requirement for the alternatives to have the same number of consequences. However, within any given alternative, it is required that the consequences are exclusive and exhaustive, i.e. whatever the result, it should be covered by the description of exactly one consequence. This is unproblematic, since a residual consequence can be added to take care of unspecified events. The probability and value statements plus the weights are represented by interval constraints and core intervals described later. Intervals are a natural form in which to express such imprecise statements. It is not required that the consequence sets are determined from the outset. A new consequence may be added at a later stage, thus facilitating an incremental style of working.

20.3 Information and Decision Frames


The framework for decision analysis implemented in DecideIT is suggested in (Danielson and Ekenberg, 1998). In this framework, imprecise probabilities as well as imprecise utilities are handled by modelling a decision situation with numerically imprecise sentences such as the probability of consequence c11 is greater than 5% and comparative sentences such as consequence c11 is preferred to consequence c12. These kinds of sentences are simply represented by suitable intervals and comparisons. Sentences, such as The probability of cij
Page 128 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

lies between the numbers ak and bk, are translated to pij [ak,bk]. Similarly, sentences such as: The probability of cij is greater than the probability of ckl are translated into inequalities such as pij > pkl. In this way, each statement is represented by one or more constraints. The conjunction of such constraints together with pij = 1 for each strategy Ai involved13, is called the probability base (P). The utility base (V) consists of similar translations of utility estimates. The collection of probability and utility statements constitutes the decision frame. The idea with such a frame is to collect all information necessary for the model in one structure. This structure is then filled in with user statements. All the probability statements in a decision problem share a common structure because they are all made relative to the same decision frame. The correspondence between the user model and the representation is summarized in Table 1.

User model Decision problem Alternative Consequence, event Collection of statements Interval statement

Representation Decision frame Consequence set Consequence Base Core interval Interval constraint

Table 1: Representation of user model

In practice, a model of the situation is created with criteria, relevant courses of action, and their consequences when specific events occur. The courses of action are called alternatives in the user model, and they are represented by consequence sets in the decision frame. If the problem contains more than one decision level, it is internally transformed into an alternative consequence form (AC-form), a one-level decision tree that is a computationally equivalent representation. In the user interface, all levels are kept as they were originally entered. Following the establishment of a decision frame in the tool, the probabilities of the events and the values of the consequences are subsequently filled in.

13 The normalization constraint is added because the consequences are assumed to be exhaustive as well as pair

wise disjoint.

Page 129 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

20.4 Frame Structure


A decision frame must capture the structure of the tree internally in the tool once transformed into one-level form. A one-level tree consists primarily of sets of consequences. Then, there are statements of probability and value collected in structures called constraint sets and cores. Definition: Given a set of variables S = {xi}, a continuous function g:Sn[0,1], and real numbers a,b [0,1] with a b, an interval constraint g(x1,,xn) [a,b] is a shorter form for a pair of weak inequalities g(x1,,xn) a and g(x1,,xn) b. In this manner, both equalities and inequalities can be handled in a uniform way since equalities are represented by intervals [a,a]. A collection of interval constraints concerning the same set of variables is called a constraint set. It follows that a constraint set can be seen as a system of inequalities. For such a system to be meaningful, there must exist some vector of variable assignments that satisfies each inequality in the system simultaneously. Definition: Given a set of variables {xi}, a constraint set X in {xi} is consistent iff the system of weak inequalities in X has a solution.14 Otherwise, the constraint set is inconsistent. A constraint Z is consistent with a constraint set X iff the constraint set {Z} X is consistent. In other words, a consistent constraint set is a set where the constraints are at least not contradictory. Definition: Given a consistent constraint set X in {xi} and a function f, Xmax(f(x)) =def sup(a {f(x) > a} X is consistent). Similarly, Xmin(f(x)) =def inf(a {f(x) < a} X is consistent)

The orthogonal hull is a concept that in each dimension signals which parts are definitely incompatible with the constraint set. The orthogonal hull can be pictured as the result of wrapping the smallest orthogonal hyper-cube around the constraint set. Definition: Given a consistent constraint set X in {xi}iI, the set of pairs {Xmin(xi),Xmax(xi)} is the orthogonal hull of the set and is denoted Xmin(xi),Xmax(xi)n.

14Then there is a non-empty solution set for X.

Page 130 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Constraints and core intervals have different roles in specifying a decision situation. The constraints represent negative information, which vectors are not part of the solution sets. The contents of constraints specify which ranges are infeasible by excluding them from the solutions. This is in contrast to core intervals, which represent positive information in the sense that the decision-maker enters information about sub-intervals that are felt to be the most central ones and that no further discrimination is possible within those ranges. Definition: Given a constraint set X in {xi} and the orthogonal hull ai,bin of X, a core interval of xi is an interval [ci,di] such that ai ci di bi. A core [ci,di]n of {xi} is a set of core intervals {[ci,di]}, one for each xi. As for constraint sets, the core might not be meaningful in the sense that it may contain no possible variable assignments able to satisfy all the inequalities. This is quite similar to the concept of consistency for constraint sets, but for core intervals, the requirement is slightly different. It is required that the most likely point is contained within the core. Definition: Given a consistent constraint set X in {xi} and a most likely point r = (r1,,rn), the core [ci,di]n of {xi} is permitted with respect to r iff ci ri di. Together, constraint sets and cores delimit the shape of the belief in the numerical values for the variables. See Figure 4-5.
Belief

Focal point Core

Value

Figure 4-5: The hull, core and most likely point for a variable

Hull

20.5 Bases
A base consists of a constraint set for a set of variables together with a core. A base is simply a collection of constraints and the core that belongs to the variables in the set. The idea with a base is to represent a class of functions over a finite, discrete set of consequences. Definition: Given a set {xi} of variables and a most likely point r, a base X in {xi} consists

Page 131 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

of a constraint set XC in {xi} and a core XK of {xi}. The base X is consistent if XC is consistent and XK is permitted with respect to r.

20.6 Probability Bases


The collection of probability statements in a decision situation is called the probability base. A probability base is said to be consistent if it can be assigned at least one real number to each variable so that all inequalities are simultaneously satisfied. The idea is that no meaningful operations can take place on a set of statements that have no variable assignments in common, since there is no way to take all the requirements into account. Note that the method deals with classes of functions of which there are infinitely many instantiations, and insists on at least one of them yielding consistent results. Definition: Given a set {Cik} of disjoint and exhaustive consequences, a base P in {pik}, and a discrete, finite probability mass function :C [0,1] over {Cik}. Let pik denote the function value (C ik). obeys the standard probability axioms, and thus pik [0,1] and k pik = 1 are default constraints in the constraint set PC. Then P is a probability base. Thus, a probability base can be seen as characterizing a set of discrete probability distributions. The core PK can be thought of as an attempt to estimate a class of mass functions by estimating the individual discrete function values.

20.7 Value Bases


Requirements similar to those for probability variables can be found for value variables. There are apparent similarities between probability and value statements but there are differences as well. The normalization (
k

pik = 1) requires the probability variables of a set

of exhaustive and mutually exclusive consequences to sum to one. No such dimension reducing constraint exists for the value variables. Definition: Given a set {Cik} of disjoint and exhaustive consequences, a base V in {vik}, and a discrete, finite value function :C[0,1]. Let vik denote the function value (Cik). Because of the range of , vik [0,1] are default constraints in the constraint set VC. Then V is a value base. Similar to probability bases, a value base can be seen as characterizing a set of value functions. The value core VK can be seen as an attempt to estimate a class of value functions.

Page 132 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

The probability and value bases together with structural information constitute the decision frame.

20.8 Frames
Using the above concepts of consequence, constraint, core, and base, it is possible to model the decision-makers situation in a decision frame. Compare the decision frame to Table 1 at the beginning of the appendix. The frame captures a decision problem on AC-form, a onelevel tree problem in normal form. The frame is also the key data structure in the tool implementation, holding references to other structure information and to the bases containing most of the information. All statements entered via the tool user interface are collected in the decision frame. When all statements in the current state of the problem have been entered, the data entry phase is over for the time being. As the insights into the decision problem accumulate during all the following phases, it is possible to add new information and alter or delete information already entered.

20.9 Sanity Checks


Thereafter, the work continues with evaluating the alternatives. It begins by comparing the alternatives as they are entered. As the first evaluation step, the sanity of the decision frame is checked. Much information collected, especially in large investigations, run the risk of being cluttered or misunderstood during the process. If some data in the frame is problematic, the decision-maker could consider leaving it out of the current cycle or recollecting it. Missing data is easily handled for later inclusion. For example, a missing consequence can be added at a later stage. If the set of consequences for some alternative is not exhaustive, a residual consequence can be temporarily added. Missing value constraints can be temporarily substituted with very wide intervals or just left out. Such possibilities have certain advantages as the results emerging at the outset of the evaluation may be viewed with greater confidence than if erroneous data is entered.

20.10

Security Thresholds

Many decisions are one-off decisions, or are important enough not to allow a too undesirable outcome regardless of its having a very low probability. The common aggregate decision rules will not rule out an alternative with such a consequence provided it has a very low probability. If the probability for a very undesirable consequence is larger than some security level, it seems reasonable to require that the alternative should not be considered, regardless
Page 133 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

of whether expected value shows it to be a good course of action. If the security level is violated by one or more consequences in an alternative and this persists beyond a predetermined rate of cutting (described below), then the alternative is unsafe and should be disregarded. An example of security levelling is an insurance company desiring not to enter into insurance agreements where the profitability is high but there is a very small but not negligible risk for the outcome to be a loss large enough to put the companys existence at stake. The security analysis requires some parameters to be set and security thresholds serve as an important supplement to the expected value.

20.11

Evaluations

After having taken security thresholds into account, which value does a particular decision have? In cases where the outcomes can be assigned monetary values, it seems natural that the value of the decision should be some kind of aggregation of the values of the individual consequences. The ultimate comparing rule of an evaluation in DecideIT as well as in many other methods is the expected value (EV), sometimes instantiated as the expected utility or the expected monetary value. Since neither probabilities nor values are fixed numbers, the evaluation of the expected value yields quadratic (bilinear) objective functions of the form EV(Ai) = pi1vi1 + + pinvin, where the piks and viks are variables. Further complicating the picture is the presence of multiple criteria. For s criteria, this leads to the expression EV(Ai) = w1 (p1i1v1i1 + + p1inv1in) + + ws (psi1vsi1 + + psinvsin), where wk is the weight of criterion k. Maximization of such expressions are computationally demanding problems to solve in the general case, using techniques from the area of non-linear programming. This leaves us with differing values and weights. By multiplying in the weights and making the probabilities common, the expression can be rewritten: EV(Ai) = pi1 w1v1i1 + + pin w1v1in + + pi1 ws vsi1 + + pin ws

vsin,

which finally is written EV(Ai) = pi1 (w1v1i1 + + ws vsi1) + + pin (w1v1in + + ws

vsin);

Page 134 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

thus permitting local (at consequence level) culling of weighted values. Maximization of such expressions is less but still computationally demanding problems to solve, using techniques from the area of quadratic programming. In (Danielson, 1998) there are discussions about and proofs of the existence of computational procedures to reduce the problem to systems with linear objective functions, solvable with ordinary linear programming methods. When a rule for calculating the EV for decision frames containing interval statements is established, the next question is how to compare the courses of action using this rule. It is not a trivial task, since usually the possible EV's of several alternatives overlap. The most favourable assignments of numbers to variables for each alternative usually render that alternative the preferred one. The first step towards a usable decision rule is to establish some concepts that tell when one alternative is preferable to another. Definition: The alternative A1 is at least as good as A2 if EV(A1) EV(A2) for all consistent assignments of the probability and value variables. The alternative A1 is better than A2 if it is at least as good as A2 and further EV(A1) > EV(A2) for some consistent assignments of the probability and value variables. The alternative A1 is admissible if no other alternative is better. If there is only one admissible alternative it is obviously the preferred choice. Usually there are more than one, since apparently good or bad alternatives are normally dealt with on a manual basis long before decision tools are brought into use. All non-admissible alternatives are removed from the considered set and do not take further part in the evaluation. The existence of more than one admissible alternative means that for different consistent assignments of numbers to the probability and value variables, different courses of action are preferable. When this occurs, how is it possible to find out which alternative is to prefer? Let 12 = EV(A1) EV(A2) be the differences in expected value between the alternatives. The strength of A1 compared to A2, given as a number max(12) [1,1], shows how the most favourable consistent assignments of numbers to the probability and value variables lead to the greatest difference in the expected value between A1 and A2. In the same manner, A2 is compared to A1. These two strengths need not sum to one or to any other constant the first might for example be 0.2 and the second 0.4. If there are more than two alternatives, pair-wise comparisons are carried out between all of them.

Page 135 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Furthermore, there is a strong element of comparison inherent in a decision procedure. For example, statements such as v11 > v22 are not taken into account when calculating maximal and minimal EV(Ai) unless they influence the hull. As the results are interesting only in comparison to other alternatives, it is reasonable to consider the differences in strength as well. Therefore, it makes sense to evaluate the relative strength of A1 compared to A2 in addition to the strengths themselves, since such strength values would be compared to some other strengths anyway in order to rank the alternatives. The relative strength between the two alternatives A1 and A2 are calculated using the formula mid(12) = [max(12)+min(12)]/2 = [max(12) max(21)]/2 The concept of strength is actually somewhat more complicated. Dominance means that one consequence set is superior to another, at least in a part of the solution space to the bases. The weakest relation would be if a part refers to a single solution vector. A more reasonable interpretation of a part is if it is superior in a substantial fraction of the solutions. Dominance in the strongest sense would mean to require that the part consists of all solution vectors. This idea is captured in the concepts of strong, marked, and weak dominance. They correspond to the minimal, medium, and maximal differences. Alternative A1 is said to strongly dominate alternative A2 if min(12) > 0, to markedly dominate if mid(12) > 0, and finally to weakly dominate if max(12) > 0. This is further explained in (Danielson, 2003). In DecideIT, the relative strength is shown as the middle line in the evaluation graphs.

20.12

Cutting the Hull

The hull cut (in DecideIT contractions) is a generalized sensitivity analysis to be carried out in a large number of dimensions. In non-trivial decision situations, when a decision frame contains numerically imprecise information, the different principles suggested above are often too weak to yield a conclusive result by themselves. Only studying the differences in the expected value for the complete bases often gives too little information about the mutual strengths of the alternatives. Thus, after the elimination of undesirable consequence sets, the decision-maker could still find that no conclusive decision has been made. One way to proceed is to determine the stability of the relation between the consequence sets under consideration. A natural way to investigate this is to consider values near the boundaries of the constraint intervals as being less reliable than the core due to the former being deliberately imprecise. Hence, it is important to be able to study the strengths (or dominances) between the

Page 136 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

alternatives on sub-parts of the bases. If dominance is evaluated on a sequence of ever-smaller sub-bases, a good appreciation of the strengths dependency on boundary values can be obtained. This is taken into account by cutting off the dominated regions indirectly using the hull cut operation. This is denoted cutting the bases, and the amount of cutting is indicated as a percentage , which can range from 0% to 100%. For a 100% cut, if no core is specified, the bases are transformed into single points, and the evaluation becomes the calculation of the ordinary expected value. Definition: Given a base X, a set of real numbers {ai,bi}, a hull [ci,di]n, and a real number [0,1], a -cut of X is to replace the hull by [ci(1)(aici), di(1)(bi di)]n. It is possible to regard the hull cut as an automated kind of sensitivity analysis. In order to maintain consistency, the cut decreases the bases in predefined ways. Since the belief in peripheral values is somewhat less, the interpretation of the cut is to zoom in on more believable values that are more centrally located. The principle can also be motivated by the difficulties of performing simultaneous sensitivity analysis in several dimensions at the same time. It can be hard to gain real understanding of the solutions to large decision problems using only low-dimensional analyses, since different combinations of dimensions can be critical to the evaluation results. Investigating all possible such combinations would lead to a procedure of high combinatorial complexity in the number of cases to investigate. Using hull cuts, such difficulties are circumvented. The evaluation idea behind the principle is to investigate how much the hull can be cut before dominance appears between the consequence sets compared. If there is no dominance even in the original core, it may be further cut towards the most likely point in order to achieve dominance. The cut avoids the complexity inherent in combinatorial analyses, but it is still possible to study the stability of a result by gaining a better understanding of how important the constraint boundaries really are. By co-varying the cut of an arbitrary set of intervals, it is possible to gain much better insight into the influence of the structure of the decision frame on the solutions. Consequently, a cut can be regarded as a focus parameter that zooms in from the full statement intervals to central sub-intervals (the core). The results of the comparisons can be displayed either in a diagram for each pair of alternatives (Delta diagrams) or as a summary for each alternative (Gamma diagrams). Figure 4-6 below deals only with Delta diagrams.

Page 137 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

In the figure, the evaluation three alternatives are shown as three pair-wise comparisons between the alternatives respectively. The x-axis shows the cut in per cent ranging from 0 to 100. The y-axis is the expected value difference ij for the pairs. The cone (which need not be linear if comparative statements are involved) consists of three lines. For comparing alternatives A1 and A2, the upper line is max(12), the middle is mid(12), and the lower is min(12). Thus, one can see from which cut level an alternative dominates weakly, markedly, and strongly. As the cut progresses, one of the alternatives eventually dominates strongly. The cut level necessary for that to occur shows the separability between the expected values.

Figure 4-6: Evaluation of alternatives.

9.13 Sensitivity Analyses


After the evaluation, a sensitivity analysis is the next step. The analysis tries to show what parts of the given information are most critical for the obtained results and must therefore be given extra careful consideration. This is accomplished by manually varying a number of statements in desired ways, increasing or decreasing intervals, modifying structural information, etc. The decision-maker might, however, have other ideas of interesting modifications to make to the bases, like decreasing or even increasing only selected intervals. He might have structural or problem specific information that leads him to manipulate certain intervals in special ways. A common strategy is decreasing intervals until only one alternative is admissible. It also points to which information is too vague to be of any assistance to the ongoing evaluation. Information identified in this way is subject to reconsideration, thereby triggering iteration in the process.

9.14 Decision Process Results


The selection procedure then continues with: (i) Remove all strongly dominated consequence sets (ii) If more than one consequence set remains (ii a) Cut the frame until only one consequence set remains
Page 138 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

(ii b) Remove the markedly dominated consequence sets (ii c) A combination of (ii a) and (ii b) (iii) If only one consequence set remains (iii a) Uncut the frame until other consequence sets appear (iii b) Study the markedly dominated consequence sets (iii c) A combination of (iii a) and (iii b) Before a new iteration starts, alternatives found to be undesirable or obviously inferior by other information could be removed from the decision process. Likewise, a new alternative can be added, should the information gathered indicate the need for it. Consequences in an alternative can be added or removed as necessary to reflect changes in the model. Often a number of cycles are necessary to produce an interesting and reliable result. After the appropriate number of iterations has been completed, both the decision problem and its proposed solution(s) in the form of preferred courses of action will be fairly well understood and documented. Anyone interested and with access to the information can afterwards check, verify (and criticize) the decision based on the output documentation, which because all consequences are clearly presented shows how all the alternative courses of action have been valued. Also, during the decision process, the analysis is open for comments and can become the basis for further discussions.

Page 139 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

References
(Bernoulli, 1954) D. Bernoulli, Specimen Theoriae de Mensura Sortis, translated into Theory on the Measurement of Risk, Econometrica 22, pp. 2236, 1954. (Choquet, 1953) G. Choquet, Theory of Capacities, Ann. Inst. Fourier 5, pp. 131295, 1953/1954. (Clemen, 1996) R. T. Clemen, Making Hard Decisions, Brooks/Cole Publishing Co., Pacific Grove CA, 1996. (Danielson, 1997) M. Danielson, Computational Decision Analysis, PhD. Thesis, Department of Computer and Systems Sciences, Stockholm University and KTH, Report No. 97011, 1997. (Danielson, 2003) M. Danielson, Generalized Evaluation in Decision Analysis, to appear in European Journal of Operational Research, 2003. (Dempster, 1967) A. P. Dempster, Upper and Lower Probabilities Induced by a Multivalued Mapping, Annals of Mathematical Statistics 38, pp.325339, 1967. (Doyle and Thomason, 1997) R. Doyle and R. H. Thomason (eds.), Qualitative Preferences in Deliberation and Practical Reasoning, Working Notes, Stanford University, 1997. (Ekenberg, 1994) L. Ekenberg, Decision Support in Numerically Imprecise Domains, PhD. Thesis, Department of Computer and Systems Sciences, Stockholm University and KTH, Report No. 94-003-DSV, 1994. (Ekenberg and Danielson, 1998) L. Ekenberg and M. Danielson, A Framework for Analysing Decision under Risk, European Journal of Operational Research 104/3, pp. 474484, 1998. (Ekenberg and Thorbirnson, 2001) L. Ekenberg and J. Thorbirnson, Second-Order Decision Analysis, Int. Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 9/1, pp. 1338, 2001. (Ellsberg, 1961) D. Ellsberg, Risk, Ambiguity, and the Savage Axioms, Quarterly Journal of Economics 75, pp.643669, 1961.

Page 140 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

(French, 1988) S. French, Decision Theory An Introduction to the Mathematics of Rationality, Ellis Horwood Ltd., 1988. (Friedman, 1953) M. Friedman, Essays in Positive Economics, University of Chicago Press, 1953. (Funtowicz and Ravetz, 1990) S. O. Funtowicz and J. R. Ravetz, Uncertainty and Quality in Science for Public Policy, Kluwer, Dordrecht, 1990. (Good, 1962) I. J. Good, Subjective Probability as the Measure of a Non-measurable Set, Logic, Methodology, and the Philosophy of Science, P. Suppes, B. Nagel, and A. Tarski (eds.), Stanford University Press, pp.319329, 1962. (Grdenfors and Sahlin, 1982) P. Grdenfors and N-E. Sahlin, Unreliable Probabilities, Risk Taking, and Decision Making, Synthese 53, pp. 361386, 1982. (Huber, 1973) P. J. Huber, The Case of Choquet Capacities in Statistics, Bulletin of the International Statistical Institute 45, pp.181188, 1973. (Hurwicz, 1951) L. Hurwicz, Optimality Criteria for Decision Making under Ignorance, Cowles Commission Discussion Paper No. 370, 1951. (Keeney, 1980) R. Keeney, Siting Energy Facilities, New York Academic Press, 1980. (Keeney, 1982) R. Keeney, Decision Analysis: An Overview, Operations Research 30, pp.803838, 1982. (Keeney and Raiffa, 1976) R. Keeney and H. Raiffa, Decisions with Multiple Objectives Preferences and Value Tradeoffs, Wiley, New York, 1976. (Levi, 1974) I. Levi, On Indeterminate Probabilities, Journal of Philosophy 71, pp. 391 418, 1974. (Luce and Raiffa, 1957) R. D. Luce and H. Raiffa, Games and Decisions Introduction and Critical Survey, John Wiley and Sons, 1957. (Malmns, 1990) P-E. Malmns, Axiomatic Justification of the Utility Principle, HSFR 677/87, 1990. (March and Simon, 1958) J. G. March and H. Simon, Organizations, Wiley, New York, 1958. (Markowitz, 1952) H. Markowitz, The Utility of Wealth, Journal of Political Economy 60, pp.151158, 1952.

Page 141 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

(Menger, 1934) K. Menger, Das Unsicherheitsmoment in der Wertlere, Zeitschrift fr Nationalkonomie 5, pp.459485, 1934. (Morgan and Henrion, 1990) M. G. Morgan and M. Henrion, Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analyses, Cambridge University Press, 1990. (von Neumann and Morgenstern, 1947) J. von Neumann and O. Morgenstern, Theory of Games and Economic Behaviour, 2nd ed., Princeton University Press, 1947. (Raiffa, 1968) H. Raiffa, Decision Analysis: Introductory Lectures and Choices under Uncertainty, Random House, 1968. (Ramsey, 1931) F. P. Ramsey, Truth and Probability, The Foundations of Mathematics and other Logical Essays, 1931. (Resnik, 1987) M. D. Resnik, Choices An Introduction to Decision Theory, University of Minnesota Press, Minneapolis, 1987. (Riabacke, 2002) A. Riabacke, Computer Based Prescriptive Decision Support, Department of Information Technology and Media, Mid Sweden University, Fibre Science and Communication Network Report No. R-02-33, 2002. (Roy, 1991) B. Roy, The Outranking Approach and the Foundations of the ELECTRE Methods, Theory and Decision 31, pp.4973, 1991. (Saaty, 1980) T. L. Saaty, The Analytical Hierarchy Process, Mc-Graw Hill, 1980. (Savage, 1972) L. Savage, The Foundation of Statistics, 2nd ed., Dover, John Wiley and Sons, 1972. (Shachter, 1986) R. D. Shachter, Evaluating Influence Diagrams, Operations Research 34, pp.871882, 1986. (Shapira, 1995) Z. Shapira, Risk Taking: A Managerial Perspective, Russel Sage Foundation, 1995. (Simon, 1955) H. Simon, A Behavioral Model of Rational Choice, Quarterly Journal of Economics 69, pp.99118, 1955. (Smith, 1961) C. A. B. Smith, Consistency in Statistical Inference and Decision, Journal of the Royal Statistic Society 23 Series B, pp.125, 1961.

Page 142 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

(Tversky and Kahneman, 1986) A. Tversky and D. Kahneman, Rational Choice and the Framing of Decisions, Journal of Business 59/2, pp.251278, 1986. (Vincke, 1992) Ph. Vincke, Multicriteria Decision Aid, John Wiley and Sons, Chichester, 1992. (Wald, 1950) A. Wald, Statistical Decision Functions, John Wiley and Sons, 1950. (Walley, 1991) P. Walley, Statistical Reasoning with Imprecise Probabilities, Chapman and Hall, 1991. (Walley, 1997) P. Walley, Imprecise Probabilities, The Imprecise Probabilities Project, http://ippserv.rug.ac.be, 1997. (Weichselberger, 1999) K. Weichselberger, The Theory of Interval-Probability as a Unifying Concept for Uncertainty, Proc. of ISIPTA99, Ghent, 1999. (Weichselberger and Phlman, 1990) K. Weichselberger and S. Phlman, A Methodology for Uncertainty in Knowledge-Based Systems, Springer-Verlag, New York, 1990. (von Winterfeldt and Edwards, 1986) D. von Winterfeldt and W. Edwards, Decision Analysis and Behavioural Research, Cambridge University Press, 1986. (Wynne, 1992) B. Wynne, Uncertainty and Environmental Learning: Reconceiving Science and Policy in the Preventative Paradigm, Global Environmental Change 2, pp.11127, 1992.

Page 143 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Index
Add alternatives .......................................72 Add nodes ................................................74 affine transformation .......................97, 121 alternative define ...................................................15 Alternative Properties ..............................53 Automatic update ....................................83 axiom probability ...........................................95 utility theory ..............................105, 107 axiom system .........................................104 Bayes theorem .........................................96 Bayesian decision analysis ....................103 cardinal ranking .......................................32 Cardinal ranking ......................................80 Cardinal Ranking...............................62, 67 Cardinal Ranking.....................................62 Cardinal ranking all alternatives ..........80 Cardinal Ranking All Criteria ..............67 Choose alternative ...................................76 Close ............................................51, 52, 81 command Critical Probabilities/Values ...............60 Extreme Values ...................................60 Preference Order .................................60 Probability Templates .........................59 Risk Profile ..........................................60 Security Thresholds .............................60 command Alternative Properties ..........................53 Close ....................................................51 Exit................................................ 51, 52 Export tree to JPEG-format ................ 51 Hide/Show all Evaluation Windows ... 58 New ..................................................... 51 Open .................................................... 51 Overview............................................. 58 Page setup ........................................... 51 Print tree.............................................. 51 Redo .................................................... 53 Save..................................................... 51 Save as ................................................ 51 Set Background Color......................... 53 Set Value Scale ................................... 53 Undo ................................................... 53 Update ................................................. 58 Update Tree ........................................ 58 Value/Weight Relations ...................... 53 command Total Ranking ..................................... 60 command Cardinal Ranking ................................ 60 command Expected Value Graph ........................ 60 command Total Ranking All Criteria ............... 60 command Cardinal Ranking All Criteria .......... 60 command Expected Value Graph - All Criteria .. 60 command Settings ............................................... 70

Page 144 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

command File .......................................................80 command Edit ......................................................80 command View ....................................................80 command Update .................................................80 command Update .................................................83 command Automatic update ................................83 Compare positive graphs ...................81, 82 conditional independence ........................96 conditional probability ............................96 consequence define ...................................................17 label .....................................................18 consequence node ....................................74 contraction ...........................30, 31, 36, 137 Contraction ..............................................81 Convert to probability/decision node ......75 Copy branch ............................................76 Copy node ...............................................76 Copy tree .................................................72 create a tree..............................................15 criteria....................................................112 Criteria assign a tree .........................................41 define ...................................................37 evaluate................................................45 weights ................................................39 Criteria .....................................................37 Criteria .....................................................39

Criteria .................................................... 45 criteria weights........................................ 39 critical probabilities ................................ 33 Critical probabilities/values .................... 80 Critical Probabilities/Values ............. 33, 65 critical values .......................................... 33 cumulative risk profile ...................... 37, 64 Cumulative risk profile ........................... 80 decision evaluate ......................................... 28, 29 decision analysis ................. 90, 92, 93, 128 decision frame . 30, 130, 131, 134, 137, 138 decision frames ..................................... 129 decision modelling .................................. 99 decision process .................................... 139 decision table .................................. 99, 100 decision tree .................................... 15, 123 decisions under certainty ...................... 100 decisions under risk .............................. 103 decisions under strict uncertainty ......... 101 Delete branch .......................................... 75 Disregard alternative ............................... 76 Edit .......................................................... 81 elicitation .............................................. 115 Evaluation ............................................... 60 Evaluation windows................................ 80 event node ............................................... 74 Exit.......................................................... 52 expected utility........................................ 90 expected value ...................................... 135 expected value graph .............................. 80 Expected value graph .............................. 80 Expected Value Graph .... 28, 45, 63, 64, 67 Expected Value Graph - All Criteria ...... 67

Page 145 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Expected value graph all alternatives ....80 extreme values .........................................36 Extreme values ........................................80 Extreme Values .................................67, 69 File ...........................................................80 File menu .................................................51 graph models .........................................123 Hide Sub-nodes .................................73, 76 Hide window .....................................81, 82 Hide/Show all Evaluation Windows .......58 hull cut .....................................................30 probability .............................................119 imprecise probability .............................119 imprecise utility .....................................120 influence diagrams ........................124, 127 install DecideIT .......................................50 installation .................................................8 interval constraint ..................................131 Kolmogorov-axioms................................95 maximax ..................................................67 maximax utility criterion .......................102 maximin...........................................67, 101 maximin utility criterion........................102 minimax .................................................101 minimax risk criterion ...........................102 model create ...................................................51 most likely point ................................59, 77 most likely point ......................................59 most likely point ......................................78 most likely point ....................................132 most likely point ....................................132 most likely point ....................................132 most likely point ....................................132

most likely point ................................... 138 Move branch ........................................... 75 multi attribute utility theory .................. 112 multi criteria decision analysis ....... 37, 112 multiple and conflicting objectives ....... 112 Multiple Decisions .................................. 84 Node properties ....................................... 77 Node Properties ...................................... 73 Node Property Frame .............................. 72 Numerical ............................................... 81 Open ............................................ 51, 73, 77 ordinal scale .................................... 98, 100 ordinal value function ........................... 100 orthogonal hull ...................................... 131 Overview................................................. 58 pessimism-optimism Index .............. 68, 101 PMEU ............................................. 28, 124 Preference order ...................................... 80 Preference Order ..................................... 69 principle of insufficient reason ....... 68, 101 Principle of Insufficient Reason ........... 101 principle of maximizing the expected utility ........................... 28, 103, 104, 109 Print Model ............................................. 52 probability ......................... 91, 95, 116, 133 assign .................................................. 20 conditional .................................. 96, 124 edit ...................................................... 77 elicitation .......................................... 116 independent ......................................... 96 qualitative ......................................... 108 probability base ............................. 130, 133 probability theory.................................... 95 qualitative probability ........................... 108

Page 146 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

Redo ........................................................53 Regard alternative ...................................76 Reset y-scale ............................................81 risk attitude ............................................110 risk constraint ........................................111 risk premium .........................................110 Risk Profile ..............................................64 save ..........................................................17 Save .........................................................52 Save as ...............................................18, 52 second-order beliefs ..............................121 security threshold ....................60, 111, 134 security thresholds ...................................34 Security thresholds ..................................80 Security Thresholds .................................34 sensitivity analysis...46, 128, 137, 138, 139 Set Background Color .............................58 Set color...................................................81 Set Value Scale........................................55 Set y-scale ...............................................81 Settings ....................................................70 Size ..........................................................81 start DecideIT ....................................15, 50 system requirements ..................................8 template .............................................23, 24 define ...................................................22 Templates ................................................59 Templates ................................................23 Templates ................................................59 Templates ................................................59 Tools ........................................................70 total ranking.............................................31 Total ranking ...........................................80 Total Ranking ..........................................61

Total Ranking ......................................... 61 Total Ranking ......................................... 62 Total Ranking ......................................... 62 Total Ranking ......................................... 63 Total Ranking ......................................... 63 Total Ranking ......................................... 66 Total ranking all alternatives ............... 80 Total Ranking All Criteria ................... 66 uncertainty ............................................ 118 Undo ....................................................... 53 Update ............................................... 59, 83 Update Tree ............................................ 59 utility ..................................................... 115 elicitation .......................................... 115 utility base ............................................. 130 utility function ................................ 97, 113 utility imprecise .................................... 120 utility theory axiom ................................................ 104 utility theory............................................ 97 axiom .................................................. 91 utility theory.......................................... 104 utility theory.......................................... 105 utility theory axiom ................................................ 105 utility theory.......................................... 107 utility theory axiom ................................................ 107 utility theory .......................................... 109 utility theory criticism ............................................ 109 utility theory criticism ............................................ 110 utility theory

Page 147 of 148

User Manual DecideIT Decision Tool Preference AB 2006-2011

independence .....................................113 value assign ...................................................26 value base ..............................................133 value relations .........................................42

Value Relations ................................. 42, 44 Value Span .............................................. 68 Value/Weight Relations .......................... 54 View .................................................. 58, 81 weight ................................................... 117

Page 148 of 148

You might also like