Professional Documents
Culture Documents
Security Policies and Security Models
Security Policies and Security Models
Security Policies and Security Models
ITY
MODELS
J. A. Goguenand J. Mesajuer
SRI International
Menlo Park CA 94025
CHi753-3/82/0000/0011$00.75@1982IEEE 11
Informationflow techniquesattenptto analyzewhat pre-existing mathematicalmodel. hbst work in
users (orprocesses,or variables)can potentially ccmputersecurityhas ignoredthe socialcontexts
interferewith others,whereaswe beginoppositely, in which systemsare actuallyused. However,
bysayirg what users (or processes,or variables) differentorganizations have differentsecurity
must not interferewith others in order that sane needs,ard use theirsystemsin differentways; in
secur~ policyhold. Informationflow techniques general,they have differentsecuritypolicies.
generallyproceedto form the transitiveclosureof Providinga model withoutunderstanding the needs
a “potentiallyflows”relation,arxlthus erxlUp of the ccanmunityinvolvedis unlikelyto yield the
with a largenumberof cases that actuallycannot most usefulresults. What is ultimatelynecessary
occur,but nonethelessmust be analyzed. We hope is that the actualcommunityof users shouldbe
to avoidthis bog by using a more refinedanalysis satisfiedthat the systemthey are using is
based on noninterferenceand containingexplicit sufficiently secure in a sensewhich is appropriate
informationabout the operationsinvokedby users. for their particularpurposes. Once the needsof a
ccinmunityhave been understood,it may be possible
Our wrk was in part inspiredby the approachof to formalizethose needs ad to model their
informationprocessingsystem;at this point it
[Feiertag80] and [Feiertag,Levitt& Robinson
77]. Some differencesare that we are more will be meaningfulard usefulto provideproofs.
explicitand rigorousaboutour franeworkand In this,we disagreewith [DeMillo,Lipton& Perlis
assumptions, we treat the generaldyneanical
case in 77], who seam to believethat the socialprocessin
which capabilitiesmay be passedbetweenusers,and itselfcan be adequatefor securityverification;
we considerarbitrarysecuritypolicies,rather see [Dijkstra78] for relateddialectics.
than just MLS. Our approachis relatedto that of
[Rushby81a] which also uses an autcmatonmcdel. Thusrwe envisiona four stage approach: first,
Differences are that [Rushby81a] only considers determinethe securityneedsof a given community;
the staticcase (nopassi~ of capabilities) of a second,expressthose needs as a formal
Plicy that separatesprocessesby permittingthem requirement;third,model the systemwhich that
to communicateonly throughspecifiedchannels. ccmmunityis (orwill be) usimg;and last,verify
that this model satisfiesthe requirement.Only
The readerwho wantsmore backgroundinformationon the last of these steps is purelymathematical,
crnnputer
securityshouldconsult [Denning82], or althoughthe other stepshave mathematicalaspects;
[Landwehr81], [Rushby81b]or [Turn81]. the remainderof this pa~r concentrates on such
aspects.
We definewhat it means for one set of users to be whereC is the set of statechangingcommandsand U
is the set of users. It might be that if u is not
‘noninterferirq with” another;this formalizesthe
notionof informationnot flowingwhere it permittedto performa cmnd c, then do(s,u,c)=
shouldn’t,withoutassumingthat informationflow s; such securityrestrictionscan be implementedby
is necessarilytransitive.We next providea consultingthe capabilitytable,and are here
simplyasswnedto be built into the functiondo.
generaldefinitionof ‘securitypolicy”and discuss
that it means for a given capabilitysystemto
satisfya given securitypclicy,thus definingthe We will also assunethat for a given state and
securityverificationproblen. Finally,we suggest user,we know what output (if any) is sent to that
somemethcdsfor actuallycarryimgout such user. ‘ibisaspectof the systemcan be described
verifications. by a function
In order to handlethe dymuniccase, in which What For conveniencein the followinq,let C = CC U SC.
users are permittedto do can changewith time,we the set of all commands. ‘“
will assunethat in additionto the statemachine
featuresthere are also “capabilitycommands”that As we have formulatedcapabilitysystems,there are
can changethe capabilitytable. The effectsof no commandsthat changeboth the stateand the
such commandscan be describedby a function capabilitytable;however,if it is desird to
accc+nmdatesuch commandsfor some application,our
definitioncan be easilychangedto do so.
cdo: Capt x U x (X -> Capt ,
The capabilitycompnent of a capabilitysysta is
where Capt is the set of all possiblecapability itselfa statemachine,with state set Capt and
tables,U is the set of all users,ad CC is the inputset U x CC; it models the way in which the
setof capabilitycommands. (If a user u is not capabilitytable is updated,and includessuch
allowedto performthe capabilitycommandc on the possibilities as passingand creatingcapabilities.
table t, then cdo(t,u,c)may be just t again;as We will call it the capabilitysub-machineof the
before,this is determinedby consultingthe capabilitysystem. The entire=~~systen is
capabilitytable.) In order to distinguish a cascadecomection (in the senseof automaton
capabilitycomnandsfrom state comnards,in the theory,e.g. what is called“serialconnection”
followirqwe will denotethe set of all state in [Hartmanis& Stearns66]) of this machinewith
comnandsby SC. so that the state transitionsard anotherthat models the processingof all the non-
the outputscan be checkedfor securityagainstthe capabilityinformation, such as user files. In the
capabilitytable,we will add a capabilitytable following figure illustratiq this cascade
ccmponentto the state transitionfunctionand to connection, the function Check returns the
the outputfunction. information from the capability table needed to
determine whether or not a given cornnand is
Adding all this to the definition of a state authorized for a given user.
machine, and also addirxgan initial capability
table, we get the following as our basic concept: Let us call a subsetof C = SC U CC an ability, ati
let Ab denote the set of all such subsets. Then it
Definition2: A capabilitysystcrn
M consistsof wuld make sense, for example, to let Capt = [U ->
the followimg: Ah], the set of all functions from U to Ab, so that
a capability table wuld tell us exactly what
abilityeach user actuallyhas. However,our
A set U whose elementsare called abstractmodel does not requireany such particular
“users.” representation. We will discuss later on some
A set S whose elementsare called
“states.”
Csdo(s,t,u,
c) = (s,cdo(t,u,c))
if c is inCC. 3.1 StaticPoliciffi
We can now view a capabilitysystemas a state Let us beginwith some auxiliarynotation. Given a
machine,with state space S x Capt, inputspace (U statemachineM, let w be an elementof (U x C)*
x C) and outputspaceOut. We can thereforeextend and let u be a user. We define [[ w ]]u to be the
the systemtransitionfunctionin the classicalway “outputto u afterdoing w on M,” i.e.,
to stringsof inputs,by defining
[[ wllu=out([[wll,u) .
Csdo: Sxcapt x (uxc)*->sxcapt
It is the “denotationof w from the point of view
with the equations of user u.” Note that M may have additional
structureas a capabilitysystem. In that case,
the state spacehas the form S x Capt, the set C of
csdo(s,t,NIL)
= (s,t) conmandsis a disjointunionCC U SC of state and
capabilitytable changi~ commands,and the state
and transitionfunctionis csdo. TTXJ.S, all the general
definitionsgiven below also applyto capability
csdo(s,t,w.
(u,c))= csdo(csdo(s,t,w),u,c) systems. However,it is simplerto state them for
a arbitrarystate transitionsystem.
for s inS, t inCapt, u inU, c inC, amiiw in (U
x C)*, where NIL denotesthe anptystringand the Our secondauxiliarynotationhas to do with the
“dot” denotesstringconcatenation.we will USS selectionof subsequencesof user-commandpairs.
the notation
Definition3: Let G C U be a “group”of users,
let A ~C be an ability,and let wbe in (U x C)*.
[[ w]] = Csdo(to,co,w)
Thenye let l%(w)denote the subsequenceof w
obtainedby e iminatingthosepairs (u,c)with u in
for the “denotation”or effectof the inputstring G. Similarly,we let PA(w) denotethe subsequence
w on states,startingfrom the initialstateof the of w obtainedby eliminatingthose pairs (u,c)with
whole system. c in A. Combiningthese tm, we let PG A(w) denote
the subsequenceof w obtainedby eliml~atimgthe
We now recallone more conceptfrom classical pairs (u,c) with u in G and c in A. []
automatontheory,and then specializeit to the
case at hand. A state s of a statemachineM is For exanple,if G = {u,v}and A = {c1,c2},then
reachableiff there is some w in C* such that
[[ w ]] = s. But noticethat the set of reachable
statesof a capabilitysystemwill not in general ~,A( (u’,cl)
(u,c3)(u,c2)
(v’,c1)) =
be a Cartesianproductof a set of (ordinary)
= (U’,
C1)(U,C3)(V’,C1)
,
statesand a set of capabilitytables. However,
the set of reachablestateswill alwaysbe the where u’,v’ are other users and C3 is another
state set of a reachablesubmachine,the reachable command.
capabilitysubsysta of the given capability
system.
Nowwe are ready for the basic technicalconceptof
noninterference,tiich we give in threedifferent
forms (wewill latergeneralizeall this to
3 SecurityPolicies conditionalnoninterference).
Whereasthe previoussectionpresentedour approach
to nmdellingsecuresystems,this sectionpresents
our approachto definingsecuritypolicies. The
purpse of a securitypolicy is to declarewhich
informationflowsare not to be permitted. Giving
such a securitypolicycan be reducedto giving a
set of noninterference
assertions. Each
noninterferenceassertionsays that
15
Definition4: Given a statemachineM and sets G, describesituationswhere passinginformation
G’ of users,we say thatG does not interferewith acrossboundariesis permitted,for examplewhen
(or is non-interferingwith~,~ltten G :Im informationis declassifi~,or when it is
iff for~l w in (U x C)* and all u inG’, controlledby discretionaryaccess.
[1 U[X,+=J]:1 U[--,x’l
Althoughwe have statedthesedefinitionsfor state
machines,they apply immediatelyto capability holds for M.
systemsbecausewe have shown that capbility
systemsare also statemachines. In the following The functionLevelmight be storedin the
we will generallybe applyingthesedefinitionsto capabilityccmponentof the system,and it is not
the case of dynamicallychangingcapabilitiesas assunedto be necessarilyconstant. Or the —
coveredby capabilitysystems. capabilitycomponentmight be much more complex,
containinginformationon who is allowedto change
Example1: It followsfrcm the abovedefinition classifications,or to pass capabilitiesto change
that if A :1 {u), then the commendsin A have no classifications.This will be discussedbelow.
effectwhatsoeveron the output seen by u. For
exanple,if A is the abilityto create,write, Let us call a group G of users invisible(relative
modify or deletea file F, then “A noninterfering to other users) iff G :1 ~ (“invisible” seems
with u“ means that the informationread frcm F by u appropriatefor these usersbecausethey can see
cannotbe changedby any commandsin A. In withoutbeing seen). It is very easy to express
particular,if F did not originallyexist,then u -MISusirq this notion:
will alwaysbe told that F doesn’texist,
independently of what commandsin A may actually
have done. [] for every level x, U[x,+~]is invisible.
where p is definedby
P (J) =J,
p(ol...On)= o’1...n’n
where
withuinGandainA ,
The constraintsimpliedby this pictureincludethe
followingnoninterferenceassertions: and
{d} :1 {C}
the projectionfunctionp is to take accountof the
fact that theremay be some subsequencesin a long
sequenceof corrmandsthat are noninterferi~,while
othersmay be interfering.
We will show in a futurepaper how to formally
producea completeset of of noninterference The examplesbelowgive severalconditional
assertionsfrom an informationflow diagramlike noninterferenceassertionsand show how they apply
that of Figure2. For the manent,let us note that to commandsequences.
the left group of assertionsare purely
topological, while the rightgroup encode Exsmple7: Discretionary Access. In this ex~Ple,
informationabut the specificabilitiesmentioned we assmnethe existenceof a functionCHECK(W,U,C),
in the graph. [] which looks at the capabilitytable in state [[ w
]] to see whetheror not u is authorizedto do
comnandc; it returnstrue if he is, arxlfalse if
not. We can then regardCHECK(U,C)as a predicate
on cornnandsequencesw. One generalpolicythat we
wish to enforcefor all usersu and all commarxls c
is
(*) {u},{c}:1 u~not CHD2K(U,C),
17
where U is the set of all users. This jUSt SSYS Example 8: Bailout-Function.We now expressa
that u cannot interfereusing c if he does not have dynamicpollcyhavingone functionthat differs
the capabilityto use C in that state. fran standardMIS; the purposeof this ex?anple is
of courseto illustratethe use of our conditional
NOW let us considerthe case where for each user u noninterference assertionformalism,ratherto
and comnandc, there is anothercomnand,denot~ argue for or againstany particularsecurity
pass(u,c),tiich says to pass to u the abilityto plicy. In this plicy, there is a commandB that
do c; of course,the user issuingthlsc_ndmaY when executedchangesthe lwel of the user to the
or may not be authorizedto do so. we need a bit lowestsecuritylevel in an irrevocable manner.
more notation. If w is a comnandsequenceof the l’bus,we assumea simplyorderedset L of security
form w’.o, let previous(w)=w’.and let last(w)= lwels, with bottomlevel “Uric” say, and with a
O. Then let us write c=(prevlo~~ufc) ‘or ‘ie functionLevel from users to levels. Then the
predicateCHECK(previous(w),u,c) of w: Now the policy is statd with the followingnoninterference
policythat we wish to enforceregardinguse of the assertionsfor each user u
pass ccmmandis
{u} :! U[-oo,Lwel(u))
—if CHECK(u,B),
(**) {u},{c} :! U~[not gy(previo~tu~c)l
tiereU[~, x) = { u’ I Level(u’)< x }, and
[if CHiXK(previous,u’
,pass(u,c))
then not last= (u’,psss(u,c))] U(Unc,*] :1 {u) .if not CHECK(u,B),
CHECK(w.(u’’,
d),u,c)= C~K(W,U,C) . Specificsystems,such as PSC6, can be modelledin
our franework,by instantiatingthe varioussets
Then, for example,(**)says that for any user v, and functionsinvolvd in Definition2. This can
be done in many differentspecificationformalisms,
[[ (u,c)(u’,pass(u,
c))(u’’,d)(u,c)
]]v= includingthe statemachineapproachof SPECIAL
[Levitt,Robinson& Silverberg79]; the first
[[ (u’,pass(u,c)
)(u’’,d)(u,c)
]]v , order logicdecisionprocedureapproachof STP
[Shostak,Schwartz&Nelliar-gnith81]; the
i.e.,that the first instanceof (u,c)in the inductivedefinitionover listsand numbers
comnandsequencehas no observableeffect. [] approachof [Boyer& Moore 80]; the parameterized
procedureapproachof CLEbR [Burstall& Goguen77]
or in a more usableform,ORDINARY[Goguen&
Burstall80]; and the executableabstractdata
type/rewriterule approachof WJ [Goguen& Tardo
79].
18
Anotherpossibilityis the formalismsurveyedin
[Snyder811, one of the few which considersthe
passingof capabilities.We observethat this is a
graph gramnarformalismin the [Bell& LaPadula74]
l(reowsk~=en & Winkowski 78]g?n%~%y, Bell,D. E. and LaPadularL. J.
it seemsto be difficultto verify~licies in this Secure ComputerSystems:
formalism. MathematicalFoundations—and
Model.
Tec~l Report,MITRE Corporation,
5 VerificationofSecurityPolicies 1974.
Bedford,MA.
How can we verifythat a securitypolicyP is
satisfiedby a capabilitysystemM? For exanple, [Bover& Moore 801
how can we verifythat MLS has been correctly Boyer,R. and Moore,J. S.
implementedin some versionof PSC6? Fran a A Computationals.
generalpcint of view, this is a matterof AcademicPress,1980.
verifyingthat the noninterference assertionsin
the policyare true of some particularabstract [Burstall& Goguen 77]
machine. This can be done by inductionover the “Burstall, R. M. and Gcguen,J. A.
commandsof the system. We have some hope that a ~tting Theoriestogetherto Make
great deal can be accomplishedwith purely Specifications.
syntacticcheckimgof specifications for the Proceedings,Fifth International
operatingsystem,as with the FeiertagMIS tool Joint Conferenceon Artificial
.—
[Feiertag80], becauseof the simpleformof the Intelligence5:10~-1058, 1977.
assertionsoccurringin the securitypolicy
definitions.More detaileddiscussionof this will [DeMillo,Lipton&Perlis 77]
be the subjectof a futurereprt. DeMillorR. A., Lipton,R. J. and
Perlis,A. J.
SocialProcessesand Proofsof
But, you may ask, how can we verify that some given Theoremsand Prcwrams.
code runningon a givenmachineactuallysatisfies In Proceedings,Fou~thACM Symposium
some policy? One approachis just to verifythe .— —
on Principlesof Pr ramming
~licy for a high level specification, and then to firguages,F%= ~CM,
verifythat a lowerlevelmachinecorrectly 1977.
implementsthe specification, perhapsthrougha
sequenceof intermediate abstractmachines;this is
essentiallythe approachof HIM [Levitt,Robinson& [Denning82] Oennhxj,D.
Silverberg79]. and Data Security.
Cryptography
.—
Addison-Wesley,
1982.
[Dijkstra78] Dijkstra,E. W.
6 Sumnary On a PoliticalPamphletfrom the
MiddleAges.
This paper has describedan approachto security SoftwareEngineeringNotes 3(2):14-
which is based on: 16, 1978.
[Feiertag80] Feiertag,R.
~ Techniquefor Provinq
Specifica~ns
—— are Multilevel
Secure.
Tee= Report,SRI ReportCSL-
109, 1980;
[FeiertagrLevitt& Robinson77]
Feiertag,R. J., LevittrK. N. and
6Thisobservationseems to be new, and might be Robinson,L.
of some use for the furtherdevelopmentof the ProvimgMultilevelSecurityof a
formalismdescribedin [Snyder81], becauseof the SystenDesign.
existenceof a considerablebody of resultson In Proceedimgs~SixthACM Symposium
graphgrammars,includingpumpimglemmas,
decidabilityresults,and normalforms. g%%%% %?inciples’
19
[Neunann,l?oyer,
Feiertag,Levitt&Robinson 801
[Goguen& Burstall80] Neunann,P. G., Boyer,R.S.,
Goguen,J. A. and Burstall,R. M. Feiertag,R.S., Levitt,K. N. ad
An OrdinaryDesign. Robinson,R.S.
~chnical R+art, SRI Intermtional, ~Provably SecureOperatinqSYstem:
1980.
Draft report. %%%!j%%ApplicatiO”’ m
Tee= Report,tiI International,
lGoguen& Tardo79] ComputerScienceLaboratory,
Goguen,J. A. and Tardo,J. 1980.
An Introductionto OBJ: A Language
for Writingand TestirgSoftware rpowk & Farber 781
Specifications. . .
Popek,G. J. and Farber,DavidA.
In S-w ificationof Reliable Ahkxlelfor Verificationof Data
Software,page=170-189. , 1979. Securityin OperatingSystems.
Communications of ~.Association
[Goguen,Thatcher&Wagner 781
Goguen,J. A., Thatcher,J. W. and %3?%%%;-
Wagner,E.
An InitialAlgebraApproachto the [Rushby81a] Rushby,J. M.
Specification,Correctnessand Proof ?f.Se~rability:A
Implementationof AbstractData Verlflcat~onTechni;e for a
Types. ~—-
Class of SecurityKernels.
In R. Yeh (editor),CurrentTrends Tec=l=epcrt, Computing
Laboratory,Universityof
g:- w~ . Newcastle-upon”fine,i9B1.
Originallypublishedas IBM T. J.
WatsonResearchCenterReportRC [RUShby 81b] Rushby,J. M.
6487, October1976. Verification<Secure Sy~tems.
TechnicalReport,Universltyof
Newcastleupn Tyne, 1981.
[Guttag75] Guttag,J. V.
The Specificationand Applicationto
[Shostak,Schwartz&Melliar-%ith 81]
=tract Data — Shostak,Schwartz& Melliar-Smith.
STP:A Mechanized~
—— for
PhD thesis,Universityof Toronto,
Specificationand Ver=cation.
1975.
CcmputerScienceDepartment,Report TechnicalRepcrt,~mputer Science
Lab, SRI International,1981.
CSRG-59.
[Levitt,Robinson& Silverberg79]
LevittrK., Robinson,L. and
Silverberq,B.
The HDY H;ndbcok.
—-
TechnlcalReport,SRI,
International,ComputerScience
Lab, 1979.
VolunesI, 11, III.
20