Submitted by: Maciej Machulak
Social networks and other social applications are becoming applications become increasingly important for a large part of the society. Young and mature Internet users participate in social networks and exchange information about their personal or professional activities. They create connections with friends or other professionals. They share their personal information and digital content using various social applications.
Young people, in particular, have little knowledge about technical complexities of social networks and other social those applications. They have little understanding about the value of information that they submit and share among their peers and other users of those applications. Personal information such as age, sex, telephone numbers or hobbies is often not perceived as valuable. Similarly, other digital content such as picturesphotos, short video clips or documents is viewed as any other information which can be freely available to for other users of social networks.
In reality, information submitted by users of social networks may be of great value to third parties. Personal information is often used for advertising purposes or can be abused by malicious users for other purposes. Digital content, on the other hand, has influence on how a particular individual is perceived by others, be it employers or peers. As such, restricting access to information and ensuring one’s privacy is a necessity and is currently under research.
Younger users of social networking applications may not be aware of the above mentioned security and privacy issues related to the information that they submit. As such, they may expose too much information to their friends, which is not desirable. It is often the case that information is shared only with members with whom a person has a direct connection in a social network. However, to increase popularity by having many so-called friends, many users of those applications make connections with others even if they do not know them in person.
To prevent information leakage, parents often require having insight into what information is submitted and shared. They can then restrict publishing of sensitive information. In order to be able to control information, parents need to be given usernames and passwords. This, however, is often perceived to be too intrusive for from the perspective of younger users.
In the next section this scenario we discuss how User-Managed Access can be used to support parents with restricting information publishing by their children. We present how younger users of social networking applications can benefit from our proposal. With our scenario we show how our the User-Managed Access approach allows a user to delegate access control related tasks to other entities that may have a better understanding of security requirements for their resources owned by those users.
Use Case: Delegating Access Management to Content on Social Applications (Pending)
Submitted by: Maciej Machulak
- Would it be possible for Alice not to be concerned with security and privacy issues and only take care of publishing here data on her favorite social networking application?
- Can Alice achieve that without revealing her username and password to the entity which should have access to her privacy and security settings?
- The Social Networking Application may allow users to add accounts which could be used to control security and privacy of data published by those users (e.g. Alice may create an additional account with a different username and password which she can hand over to her father – this account could be used by her father to change security and privacy settings for Alice). This approach, however, is not efficient. If Alice uses more than one Social Networking Application then she might need to create multiple accounts and she will end up having security and privacy settings hosted in multiple places).
- Could Alice allow the Social Networking Application to delegate access control related tasks to a third party component? Can such component be under control of a different entity than Alice? Could such entity define arbitrary terms that must be met in order to access Alice’s Web resources hosted by her Social Networking Application?
Alice, a 14 year old girl, wants to have an account on a popular social networking application - FaceSpace. She wants to create a network of her friends with whom she wants to share pictures photos and discuss her hobbies. She wants to keep in touch with them and does not want to be left behind with new technologies that have been used by her peers for some time now.
When Alice sets up the account at a popular social networking application FaceSpace she needs to provide a variety of information including. This includes providing information about her age. The application When Alice states that she’s 14 then then FaceSpace detects that Alice she is very young . It then and informs her that she will need parental control over all the information that she submits and wishes to share with other users of the application. What the application this means is that Alice will not an adult (legal guardian) will be able to restrict access control information dissemination by herself but will rely on an adult to make access control decision. Alice then for information that Alice submits. However, Alice will still be able to share information if her access control rules do not contradict with those specified by her legal guardian. To set up the parental control, Alice asks her father Bob for some help with setting up her account and providing the required parental control functionality.
Bob is happy that his daughter will be able to communicate with her friends but he is concerned with what information will be released and how this information might be used by legitimate or malicious users. He knows that the social networking application FaceSpace has been certified to support parental control and allows third party UMA-based access control systems to be used for that purpose.
Bob is already using a specialized Authorization Manager – CopMonkey – for his own purposes. He uses such component this AM to define access control policies for his various online resources like documents and pictures that he shares with his friends and colleagues at work. Bob decides to use this Authorization Manager personal data (home address, telephone number) so that he can point other Web services to pull such data as required (e.g. during registration process at different Web sites). Moreover, he protects his online calendar service using CopMonkey and he established necessary relationships between his calendar feeds and his Visa payment service. He wants to be sure that Visa knows where Bob is and can correlate such data with data regarding his credit and debit card usage.
Bob decides that CopMonkey will be perfect for parental control over Alice's information. He plugs in Therefore, he introduces this AM component to the social networking application and is now able to easily control how various information submitted by Alice is shared among her friends.to FaceSpace (establishes a trust relationship between these two services). As such, he is able to restrict how different information published by his daughter is accessed by her friends or other users of the application. Bob sets up basic policies regarding sharing of such information:
(1) Alice's pictures and video clips can be shared with her friends only.
(2) If the request to either a picture or a video clip comes from a user who does not have a connection with Alice but is a member of the same group as Alice is then such user needs to agree not to further share this picture. Moreover, he needs to provide a certified claim that he is a member of a particular group.
(3) Only friends of Alice can see her personal information such as telephone number or email address.
When the account is set up and parental control has been configured then Alice is able to use it FaceSpace just as any other user of the social networking application. She writes comments about her day, posts links to interesting movies. Additionally, she uploads some of her pictures photos and short video clips with her friends. She knows that her father is very concerned with Alice's safety and the privacy of her information and that only comments and links are automatically shared with her friends.
However, pictures and videos are only shared with a predefined set of her friends which was approved by her father. To extend the set and share such multimedia content with other users, Alice must ask for her father's consent.
Knowing about security constraints imposed by her father, Alice decides to upload a picture from her birthday party. She wants to share it with all the friends that attended the party. When the picture is uploaded she clicks a share button to make a list of friends who should be able to access this picture. Up to this point, Alice performs sharing related task just as any other user. However, once the 'Share' button is clicked then Alice is presented with information that her picture has been shared with Tom and Patrick only as those two out of her list are considered trustworthy by her father. Sharing the picture with the rest of the group is subject to Alice's father approval.
Under the hood, the social networking application sends an access control policy request to the Authorization Manager as configured by Bob. The picture is not shared unless a reply is sent back confirming that a policy, as defined in form of list by Alice, is proper. This policy request waits within a Authorization Manager for Bob's consideration. As Bob checks his Authorization Manager on a daily basis, he sees that a new request for an access control policy has been received. He checks the resource that is shared (i.e. the picture of her daughter at her birthday party) and what are the possible consumers of this resource (i.e. identities of her daughter's friends). The list seems fine for Bob apart from a single identity of her daughter's older friend who misbehaved at the party. Therefore, Bob removes his identity from the list and approves a new access control policy. When this happens, a request is sent back to the social networking application that a policy for the picture has changed.
When Alice logs in to her account at a social networking application she sees that her father approved her sharing list. What that means is that Alice's proposed access control policy has been validated by her father and has been applied to her picture. However, she notices that some identities have been removed from the list. She checks which of her friends have been removed and decides not to negotiate with her father. After all, she was mad with her friend not acting properly at this very important event of hers. She hopes that her friends will get notifications about a new picture being shared and she is very excited about the commentsher sharing options are therefore limited. She knows that she can share the information by herself but that she cannot override higher level sharing constraints imposed by her father.
She’s not sure how that works but leaves this issue to her father – after all he is more proficient in defining correct privacy and security settings than she is.
Alice uploads a photo from her birthday party and wishes to share it with some of the users of FaceSpace. When the upload is finished, she clicks on the “Share” link next to the photo and defines the users with whom the picture should be shared. Her sharing options include all of her friends and members of one of the groups that Alice participates in - "Youth Sport". She confirms her sharing choice by clicking on the "Apply Sharing Options" link.
Jane, one of Alice’s best friends, heard that a new photo from a birthday party has been uploaded and wishes to see it immediately. She logs in to FaceSpace and clicks on the Alice’s photo album link. She can see a link to the newly uploaded photo. When she clicks the link, the application detects that a request has been made to an UMA protected resource. Therefore, such access request is subject to access control by the configured Authorization Manager apart from the internal mechanisms used at FaceSpace. Jane knows nothing about UMA and does not see the underlying protocol that is being executed by FaceSpace. She is not aware of the fact that FaceSpace acting on her behalf obtains an access token from Bob's CopMonkey (meeting requirements of policy no. 1). She observes a very small delay before accessing the photo. This delay, however, is visible only when she wishes to access the photo for the first time.
Patrick does not have a connection with Alice but is a member of the "Youth sport" group. He saw that a new picture has been uploaded by one of the group members. Therefore, he clicks on a link to see the photo. Similarly to Jane's case, FaceSpace detects that a request has been made to an UMA protected resource. Therefore, such access request is subject to access control by the configured Authorization Manager apart from the internal mechanisms used at FaceSpace. Patrick gets redirected to a page presenting a short description of what just happened (i.e. that he is subject to additional access control mechanisms as defined in policy no. 2). He sees that he must agree not to distribute the photo and that is a member of the "Youth sport" group. He agrees to that happily and is redirected back to the photo hosted by FaceSpace. Just as Jane, he is not aware of any complexities of the UMA protocol and only knows that he had to agree to some simple terms. Thanks to the UMA protocol, CopMonkey collects the following claims: (a) one self-asserted by Patrick that he will not share a photo, (b) a claim certified by FaceSpace that Patrick is a member of the "Youth sport" group. Both claims are required to authorize FaceSpace access Alice's photo on behalf of Patrick.
Patrick likes the photo very much and he decides to see who Alice is. Therefore, he clicks on her "Profile" link. This access request is also subject to parental control configured for FaceSpace. In this case, Patrick is denied access which conforms to Bob's policy no. 3.
Over time, Alice learns that allowing her father to have impact on security of the resources that she shares with her friends is not a bad thing. She feels safe and knows that everything she submits to her social networking application FaceSpace is secure. Over time, Alice also learns more about security and sees what information is prevented from being shared with her friends. In the future she hopes to make better security decisions by herself. At some point she'll be fully responsible for controlling access to her resources.
Her father Bob is also happy as he knows that his daughter can communicate with her friends in a safe and secure way. He checks his Authorization Manager on a daily basis and composes access control policies if any requests are sent and defines terms as new photos and other resources are uploaded by his daughter's social networking application. Moreover, he audits all access requests and sees how Alice's friends access her pictures photos and video clips. He hasn't noticed any abuses and is confident in whatever her daughter does. After all, he's fully responsible for her privacy and security and he puts much effort into ensuring that his daughter stays safe and still enjoys the benefits of social networking on the Web.
The architecture for a User-Managed Access for the provided scenario is depicted below.
A user delegates access control functionality for his resources to a component that is managed by a different entity. Therefore, the user is only concerned with creating and submitting resources online. Another entity (custodian) is then responsible for defining access control rules for those resources.
Actors in the described scenario are as following:
- Primary Resource User – Alice
- Authorizing User – Bob
- Protected Resource #1 – Photo
- Protected Resource #2 - Personal Information
- Host - FaceSpace
- Requester – FaceSpace
- Requesting Party #1 – Jane
- Requesting Party #2 - Patrick
- Authorization Manager – CopMonkey
- The typical UMA Authorizing User is split into two roles: Primary Resource User and Authorizing User (Custodian)
- UMA serves the purpose of Mandatory Access Control (MAC) with regards to Discretionary Access Control (DAC). MAC policies are defined by Bob. They are unconditionally enforced and cannot be overridden by sharing options chosen by Alice.
- The same Web application acts as a Host and a Requester, externalizing part of its access control functionality to AM.
- Claims provided by the Requester must be certified by an authoritative body (e.g. Jane needs to provide a certified claim that she is a friend of Alice – such claim is signed by FaceSpace)
The following scenario shows how a user can delegate (a part of) access control functionality to a different user. In this case, an owner of a resource decides that a different entity (a custodian) will be also responsible for security of their resources. A user is only concerned with producing and submitting content on the Web and may define sharing options for such content (e.g. Alice can upload a picture and define who sees that picture). However, it is the custodian that is responsible for ensuring that such content is well protected by defining mandatory access control policies. Such MAC rules cannot be overridden by rules defined by the primary resource user (e.g. Bob may define a rule that Alice's photos can be shared with her friends only which may restrict Alice's sharing options). It is then up to the custodian what access control rules will be are applied effectively to resources. An Authorization Manager in such setting can be viewed as an access control module externalized from a Web application that is simply under control of a different entity.
View of the actors presented in this scenario with regards to the generic architecture of a User-Managed Access is depicted below:
Presented diagram shows a Authorization Manager (1), a User (2), a Host (3), a Requester (4) and a Custodian (5)Such Authorization Manager may serve the purpose of Mandatory Access Control.
A custodian can be fully responsible for defining access control policies and may be fully separated from an owner of resources. In the described scenario, Bob could be the only entity that defines access control rules for Alice's resources and Alice would only be concerned with producing and submitting these resources to FaceSpace. In such case no direct interactions are needed between an owner of a resource and a custodian who defines access control policy for this resource. An owner the primary resource user and an authorizing user. A primary resource user may not have any knowledge about the security that is applied to a resource. As such, an owner he or she can focus on the main tasks related to producing a resource (e.g. writing a document, submitting a photo) and can leave applying security to those who have greater knowledge about security requirements that need to be considered. Another approach, which has been discussed in the scenario, is where a user can make an access control policy that is subject to approval by a custodian. In such setting, two different approaches can be considered. A custodian can either only restrict the policy further (i.e. the resulting access control policy can be composed of a subset of rules as proposed by an owner of a resource). In the second approach, a custodian can define access control policies at his own discretion. This can mean that a custodian can restrict policies proposed by an owner of a resource by deleting certain rules, expand those policies by introducing new rules or change those policies completely. In any case, how ownership of a resource is preserved needs to be consideredand experience in defining security rules.
In the scenario, a particular Host could only accept to establish its trust relationship with a whitelisted AM. Such AM would be able to certify that a particular Authorizing User is an adult and a legal guardian of the Primary Resource User. Bob needs to get involved in the registration process to establish a trust-relationship between FaceSpace and CopMonkey. Another option for Bob, for example, would be to get an email with the instructions concerning introducing FaceSpace to AM and to perform this introduction later. Before this happens (i.e. resources submitted by Alice are protected by an adult),
Alice can share resources by herself but FileSpace will not accept any liability for potential injuries that Alice may suffer from.