Download as pdf or txt
Download as pdf or txt
You are on page 1of 27

InBloom:

Building Trust for The Protected Sharing and Analysis of Student Data for Personalized Learning
Dr. John Henry Clippinger MIT Media Lab ID3 Abstract

Over the last several years there has been an explosion in the collection, analysis and monetization of personal data. This trend has spurred grass root movements and regulators around the world to update antiquated privacy policies to reflect the realities of the present era. In education, the collection and analysis of student data presents special challenges. While potentially valuable in designing personalized learning programs, such data are taken without the consent and understanding of students and their families and have the Orwellian potential for significant abuse and stigmatization. Privacy polices and practices formed 40 years ago, such as FERPA, have faint credibility and even efficacy in an era of sensors, Big Data,and the mobile Internet. InBlooms privacy policies, practices and partnerships have done little to assuage its critics, and significantly lag contemporary privacy policies in the U.S and the EU. Hence, if InBloom is to achieve its exemplary goals of personalized learning, it will need to develop privacy policies and practices that are consistent with the Obama Administrations Data Consumer Bill of Rights, as well as the concerns of numerous advocacy and stakeholder groups. This paper outlines five concrete steps that InBloom could undertake to help quell its critics, address its shortcomings, and restore trust and credibility among its stakeholders to achieve its goals of personalized learning.

Introduction Within just the last five years, there has been an explosion of online and mobile services that collect, analyze and monetize personal data. In one of its reports, the World Economic Forum, Personal Data: The Emergence of a New Asset Class, January, 2011) has called such personal data a new asset class and the new oil, signifying its importance as a new business resource that can be used to build a new global economy. Entrepreneurs, governments, enterprises, and NGOs are all pursing this new oil with a vengeance, and in a comparable metaphorical fashion, often with

little regard for collateral, ecological side effects the erosion of personal privacy and trust. With the imminent proliferation of billions of sensors the Internet of things combined with the location and sensor data captured on over 5 billion mobile phones no one will be invisible to the watchful eyes of data aggregators and analyzers. Given the ubiquity of anyones data footprints and the ever-compounding power of machine learning and analytics, no one can opt out. Everyone is being tracked and analyzed. We are quickly transitioning to a global data ecology and economy where ones data is the equivalent of ones identity and reputation the marker for how one is known and treated at the world at large. Proposals to legally require data brokers to do not track i.e, do not collect personal data are as realistic abstinence vows as a policy to prevent teenage pregnancies; it works only for a conscientious few. The sad truth is that todays privacy policies are still wedded to principles and expectations shaped in the 70s and 80s, when there were no laptops, Internet, mobile phones, sensors or tablets, and data were relatively scarce and expensive. It was a wholly different era in which the greatest harms were the unauthorized collection and use of personal data. It was also a time when regulators within the US and the EU believed that regulatory remedies could be timely and effective in protecting personal identifying information. But today digital data are essential to a growing spectrum of technologies and infrastructures that revolve around the use of large databases. While privacy violations from improper disclosures and use of data remain a problem, a whole new class of public and private harms may now result from inhibitions in the flows and use of data. In education, new personal learning analytics and techniques are absolutely dependent upon the collection of student data over time. Failures to appropriately share data can result in catastrophic security breaches, epidemic outbreaks, medical failures, and public safety failures, let alone failed educational 2

opportunities, With the imminent advent of smart cities and sensors to monitor cars, food, appliances, homes, vehicles, pets, and children, how data are collected, shared and analyzed will effectively define our personal freedoms and the quality of civic life. In education, the collection, sharing, and analysis of personal data are essential to devising tailored learning programs and assessments. Better data means better learning and educational innovation and advancement. Yet while the very intimate information revealed by personal data can enable better learning experiences, supervision and results, it also increases the vulnerability of students and their families. Testing data, teacher assessments, truancy data, health records, personality and aptitude assessments, residence data, police records all can be used to improve learning and to stigmatize and control the fate of students and their families. The more hands that touch the data, the greater the risks that the interests of the child and her family will not be put first. The many teachers, counselors, social workers and administrators who have some relationships with students are not always motivated or even capable of acting in their best interests. Such professionals have enormously varied competencies, commitments and understanding about what is best for a child. When these factors are blended with commercial interests and influences over a childs education along with the institutional powers of large enterprises and government agencies, the potential for real harms and abuses escalate. In this context, simply relying upon assurances of professionalism and trust by educators, researchers, and administrators is inadequate. Parents already have too little trust in our educational much less governmental institutions. The public is understandably skeptical that regulations such as the 1974 Family Educational Rights and Privacy Act (FERPA) will be followed, or will actually protect the interests of students and their families. Such regulations are often written with the 3

interests of educators, administrators and even private third parties in mind. In the case of breaches, the child and families are relatively powerless, and it is they that bear the life long burden of institutional failures. In short, educational data, especially childrens data, are a precious, personal and vulnerable resource that cannot be handled credibly through traditional terms of service agreements, opaque privacy policies or disclosure agreements. FERPA is nearly forty years old and was written without contemplation of current data collection, protection, analysis, and dissemination methods. Furthermore, the type of student data now available (see Appendix) is inherently not only PII (Personal Identifiable Information), but potentially highly prejudicial and injurious.to students and their families. Conventional regulations, even those of more recent vintage, such as the proposed, Online Do Not Track, Act of 2013, remain woefully inadequate means to assure that peoples privacy interests are rigorously protected. The daunting challenge facing programs such as InBloom, which function primarily through local schools and state educational institutions, is to establish itself as a legitimate, trusted steward of student data in the eyes of parents, students, and advocacy groups. Technology and Policy Trends Favor User Control and a New Deal on Data It is hard to overstate how quickly mobile, sensor and digital technologies are changing the way data are being collected, analyzed and monetized. Virtually all forms of human activity calls, purchases, personal movements, social and commercial interactions, texting, health, financial dealings are being captured as data and analyzed. Large institutions with the capacity to assess the data can thereby make an astonishing assortment of predictions, commercial offers and assessments of markets, public behavior, social activities, and more. 4

Users and regulators have responded to this trend by seeking to give people greater control over their personal data through the creation of protected Personal Data Stores (PDS) for individuals. The Obama Administration has embraced this position through a broad array of data protection initiatives that include consumer data protection policy, guidelines issued by the National Strategies on Trusted Identities in Cyberspace (NSTIC), the Department of Commerces Green Paper in 2011 and the Federal Trade Commissions Privacy Report in 2012. In the EU, the Data Privacy Commission has advocated a Bill of Rights for Data, and the World Economic Forum in three annual reports has advocated user control over personal data. This shift towards user-centric control of personal data is also reflected in the rapid rise of personal data locker services such as DropBox, Box, iDrive, Mega, Sky Drive, Singly, iCloud and many others. In a similar fashion, the U.S. Government has led initiatives that give citizens easier, reliable access to their government data: the Green Button for utility data, the Blue Button for VA health data, and the My Data initiative launched by the Department of Education. This shift in norms is not only about giving people the right to control their data, but about enabling self-service, online management of personal and family affairs. Increasingly, we expect people to use their personal data to manage their personal affairs and to negotiate acceptable terms of service with retailers and online service providers. This trend will certainly grow stronger in educational services from K-12 and post secondary in the future. Within the U.S. a major driver for these user-centric policies is distrust of governmental institutions in general and especially a fear or disdain for governmental overreach. Whether this fear is warranted , or simply a symptom of other sociological and culture factors (the pace of change, institutional dysfunction, the sheer complexity of contemporary technology), such attitudes are common on both the political left and right. Advocacy groups are quick to see potential dangers, however improbably remote, without acknowledging the potential educational 5

benefits of data sharing, core standards and an active government role, even a local one. Such fears and distrust cannot be dispelled by bland assurances of just trust us, PR campaigns, or even detailed citations of legal agreements and regulations. Rather, parents need to feel that they are indeed in control, especially when it comes to protecting and educating their children. This can only be achieved by giving them control in ways that they can understand, influence and trust directly and personally. They must be able to opt in and out in of protective systems in simple and meaningful ways. They must be able to control the flow and use of their personal data or that of their child in ways that they understand. The notion that parents can be made to defer to school authorities or third-party vendors to act in your childs best interests, or that data-sharing can simply be mandated, simply does not exist anymore. The skepticism and distrust are all the more pronounced as the process for protecting privacy becomes more opaque and convoluted and as the personal benefits appear more remote and abstract. Assurances based on federal or state protections are greeted with similar distrust. InBlooms Privacy Policies and Assurances I consider InBloom Identity Theft. We need a class action law suit to protect students privacy. -- Diane Ravitch, education policy expert From its inception, InBlooms goal of collecting student data to improve educational success through personalized learning, was greeted with significant skepticism in many quarters. Especially worrisome to its critics was InBlooms partnership with Rupert Murdochs Wireless Generation. Leonie Halmson, co-founder of Parents Across American, makes these points:

The Gates Foundation, in association with Wireless Generation, a subsidiary of Rupert Murdochs News Corporation, recently formed a private LLC called the Shared Learning Collaborative. This LLC will collect confidential student and teacher data provided to them by states throughout the country, and in some form, share it with vendors and other commercial enterprises. The purpose of this project is at least in part to help vendors develop and market their educational products. NYS and NYC, along with school districts in Colorado, Illinois, Massachusetts, and North Carolina, have agreed to participate in Phase one of this project, starting in late 2012, with Delaware, Georgia, Kentucky and Louisiana participating in Phase II soon after. This project provokes serious privacy concerns as to the security of this confidential information, and the lack of any parental consent in the decision to share it with the LLC. The concerns are intensified by the fact that News Corp has been charged with serious privacy violations, including phone and computer hacking and bribing of public officials in the UK. The NY Post, another subsidiary of News Corp, recently provoked controversy by publishing teacher data reports based on student test scores in its paper, and running inflammatory articles about teachers who received low scores. There are also serious questions about the legality of this project. The US Dept. of Education has recently rewritten the regulations for FERPA, or the Family Educational Rights and Privacy Act, to allow more liberal sharing of student data, especially for research purposes. The new regulations went into effect in January of 2012. The growing publics distrust of business as usual approaches to privacy protection can be seen in the recent adverse reactions to InBloom presentation at the SXSW program in March 2013. This followed intense public criticism of InBloom for its privacy policies and relationships with third-party educational service vendors. Educational, privacy and civil liberty activist groups across the country have challenged the legality and ethics of the InBloom program. Students and their families found little reassurance from the Reuters article that describes the InBloom program (March 13, 2013), which was cited by Diane Ravitchs highly charged blog post, Identity Theft. As Reuters wrote: Federal officials say the database project complies with privacy laws. Schools do not need parental consent to share student records with any school official who has a "legitimate educational interest," according to 7

the Department of Education. The department defines school official to include private companies hired by the school, so long as they use the data only for the purposes spelled out in their contracts. The database also gives school administrators full control over student files, so they could choose to share test scores with a vendor but withhold social security numbers or disability records Indeed, it could be argued that the Department of Education privacy regulations privilege the interests of school administrators and third parties more than the privacy interests of the students and their parents. Recent modifications in the FERPA regulations do little to dispel such concerns through their legal obfuscation and self-serving bureaucratese. It is hard to imagine a typical parent being reassured if presented with the following text: Notice of Proposed Rulemaking In the NPRM, we proposed regulations to: Amend 99.3 to define the term authorized representative to include individuals or entities designated by FERPA-permitted entities to carry out an audit or evaluation of Federal- or State-supported education programs, or for the enforcement of or compliance with Federal legal requirements related to these programs (audit, evaluation, or enforcement or compliance activity); Amend the definition of directory information in 99.3 to clarify that a unique student identification (ID) number may be designated as directory information for the purposes of display on a student ID card or badge if the unique student ID number cannot be used to gain access to education records except when used in conjunction with one or more factors that authenticate the users identity, such as a Personal Identification Number, password, or other factor known or possessed only by the authorized user; Amend 99.3 to define the term education program as any program principally engaged in the provision of education, including, but not limited to, early childhood education, elementary and secondary education, postsecondary education, special education, job training, 8

career and technical education, and adult education; Amend 99.31(a)(6) to clarify that FERPA-permitted entities are not prevented from redisclosing PII from education records as part of agreements with researchers to conduct studies for, or on behalf of, educational agencies and institutions; Remove the provision in 99.35(a)(2) that required that any FERPApermitted entity must have legal authority under other Federal, State, or local law to conduct an audit, evaluation, or enforcement or compliance activity; Amend 99.35(a)(2) to provide that FERPA-permitted entities are responsible for using reasonable methods to ensure that their authorized representatives comply with FERPA; Add a new 99.35(a)(3) to require that FERPA-permitted entities must use a written agreement to designate an authorized representative (other than an employee) under the provisions in 99.31(a)(3) and 99.35 that allow the authorized representative access to PII from education records without prior written consent in connection with any audit, evaluation, or enforcement or compliance activity; Add a new 99.35(d) to clarify that in the event that the Departments Family Policy Compliance Office (FPCO or Office) finds an improper redisclosure in the context of 99.31(a)(3) and 99.35 (the audit or evaluation exception), the Department would prohibit the educational agency or institution from which the PII originated from permitting the party responsible for the improper disclosure (i.e., the authorized representative, or the FERPA-permitted entities, or both) access to PII from education records for a period of not less than five years (five- year rule); Amend 99.37(c) to clarify that while parents or eligible students (students who have reached 18 years of age or are attending a postsecondary institution at any age) may opt out of the disclosure of directory information, this opt out does not prevent an educational agency or institution from requiring a student to wear, display, or disclose a student ID card or badge that exhibits directory information; Amend 99.37(d) to clarify that educational agencies or institutions may develop policies that allow the disclosure of directory information only to specific parties, for specific purposes, or both; and Add 99.60(a)(2) to authorize the Secretary to take appropriate actions to enforce FERPA against any entity that receives funds under any program administered by the Secretary, including funds provided by 9

grant, cooperative agreement, contract, subgrant, or subcontract. This language is impenetrable at best to any layman. (Reader: I doubt that you actually read it all or understood it to be a legal standard.) Such text is insider legal jargon that is self-referential, self-protective and not transparent to those whom it most affects. This has been widely noted by highly respected organizations such as the ACLU, the Electronic Privacy Information Center, Citizens for Pubic Schools, Mass. PTA. Below is the testimony of Josh Golin of Campaign for a Commercial Free Childhood: Commissioner Chester assures that all parties involved will be obligated to comply with the Family Educational Rights and Privacy Act (FERPA). Yet critics have charged that the U.S. Department of Educations 2011 changes to FERPA violate the original intent of the law. Recently, the Electronic Privacy Information Center filed suit against the DOE for these changes to FERPA. Commissioner Chesters letter also did not reference the Federal Trade Commissions recent changes to the Childrens Online Protection and Privacy Rule. These changes restrict the capture and use of a childs personally identifiable information in recognition of the huge risks to safety and privacy that occur when commercial entities obtain access to it Given the above concerns, we believe it is imperative that parental consent be obtained before any childs data is shared with InBloom or any private corporation. We also request that you make public the types of data that have been or will be collected from students in Everett as part of the initial pilot. When one consults the InBloom website to review its privacy policy, the language and approach is minimal, vague, perfunctory and hardly reassuring. More significantly, as noted in the Golins testimony, the InBloom privacy policy and deference to FERPA does not acknowledge the more recent policies of the Obama Administrations FTC report, Protecting Consumer Privacy in an Era of Rapid Change or the White Houses report, Consumer Data Privacy in a Networked World: Framework for Protecting Privacy and Promoting Innovation in The Global Digital Economy. 10

These and other reports espouse a privacy by design approach to enforce a new Consumer Privacy Bill of Rights (Consumer Data Privacy in a Networked World, A Framework for Protecting Privacy and Promoting Innovation; in the Global Digital Economy, White House, p.1, 2012) that endorses: Individual Control: Consumers have a right to exercise control over what personal data companies collect from them and how they use it. Transparency: Consumers have a right to easily understandable and accessible information about privacy and security practices. Respect for Context: Consumers have a right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data. Security: Consumers have a right to secure and responsible handling of personal data. Access and Accuracy: Consumers have a right to access and correct personal data in usable formats, in a manner that is appropriate to the sensitivity of the data and the risk of adverse consequences to consumers if the data is inaccurate. Focused Collection: Consumers have a right to reasonable limits on the personal data that companies collect and retain. Accountability: Consumers have a right to have personal data handled by companies with appropriate measures in place to assure they adhere to the Consumer Privacy Bill of Rights.

These general principles for the collection and use of personal data - including student data - present significant challenges for accommodating legitimate research and commercial uses of personal data with privacy protections. If one looks at the scope and type of data that InBloom intends to collect (see Appendix), it is as comprehensive and latent for abuse as any medical records. Within the InBloom website, Privacy Policy, there is no mention of focused collection, respect for context, security or accountability. Nor is there any acknowledgement of Fair Information Practices and Principles (FIPPS) or of role- based permissions that are contingent on the purpose, retention and use of student data. Such student data is easily re-identifiable(see Professor Latanya Sweeny, 11

Policy and Law: Identifiability of de-identified, 2013), and given the likely lack of budget, training, sophistication, and enforcement within public education institutions, the likelihood of security breaches and leaks will be unacceptably high. What thin protections do exist will only be further eroded by the strong financial incentives of third parties to monetize the data. The fact that InBloom management comes from the very industry that seeks to benefit from the monetization of the data, as so bluntly described in the Reuters article, does little to quell parental and activist concerns. Given the groundswell of opposition to the InBloom data collection, sharing, monetization, and privacy policies, it is hard to see how with its current privacy policies it can achieve its exemplary goal of providing evidence-based personalized learning. The service is not likely to win consumer acceptance without a major overhaul of its privacy policies and extensive dialogue and trust-building with key stakeholder groups. The project needs a comprehensive legal, policy, and technical framework that conforms to current standards and expectations (e.g., the FTC and White House policy reports), recognizes the right of students and their families to have control over their educational data, and provides demonstrable systems of transparency and accountability. Proposed Course of Action and Remedy for InBloom Privacy Issues This problem will not go away and cannot be finessed. It threatens to undermine the overall goals of the InBloom program. It needs to be dealt with swiftly and openly, and it needs to squarely address the legitimate concerns of parents, students, activists and other stakeholders. I would recommend a concerted effort to develop a transparent and accountable privacy policy along the lines of privacy by design, the Consumer Privacy Bill of Rights and the NSTIC trust frameworks, and tailored to meet the needs of all students and their families. This will entail not only drafting new legal language, but 12

developing and testing actual implementations of trusted platforms. Different trust frameworks and mechanisms for the protected sharing of personal data must be independently field-tested and evaluated. It would make practical sense to develop effective, reliable use cases that are credible for researchers, educators and third parties as well as for students and their families. Such use cases could be vetted by different stakeholders to assess how effectively they address their concerns. There may be a way of conducting field trials to address some stakeholders fears and help restore credibility. ID3 in conjunction with MIT Media Lab over the last two years has been developing an open source software platform that gives individuals control over their personal data and provides a highly secure and auditable means for permissions/policy based sharing of highly sensitive personal data. As part of a project to help returning veterans identify and cope with depression and PTSD, the Defense Advanced Research Project Agenda (DARPA) funded the development of a trust framework for the collection, analysis and sharing of mobile sensor data This platform is now being used in test trials by Telefonica, Telecom Italia in Trento, Italy, as part of field trials for the protected sharing and analysis of mobile data to offer new urban and other services. The Open Mustard Seed (OMS) version of the platform is now being developed to support Quantified Self and other applications using location and sensor data. This system uses trust frameworks and rule-based permission engines to enforce context, age, and jurisdiction sensitive-data sharing rules. OMS is also designed to express and enforce different governance and enforcement agreements, such as audit logs detailing access to data and the enforcement of permissions, and the resolution of disputes. In short, a version of OMS may be highly useful in testing out different use cases to determine how trust and confidence might be restored to the InBloom endeavor through highly demonstrable, transparent and testable means. Prospective Next Steps for Restoring Trust: 13

1. Develop a Comprehensive Privacy and Data Sharing Framework for Student Data Reflecting Best Practices: As noted earlier, the InBloom privacy policy as represented on its website is neither transparent nor reflects the latest or best privacy practices. Moreover, it is not sufficient to just state policies, aspirations and assurances. It is important to have a technology architecture and appropriate security and privacy protecting principles (authentication, authorization, permissions, auditing, etc.) that are aligned with the appropriate software components (See Electronic Privacy Information Center: www.epic.org OpenID Connect, Oauth2.0, encryption, data minimization, anonymization, role-based permissions, zero knowledge proofs, ephemeral identifiers, independent audit logs, etc.). The objective is to express and enforce privacy by design principles in the data-sharing modules themselves. . 2. Convene Stakeholders to Help Assess and Set Trust Framework Principles, Student Data Commons, and Identify Use Cases: The goal here is to engage the key stakeholders (researchers, students, parents, teachers, administrators, third parties, activists, regulators) and learn about their requirements, aspirations, fears and objections. Also, InBloom should work with stakeholders to identify use cases and criteria of success that if achieved would overcome users objections and gain their approval. Identify the key deal breakers and set priorities of design and even a phased-in process for building credible acceptance. This can begin as an open process but then should evolve to a highly structured and rigorous process where options, remedies and priorities can be articulated and agreed upon. In other words, reference use cases are needed to produce testable trials that would allow anyone to scrutinize and question the results. It would be necessary to have a rough stakeholder consensus on measurable outcomes for success not only in terms of user acceptance and privacy, but also in collecting, sharing, and analyzing data for successful learning analytics. 14

3. Manage Student Data as a Common Pool Resource and Conduct Field Trials of Use Cases in Representative Settings: The goal here is to apply principles from the Nobel Laureate economist Elinor Ostrom, whose work on the management of common pool resources suggests ways that different stakeholders could forge appropriate legal agreements and social understandings for using a shared pool of data. Field trials could be undertaken that use mobile phones, tablets and PCs, and reflect different kinds of likely settings and environments for the collection and use of student data. Depending upon the experimental design, the field trials could include thousands of users. Hence, there would need to be experimental designs that address the concerns of stakeholders and provide rigorous and replicable results. 4. Compile and Assess Results with Stakeholders: A meeting of stakeholders could be convened and the results presented and discussed. Out of that meeting would come guidelines for data collection, sharing and analysis and proposals for additional research. It would be hoped that the experiments would be sufficiently successful in key respects as to identify near term projects that could be undertaken as pilots. 5. Develop Scalable, User-Centric Approach For The Protected Sharing of Student Data for InBloom Learning Objectives: Depending upon the success and receptivity of the field trials to the different stakeholders, a follow on goal would be to develop a scalable platform, model agreements and governance practices to used by researchers, students, parents, third parties and school administrators to conduct their field trials and experiments. Such a platform would provide user control, audits and transparency throughout the course of developing and testing effective, personalized learning programs. It would also have specialized policies, potentially with safe harbor provisions to enable exploratory research while at the same time providing anonymity and privacy. 15

Conclusion We are living in a data-rich environments where the trusted collection and use of personal data is both an economic necessity and a basic human right. This poses some unprecedented challenges in moving forward. New ways must be developed to give individuals and families meaningful control over their personal data, both in protecting their privacy and in giving them new opportunities to use their data as they see fit. This principle applies especially to the use of student data. In this context, as technologies and social negotiations about their use evolve, it is untenable for businesses to rely on historic standards or government policies alone. Educators, students, parents, researchers and third parties are increasingly distrustful of opaque privacy agreements and the assurances of state and Federal regulators. Future innovation in this field therefore requires that the educational community take the lead in pioneering new best practices in privacy-by-design and safeguards for the data rights of students and their families. Moving in this direction will require a new convening of all stakeholders in an open and continuous process that looks to blend experimentation and validation of privacy protection practices with important educational research goals. In order to be a leader in learning analytics and the design of effective personalized learning, InBloom will also need to become a thought leader and advocate for the privacy and data rights of students and their families.

16

Bibliography Clippinger, John Henry, A Crowd of One, The Future Of Individual Identity, Public Affairs, Perseus, 2007 DARPA DCAPS: Detection and Computational Analysis of Psychological Signals (DCAPS), http://www.darpa.mil/Our_Work/I2O/Programs/Detection_and_Computational_A nalysis_of_Psychological_Signals_%28DCAPS%29.aspx de Montjoye, Y.-A., Wang S., Pentland A., On the Trusted Use of Large-Scale Personal Data (http://sites.computer.org/debull/A12dec/issue1.htm). IEEE Data Engineering Bulletin, 35-4 (2012) Department of Commerce, Commercial Data Privacy and Innovation in the Internet Economic: A Dynamic Framework, 2012 Electronic Privacy Information Center, (EPIC), www.epic.org Federal Trade Commission, Protecting Consumer Privacy in an Era of Rapid Change, Recommendations for Businesses and Policy Makers, March 2012 InBloom https://www.inbloom.org Lohr, Steve, Big Data Is Opening Doors, but Maybe Too Many. The New York Times Business Day Technology, March 23, 2013 Mobile Territorial Labs. [press] Pivato, Marco, Il tuo smartphone ti osserva e studia cosa fai e cosa pensi. Test a Trento in collaborazione con il Mit. La Stampa- TuttoScienze, April 17, 2013 The Telefonica M2M Team, Telefnica and Telecom Italia collaborate in Smart City innovation projects in Trento, http://blog.digital.telefonica.com/?press-release=telefonica-and-telecom- italia-collaborate-in-smart-city-innovation-projects-in-trento, October 31, 2012 Telecom Italia SKIL Lab, Mobile Territorial Lab (MTL), http://skil.telecomitalia.com/index.php?option=com_content&view=article& id=96%3Amtlproject&catid=35%3Acatprogetti&Itemid=68 17

Trento Mobile Territorial Lab (MTL), http://www.mobileterritoriallab.eu Open Mustard Seed, http://idcubed.org/open-platform/platform/

Ostrom, Elinor, and Gardner, Roy, and Walker, James, Editors, Rules, Games, and Common Pool Resources.Ann Arbor, University of Michigan Press, 1994 Ostrom, Elinor and Hess, Charlotte, Editors, Understanding Knowledge as a Commons: From Theory to PracticeThe MIT Press, Cambridge, Massachusetts, 2006 Pure, CPS set to sign away student privacy ,Tuesday, May 22nd, 2012,
http://pureparents.org/?tag=ferpa-gates-foundation-murdoch Ravitch, Diane, Diane Ravitchs Blog http://dianeravitch.net/2013/04/07/is-inbloom-engaged-in-identity-theft/ Simon, Stephanie K-12 database jazzes tech startups, spooks parents, March 3, 2013 Sweeny, Latanya, Policy and Law: Identifiability of de-identified , 2013, datahttp://latanyasweeney.org/work/identifiability.html White House, Consumer Data Privacy in a Networked World, A Framework for Protecting Privacy and Promoting Innovation in The Global Digital Economy, Feb. 2012 White House, National Strategy for Trusted Identities in Cyberspace, Enhancing Online Choice, Security, Efficiency Privacy, February 2010 World Economic Forum, Rethinking Personal Data: Strengthening Trust,, 2012 World Economic Forum, Unlocking the Value of Personal Data: From Collection to Use, January, 2013 http://www.weforum.org/reports/personal-data-emergence-new-asset-class http://www.weforum.org/issues/rethinking-personal-data http://www3.weforum.org/docs/WEF_IT_UnlockingValuePersonalData_CollectionU sage_Report_2013.pdf http://www3.weforum.org/docs/WEF_IT_UnlockingValuePersonalData_CollectionU sage_Report_2013.pdf 18

InBloom Data Types DisabilityType A disability condition that describes a child's impairment. Like all enumerations in inBloom, DisabilityType is derived from the W3C data type to Autistic/Autism Deaf-Blindness Deafness Developmental Delay Emotional Disturbance Hearing/Auditory Impairment Infants and Toddlers with Disabilities Mental Retardation Multiple Disabilities Orthopedic Impairment Other Health Impairment Speech or Language Impairment Specific Learning Disability Traumatic Brain Delay Visual Impairment DisciplineActionLengthDifferenceReasonType Indicates the reason for the difference, if any, between the official and actual lengths of a students disciplinary assignment. Like all enumerations in inBloom, DisciplineActionLengthDifferenceReasonType is derived from the W3C data type token. No Difference Term Modified By District Term Modified By Court Order Term Modified By Mutual Agreement Student Completed Term Requirements Sooner Than Expected Student Incarcerated Term Decreased Due To Extenuating Health-Related Circumstances Student Withdrew From School School Year Ended Continuation Of Previous Years Disciplinary Action Assignment Term Modified By Placement Program Due To Student Behavior While In The Placement 19 APPENDIX

Other CourseRepeatCodeType Indicates that an academic course has been repeated by a student and how that repeat is to be computed in the student's academic grade average. Like all enumerations in inBloom, CourseRepeatCodeType is derived from the W3C data type token. RepeatCounted RepeatNotCounted ReplacementCounted ReplacedNotCounted RepeatOtherInstitution NotCountedOther AssessmentReportingMethodType The method that the instructor of the class uses to report the performance and achievement of all students. It may be a qualitative method such as individualized teacher comments or a quantitative method such as a letter or a numerical grade. In some cases, more than one type of reporting method may be used. Like all enumerations in inBloom, AssessmentReportingMethodType is derived from the W3C data type token. Achievement/proficiency level ACT score Adaptive scale score Age score C-scaled scores College Board examination scores Composite Score Composite Rating Composition Score Grade equivalent or grade-level indicator Grade equivalent or grade-level indicator Graduation score Growth/value-added/indexing International Baccalaureate score Letter grade/mark Mastery level Normal curve equivalent Normalized standard score Number score Pass-fail Percentile Percentile rank Proficiency level Promotion score 20

Ranking Ratio IQ's Raw score Scale score Standard age score Standard error measurement Stanine score Sten score Theta T-score Vertical score Workplace readiness score Z-score Other Not applicable Quantile Measure Lexile Measure Vertical Scale Score National College-Bound Percentile State College-Bound Percentile AssessmentCategoryType The category of an assessment based on format and content. For example: Achievement test Advanced placement test Alternate assessment/grade-level standards Attitudinal test Cognitive and perceptual skills test ... Like all enumerations in inBloom, AssessmentCategoryType is derived from the W3C data type token. Achievement test Advanced Placement International Baccalaureate Aptitude test Attitudinal test Benchmark test Class test class quiz College entrance exam Cognitive and perceptual skills test Developmental observation English proficiency screening test Foreign language proficiency test Interest inventory Manual dexterity test Mental ability (intelligence) test Performance assessment 21

Personality test Portfolio assessment Psychological test Psychomotor test Reading readiness test State summative assessment 3-8 general State high school subject assessment State high school course assessment State alternative assessment/grade-level standards State alternative assessment/modified standards State alternate assessment/ELL State English proficiency test Other IncidentLocationType Identifies where the incident occurred and whether or not it occurred on school. Like all enumerations in inBloom, IncidentLocationType is derived from the W3C data type token. On School Administrative offices area Cafeteria area Classroom Hallway or stairs Locker room or gym areas Restroom Library/media center Computer lab Auditorium On-School other inside area Athletic field or playground Stadium Parking lot On-School other outside area Off School Bus stop School bus Walking to or from school Off-School at other school Off-School at other school district facility Online Unknown OldEthnicityType

22

Previous definition of Ethnicity combining Hispanic/Latino and Race. Like all enumerations in inBloom, OldEthnicityType is derived from the W3C data type token. American Indian Or Alaskan Native Asian Or Pacific Islander Black, Not Of Hispanic Origin Hispanic White, Not Of Hispanic Origin PersonalInformationVerificationType The evidence presented to verify one's personal identity; for example: drivers license, passport, birth certificate, etc. Like all enumerations in inBloom, PersonalInformationVerificationType is derived from the W3C data type token. Baptismal or church certificate Birth certificate Drivers license Entry in family Bible Hospital certificate Immigration document/visa Life insurance policy Other Other non-official document Other official document Parents affidavit Passport Physicians certificate Previously verified school records State-issued ID ReasonNotTestedType The primary reason student is not tested. For example: Absent Refusal by parent Refusal by student Medical waiver Illness Disruptive behavior LEP Exempt ... Like all enumerations in inBloom, ReasonNotTestedType is derived from the W3C data type token. Absent LEP exempt LEP postponement Not appropriate (ARD decision) Not tested (ARD decision) Alternate assessment administered Parental waiver Foreign exchange student waiver Refusal by parent Refusal by student Medical waiver 23

Disruptive behavior Previously passed the examination Other RelationType The nature of an individual's relationship to a student. Like all enumerations in inBloom, RelationType is derived from the W3C data type token. Adopted daughter Adopted son Adoptive parents Advisor Agency representative Aunt Brother, half Brother, natural/adoptive Brother, step Brother-in-law Case Worker, CPS Court appointed guardian Cousin Daughter Daughter-in-law Dependent Doctor Employer Emergency Contact Family member Father's significant other Father, foster Father Father, step Father-in-law Fiance Fiancee Former husband Former wife Foster daughter Foster parent Foster son Friend Granddaughter Grandparent Great Grandparent Grandson Great aunt Great uncle 24

Guardian Husband Life partner Life partner of parent Minister or priest Mother's significant other Mother, foster Mother Mother, step Mother-in-law Nephew Niece None Other Parent Partner Partner of parent Probation officer Sibling Sister, half Sister, natural/adoptive Sister, step Sister-in-law Son Son-in-law Spouse Stepdaughter Stepson Stepsibling Uncle Ward Wife ResponseIndicatorType Indicator of the response. For example: Nonscorable response Ineffective response Effective response Partial response ... Like all enumerations in inBloom, ResponseIndicatorType is derived from the W3C data type token. Nonscorable response Ineffective response Effective response Partial response RestraintEventReasonItemType The items of categorization of the circumstances or reason for the restraint. Like all enumerations in inBloom, RestraintEventReasonItemType is derived from the W3C data type token. 25

Imminent Serious Physical Harm To Themselves Imminent Serious Physical Harm To Others Imminent Serious Property Destruction SeparationReasonType Reason for terminating the employment; for example: Employment in education, Employment outside of education, Retirement, Family/personal relocation, Change of assignment Like all enumerations in inBloom, SeparationReasonType is derived from the W3C data type token. Employment in education Employment outside of education Retirement Family/personal relocation Change of assignment Formal study or research Illness/disability Homemaking/caring for a family member Layoff due to budgetary reduction Layoff due to organizational restructuring Layoff due to decreased workload Discharge due to unsuitability Discharge due to misconduct Discharge due to continued absence or tardiness Discharge due to a falsified application form Discharge due to credential revoked or suspended Discharge due to unsatisfactory work performance Death Personal reason Lay off due to lack of funding Lost credential Unknown Other StaffIdentificationSystemType A coding scheme that is used for identification and record-keeping purposes by schools, social services, or other agencies to refer to a staff member. Like all enumerations in inBloom, StaffIdentificationSystemType is derived from the W3C data type token. Drivers License Health Record Medicaid Professional Certificate School District State Federal 26

Other Federal Selective Service SSN US Visa PIN Canadian SIN Other WeaponItemType The enumeration items for the types of weapon used during an incident. Like all enumerations in inBloom, WeaponItemType is derived from the W3C data type token. Firearm Illegal Knife Non-Illegal Knife Club Other Sharp Objects Other Object Substance Used as Weapon Knife Unknown None Other

27

You might also like