Privacy Green Paper Response
Advice on Current situation: They're in a huge rush to get this turned around
Advice About Writing:
- Be clear - a lackey is reading and summarizing points
- Keep it short - don't try to answer all the questions they ask, just the ones you care about
- When you're writing, make each point in a heading, so that if you took out the text it'd still be there
- Careful about tech mandates (ie, OAuth)
MAIN POINTS: That companies who have transactional data with people - should be mandated to make available with one click authentication to support people getting access/linking their records together.
Today if you look at the steps you have to go through to get your own records now it is like 6-12 hard steps you have to do manually each month or something.
Dear Secretary Locke,
We are submitting these comments in response to the December 2010 Commerce Department Privacy Green Paper.
We represent a community of end-user advocates and technology innovators focused on individual rights and access to their own personal data and the business and innovation opportunity this new user-management and control offers.
First we should outline where we are coming from and then we will comment on how this future looking view informs our response to the Green Paper and questions for further comment.
Personal Data Storage and Services A Middle Way between Do Not Track and Business as Usual Stalking
There is a way to deal with users' personal data that most have not yet explored. It sits between the two extremes of a familiar spectrum.
On one end, “Do not track” using technology and a legal mandate to prevent any data collection. In this scenario cross site behavioral targeting is stopped because users signal they do not want any information to be collected about them as they move about the web. The economic value the advertiser was getting through higher click thought rates by providing more relevant ads is eliminated and sites that get revenue from ads potentially have their revenue base reduced. The economic value of the data is not captured by the end-user or the media/advertising/data aggregating complex.
And on the other end, “Business as usual” leaving the door open for ever more “innovative” pervasive and intrusive data collection and cross referencing for behavioral targeting developing profiles - digital dossiers of billions of people based on IP addresses, device identification, e-mail address etc. It is highly invasive of people’s privacy linking their activities across contexts they wish to keep separate. Material decisions about peoples lives are beginning to be made on such data and they are not aware of it. Economic value is derived but at the expense of the basic dignity and privacy rights of the individual.
Personal data storage services are emerging representing a middle way that provides greater choice and control to the individual along AND greater economic value to businesses. These allow individuals to collect their own personal data to manage it and then give permissioned access to their digital footprint to the business and services they choose—businesses they trust to provide better customization, more relevant search results, and real value for the user from their data.
Over the last year activity in this space has grown tremendously. In this emerging field of innovation we have identified over ten startups, at least three open source projects, several technical standards efforts in recognized ISO’s along with companies in the web, mobile, entertainment and banking industries considering this model.
One of the most important things about this emerging space is that has active business development both in the United States and across Europe as this model fits with privacy norms and practices in Europe. This model offers the possibility for achieving global interoperability one of the key goals articulated by the commerce department for this forthcoming set of policies and regulations.
People are the Only Ethical Integration Point for Disparate Data Sets
Today the personal data ecosystem in which almost everyone unknowingly participates. People emit information about themselves, activities and intentions, in various digital forms. It is collected by a wide range of institutions and businesses with which people interact directly; then it is assembled by data brokers and sold to data users. This chain of activity happens with almost no participation or awareness on the part of the data subject - the individual.
We believe that the individual is the only ethical integration point for the vast range disparate data sets that give a comprehensive picture of the individual.
This list of data types was put together by Marc Davis for the World Economic Forum Re-Thinking Personal Data event in June of 2010. It highlights the vast range of datasets that an individual might have about them in some digital form in some database somewhere.
Identity and Relationships
- Identity (IDs, User Names, Email Addresses, Phone Numbers, Nicknames, Passwords, Personas)
- Demographic Data (Age, Sex, Addresses, Education, Work History, Resume)
- Interests (Declared Interests, Likes, Favorites, Tags, Preferences, Settings)
- Personal Devices (Device IDs, IP Addresses, Bluetooth IDs, SSIDs, SIMs, IMEIs, etc.)
- Relationships (Address Book Contacts, Communications Contacts, Social Network Relationships, Family * Relationships and Genealogy, Group Memberships, Call Logs, Messaging Logs)
- Location (Current Location, Past Locations, Planned Future Locations)
- People (Copresent and Interacted with People in the World and on the Web)
- Objects (Copresent and Interacted with Real World Objects)
- Events (Calendar Data, Event Data from Web Services)
- Browser Activity (Clicks, Keystrokes, Sites Visited, Queries, Bookmarks)
- Client Applications and OS Activity (Clicks, Keystrokes, Applications, OS Functions)
- Real World Activity (Eating, Drinking, Driving, Shopping, Sleeping, etc.)
- Text (SMS, IM, Email, Attachments, Direct Messages, Status Text, Shared Bookmarks, Shared Links Comments, Blog Posts, Documents)
- Speech (Voice Calls, Voice Mail)
- Social Media (Photos, Videos, Streamed Video, Podcasts, Produced Music, Software)
- Presence (Communication Availability and Channels)
- Private Documents (Word Processing Documents, Spreadsheets, Project Plans, Presentations, etc.)
- Consumed Media (Books, Photos, Videos, Music, Podcasts, Audiobooks, Games, Software)
- Financial Data
- Financial Data (Income, Expenses, Transactions, Accounts, Assets, Liabilities, Insurance, Corporations, Taxes, Credit Rating)
- Digital Records of Physical Goods (Real Estate, Vehicles, Personal Effects)
- Virtual Goods (Objects, Gifts, Currencies)
- Health Care Data (Prescriptions, Medical Records, Genetic Code, Medical Device Data Logs)
- Health Insurance Data (Claims, Payments, Coverage)
Other Institutional Data
- Governmental Data (Legal Names, Records of Birth, Marriage, Divorce, Death, Law Enforcement Records, Military Service)
- Academic Data (Exams, Student Projects, Transcripts, Degrees)
- Employer Data (Reviews, Actions, Promotions)
Service Providers Must Work For the End-User Much like most people do not host their own e-mail servers or websites on servers in their basesments. Most individuals will not have the technical skill or desire to actually manage the collection, integration, analysis, permission management and other services needed to derive value from their data.
Individuals need to trust that the service providers in this space are working on their behalf. Overall market models need to emerge that support the Personal Data Store Service Provider’s making money while working on behalf. The Personal Data Ecosystem Collaborative Consortium has a Value Network Mapping and Analysis project to outline this model and is raising money to do it.
Personal Data is like Personal Money Individuals must be able move data between service providers just like today they can move money between banks and it retains its value.
End-User choice and the right to move data from one service provider to another is key. Just like our money does not become “worthless” when we move it from one bank to another the same needs to hold true for individual’s data.
Consumers need to be able to to Collect and Aggregate Their Data from Product and Service providers For this Personal Data Ecosystem and Economy to emerge it is essential that consumers have easy access to their data from the providers they do business with. The steps involved in getting data out of services is tedious and onerous. Several real examples of tedious multi-step processes.
If export is available it is often not machine readable. The steps must be repeated monthly as statements are issued.
There is no reason for this simple Open Standards like OAuth allow for account linking without the dangerous practice of giving a user-name and password to one of the service providers.
Another reason this feature is important is that services disappear and along with it user data and digital assets (like photographs), users create content and generate data in the usage of sites and they should be able to easily export this from sites.
The commerce department could mandate that companies who have data from people usage of their services be mandated to make available with one click authentication to support people getting access and being able link their records together.
Create a Level Playing Field around Data Aggregation and Services Which companies can do what with what kinds of data?
Today the regulatory patchwork around data protection means that different kinds of data are treated differently and it affects how different industries can compete.
For example Google and Facebook have vast collections of data about individuals - from their activities on their sites, what they click on, who they know, what they search for, where they go etc. They analyze these data sets and then provide relevant ads based on their activities.
Today with mobile devices connected to the web mobile carriers collect a very similar set of data - where an individual goes, who they call and text, where they go to on the web. They are regulatory prohibited from using this data in the same way that Google and Facebook do.
A model where individuals choose a core data service provider where they collect and aggregate their data in a “data bank” and then choose from a variety of 3rd and 4th party service providers to give them services based on that data creates enormous market and business opportunity and spurs competition.
Individuals Should be able to Keep their Data for a Lifetime What if the individual could choose to retain all the information they wanted for as long as they wanted? This is a graph that Marc Davis has publicly presented to explain today's current data environment and a future where people are in control.
The red dot shows us what’s happening today: –some data aggregators are necessarily self-regulating by limiting the amount of time they keep data, and governments are limiting data retention and anonymization practices.
The green dot shows us what WOULD happen if people were given the capacity to store and manage their own data – if they could keep as much data as they wanted for as long as they wanted. Digital footprints of a lifetime could be shared with future generations.
In a user-centric model where the individual can aggregate information about themselves, new classes of services – more specific to the individual, based on data accessed with user permission, can emerge.
The foundation of this eco-system is personal data storage services that are totally under the control of the individual. These new data service providers can be more viable if individuals can have simple ways to link their accounts together.
The model presented above, a Personal Data Ecosystem where indiviuals are in control of their own data aligns with the interests of all the stakeholders the Commerce department is seeking to balance. Companies who collect personal data win: by sharing and synchronizing with people’s personal data stores, companies get more accurate information. New services can be offered on data sets, including data not previously permitted for use or access when providing services (telephone log records or mobile geolocation data, for example).
People win: by collecting, managing, and authorizing access to their own personal data, thereby increasing their trust and use of digital realms. This empowers people to work together in communities and groups more efficiently and effectively.
Regulators, advocates, and legislators win: by protecting people with new frameworks that also encourage innovation and new business opportunities.
Response to Questions
1. The Task Force recommends adoption of a baseline commercial data privacy framework built on an expanded set of Fair Information Practice Principles (FIPPs).
a. Should baseline commercial data privacy principles, such as comprehensive FIPPs, be enacted by statute or through other formal means to address how current privacy law is enforced?
b. How should baseline privacy principles be enforced? Should they be enforced by non-governmental entities in addition to being the basis for FTC enforcement actions?
c. As policymakers consider baseline commercial data privacy legislation, should they seek to grant the FTC the authority to issue more detailed rules? What criteria are useful for deciding which FIPPs require further specification through rulemaking under the Administrative Procedure Act?
d. Should baseline commercial data privacy legislation include a private right of action?
2. To meet the unique challenges of information intensive environments, FIPPs regarding enhancing transparency; encouraging greater detail in purpose specifications and use limitations; and fostering the development of verifiable evaluation and accountability should receive high priority.
a. What is the best way of promoting transparency so as to promote informed choices? The Task Force is especially interested in comments that address the benefits and drawbacks of legislative, regulatory, and voluntary private sector approaches to promoting transparency.
b. What incentives could be provided to encourage the development and adoption of practical mechanisms to protect consumer privacy, such as PIAs, to bring about clearer descriptions of an organization’s data collection, use, and disclosure practices?
c. What are the elements of a meaningful PIA in the commercial context? Who should define these elements?
d. What processes and information would be useful to assess whether PIAs are effective in helping companies to identify, evaluate, and address commercial data privacy issues?
e. Should there be a requirement to publish PIAs in a standardized and/or machine-readable format? f. What are consumers’ and companies’ experiences with systems that display information about companies’ privacy practices in contexts other than privacy policies?
g. What are the relative advantages and disadvantages of different transparency-enhancing techniques in an online world that typically involves multiple sources being presented through a single user interface?
h. Do these (dis)advantages change when one considers the increasing use of devices with more limited user interface options?
i. Are purpose specifications a necessary or important method for protecting commercial privacy?
j. Currently, how common are purpose specification clauses in commercial privacy policies?
k. Do industry best practices concerning purpose specification and use limitations exist? If not, how could their development be encouraged?
l. What incentives could be provided to encourage companies to state clear, specific purposes for using personal information?
m. How should purpose specifications be implemented and enforced?
n. How can purpose specifications and use limitations be changed to meet changing circumstances?
o. Who should be responsible for demonstrating that a private sector organization’s data use is consistent with its obligations? What steps should be taken if inconsistencies are found?
p. Are technologies available to allow consumers to verify that their personal information is used in ways that are consistent with their expectations?
q. Are technologies available to help companies monitor their data use, to support internal accountability mechanisms?
r. How should performance against stated policies and practices be assessed?
s. What incentives could be provided to encourage companies to adopt technologies that would facilitate audits of information use against the company’s stated purposes and use limitations?
3. Voluntary, enforceable codes of conduct should address emerging technologies and issues not covered by current application of baseline FIPPs. To encourage the development of such codes, the Administration should consider a variety of options, including (a) public statements of Administration support; (b) stepped up FTC enforcement; and (c) legislation that would create a safe harbor for companies that adhere to appropriate voluntary, enforceable codes of conduct that have been developed through open, multi- stakeholder processes.
a. Should the FTC be given rulemaking authority triggered by failure of a multi-stakeholder process to produce a voluntary enforceable code within a specified time period?
b. How can the Commerce Department best encourage the discussion and development of technologies such as “Do Not Track”?
c. Under what circumstances should the PPO recommend to the Administration that new policies are needed to address failure by a multi-stakeholder process to produce an approved code of conduct?
d. How can cooperation be fostered between the National Association of Attorneys General, or similar entities, and the PPO?
5. The FTC should remain the lead consumer privacy enforcement agency for the U.S. Government.
a. Do FIPPs require further regulatory elaboration to enforce, or are they sufficient on their own?
b. What should be the scope of FTC rulemaking authority?
c. Should FIPPs be considered an independent basis for FTC enforcement, or should FTC privacy investigations still be conducted under Federal Trade Commission Act Section 5 “unfair and deceptive” jurisdiction, buttressed by the explicit articulation of the FIPPs? d. Should non-governmental entities supplement FTC enforcement of voluntary codes?
e. At what point in the development of a voluntary, enforceable code of conduct should the FTC review it for approval? Potential options include providing an ex ante “seal of approval,” delaying approval until the code is in use for a specific amount of time, and delaying approval until enforcement action is taken against the code.
f. What steps or conditions are necessary to make a company’s commitment to follow a code of conduct enforceable?
6. The U.S. government should continue to work toward increased cooperation among privacy enforcement authorities around the world and develop a framework for mutual recognition of other countries’ commercial data privacy frameworks. The United States should also continue to support the APEC Data Privacy Pathfinder project as a model for the kinds of principles that could be adopted by groups of countries with common values but sometimes diverging privacy legal frameworks.
7. Consideration should be given to a comprehensive commercial data security breach framework for electronic records that includes notification provisions, encourages companies to implement strict data security protocols, and allows States to build upon the framework in limited ways. Such a framework should track the effective protections that have emerged from State security breach notification laws and policies.
What factors should breach notification be predicated upon (e.g., a risk assessment of the potential harm from the breach, a specific threshold such as number of records, etc.)?
8. A baseline commercial data privacy framework should not conflict with the strong sectoral laws and policies that already provide important protections to Americans, but rather should act in concert with these protections.
9. Any new Federal privacy framework should seek to balance the desire to create uniformity and predictability across State jurisdictions with the desire to permit States the freedom to protect consumers and to regulate new concerns that arise from emerging technologies, should those developments create the need for additional protection under Federal law.
b. How could a preemption provision ensure that Federal law is no less protective than existing State laws? What are useful criteria for comparatively assessing how protective different laws are?
c. To what extent should State Attorneys General be empowered to enforce national FIPPs-based commercial data privacy legislation?
d. Should national FIPPs-based commercial data privacy legislation preempt State unfair and deceptive trade practices laws?
10. The Administration should review the Electronic Communications Privacy Act (ECPA), with a view to addressing privacy protection in cloud computing and location-based services. A goal of this effort should be to ensure that, as technology and market conditions change, ECPA continues to appropriately protect individuals’ expectations of privacy and effectively punish unlawful access to and disclosure of consumer data.
a. The Task Force seeks case studies and statistics that provide evidence of concern—or comments explaining why concerns are unwarranted—about cloud computing data privacy and security in the commercial context. We also seek data that link any such concerns to decisions to adopt, or refrain from adopting, cloud computing services.
b. The Task Force also seeks input on whether the current legal protections for transactional information and location information raise questions about what privacy expectations are reasonable and whether additional protections should be mandated by law. The Task Force also invites comments that discuss whether privacy protections for access to location information need clarification in order to facilitate the development, deployment and widespread adoption of new location-based services.
c. The Task Force seeks information from the law enforcement community regarding the use of ECPA today and how investigations might be affected by proposed amendments to ECPA’s provisions.