By Gary Schwartz
The way we define the term privacy is subjective. In the United States, we police privacy based on a very broad definition under Section 5 of the Federal Trade Commission Act that prohibits “unfair or deceptive acts or practices in or affecting commerce.” The devil is in the policy details.
If the news headlines over the past few months are any indication, we are mighty confused with what to call private and what to call public, what to sanction and what not to sanction. How can we start to solve small-screen privacy when we have not solved our digital angst on the desktop?
Jules Polonetsky, director of the Future of Privacy Forum, says that when the browser invariably crashes it pops up a commiserating dialogue box asking you permission to send the diagnostic report to the browser company anonymously to help them fix bugs and build a better browser.
Faced with this privacy brief, only 3 percent of users click “Yes.”
Is it because we are digital immigrates? Our children happily offer data everyday about personal activity without hesitation.
Is the challenge simplifying the legal narrative to allow consumers to make an informed decision without interrupting their next click on the small screen? It seems an improbable feat.
In March, the Federal Trade Commission issued a report on best practices for businesses collecting personal data called “Protecting Consumer Privacy in an Era of Rapid Change: Recommendations For Businesses and Policymakers.”
The FTC, which is taking a proactive lead on privacy in the beltway, seems cognizant that it needs to create a flexible framework to best interpret what is unfair or deceptive in Section 5 of the Federal Trade Commission Act.