Archive for "internet"
The software and services that we use online are specifically designed to leave us at risk of security threats unless we change the default options. That was the message of privacy and security researcher Christopher Soghoian’s speech, “Evil Defaults: Why the software and services we use are intentionally designed to violate our privacy,” on March 1 in the Student Lounge.
Defaults are not exclusive to software. High school cafeterias across the nation are experimenting with changing defaults. Soghoian said that one cafeteria simply moved the pizza up so that students had to reach for it, and placed salads at eye level. The results: higher consumption of salads. A quick Google search shows dozens of similar experiments nationwide.
Software companies are doing the same thing, just without the good motives. “There’s a saying that if you’re using something for free, you are the product,” Soghoian said. He means that your privacy and data are the price you pay for something like Facebook or Gmail. “Facebook earns less than $5 per user per year from the data they sell. They could offer a premium alternative for $10, but they don’t.” The privacy invasion is a default for which there is no alternative.
But there are also more subtle default choices – ones which users can avoid. Soghoian said that many users even in the United States do not know that packet data over Wi-Fi in not encrypted. Hackers using packet sniffers can intercept user names and passwords in order to hijack accounts. There’s a very simple alternative: HTTPS (encrypted or secure HTTP). When you are accessing a service protected by HTTPS encryption, a lock icon appears in your browser. But “few users notice the lock icon when it’s there, and even fewer notice the lack of the lock icon,” Soghoian said.
For many years, Google’s Gmail service relegated HTTPS to the thirteenth option in a settings screen that most users ignored. In late 2009, a group of security experts led by Soghoian petitioned Google to change the default (see Encrypt the Cloud, Security Luminaries Tell Google, June 16, 2009). Google changed the default to HTTPS on Jan. 12, 2010. When Google did the same for its search engine in October of 2011, the marketers complained that they were being deprived of your search data (see Google SSL Default, Goodbye Query Referrer Data).
It is also possible to use HTTPS on Microsoft’s Hotmail e-mail service, but enabling HTTPS takes six steps.
Why would these companies do this? After all, it’s not in their interest for users to be afraid of identity theft as a result of using their service.
One reason may be because the government is asking them to. Valerie Caproni, General Counsel for the FBI, said in Congressional testimony that weak security helps the government catch criminals. “Criminals tend to be somewhat lazy, and a lot of times, they will resort to what is easy … [as] long as we have a solution that will get us the bulk of our targets, the bulk of criminals, the bulk of terrorists, the bulk of spies, we will be ahead of the game. We can’t have … to design individualized solutions as though they were a very sophisticated target who was self-encrypting and putting a very difficult encryption algorithm on for every target we confront because not every target is using such sophisticated communications.”
Third parties are watching you
Another reason companies do this is because it helps their own affiliates. First parties, Soghoian said, are the websites we all visit, such as the BBC, New York Times, and Reddit. Third parties are companies like Google’s Doubleclick, Microsoft’s Atlas, and independent companies such as Bluekai, all of which collect user data from first party websites and sell it.
Microsoft’s conflict of interest became clear shortly before the release of Internet Explorer 8. The original design for IE8 had sophisticated anti-tracking technologies turned on by default. The browser would accept cookies from first parties and block those from third parties. This default setting made sense to the programmers. It upset Microsoft’s advertising department, who complained directly to CEO Steve Ballmer. As a result, users can turn on IE8′s privacy options, but the browser turns off the option every time it’s shut down, so the user has to reselect it each time they start the application.
Soghoian had a more complex story to tell about Google. In 1999, computer usability student Alma Whitten co-wrote a paper called Why Johnny Can’t Encrypt: A Usability Evaluation of PGP 5.0. The paper called for higher standards of usability in security software. Today, Alma Whitten is Google’s Director of Privacy for Product and Engineering. When Google recently announced that it would back a “Do Not Track” initiative, Whitten might have made the announcement for Google. But she didn’t. Instead, it was Google’s senior vice president of advertising, Susan Wojcicki, who made the announcement — and it’s likely that it was Wojcicki, not Whitten, who made the decision.
In companies like Microsoft and Google, defaults are set by divisions that need to make money, such as advertising — not by programmers or privacy experts.
In the software industry, Soghoian said, “dark patterns” are interfaces that are optimized to trick people—the same way that casinos in Las Vegas are designed so that you get lost, lose track of time, and keep on gambling. They’re optimized for the benefit of the business, not the customers.
Risks grow as hacking gets easier
The risks of using unsecured networks are growing, Soghoian warned. A tool released a few years ago makes it easy to hijack Facebook and Twitter accounts over unsecured Wi-Fi connections (see Firesheep addon allows the clueless to hack Facebook, Twitter over Wi-Fi). Owning and using the tool is likely illegal, but that won’t stop hackers. The Firesheep tool makes it easier to do the hacking, and when the hacking gets easier, it’s likely to also become more prevalent.
In the question and answer after his speech, Soghoian added that although a similar tool does not yet exist for hacking smartphone services, it’s likely only a matter of time.
Why Private Browsing Modes Do Not Deliver Real Privacy by Christopher Soghoian.
Freedom In the Cloud: Software Freedom, Privacy, and Security for Web 2.0 and Cloud Computing, the speech by Eben Moglen at NYU that inspired Diaspora, the open source alternative to Facebook.
In the ten years since 9/11, the American government’s willingness to sacrifice civil liberties to preserve security has provoked fierce debate. On September 15, just a few days after nationwide ceremonies commemorated the tenth anniversary of the attacks, a panel of Brooklyn Law School faculty gathered at the Subotnick Center to discuss what freedoms Americans had lost in the war on terror. At the same time, the discussion celebrated the publication of Professor Susan Herman’s new book, Taking Liberties: The War on Terror and the Erosion of American Democracy (Oxford University Press, 2011).
Professor Herman, who is also the president of the American Civil Liberties Union, began the discussion by sharing excerpts from her book, illustrating the dramatic impact this legislation has had on the day-to-day lives of Americans. “We started all this change in law after 9/11 without debate or discussion. There are lots of reasons to ask questions … and now is the time to start doing that.”
Professor Derek Bambauer shared his thoughts on the impact 9/11 has had on the Internet: “The Internet before 9/11 was the domain of Pets.com, and after 9/11 one of WikiLeaks.” And the ripple effect goes far beyond our Internet domains. As Professor Maryellen Fullerton explained, in commenting on a controversial provision in the Patriot Act that outlaws providing “material support” to a terrorist organization, “the standards of material support have defined terrorist organizations so broadly that almost any group may fall into that category.” Prof. Fullerton remarked that regardless of the immense impact these “material support” laws have had on immigration and the funding of non-profit groups worldwide, unfortunately, “Congress has shown no interest in narrowing the law.”
Professor Nelson Tebbe acted as moderator and discussed the effects of anti-terrorist legislation on the free exercise of religion: “How easy is it to tell if government is targeting terrorists or Muslims? … The power of [Professor Herman’s] book is that she shows through stories of real people … how difficult it is to [distinguish] between these two things.”
The well-attended discussion sparked a stimulating conversation between panel members as well as members of the audience. “It was a great mix of perspectives on a topic we’ve already heard so much about,” said Jason Stewart, ’13.
Sean Hymowitz, ’12, asked the panelists to comment on the TSA Secure Flights program. “Is it going to get better? Because I don’t like taking my shoes off,” he said, generating laughs from the audience. Prof. Herman responded by pointing out that she covered that topic in her book, and referred specifically to one story of one twenty-two year old who was interrogated for five hours by a TSA official because he carried Arabic-English flashcards onto the plane.
As the discussion drew to a close, Interim Dean Michael Gerber asked a poignant question: is there any optimistic outlook we can take from the current trend? The panelists looked to one another before Professor Bambauer spoke up: “It’s hard to tell anything but a pessimistic story … but perhaps our lens is not broad enough yet.” To this Professor Herman added, “if we can’t change these laws while we have a former Constitutional Law professor in office… it’s not going to happen. This doesn’t mean change isn’t possible [but] politicians are not going to do it. It’s up to us.”
Listen to a podcast with Prof. Herman about her new book, courtesy of the BLS Library Blog.