ALL Seated In the Boardroom?
No ! This is IT Wee are in the Server Room !
Systemini
.net
Prepared
For Millenials
One For All
All For One
Don't miss a thing!
Subscribe
to our newsletter
Find your Domain:

Get the best name for your website

.
Here’s the thing about different people playing the same piece of music: sometimes, they’re going to sound similar. And when music is by a composer who died 268 years ago, putting his music in the public domain, a bunch of people might record it and some of them might put it online. In this situation, a combination of copyright bots and corporate intransigence led to a Kafkaesque attack on music. Musician James Rhodes put a video of himself playing Bach on Facebook. Sony Music Entertainment claimed that 47 seconds of that performance belonged to them. Facebook muted the video as a result. So far, this is stupid but not unusually stupid in the world of takedowns. It’s what happened after Rhodes got Sony’s notice that earned it a place in the Hall of Shame. One argument in favor of this process is that there are supposed to be checks and balances. Takedown notices are supposed to only be sent by someone who owns the copyright in the material and actually believes that copyright’s been infringed. And if a takedown notice is wrong, a counter-notice can be sent by someone explaining that they own the work or that it’s not infringement. Counter-notices have a lot of problems, not the least of which is that the requirements are onerous for small-time creators, requiring a fair bit of personal information. There’s always the fear that, even for someone who knows they own the work, that the other side will sue them anyway, which they cannot afford. Rhodes did dispute the claim, explaining that “this is my own performance of Bach. Who died 300 years ago. I own all the rights.” Sony rejected this reasoning. While we don’t know for sure what Sony’s process is, we can guess that a copyright bot, or a human acting just as mechanically, was at the center of this mess. A human doing actual analysis would have looked at a video of a man playing a piece of music older than American copyright law and determined that it was not something they owned. It almost feels like an automatic response also rejected Rhodes’ appeal, because we certainly hope a thoughtful person would have received his notice and accepted it. Rhodes took his story to Twitter, where it picked up some steam, and emailed the heads of Sony Classical and Sony’s public relations, eventually getting his audio restored. He tweeted “What about the thousands of other musicians without that reach…?” He raises a good point. None of the supposed checks worked. Public pressure and the persistence of Rhodes was the only reason this complaint went away, despite how the rules are supposed to protect fair use and the public domain. How many more ways do we need to say that copyright bots and filters don’t work? That mandating them, as the European Union is poised to do, is dangerous and shortsighted? We hear about these misfires roughly the same way they get resolved: because they generate enough noise. How many more lead to a creator’s work being taken down with no recourse? (function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) {return;} js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));{lang: 'en-GB'}
 
A decade ago, before social media was a widespread phenomenon and blogging was still a nascent activity, it was nearly unthinkable outside of a handful of countries—namely China, Tunisia, Syria, and Iran—to detain citizens for their online activity. Ten years later, the practice has become all too common, and remains on the rise in dozens of countries. In 2017, the Committee to Protect Journalists found that more than seventy percent of imprisoned journalists were arrested for online activity, while Reporters Without Borders’ 2018 press freedom barometer cited 143 imprisoned citizen journalists globally, and ten citizen journalists killed. While Tunisia has inched toward democracy, releasing large numbers of political prisoners following the 2011 revolution, China, Syria, and Iran remain major offenders, and are now joined by several countries, including the Philippines, Saudi Arabia, and Egypt. When we first launched Offline in 2015, we featured five cases of imprisoned or threatened bloggers and technologists, and later added several more. We hoped to raise awareness of their plight, and advocate for their freedom, but we knew it would be an uphill struggle. In two cases, our advocacy helped to secure their release: Ethiopian journalist Eskinder Nega was released from prison earlier this year, and the Zone 9 Bloggers, also from Ethiopia, were acquitted in 2015 following a sustained campaign for their freedom. %3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube-nocookie.com%2Fembed%2FqudbaDNxJUc%3Frel%3D0%26autoplay%3D1%22%20allow%3D%22autoplay%3B%20encrypted-media%22%20allowfullscreen%3D%22%22%20width%3D%22640%22%20height%3D%22385%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube-nocookie.com
Award-winning Ethiopian journalist Eskinder Nega on the power of the Internet and journalism.  Today, the situation in several countries is dire. In Egypt, where a military coup brought the country back toward dictatorship, dozens of individuals have been imprisoned for expressing themselves. Activist Amal Fathy was detained earlier this year after a video she posted to Facebook detailing her experiences with sexual harassment in Cairo went viral, and awaits trial. And Wael Abbas, an award-winning journalist whose experiences with censorship we’ve previously documented, has been detained without trial since May 2018. We also continue to advocate for the release of Alaa Abd El Fattah, the Egyptian activist whose five-year sentence was upheld by an appeals court last year.
Three new Offline cases demonstrate the lengths to which states will go to silence their critics. Eman Al-Nafjan, a professor, blogger, and activist from Saudi Arabia, was arrested in May for her advocacy against the country’s ban on women driving, which was repealed just one month later. Ahmed Mansoor is currently serving a ten-year sentence for “cybercrimes” in his home country of the United Arab Emirates after being targeted several times in the past for his writing and human rights advocacy. And Dareen Tatour, a Palestinian citizen of Israel, recently began a five-month prison sentence after several years of house arrest and a lengthy trial for content she posted on social media that had been misinterpreted by police. Advocacy and campaigns on behalf of imprisoned technologists, activists, and bloggers can make a difference. In the coming months, we will share more details and actions that the online community can take to support these individuals, defend their names, and keep them safe. To learn more about these and other cases, visit Offline.
(function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) {return;} js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));{lang: 'en-GB'}
 

What We Mean When We Say "Data Portability"

“Data portability” is a feature that lets a user take their data from a service and transfer or “port” it elsewhere. This often comes up in discussions about leaving a particular social media platform and taking your data with you to a rival service. But bringing data to a competing service is just one use for data portability; other, just-as-important goals include analyzing your data to better understand your relationship with a service, building something new out of your data, self-publishing what you learn, and generally achieving greater transparency. Regardless of whether you are “porting” your data to a different service or to a personal spreadsheet, data that is “portable” should be easy to download, organized, tagged, and machine-parsable. EFF supports users’ legal right to obtain a copy of the data they have provided to an online service provider. Once you move beyond that, however, the situation gets more complicated. Data portability interacts, and sometimes even conflicts, with other digital rights priorities, including privacy and security, transparency, interoperability, and competition. Here are some of the considerations EFF keeps in mind when looking at the dynamics of data portability. Privacy and Security Any conversation about data portability in practice should keep privacy and security considerations front and center. First off, security is a critical concern. Ported data can contain extremely sensitive information about you, and companies need to be clear about the potential risks before users move their data to another service. Users shouldn’t be encouraged to share information with untrustworthy third parties. And data must always be protected with strong security in transit and at its new location. How do we unravel the data you provide about yourself to a service from the data your friends provide about you? Second, it’s not always clear what data a user should have the right to port. There are a lot of questions to grapple with here: When does "data portability" presume inclusion of one's social graph, including friends' contact information? What are all the ways that can go wrong for those friends’ privacy and security? How do we unravel the data you provide about yourself, the data your friends provide about you, and all the various posts, photos, and comments you may interact with? And then, how can we ensure data portability respects all of those users’ right to have control over their information? While there are no easy answers, the concept of consent is a starting point. For example, a service could ask friends for their specific, informed consent to share contact information when you initiate a download of all your data. Companies should also explore technical solutions that might allow users to export lists of friends in an obfuscated, privacy-protective form. Transparency Portability works hand-in-hand with transparency. If some of your data is easy to download and use (portable) but the rest is secret (not transparent), then you are left with an incomplete picture of your relationship with a service. Conversely, if you are able to find out all the information a company has about you (transparent) but have no way to take it and interact with it (not portable), you are denied opportunities to further understand and analyze it. Companies first should be transparent about the profile data that they collect or generate about you for marketing or advertising purposes, including data from third parties and inferences the company itself makes about you. Comprehensive portability should include this information, too; these data should be just as easy for you to access and use as the information you share voluntarily. Portability works hand-in-hand with transparency to return power to users. Both portability and transparency return power to users. For example, a comprehensive download of the data Facebook stores about a user’s browsing habits and advertising preferences might help her reverse-engineer Facebook’s processes for making inferences about users for targeted advertising. Or, in another example, the ability to take complete metadata about one’s music preferences and listening patterns from Spotify to another streaming service might make for a better user experience; Spotify might have figured out over time that you can’t stand a certain genre of music, and your next streaming service can immediately accommodate that too. Interoperability Data portability can also work alongside “interoperability.” Interoperability refers to the extent to which one platform’s infrastructure can work with others. In software parlance, interoperability is usually achieved through Application Programming Interfaces (APIs)—interfaces that allow other developers to interact with an existing software service. This can allow “follow-on innovators” to not only interact with and analyze but also build on existing platforms in ways that benefit users. For example, PadMapper started by organizing data about rental housing pulled from Craigslist posts and presenting it in a useful way; Trillian allowed users to use multiple IM services through the same client and added features like encryption on top of AIM, Skype, and email. On a larger scale, digital interoperability enables decentralized, federated services like email, modern telephony networks, and the World Wide Web. Competition Depending on the context and platform, data portability is vital but not sufficient for encouraging competition. In many markets, it’s hard for competition to exist without portability, so we must get this part right. Data portability can support users’ right to “vote with their feet” by leaving a platform or service that isn’t working for them.
But on its own, data portability cannot magically improve competition; the ability to take your data to another service is not helpful if there are no viable competitors. Similarly, data portability cannot fend off increasing centralization as big players buy up or squash smaller competitors. Initiatives like the Data Transfer Project among Facebook, Microsoft, Twitter, and Google could ultimately be important,  but won’t meaningfully help competition unless they allow users to move their data beyond a small cabal of incumbent services. Right now they don’t. Combined with other substantive changes, data portability can support users’ right to “vote with their feet” by leaving a platform or service that isn’t working for them and taking their data and connections to one that does. Making these options real for people can encourage companies to work to keep their users, rather than hold them hostage. (function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) {return;} js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));{lang: 'en-GB'}
 
Despite waves of calls and emails from European Internet users, the European Parliament today voted to accept the principle of a universal pre-emptive copyright filter for content-sharing sites, as well as the idea that news publishers should have the right to sue others for quoting news items online – or even using their titles as links to articles. Out of all of the potential amendments offered that would fix or ameliorate the damage caused by these proposals, they voted for worst on offer. There are still opportunities, at the EU level, at the national level, and ultimately in Europe’s courts, to limit the damage. But make no mistake, this is a serious setback for the Internet and digital rights in Europe. It also comes at a trepidatious moment for pro-Internet voices in the heart of the EU. On the same day as the vote on these articles, another branch of the European Union’s government, the Commission, announced plans to introduce a new regulation on “preventing the dissemination of terrorist content online”. Doubling down on speedy unchecked censorship, the proposals will create a new “removal order”, which will oblige hosting service providers to remove content within one hour of being ordered to do so. Echoing the language of the copyright directive, the Terrorist Regulation “aims at ensuring smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes”; it encourages the use of “proactive measures, including the use of automated tools.” Not content with handing copyright law enforcement to algorithms and tech companies, the EU now wants to expand that to defining the limits of political speech too. And as bad as all this sounds, it could get even worse. Elections are coming up in the European Parliament next May. Many of the key parliamentarians who have worked on digital rights in Brussels will not be standing. Marietje Schaake, author of some of the better amendments for the directive, announced this week that she would not be running again. Julia Reda, the German Pirate Party representative, is moving on; Jan Philipp Albrecht, the MEP behind the GDPR, has already left Parliament to take up a position in domestic German politics. The European Parliament’s reserves of digital rights expertise, never that full to begin with, are emptying. The best that can be said about the Copyright in the Digital Single Market Directive, as it stands, is that it is so ridiculously extreme that it looks set to shock a new generation of Internet activists into action – just as the DMCA, SOPA/PIPA and ACTA did before it. If you’ve ever considered stepping up to play a bigger role in European politics or activism, whether at the national level, or in Brussels, now would be the time. It’s not enough to hope that these laws will lose momentum or fall apart from their own internal incoherence, or that those who don’t understand the Internet will refrain from breaking it. Keep reading and supporting EFF, and join Europe’s powerful partnership of digital rights groups, from Brussels-based EDRi to your local national digital rights organization. Speak up for your digital business, open source project, for your hobby or fandom, and as a contributor to the global Internet commons. This was a bad day for the Internet and for the European Union: but we can make sure there are better days to come. (function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) {return;} js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));{lang: 'en-GB'}
 

Yes, You Can Name A Website “Fucknazis.us”

Jeremy Rubin just wanted to speak out about the rise of white supremacist groups in the U.S. and raise some money to fight against those groups. But the Internet domain name he registered in late 2017 for his campaign—“fucknazis.us”—ran afoul of a U.S. Department of Commerce policy banning certain words from .US domain names. A government contractor took away his domain name, effectively shuttering his website. Last month, after EFF and the Cyberlaw Clinic at Harvard Law School intervened, Mr. Rubin got his site back. A government agency shutting down an Internet domain based on the contents of its name runs afoul of the First Amendment. After a long back-and-forth with EFF and the Cyberlaw Clinic, the Commerce Department’s contractor Neustar agreed to give Mr. Rubin back his domain, and to stop banning “dirty words.” fucknazis.us has proudly returned to the Internet. As anyone with a business or personal website knows, having a meaningful domain name can be the cornerstone of online presence. Mr. Rubin, moved to act after anti-Semitic and white supremacist incidents last summer, created a “virtual lapel pin” through the Ethereum computing platform as a fundraiser for opposition to these causes. The virtual pins, and the domain he registered to sell them, declared his message in a pithy fashion: “fucknazis.us” The Internet’s domain name system as a whole is governed by ICANN, an independent nonprofit organization. While ICANN imposes questionable rules from time to time, a blanket ban on naughty words in domain names has never been one of them. Unluckily for Mr. Rubin, the .US top-level domain is a different animal, because it’s controlled by the U.S. government. Originally used only for government websites, .US is now open to anyone with a connection to the U.S. Since 1998, it’s been controlled by the National Telecommunications and Information Administration (NTIA), a part of the Department of Commerce. And it’s managed by registry operator Neustar, Inc., under contract with NTIA. Shortly after Mr. Rubin registered “fucknazis.us,” Neustar suspended the domain, calling it a violation of an NTIA “seven dirty words” policy, a phrase with particular First Amendment significance. As a general rule, First Amendment law makes clear that the government can rarely impose restrictions on speech based on the content of that speech, and when it does, must show some level of necessity. The well-known case of Federal Communications Commission v. Pacifica Foundation upheld the FCC’s decision to reprimand, though not fine or revoke the license of, a public broadcaster after it aired George Carlin’s famous monologue “Filthy Words.” In so doing, the Court approved of the FCC’s definition of “indecency,” a word otherwise without a constitutional definition. But the Supreme Court explained that “indecency” as a legal concept was limited to over-the-air broadcast media, because broadcasts made use of limited radio spectrum, were a scarce and highly regulated public resource, and were easily overheard by children in their everyday surroundings. Many years later, the Supreme Court directly rejected the US government’s attempt to impose a similar indecency regime on the Internet, and that regime has never been applied to any medium other than over-the-air radio and television broadcasts. Last month, we learned that Neustar and NTIA were reversing course, allowing Mr. Rubin to proceed with the use of fucknazis.us, and more generally removing these kinds of restrictions from future .US domain name registrations. Thanks to the First Amendment, the .US domain, advertised as “America’s Address,” is a place where one can say “Fuck Nazis” without censorship. (function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) {return;} js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));{lang: 'en-GB'}
 

Today, Europe Lost The Internet. Now, We Fight Back.

How did this terrible state of affairs come to pass?