The World of P2P: Peer-to-Peer (P2P)

What is P2P ?

edit
 
This is a diagram of a Peer-to-Peer computer network.
 
A diagram of a server-based computer network.

Generally, a peer-to-peer (or P2P) computer network refers to any network that does not have fixed clients and servers, but a number of autonomous peer nodes that function as both clients and servers to the other nodes on the network. This model of network arrangement is contrasted with the client-server model (that was unable to scale up to todays necessities). In the P2P model any node should be able to initiate or complete any supported transaction. Peer nodes may differ in local configuration, processing speed, network bandwidth, and storage quantity. This is the basic definition any p2p system.

The term P2P may mean different things to different people in different contexts. For instance, although the term has been applied to Usenet and IRC in all their incarnations and is even applicable to the network of IP hosts known as the Internet, it is most often used restricted to the networks of peers developed starting in the late 1990s characterized by transmission of data upon the receiver's request instead of the sender's. Such early networks included Gnutella, FastTrack, and the now-defunct Napster which all provide facilities for free (and somewhat anonymous) file transfer between personal computers connected in a dynamic and unreliable way to a network in order to work collectively towards a shared objective.

Even those early Networks did work around the same concept or implementation. In some Networks, such as Napster, OpenNap or IRC, the client-server structure is used for some tasks (e.g. searching) and a peer-to-peer structure for others, and even that is not consistent in each. Networks such as Gnutella or Freenet, use a peer-to-peer structure for all purposes and are sometimes referred to as true peer-to-peer networks, even though some of the last evolution are now making them into a hybrid approach were each peer is not equal in its functions.

When the term peer-to-peer was used to describe the Napster network, it implied that the peer protocol nature was important, but in reality the great achievement of Napster was the empowerment of the peers (ie, the fringes of the network). The peer protocol was just a common way to achieve this.

So the best approach will be to define peer-to-peer, not as a set of strict definitions but to extend it to a definition of a technical/social/cultural movement, that attempts to provide a decentralized, dynamic and self regulated structure (in direct opposition to the old model o central control or server-client model, that failed to scale up to today's expectations), with the objective of providing content and services. In this way a computer programs/protocol that attempts to escape the need to use a central servers/repository and aims to empower or provide a similar level of service/access to a collection of similar computers can be referred to as being a P2P implementation, and it will be in fact enabling everyone to be a creator/provider, not only a consumer. Every P2P system is by definition self feeding, the more participants it has the better it will satisfy it's objectives.

From a Computer Science Perspective

edit

Technically, a true peer-to-peer application must implement only peering protocols that do not recognize the concepts of "server" and "client". Such pure peer applications and networks are rare. Most networks and applications described as peer-to-peer actually contain or rely on some non-peer elements, such as DNS. Also, real world applications often use multiple protocols and act as client, server, and peer simultaneously, or over time.

P2P under a computer science perspective creates new interesting fields for research not on to the not so recent switch of roles on the networks components, but due to unforeseen benefits and resource optimizations it enables, on network efficiency and stability.

Peer-to-peer systems and applications have attracted a great deal of attention from computer science research; some prominent research projects include the Chord lookup service, the PAST storage utility, and the CoopNet content distribution system (see below for external links related to these projects).

It is also important to notice that the computer is primarily a information devices, whose primary function is to copy data from location to location, even more than performing other types of computations. This makes digital duplication something intrinsic to the normal function of any computer it is impossible to realize the goal of general purpose open computing with any type of copy protection. Enforcement of copyright in the digital era should not be seen as a technical issue but a new reality that society needs to adapt to.

Distributed Systems

edit

Distributed systems are becoming key components of IT companies for data centric computing. A general example of these systems is the Google infrastructure or any similar system. Today most of the evolution of these systems if being focused on how to analyze and improve performance. A P2P system is also a distributed systems and share, depending on the implementation, the characteristics and problems of distributed systems (error/failure detection, aligning machine time, etc...).

Ganglia

edit

Ganglia is a scalable distributed monitoring system for high-performance computing systems such as clusters and Grids. It is based on a hierarchical design targeted at federations of clusters. It leverages widely used technologies such as XML for data representation, XDR for compact, portable data transport, and RRDtool for data storage and visualization. It uses carefully engineered data structures and algorithms to achieve very low per-node overheads and high concurrency.
Ganglia has been ported to an extensive set of operating systems and processor architectures, and is currently in use on thousands of clusters around the world. It has been used to link clusters across university campuses and around the world and can scale to handle clusters with 2000 nodes. ( http://ganglia.info/ )

Distributed Computation

edit

The basic premise behind distributed computation is to spread computational tasks between several machines distributed in space, most of the new projects focus on harnessing the idle processing power of "personal" distributed machines, the normal home user PC. This current trends is an exciting technology area that has to do with a sub set of distributed systems (client/server communication, protocols, server design, databases, and testing).

This new implementation of an old concept has its roots in the realization that there is now a staggering number of computers in our homes that are vastly underutilized, not only home computers but there are few businesses that utilizes their computers the full 24 hours of any day. In fact, seemingly active computers can be using only a small part of their processing power. Using a word processing, email, and web browsing require very few CPU resources. So the "new" concept is to tap on this underutilized resource (CPU cycles) that can surpass several supercomputers at substantially lower costs since machines that individually owned and operated by the general public.

SETI@Home

edit

One of the most famous distributed computation project, , hosted by the Space Sciences Laboratory, at the University of California, Berkeley, in the United States. SETI is an acronym for the Search for Extra-Terrestrial Intelligence. SETI@home was released to the public on May 17, 1999.

In average it used hundreds of thousands of home Internet-connected computers in the search for extraterrestrial intelligence. The whole point of the programs is to run your free CPU cycles when it would be otherwise idle, the original project is now deprecated to be included into BOIC.

BOINC
edit

BOINC has been developed by a team based at the Space Sciences Laboratory at the University of California, Berkeley led by David Anderson, who also leads SETI@home.

Boinc stands for Berkeley Open Infrastructure for Network Computing, a non-commercial (free/w:open source software), released under the LGPL, middleware system for volunteer computing, originally developed to support the SETI@home project and still hosted at ( http://boinc.berkeley.edu/ ), but intended to be useful for other applications in areas as diverse as mathematics, medicine, molecular biology, climatology, and astrophysics. an open-source software platform for computing using volunteered resources that extends the original concept and lets you donate computing power to other scientific research projects such as:

  • Climateprediction.net: study climate change.
  • Einstein@home: search for gravitational signals emitted by pulsars.
  • LHC@home: improve the design of the CERN LHC particle accelerator.
  • Predictor@home: investigate protein-related diseases.
  • Rosetta@home: help researchers develop cures for human diseases.
  • SETI@home: Look for radio evidence of extraterrestrial life.
  • Folding@Home ( http://www.stanford.edu/group/pandegroup/folding/ ): to understand protein folding, misfolding, and related diseases.
  • Cell Computing biomedical research. (Japanese; requires nonstandard client software)
  • World Community Grid: advance our knowledge of human disease. (Requires 5.2.1 or greater)

As a "quasi-supercomputing" platform, BOINC has over 435,000 active computers (hosts) worldwide. BOINC is funded by the National Science Foundation through awards SCI/0221529, SCI/0438443, and SCI/0506411.

It is also used for commercial usages, as there are some private companies that are beginning to use the platform to assist in their own research. The framework is supported by various operating systems: Windows (XP/2K/2003/NT/98/ME), Unix (GNU/Linux, FreeBSD) and Mac OS X.

World Community Grid (WCG)

edit

Created by IBM, World Community Grid ( http://www.worldcommunitygrid.org/ ) is similar to the above systems. Fourteen IBM servers serve as "command central" for WCG. When they receive a research assignment from an organization, they will scour it for security bugs, parse it into data units, encrypt them, run them through a scheduler and dispatch them out in triplicate to the army of volunteer PCs.

To be a volunteer one only needs to download a free, small software agent (similar to a screensaver).

Projects get selected based on the potential to benefit from WCG technology and address humanitarian concerns, and chosen by an independent, external board of philanthropists, scientists and officials.

The software is OpenSource (LGPL), C/C++ and wxWidgets and is available for Windows, Mac, or Linux.

Grid Networks

edit

Grids first emerged in the use of supercomputers in the U.S. , as scientists and engineers sought access to scarce high-performance computing resources that were concentrated at a few sites.

Open Science Grid
edit

The Open Science Grid ( http://www.opensciencegrid.org/ ) was built and is operated by the OSG Consortium, it is a U.S. grid computing infrastructure that supports scientific computing via an open collaboration of science researchers and software developers from universities and national laboratories, storage and network providers.

Globus Alliance
edit

The Globus Alliance ( http://www.globus.org/ ) is a community of organizations and individuals developing fundamental technologies behind the "Grid," which lets people share computing power, databases, instruments, and other on-line tools securely across corporate, institutional, and geographic boundaries without sacrificing local autonomy.

The Globus Alliance also provides the Globus Toolkit, an open source software toolkit used for building robust, secure, grid systems (peer-to-peer distributed computing on supercomputers, clusters, and other high-performance systems) and applications. A Wiki is available to the Globus developer community ( http://dev.globus.org/wiki/Welcome ).

High Throughput Computing (HTC)

edit

As some scientists try extract more floating point operation per second (FLOPS) or minute from their computing environment, others concentrate on the same goal for larger time scales, like months or years, we refer these environments as High Performance Computing (HPC) environments.

The term HTC was coined in a seminar at the NASA Goddard Flight Center in July of 1996 as a distinction between High Performance Computing (HPC) and High Throughput Computing (HTC).

HTC focus is on the processing power and not on the network, but the systems can also be created over a network and so be seen as a Grid network optimized for processing power.

Condor Project
edit

The goal of the Condor Project ( http://www.cs.wisc.edu/condor/ ) is to develop, implement, deploy, and evaluate mechanisms and policies that support High Throughput Computing (HTC) on large collections of distributively owned computing resources. Guided by both the technological and sociological challenges of such a computing environment, the Condor Team has been building software tools that enable scientists and engineers to increase their computing throughput.

IBM Grid Computing

edit

IBM among other big fishes in the IT pond, spends some resources investigating Grid Computing. Their attempts around grid computing are listed in on the projects portal page ( http://www.ibm.com/grid ). All seem to attempt to leverage the enterprise position on server machines to provide grid services to costumers. The most active project is Grid Medical Archive Solution a scalable virtual storage solution for healthcare, research and pharmaceutical clients.

Content distribution/Hosting

edit

The traditional method of distributing large files is to put them on a central server. The server and the client can then share the content across the network using agreed upon protocols (from HTTP, FTP to an infinite number of variations) when using IP connections the data can be sent over TCP or UDP connection or a mix of the two, this all depends mostly on the requirements on the service, machines, network and many security considerations.

The advantages regarding optimization of speed, availability and consistency of service in regards to optimal localization is nothing unheard off. Akamai Technologies and Limelight Networks among other similar solutions have attempted to commercially address this issue, even Google has distributed the location of its data centers to increase the response of its services. This has addressed the requirement of large content and service distribution but is not a fully decentralization of the control structure.

P2P evolved in to solve a distinct problem, that central servers do not scale well. Bandwidth, space and CPU constitute are a point of failure, that can easily bring the function of a system to an end, as any centralization of services.

Note:
One simple example on how centralization is problematic is exemplified by the Denial-of-service attack, that generally consists of the efforts of one or more machines to temporarily or indefinitely interrupt or suspend services of a server connected to the Internet, generally by overloading it.

Conferences and papers

edit

P2P is not, yet, a well established field of research or even a computer science specific field. P2P technology covers too many subjects, that it is yet hard to restrict all interactions as a field on itself. As a result much relevant information is hard to find.

Conferences

For conferences one of the locations that has update information from a non-commercial and platform agnostic viewpoints is the list provided by the project GNUnet project at https://gnunet.org/conferences .


 

To do:
Create list of annual conferences


Papers
 

To do:
Location for free access paper regarding P2P

From a Economics Perspective

edit

For a P2P system to be viable there must be a one to one equal share of work between peers, the goal should be a balance between consumption and production of resources with the goal of maintaining a singe class of participant on the network, sharing the same set of responsibilities. Most P2P systems have a hard time creating incentives for users to produce (contribute), and end up generating a pyramidal (or multiples, tree like) scheme, as users interact within making the systems dependent on the network effect created. The more users the system has, the more attractive it is (and the more value it has) as any system that depends on the network effect, it's success is based on compatibility and conformity issues.

The digital revolution has created a wave of changes some are yet to be fully understood. One of the most important in regards to economics, and that commercial goods producers are fully aware is the dilution of value due to the increasing accumulation and durability of older creations. Digital media has made not only old creations more durable but also more easily accessible and visible and cheap to accumulate.

Even if rarely anyone defends works on the public domain, those works continue to move consumers from the need to acquire new one. It fallows that this makes our common cultural historic records a prime target for the dark forces that arise from basic economic interests.

The popularization of the production and distribution of Cultural goods

edit

P2P radically shifts the economics of distribution and business models dealing with intangible cultural goods (intellectual property). Since most content is virtual, made only of information. This information can be any type of non material object that is made from ideas (text, multimedia, etc.). In this way content is also the myriad ways ideas can be expressed. It may consist of music, movies, books or any one single aspect, or combination of each.

Music

edit

The digital revolution has forced to music industry to reshape itself in various ways. From the promotion, to distribution passing.

Radio

Radio had been for a long time the way that the industry managed demand for its products. It was not a question of quality but of product "visibility" and a easy way to generate revenue from royalties.

The advent of the transistor and the walk-man should have clued-in the industry to the changes to come. Even here the simple digitalization and the possibility of moving radio from the airwaves to the Internet has caused much pain in the industry and eroded considerably how it had managed to shape demand.

From Tape to MP3

Another front of attack came from the very media where the content was sold. Something that should also been foresaw since the advent of the 8-track tape, that culminated in compact Cassette was another revolutionary change to the business model, as revolutionary to the music industry as it was to the video. Adaptation led to the CD and mixed content offerings. In fact it seems that all pressure and incentives to change were being dictated by the consumers and the industry economic effort to reduce production costs but those that had decision power over the industry continued to remain blind to the technological changes they were fostering themselves.

Here the move to the digital not only permitted even easier reproduction but ultimately made the product completely virtual and independent from the media it was sold on. The walk-man, evolved into the portable Cd-player and ultimately died silently as solid memory and portable players took over the market.

Internet
Sidenote

Death of ACTA music video  Dan Bull's D.O.A.C.T.A (Death of ACTA) being ACTA the Anti-Counterfeiting Trade Agreement, in it the creator explains concisely how independent music creators see this continued call for intellectual property control policies.

The need for content intermediaries continues its rapid decline. Most intermediaries do not add much value to the product besides being able to provide better marketing orientation and general business knowhow to the content producers.

The time where volume would permit record companies to offer better production facilities is over, as the price for producing an audio work is now accessible to all, even if in physical form. Intermediaries in fact are becoming to costly for the perks they can still provide. They create unnecessary barriers between the producers and consumers.

In todays interconnected world the distribution channels are so diversified that creating artificial control schemes for digital distribution (physical or virtual) will only degrade the level of satisfaction of consumers without increasing product value but incrementing the costs to the sanctioned distributors.

If costumers are faced with a product with DRM, then unauthorized copies if made publicly available, will create a competing product without limitations, thus creating a better product with a better price tag. In fact the use of DRM creates differentiation and promotes the creation of a parallel markets (if one can call it that because most offerings are gratis, but multiple DRM schemes would fragment the market in the same way), this results from the consumers wishes are not being satisfied by the primary offer or by simply enabling more choices.

Today radio, TV and the press as a publicity vehicle is becoming increasingly infective in relation to the interactive media that the Internet permits and it can be utilized as a direct distribution channels. More and more artist are becoming aware of the advantages of controlling the copyright of their productions and taking the responsibility of distributing their own works, this has also increased the level of communication with the consumer.

This has become quickly evident in the music industry, mostly because the medium has always been extremely volatile and consumers have had a great number of ways of utilizing the content, reducing the freedom of movement for the content have always been attempted and failed, the same is becoming true for video and with time even books will have to deal with this new reality, as is now seen with the written press. As the medium for the content becomes ubiquitous, cheap and acceptable to consumers, producers will have to adapt.

Video

edit
Movies
TV

Recently some television networks are rethinking their approach to audiences, this has resulted from the level acceptance and interest that DVD show collections were having and several online attempts to improve distribution. Since now anyone can easily illegally download their favorite shows, a problem similar to the fragmentation of the distribution channels as seen in the music recording industry with the rise of alternative delivery technologies will have a similar result if television industry fails adapt and fill the audiences expectations of quick and easy accessibility to new fresh content.


 

To do:
extend, address the rise of independent productions, compare the p2p meme of decentralization with the static settings of the industry, we are all producers now, real time interaction


ISPs

edit

ISPs have been shaping/throttling P2P traffic, especially the more popular networks for years, resulting on an ongoing cat and mouse game between ISPs and P2P developers. In the US the network neutrality discussion and recently the evidence of this actions by ISPs against P2P traffic has turned this matter into a political issue.

In November 2007, Vuze, creators of Azureus (a Bittorrent application), petitioned the FCC, resulting in a FCC hearing held in December 2007. One of the issues raised there, was the level of data available on BitTorrent throttling. This lead to a statement by the General Counsel at Vuze, Jay Monahan; “We created a simple software “plug-in” that works with your Vuze application to gather information about potential interference with your Internet traffic.”

This plugin has been gathering more hard data on the actions of ISPs, resulting in a growing list of ISPs that interfere with P2P protocols is maintained on the Azureus WIKI ( http://www.azureuswiki.com/index.php/Bad_ISPs ).

From a Sociological Perspective

edit

From person to person or user to user, a new world is being born in that all are at the same time producers and consumers. The information will be free since the costs of distribution will continue to fall and the power for creative participation is at anyones hands.

Is it morally wrong?

edit

As discussed previously there is no common ground to answer this question, views differ wildly, even states degrees with the interpretation or legality of restricting/implementing intellectual property rights.


 

To do:


For every action there is a reaction

It is today an evidence that there is a social movement against what is generally perceived as the corruption of copyright over public goods, that is, legally a minority is attempting to impose extensions and reductions of liberties to defend economical interests of mostly sizable international corporations that in it's vast majority aren't even the direct creators of the goods. In this particular case virtual goods, mostly digital that have a approaching 0 cost of replication and aren't eroded by time or use.

DRM (Digital Rights Management)

edit

When we talk about DRM it useful to keep in mind that the rights that are being "managed" are completely distinct from simple intellectual rights that were granted protection on non-digital media. Since the level of control permitted can be extreme, for those that respect the DRM, sometimes the rights that are removed from the consumer are simply freedom that existed in past media, for example the freedom to lend. It has gone to a point that the concept of buying a good has been slowly changing to renting, in a way that you ultimately do not own or have full control over what you paid for.

Note:
The issues regarding DRM has become of such a magnitude that as this type of technology becomes more abusive of consumers rights that its surreptitious implications that due to the lack of consumer's awareness and education efforts have began to appear to provide labels and indicators that permit a more educated selection of what is consumed. The Free Software Foundation via its defectivebydesign.org campaign started in 2008 a promoting the adoption of a distinctive label indicating DRM-Free content/media (http://www.defectivebydesign.org/drm-free).

A more unified marker for DRM-free files that also educates downloaders about DRM is a powerful way to increase the value of being DRM-free. People looking for ebooks in places like Amazon often have trouble figuring out which ebooks have DRM and which don't because Amazon does not advertise that information. This label is a step toward solving that problem, making it easy for people who oppose DRM to find like-minded artists, authors, and publishers to support.

In late 2005, market-based rationales influenced Sony BMG's deployment of DRM systems on millions of Compact Discs that threatened the security of its customers computers and compromised the integrity of the information infrastructure more broadly. This became known as the Sony BMG Rootkit debacle (see the paper Mulligan, Deirdre and Perzanowski, Aaron K., "The Magnificence of the Disaster: Reconstructing the Sony BMG Rootkit, for detailed information).

In February 2007, Steve Jobs, wrote an open letter addressing DRM since it was impacting Apples business on the iTunes/iPod store ( http://www.apple.com/hotnews/thoughtsonmusic/ ).

On a presentation made by David Hughes of the RIAA at Arizona State University (2007), David Hughes, senior vice president of technology for the RIAA, dubbed the spiritual leader of Apple Steve Jobs as a "hypocrite" over his attitude to DRM on iTunes. "While Steve has been banging on about the music companies dropping DRM he has been unwilling to sell his Pixar movies through iTunes without DRM and DVDs without CSS encryption."

a danger for historical records

edit
 

To do:


P2P United

edit

A now disbanded organization formed by six of the biggest P2P groups (those behind eDonkey, Grokster, Morpheus, Blubster, Limewire and BearShare), with Adam Eisgrau as executive director. It was started in mid-July 2003 to provide a way to lobby for the P2P on U.S. Congress and WIPO, the UN organization that administers intellectual property treaties since the file-sharing industry (as an industry) had no identifiable name and face in Washington or in the media.

This attempt was a bust and since then most of the members of the group has lost court cases or have settled and closed operations.


 

To do:
Complete


Peer-to-Peer working group

edit

The Peer-to-Peer WG (P2Pwg).

A great article about problems with the creation of the working group is available at (www.openp2p.com) by Tim O'Reilly 10/13/2000 is available (http://www.openp2p.com/pub/a/p2p/2000/10/13/working_grp.html).

P2P in non technological fields

edit

There are also several movements attempting to establish how to apply the concept of P2P to non technological fields like politics, economics, ecology etc...
One of such attempts is The Foundation for P2P Alternatives ( http://p2pfoundation.net ), that function as a clearinghouse for such open/free, participatory/p2p and commons-oriented initiatives and aims to be a pluralist network to document, research, and promote peer to peer alternatives.

edit

The most commonly shared files on such networks are mp3 files of popular music and DivX movie files. This has led many observers, including most media companies and some peer-to-peer advocates, to conclude that these networks pose grave threats to the business models of established media companies. Consequently, peer-to-peer networks have been targeted by industry trade organizations such as the RIAA and MPAA as a potential threat. The Napster service was shut down by an RIAA lawsuit; both groups the RIAA and MPAA spend large amounts of money attempting to lobby lawmakers for legal restrictions. The most extreme manifestation of these efforts to date (as of January, 2003) has been a bill introduced by California Representative Berman, which would grant copyright holders the legal right to break into computer systems believed to be illegally distributing copyrighted material, and to subvert the operation of peer-to-peer networks. The bill was defeated in committee in 2002, but Rep. Berman has indicated that he will reintroduce it during the 2003 sessions.

As attacks from Media companies expand the networks have seemed to adapt at a quick pace and have become technologically more difficult to dismantle. This has caused the users of such systems to become targets . Some have predicted that open networks may give way to closed, encrypted ones where the identity of the sharing party is not known by the requesting party. Other trends towards immunity from media companies seem to be in wireless ad hoc networks where each device is connected in a true peer-to-peer sense to those in the immediate vicinity.

While historically P2P file sharing has been used to illegally distribute copyrighted materials (like music, movies, and software), future P2P technologies will certainly evolve and be used to improve the legal distribution of materials.


 

To do:
..."IP addresses are an identifier used to locate a particular network interface on the Internet. Be this a router, a PC, Mac, PDA, mobile phone or otherwise (with modules capable of utilizing one ranging to the size of a finger nail). IP addresses are not proof that a particular TYPE (PC running Windows, Linux or other free software, PDA, mobile phone, etc.) of computer hardware was used in the transmission. Nothing about this hardware can be *assumed*, and also nothing about the users IF ANY, of this hardware. So, I define my second point, which is that these electronic devices (of the types I listed above) may be operated without regard to physical location or the actual OWNER of the IP address." in "Patricia Santangelo files Answer, Demands Trial by Jury"...


As it should be obvious by now the problem P2P technologies create to the owner of the content, to the control of the distribution channels and to the limitation of users (consumers) rights is huge, the technology is making holes in the standard ideology that controls the relations between producers and consumers some new models have been proposed (see for example Towards solutions to “the p2p problem” - http://groups.sims.berkeley.edu/pam-p2p/ ).

In 2007 a handful of the wealthiest countries (United States, the European Commission, Japan, Switzerland, Australia, New Zealand, South Korea, Canada, and Mexico) started secretive negotiations toward a treaty-making process to create a new global standard for intellectual property rights enforcement, the Anti-Counterfeiting Trade Agreement (ACTA) initially due to be adopted at the 34th G8 summit in July 2008, has now hoped to be concluded in 2010.

It has been argued that the main purpose of the treaty is to provide safe harbor for service providers so that they may not hesitate to provide information about infringers; this may be used, for instance, to quickly identify and stop infringers once their identities are confirmed by their providers.

Similarly, it provides for criminalization of copyright infringement, granting law enforcement the powers to perform criminal investigation, arrests and pursue criminal citations or prosecution of suspects who may have infringed on copyright.

More pressingly, being an international treaty, it allows for these provisions—usually administered through public legislation and subject to judiciary oversight—to be pushed through via closed negotiations among members of the executive bodies of the signatories, and once it is ratified, using trade incentives and the like to persuade other nations to adopt its terms without much scope for negotiation.

Is it Illegal?

edit

Peer-to-Peer in itself in nothing particularly new. We can say that an FTP transfer or any other one on one transfer is P2P, like an IRC user sending a DCC file to another, or even email, the only thing that can be illegal is the use one can give to a particular tool.

Legal uses of P2P include distributing open or public content, like movies, software distributions (Linux, updates) and even Wikipedia DVDs are found on P2P Networks. It can also be used to bypass censorship, like for instance the way Michael Moore's film 'Sicko' leaked via P2P or as publicity machine to promote products and ideas or even used as a market annalists tool.

However trading copyrighted information without permission is illegal in most countries. You are free to distribute your favorite Linux distribution, videos or pictures you have taken yourself, MP3 files of a local band that gave you permission to post their songs online, maybe even a copy an open source software or book. The view of legality lies foremost on cultural and moral ground and in a globally networked world there is no fixed line you should avoid crossing, one thing is certain most people don't produce restricted content, most view their creations as giving to the global community, so it's mathematically evident that a minority is "protected" by the restrictions imposed on the use and free flow of ideas, concepts and culture in general.

P2P as we will see is not only about files sharing, it is more generally about content/services distribution.

 
Sharing is not theft and theft is not the same as piracy, this is true under any law.
is sharing theft? and is theft piracy? surly not...

Sharing contents that you have no right to is not theft. It has never been theft anywhere in the world. Anyone who says it is theft is wrong. Sharing content that you do not own (or have the rights to distribute) is copyright infringement. Duplicating a digital good does not reduce the value of the original good, nor does it signify a subtraction of the same from the owner. On the other hand, making that same digital good available to others without a license may have a well understood effect of augmenting the visibility of said product, resulting in free advertisement and discussion about the product, this generally results in the increase of the demand for it, this has been validated in tests done with digital books, music and video releases.

Using the term "piracy" to describe the copyright infringement is a metaphoric heuristic, a public relations stunt from lobbies of big corporations that represent copyright holders or hold the copyright over commercializable cultural goods, as a way to mislead the public and legislators. Leading to practices that directly damage society and culture (see Sonny Bono Copyright Term Extension Act or the Mickey Mouse Protection Act).

The legal battles we are now accustomed to hear about, deal mostly on control and to a lesser degree in rights preservation. Control over the way distribution is archived (who gets what in what way) results in creating artificial scarcity. This deals with money, as there is added value to controlling and restricting access due to format and limitations in time and space.

 
Cartoon about free culture, intellectual property and Internet Piracy. Found on The Pirate Bay in late February / early March '09.

Content and Indexers

edit

One other distinction that needs to be made is in content distribution (especially one in violation of copyrights) and indexing said content. On how it is indexed and to what goals seems to be important, if for nothing else because copyright holders will prefer to persecute services that can pay for infringing on their rights. But indexing may not constitute an illegal activity at all since no rights are directly affected in some jurisdictions the issue seems to be how open does it promote said infringement and if it constitutes an illegal action by itself, and that can be a point that is extremely hard to make. In any case by 2015 most public bittorrent trackers for instance would openly comply with DMCA requests and implement takedown procedures, even if often complaining that requests at times are too broad going so far as to cover works that the requester has no rights over.


 

To do:

  • Encryption
  • paedophiles and terrorists


World Intellectual Property Organization (WIPO)

edit

The World Intellectual Property Organization is one of the specialized agencies of the United Nations. WIPO was created in 1967 with the stated purpose "to encourage creative activity, [and] to promote the protection of intellectual property throughout the world". The convention establishing the World Intellectual Property Organization, was signed at Stockholm on July 14, 1967.


 

To do:
Add more info on the WIPO and relevant treaties.


 
An Italian manifest saying "to share is not steal", referring to P2P legal status in Italy.

In August 2007, the Music industry was rebuffed in Europe on file-sharing identifications, as a court in Offenburg, Germany refused to order ISPs to identify subscribers when asked to by Music Industry who suspected specified accounts were being used for copyright-infringing file-sharing, the refusal was based in the courts understanding that ordering the ISPs to handover the details would be "disproportionate", since the Music Industry representatives had not adequately explained how the actions of the subscribers would constitute "criminally relevant damage" that could be a basis to request access to the data.

This was not an insulate incident in Germany, as also in 2007, Celle chief prosecutor's office used the justification that substantial damage had not been shown to refuse the data request, and does follows the opinion of a European Court of Justice (ECJ) Advocate-General, Juliane Kokott who had published an advice two weeks earlier, backing this stance, as it states that countries whose law restricted the handing over of identifying data to criminal cases were compliant with EU Directives. The produced advice was directed to a Spanish case in which a copyright holders' group wanted subscriber details from ISP Telefónica. The ECJ isn't obliged to follow an Advocate-General's advice, but does so in over three-quarters of cases.

In most European countries, copyright infringement is only a criminal offense when conducted on a commercial scale. This distinction is important because public funds are not directed into investigating and persecuting personal and most often private copyright violations with limited economic impact.


 

To do:


On June 12 2007 the Société des Producteurs de Phonogrammes en France (SPPF - http://www.sppf.com/ ), an entity that represents the legal interests and collects copyright revenue in behalf of independent French audio creations, have publicly announced that they had launched a civil action on the Paris Court of First Instance requesting a court order to terminate the distribution and function of Morpheus (published by Streamcast), Azureus and demanding compensation for monetary losses. On 18 September 2007 a similar action was made against Shareaza and on 20 December 2007 the SPPF announced a new action this time against Limewire. All of this legal actions seem to have as a base an amendment done to the national copyright law that stipulates that civil action can taken against software creators/publishers that do not take steps in preventing users from accessing illegal content.


 

To do:
Also mention ALPA (Association against audiovisual piracy)


 

To do:
Digital Economy Bill


edit

The FACT is a trade organization in the United Kingdom established to represent the interests of the its members in the film and broadcasting business on copyright and trademark issues. Established in 1983, FACT works with law enforcement agencies on copyright-infringement issues.

FACT has produced several adverts which have appeared at the beginning of videos and DVDs released in the UK, as well as trailers shown before films in cinemas. While operating with the same function that the Motion Picture Association of America (MPAA), FACT has avoided public outcry by focusing most o its actions in targeting serious and organized crime involving copyright-infringement.

In an interesting demonstration of cross-border mutual support between similar business organizations, in 2008, FACT helped the MPAA in a sting operation against streaming links site SurfTheChannel. The MPAA not only participated in the questioning by bringing their own investigators it was allowed to examine the apprehended equipment and managed to find an United States programmer that worked for the site to testify in the legal proceedings. The programmer was not persecuted in the US, but agreed to pay the MPAA the amount he made for working at SurfTheChannel (see MPAA Agents Expose Alleged Movie Pirates for details).

In July 2009 in Barcelona, Spain. Judge Raul N. García Orejudo declared that “P2P networks, as a mere transmission of data between Internet users, do not violate, in principle, any right protected by Intellectual Property Law,” when dismissing the Sociedad General de Autores y Editores (SGAE) legal action for the closing of the eD2K link site elrincondejesus.com..

In August 2009, the Inspecção-Geral das Actividades Culturais (IGAC) sent a notification to the biggest ISP in Portugal, Portugal Telecom (PT), to remove pages that hosted links to freely by unlicensed downloadable copyrighted material from external pages. This notification came into being after the situation was reported by the Internet Anti-piracy Civic Movement (Movimento Cívico Antipirataria na Internet, MAPINET), this association most prominent members are the Association for the Audiovisual Commerce of Portugal (Associação de Comércio Audiovisual de Portugal, ACAPOR), Portuguese Phonographic Association (Associação Fonográfica Portuguesa, AFP), the Association for the Management and Distribution of Rights (Associação para a Gestão e Distribuição de Direitos, AUDIOGEST), the Federation of Video Editors (Federação de Editores de Videogramas, FEVIP), the Cooperative for the Management of the Artists Rights (Cooperativa de Gestão dos Direitos dos Artistas, GDA), the Association for the Management of Author's Rights(Associação para a Gestão de Direitos de Autor, GEDIPE, part of The AGICOA Alliance), the Portuguese Society of Authors (Sociedade Portuguesa de Autores, SPA) and some other producers and editors as well as some translators. Since no real illegal content is hosted on servers under the Portugal Telecom control it seems the links aren't a violation on the users terms of use at the identified Blog portal service, and no action has been taken so far. List of the offending 28 pages available in an article in Portuguese.

The Norway’s Personal Data Act (PDA) makes it mandatory for ISPs in the country to delete all IP address logs on their customers more than three weeks old as it is considered personal information. This is a huge step forward in personal data protection laws but it also will make the work of "pirate-hunters" more difficult. The Simonsen law firm, is an example since it is known by the lawyer Espen Tøndel, figure head on this matters, and for having since 2006 (now terminated), a temporary license from Norway’s data protection office to monitor suspected IP addresses without legal supervision.

Under US law "the Betamax decision" (Sony Corp. of America v. Universal City Studios, Inc.), case holds that copying "technologies" are not inherently illegal, if substantial non-infringing use can be made of them. This decision, predating the widespread use of the Internet applies to most data networks, including peer-to-peer networks, since legal distribution of some files can be performed. These non-infringing uses include sending open source software, creative commons works and works in the public domain. Other jurisdictions tend to view the situation in somewhat similar ways.

The US is also a signatory of the WIPO treaties, treaties that were partially responsible for the creation and adoption of the Digital Millennium Copyright Act (DMCA).

As stated in US Copyright Law, one must be keep in mind the provisions for fair use, licensing, copyright misuses and the statute of limitations.

MGM v. Grokster


 

To do:
http://www.eff.org/IP/P2P/MGM_v_Grokster/


Recording Industry Association of America (RIAA)

edit

The RIAA and the labels took an aggressive stance as soon as online music file sharing became popular. They won an early victory in 2001 by shutting down the seminal music-sharing service Napster.

The site was an easy target because Napster physically maintained the computer servers where illegal music files, typically in high-fidelity, compressed, download-friendly MP3 format, were stored. [With P2P networks, the files are stored on individual user computers; special software lets consumers "see" the files and download them onto their own hard drives.]
—Daphne Eviatar, "Record industry, music fans out of tune," The Recorder, August 20, 2003

The Recording Industry Association of America (RIAA) ( http://www.riaa.com/ ) is the trade group that represents the U.S. recording industry. The RIAA receives funding from the four of the major music groups EMI, Warner, Sony BMG and Universal and hundreds of small independent labels.

Motion Picture Association of America (MPAA)

edit

The MPAA is an American trade association that represents six biggest Hollywood studios. It was founded in 1922 as the Motion Picture Producers and Distributors of America (MPPDA). It focus in advancing the business interests of its members and administers the MPAA film rating system.

In the early 1980s, the Association opposed the videocassette recorder (VCR) on copyright grounds. In a 1982 congressional hearing, Valenti decried the "savagery and the ravages of this machine" and compared its effect on the film industry and the American public to the Boston Strangler.

The MPAA acts as a lobbies for stricter legislation in regards to copyright safeguards, protection extensions and sanctions, and actively pursues copyright infringement, including fighting against sharing copyrighted works via peer-to-peer file-sharing networks, legally and in technologically disruptive ways.

The MPAA has promoted a variety of publicity campaigns designed to increase public awareness about copyright infringement, including Who Makes Movies?; You can click, but you can't hide; and You Wouldn't Steal a Car, a 2004 advertisement appearing before program content on many DVDs.

The MPAA British counterpart is the Federation Against Copyright Theft (FACT).


 

To do:
Warner Bros. to Try File Sharing in Germany
MPAA sues newsgroup, P2P search sites By John Borland, Published on ZDNet News, February 23, 2006


Canada

edit

Canada has a levy on blank audio recording media, created on March 19, 1998, by the adoption of the new federal copyright legislation. Canada introduced this levy regarding the private copying of sound recordings, other states that share a similar copyright regime include most of the G-7 and European Union members. In depth information regarding the levy may be found in the Canadian copyright levy on blank audio recording media FAQ ( http://neil.eton.ca/copylevy.shtml ).

With borders and close ties to it neighbor, Canada as historically been less prone to serve corporations interests and has a policy that contrasts in its social aspects with any other country in the American Continent. The reality is that Canada has been highly influenced and even pressured (economically and politically) by its strongest neighbor, the USA, to comply with its legal, social and economic evolution. In recent time (November 2007) the government of Canada has attempted to push for the adoption of a DMCA-modeled copyright law, so as to comply with the WIPO treaties the country signed in 1997 in a similar move to the USA, this has resulted in a popular outcry against the legislation and will probably result in it's alteration. The visibility of this last attempt was due to efforts of Dr. Michael Geist, a law professor at the University of Ottawa considered an expert in copyright and the Internet, that was afraid that law would copy the worst aspects of the U.S. Digital Millennium Copyright Act.

Darknets vs Brightnets

edit

Due the refusal to legislate in accordance with the public needs and wants. By adding extensions of copyrights (US, UK) and by actively promoting the promulgating laws similar to the DMCA in other countries a monoculture is created where a virtual monopoly on cultural goods is created, generating something of a cultural imperialism.

This reality promotes the population to move its support for transparent distribution systems (brightnets) to more closed system (darknets), that will increasingly depend on social connections to get into, like the hold speakeasy bars that popped up during the prohibition, legislating against the people once more will prove to be a failure.

A P2P brightnet for content distribution,where no one breaks the law, so no one need hide in the dark, can only be feasible if built around owner-free media/system or by being as heavily controlled, owned media/system, as the old centralized system. This probably around a centralized entity that should guarantee control and manage the content, maybe even requiring the use of some DRM sheme, an overall failed system.

This types of networks were already tested and failed, since content is also information, the need for privacy or lack of it on an open system will always create generate a more layered system that will ultimately degenerate in a darknet to survive legal actions.

Shadow play

edit

In any open society, secrecy, intentional obfuscation of facts and usurpation or suppression of rights should be seen not only as negatively but often as a civilizational back-step. Often illegal or strictly restricted this types of activities are not only possible by states but also actively pursued by large corporations, that in some nations (or standard setting organizations) are able to exert an unprecedented control over policy.

In a world where the end user intends to have control over their own hardware and software some actions are not intended to see the light of day. This section is dedicated to bring out some of the subjects/actions in an attempt to help the reader to fully appreciate some of the less publicized information that has some kind of baring on the evolution of P2P.

Some organizations (or groups with invested interests) still think that it is up to them to think for the masses, in place of just try to push the information out granting the public the ability to make their own informed decisions. There is an clear organized attempt to, in general, hide facts from the public. Information is power.

Attempting to control access and the flow of information is a fool's errand. People have always rebelled against industry-mandated black box solutions and their artificial restrictions, that do not serve any other purpose but the economic interests of those attempting to exert control.

Since P2P (and P2P related technologies) started to pop up and gather momentum, the security of the user on the OS started to be championed, and placed above user freedoms, by imposing choices that disregards the need to proper educate the users, diminishing their ability to make their own choices. This philosophy keeps people in a state of being technologically challenged, and afraid of change. It is even funny to see to what degree, efforts are made to keep these "security enhancements" hidden.

Well not all is lost, some people can't seem to be made to comply with this state of things and some information can be found and actions reversed.

about MS Windows
  • TCPIP.SYS - [fix],[info] for Windows XP.

ISPs

edit

One of the most important aspects of running an Internet providers (ISPs) is being quick to adapt to customers' demands and changes in demands and uses. This adaptation can be done from two sides, adapting by modifying the network offering and creating new service offerings or by the suppression and degradation of new consumer trends. The latter is easily done if the ISPs is part of a larger media outlet, that can not only influence public opinion but shape legislation.

ISPs are not very pleased with P2P technologies due to the load the bring into to their networks, although they sell their Internet connections as unlimited usage, if people actually take on their offer, ISPs will eventually be unable to cope with the demand at the same price/profit level. The simpler solution would be to match their offer to the use, by increasing their capacity and fulfilling at least the contractual obligations, but some decided to simply throttle (i.e. slow down) peer-to-peer traffic or even intentionally interfere with it. This has made clients increasingly worried over some ISPs actions, from traffic shaping (protocol/packet prioritization) to traffic tampering.

San Francisco-based branch of the Electronic Frontier Foundation (EFF) a digital rights group have successfully verified that this type of efforts by Internet providers to disrupt some uses of their services and evidences seem to indicate that it is an increasing trend other as reports have reached the EFF and verified by an investigation by The Associated Press.

EFF Releases Report Interference with Internet Traffic on ComCast ( http://www.eff.org/wp/packet-forgery-isps-report-comcast-affair ), other information is available about this subject on the EFF site.

Metrics

edit

Forced traffic shaping

edit

Some ISPs are now using more sophisticated measures (e.g. pattern/timing analysis or categorizing ports based on side-channel data) to detect P2P traffic. This means that even encrypted traffic can be throttled. However, with ISPs that continue to use simpler, less costly methods to identify and throttle P2P networks, the current countermeasures added to P2P solutions remains extremely effective.

Traffic tampering

edit

Traffic tampering is more worrying than Traffic shaping and harder to be noticed or verified. It can also be defined as spoofing, consisting in the injection of adulterated/fake information into communication by gaming a given protocol. It like the post office taking the identity of one of your friends and sending mail to you in its name.

Pcapdiff ( http://www.eff.org/testyourisp/pcapdiff/ ) is a free Python tool developed by the EFF to compare two packet captures and identify potentially forged, dropped, or mangled packets.

When there is a demand offers to satisfy it quickly fallow. One example is the Sandvine Incorporated application that is able to intercept p2p communications and subvert the protocols. This type of application has dual proposes (for the owner perspective), for instance the Sandvine application was primarily designed to change Gnutella network traffic as a path "optimizer". But as today the adoption of the BitTorrent seems to be taking primacy, recent offers of the Sandvine application are capable of intercepting BitTorrent peer-to-tracker communication as to identify peers based on the IP address and port numbers in the peer list returned from the tracker. When Sandvine later sees connections to peers in the intercepted peer lists, it may (according to policy) break these connections by sending counterfeit TCP resets. Even if BitTorrent protocol continues to implement countermeasures, they have costs and it turns the problem into an "arms race", the issue is moral or legal in nature, with security implications.

The fight for network neutrality

edit

Network Neutrality deals with the need to prevent ISPs from double dipping on charges/fees for both the clients paying for their broadband connections and WEB sites/Organizations having also to pay for prioritization of traffic according to origination and destination or protocol used.

The secretive Anti-Counterfeiting Trade Agreement

edit

The Anti-Counterfeiting Trade Agreement (ACTA) is a proposed plurilateral agreement for the purpose of establishing international standards on intellectual property rights enforcement.

ACTA has the purpose to establish a new international legal framework that countries can join on a voluntary basis and would create its own governing body outside existing international institutions such as the World Trade Organization (WTO), the World Intellectual Property Organization (WIPO) or the United Nations.

The idea to create a plurilateral agreement on counterfeiting was developed by Japan and the United States in 2006. Canada, the European Union and Switzerland joined the preliminary talks throughout 2006 and 2007. Official negotiations began in June 2008, with Australia, Mexico, Morocco, New Zealand, the Republic of Korea and Singapore joining the talks. It is planned for negotiations to finish in September 2010.

Negotiating countries have described it as a response "to the increase in global trade of counterfeit goods and pirated copyright protected works.

The scope of ACTA is broad, including counterfeit goods, generic medicines and copyright infringement on the Internet. Because it is in effect a treaty, ACTA would overcome many court precedents defining consumer rights as to "fair use" and would either change or remove limitations on the application of intellectual property laws.

After a series of draft text leaks in 2008, 2009 and 2010 the negotiating parties published the official version of the current draft on 20 April 2010.

United States

edit

Both the Obama administration and the Bush administration had rejected requests to make the text of ACTA public, with the White House saying that disclosure would cause "damage to the national security."

In 2009, Knowledge Ecology International filed a FOIA (Freedom of Information Act) request in the United States, but their entire request was denied. The Office of the United States Trade Representative's Freedom of Information office stated the request was withheld for being material "properly classified in the interest of national security."

US Senators Bernie Sanders (I-VT) and Sherrod Brown (D-OH) penned a letter on 23 November 2009, asking the United States Trade Representative to make the text of the ACTA public.

Secret negotiations

edit

The Electronic Frontier Foundation (EFF) opposes ACTA, calling for more public spotlight on the proposed treaty in its paper "Sunlight for ACTA" ( http://www.eff.org/action/sunlight-acta )

Since May 2008 discussion papers and other documents relating to the negotiation of ACTA have been uploaded to Wikileaks, and newspaper reports about the secret negotiations swiftly followed.

In June 2008 Canadian academic Michael Geist writing for Copyright News argued that "Government Should Lift Veil on ACTA Secrecy" noting before documents leaked on the Internet ACTA was shrouded in secrecy. Coverage of the documents by the Toronto Star "sparked widespread opposition as Canadians worry about the prospect of a trade deal that could lead to invasive searches of personal computers and increased surveillance of online activities." Geist argues that public disclosure of the draft ACTA treaty "might put an end to fears about iPod searching border guards" and that it "could focus attention on other key concerns including greater Internet service provider filtering of content, heightened liability for websites that link to allegedly infringing content, and diminished privacy for Internet users." Geist also argues that greater transparency would lead to a more inclusive process, highlighting that the ACTA negotiations have excluded both civil society groups as well as developing countries. Geist reports that "reports suggest that trade negotiators have been required to sign non-disclosure agreements for fear of word of the treaty's provisions leaking to the public." He argues that there is a need for "cooperation from all stakeholders to battle counterfeiting concerns" and that "an effective strategy requires broader participation and regular mechanisms for feedback".

In November 2008 the European Commission responded to these allegations as follows:

It is alleged that the negotiations are undertaken under a veil of secrecy. This is not correct. For reasons of efficiency, it is only natural that intergovernmental negotiations dealing with issues that have an economic impact, do not take place in public and that negotiators are bound by a certain level of discretion. However, there has never been any intention to hide the fact that negotiations took place, or to conceal the ultimate objectives of the negotiations, the positions taken in European Commission Trade 5/6 the negotiations or even details on when and where these negotiations are taking place. The EU and other partners (US, Japan, Canada, etc.) announced their intention to start negotiations of ACTA on 23 October 2007, in well publicised press releases. Since then we have talked about ACTA on dozens of occasions, including at the European Parliament (INTA committee meetings), and in numerous well attended seminars. Commission organised a stakeholders' consultation meeting on 23 June in Brussels, open to all – industry and citizens and attended by more than 100 participants. US, Australia, Canada, New Zealand and other ACTA partners did the same.

This position changed in 10 March 2010 with a direct European Parliament resolution criticizing the ACTA, the proceedings and the infringements on fundamental human rights.

Threats to freedom and fundamental human rights

edit

An open letter signed by many organizations, including Consumers International, EDRi (27 European civil rights and privacy NGOs), the Free Software Foundation (FSF), the Electronic Frontier Foundation (EFF), ASIC (French trade association for web 2.0 companies), and the Free Knowledge Institute (FKI), states that "the current draft of ACTA would profoundly restrict the fundamental rights and freedoms of European citizens, most notably the freedom of expression and communication privacy."

The Free Software Foundation argues that ACTA will create a culture of surveillance and suspicion. (see http://www.fsf.org/campaigns/acta "Speak out against ACTA").

Aaron Shaw, Research Fellow at the Berkman Center for Internet & Society at Harvard University, argues that "ACTA would create unduly harsh legal standards that do not reflect contemporary principles of democratic government, free market exchange, or civil liberties. Even though the precise terms of ACTA remain undecided, the negotiants' preliminary documents reveal many troubling aspects of the proposed agreement" such as removing "legal safeguards that protect Internet Service Providers from liability for the actions of their subscribers" in effect giving ISPs no option but to comply with privacy invasions. Shaw further says that ACTA would also facilitate privacy violations by trademark and copyright holders against private citizens suspected of infringement activities without any sort of legal due process".

The Free Software Foundation (FSF) has published "Speak out against ACTA", stating that the ACTA threatens free software by creating a culture "in which the freedom that is required to produce free software is seen as dangerous and threatening rather than creative, innovative, and exciting."

ACTA would also require that existing ISP no longer host free software that can access copyrighted media; this would substantially affect many sites that offer free software or host software projects such as SourceForge. Specifically the FSF argues that ACTA will make it more difficult and expensive to distribute free software via file sharing and P2P technologies like BitTorrent, which are currently used to distribute large amounts of free software. The FSF also argues that ACTA will make it harder for users of free operating systems to play non-free media because DRM protected media would not be legally playable with free software.

On 10 March 2010, the European Parliament adopted a resolution criticizing the ACTA, with 663 in favor of the resolution and 13 against, arguing that "in order to respect fundamental rights, such as the right to freedom of expression and the right to privacy" certain changes in the ACTA content and the process should be made.

edit

Nate Anderson with Ars Technica pointed out in his article ( http://arstechnica.com/news.ars/post/20080602-the-real-acta-threat-its-not-ipod-scanning-border-guards.html ), that ACTA encourages service providers to provide information about suspected infringers, by giving them "safe harbor from certain legal threats". Similarly, it provides for criminalization of copyright infringement, granting law enforcement the powers to perform criminal investigation, arrests and pursue criminal citations or prosecution of suspects who may have infringed on copyright. It also allows criminal investigations and invasive searches to be performed against individuals for whom there is no probable cause, and in that regard weakens the presumption of innocence and allows what would in the past have been considered unlawful searches.

Since ACTA is an international treaty, it is an example of policy laundering used to establish and implement legal changes. Policy laundering allows legal provisions—usually administered through public legislation and subject to judiciary oversight—to be pushed through via closed negotiations among members of the executive bodies of the signatories. Once ratified, companies belonging to non-members may be forced to follow the ACTA requirements since they will fall out of the safe harbor protections. Also, the use of trade incentives and the like to persuade other nations to adopt treaties is a standard approach in international relationships. Additional signatories would have to accept ACTA's terms without much scope for negotiation.

From 16–18 June 2010, a conference was held at the Washington College of Law, attended by "over 90 academics, practitioners and public interest organizations from six continents". Their conclusions were published on 23 June 2010 on the American University Washington College of Law website. They found "that the terms of the publicly released draft of ACTA threaten numerous public interests, including every concern specifically disclaimed by negotiators."

Requests for disclosure

edit

In September 2008 a number of interest groups urged parties to the ACTA negotiations to disclose the language of the evolving agreement. In an open letter the groups argued that: "Because the text of the treaty and relevant discussion documents remain secret, the public has no way of assessing whether and to what extent these and related concerns are merited." The interest groups included: the Consumers Union, the Electronic Frontier Foundation, Essential Action, IP Justice, Knowledge Ecology International, Public Knowledge, Global Trade Watch, the US Public Interest Research Group, IP Left (Korea), the Canadian Library Association, the Consumers Union of Japan, the National Consumer Council (UK) and the Doctors without Borders' Campaign for Essential Medicines.