A Study in Counterfeiting Remedies – Denmark’s Approach


This is the first in a series of articles on remedies considered for online counterfeiting and piracy, in light of the dismantling of the proposals set forth in the Protect IP Act (PIPA) and the Stop Online Piracy Act (SOPA) from earlier this year. For more on PIPA and SOPA, please see the prior posts on these topics.

The purpose of these articles is to explore potential ways to combat online counterfeiting and piracy, and in particular the type of counterfeiting and piracy that occurs overseas, but is directed at a U.S.-based audience. The most notable example in recent months is Megaupload, which has been taken down in a cooperative effort of seven countries. (For more on the Megaupload take-down, visit The Guardian (UK)’s Megaupload Page and the US Department of Justice’s news release announcing the indictment. For more about the U.S. Immigrations and Customs enforcement take-downs, see ICE’s news releases about its intellectual property enforcement efforts.)   

However, cooperative effort across borders is only possible with countries that share the U.S.’s protection of intellectual property rights. Not all countries do. So, what are trademark and copyright owners to do to protect their IP rights in the online world, where geographic borders mean very little?

This series will examine enforcement efforts in other countries as an illustration of possible enforcement mechanisms that might be available, depending on how new legislation on this topic might be written.

Danish Maritime & Commercial Court Decision

A few days ago, Norsker & Co. (a Danish law firm) posted an article about a recent case, Hublot SA Geneve v. Bronsztejin in the Maritime and Commercial Court (May 3, 2012). According to the article, Danish purchasers ordered counterfeit Hublot watches from a Chinese online service. They paid Dkr$2,250 (USD $2,664.41) for these five watches. When the watches arrived at Danish customs, they were seized, pending proof that they had been purchased for private use. The purchasers did not provide such proof and the counterfeits were destroyed. 

The court then punished the purchasers of these counterfeit watches, by assessing monetary fines and destroying the counterfeit watches. There does not appear to be any action taken against the sellers or any other entities in the distribution chain. The purchasers were required to pay the Danish Customs Office’s cost to destroy the counterfeit goods (Dkr2,500 = USD $438.34), damages for the trademark violation (presumably paid to the trademark holder) in the amount of Dkr5,000 (USD $ 859.49) and “costs” (presumably the court costs) in the amount of Dkr15,500 (USD $ 2,664.41). (Currency converter used here was accessed on May 22, 2012).

As a result, it appears that in Denmark, the courts have chosen to punish the purchasers of the counterfeit goods, and not the intermediaries in the distribution chain. The summary did not mention any other defendants – such as the payment processor who processed the credit card payment, or the shipping service that carried the goods across borders.

Future Articles

The articles to follow in this series will consider enforcement mechanisms imposed in other countries – and perhaps competing types of enforcement within the same jurisdiction – to see what other enforcement possibilities have been considered. Please note that I take no position on the effectiveness or fairness of any of these measures, but instead am collecting a laundry list of possible sanctions and targets of those sanctions for research purposes.

Second Circuit Partially Overturned Viacom vs. YouTube Case

In an April 5, 2012 opinion, the Second Circuit concluded that the district court erred in part by granting summary judgment in YouTube’s favor in its August 10, 2010 judgment and remanded the case in part for further proceedings. Viacom Int’l et al. v. YouTube, Inc. et al., No. 10-3270, slip op. (2d Cir. Apr. 5, 2012).

The Second Circuit affirmed certain parts of the underlying opinion – particularly the district court’s conclusion that the DMCA’s § 512(c) safe harbor requires “knowledge or awareness of specific infringing activity” and its conclusion that three of YouTube’s four software functions at issue in this case qualified for the safe harbor – but vacated the opinion in at least one important way. Specifically, the Second Circuit reversed the district court’s interpretation of the “right and ability to control infringing activity” to require “item-specific” knowledge. It also remanded the case to the district court for further fact-finding on whether the fourth software function qualified for a safe harbor. Id. at 2.

In the underlying case, Viacom was joined by several other film studios, television networks, music publishers, and sports leagues in alleging direct and secondary copyright infringement “based on the public performance, display, and reproduction of approximately 79,000 audiovisual ‘clips'” that appeared on Youtube between 2005 and 2009. Id. at 8. Plaintiffs sought damages in the form of statutory damages under § 504(c) or, alternatively, actual damages – as well as declaratory and injunctive relief. Id. at 9.

After considering the arguments on appeal, the Second Circuit vacated the order granting summary judgment in YouTube’s favor because 1) an issue of fact existed whether YouTube had actual knowledge or awareness of specific infringing activity on its website (thus making summary judgment inappropriate), 2) the “right and ability to control” provision does not require “item specific knowledge” and 3) further fact-finding was required before a conclusion could reasonably be reached about whether YouTube’s fourth software process qualified for the safe harbor. Id. at 9.

The Court spent a fair amount of time discussing the evidence presented showing that YouTube officials may have had more than a general understanding that someone might post infringing material to their site. Id. at 19-22. Among other facts, the Court pointed to the following as persuasive that a jury could conclude that YouTube was on notice that infringing content probably existed on their site:





  • YouTube Founder, Jawed Karim, prepared a March 2006 report in which he explained that several “well-known shows” could still be found on YouTube. He stated, “although YouTube is not legally required to monitor content . . . and complies with DMCA takedown requests we would benefit from preemptively removing content that is blatantly illegal and likely to attract criticism.” Id. at 20-21. He added that a “more thorough analysis” was required before any content could be removed. Id. at 21.
  • Another YouTube founder, Chad Hurley, concluded in 2005 that certain videos containing the titles “budlight commercials” should be rejected as potentially infringing, but another co-founder, Steve Chen, asked “can we please leave these in a bit longer? Another week or two can’t hurt,” after which time, Karim replied that he “added back in all 28 bud videos.” Id.
  • Finally, the court described another 2005 email exchange in the following way: [Id.]
    • First, “Hurley urged his colleagues ‘to start being diligent about rejected copyrighted/inappropriate content,’ noting that ‘there is a cnn clip of the shuttle clip on the site today, if the boys from Turner would come to the site, they might be pissed?'”
    • Then, his colleagues responded: “but we should just keep that stuff on the site. I really don’t see what will happen. what? someone from cnn sees it? he happens to be someone with power? he happens to want to take it down right away. he gets in touch with cnn legal. 2 weeks later, we get a cease & desist letter. we take the video down.”
    • Finally, Karim agreed, stating “the CNN space shuttle clip, I like. we can remove it once we’re bigger and better known, but for now that clip is fine.”

Based on these and other similar facts, the Court found that plaintiffs had presented sufficient evidence for “a reasonable juror . . .[to] conclude that YouTube had actual knowledge of specific infringing activity, or was at least aware of facts or circumstances from which specific infringing activity was apparent.” Id. at 22. As a result, the grant of summary judgment in YouTube’s favor was simply “premature.” Id.

Willful Blindness
The Court also analyzed the issue of whether “willful blindness” was equivalent to actual knowledge for purposes of imposing liability for direct or vicarious liability – an issue that the Court labeled as “an issue of first impression.” Id. at 22-24. Relying on Tiffany (NJ) Inc. v. eBay, Inc., 600 F.3d 93, 109 (2d Cir. 2010), the Court explained that when a service provider “has reason to suspect that users of its service are infringing a protected mark, it may not shield itself from learning of the particular infringing transactions by looking the other way.” Viacom, No. 10-3270 at 23 (citing cases).

The Court further clarified that “willful blindness” could not be defined as “an affirmative duty to monitor.” Id. at 24.

The Court ultimately held that “the willful blindness doctrine may be applied, in appropriate circumstances, to demonstrate knowledge or awareness of specific instances of infringement under the DMCA.” Id. The Court did not reach the conclusion of whether YouTube actually had been willfully blind to this type of infringing behavior of its users, but instead only that willful blindness could serve as a basis for liability under the right conditions, and thus remanded the case to the district court for consideration of whether the defendants “made a deliberate attempt to avoid guilty knowledge.” Id. 

Right of Control

As the Court stated, the § 512(c) safe harbor provides that eligible service providers must “not receive a financial benefit directly attributable to the infringing activity, in a case in which the service provider has the right and ability to control such activity.” Id. at 24. The district court considered this argument only briefly, and concluded that this “right and ability to control” required item-specific knowledge. Id. (quoting 718 F. Supp. 2d at 527).

In the appeal, the Court rejected both of the competing interpretations suggested by the parties, concluding that the language neither: 1) requires the ISP to know of the particular infringing content before being able to control it (as proffered by defendants) – because such construction would render this provision (§ 512(c)(1)(B)) duplicative of § 512(c)(1)(A), a result which would be disfavored as a matter of statutory interpretation; nor 2) that this provision merely codifies the common law doctrine of vicarious copyright liability (as proffered by plaintiffs). In its analysis of the common law doctrine, the Court quipped, “Happily, the future of digital copyright law does not turn on the confused legislative history of the control provision.” Id. at 26.

The Court instead concluded that “the right and ability to control infringing activity under § 512(c)(1)(B) ‘requires something more than the ability to remove or block access to materials posted on a service provider’s website.'” Id. at 27 (quoting Capitol Records, Inc. v. MP3tunes, LLC, __ F. Supp. 2d __, 2011 WL 5104616, at *14 (S.D.N.Y. Oct. 25, 2011), among other cases).

The Court acknowledged that the “something more” still needed to be defined, although two cases had found specific activity that courts have concluded rose to a level sufficient to find “right and ability to control.” Id. at 28 (citing Perfect10, Inc. v. Cybernet Ventures, Inc., 213 F. Supp. 2d 1146 (C.D. Cal. 2002) and MGM Studios, Inc. v. Grokster, Ltd., 545 U.S. 913 (2005)). Rather than setting a specific definition in this opinion, the Court remanded this case to the district court to consider whether plaintiffs “adduced sufficient evidence to allow a reasonable jury to conclude” that YouTube met the “right and ability to control” test, and whether it received a financial benefit directly attributable to this activity. Id. at 28-29.

Summary
    

The Court held the following:

1) Knowledge or awareness of facts/circumstances that indicate specific and identifiable instances of infringement is required by § 512(c)(1)(A);

2) A reasonable jury could conclude that YouTube had knowledge or awareness with respect to a handful of specific clips; the case was remanded to determine whether YouTube had the requisite knowledge or awareness with respect to the clips actually at issue in this case;

3) The willful blindness doctrine can be applied, in appropriate circumstances, to demonstrate knowledge or awareness of specific instances of infringement; the case was remanded to determine whether the willful blindness doctrine was applicable here;

4) The district court erred by finding that the “right and ability to control” required item-specific knowledge of the infringing material; the case was remanded for further fact-finding on this issue;

And,

5) The district court was correct that three of the four software functions qualify for the safe harbor; the case was remanded for consideration of whether the fourth function so qualified.

Id. at 34-35. In connection with the remand, the Court left it to the district court’s discretion whether additional discovery would be required. The Court did not award reimbursement of costs to either party.

Google’s Privacy Policy Under Fire Before it Became Effective

On February 22, thirty-six attorneys general signed and sent a letter (through the National Association of Attorneys General) to Google objecting to its new privacy policy, scheduled to take effect on March 1. (See prior post about the provisions of the new policy.) The National Association of Attorneys General reports that the letter objects to Google’s one-size-fits-all approach for all consumers of all of its various services. Specifically, the letter states, “Google’s new privacy policy is troubling for a number of reasons. On a fundamental level, the policy appears to invade consumer privacy by automatically sharing personal information consumers input into one Google product with all Google products. Consumers have diverse interests and concerns, and may want the information in their Web History to be kept separate from the information they exchange via Gmail.” Feb. 22, 2012 Letter. Indeed, the policy requires that consumers to “allow information across all of these products to be shared, without giving them the proper ability to opt out.” Id.

The letter also points out that users of Android phones will be significantly impacted: “Even more troubling, this invasion of privacy is virtually impossible to escape for the nation’s Android-powered smartphone users, who comprise nearly 50% of the national smartphone market. . . . For these consumers, avoiding Google’s privacy policy change may mean buying an entirely new phone at great personal expense. No doubt many of these consumers bought an Android-powered phone in reliance on Google’s existing privacy policy, which touted to these consumers that ‘We will not reduce your rights under this Privacy Policy without your explicit consent.'” Id. (Footnotes omitted). So much for that promise.

The letter requests a response by February 29. It’s unclear whether a response was provided.

EPIC v. FTC Lawsuit

In a related story, the Electronic Privacy Information Center filed suit on February 17 against the FTC to require it to enforce the Google Consent Order, thus barring the amended privacy policy from becoming effective. The court dismissed the complaint on February 24 for lack of jurisdiction over the FTC, but noted its own concerns about the terms of the privacy policy. EPIC filed an emergency appeal with the Circuit Court of Appeals for the D.C. Circuit on February 24, seeking argument before the March 1 effective date. Details about EPIC’s efforts, copies of its pleadings and information about the FTC Chairman’s interview on C-SPAN, the EU’s objection to the privacy policy changes, and the attorneys’ general’s objections can be found on its Consent Order Page.

Note also that EPIC obtained (through a FOIA request) a copy of Google’s Privacy Compliance Report that it filed with the FTC on January 26, 2012. EPIC has posted a copy on its Consent Order Page (see the heading entitled, “‘FOIA Matters’ – EPIC Obtains Google Privacy Compliance Report”). The Privacy Compliance Report describes the March 1 privacy policy changes, although the description is rather watered down and focuses on Google’s efforts to notify its users that the change was coming.

Five Privacy Organizations Request Congressional Hearing

On February 24, five privacy organizations wrote to Representative Mary Bono Mack and Representative G.K. Butterfield of the House Energy and Commerce Committee, Subcommittee on Commerce, Manufacturing and Trade objecting to the privacy policy and requesting that the currently scheduled private hearing with Google to discuss the changes to the privacy policy be opened to the public. Feb. 24, 2012 Letter. These organizations were the Center for Digital Democracy (CDD), Consumer Watchdog, Consumer Federation of America (CFA) and U.S. Public Interest Research Groups. As of this writing, a hearing has not yet been scheduled, but continue to check the Committee’s hearing schedule for updates.

Foreign Organizations Respond in Opposition to New Privacy Policy 

On February 27, 2012, the Commission Nationale de l’Informatique et des Libertés (CNIL) – an independent commission in the French government charged with “ensuring that information technology remains at the service of citizens, and does not jeopardize human identity or breach human rights, privacy or individual or public liberties” – sent a letter to Google, reporting that it has preliminarily concluded that “Google’s new policy does not meet the requirements of the European Directive on Data Protection (95/46/CE), especially regarding the information provided to data subjects.” (The phrase “data subject” refers to “an identified or identifiable natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity.” Art. 2, Definitions, (a))

The Commission had been asked by the Article 29 Data Protection Working Party of the EU to take the lead on this investigation.  (Google’s response to the initial letter from the Article 29 Data Protection Working Party was sent on February 3, 2012, and basically argued that its policies had not changed, but were merely consolidated.)

Earlier, but for similar reasons, on February 23, 2012, the Australian Privacy Commissioner, Timothy Pilgrim wrote to Google on behalf of the Technology Working Group of the Asia Pacific Privacy Authorities expressing concern about the implementation of the new changes. Google responded on February 29.

News Coverage

Here are some samples of articles published in the past few days on this topic:

Google’s Response Thus Far

Google has not posted any response on its press releases page, but that’s not to say that Google hasn’t responded directly to any of these organizations. At some point, I’m sure that Google will make some public statement – in some forum – that will continue to defend its decision to consolidate its privacy policies and the accumulated consumer data into one single data source, probably on the grounds that this is a benefit to consumers because it would allow Google to customize its services to their use.

Conclusions

It appears that the only recourse a consumer has if he or she does not want to participate in the new consolidation of their data currently spread over various Google services is to cancel all Google accounts. It could be very time-consuming to find replacement services (for instance, set up and transition to a new email account, remove YouTube video content and re-post somewhere else that does not require such a broad license to the host, port a blog from Blogger to WordPress (for instance) and publicize the new address). For anyone who uses these services for business or advertising/marketing purposes, the impact in both time and money – and perhaps goodwill developed from a loyal following – could be significant to transition to new providers. As a result, perhaps it’s not really a valid “choice.”

Senate Committee on the Judiciary Schedules Hearing on FOIA


The Senate Committee on the Judiciary announced today that it will be holding a hearing on March 13, 2012 at 10:30 am in the Dirksen Senate Office Building on “The Freedom of Information Act: Safeguarding Critical Infrastructure Information and the Public’s Right to Know.” A list of expected witnesses has not yet been published.

It appears that this hearing will be simulcast over the Internet, so please check back on the Senate’s hearing announcement as the date gets closer for more details and a link to the simulcast.

Google Announces New Privacy Policy and Terms of Service

Effective March 1, 2012, Google’s new Privacy Policy and Terms of Service will go into effect.  These changes are billed as simplifying and consolidating over 60 different privacy policies that apply to Google’s library of services and tools – specifically that it’s “a lot shorter and easier to read.”  (See Overview for this text.)  What appears below is a brief summary of each document, but I encourage you to read the originals, as other issues may jump out at you based on your individual circumstances.  Following these summaries is a description of how to opt out.
Privacy Policy
A quick comparison between the new Privacy Policy and the October 20, 2011 version of the main Privacy Policy suggests that perhaps the information collected by Google – or how that information is used – hasn’t changed, but instead, how the policy is explained.  (I did not compare the March policy with any of the other 60-odd policies that Google referenced in its Overview, so there may be some significant changes here.)
Among other notable provisions of the new policy are the following: 
·        Google may collect device-specific information (such as specifics about your hardware model and mobile network, including your phone number).  Google may associate such device-identifying information or phone number with your Google account.
·        Google may collect and store server logs showing how you used their services (such as search engine queries), call history (to/from phone numbers, duration of calls, SMS routing info, forwarding numbers, and time and date), IP address, device crash history, browser type and browser language, and may also use cookies.
·        Google may collect information about your location using GPS signals sent by a mobile device or sensor data searching for nearby WiFi access points and cell towers.
·        Google may use information from cookies or “pixel tags” to “improve your user experience and the overall quality of our services.”  The example Google gives is being able to remember your language preferences, but the breadth of this tool could be rather large.
Google offers several tools to provide “Transparency and Choice,” including links to review and control information tied to your Google Account, view and edit ad preferences, adjust your Google Profile, control with whom you share information and port your data from Google’s services through a tool called Dataliberation.
Google also reminds users that any information that users share publicly will be indexable by search engines, including Google.  Google explains that it provides mechanisms to correct or remove incorrect data that reside on its servers, but provides no links to place a request to the begin the process. 
Finally, Google provides information about what it shares with non-Google entities and explains it security protections.
Terms of Service
Google’s new Terms of Service are pretty straightforward and contain provisions such as warranties, disclaimers, limitations on liability, business use of Google’s services, and choice of law (California – although specifically disclaims California’s conflict of laws rules).  The Terms of Service uses expressions like “Don’t misuse our services” and “Don’t interfere with our services.”  It also provides confirmation that “you retain ownership of any intellectual property rights that you hold in” content that you upload to Google’s services.  But, and this is significant, “When you upload or otherwise submit content to our Services, you give Google (and those we work with) a worldwide license to use, host, store, reproduce, modify, create derivative works (such as those resulting from translations, adaptations, or other changes we make so that your content works better with our Services), communicate, publish, publicly perform, publicly display and distribute such content.”  Google follows this broad automatic license with this explanation:  “The rights you grant in this license are for the limited purpose of operating, promoting, and improving our Services, and to develop new ones.”
Additional Details about Managing Your Online Profile

In its Privacy Policy, Google also provides information about how to opt out of certain advertising delivery (such as DoubleClick) – more information can be found here: https://www.google.com/intl/en/privacy/ads/.  Google explains that you can opt out of Network Advertising through a single page (http://www.networkadvertising.org/managing/opt_out.asp), which tells you whether certain cookies are present are your machine and allows you to opt-out to each individually or to all of them at once.   You can also permanently block the DoubleClick cookie.  Be sure to read all of the disclaimers before making permanent changes to your browser. 

Note also that in the Advertising and Privacy section, Google explains, “Ads that appear next to Gmail messages can also be personalized based on emails in your account. Read more about ads in Gmail and your personal data.     

You can also request that content you don’t want to be included in Google’s search engine results be removed.  Details are here:  http://support.google.com/webmasters/bin/answer.py?hl=en&answer=164734 (Google cautions that these tools should only be used to remove pages urgently – such as if a private credit card number is exposed – where immediate action is required.  Google adds that using the tools too liberally within your own web site could cause functionality problems.)

As mentioned above, these new policies go into effect across the board for Google services on March 1, 2012.  You have a little time between now and then, and I’d encourage you to read these policies for yourself and determine what pieces (if any) matter to you so that you can make changes or opt out, if necessary to protect your interests.