Publications

InfoRM Privacy Law Update – October 2015

Home Insights InfoRM Privacy Law Update – October 2015

In this edition of InfoRM:

Big Data to help Vulnerable Children

Debate around the use of Predictive Risk Modelling (PRM) has arisen following a recent Radio New Zealand Insight documentary that highlighted the use of PRM in New Zealand, available here.  PRM is an algorithm that utilises information collected from various sources to generate a risk score predicting the probability of a child's mistreatment.

Following publication of the White Paper for Vulnerable Children in October 2010, the New Zealand Government commissioned Auckland University to undertake PRM research. The Ministry of Social Development (MSD) has now modified this research and is currently evaluating the feasibility of using information gathered from welfare and health agencies to “identify people at risk early enough to allow for effective intervention”. Variables include family members who have had contact with welfare services, indicators of mental health, location, sentencing history, family violence, single-parent status and caregiver age. This data can be used to assess whether intervention is necessary to protect vulnerable children from maltreatment.

The latest research suggests that PRM is relatively accurate, with 42.2% of the 1% most vulnerable children (tracked from the cohort of children born between 2007 and 2010) predicted to be at high risk found to be maltreated by the age of five. However, many people are concerned about the use of PRM. These opponents highlight the risks surrounding big data and the inaccuracies that can occur through its collection and use. They suggest inaccuracies could lead to people being wrongly interpreted as ‘at risk’ and being exposed to welfare and enforcement agencies.

The debate highlights both the potential benefits and risks of big data. PRM is facilitated by an approved information sharing agreement between MSD, the Ministries of Justice, Health, Education and the New Zealand Police that allows the exchange and use of information to identify and respond to vulnerable children. While that ensures the holding and sharing of information is subject to certain requirements, given its highly sensitive nature, careful use and control will be essential to maintain public confidence and trust. On the other hand, the insights available through PRM and similar modelling techniques offer a glimpse into a new level of accuracy and responsiveness that may be possible for the provision of public services.


New Zealand Developments

Deputy PM outlines role of data in effective public services

The spotlight on PRM (see above) comes at the same time that Hon Bill English has, following the release of the Productivity Commission’s report into more effective social services, outlined the need for a data-driven “social investment” analysis when providing public services.

In a speech given as part of the Treasury Guest Lecture Series on Social Investment, Minister English noted that a key challenge for government is working out whether interventions are effective and that data technology is an essential tool for doing this.

“[Effective measurement] will require the development of the data and measurement infrastructure that delivers the feedback loop to support decision makers. We want to take advantage of developments in data technology and analytics have transformed many service industries across the economy. We are incorporating these tools in Budget 2016, and we expect in future that they be applied across all social services – that is, becoming systemic and systematic.”

The Minister’s comments also signal that the Government may be considering a more permissive regime for the sharing of information within the public service, and also increased involvement from private and community providers who have knowledge of circumstances “on the ground”.

“Communities need to know what is happening in their area, so we’ll be increasing the sharing of anonymised data with the public.  Social services providers need to understand their clients so that they can better tailor their services. The Government as the funder of services needs to know what interventions make a real difference. Each of these requires data to flow securely, but freely.”

“Protecting privacy is essential. The problem to solve is finding the balance between protecting identity and accessing the opportunity to do better. That’s an important part of why the Government is establishing the Data Futures Partnership, a permanent organisation that will work on these difficult problems. It is also why we are reviewing the Privacy Act.”

HRRT decision highlights new damages scale and dangers of personal liability

Director of Human Rights Proceedings v Crampton [2015] NZHRRT 35 (29 July 2015)

The Human Rights Review Tribunal (Tribunal) recently awarded damages of $18,000 for humiliation and loss of dignity after a damaging letter about the Plaintiff (President of the Massey University Extramural Students’ Society) was disclosed to a student magazine. The decision is another reminder for agencies to be aware of privacy considerations in heated employment (or similar) environments, and (following Hammond v Credit Union Baywide) indicates the Tribunal’s increased willingness to provide substantial awards for humiliation, even at the lower end of the spectrum. The case is also notable for the fact that the individual who disclosed the information, rather than the organisation for whom he worked, appears to have been treated as wholly liable for the damages.

The Plaintiff complained to the Privacy Commissioner (Commissioner) after her Vice-President (the Defendant) released a letter, written by members of the Society, to Massey University’s student magazine, Massive Magazine. The letter outlined concerns regarding the Plaintiff’s performance. The Tribunal found that there was a beach of Information Privacy Principle (IPP) 11, as disclosure of the letter was not for the purposes for which the information was collected (being to address performance concerns rather than to embarrass and harm the Plaintiff).

The Tribunal accepted that the disclosure caused the Plaintiff significant humiliation, significant loss of dignity and significant injury to her feelings and awarded $18,000, being “towards the lower end of the middle band”. The Tribunal noted that it was careful to exclude humiliation arising from causes other than disclosure of the letter. In addition, the Tribunal required the Defendant to attend an “Introduction to the Privacy Act” course run by the Office of the Commissioner.

Interestingly, the Tribunal was careful to distance the actions of the Defendant (and others who wrote the letter) from the Society more generally. The Tribunal noted that the letter was drafted by only four members of the Society, which fell short of quorum. There was no evidence of a resolution or decision of the Society’s Executive Committee directing or authorising the release of the letter. The Tribunal considered that, whatever the capacity in which the Defendant disclosed the letter, s 126 of the Privacy Act (which makes an individual liable for his or her actions when acting as employee or agent) meant that the Defendant could be liable.

The decision serves as a reminder for individuals to be careful before disclosing information that could breach IPP 11 or cause embarrassment to others. An employer may not be willing to accept responsibility for an employee’s actions (including by contribution to legal costs).


Immigration New Zealand refuses to correct birth date

A recent decision of the Commissioner raises questions about what agencies are required to do in situations where a person requests a correction of their personal information but the proposed correction cannot be conclusively verified as true. 

The Complainant, a former Ethiopian refugee, produced documents that cast doubt on Immigration New Zealand’s (INZ) records of his birth year and sought correction of his age. Rather than his recorded age of 12, the information suggested the complainant could be as old as 18.

How did such a significant disparity between the Complainant’s recorded and actual age arise? The Complainant’s aunt estimated the Complainant was born in 2000 when she sponsored his refugee application to come to New Zealand in 2011. Her estimate was used to apply for an Ethiopian passport and birth certificate (there being no official record of his birth). Although a paediatrician advised INZ that the Complainant looked older than his date of birth suggested, INZ relied on the aunt’s information.

Following a bone density scan and a dental examination in 2012 and 2013 which suggested the Complainant was likely significantly older than recorded, the Complainant sought correction of his birth date but this was denied by INZ. The Complainant later obtained an amended birth certificate from Ethiopia and again sought correction of his information. INZ again refused but did offer to add a “correction requested but not granted” note to the Complainant’s file.

In his decision, the Commissioner held that INZ’s (likely incorrect) birth date would cause harm to the Complainant on an ongoing basis unless it was amended. As INZ’s date restricted the Complainant’s entitlement to a driver’s licence, financial assistance and the adult minimum wage, full correction (rather than a file note) was appropriate.

The case raises questions about what agencies are to do in circumstances where, even if it is accepted that an agency’s records are incorrect, there is no information that conclusively shows what the correct information is.  While it appears INZ got the Complainant’s birth date wrong, it was arguably in an invidious position of having to guess an age likely ranging from somewhere between 15 to 18 years old. The Commissioner does not indicate what date INZ should have used or how it should have chosen a date. The matter raises questions about how an agency should deal with unavailable or unverifiable information, what investment (money and time) an agency should make to ascertain the “truest” information and whether, in circumstances that do not have such a significant impact on the Complainant, a better option is for an agency to record no information, rather than either of two potentially incorrect pieces of information.

The Commissioner has referred the case to the Director of Human Rights proceedings to consider bringing a claim before the Human Rights Review Tribunal.

Privacy codes of practice amended

The Commissioner has recently issued updated Privacy Act codes of practice following the enactment of the Harmful Digital Communications Act 2015. The changes ensure the codes are consistent with amendments to the Privacy Act which place a limitation on use and disclosure of personal information from publicly available publications. Additional minor changes have also been made to update the codes.

Digitisation of old records poses problems for accuracy of personal information

The Commissioner has talked to the Department of Internal Affairs (DIA) after a person complained that details of a bankruptcy, since annulled, were accessible online.

The complaint arose after the New Zealand Gazette became an online publication in 2014 and retrospectively published material from 1993 onwards. DIA is responsible for publishing the Gazette, and is required to publish notices of people who are declared bankrupt or are under a No Asset Procedure (NAP). These notices are required to be accessible in perpetuity under the Public Records Act 2005.

In response to the complaint, DIA accepted that this information was readily available and was out of date in many circumstances. It undertook to further communicate to affected individuals that the publications are permanent public records, and would add text to the online Gazette notices indicating the period of bankruptcy or NAP and whether it had been discharged. The complainant was happy with this response and dropped the complaint.

This example illustrates that obligations to maintain historical records do not mean an entity is exempted from the requirement in the Privacy Act to correct information that is inaccurate, out of date, incomplete or misleading. The obligation continues even if the information was valid at the time. The case also shows how the concept of “practical obscurity” - where administrative or other barriers reduce the accessibility of public records, making them almost private - has been significantly reduced by the rise of the internet.

It is also relevant that the Commissioner’s discussion was confined to the role of the agency holding the data rather than search engines which provide access to it (in contrast to the case of Google Spain v AEPD & Gonzelez, which recognised a right to be forgotten in the European jurisdiction). DIA accepted that it could technically block a direct search of personal information to the Gazette website, but this would ultimately be ineffective as the information could be accessed through cached search results and duplication of information.

Around the World of Privacy:

USA

Cable company Comcast has settled regulator-brought proceedings for US$33m after it accidentally published the names, phone number and addresses of 75,000 people (including judges and law enforcement officials) who had paid to keep that information private. The breach resulted from a system upgrade that failed to identify those who had asked for their information to remain confidential. The case is a reminder of the importance of robust systems to ensure information is protected.

Elsewhere, a Minnesota judge has certified a class action against Target Corp brought by banks against the retailer. The case is the latest event following Target’s 2013 data breach, which compromised at least 40 million credit cards during the busy holiday season, and demonstrates the potentially massive cost of a large scale data breach.

As well as reacting to breaches, there have also been moves to prevent them before they happen. The Relationship Privacy Act has been passed in Texas, and is intended to guard against “revenge porn” being posted online. The Act is similar to New Zealand’s Harmful Digital Communications Act, with both providing penalties for posters of harmful content, and mechanisms for victims to see it removed.


Australia

Vodafone Australia has received stinging criticism after it emerged that in 2011 an employee accessed text and call information of a journalist that had reported on a major data breach at the company. While Vodafone commissioned KPMG to conduct an internal investigation, the journalist, Australian Privacy Commissioner and police were never notified of the privacy breach. New South Wales’ Minister for Innovation and Better Regulation Victor Dominello has criticised Vodafone’s explanation that the breach was by a rogue employee, and has asked for evidence of Vodafone’s processes for dealing with privacy breaches.

While Australian telcos are not currently allowed to access metadata in that way, they will, following the coming into force of the Telecommunications (Interception and Access) Amendment (Data Retention) Act next month, be required to retain it. The Act compels telcos to “keep” metadata of their customers so that security agencies can later access it.  A point for debate will be whether the Act does away with the carveout in the recent decision of Grubb (holding that Privacy Act requests can only be made for telco metadata if it is readily retrievable by the particular telco), on the basis that any information “kept” by a telco will be retrievable by it.  However, as the information must be “readily” retrievable, it would appear that simply keeping the information will not always be sufficient to meet that standard. Some commentators are also suggesting that the cost of compliance with the new Australian Act will be passed on to consumers or will see some small ISPs put out of business.

The Australian Information Commissioner has also launched an investigation into the Ashley Madison privacy breach.


Russia

A Russian court has awarded a man 50,000 rubles (NZ$1,180) on the basis that Google illegally accessed his emails and sent him targeted advertising. Google rejected the decision, stating: “Humans are not reading your emails. Our automated systems scan emails in order to prevent spam reaching your inbox and to detect bad things like malware. We use the same system to ensure you see advertising that is relevant to you”. The case raises questions regarding the extent to which liability can (or should) be avoided on the basis that information is collected by an automated computer program, rather than a human.Last article content here.


This publication is intended only to provide a summary of the subject covered. It does not purport to be comprehensive or to provide legal advice. No person should act in reliance on any statement contained in this publication without first obtaining specific professional advice. If you require any advice or further information on the subject matter of this newsletter, please contact the partner/solicitor in the firm who normally advises you, or alternatively contact one of the partners listed below.

Talk to one of our experts:
Related Expertise