16.08.2010
10.49 In IP 31, the ALRC asked whether the privacy principles should be amended to deal with the impact of developing technology on privacy.[63] In DP 72, the ALRC expressed the view that these issues should not be covered in the model UPPs, but instead could form the subject of technology-specific guidance.
10.50 As noted at the beginning of this chapter, to operate effectively, principles-based and compliance-oriented regulatory schemes require the issuing of non-binding guidance that clarifies the rights and obligations contained in the primary legislation.[64] In Chapter 9 the ALRC examines a number of developing technologies that may require such guidance when used by agencies and organisations to handle personal information.[65] For example, RFID systems, surveillance devices and internet software could allow an agency or organisation to collect personal information about an individual without his or her knowledge or consent. Security issues may arise when information is transmitted by wireless technologies, and large quantities of information are stored electronically. It may also be difficult for an individual to gain meaningful access to personal information that an organisation holds in an encrypted form.
10.51 Agencies and organisations using such technologies to handle an individual’s personal information may be required to do certain things to meet the obligations set out in the model UPPs. Guidance issued by the OPC—for example, technology-specific guidelines—could specify what is required to fulfil the obligations in the UPPs when personal information is handled using a particular technology.
10.52 The OPC has the power to prepare guidelines that assist agencies and organisations to avoid acts or practices that may be interferences with, or adversely affect, the privacy of individuals.[66] The OPC has used this power to issue guidelines that deal with the data-matching activities of agencies.[67] While guidelines such as these are not binding, they indicate the OPC’s understanding of the requirements set out in the privacy principles. Guidelines can provide greater detail than high level principles, and help to modify behaviour. They can also be highly persuasive in the complaint-handling process. The OPC may also take into account compliance with guidelines when conducting an audit or Privacy Performance Assessment.[68]
10.53 In formulating guidance, the OPC could examine similar guidance published in other jurisdictions. For example, in June 2006, the Information and Privacy Commissioner Ontario issued Privacy Guidelines for RFID Systems. These guidelines are not mandatory, but encourage agencies and organisations to comply with certain limits on collection, use and disclosure of information collected by RFID tags embedded in retail items.[69] In January 2008, the United Kingdom Information Commissioner’s Office published an updated CCTV code of practice.[70] The non-binding code provides advice about how agencies and organisations that use CCTV or similar devices can meet the requirements in the Data Protection Act 1998 (UK).
Collection
10.54 The ‘Collection’ principle requires an agency or organisation to collect information only by fair means, and not in an unreasonably intrusive way.[71] In DP 72, the ALRC proposed that guidance issued by the OPC could explain the meaning of the terms ‘fair means’ and ‘unreasonably intrusive’ in relation to certain technologies.[72] This may be required where technologies allow the collection of information without the knowledge of an individual.
10.55 In DP 72, the ALRC provided examples of situations that the OPC could consider in formulating this guidance. The ALRC suggested that an unreasonably intrusive collection of information by RFID systems might involve collection of information from an RFID tag combined with a location detection sensor embedded in an item of clothing sold by a retailer. The ALRC also suggested that collection of information by keystroke software installed on internet cafe computers may not fall within the definition of ‘fair means’.
10.56 This proposal—and most of the proposals relating to technology-specific guidance—received general support from stakeholders.[73] The Department of Human Services indicated that the proposed guidance would be a cost-effective way to raise awareness of individual rights, educate agencies and organisations about their privacy obligations with respect to developing technology, and facilitate compliance with the Act.[74]
10.57 The OVPC submitted that the proposals relating to guidance should be produced jointly with Privacy Commissioners in all Australian jurisdictions and ‘should be applicable to the federal public sector, the private sector and to state and territory public sectors’.[75]
10.58 The Communications Alliance Ltd submitted that the OPC guidance should be developed in consultation with the communications industry.[76] ACMA observed that the development of appropriate guidance for new and emerging technologies also will require ongoing Australian involvement in international standard-making forums, and OPC liaision with relevant agencies.
10.59 ACMA also suggested that the PIA process
could be used as part of guidance provided by [the] OPC to describe and de-mystify the particular technology in question; identify and analyse possible privacy implications; make recommendations for minimising privacy intrusion and maximising privacy protections, while ensuring that the objectives associated with the proposed deployment and use of the technology are met.[77]
10.60 In the ALRC’s view, the OPC should issue guidance on what agencies and organisations need to do to meet the requirements in the ‘Collection’ principle when using technologies, such as RFID and biometric systems, to handle personal information. This guidance should provide examples, such as when the use of a certain technology to collect personal information is not done by ‘fair means’ and is done ‘in an unreasonably intrusive way’.
Notification
10.61 In response to IP 31, the ALRC received limited support for requiring agencies and organisations that use certain technologies to collect personal information to comply with additional notice requirements. In DP 72, the ALRC stated that including such requirements in the ‘Notification’ principle would not be consistent with its high-level, technology-neutral approach. The ALRC proposed that this issue could form one subject of technology-specific guidance on the ‘Notification’ principle.[78]
10.62 The ALRC provided a non-exhaustive list of what could be included in such guidance. For example, guidance on RFID could encourage agencies and organisations to inform an individual about how to remove or deactivate an RFID tag embedded in a product. Agencies and organisations using biometric systems could be required to inform individuals of the error rates of the systems, and the steps that can be taken by an individual wishing to challenge the system’s results.[79] Further, guidance could encourage agencies or organisations to inform individuals of the format in which personal information may be disclosed—for example, whether it will be disclosed in an electronic format.
10.63 The OPC stated that ‘technology-specific notice requirements are likely to be prescriptive and therefore at odds with the concept of principles-based law’. The OPC submitted that, instead, ‘requirements for new technologies should be incorporated in technologically specific binding guidelines and industry codes’.[80]
10.64 In the ALRC’s view, the OPC should issue guidance that clarifies the requirements under the ‘Notification’ principle when personal information is handled by new and emerging technologies. This recommendation is consistent with the ALRC’s regulatory approach set out at the beginning of this chapter and in Chapter 4. It is also consistent with the other recommendations in this section.
10.65 In addition, if a particular technology requires more stringent notification requirements, these could be included in a technology-specific privacy code approved by the OPC and made mandatory under Part IIIAA of the Privacy Act.[81]The OPC could also initiate its own code for the handling of personal information by a particular technology, and lobby the minister responsible for administering the Privacy Act to have this promulgated by regulations.[82]
Data security
10.66 The ‘Data Security’ principle in the model UPPs requires agencies and organisations to take reasonable steps to protect personal information from loss, misuse and unauthorised access, modification or disclosure. In relation to the IPPs and the NPPs that deal with data security, the OPC has indicated that ‘reasonable steps’ will depend on: the sensitivity of the personal information held; the circumstances in which the personal information is held; the risks of unauthorised access to the personal information; the consequences to the individual of unauthorised access; and the costs of security systems.[83]
10.67 In Chapter 28, the ALRC recommends that the OPC provide guidance on the meaning of the term ‘reasonable steps’ in the ‘Data Security’ principle. This guidance should refer to technological developments in this area and, in particular, relevant encryption standards.[84] In addition, the ALRC recommends that the OPC provide guidance on what is required of an agency or organisation to destroy or render non-identifiable personal information, particularly when that information is held or stored in an electronic form.[85]
Access and correction
10.68 The ‘Access and Correction’ principle provides individuals with a general right to access personal information about them that is held by agencies and organisations.[86] In IP 31, the ALRC noted that some personal information may be stored in a way that makes it difficult to analyse or comprehend.[87] The European Parliament Directive on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (EU Directive) requires personal information to be communicated to an individual in an ‘intelligible form’.[88] This could mean, for example, that a machine capable of reading biometric information, or an expert with the ability to interpret the results of a machine’s analysis of biometric information, should be made available to an individual seeking to exercise his or her right of access to this type of personal information.[89]
10.69 In DP 72, the ALRC suggested that it is implicit in the ‘Access and Correction’ principle that, where an individual requests access to personal information about him or her that is held by an organisation, the organisation should provide access to the information in an intelligible form where this is practicable. Moreover, the ALRC noted that it is best practice for organisations to hold some information only in encrypted form. For example, the sensitive nature of biometric information—and the difficulty of replacing such information if it is compromised—means that an organisation should generally destroy the raw data, such as digital photographs, that are obtained when an individual enrols in a biometric system.[90] The ALRC proposed that the OPC should provide guidance on the type of information that an agency or organisation should make available to an individual when information is held in an encrypted form.[91]
10.70 In its response to DP 72, the OPC disagreed with the ALRC’s view that the proposed ‘Access and Correction’ principle necessarily implied that access to personal information be provided in an intelligible form where this is practicable.
This Office believes that a change which allowed for information to be presented in a comprehensible form would enhance individuals’ access rights. The Office is aware that there will be occasions where it may be extremely difficult for information to be presented in an intelligible form. For this reason, the Office submits that personal information should be made accessible in an intelligible form where practicable.[92]
ALRC’s view
10.71 There is no evidence to indicate that agencies and organisations are providing information to individuals in an unintelligible form. It is implicit in the ‘Access and Correction’ principle that, where an individual requests access to personal information about him or her that is held by an agency or organisation, the agency or organisation should provide access to the information in an intelligible form, where this is practicable. Section 25A of the Acts Interpretation Act 1901 (Cth) states that, where a person is required by or under an Act to make available to a court, tribunal or person, information that is kept in a mechanical, electronic or other device, that information should be reproduced in a form capable of being understood by the court, tribunal or person. This provision appears applicable to individuals’ rights of access under the Privacy Act.
10.72 The ALRC is also concerned that drafting the ‘Access and Correction’ principle in line with the OPC’s submission could have unintended consequences. Requiring agencies and organisations to provide an individual with information held about them in an intelligible form may encourage organisations to hold certain information, which for security reasons should be encrypted permanently, in an unencrypted form or a form that is able to be decrypted.
10.73 On balance, therefore, the ‘Access and Correction’ principle should not be drafted to provide individuals with a right to access information in an intelligible form. The OPC should provide guidance on the type of information that an agency or organisation should make available to an individual when information is held in an encrypted form. This could include, for example, information about whether an encrypted biometric template is a facial or fingerprint biometric.
10.74 The ‘Access and Correction’ principle requires an agency and an organisation to take reasonable steps to correct personal information about an individual when that individual establishes that information held by the agency or organisation is not accurate, complete, up-to-date or relevant.[93] For example, if an organisation’s biometric system repeatedly fails to identify or authenticate an individual who had provided the organisation with biometric information to enrol in the system, this indicates that the biometric information held by the organisation is not accurate, complete or up-to-date. In this context, it would be reasonable for the organisation to re-enrol that individual in the biometric system.[94]
Automated decision review mechanisms
10.75 Article 15(1) of the EU Directive reflects concern about the increasing automation of decisions that affect individuals.[95] It states:
Member States shall grant the right to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.[96]
10.76 In the United Kingdom, a data controller, on request in writing by an individual, is required ‘to ensure that no decision taken by or on behalf of the data controller which significantly affects that individual is based solely on the processing by automatic means of personal data’.[97]
10.77 In 2004, the Administrative Review Council (ARC) published a report that contained a number of principles for agencies carrying out automated decision-making processes, and included a principle that provided for the manual review of decisions in certain circumstances.[98] The ARC Report was the basis for a guide published in February 2007 by the Australian Government Information Management Office (AGIMO).[99] This guide provides suggestions on when automated systems may be suitable for administrative decision making, the development and governance of automated systems and the design of such systems.
10.78 Research into computer systems indicates that such systems are not inherently accurate and reliable. Dr Cameron Spenceley notes that the reliability of computer hardware ‘is governed not only by the validity and integrity of its design, but also by the lifespan of its physical components’.[100] Further, Spenceley states that the ‘proposition that the reliability of computer software generally meets or exceeds some threshold is not demonstrable on an inherent or empirical basis with information and data that are generally available’.[101]
10.79 In IP 31, the ALRC asked whether an additional privacy principle for automated decision-making processes was required. In DP 72, the ALRC proposed that human review of automated decision-making processes should be the subject of guidance issued by the OPC.[102] The ALRC suggested that such guidance could be based on the material on automated decision-making processes produced by the ARC and AGIMO.
Submissions and consultations
10.80 In response to DP 72, some stakeholders supported the inclusion in the ‘Data Quality’ principle of a requirement for the review of automated decisions. The OPC was concerned that the ALRC’s proposal would be ‘a reiteration of the ARC’s guidelines rather than an enforceable mechanism’. The OPC submitted that the ‘Data Quality’ principle should be amended to include a technology-neutral review mechanism.[103]
10.81 The Cyberspace Law and Policy Centre stated that
the proposed ‘Data Quality’ principle (though not the ‘Access and Correction’ principle) in the UPPs could be taken to impose such a requirement in some circumstances, but submits that it should be made an express requirement as part of UPP 7, with an appropriate ‘reasonable steps’ limitation.[104]
10.82 ANZ submitted that it had considerable experience in using automated decision-making models to assess applications for products such as credit cards and personal loans. ANZ stated that ‘well designed models do not produce inaccurate or unreliable results and have not generated consumer dissatisfaction’.[105]
ALRC’s view
10.83 Ensuring that accurate decisions are made about individuals is in the interests of the agencies and organisations that make such decisions. The ALRC supports the practice of human review of decisions that are made by automated means, particularly when an agency or organisation plans to take adverse action against an individual on the basis of such a decision. In supporting this practice, the ALRC notes research that indicates that computer software and hardware may not necessarily produce accurate and reliable results.
10.84 The ALRC, however, received no concrete example of harm resulting from automated decision making by agencies and organisations. This indicates that the inclusion in the ‘Data Quality’ principle of a prescriptive requirement for human review of automated decisions is not required. The OPC should provide guidance on when it would be appropriate for an agency or organisation to involve humans in the review of decisions made by automated mechanisms. In light of a demonstrated lack of harm, guidance should address the issue appropriately.
10.85 The model UPPs generally provide high-level and outcomes-based requirements.[106] The ‘Data Quality’ principle requires an agency or organisation to take reasonable steps to make sure that personal information that it collects, uses or discloses is accurate, complete, up-to-date and relevant. This principle provides for outcomes relevant to review of decisions made by automated means.
Data-matching
10.86 Data-matching is ‘the large scale comparison of records or files … collected or held for different purposes, with a view to identifying matters of interest’.[107] Privacy concerns about data-matching include the: revealing of previously unknown information about an individual without the knowledge or consent of that individual; profiling of an individual; difficulty for an individual in accessing information contained in the new data-set without knowledge that such a data-set was compiled; accuracy of the matched data; and security of large amounts of data collected for the purposes of data-matching or data mining.[108]
Regulation of data-matching
10.87 Currently, agencies that conduct data-matching activities are subject to regulation additional to the privacy principles. Mandatory rules apply to data-matching involving tax file numbers[109] and the OPC has published guidance that applies to other data-matching activities of agencies.[110] Data-matching conducted by organisations may be subject to some of the privacy principles, but there is no specific regulation of the data-matching activities of organisations. This section considers whether greater regulation of data-matching activities is required.
10.88 The Privacy Commissioner has functions relating to data-matching. These include undertaking research and monitoring developments in data processing and computer technology (including data-matching and data linkage) to help minimise any adverse effects of such developments on privacy.[111] In addition, the Privacy Commissioner can examine (with or without a request from a minister) any proposal for data-matching or data linkage that may involve an interference with privacy or that may have any adverse effects on the privacy of individuals.[112] The Privacy Commissioner may report to the minister responsible for administering the Privacy Act[113] about the results of any research into developments in data-matching or proposals for data-matching.[114]
10.89 The Data-matching Program (Assistance and Tax) Act 1990 (Cth) and the Data-matching Program (Assistance and Tax) Guidelines (the Guidelines) regulate the use of tax file numbers to match data held by certain agencies, such as the Australian Taxation Office and Centrelink.[115] The Privacy Commissioner monitors compliance with the Act and the Guidelines. The Privacy Commissioner advises agencies about the interpretation of the Actand inspects the way in which they undertake data-matching regulated by the Act.[116] An act or practice that breaches Part 2 of the Data-matching Program (Assistance and Tax) Act, or the Guidelines, constitutes an ‘interference with privacy’.[117] An individual can complain to the Privacy Commissioner about any such act or practice.[118]
10.90 Agencies may also engage in data-matching activities that do not involve the use of tax file numbers. For example, in early 2004, the Australian Securities and Investments Commission (ASIC) began matching data from its public database with data from the Insolvency and Trustee Service Australia’s National Personal Insolvency Index.[119] The purpose of this data-matching program is to identify individuals who should be disqualified automatically from managing corporations under the Corporations Act 2001 (Cth).[120]
10.91 The Privacy Commissioner has issued guidelines for agencies that engage in data-matching practices that are not regulated by the Data-matching (Assistance and Tax) Act 1990 (Cth) (the voluntary data matching guidelines).[121] The voluntary data-matching guidelines aim to ensure that data-matching programs ‘are designed and conducted in accordance with sound privacy practices’.[122] Although the guidelines are not legally binding, a number of agencies have agreed to comply with them.[123]
10.92 The voluntary data-matching guidelines apply to agencies that match data from two or more databases, if at least two of the databases contain information about more than 5,000 individuals.[124] In summary, the guidelines require agencies to: give public notice of any proposed data-matching program; prepare and publish a ‘program protocol’ outlining the nature and scope of a data-matching program; provide individuals with an opportunity to comment on matched information if the agency proposes to take administrative action on the basis of it; and destroy personal information that does not lead to a match. Further, the voluntary data-matching guidelines generally prohibit agencies from creating new, separate databases from information about individuals whose records have been matched.[125]
Submissions and consultations
10.93 In IP 31, the ALRC asked whether data-matching programs that fall outside the Data-matching Program Assistance and Tax Act should be regulated more formally.[126] In DP 72, the ALRC proposed that the OPC should issue guidance that applies to data-matching by organisations.[127] The ALRC suggested that this guidance could be in the form of guidelines that are based on the existing data-matching guidelines that apply to agencies.
10.94 A number of stakeholders submitted that the privacy principles do not regulate adequately the data-matching activities of organisations.[128] Centrelink supported the ALRC’s proposal, noting that the proposed guidance ‘would enhance the privacy of citizens and make processes consistent across agencies and organisations’.[129] The OPC submitted that it ‘is committed to issuing guidelines on best practice for data-matching activities’.[130]
10.95 Other stakeholders expressed concern that the ALRC’s proposal fell short of providing adequate privacy protection for data-matching activities conducted by organisations. The OVPC supported the introduction into the Privacy Act of statutory provisions modelled on Part X of the Privacy Act 1993 (NZ)—which provides regulation of information-matching programs conducted by New Zealand agencies.[131] The Office of the Privacy Commissioner submitted that it be empowered to make binding codes for data-matching activities undertaken by specific industries.[132]
10.96 Several stakeholders stated that agencies should be subject to greater regulation when conducting data-matching programs.[133] The OPC submitted that the existing voluntary data-matching guidelines should be reviewed and made mandatory.[134]
ALRC’s view
10.97 While several stakeholders expressed concern about the privacy implications of data-matching activities, there is no indication that agencies are not currently complying with the voluntary data-matching guidelines issued by the OPC. A case has not been made out, therefore, for making these guidelines mandatory. In reaching this decision, the ALRC notes that one of the functions of the OPC, discussed above, is to research and monitor technology, including data-matching, and report the results to the minister. The OPC could exercise this function to review the adequacy of, and compliance with, the existing guidelines if the OPC deems this to be necessary.
10.98 In Chapter 5, the ALRC recommends that the regulation-making power in the Privacy Act should be amended to provide that the Governor-General may make regulations, consistent with the Act, modifying the operation of the UPPs to impose different or more specific requirements on agencies and organisations.[135] This mechanism could be used to provide greater regulation of the data-matching activities of agencies and organisations if the minister responsible for administering the Privacy Act deems this to be necessary. If the OPC finds that agencies are not complying with the voluntary data-matching guidelines, the OPC can lobby the relevant minister to have more stringent regulation of data-matching promulgated by regulation.
10.99 A similar staged approach could be followed for the regulation of data-matching activities conducted by organisations. There is currently no OPC guidance that applies to organisations engaged in data-matching. The OPC should develop and publish guidance for organisations on the privacy implications of data-matching. This guidance could be in the form of guidelines that are based on the existing data-matching guidelines that apply to agencies. If this guidance is found to be inadequate, the OPC could lobby the relevant minister to introduce regulations.
Recommendation 10-3 The Office of the Privacy Commissioner should develop and publish guidance in relation to technologies that impact on privacy. This guidance should incorporate relevant local and international standards. Matters that such guidance should address include:
(a) developing technologies such as radio frequency identification (RFID) or data-collecting software such as ‘cookies’;
(b) when the use of a certain technology to collect personal information is not done by ‘fair means’ and is done ‘in an unreasonably intrusive way’;
(c) when the use of a certain technology will require agencies and organisations to notify individuals at or before the time of collection of personal information;
(d) when agencies and organisations should notify individuals of certain features of a technology used to collect information (for example, how to remove an RFID tag contained in clothing; or error rates of biometric systems);
(e) the type of information that an agency or organisation should make available to an individual when it is not practicable to provide access to information in an intelligible form (for example, the type of biometric information that is held as a biometric template); and
(f) when it may be appropriate for an agency or organisation to provide human review of a decision made by automated means.
Recommendation 10-4 The Office of the Privacy Commissioner should develop and publish guidance for organisations on the privacy implications of data-matching.
[63] See Australian Law Reform Commission, Review of Privacy, IP 31 (2006), [11.125]–[11.140].
[64] See also Chs 4, 18.
[65] In Ch 71, the ALRC recommends that ACMA, in consultation with the OPC, Communications Alliance, the Telecommunications Industry Ombudsman, and other relevant stakeholders, should develop and publish guidance that addresses privacy issues raised by new technologies such as location-based services, voice over internet protocol and electronic number mapping: Rec 71–4.
[66] This power is discussed in Ch 47.
[67] Office of the Federal Privacy Commissioner, The Use of Data Matching in Commonwealth Administration—Guidelines (1998).
[68] Office of the Privacy Commissioner, Privacy Audit Manual—Part I (Information Privacy Principles) (1995), 5. In Ch 47, ‘Privacy Performance Assessments’ are recommended: Rec 47–6.
[69] Information and Privacy Commissioner of Ontario, Privacy Guidelines for RFID Information Systems (2006).
[70] United Kingdom Government Information Commissioner’s Office, CCTV Code of Practice—Revised Edition (2008).
[71] See the ‘Collection’ principle set out in the model UPPs.
[72] Australian Law Reform Commission, Review of Australian Privacy Law, DP 72 (2007), Proposal 7–5(a).
[73] See, eg, Australian Government Department of Agriculture‚ Fisheries and Forestry, Submission PR 556, 7 January 2008; Public Interest Advocacy Centre, Submission PR 548, 26 December 2007; Australian Government Attorney-General’s Department, Submission PR 546, 24 December 2007; Office of the Privacy Commissioner, Submission PR 499, 20 December 2007; Australia Post, Submission PR 445, 10 December 2007; Australian Unity Group, Submission PR 381, 6 December 2007.
[74] Australian Government Department of Human Services, Submission PR 541, 21 December 2007.
[75] Office of the Victorian Privacy Commissioner, Submission PR 493, 19 December 2007. See Rec 17–3.
[76] Communications Alliance Ltd, Submission PR 439, 10 December 2007.
[77] Australian Communications and Media Authority, Submission PR 522, 21 December 2007.
[78] Australian Law Reform Commission, Review of Australian Privacy Law, DP 72 (2007), Proposal 7–5(b).
[79] See Council of Europe, Progress Report on the Application of the Principles of Convention 108 to the Collection and Processing of Biometric Data (2005), 11.
[80] Office of the Privacy Commissioner, Submission PR 499, 20 December 2007.
[81] Codes are discussed further in Ch 48.
[82] The regulation-making power is discussed in Ch 5.
[83]Privacy Act 1988 (Cth) s 14, IPP 4; sch 3, NPP 4.1; Office of the Federal Privacy Commissioner, Security and Personal Information, Information Sheet 6 (2001), 1.
[84] Rec 28–3.
[85] Rec 28–4.
[86] See Ch 29.
[87] Australian Law Reform Commission, Review of Privacy, IP 31 (2006), [11.132].
[88] European Parliament, Directive on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, Directive 95/46/EC (1995), art 12(a).
[89] Council of Europe, Progress Report on the Application of the Principles of Convention 108 to the Collection and Processing of Biometric Data (2005), [82].
[90] See, eg, Biometrics Institute, Biometrics Institute Privacy Code (2006), Principle 11.
[91] Australian Law Reform Commission, Review of Australian Privacy Law, DP 72 (2007), Proposal 7–5(d).
[92] Office of the Privacy Commissioner, Submission PR 499, 20 December 2007.
[93] See Ch 29.
[94] Biometric systems are discussed in detail in Ch 9.
[95] L Bygrave, ‘Minding the Machine: Art 15 of the EC Data Protection Directive and Automated Profiling’ (2000) 7 Privacy Law & Policy Reporter 67, 68.
[96] European Parliament, Directive on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, Directive 95/46/EC (1995), art 15(1). The United Nations Human Rights Committee has also stated that art 17 of the International Covenant on Civil and Political Rights, 16 December 1966, [1980] ATS 23, (entered into force generally on 23 March 1976) means that an individual has the right to ‘ascertain in an intelligible form’ personal data that is stored in automatic data files: United Nations Office of the High Commissioner for Human Rights, General Comment No 16: The Right to Respect of Privacy, Family, Home and Correspondence, and Protection of Honour and Reputation (Art 17) (1988), [10].
[97]Data Protection Act 1998 (UK) s 12(1).
[98] Administrative Review Council, Automated Assistance in Administrative Decision Making, ARC 46 (2004), Principle 22.
[99] Australian Government Information Management Office, Automated Assistance in Administrative Decision-Making Better Practice Guide (2007).
[100] C Spenceley, ‘Evidentiary Treatment of Computer-Produced Material: A Reliability Based Evaluation’, Thesis, University of Sydney, 2003, 121.
[101] Ibid, 151.
[102] Australian Law Reform Commission, Review of Australian Privacy Law, DP 72 (2007), Proposal 7–5(e).
[103] Office of the Privacy Commissioner, Submission PR 499, 20 December 2007.
[104] Cyberspace Law and Policy Centre UNSW, Submission PR 487, 19 December 2007.
[105] ANZ, Submission PR 467, 13 December 2007.
[106] See Chs 4, 18.
[107] Office of the Federal Privacy Commissioner, The Use of Data Matching in Commonwealth Administration—Guidelines (1998), [14].
[108] The impact on privacy of data-matching is discussed further in Ch 9.
[109]Data-matching Program (Assistance and Tax) Act 1990 (Cth); Office of the Federal Privacy Commissioner, Schedule—Data-matching Program (Assistance and Tax) Guidelines (1997).
[110] Office of the Federal Privacy Commissioner, The Use of Data-Matching in Commonwealth Administration—Guidelines (1998) <www.privacy.gov.au> at 5 May 2008.
[111] See Privacy Act 1988 (Cth) s 27(1)(c).
[112] Ibid s 27(1)(k).
[113] Commonwealth of Australia, Administrative Arrangements Order, 25 January 2008 [as amended 1 May 2008].
[114]Privacy Act 1988 (Cth) ss 27(1)(c), 32(1).
[115] See Ch 9.
[116] Office of the Privacy Commissioner, The Operation of the Privacy Act Annual Report: 1 July 2004–30 June 2005 (2005), [3.8].
[117]Privacy Act 1988 (Cth) s 13.
[118] Ibid s 36; Data-matching Program (Assistance and Tax) Act 1990 (Cth) s 14.
[119] Australian Securities and Investments Commission, ITSA Data Matching Protocol <www.asic.gov.au> at 1 May 2008.
[120] Ibid.
[121] Office of the Federal Privacy Commissioner, Schedule—Data-matching Program (Assistance and Tax) Guidelines (1997).
[122] Office of the Federal Privacy Commissioner, The Use of Data Matching in Commonwealth Administration—Guidelines (1998), 1.
[123] Office of the Federal Privacy Commissioner, The Operation of the Privacy Act Annual Report: 1 July 2003–30 June 2004 (2004), 68–71.
[124] Office of the Federal Privacy Commissioner, The Use of Data Matching in Commonwealth Administration—Guidelines (1998), [15].
[125] Ibid, [33]–[41], [42]–[47], [63], [69].
[126] Australian Law Reform Commission, Review of Privacy, IP 31 (2006), Question 7–6(h).
[127] Australian Law Reform Commission, Review of Australian Privacy Law, DP 72 (2007), Proposal 7–6.
[128] See, eg, Office of the Privacy Commissioner, Submission PR 215, 28 February 2007; Office of the Victorian Privacy Commissioner, Submission PR 217, 28 February 2007; Centre for Law and Genetics, Submission PR 127, 16 January 2007; Office of the Information Commissioner (Northern Territory), Submission PR 103, 15 January 2007.
[129] Australian Government Centrelink, Submission PR 555, 21 December 2007. See also National Health and Medical Research Council, Submission PR 397, 7 December 2007.
[130] Office of the Privacy Commissioner, Submission PR 499, 20 December 2007.
[131] Office of the Victorian Privacy Commissioner, Submission PR 493, 19 December 2007.
[132] Office of the Privacy Commissioner, Submission PR 499, 20 December 2007.
[133] See, eg, Australian Privacy Foundation, Submission PR 553, 2 January 2008; Office of the Victorian Privacy Commissioner, Submission PR 493, 19 December 2007; Office of the Privacy Commissioner, Submission PR 215, 28 February 2007; Office of the Victorian Privacy Commissioner, Submission PR 217, 28 February 2007; Australian Privacy Foundation, Submission PR 167, 2 February 2007.
[134] Office of the Privacy Commissioner, Submission PR 215, 28 February 2007.
[135] Rec 5–1.