China has launched an ambitious project to combine the nation’s public and private security cameras into a centralised national surveillance and data sharing platform. 

Through its "Sharp Eyes" project China intends to combine its data sharing platform with facial recognition software to track suspects, predict crimes and monitor its immense population. At a time when trust in private companies to use data appropriately is at an all time low, the project raises questions about how governments will ensure data is used in accordance with individuals wishes.  

There are two sides to this coin.  The technologies may well be very helpful to governments for activities such as law enforcement but they also:

1) raise concerns around data use by governments in excess of citizens expectations;

2) carry risks of hacking and subsequent fraud; 

3) may potentially worsen the profiling of minority groups.

China is by no means alone among nations in utilising facial recognition software to assist law enforcement in preventing crimes. But it is the huge reach of the programme and the proposed combination of this data with other personal information - medical records, social media presence, travel plans - that sets this programme apart, raising the question of what privacy rights Chinese citizens can expect.  In New Zealand, facial recognition software is already in use by government departments such as Customs and the Police. But what does the public, whose data is being collected, expect these departments to do with it? We like to think that our government would respect the privacy of New Zealanders and limit the use of this technology to matters essential for public health and safety. However, “Sharp Eyes” shows the extent to which this technology could be taken if care isn't taken.

"Sharp Eyes" and other uses of facial recognition also raises concerns regarding fraud and criminal justice. If a government - or private - database holding facial recognition data was to be hacked there is little that individuals could do to prevent fraud. You may be able to change your password or reset a PIN, but the bio metrics of your face are not as easy to change.

There is also the potential for an increase in profiling of minority groups. If one group is disproportionately arrested and subject to a mugshot - and if databases are not regularly audited to remove those subsequently found innocent - then there is an increased chance that members of those groups may be falsely identified as suspects. This has previously been argued by civil rights groups in the United States.

The "Sharp Eyes" programme offers an insight into where the use of facial recognition software by government and law enforcement could go. While there is no indication the software would be used to the same extent in New Zealand, the question remains whether the public is willing to trust our leaders to use our data appropriately - and not for other Orwellian goals.