Bonefishblues wrote:The earlier link I posted suggested that it is highly application-specific, i.e. the algorithms vary in quality, but at its best it is now very reliable. It also suggests it's making great strides in terms of quality - one wouldn't expect otherwise in the current technological climate. Describing it as 'not accurate' is a generalisation.
There is a latent gender and racial bias, seemingly.
SF has indeed banned it - if anywhere was going to be in that vanguard, then SF's a likely candidate.
So what's to do? Is it the case that we should put this in a cupboard, just as the railway pioneers did when Huskisson was killed
To be clear, I am not saying this is fully resolved, but the way to deal with this is surely not for a random case to be brought under the Human Rights Act, we have to be more proactive than that, surely? Or is our legislature tied up with other matters, I wonder?
Sorry if I was perhaps a bit strong bonefish but I still think you are being blase.
Yes I do think a human rights act case should be brought. It is not a random case at all but very pertinent.
The authorities/police have massively mishandled this.
By just ploughing ahead with no regulation.
If the current non regulation and "screw your rights" situation goes on, we may as well tear up any privacy/data protection laws now and have done with it.
Yes the police are under resourced these days - that is why they need more than ever policing by consent.
Reduce the level of consent in society to policing and you will be talking crippling financial outlays.
The Stasi comes to mind - no and I am not exaggerating.
Reduce society to that level and you are not just talking mass trashing of human rights but economically crippling levels of surveillance.
The police are in my view being profoundly stupid.
I wish the very relevant human rights case well.