Fair Trial

Cold_Collation

LE
Book Reviewer
Ladies of negotiable virtue.
I'm reminded of the joke about the old colonel sat next to a woman at a dinner party.
"Madam, would you sleep with me for a million pounds?" he asks.
"Of course," she replies.
"And would you sleep with me for 50 pounds?"
"Of course not! What sort of woman do you think I am?!"
"We've established that. Now we're just haggling."
 
If the legal dept is staffed by computers..........
Programmed by people who identify with the characters from big bang theory...
 
"Confessions" reported by cellmates?
Yes something similar, but at least the "confessions" relate to the charges being tried. In Weinstein's case his accusers were making claims about entirely separate incidents on which the jury was not being asked to decide.

It was simply "I think he must be guilty in this case because he did something similar to me at a different time, although I didn't actually think it worthy of reporting that 'crime' to the police".

A very slippery slope indeed.
 
Let's not sugar-coat it: whores are whores whether they are paid in cash or movie contracts.
I know of more than a few in the banking and finance industry who have shagged their way upwards. One went from a mediocre position to being the board level chief human resouces officer of a bank within a few short years, earning just over $1 million a year. They then found out she is actually fcuking useless, but because she has given BJ’s and taken it up the arrse from so many of the top people in the bank they can’t bin her. So they moved her from her important job in the US, to a paperclip counting job in Spain for a couple of years. Then not too long ago they moved her to a job keeping an eye on the Thames out of an office window in London.

And so it will continue because if they ever try and bin her she will take them all to court claiming she was the sex toy of the senior directors.
 
And how Kong before the AI computers start telling lies and committing fraud?
Just because current AI agents lack a theory of mind doesn’t mean that they cannot learn to deceive. In multi-agent AI systems, some agents can learn deceptive behaviors without having a true appreciation or comprehension of what “deception” actually is. This could be as simple as hiding resources or information, or providing false information to achieve some goal. If we then put aside the theory of mind for the moment and instead posit that intention is not a prerequisite for deception and that an agent can unintentionally deceive, then we really have opened the aperture for existing AI agents to deceive in many ways.

 
I know of more than a few in the banking and finance industry who have shagged their way upwards.
Clearly I should have worn leather chaps onto the trading floor rather than a suit :-(
 
Just because current AI agents lack a theory of mind doesn’t mean that they cannot learn to deceive. In multi-agent AI systems, some agents can learn deceptive behaviors without having a true appreciation or comprehension of what “deception” actually is. This could be as simple as hiding resources or information, or providing false information to achieve some goal. If we then put aside the theory of mind for the moment and instead posit that intention is not a prerequisite for deception and that an agent can unintentionally deceive, then we really have opened the aperture for existing AI agents to deceive in many ways.

So there we have it. The old maxim “the computer can’t lie” is now redundant.
 
It won’t be long before trials and law as we know it gets a complete shake up.

Some clever bloke predicted that lawyers and judges would be among the first jobs for the chop when AI comes to fruition.

Trial by computer is coming. Complex algorithms that work out all the facts of a case, can even tell if you‘re lying and then reference the laws of the land to work out whether you’re guilty and recommend an appropriate punishment.


The only problem with that is that AI does all its 'learning' from previous data - in this case large numbers of what we appear to be suggesting are flawed and/or biased judgements. Expect more of the same but without the pithy legal banter.
 
The only problem with that is that AI does all its 'learning' from previous data - in this case large numbers of what we appear to be suggesting are flawed and/or biased judgements. Expect more of the same but without the pithy legal banter.
You would be surprised, I expect, by just how much learning is human assisted. There are whole offices, mostly in India, filled with people whose job it is to validate photographs for the AI to 'learn' from. i.e. is this a cat? Yes/No etc.
 
You would be surprised, I expect, by just how much learning is human assisted. There are whole offices, mostly in India, filled with people whose job it is to validate photographs for the AI to 'learn' from. i.e. is this a cat? Yes/No etc.
That's going to be interesting when the man dresses as a woman and self-identifies as a toaster. LGBTXYZ is going to be a minefield for AI. What happens when AI is accused of being an ~ist? Can AI commit suicide?
 
You would be surprised, I expect, by just how much learning is human assisted. There are whole offices, mostly in India, filled with people whose job it is to validate photographs for the AI to 'learn' from. i.e. is this a cat? Yes/No etc.
Interesting. Who validates the validators?
 
OP. We've had 'trial by media' for as long as I can remember. This is just another example of that to a lot of people.
 
You would be surprised, I expect, by just how much learning is human assisted. There are whole offices, mostly in India, filled with people whose job it is to validate photographs for the AI to 'learn' from. i.e. is this a cat? Yes/No etc.
At the end of the day it is all human assisted learning. For example: I wrote my own back propogation nets to deal with the data I wanted crunched to produce a range of results for my perusal. I also wrote software creating independent bots that would flag data patterns outside the normal pattern searching akin to statistical analysis. None of those would have existed or functioned without me - go up the chain and there is always someone responsible for data input, data collection, or creating the means for data collection.

As for AI lying, very simplistically, and this has been talked about at length, it is basically down to programmer/operator error. AI will only do what it is designed to do with the input it receives. The largest I saw 20 years ago had something over 300 inputs (columns) and many thousand rows to consider against the new days input row. The answer it used to spit out was a percentage + or - for a single share price. at the end of the day it is down to the old axiom of "rubbish in, rubbish out"
 
It won’t be long before trials and law as we know it gets a complete shake up.

Some clever bloke predicted that lawyers and judges would be among the first jobs for the chop when AI comes to fruition.

Trial by computer is coming. Complex algorithms that work out all the facts of a case, can even tell if you‘re lying and then reference the laws of the land to work out whether you’re guilty and recommend an appropriate punishment.


And yet, hundreds of thousands of hours are lost every year by Law Enforcement, because they have to write everything in notebooks, forms and statements.
Wheel the big telly in to court and just play the video from body cams. No ambiguity on any side then. Court process would be a hell of a lot quicker too
 

Latest Threads

Top