What Does ChatGPT (And AI) Mean for Criminal Law?
Blog by Arun S. Maini
Artificial intelligence and ChatGPT are expressions nobody other than computer nerds used, until a few months ago. Now, it’s all anybody talks about, because apparently it will change EVERYTHING.
As a hapless lawyer recently found out, when he trusted a chatbot to do his legal research, and then failed to check the work to make sure it was accurate, the technology can’t replace humans- at least, not yet!
So what do chatbots and AI have to do with criminal law? No doubt, it will be an important tool to search and organize enormous amounts of information. It will do the work of many lawyers, paralegals and students, at a fraction of the cost, and often in no time at all.
The problem is that not all the advancements promised by artificial intelligence will benefit society. Some consequences now being discussed are downright frightening. A network that outgrows its human overseers, and takes decisions to interfere with digital systems (banking, major infrastructure) is not out of the question. And fear of these consequences may mean that it will be a long time before we let a driverless car steer us on the highway, or a pilotless plane fly us over the ocean. But some of the nefarious uses of technology are already here.
Besides the doomsday scenarios of self-directed robots launching nuclear strikes or unleashing deadly viruses, there are more everyday concerns. When a bot can create and release a pop song that sounds just like Drake, or a political campaign releases secret footage of a candidate “caught in the act”, how will you know what is true and what is a “deepfake?”
The misuse of information and the doctoring (or wholesale fabrication) of video, photos and text is going to bedevil the justice system for a long time to come: fake fingerprints that just happen to match a suspect; a CCTV surveillance clip that purports to show a robbery or an assault, fake text messages that constitute a threat. Justice is slow to adapt to new technology (they had to shut down the whole court system for months during COVID) and there may be many wrongful convictions based on fake or altered evidence. It is expensive to hire digital experts who can detect doctored evidence. It may even take the development of other AI technologies to counter the crimes committed by rogue actors who have learned to harness these inventions for devious purposes.
So how do you protect yourself from the negative effects of AI? One way is to be more careful about sharing your data, and letting machines track your every movement. What may seem to be routine information, in the hands of the wrong person, can become a weapon. We are all going to have to become more conversant in the language of the tech geeks as these inventions start to power all of our systems, and raise the possibility of creating chaos.
And if you find yourself accused of a crime you did not commit, based on evidence that is not real, then you will need a team with the knowledge and experience to protect you from the perils of this kind of “progress”.
Arun S. Maini at the Defence Group has been a criminal lawyer for over 25 years. If you or a loved one are facing criminal charges and need the advice of an experienced and skilled lawyer to help you through the legal process, call The Defence Group for a free consultation at 877-295-2830 or email us through the Contact Us link throughout our website.