I'm a civil litigator. I draft and argue motions, go to trial, take depositions, etc. I have found AI most helpful in shortening the time required to find the cases I need (e.g., 10 minutes instead of an hour). As far as I can tell, it's finding me the same cases I would have otherwise found.
Also, the AI doesn't write well enough for me to use it for writing without it being counterproductive. I know people who use it for document review, but I don't do personal injury or product liability work, so I don't typically have tens of thousands of pages to review.
Also, I believe Westlaw (my primary research tool) operates in a somewhat closed manner (i.e., they aren't training on slop or jeets). All it's really done is reduce reliance on boolean operators.
Other than some outrageous examples, I'm pretty happy with the current status quo in the legal industry. I am, of course, extremely bias because it's working out quite well for me. In my opinion, the primary issue with the legal industry is the hoards of childless women, with unlimited time on their hands, who are intent on conquering every legal institution.
Heed the author's disclaimer, all ye who seek insight into what is happening with AI and the law. If you want AI-doomer titillation then this article is for you.
There are many logical leaps based on incorrect premises.
1. "The slop recursion doom loop": Legal research AIs that attorneys pay for are not training on jeetslop.
2. "AI-generated briefs with their AI citations will force judges to adopt the AI's view of the law": Judges are not bound to decide cases only on the precedent cited by the parties.
3. "AI will narrow the wiggle room for justice provided by the common law through the AI's ability to find cases that are so on-point that the judges will be bound to follow this previously undiscovered case law": Not all cases have precedential effect. Trial court cases have essentially zero precedential effect on other trial court cases. They may be persuasive but they do not restrict a judge's discretion.
Fun thought experiment, but it's as useful as poetry if you want to learn something.
I think most of the other comments on this post have a very limited grasp of the ways an AI-law feedback loop could work, & an unwarranted optimistic evaluation of the average quality of the legal practitioner.
The real power over a conversation isn't the express manipulation of what is being said, but the editorial power to simply remove entire areas of the conversation from consideration. While this may be incorrect, the popular conception of precedent is that the deciding judge must consider it & justify breaking from it - precedent is a form of editorial control. It constrains the judges thinking like the gravity field of a planet. No human has been able to produce anything novel since the time of Adam, merely iterating on what already exists.
The model becomes AI is sourcing the precedent for cases, which affect the judges decision, which forms new precedent, which the AI proffers up for citation in new cases, etc. The AI doesn't need to consume its' own jeetslop, the feedback loop is one step removed. A mediocre judicial practitioner uses AI as a research crutch to stay competitive in the market. A mediocre judge uses AI to keep up with his caseload. Incentivized to conserve fuel, the legal profession remains in the orbit of ever-greater AI-driven precedent. The sclerosis of the Common Law increases at ever-greater speeds.
To get something out of a book like Fahrenheit 451, & how to it applies today, the reader must look past the lack of 4-wall television rooms. You have to be able to apply the same kind of logic to a piece like this.
I think this article is overly alarmist about AI's effects on the law. It's important to understand that there is a LOT of incompetence out there in the profession. There always has been, as a matter of fact. The examples the author gives with the lawyer turning in a brief with bogus cases and the judge using AI for sentencing are hardly novel, and not really caused by AI in my opinion. Attorneys have been handing in motions and briefs that distort or misunderstand appellate holdings, misstate the law, or outright lie for decades and decades. Many judges, likewise, have been lazy or just plain stupid, and have issued moronic rulings since the inception of the American judicial system. Despite this, that system has chugged along for centuries without collapsing into flames.
Law is like any other vocation, there are skilled and genuinely intelligent attorneys/judges out there practicing alongside midwits (or at least the legal equivalent) and absolute retards. The cream of the profession is a small minority amidst a sea of mediocrity. AI programs are just tools, similar to Westlaw or LexisNexis (which both extensively use AI now too, by the way). In the hands of incompetent people, AI is going assist them in producing slop that is little different than the slop they'd be producing otherwise. In the hands of talented people, it's likely going to be a useful, but probably not game-changing, weapon in the arsenal. The upper echelon of attorneys and judges are almost certainly going to continue to be the ones driving the development of case law into the future just as they always have, only now they'll be using AI as they do so.
The fact is that AI is just a reality now, and it's not going to be put back in the box. I'd bet money that there were probably people predicting a similar doom for the field when the aforementioned legal research engines came about too, but there isn't a lawyer alive today that creeps around law libraries pulling physical books off the shelves to look up precedent. I seriously doubt this is going to lead to "fully automated anarchotyranny," although I suppose time will tell.
A good starting point for protecting oneself from this would be to understand when Common Law, Equity Law, and Contract Law are being used. The majority of our legal system deals in contract law.
You can’t give legal advice without speaking directly with someone about their own personal legal problems. Don’t echo CIA psyops like “sovereign citizen” and cite your shit if you’re going to talk it.
I'm a civil litigator. I draft and argue motions, go to trial, take depositions, etc. I have found AI most helpful in shortening the time required to find the cases I need (e.g., 10 minutes instead of an hour). As far as I can tell, it's finding me the same cases I would have otherwise found.
Also, the AI doesn't write well enough for me to use it for writing without it being counterproductive. I know people who use it for document review, but I don't do personal injury or product liability work, so I don't typically have tens of thousands of pages to review.
Also, I believe Westlaw (my primary research tool) operates in a somewhat closed manner (i.e., they aren't training on slop or jeets). All it's really done is reduce reliance on boolean operators.
Other than some outrageous examples, I'm pretty happy with the current status quo in the legal industry. I am, of course, extremely bias because it's working out quite well for me. In my opinion, the primary issue with the legal industry is the hoards of childless women, with unlimited time on their hands, who are intent on conquering every legal institution.
Heed the author's disclaimer, all ye who seek insight into what is happening with AI and the law. If you want AI-doomer titillation then this article is for you.
There are many logical leaps based on incorrect premises.
1. "The slop recursion doom loop": Legal research AIs that attorneys pay for are not training on jeetslop.
2. "AI-generated briefs with their AI citations will force judges to adopt the AI's view of the law": Judges are not bound to decide cases only on the precedent cited by the parties.
3. "AI will narrow the wiggle room for justice provided by the common law through the AI's ability to find cases that are so on-point that the judges will be bound to follow this previously undiscovered case law": Not all cases have precedential effect. Trial court cases have essentially zero precedential effect on other trial court cases. They may be persuasive but they do not restrict a judge's discretion.
Fun thought experiment, but it's as useful as poetry if you want to learn something.
Poetry can be quite instructive.
Nope. Good luck finding a job 🙏
I think most of the other comments on this post have a very limited grasp of the ways an AI-law feedback loop could work, & an unwarranted optimistic evaluation of the average quality of the legal practitioner.
The real power over a conversation isn't the express manipulation of what is being said, but the editorial power to simply remove entire areas of the conversation from consideration. While this may be incorrect, the popular conception of precedent is that the deciding judge must consider it & justify breaking from it - precedent is a form of editorial control. It constrains the judges thinking like the gravity field of a planet. No human has been able to produce anything novel since the time of Adam, merely iterating on what already exists.
The model becomes AI is sourcing the precedent for cases, which affect the judges decision, which forms new precedent, which the AI proffers up for citation in new cases, etc. The AI doesn't need to consume its' own jeetslop, the feedback loop is one step removed. A mediocre judicial practitioner uses AI as a research crutch to stay competitive in the market. A mediocre judge uses AI to keep up with his caseload. Incentivized to conserve fuel, the legal profession remains in the orbit of ever-greater AI-driven precedent. The sclerosis of the Common Law increases at ever-greater speeds.
To get something out of a book like Fahrenheit 451, & how to it applies today, the reader must look past the lack of 4-wall television rooms. You have to be able to apply the same kind of logic to a piece like this.
I think this article is overly alarmist about AI's effects on the law. It's important to understand that there is a LOT of incompetence out there in the profession. There always has been, as a matter of fact. The examples the author gives with the lawyer turning in a brief with bogus cases and the judge using AI for sentencing are hardly novel, and not really caused by AI in my opinion. Attorneys have been handing in motions and briefs that distort or misunderstand appellate holdings, misstate the law, or outright lie for decades and decades. Many judges, likewise, have been lazy or just plain stupid, and have issued moronic rulings since the inception of the American judicial system. Despite this, that system has chugged along for centuries without collapsing into flames.
Law is like any other vocation, there are skilled and genuinely intelligent attorneys/judges out there practicing alongside midwits (or at least the legal equivalent) and absolute retards. The cream of the profession is a small minority amidst a sea of mediocrity. AI programs are just tools, similar to Westlaw or LexisNexis (which both extensively use AI now too, by the way). In the hands of incompetent people, AI is going assist them in producing slop that is little different than the slop they'd be producing otherwise. In the hands of talented people, it's likely going to be a useful, but probably not game-changing, weapon in the arsenal. The upper echelon of attorneys and judges are almost certainly going to continue to be the ones driving the development of case law into the future just as they always have, only now they'll be using AI as they do so.
The fact is that AI is just a reality now, and it's not going to be put back in the box. I'd bet money that there were probably people predicting a similar doom for the field when the aforementioned legal research engines came about too, but there isn't a lawyer alive today that creeps around law libraries pulling physical books off the shelves to look up precedent. I seriously doubt this is going to lead to "fully automated anarchotyranny," although I suppose time will tell.
By that logic, it is inevitable that we are doomed to a future of unintelligible X’s. Dumb.
Good article. Scary possibilities.
A good starting point for protecting oneself from this would be to understand when Common Law, Equity Law, and Contract Law are being used. The majority of our legal system deals in contract law.
Common law applies to almost everything. Don't give legal advice, don't shill Sov Cit nonsense.
You can’t give legal advice without speaking directly with someone about their own personal legal problems. Don’t echo CIA psyops like “sovereign citizen” and cite your shit if you’re going to talk it.