After California’s future lawyers complained that hundreds of future lawyers were plagued by technical issues and fraud, the state’s legal licensing agency has sparked new outrage by acknowledging that several multiple-choice questions have been developed with the help of artificial intelligence.
California’s state law said in a news release Monday it would require the California Supreme Court to adjust the test scores of those who took the February bar exam.
However, we refused to acknowledge any serious issues with multiple choice questions. It revealed that a series of questions have been recycled from the first-year law school student exam, but others were developed with AI support by ACS Ventures, an independent state psychologist.
“The fiasco that was the bar exam in February 2025 was worse than we could have imagined,” said Mary Bassick, assistant dean of academic skills at UC Irvine Law School. “I have very few words. It’s incredible to use artificial intelligence to ask questions drafted by non-lawyers.”
After completing the exam, Basick said that some test takers complained that they felt like some of the questions were written by AI.
“I protected the bar,” Basik said. “‘However, they don’t! They don’t do that!”
Basick has expressed that “these are the same psychometric technicians tasked with establishing that the questions are valid and reliable,” and that AI developed questions written by psychologists who do not represent “an obvious conflict of interest” that Basick argued.
“It’s an incredible entrance,” agreed Katie Moran, an associate professor at the University of San Francisco School of Law, who specializes in preparing for the bar exam.
“The state bars have admitted that they hired the company to use non-lawyers to draft questions given in the actual bar exam,” she said. “They then paid the same company to evaluate and ultimately approve questions about the exam, including questions written by the company.”
State Bar, the administrative unit of the California Supreme Court, said Monday that most of the multiple choice questions were developed by Kaplan Exam Services, a company they signed with last year because they tried to save money.
According to a recent presentation by State Bar, 100 of the 171 scored multi-choice questions were given by Kaplan, and 48 were drawn from the first-year Law Students exam. A small subset of 23 scored questions was created by statebar psychologist ACS Ventures and developed using artificial intelligence.
“We are confident in the validity of this [multiple-choice questions] To accurately and fairly assess the legal capabilities of test takers,” Leah Wilson, executive director of the state bar, said in a statement.
On Tuesday, a State Bar spokesman told The Times that all questions, including 29 scored unrelated questions from independent psychometers from the agency, developed with AI support, were reviewed prior to the trial on factors such as legal accuracy, minimal ability and potential bias.
When measured as reliable, the state bar told the Times. Total performed multiple-choice questions from all sources, including AI, “beyond the psychometric target of 0.80.”
The state bar also rejected the idea of a conflict of interest.
“The process of validating questions and testing reliability is not subjective,” states Statebar.
Alex Chan, the lawyer who chairs the Judiciary Committee of the State Lawyers Committee, told The Times that only a few people used AI during their time and that they don’t necessarily create questions.
“The professor suggests using AI to draft all the multi-select questions, rather than using AI to examine them,” Chan said. “That’s not my understanding.”
Chang said the California Supreme Court in October urged state prisons to “consider the availability of new technologies, such as artificial intelligence, which could innovate and improve the reliability and cost-effectiveness of such tests.”
“The court gave guidance to consider using AI, and that’s exactly what we’re trying to do,” Chan said.
But a California Supreme Court spokesman said Tuesday that a judge only discovered this week that state bars used AI in developing exam questions.
“Until yesterday’s press release from the state bar, the courts were unaware that AI was being used to draft multiple choice questions,” the spokesman said in a statement.
Last year, as state bars faced a $22 million deficit in the General Fund, they decided to abandon the national conference of multi-state surveillance testing for bar examiners, a system used in most states, to reduce costs and move to a new hybrid model of direct and remote testing. We have reduced our $8.25 million contract with test preparation company Kaplan Exam Service to create test questions and conduct the exams.
There were multiple issues with the development of the new State Bar exam. Some candidates reported that they were started from an online testing platform or experienced screen that delayed error messages. Others complained that multiple choice test questions had typos and consisted of nonsense questions, which ruled out important facts.
Failed exams have led some students to file federal lawsuits against Meazure Learning. Meanwhile, California Senate Attorney Speaker Thomas J. Amberg (D. Santa Ana), sought an audit of the state bar, and the California Supreme Court has directed the agency to return to the traditional in-person management of the July bar exam.
But state bars are moving forward with a new system of multiple choice questions.
“Many people have expressed concern about the speed at which Kaplan questions were drafted and the quality of the resulting questions,” Basik and Moran wrote April 16 in public comments to a committee of BAR examiners. “50 released practice questions (edited and re-released a few weeks before the exam) still contain a number of errors. This further erodes confidence in the quality of the question.”
Historically, Moran took years to develop exam questions written by a national conference of judicial examiners.
Bassick said some of the questions from the first year legal exams raised the red flag. She argued that exams to know if they learned enough in their first year of law school differ from those that determine whether test takers have minimal ability to practice law.
“That’s a pretty different standard,” she said. “It’s not just that, ‘Hey, do you know this rule?” It’s ‘Do you know how to apply it in situations where ambiguity is and determine the correct course of action?’ ”
Also, recycling questions from first-year legal exams using AI represent a major shift in preparation for the bar exam, Basik said. She argued that such changes would require two years of notice under the California Business and Occupation Code.
But the state bar told the Times that the source of the question had not triggered the notification for that two years.
“The fact that there were multiple sources for the development of questions did not affect the preparation for the exam,” state Barr said.
Basick said she was concerned in early March. She said state bars kicked her and other academic experts out of a panel of questionable customers.
She said state bars have argued that these law professors have cooperated with questions drafted by the National Conference’s judicial examiners over the past six months, and could raise issues that could lead to potential copyright infringement.
“Ironically, what they did instead is they use artificial intelligence and have non-lawyer draft questions,” she said. “There’s nothing else available to the location where artificial intelligence got information, so it must be an NCBE question. What other AI uses?”
Since the February exam fiasco, State Bar has underestimated the idea that multiple choice questions have a major problem. Instead, it focuses on the issue of Meazure Learning.
“We are scrutinizing vendor performance in meeting vendor contractual obligations,” state Barr said in a document listing the issues the test has experienced and highlighting the relevant performance expectations presented in the contract.
However, critics accused states of shifting responsibility and argued that they were unable to acknowledge the seriousness of the issue of multiple choice questions.
Moran released all 200 questions he was testing for transparency and called on state bars to provide opportunities for future examiners to become accustomed to a variety of questions. She also called on state bars to return to the multi-state bar exam for the July exam.
“They just showed they couldn’t do a fair test,” she said.
Chan said the Judiciary Committee committee will meet on May 5 to discuss non-scoring coordination and relief measures. However, he doubted that state bars would release all 200 questions or return to the national conference bar exam in July.
He says that the security of NCBE exams does not allow any form of remote testing, and a recent State Bar survey found that nearly 50% of California bar applicants want to maintain remote options.
“We’re not back in NCBE, at least in the near future,” Chan said.
Source link