The California Supreme Court urged California Bars on Thursday to explain how and why they used artificial intelligence to develop multiple choice questions for the February bar exam.
California’s highest court overseeing state bars revealed Tuesday that state bars were not notified prior to the exam that independent psychometers allowed independent psychometers to use AI to develop a small subset of small questions.
On Thursday, the court explained how to increase public pressure on state bars, use AI to develop questions, and explained what actions were taken to ensure the reliability of the questions.
This demand comes when he petitions courts to adjust the test scores of hundreds of California lawyers who complained of multiple technical issues and irregularities during the February exam.
The controversy is more than state bars use artificial intelligence itself. It’s about high stakes exams that determine how state bars use AI to develop questions and how strict the review process is, whether thousands of aspiring lawyers can practice law in California each year.
It also challenges how transparent state bar officials are as they attempt to abandon the national conference of BAR inspectors’ multi-stateball exams, a system used in most states, and deploy new hybrid models of face-to-face and remote testing to reduce costs.
In a statement Thursday, the Supreme Court said “the way and reasons why AI was drafted, revised or otherwise used in the drafting, revision, or otherwise, efforts taken to ensure reliability of AI-assisted multiple choice questions, the reliability of AI-Assisted Multi-Choice questions, the reliability of the Multi-Assisted questions, and the relationship between the removal of SCES if BE is released. The reliability of the remaining multiple choice questions used for scoring.”
Last year, the court approved a state bar’s plan to build a five-year contract with Kaplan to create 200 test questions for the new exam. State Bar also hired another company, Meazure Learning, to manage the exams.
It was up until this week, almost two months after the exam, that state bars revealed in a news release that they had deviated from plans to write all multi-choice questions using the Kaplan Exam Service.
In the presentation, Statebar revealed that 100 of 171 scored multi-choice questions were given by Kaplan, and 48 were drawn from the exams of first-year law students. A small subset of 23 scored questions was created by statebar psychologist ACS Ventures and developed using artificial intelligence.
“We are confident in the validity of this [multiple-choice questions] To accurately and fairly assess the legal capabilities of test takers,” Leah Wilson, executive director of the state bar, said in a statement.
Alex Chan, the attorney who chairs the Californian State Committee on Lawyers’ Supervision of Exercises, told the Times Tuesday that only a small subset of questions use AI and does not necessarily create questions.
Chang also said that the California Supreme Court in October urged state prisons to “consider the availability of new technologies, such as artificial intelligence, which could innovate and improve the reliability and cost-effectiveness of such tests.”
“The court gave guidance to consider using AI, and that’s exactly what we’re trying to do,” Chan said.
The process, as explained later, will be subject to court review and approval.
On Thursday, Chan revealed to the Times that state bar officials had not told a committee of court examiners ahead of the trials they plan to use AI.
“The committee was not able to consider its use as it had never been notified of the use of AI before the exam was conducted,” Chan said.
Katie Moran, an associate professor at the University of San Francisco School of Law, said he specializes in preparing for the bar exam, which asks a series of questions.
“Who from State Bar directed ACS Ventures to write bar exam questions to create multiple choice questions to be featured in the bar exam,” she said on LinkedIn. “What guidelines were there if statebar provided?”
Mary Bassick, dean of academic skills at UC Irvine Law School, said it’s a big deal that changes to the way state bar drafted the question were not approved by the Judicial Review Board or the California Supreme Court.
“What they approved was a multi-choice exam with questions Kaplan drafted,” she said. “Since Kaplan is a bar preparation company, we have, of course, knowledge of the legal concepts being tested, the bar exam itself, and how to structure the questions. So the idea was that it wasn’t a major change.”
She noted that any major changes that could affect how test takers prepare for exams require two years of notice under the California Business and Occupation Code.
“Usually, these types of questions take years to develop to make sure they are valid and reliable, and there are multiple review steps,” Basik said. “We didn’t have enough time to do that.”
Basik and other professors also raised concerns about hiring non-total trained psychosurveyors to develop questions with AI, and determining whether the questions are valid and reliable, representing conflicts of interest.
State bars are challenging the idea. “The process of validating questions and testing reliability is not subjective, and the statistical parameters used by psychologists remain the same regardless of the cause of the question,” the statement said.
On Tuesday, the state bar told the Times that all questions were reviewed by the content verification panel and subject matter experts prior to the examination for factors such as legal accuracy, minimum ability and potential bias.
When measured for reliability, the state bar said “scored multi-choice questions from all sources, including AI, were run “beyond the psychometric target of 0.80.”
State Bar has yet to answer questions about why Kaplan has deviated from plans to draft multiple-choice questions for all exams. It also doesn’t go into detail about how ACS Ventures uses AI to develop questions.
Source link