Mary Beth Kuznik, Founder and President of VotePA and 2020 graduate of Duquesne University School of Law, compares attempts to implement a remote bar exam software to failed election technologies...
What do bar exams have in common with elections in the age of COVID-19, aside from the obvious implication that both are related to justice and the rule of law?
While elections have been dealing with the pressures of technology for decades, state bar exams are traditionally huge in-person testing rituals relying heavily on paper and pencil. This year, it is not feasible to conduct bar exams involving hundreds or thousands of applicants in one large hall amidst the global COVID-19 pandemic. In place of live testing, some bar examiners and state supreme courts are announcing remotely-proctored online bar exams. In doing so, these officials are making many of the same misguided assumptions about technology that left our elections at risk for years.
Since late 2004 I have studied voting systems, the vendors that sell them, and the election officials that buy them. I’ve observed election vendors spouting puffery and soothing words to get officials to purchase and trust products that lack cybersecurity and capacity to do the job needed on Election Day. Time and time again I have heard vendors tell election officials that everything will be fine if their company provides the technology. Just choose us and your election will run smoothly. All is well. These are not the droids you are looking for.
As a 2020 law graduate facing an online bar exam in October, I shudder to see these same tactics apparently being directed at bar examiners and state supreme courts by another set of vendors, despite warnings that mass online bar exams are not feasible. I also see bar examiners and supreme courts falling victim to the misguided notion that anything and everything can quickly and easily be put online and will work just fine.
“Everyone shops and banks on their computer, so why can’t we vote online like that too?” In my work for secure and verified elections, it seems I’ve heard this question at least once a week for years. The answer is simple: the requirement of a secure, accurate, and anonymous ballot in public elections is far beyond the capability of today’s internet.
But there is more to it than just that. Although remote technology seems ubiquitous in today’s life, nothing in this field is accomplished by simply taking a process and putting it online. Successful companies that operate online shopping and banking spend millions of dollars and take years to develop and test their software for security, accuracy, and ability of their system to handle its load of customers. Online operations are also insured – meaning that customers are protected when a failure occurs. In most cases, problems with online sales and banking are ultimately corrected and customers are compensated for losses. Nonetheless, banks and major online retailers lose billions of dollars annually to fraud and system failure. Expectation of technology issues and accompanying losses are built into the system.
But normal banking and sales are not the same as elections (or exams.) In these high-stakes events, software must perform flawlessly on a large scale at one crucial time on one crucial day. Here, any significant failure is catastrophic. Fortunately, we have not yet had a known massive failure of technology in an election. But smaller events have provided us with warning signs of what can happen when we ignore careful code development and fail to adequately test online systems.
The Orca and the App
It was mid-morning, Tuesday, November 6, 2012. Barack Obama was running for re-election as President of the United States. His challenger was Massachusetts governor Mitt Romney. I was working inside my local polling place as Judge of Election, excited for my fifth presidential election as a poll officer. After making it through the early morning crush of voters, my fellow workers and I were settling in for a busy, steady day.
Suddenly the relative quiet was interrupted by a loud voice. “I don’t know what the heck is wrong with this app! Nothing I try to do will go through!” It was one of the Romney poll watchers who had been observing voters, and she was very upset. She had been trying since a little after 7 AM to use her smartphone to log into Romney headquarters — but it seemed that the system had gone dead.
The problem turned out to be Orca, a Romney campaign app designed to let 37,000 volunteer poll watchers in certain states check off supporters as they signed in to vote. Once these watchers’ reports were filed, other campaign workers would then telephone voters who had not come out yet and urge them to come to the polls. In other words, Romney’s campaign was conducting a classic get-out-the-vote (GOTV) operation, an election day staple for both major parties since the earliest days of the United States.
But Orca was a whale of a different animal. It was bold in its plans to use real-time data. The system was designed for the checked-off voter lists to be uploaded to servers in the Romney headquarters in Boston. Analytics from these up-to-the-minute reports would allow the campaign to mobilize its Election Day GOTV resources all over the key swing states of Ohio, Florida, Pennsylvania, Iowa, and Colorado.
Orca sounded like a great idea, but its development had been rushed. It was quickly coded and prepped on a “lightning schedule” in the few short months between the spring primaries and the November election. The system had never been fully tested for load capacity. When the 37,000 volunteers all tried to log in and get their precinct lists at the same time on Election Day morning, the Romney campaign servers crashed. They never were able to recover. As reporter Sean Gallagher put it,
Orca had not been tested under real-world conditions and repeatedly failed when it was needed the most . . . whatever testing environment Romney’s campaign team and IT consultant used, it wasn’t one that mimicked the conditions of Election Day. As a result, Orca’s launch on Election Day was essentially a beta test of the software – not something that most IT organizations would do in such a high-stakes environment.
This past February, with all eyes on 2020’s first-in-the-nation Iowa Democratic Presidential Caucus, another poorly-tested app failed spectacularly. The app was built for the Iowa Democratic Party by Shadow, Inc., a small for-profit tech company. Cybersecurity experts expressed concern that the app had not been “vetted, tested at scale, or even shown to independent analysts before being introduced in Iowa.” It failed badly on Caucus Night. A New York Times article headline called it “a systemwide disaster.”
The 2020 Bar Exam
If bar examiners and state supreme courts overseeing the bar would look to the history of technology failures in election software, they would see the enormity of what they are undertaking. Over 30,000 examinees in approximately 16 states are scheduled to take the Multistate Bar Exam at exactly the same time on October 6. It is incredibly risky to proclaim an unprecedented remotely-proctored online bar exam of this magnitude. It is even more risky to rely on a couple of small vendors to hastily provide the programming for it.
Earlier this year, three companies offered technology to allow proctoring of bar exams from students’ homes. One of them, Extegrity, has since pulled out of the remotely proctored online bar exam due to concerns of undue risk.
At this writing, only two vendors remain. It is unclear whether either of these two comparatively tiny companies, each with a history of failures during normally proctored non-remote bar exams, has the resources and ability needed to develop and robustly test the examination and the anti-cheating remote proctoring programs necessary for a successful and secure online bar exam by October. Even for a large company with unlimited resources, it would be a difficult or impossible task to prepare and test such software in a few weeks or months.
Early results are not promising. In late July, software by one of the remaining vendors, ILG, failed badly during practice exams. As a consequence, Indiana and Nevada instituted a postponement of their testing day. Indiana announced an open-book test.
The other remaining vendor, ExamSoft, claimed that a Distributed Denial of Service (DDoS) cyberattack caused their remotely-proctored bar exam to fail with 800 examinees online in Michigan on July 28. This company’s new remote exam monitoring software had been beta tested as recently as March, with just eight customers.
What’s the Remedy?
In this historic election year, state legislatures and officials are wisely making unprecedented changes to allow as many people as possible to vote safely in our global pandemic. These common-sense changes include increased vote-by-mail capability, but thankfully they do not include a massive national call for dangerously insecure internet voting. Since evidence surfaced of election interference by Russia and other nation-states in 2016, officials and the media have started to pay appropriate attention to the danger facing our elections, especially to the increasing danger posed by online operations using the internet.
State bar examiners and supreme courts should follow this lead. Common-sense measures should be adopted to allow 2020 law graduates to attain licensure without subjecting applicants to dubious online schemes in this unprecedented pandemic year. A reasonable emergency alternative to a hastily-prepared online bar exam would be diploma privilege followed perhaps by increased supervision for a time and/or additional legal education requirements.
At the very least bar examiners should follow Indiana and allow open-book exams. Such a test would allow the exam to bypass poorly-tested remote proctoring software. It would also be more representative of real-life practice where responsible lawyers normally look things up.
What bar examiners should not do is to continue to trust that two tiny companies will be able to successfully program and test a massive online remote-proctoring service by the first week of October. While I have the deepest respect for my state Supreme Court and Board of Bar Examiners, everything I have learned about election technology over the years tells me that this online bar exam scheme is unlikely to work as they intend. I have zero confidence that the online proctoring system will not crash when I sign on for the Multistate Bar Exam along with 30,000 other applicants on October 6. I shudder at the thought of all my years of preparation in law school and my months of studying for the bar exam being wiped away by one software error or an overloaded server when a viable and safe compromise is available.
JURIST carries extended coverage of Bars Exams in the Pandemic.
Mary Beth Kuznik is the founder and President of VotePA, a statewide, nonpartisan alliance of groups and individuals dedicated to voting rights and verified elections in Pennsylvania. An elected Judge of Elections, she has served as a poll worker since before the 1992 Clinton-Gore election. Mary Beth has represented VotePA at meetings of the National Association of Secretaries of State and as a member of the Election Verification Network, VoteTrustUSA, and numerous in-state coalitions. She participated in election policy planning with Pennsylvania Governor Tom Wolf and his staff in 2016, with the U.S. Senate Rules Committee in 2007, and has been quoted in The New York Times, The Wall Street Journal, and many other publications. In May 2020, Mary Beth graduated with a Juris Doctor degree from Duquesne University School of Law in Pittsburgh. She is scheduled to take the Pennsylvania Bar Exam in October.
Suggested citation: Mary Beth Kuznik, The Orca, the App, and the Bar Exam, JURIST – Professional Commentary, August 15, 2020, https://www.jurist.org/commentary/2020/08/mary-beth-kuznik-bar-exam-orca/.
This article was prepared for publication by Gabrielle Wast. Please direct any questions or comments to her at firstname.lastname@example.org
Opinions expressed in JURIST Commentary are the sole responsibility of the author and do not necessarily reflect the views of JURIST's editors, staff, donors or the University of Pittsburgh.