Viewing entries tagged
testing

Comment

Voting System (De)certification - A Way Forward? (2 of 2)

Yesterday I wrote about the latest sign of the downward spiral of the broken market in which U.S. local election officials (LEOs) purchase product and support from vendors of proprietary voting system products, monolithic technology the result of years' worth of accretion, and costing years and millions to test and certify for use -- including a current case where the process didn't catch flaws that may result in a certified product being de-certified, and being replaced by a newer system, to the cost of LEOs. Ouch! But could you really expect a vendor in this miserable market to give away new product that they spent years and tens of millions develop, to every customer of the old product, who the vendor had planned to sell upgrades to? -- just because of flaws in the old product? But the situation is actually worse: LEOs don't actually have the funding to acquire a hypothetical future voting system product in which the vendor was fully open about true costs including

(a) certification costs both direct (fees to VSTLs) and indirect cost (staff time), as well as

(b) costs of development including rigorously designed and documented testing.

Actually, development costs alone are bad enough, but certification costs make it much worse -- as well as creating a huge barrier to entry of anyone foolhardy enough to try to enter the market (or even stay in it!) and make a profit.

A Way Forward?

That double-whammy is why I and my colleagues at OSDV are so passionate about working to reform the certification process, so that individual components can be certified for far less time and money than a mess o'code accreted over decades, and including wads of interwoven functionality that might need even need to be certified! And then of course, these individual components could also be re-certfied for bug fixes by re-running a durable test plan that the VSTL created the first time around.  And that of course requires common data formats for inter-operation between components -- for example, between a PCOS device and a Tabulator system that combines and cross checks all the PCOS devices' outputs, in order to either find errors/omissions or find a complete election result.

So once again our appreciation to NIST, EAC, IEEE 1622 for actually doing the detailed work of hashing out these common data formats, which is the bedrock of inter-operation, which is the pre-req for certification reform, which enables certification cost reduction of certification, which might result in voting system component products being available at true costs that are affordable to the LEOs who buy and use them.

Yet's that's quite a stretch, from data standards committee work, to a less broken market that might be able to deliver to customers at reasonable cost. But to replace a rickety old structure with a new, solid, durable one, you have to start at the bedrock, and that's where we're working now.

-- EJS

PS: Thanks again to Joe Hall for pointing out that the current potential de-certification and mandatory upgrade scenario (described in Part 1) illustrates the untenable nature of a market that would require vendors to pay for expensive testing and certification efforts, and to also have to (as some have suggested) forego revenue when otherwise for-pay upgrades are required because of defects in software.

Comment

Comment

Voting System (De)certification - Another Example of the Broken Market (1 of 2)

Long-time readers will certainly recall our view that the market for U.S. voting systems is fundamentally broken. Recent news provides another illustration of the downward spiral: the likely de-certification of a widely used voting system product from the vendor that owns almost three quarters of the U.S. market. The current stage of the story is that the U.S. Election Assistance Commission is formally investigating the product for serious flaws that led to errors of the kind seen in several places in 2010, and perhaps best documented in Cuyahoga County. (See:  "EAC Initiates Formal Investigation into ES&S Unity 3.2.0.0 Voting System".) The likely end result is the product being de-certified, rendering it no longer legal for use in many states where it is currently deployed. Is this a problem for the vendor? Not really. The successor version of the product is due to emerge from a lengthy testing and certification process fairly soon. Having the current product banned is actually a great tool for migrating customers to the latest product!

But at what cost to who? The vendor will charge the customers (local election officials, or LEOs) for the new product, the same as would have been if the migration were voluntary and the old product version still legal. The LEOs will have to sign and pay for a multi-year service agreement. And they will have the same indirect costs of staff efforts (at the expense of other duties like running elections, or getting enough sleep to run an election correctly), and direct costs for shipping, transportation, storage, etc. These are real costs! (Example: I've heard reports of some under-funded election officials opting to not use election equipment that they already have, because they have no funding for the expense of taking out of the warehouse to testing facility, and doing the required pre-election testing.)

Some observers have opined that vendors of flawed voting system products should pay: whether damages, or fines, or doing the migration gratis, or something. But consider this deeper question, from UCB and Princeton's Joe Hall:

Can this market support a regulatory/business model where vendors can't charge for upgrades and have to absorb costs due to flaws that testing and certification didn't find? (And every software product, period, has them).

The funding for a high level of quality assurance has to come from somewhere, and that's not voting system customers right now. Perhaps we're getting to the point where the amount of effort it takes to produce a robust voting system and get it certified -- at the vendor's expense -- creates a cost that customers are not willing or able to pay when the product gets to market.

A good question! and one that illustrates the continuing downward spiral of this broken market. The cost to to vendors of certification is large, and you can't really blame a vendor for the sort of overly rapid development, marketing, and sales that leads to the problems being investigated. The folks are in this business to make a profit for heavens' sake, what else could we expect?

-- EJS

PS - Part Two, coming soon: a way out of the spiral.

Comment

2 Comments

EAC Guidelines for Overseas Voting Pilots

election-assistance-commissionLast Friday was a busy day for the Federal Elections Assistance Commission.  They issued their Report to Congress on efforts to establish guidelines for remote voting systems.  And they closed their comment period at 4:00pm for the public to submit feedback on their draft Pilot Program Testing Requirements. This is being driven by the MOVE Act implementation mandates, which we have covered previously here (and summarized again below).  I want to offer a comment or two on the 300+ page report to Congress and the Pilot program guidelines for which we submitted some brief comments, most of which reflected the comments submitted by ACCURATE, friends and advisers of the OSDV Foundation.

To be sure, the size of the Congressional Report is due to the volume of content in the Appendices including the full text of the Pilot Program Testing Requirements, the NIST System Security Guidelines, a range of example EAC processing and compliance documents, and some other useful exhibits.

Why Do We Care? The TrustTheVote Project’s open source elections and voting systems framework includes several components useful to configuring a remote ballot delivery service for overseas voters.  And the MOVE Act, which updates existing federal regulations intended to ensure voters stationed or residing (not visiting) abroad can participate in elections at home.

A Quick Review of the Overseas Voting Issue The Uniformed and Overseas Citizens Absentee Voting Act (UOCAVA) protects the absentee voting rights for U.S. Citizens, including active members of the uniformed services and the merchant marines, and their spouses and dependents who are away from their place of legal voting residence.  It also protects the voting rights of U.S. civilians living overseas.  Election administrators are charged with ensuring that each UOCAVA voter can exercise their right to cast a ballot.  In order to fulfill this responsibility, election officials must provide a variety of means to obtain information about voter registration and voting procedures, and to receive and return their ballots.  (As a side note, UOCAVA also establishes requirements for reporting statistics on the effectiveness these mechanisms to the EAC.)

What Motivated the Congressional Report? The MOVE (Military and Overseas Voting Enhancement) Act, which became law last fall, is intended to bring UOCAVA into the digital age.  Essentially it mandates a digital means to deliver a blank ballot. 

Note: the law is silent on a digital means to return prepared ballots, although several jurisdictions are already asking the obvious question:  "Why improve only half the round trip of an overseas ballot casting?"

And accordingly, some Pilot programs for MOVE Act implementation are contemplating the ability to return prepared ballots.  Regardless, there are many considerations in deploying such systems, and given that the EAC is allocating supporting funds to help States implement the mandates of the MOVE Act, they are charged with ensuring that those monies are allocated for programs adhering to guidelines they promulgate.  I see it as a "checks and balances" effort to ensure EAC funding is not spent on system failures that put UOCAVA voters participation at risk of disenfranchisement.

And this is reasonable given the MOVE Act intent.  After all, in order to streamline the process of absentee voting and to ensure that UOCAVA voters are not adversely impacted by the transit delays involved due to the difficulty of mail delivery around the world, technology can be used to facilitate overseas absentee voting in many ways from managing voter registration to balloting, and notably for our purposes:

  • Distributing blank ballots;
  • Returning prepared ballots;
  • Providing for tracking ballot progress or status; and
  • Compiling statistics for UOCAVA-mandated reports.

The reality is, however, systems deployed to provide these capabilities face a variety of threats.  If technology solutions are not developed or chosen so as to be configured and managed using guidelines commensurate with the importance of the services provided and the sensitivity of the data involved, a system compromise could carry severe consequences for the integrity of the election, or the confidentiality of sensitive voter information.

The EAC was therefore compelled to prepare Guidelines, report to Congress, and establish (at least) voluntary guidelines.  And so we commented on those Guidelines, as did colleagues of ours from other organizations.

What We Said - In a Nutshell Due to the very short comment period, we were unable to dive into the depth and breadth of the Testing Requirements.  And that’s a matter for another commentary.  Nevertheless, here are the highlights of the main points we offered.

Our comments were developed in consultation with ACCURATE; they consisted of (a) underlining a few of the ACCURATE comments that we believed were most important from our viewpoint; (b) the addition of a few suggestions for how Pilots should be designed or conducted.  Among the ACCURATE comments, we underscored:

  • The need for a Pilot's voting method to include a robust paper record, as well as complementary data, that can be used to audit the results of the pilot.
  • Development of, and publication of security specifications that are testable.

In addition, we recommended:

  • Development of a semi-formal threat model, and comparison of it to threats of one or more existing voting methods.
  • Testing in a mock election, in which members of the public can gain understanding of the mechanisms of the pilot, and perform experimentation and testing (including security testing), without impacting an actual election.
  • Auditing of the technical operations of the Pilot (including data center operations), publication of audit results, and development of a means of cost accounting for the cost of operating the pilot.
  • Publication of ballots data, cast vote records, and results of auditing them, but without compromising the anonymity of the voter and the ballot.
  • Post-facto reporting on means and limits of scaling the size of the pilot.

You can bet this won't be the last we'll hear about MOVE Act Pilots issues; I think its just the 2nd inning of an interesting ball game... GAM|out

2 Comments

Comment

Election Tech Pilots -- Panel Video Now Online

I've got a word to say about "pilots". It seems timely given what seems to be a serious uptick in discussion, legislation, and trials of "pilots" of new use of election technology. Actually, the words have already been said, and by people who know much more than I do about it, at the UOCAVA Summit 2010 now  available on YouTube's Overseas Voting Foundation Channel - Summit 2010 Panels. The video is on the topic of pilots, with real-world experiences provided by Alec Yasinsac, Paul Stenbjorn, Carol Paquette, and Paul Docker. In order to save you the trouble of listening to yours truly in the initial segment, I'll summarize: the term "pilot" can have two different meanings. One meaning is the neutral meaning: we're some election officials who are thinking about modifying or adding to the way we conduct an election, so we thought we'd try it out in a small limited way, and learn whether it is actually useful, and if so what issues there might be in making this change at full scale. Elections in recent years included some pilots like this, of early voting, and of voter centers. Not all were successful, and some of the lessons learned were about some real challenges of doing it right in a full election.

The second meaning the scary meaning: we're some election officials who have pretty much decided that we're going to modify or add to the way we conduct elections, and we're going to use the term "pilot" to sneak in some changes as experimental, and use that as a step to making the changes permanent and full scale. I don't actually know any election officials who work like that, but it is model that some concerned people have in mind.

What you can learn from the video of the panel discussion is some real stories of pilots that people really did as real experiments, what they learned, what they decided was a failure, and what they decided was worthwhile but needing more work before being ready for prime time.

One thing that I have learned by immersing myself in election-land, is that election practices in the U.S. are constantly changing -- every week I hear news of some possible change in some state or locality. Election practices are not at all fixed. That's why I thought it worthwhile to learn how thoughtful election officials try to learn about whether a possible change is actually a good change, or just another good idea.

-- EJS

Comment

Comment

Tim Bray on the way Enterprise Systems are built (compared to open source)

Tim Bray is one of the main people behind XML so he has some serious cred in the world of building and deploying systems. So it with interest (and some palpable butterflies) that a recent missive of his: "Doing it Wrong". I don't know how much of what he says is relevant to what we at TrustTheVote are doing and how we are doing it, but it makes for interesting and highly relevant reading. I do know that many of his examples are very different from elections technology, in fundamental ways, and for many many reasons. So there's no one-to-one correlation, but listen to what he says:

"Doing it wrong:Enterprise Systems, I mean. And not just a little bit, either. Orders of magnitude wrong. Billions and billions of dollars worth of wrong. Hang-our-heads-in-shame wrong. It’s time to stop the madness." (from "Doing it Wrong" from Tim Bray)

and:

"What I’m writing here is the single most important take-away from my Sun years, and it fits in a sentence: The community of developers whose work you see on the Web, who probably don’t know what ADO or UML or JPA even stand for, deploy better systems at less cost in less time at lower risk than we see in the Enterprise." (from "Doing it Wrong" from Tim Bray)

and:

"The Web These Days · It’s like this: The time between having an idea and its public launch is measured in days not months, weeks not years. Same for each subsequent release cycle. Teams are small. Progress is iterative. No oceans are boiled, no monster requirements documents written." (from "Doing it Wrong" from Tim Bray)

and:

"The point is that that kind of thing simply cannot be built if you start with large formal specifications and fixed-price contracts and change-control procedures and so on. So if your enterprise wants the sort of outcomes we’re seeing on the Web (and a lot more should), you’re going to have to adopt some of the cultures and technologies that got them built." (from "Doing it Wrong" from Tim Bray)

All of these quotes are from "Doing it Wrong" from Tim Bray. I suggest reading it.

Comment

Comment

Ballot Design and the importance of (simple) usability tests

In another department of our megaplex one of my colleagues, Aleks Totic is working on ballot layout and design for the TrustTheVote technology suite. I came across this great blog post from the Brennan Center at NYU that describes a recent situation where it appears a simple bit of questionable (but valid) layout may have caused many voters to skip past a ballot initiative. From the conclusion of the article:

"What probably would have alerted officials to this problem ahead of time, and at little or no cost, would have been a simple usability test: observing ten or fifteen King County citizens as they "voted" on the ballot before the design was finalized. This solution is simple, easy and cheap. The Usability Professionals Association has a great explanation of how it's done." (from Ballot Design Still Matters)

Yes, it's true, no matter how wonderful our ballot design guidelines are, and how well an automated checklist is applied to a ballot before printing, a simple usability test ("it aint rocket science") is so simple and cheap, it should never be skipped.

It's a good article: read the whole thing!

Technorati Tags: , , , ,

Comment

1 Comment

Federally Approved Voting System - Not Tested for Security

We now have a federally certified voting system product that has completed the required testing by a federally certified independent test lab. That's a milestone in itself, as is the public disclosure of some of the results of testing process. Thanks to that disclosure, though, we now know that the test lab did practically zero security testing of the Premier product, because of a gross mis-understanding of prior security review. For a complete, accurate, and brief explanation of the whole situation, I urge you to read this letter to the EAC. The letter is from a group of absolutely top-notch voting technology and/or computer security experts, who were involved in California's Top to Bottom Review (TTBR) of voting systems, which included the Premier system that was recently certified.

At the risk of over-simplifying, the story goes like this.

  1. The TTBR found loads of technical security problems and concluded that
    • the security problems were so severe that its technological security mechanisms were unable to protect the system; and
    • the problems could be addressed only with strict procedural security - chain of custody, tamper-evident seals, and the like.
  2. Next, the test lab mis-interpreted these conclusions, assuming that the  system's vulnerabilities depended only on effective procedural controls; therefore, no need to test technical security mechanisms!
  3. The test plan included no additional security tests, and hence the Premier system passed testing despite the many security flaws found in the TTBR.

That's the gist, but do read the letter to the EAC. It's a fine piece of writing in which Joe Hall and Aaron Burstein set out everything fair and square, chapter and verse. I have to say it's astonishing.

Now, maybe this seems exceptionally geeky, with cavils over test plans and test lab results, and so on. Or maybe it seems critical of the EAC/NIST testing program. But in fact that test program is incredibly important as a gate through which computing technology must pass before being used to count votes. In a very real sense, the current testing program is just getting started, so perhaps it's not surprising that there are many lessons to learn. And my thanks go to all these TTBR verterans for speaking out to remind us how much there is to learn on the road to excellence both of voting systems and of the program to test them.

-- EJS

1 Comment

2 Comments

Tales From Real Life: Testing

Another in our series of real life stories ... how it actually works for real election officials to test a new voting system that they might be adopting for use in the state. The backplot is that New York State has been unwilling to give up its admittedly no-longer-legal*  lever machines, until the the state Board of Elections is confident that they have a replacement that not only meets Federal standards, but also is reliable and meeting similar requirements met by the old lever machines. There have been several setbacks in the adoption process, but the latest phase is some detailed testing of the candidate systems. (For the real voting tech geeks, what's being tested is a hybrid of the Dominion ImageCast scanner/Ballot Marking Device and the ES&S DS200 scanner with the Automark BMD.)

Bo Lipari is is on the Advisory Committee for this process, and has reported in detail on the testing process. You don't have to be a complete election geek to scan Bo's tales, and be impressed with the level and breadth of of diligence, and the kinds of kinks and hiccups that can occur. And of course the reportage is very helpful to us, as a concrete example of what kind of real life testing is needed for any new voting system, open or closed, to be accepted for use in U.S. elections.

-- EJS

* No-longer-legal means that NY state law was changed to require replacement of lever machines. In the initial release of this note I erroneously said that the replacement requirement was driven by HAVA. Thanks again to alert readers (see comments below) for the catch, and for providing many resources on the vexed question of "HAVA compliant" generally and lever machines specifically.

2 Comments