Viewing entries tagged
certification

Comment

State Certification of Future Voting Systems — 3 Points of Departure

This is a more technical post than others here given the broadening of an audience visiting this Foundation web site in search for content like this article below rather than hanging out on our more geeky Project site (which is soon to be relaunched and be way more engaging for all audiences, we're excited to report).  Usually, you will find this kind of content over there, while here we'll talk more about voting experience innovations, policy matters, and progress of the Project.  So, for those who are passionate about elections reform and improving the voting experience, but are not as fluent in some of the technical issues, feel free to look this over, but do not fret if seems like gobbledygook.  There is more relevant stuff for your concerns to come.  Ready?  Here we go...

Comment

Comment

A Northern Exposed iVoting Adventure

NorthernExposureImage
NorthernExposureImage

Alaska's extension to its iVoting venture may have raised the interests of at least one journalist for one highly visible publication.  When we were asked for our "take" on this form of iVoting, we thought that we should also comment here on this "northern exposed adventure." (apologies to those fans of the mid-90s wacky TV series of a similar name.) Alaska has been among the states that allow military and overseas voters to return marked absentee ballots digitally, starting with fax, then eMail, and then adding a web upload as a 3rd option.  Focusing specifically on the web-upload option, the question was: "How is Alaska doing this, and how do their efforts square with common concerns about security, accessibility, Federal standards, testing, certification, and accreditation?"

In most cases, any voting system has to run that whole gauntlet through to accreditation by a state, in order for the voting system to be used in that state. To date, none of the iVoting products have even trying to run that gauntlet.

So, what Alaska is doing, with respect to security, certification, and host of other things is essentially: flying solo.

Their system has not gone through any certification program (State, Federal, or otherwise that we can tell); hasn't been tested by an accredited voting system test lab; and nobody knows how it does or doesn't meet  federal requirements for security, accessibility, and other (voluntary) specifications and guidelines for voting systems.

In Alaska, they've "rolled their own" system.  It's their right as a State to do so.

In Alaska, military voters have several options, and only one of them is the ability to go to a web site, indicate their choices for vote, and have their votes recorded electronically -- no actual paper ballot involved, no absentee ballot affidavit or signature needed. In contrast to the sign/scan/email method of return of absentee ballot and affidavit (used in Alaska and 20 other states), this is straight-up iVoting.

So what does their experience say about all the often-quoted challenges of iVoting?  Well, of course in Alaska those challenges apply the same as anywhere else, and they are facing them all:

  1. insider threats;
  2. outsider hacking threats;
  3. physical security;
  4. personnel security; and
  5. data integrity (including that of the keys that underlie any use of cryptography)

In short, the Alaska iVoting solution faces all the challenges of digital banking and online commerce that every financial services industry titan and eCommerce giant spends big $ on every year (capital and expense), and yet still routinely suffer attacks and breaches.

Compared to the those technology titans of industry (Banking, Finance, Technology services, or even the Department of Defense), how well are Alaskan election administrators doing on their shoestring (by comparison) budget?

Good question.  It's not subject to annual review (like banks' IT operations audit for SAS-70), so we don't know.  That also is their right as a U.S. state.  However, the  fact that we don't know, does not debunkany of the common claims about these challenges.  Rather, it simply says that in Alaska they took on the challenges (which are large) and the general public doesn't know much about how they're doing.

To get a feeling for risks involved, just consider one point, think about the handful of IT geeks who manage the iVoting servers where the votes are recorded and stored as bits on a disk.  They arenot election officials, and they are no more entitled to stick their hands into paper ballots boxes than anybody else outside a county elections office.  Yet, they have the ability (though not the authorization) to access those bits.

  • Who are they?
  • Does anybody really oversee their actions?
  • Do they have remote access to the voting servers from anywhere on the planet?
  • Using passwords that could be guessed?
  • Who knows?

They're probably competent responsible people, but we don'tknow.  Not knowing any of that, then every vote on those voting servers is actually a question mark -- and that's simply being intellectually honest.

Lastly, to get a feeling for the possible significance of this lack of knowledge, consider a situation in which Alaska's electoral college votes swing an election, or where Alaska's Senate race swings control of Congress (not far-fetched given Murkowski's close call back in 2010.)

When the margin of victory in Alaska, for an election result that effects the entire nation, is a low 4-digit number of votes, and the number of digital votes cast is similar, what does that mean?

It's quite possible that those many digital votes could be cast in the next Alaska Senate race.  If the contest is that close again,  think about the scrutiny those IT folks will get.  Will they be evaluated any better than every banking data center investigated after a data breach?  Any better than Target?  Any better than Google or Adobe's IT management after having trade secrets stolen?  Or any better than the operators of military unclassified systems that for years were penetrated through intrusion from hackers located in China who may likely have been supported by the Chinese Army or Intelligence groups?

Probably not.

Instead, they'll be lucky (we hope) like the Estonian iVoting administrators, when the OCSE visited back in 2011 to have a look at the Estonian system.  Things didn't go so well.  OCSE found that one guy could have undermined the whole system.  Good news: it didn't happenCold comfort: that one guy didn't seem to have the opportunity -- most likely because he and his colleagues were busier than a one-armed paper hanger during the election, worrying about Russian hackers attacking again, after they had previously shut-down the whole country's Internet-connect government systems.

But so far, the current threat is remote, and it is still early days even for small scale usage of Alaska's iVoting option.  But while the threat is still remote, it might be good for the public to see some more about what's "under the hood" and who's in charge of the engine -- that would be our idea of more transparency.

Wandering off the Main Point for a Few Paragraphs So, in closing I'm going to run the risk of being a little preachy here (signaled by that faux HTML tag above); again, probably due to the surge in media inquiries recently about how the Millennial generation intends to cast their ballots one day.  Lock and load.

I (and all of us here) are all for advancing the hallmarks of the Millennial mandates of the digital age: ease and convenience.  I am also keenly aware there are wing-nuts looking for their Andy Warhol moment.  And whether enticed by some anarchist rhetoric, their own reality distortion field, or most insidious: the evangelism of a terrorist agenda (domestic or foreign) ...said wing nut(s) -- perhaps just for grins and giggles -- might see an opportunity to derail an election (see my point above about a close race that swings control of Congress or worse).

Here's the deep concern: I'm one of those who believes that the horrific attacks of 9.11 had little to do with body count or the implosions of western icons of financial might.  The real underlying agenda was to determine whether it might be possible to cause a temblor of sufficient magnitude to take world financial markets seriously off-line, and whether doing so might cause a rippling effect of chaos in world markets, and what disruption and destruction that might wreak.  If we believe that, then consider the opportunity for disruption of the operational continuity of our democracy.

Its not that we are Internet haters: we're not -- several of us came from Netscape and other technology companies that helped pioneer the commercialization of that amazing government and academic experiment we call the Internet.  Its just that THIS Internet and its current architecture simply was not designed to be inherently secure or to ensure anyone's absolute privacy (and strengthening one necessarily means weakening the other.)

So, while we're all focused on ease and convenience, and we live in an increasingly distributed democracy, and the Internet cloud is darkening the doorstep of literally every aspect of society (and now government too), great care must be taken as legislatures rush to enact new laws and regulations to enable studies, or build so-called pilots, or simply advance the Millennial agenda to make voting a smartphone experience.  We must be very careful and considerably vigilant, because its not beyond the realm of reality that some wing-nut is watching, cracking their knuckles in front of their screen and keyboard, mumbling, "Oh please. Oh please."

Alaska has the right to venture down its own path in the northern territory, but it does so exposing an attack surface.  They need not (indeed, cannot) see this enemy from their back porch (I really can't say of others).  But just because it cannot be identified at the moment, doesn't mean it isn't there.

One other small point:  As a research and education non-profit we're asked why shouldn't we be "working on making Internet voting possible?Answer: Perhaps in due time.  We do believe that on the horizon responsible research must be undertaken to determine how we can offer an additional alternative by digital means to casting a ballot next to absentee and polling place experiences.  And that "digital means" might be over the public packet-switched network.  Or maybe some other type of network.  We'll get there.  But candidly, our charge for the next couple of years is to update an outdated architecture of existing voting machinery and elections systems and bring about substantial, but still incremental innovation that jurisdictions can afford to adopt, adapt and deploy.  We're taking one thing at a time and first things first; or as our former CEO at Netscape used to say, we're going to "keep the main thing, the main thing."

Onward
GAM|out

Comment

1 Comment

Comments Prepared for Tonight's Elections Technology Roundtable

This evening at 5:00pm members of the TrustTheVote Project have been invited to attend an elections technology round table discussion in advance of a public hearing in Sacramento, CA scheduled for tomorrow at 2:00pm PST on new regulations governing Voting System Certification to be contained in Division 7 of Title 2 of the California Code of Regulations. Due to the level of activity, only our CTO, John Sebes is able to participate.

We were asked if John could be prepared to make some brief remarks regarding our view of the impact of SB-360 and its potential to catalyze innovation in voting systems.  These types of events are always dynamic and fluid, and so we decided to publish our remarks below just in advance of this meeting.

Roundtable Meeting Remarks from the OSDV Foundation | TrustTheVote Project

We appreciate an opportunity to participate in this important discussion.  We want to take about 2 minutes to comment on 3 considerations from our point of view at the TrustTheVote Project.

1. Certification

For SB-360 to succeed, we believe any effort to create a high-integrity certification process requires re-thinking how certification has been done to this point.  Current federal certification, for example, takes a monolithic approach; that is, a voting system is certified based on a complete all-inclusive single closed system model.  This is a very 20th century approach that makes assumptions about software, hardware, and systems that are out of touch with today’s dynamic technology environment, where the lifetime of commodity hardware is months.

We are collaborating with NIST on a way to update this outdated model with a "component-ized" approach; that is, a unit-level testing method, such that if a component needs to be changed, the only re-certification required would be of that discrete element, and not the entire system.  There are enormous potential benefits including lowering costs, speeding certification, and removing a bar to innovation.

We're glad to talk more about this proposed updated certification model, as it might inform any certification processes to be implemented in California.  Regardless, elections officials should consider that in order to reap the benefits of SB-360, the non-profit TrustTheVote Project believes a new certification process, component-ized as we describe it, is essential.

2. Standards

2nd, there is a prerequisite for component-level certification that until recently wasn't available: common open data format standards that enable components to communicate with one another; for example, a format for a ballot-counter's output of vote tally data, that also serves as input to a tabulator component.  Without common data formats elections officials have to acquire a whole integrated product suite that communicates in a proprietary manner.  With common data formats, you can mix and match; and perhaps more importantly, incrementally replace units over time, rather than doing what we like to refer to as "forklift upgrades" or "fleet replacements."

The good news is the scope for ballot casting and counting is sufficiently focused to avoid distraction from the many other standards elements of the entire elections ecosystem.  And there is more goodness because standards bodies are working on this right now, with participation by several state and local election officials, as well as vendors present today, and non-profit projects like TrustTheVote.  They deserve congratulations for reaching this imperative state of data standards détente.  It's not finished, but the effort and momentum is there.

So, elections officials should bear in mind that benefits of SB-360 also rest on the existence of common open elections data standards.

3. Commercial Revitalization

Finally, this may be the opportunity to realize a vision we have that open data standards, a new certification process, and lowered bars to innovation through open sourcing, will reinvigorate a stagnant voting technology industry.  Because the passage of SB-360 can fortify these three developments, there can (and should) be renewed commercial enthusiasm for innovation.  Such should bring about new vendors, new solutions, and new empowerment of elections officials themselves to choose how they want to raise their voting systems to a higher grade of performance, reliability, fault tolerance, and integrity.

One compelling example is the potential for commodity commercial off-the-shelf hardware to fully meet the needs of voting and elections machinery.  To that point, let us offer an important clarification and dispel a misconception about rolling your own.  This does not mean that elections officials are about to be left to self-vend.  And by that we mean self-construct and support their open, standard, commodity voting system components.  A few jurisdictions may consider it, but in the vast majority of cases, the Foundation forecasts that this will simply introduce more choice rather than forcing you to become a do-it-yourself type.  Some may choose to contract with a systems integrator to deploy a new system integrating commodity hardware and open source software.  Others may choose vendors who offer out-of-the-box open source solutions in pre-packaged hardware.

Choice is good: it’s an awesome self-correcting market regulator and it ensures opportunity for innovation.  To the latter point, we believe initiatives underway like STAR-vote in Travis County, TX, and the TrustTheVote Project will catalyze that innovation in an open source manner, thereby lowering costs, improving transparency, and ultimately improving the quality of what we consider critical democracy infrastructure.

In short, we think SB-360 can help inject new vitality in voting systems technology (at least in the State of California), so long as we can realize the benefits of open standards and drive the modernization of certification.

 

EDITORIAL NOTES: There was chatter earlier this Fall about the extent to which SB-360 allegedly makes unverified non-certified voting systems a possibility in California.  We don't read SB-360 that way at all.  We encourage you to read the text of the legislation as passed into law for yourself, and start with this meeting notice digest.  In fact, to realize the kind of vision that leading jurisdictions imagine, we cannot, nor should not alleviate certification, and we think charges that this is what will happen are misinformed.  We simply need to modernize how certification works to enable this kind of innovation.  We think our comments today bear that out.

Moreover, have a look at the Agenda for tomorrow's hearing on implementation of SB-360.  In sum and substance the agenda is to discuss:

  1. Establishing the specifications for voting machines, voting devices, vote tabulating devices, and any software used for each, including the programs and procedures for vote tabulating and testing. (The proposed regulations would implement, interpret and make specific Section 19205 of the California Elections Code.);
  2. Clarifying the requirements imposed by recently chaptered Senate Bill 360, Chapter 602, Statutes 2013, which amended California Elections Code Division 19 regarding the certification of voting systems; and
  3. Clarifying the newly defined voting system certification process, as prescribed in Senate Bill 360.

Finally, there has been an additional charge that SB-360 is intended to "empower" LA County, such that what LA County may build they (or someone on their behalf) will sell the resulting voting systems to other jurisdictions.  We think this allegation is also misinformed for two reasons: [1] assuming LA County builds their system on open source, there is a question as to what specifically would/could be offered for sale; and [2] notwithstanding offering open source for sale (which technically can be done... technically) it seems to us that if such a system is built with public dollars, then it is in fact, publicly owned.  From what we understand, a government agency cannot offer for sale their assets developed with public dollars, but they can give it away.  And indeed, this is what we've witnessed over the years in other jurisdictions.

Onward.

1 Comment

Comment

"Why is There a Voting Tech Logjam in Washington

"Why is There a Voting Tech Logjam?" -- that's a good question! A full answer has several aspects, but one of them is the acitivty (or in-activity) at the Federal level, that leads to very limited options in election tech. For a nice pithy explanation of that aspect, check out the current issue of the newsletter of the National Conference of State Legislators, on page 4. One really important theme addressed here is the opportunity for state lawmakers to make their decisions about what standards to use, to enable the state's local election officials make their decisions about what technology to make or adopt -- including purchase, in-house build, and (of course) adoption and adaptation of open-source election technology.

-- EJS

Comment

2 Comments

Election Transparency Must be Apolitical

For those of you who have been following the recount saga in Wisconsin, here is a bit of news, and a reflection on that. So, the news from a couple of days ago (I'm just catching up) is that the process of re-counting is complete, but the resolution of that close election may not be.  The re-counting did not change which candidate is leading, and apparently expanded the margin slightly.

Trailing candidate Joanne Kloppenburg explains her motivation for the recount in a newspaper letter to the editor, building on the old but true assertion that, "One may be entitled to their own opinions, but they are not entitled to their own facts."

We steer clear of political food fights, and I have no opinion on her motivation. But we are all about transparency and transparency should not have any political agenda attached.

To that end, what Kloppenburg does point about some of the irregularities, problems, and issues with the re-counting process (which are not the same as the problems with the original count), including lack of physical security on ballots, and uncertainty as to whether the re-counted ballots were the same ballots as the originally counted ones -- are reasonable questions about transparency.  More importantly, Kloppenburg offers some reflections about the re-count that are important and correctly apolitical:

When races are this close, there is a significant public interest established both by statute and by common sense in determining that votes were counted and counted accurately.

This election was close, and there were many who have expressed doubts about whether it was clean. The right to vote is fundamental. It is a right that courageous people fight and die for every day.  In America, that right carries with it a promise: that elections are fair and open, that election results are untainted by deceit or fraud, and that the electoral process provides every eligible voter with an equal opportunity to privately and independently cast a ballot.

In order to make that promise real, there are appropriate and established steps that help make sure the outcome of elections, when in doubt, can withstand scrutiny. That, no more and no less, is exactly why this recount is so important.

That is, in fact, a fine description of the purpose of a recount.

It's unfortunate that in this particular case, the re-count process seems to have a similar or greater level of problems that cast doubt on the result.  We can only hope that the full scope of the process, warts and all, becomes transparent to the public.

For me, I find that regardless of candidate or political preferences, there is a point couched in the last two sentences excerpted from her letter that matters most:

...there are appropriate and established steps that help make sure the outcome of elections, when in doubt, can withstand scrutiny

Transparency in process.  There should be nothing political about that.

GAM|out

2 Comments

Comment

Voting System (De)Certification, Reloaded (Part 3 of 2)

Thanks to some excellent recent presentations by EAC folks, we have today a pleasant surprise of an update to our recent blogs Voting System Decertification: A Way Forward (in Part 1 and Part 2). As you might imagine with a government-run test and certification program, there is an enormous amount of detail (much of it publicly available on the EAC web site!) but Greg and I have boiled it down to a handful of point/counterpoints. Very short version: EAC seems to be doing a fine job, both in the test/certification/monitoring roles, and in public communication about it. At the risk of oversimplifying down to 3 points, here goes: 1. Perverse Incentive

Concern: ES&S's Unity 3.2.0.0 would be de-certified as a result of EAC's investigation into functional irregularities documented in Cuyahoga County, Ohio, by erstwhile elections direction Jane Platten (Kudos to Cuyahoga). With the more recent product 3.2.1.0 just certified, the "fix" might be for customers of 3.2.0.0 to upgrade to the latest version, with unexpected time and unbudgeted upgrade expense to customers, including license fees. If so, then the product defect, combined with de-certification, would actually benefit the vendor by acting to spur customers toward paid upgrades. Update: Diligent work at EAC and ES&S has resulted in in ES&S providing an in-place fix to its 3.2.0.0 product, so that EAC doesn't have to de-certify the product, and customers don't have to upgrade. In fact, one recent result of EAC's work with Cuyahoga County, the county was able to get money back from the vendor because of the issues identified.

Next Steps: We'll be waiting to hear whether the fix is provided at ES&S's expense (or at least no cost to customers), as it appears may be the case. We'll also be watching with interest the process in which version 3.2.0.0+ fix goes through the test and certification process to get legal for real use in elections. As longtime readers know, we've stressed the importance of the emergence of a timely re-certification process for products that have been certified, need a field update, and need the previously used test lab to test the updated system with testing that is as rigorous as the first time, but less costly and more timely.

2. Broken Market

Concern: This situation may be an illustration of the untenable nature of of a market that would require vendors to pay for expensive testing and certification efforts, and to also have to forego revenue when otherwise for-pay upgrades are required because of defects in software.

Update: By working with the vendor and their test lab on both the earlier and current versions of the product, all customers will be able to obtain a no-fee update of their existing product version, rather than being required to do a for-fee upgrade to a later product version. Therefore, the "who pays for the upgrade?" question applies only to those customers who actually want to to pay for the latest version.

Next Steps: Thanks to the EAC's new process of publishing timelines for all product evaluation versions, it should be possible to compare the timeframe for the original 3.2.0.0 testing, the more recent 3.2.1.0 testing, and the testing of the bug fixed version of 3.2.0.0. We can hope that this case demonstrates that a re-certification process can indeed be equally rigorous, less costly, and more timely.

ES&S Evaluation Timeline

3. Lengthy Testing and Certfication

Concern: The whole certification testing process costs millions and takes years for these huge voting system products of several components and dozens of modules of software. How could a re-test really work at a tiny fraction of a fraction of that time and cost?

Update: Again, thanks to publishing those timelines, and with experience of recent certification tests, we can see the progress that EAC is making towards their goal that an end-to-end testing campaign of a system to be less than 9 months and a million dollars, perhaps even quarter or a third less. The key, of course, is that a system be ready for testing. As we've seen with some of the older systems that simply weren't designed to meet current standards, and weren't engineered with a rigorous and documented Q&A process that could be disclosed to a test lab to build on, well, it can be a lengthy process -- or even one that a vendor withdraws from in order to go back and do some re-engineering before trying again.

Next Steps: A key part of this time/cost reduction is EAC's guidance to vendors on readiness for testing. That guidance is another relatively recent improvement by EAC. We can hope for some public information in future about how the readiness assessment has worked, and how it helped a test process get started right. But even better, EAC folks have again repeated a further goal for time/cost reduction, but moving to voting system component certification, rather than certifying the whole enchilada - or perhaps I should say, a whole enchilada, rather than the whole plato gordo of enchilada, quesadillas, and chile relleno, together with the the EMS for it all with its many parts - rice, frijoles, pico de gallo, fresh guacamole ... (I detect an over-indlugence in metaphor here!)

One More Thing: As we've said before, we think that component level testing and re-testing is the Big Step to get whole certification scheme into a shape that really serves its real customers - election officials and the voting public. And we're proud to jump in and try it out ourselves -- work with EAC, use the readiness assessment ourselves, do a pilot test cycle, and see what can be learned about how that Big Step might actually work in the future.

-- EJS

Comment

Comment

Voting System (De)certification - A Way Forward? (2 of 2)

Yesterday I wrote about the latest sign of the downward spiral of the broken market in which U.S. local election officials (LEOs) purchase product and support from vendors of proprietary voting system products, monolithic technology the result of years' worth of accretion, and costing years and millions to test and certify for use -- including a current case where the process didn't catch flaws that may result in a certified product being de-certified, and being replaced by a newer system, to the cost of LEOs. Ouch! But could you really expect a vendor in this miserable market to give away new product that they spent years and tens of millions develop, to every customer of the old product, who the vendor had planned to sell upgrades to? -- just because of flaws in the old product? But the situation is actually worse: LEOs don't actually have the funding to acquire a hypothetical future voting system product in which the vendor was fully open about true costs including

(a) certification costs both direct (fees to VSTLs) and indirect cost (staff time), as well as

(b) costs of development including rigorously designed and documented testing.

Actually, development costs alone are bad enough, but certification costs make it much worse -- as well as creating a huge barrier to entry of anyone foolhardy enough to try to enter the market (or even stay in it!) and make a profit.

A Way Forward?

That double-whammy is why I and my colleagues at OSDV are so passionate about working to reform the certification process, so that individual components can be certified for far less time and money than a mess o'code accreted over decades, and including wads of interwoven functionality that might need even need to be certified! And then of course, these individual components could also be re-certfied for bug fixes by re-running a durable test plan that the VSTL created the first time around.  And that of course requires common data formats for inter-operation between components -- for example, between a PCOS device and a Tabulator system that combines and cross checks all the PCOS devices' outputs, in order to either find errors/omissions or find a complete election result.

So once again our appreciation to NIST, EAC, IEEE 1622 for actually doing the detailed work of hashing out these common data formats, which is the bedrock of inter-operation, which is the pre-req for certification reform, which enables certification cost reduction of certification, which might result in voting system component products being available at true costs that are affordable to the LEOs who buy and use them.

Yet's that's quite a stretch, from data standards committee work, to a less broken market that might be able to deliver to customers at reasonable cost. But to replace a rickety old structure with a new, solid, durable one, you have to start at the bedrock, and that's where we're working now.

-- EJS

PS: Thanks again to Joe Hall for pointing out that the current potential de-certification and mandatory upgrade scenario (described in Part 1) illustrates the untenable nature of a market that would require vendors to pay for expensive testing and certification efforts, and to also have to (as some have suggested) forego revenue when otherwise for-pay upgrades are required because of defects in software.

Comment

Comment

Voting System (De)certification - Another Example of the Broken Market (1 of 2)

Long-time readers will certainly recall our view that the market for U.S. voting systems is fundamentally broken. Recent news provides another illustration of the downward spiral: the likely de-certification of a widely used voting system product from the vendor that owns almost three quarters of the U.S. market. The current stage of the story is that the U.S. Election Assistance Commission is formally investigating the product for serious flaws that led to errors of the kind seen in several places in 2010, and perhaps best documented in Cuyahoga County. (See:  "EAC Initiates Formal Investigation into ES&S Unity 3.2.0.0 Voting System".) The likely end result is the product being de-certified, rendering it no longer legal for use in many states where it is currently deployed. Is this a problem for the vendor? Not really. The successor version of the product is due to emerge from a lengthy testing and certification process fairly soon. Having the current product banned is actually a great tool for migrating customers to the latest product!

But at what cost to who? The vendor will charge the customers (local election officials, or LEOs) for the new product, the same as would have been if the migration were voluntary and the old product version still legal. The LEOs will have to sign and pay for a multi-year service agreement. And they will have the same indirect costs of staff efforts (at the expense of other duties like running elections, or getting enough sleep to run an election correctly), and direct costs for shipping, transportation, storage, etc. These are real costs! (Example: I've heard reports of some under-funded election officials opting to not use election equipment that they already have, because they have no funding for the expense of taking out of the warehouse to testing facility, and doing the required pre-election testing.)

Some observers have opined that vendors of flawed voting system products should pay: whether damages, or fines, or doing the migration gratis, or something. But consider this deeper question, from UCB and Princeton's Joe Hall:

Can this market support a regulatory/business model where vendors can't charge for upgrades and have to absorb costs due to flaws that testing and certification didn't find? (And every software product, period, has them).

The funding for a high level of quality assurance has to come from somewhere, and that's not voting system customers right now. Perhaps we're getting to the point where the amount of effort it takes to produce a robust voting system and get it certified -- at the vendor's expense -- creates a cost that customers are not willing or able to pay when the product gets to market.

A good question! and one that illustrates the continuing downward spiral of this broken market. The cost to to vendors of certification is large, and you can't really blame a vendor for the sort of overly rapid development, marketing, and sales that leads to the problems being investigated. The folks are in this business to make a profit for heavens' sake, what else could we expect?

-- EJS

PS - Part Two, coming soon: a way out of the spiral.

Comment

Comment

Open-Source Election Software Hosting -- What Works

Putting an open source application into service - or "deployment" - can be different from deploying proprietary software. What works, and what doesn't? That's a question that's come up several times in the few weeks, as the TTV team has been working hard on several proposals for new projects in 2011. Based on our experiences in 2009-10, here is what we've been saying about deployment of election technology that is not a core part of ballot certified casting/counting systems, but is part of the great range of other types of election technology: data management solutions for managing election definitions, candidates, voter registration, voter records, pollbooks and e-pollbooks, election results, and more - and reporting and publishing the data. For proprietary solutions - off the shelf, or with customization and professional services, or even purely custom applications like many voter record systems in use today - deployment is most often the responsibility of the vendor. The vendor puts the software into the environment chosen by the customer -- state or local election officials - ranging from the customer's IT plant to outsourced hosting to the vendor's offering of an managed service in an application-service-provider approach. All have distinct benefits, but share the drawback of "vendor lock-in."

What about open-source election software? There are several approaches that can work, depending the nature of the data being managed, and the level of complexity in the IT shop of the election officials. For today, here is one approach that has worked for us.

What works: outsourced hosting, where a system integrator (SI) manages outsourced hosting. For our 2010 project for VA's FVAP solution, the project was led by an SI that managed the solution development and deployment, providing outsourced application hosting and support. The open-source software included a custom Web front-end to existing open-source election data management software that was customized to VA's existing data formats for voters and ballots. This arrangement worked well because the people who developed the custom front-end software also performed the deployment on a system completely under their control. VA's UOCAVA voters benefited from the voter service blank-ballot distribution, while the VA state board of elections was involved mainly by consuming reports and statistics about the system's operation.

That model works, but not in every situation. In the VA case, this model also constrained the way that the blank ballot distribution system worked. In this case, the system did not contain personal private information -- VA-provided voter records were "scrubbed". As a result, it was OK for the system's limited database to reside in a commercial hosting center outside of the the direct control of election officials. The deployment approach was chosen first, and it constrained the nature of the Web application.

The constraint arose because the FVAP solution allowed voters to mark ballots digitally (before printing and returning by post or express mail). Therefore it was essential that the ballot-marking be performed solely on the voter's PC, which absolutely no visibility by the server software running in the commercial datacenter. Otherwise, each specific voter's choices would be visible to a commercial enterprise -- clearly violating ballot secrecy. The VA approach was a contrast to some other approaches in which a voter's choices were sent over the Internet to a server which prepared a ballot document for the voter. To put it another way …

What doesn't work: hosting of government-privileged data. In the case of the FVAP solution, this would have been outsourced hosting of a system that had visibility on the ultimate in election-related sensitive data: voters' ballot choices.

What works: engaged IT group. A final ingredient in this successful recipe was engagement of a robust IT organization at the state board of elections. The VA system was very data-intensive during setup, with large amounts of data from legacy systems. The involvement of VA SBE IT staff was essential to get the job done on the process of dumping the data, scrubbing and re-organizing it, checking it, and loading it into the FVAP solution -- and doing this several times as the project progressed to the point where voter and ballot data were fixed.

To sum up what worked:

  • data that was OK to be outside direct control of government officials;
  • government IT staff engaged in the project so that it was not a "transom toss" of legacy data;
  • development and deployment managed by a government-oriented SI;
  • deployment into a hosted environment that met the SI's exact specifications for hosting the data management system.

That recipe worked well in this case, and I think would apply quite well for other situations with the same characteristics. In other situations, other models can work. What are those other models, or recipes? Another day, another blog on another recipe.

-- EJS

Comment

2 Comments

EAC Guidelines for Overseas Voting Pilots

election-assistance-commissionLast Friday was a busy day for the Federal Elections Assistance Commission.  They issued their Report to Congress on efforts to establish guidelines for remote voting systems.  And they closed their comment period at 4:00pm for the public to submit feedback on their draft Pilot Program Testing Requirements. This is being driven by the MOVE Act implementation mandates, which we have covered previously here (and summarized again below).  I want to offer a comment or two on the 300+ page report to Congress and the Pilot program guidelines for which we submitted some brief comments, most of which reflected the comments submitted by ACCURATE, friends and advisers of the OSDV Foundation.

To be sure, the size of the Congressional Report is due to the volume of content in the Appendices including the full text of the Pilot Program Testing Requirements, the NIST System Security Guidelines, a range of example EAC processing and compliance documents, and some other useful exhibits.

Why Do We Care? The TrustTheVote Project’s open source elections and voting systems framework includes several components useful to configuring a remote ballot delivery service for overseas voters.  And the MOVE Act, which updates existing federal regulations intended to ensure voters stationed or residing (not visiting) abroad can participate in elections at home.

A Quick Review of the Overseas Voting Issue The Uniformed and Overseas Citizens Absentee Voting Act (UOCAVA) protects the absentee voting rights for U.S. Citizens, including active members of the uniformed services and the merchant marines, and their spouses and dependents who are away from their place of legal voting residence.  It also protects the voting rights of U.S. civilians living overseas.  Election administrators are charged with ensuring that each UOCAVA voter can exercise their right to cast a ballot.  In order to fulfill this responsibility, election officials must provide a variety of means to obtain information about voter registration and voting procedures, and to receive and return their ballots.  (As a side note, UOCAVA also establishes requirements for reporting statistics on the effectiveness these mechanisms to the EAC.)

What Motivated the Congressional Report? The MOVE (Military and Overseas Voting Enhancement) Act, which became law last fall, is intended to bring UOCAVA into the digital age.  Essentially it mandates a digital means to deliver a blank ballot. 

Note: the law is silent on a digital means to return prepared ballots, although several jurisdictions are already asking the obvious question:  "Why improve only half the round trip of an overseas ballot casting?"

And accordingly, some Pilot programs for MOVE Act implementation are contemplating the ability to return prepared ballots.  Regardless, there are many considerations in deploying such systems, and given that the EAC is allocating supporting funds to help States implement the mandates of the MOVE Act, they are charged with ensuring that those monies are allocated for programs adhering to guidelines they promulgate.  I see it as a "checks and balances" effort to ensure EAC funding is not spent on system failures that put UOCAVA voters participation at risk of disenfranchisement.

And this is reasonable given the MOVE Act intent.  After all, in order to streamline the process of absentee voting and to ensure that UOCAVA voters are not adversely impacted by the transit delays involved due to the difficulty of mail delivery around the world, technology can be used to facilitate overseas absentee voting in many ways from managing voter registration to balloting, and notably for our purposes:

  • Distributing blank ballots;
  • Returning prepared ballots;
  • Providing for tracking ballot progress or status; and
  • Compiling statistics for UOCAVA-mandated reports.

The reality is, however, systems deployed to provide these capabilities face a variety of threats.  If technology solutions are not developed or chosen so as to be configured and managed using guidelines commensurate with the importance of the services provided and the sensitivity of the data involved, a system compromise could carry severe consequences for the integrity of the election, or the confidentiality of sensitive voter information.

The EAC was therefore compelled to prepare Guidelines, report to Congress, and establish (at least) voluntary guidelines.  And so we commented on those Guidelines, as did colleagues of ours from other organizations.

What We Said - In a Nutshell Due to the very short comment period, we were unable to dive into the depth and breadth of the Testing Requirements.  And that’s a matter for another commentary.  Nevertheless, here are the highlights of the main points we offered.

Our comments were developed in consultation with ACCURATE; they consisted of (a) underlining a few of the ACCURATE comments that we believed were most important from our viewpoint; (b) the addition of a few suggestions for how Pilots should be designed or conducted.  Among the ACCURATE comments, we underscored:

  • The need for a Pilot's voting method to include a robust paper record, as well as complementary data, that can be used to audit the results of the pilot.
  • Development of, and publication of security specifications that are testable.

In addition, we recommended:

  • Development of a semi-formal threat model, and comparison of it to threats of one or more existing voting methods.
  • Testing in a mock election, in which members of the public can gain understanding of the mechanisms of the pilot, and perform experimentation and testing (including security testing), without impacting an actual election.
  • Auditing of the technical operations of the Pilot (including data center operations), publication of audit results, and development of a means of cost accounting for the cost of operating the pilot.
  • Publication of ballots data, cast vote records, and results of auditing them, but without compromising the anonymity of the voter and the ballot.
  • Post-facto reporting on means and limits of scaling the size of the pilot.

You can bet this won't be the last we'll hear about MOVE Act Pilots issues; I think its just the 2nd inning of an interesting ball game... GAM|out

2 Comments

8 Comments

A License to Adopt

Open Source Technology Licensing... We've been promising to respond to the chorus of concerns that we may drift from the standard GPL for our forthcoming elections and voting systems software platform and technology.  Finally, we can begin talking about it (mainly because I found a slice of time to do so, and not because of any event horizon enabling the discussion, although we're still working out issues and there will be more to say soon.)

gplv3-127x51From the beginning we’ve taken the position that open source licenses as they exist today (and there are plenty of them) are legally insufficient for our very specific purposes of putting publicly owned software into the possession of elections jurisdictions across the nation.

And of course, we’ve heard from activists, essentially up in arms over our suggestion that current licensing schemes are inadequate for our purposes.  Those rants have ranged from the sublime to the ridiculous, with some reasonable questions interspersed.  We’d like to now gently remind those concerned that we [a] benefit from a strong lineage of open source licensing experience dating back to the Mozilla code-release days of Netscape catalyzed by Eric Raymond’s Manifesto, [b] have considerable understanding of technology licensing (e.g., we have some deep experience on our Board and within our Advisory group, and I'm a recovering IP lawyer myself), and [c] are supported by some of the best licensing lawyers in the business. So, we’re confident of our position on this matter.

We’ve dared to suggest that the GPL as it stands today, or for that manner any other common open source license, will probably not work to adequately provide a license to the software sources for elections and voting systems technology under development by the Open Source Digital Voting Foundation.  So, let me start to explain why.

I condition this with “start” because we will have more to say about this – sufficient to entertain your inner lawyer, while informing your inner activist.  That will take several forms, including a probable white paper, more blog posts, and (wait for it) the actual specimen license itself, which we will publicly vet (to our Stakeholder Community first, and the general public immediately thereafter).

The Why of it…

The reasons we’re crafting a new version of a public license are not primarily centered on the grant of rights or other “meat” of the license, but ancillary legal terms that may be of little significance to most licensees of open source software technology, but turn out to be of considerable interest to, and in many cases requirements of Government agencies intending to deploy open source elections and voting technology in real public elections, where they’re “shooting with live ballots” as Bob Carey of FVAP likes to say.

It is possible that an elections jurisdiction, as a municipal entity could contract around some of the initial six points I make below, but the GPL (and most other “copyleft” licenses) expressly disallow the placing of "additional restrictions" on the terms of the license.  And most of the items I describe below could or would be considered "additional restrictions."  Therefore, such negotiation of terms would render a standard copyleft license invalid under their terms of issuance today.

It’s not like we haven’t burnt through some whiteboard markers thinking this through – again, we’re blessed with some whip smart licensing lawyers.   For instance, we considered a more permissive license, wrapped in some additional contract terms.  But a more permissive license would be a significant decision for us, because it would likely allow proprietary use of the software – an aspect we’ve not settled on yet.

With that in mind, here are six of the issues we’re grappling with that give rise to the development of the “OSDV Public License.”  This list is by no means exhaustive.  And I'm waiting for some additional insight from one of our government contracting lawyers who is a specialist in government intellectual property licensing.  So we’ll have more to say beyond here.

  1. Open source licenses rarely have “law selection” clauses.  Fact: Most government procurement regulations require the application of local state law or federal contracting law to the material terms and conditions of any contract (including software “right to use” licenses).
  2. Open source licenses rarely have venue selection clauses (i.e., site and means for dispute resolution).  Fact: Many state and federal procurement regulations require that disputes be resolved in particular venues.
  3. There are rights assignment issues to grapple with.  Fact: Open source licenses do not have "government rights" provisions, which clarify that the software is "commercial software" and thus not subject to the draconian rules of federal procurement that may require an assignment of rights to the software when the government funds development.  (There may be state equivalents, we’re not certain.)  On the one hand, voting software is a State or county technology procurement and not a federal activity.  But we’ve been made aware of some potential parallelism in State procurement regulations.
  4. Another reality check is that our technology will be complex mix of components some of which may actually rise to the level of patentability, which we intend to pursue with a “public assignment” of resulting IP rights.  Fact: Open source licenses do not contain "march-in rights" or other similar provisions that may be required by (at least) federal procurement regulations for software development.  Since some portion of our R&D work may be subject to funding derived from federal-government grants, we’ll need to address this potential issue.
  5. There is a potential enforceability issue.  Fact: Contracting with states often requires waiver of sovereign immunity to make licenses meaningfully enforceable.
  6. In order to make our voting systems framework deployable for legal use in public elections, we will seek Federal and State(s) certifications where applicable. Doing so will confer a certain qualification for use in public elections on which will be predicated a level of stability in the code and a rigid version control process.  It may be necessary to incorporate additional terms into “deployment” licenses (verses “development” licenses) specific to certification assurances and therefore, stipulations on “out-of-band” modifications, extensions, or enhancements.  Let’s be clear: this will not incorporate any restrictions that would otherwise be vexatious to the principles of open source licensing, but it may well require some procedural adherence.

And this is the tip of the iceberg on the matter of providing an acceptable comprehensive, enforceable, open source license for municipalities, counties, and States who desire to adopt the public works of the Open Source Digital Voting Foundation TrustTheVote Project for deployment in conducting public elections.

At this juncture, its looking like we may end up crafting a license somewhat similar in nature to the Mozilla MPL.

Hopefully, we’ve started a conversation here to clarify why it’s a bit uninformed to suggest we simply need to "bolt on" the GPL3 for our licensing needs.  Elections and voting technology – especially that which is deployed in public elections – is a different animal than just about any other open source project, even in a majority of other government IT applications.

Stay tuned; we’ll have more to show and tell.

Cheers GAM|out

8 Comments

Comment

Voting System Certified in New York

New York state recently certified two voting systems, and the end of the process is an interesting insight into current certification and standards -- particularly the view of the dissent-voting participant, Bo Lipari, who explained his vote in his blog My Vote on NY Voting Machine Certification. It's certainly worth reading Bo's complete rationale, but I think that the most important take-away is very aptly expressed by today's guest blogger, Candice Hoke, the Director of the Center for Election Integrity and Associate Professor of Law at the Cleveland-Marshall College of Law at Cleveland State University. I read Bo Lipari's blog regarding the NY VS certification issue, and the 9:1 vote in favor of certification, with Bo's vote the only dissent. To provide a lawyer's view, I would mention that Anglo-American law includes a principle termed "substantial compliance." It has limitations and caveats, but it's worth considering how this principle might apply to the voting tech certification area, or instead be excluded from it.

At base, Bo's blog, and certification facts he presents, pose a very important question:

Do we really want voting system vendors to be able to "substantially comply" with the certification standards, or do we want to require more rigorous, complete compliance; and if so, why?

This is a critical question, of course.  Certainly, in the earlier NASED certification process, the ITAs (labs operating as Independent Testing Authorities) viewed substantial compliance to be all that was required.  The ITA view of “substantial” seemed to be inchoate and ad hoc, perhaps based on a general gestalt of the voting system product under review. As the California TTBR and other independent voting system studies documented, "substantial” offers a great deal of interpretive wiggle room.

My thanks to Candice both for posing this important question and for pointing that any answer is not going to be tidy, whether it is black-or-white, or a paler shade of gray.

-- EJS

Comment

Comment

Sequoia Makes Good on Publishing New Source Code

SequoiaKudos to Sequoia Voting Systems for making good on a promise to publish ("disclose") new source code for a future release of their Frontier voting system.  We applaud their recognition of the importance of transparency in voting technology.  That is, after all the hallmark of our work and the mission of the Open Source Digital Voting Foundation. Rather than regurgitate the details of this (and the important implications) I want to point you to a tour de force treatment of this topic by one of our esteemed advisers, Joe Hall and his post about this today.

But there is one point we do want to make.  Inasmuch as we applaud Sequoia's move today, there is an important distinction (and value proposition) to the work of the TrustTheVote Project.  The foundation of our project is a principle of "integrity assurance through open source."

And we believe that publication of code alone will not achieve integrity assurance.  The reason is that although code may be disclosed for inspection, there is no assurance whatsoever that any input from an observer, whether suggestions for improvement or error discovery will ever be heeded.

On the other hand, a truly open source effort means that the source code can be far more than simply visually inspected.  It means that anyone is free (and encouraged) to offer corrections, modifications and actually submit them for integration into the TrustTheVote Project source tree.  This is truly an open, transparent process of public engagement and collaboration.  And this is markedly different from a commercial vendor simply publishing their source code for arm's length inspection.

Admittedly, it appears that Sequoia is going somewhat beyond mere publication and making their code available to compile and run (as per their terms).  But again, to what extent anything uncovered, discovered, or recommended will actually see its way back into production versions of their code remains to be seen.

As our development work produces code to engage public advice, comment, and modification, rest assured that TrustTheVote Project will empower public participation in making sure that publicly owned voting technology is achieving accuracy, transparency, trustworthiness, and security.

Bear in mind that our objective is to assemble an entire system that is qualified for a federal certification process (something we still need to start a conversation about separate from this thread).  And to do so, there will need to be a publicly vetted process for including contributions, modifications, and error corrections.  We promise this "integration qualification" process should not be as intensive as obtaining Apple's certification for an iPhone app, but there will need to be a process that ensures that integrity assurance mandate.

Bear with us as our Core Team begins publishing component prototypes and Beta versions as per specifications driven by our Stakeholder Community.  We already have stuff for your perusal.  Check out our Project Wiki to stay abreast of the work.

We'll see how far Sequoia is willing to go; after all, this is the same company that threatened computer science professors just last year for daring to reverse engineer their code for purposes of supporting State's legitimate requests to audit voting systems.  What an about face they've made, but again, we think their decision is an important move.

So again, kudos to Sequoia for making an effort towards transparency albeit commercially constrained.  Let's see where it goes.

What do you think? -GAM|out

Comment

Comment

Sequoia Announces Published Source Code

Sequoia Voting Systems announced today that they will be moving towards a disclosed-source model in which they will soon begin publishing their source code. I must say that the tone and language of the press release is gratifying, especially that they thought to say that the product is also open-data, which is critical for the goal of transparency of operation of a voting system. But perhaps the most satisfying is the about-face on security by obscurity. Sequoia's VP of R&D, Eric D. Coomer, PhD, was quoted:

Security through obfuscation and secrecy is not security. Fully disclosed source code is the path to true transparency and confidence in the voting process for all involved.

I couldn't agree more! Even though the product is still proprietary (disclosed-source not open-source), it's nice to see a vendor come around to the idea that open is not weak, and indeed to have taken the leap to do R&D to make a product that they say was intended from the beginning to be disclosed.

-- EJS

Comment

1 Comment

Federally Approved Voting System - Not Tested for Security

We now have a federally certified voting system product that has completed the required testing by a federally certified independent test lab. That's a milestone in itself, as is the public disclosure of some of the results of testing process. Thanks to that disclosure, though, we now know that the test lab did practically zero security testing of the Premier product, because of a gross mis-understanding of prior security review. For a complete, accurate, and brief explanation of the whole situation, I urge you to read this letter to the EAC. The letter is from a group of absolutely top-notch voting technology and/or computer security experts, who were involved in California's Top to Bottom Review (TTBR) of voting systems, which included the Premier system that was recently certified.

At the risk of over-simplifying, the story goes like this.

  1. The TTBR found loads of technical security problems and concluded that
    • the security problems were so severe that its technological security mechanisms were unable to protect the system; and
    • the problems could be addressed only with strict procedural security - chain of custody, tamper-evident seals, and the like.
  2. Next, the test lab mis-interpreted these conclusions, assuming that the  system's vulnerabilities depended only on effective procedural controls; therefore, no need to test technical security mechanisms!
  3. The test plan included no additional security tests, and hence the Premier system passed testing despite the many security flaws found in the TTBR.

That's the gist, but do read the letter to the EAC. It's a fine piece of writing in which Joe Hall and Aaron Burstein set out everything fair and square, chapter and verse. I have to say it's astonishing.

Now, maybe this seems exceptionally geeky, with cavils over test plans and test lab results, and so on. Or maybe it seems critical of the EAC/NIST testing program. But in fact that test program is incredibly important as a gate through which computing technology must pass before being used to count votes. In a very real sense, the current testing program is just getting started, so perhaps it's not surprising that there are many lessons to learn. And my thanks go to all these TTBR verterans for speaking out to remind us how much there is to learn on the road to excellence both of voting systems and of the program to test them.

-- EJS

1 Comment

2 Comments

Tales From Real Life: Testing

Another in our series of real life stories ... how it actually works for real election officials to test a new voting system that they might be adopting for use in the state. The backplot is that New York State has been unwilling to give up its admittedly no-longer-legal*  lever machines, until the the state Board of Elections is confident that they have a replacement that not only meets Federal standards, but also is reliable and meeting similar requirements met by the old lever machines. There have been several setbacks in the adoption process, but the latest phase is some detailed testing of the candidate systems. (For the real voting tech geeks, what's being tested is a hybrid of the Dominion ImageCast scanner/Ballot Marking Device and the ES&S DS200 scanner with the Automark BMD.)

Bo Lipari is is on the Advisory Committee for this process, and has reported in detail on the testing process. You don't have to be a complete election geek to scan Bo's tales, and be impressed with the level and breadth of of diligence, and the kinds of kinks and hiccups that can occur. And of course the reportage is very helpful to us, as a concrete example of what kind of real life testing is needed for any new voting system, open or closed, to be accepted for use in U.S. elections.

-- EJS

* No-longer-legal means that NY state law was changed to require replacement of lever machines. In the initial release of this note I erroneously said that the replacement requirement was driven by HAVA. Thanks again to alert readers (see comments below) for the catch, and for providing many resources on the vexed question of "HAVA compliant" generally and lever machines specifically.

2 Comments

Comment

Gold or Pyrite?

In a couple of prior posts, I explained the "gold copy" or "trusted build" concept, and the role of EAC, NIST, and test labs. I can't seem to completely bury this tale, because it raised another question about the processing of checking a voting system to see if it is legit: "Why is this checking so hard? Is the government doing its job here?" Yes, the government is doing its job, in that voting system products are tested, and theoretically the test results include fingerprint data the theoretically an election official could use effectively. That's the general idea. Now, I am not saying that the current test lab approach is great. It isn't, and it lacks transparency among several other defects. Nevertheless, if there were a certified system that you trusted, then the independent test process can in fact use some techniques to fingerprint the system tested, so that you could compare a government-certified fingerprint of your system to see if it was exactly the same as the tested system. It's just that with the current voting devices, the checking is  impractical and -- if done without some very-difficult-to-achieve auditable physical security measures -- almost meaningless. But that's old news from old posts.

The remaining question for today is why current systems are impractical for checking. What is it with these existing voting systems? They are not easily validated because in most cases they simply weren't designed for it. Validation was not required and few though it important at the time (the post-HAVA gold rush). Then, once a system is built, you really can't go back and tack on a "field validation" feature. The government regulatory groups do not want to "change the rules" on the vendors -- in part for the very good reason that the existing Gold-Rush-era rules (the Voluntary Voting System Guidelines of 2002, the year HAVA was enacted) are the current rules that are still in-process for revision. So for the foreseeable future, there is no pragmatic reason for existing systems to change.

And that brings us back to a frequent theme that shows the real benefit of openness: public benefit. Existing systems weren't built to be validated; it's expensive to re-architect and re-implement the systems; for-profit vendors have no incentive to do so; hence, trust benefits can only be expected to be delivered by people doing work on tech that must be to be delivered and maintained in the public trust by a public interest organization, in order to maintain public confidence.

-- EJS

Comment

Comment

Identifying the Gold, Redux

I recently commented on specific connection, in the case of the TrustTheVote project, of open source methods and the issue of identifying a "gold build" of a certified voting system. As a reminder to more recent readers, most states have laws that require election officials to use only those specific voting system products that were previously certified for use in that state -- and not some slightly different version of the the same product. But recently, I got a good follow-up question - what is the role of the Federal government, in this "gold build" identification process? There is in fact an important role, that is potentially very helpful, and where openness can help magnify the benefit of this helpful role of the government. Here's the scoop. The EAC has the fundamental responsibility for Federal certification, which is used in varying degrees as part of some states' certification. Testing is the main body of work leading up to certification. Testing is performed by private companies, that have qualified in a  NIST-managed accreditation program as an official Voting Systems Test Lab. There are two key steps in the overall process in which a test lab verifies that it can re-do the "trusted build" process to re-create the soon-to-be "gold" version, so long as the lab can verify that the trusted build process did in fact re-create the same exact software that was tested. Then, as the EAC Web site briefly states: "Manufacturer provides software identification tools to EAC, which enables election officials to confirm use of EAC-certified systems."

But here is the fly in the ointment: for your typical PC or server, this is not easy! and the same is true for current voting systems. Yes, you could crack open the chassis, remove the hard drive, examine it as the boot medium, re-derive a fingerprint, and compare the fingerprint to something on the EAC web site. But in practice this is not going to happen in real election offices, and in any case it would be fruitless -- even if you did, you would still have no assurance that the device in the precinct was still the same as the gold build, because the boot media can be written after the central office tests the device, but before it goes into use in a polling place.

That's quite an annoying fly in the ointment, but it doesn't have to be that way. In fact, for for a carefully designed dedicated system, the fingerprinting and re-checking can be quite feasible -- and that applies to carefully made voting systems too, as we've previously explained. Such carefully made voting systems would be a real improvement in trustworthiness (which is why we're building them!), but they aren't a silver bullet, since you can never 100% trust the integrity of a computing system. That's why vote tabulation audits are an important ingredient, and why I periodically bang on about auditing in election processes.

-- EJS

Comment

Comment

Open and Secret?

Scanning the news last week, I found rumors of Premier Systems (the voting system vendor formerly known ad Diebold) going open-source, and of the Federal government pondering cases where voting system test results should be confidential. An interesting juxtaposition! The first item I call a rumor not because I disbelieve the blogger in question, but because Premier hasn't announced anything. But the blog article does contain some interesting stuff, including a paraphrase of Premier's CEO opining that releasing source code to the public would be an approach that results in several beneficial outcomes.

At the other end of the spectrum we have some news from a recent meeting of the EAC's standards committee, including discussion of the new Voluntary Voting Systems Guidelines draft, intended for use in Federal certification testing of products like those from Premier. The draft would require that the result of the testing process should include documentation of all attacks the system is designed to resist or detect, as well as any vulnerabilities known to the manufacturer. Some committee members pondered whether the vendors should mark this information as confidential.

Some observers have questioned whether it would be appropriate to certify a system that has known security vulnerabilities. Others have pointed out that every system has vulnerabilities, and that the important issue is to be clear about the definition and boundaries of security, sometimes called a "threat model." Within those defined limits, customers should be very clear about what deployment and operational procedures that they need to adhere to, in order to maintain the integrity of system as defined by the vendor; beyond those limits, caveat emptor.

We might take up that debate another day. But for now, the obvious old adage applies: you can't have it both ways. If a system is truly open, then there are no secrets -- though you can try hushing up issues that can be readily discovered by directly inspecting the open system. However, I think that there is a case to be made that there really are no secrets, especially for systems that are important enough that people really do want to know "whether it really works." Next time, a couple specific examples from recently published voting machine security studies, that have put me in mind of another saying: "Open, Sesame!"

-- EJS

Comment

Comment

Arizona: a New Definition of "Sufficiently" Mis-Counted?

There's a fascinating nugget inside of a fine legal story unfolding in Arizona. I know that not all our readers are thrilled by news of court cases related to election law and election technology, so I'll summarize the legal story in brief, and then get to the nugget. The Arizona Court of Appeals has been working on case that considers this interesting mix:

  • The State's constitutional right of free and fair elections;
  • The recognition that voting systems can mis-count votes;
  • The idea that a miscounted election fails to be fair;
  • The certification for use in AZ of voting system products that had counting errors before;
  • The argument over whether certified systems can be de-certified on constitutional grounds.

For the latest regular press news on the case, see the Arizona Daily Star's article "Appeals court OKs group's challenge to touch-screen voting."

Now let's look at what Judge Philip Hall actually said in the decision: (Thanks to Mark Lindeman for trolling this out). The judge refers to a piece of AZ law, A.R.S. § 16-446(B)(6), that says: "An electronic voting system shall . . . [w]hen properly operated, record correctly and count accurately every vote cast." That "every" is a pretty strong word! Judge Philip Hall wrote:

We conclude that Arizona’s constitutional right to a "free and equal" election is implicated when votes are not properly counted. See A.R.S. §16-446(B)(6). We further conclude that appellants may be entitled to injunctive and/or mandamus relief if they can establish that a significant number of votes cast on the Diebold or Sequoia DRE machines will not be properly recorded or counted.

As election-ologist Joe Hall pointed out, "Of course, I'm left wondering 'what is significant?' here. Sounds like a question we'll hear a lot about in the future of this case!" Indeed we will. Of course neither AZ law nor the legal ruling provides a pre-scription for "significant" but note also that "significant" may be a relative concept, depending on how close a race is. (Thanks again to Mark Lindeman for the point.) We know it's pretty easy for today's voting systems to miscount modest numbers (hundreds) of votes, and escape the notice of humans; and we know that contests that close will occur. Does that mean we can't use these voting systems?

I guess the argument is going to continue, both on "significant" in Hall's decision, and on "properly operated" in the AZ law. And as we saw in Humboldt County and many other places, "operator error" is often in the eye of the beholder.

-- EJS

Comment