To our elections official stakeholders, Chief Technology Officer John Sebes covers a point that seems to be popping up in discussions more and more. There seems to be some confusion about what "open source" means in the context of software used for election administration or voting. That's understandable, because some election I.T. folks, and some current vendors, may not be familiar with the prior usage of the term "open source" -- especially since it is now used in so many different ways to describe (variously), people, code, legal agreements, etc. So, John hopes to get our Stakeholders back to basics on this.
Viewing entries tagged
This evening at 5:00pm members of the TrustTheVote Project have been invited to attend an elections technology round table discussion in advance of a public hearing in Sacramento, CA scheduled for tomorrow at 2:00pm PST on new regulations governing Voting System Certification to be contained in Division 7 of Title 2 of the California Code of Regulations. Due to the level of activity, only our CTO, John Sebes is able to participate.
We were asked if John could be prepared to make some brief remarks regarding our view of the impact of SB-360 and its potential to catalyze innovation in voting systems. These types of events are always dynamic and fluid, and so we decided to publish our remarks below just in advance of this meeting.
Roundtable Meeting Remarks from the OSDV Foundation | TrustTheVote Project
We appreciate an opportunity to participate in this important discussion. We want to take about 2 minutes to comment on 3 considerations from our point of view at the TrustTheVote Project.
For SB-360 to succeed, we believe any effort to create a high-integrity certification process requires re-thinking how certification has been done to this point. Current federal certification, for example, takes a monolithic approach; that is, a voting system is certified based on a complete all-inclusive single closed system model. This is a very 20th century approach that makes assumptions about software, hardware, and systems that are out of touch with today’s dynamic technology environment, where the lifetime of commodity hardware is months.
We are collaborating with NIST on a way to update this outdated model with a "component-ized" approach; that is, a unit-level testing method, such that if a component needs to be changed, the only re-certification required would be of that discrete element, and not the entire system. There are enormous potential benefits including lowering costs, speeding certification, and removing a bar to innovation.
We're glad to talk more about this proposed updated certification model, as it might inform any certification processes to be implemented in California. Regardless, elections officials should consider that in order to reap the benefits of SB-360, the non-profit TrustTheVote Project believes a new certification process, component-ized as we describe it, is essential.
2nd, there is a prerequisite for component-level certification that until recently wasn't available: common open data format standards that enable components to communicate with one another; for example, a format for a ballot-counter's output of vote tally data, that also serves as input to a tabulator component. Without common data formats elections officials have to acquire a whole integrated product suite that communicates in a proprietary manner. With common data formats, you can mix and match; and perhaps more importantly, incrementally replace units over time, rather than doing what we like to refer to as "forklift upgrades" or "fleet replacements."
The good news is the scope for ballot casting and counting is sufficiently focused to avoid distraction from the many other standards elements of the entire elections ecosystem. And there is more goodness because standards bodies are working on this right now, with participation by several state and local election officials, as well as vendors present today, and non-profit projects like TrustTheVote. They deserve congratulations for reaching this imperative state of data standards détente. It's not finished, but the effort and momentum is there.
So, elections officials should bear in mind that benefits of SB-360 also rest on the existence of common open elections data standards.
3. Commercial Revitalization
Finally, this may be the opportunity to realize a vision we have that open data standards, a new certification process, and lowered bars to innovation through open sourcing, will reinvigorate a stagnant voting technology industry. Because the passage of SB-360 can fortify these three developments, there can (and should) be renewed commercial enthusiasm for innovation. Such should bring about new vendors, new solutions, and new empowerment of elections officials themselves to choose how they want to raise their voting systems to a higher grade of performance, reliability, fault tolerance, and integrity.
One compelling example is the potential for commodity commercial off-the-shelf hardware to fully meet the needs of voting and elections machinery. To that point, let us offer an important clarification and dispel a misconception about rolling your own. This does not mean that elections officials are about to be left to self-vend. And by that we mean self-construct and support their open, standard, commodity voting system components. A few jurisdictions may consider it, but in the vast majority of cases, the Foundation forecasts that this will simply introduce more choice rather than forcing you to become a do-it-yourself type. Some may choose to contract with a systems integrator to deploy a new system integrating commodity hardware and open source software. Others may choose vendors who offer out-of-the-box open source solutions in pre-packaged hardware.
Choice is good: it’s an awesome self-correcting market regulator and it ensures opportunity for innovation. To the latter point, we believe initiatives underway like STAR-vote in Travis County, TX, and the TrustTheVote Project will catalyze that innovation in an open source manner, thereby lowering costs, improving transparency, and ultimately improving the quality of what we consider critical democracy infrastructure.
In short, we think SB-360 can help inject new vitality in voting systems technology (at least in the State of California), so long as we can realize the benefits of open standards and drive the modernization of certification.
EDITORIAL NOTES: There was chatter earlier this Fall about the extent to which SB-360 allegedly makes unverified non-certified voting systems a possibility in California. We don't read SB-360 that way at all. We encourage you to read the text of the legislation as passed into law for yourself, and start with this meeting notice digest. In fact, to realize the kind of vision that leading jurisdictions imagine, we cannot, nor should not alleviate certification, and we think charges that this is what will happen are misinformed. We simply need to modernize how certification works to enable this kind of innovation. We think our comments today bear that out.
Moreover, have a look at the Agenda for tomorrow's hearing on implementation of SB-360. In sum and substance the agenda is to discuss:
- Establishing the specifications for voting machines, voting devices, vote tabulating devices, and any software used for each, including the programs and procedures for vote tabulating and testing. (The proposed regulations would implement, interpret and make specific Section 19205 of the California Elections Code.);
- Clarifying the requirements imposed by recently chaptered Senate Bill 360, Chapter 602, Statutes 2013, which amended California Elections Code Division 19 regarding the certification of voting systems; and
- Clarifying the newly defined voting system certification process, as prescribed in Senate Bill 360.
Finally, there has been an additional charge that SB-360 is intended to "empower" LA County, such that what LA County may build they (or someone on their behalf) will sell the resulting voting systems to other jurisdictions. We think this allegation is also misinformed for two reasons:  assuming LA County builds their system on open source, there is a question as to what specifically would/could be offered for sale; and  notwithstanding offering open source for sale (which technically can be done... technically) it seems to us that if such a system is built with public dollars, then it is in fact, publicly owned. From what we understand, a government agency cannot offer for sale their assets developed with public dollars, but they can give it away. And indeed, this is what we've witnessed over the years in other jurisdictions.
In a recent posting, I recalled the old-fashioned traditional proprietary-IT-think of vendors leveraging their proprietary data for their customers, and contrasted that with election technology where the data is public. In the "open data" approach, you do not need to have integrated reporting features as part of a voting system or election management system. Instead, you can choose your own reporting system, hook it up to your open database of election data, and mine that data for whatever reports you want. And if you need help, only a few days of a reporting-systems consultant can get you set up quite quickly. The same applies to what we used to call "ad hoc querying" in the olden enterprise IT days, and now might be "data mining". Well, every report is the result doing one or more database queries, and formatting the results. When you can do ad hoc creation of new report template, then an ad hoc query is really just a new report. With the open-data approach, there is no need to buy any additional "modules" from a voting system vendor in order to be able to do querying, reporting, or data mining. Instead, you have ready access to the data with whatever purpose-built tools you choose.
Today, I want to underline that point as applied to mobility, that is, the use of apps on mobile devices (tablets, smart phones, etc.) to access useful information in a quick and handy on-the-go small-screen form factor. Nowadays, lots of folks want "an app for that" and election officials would like to be able to provide. But the options are not so good. A proprietary system vendor may have an app, but it might not be what you had in mind; and you can't alter it. You might get a friendly government System Integrator to crack open your proprietary voting system data and write some apps for you, but that is not a cheap route, either.
What, in contrast, is the open route? It might seem a detour to get you where you want to go, but consider this. With open data, there is no constraint on how you use it, or what you use it with. If you use an election management system that has a Web services API, you can publish all that data to the whole world in a way that anyone's software can access it-- including mobile apps-- including all the data, not just what happens to be available in proprietary product's Web interface. That's not just open-source and "open data" but also "complete data."
Then for some basic apps, you can get friendly open-gov techies to make something simple but effective for starters, and make the app open source. From there on out, it is up to the ingenuity of the tens of thousands of mobile app tinkerers and good government groups (for an example, read about one of them here, and then try it the app yourself) to come up great ideas about how to present the data -- and the more options there are, the more election data, the public's data, gets used for the public good.
I hope that that picture sounds more appealing than closed systems. But to re-wind to Proprietary Election Technology Vendors' (PETV) offerings to Local Election Officials (LEO), consider this dialogue as the alternative to "open data, complete data."
LEO: I'd like to get an election data management solution with flexible reporting, ad hoc querying, a management dashboard, a nifty graphical public Web interface, and some mobile apps.
PETV: Sure, we can provide it. We have most of that off the shelf, and we can do some customization work and professional services to tailor it to your needs. Just guessing from you asked for, that will be $X for the software license, $Y per year for support, $Z for the customization work, and we'll need to talk about yearly support for the custom stuff.
LEO: Hmmm. Too much for me. Bummer.
PETV: Well, maybe we can cut you a special deal, especially if you lower your sights on that customization stuff.
LEO: Hmmm. Then I'm not really getting all I asked for, but I am getting something I can afford. ... But will you all crack open your product's database with a Web services API so that anybody can write a mobile app for it, for any mobile device in the world?
PETV: Wow! That would be some major customization. I think you'll find our mobile app is just fine.
LEO: What about cracking open the database so I can use my choice of reporting tools?
PETV: Ah, no, actually, and I think you'll find our reporting features are really great.
I'll stop the dialogue (now getting painful to listen to) and actually stop altogether for today, leaving the reader to contrast it with the open-data, complete-data approach of an open election data management system with core functions and features, basic reporting, basic mobility, and above all the open-ness for anyone to data-mine or mobilize the election data that is, in fact, the people's information.
During some recent election technology adoption discussions, I've realized how some standard proprietary-IT-think has affected acquisitions of election technology. And it is a mind-set that I used to have too, back when I was in the enterprise IT infrastructure business. Back then, the normal thing was to have a core technology with some primary value, a road map of a couple major extensions of the core technology, and a product roadmap for adding functions and features. Of course we wanted our customers to want more of our stuff as time went by, and we wanted to support our pricing model with customer options for this growing set of features.
And one more-or-less knee-jerk response was an expanding feature set for "reporting." The idea was familiar: the vendor lets you, the customer, use their software; the software builds up a valuable base of information (a proprietary information base) about its history of use and what it can tell you about your IT usage; so the software should be able to prepare you reports that tell you various kinds of juicy information nuggets. And the big assumption was that only that software had the smarts to do so.
And that went double for the cases where a few "reports" were small enough in scope but commonly enough used that it was better to present a handful of them as graphics on a single administrative screen. Thus, the "management dashboard" and new spin on higher product value.
Rewind to the present day, and I found it curious that this mindset is still around, including among adopters of election technology. But in election-land, there is huge missing concept here: Inside of election technology, the data is not proprietary, not specific to a vendor. Sure, a closed system vendor may make data format(s) proprietary, but the data of elections, contests, candidates, ballots, voters, vote totals -- all that and more is by rights public data.
Now, here is the "open" factor: In an open system, all that public data is freely available. Anyone, or anyone's code, can access the data. Take the example of the TTV Election Manager and TTV Tabulator working to consolidate vote counts. The Election Manager's database is an ordinary database with a public schema. If an election official wants some specific reports generated, it is only one option to ask for Election Manager or Tabulator features to slice and dice the data and prepare nifty tables and graphics. And it is tempting to want that in the same Web application interface of the Election Manager. That temptation is underlined because existing proprietary EMSs do have the "you can only get reports from me" concept -- though seemingly to not able please all users with one set of limited reporting features.
But a better option is to recognize that all the data is there already, sitting in a publicly documented database which can be accessed directly by any purpose-built reporting system. Get the reporting system of your choice -- there are tons of them ranging from the grand-daddy of them all Crystal Reports (now offered by software giant SAP) to the reporting offering of venerable open-source project GNU. Hook up the reporting system to your database of election data (yes, that can be a real election management database in the picture above), and design and generate reports to your hearts' content. And even better: a purpose-built reporting package probably has many more handy features than either a product manager or a customer of a voting system product would think of.
And that's the power of "open data," using the best tool for each job -- an election data management system to manage election data, a voting system to collect votes, and a reporting system to generate a wide variety of customizable reports. And that power creates options and trade-offs, which are essential in funding-constrained U.S. election-land. It's tempting to want one vendor to have a completely integrated product of everything, but it may well be more cost-effective -- and ultimately more useful -- to have a collection of packages each of which has your best bang-for-the-buck for each task you need automated.
PS: Next time on "Detours" -- mobile computing as another example of a detour from traditional proprietary-IT-think in election-land.
Continuing our Bedrock election story (see parts one, two, and three if you need to catch up), we find the County of Bedrock Board of Elections staff, including design guru Dana Chisel, in the "ballot design studio," a dusty back room of the BBoE. Chisels in hand, staffers ponder the blank slate, or rather sandstone, of sample ballot slabs on easels. With the candidate and referendum filing periods closed and the election only a couple weeks away, it's time to make the ballots.
Now, you might think that the ballot consists of the 3 items we know of - the race for Mayor, the race for Quarry Commission, and the question on the quarry fee. However, recall that each precinct in Bedrock County has a distinct set of districts. In this election, each precinct has a distinct ballot with a distinct set of contests corresponding to the districts that the precinct is part of. At a first cut, the contests by precinct are:
- Downtown-001: the contest for mayor, and the referendum on quarry fees;
- Quarrytown-002: the contests for mayor and quarry commissioner, and the referendum on quarry fees;
- QuarryCounty-003: the contest for quarry commissioner, and the referendum on quarry fees;
- County-004: the referendum on quarry fees.
You'll note that only Town residents -- in Precincts 1 or 2 -- are entitled to vote for mayor, while residents of the Mineral District -- in Precincts 2 or 4 -- are the only voters entitled to for Quarry Commissioner. Last, all voters in the county are eligible to vote on county revenue issues such as taxes and fees imposed by the county.
That, plus the list of candidates and the text of the referendum, comprise what might be called the content of each of the 4 ballots, or the ballot configuration. But the ballots themselves need to be designed: the ballot items have to appear in some order, and the candidates likewise; the ballot items have to be arranged in some visual design, vertically or horizontally, with sufficient space between each, fitting the size of ballot slates that they will be etched on … and so on.
So, armed with chisels, the proverbial blank slate, and several tablets stating the legal requirements for contest and candidate order, design guru Dana Chisel marks out a prototype ballot containing all the requisite ballot content, laid out according to usability principles known since the Stone Age (left justified text, instructions separate from content, instructions with simple words along with pictures, and more). After a few tries and consultation with their boss Rocky, they have a design model for each of the 4 ballots. The next step are usability testing with volunteer voters, and using the results to create the final slabs that serve as the model for each ballot style. Then they're ready for mass reproduction of ballots for the upcoming election -- get those duplidactylsaurs into action!
Now, you might think that they're ready for election day, but wait there's more, including the preparation of pollbooks, and then early voting, and then eventually election day operations.
Next time: Pollbooks and Early Voting
Sorry we've been away from the podium here for a couple of weeks. We're heads-down on some very exciting projects. But not nearly as exciting as what I have to announce today. Let's get right to it.
The time has come. Some might argue it’s overdue. Growth of the activities and work here, and the need for speed in advancing the agenda of open source elections technology triggers today’s announcement:
The OSDV Foundation Leadership Team is growing, and we're officially recruiting for a new Chief Executive Director.
The search is on, and we want your help in locating an absolute “A-player” to lead the next level of growth for the Open Source Digital Voting Foundation.
“Wait a minute,” you say. “Wait a minute! Doesn’t the Foundation already have an Executive Director… or actually like two of them?” Oh, definitely—you're right, two of them. John Sebes and myself, co-founders and co-executive directors (as mandated by the Foundation’s by-laws), have been tirelessly leading and managing this 4-year effort since Day 1 with the generous support and advice of our Board.
We have also have been managing all aspects of Foundation development (read: funding) and technology work (e.g., the TrustTheVote Project). And the workload has become overwhelming. We each need to now focus on our particular domain expertise in order to sustain and accelerate the momentum the TrustTheVote Project is gaining.
So, it is time for both of us to narrow our respective scope of efforts. For myself, this means focusing on stakeholder community development, public outreach, adoption and deployment, and strategic alliances and backing. In the commercial world, this might be akin to the kind of role I’ve played in the tech sector for about 1/2 of my career: running marketing and business development.
For John, this means the heavy responsibility for leading the core mission of the non-profit: open source elections technology design and development efforts. This is aligned with his commercial world experience: as an engineering manager and chief technology officer.
What’s left are all of the activities associated with day-to-day operational leadership, to effectively manage and grow the Foundation. This includes executive leadership in major fund raising from all sources, accounting, finance, administration, legal affairs, and public relations. It is in the commercial world, a CEO role. In other words, with the growth in activities and work, the leadership team must expand and bring in the right talent to take this to the next level.
We’ve successfully been managing what essentially amounts to nearly a $1.0M operation; a tiny start-up by commercial comparison, but significant by some non-profit comparisons. We realize that we must now elevate this to a $7-10M annual operation in order to maintain the momentum we’re generating and be the kind of change agent for public elections integrity and trust according to our Charter.
And we’re experienced enough to appreciate that neither of us is well suited to provide that non-profit leadership and somehow keep doing what we do best.
The details of technology architecture and building the stakeholder community are more than full-time efforts alone. To be sure, both John and I have managed commercial technology operations greater than $10M per year (but in those cases had staffing and resources commensurate with the size of operation). However, the nuances of a non-profit operation, its methods of funding, and the need for our acquired domain expertise on elections technology, flat out prohibits us from trying to do it all any longer.
So, Here We Go.
We’ve uploaded a position description on the TrustTheVote Wiki. You will find it here. And there is a companion document that provides some background, here. We’ve engaged with our Board, an Executive Recruiter, and our advisers to expand the search.
With today’s announcement, we look to you, our backers, supporter, stakeholders, and other interested onlookers to join in the search for our ideal candidate to lead this exciting and important project blending the best in technology innovation, with the imperative agenda of “critical democracy infrastructure.”
And it’s a helluva lot of fun working to be the change agent for accuracy, transparency, verification, and security of public elections technology in a digital age. To be sure, there's a bunch of great stuff going on here: the digital poll book project based on the Apple iPad; the election night reporting system project using open data and web services distribution; work with the Federal Elections Assistance Commission on component-level certification for the open source Tabulator we're building; and working with the IEEE 1622 Standards Group on our proposed standard for open election data formats.
Please spread the word; the search is ON. If you know of an ideal candidate, or even think you might be one yourself, we want to hear from you. Ping us. You can also drop a note to "edsearch" sent to our Foundation web site domain.
At the end of our last visit to the fictional Town of Bedrock, we left Fred as he applied to run for mayor. Now we'll continue the story, but with a focus on Bedrock itself, in order to continue building up a detailed, yet simplified, account of actual U.S. election practice. The focus is on Bedrock rather than its colorful denizens, because the answer to the current question -- can Fred be a candidate for mayor in the upcoming election? -- lies partly in the details of Cobblestone County and Town of Bedrock, how they are structured and administered for elections. At a first glance of the Bedrock County map, you'll see that the Town of Bedrock is entirely in Cobblestone County, dividing the county into two regions, the part that is incorporated in the Town, and the unincorporated portion.
Look a bit more closely though, and you'll see the Mineral District -- not a town but a political division called an electoral district (in some states in the U.S., called a jurisdiction rather than a district). The Mineral District in the part of the county that's affected by quarrying operations at the Bedrock Quarry, and the Bedrockites who live there get to elect the Quarry Commission to regulate the Quarry. Look a bit more carefully and you'll notice that part of the Mineral District is in the Town of Bedrock, and the rest is in the unincorporated county.
To keep our Election Tale simple, that's almost all of the electoral structure of Cobblestone County that is the jurisdiction of the Bedrock BoE. The remaining part may be a bit more familiar: the precincts. Each precinct is a region in which all of the voters are entitled to vote on exactly the same ballot items; put another way, in one precinct all of the voters reside in exact same set of electoral districts. So in Bedrock County, there are 4 precincts:
- The "Downtown-001" precinct, part of two districts: the district of the Town of Bedrock, and the district for Bedrock County;
- The "Quarrytown-002" precinct, part of those same two districts, plus the Mineral District;
- The "QuarryCounty-003" precinct, part of the Mineral District and the County;
- The "County-004" precinct, part of just the district for the County.
Looking a little more carefully, you'll notice the Flintstone residence is in the QuarryTown-002 precinct, which means the Flintstones (or at least those of them that are registered voters) are eligible to run for offices in either the Town or the Quarry District. To say that more generally, in order to be eligible to run for an office, you have to reside in the district that the office is part of. Fred wants to run for Mayor of the Town of Bedrock, so he has to reside in the Town of Bedrock.
Back at the BBoE, Rocky has completed the eligibility check for Fred, having ensured that:
- he resides in the Town of Bedrock,
- he is registered to vote,
- his current address matches the address in his voter record,
- he is not serving jail time,
and perhaps some other eligibility requirements in Stone Age election law that we are not aware of. Fred is satisfied to find that on the Bedrock slab-site's Upcoming Election slab, he is listed as a candidate for mayor. However, there is also a bit of a surprise: his neighbor Betty Rubble is running against him! And also Barney Rubble is running for Fred's old Quarry Commission seat. Also, the commission's clerical errors seem to have been resolved, and the quarry fee referendum will be on the ballot. With a few more days of filing time left, an irritated Fred ponders who lives in the Mineral District, that might be convinced to run against Barney.
Next Time: it's time for ballot design - get out your Chisel!
Thanks to some excellent recent presentations by EAC folks, we have today a pleasant surprise of an update to our recent blogs Voting System Decertification: A Way Forward (in Part 1 and Part 2). As you might imagine with a government-run test and certification program, there is an enormous amount of detail (much of it publicly available on the EAC web site!) but Greg and I have boiled it down to a handful of point/counterpoints. Very short version: EAC seems to be doing a fine job, both in the test/certification/monitoring roles, and in public communication about it. At the risk of oversimplifying down to 3 points, here goes: 1. Perverse Incentive
Concern: ES&S's Unity 184.108.40.206 would be de-certified as a result of EAC's investigation into functional irregularities documented in Cuyahoga County, Ohio, by erstwhile elections direction Jane Platten (Kudos to Cuyahoga). With the more recent product 220.127.116.11 just certified, the "fix" might be for customers of 18.104.22.168 to upgrade to the latest version, with unexpected time and unbudgeted upgrade expense to customers, including license fees. If so, then the product defect, combined with de-certification, would actually benefit the vendor by acting to spur customers toward paid upgrades. Update: Diligent work at EAC and ES&S has resulted in in ES&S providing an in-place fix to its 22.214.171.124 product, so that EAC doesn't have to de-certify the product, and customers don't have to upgrade. In fact, one recent result of EAC's work with Cuyahoga County, the county was able to get money back from the vendor because of the issues identified.
Next Steps: We'll be waiting to hear whether the fix is provided at ES&S's expense (or at least no cost to customers), as it appears may be the case. We'll also be watching with interest the process in which version 126.96.36.199+ fix goes through the test and certification process to get legal for real use in elections. As longtime readers know, we've stressed the importance of the emergence of a timely re-certification process for products that have been certified, need a field update, and need the previously used test lab to test the updated system with testing that is as rigorous as the first time, but less costly and more timely.
2. Broken Market
Concern: This situation may be an illustration of the untenable nature of of a market that would require vendors to pay for expensive testing and certification efforts, and to also have to forego revenue when otherwise for-pay upgrades are required because of defects in software.
Update: By working with the vendor and their test lab on both the earlier and current versions of the product, all customers will be able to obtain a no-fee update of their existing product version, rather than being required to do a for-fee upgrade to a later product version. Therefore, the "who pays for the upgrade?" question applies only to those customers who actually want to to pay for the latest version.
Next Steps: Thanks to the EAC's new process of publishing timelines for all product evaluation versions, it should be possible to compare the timeframe for the original 188.8.131.52 testing, the more recent 184.108.40.206 testing, and the testing of the bug fixed version of 220.127.116.11. We can hope that this case demonstrates that a re-certification process can indeed be equally rigorous, less costly, and more timely.
3. Lengthy Testing and Certfication
Concern: The whole certification testing process costs millions and takes years for these huge voting system products of several components and dozens of modules of software. How could a re-test really work at a tiny fraction of a fraction of that time and cost?
Update: Again, thanks to publishing those timelines, and with experience of recent certification tests, we can see the progress that EAC is making towards their goal that an end-to-end testing campaign of a system to be less than 9 months and a million dollars, perhaps even quarter or a third less. The key, of course, is that a system be ready for testing. As we've seen with some of the older systems that simply weren't designed to meet current standards, and weren't engineered with a rigorous and documented Q&A process that could be disclosed to a test lab to build on, well, it can be a lengthy process -- or even one that a vendor withdraws from in order to go back and do some re-engineering before trying again.
Next Steps: A key part of this time/cost reduction is EAC's guidance to vendors on readiness for testing. That guidance is another relatively recent improvement by EAC. We can hope for some public information in future about how the readiness assessment has worked, and how it helped a test process get started right. But even better, EAC folks have again repeated a further goal for time/cost reduction, but moving to voting system component certification, rather than certifying the whole enchilada - or perhaps I should say, a whole enchilada, rather than the whole plato gordo of enchilada, quesadillas, and chile relleno, together with the the EMS for it all with its many parts - rice, frijoles, pico de gallo, fresh guacamole ... (I detect an over-indlugence in metaphor here!)
One More Thing: As we've said before, we think that component level testing and re-testing is the Big Step to get whole certification scheme into a shape that really serves its real customers - election officials and the voting public. And we're proud to jump in and try it out ourselves -- work with EAC, use the readiness assessment ourselves, do a pilot test cycle, and see what can be learned about how that Big Step might actually work in the future.
Today, we'll continue our illustrative story of elections -- and as in the first installment of the story, we'll keep it simple with the setting in the Town of Bedrock. As we tune in, we find Fred Flintstone in downtown Bedrock at the offices of Cobblestone County's Bedrock Board of Elections (BBoE). He's checking up on the rumor that Mayor Flint Eastrock has resigned, and that there is a Special Mayoral Election scheduled. Asking BBoE staffer Rocky Stonerman, Rocky replies, "Of course Fred! Just check the BBoE's public slab-site." Going back outside the BBoE offices, he checks the public slab-site, and sure enough there is a newly posted slab announcing the election. Fred tells Rocky he'd like to run, and Rocky explains how Fred needs to apply as a candidate, and what the eligibility rules are. "Fred, I'll tell you straight up, don't bother to fill out the application, you're not eligible because you're a Quarry Commissioner. If you want to run for Mayor, you'll need to resign first, and then apply as a mayoral candidate."
"Yabba dabba doo - that's what I'm here to do!" Forms and formalities of resignation then taken care of, Rocky gives Fred an application slab and chisel, and then grabs a chisel and runs out to update the Upcoming Elections slab page to include information about the contest for Quarry Commission, Seat #2. In the meantime, Fred has finished his application, and hands it in when Rocky returns. "Did you put me on the candidate list?"
"Of course not, Fred. We have to process your application! Best bet is to come down tomorrow -- I'm going to have to pull your voter record from our voter record tablet-base system. And I've got to tell you, it'll take a while -- the VRTB has thousands of records. We're still running on unsupported old ScryBase system! Wish we had funding to upgrade but not so far. Petro tells me the we should look at an open-stone MyScryql system, and …"
Not so interested in Rocky and Petro's slabs and tablets, Fred interrupts, "And what about that referendum?" Rocky replies, "Oh yes! The Quarry Courier brought over an application yesterday, but it didn't have all the commissioners' signatures on the application. You probably want to get the commission to fix that -- unless you want to go the petition route, though you'd need 300 signatures and frankly I don't know if our TBMS has room, because …"
Having heard more than enough about stone-age election technology for one day, Fred beats a hasty retreat. Tune in for the next installment, to find out if Fred actually gets to run for mayor.
You'll often find the term "open source" here, used to describe either the source code for software, or the license that allows you take that source code and use it. But "open data" is just as important. A recent New York Times article read almost like I would have said it, starting with "It's not boring, really!" or to be precise, the title "This Data Isn’t Dull. It Improves Lives". NYT's Richard H. Thaler starts on exactly the right point:
Governments have learned a cheap new way to improve people’s lives. Here is the basic recipe: Take data that you and I have already paid a government agency to collect, and post it online in a way that computer programmers can easily use. Then wait a few months. Voilà! The private sector gets busy, creating Web sites and smartphone apps that reformat the information in ways that are helpful to consumers, workers and companies.
That's exactly the approach to election open-data that we're taking in the next steps of election data management at TrustTheVote. Right now, I have to admit that the current set of election data might actually be fairly boring unless you have an interest in ballot proofing ot electoral districting (which we'll get to in a couple more installments in our Bedrock series). But the next step might be more interesting: combining that data with election-result information, which up to now we've managed only in the context of the TTV Tabulator and some common data formats that we're working on with some help from EAC and NIST.
But by adding election result data back into the election definition data, we get the the next cool part: a new TTV component that is like the current Election Manager (which would remain deployed privately within a BoE or state), but with only the ability to publicly provide election and election result data via a Web services API. That, in turn, becomes the back end for an election night reporting system Web site and smartphone app.
But perhaps just as important, that API would be publicly accessible to any software, including 3rd party sites and apps, as Mr. Thaler points out. Right now, most election definition data and result data is locked up in the EMS of proprietary voting system products, with a sliver of it published in human-oriented reports and sometimes web content. A new TTV component for Web publication of that information would be a fine first step, but by itself, the data would still be limited to availability in whatever form (however broad) the human-oriented Web interface provides. The really key point, instead, is this:
Not only publish via a Web site, but also make all the data accessible, so anyone can do their own thing to slice and dice the data to gain confidence that the election results are right.
And that point -- confidence -- is where we rendezvous with the NYT's point about data improving people's lives. I'm not sure that open-data for elections can save lives, but I think it can help save some people's faith in election integrity. And now is certainly a trying time, with a variety of election-related litigation news from NY and CO and IN and SC, all seeming to say that election irregularities or outright fraud -- even at the top with IN's highest election official being indicted -- is all over the country.
There's an old saying "one bad apple doesn't spoil the whole barrel" but we should not have to take it on faith for the large barrel of honest diligent election officials and the valid results of their well-run elections. A bit of open-data might actually help.
Yesterday I wrote about the latest sign of the downward spiral of the broken market in which U.S. local election officials (LEOs) purchase product and support from vendors of proprietary voting system products, monolithic technology the result of years' worth of accretion, and costing years and millions to test and certify for use -- including a current case where the process didn't catch flaws that may result in a certified product being de-certified, and being replaced by a newer system, to the cost of LEOs. Ouch! But could you really expect a vendor in this miserable market to give away new product that they spent years and tens of millions develop, to every customer of the old product, who the vendor had planned to sell upgrades to? -- just because of flaws in the old product? But the situation is actually worse: LEOs don't actually have the funding to acquire a hypothetical future voting system product in which the vendor was fully open about true costs including
(a) certification costs both direct (fees to VSTLs) and indirect cost (staff time), as well as
(b) costs of development including rigorously designed and documented testing.
Actually, development costs alone are bad enough, but certification costs make it much worse -- as well as creating a huge barrier to entry of anyone foolhardy enough to try to enter the market (or even stay in it!) and make a profit.
A Way Forward?
That double-whammy is why I and my colleagues at OSDV are so passionate about working to reform the certification process, so that individual components can be certified for far less time and money than a mess o'code accreted over decades, and including wads of interwoven functionality that might need even need to be certified! And then of course, these individual components could also be re-certfied for bug fixes by re-running a durable test plan that the VSTL created the first time around. And that of course requires common data formats for inter-operation between components -- for example, between a PCOS device and a Tabulator system that combines and cross checks all the PCOS devices' outputs, in order to either find errors/omissions or find a complete election result.
So once again our appreciation to NIST, EAC, IEEE 1622 for actually doing the detailed work of hashing out these common data formats, which is the bedrock of inter-operation, which is the pre-req for certification reform, which enables certification cost reduction of certification, which might result in voting system component products being available at true costs that are affordable to the LEOs who buy and use them.
Yet's that's quite a stretch, from data standards committee work, to a less broken market that might be able to deliver to customers at reasonable cost. But to replace a rickety old structure with a new, solid, durable one, you have to start at the bedrock, and that's where we're working now.
PS: Thanks again to Joe Hall for pointing out that the current potential de-certification and mandatory upgrade scenario (described in Part 1) illustrates the untenable nature of a market that would require vendors to pay for expensive testing and certification efforts, and to also have to (as some have suggested) forego revenue when otherwise for-pay upgrades are required because of defects in software.
Long-time readers will certainly recall our view that the market for U.S. voting systems is fundamentally broken. Recent news provides another illustration of the downward spiral: the likely de-certification of a widely used voting system product from the vendor that owns almost three quarters of the U.S. market. The current stage of the story is that the U.S. Election Assistance Commission is formally investigating the product for serious flaws that led to errors of the kind seen in several places in 2010, and perhaps best documented in Cuyahoga County. (See: "EAC Initiates Formal Investigation into ES&S Unity 18.104.22.168 Voting System".) The likely end result is the product being de-certified, rendering it no longer legal for use in many states where it is currently deployed. Is this a problem for the vendor? Not really. The successor version of the product is due to emerge from a lengthy testing and certification process fairly soon. Having the current product banned is actually a great tool for migrating customers to the latest product!
But at what cost to who? The vendor will charge the customers (local election officials, or LEOs) for the new product, the same as would have been if the migration were voluntary and the old product version still legal. The LEOs will have to sign and pay for a multi-year service agreement. And they will have the same indirect costs of staff efforts (at the expense of other duties like running elections, or getting enough sleep to run an election correctly), and direct costs for shipping, transportation, storage, etc. These are real costs! (Example: I've heard reports of some under-funded election officials opting to not use election equipment that they already have, because they have no funding for the expense of taking out of the warehouse to testing facility, and doing the required pre-election testing.)
Some observers have opined that vendors of flawed voting system products should pay: whether damages, or fines, or doing the migration gratis, or something. But consider this deeper question, from UCB and Princeton's Joe Hall:
Can this market support a regulatory/business model where vendors can't charge for upgrades and have to absorb costs due to flaws that testing and certification didn't find? (And every software product, period, has them).
The funding for a high level of quality assurance has to come from somewhere, and that's not voting system customers right now. Perhaps we're getting to the point where the amount of effort it takes to produce a robust voting system and get it certified -- at the vendor's expense -- creates a cost that customers are not willing or able to pay when the product gets to market.
A good question! and one that illustrates the continuing downward spiral of this broken market. The cost to to vendors of certification is large, and you can't really blame a vendor for the sort of overly rapid development, marketing, and sales that leads to the problems being investigated. The folks are in this business to make a profit for heavens' sake, what else could we expect?
PS - Part Two, coming soon: a way out of the spiral.
As I said in my recent MLK posting, I'm starting a series of blogs that should provide a concrete example of election management, at a small scale and (I hope) with some interest value. But before I tell a story of election management, we need to first have a story of an election, and this particular election starts with a candidate.
So, let me tell you a little story about a man named Jed -- oops, sorry, a man named Fred*. Fred lives in the Town of Bedrock, and just heard that the famous mayor, Flint Eastrock, has just resigned, in order to start a new film project. Fred decides to run for mayor in the Special Mayoral Election, because he's ready for the big time, having served on the Quarry Commission for some years. Like modern-day U.S., Bedrockites prefer to elect as many government positions as possible; rather than trusting the Mayor or Bedrock City Council to appoint Quarry Commissioners, the 5 commissioners are elected. So, in the special election, Fred's open seat on the Commission will also be up for election.
Lastly, as Fred's last act as Commissioner before resigning to run for mayor, Fred proposes a new referendum about the Quarry: a question for the voters to approve or reject a new usage fee for quarrying -- some needed additional revenue for the quarry upgrade that he hopes to be the centerpiece of his tenure as mayor.
So, there we have an election coming up, with three ballot items:
- An open seat for mayor, which Fred wants to run for;
- An open seat on the Quarry Commissioner, from which Fred has resigned;
- A referendum on the new quarry usage fee.
That's almost enough for getting started on our Bedrock election story, but we've also seen a bit of Bedrock election law and election administration in action:
- When a the office of Mayor is vacant, it is filled by special election, not appointment or remaining vacant until the next regular election.
- The Bedrock Board of Election (BBoE) called a special election.
- If a current local office-holder wants to run for a vacant office, he or she must resign from the office they already hold.
- If there is a local referendum pending for the next election, and a special election is called, then the referendum is held during the special election.
Next time, Fred applies to be a candidate for mayor, and gets an earful about how the BBoE works in practice. Fred knows, as I do, that there is always more to learn in election-land!
* My thanks and apologies for David Pogue on this one.
Putting an open source application into service - or "deployment" - can be different from deploying proprietary software. What works, and what doesn't? That's a question that's come up several times in the few weeks, as the TTV team has been working hard on several proposals for new projects in 2011. Based on our experiences in 2009-10, here is what we've been saying about deployment of election technology that is not a core part of ballot certified casting/counting systems, but is part of the great range of other types of election technology: data management solutions for managing election definitions, candidates, voter registration, voter records, pollbooks and e-pollbooks, election results, and more - and reporting and publishing the data. For proprietary solutions - off the shelf, or with customization and professional services, or even purely custom applications like many voter record systems in use today - deployment is most often the responsibility of the vendor. The vendor puts the software into the environment chosen by the customer -- state or local election officials - ranging from the customer's IT plant to outsourced hosting to the vendor's offering of an managed service in an application-service-provider approach. All have distinct benefits, but share the drawback of "vendor lock-in."
What about open-source election software? There are several approaches that can work, depending the nature of the data being managed, and the level of complexity in the IT shop of the election officials. For today, here is one approach that has worked for us.
What works: outsourced hosting, where a system integrator (SI) manages outsourced hosting. For our 2010 project for VA's FVAP solution, the project was led by an SI that managed the solution development and deployment, providing outsourced application hosting and support. The open-source software included a custom Web front-end to existing open-source election data management software that was customized to VA's existing data formats for voters and ballots. This arrangement worked well because the people who developed the custom front-end software also performed the deployment on a system completely under their control. VA's UOCAVA voters benefited from the voter service blank-ballot distribution, while the VA state board of elections was involved mainly by consuming reports and statistics about the system's operation.
That model works, but not in every situation. In the VA case, this model also constrained the way that the blank ballot distribution system worked. In this case, the system did not contain personal private information -- VA-provided voter records were "scrubbed". As a result, it was OK for the system's limited database to reside in a commercial hosting center outside of the the direct control of election officials. The deployment approach was chosen first, and it constrained the nature of the Web application.
The constraint arose because the FVAP solution allowed voters to mark ballots digitally (before printing and returning by post or express mail). Therefore it was essential that the ballot-marking be performed solely on the voter's PC, which absolutely no visibility by the server software running in the commercial datacenter. Otherwise, each specific voter's choices would be visible to a commercial enterprise -- clearly violating ballot secrecy. The VA approach was a contrast to some other approaches in which a voter's choices were sent over the Internet to a server which prepared a ballot document for the voter. To put it another way …
What doesn't work: hosting of government-privileged data. In the case of the FVAP solution, this would have been outsourced hosting of a system that had visibility on the ultimate in election-related sensitive data: voters' ballot choices.
What works: engaged IT group. A final ingredient in this successful recipe was engagement of a robust IT organization at the state board of elections. The VA system was very data-intensive during setup, with large amounts of data from legacy systems. The involvement of VA SBE IT staff was essential to get the job done on the process of dumping the data, scrubbing and re-organizing it, checking it, and loading it into the FVAP solution -- and doing this several times as the project progressed to the point where voter and ballot data were fixed.
To sum up what worked:
- data that was OK to be outside direct control of government officials;
- government IT staff engaged in the project so that it was not a "transom toss" of legacy data;
- development and deployment managed by a government-oriented SI;
- deployment into a hosted environment that met the SI's exact specifications for hosting the data management system.
That recipe worked well in this case, and I think would apply quite well for other situations with the same characteristics. In other situations, other models can work. What are those other models, or recipes? Another day, another blog on another recipe.
As I often do, I had a thoughtful Martin Luther King Day -- as you can see from my still pondering a couple days later. But I think I now have something to share. Last time I wrote on MLK, I likened two unlikely things:
- King's demand for social justice and peace, using Isaiah's prophetic words that "Justice shall roll down like water, and righteousness like a mighty stream."
- My vision of really meaningful election transparency, stemming from a mighty torrent of data that details everything that happened in a county's conduct of an election, published in a form anyone can see, and can use to check whether the election outcomes are actually supported by the data.
Still a bit of a stretch, no doubt, because since my little moment by the waterfalls of the MLK memorial in San Francisco, I've had rather mixed success in explaining why this kind of transparency is so difficult. Among the reasons are the complexity of the data, and the very inconvenient way it is locked up inside voting system products and proprietary data formats.
But perhaps more important, it is just a vexingly detailed and complicated process to administer elections and conduct voting and counting -- paradoxically made even more complex with the addition of new technology. (Just ask a New York state election admin person about 2010.) In some cases, I am sure that local election officials would not take umbrage at the phrase "Rube Goldberg Machine" to describe the whole passle of people, process, and tools.
So, among my new year's resolutions, I am going to try to communicate, by example, a large part of the scope of data and transparency that is needed in U.S. elections. It will take some time to do in small digestible blogs, but I hope the example will serve to illustrate several things:
- What election administration is really like;
- What kinds of information and operations are used;
- How a regular process of capturing and exposing the information can prevent some of the mishaps, doubts, and litigation you've often read about here.
- Last but not least, how the resulting transparency connects directly to the nuts-and-bolts election technology work that we are doing on vote tabulation and on digital pollbooks.
One challenge will be keeping the example at an artificially small scale, for comprehensibility, while still providing meaningful examples of the data and the election officials' work to use it. On that point especially, feedback will be particularly welcome!
As Greg said in his New Year's posting, we've been planning a variety of activities for 2011, and reflecting on what we did in 2010, much that remains to do, and to do better. But at the risk of boring you with a laundry list, I wanted to provide some additional detail on some of the 2010 activities that Greg mentioned. Many of the items listed below serve to indicate how much of the work in election technology (ours and others) has to get very detail oriented in order to actually deliver. Voter Registration
- Released version 2.0 of the TTV Online Voter Registration tool.
- Put OVRv2 into production, operated by Open Source Labs and managed by RockTheVote.
- Under RTV's management, OVR has served well over 200,000 registrants for the 2010 election cycle, nearing the quarter-million total.
Election Management System
- First-ever open source election management software deployed for use in DC and VA overseas voting projects in November 2010 elections.
- TTV Election Manager supports DC legacy data formats, VIP standard election data for VA, DC-specific jurisdiction definitions, and first-ever new VA custom jurisdictions for local referenda.
- First-ever system for computing and proofing and entire state's worth of election data and ballot definitions.
- First-ever open source paper ballot design system supports local and state specific ballot formats and composition rules for multiple jurisdictions including DC, VA, NH
- For VA statewide election, over 2,700 locality-specific ballots generated, including first-ever state-law compliant ballots for special classes of non-local UOCAVA voters.
- First-ever generation of dual-use ballot documents, the same document marked either digitally or physically to become the same legal paper ballot of record.
Overseas Ballot Distribution
- Fully localized ballots delivered to thousands of UOCAVA voters worldwide
- Data integration with state voter record databases, ensuring every eligible UOCAVA voter gets their correct ballot
- Public test of Digital Ballot Return - a controversial activity with many lessons learned on all sides, but we're proud to have supported the D.C. BOEE in a rare example of responsible open public testing that should be the model for any assessment of new election technology.
Open-Source Software License
- Released the OSDV Public License, or OPL, the first open source license specifically designed to aid state and local governments in acquiring open-source technology.
- Published the OPL Rationale document, explaining the goals of the OPL and the reasoning behind each element of the OPL as meeting government needs for software licensing.
Public Speaking and Education
- Co-sponsored the Overseas Voting Foundation's UOCAVA Summit, moderated the Internet Voting Debate, and sessions on Pilots, UOCAVA Technology Futures
- Published invited position papers at National Institute of Science and Technology conferences on election data standards and Overseas Voting
- Gave invited presentations at Gov 2.0 Summit, National Civic Summit, OSCON Open Source Convention, Government Open Source Conference among others
As you can see from these highlights -- the tip of the proverbial iceberg -- 2010 was a busy year for us. And 2011 is shaping up to be even busier!
Yesterday, judges in New York state were hearing calls for hand recount, while elsewhere other vote counts were being factored into the totals, and on the other side of the Atlantic, the same question "where are the election results?" was getting very serious. In the Ivory Coast, like in some places in the U.S., there is a very close election that still isn't decided. There, it's gotten serious, as the military closed off all points of entry into the country as a security measure related to unrest about the close election and lack of a winner. Such distrust and unrest, we are lucky to have avoided here; despite the relatively low levels of trust in U.S. electoral processes (less than half of eligible people vote, and of voters polled in years past a third to a half were negative or neutral), we are content to let courts and the election finalization process wind on for weeks. OK, so maybe not content, maybe extremely irate and litigious in some cases, but not burning cars in streets.
That's why I think it is particularly important that Americans better understand the election finalization process -- which of course like almost everything in U.S. elections varies by state or even locality. But the news from Queens NY (New York Times, "A Month After Elections, 200,000 Votes Found") though it sounds awful in headline, is actually enormously instructive -- especially about our hunger for instant results.
It's not awful; it's complicated. As the news story outlines, there is a complicated process on election night, with lots of room for human error after a 16 hour day. The finalization process is conducted over days or weeks to aggregate vote data and produce election results carefully, catching errors, though usually not changing preliminary election-night results. As Douglas A. Kellner, co-chairman of the State Board of Elections, said:
The unofficial election night returns reported by the press always have huge discrepancies — which is why neither the candidates or the election officials ever rely on them.
That's particularly true as NY has moved to paper optical scan voting from lever machines, and the finalization process has changed. But in the old days, it was possible to misplace one or a few lever machine's worth of vote totals with human errors in the paper process of reading dials, writing numbers on reporting form sheets, transporting the sheets, etc. Then, add to that the computer factor for human error, and you get your 80,000 vote variance in Queens.
Bottom line -- when an election is close, of course we want the accurate answer, and getting it right takes time. Using computerized voting systems certainly helps with getting quicker answers for contests that aren't close and won't change in the final count. And certainly they can help by enabling audits and recounts that lever machines could not. But for close calls, it's back to elbow grease and getting out the i-dotters and t-crossers -- and being thankful for their efforts.
In my last post, I recounted an incident from Erie County NY, but deferred to today an account of what the technology troubles were, that prevented the routine use of a Tabulator to create county-wide vote totals by combining count data from each of the opscan paper ballot counting devices. The details are worth considering as a counter-example of technology that is not transparent, but should be. As I understand the incident, it wasn't the opscan counting systems that malfunctioned, but rather the portion of the voting system that tabulates the county-wide vote totals. As I described in an earlier post, the ES&S system has no tabulator per se, but rather some aggregation software that is part of the larger body of Election Management System (EMS) software that runs on an ordinary Windows PC. Each opscan devices writes data to a USB stick, and election officials aggregate the data by feeding each stick into the EMS. The EMS is supposed to store all the data on the stick, and add up all the opscan machines' vote counts into a vote total for each contest.
Last week, though, when Erie County officials tried to do so, the EMS rejected the data sticks. Election officials had no way to use the sticks to corroborate the vote totals that they had made by visually examining the election-night paper-tapes from the 130 opscan devices. Sensible questions: Did the devices' software err in writing the data to the sticks? If so, might the tapes be incorrect as well? Is the data still there? It turns out that the case was a bug in EMS software, not the devices, and in fact the data on the sticks was just fine. With a workaround on the EMS, the data was extracted from the sticks and used as planned. Further, the workaround did not require a bug fix to the software, which would have been illegal. Instead, some careful hand-crafting of EMS data enabled the software to stop choking on the data from the sticks.
Now, I am not feeling 100% great about the need for such hand-crafting, or indeed about the correctness of the totals produced by a voting system operating outside of its tested ordinary usage. But some canny readers are probably wondering about a simpler question. If the data was on the sticks, why not simply copy the files off the stick using a typical PC, and examine the contents of the files directly? With 40-odd contests countywide and a 100-odd sticks and paper tapes, it's not that much work to just look at the them to whether the numbers on each stick match those on the tapes. Answer: the voting system software is set up to prevent direct examination, that's why! The vote data can only be seen via the software in the EMS. And when that software glitches, you have to wonder about what you're seeing.
This is at least one area where better software design can lead to higher confidence system: write-once media for storing each counting device's tallies; use of public standard data formats so that anyone examine the data; use of human-usable formats so that anyone can understand the data; use of a separate, single-purpose tabulator device that operates autonomously from the rest of the voting system; publication of the tally data and the tabulator's output data, so that anyone can check the correct results either manually or with their choice of software. At least that's the TrustTheVote approach that we're working out now.
Behind the election news in Buffalo, NY, there is a cautionary tale about voting system complexity and confidence. The story is about a very close race for the state Senate's 60th district. One news article includes a reference to "software problems with the new electronic voting machines in Erie County." The fundamental issue here is whether to trust the vote count numbers, in a case where the race is very close and where the voting system malfunctioned at least once, because of a software bug later identified by the vendor. If one part of the system malfunctioned, shouldn't we also be concerned that another part may also have malfunctioned? An error on even one of the over a 100 paper-ballot-counting devices could easily swamp the very small margin between the top two candidates.
Those are good questions, and as frequent readers will already know, the typical answer is "audit", that is, hand-counting a portion of the paper ballots to ensure that the hand-counts match the machine counts, using statistical science to guide how many ballots to hand count to achieve confidence that the overall election results are valid. That's what the state of Connecticut -- another recent adopter of paper ballots over lever machines -- is doing with a manual count of ballots from 73 of the 734 precincts statewide.
But that's not happening in Buffalo (as far as I can tell), where instead there is wrangling over doing a full re-count, with confusion over the voting system malfunction muddying the waters. And that's a shame, because election technology properly used (including routine audits) should not cause this kind of legal activity over the validity of an election result -- in this case an important one that could influence party control in the state Senate, with re-districting on the horizon.
But some of the finger-point goes to the technology too. What actually malfunctioned? Could the glitch have effect the election result? What can we learn from the incident? Questions for next time ...
Continuing on with our recap of election technology faults and oddities in the recent election, not the most alarming but perhaps the most perplexing is a story from Gadsden, AL. From the the news article, it seems that Etowah County's election officials rely on their voting system vendor, Election Systems and Services (ES&S) to provide election supplies for using ES&S's opscan ballot counter. So far so good, but the supplies include what ES&S coyly refers to a "marking devices", a.k.a. pens for voters to use to fill in bubbles on the paper ballot. Why the county needs to buy pens from ES&S rather than a local Office Depot or similar, I couldn't say, but here is the weird bit ... ES&S couldn't manage to find enough pens that both adequately marked their ballots, and met the $9 a dozen price point, which an ES&S representative implied was a money-loser for them in their service contract with Etowah County. And here is the really weird bit: instead of buying some $12 a dozen pens, or letting the election officials know about the pen shortage (!), ES&S supplied pencils instead of pens. That's right, pencils, with an implied recommendation that it is OK for voters to mark a ballot with an erasable mark; and implied endorsement that the opscan counting devices read pencil marks as well marks from ink-based "marking devices."
It will come as no surprise to readers here that the counting machines got flakey when presented with pencil marked ballots, and caused some trouble at the polling places -- or at least the need to run over to a local store and buy some more pens. It seems that no great harm was done, at least based on what I was able to glean from the news article. People marked ballots with pencil, got them kicked back from the scanners, and had to wait for one of those scarce pens to become available to be able to mark a fresh ballot in pen. One hopes that poll workers correctly stored the pencil ballots as spoiled, and properly failed to examine the ballots to determine how a voter voted. These are normal operational risks, of course, but here again we see where flakey machines and/or flakey vendors create the polling place conditions where:
- Accurate, private, timely voting is more at risk than need be.
- Election officials and volunteers have to operate with more power and discretion, and less transparency, than anyone would prefer under normal circumstances.
But honestly, the story is short, and has several bits that are so wacky, I really urge you to read it yourself. Here are some of the bits that leapt out at me when I read it. These are quotes, but italics are mine, indicating where my jaw dropped. :-)
- Election Systems and Software ... included pencils in supply packets because of a shortage of pens.
- County election officials discovered that when they checked supplies prior to Election Day.
- ES and S apologized for the problems that arose from the change to pencils, which he said “created more issues than we anticipated.”
- ... continual changes have made it difficult to evaluate and settle on a “reliable, consistently available pen.”
- ... cost of pens that meet technical requirements had increased to more than $9 a dozen, and with the quantities needed for the election, that amounted to a “significant cost.”
- [ES&S] won’t provide pencils for future elections and “will be looking for a suitable pen that meets the various needs of all.”
Really weird. I guess that what Etowah County deserves, but does not have today, is a voting system with optical scanners that are "functionally compatible" with a "marking device" that local election officials can easily buy by the hundred dozen at a local store. I know that I often explain here why some aspect of election technology is not rocket science, but I an assure you, no combustion engineering or orbital mechanics are required to find a black felt tip pen to mark a bubble that a scanner can scan and counting software can find. It's too bad that's not the case in Gadsden.