Viewing entries tagged
Gregory Miller

Comment

“Digital Voting”—Don’t believe everything you think

In at recent blog post we examined David Plouffe’s recent Wall Street Journal forward-looking op-ed [paywall] and rebalanced his vision with some practical reality.

Now, let’s turn to Plouffe’s notion of “digital voting.”  Honestly, that phrase is confusing and vague.  We should know: it catalyzed our name change last year from Open Source Digital Voting Foundation (OSDV) to Open Source Election Technology Foundation (OSET).

Comment

Comment

David Plouffe’s View of the Future of Voting — We Agree and Disagree

David Plouffe, President Obama’s top political and campaign strategist and the mastermind behind the winning 2008 and 2012 campaigns, wrote a forward-looking op-ed [paywall] in the Wall Street Journal recently about the politics of the future and how they might look.

He touched on how technology will continue to change the way campaigns are conducted – more use of mobile devices, even holograms, and more micro-targeting at individuals. But he also mentioned how people might cast their votes in the future, and that is what caught our eye here at the TrustTheVote Project.  There is a considerable chasm to cross between vision and reality.

Comment

Comment

Money Shot: What Does a $40M Bet on Scytl Mean?

…not much we think.

Yesterday’s news of Microsoft co-founder billionaire Paul Allen’s investing $40M in the Spanish election technology company Scytl is validation that elections remain a backwater of innovation in the digital age.

But it is not validation that there is a viable commercial market for voting systems of the size typically attracting venture capitalists; the market is dysfunctional and small and governments continue to be without budget.

And the challenges of building a user-friendly secure online voting system that simultaneously protects the anonymity of the ballot is an interesting problem that only an investor of the stature of Mr. Allen can tackle.

We think this illuminates a larger question:

To what extent should the core technology of the most vital aspect of our Democracy be proprietary and black box, rather than publicly owned and transparent?

To us, that is a threshold public policy question, commercial investment viability issues notwithstanding.

To be sure, it is encouraging to see Vulcan Capital and a visionary like Paul Allen invest in voting technology. The challenges facing a successful elections ecosystem are complex and evolving and we will need the collective genius of the tech industry’s brightest to deliver fundamental innovation.

We at the TrustTheVote Project believe voting is a vital component of our nation’s democracy infrastructure and that American voters expect and deserve a voting experience that’s verifiable, accurate, secure and transparent.  Will Scytl be the way to do so?

The Main Thing

The one thing that stood out to us in the various articles on the investment were Scytl’s comments and assertions of their security with international patents on cryptographic protocols.  We’ve been around the space of INFOSEC for a long time and know a lot of really smart people in the crypto field.  So, we’re curious to learn more about their IP innovations.  And yet that assertion is actually a red herring to us.

Here’s the main thing: transacting ballots over the public packet switched network is not simply about security.   Its also about privacy; that is, the secrecy of the ballot.  Here is an immutable maxim about the digital world of security and privacy: there is an inverse relationship, which holds that as security is increased, privacy must be decreased, and vice-verse.  Just consider any airport security experience.  If you want maximum security then you must surrender a bunch of privacy.  This is the main challenge of transacting ballots across the Internet, and why that transaction is so very different from banking online or looking at your medical record.

And then there is the entire issue of infrastructure.  We continue to harp on this, and still wait for a good answer.  If by their own admissions, the Department of Defense, Google, Target, and dozens of others have challenges securifying their own data centers, how exactly can we be certain that a vendor on a cloud-based service model or an in-house data center of a county or State has any better chance of doing so? Security is an arms race.  Consider the news today about Heartbleed alone.

Oh, and please for the sake of credibility can the marketing machinery stop using the phrase “military grade security?”  There is no such thing.  And it has nothing to do with an increase in the  128-bit encryption standard RSA keys to say, 512 or 1024 bit.  128-bit keys are fine and there is nothing military to it (other than the Military uses it).  Here is an interesting article from some years ago on the sufficiency of current crypto and the related marketing arms race.  Saying “military grade” is meaningless hype.  Besides, the security issues run far beyond the transit of data between machines.

In short, there is much the public should demand to understand from anyone’s security assertions, international patents notwithstanding.  And that goes for us too.

The Bottom Line

While we laud Mr. Allen’s investment in what surely is an interesting problem, no one should think for a moment that this signals some sort of commercial viability or tremendous growth market opportunity.  Nor should anyone assume that throwing money at a problem will necessarily fix it (or deliver us from the backwaters of Government elections I.T.).  Nor should we assume that this somehow validates Scytl’s “model” for “security.”

Perhaps more importantly, while we need lots of attention, research, development and experimentation, the bottom line to us is whether the outcome should be a commercial proprietary black-box result or an open transparent publicly owned result… where the “result” as used here refers to the core technology of casting and counting ballots, and not the viable and necessary commercial business of delivering, deploying and servicing that technology.

Comment

Comment

The “VoteStream Files” A Summary

The TrustTheVote Project Core Team has been hard at work on the Alpha version ofVoteStream, our election results reporting technology. They recently wrapped up a prototype phase funded by the Knight Foundation, and then forged ahead a bit, to incorporate data from additional counties, provided by by participating state or local election officials after the official wrap-up.

Along the way, there have been a series of postings here that together tell a story about the VoteStream prototype project. They start with a basic description of the project in Towards Standardized Election Results Data Reporting and Election Results Reload: the Time is Right. Then there was a series of posts about the project’s assumptions about data, about software (part one and part two), and about standards and converters (part one and part two).

Of course, the information wouldn’t be complete without a description of the open-source software prototype itself, provided Not Just Election Night: VoteStream.

Actually the project was as much about data, standards, and tools, as software. On the data front, there is a general introduction to a major part of the project’s work in “data wrangling” in VoteStream: Data-Wrangling of Election Results DataAfter that were more posts on data wrangling, quite deep in the data-head shed — but still important, because each one is about the work required to take real election data and real election result data from disparate counties across the country, and fit into a common data format and common online user experience. The deep data-heads can find quite a bit of detail in three postings about data wrangling, in Ramsey County MN, in Travis County TX, and in Los Angeles CountyCA.

Today, there is a VoteStream project web site with VoteStream itself and the latest set of multi-county election results, but also with some additional explanatory material, including the election results data for each of these counties.  Of course, you can get that from the VoteStream API or data feed, but there may be some interest in the actual source data.  For more on those developments, stay tuned!

Comment

Comment

A Northern Exposed iVoting Adventure

NorthernExposureImage
NorthernExposureImage

Alaska's extension to its iVoting venture may have raised the interests of at least one journalist for one highly visible publication.  When we were asked for our "take" on this form of iVoting, we thought that we should also comment here on this "northern exposed adventure." (apologies to those fans of the mid-90s wacky TV series of a similar name.) Alaska has been among the states that allow military and overseas voters to return marked absentee ballots digitally, starting with fax, then eMail, and then adding a web upload as a 3rd option.  Focusing specifically on the web-upload option, the question was: "How is Alaska doing this, and how do their efforts square with common concerns about security, accessibility, Federal standards, testing, certification, and accreditation?"

In most cases, any voting system has to run that whole gauntlet through to accreditation by a state, in order for the voting system to be used in that state. To date, none of the iVoting products have even trying to run that gauntlet.

So, what Alaska is doing, with respect to security, certification, and host of other things is essentially: flying solo.

Their system has not gone through any certification program (State, Federal, or otherwise that we can tell); hasn't been tested by an accredited voting system test lab; and nobody knows how it does or doesn't meet  federal requirements for security, accessibility, and other (voluntary) specifications and guidelines for voting systems.

In Alaska, they've "rolled their own" system.  It's their right as a State to do so.

In Alaska, military voters have several options, and only one of them is the ability to go to a web site, indicate their choices for vote, and have their votes recorded electronically -- no actual paper ballot involved, no absentee ballot affidavit or signature needed. In contrast to the sign/scan/email method of return of absentee ballot and affidavit (used in Alaska and 20 other states), this is straight-up iVoting.

So what does their experience say about all the often-quoted challenges of iVoting?  Well, of course in Alaska those challenges apply the same as anywhere else, and they are facing them all:

  1. insider threats;
  2. outsider hacking threats;
  3. physical security;
  4. personnel security; and
  5. data integrity (including that of the keys that underlie any use of cryptography)

In short, the Alaska iVoting solution faces all the challenges of digital banking and online commerce that every financial services industry titan and eCommerce giant spends big $ on every year (capital and expense), and yet still routinely suffer attacks and breaches.

Compared to the those technology titans of industry (Banking, Finance, Technology services, or even the Department of Defense), how well are Alaskan election administrators doing on their shoestring (by comparison) budget?

Good question.  It's not subject to annual review (like banks' IT operations audit for SAS-70), so we don't know.  That also is their right as a U.S. state.  However, the  fact that we don't know, does not debunkany of the common claims about these challenges.  Rather, it simply says that in Alaska they took on the challenges (which are large) and the general public doesn't know much about how they're doing.

To get a feeling for risks involved, just consider one point, think about the handful of IT geeks who manage the iVoting servers where the votes are recorded and stored as bits on a disk.  They arenot election officials, and they are no more entitled to stick their hands into paper ballots boxes than anybody else outside a county elections office.  Yet, they have the ability (though not the authorization) to access those bits.

  • Who are they?
  • Does anybody really oversee their actions?
  • Do they have remote access to the voting servers from anywhere on the planet?
  • Using passwords that could be guessed?
  • Who knows?

They're probably competent responsible people, but we don'tknow.  Not knowing any of that, then every vote on those voting servers is actually a question mark -- and that's simply being intellectually honest.

Lastly, to get a feeling for the possible significance of this lack of knowledge, consider a situation in which Alaska's electoral college votes swing an election, or where Alaska's Senate race swings control of Congress (not far-fetched given Murkowski's close call back in 2010.)

When the margin of victory in Alaska, for an election result that effects the entire nation, is a low 4-digit number of votes, and the number of digital votes cast is similar, what does that mean?

It's quite possible that those many digital votes could be cast in the next Alaska Senate race.  If the contest is that close again,  think about the scrutiny those IT folks will get.  Will they be evaluated any better than every banking data center investigated after a data breach?  Any better than Target?  Any better than Google or Adobe's IT management after having trade secrets stolen?  Or any better than the operators of military unclassified systems that for years were penetrated through intrusion from hackers located in China who may likely have been supported by the Chinese Army or Intelligence groups?

Probably not.

Instead, they'll be lucky (we hope) like the Estonian iVoting administrators, when the OCSE visited back in 2011 to have a look at the Estonian system.  Things didn't go so well.  OCSE found that one guy could have undermined the whole system.  Good news: it didn't happenCold comfort: that one guy didn't seem to have the opportunity -- most likely because he and his colleagues were busier than a one-armed paper hanger during the election, worrying about Russian hackers attacking again, after they had previously shut-down the whole country's Internet-connect government systems.

But so far, the current threat is remote, and it is still early days even for small scale usage of Alaska's iVoting option.  But while the threat is still remote, it might be good for the public to see some more about what's "under the hood" and who's in charge of the engine -- that would be our idea of more transparency.

Wandering off the Main Point for a Few Paragraphs So, in closing I'm going to run the risk of being a little preachy here (signaled by that faux HTML tag above); again, probably due to the surge in media inquiries recently about how the Millennial generation intends to cast their ballots one day.  Lock and load.

I (and all of us here) are all for advancing the hallmarks of the Millennial mandates of the digital age: ease and convenience.  I am also keenly aware there are wing-nuts looking for their Andy Warhol moment.  And whether enticed by some anarchist rhetoric, their own reality distortion field, or most insidious: the evangelism of a terrorist agenda (domestic or foreign) ...said wing nut(s) -- perhaps just for grins and giggles -- might see an opportunity to derail an election (see my point above about a close race that swings control of Congress or worse).

Here's the deep concern: I'm one of those who believes that the horrific attacks of 9.11 had little to do with body count or the implosions of western icons of financial might.  The real underlying agenda was to determine whether it might be possible to cause a temblor of sufficient magnitude to take world financial markets seriously off-line, and whether doing so might cause a rippling effect of chaos in world markets, and what disruption and destruction that might wreak.  If we believe that, then consider the opportunity for disruption of the operational continuity of our democracy.

Its not that we are Internet haters: we're not -- several of us came from Netscape and other technology companies that helped pioneer the commercialization of that amazing government and academic experiment we call the Internet.  Its just that THIS Internet and its current architecture simply was not designed to be inherently secure or to ensure anyone's absolute privacy (and strengthening one necessarily means weakening the other.)

So, while we're all focused on ease and convenience, and we live in an increasingly distributed democracy, and the Internet cloud is darkening the doorstep of literally every aspect of society (and now government too), great care must be taken as legislatures rush to enact new laws and regulations to enable studies, or build so-called pilots, or simply advance the Millennial agenda to make voting a smartphone experience.  We must be very careful and considerably vigilant, because its not beyond the realm of reality that some wing-nut is watching, cracking their knuckles in front of their screen and keyboard, mumbling, "Oh please. Oh please."

Alaska has the right to venture down its own path in the northern territory, but it does so exposing an attack surface.  They need not (indeed, cannot) see this enemy from their back porch (I really can't say of others).  But just because it cannot be identified at the moment, doesn't mean it isn't there.

One other small point:  As a research and education non-profit we're asked why shouldn't we be "working on making Internet voting possible?Answer: Perhaps in due time.  We do believe that on the horizon responsible research must be undertaken to determine how we can offer an additional alternative by digital means to casting a ballot next to absentee and polling place experiences.  And that "digital means" might be over the public packet-switched network.  Or maybe some other type of network.  We'll get there.  But candidly, our charge for the next couple of years is to update an outdated architecture of existing voting machinery and elections systems and bring about substantial, but still incremental innovation that jurisdictions can afford to adopt, adapt and deploy.  We're taking one thing at a time and first things first; or as our former CEO at Netscape used to say, we're going to "keep the main thing, the main thing."

Onward
GAM|out

Comment

Comment

PCEA Report Finally Out: The Real Opportunity for Innovation Inside

PCEACoverThis week the PCEA finally released its long-awaited report to the President.  Its loaded with good recommendations.  Over the next several days or posts we'll give you our take on some of them.  For the moment, we want to call your attention to a couple of under-pinning elements now that its done.

The Resource Behind the Resources

Early in the formation of what initially was referred to as the "Bauer-Ginsberg Commission" we were asked to visit the co-chairs in Washington D.C. to chat about technology experts and resources.  We have a Board member who knows them both and when asked we were honored to respond.

Early on we advised the Co-Chairs that their research would be incomplete without speaking with several election technology experts, and of course they agreed.  The question was how to create a means to do so and not bog down the progress governed by layers of necessary administrative regulations.

I take a paragraph here to observe that I was very impressed in our initial meeting with Bob Bauer and Ben Ginsberg.  Despite being polar political opposites they demonstrated how Washington should work: they were respectful, collegial, sought compromise to advance the common agenda and seemed to be intent on checking politics at the door in order to get work done.  It was refreshing and restored my faith that somewhere in the District there remains a potential for government to actually work for the people.  I digress.

We advised them that looking to the CalTech-MIT Voting Project would definitely be one resource they could benefit from having.

We offered our own organization, but with our tax exempt status still pending, it would be difficult politically and otherwise to rely on us much in a visible manner.

So the Chairs asked us if we could pull together a list -- not an official subcommittee mind you, but a list of the top "go to" minds in the elections technology domain.  We agreed and began a several week process of vetting a list that needed to be winnowed down to about 20 for manageability  These experts would be brought in individually as desired, or collectively  -- it was to be figured out later which would be most administratively expedient.  Several of our readers, supporters, and those who know us were aware of this confidential effort.  The challenge was lack of time to run the entire process of public recruiting and selection.  So, they asked us to help expedite that, having determined we could gather the best in short order.

And that was fine because anyone was entitled to contact the Commission, submit letters and comments and come testify or speak at the several public hearings to be held.

So we did that.  And several of that group were in fact utilized.  Not everyone though, and that was kind of disappointing, but a function of the timing constraints.

The next major resource we advised they had to include besides CalTech-MIT and a tech advisory group was Rock The Vote.  And that was because (notwithstanding they being a technology partner of ours) Rock The Vote has its ear to the rails of new and young voters starting with their registration experience and initial opportunity to cast their ballot.

Finally we noted that there were a couple of other resources they really could not afford to over-look including the Verified Voting Foundation, and L.A. County's VSAP Project and Travis County's StarVote Project.

The outcome of all of that brings me to the meat of this post about the PCEA Report and our real contribution.  Sure, we had some behind the scenes involvement as I describe above.  No big deal.  We hope it helped.

The Real Opportunity for Innovation

But the real opportunity to contribute came in the creation of the PCEA Web Site and its resource toolkit pages.

On that site, the PCEA took our advice and chose to utilize Rock The Vote's open source voter registration tools and specifically the foundational elements the TrustTheVote Project has built for a States' Voter Information Services Portal.

Together, Rock The Vote and the TrustTheVote Project are able to showcase the open source software that any State can adopt, adapt, and deploy--for free (at least the adoption part) and without having to reinvent the wheel by paying for a ground-up custom build of their own online voter registration and information services portal.

We submit that this resource on their PCEA web site represents an important ingredient to injecting innovation into a stagnant technology environment of today's elections and voting systems world.

For the first time, there is production-ready open source software available for an important part of an elections official's administrative responsibilities that can lower costs, accelerate deployment and catalyze innovation.

To be sure, its only a start -- its lower hanging fruit of an election technology platform that doesn't require any sort of certification. With our exempt status in place, and lots of things happening we'll soon share, there is more, much more, to come.  But this is a start.

There is a 112 pages of goodness in the PCEA report.  And there are some elements in there that deserve further discussion.  But we humbly assert its the availability of some open source software on their resource web site that we think represents a quiet breakthrough in elections technology innovation.

The news has been considerable.  So, yep, we admit it.  We're oozing pride today. And we owe it to your continued support of our cause. Thank you!

GAM | out

Comment

1 Comment

Comments Prepared for Tonight's Elections Technology Roundtable

This evening at 5:00pm members of the TrustTheVote Project have been invited to attend an elections technology round table discussion in advance of a public hearing in Sacramento, CA scheduled for tomorrow at 2:00pm PST on new regulations governing Voting System Certification to be contained in Division 7 of Title 2 of the California Code of Regulations. Due to the level of activity, only our CTO, John Sebes is able to participate.

We were asked if John could be prepared to make some brief remarks regarding our view of the impact of SB-360 and its potential to catalyze innovation in voting systems.  These types of events are always dynamic and fluid, and so we decided to publish our remarks below just in advance of this meeting.

Roundtable Meeting Remarks from the OSDV Foundation | TrustTheVote Project

We appreciate an opportunity to participate in this important discussion.  We want to take about 2 minutes to comment on 3 considerations from our point of view at the TrustTheVote Project.

1. Certification

For SB-360 to succeed, we believe any effort to create a high-integrity certification process requires re-thinking how certification has been done to this point.  Current federal certification, for example, takes a monolithic approach; that is, a voting system is certified based on a complete all-inclusive single closed system model.  This is a very 20th century approach that makes assumptions about software, hardware, and systems that are out of touch with today’s dynamic technology environment, where the lifetime of commodity hardware is months.

We are collaborating with NIST on a way to update this outdated model with a "component-ized" approach; that is, a unit-level testing method, such that if a component needs to be changed, the only re-certification required would be of that discrete element, and not the entire system.  There are enormous potential benefits including lowering costs, speeding certification, and removing a bar to innovation.

We're glad to talk more about this proposed updated certification model, as it might inform any certification processes to be implemented in California.  Regardless, elections officials should consider that in order to reap the benefits of SB-360, the non-profit TrustTheVote Project believes a new certification process, component-ized as we describe it, is essential.

2. Standards

2nd, there is a prerequisite for component-level certification that until recently wasn't available: common open data format standards that enable components to communicate with one another; for example, a format for a ballot-counter's output of vote tally data, that also serves as input to a tabulator component.  Without common data formats elections officials have to acquire a whole integrated product suite that communicates in a proprietary manner.  With common data formats, you can mix and match; and perhaps more importantly, incrementally replace units over time, rather than doing what we like to refer to as "forklift upgrades" or "fleet replacements."

The good news is the scope for ballot casting and counting is sufficiently focused to avoid distraction from the many other standards elements of the entire elections ecosystem.  And there is more goodness because standards bodies are working on this right now, with participation by several state and local election officials, as well as vendors present today, and non-profit projects like TrustTheVote.  They deserve congratulations for reaching this imperative state of data standards détente.  It's not finished, but the effort and momentum is there.

So, elections officials should bear in mind that benefits of SB-360 also rest on the existence of common open elections data standards.

3. Commercial Revitalization

Finally, this may be the opportunity to realize a vision we have that open data standards, a new certification process, and lowered bars to innovation through open sourcing, will reinvigorate a stagnant voting technology industry.  Because the passage of SB-360 can fortify these three developments, there can (and should) be renewed commercial enthusiasm for innovation.  Such should bring about new vendors, new solutions, and new empowerment of elections officials themselves to choose how they want to raise their voting systems to a higher grade of performance, reliability, fault tolerance, and integrity.

One compelling example is the potential for commodity commercial off-the-shelf hardware to fully meet the needs of voting and elections machinery.  To that point, let us offer an important clarification and dispel a misconception about rolling your own.  This does not mean that elections officials are about to be left to self-vend.  And by that we mean self-construct and support their open, standard, commodity voting system components.  A few jurisdictions may consider it, but in the vast majority of cases, the Foundation forecasts that this will simply introduce more choice rather than forcing you to become a do-it-yourself type.  Some may choose to contract with a systems integrator to deploy a new system integrating commodity hardware and open source software.  Others may choose vendors who offer out-of-the-box open source solutions in pre-packaged hardware.

Choice is good: it’s an awesome self-correcting market regulator and it ensures opportunity for innovation.  To the latter point, we believe initiatives underway like STAR-vote in Travis County, TX, and the TrustTheVote Project will catalyze that innovation in an open source manner, thereby lowering costs, improving transparency, and ultimately improving the quality of what we consider critical democracy infrastructure.

In short, we think SB-360 can help inject new vitality in voting systems technology (at least in the State of California), so long as we can realize the benefits of open standards and drive the modernization of certification.

 

EDITORIAL NOTES: There was chatter earlier this Fall about the extent to which SB-360 allegedly makes unverified non-certified voting systems a possibility in California.  We don't read SB-360 that way at all.  We encourage you to read the text of the legislation as passed into law for yourself, and start with this meeting notice digest.  In fact, to realize the kind of vision that leading jurisdictions imagine, we cannot, nor should not alleviate certification, and we think charges that this is what will happen are misinformed.  We simply need to modernize how certification works to enable this kind of innovation.  We think our comments today bear that out.

Moreover, have a look at the Agenda for tomorrow's hearing on implementation of SB-360.  In sum and substance the agenda is to discuss:

  1. Establishing the specifications for voting machines, voting devices, vote tabulating devices, and any software used for each, including the programs and procedures for vote tabulating and testing. (The proposed regulations would implement, interpret and make specific Section 19205 of the California Elections Code.);
  2. Clarifying the requirements imposed by recently chaptered Senate Bill 360, Chapter 602, Statutes 2013, which amended California Elections Code Division 19 regarding the certification of voting systems; and
  3. Clarifying the newly defined voting system certification process, as prescribed in Senate Bill 360.

Finally, there has been an additional charge that SB-360 is intended to "empower" LA County, such that what LA County may build they (or someone on their behalf) will sell the resulting voting systems to other jurisdictions.  We think this allegation is also misinformed for two reasons: [1] assuming LA County builds their system on open source, there is a question as to what specifically would/could be offered for sale; and [2] notwithstanding offering open source for sale (which technically can be done... technically) it seems to us that if such a system is built with public dollars, then it is in fact, publicly owned.  From what we understand, a government agency cannot offer for sale their assets developed with public dollars, but they can give it away.  And indeed, this is what we've witnessed over the years in other jurisdictions.

Onward.

1 Comment

Comment

Crowd Sourcing Polling Place Wait Times – Part 2

Last time, we wrote about the idea of a voter information service where people could crowd source the data about polling place wait times, so that other voters would benefit by not going when the lines are getting long, and so that news media and others could get a broad view of how well or poorly a county or state was doing in terms of voting time. And as we observed such would be a fine idea, but the results from that crowd-sourced reporting would be way better if the reporting were not on the “honor system.”  Without going on a security and privacy rampage, it would be better if this idea were implemented using some existing model for people to do mobile computing voter-stuff, in a way that is not trivial to abuse, unlike the honor system.

Now, back to the good news we mentioned previously: there is an existing model we could use to limit the opportunity for abuse.  You see, many U.S. voters, in a growing number of States, already have the ability to sit in a café and use their smart phone and a web site to identify themselves sufficiently to see their full voter record, and in some cases even update part of that voter record.

So, the idea is: why not extend that with a little extra record keeping of when a voter reports that they have arrived at the polls, and when they said they were done? In fact, it need not even be an extension of existing online voter services, and could be done in places that are currently without online voter services altogether.  It could even be the first online voter service in those places.

The key here is voters “sufficiently identify themselves” through some existing process, and that identification has to be based an existing voter record.  In complex online voter services (like paperless online voter registration), that involves a 3-way real-time connection between the voter’s digital device, the front-end web server that it talks to, and a privileged and controlled connection from the front-end to obtain specific voter data in the back-end.  But in a service like this, it could be even simpler, with a system that’s based on a copy of the voter record data, indeed, just that part that the voter needs to use to “identify themselves sufficiently”.

Well, let’s not get ahead of ourselves.  The fact is, State and/or local elections officials generally manage the voter database.  And our Stakeholders inform us its still likely these jurisdictions would want to operate this service in order to retain control of the data, and to control the ways and means of “sufficient identity” to be consistent with election laws, current usage practices, and other factors.  On the other hand, a polling place traffic monitor service can be a completely standalone system – a better solution we think, and more likely to be tried by everyone.

OK, that’s enough for the reasonably controlled and accurate crowd-source reporting of wait times. What about the benefits from it – the visibility on wait times?  As is almost always the case in transparent, open government computing these days, there are two parallel answers.

The first answer is that the same system that voters report into, could also provide the aggregated information to the public.  For example, using a web app, one could type in their street address (or get some help in selecting it, akin to our Digital Poll Book), and see the wait time info for a given precinct.  They could also view a list of the top-5 shortest current wait times and bottom-5 longest wait times of the precincts in their county, and see where their precinct sits in that ranking.  They could also study graphs of moving averages of wait times – well, you can ideate for yourself.  It’s really a question of what kind of information regular voters would actually value, and that local election officials would want to show.

The second answer is that this system must provide a web services API so that “other systems” can query this wait-time-reporting service.  These other systems should be able to get any slice of the raw data, or the whole thing, up to the minute.  Then they could do whatever visualization, reporting, or other services thought up by the clever people operating this other system.

For me, I’d like an app on my phone that pings like my calendar reminders, that I set to ping myself after 9am (no voting without adequate caffeine!) but before 3pm (high school lets out and street traffic becomes a sh*t show ;-)); but oh, when the waiting time is <10 minutes.  I’d also like something that tells me if/when turn-out in my precinct (or my county, or some geographic slice) tips over 50% of non-absentee voters.  And you can imagine others.  But the main point is that we do not expect our State or local election officials to deliver that to us.  We do hope that they can deliver the basics, including that API so that others can do cool stuff with the data.

Actually, it’s an important division of labor.

Government organizations have the data and need to “get the data out” both in raw form via an API, and in some form useful for individual voters’ practical needs on Election Day.  Then other organizations or individuals can use that API with their different vision and innovation to put that data to a range of additional good uses.  That's our view.

So, in our situation at the TrustTheVote Project, it’s actually really possible.  We already have the pieces: [1] the whole technology framework for online voter services, based on existing legacy databases; [2] the web and mobile computing technology framework with web services and APIs; [3] existing voter services that are worked examples of how to use these frameworks; and [4] some leading election officials who are already committed to using all these pieces, in real life, to help real voters.  This “voting wait-time tracker” system we call PollMon is actually one of the simplest examples of this type of computing.

We’re ready to build one.  And we told the Knight News Challenge so.  We say, let’s do this.  Wanna help?  Let us know.  We've already had some rockin good ideas and some important suggestions.

GAM | out

Comment

Comment

Crowd Sourcing Polling Place Wait Times

Long lines at the polling place are becoming a thorn in our democracy. We realized a few months ago that our elections technology framework data layer could provide information that when combined with community-based information gathering might lessen the discomfort of that thorn.  Actually, that realization happened while hearing friends extol the virtues of Waze.  Simply enough, the idea was crowd-sourcing wait information to at least gain some insight on how busy a polling place might be at the time one wants to go cast their ballot.

Well, to be sure, lots of people are noodling around lots of good ideas and there is certainly no shortage of discussion on the topic of polling place performance.  And, we’re all aware that the President has taken issue with it and after a couple of mentions in speeches, created the Bauer-Ginsberg Commission.  So, it seems reasonable to assume this idea of engaging some self-reporting isn’t entirely novel.

After all, its kewl to imagine being able to tell – in real time – what the current wait time at the polling place is, so a voter can avoid the crowds, or a news organization can track the hot spots of long lines.  We do some "ideating" below but first I offer three observations from our noodling:

  • It really is a good idea; but
  • There’s a large lemon in it; yet
  • We have the recipe for some decent lemonade.

Here’s the Ideation Part

Wouldn’t it be great if everybody could use an app on their smarty phone to say, “Hi All, its me, I just arrived at my polling place, the line looks a bit long.” and then later, “Me again, OK, just finished voting, and geesh, like 90 minutes from start to finish… not so good,” or “Me again, I’m bailing.  Need to get to airport.”

And wouldn’t it be great if all that input from every voter was gathered in the cloud somehow, so I could look-up my polling place, see the wait time, the trend line of wait times, the percentage of my precinct’s non-absentee voters who already voted, and other helpful stuff?  And wouldn’t it be interesting if the news media could show a real time view across a whole county or State?

Well, if you’re reading this, I bet you agree, “Yes, yes it would."  Sure.  Except for one thing.  To be really useful it would have to be accurate.  And if there is a question about accuracy (ah shoot, ya know where this is going, don-cha?) Yes, there is always that Grinch called “abuse.”

Sigh. We know from recent big elections that apparently, partisan organizations are sometimes willing to spend lots of money on billboard ads, spam campaigns, robo-calls, and so on, to actually try to discourage people from going to the polls, within targeted locales and/or demographics. So, we could expect this great idea, in some cases, to fall afoul of similar abuse.  And that’s the fat lemon.

But, please read on.

Now, we can imagine some frequent readers spinning up to accuse us of wanting everything to be perfectly secure, of letting the best be the enemy of the good, and noting that nothing will ever be accomplished if first every objection must be overcome. On other days, they might be right, but not so much today.

We don’t believe this polling place traffic monitoring service idea requires the invention of some new security, or integrity, or privacy stuff.  On the other hand, relying on the honor system is probably not right either.  Instead, we think that in real life something like this would have a much better chance of launch and sustained benefit, if it were based on some existing model of voters doing mobile computing in responsible way that’s not trivial to abuse like the honor system.

And that lead us to the good news – you see, we have such an existing model, in real life. That’s the new ingredient, along with that lemon above, and a little innovative sugar, for the lemonade that I mentioned.

Stay tuned for Part 2, and while waiting you might glance at this.

Comment

6 Comments

For (Digital) Poll Books -- Custody Matters!

Today, I am presenting at the annual Elections Verification Conference in Atlanta, GA and my panel is discussing the good, the bad, and the ugly about the digital poll book (often referred to as the “e-pollbook”).  For our casual readers, the digital poll book or “DPB” is—as you might assume—a digital relative of the paper poll book… that pile of print-out containing the names of registered voters for a given precinct wherein they are registered to vote. For our domain savvy reader, the issues to be discussed today are on the application, sometimes overloaded application, of DPBs and their related issues of reliability, security and verifiability.  So as I head into this, I wanted to echo some thoughts here about DPBs as we are addressing them at the TrustTheVote Project.

OSDV_pollbook_100709-1We've been hearing much lately about State and local election officials' appetite (or infatuation) for digital poll books.  We've been discussing various models and requirements (or objectives), while developing the core of the TrustTheVote Digital Poll Book.  But in several of these discussions, we’ve noticed that only two out of three basic purposes of poll books of any type (paper or digital, online or offline) seem to be well understood.  And we think the gap shows why physical custody is so important—especially so for digital poll books.

The first two obvious purposes of a poll book are to [1] check in a voter as a prerequisite to obtaining a ballot, and [2] to prevent a voter from having a second go at checking-in and obtaining a ballot.  That's fine for meeting the "Eligibility" and "Non-duplication" requirements for in-person voting.

But then there is the increasingly popular absentee voting, where the role of poll books seems less well understood.  In our humble opinion, those in-person polling-place poll books are also critical for absentee and provisional voting.  Bear in mind, those "delayed-cast" ballots can't be evaluated until after the post-election poll-book-intake process is complete.

To explain why, let's consider one fairly typical approach to absentee evaluation.  The poll book intake process results in an update to the voter record of every voter who voted in person.  Then, the voter record system is used as one part of absentee and provisional ballot processing.  Before each ballot may be separated from its affidavit, the reviewer must check the voter identity on the affidavit, and then find the corresponding voter record.  If the voter record indicates that the voter cast their ballot in person, then the absentee or provisional ballot must not be counted.

So far, that's a story about poll books that should be fairly well understood, but there is an interesting twist when if comes to digital poll books (DPB).

The general principle for DPB operation is that it should follow the process used with paper poll books (though other useful features may be added).  With paper poll books, both the medium (paper) and the message (who voted) are inseparable, and remain in the custody of election staff (LEOs and volunteers) throughout the entire life cycle of the poll book.

With the DPB, however, things are trickier. The medium (e.g., a tablet computer) and the message (the data that's managed by the tablet, and that represents who voted) can be separated, although it should not.

Why not? Well, we can hope that the medium remains in the appropriate physical custody, just as paper poll books do. But if the message (the data) leaves the tablet, and/or becomes accessible to others, then we have potential problems with accuracy of the message.  It's essential that the DPB data remain under the control of election staff, and that the data gathered during the DPB intake process is exactly the data that election staff recorded in the polling place.  Otherwise, double voting may be possible, or some valid absentee or provisional ballots may be erroneously rejected.  Similarly, the poll book data used in the polling place must be exactly as previously prepared, or legitimate voters might be barred.

That's why digital poll books must be carefully designed for use by election staff in a way that doesn't endanger the integrity of the data.  And this is an example of the devil in the details that's so common for innovative election technology.

Those devilish details derail some nifty ideas, like one we heard of recently: a simple and inexpensive iPad app that provides the digital poll book UI based on poll book data downloaded (via 4G wireless network) from “cloud storage” where an election official previously put it in a simple CSV file; and where the end-of-day poll book data was put back into the cloud storage for later download by election officials.

Marvelous simplicity, right?  Oh hec, I'm sure some grant-funded project could build that right away.  But turns out that is wholly unacceptable in terms of chain of custody of data that accurate vote counts depend on.  You wouldn't put the actual vote data in the cloud that way, and poll book data is no less critical to election integrity.

A Side Note:  This is also an example of the challenge we often face from well-intentioned innovators of the digital democracy movement who insist that we’re making a mountain out of a molehill in our efforts.  They argue that this stuff is way easier and ripe for all of the “kewl” digital innovations at our fingertips today.  Sure, there are plenty of very well designed innovations and combinations of ubiquitous technology that have driven the social web and now the emerging utility web.  And we’re leveraging and designing around elements that make sense here—for instance the powerful new touch interfaces driving today’s mobile digital devices.  But there is far more to it, than a sexy interface with a 4G connection.  Oops, I digress to a tangential gripe.

This nifty example of well-intentioned innovation illustrates why the majority of technology work in a digital poll book solution is actually in [1] the data integration (to and from the voter record system); [2] the data management (to and from each individual digital poll book), and [3] the data integrity (maintaining the same control present in paper poll books).

Without a doubt, the voter's user experience, as well as the election poll worker or official’s user experience, is very important (note pic above)—and we're gathering plenty of requirements and feedback based on our current work.  But before the TTV Digital Poll Book is fully baked, we need to do equal justice to those devilish details, in ways that meet the varying requirements of various States and localities.

Thoughts? Your ball (er, ballot?) GAM | out

6 Comments

Comment

The 2013 Annual Elections Verification Conference Opens Tonight

If its Wednesday 13.March it must be Atlanta.  And that means the opening evening reception for the Elections Verification Network's 2013 Annual Conference.  We're high on this gathering of elections officials, experts, academicians and advocates because it represents a unique interdisciplinary collaboration of technologists, policy wonks and legal experts, and even politicians all with a common goal: trustworthy elections. The OSDV Foundation is proud to be a major sponsor of this event.  We do so because it is precisely these kinds of forums where discussions about innovation in HOW America votes take place and it represents a rich opportunity for collaboration, debate, education, and sharing.  We always learn much and share our own research and development efforts as directed by our stakeholders -- those State and local elections officials who are the beneficiaries of our charitable work to bring increased accuracy, transparency, verification, and security (i.e., the 4 pillars of trustworthiness) to elections technology reform through education, research and development for elections technology innovation.

Below are my opening remarks to be delivered this evening or tomorrow morning, at the pleasure of the Planning Committee depending on how they slot the major sponsors opportunities to address the attendees.  We believe there are 3 points we wanted to get across in opening remarks: [1] why we support the EVN; [2] why there is a growing energy around increased election verification efforts, and [3] how the EVN can drive that movement forward.....

Greetings Attendees!

On behalf of the EVN Planning Committee and the Open Source Digital Voting Foundation I want to welcome everyone to the 2013 Elections Verification Network Annual Conference.  As a major conference supporter, the Planning Committee asked if I, on behalf of the OSDV Foundation, would take 3 minutes to share 3 things with you:

  • 1st, why the Foundation decided to help underwrite this Conference;
  • 2nd, why we believe there is a growing energy and excitement around election verification; and
  • 3rd, how the EVN can bring significant value to this growing movement

So, we decided to make a major commitment to underwriting and participating in this conference for two reasons:

  1. We want to strengthen the work of this diverse group of stakeholders and do all that we can to fortify this gathering to make it the premier event of its kind; and
  2. The work of the EVN is vital to our own mission because there are 4 pillars to trustworthy elections: Accuracy, Transparency, Verification, and Security, and the goals and objectives of these four elements require enormous input from all stakeholders.  The time to raise awareness, increase visibility, and catalyze participation is now, more than ever.  Which leads to point about the movement.

We believe the new energy and excitement being felt around election verification is due primarily to 4 developments, which when viewed in the aggregate, illustrates an emerging movement.  Let’s consider them quickly:

  1. First, we’re witnessing an increasing number of elections officials considering “forklift upgrades” in their elections systems, which are driving public-government partnerships to explore and ideate on real innovation – the Travis County Star Project and the LA County’s VSAP come to mind as two showcase examples, which are, in turn, catalyzing downstream activities in smaller jurisdictions;
  2. The FOCE conference in CA, backed by the James Irvine Foundation was a public coming out of sorts to convene technologists, policy experts, and advocates in a collaborative fashion;
  3. The recent NIST Conferences have also raised the profile as a convener of all stakeholders in an interdisciplinary fashion; and finally,
  4. The President’s recent SOTU speech and the resulting Bauer-Ginsberg Commission arguably will provide the highest level of visibility to date on the topic of improving access to voting.  And this plays into EVN’s goals and objectives for elections verification.  You see, while on its face the visible driver is fair access to the ballot, the underlying aspect soon to become visible is the reliability, security, and verifiability of the processes that make fair access possible.  And that leads to my final point this morning:

The EVN can bring significant value to this increased energy, excitement, and resulting movement if we can catalyze a cross pollination of ideas and rapidly increase awareness across the country.  In fact, we spend lots of time talking amongst ourselves.  It’s time to spread the word.  This is critical because while elections are highly decentralized, there are common principles that must be woven into the fabric of every process in every jurisdiction.  That said, we think spreading the word requires 3 objectives:

  1. Maintaining intellectual honesty when discussing the complicated cocktail of technology, policy, and politics;
  2. Sustaining a balanced approach of guarded optimism with an embracing of the potential for innovation; and
  3. Encouraging a breadth of problem awareness, possible solutions, and pragmatism in their application, because one size will never fit all.

So, welcome again, and lets make the 2013 EVN Conference a change agent for raising awareness, increasing knowledge, and catalyzing a nationwide movement to adopt the agenda of elections verification.

Thanks again, and best wishes for a productive couple of days.

Comment

Comment

Do Trade Secrets Hinder Verifiable Elections? (Duh)

Slate Magazine posted an article this week, which in sum and substance suggests that trade secret law makes it impossible to independently verify that voting machines are working correctly.  In a short, we say, "Really, and is this a recent revelation?" Of course, those who have followed the TrustTheVote Project know that we've been suggesting this in so many words for years.  I appreciate that author David Levine refers to elections technology as "critical infrastructure."  We've been suggesting the concept of "critical democracy infrastructure" for years.

To be sure, I'm gratified to see this article appear, particularly as we head to what appears to be the closest presidential election since 2000.  The article is totally worth a read, but here is an excerpt worth highlighting from Levine's essay:

The risk of the theft (known in trade secret parlance as misappropriation) of trade secrets—generally defined as information that derives economic value from not being known by competitors, like the formula for Coca-Cola—is a serious issue. But should the “special sauce” found in voting machines really be treated the same way as Coca-Cola’s recipe? Do we want the source code that tells the machine how to register, count, and tabulate votes to be a trade secret such that the public cannot verify that an election has been conducted accurately and fairly without resorting to (ironically) paper verification? Can we trust the private vendors when they assure us that the votes will be assigned to the right candidate and won’t be double-counted or simply disappear, and that the machines can’t be hacked?

Well, we all know (as he concludes) that all of the above have either been demonstrated to be a risk or have actually transpired.  The challenge is that the otherwise legitimate use of trade secret law ensures that the public has no way to independently verify that voting machinery is properly functioning, as was discussed in this Scientific American article from last January (also cited by Levine.)

Of course, what Levine is apparently not aware of (probably our bad) is that there is an alternative approach on the horizon,  regardless of whether the government ever determines a way to "change the rules" for commercial vendors of proprietary voting technology with regard to ensuring independent verifiability.

As a recovering IP lawyer, I'll add one more thing we've discussed within the TrustTheVote Project and the Foundation for years: this is a reason that patents -- including business method patents -- are arguably helpful.  Patents are about disclosure and publication, trade secrets are, be definition, not.  Of course, to be sure, a patent alone would not be sufficient because within the intricacies of a patent prosecution there is an allowance that only requires partial disclosure of software source code.  Of course, "partial disclosure" must meet a test of sufficiency for one "reasonably skilled in the art" to "independently produce the subject matter of the invention."  And therein lies the wonderful mushy grounds on which to argue a host of issues if put to the test.  But ironically, the intention of partial code disclosure is to protect trade secrets while still facilitating a patent prosecution.

That aside, I also note that in the face of all the nonsense floating about in the blogosphere and other mainstream media whether about charges of Romney's ownership interest in voting machinery companies being a pathway to steal an election or suggesting a Soros-Spanish based voting technology company's conspiracy to deliver tampered tallies, Levine's article is a breath of fresh air deserving the attention ridiculously lavished on these latest urban myths.

Strap in... T-12 days.  I fear a nail biter from all view points.

GAM|out

Comment

Comment

Movement to Bring Open Source to Government Being Reorganized

Greetings- Just a quick post to suggest an interesting report out this afternoon on the TechPresident blog.  The move to consolidate the efforts of Civic Commons (home of Open311.org) and Code For America (CfA), notwithstanding the likely trigger being Civic Common's leader, Nick Grossman moving on, actually makes sense to us.  CfA's  Jennifer Pahlka's write up is here.

Recently in a presentation, I was asked where our work fits in to the whole Gov 2.0 movement.  It seems to us that we are probably a foundational catalyst to the movement; related, but only tangentially.  To be sure, we share principles of accuracy, transparency, verification and security in government information (ours being elections information).  But Gov 2.0 (and its thought leaders such as CfA) is a considerably different effort from ours at the TrustTheVote Project.  That's mainly because the backbone of the Civic Commons, Open311.org, and CfA efforts is Web 2.0 technology (read: the social web and related mash-up tools).  There is nothing wrong with that; in fact, its downright essential for transparency.

But to keep the apples in their crate and the oranges elsewhere, our work is about a far heavier lifting exercise.  Rather than liberating legacy government data stores to deliver enlightened public information sites, or to shed sunlight on government operations, we're building an entirely new open source elections technology stack from the OS kernel up through the app layer, with particular emphasis on an open standards common data format (more news on that in coming posts).

Ours is about serious fault tolerant software architecture, design and engineering with stuff built in C++, Objective C, even dropping down to the machine-level, potentially as far as firmware if necessary, but at the app layer higher level programming tools as well including frameworks like Rails, and UX/UI delivery vehicles like HTML5 and AJAX (to the extent of browser-based or iOS5-based applications).

And that point is the segue to my closing comment:  the Gov 2.0 movement is smartly delivering Government information via the web; the social web in particular.  That's huge.  By contrast, remember that a good portion of our work is focused on purpose-built, application-specific devices like Optical Scanners to "read" ballots, devices to mark a ballot for printing and processing, or mobile tablets to serve as digital poll books.  Sure, the web is involved in some voter facing services in our framework, like voter registration.  But unlike the Gov 2.0 effort, we have no plans leverage the web or Internet in general for anything (save a blank ballot delivery or voter registration update).

So by contrast, we're in the rough, while Code for America is on the putting green.  And as such, you should have a look at the TechPresident article today. Cheers GAM|out

Comment

Comment

At the Risk of Running off the Rails

So, we have a phrase we like to use around here borrowed from the legal academic world.  Used to describe an action or conduct in analyzing a nuance in tort negligence, is the phrase "frolic and detour."  I am taking a bit of detour and frolicking in an increasingly noisy element of explaining the complexity of our work here.  (The detour comes from the fact that as "Development Officer" my charge is ensuring the Foundation and projects are financed, backed, supported, and succeed in adoption.  The frolic is in the form of commentary below about software development methodologies although I am not currently engaged or responsible for technical development outside of my contributions in UX/UI design.)  Yet, I won't attempt to deny that this post is also a bit of promotion for our stakeholders -- elections IT officials who expect us to address their needs for formal requirements, specifications, benchmarks, and certification, while embracing the agility and speed of modern development methodologies. This post was catalyzed by chit-chat at dinner last evening with an energetic technical talent who is jacked-up about the notion of elections technology being an open source infrastructure.  Frankly, in 5 years we haven't met anyone who wasn't jacked-up about our cause, and their energy is typically around "damn, we can do this quick; let's git 'er done!"  But it is about at this point where the discussion always seems to get a bit sideways.  Let me explain.

I guess I am exposing a bit of old school here, but having had the formal training in computer systems science and engineering (years ago) I believe data modeling -- especially for database-backed enterprise apps -- is an absolute imperative priority.  And the stuff of elections systems is serious technology, containing a significant degree of fault tolerance, integrity and verification assurance, and perhaps most important a sound data model.  And modeling takes time and requires documentation, both of which are nearly antithetical in today's pop culture of agile development.

Bear in mind, the TTV Project embraces agile methods for UX/UI development efforts. And there are a number of components in the TTV elections technology framework that do not require extensive up-front data modeling and can be developed purely in an iterative environment.

However, we claim that data modeling is critical for certain enterprise-grade elections applications because (as many seasoned architects have observed): [a] the data itself has meaning and value outside of the app that manipulates it, and [b] scalability requires a good DB design because you cannot just add in scalability later.  The data model or DB design defines the structure of the database and the relationships between the data sets; it is, in essence the foundation on which the application(s) are built.   A solid DB design is essential to achieve a scalable application.  Which leads to my lingering question:  How do agile development shops design a database?

I've heard the "Well, we start with a story..." approach.  And when I ask those who I really respect as enterprise software architects with real DB design chops, who also respect and embrace agile methodologies, they tend to express reservations about the agile mindset being boorishly applied to truly scalable, enterprise grade relational DB design that results in a well performing application, and related data integrity.

Friends, I have no intention of hating on agile principles of lightweight development methods -- they have an important role in today's application software development space and an important role here at the Foundation, but at the same time, I want to try to explain why we cannot simply just "bang out" new elections apps for ballot marking, tabulation, or ballot design and generation in a series of sprints and scrums.

First, in all candor, I fear this confusion rests in the reality that fewer and fewer developers today have had a complete computer science education, and cannot really claim to be disciplined software engineers or architects.  Many (not all) have just "hacked" with, and self-taught themselves, development tools because they built a web site or implemented a digital shopping bag for a friend (much like the well intentioned developer my wife and I met last evening).

Add in the fact, the formality and discipline of compiled code has given way to the rapid prototyping benefits of interpreted code.  And in the processes of this new modern training in software development (almost exclusively for the sandbox of the web browser as the UX/UI vehicle) what has been forgotten is that data modeling exists not because it creates overhead and delays, but because it removes such impediments.

Look at this another way.  I like to use building analogies -- perhaps because I began my collegiate studies long ago in architectural engineering before realizing that computer graphics would replace drafting.  There is a reason we spend weeks, sometimes months traveling by large holes in the ground with towers of re-bar, forms, and concrete pouring without any clue of what really will stand there once finished.  And yet, later as the skyscraper takes form, the speed with which it comes together seems to accelerate almost weekly.  Without that foundation carefully laid, the building cannot stand for any extended period of time, let alone bear the dynamic and static weights of its appointments, systems, and occupants.  So too, is this the case with complex, highly scalable, fault tolerant enterprise software -- without the foundation of a sold data model, the application(s) will never be sustainable.

I admit that I have been out of production grade software development (i.e., in the trenches coding, compiling; link, load, dealing with lint and running in debug mode) for years, but I can still climb on the bike and turn the pedals.  The fact is, data flow and data model could not be more different.  The former cannot exist without the latter.  It was well understood and data modeling has demonstrated many times that one cannot create a data flow out of nothing.  There has to be a base model as a foundation of one or more data flows, each mapping to its application.  Yet in our discussion punctuated by a really nice wine and great food, this developer seemed to want to dismiss modeling as something that can be done later... perhaps like refactoring (!?)

I am beginning to believe this fixation of modern developers with "rapid" non-data-model development is misguided, if not dangerous for its latent time shifted costs.

Recently, a colleague at another Company was involved with the development of a system where no time whatsoever was spent on data model design.  Indeed, the screens started appearing in record time.  The UX/UI was far from complete, but usable.  And the team was cheered as having achieved great "savings" in the development process.  However, when it came time to expand and extend the app with additional requirements, the developers waffled and explained they would have to recode the app in order to meet the new process requirements.  The data was unchanged, but processes were evolving.  The balance of the project ground to a halt in the dismissal of the first team over arguments about why requirements planning up front should have been done, and they figured out who to hire in to solve  it.

I read somewhere of another development project where the work was getting done in 2 week cycles. They were about 4 cycles away from finishing when on the tracker schedule a task called "concurrency" appeared for the next to last (penultimate) cycle.  The project subsequently imploded because all of the code had to be refactored (a core entity actually was determined to be two entities.)  Turns out that no upfront modeling led to this sequence of events, but unbelievably, the (agile) Development Firm working on the project, spun this as a "positive outcome;" that is they explained, "Hey, its a good thing we caught this a month before go-live."  Really?  Why wasn't that caught before that pungent smell of freshly cut code started wafting through the lab?

Spin doctoring notwithstanding, the scary thing to me is that performance and concurrency problems caused by a failure to understand the data are being caught far too late in the Agile development process, which makes it difficult if not impossible to make real improvements.  In fact, I fear that many agile developers have the misguided principle that all data models should be:

create table DATA
 (key INTEGER,
 stuff BLOB);

Actually, we shouldn't joke about this.  That idea comes from a scary reality: a DBA (database architect) friend tells about a development team he is interacting with on an outsourced State I.T. project that has decided to migrate a legacy non-Oracle application to Oracle using precisely this approach.  Data that had been stored as records in old ISAM type files, will be stored in Oracle as byte sequences in Blobs, with an added surrogate generated unique primary key.  When he asked what's the point of that approach, no one at the development shop could give him a reasonable answer other than "in the time frame we have, it works."   It begs the question: What do you call an Oracle Database where all the data in it is invisible to Oracle itself and cannot be accessed and manipulated directly using SQL?  Or said differently, would you call a set of numbered binary records a "database," or just "a collection of numbered binary records?"

In another example of the challenges of agile development in a database-driven app world, a DBA colleague describes being brought in on an emergency contract basis to an Agile project under development on top of Oracle, to deal with "performance problems" in the database.   Turns out the developers were using Hibernate and apparently relied on it to create their tables on an as-needed basis, simply adding a table or a column in response to incoming user requirements and not worrying about the data model until it crawled out of the code and attacked them.

This sort of approach to app development is what I am beginning to see as "hit and run."  Sure, it has worked so far in the web app world of start-ups: get it up and running as fast as possible, then exit quickly and quietly before they can identify you as triggering the meltdown when scale and performance start to matter.

After chatting with this developer last evening (and listening to many others over recent months lament that we're simply moving too slowly) I am starting to think of Agile development as a methodology of "do anything rather than nothing, regardless of whether its right."  And this may be to support the perception of rapid progress: "Look, we developed X components/screens/modules in the past week."  Whether any of this code will stand up to production performance environments is to be determined later.

Another Agile principle is of incremental development and delivery.   It's easy for a developer to strip out a piece of poorly performing code and replace it with a chunk that offers better or different capabilities.  Unfortunately, you just cannot do this in a Database.  For example: you cannot throw away old data in old tables and simply create new empty tables.

The TrustTheVote Project continues to need the kind of talent this person exhibited last evening at dinner.  But her zeal aside (and obvious passion for the cause of open source in elections), and at the risk of running off the (Ruby) rails here, we simply cannot afford to have these problems happen with the TrustTheVote Project.

Agile methodologies will continue to have their place in our work, but we need to be guided by some emerging realities, and appreciate that for as fast as someone wants to crank out a poll book app or a ballot marking device, we cannot afford to short-cut simply for the sake of speed.  Some may accuse me of being a waterfall Luddite in an agile world; however, I believe there has to be some way to mesh these things, even if it means requirements scrums, data modeling sprints, or animated data models.

Cheers GAM|out

Comment

Comment

Temporarily Missing, But Still in Action

Happy "Holidaze"

On the eve of 2012 we so need to check in here and let you know we're still fighting the good fight and have been totally distracted by a bunch of activities.  There is much to catch you up on and we'll start doing that in the ensuing days,  but for now we simply wanted to check in and wish everyone a peaceful and prosperous new year.  And of course, we intend that to "prosper" is to enrich yourself in any number of ways, not simply financially, but intellectually, physically, and spiritually as well... how ever you chose to do so ;-)

Looking back while looking ahead, as this afternoon before the new year urges us all to do, we are thankful for the great headway we made in 2011 (and we'll have much more to say about those accomplishments separately), and we are energized (and resting up) for the exciting and intense election year ahead.  And that brings me to two thoughts I want to share as we approach the celebration of this New Year's Eve 2011.

1. A Near #FAIL

First, if there was one effort or project that approached "#fail" for us this year it was our intended work to produce a new open data, open source elections night reporting system for Travis County, TX, Orange County, CA and others.  We were "provisionally chosen" by Travis County pending our ability to shore up a gap in the required funding to complete some jurisdiction specific capabilities.

We approached prospective backers in addition to our current ones and unfortunately we could not get everyone on board quickly enough, and tried to do so on the eve of their budgetary commitments being finalized for other 2012 election year funding commitments, mostly around voter enfranchisement (more on that in a moment.)  We were short answers to 2 questions of Travis County, the answers to which well could have dramatically reduced the remaining fund gap requirement and allowed us to accelerate toward final selection and be ready in time for 2012.

For unexplained reasons, Travis County has fallen silent to answer any of our questions, respond to any of our inquiries, or even continue to advance our discussions.  We fear that something has happened in their procurement process and they simply haven't gotten around to the courtesy of letting us know.  This is frustrating because we've been left in a state of purgatory -- really unable to determine where and how to allocate resources without this resolved.  The buck stops with me (Gregory) on this point as I should've pushed harder for answers from both sides: Travis on the technical issues and our Backers on the funding question.

I say this was a "near #fail" because it clearly is unresolved: we know Orange County, as well as other jurisdictions, and media channels such as the AP remain quite keen on our design, the capabilities for mobile delivery, the open data, and of course the open source alternative to expensive (on a total cost of ownership or "TCO" basis) proprietary black-box solutions.  Moreover, the election night reporting system is a "not insignificant" component to our open source elections technology framework, and its design and development will continue.  And perhaps we'll get some clarity on Travis County, close the funding gap, and get that service launched in time for next Fall's election frenzy.  Stay tuned.

So, that is but one of several distractions that allowed this vital blog to sit idle for the last half of summer and all of the Fall.  We'll share more about the other distractions in  upcoming posts as we get underway with 2012.  But I have a closing comment about the 2012 election season in this final evening of 2011.

2.  The 2012 Battles on the Front-lines of Democracy Will Start at the Polling Place

Millions of additional Americans will be required to present photo ID when they arrive at the polls in four states next year.  Kansas, Rhode Island, Tennessee and Texas will require voters to prove their identities, bringing the total number of States to 30 that require some form of voter identification, this according to the National Conference of State Legislatures.

This is an issue that has reached the boiling point and we predict will set off a storm of lawsuits (and they are happening already).  It ranks very close to redistricting in terms of its impact on voter enfranchisement according to one side of the argument.  Opponents also argue that such regulations impose an unfair barrier to those who are less likely to have photo IDs, including the poor and the elderly.  The proponents stand steadfast that the real issue is voter fraud and this is the best way to address it.  Of course, the trouble with that argument is that after a five-year U.S. DoJ probe lasting across two different administrations found little (53 cases) discernible evidence of widespread voter fraud.   And yet, there are also reasonable arguments suggesting that regardless of voter fraud, there seems to be no difficulty in our elderly, disabled or poor obtaining ID cards (where required) in order to enable them to obtain Medicare, Medicaid and food stamps.

To be clear: the Foundation has no opinion on the matter of voter ID.  We see arguments on both sides.  Our focus is simply this: any voter identification process must be fair, not burdensome, transparent, and uniformly applied.  We're far more vested in how to make technology to facilitate friction-free access to the polling place that produces a verifiable, audit-ready, and accountable paper trail for all votes.  We do believe that implementing voter ID as a means to restrict the vote is troublesome... as troublesome as preventing voter ID in order to passively enable those who are not entitled as a matter of citizenship to cast a ballot.

Regardless of how you come down on this issue, we believe it will be where the battles begin in the 2012 election season over enfranchising or disenfranchising voters begins.

And with that, we say, 2012: bring it.  We're ready.  Be there: its going to be an interesting experience.  Here we go. Cheers Greg

Comment

3 Comments

An Independence Holiday Reflection: IP Reform and Innovation in Elections Technology

On this Independence Day I gave some reflection to the intentions of our founding fathers, and how that relates to our processes of elections and the innovations we should strive for to ensure accuracy, transparency, verification, and security.  And as I thought about this more while gazing out at one of the world’s most precious natural resource treasures and typing this post, it occurred to me that innovation in elections systems is largely around the processes and methods more than any discrete apparatus. That’s when the old recovering IP lawyer in me had an “ah ha” moment.   And that’s what this long-winded post is about—something that actually should matter to you, a reader of this forum about our on-going effort to make elections and voting technology critical democracy infrastructure.

You see, in America, innovation has long been catalyzed by intellectual property law, specifically patents.

And as you probably also know, patent law is going through major reform efforts in Congress as you read this.  Now here is what you may have missed, which in my reflecting on this Fourth of July holiday, the efforts of the TrustTheVote Project, and innovations in voting technology, dawned on me: there is a bad ingredient to the current patent reform legislation that threatens to not only undermine the very foundations on which patent law is used to catalyze innovation, but equally has the potential to undermine some very basic ideals our founding fathers had in mind as this nation was born.  Bear with me while I unravel this for you; I think it will grab your attention.

So it starts with Members of Congress debating patent reform through the America Invents Act (H.R. 1249).  You see, few may be aware of the role that business method patents (BMPs) play in the political process, especially during elections.  BMPs have been used to protect innovations designed to improve the operation of the political process.   And it is not unreasonable to assume that the TrustTheVote Project itself is working on innovations that should well qualify for patent protection resulting in patents we would assign ownership in to the general public.  Weakening the protection for such innovations may in turn reduce the motivation for companies and individuals to continue innovating in these technologies.  And it certainly could impact our work as well.  But this is exactly what Section 18 of H.R.1249, the America Invents Act of 2011, as currently drafted would likely do.

There is a long history of inventors using BMPs to protect their innovations related to voting systems.  As such systems have developed, from paper voting, to electronic voting, to on-line voting, companies both large and small have continued to innovate, and to protect their new technologies via the patent system, often through the use of BMPs.  Let’s look at just two major areas.

  1. Electronic Voting Systems - it is estimated that between 20% and 30% of American voters now cast their ballots electronically, chiefly via Direct Recording Electronic (DRE) systems. Yet these systems have encountered many problems related to their ability to record votes accurately, verifiably, and securely.  In effort to remedy these problems (but largely to no demonstrative gain), companies have developed technologies designed to overcome these shortcomings, and have protected these technologies with a series of patents, many of which are classed as BMPs. Organizations with numerous BMPs related to improving electronic voting systems include large companies such as IBM, Accenture and Pitney Bowes, and smaller specialist companies such as Hart InterCivic, Avante International and Smartmatic.
  2. Internet Voting Systems – in DRE systems, voters typically have to be physically present at a polling station in order to cast their ballots.  The next logical progression is for voters to cast their ballots remotely, for example via the Internet.   For reasons repeatedly explained here and elsewhere, this is just not a good idea given today’s “Internet.”  But in any event, such ill-advised efforts require a whole new level of network security, in order to ensure that the votes are recorded both accurately and in a verifiable fashion (both being extremely difficult to do, and its unclear any system exists patented or not that can do so, but bear with me for the sake of argument).   A search of the patents in this area, however, reveals that companies such as Accenture, Hart InterCivic, Scytl, and Avante have BMPs describing so-called Internet voting.  These BMPs sit alongside their earlier BMPs covering DRE systems, as these companies develop successive generations of voting technology.

In short, Companies are continuing to seek patent protections for innovations in this sector and business methods continue to be a vehicle for so doing.

Section 18 of H.R.1249, the America Invents Act of 2011, aims to give one special interest—banks—a “get out of jail card.”  As I read it, the provision does this:  If you sue a bank for infringement of a business method patent the bank can stay the court litigation and take your patent to the USPTO for a special post-grant review or PGR process.  If the Bank loses the first round at the office they have an automatic appeal to the board in the office.  If they lose they have another automatic appeal to the Court of Appeals for the Federal Circuit (CAFC), the sole circuit or appeals court for patent cases.  This process takes between 4-7 years based on the existing reexamination systems at the office.  This process is special in that the bank can bring in additional forms of prior art not permitted for other reexamination systems.

There is a good reason why the range of prior art that can be used in court to challenge a patent is not available in the office.  A judge, jury, rules of evidence, cross-examination and other time-tested features of court do not exist at the Patent and Trademark Office.  A patent examiner does not have the experience, procedures, institutional knowledge or time to ascertain the veracity or fraud of the prior art.  More importantly, they do not have the resources to deal with the increased volume of art. Worse, Section 18 can be conducted regardless of whether the patent has already been deemed valid in a prior proceeding.  

And on this Independence Day Holiday, it occurs to me this violates separation of powers and should be unconstitutional.

Before I explain how I can envision this impacting what we’re doing, let me state that I will not delve into the debate over BMPs because it devolves into a religious war, and one that I as both a computer scientist and an IP Lawyer have actually shifted view points from on side to the other over the years.  But suffice it to say that there are examples of useful business method patents that would be eliminated by Section 18 of the patent reform legislation winding its way through Congress.  We are all very familiar with one example: SSL.  Indeed, secure sockets layer is comprised of two BMPs.  Everyone in the world, each day, touches this patented innovation.  If Section 18 were law in 1995 then Visa and Mastercard with their SET proposal could have stalled Netscape and SSL in the USPTO for many years.  Microsoft and CommerceNet with SHTTP could have done the same.  The world would be worse off with competing security protocols. Ecommerce itself may not have taken off at all; at the very least its growth would have been stunted.

It is worth noting that since about year 2000 the U.S.P.T.O has employed a "second pair of eyes" process to examine BMP applications twice.  Moreover, given the public acrimony over BMPs, the U.S.P.T.O is very slow to grant BMPs and the allowance rate is 20% lower than other art areas.  And recently the Supreme Court in their Bilski decision affirmed the patentability of BMPs.

Yet, in spite of the acrimony and higher threshold to get a BMP, many companies large and small innovate and invest in BMPs.  The top 20 owners of BMPs are Fortune 100 companies and or respectable startups. Non-practicing entities comprise a very small portion of the ownership pool of BMPs.  And in considering the innovations resident in the open source elections technology framework we’re developing, we too may find ourselves in the middle of the BMP and Section 18 crossfire.

The challenge is, as a non-profit (pending 501.c.3) organization, we cannot and do not engage in the political process of legislation or lobbying.  Yet, we’re wary of where this is going, I think you should be too.  You see, policy makers don't often have the time to consume, absorb and digest the data.  They prefer anecdotes, headline grabbing stories, one-page summaries, and talking points.

So let me turn to our thinking about BMPs and the impact of Section 18.

As mentioned above, without debating the basis for BMPs we at the TrustTheVote Project have come to accept that they are an essential part of technology IP.  One reason is that the scope for IT innovation far exceeds the scope for inventing new technology, and includes innovation in the use of existing technology for new purposes.  That's been increasingly true for some 20 years, with the scope of the online world coming to encompass so many areas of human activity.  One of the more recent advances is the use of IT innovations for public benefit.  I'll explain that in terms of elections and political activity, but first let me give a general idea and one specific existing example.

In our experience with IT IP, a BMP can be used as a way to make a claim that "X has been used for many things before, but not in the area of Y; here is a way to use X for a particular purpose in the area of Y; this enables a new human activity Z."  Now, I could forgo that claim and limit myself to a claim about Y-inspired extension of X that might be a sufficiently significant extension to warrant a patent for a technical innovation; or it might not.

If I limited myself that way, then another party could claim the innovation of using that new method for a particular purpose Z.  So in general, I want to claim both, to protect the right to use X in Y for Z.

Here’s a big idea: "Protect" in the public benefit world means "anyone can do so, not limited by a private or for profit IP holder." That applies whether or not my extension of prior X is sufficiently innovative by itself.

As an example of this idea, let’s return to SSL, the subject of very well known and high quality BMPs.  When SSL was invented, the use of cryptography for communication security was already well established, including the use of digital certificates to establish (a chain of) trust in the identity of parties communicating.  In fact, there were many examples of cryptographic protocols and communication protocols.  So for X, let's say "use of cryptographic protocols and communication protocols together for communication with security properties."

Now, SSL as a protocol may well have been sufficiently innovative to warrant patents of algorithms. But whether or not that was true, SSL was used for several purposes, including a particular kind of communication in which one party trusts a third to vouch for the second party's identity as being sufficiently established for a financial transaction. That's Y.  Z is "digital commerce" meaning financial transactions performed as part of exchange in which one party pays another party for goods and services – including digital goods and digital services.  Without X used for Y, digital commerce wouldn't exist, and many forms of digital services and digital goods simply would not be provided. With X used for Y, Z is enabled for the first time.  And I view Z -- digital commerce -- as a major public benefit, even if it was primarily for private for-profit commercial transactions.

The public benefit is a larger economy with the addition of digital commerce.

So far so good, but let's revisit the value of the BMP.  If it didn't exist, the holders of patents for X could effectively block Z, or prevent intermediation and insert themselves into every use of X in Y for Z or X in A for B -- any use. For example, IBM holds many patents on cryptographic protocols.  I don't know if those protocols and patents were sufficiently broad to cover the SSL protocol as an algorithm or apparatus.  But if that were so, and BMPs didn't exist, then IBM could have insisted that it be a party to every digital commerce transaction, only allowing transaction services by parties that made payments to IBM on terms dictated by IBM.  Any other parties would be barred from digital commerce.  Of course, that public benefit may be a matter of opinion on which many people would differ.

In elections and politics, public benefit may be clearer.

For a first example, consider technical innovations for online voter registration. Such innovations might include the use of a "forms wizard" to help people follow complicated rules for filling out voter registration forms; digital means for capturing a signature for the form; digital transmission of the form itself, or its data; and there are more.  All these techniques have been invented before and used in other areas of human endeavor. Adapting them for use in voter registration is probably not an adaption that qualifies as an innovation. But if one wants to ensure that the public be able to use IT implementations of online voter registration, a BMP can cover the use of forms wizards (or other X) for online voter registration (Y) to enable more rapid and more widespread ability of citizens to vote (Z).  Many people would definitely regard that as a public benefit.  The BMP protects that benefit when the BMP holder permits anyone to use the business process, barring a patent holder for X (the specific IT technique) from claiming that online VR implementations infringe their patent.

I don't know who, if anyone, holds a patent relevant to the technology of the types of innovation in online VR that I refer to here.  However, I suspect that many would regard it as a public detriment for citizens to have to pay a for-profit company for the right to use an online VR service; or for local or state governments to have to pay for privilege of operating such a service.

Other examples lie in the activities around political campaigns to form communities of supporters, organize volunteers, raise money, etc.  The use of social media and other online technology has and I expect will continue to increase in use, enabling more citizens to more easily participate in the political process. As in elections technology, such innovation is often the application of established technology for new purpose.

BMPs can protect the right of political organizations to use such established technology.  I can easily imagine a PAC or other issues-based political organization building a membership organization that includes online interaction with members, including gaining and retaining credit card information for future contributions to the organization, or directly to a candidate or campaign. If I were a member of such an organization, I might expect to get an email about a new set of candidate reviews for candidates in an upcoming election.  I could go to the organization's web site and read up on candidates.  I could choose to make a donation directly to the candidate's campaign, immediately, with a single click of a "give $100" button in the candidate review.

Suppose that there were a private company with a patent on making payment in digital commerce using a similar method.  Without a BMP for the process of a citizen contributing to a campaign as part of a Web session with a web site of an issues based voluntary membership association, that patent holder could insist that it be the sole conduit of such contributions.  I suspect most people would view it as a public detriment to either pay a for-profit company for the privilege of a quick and easy campaign contribution, or use a more cumbersome and error prone method for free.

Worse, one could imagine selective enforcement of the patent, or selectively preferential licensing agreements, to make the quick and easy contribution method available only to political campaigns that the patent holder favored.

The same selective approach could be applied to any part of the political process.  Back to voter registration, it's possible that a patent holder would choose to license its innovations selectively, only to those local election officials in locales where the majority of unregistered voters are perceived as friendly to the politics of the patent holder.

A selective approach could also be applied for disputes.  For example, if a financial transactions company were able to stop a political campaign for collecting online contributions in a certain manner, during the time in which the dispute is resolved. If the time frame stretches long enough, it doesn't matter if the campaign wins the dispute—the election will already be over and the opportunity to raise and use funds will be gone.

And these types of scenarios could fit pretty much any use of social media technology, where a patent holder of a purely technical patent could assert the right to constrain the use of the technique in any field of human activity, including elections or politics.

These examples may be fanciful, or not based on a real scenario where an election-relevant or politics-relevant technology-using process is the subject of a BMP that involves a particular use of a particular underlying technology for enabling or automating the process.  But I believe that the general benefit of BMPs would apply to real cases.

This may be a new idea—organizations with a public-benefit motivation wanting to ensure general use of technology-enabled innovations in electoral or political processes, rather than trying to control or reserve or profit from BMPs.  And it is certainly not what BMPs might have been intended for.  But I believe that BMPs could be used—and for all I know are already being used—for electoral or political processes.  It would be a shame, and a public detriment, if BMPs became less useful, either in general, or less useful in disputes with a particular class of organizations. This might be counter intuitive, but as we see the growth of digital democracy, open government, online activism, and the like, it shouldn't come as a surprise that these new forms of technology-enabled human activity also create new uses for pre-existing IP protections that pre-date the existence of these evolving activities.

Setting aside the efficacy of BMPs and the related religious debates, I bet we can all agree that without BMPs, Goliath—IBM in my perhaps fanciful example above—can block the public, especially the little guy.  Section 18 in the patent bill gives banks a new tool, unique for banks, to stop David from getting their idea to the market.  And this troubles me for it moves us toward that proverbial slippery slope.

At the end of the day, Section 18 of H.R.1249, the America Invents Act of 2011 is frankly, akin to a government regime not granting a permit to open a business simply because one is from the wrong caste or religion or political party... and that's not the government regime of this nation, who independence we celebrate today. Yet it appears some special interests in patent reform may have an otherwise misguided view to the contrary.

Your ball GAM|out

3 Comments

Comment

Help Wanted; The Search is On

Greetings All-
Sorry we've been away from the podium here for a couple of weeks.  We're heads-down on some very exciting projects.  But not nearly as exciting as what I have to announce today.  Let's get right to it.

The time has comeSome might argue it’s overdue.  Growth of the activities and work here, and the need for speed in advancing the agenda of open source elections technology triggers today’s announcement:

The OSDV Foundation Leadership Team is growing, and we're officially recruiting for a new Chief Executive Director.

The search is on, and we want your help in locating an absolute “A-player” to lead the next level of growth for the Open Source Digital Voting Foundation.

Wait a minute,” you say. “Wait a minute!  Doesn’t the Foundation already have an Executive Director… or actually like two of them?”  Oh, definitely—you're right, two of them.  John Sebes and myself, co-founders and co-executive directors (as mandated by the Foundation’s by-laws), have been tirelessly leading and managing this 4-year effort since Day 1 with the generous support and advice of our Board.

We have also have been managing all aspects of Foundation development (read: funding) and technology work (e.g., the TrustTheVote Project).  And the workload has become overwhelming.  We each need to now focus on our particular domain expertise in order to sustain and accelerate the momentum the TrustTheVote Project is gaining.

So, it is time for both of us to narrow our respective scope of efforts.  For myself, this means focusing on stakeholder community development, public outreach, adoption and deployment, and strategic alliances and backing.  In the commercial world, this might be akin to the kind of role I’ve played in the tech sector for about 1/2 of my career: running marketing and business development.

For John, this means the heavy responsibility for leading the core mission of the non-profit: open source elections technology design and development efforts. This is aligned with his commercial world experience: as an engineering manager and chief technology officer.

What’s left are all of the activities associated with day-to-day operational leadership, to effectively manage and grow the Foundation.  This includes executive leadership in major fund raising from all sources, accounting, finance, administration, legal affairs, and public relations.  It is in the commercial world, a CEO role.  In other words, with the growth in activities and work, the leadership team must expand and bring in the right talent to take this to the next level.

We’ve successfully been managing what essentially amounts to nearly a $1.0M  operation; a tiny start-up by commercial comparison, but significant by some non-profit comparisons.  We realize that we must now elevate this to a $7-10M annual operation in order to maintain the momentum we’re generating and be the kind of change agent for public elections integrity and trust according to our Charter.

And we’re experienced enough to appreciate that neither of us is well suited to provide that non-profit leadership and somehow keep doing what we do best.

The details of technology architecture and building the stakeholder community are more than full-time efforts alone.  To be sure, both John and I have managed commercial technology operations greater than $10M per year (but in those cases had staffing and resources commensurate with the size of operation).  However, the nuances of a non-profit operation, its methods of funding, and the need for our acquired domain expertise on elections technology, flat out prohibits us from trying to do it all any longer.

So, Here We Go.
We’ve uploaded a position description on the TrustTheVote Wiki.  You will find it here. And there is a companion document that provides some background, here.  We’ve engaged with our Board, an Executive Recruiter, and our advisers to expand the search.

With today’s announcement, we look to you, our backers, supporter, stakeholders, and other interested onlookers to join in the search for our ideal candidate to lead this exciting and important project blending the best in technology innovation, with the imperative agenda of “critical democracy infrastructure.”

And it’s a helluva lot of fun working to be the change agent for accuracy, transparency, verification, and security of public elections technology in a digital age.  To be sure, there's a bunch of great stuff going on here: the digital poll book project based on the Apple iPad; the election night reporting system project using open data and web services distribution; work with the Federal Elections Assistance Commission on component-level certification for the open source Tabulator we're building; and working with the IEEE 1622 Standards Group on our proposed standard for open election data formats.

Please spread the word; the search is ON.  If you know of an ideal candidate, or even think you might be one yourself, we want to hear from you.  Ping us.  You can also drop a note to "edsearch" sent to our Foundation web site domain.

Onward.
GAM|out

Comment

2 Comments

How Digital Pollbooks Can Ease the Voter ID Challenge

OSDV_pollbook_100709-1Some of you have heard the rumors and rumblings. Yes, an exciting new project in our open source elections technology framework is in the works.  And yes, it is an important tool for the front lines of democracy: election polling places. We'll have a  bunch more to officially say about our digital poll book project shortly.

But first, a thought about how this tool can help the Voter ID challenge.

The Progressive States Network recently posted a call for participation in a teleconference to discuss fighting a rising wave of renewed interest in compulsory photo identification at the Polls.  They note in part:

With a shift of control of state legislatures and governorships across the country taking shape this month, many conservative lawmakers are pushing laws that would require photo identification for all voters at the polls.  While these laws are touted as a catchall way to prevent voter fraud, in reality they only address voter impersonation, an extremely rare form of fraud.  More importantly they will cost states money that could be better spent in these difficult economic times and serve primarily to disenfranchise hundreds of thousands of voters.

Maybe so, maybe so.  But we’ll sidestep that argument for a moment to point out that our newest framework project—the Digital Poll Book—can help address this problem, and is but one of several reasons the Digital Poll Book (as envisioned and being designed by the TrustTheVote Project) is a near imperative piece of election technology—open source, of course!

[Ed Note: watch for a post in the near future to provide a more proper overview of this exciting 2011 project—something we think will easily outshine work in 2009 on voter registration systems and work in 2010 on ballot design and generation.]

So, let's have a look at some concerns people have about Voter ID, and where digital Poll books can help.

Concern #1: It's a bad idea to have to trust poll workers It’s a bad idea to trust poll workers to accurately and honestly perform the check for each voter that the ID document they present is valid, and that the document contains ID information that matches voter ID information in the poll book.  Erroneous or mendacious poll workers can incorrectly reject valid ID, or perform a false negative on the match of ID with poll book records, or just take enough extra time during check in to intimidate some people, and force longer lines at polling places.

Our Response:

  • That's a valid concern—but about the proper performance of ID checks, rather than the ID check itself.
  • Digital poll books can ameliorate these concerns when combined with digital capture of ID.  Here’s how:  Increasingly, States’ driver's licenses and state ID cards are card-reader ready (i.e., they can be swiped through a device to pick up or “read” the vital data encoded into the card.)  Such a swipe can be the basis for a digital poll book looking up a valid voter matching voter record, without reliance on the poll worker.  In other states an even simpler method of voter ID has the same effect—the Board of Elections issue single-purpose voter-ID cards, including bar code that can be scanned to provide the voter ID information.

Concern #2: A Registered Voter may not have a valid State ID Not every registered voter has valid state ID, and for some people it is a physical or financial hardship to obtain state-verified identification.

Our Response:

  • That may well be true for a small population of people, but the statement assumes that State ID is the only valid voter ID. BoEs can choose to adopt alternatives, for example  BoE-issued voter-ID cards as used in some states today. Sending these to voters can be as easy as current routine BoE-voter interaction, along with sample ballot mail-outs, with no cost or effort to the voter.

Concern #3: The alternative of provisional voting in absence of valid ID is disenfranchising. If a voter arrives at the Polling Place without valid ID where such is required, then at best they have to vote provisionally—which is potentially disenfranchising given the inconsistencies of counting provisional ballots.

Our Response:

  • It is true that many provisional voters do not have their ballot counted because of errors on or legibility of the provisional affidavit.  However, digital poll books can help by providing a provisional affidavit form helper that collects all of the required information, and prints a complete, correct, and legible affidavit for the voter.
  • It is also true that some people believe that provisional votes are often not counted. Notwithstanding the accuracy of claims of uncounted provisional ballots, sunshine is the best remedy for these concerns.  Digital poll books can help by capturing—for subsequent aggregation and publication—accurate information about provisional voters and affidavits, for members of the public to verify whether the number of counted provisional ballots matches the number that should have been counted.

Concern #4: Voter ID requirements are inconsistent with vote-by-mail. Voter ID has little deterrence value for voter impersonation fraud, because of the option of voting by mail without voter ID. For voters that might be intimidated by an ID check at a polling place, voter ID shifts participation to vote-by-mail, where voters have additional risk (compared to in person voting) of not having their vote counted due to errors in preparing vote-by-mail materials.

Our Response:

  • The comparison of voter-ID in person, vs. vote-by-mail without ID, is a valid comparison in general, but varies by State -- both in States' use of vote-by-mail, and in States' methods of identifying or authenticating absentee voters.  In a state with no-fault absentee, permanent absentee, permanent vote-by-mail, and similar practices, it may well be fruitless to impose voter-ID requirements on the minority of participating voters who vote in person.
  • However, other States have more limited and controlled use of absentee voting, with the large majority of voters voting in person.  In those cases, digital poll books can help ameliorate some of the above concerns and help enable voter ID benefits in States where such benefits are sought.

We think the Voter ID issue is thorny.  We also believe people should get involved with this debate as its likely to have a real impact in how America votes (where the Polling Place remains the epicenter of that civic duty).  We also believe that the elimination of paper-based poll books and reducing if not removing the related issues that can run with their people-based processes is an equally important part of this issue.  Our newest elections technology framework project for 2011 is the open source digital poll book.  Its truly exciting, and we envision it being based on some highly desirable, easy to use and insanely great technology.

Stay tuned for a briefing on the project.

GAM|out

2 Comments

Comment

Welcome 2011; The Movement Continues...

Happy New Year Readers! We're so ready for 2011 (and probably the same with you).  The work of the TrustTheVote Project in particular, and the Foundation in general grew steadily this year, as has awareness and knowledge of our efforts to create an open source elections technology framework.  Our mission to make elections technology tantamount to "critical democracy infrastructure" was significantly advanced in 2010 thanks to far too many individuals to mention here (but especially a deep appreciation to our Foundation Advisers including Mitch Kapor and Debra Bryant, as well as the folks at O'Reilly Media).  And we expect this effort to become a movement in 2011.

The Foundation achieved several objectives this year in terms of the heavy lifting to design and develop software components of the elections technology framework, mostly addressing the Ballot Design Studio, Ballot Generator, and Elections Management System.

On tap for 2011 are two major initiatives for which we are securing funding now:

  1. The Digital Poll Book; and
  2. The Tabulator.

Stay tuned for exciting news about both of those major 2011 projects.  We also plan several conference speaking engagements in 2011, starting with the 2011 Overseas Vote Foundation's UOCAVA Summit, which we are proud to be sponsoring again this February coming in Washington D.C.  And watch for us at GOSCON, OSCON, Gov2.0 and other important gatherings as well as election technology specific events including IEEE and NIST workshops.

We look forward to your continued advice, engagement, and support in 2011.

On behalf of the Core Architecture Team including Pito, Ann, Jeff, John, Alexsey, and Tom, and the Foundation Ops Team including Sarah, Matt, Tom, Bill, Barb, and myself, we wish you a prosperous and healthy 2011.

Happy New Year All!

Comment

Comment

Different Transparencies; Different Realities

While heads-down on year-end activities for the Foundation, I've not had much breathing room to think, reflect, and offer commentary here, but then I received an eMail this weekend, which caught my attention and somewhat caught me off guard.  It just goes to show what we often presume to be clear, isn't always so. The message went along the lines of,

So here is another example of so-called transparency which is leading me to conclude its not all its cracked up to be.

The writer was referring to the Wikileaks maneuver this past week to publish more content provided to it through downloads of classified documents (in this case, cable transmissions from State Department staff and diplomats) by a Pentagon-based solider (arguably turned rogue).

The Wikileaks site asserts that "Publishing improves transparency, and this transparency creates a better society for all people."  Maybe so.  Maybe so.

But this kind of transparency is a far-cry from the kind of transparency that the OSDV Foundation seeks to advance in the administration, conduct, and verification of public elections.

Transparency in government is a sound ideal.  How it is implemented, however, requires responsible conduct.  We believe that while ensuring protection of whistle-blowers and the discovery of corruption and illegal behavior is imperative to the integrity of governments and democracy, the reckless publication of content that is tantamount to malicious behavior intended to embarrass or disrupt otherwise legal activity of government is simply a bad idea.  And its not our idea of transparency.

So let's be clear.  If Wikileaks transparency leads to the illumination of bad, corrupt, illegal or oppressive government activity (which was the original motivation for Wikileaks when it began by focusing on the conduct of the Chinese government, and co-founded by Chinese disidents), we think that is good.  What Wikileaks did last week, with regard to publishing U.S. diplomatic classified communications, is not good.  And more to the point and our agenda here, if the open source TrustTheVote elections technology leads to greater transparency in democratic electoral processes, then that is a very good thing.  And that's the role transparency plays in our charter and mission.  It does not (and should not) lead to reckless witch hunting of otherwise legal and well intended acts by elections officials.  We'll leave that to ambitious politically motivated journalists.

Our intention with transparency is to increase trust in the machinery used to conduct our public elections.  We have no mission, charter, or intent to put tools in place to play "gotcha" with elections officials thanklessly toiling in the trenches and on the firing lines of public elections administration.

So, we agree with the individual who wrote me confidentially that Wikileaks' idea of transparency in this case was recklessly executed (and could put innocent individuals at risk if not disrupt vital efforts by all governments in negotiating cooperation and co-existence.)  But we disagree with their leap to conclusion that all transparency is bad, or even vaulting off the edge, that the OSDV Foundation efforts are even remotely similar in intent.

Transparency, from our point of view, remains an important and vital, if not fundamental component of the electoral process.  Transparency can lead to better audit and verification, and ensure that voter's ballots are counted as cast.  The transparency acts of Wikileaks can lead to a better society, if they remain true to their original mission.  The reckless sensationalist efforts Wikileaks seems to be engaged in today is disturbing.  And its not our idea of transparency, although we can assume it is a new reality likely to persist.

GAM|out

Comment