Viewing entries tagged
Commentary

Comment

Critical Democracy Infrastructure: Our Briefing Launches

We are pleased to announce the release of the OSET Institute’s Critical Democracy Infrastructure (CDI) Briefing.  It’s been over a year in development.  Early review by several in Government, Media, and Advisors tell us this may be the most important publication on the issue of election infrastructure yet.  We humbly hope so. This Briefing provides a thorough review of the technology infrastructure of election administration and operation.  We address its critical nature and what is required for it to be treated as such, and assess the challenges of official designation, as well as the immediate and longer-term challenges to protecting this vital aspect of our democracy...

Comment

Comment

Reality Check: Cost of Software Development

Even philanthropic efforts to produce public benefits in the form of civic technology have real costs associated with software development.  The open source model, however, means the costs are significantly less than current proprietary commercial alternatives, while the innovative benefits, unconstrained by commercial mandates, can be significantly greater.  More importantly, there is some reality distortion over the real costs to building civic engagement IT, such as election administration and voting systems.  They are markedly different than many other civic engagement tools that require only APIs and interactive web services leveraging government data stores to better engage and serve citizens.  Tuesday's post by Ms. Voting Matters on our Voter Services Portal ignited comments and questions about the real cost to build the Voter Services Portal.  The VSP is not "yet another simple web site," but a collection of software to provide services to voters that integrate with back-end legacy systems, and set the foundation to drive a series of voter service innovations as well as other election management tools in the near future.  We breakdown the cost model and actual costs here...

Comment

Comment

Automatic Voter Registration: Oregon Governor Signs Bill to "Just Do it."

Oregon relying on its pioneering heritage and Nike spirit says, "Just Do It" for automatic voter registration. And this move seems to provide a worked example for our CTO's recent blog post about the technical simplicity to do so.  Oregon already being a vote-by-mail state with online voter registration to boot, was likely able to benefit from those prior innovations.  But regardless, as our Foundation's Secretary and General Counsel points out in this post, its a smart move...

Comment

Comment

Three-Step Test for "Open Source"

To our elections official stakeholders, Chief Technology Officer John Sebes covers a point that seems to be popping up in discussions more and more.  There seems to be some confusion about what "open source" means in the context of software used for election administration or voting. That's understandable, because some election I.T. folks, and some current vendors, may not be familiar with the prior usage of the term "open source" -- especially since it is now used in so many different ways to describe (variously), people, code, legal agreements, etc. So, John hopes to get our Stakeholders back to basics on this.

Comment

Comment

Ms. Voting Matters' Take: "No Magic Will Bring About Online Voting"

Ms. Voting Matters would really like to wave her magic wand and allow everyone on the planet to cast their votes, securely, with their smart phones, tablets, or laptops. Really truly, I would do it if I could. But I can’t. The Internet of Voting is just not safe and secure enough now, no matter how much we all would wish it so.  Let me share why.

Comment

Comment

David Plouffe’s View of the Future of Voting — We Agree and Disagree

David Plouffe, President Obama’s top political and campaign strategist and the mastermind behind the winning 2008 and 2012 campaigns, wrote a forward-looking op-ed [paywall] in the Wall Street Journal recently about the politics of the future and how they might look.

He touched on how technology will continue to change the way campaigns are conducted – more use of mobile devices, even holograms, and more micro-targeting at individuals. But he also mentioned how people might cast their votes in the future, and that is what caught our eye here at the TrustTheVote Project.  There is a considerable chasm to cross between vision and reality.

Comment

Comment

Money Shot: What Does a $40M Bet on Scytl Mean?

…not much we think.

Yesterday’s news of Microsoft co-founder billionaire Paul Allen’s investing $40M in the Spanish election technology company Scytl is validation that elections remain a backwater of innovation in the digital age.

But it is not validation that there is a viable commercial market for voting systems of the size typically attracting venture capitalists; the market is dysfunctional and small and governments continue to be without budget.

And the challenges of building a user-friendly secure online voting system that simultaneously protects the anonymity of the ballot is an interesting problem that only an investor of the stature of Mr. Allen can tackle.

We think this illuminates a larger question:

To what extent should the core technology of the most vital aspect of our Democracy be proprietary and black box, rather than publicly owned and transparent?

To us, that is a threshold public policy question, commercial investment viability issues notwithstanding.

To be sure, it is encouraging to see Vulcan Capital and a visionary like Paul Allen invest in voting technology. The challenges facing a successful elections ecosystem are complex and evolving and we will need the collective genius of the tech industry’s brightest to deliver fundamental innovation.

We at the TrustTheVote Project believe voting is a vital component of our nation’s democracy infrastructure and that American voters expect and deserve a voting experience that’s verifiable, accurate, secure and transparent.  Will Scytl be the way to do so?

The Main Thing

The one thing that stood out to us in the various articles on the investment were Scytl’s comments and assertions of their security with international patents on cryptographic protocols.  We’ve been around the space of INFOSEC for a long time and know a lot of really smart people in the crypto field.  So, we’re curious to learn more about their IP innovations.  And yet that assertion is actually a red herring to us.

Here’s the main thing: transacting ballots over the public packet switched network is not simply about security.   Its also about privacy; that is, the secrecy of the ballot.  Here is an immutable maxim about the digital world of security and privacy: there is an inverse relationship, which holds that as security is increased, privacy must be decreased, and vice-verse.  Just consider any airport security experience.  If you want maximum security then you must surrender a bunch of privacy.  This is the main challenge of transacting ballots across the Internet, and why that transaction is so very different from banking online or looking at your medical record.

And then there is the entire issue of infrastructure.  We continue to harp on this, and still wait for a good answer.  If by their own admissions, the Department of Defense, Google, Target, and dozens of others have challenges securifying their own data centers, how exactly can we be certain that a vendor on a cloud-based service model or an in-house data center of a county or State has any better chance of doing so? Security is an arms race.  Consider the news today about Heartbleed alone.

Oh, and please for the sake of credibility can the marketing machinery stop using the phrase “military grade security?”  There is no such thing.  And it has nothing to do with an increase in the  128-bit encryption standard RSA keys to say, 512 or 1024 bit.  128-bit keys are fine and there is nothing military to it (other than the Military uses it).  Here is an interesting article from some years ago on the sufficiency of current crypto and the related marketing arms race.  Saying “military grade” is meaningless hype.  Besides, the security issues run far beyond the transit of data between machines.

In short, there is much the public should demand to understand from anyone’s security assertions, international patents notwithstanding.  And that goes for us too.

The Bottom Line

While we laud Mr. Allen’s investment in what surely is an interesting problem, no one should think for a moment that this signals some sort of commercial viability or tremendous growth market opportunity.  Nor should anyone assume that throwing money at a problem will necessarily fix it (or deliver us from the backwaters of Government elections I.T.).  Nor should we assume that this somehow validates Scytl’s “model” for “security.”

Perhaps more importantly, while we need lots of attention, research, development and experimentation, the bottom line to us is whether the outcome should be a commercial proprietary black-box result or an open transparent publicly owned result… where the “result” as used here refers to the core technology of casting and counting ballots, and not the viable and necessary commercial business of delivering, deploying and servicing that technology.

Comment

Comment

A Northern Exposed iVoting Adventure

NorthernExposureImage
NorthernExposureImage

Alaska's extension to its iVoting venture may have raised the interests of at least one journalist for one highly visible publication.  When we were asked for our "take" on this form of iVoting, we thought that we should also comment here on this "northern exposed adventure." (apologies to those fans of the mid-90s wacky TV series of a similar name.) Alaska has been among the states that allow military and overseas voters to return marked absentee ballots digitally, starting with fax, then eMail, and then adding a web upload as a 3rd option.  Focusing specifically on the web-upload option, the question was: "How is Alaska doing this, and how do their efforts square with common concerns about security, accessibility, Federal standards, testing, certification, and accreditation?"

In most cases, any voting system has to run that whole gauntlet through to accreditation by a state, in order for the voting system to be used in that state. To date, none of the iVoting products have even trying to run that gauntlet.

So, what Alaska is doing, with respect to security, certification, and host of other things is essentially: flying solo.

Their system has not gone through any certification program (State, Federal, or otherwise that we can tell); hasn't been tested by an accredited voting system test lab; and nobody knows how it does or doesn't meet  federal requirements for security, accessibility, and other (voluntary) specifications and guidelines for voting systems.

In Alaska, they've "rolled their own" system.  It's their right as a State to do so.

In Alaska, military voters have several options, and only one of them is the ability to go to a web site, indicate their choices for vote, and have their votes recorded electronically -- no actual paper ballot involved, no absentee ballot affidavit or signature needed. In contrast to the sign/scan/email method of return of absentee ballot and affidavit (used in Alaska and 20 other states), this is straight-up iVoting.

So what does their experience say about all the often-quoted challenges of iVoting?  Well, of course in Alaska those challenges apply the same as anywhere else, and they are facing them all:

  1. insider threats;
  2. outsider hacking threats;
  3. physical security;
  4. personnel security; and
  5. data integrity (including that of the keys that underlie any use of cryptography)

In short, the Alaska iVoting solution faces all the challenges of digital banking and online commerce that every financial services industry titan and eCommerce giant spends big $ on every year (capital and expense), and yet still routinely suffer attacks and breaches.

Compared to the those technology titans of industry (Banking, Finance, Technology services, or even the Department of Defense), how well are Alaskan election administrators doing on their shoestring (by comparison) budget?

Good question.  It's not subject to annual review (like banks' IT operations audit for SAS-70), so we don't know.  That also is their right as a U.S. state.  However, the  fact that we don't know, does not debunkany of the common claims about these challenges.  Rather, it simply says that in Alaska they took on the challenges (which are large) and the general public doesn't know much about how they're doing.

To get a feeling for risks involved, just consider one point, think about the handful of IT geeks who manage the iVoting servers where the votes are recorded and stored as bits on a disk.  They arenot election officials, and they are no more entitled to stick their hands into paper ballots boxes than anybody else outside a county elections office.  Yet, they have the ability (though not the authorization) to access those bits.

  • Who are they?
  • Does anybody really oversee their actions?
  • Do they have remote access to the voting servers from anywhere on the planet?
  • Using passwords that could be guessed?
  • Who knows?

They're probably competent responsible people, but we don'tknow.  Not knowing any of that, then every vote on those voting servers is actually a question mark -- and that's simply being intellectually honest.

Lastly, to get a feeling for the possible significance of this lack of knowledge, consider a situation in which Alaska's electoral college votes swing an election, or where Alaska's Senate race swings control of Congress (not far-fetched given Murkowski's close call back in 2010.)

When the margin of victory in Alaska, for an election result that effects the entire nation, is a low 4-digit number of votes, and the number of digital votes cast is similar, what does that mean?

It's quite possible that those many digital votes could be cast in the next Alaska Senate race.  If the contest is that close again,  think about the scrutiny those IT folks will get.  Will they be evaluated any better than every banking data center investigated after a data breach?  Any better than Target?  Any better than Google or Adobe's IT management after having trade secrets stolen?  Or any better than the operators of military unclassified systems that for years were penetrated through intrusion from hackers located in China who may likely have been supported by the Chinese Army or Intelligence groups?

Probably not.

Instead, they'll be lucky (we hope) like the Estonian iVoting administrators, when the OCSE visited back in 2011 to have a look at the Estonian system.  Things didn't go so well.  OCSE found that one guy could have undermined the whole system.  Good news: it didn't happenCold comfort: that one guy didn't seem to have the opportunity -- most likely because he and his colleagues were busier than a one-armed paper hanger during the election, worrying about Russian hackers attacking again, after they had previously shut-down the whole country's Internet-connect government systems.

But so far, the current threat is remote, and it is still early days even for small scale usage of Alaska's iVoting option.  But while the threat is still remote, it might be good for the public to see some more about what's "under the hood" and who's in charge of the engine -- that would be our idea of more transparency.

Wandering off the Main Point for a Few Paragraphs So, in closing I'm going to run the risk of being a little preachy here (signaled by that faux HTML tag above); again, probably due to the surge in media inquiries recently about how the Millennial generation intends to cast their ballots one day.  Lock and load.

I (and all of us here) are all for advancing the hallmarks of the Millennial mandates of the digital age: ease and convenience.  I am also keenly aware there are wing-nuts looking for their Andy Warhol moment.  And whether enticed by some anarchist rhetoric, their own reality distortion field, or most insidious: the evangelism of a terrorist agenda (domestic or foreign) ...said wing nut(s) -- perhaps just for grins and giggles -- might see an opportunity to derail an election (see my point above about a close race that swings control of Congress or worse).

Here's the deep concern: I'm one of those who believes that the horrific attacks of 9.11 had little to do with body count or the implosions of western icons of financial might.  The real underlying agenda was to determine whether it might be possible to cause a temblor of sufficient magnitude to take world financial markets seriously off-line, and whether doing so might cause a rippling effect of chaos in world markets, and what disruption and destruction that might wreak.  If we believe that, then consider the opportunity for disruption of the operational continuity of our democracy.

Its not that we are Internet haters: we're not -- several of us came from Netscape and other technology companies that helped pioneer the commercialization of that amazing government and academic experiment we call the Internet.  Its just that THIS Internet and its current architecture simply was not designed to be inherently secure or to ensure anyone's absolute privacy (and strengthening one necessarily means weakening the other.)

So, while we're all focused on ease and convenience, and we live in an increasingly distributed democracy, and the Internet cloud is darkening the doorstep of literally every aspect of society (and now government too), great care must be taken as legislatures rush to enact new laws and regulations to enable studies, or build so-called pilots, or simply advance the Millennial agenda to make voting a smartphone experience.  We must be very careful and considerably vigilant, because its not beyond the realm of reality that some wing-nut is watching, cracking their knuckles in front of their screen and keyboard, mumbling, "Oh please. Oh please."

Alaska has the right to venture down its own path in the northern territory, but it does so exposing an attack surface.  They need not (indeed, cannot) see this enemy from their back porch (I really can't say of others).  But just because it cannot be identified at the moment, doesn't mean it isn't there.

One other small point:  As a research and education non-profit we're asked why shouldn't we be "working on making Internet voting possible?Answer: Perhaps in due time.  We do believe that on the horizon responsible research must be undertaken to determine how we can offer an additional alternative by digital means to casting a ballot next to absentee and polling place experiences.  And that "digital means" might be over the public packet-switched network.  Or maybe some other type of network.  We'll get there.  But candidly, our charge for the next couple of years is to update an outdated architecture of existing voting machinery and elections systems and bring about substantial, but still incremental innovation that jurisdictions can afford to adopt, adapt and deploy.  We're taking one thing at a time and first things first; or as our former CEO at Netscape used to say, we're going to "keep the main thing, the main thing."

Onward
GAM|out

Comment

Comment

PCEA Report Finally Out: The Real Opportunity for Innovation Inside

PCEACoverThis week the PCEA finally released its long-awaited report to the President.  Its loaded with good recommendations.  Over the next several days or posts we'll give you our take on some of them.  For the moment, we want to call your attention to a couple of under-pinning elements now that its done.

The Resource Behind the Resources

Early in the formation of what initially was referred to as the "Bauer-Ginsberg Commission" we were asked to visit the co-chairs in Washington D.C. to chat about technology experts and resources.  We have a Board member who knows them both and when asked we were honored to respond.

Early on we advised the Co-Chairs that their research would be incomplete without speaking with several election technology experts, and of course they agreed.  The question was how to create a means to do so and not bog down the progress governed by layers of necessary administrative regulations.

I take a paragraph here to observe that I was very impressed in our initial meeting with Bob Bauer and Ben Ginsberg.  Despite being polar political opposites they demonstrated how Washington should work: they were respectful, collegial, sought compromise to advance the common agenda and seemed to be intent on checking politics at the door in order to get work done.  It was refreshing and restored my faith that somewhere in the District there remains a potential for government to actually work for the people.  I digress.

We advised them that looking to the CalTech-MIT Voting Project would definitely be one resource they could benefit from having.

We offered our own organization, but with our tax exempt status still pending, it would be difficult politically and otherwise to rely on us much in a visible manner.

So the Chairs asked us if we could pull together a list -- not an official subcommittee mind you, but a list of the top "go to" minds in the elections technology domain.  We agreed and began a several week process of vetting a list that needed to be winnowed down to about 20 for manageability  These experts would be brought in individually as desired, or collectively  -- it was to be figured out later which would be most administratively expedient.  Several of our readers, supporters, and those who know us were aware of this confidential effort.  The challenge was lack of time to run the entire process of public recruiting and selection.  So, they asked us to help expedite that, having determined we could gather the best in short order.

And that was fine because anyone was entitled to contact the Commission, submit letters and comments and come testify or speak at the several public hearings to be held.

So we did that.  And several of that group were in fact utilized.  Not everyone though, and that was kind of disappointing, but a function of the timing constraints.

The next major resource we advised they had to include besides CalTech-MIT and a tech advisory group was Rock The Vote.  And that was because (notwithstanding they being a technology partner of ours) Rock The Vote has its ear to the rails of new and young voters starting with their registration experience and initial opportunity to cast their ballot.

Finally we noted that there were a couple of other resources they really could not afford to over-look including the Verified Voting Foundation, and L.A. County's VSAP Project and Travis County's StarVote Project.

The outcome of all of that brings me to the meat of this post about the PCEA Report and our real contribution.  Sure, we had some behind the scenes involvement as I describe above.  No big deal.  We hope it helped.

The Real Opportunity for Innovation

But the real opportunity to contribute came in the creation of the PCEA Web Site and its resource toolkit pages.

On that site, the PCEA took our advice and chose to utilize Rock The Vote's open source voter registration tools and specifically the foundational elements the TrustTheVote Project has built for a States' Voter Information Services Portal.

Together, Rock The Vote and the TrustTheVote Project are able to showcase the open source software that any State can adopt, adapt, and deploy--for free (at least the adoption part) and without having to reinvent the wheel by paying for a ground-up custom build of their own online voter registration and information services portal.

We submit that this resource on their PCEA web site represents an important ingredient to injecting innovation into a stagnant technology environment of today's elections and voting systems world.

For the first time, there is production-ready open source software available for an important part of an elections official's administrative responsibilities that can lower costs, accelerate deployment and catalyze innovation.

To be sure, its only a start -- its lower hanging fruit of an election technology platform that doesn't require any sort of certification. With our exempt status in place, and lots of things happening we'll soon share, there is more, much more, to come.  But this is a start.

There is a 112 pages of goodness in the PCEA report.  And there are some elements in there that deserve further discussion.  But we humbly assert its the availability of some open source software on their resource web site that we think represents a quiet breakthrough in elections technology innovation.

The news has been considerable.  So, yep, we admit it.  We're oozing pride today. And we owe it to your continued support of our cause. Thank you!

GAM | out

Comment

1 Comment

Comments Prepared for Tonight's Elections Technology Roundtable

This evening at 5:00pm members of the TrustTheVote Project have been invited to attend an elections technology round table discussion in advance of a public hearing in Sacramento, CA scheduled for tomorrow at 2:00pm PST on new regulations governing Voting System Certification to be contained in Division 7 of Title 2 of the California Code of Regulations. Due to the level of activity, only our CTO, John Sebes is able to participate.

We were asked if John could be prepared to make some brief remarks regarding our view of the impact of SB-360 and its potential to catalyze innovation in voting systems.  These types of events are always dynamic and fluid, and so we decided to publish our remarks below just in advance of this meeting.

Roundtable Meeting Remarks from the OSDV Foundation | TrustTheVote Project

We appreciate an opportunity to participate in this important discussion.  We want to take about 2 minutes to comment on 3 considerations from our point of view at the TrustTheVote Project.

1. Certification

For SB-360 to succeed, we believe any effort to create a high-integrity certification process requires re-thinking how certification has been done to this point.  Current federal certification, for example, takes a monolithic approach; that is, a voting system is certified based on a complete all-inclusive single closed system model.  This is a very 20th century approach that makes assumptions about software, hardware, and systems that are out of touch with today’s dynamic technology environment, where the lifetime of commodity hardware is months.

We are collaborating with NIST on a way to update this outdated model with a "component-ized" approach; that is, a unit-level testing method, such that if a component needs to be changed, the only re-certification required would be of that discrete element, and not the entire system.  There are enormous potential benefits including lowering costs, speeding certification, and removing a bar to innovation.

We're glad to talk more about this proposed updated certification model, as it might inform any certification processes to be implemented in California.  Regardless, elections officials should consider that in order to reap the benefits of SB-360, the non-profit TrustTheVote Project believes a new certification process, component-ized as we describe it, is essential.

2. Standards

2nd, there is a prerequisite for component-level certification that until recently wasn't available: common open data format standards that enable components to communicate with one another; for example, a format for a ballot-counter's output of vote tally data, that also serves as input to a tabulator component.  Without common data formats elections officials have to acquire a whole integrated product suite that communicates in a proprietary manner.  With common data formats, you can mix and match; and perhaps more importantly, incrementally replace units over time, rather than doing what we like to refer to as "forklift upgrades" or "fleet replacements."

The good news is the scope for ballot casting and counting is sufficiently focused to avoid distraction from the many other standards elements of the entire elections ecosystem.  And there is more goodness because standards bodies are working on this right now, with participation by several state and local election officials, as well as vendors present today, and non-profit projects like TrustTheVote.  They deserve congratulations for reaching this imperative state of data standards détente.  It's not finished, but the effort and momentum is there.

So, elections officials should bear in mind that benefits of SB-360 also rest on the existence of common open elections data standards.

3. Commercial Revitalization

Finally, this may be the opportunity to realize a vision we have that open data standards, a new certification process, and lowered bars to innovation through open sourcing, will reinvigorate a stagnant voting technology industry.  Because the passage of SB-360 can fortify these three developments, there can (and should) be renewed commercial enthusiasm for innovation.  Such should bring about new vendors, new solutions, and new empowerment of elections officials themselves to choose how they want to raise their voting systems to a higher grade of performance, reliability, fault tolerance, and integrity.

One compelling example is the potential for commodity commercial off-the-shelf hardware to fully meet the needs of voting and elections machinery.  To that point, let us offer an important clarification and dispel a misconception about rolling your own.  This does not mean that elections officials are about to be left to self-vend.  And by that we mean self-construct and support their open, standard, commodity voting system components.  A few jurisdictions may consider it, but in the vast majority of cases, the Foundation forecasts that this will simply introduce more choice rather than forcing you to become a do-it-yourself type.  Some may choose to contract with a systems integrator to deploy a new system integrating commodity hardware and open source software.  Others may choose vendors who offer out-of-the-box open source solutions in pre-packaged hardware.

Choice is good: it’s an awesome self-correcting market regulator and it ensures opportunity for innovation.  To the latter point, we believe initiatives underway like STAR-vote in Travis County, TX, and the TrustTheVote Project will catalyze that innovation in an open source manner, thereby lowering costs, improving transparency, and ultimately improving the quality of what we consider critical democracy infrastructure.

In short, we think SB-360 can help inject new vitality in voting systems technology (at least in the State of California), so long as we can realize the benefits of open standards and drive the modernization of certification.

 

EDITORIAL NOTES: There was chatter earlier this Fall about the extent to which SB-360 allegedly makes unverified non-certified voting systems a possibility in California.  We don't read SB-360 that way at all.  We encourage you to read the text of the legislation as passed into law for yourself, and start with this meeting notice digest.  In fact, to realize the kind of vision that leading jurisdictions imagine, we cannot, nor should not alleviate certification, and we think charges that this is what will happen are misinformed.  We simply need to modernize how certification works to enable this kind of innovation.  We think our comments today bear that out.

Moreover, have a look at the Agenda for tomorrow's hearing on implementation of SB-360.  In sum and substance the agenda is to discuss:

  1. Establishing the specifications for voting machines, voting devices, vote tabulating devices, and any software used for each, including the programs and procedures for vote tabulating and testing. (The proposed regulations would implement, interpret and make specific Section 19205 of the California Elections Code.);
  2. Clarifying the requirements imposed by recently chaptered Senate Bill 360, Chapter 602, Statutes 2013, which amended California Elections Code Division 19 regarding the certification of voting systems; and
  3. Clarifying the newly defined voting system certification process, as prescribed in Senate Bill 360.

Finally, there has been an additional charge that SB-360 is intended to "empower" LA County, such that what LA County may build they (or someone on their behalf) will sell the resulting voting systems to other jurisdictions.  We think this allegation is also misinformed for two reasons: [1] assuming LA County builds their system on open source, there is a question as to what specifically would/could be offered for sale; and [2] notwithstanding offering open source for sale (which technically can be done... technically) it seems to us that if such a system is built with public dollars, then it is in fact, publicly owned.  From what we understand, a government agency cannot offer for sale their assets developed with public dollars, but they can give it away.  And indeed, this is what we've witnessed over the years in other jurisdictions.

Onward.

1 Comment

Comment

Free at Last: We Earn Our 501(c)(3) Tax Exempt Status

I am pleased to announce to our readers that the IRS has granted our 7-year old organization full unbridled tax exempt status under section 501(c)(3) of the Internal Revenue Code as a public charity.  This brings to a close an application review that consumed over 6 years—one of the longest for a public benefits non-profit organization.  Our Chief Development Officer, Gregory Miller has already offered his insight this morning, but I want to offer a couple of thoughts from my view point (which I know he shares). By now, you may have seen the WIRED Magazine article that was published this morning.  Others here will surely offer some additional comment of their own in separate posts.  But it does set the context for my brief remarks here.

First, to be sure,  this is a milestone in our existence because the Foundation’s fund raising efforts and corresponding work on behalf of elections officials and their jurisdictions nationwide has been largely on hold since we filed our original IRS Form 1023 application back in February 2007.

The Foundation has managed to remain active through what self-funding we could afford, and through generous grants from individuals and collaborating organizations that continued to support the “TrustTheVote™ Project” despite our "pending" status.

A heartfelt "thank you" to Mitch Kapor, Heather Smith and Rock the Vote, Alec Totic, Matt Mullenweg, Pito Salas, the Gregory Miller family and the E. John Sebes family (to name a few of the those who so believed in us early on to offer their generous support).  The same thanks goes to those who wished to remain anonymous in their support.

In addition to our being set free to move full speed ahead on our charter, I think this is interesting news for another reason: this project, which has a clear charitable cause with a compelling public benefit, was caught up in an IRS review perhaps mostly for having the wrong words in its corporate name.

Our case became entangled in the so-called “Bolo-Gate” scandal at the IRS Exempt Division.  And we unintentionally became a poster child for be-on-the-lookout reviews as such applied to entities involved in open source technology.

In sum and substance, our case required 6 years and 4 months for the IRS to decide.  The Service ultimately dragged us into our final administrative remedy, the "conference-of-right" we participated in last November, following their "intent to deny" letter in March of last year.  Then it took the IRS another 220 days to finally decide the case, albeit in our favor, but not before we had a] filed close to 260 pages of interrogatory responses, of which 182 were under affidavits; b] developed nearly 1,600 pages of total content; and c] ran up a total bill for legal and accounting fees over those years in excess of $100,000.

We’ve definitely learned some things about how to handle a tax exempt application process for an organization trying to provide public benefit in the form of software technology, although frankly, we have no intentions or interest in ever preparing another.

But there is a story yet to be told about what it took for us to achieve our 501(c)(3) standing—a status that every single attorney, CPA, or tax expert who reviewed our case over the years believed we deserved.   That noted, we are very grateful to our outside tax counsel team at Caplin Drysdale led by Marc Owen, who helped us press our case.

I am also deeply relieved that we need not raise a legal defense fund, but instead can finally start turning dollars towards the real mission: developing accurate, transparent, verifiable, and more secure elections technology for public benefit rather than commercial gain.  Its not lost on us, nor should it be on you, how we could've spent the money we need to pay to our lawyers and accountants on advancing the substantive cause of the TrustTheVote Project.

So, now its time to focus ahead, get to work, and raise awareness of the TrustTheVote Project and the improvements it can bring to public elections.

We're a legitimate legally recognized 501(c)(3) tax exempt public benefits corporation.  And with that you will begin to see marked changes in our web sites, our activities.  Stay tuned.  We're still happily reeling a bit from the result, but wrapping our heads around what we need to do now that we have the designation we fought for 6 years to have in order to fund the work our beneficiaries -- elections jurisdictions nationwide -- so deserve.

Please join me in acknowledging this major step and consider supporting our work going forward.  After all, now it really can be tax deductible (see your accountant and lawyer for details).

Best Regards, Christine M. Santoro Secretary, General Counsel

Comment

Comment

The Gift Has Arrived: Our Exempt Status After 6 Years

Today is a bit of a historical point for us: we can publicly announce the news of the IRS finally granting our tax exempt status. The digital age is wreaking havoc, however, on the PR and news processes.  In fact, we knew about this nearly 2 weeks ago, but due to a number of legal and procedural issues and a story we were being interviewed for, we were on hold in making this important announcement.  And we're still struggling to get this out on the wires (mostly due to a change our of our PR Agency at the most inopportune moment).

The result: WIRED Magazine actually got the jump on us (oh, the power of digital publishing), and now hours after their article posting, we're finally getting our own press release to the world.

I have to observe, that notwithstanding a paper-chase of near epic proportions with the IRS in granting us what we know our charter deserves to do foster the good work we intend, at the end of the day, 501(c)(3) status is a gift from the government.  And we cannot lose sight of that.

So, for the ultimate outcome we are deeply grateful, please make no mistake about that.  The ways  and means of getting there was exhausting... emotionally, financially, and intellectually.  And I notice that the WIRED article makes a showcase of a remark I made in one of the many  interviews and exchanges leading up to that story about being "angry."

I am (or was) angry at the process because 6 years to ask and re-ask us many of the same questions, and perform what I humbly believe at some point amounted to intellectual naval gazing, was crazy.  I can't help but feel like we were being bled.  I fear there are many other valuable public benefit efforts, which involve intangible assets, striving for the ability to raise public funds to do public good, who are caught up in this same struggle.

What's sad, is that it took the guidance and expertise (and lots of money that could be spent on delivering the on our mission) of high powered Washington D.C. lawyers to negotiate this to successful conclusion.  That's sad, because the vast majority of projects cannot afford to do that.  Had we not been so resolute in our determination, and willing to risk our own financial stability to see this through, the TrustTheVote Project would have withered and died in prosecution of our tax exempt status over 6 years and 4 months.

Specifically, it took the expertise and experience of Caplin Drysdale lawyers Michael Durham and Marc Owen himself (who actually ran the IRS Tax Exempt Division for 10 years).  If you can find a way to afford them, you can do no better.

There is so much that could be shared about what it took and what we learned from issues of technology licensing, to nuances of what constitutes public benefit in terms of IRS regulations -- not just what seems obvious.  Perhaps we'll do so another time.  I note for instance that attorney Michael Durham was a computer science major and software engineer before becoming a tax lawyer.  I too have a very similar combination background of computer science and intellectual property law, and it turned out to be hugely helpful to have this interdisciplinary view -- just odd that such would be critical to a tax exempt determination case.

However, in summary, I was taught at a very young age and through several life lessons that only patience and perseverance empower prevailing.  I guess its just the way I, and all of us on this project are wired.

Cheers GAM | out

Comment

Comment

Crowd Sourcing Polling Place Wait Times – Part 2

Last time, we wrote about the idea of a voter information service where people could crowd source the data about polling place wait times, so that other voters would benefit by not going when the lines are getting long, and so that news media and others could get a broad view of how well or poorly a county or state was doing in terms of voting time. And as we observed such would be a fine idea, but the results from that crowd-sourced reporting would be way better if the reporting were not on the “honor system.”  Without going on a security and privacy rampage, it would be better if this idea were implemented using some existing model for people to do mobile computing voter-stuff, in a way that is not trivial to abuse, unlike the honor system.

Now, back to the good news we mentioned previously: there is an existing model we could use to limit the opportunity for abuse.  You see, many U.S. voters, in a growing number of States, already have the ability to sit in a café and use their smart phone and a web site to identify themselves sufficiently to see their full voter record, and in some cases even update part of that voter record.

So, the idea is: why not extend that with a little extra record keeping of when a voter reports that they have arrived at the polls, and when they said they were done? In fact, it need not even be an extension of existing online voter services, and could be done in places that are currently without online voter services altogether.  It could even be the first online voter service in those places.

The key here is voters “sufficiently identify themselves” through some existing process, and that identification has to be based an existing voter record.  In complex online voter services (like paperless online voter registration), that involves a 3-way real-time connection between the voter’s digital device, the front-end web server that it talks to, and a privileged and controlled connection from the front-end to obtain specific voter data in the back-end.  But in a service like this, it could be even simpler, with a system that’s based on a copy of the voter record data, indeed, just that part that the voter needs to use to “identify themselves sufficiently”.

Well, let’s not get ahead of ourselves.  The fact is, State and/or local elections officials generally manage the voter database.  And our Stakeholders inform us its still likely these jurisdictions would want to operate this service in order to retain control of the data, and to control the ways and means of “sufficient identity” to be consistent with election laws, current usage practices, and other factors.  On the other hand, a polling place traffic monitor service can be a completely standalone system – a better solution we think, and more likely to be tried by everyone.

OK, that’s enough for the reasonably controlled and accurate crowd-source reporting of wait times. What about the benefits from it – the visibility on wait times?  As is almost always the case in transparent, open government computing these days, there are two parallel answers.

The first answer is that the same system that voters report into, could also provide the aggregated information to the public.  For example, using a web app, one could type in their street address (or get some help in selecting it, akin to our Digital Poll Book), and see the wait time info for a given precinct.  They could also view a list of the top-5 shortest current wait times and bottom-5 longest wait times of the precincts in their county, and see where their precinct sits in that ranking.  They could also study graphs of moving averages of wait times – well, you can ideate for yourself.  It’s really a question of what kind of information regular voters would actually value, and that local election officials would want to show.

The second answer is that this system must provide a web services API so that “other systems” can query this wait-time-reporting service.  These other systems should be able to get any slice of the raw data, or the whole thing, up to the minute.  Then they could do whatever visualization, reporting, or other services thought up by the clever people operating this other system.

For me, I’d like an app on my phone that pings like my calendar reminders, that I set to ping myself after 9am (no voting without adequate caffeine!) but before 3pm (high school lets out and street traffic becomes a sh*t show ;-)); but oh, when the waiting time is <10 minutes.  I’d also like something that tells me if/when turn-out in my precinct (or my county, or some geographic slice) tips over 50% of non-absentee voters.  And you can imagine others.  But the main point is that we do not expect our State or local election officials to deliver that to us.  We do hope that they can deliver the basics, including that API so that others can do cool stuff with the data.

Actually, it’s an important division of labor.

Government organizations have the data and need to “get the data out” both in raw form via an API, and in some form useful for individual voters’ practical needs on Election Day.  Then other organizations or individuals can use that API with their different vision and innovation to put that data to a range of additional good uses.  That's our view.

So, in our situation at the TrustTheVote Project, it’s actually really possible.  We already have the pieces: [1] the whole technology framework for online voter services, based on existing legacy databases; [2] the web and mobile computing technology framework with web services and APIs; [3] existing voter services that are worked examples of how to use these frameworks; and [4] some leading election officials who are already committed to using all these pieces, in real life, to help real voters.  This “voting wait-time tracker” system we call PollMon is actually one of the simplest examples of this type of computing.

We’re ready to build one.  And we told the Knight News Challenge so.  We say, let’s do this.  Wanna help?  Let us know.  We've already had some rockin good ideas and some important suggestions.

GAM | out

Comment

Comment

Crowd Sourcing Polling Place Wait Times

Long lines at the polling place are becoming a thorn in our democracy. We realized a few months ago that our elections technology framework data layer could provide information that when combined with community-based information gathering might lessen the discomfort of that thorn.  Actually, that realization happened while hearing friends extol the virtues of Waze.  Simply enough, the idea was crowd-sourcing wait information to at least gain some insight on how busy a polling place might be at the time one wants to go cast their ballot.

Well, to be sure, lots of people are noodling around lots of good ideas and there is certainly no shortage of discussion on the topic of polling place performance.  And, we’re all aware that the President has taken issue with it and after a couple of mentions in speeches, created the Bauer-Ginsberg Commission.  So, it seems reasonable to assume this idea of engaging some self-reporting isn’t entirely novel.

After all, its kewl to imagine being able to tell – in real time – what the current wait time at the polling place is, so a voter can avoid the crowds, or a news organization can track the hot spots of long lines.  We do some "ideating" below but first I offer three observations from our noodling:

  • It really is a good idea; but
  • There’s a large lemon in it; yet
  • We have the recipe for some decent lemonade.

Here’s the Ideation Part

Wouldn’t it be great if everybody could use an app on their smarty phone to say, “Hi All, its me, I just arrived at my polling place, the line looks a bit long.” and then later, “Me again, OK, just finished voting, and geesh, like 90 minutes from start to finish… not so good,” or “Me again, I’m bailing.  Need to get to airport.”

And wouldn’t it be great if all that input from every voter was gathered in the cloud somehow, so I could look-up my polling place, see the wait time, the trend line of wait times, the percentage of my precinct’s non-absentee voters who already voted, and other helpful stuff?  And wouldn’t it be interesting if the news media could show a real time view across a whole county or State?

Well, if you’re reading this, I bet you agree, “Yes, yes it would."  Sure.  Except for one thing.  To be really useful it would have to be accurate.  And if there is a question about accuracy (ah shoot, ya know where this is going, don-cha?) Yes, there is always that Grinch called “abuse.”

Sigh. We know from recent big elections that apparently, partisan organizations are sometimes willing to spend lots of money on billboard ads, spam campaigns, robo-calls, and so on, to actually try to discourage people from going to the polls, within targeted locales and/or demographics. So, we could expect this great idea, in some cases, to fall afoul of similar abuse.  And that’s the fat lemon.

But, please read on.

Now, we can imagine some frequent readers spinning up to accuse us of wanting everything to be perfectly secure, of letting the best be the enemy of the good, and noting that nothing will ever be accomplished if first every objection must be overcome. On other days, they might be right, but not so much today.

We don’t believe this polling place traffic monitoring service idea requires the invention of some new security, or integrity, or privacy stuff.  On the other hand, relying on the honor system is probably not right either.  Instead, we think that in real life something like this would have a much better chance of launch and sustained benefit, if it were based on some existing model of voters doing mobile computing in responsible way that’s not trivial to abuse like the honor system.

And that lead us to the good news – you see, we have such an existing model, in real life. That’s the new ingredient, along with that lemon above, and a little innovative sugar, for the lemonade that I mentioned.

Stay tuned for Part 2, and while waiting you might glance at this.

Comment

Comment

The 2013 Annual Elections Verification Conference Opens Tonight

If its Wednesday 13.March it must be Atlanta.  And that means the opening evening reception for the Elections Verification Network's 2013 Annual Conference.  We're high on this gathering of elections officials, experts, academicians and advocates because it represents a unique interdisciplinary collaboration of technologists, policy wonks and legal experts, and even politicians all with a common goal: trustworthy elections. The OSDV Foundation is proud to be a major sponsor of this event.  We do so because it is precisely these kinds of forums where discussions about innovation in HOW America votes take place and it represents a rich opportunity for collaboration, debate, education, and sharing.  We always learn much and share our own research and development efforts as directed by our stakeholders -- those State and local elections officials who are the beneficiaries of our charitable work to bring increased accuracy, transparency, verification, and security (i.e., the 4 pillars of trustworthiness) to elections technology reform through education, research and development for elections technology innovation.

Below are my opening remarks to be delivered this evening or tomorrow morning, at the pleasure of the Planning Committee depending on how they slot the major sponsors opportunities to address the attendees.  We believe there are 3 points we wanted to get across in opening remarks: [1] why we support the EVN; [2] why there is a growing energy around increased election verification efforts, and [3] how the EVN can drive that movement forward.....

Greetings Attendees!

On behalf of the EVN Planning Committee and the Open Source Digital Voting Foundation I want to welcome everyone to the 2013 Elections Verification Network Annual Conference.  As a major conference supporter, the Planning Committee asked if I, on behalf of the OSDV Foundation, would take 3 minutes to share 3 things with you:

  • 1st, why the Foundation decided to help underwrite this Conference;
  • 2nd, why we believe there is a growing energy and excitement around election verification; and
  • 3rd, how the EVN can bring significant value to this growing movement

So, we decided to make a major commitment to underwriting and participating in this conference for two reasons:

  1. We want to strengthen the work of this diverse group of stakeholders and do all that we can to fortify this gathering to make it the premier event of its kind; and
  2. The work of the EVN is vital to our own mission because there are 4 pillars to trustworthy elections: Accuracy, Transparency, Verification, and Security, and the goals and objectives of these four elements require enormous input from all stakeholders.  The time to raise awareness, increase visibility, and catalyze participation is now, more than ever.  Which leads to point about the movement.

We believe the new energy and excitement being felt around election verification is due primarily to 4 developments, which when viewed in the aggregate, illustrates an emerging movement.  Let’s consider them quickly:

  1. First, we’re witnessing an increasing number of elections officials considering “forklift upgrades” in their elections systems, which are driving public-government partnerships to explore and ideate on real innovation – the Travis County Star Project and the LA County’s VSAP come to mind as two showcase examples, which are, in turn, catalyzing downstream activities in smaller jurisdictions;
  2. The FOCE conference in CA, backed by the James Irvine Foundation was a public coming out of sorts to convene technologists, policy experts, and advocates in a collaborative fashion;
  3. The recent NIST Conferences have also raised the profile as a convener of all stakeholders in an interdisciplinary fashion; and finally,
  4. The President’s recent SOTU speech and the resulting Bauer-Ginsberg Commission arguably will provide the highest level of visibility to date on the topic of improving access to voting.  And this plays into EVN’s goals and objectives for elections verification.  You see, while on its face the visible driver is fair access to the ballot, the underlying aspect soon to become visible is the reliability, security, and verifiability of the processes that make fair access possible.  And that leads to my final point this morning:

The EVN can bring significant value to this increased energy, excitement, and resulting movement if we can catalyze a cross pollination of ideas and rapidly increase awareness across the country.  In fact, we spend lots of time talking amongst ourselves.  It’s time to spread the word.  This is critical because while elections are highly decentralized, there are common principles that must be woven into the fabric of every process in every jurisdiction.  That said, we think spreading the word requires 3 objectives:

  1. Maintaining intellectual honesty when discussing the complicated cocktail of technology, policy, and politics;
  2. Sustaining a balanced approach of guarded optimism with an embracing of the potential for innovation; and
  3. Encouraging a breadth of problem awareness, possible solutions, and pragmatism in their application, because one size will never fit all.

So, welcome again, and lets make the 2013 EVN Conference a change agent for raising awareness, increasing knowledge, and catalyzing a nationwide movement to adopt the agenda of elections verification.

Thanks again, and best wishes for a productive couple of days.

Comment

Comment

Do Trade Secrets Hinder Verifiable Elections? (Duh)

Slate Magazine posted an article this week, which in sum and substance suggests that trade secret law makes it impossible to independently verify that voting machines are working correctly.  In a short, we say, "Really, and is this a recent revelation?" Of course, those who have followed the TrustTheVote Project know that we've been suggesting this in so many words for years.  I appreciate that author David Levine refers to elections technology as "critical infrastructure."  We've been suggesting the concept of "critical democracy infrastructure" for years.

To be sure, I'm gratified to see this article appear, particularly as we head to what appears to be the closest presidential election since 2000.  The article is totally worth a read, but here is an excerpt worth highlighting from Levine's essay:

The risk of the theft (known in trade secret parlance as misappropriation) of trade secrets—generally defined as information that derives economic value from not being known by competitors, like the formula for Coca-Cola—is a serious issue. But should the “special sauce” found in voting machines really be treated the same way as Coca-Cola’s recipe? Do we want the source code that tells the machine how to register, count, and tabulate votes to be a trade secret such that the public cannot verify that an election has been conducted accurately and fairly without resorting to (ironically) paper verification? Can we trust the private vendors when they assure us that the votes will be assigned to the right candidate and won’t be double-counted or simply disappear, and that the machines can’t be hacked?

Well, we all know (as he concludes) that all of the above have either been demonstrated to be a risk or have actually transpired.  The challenge is that the otherwise legitimate use of trade secret law ensures that the public has no way to independently verify that voting machinery is properly functioning, as was discussed in this Scientific American article from last January (also cited by Levine.)

Of course, what Levine is apparently not aware of (probably our bad) is that there is an alternative approach on the horizon,  regardless of whether the government ever determines a way to "change the rules" for commercial vendors of proprietary voting technology with regard to ensuring independent verifiability.

As a recovering IP lawyer, I'll add one more thing we've discussed within the TrustTheVote Project and the Foundation for years: this is a reason that patents -- including business method patents -- are arguably helpful.  Patents are about disclosure and publication, trade secrets are, be definition, not.  Of course, to be sure, a patent alone would not be sufficient because within the intricacies of a patent prosecution there is an allowance that only requires partial disclosure of software source code.  Of course, "partial disclosure" must meet a test of sufficiency for one "reasonably skilled in the art" to "independently produce the subject matter of the invention."  And therein lies the wonderful mushy grounds on which to argue a host of issues if put to the test.  But ironically, the intention of partial code disclosure is to protect trade secrets while still facilitating a patent prosecution.

That aside, I also note that in the face of all the nonsense floating about in the blogosphere and other mainstream media whether about charges of Romney's ownership interest in voting machinery companies being a pathway to steal an election or suggesting a Soros-Spanish based voting technology company's conspiracy to deliver tampered tallies, Levine's article is a breath of fresh air deserving the attention ridiculously lavished on these latest urban myths.

Strap in... T-12 days.  I fear a nail biter from all view points.

GAM|out

Comment

Comment

A New Voice

Hello - My name is Christine Santoro, the Foundation's General Counsel.  Although you probably have never heard from me before (well, at least read my writing here anyway), I am responsible for the legal machinery underneath the OSDV Foundation and TrustTheVote Project.  We've heard from our readers that they would like to read more from us on issues of law and policy concerning elections and voting technology.  So, I've decided to start voicing my thoughts and musings on topics of interest to our readership ranging from election law to issues of technology policy related to voting systems and machinery.

I look forward to sharing my thoughts with you, and more importantly, reading your comments and feedback and engaging in a conversation.  If you have anything in particular that you would like us to talk about let me know.  Talk to you soon.

Comment

Comment

Movement to Bring Open Source to Government Being Reorganized

Greetings- Just a quick post to suggest an interesting report out this afternoon on the TechPresident blog.  The move to consolidate the efforts of Civic Commons (home of Open311.org) and Code For America (CfA), notwithstanding the likely trigger being Civic Common's leader, Nick Grossman moving on, actually makes sense to us.  CfA's  Jennifer Pahlka's write up is here.

Recently in a presentation, I was asked where our work fits in to the whole Gov 2.0 movement.  It seems to us that we are probably a foundational catalyst to the movement; related, but only tangentially.  To be sure, we share principles of accuracy, transparency, verification and security in government information (ours being elections information).  But Gov 2.0 (and its thought leaders such as CfA) is a considerably different effort from ours at the TrustTheVote Project.  That's mainly because the backbone of the Civic Commons, Open311.org, and CfA efforts is Web 2.0 technology (read: the social web and related mash-up tools).  There is nothing wrong with that; in fact, its downright essential for transparency.

But to keep the apples in their crate and the oranges elsewhere, our work is about a far heavier lifting exercise.  Rather than liberating legacy government data stores to deliver enlightened public information sites, or to shed sunlight on government operations, we're building an entirely new open source elections technology stack from the OS kernel up through the app layer, with particular emphasis on an open standards common data format (more news on that in coming posts).

Ours is about serious fault tolerant software architecture, design and engineering with stuff built in C++, Objective C, even dropping down to the machine-level, potentially as far as firmware if necessary, but at the app layer higher level programming tools as well including frameworks like Rails, and UX/UI delivery vehicles like HTML5 and AJAX (to the extent of browser-based or iOS5-based applications).

And that point is the segue to my closing comment:  the Gov 2.0 movement is smartly delivering Government information via the web; the social web in particular.  That's huge.  By contrast, remember that a good portion of our work is focused on purpose-built, application-specific devices like Optical Scanners to "read" ballots, devices to mark a ballot for printing and processing, or mobile tablets to serve as digital poll books.  Sure, the web is involved in some voter facing services in our framework, like voter registration.  But unlike the Gov 2.0 effort, we have no plans leverage the web or Internet in general for anything (save a blank ballot delivery or voter registration update).

So by contrast, we're in the rough, while Code for America is on the putting green.  And as such, you should have a look at the TechPresident article today. Cheers GAM|out

Comment

Comment

At the Risk of Running off the Rails

So, we have a phrase we like to use around here borrowed from the legal academic world.  Used to describe an action or conduct in analyzing a nuance in tort negligence, is the phrase "frolic and detour."  I am taking a bit of detour and frolicking in an increasingly noisy element of explaining the complexity of our work here.  (The detour comes from the fact that as "Development Officer" my charge is ensuring the Foundation and projects are financed, backed, supported, and succeed in adoption.  The frolic is in the form of commentary below about software development methodologies although I am not currently engaged or responsible for technical development outside of my contributions in UX/UI design.)  Yet, I won't attempt to deny that this post is also a bit of promotion for our stakeholders -- elections IT officials who expect us to address their needs for formal requirements, specifications, benchmarks, and certification, while embracing the agility and speed of modern development methodologies. This post was catalyzed by chit-chat at dinner last evening with an energetic technical talent who is jacked-up about the notion of elections technology being an open source infrastructure.  Frankly, in 5 years we haven't met anyone who wasn't jacked-up about our cause, and their energy is typically around "damn, we can do this quick; let's git 'er done!"  But it is about at this point where the discussion always seems to get a bit sideways.  Let me explain.

I guess I am exposing a bit of old school here, but having had the formal training in computer systems science and engineering (years ago) I believe data modeling -- especially for database-backed enterprise apps -- is an absolute imperative priority.  And the stuff of elections systems is serious technology, containing a significant degree of fault tolerance, integrity and verification assurance, and perhaps most important a sound data model.  And modeling takes time and requires documentation, both of which are nearly antithetical in today's pop culture of agile development.

Bear in mind, the TTV Project embraces agile methods for UX/UI development efforts. And there are a number of components in the TTV elections technology framework that do not require extensive up-front data modeling and can be developed purely in an iterative environment.

However, we claim that data modeling is critical for certain enterprise-grade elections applications because (as many seasoned architects have observed): [a] the data itself has meaning and value outside of the app that manipulates it, and [b] scalability requires a good DB design because you cannot just add in scalability later.  The data model or DB design defines the structure of the database and the relationships between the data sets; it is, in essence the foundation on which the application(s) are built.   A solid DB design is essential to achieve a scalable application.  Which leads to my lingering question:  How do agile development shops design a database?

I've heard the "Well, we start with a story..." approach.  And when I ask those who I really respect as enterprise software architects with real DB design chops, who also respect and embrace agile methodologies, they tend to express reservations about the agile mindset being boorishly applied to truly scalable, enterprise grade relational DB design that results in a well performing application, and related data integrity.

Friends, I have no intention of hating on agile principles of lightweight development methods -- they have an important role in today's application software development space and an important role here at the Foundation, but at the same time, I want to try to explain why we cannot simply just "bang out" new elections apps for ballot marking, tabulation, or ballot design and generation in a series of sprints and scrums.

First, in all candor, I fear this confusion rests in the reality that fewer and fewer developers today have had a complete computer science education, and cannot really claim to be disciplined software engineers or architects.  Many (not all) have just "hacked" with, and self-taught themselves, development tools because they built a web site or implemented a digital shopping bag for a friend (much like the well intentioned developer my wife and I met last evening).

Add in the fact, the formality and discipline of compiled code has given way to the rapid prototyping benefits of interpreted code.  And in the processes of this new modern training in software development (almost exclusively for the sandbox of the web browser as the UX/UI vehicle) what has been forgotten is that data modeling exists not because it creates overhead and delays, but because it removes such impediments.

Look at this another way.  I like to use building analogies -- perhaps because I began my collegiate studies long ago in architectural engineering before realizing that computer graphics would replace drafting.  There is a reason we spend weeks, sometimes months traveling by large holes in the ground with towers of re-bar, forms, and concrete pouring without any clue of what really will stand there once finished.  And yet, later as the skyscraper takes form, the speed with which it comes together seems to accelerate almost weekly.  Without that foundation carefully laid, the building cannot stand for any extended period of time, let alone bear the dynamic and static weights of its appointments, systems, and occupants.  So too, is this the case with complex, highly scalable, fault tolerant enterprise software -- without the foundation of a sold data model, the application(s) will never be sustainable.

I admit that I have been out of production grade software development (i.e., in the trenches coding, compiling; link, load, dealing with lint and running in debug mode) for years, but I can still climb on the bike and turn the pedals.  The fact is, data flow and data model could not be more different.  The former cannot exist without the latter.  It was well understood and data modeling has demonstrated many times that one cannot create a data flow out of nothing.  There has to be a base model as a foundation of one or more data flows, each mapping to its application.  Yet in our discussion punctuated by a really nice wine and great food, this developer seemed to want to dismiss modeling as something that can be done later... perhaps like refactoring (!?)

I am beginning to believe this fixation of modern developers with "rapid" non-data-model development is misguided, if not dangerous for its latent time shifted costs.

Recently, a colleague at another Company was involved with the development of a system where no time whatsoever was spent on data model design.  Indeed, the screens started appearing in record time.  The UX/UI was far from complete, but usable.  And the team was cheered as having achieved great "savings" in the development process.  However, when it came time to expand and extend the app with additional requirements, the developers waffled and explained they would have to recode the app in order to meet the new process requirements.  The data was unchanged, but processes were evolving.  The balance of the project ground to a halt in the dismissal of the first team over arguments about why requirements planning up front should have been done, and they figured out who to hire in to solve  it.

I read somewhere of another development project where the work was getting done in 2 week cycles. They were about 4 cycles away from finishing when on the tracker schedule a task called "concurrency" appeared for the next to last (penultimate) cycle.  The project subsequently imploded because all of the code had to be refactored (a core entity actually was determined to be two entities.)  Turns out that no upfront modeling led to this sequence of events, but unbelievably, the (agile) Development Firm working on the project, spun this as a "positive outcome;" that is they explained, "Hey, its a good thing we caught this a month before go-live."  Really?  Why wasn't that caught before that pungent smell of freshly cut code started wafting through the lab?

Spin doctoring notwithstanding, the scary thing to me is that performance and concurrency problems caused by a failure to understand the data are being caught far too late in the Agile development process, which makes it difficult if not impossible to make real improvements.  In fact, I fear that many agile developers have the misguided principle that all data models should be:

create table DATA
 (key INTEGER,
 stuff BLOB);

Actually, we shouldn't joke about this.  That idea comes from a scary reality: a DBA (database architect) friend tells about a development team he is interacting with on an outsourced State I.T. project that has decided to migrate a legacy non-Oracle application to Oracle using precisely this approach.  Data that had been stored as records in old ISAM type files, will be stored in Oracle as byte sequences in Blobs, with an added surrogate generated unique primary key.  When he asked what's the point of that approach, no one at the development shop could give him a reasonable answer other than "in the time frame we have, it works."   It begs the question: What do you call an Oracle Database where all the data in it is invisible to Oracle itself and cannot be accessed and manipulated directly using SQL?  Or said differently, would you call a set of numbered binary records a "database," or just "a collection of numbered binary records?"

In another example of the challenges of agile development in a database-driven app world, a DBA colleague describes being brought in on an emergency contract basis to an Agile project under development on top of Oracle, to deal with "performance problems" in the database.   Turns out the developers were using Hibernate and apparently relied on it to create their tables on an as-needed basis, simply adding a table or a column in response to incoming user requirements and not worrying about the data model until it crawled out of the code and attacked them.

This sort of approach to app development is what I am beginning to see as "hit and run."  Sure, it has worked so far in the web app world of start-ups: get it up and running as fast as possible, then exit quickly and quietly before they can identify you as triggering the meltdown when scale and performance start to matter.

After chatting with this developer last evening (and listening to many others over recent months lament that we're simply moving too slowly) I am starting to think of Agile development as a methodology of "do anything rather than nothing, regardless of whether its right."  And this may be to support the perception of rapid progress: "Look, we developed X components/screens/modules in the past week."  Whether any of this code will stand up to production performance environments is to be determined later.

Another Agile principle is of incremental development and delivery.   It's easy for a developer to strip out a piece of poorly performing code and replace it with a chunk that offers better or different capabilities.  Unfortunately, you just cannot do this in a Database.  For example: you cannot throw away old data in old tables and simply create new empty tables.

The TrustTheVote Project continues to need the kind of talent this person exhibited last evening at dinner.  But her zeal aside (and obvious passion for the cause of open source in elections), and at the risk of running off the (Ruby) rails here, we simply cannot afford to have these problems happen with the TrustTheVote Project.

Agile methodologies will continue to have their place in our work, but we need to be guided by some emerging realities, and appreciate that for as fast as someone wants to crank out a poll book app or a ballot marking device, we cannot afford to short-cut simply for the sake of speed.  Some may accuse me of being a waterfall Luddite in an agile world; however, I believe there has to be some way to mesh these things, even if it means requirements scrums, data modeling sprints, or animated data models.

Cheers GAM|out

Comment