Title: eVote System Certification in the USA
1eVote System Certification in the USA
2The Florida Recount Disaster of the year 2000
elections
- Started the move towards eVote systems in the US
- Old-fashioned manual punch card systems
(Votomatic) - Often used in counties with low income, that had
no money to buy new equipment - hanging chads holes not fully punched through
- Confusing paper ballot design
- ?Uncertainty about voter intentions
3(No Transcript)
4The old federal certification process
- National Association of State Election Directors
(NASED), in effect since 1994 - No federal funding
- Voting systems tested by Independent Testing
Authorities (ITA) using 1990 Federal Election
Commission Voting System Standards (VSS) - Slightly updated in 2002 (before HAVA passing)
- NASED reviews ITA report and certifies a system
as meeting federal standards - Conflict of Interest ITAs are commercial
companies Vendors selects, and pays directly to
the ITAs ? ITAs have no interest in negative
reports - Almost all systems used in US elections were
NASED/ITA certified, yet the certification failed
to prevent disasters like Florida 2000, or find
the errors found in CA TTBR (see below)
5"Help America Vote Act" (HAVA)
- Passed in October 2002
- Objective
- Modernize US election technology to avoid
situations like Florida 2000 in the future,
through - Creation of the Federal Election Assistance
Commission (EAC), which would - Establish uniform election system standards and
create a new, more efficient federal
certification system - And 3.9 billion dollars in federal funding for
states to buy new technology, guided by the EAC
6New certification process delayed
- HAVA requires the EAC to develop new voting
systems standards by January 1, 2004 - These standards help states select technology to
upgrade their election systems (using the federal
funding) by January 1, 2006 - BUT Appointment of EAC commissioners delayed by
almost 10 months - BUT only US 2 million (of the US 30 million
planned 2003 EAC budget for testing and RD) was
provided - ? No guidelines in 2003
72004 Money without guidance
- In 2004, of US 50 million budgeted for testing,
research and development of standards, only US
1.2 million were paid out - ? No standards / certification in 2004
- BUT in 2004, US 1300 million was paid out to
states to buy new technology - US Dept. of Justice insists on states having new
equipment ready by January 1st, 2006 - ? Huge new, unregulated market for voting
equipment makers
8Sell whatever you have, quickly
- Equipment makers rush to market
- Immature products, focus on features, not code
design - Insecure software
- Counties buy whatever looks good
- No in-house IT expertise to evaluate
- No EAC guidance on whats good and what not
- ? Thousands of small and not-so-small disasters
causes by faulty voting systems
9First Guidelines only in late 2005
- Voluntary Voting System Guidelines (VVSG)
published only in December 13, 2005 (designed by
NIST, approved by EAC) - Went into effect only in 2007
- To bridge the gap, in June 2006, the EAC
essentially took over the NASED/ITA program, with
all its flaws - EACs own testing and certification program
started only in January 2007
10Current EAC System
- Similar system as NASED (ITAs are now voting
system test laboratories or VSTLs) - Testing against VVSG 2005
- BUT similar conflict of interest (direct VSTL
payment and selection) - Still voluntary, states may require EAC
certification, but dont have to - Better Quality Monitoring Program reviews
systems after certification, and may de-certify
for vendor misinformation, use of non-certified
versions in the field, unauthorized change,
malfunction and bugs in the field, etc - Updated VVSG II are still not finished, EAC tests
against 2005 standards
11Friendly testing VS adversarial testing
- VVSG 2005 are fairly comprehensive, but EAC
testing methods to verify them are not sufficient - EAC is friendly testing - defines test cases
based on functions that the equipment is supposed
to have - Does it do what it says it does?
- Predictable, does not anticipate unusual
situations or creative attacks - Adversarial testing Assemble a group of smart
people, and say Lets see if we can break this!
? State certification programs like California
TTBR, Ohio Everest, Florida SAIT
12California Top-to-bottom-Review
- Introduced in 2007 by Secretary of State (Sos)
Debra Bowen in response to weak federal
certification - All currently certified systems in use in CA are
reviewed under new methodology - Severe security flaws found with all systems
- SoS Office decertifies all systems for use in
California (both Scanners and DREs) - Imposes strict usage conditions for
re-certification - for Sequoia and Diebold, only early voting, on
eDay only one machine per polling place (for
disabled access) - all results from them must be manually recounted
(100) - Hart Intercivic may be used more freely
- ESS didnt submit its software and was directly
decertified - all vendors must produce plans to harden their
equipment to protect against security
vulnerabilities found by the TTBR
13Consequences for the states
- States had been rushed by the Dept. of Justice to
buy machines by 1. Jan 2006, even without EAC
guidance - Now, in CA, millions of US worth of equipment
(especially DREs) sat in storage, and could not
be used ? wasted taxpayer dollars - Counties had to revert to paper elections (e.g.
Santa Clara Ct) or buy different, certified
machines, spending extra money
14Modules of TTBR
- Penetration analysis / Red Team attacks
- first w/o system knowledge, then with full system
knowledge - Source Code / Architectural review
- Hardware review
- Documentation review
- Accessibility review
- ? Threat assessment, define use conditions to
mitigate the security weaknesses found
15Advantages
- Vendor pays SoS, not test lab
- SoS then selects team who will audit
- ? No conflict of interest
- Audit teams are from State University (Professor
and Grad students) not commercial companies - Name and CV of each participating auditor is
published online ? academic reputation as
guarantor of integrety - Teams elaborate report, SoS issues
- certification,
- conditional certification (under use conditions),
or - rejection
- Complete reports of teams are available online,
not just summaries
16Handling system changes
- SoS must be informed for each system change
- SoS decides
- if the change is minor it rolls over the
certification to the new version - otherwise, full new certification is required
- Temptation for vendor to not declare system
changes to avoid cost of re-certification - Case of ESS In Nov 2007, SoS sued ESS for
selling 972 AutoMARK Model A200 ballot-marking
machines to several counties that contained
hardware changes that had were not authorized by
the Secretary of State - Settled against fine of 3.25 Million in 2009
17Vendors squeezed by cost and deadlines
- Problem need for system upgrades often arise
with short notice - Not enough time to develop new software and pass
through certification process in time for
elections (takes months) - Because EAC certification is weak, states have
their own systems, but this forces vendors to pay
for all the different certification in all states
they want to sell in ? Prohibitively costly and
time consuming - Market consolidation, only strongest vendors
survive
18Outlook
- One strong federal certification system (modeled
on State best practice) should make state
certification superfluous - Cheaper for vendors, easier market entry
19- Thank you!
- Ingo.boltz_at_gmail.com