Thursday, April 4, 2013

How long will it take to Brute Force AES?

 How long will it take to brute force a given encrypted text if encrypted with 256 bit AES?

As of April 3, 2013 Using modern cryptanalysis to break AES-256 bit encryption by guessing the key used on a given block of data using the Biclique Cryptanalysis method as developed by Andrey, Bogdanov, Dmitry Khovratovich, and Christian Rechberge [i] will require a minimum of 2^254 operations for a data-space of  2^40 Bits (128 Gigabytes). 

Processor
Operations Per Second
Number of Operations
Time in Years  (@ 100% Success)
Xeon E5-4650
1.4969 x 10 ^ 13
2^254
6.12818 x 10 ^ 55
I7 3970 XM
1.2969 x 10 ^13
2^254
7.07323 x 10 ^ 55
AMD 7970 (GPGPU)
3.584 x 10 ^ 15
2^254
2.55951 x 10 ^ 53
Titan (Oak Ridge National Labs) Current Top 500 King.
1.759 x 10 ^ 18
2^254
5.2150 x 10 ^ 50

Again, this is a perfect scenario; given a sample size of  2^40 bits ; 128 Gigabytes encrypted with the same 128 bit symmetric key. 

The world’s fastest computer would take 5.215 x 10 ^ 50 (That's 50 zeros) years at 100% accuracy; the research stipulates that the best hit / miss scenario is 63% accuracy so we may infer that the best case would actually be far longer than that. Bogdanov stated the following in an interview:

"To put this into perspective: on a trillion machines, that each could test a billion keys per second, it would take more than two billion years to recover an AES-128  bit key"[ii]

If we take Moore’s Law into account assuming that Titan’s number of operations will double every twelve months.

Number of Years = Number of Operations Required / Maximum Operations Available * Efficiency

A good method used to ball park would be to assume that every year the "Maximum Operations Available" for a given computing system will double.

Multibillion Dollar Super computers will be capable of brute forcing using AES using Bogdanov et. al. method using a single operation in 1.6457091 x 10 ^58 years; assuming 100% efficiency.

That’s 1,645,709,100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 Years.
 
Current scientific estimates state that our sun will become a red giant; expand due to lack of Hydrogen for fusion and consume all inner planetary bodies including the earth in 7.59 billion years. Current estimates put the sun at 4.5 billion years of age which means around 3.09 Billion (3,090,000,000,000) years are left before this event occurs give or take a millienia. 

This analysis only covers the most recently published method to brute force AES; Side Channel Analysis, and other such methods that use attacks on the implementation of the algorithm such as improper key storage, memory management or the failure of security controls on a given system are excluded from discussions here and this is a purely theoretical discussion; we are not considering the platform or implementation of AES on a computer. 

These estimates are based on the idea that the current environment remains unchanged: That is of course unless someone builds a quantum computer capable of executing Shor’s, Grover’s or similar quantum factorization and search algorithms (greater than 1000 entangled q-bits) or if there is some kind of rapid change in Moore’s Law; such as inexpensive atomic replication and manufacture of complex systems such as the Drexler revolution, or the Kurtzwiel's singularity occurs where systems become self aware and more intelligent than humanity and become better at designing chips than we are this timeline is likely to remain somewhat constant.

IBM has already demonstrated the implementation of Shor’s algorithm with 7-qbits[iv], in 2001. Although the key requirement for Quantum systems is the sheer number of Q-bits required; the evolution of these systems is far more complex than the traditional computing systems as developed with transistors on silicon using VLSI and their associated developmental challenges. 

 The estimates for number of q-bits vary but somewhere between 512 and 1000 would be needed to break AES or RSA in a timely way; ie; before the sun explodes. And their development faces both real world theoretical problems in physics and practical engineering difficulties such as environmental isolation that are associated with said physics problems. Good examples of quantum machines include atom smashers built under mountains for CERN between Switzerland and France; they are under mountains for a reason; there's little to no background radiation there. 

As stated by Schneier[iii]  attacks against cryptosystems always get better, not worse; D-Wave systems is making massive inroads (more like major highways) into discrete optimization problems using quantum computing and their board has announced that soon they will have more computing power than that which is available in this universe according to Rose’s Law; limited by their computing architecture to said optimization problems; various scholars and industry pundits do not fear this event but openly admit the risk it maintains to their cryptosystems which include RSA and AES; the only issue bieng when will a company or organization produce a system like D-wave's only using Shor's or Grover's algorithms;
 
In a quantum machine environmental background radiation is similar to electrical or RF based noise in a traditional electrical machine; Ever notice how your personal radio may hum next to a fluorescent light? The only difference being the requirement for a kilometer of hard stone to prevent interference vs. the use of a grounded metal box, isolated power supply and exterior antenna.   

A cryptosystem is considered broken when a method exists to solve for a key that is more efficient than the existing brute force method using known ciphertexts and cryptanalysis. Even broken crypto systems often remain useful due to the time and effort required to search for a given key. 

Easy ways to defeat this and future attacks include developing and using cryptosystems that are lattice based and do not  make use of the discrete logarithm or large prime problems.

References;

[i] Bogdanov, Andrey; Khovartovich, Dmitry; Rechberge, Christian (Univesity Luven, Microsoft Resarch, France Telecom, 2011) Biclique Cryptanalysis of the Full AES [PDF Document] Available Online: http://research.microsoft.com/en-us/projects/cryptanalysis/aesbc.pdf  (Accessed on April 2nd 2013)

 [ii] Neil, Dave (The Inquirer, August 17th 2011) AES encryption is cracked [World Wide Web] Available Online: http://www.theinquirer.net/inquirer/news/2102435/aes-encryption-cracked (Accessed on April 3rd 2013)

[iii]Schneier, Bruce (August 8th 2011) New Attack on AES [World Wide Web] Available Online: http://www.schneier.com/blog/archives/2011/08/new_attack_on_a_1.html (Accessed on April 3rd 2013) 

 [iv]N.a. (IBM, 2001)  IBM's Test-Tube Quantum Computer Makes History [World Wide Web] Available Online: http://www-03.ibm.com/press/us/en/pressrelease/965.wss (Accessed on April 4th 2013)


Monday, August 13, 2012

Eyes are the Windows to the Soul


On the  topic of biometrics the goal is to use a machine readable component of your physiology as a component of the means to repudiate you as an individual to a machine and its systems.  As part of very strong authentication; a password is something you know; a token is something you have and biometrics are generated by something you are.  As with any system of repudiation the accuracy is inversely proportional to the cost. The cheaper it is; the less accurate metrics it can generate and thus the more susceptible to exploit it can be. 
 
Biometrics rely on two key pieces of data and generate errors based on whatever it is they are scanning; the False Acceptance Rate and False Rejection Rate (FAR and FRR) are at odds with the identification mechanisms often being used due to the fact that increasing the accuracy of any of these systems means increasing the cost of said system. 
 
Figure 1: FAR & FRR of Biometrics



The Cross over error rate (CER) is often used as the primary factor in determining both the authentication of the given metric as provided and its accuracy as a system. The (ISC)2[ii] also provides within the same page an overview of this Accuracy in terms of odds of an error when conducting a scan. 

Biometric Crossover Accuracy

DNA Sequencing

PCR is error prone[1] and expensive.

Retinal Scan
1:100,000,000
Vascular Scan (Ultrasound or infrared on Hand or Finger)
1:100,000,000[iii]
Iris Scan
1:131,000
Facial Recognition
1:2000[iv]
Fingerprint
1:500
Hand Geometry
1:500
Signature Dynamics
1:50
Voice Dynamics
1:50
Figure 2: CER Rates of Common Biometrics

The (ISC)2 has identified that the method used to access a system should be determined by the level of sensitivity of the data contained on that system; further to this that valuation should be used as the primary metric used when establishing acceptable expenses for access to that system. The cost of biometric systems is proportional to their accuracy, the most common biometric used today is the fingerprint reader; this is due to the low cost of producing them coupled with the accepted biometric culture of fingerprints; since they have been used by police for over a century. It is also the cheapest biometric available; this means it’s the most likely to be exploited; the same methods used to lift finger prints from a crime scene can be used to fool a fingerprint reader; weather Its capacitive or optical. Using these characteristics the capacitive type is the most deployed due to cost and also the most easily fooled.

DNA scanners at this time are too unreliable and the process is too error prone to be used; however rapid sequencing technology is currently being developed by various competitors as a means to scan for hereditary health issues.[v] The one true biometric identifier for humans is DNA; however Twins have similar DNA; A true unique biometric trait for every human is Vascular Structure; it even differs between twins as it is a function of a person’s growth.  Although vascular scans of any type are the most accurate and the hardest to fool since they often take a pulse; they are also the most expensive to implement as they may require medical grade equipment in the case of Retinal Scans or near filed infrared in the case of thermal hemoglobin images. Palm vein scanners have existed commercially since 1997[vi], Retinal scanners have been around even longer; however they are optical cameras adapted from Optometry and Ophthalmology.

Another potential biometric is the nature of a behavioral trait; these include Gait Analysis, Typing analysis and any repeated generated behavior; touch typists have very consistent flight and dwell times on keyboards and these are dependent upon a learned skill they may vary over time but generally one’s typing speed once taught and learned remains within a given range; Since each user is unique their typing patterns can be akin to a signature; and the traditional fields of Signature analysis in addition offer further behavior options however the issues here again are around cameras that are used to conduct these various procedures; or the sensors used themselves.

Ultimately all  authentication systems use or are based upon electronics and electrical equipment, thus regardless of the biometric used often within the chain of access control the wires used to interface with the access point in question and it’s mechanical retaining mechanisms such as an electric door strike or similar magnetic locking mechanism. More often than not these wires are configured to be accessible by maintenance staff and constitute the easiest method to circumnavigate a bio-metric when dealing with access control; although you’ll be hard pressed to “hotwire” an electrical strikelock without being noticed by local security personnel if they are present.

Common Methods of Compromise

Many a great murder mystery has been produced where the protagonist is framed by the lifting and placing of a single finger print. Notable examples include using gummy-bears and the grease print left by the finger in conjunction with water vapor of ones exhaled breath (Leyden, 2002)[vii]; although most scanners now measure ambient humidity or take a picture as a means to prevent this; Homemade gelatin molds seem to work best as they have a similar capacitance to skin; one other method is to use a “Latent” fingerprint harvested from some device or the environment of the subject in question to create the mould required. (Sten, Kaseva, Virtanen)[viii]; This method showed 100% success, this combined with the very public nature of peoples work environments means that finger prints are no measure of security that does not preclude a well-motivated attacker the ability to create a fake finger that fools the scanner; Other scanners that integrate thermal and vascular structure do an excellent job of increasing the overall security of the device however most laptops sold today use a capacitive fingerprint scanner due to the low cost.

The availability of made to order contact lenses with printed elements have given rise to digitally printed imposters of irises that only require a high resolution picture of said iris. With improving photographic elements and the availability of social media this is actually easier to obtain than previously thought.  Using open source intelligence methods pictures of individuals on line and the fact that resolutions are continuing to increase it is not beyond the realm of a motivated individual to harvest pictures based on public profiles and use or determine iris composition by some educated guess work and the use of some  recently demonstrated advances.  A team security researchers recently demonstrated that even the 200+ point scan comparison algorithm used by iris scanners may also be fooled by subjecting it to a database of computer generated irises; although this was an exercise that only involved images it demonstrated that the software algorithm used to match our eyes to a stored value is vulnerable to such manipulation. (Zetter, 2012)[ix]

The common themes amongst methods used to exploit biometric authentication systems are as follows:
  1.  Use a fake object as a replica of the real object to fool the sensor.
  2. Leverage weaknesses in the sensors implementation of the algorithm used.
  3. Manipulate the Conditions of the Environment
  4. Avoid the entire process of authentication & authorization
  5. Manipulate the process of authorization such as committing help desk fraud.
In an ideal world entrance to a secure area would contain a mantrap with armed guards and a multi-factor authentication point with at least one behavioral trait; one token (such as a key fob) and one biometric of either the retinal or near infrared or thermal heat map of a vascular property such as palm or finger vasculature scans or similar information where each device contributes one component of authorization and authentication to the whole system.   

Economical and fast DNA sequencing does not exist however once it does any twin would have to require all other components in addition to being a twin may be considered a false positive. Gait analysis would require high resolution video and the use of machine learning systems to know the identity of the person as they walk; Sports injuries, changes in muscle tone or health would affect the accuracy of gait analysis. Facial geometry again requires the video and software whereas the use of typing analysis only requires an LCD screen, keyboard and a computer, as well as software.

The investment into the systems used to authenticate people against the computer systems   they use as well as the areas they access can be considered as increasing the value of the information generated and encourage workplace safety but reminding individuals that they can be held to account for any and all actions in the work place.  

 

Figure 3: The three standard bodies of Authentication

As stated often by many security professionals and pundits alike. A password is “something you know”; A fingerprint is “something you are”; A key card, smart card, token, fob, usb-key, with a certificate is “something you have”. This Venn diagram does not contain behavioral traits as these systems are in their infancy and are well beyond the cost of most organizations; there are also privacy concerns around their use.  When we state “Two factor” authentication the intent is to pick two of the above domains and develop a system of authentication that uses them. RSA’s use of Smart Card enabled FOB’s with one time passwords are an excellent example of something you have and something you know; Crypto-Card has a system that generates their own seed files on site.   

A smart card with a chip on it that contains biometrically encoded data that is inserted into a reader along with a hand geometry, Iris, fingerprint or retina comparison is an excellent example of something that you have, something you know and something you are; however without extending the biometric to include some real-time data such as “Ambient temperature”, “Pulse Rate”, or even the “Smell” or other relevant information about the “Something you are” then as Schneier previously stated the sensor in question may be fooled or the authorization process may be open to abuse (Schneier, 1999)[x]. 



Conclusions

There is a trend now to rely on the Trusted Platform Module (A controlled chip) to store and protect the systems integrity and confidentiality or on a Virtual Smart Card service; TPM’s store cryptographic keys, passwords and hashes of secure configurations of the device in which they operate; Virtual Smart Card services store smart card certificates locally.  Although TPM’s offer a degree of security they are a component within the system that you are authenticating to; the idea of presenting a token to a system is to authenticate you a user to a given system. The act of presenting this token weather it’s a password, smart card, finger-print is an improvement to the security posture of the device only so long as the user knows how to treat the token.  

The systems and tokens should always be separate; when was the last time you left the key in the deadbolt of the front door of your home and called it secure?  

A TPM may be the equivalent of a two foot thick steel reinforced concrete door; but any explosives expert will tell you that just means more time and money for explosives are needed; or a little more engineering is required; Thermate does wonders to steel and concrete when detonated correctly.  

The unification of the authentication mechanism with the system is a reduction in security posture; having the separate token should always be the standard when dealing in sensitive systems.  Thus when a loss event does occur the perpetrators only have one half of the required system to attempt a compromise; any attempt at unauthorized access would require both the key whatever it may be and the system.  This concept is at the core of what it means to have “two factor” authentication.  

Where biometrics may be stored on some kind of smart media; so long as the media is controlled and has a good documented policy and lifecycle including physical destruction; in so far as the systems user hold the card that contains the data that makes up their fingerprint or iris then we may even alleviate some of the privacy concerns around controlled access to that biometric data; as it is highly sensitive.

The systems protections used must always be commensurate with the level of sensitivity of the data in that system. The higher the sensitivity the greater the need to have two factor or greater authentication and associated protective mechanisms; The cost may seem large however the value offered far outstrips the initial pain on purchase when you can definitively control access and develop a foundation for dual custody within your authentication and authorization infrastructure.  You may answer such questions like who did what with which data where accurately; can you answer that one today?

As for personal devices; when you consider what you store on them simply ask yourself “How much do I value my personal information and data? What are my movies, pictures and music worth to me?”  You may quickly discover that your personal information is far more valuable than you previously thought; It often has the weakest protective mechanisms of all data you deal with on a daily basis, this is indicative of the lack of consumer value where security mechanisms are concerned; it’s also why many consumers view security features as an encumbrance on functionality.

References

[1] PCR Error rates depend upon the technology used to amplify the DNA obtained in the sample and can be as high as 40% per 1 kilo-base pair; the final mixture of DNA that is sequenced may be a “Close” match; but not an EXACT copy.


[i] Tipton, Harold F. (Auerbach Publications 2011) Offical (ISC)2 guide to the SSCP CBK P.21 ISBN: 978-1-4398-0484-1
[ii] Tipton, Harold F. (Auerbach Publications 2011) Offical (ISC)2 guide to the SSCP CBK P.21 ISBN: 978-1-4398-0484-1
[iii] Lula, A; DeSantis M; (Unviersity of Bascilica, August 2011) Experimental evaluation of an ultrasound technique for the biometric recognition of human hand anatomic elements. [Online] World Wide Web, Available from: http://www.ncbi.nlm.nih.gov/pubmed/21367443 (Accessed on July 25th 2012)
[iv] Williams, Mark (MIT Technology Review, May 2007) Better Face Recognition Software [Online] World Wide Web; available from: http://www.technologyreview.com/news/407976/better-face-recognition-software/ (Accessed on July 25th 2012)
[v] N.a. (Archon Genomics, Xprize Foundation, 2012) Express Scripts 100 over 100: Archon Genomics Xprize Informaiton Page [Online] World Wide Web, Available from: http://genomics.xprize.org/competition-details/prize-overview (Accessed on July 25th 2012)
[vi]Jain, K. Anil; Ross, Arun; Prabhakar, Salil (IEEE, Transactions on Circuts and Systems for Video Technology, Volume 14, Number  1, January 2004) An Introduction to Biometric Recognition [Online] PDF Document, Available from: http://www.csee.wvu.edu/~ross/BiometricsTextBook/Papers/Introduction/JainRossPrabhakar_BiometricIntro_CSVT04.pdf (Accessed on July 25th 2012)
[vii] Layden, John (The Register, May 2002) Gummy bears defeat fingerprint sensors Sticky problem for biometrics firms [Online] World Wide Web, Available from: http://www.theregister.co.uk/2002/05/16/gummi_bears_defeat_fingerprint_sensors/ (Accessed on July 27, 2012)
[viii] Stén, Antti; Kaseva, Antti; Virtanen , Teemupekka; (Helsinki University, 2003)  Fooling Fingerprint Scanners - Biometric Vulnerabilities of the Precise Biometrics 100 SC Scanner [Online] PDF Document, Available From: http://stdot.com/pub/ffs_article_asten_akaseva.pdf (Accessed on July 27th 2012) 
[ix] Zetter, Kim (Wired, July 25th 2012) Reverse-Engineered Irises Look So Real, They Fool Eye-Scanners [Online] PDF Document, Available from: http://www.wired.com/threatlevel/2012/07/reverse-engineering-iris-scans/all/ (Accessed on July 27th 2012)           
[x] Schneier, Bruce (CACM, 42, 8, August 1999) Inside Risks 110 Biometrics: Uses and Abuses [Online] PDF Document, Available from: http://www.schneier.com/essay-019.pdf (Accessed on August 13th 2012)