Friday, September 18, 2009

PIN Sniffing revisisted

It's been a while since I've posted anything; so here's some recent expierences I've had.

Recently I recieved a call from my bank, they had an automated attendant a recording advising me that my PIN had been comprimized; and I should visit my branch and have it changed as soon as possible. This plague had struck my significant other not even a month previsoulsy so I half expected it to hit me sooner or later.

I've noticed a trend lately; within the realm of security, espcially where merchants are concernd. PIN Skimming is defined as making a copy of the IDE card (your debit or credit card is called an IDE card, not to be confused with Integrated Drive Electronics; which are inside most old computers as a hard drive interface bus!).

Debit card fraud is on the rise; as are credit card fraud and various other forms; however this second version of fraud is interesteing in it's methods.

http://www.ctv.ca/servlet/ArticleNews/story/CTVNews/1105142446966_16/?hub=WFive

The core issue to consider here is that plastic money is designed to be cheap; the security on a debit card is limited to a 4 digit pin that is usually never changed. As any novice LAN administrator will repeat; 4 digits does not a password make.

Now the equipment in question is also very cheap; pin pad's and readers are a cost sensitive industry; in that merchants will not implment a digital system if the cost is too high and banks will not sell systems if they cannot turn a profit on them so the unit manufactering cost must be low. The newer systems include two standards instead of a magnetic stripe but for the most part most systems remain the same.

Magnetic Stripe Cards are old
http://en.wikipedia.org/wiki/Magnetic_stripe_card

Smart Cards can be duplicated
http://en.wikipedia.org/wiki/Smart_cards

And Even RFID has been comprimised:
http://en.wikipedia.org/wiki/RFID#Exploits

In fact anyone with a university degree in eletrical engineering and working knoledge of comptuers and radio systems can setup a system to skim data; but my most recent expierence when speakign to my banking representative I asked them how they became aware of the breach? They replied; well certian merchant providers have firmware that is comprimized, so even the merchant is unaware of the skimming.

I thought to myself "That's pure ciminal genius!"; Don't bother to comprimise the bank, it's individuals or even the merchants but instead exploit the cheap equipment responsible for the client transaction by placing a keylogger in the firmware and duplicating the cards; most systems require network connections to process the Electronic Data Interchange transactions; therefore if your trojan firmware can skim data and send it off to a zombie network for processing you have an active database ripe for duplication withonout even the merchants bieng aware of the comprimized device. I bet the sales of the databases are the motivator for the crime in the first place.

I mean how much money will 7/11, Quickie and the like spend on security audits of their own merchant processing hardware? It's all leased from banks or 3rd parties whom simply wish to maintain cheap transaction costs as to remain competative. So it's the softest target; and therefore the most likely to be exploited.

It seems we now have a new exploit to contend with the soft pin pad merchant firmware exploit; I could only imagine of a few organizations that are capable of this kind of clout and money since you'd have to hire a developer that has worked on this machinerey previously; and developers are never cheap.

So the question enters the ring; as the security arms race continues when will biometrics be incroproated into our transactions?

Currently the authentication mechanisam is purely based on numbers and data in the case of credit (I mean you can challenge any charge that you have not signed; depending on the retailer), In the case of debit you have somthing you know and somthing you have; both of which can be taken via comprimized firmware within the device used, with no need for a camera, since you punch your pin into the device, there may be minor issues but even 3rd parties have demonstrated kayboard based exploits so I assume a pin pad may also be subject to the same type.

As this evolution continues the banks are borne to bear the weight and responbility; the merchants remain unaware and we the people get screwed.

Friday, July 3, 2009

Should Server-side scripting or client side scripting for high volume websites?

Javascript is client based and although it’s useful for AJAX and relies on the browser to be executed the use of Javascript makes the website and it’s developer depend on the client’s system (an unknown) to determine website performance.

The Post Hypertext Processor (PHP) is a server side language programmed initially as a bunch of CGI scripts complied in C/C++, as such it’s a very efficient language.(Echo3)[i] The primary platform for web-development with PHP Is LAMP (Linux, Apache, PHP and MySQL);(Dougherty)[ii](NetCraft)[iii] As with any programming language and website more than just the language must be considered to determine overall system performance with respect to the website expierence. An example of this is if we have the world most powerful supercomputer but only a dial-up connection no one will host web-sites on said computer.

There are far more factors in performance than just the language; how many users are there? How much Bandwidth is there? What’s the local carriers’s data load during the hours of testing?

If we conduct an “Apples to Apples” comparison of PHP vs Javascript where all factors are equal; PHP will come out far faster as it’s a dedicated framework of byte-compiled code operating within the CGI vs code that is compiled within the browser.(WrenSoft)[iv] When we examine the underlying frameworks and architectures we find that this has been a longstanding debate. PHP is based in C/C++ where as Javascript is a derivative in Java; if we compare execution times of C/C++ vs Java we see that even with just in time compilation we are trading platform compatibility for performance. (Lewis et al.)[v] In short we may reify the issue as follows in a logical question;

Is it faster to compile once and execute many times upon request or to compile code at every execution after downloading it’s source?

When considering performance and saleability; we must look to the medium being utilized and it’s architecture; LAMP platforms host most of the internet ; Facebook, MySPACE, Google, Amazon, Ebay, all use LAMP based architectures with custom CGI-Scripts; these are by far the largest and most used and abused sites on the internet, the major difference between these is that the databases they use are quite large and distributed; the notion of using a Database to render dynamic content did exist before PHP but PHP became the “Defacto” dynamically generated database driven website framework via templates; as for saleability with the use of “DNS Round Robin” and PHP via MySQL clusters allows PHP to scale into the millions of hit’s per minute across multiple hosts; to further improve performance one may utilize geo-location to ensure that a low latency server is chosen for a given domain that remains close to the client DNS request. Microsoft has no comparable product, and Oracle recently bought out Sun to ensure that MySQL would become one of their products because of this technical “Niche”.

References


[i] N.a. (Echo 3 Services, July 7th 2002) Origins of PHP [Online] World Wide Web, Availalbe from: http://michaelthompson.org/weblog/pages/Origins_of_PHP.html (Accessed on June 2nd 2009)

[ii] Doughtery, Dale (Onlamp, January 26th 2001) LAMP The opensource web platform [Online] World Wide Web, Available from: http://www.onlamp.com/pub/a/onlamp/2001/01/25/lamp.html (Accessed on June 2nd 2009)

[iii] N.a. (Netcraft, 2009) Market Share for all servers across domains [Online] World Wide Web, Available from: http://news.netcraft.com/ (Accessed June 2nd 2009)

[iv] N.a. (Wrensoft 2008) Search Benchmark Informaiton & Comparison [Online] World Wide Web, Available from: http://www.wrensoft.com/zoom/benchmarks.html (Accessed on June 2nd 2009)

[v] Lewis, J.P.; Neumann, Ulrich; (University of Southern California) Preformance of Java vs C++ [Online] World Wide Web, Availalbe from: http://www.idiom.com/~zilla/Computer/javaCbenchmark.html (Accessed June 2nd 2009)

Friday, June 26, 2009

What kinds of Exploits are there for the DOM Model, and how to mitigate them?

The Document Object Model is defined by the W3C as:

“The Document Object Model is a platform- and language-neutral interface that will allow programs and scripts to dynamically access and update the content, structure and style of documents. The document can be further processed and the results of that processing can be incorporated back into the presented page. This is an overview of DOM-related materials here at W3C and around the web.”(W3C)i


The DOM is a form of advanced programmer interface (API), designed to allow web-developers access to fuctions and objects within the page via javascript. This allows the flexible creation and update of page and site elements in manners that most programmers would already understand. Since the DOM uses Javascript; it is executed within the Client Browser, it may also be executed by any language including but not limited to VBScript, C#, ASP.NET et cetera, ad nosium.(W3Schools)ii Since the DOM Model is Platform independent it may be manipulated by any script.


Funcitonality vs Security the Balance

The functionality of any API is always inversly proportional to the securty of that API. (Reguly)iii(Howard)iv


Exploits

The primary type and most common type of DOM Exploit involes a type callsed XSS or Cross Site Scripting. The one type specific to the DOM would be a Local XSS Attack as defined by Klien.(Klien)v

The secondary type of DOM is the good old Bufer Overflow.(Wikipedia)vi; Since all new browsers must be DOM compliant to function, the browser must allow the execution and use of DOM methods. This open's a potential attack vector to malicious code via the code arbitrating and then overrunning a given variables memory buffer; Since the most common browser on the internet is Internet Explorer there have been many DOM related buffer overflows but I will provide one here. (Microsoft)vii

Each Browser on every platform has had one buffer overflow at one time or anohter they arise as a risk when the memory of a called object is not properly recovered or allocated during said objects instantiation. Since on most microsoft platforms the brower is running under the local users identity (which usually has administrative rights to the machine) if a buffer overflow does occur and is successful it results in the ability for the producer of said malicious code to execute arbitrary code with administrative privlages. In hacking cricles this is called “Owning” the box. Once a box has been “Owned” it may be used as a remote spam server, zombie box for DDOS or DOS attackes, or Identity theft or for whatever nefarious purposes the malicious code wirter intended.

There is also “ClickJacking” however it's a derivative of Cross Site scripting, primaraly used to bankrupt advertizing budgets of various competitiors to improve one's own ad ranks within search engine powered keyword systems.


Mitigations

To achieve any security one must limite the type and function of object calls and implment systems with features such as Automated Memory Management & Verifacation, (Stallings)viii another method used to mitigate buffer overflows is random order library loading on the operating sytems startup.(OpenBSD)ix Although the primary and best method to protect agains XSS and Buffer overflow attackes is to disable Scripting alltogether, requireing the user to verify weather or not the site maintains valid code. (Gorgio)x The only alternative to this would be to implment dynamic online content validation as mentioned by Helfin et all.(Helfin et All.)xi Were all contenet has a 3rd party encrypted checksum with integrated public keys thus leveradging the cryptographic systems checksumming methods ot certify content.


in.a. (W3C, January 19th 2005) Document Object Model [Online] World Wide Web, Available from: http://www.w3.org/DOM/ (Accessed on June 25th 2009)

iin.a. (W3Schools, n.d.) JavaScript HTML DOM Objects [Online] World Wide Web, Available frrom: http://www.w3schools.com/js/js_obj_htmldom.asp (Accessed on: June 25th 2009)

iiiRugley, Tyler (360 Security Ncircle, March 11th 2009) Functionality vs Security Who Wins? [Online] World Wide Web, Available from: http://blog.ncircle.com/blogs/vert/archives/2009/03/functionality_versus_security.html (Accessed on: June 25th 2009)

ivHoward, Michael (Microsoft, March 2007) Security Development Lifecycle (SDL) Banned Function Calls [Online] World Wide Web, Available from: http://msdn.microsoft.com/en-us/library/bb288454.aspx (Accessed on June 25th 2009)

vKlein, Amit (Web applications Security Consortium, April 7th 2005) DOM Based Cross Site Scripting of the Third Kind [Online] World Wide Web, Available from: http://www.webappsec.org/projects/articles/071105.shtml (Accessed on June 25th 2009)

vin.a. (Wikipedia, June 19th 2009) Buffer Overflow [Online] World Wide Web, Available from: http://en.wikipedia.org/wiki/Buffer_overflow (Accessed on June 25th 2009)

viin.a. (Microsoft, December 13th 2005) Microsoft Security Buillitn MS05-054 KBID 905915 [Online] World Wide Web, available from: http://www.microsoft.com/technet/security/Bulletin/MS05-054.mspx (Accessed on June 25th 2009)

viiiWilliam Stallings, (Prentice Hall, 2008) Operating Systems 6th ed. Section 7.5 Security Issues P331. [Online] World Wide Web, Available from: http://books.google.ca/books?id=dBQFXs5NPEYC&pg=PA331&lpg=PA331&dq=Memory+Management+Security&source=bl&ots=CtpS0WeuF8&sig=ws4AjP5HPHEPQ9DHx1X2oxfkSZs&hl=en&ei=h3FFSuLGO4WEtwf-8OCVBg&sa=X&oi=book_result&ct=result&resnum=4 (Accessed on June 25th 2009)

ixn.a. (OpenBSD Foundation, October 3rd 2006) OpenBSD 3.4 Release Notes [Online] World Wide Web, Available from: http://www.openbsd.org/34.html (Accessed on June 25th 2009)

xMaone, Gorgio (Noscript, n.d.) NoScript Project Home page [Online] World Wide Web, available from: http://noscript.net/ (Accessed on June 25th 2009)

xiHelfin, J; Handler, J; (Maryland University, IEEE, March 4th 2004) Intellegent Systems Volume 16 Issue 2, A portrat of the Semantic web in action [Online] PDF Document, Available from: http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=920600 (Accessed on June 25th 2009)

Thursday, June 11, 2009

Have webforms changed the workflow paradigm?

The international standards organization (ISO) refers to a “business workflow” as “A sequence of Operations and or Work for an individual or group of persons either itself or as a process”.(ISO) [i] The origins of business workflows and workflow analysis are rooted in Operations Management. (Pilkington et al.)[ii]

Requirements Analysis and Engineering is a method utilized within software engineering to elicit conditions and needs of the clients for a proposed software solution to a business issue in a structured manner as to facilitate the primary development stages of software design. (DoD)[iii] This method utilizes “Stakeholder Identification”, “Interviews”, “Requirements Lists”, “Measureable Goals” and various forms of “Prototyping”.

Since all web forms are implemented in software; they should meet software engineering standards and requirements gathering should be conducted beforehand; in reality this is not always the case. However for the purposes of this argument; we will assume that all web-sites and forms are software and therefore should be developed according to existing models to reduce the risk of project failure and price overruns. (Jain)[iv]

There are many software development models and for the purposes of brevity we will not open a discussion to them here; we will simply state that these models exist and each model has its own method for development and lifecycle control; this includes where data may be gathered and determining at which points within the workflow process data should be elicited from the client using said developed application. These models include the Waterfall (Boehm)[v], Agile (Cockburn)[vi], Extreme (Beck)[vii] and Iterative (Basili)[viii]: Each of these models is impacted by the Bohem Spiral where cost and complexity is proportional to the software version and its history in development e.g. the longer the development cycle and the older the software the more it costs in both man-hours and monetary terms to maintain.

Just as there are software development models; the world of Business Analysis would not be complete without project and workflow management models and standards. These include the Project Management Institutes PMBok (Jaeger)[ix], IBM’s Rational Unified Process (Krebs)[x] and to a certain degree Unified modeling Language (UML)[xi], the IEEE and ISO have derivatives of the previous processes; such as the ISO 9000 certification; however these are accreditations for institutions regarding management and not the frameworks and models themselves.


Data Collection and Workflow Management

The methods used to identify how data is collected, stored, and used to generate business intelligence, Income, sales leads and revenue has been impacted in an immensely by the advent of the information age. eCommerce and B2B systems have emerged as new markets for previously location limited businesses in the worlds of retail sales, software development, business consultation, bookkeeping and accounting as well as entertainment. Communcaitons across all businesses have been forever changed to require a web-site and e-mail based communications. The global software market alone is valued at $203.4 billion as of 2006 and is expected to grow to $271 billion by 2011. (Datamonitor)[xii], this does not include other business sectors such as retail sales, mining and manufacturing, consumer electronics, et cetera, ad nosium. Each of the market segments benifits in various ways from a web presence and workflow integration with business intelligence has given rise to the “BI” sector within the ERP and CRM software markets.

How workflows relate to a given web-form is dependent upon the web-sites parent’s companies core business; a car manufacturer may use the web form for marketing and price quotation; thus generating potential business and valuable marketing information regarding their products in real-time. Where as an online music store such as iTunes, Beat port or Napster utilize web forms as a method to interface directly with their clients thought the entire sales process. Some of the world’s most valued companies exist exclusively online and function around a core web-site; these include search engines such as Google, eCommerce sites such as E-bay and payment processors such as Paypal. Although their per-capita margins are small, the volume of transfer for each of these sites is in the billions; and therefore even though they may only make a 1% margin on sales: google sells adwords, E-bay charges $3.99 per auction, and Paypal has moderate service fees) due to the sheer volume of clients these businesses are worth billions individually; as we can see Google alone is worth 135 Billion dollars. (Google Finance)[xiii]

Therefore not only are web forms important to a business but how they generate data; the methods used to manage and leverage that data and very nature of the paradigm of a given workflow has been forever changed for most if not all industrial and non-industrial businesses. Previously you had to purchase advertizing space in a given market from various publications; today you may simply setup a website and send out advertizing referring back to said site; or if the site is a service let the word of mouth carry the burden of marketing.

Thus the paradigm has changed, the nature by which businesses function; how software as a platform functions; the nature by which we collaborate and maintain our businesses and their respective communications have forever been changed by the web.

References



[i] N.a. (ISO 2006) Health informatics -- Digital imaging and communication in medicine (DICOM) including workflow and data management [Online] PDF Document, Available from: http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=43218 (Accessed on: June 11th 2009)

[ii] Pilkington, Alan; Meredith, Jack (University of London, Wake Forest University, November 2008) The evolution of the intellectual structure of operations management—1980–2006: A citation/co-citation analysis [Online] PDF Document, Available form: http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6VB7-4T84K5P-1&_user=10&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=dcda044cc1d0bcf78c8f9a38c50f1770 (Accessed on: June 11th 2009)

[iii] N.a. (Department of Defence, Systems Management College, January 2001) Systems Engineering Fundamentals [Online] PDF Document, available from: http://www.dau.mil/pubs/pdf/SEFGuide%2001-01.pdf (Accessed on June 11th 2009)

[iv] Jain, Deepak (Code Project, January 11th 2007) Importance of Processes and Standards in Software Development [Online] World Wide Web, Available from: http://www.codeproject.com/KB/work/Process.aspx (Accessed on June 11th 2009)

[v] Boehm, B; (ACM, 1986) A spiral model of software development and enhancement [Online] PDF Document, Available from: http://portal.acm.org/citation.cfm?id=12944.12948 (Accessed on June 11th 2009)

[vi] Cockburn, A (A Cockburn, 2001) Agile Development [Online] PDF Document, Available from: http://www.imamu.edu.sa/Scientific_selections/Documents/IT/AgileSwDevDraft3.pdf (Accessed on June 11th 2009)

[vii] Jeffries, R; (XP Programming.com, November 8th 2001) What is Extreme Programming? [Online] World Wide Web, Available from: http://www.xprogramming.com/xpmag/whatisxp.htm (Accessed on June 11th 2009)

[viii] Basili, V, R; (IEEE 1990) Viewing Maintenece as Reuse-Oriented Software Development [Online] PDF Document, Available form: http://www.cs.umd.edu/projects/SoftEng/ESEG/papers/82.37.pdf (Accessed on June 11th 2009)

[ix] De Jager, J-M; (12Manage, 2004) PMBOK information Page [Online] World Wide Web, Available from: http://www.12manage.com/methods_pmi_pmbok.html (Accessed on June 11th 2009)

[x] Krebs, Joe (IBM, January 15th 2007) The value of RUP Certifiacation [Online] World Wide Web, Available from: http://www.ibm.com/developerworks/rational/library/jan07/krebs/index.html (Accessed on June 11th 2009)

[xi][xi]N.a. (UML Organization, February 2009) UML Version 2.2 Formal Specification [Online] PDF Document, Available from: http://www.omg.org/spec/UML/2.2/ (Accessed on June 11th 2009)

[xii] N.a. (Datamonitor, 2006) The global software market report [Online] World Wide Web, Available from: http://74.125.95.132/search?q=cache:Yhs8myEjdJ8J:www.infoedge.com/product_type.asp%3Fproduct%3DDO-4959+global+market+value+software&cd=1&hl=en&ct=clnk&gl=ca&client=firefox-a (Accessed on June 11th 2006)

[xiii] N.a. (Nasdaq, June 11th 2009) Google Finance Quote for GOOG [Online] World Wide Web, Available from: http://www.google.ca/finance?client=ob&q=NASDAQ:GOOG (Accesssed on June 11th 2009)

Tuesday, June 9, 2009

How will the Growing Web impact your future, your childrens future?

The web is growing at an exponential rate as the result of Moore’s Law (Moore)[i], this creates an inverse proportion to the given “Cost” of information. Ie; as processing power doubles so too does storage, bandwidth, and all other related technology. A result of this is the reduction in cost of all technology; thus what once could host only 1000 web pages may now host 10,000, and thus the ability to host increases exponentially every 18 months.

Information and Knowledge wants to be free as a result of its very nature as stated by Stwart Brand (Clarke)[ii] This results in instantaneous availability of mass amounts of information. This is the reason we now refer to the present as the “Information Age”. (Ulmer)[iii]

We may easily fill volumes in regards to what is now available and how this impacts everyday life however for a look to the future and a prediction I foresee is the advent of wearable computing and communications as seen here with MIT’s Media Lab’s sixth sense.(Maes)[iv] Or by the advent of wearable computing.

The current issue of ready access to this information has brought about some other industries as a result; these include the online search engine, source verification engines, semi automated human resource modules and fully automated text based search function for every industry from electrical engineering to auto manufacturing.

As the volume of information grows simply being able to navigate this volume effectively becomes a life skill. We will require intelligent systems that fully leverage mammalian models to ensure that we can understand the segments of information we choose to digest. (Rodriguez et al.)[v]

The impacts this volume of information and it’s access to communications has already changed the way we wage war; (USAF)[vi], how we shop, how we research medicine and how we make life choices conserving everything from our education to daily consumption. Thus the volume of information enriches everyone’s life by allowing each individual to delve into their respective interests and communicate instantaneously around the globe.

The impact this will have on my children’s life will be far greater than mine; as a first generation information age person myself; I am duly biased. I have had access to the web for most of my adult life; I cannot imagine the impact this information would have on my children.

I plan on ensuring that they know how to use the web, and that they become selective consumers of “Good” media; ideally I’d like to teach my children how to avoid the various propaganda, pornography and other wasteful entertainment online; however by arming them with the tools to recognize the good from the bad I hope that the World Wide Web becomes an even more valuable resource for the future generations than it is today.

Pandora's Box was never so big or so cluttered.


References



[i] Moore, Gordon (Intel, 1965) Moore’s Law [Online] World Wide Web, Available from: http://www.intel.com/technology/mooreslaw/ (Accessed on June 9th 2009)

[ii] Clarke, Roger (Xamax Consultancy, 2001) Information Wants to be Free [Online] World Wide Web, Available from: http://www.rogerclarke.com/II/IWtbF.html (Accessed on June 9th 2009)

[iii] Ulmer, Dave (Ulmer, December 23rd 2006) Beyond the Information Age [Online] World Wide Web, Available from: http://www.vias.org/beyinfoage/index.html (Accessed on June 9th 2009)

[iv] Maes, Pattie (MIT, TED, February 2009) A sixth sense Lecutre [Online] World Wide Web, Available from: http://www.ted.com/talks/pattie_maes_demos_the_sixth_sense.html (Accessed on June 9th 2009)

[v] Rodriguez, Marko A.; (Vrije Universiteit, Brussel, Belguim, June 2006) The Hyper-Cortex of Human Collective-Intelligence Systems [Online] PDF Document, Available from: http://arxiv.org/ftp/cs/papers/0506/0506024.pdf (Accessed on June 9th 2009)

[vi] N.a. (USAF, April 18th 2008) AFCYBER works to define scope of new 450th Electronic Warfare Wing [Online] World Wide Web, Available from: http://thesop.org/index.php?article=10755 (Accessed on June 9th 2009)

Wednesday, April 29, 2009

Prohibition does not work in any form.

Prohibition was once instituted in the United States, as a result Canadians made a killing selling Liquor illicitly to their American neighbors; and then Al-capone started killing some police and eventually the violence in the urban areas forced the public to realize that maybe drinking isin`t too bad.

The modern "War on Drugs" is another arm of the military industrial complex with wholly political aims to prevent countries like coloumbia from propping up dictators. Ronald Regan had some good ideas, but his own Intellegence agencies was caught trafficking cocciane during the 80's. However I am not an american, I am a Canadian so therefore I will concentrate on that which I may have some power to change.

Harper's government is very pro-war on drugs, they aimed to increase policing aginst gangs and gang warfare; they wish to reduce the minimum age that a child can be tried as an adult and during his "Throne Speech" he stated that they will "Get tough on crime".

Canada has this nasty habbit of not voting! And when we do vote we are in a minority; this minority government was elected by less than half of the Canadian population, but that's a whole other issue.

Recently a 14 year old girl died as the result of an "Extacy" overdose;
http://www.edmontonjournal.com/Students%20mourn%20Edmonton%20girl%20after%20ecstasy%20overdose/1538627/story.html

It's a tragic and sad story, youth culture typically rebels in any way shape or form and this girl had told her parents that not even a week earlier "She would never do drugs".

I could go on for hours about the Drug wars in the southern united states and Mexico but; again that's American.

My point is this; as a country we are all hypocrites that don't vote.

Tobbacco and Alcohol kill and maim more people every year than illicit drugs, yet illicit drugs account for a multi-billion dollar underground economy. This black market allows gangs to arm themselfs; it's the source of the ability for criminals to fight the police and it's at the core of the opposition of the "War on Drugs".

Any General will tell you that during WWII we started bombing ball bearing factories; since without ball bearings the germans could not fix thier trucks which were essential to the blitz-kreig style of fighting. We also bobmed civilian populaces, and commited horrors that remain the worst in human history.

Milton Friedman once stated;

"If coccciane was legal, crack would not exist."


Remove the economics behind the industry and tax & regulate it, by no means is it ok to consume things like Meth, Heroin, Cocciane, "Extacy" for any duration of time due to the damages incurred but addiction is a medical issue and should be treated as such. Instead we turn addicts into criminals or they themselfs become criminals to support their habbits. The medical industry can produce these at a fraction of the price; hell Cocciane Hydrocloride was invented by Merk, Meth-Amphetamine was developed by the Germans during WWII and the various party drugs have been around since thier development in the 50's and 60's, under government and privately funded psycedelic resarch programs which were the precursors to the science behind SSRI's.

The only sure path to harm reduction is to prevent the psychosis and neruological damage before it becomes a socitotial and or ciminal issue. There are now a pelthora of SSRI's, NSSRI's, MAOI's and others that act upon the same potentials to block the limbic reward path of most illicit chemicals, Methadone programs have improved the situtations in many cities; and programs could be designed for a fraction of the current costs of enforcement.

In short:
We may build schools and hospitals or we can build armies and graveyards, it's your choice.

You may support the military industrial complex or you can support your people.

Saturday, April 25, 2009

Insider Attacks

Insider Attacks

Insider attacks are defined as security breaches where a person with access to a corporate system and or network misappropriates information from that system; or when an internal employee of a given company commits a security violation against that company. (Einwechter)[i] The NIST articulates that the most prevalent and common threat to any company is the insider attack as it is the least monitored and most difficult to detect; this was as of 1994 and has remained a constant fixture in network and systems security throughout the years. (Bassham et al.)[ii]

Forensic Techniques

The forensic techniques available currently include local system analysis, network traffic analysis and log file reporting and analysis; however these techniques are primarily used to detect and compile evidence where a case is known or where an external and foreign entity has compromised an internal system or network. Insider attacks may compromise a system but they may do so with user accounts that have administrative access to said system or with tools used internally to gain access to privileged information. Thus forensic techniques are not designed to detect and alert security personnel to internal violations as they may be mistaken for routine administration and operations. Examples include any case where an employee conducts network traffic analysis to obtain the usernames and passwords of individuals with access to sensitive information and then impersonates those individuals within their own network to facilitate the changes they desire; or where an employee with administrative access to network infrastructure changes said infrastructure against the policy of the company they are employed by; such as modifying their salary within the Accounting Database or damaging systems intentionally due to a grievance with their employer.


Anti-Forensics

According to Kerckhoff’s principal and it’s reformulation as Claude Shannons Maxim “The Enemy knows the System”. (Kerckhoff)[iii] Although we are referring to internal systems and operations that may or may not involve cryptography; when the “Enemy” is an internal employee this truth determines the maxim extent of the systems risk and its potential for grievous damage to the company.

Anti-forensic techniques and tools include Alternative OS and Systems use methods, Data Manipulation (Secure Data Deletion, Overwriting Meta-data, Preventing Data-creation), Encryption, Encrypted Network Protocols, Program Packers, Steganogarphy, Generic Data Hiding and Targeting Forensic Tools directly to exploit them. (Garfinkel)[iv]

Although anti-forensic techniques were initially developed to secure systems for military operation, these tools may also be used by malicious persons during internal attacks against the targeted internal systems, combined with the intimate knowledge that an internal attacker will maintain of the company and its operations, the personnel involved the methods used to detect these attacks become even more difficult.

Synopsis

One may argue that the best method to prevent internal attacks is to employ good people and to keep them happy. However since companies can’t please everyone all the time there’s bound to be conflicts that arise as the result of corporate restructuring, salary and pay differences and general employee alienation.

Technical methods to prevent internal attacks include the “Segregation of Duties”(ISACA)[v] and “Segregation of Systems and Networks”; (Kupersanin)[vi] Thus by logically segregating access to resources by both function, location and internal client access requirements we may mitigate the potential for one employee to commit attacks. In addition to this internal financial systems are of paramount concern and should have segregated administrative, functional and operational client accounts that are limited to those resources that require access for their duties. Examples of this are that a payroll clerk should not have the ability to modify salaries in the financial application used to control payroll; that function should be limited to the executive & middle management as well as human resources personnel; general LAN WAN administrative accounts should not have access to these systems and access should be limited to only one or two people that act at the managerial level of network support and operations. Thus a LAN/WAN administrator would not have either the permission, nor the access required to change their own salary.


References



[i] Einwecher, Nathan (Security Focus, March 20th 2002) Preventing and Detecting Insider Attacks Using IDS [Online] World Wide Web, Available from: http://www.securityfocus.com/infocus/1558 (Accessed on April 25th 2009)

[ii] Bassham, Lawernece E.; Polk, Timothy W. (NIST, Security Division, March 10th 1994) Threat Assessment Of Malicious Code and Human Threats [Online] World Wide Web, Available from: http://csrc.nist.gov/publications/nistir/threats/subsection3_4_1.html (Accessed on April 25th 2009)

[iii] Kerckhoff, Auguste (Journal de Science Militare, February 1883) LA CRYPTOGRAPHIE MILITAIRE. [Online] World Wide Web, Available from: http://www.petitcolas.net/fabien/kerckhoffs/#english (Accessed on April 25th 2009)

[iv] Garfinkel, Simon (Naval Postgraduate School, 2007) Anti-Forensics Detection and Countermeasures [Online] PDF Document, Available from: http://simson.net/clips/academic/2007.ICIW.AntiForensics.pdf (Accessed on April 25th 2009)

[v] N.a. (ISACA, 2008) CISA Review Manual 2008, Chapter 2, Page 112 [Online] PDF Document, Available from: http://www.isaca.org/AMTemplate.cfm?Section=CISA1&Template=/ContentManagement/ContentDisplay.cfm&ContentID=40835 (Accessed on April 25th 2009)

[vi] Kupersanin, William (Insecure.org, November 15th 2002) Security Basics: Contractors on Company Networks – Network Segregation [Online] World Wide Web, Available from: http://seclists.org/basics/2002/Nov/0426.html (Accessed on April 25th 2009)