Thursday, November 3, 2011

It's All Software anyway


Over the course of the last thirteen years of my life I've seen many CASE tools; most of which have been Integrated Development Environments. IDE's are language specific CASE tools that include debuggers, memory monitors, compilers if it's a complied language like C/C++ or Delphi or runtime environments with integrated development for organizations that use interpreted languages such as Python or J2EE standards which usually include Eclipse and the enterprise JDK from Oracle; The Oracle RBDMS as the Database with Swing or the SpringSource framework as my application stack running on Apache-Tomcat or BEA weblogic, but I digress. When I was writing my previous module for Programming the Internet; I used Bluefish as an Editor and my IDE consisted of a LAMP stack on a Virtual machine for rapid application testing where I could test my changes “in situ” thus following the Agile standard whilst developing a web-site using XML and Web 2.0 standards on LAMP in PHP, on my trusty work station HAL(2.0). I conducted some rudimentary work in Eclipse for JAVA functions on the website, but I loathe Java. One of the more common IDE's today is the Visual Studio from Microsoft, Visual Studio .NET has some very interesting functions and support C# and the .NET framework which I hear is inc readably efficient but according to the language popularity from DedaSys LLC it falls far behind Java and PHP for web-development. C/C++ still reigns king in the world of systems and application development. (DedaSys LLC, April 13th 2011)i They even state the highest demand for work is in PHP and LAMP, this is due to the popularity of that platform, it runs over 60% of the websites used everyday.

My first real job was when I was in high school, in Ontario it's called secondary school and I worked with a startup that conducted RLC based power design in electronics for communications companies I was only 17 at the time, not bad for a high school job it did however pay like fast food. My first introduction to procedural programing was working with the MIPS environment from Microchip Inc. I was tasked with programming a PIC 16F84 as a fan controller for a project we were consulting on; needless to say it was a trial by fire but any programming in a junior career usually is. After the firm hit a major crisis with the local telecommunications industry meltdown in 2000 our path was changed and venture capitol was obtained from the same group that founded MosAid. We went from a consulting firm specializing in power systems design to an ASIC design firm specializing in power control ASIC's to fill a void in the components market in north America. last I heard they had a few plasma television manufactures interested in their products and the company had been sold to Powi Systems. I was laid off before the first chip completed design, To add insult to injury this was less than three months after the passing of my mother; but as always compassion and business rarely mix. It was a bit of a blessing in disguise as I began working as a network and systems consultant there after and I have not looked back ever since.

Working as a Lan Wan administrator I knew enough about the sun solars systems that were purchased to support the Verilog IDE's that were on them; from that experience I know now that all Hardware design is in fact software code; All transistors and accompanying rlc based circuits on any chip are really just a function described in a high level design language such as Verilog or VHDL. Ever since Carver Meade published his tomes on VLSI design (Meade, 1978)ii, Hardware electronics are designed in EDA (electronic design automation) suites that resemble IDE's with native drawing interfaces, you compile your circuits and test them with applications using a SPICE simulator (Nagel, 1971)iii: everything from the printed circuit board to the circuits and schematics are all rendered in software and tested months before any prototype is ever built. This is primarily due to the sheer cost of prototyping both chips and circuits. We have been using computers and CASE tools in an iterative fashion to design better computers to achieve Moores Law: The standards are maintained by a consortium of software houses including AMD, INTEL, IBM and Motorola, companies such as NEC and others have some input to the standard but the bulk of most software design work still occurs in North America both in Silicon Valley in California and here in Ontario; An interesting side note ATI got started in Toronto and has offices in Waterloo and Mississauga; Matrox is still based out of Montreal; Now the ATI offices work for AMD but the design offices are still here: We also have a large Monolithic Microwave Integrated Circut (MMIC) design as well as GaAs (Gallium Arsenide), and InP (Indium Phosphide) tech sector that still survives here in Ottawa but only as boutique desgin firms like Xwave; not as the previously glorious houses that created innovation on crystal substrates that would make any communications engineer envious. Every single technological product ever developed from now on exists on a computer in a virtual environment in it's entirety before the decision is made to build or scrap; we can even calculate the market pressures to build said device before hand using automated ERP based econometrics programs.

Now how does JAD help organizations maintain competitive advantages in the IS and IT world where moors law rules king and Brooks (Brooks, 2006)iv and Zawinski's Laws of software bloat ensure that the next version of software will bring whatever it is you use now to it knees with third order (O^3) out of sequence calls used in objects that are three times less efficient than what you use now; boils down to one simple thing; The software design life cycle.

Software is only useful for a given amount of time; This property is the usable life. the hardware is only good for five to seven years, that's what it's designed to last to; this also includes every single super computer on the planet. The software running on the hardware is directly responsible for achieving your business office functions, if your organizations business is developing software than your development methodologies, software supportability and maintainability and your organizations efficiencies of object re-use, testing, bug fixes and design considerations directly determine your bottom line; In short you live or die by the quality of the software your organization produces; and this is your competitive advantage within this economy. As Lehman stated in 1980, the usable life of a given piece of software is about 5 to 7 years before it requires a complete rewrite due to environmental changes. (Lehman, IEEE 1980)v The adoption of the Agile and other such Rapid Application Development methodologies as defined by Highsmith result in code of a greater quality. (Highsmith, 2000)vi He also states that daily compiles are necessary to achieve this level of quality to reduce overall project risk. As Galin states (Galin, 2004)vii:

As software errors are the cause of poor software quality it is important to investigate the causes of these errors in order to prevent them.

Thus in an IS or design oriented organization where Joint Application Development methodologies are not being used or no use of SDLC can be seen we may assume that the software produced will contain many errors and not be sustainable in nature. It will be of a relatively poor quality. If the business is not producing sustainable software than it itself is unsustainable and may fail due to the burgeoning costs of maintenance of their software after it is sold or implemented in some poor organizations production environment: either by contractual obligation or loss of users and customers. Software and Security fixes are far cheaper when implemented in the design and architecture phases regardless of the methodologies used, by choosing to adopt an Agile rapid application development environment and methodologies such as SDLC and PCDA cycles an organization is effectively committing to reality testing all of its produced software frequently. As the old proverb states, failure to commit is committing to failure.

in.a. (DedaSys LLC, April 13th 2011) Language Popularity [Online] World Wide Web, Available from: http://langpop.com/ (Accessed on November 2nd 2011)
iiMeade, C; Conway, L (PaloAlto, Xerox Corporation, California Institute of Technology, 1978) Introduction to VLSI Systems [Online] PDF Document Available from: http://ai.eecs.umich.edu/people/conway/VLSI/VLSIText/PP-V2/V2.pdf (Accessed on November 3nd 2011)
iiiNagel L. W.; Rohner R. A. (IEEE, Journal of Solid State Circuts, August 1971) Computer Analisys of Non-linear Circuts Exclusion Radiation [Online] PDF Document, Available From: http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1050166 (Accessed on November 2nd 2011)
ivBrooks, Fredrick (Addison Wesley, 2006) The mythical man month P. 53 ISBN: 0-201-83595-9
vLehman, Meir M. (IEEE, Proceedings, Volume 68, Number 9, September 1980) Progams, Life Cycles, and Laws of Software Evolution [Online] PDF Document, Available from: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.116.3108&rep=rep1&type=pdf (Accessed on November 2nd 2011)
viHighsmith, Jim (Cutter Consortium, 2000) Extreme Programming [Online] PDF Document, Available from: http://www.cutter.com/content-and-analysis/resource-centers/agile-project-management/sample-our-research/ead0002/ead0002.pdf (Accessed on November 2nd 2011)
viiGalin, Daniel (Pearson Addison Wesley, 2010) Sofware Quality Assurance From theroy to Impllementation P.19 ISBN: 978-0-201-70945-2

Tuesday, October 25, 2011

The nature of Trust


A synopsys from the Ethics Smethics Point of Vue and other reasons why I'll never work with Michael Schrage 

http://www.cio.com.au/article/185611/ethics_shmethics/


Do the right thing or implement the system correctly. Well this statement is clearly flawed. If your IT department cannot implement a system correctly than you as the CIO have failed at developing a team with the right skills, Wilcox et al have cited that team composition and dynamic are of prime importance on ERP and CRM development. (Willcocks, Sykes, 2000)i If such difficulties arise you might be better off seeking employment as a used car salesmen or tax collector or some other business where a lack of ethics is a good fit. 

If you believe that withholding layoff information will help get your projects completed, it’s probably going to be your last project as CIO since the company will have serious issues gaining customer confidence once employee confidence and trust is lost. ; Issues of customer confidence include those experienced by Enron, MCI World Com, Nortel, Arthur Anderson and others. (Patasuris, 2002)ii

The truth is simple, the bright ones will see the writing on the wall and leave before the layoffs occur, those deer in the  headlights will expect a golden handshake for their woes but beyond that is the prime consideration involving the why; Why bother hiring developers if you only need a software tool? Simply buy the tool and it's rights. Microsoft bought the rights to it's first software platform; this still happens every day.

Senario:
CRM Development – Outsourced Support and Administration there after.

Utilitarian –
If you have a development department that is capable of developing a good valuable CRM solution that will function as desired and you plan to outsource support and administration, then you may consider engaging in business with this software house as a means to generate real revenue for your company. If you cannot create a new and profitable business venture than your company should have also outsourced the development as you had no intention of maintaining these employees after the project was completed, if you plain to fail then you fail to plan. The moral good of the collective of the employees would be best suited by ensuring their long term employment over considerations of profit; one such consideration would be that the organization owes these people work, just as this human capital owes the organization productive efforts. As Economics (Samuelson, Nordhaus, 2004)iii states all production requires Factors that include Human Capitol and Land.

Nomative –
The moral facts of the situation as sited by Schrage are both simplistic and narrow minded. Yes IT is a business, but it’s up to the sales people and directors to ensure that it functions in a sustainable manner. In 2008 the software industry added $280 billion in value to the U.S. Economy alone (BSA, 2008)iv, If the sales department of this organization cannot capitolize on such a large market than they should be the first to be replaced; The above scenario would not exist if they outsourced all development of the platform, or if they hired developers in a cautious manner. Full time gainful employment as a developer in an economy means that you as an individual have already made the choice to devote 100% of their time and effort to your company, to assume that these people are expendable just because the company cannot find a means to sell their talents is wrong in both the utilitarian and nominative philosophical sense. The prima-face (W.D. Ross, 1930)v duties in the nominative sense would be that the benefice, fidelity and justice of the organization are co-dependent upon those received by their respective employees and those experienced by their customers. Henry ford once stated, Take aw

Short Term Losses

The short term losses of this organization would include a lack in skills and ability for the organization to product any software product, not to mention a loss in production and morale. Losses to corporate morale can kill a company; if your entire workforce is apathetic then your companies production may grind to a standstill.

Long Term Gains

The long term gains of this company or organization may be that instead of going bankrupt they exist in some form their than they used to. Current examples that include ones like this include “Research in Motion” where they have recently gutted their workforce in an effort to combat a downturn in sales.(Yin, 2011)vi By reducing 11 percent of their programmers; one such strategic move would have been to refocus efforts on Android development of a blackberry compatible mailer client to further expand on a client base on an exponentially growing client base, thus no longer depending on their own hardware business and leveraging the strength of the mobile marketplace from Samsung, Motorola, Nokia and any other Android based phone manufacturer: however weather or not Rim will survive into the new year remains to be seen.

Good Ethics or Financial Benefits
The example above is just one example, however if all organizations were to invest as heavily into their employees as their employees invest in their organizations than the returns would include competitive advantages as well as financial ones, the best examples include sponsorship of an executive MBa as a means to capitalize on leadership within the ranks of an organization as opposed to purchasing or poaching an executive. Good Ethics lead to a good reputation, a good reputation directly increases sales and revenues regardless of the product or service being sold. 

iWillcooks Leslie P., Sykes, Richard (ACM, 2000) Enterprise resource planning: the role of the CIO and it function in ERP [Online] PDF Document Available from: http://dl.acm.org/citation.cfm?id=332051.332065 (Accessed on October 23rd 2011)
iiPatasuris, Penelopie (Forbes, August 26th 2002) The croproate Scandal Sheet [Online] World Wide Web, Available From: http://www.forbes.com/2002/07/25/accountingtracker.html (Accessed on October 24th 2011)
iiiSamuelson, Paul A; Nordhaus, Paul D (McGraw Hill, 2010) Economoic 19th ed P.295. ISBN: 978-0-07-070071-0
ivn.a. (Business Software Alliance, 2008) Software Industry Facts and Figures [Online] PDF Document, Available from: http://www.bsa.org/country/Public%20Policy/~/media/Files/Policy/Security/General/sw_factsfigures.ashx (Accessed on October 24th 2011)
vW. D. Ross (Oxford University Press, 1930, Reprinted 2002) The Right and Good ISBN 0-1-992-526-53
viYin, Sarah (PCMag, 2011) Rim Cuts 2000 Jobs, Reshuffles Management [Online] World Wide Web, Available From: http://www.pcmag.com/article2/0,2817,2389071,00.asp (Accessed on October 24th 2011)

Wednesday, October 19, 2011

The Long Con


If you can't spot the sucker in your first half hour at the table, then you are the sucker.” ~ Matt Damon, The Rounders

When Vint Cerf was designing TCP/IP he did not think twice about the ramifications of it's use and abuse by the commercial enterprise, ethics were not in consideration at that time, just a protocol for communicating between computers in a resilient fashion.(n.a. 2000)i, When Tim Bern s Lee was engaging in the development of HTTP and the “Hyper Media Experiment” in 1989 at CERN his desire was to develop a system that would enable global commutation with the use of Text, Video, Audio and any digital artifacts that might be created with a computer, in 1990 he created the firs web server and client.(w3, 2001)ii

We may argue that these ethical people created an ecosystem whose behavior is no longer deterministic or positive but profit driven and potentially dangerous. The world wide web as it exists today according to the LANDER project as implemented by CAIDA contains 4,294,967,296 unique hosts as of 2007. (CAIDA, 2007)iii as of 2011 we have exhausted all available IPv4 addresses.

Moore's Law dictates that the cost of computing is divided in half every 18 months(Gordon Moore, 1965)iv; This also applies to storage in addition to computing, in that the cost of storing data is halved every 18 months as well. In fact most physical storage devices have beaten Moore's Law with respect to cost per gigabyte per watt in terms of performance over the last two decades. It has become cheaper to store information gained from use.

Business intelligence is the act of mining transaction data within an organization to derive value and intelligence from the dataset and it's behavior. Recently the advent of Web 2.0 standards means that computers that host information can now exchange semantic data as objects in XML from one another using XMLRPC thus no longer requiring user intervention or the “Duplication” of input. These include standard such as SOAP and any other Web 2.0 standard where you fill out information on one website such as Facebook, Linked In or Twitter and then 3rd party sites may query those servers using your tracking cookie as an identifier of you as a person; then poll your data and use it to auto fill it's own forms.

This notion of logical “Glue” was heralded as the greatest human achievement of all time by all of the on-line marketing firms, social networking web sites and companies with vested interests in tracking your behavior as an Internet user for profit.

One such abuse of an individual's right to privacy is called the “Ever cookie”, this cookie uses multiple methods and vectors to place it self in commonly used buffers that are not erased by most browsers such as using the Flash object store to create an “LSO” object to be tracked and quired.(Kamkar, 2010)v

Recently in Canada it was argued that “A hypertext link does not constitute defamation of character”, this has massive ramifications with regards to another case before the courts against the well known site “ISO Hunt”.(Supreme Court, 2011)vi In that a Linking site which profits from posting links to copyrighted content may not be liable for enabling said copyright infringement.

Other example includes a recent suit filed for class action by a woman in Mississippi accusing face book of violating federal wiretap laws by tracking people's use of the world wide web when they modify a cookie as stored on a web-page that may contain the “Like” button.(Goodin, 2011)vii

The ethical argument presented by companies such as “Google”, “Facebook” and “Linked In” is as follows:

People are willing to give up their rights to privacy for communication with other people that would otherwise not be possible”.

The ethical flaws of this argument are that your privacy is a right granted to you if; you live in Europe or North America by either a Charter of Rights and freedoms or a Constitution.

These companies do not sell products; they do not sell ad-space; they sell people! The user base of the web site is the product, your information is theirs and their stock valuation is proportional to the amount of users they have; this perception is reality.

Google, the world largest data store tracks over 2 billion people daily, they have over 1,000,000 servers holding e-mail, web-searches, impressions of text ads, peoples digital pictures, and databases of use; to assume that they do not violate your right to privacy in mining this data for profit is utter and sheer stupidity that should be side by side with notions of the world being flat or the sun rotating around us.

All modern Internet companies have one common product, a group of captive users with data on what they do and how they do it and Web 2.0 simply increased the volume of data available for mining and exploitation or sales. This is accepted as the norm and perceived as perfectly normal so long as they do not attempt to commit fraud or blatantly sell to you via a traditional methods such as cold calls, or telemarketing.

The Association for Computing machinery, the oldest known association of computing professionals and perhaps one of the largest has an entire chapter devoted to Knowledge Discovery and Data Mining. (ACM, n.d.)viii The KDD is a group of very talented programmers whom are usually hired by organizations to engage in knowledge management and mining from their internal datasets, however this now includes peoples personal information, a programmer may even be at ethical odds with respect to the KDD as they would have singed the Ethical Code upon their membership.

The trend of profiting from peoples privacy is actually growing as more sites become intertwined with one another the “Sticky” nature of on-line data and exchange has created an ecosystem where advertisers may profile you before you ever see any of their ads. This industry is valued in the billions, and it's being used to drive marketing and product development as I write this; This prime issue is that this trend on the Internet is wholly unethical in nature and clearly violates everyones respective rights; as time continues to pass more court cases will be heard by other violators of privacy; however I fear that due to the sheer value of the market the courts themselves may get bought as well, justice may be blind but she is also expensive.

In the 21st century most of humanity has been conned into believing that these companies are providing services and goods to make their lives better when really they are just more venues for advertising; we the early adopters and Internet users have been conned into disclosing too much information without demanding compensation that is equivalent to it's value. Every new generation from now on will simply accept this fact of life, and continue to blindly use each of these services as if they were entitled to them.
in.a. (Living Internet, January 7th 2000) Vinton Cerf TCP/IP Co-desiner [Online] World Wide Web, Avaialble from: http://www.livinginternet.com/i/ii_cerf.htm (Accessed on October 19th 2011)
iiTim Berns-Lee (W3C, 20011/09/01) Tim Bearns-Lee Biography [Online] World Wide Web, Avaialble from: http://www.w3.org/People/Berners-Lee/ (Acdessed on October 19th 2011)
iiiCai, Xue; Fan, Xun; Dougherty, Maureen; Govindian, Ramesh; Hiderman, John; Hu, Zi; Papadopulus, Christos; Pradkin, Yuri; Quan, Lin (CAIDA, 2007) The Lander Project Summary [Online] World Wide Web, Available from: http://www.caida.org/research/id-consumption/census-map/ (Accessed on October 19th 2011)
ivMoore, Gordon E (Electronics, April 19th 1965) Cramming More components onto integrated circuits [Online] PDF Document, Available from: ftp://download.intel.com/museum/Moores_Law/Articles-Press_Releases/Gordon_Moore_1965_Article.pdf (Accessed on October 19th 2011)
vKamkar, Samy (Samy.pl, 09/20/2010) The evercookie [Online] World Wide Web, Avaialble from: http://samy.pl/evercookie/ (Acceseed on October 19th 2011)
vin.a. (Superme Court of Canada, 2011) Crookes v. Newton, 2011 SCC 47 [Online] World Wide Web, Avaialble from: http://scc.lexum.org/en/2011/2011scc47/2011scc47.html (Accessed on October 19th 2011)
viiGoodin, Dan (The Register, 14th October 2011) Facebook Accused of Violating U.S. Wiretap Laws [Online] World Wide Web, Available from: http://www.theregister.co.uk/2011/10/14/facebook_tracking_lawsuit/( Accessed on October 19th 2011)
viiin.a. (ACM, 2011) SIG KDD Information portal [Online] World Wide Web, Avaialble from: http://www.sigkdd.org/ (Accessed on October 19th 2011)

Sunday, October 16, 2011

The Nature of Chaos


"The world must actually be such as to generate ignorance and inquiry; doubt and hypothesis, trial and temporal conclusions... The ultimate evidence of genuine hazard, contingency, irregularity and indeterminateness in nature is thus found in the occurrence of thinking."
- John Dewy (1958)i

The software enterprise consists of a vast forest of applications with each serving it's own genus and function, each program and system maintaining it's relevant business function. This “Ecosystem” has many dependent factors however it's usually a homogeneous environment, where most systems within an organization are similar in nature or utilize a similar base computing system to ensure that function and form are not chaotic.

To take an organization that has a “Chaotic” environment and standardize it is to undergo the process of maturity or the cyclical process of software and hardware audits according to existing standards and determine which gaps if any exist and what is required to remedy them.

The key aspects to consider in organizing the software enterprise are to determine which standards and guidelines may add value to said organization and to develop a plan to implement them by using or devising road maps to apply said standards by using formal methodologies such as CoBIT from the ISACA.(ISACA,2010)ii

CoBIT is a methodology used to manage methodologies; an example would be to formally apply the CMMI from SEI to an organization to determine the maturity levels and gaps. The CMMI is available free of charge from the SEI at Carnegie Mellon and it incorporates a lot of the standards and process from Six-Sigma. The CMMI is a collection of best practices (SEI, n.d.)iii, so given that it is a collection of what can be done; determining what must be done is a matter of business practice in Gap Assessment and Impact Assessment as well as Risk Assessments and value assessments for the organization's core business practices; the CMMI-DEV would apply if the organization is a development house, the CMMI-SVC may apply if the organization offers services.

In addition to utilizing methodologies to adopt standards we would also need to consider which standards we must adopt to reduce chaos; The ISO standard for software development quality assurance is 9000-3 the current iteration of this standard is entitled ISO 90003:2004 and is available from the ISO/IEC.(ISO, 2004)iv Other relevant standards include the IEEE 12407, ISO/IEC 15504 for quality assurance plans, ISO 27001 and ISO 27002 to improve the organizations security stance.

Standardization is one method to ensure that a software enterprise is producing quality secure software of great value but it cannot do this without having an enterprise project management office in place to ensure that the desired standards are being met with the current versions or methods thus we must also ensure that project management methods are being observed as well such as the broad adoption of SDLC.


In addition to the above formal methods there is also the question of good “Due Care” as defined by the (ISC)2.(Tipton, 2010)v Is the corporation or organization engaged in planning for business continuity? Disaster Recovery and Availability requirements? Are all of these formally defined and understood by both the Executive and Employees of all departments; thus not limited to just IT.

Thus the key considerations for any software enterprise are weather or not the office environment is standarized and mature? Is every desktop in said software Enterprise managed by a formal methodology including ITSM standards from the ITIL as defined by the U.K. Office of Government Commerce(ITIL, OGC, 2010)vi; such as Release and Problem Management along with formal Configuration and Change management? The other common sense consideration is; Does the people, process and technology function as they should to achieve the business goals of the organization?

Now the reasons behind the adoption of standards, methods, and methodologies to be used to apply said standards to said software enterprise are very simple; they are industry proven methods used to improve the value, availability and quality of the software enterprise. The people and process may be simple, the technology is complex and the goal is to reduce the amount of chaos within the software enterprise to a manageable level that can be quantified and measured and reported upon. Not only will this increase the organizations competitiveness it will also make it a far more secure and resilient entity; however we are assuming that these standards methodologies and processes are adopted and implemented with care and wisdom as endorsed by the Executive and understood by the employees.

The ecosystem in a rain-forest is wonderfully diverse and very deadly. The ecosystem in a managed forest is less complex and far more habitable as well as productive. The goal of the exercises in adoption of methods and standards by the nature of assessment and feedback; is to change the nature of the software enterprise from a risky and chaotic stance to a risk averse and standardized one that is measurable and quantifiable in human terms.

If we are unaware of the dangers lurking in the trees how can we ever hope to produce any paper?

Conversely if we have diverse separate groups of individuals formulating software projects with no oversight or consideration for goals in quality or management how may we ever hope to maintain our level of quality or client base? 
 
iKellert, Steven H. (University of Chicago Press, 1993) In the Wake of Chaos P.1 ISBN: 0-226-42976-8
iin.a. (ISACA, 2010) CoBIT 4.1 [Online] PDF Document, Available from: http://www.isaca.org/Knowledge-Center/cobit/Pages/Downloads.aspx (Accessed on October 16th 2011)
iiin.a. (Software Engineering Institutie, Carnegie Meallon Unviersity, 2010) CMMI Solutions: Process Areas [Online] World Wide Web, Avaialble from: http://www.sei.cmu.edu/cmmi/solutions/index.cfm (Accessed on October 16th 2011)
ivn.a. (ISO/IEC, 2005) Software engineering -- Guidelines for the application of ISO 9001:2000 to computer software [Online] PDF Document, Available from:http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=35867 (Accessed on October 16th 2011)
vTipton, Harold F. (Taylor & Francis, 2010) Official (ISC)2 Guide to the CISSP CBK, Second Edition P. 266 ISBN: 978-1-4398-0959-4
vin.a. (APM Group LTD. 2007) Official ITIL Website [Online] World Wide Web, Available from: http://www.itil-officialsite.com/ (Accessed on October 16th 2011)

Tuesday, October 11, 2011

Privacy in the Information Age

Computing has many capabilities and information technology is now a driver for most business ventures including multi-billon dollar companies such as Google, Microsoft, IBM, Oracle, Facebook and Amazon.

The following five basic capabilities are available to any organization via investments in computing
infrastructure.

Long term record storage

The advent of inexpensive desktop and pocket computing has had as much of an influence on business and the Gutenberg press did on the dissemination of information over two hundred years ago. Computing  facilities now house all the records for most organizations in a digital format on a file system usually residing either on a series hard drives or a highly available and redundant Storage Array Network; Having these commuting resources allows an organization or business the ability to store records of all practices and procedures; these include records of communications and meetings in the form of E-mail and minutes letters. We no longer measure libraries in terms of psychical storage space but we measure our available storage for information in the number of “libraries of congress” or Gigabytes of data that may be stored upon them. This facilitates the development of a long term corporate memory; where once an agent in an organization would have to peer over archives and file folders for hours on end to locate information they may now simply submit a query to an internal database of records to locate relevant data in the form of Documents, Communications, Actuarial Records (Books of business) or any other relevant information. The availability of vast amounts of information regarding various fields of science are turning once traditional fields of research involving psychical tests and trials into exercises in data management and mining. Cervek et al (1999)istate that the availability of a universal design of a data grid to access these now enormous data sets (peta-bytes of information) must follow a standardized format to ensure interoperability and that any research conducted upon it is sound. Long term storage equates to a perfect memory provided the integrity of the data that is stored is sound.

Knowledge Management

Knowledge Management is defined by Junpil Han and Mani Subrami (1999)ii as the ability to manage the intangible assets within an organization; that is to distribute and manage the intangible information within the individuals that work for said organization. Knowledge Management Systems are IT services and infrastructure that become business enablers and competitive advantages by ensuring that all the Value derived from the knowledge of an organization and it's business are maintained and transferable. This would not be possible without an investment in a Knowledge Management System which resides on computing infrastructure.

Business Process Automation

Business process automation can be loosely defined as the integration of automation engines within an organizations infrastructure; where each independent cost centre may be viewed as having it's own automation engines: ie; Finance and Accounting may use packages such as Accpack, Simply accounting ; Oracle Financial or people-soft to accomplish common tasks like payroll, renumeration and the calculation of balance sheets including current operations. Business process automation would involve the integration of these systems with business intelligence and marketing platforms to further automate various business processes such as “manufactering” or “software production”. Again business process automation requires that multiple IT sytsems be integrated and become aware of one another; this usually involves the adoption of WASDL or XMLRPC and web 2.0 based standards and data interchange. Examples of BPA include automation of the reconciliation of electronic books; say the payroll database with the transactions as input to the accounting platform from the bank directly; basically BPA allows an organization to streamline it's operations and realize effectiveness in staffing and human time by automation as much as can be automated.

General Purpose Computing

When Alan Turing conceived of his machine; the Turing Machine in 1936, Barker-Plummer (1995)iii as a means to conduct thought experiments with computing systems he did not realize at the time that it might be capable of simulating every algorithm ever created. Thus we may define general computing as the application of any system or algorithm by a computer to any data to obtain any information desired. General computing within the day to day activities of an organization would consists of the desktop operations of it's agents and their tool sets however complex or simple they may be. These include the mundane manipulation of information in spreadsheets all the way up ot the most advanced reasarch in statistical applications such as SaaS or similar scenarios. General computing is any computing inolving math and a computer.General computing is using records as maintained by long term storage to facilitate the creation of value or gleaning insight into the available data.

Interconnectivity of Working groups

The information superhighway in it's current form as defined by the companies that run these networks are a series of high bandwidth optical interconnections between geographically disperse municipalities, provinces, states and countries. Realistically the Internet is mapped by CAIDA project n.a. (2011)iv and is governed by international standards bodies in each country with access to it. What the INTERNET facilitates is rapid exchange of information among groups of geographically disperse agents. The nature of these exchanges is usually beneficial when business and commerce are concerned; the risks of interconnection arise when organizations expose themselves via unsound software or business practices in relation to the types and nature in which they engage in communicators on the INTERNET. The risks of Fraud and Exploit by agents in countries where little or no oversight exists has had academics move to the “INTERNET 2” which is a seperate ultra high speed network for the sole purpose of academia.

The current networks that are used by business to manage it's organization and employees are semiprivate or private by using security technologies such as application virtualization or the use of encrypted networks on top of the INTERNET such as Virtual Private Networks or SSL access gateways to ensure that employees have access to the information resources from where ever they may INTERNET access; as INTERNET access is fairly ubiquitous in the industrialized world then your organization may no longer be bound by psychical constraints of available office space. This of course gives rise to the notion of what methods and metrics are to be used to determine if an employee is productive or not; since in effect the punch clock is no longer a physical device. The benefits of utilizing the INTERNET vs physical presence are many, inexpensive video communications over instant messaging is the once dreamed of “Video Phone” and it was developed by software companies not phone companies; this inexpensive access allows organizations to compete at a multi-national level with few key resources conducing far more specialized work in local or remote locations. Many organizations of today including some of the top preforming public ally traded companies reduce travel and operations costs by having virtual meetings.

The Death of Privacy

Privacy still exists as a legal fiction; the police and state may not commit transgressions against an individuals privacy such as searching or monitoring their communications without due cause which is defined by Canadian common law as the procedure and process of hearing before a judge to obtain a warrant to engage in surveillance. This due process is a major component in Habeas Corpus (the write to equality before the law) and forms the basis upon which civilized society functions. The truth about privacy is that it is long dead; it was killed by Visa, MasterCard, American Express, Facebook, Google, LinkedIn, Twitter and Microsoft. The standards of Web 2.0 had a minor role but the agents are these companies and their core business of managing and presenting our information to us. Google is the worst offender when it comes to privacy, it is estimated that Google has over 1,000,000 systems operating in a custom designed cluster as an amalgamation of data-centers across the globe; mostly in north america; hwoever to ensure speedy searched they co-locate as close to major populations as possible. When Eric Schmidt, Richmond (2010)v the previous CEO of Google once stated:

“Our company’s policy is to get right up to the creepy line and not cross it”

The Privacy commissioner in Canada filed a Law suit following the comment and the ensuring enquiry was designed to determine what information Google maintained on the infamous and useful “Street” view trucks; as it turned out, Google would also sample any WiFi networks that were available when driving; they kept this information internally, although they did not publish the WiFi information the use of this information without consent of the owners of these networks in Canada was a clear violation of The Privacy Act.

Recently the CEO and Founder Reid Hoffman of Linked In in the Davos Annual Meeting vistated that:

“Privacy Issues are for Old People since the value of connection is greater than privacy issues”

George Cummings the CEO of Forserter stated that (2010): 

"The 15 most trafficked sites of the world seven of them social sites; which constitutes 5 to 6 hours of a day on the consumption of media, twitter has over 25 million unique visitors per day, Facebook 130 million users per day, my space 50 to 60 unique visitors per day, and linked in about 15 million and ming about 6 million these are from Com-score."

Only 17% of the on-line community uses social networking sites; the age ranges are aimed at younger INTERNET users. Thus the current trend is adoption among young adults. In addition to the social networking sites, on-line search and Geo-lcation services; the other major offenders are our own Institutions, our provincial and federal governments in efforts to ease access to our own services these include the databases of our hard information such as property ownership, licensing and registration information as well as birth and death records via sites like “Ancetry.com” where we may review and pay for our genealogy. In addition to these our banks and creditors routinely track our purchases to better serve us and to glean information into our spending habits.

Thus the organizations that have served us for so many decades now with the advent of inexpensive
computing by utilizing the above five capabilities of computing on various large scales; may now
maintain a permanent memory of every action we may have ever been involved with, including any
pictures that others have taken of us or any records we may maintain in any municipal or federal
government. Thus we may see that large businesses have made huge profits by simply asking us to
disclose information we see as valueless to facilitate connection and communications in a transparent
manner but by doing so we have sacrificed our privacy for temporary and transient communication and
connections. I believe that the value of a shared identity is and having a smaller global village is very
real but I also believe that the value of my privacy is incalculable it's a right in my country and I'd like
for my children to enjoy this right as well.

i Ann Chervenak , Ian Foster , Carl Kesselman , Charles Salisbury , Steven Tuecke (1999) The Data Grid: Towards an Architecture for the Distributed Management and Analysis of Large Scientific Datasets [Online] PDF Document, Available from: http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.32.6963 (Accessed on October 10th 2011)
ii Hahn, Junpil; Subramani, Mani R. (1999) A framework for Knoledge Management Systems Issues and Challenges for Therory and Pratices [Online] PDF Document, Available from: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.114.7693&rep=rep1&type=pdf (Accessed on October 10th 2011)
iii Barker-Plummer, David (2011) The stanford encylopedia of philosophy [Online] World Wide Web, Avaialble from: http://plato.stanford.edu/entries/turing-machine/ (Accessed on October 10th 2011)
iv n.a. (2011) CAIDA Visualizations [Online] World Wide Web, Avaialble From: http://www.caida.org/publications/visualizations/ (Accessed on October 10th 2011)
v Richmond, Shane (October 25th ,2010) How Google crossed the creepy line [Online] World Wide Web, Avaialble from: http://www.telegraph.co.uk/technology/google/8086191/How-Google-crossed-the-creepy-line.html (Accessed on October 10th 2011)
vi World Economic Forum, (2011) Davos Annual Meeting 2010 - The Growing Influence of Social Networks [Online] Video,http://www.youtube.com/watch?feature=player_embedded&v=pexGCUPlUeA#! (Accessed on October 10th 2011)

Thursday, September 8, 2011

The Furlong may take a Fortnight and weigh a Firkin


A project metric is defined any data that may be mapped by a function of math.i Functions used upon these metrics include; Measurement, Comparison, Analysis, Synthesis, Estimation and Verification.

Performance metrics are defined as any metrics that are used to derive value for a given system under scrutiny, these include financial metrics, intangible projections and EVM related data.ii The process of generating metrics consists of defining requirements for measurement, and developing the measures to determine if said requirements are met as well as establishing well defined targets to measure against.

Methods used to create these metrics include project charting using earned value management, the use of performance based logistics or defining and methods to determine key performance indicatorsiii.

Key performance indicators may be classified as into the following groups:

Quantitative
Practical
Directional
Actionable
Financial

Each of these indicators can be used to develop Balanced Score Cards, HeatMaps or as inputs for SWOT or magic quadrant analysis for a given project and it's deliverables. These also include the various methods of charting such as Histogram, Candlestick, Inverted bar, and others.

The distribution of this information once it's created may be via standard methods including e-mail or the internal network (the intra-net) of an organization where proper security and accounting controls may be used to audit and control access to this information as key performance indicators on a major project can indicate corporate risk appetite and potentially damage the reputation of the company or organization should they be disclosed publicly. Eg; when Sony online's network were breached their stock dropped 15% in less than one week due to widespread fear of public reprisals including class action suits due to said disclosure. If a company or organization is spending millions on a software project to improve it's security stance the very existence of this expenditure could affect public confidence. As Tzun Tsu stated “The basics of all combat rely upon a foundation of secrecy”.

The methods cited here are by no means exhaustive such as Six Sigma or TQM as cited by Schwable; however they are used frequently enough to warrant discussion.

Another issue is the standardization of a given organization to agree upon the methods to be used beforehand to ensure consistency in the delivery of these project performance metrics as if the data cannot be read and understood than it might as not exist in the first place. Thus if the organization agrees to conform to ISO 9000 and then standardizes on ISO 12004, 10007, and 15403 for quality management they must also standardize upon the project methodology and related performance metrics to be reviewed by each of these procedures to ensure that they are both consistent and understood.

Thus we may define what performance metrics we wish to measure within our project, apply a ruler of our own design and in turn derive value from the reports generated to be used as inputs for business decisions related to our project in question. Odd units of measure that come to mind are the inverse femtobarn, attoparsec or hoppus foot, for projects we may define dollars per hour per objective feature as a standard if we so wish for efficiency, or time per line of code (t over kloc) per project.


iPocatilu, Paul (Revisita, Informatica Economia, 2007) IT Project Metrics [Online] PDF Document, Available from: http://revistaie.ase.ro/content/44/26%20pocatilu.pdf (Accessed on September 8th 2011)
iiBrown, Mark Ghram (Productivity Inc. 1996) Keeping score: using the right metrics to drive world-class performance P. 6 – 10 ISBN: 0-527-76312-8
iiiFitz-Gibbon, Carol T. (Blank House, BREA Dialouges, 1990 – 2 ) Performance Indicators ISBN: 1-85359-093-2\

Thursday, August 25, 2011

How to Motivate Project Teams


Ultimately a project manager must determine the best method to motivate their team; the greatest leaders in business use oration to inspire; money to reward and business intelligence to analyze effectiveness.

Methods and Tests

Schwable states that there are the following motivationsi;
Intrinsic Motivation is classified as motivation that is derived from the nature of the work; ie; a boat builder probably enjoy's sailing, which may be why they studied the profession; a writer may enjoy writing; a programmer may enjoy thinking in mathematical abstraction or the challenge of reorganization abstracted data in a useful way.

Extrinsic Motivation is classified as the tratditional risk reward axis, reward of money for services rendered, risk of bieng unemployed or homeless. These are “Extrinsic” to the subject in question.

The following psychological models may be used;

Myers-Briggsii otherwise refereed to as the MTBI indicator test.

Abraham Maslow, and his Hierarchy of Needsiii

Frederick Herzberg, and his Hygienic theory behind the Motivation to work.iv

David McClelland developed the Thematic Aperception Test which is similar to a Roschact test and designed to be analyzed to discern the subjects unconscious intent. The formal methods include the DMM and SCOR standards.v

Douglas McGregor's Theory X and Thoery Y.

As with the information age, the nature and power of computers coupled with magnetic resonance imaging has allowed us to realize that most psychological arguments and models are incorrect in fact the entire ontology of human motivation is now being remapped within medical science; including the nature of being (self) and the nature of motivation; this shift in paradigms will supplant most psychology with functional neuroanatomy; where motivation is classified as a given state within a persons connectome as mapped by the anatomy of their mind.

The nature of human anatomy

Levey States the following about Motivation.vi
To relate motivation, a psychological concept, to a neurobiology substrate requires that this concept should be translated in operations relevant at the biological level. The term of motivation includes mental elementary processes decoding the emotional value of a stimulus (endogenous or exogenous) and integrating it into the elaboration, control and execution of goal-directed behaviors in order to ensure the maintenance of homeostasis, the well-being and survival of the individual and the species. Thus, in experimental condition, responses to aversive or appetitive situations are the classical markers of motivation. Consequently, the cerebral network of motivation particularly relates to the limbic system, i.e., the amygdala, the orbital and ventromedial prefrontal cortex, the anterior cingulum and the ventral striatum. Within this network, it is possible to distinguish: on one part structures such as the orbital and ventromedial prefrontal cortex, which are essential for the highest level of adaptive responses, allowing the individual to decode the variations of the emotional contingencies in real time, to integrate them into the behavior, and to maintain a behavioral choice during a long course, regardless of endogenous or environmental constraints; on the other part structures like the amygdala allow to build stable and invariants behaviors, that can be automatically activated and are essential for survival and avoidance of noxious situations”


Vilayanur Ramachandran states that the only difference between humans and their individual perception of reality is due to the input from their skin via the autonomous nervous system; which is located and occuled (limited or restricted) within the mirror neurons and the fusiform gyrus; both of these components of the human mind have evolved in the last 75,000 to 100,000 yearsvii leading to our use of language, fire tools and every last component in civilization; his research with functional MRI's and Phantom limb pain has shown that all abstracted thought is centered upon the fusiform gyrus located in annular gyrus region of the brain; in humans this portion is eight times larger than in other primates.viii Thus the nature of all abstracted thought, the center of all creativity lies in this area of the mind it is the intersection of stimuli of the visual, auditory and sensory regions; as he states it's the seat of self.

Management Tools


Motivational Theories since they are based in psychology are suspect as they lack objective specificity when compared to an FMRI study using modern medical techniques; however we may infer from Ramachandrands and Levy's findings that due to very basic motivation either greed or the desire for positive outcomes from various challenges as mentioned in Intrinsic and Extrinsic motivations as they filter through the structure of our minds as specific stimuli; these could theoretically be mapped with fmri to determine the exact areas of the mind the specific anatomy that objectively requires stimulation; however as Project managers we cannot implant neural stimulation devices and must substitute tangible rewards and penalties involving the models mentioned in this lecture which we hope have an equal net effect in said regions; these include monetary gain and peer acceptance, as well as loss and rejection. The old add age holds true that it is far easier to bait a horse than to whip it.

Influence And Power

Influence and power are mentioned by Schwable, however I have found that influence is subjective and power often fleeting; Power is too often abused by those that wield it to an end where the team is dissolved and the risk of failure is high.

Effectiveness

Determining effective methods for team motivation include developing a performance assessment program that is concise and driven by the subject in question; if we allow our team members to specify their respective goals than we may find that they will be aided or determine what these goals may be and may help formulate them in the process. This standard is practiced by a number of organizations; using training subsities and or mentoring programs are excellent methods to develop effective influence over employees and project members alike; the goal is to instill a relationship of trust and empowerment within the team to accomplish whatever task lies ahead, the methods used to do so are quite moot.



Distributed Projects

Since this discussion is about motivation in distributed projects, we can use the entire “Open Source” community as a refrence. RedHat, Ubuntu, Seles, Gentoo in fact every distribution of linux are developed as geographically diverse sets of programmers use standard and formal methods for revision control, auditing and review to develop large families of tool sets. A distribution is defeined by it's parent organization with respect to the organization they have a different business model than the traditional one of a software house in that by opening their code responsibilities to public editing they have further reduced the over all cost of development whilst only maintaining key control over functions and quality of the thousands of programs that make up an OS. Opensource and it's movement uses the primary motivation of programmers to distinguish themselves within their chosen peer group; the financal reward with this business model comes in the forms of consulting, support and administration of said systems. Organizations that make billions on their backs include SAP, IBM, Oracle and others.

The Truth about Managers

The truth about great project managers and leaders is that they do not manage at all; they simply find the best method to remove apathy from the team they are working with and find ways to challenge them to achieve the desired goal. I have in the past used only Expert Advice and Rapport alone to achieve our project goals as defined by my clients; these have included Compliance with SOX, Implementation of X509/ PKI systems according to federal government standards and distributed security systems; in each project I worked as a member of a team and my primary contributions apart from technical expertese was direction and guidance on how to achieve our desired goals.

References
iSchwable, Kathy; (Cenguage Learning, 2009) Informaiton Technology Project Management 6th ed ISBN:978-0-324-78692-7
iiMyers,Peter B; Briggs Isabel(Davies-Black Publishing, 1995). Gifts Differing: Understanding Personality Type. ISBN 0-89106-074-X
iiiMaslow, A. H. (Psycological Review, 50, 1943) A theory of Human Motivation [Online] World Wide Web, Avaiable from: http://psychclassics.yorku.ca/Maslow/motivation.htm (Accessed on August 24th 2011)
ivHerzberg, Fredrick; Mausner, Bernard; Synderman, Barbra Bloch (Transacation Publishers, 2009; 12th Reprint of John Wieley and Sons 1959 ed) The motivation to work ISBN: 1-56000-X
vWesten, Drew. (Journal of Personality Assessment, Volume 56, Issue 1 February 1991 ) Clinical Assessment of Object Relations Using the TAT. p. 56 - 74.
viLévy R. (Fédération de neurologie et Inserm U 610, Hôpital de la Salpêtrière, Paris., December 24th 2004) The neuroanatomy of motivation in man [Online] PDF Document, Available from: http://www.ncbi.nlm.nih.gov/pubmed/15683980 (Accessed on August 25th 2011)
viiVS Ramachandran (TED Talks, November 2009) The neurons that shaped Civilization [Online] Video Lecture, Ted Talk, avaiallbe from; http://www.ted.com/talks/vs_ramachandran_the_neurons_that_shaped_civilization.html (Accessed on August 25th 2011)
viiiS Ramachandran (TED Talks, March 2007) VS Ramachandran on Your Mind [Online] Video Lecture, Ted Talk, avaiallbe from; http://www.ted.com/talks/vilayanur_ramachandran_on_your_mind.html (Accessed on August 25th 2011)

Thursday, August 18, 2011

Zen and the art of the accurate estimate

Schwable states that Estimates are difficult for the following Reasonsi:
  • Estimates are done too quickly
  • Lack of Estimate Experience
  • Humans are biased towards underestimation
  • Management Desires Accuracy
In addition to these difficulties there are many methods that may be used to estimate the cost of a software project including:
  1. Top-Down Estimatesii
  2. Bottom Up Estimatesiii
  3. Parametric Modeling iv
  4. COCOMOv
  5. COSYSMOvi
  6. Event Chain Methodologyvii
  7. Function Pointsviii
  8. Program Evaluation Review Technique (PERT)ix
  9. Proxy Based Estimation (PROBE)x
  10. The Planning Gamexi
  11. Weighted Micro Function Pointsxii
  12. Wideband Delphixiii
Thus far within the confines of project estimation that we have examined the art of “triangulating” a project estimation method using a top-down and bottom up approach promises to be both the most efficent and accurate method; However Grimstad et al. State that one of the greatest risks of failure for any software project is a “Reliable cost estimate” as previously stated by various studies.xiv

The etymology of the term to “Estimate” is from Latin, and it's origin is a “Builders statement of projected costs”, from 1796 OED. Builders have the luxury of using tangible assets to develop costs upon.

Software is intangible, that is to say ultimately we may use metrics to measure software but these standards include “Guessing” how many lines of code may be required for a non-existant function, or how many man hours may be derived using Bohem's COCOMO estimates. Since software is based in the art of describing functions in a language of Mathematica abstraction, this abstraction itself further lends to the difficulties of developing an “estimate” of how many lines of code may be needed in what programming language for which function.

Given that we may select a methodology and develop metrics based upon empirical data with a given project; we may assume that we must use estimates based on previous projects with the same group of people, that is to say we must have previous work completed to compare against to help in developing a more accurate method to estimate future projects; this is a common practice derived primarily from fiscal budgeting and actuarial standards ; ie if a given business wishes to develop estimates of income they will use the previous years sales as a starting point; so to would a software development house use the previous years number of lines of code vs profit gained and man-hours of work as metrics for the next or any other quoted project.

The biggest problem in software cost estimation is the simple fact that even considering all of the research, and formal methods we may use; it's still just a very educated and formalized guess at a given cost.
Just as the old proverb states; the only constant in life is change; that is next to death and taxes, so given that the industry of software development is that with the greatest risk and potential for profit the amount of change within the software market in a given year is gargantuan when compared with more traditional areas or segments of industry.

The core business of mining resources for use has not changed in a millennia, neither has farming or travel; all of these sectors realize greater efficiencies in the information age by using software and automated systems along with great leaps in technology.

In the information age the rate of change and evolution is increasing as stated previously by Ray Kurtzweil; although he predicts systems with greater intelligence than their human agents at some point in the near future until then we are the agents of change within the software industry.

The biggest problem in developing accurate cost estimates for any given software project is change; Change in the scope of the project, change in the market forces driving the products development, internal recursive changes in the code used by the programmers to develop the project and a change in the project environment and business including change in it's risk profile or even change composition of the development and testing team of resources for said project.

We can model projects with a fair degree of accuracy given a few constants; these include the teams of people involved in the project, their expected salary, the variance in the software market which is far more volatile than most commodities and securities markets and the expected revenues from the project given previous years market data. However if any of those constants change with respect to the project then we must revisit it's model and cost estimate to maintain an accurate view.

A simple analogy for project costing would include how economists model the prime interest rate for a country. The models used over time vary; but the goal of the rate remains the same to maintain a stable level of inflation. 
They may use any models at their disposal or even develop custom models using automated systems but ultimately at the highest levels of research the numbers used for CPI, GDP and the like are aggregates of averages; thus the amount of removal or abstraction in the process may not compensate for a change in the global economic environment. These models are slow to react to change.

Thus any methodology used to develop cost models may use aggregates of averages or even aggregates of aggregates to develop a high level idea of what an enterprise spends on all of it's projects but ultimately these models are all so completely abstracted from reality that they must be sanity checked against expenditures to ensure that they hold any ounce of accuracy. Each of these processes is sensitive to change and although we excel at placing variance in an automated system; we still require experience and previous data upon which to develop our cost models. 
 
This reason alone is why many software startups fail, even why many business fail.
iSchwable, Kathy (Cengage Learning, 2010) Information Technology Project Management P.263 ISBN: 978-0-324-78692-7
iiSchwable, Kathy (Cengage Learning, 2010) Information Technology Project Management P.264 ISBN: 978-0-324-78692-7
iiiSchwable, Kathy (Cengage Learning, 2010) Information Technology Project Management P.264 ISBN: 978-0-324-78692-7
ivSchwable, Kathy (Cengage Learning, 2010) Information Technology Project Management P.265 ISBN: 978-0-324-78692-7
vBohem, Barry; Clark, Bradford; Horowitz, Ellis; Westland, Chris (USC, USC Center for Software Engineering, Annals of Software Engineering, 1995, 57-94)
viValderi, Ricardo (University of Sothern California, August 2005) THE CONSTRUCTIVE SYSTEMS ENGINEERING COST MODEL [Online] PDF Document, Avaialble from: http://csse.usc.edu/csse/TECHRPTS/PhD_Dissertations/files/Valerdi_Dissertation.pdf (Accessed on August 17th 2011)
viiAktasa , Nihat; de Bodta , Eric, Cousinb, Jean-Gabriel (IAG Louvain School of Management, Université catholique de Louvain, Belgium, November 13 2006) Event studies with a contaminated estimation period [Online] PDF Document, Avaialble from: http://www.sciencedirect.com/science/article/pii/S0929119906000290 (Accessed on August 17th 2011)
viiiCapers, Jones (McGraw-Hill, 2007) Estimating software costs: bringing realism to estimating P.246 ISBN: 978-0-07-148300-1
ixKeefer, Donald L.; Verdini, William A.; (Arazona State University, Departemnt of Decition and Information Systems, Management Science, Vol 39. No. 9 September 1995) Better Estimation of PERT Activity Time Paramaters [Online] PDF Document, Available from: http://www.jstor.org/pss/2632814 (Accessed on August 17th 2011)
xJohnson, Philip M.; Moore, Carleton A.; Dane, Joseph A.; Brewer , Robert S.; (Unversity of Hawaii, IEEE, 2000) Empirically-Guided Software Effort Guesstimation [Online] PDF Document, Avaialble from: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.103.5058&rep=rep1&type=pdf (Accessed on August 17th 2011)
xiWilliam A. Wood, William L. Kleb (IEEE, NASA Langley Research Center, Issue 3, May / June 2003) Software Exploring XP for Scientific Research [Online] PDF Document, available from: http://doi.ieeecomputersociety.org/10.1109/MS.2003.1196317 (Accessed on August 17th 2011)
xiiSymons, Charles; (Proc. of the 4th European Conf. on Software, 2001) Comeback Function Point Analisys (Modernized) All is Forgiven! [Online] PDF Document, Avaialble from: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.8.2999&rep=rep1&type=pdf (Accessed on August 17th 2011)
xiiiWiegers, Karl E. (Process Impact, 2000) Stop Promising Miracles [Online] PDF Document, Available from: http://www.uml.org.cn/SoftWareProcess/pdf/delphi.pdf (Accessed on August 17th 2011)
xivGrimstad, Stein; Jørgensen, Magne; Moløkken-Østvold, Kjetil; (Journal of Information and Software Technology 48(4):302-310 , 19 April 2005) Software Estimation Terminology: The Tower of Babel [Online] PDF Document, Availble from: http://simula.no/research/se/publications/Grimstad.2006.1 (Accessed on August 17th 2011)