Saturday, October 17, 2009

Poster Topic




Friday, October 16, 2009

Unauthorized Web Servers


I found this article rather shocking.  Our nation's financial institution had potentially 87% of their servers on their network that were not listed in their update and patch database.  Apparently, 1,150  of the servers being used were potentially being used for unauthorized activities.  With an institution as large and as well funded as the IRS, one would tend to think their security standards and practices would be at the minimum, up to par with security standards.  This article illustrates exactly the opposite.  This article further proves the need for organizations to have procedures in place with respect to allowing access to their networks.  Whether is device specific or IP specific, there needs to be standards policy and procedures in place so that when an employee needs to add a device, it goes through the proper channel to not only get cleared for security standards, but to also be added into either the inventory or the employees profile as an asset.  Here again lies the struggle between our thirst for convenient technology and the need to keep our information secure     



Reference: http://www.nextgov.com/nextgov/ng_20080904_3324.php

Data Remanence


This below linked article is concerning a newer form of data remanence.  DRAM remanence, or data random access memory remanence, is an up and coming way to access data, specifically encryption keys that may be sitting in DRAM even after a reboot.  According to Michael Cobb, this problem has been around for some time, but due to lack of hacker cooperation to publicly blog about their encounters, there is little know about the level of sophistication of this type of attack.  What we do know is how to try to prevent this sort of attack.  Cobb recommends to first and for most to maintain physical security, as slacked physical security posture, as of now, is how the attack is carried out.  This attack can be accomplished by the attacker selecting a target host that has either left their workstation unlocked, or in a hibernating or standby mode, thus allowing the attacker to boot up with a USB stick or live CD, thus potentially compromising encryption keys or other valuable data stored in DRAM.  I found this article interesting because of the lack “the knowns” about this sort of attack.  Also, I chose this article because of the method of attack not being carried out like typical attacks we have studied this semester, via the web.  

Reference:  http://searchsecurity.techtarget.com.au/articles/25842-What-is-DRAM-remanence-and-can-it-undermine-your-encryption-key-management-strategies-?topic_id=178

Increased Security through Open Source



This is an argument between open source and closed source proprietary code, and the impact of security among each.  Why would open source operating systems be any safer than store bought systems if the code if publicly available?  This article refers to Linus's Law and how it relates to open source.  I find the concept fascinating, yet comically obvious and common sense.  The Internet is a great example of Linus's Law in that because of the amount of people that use it, make it more valuable.  If there were only 100 people on the internet, would it be such a useful tool?  I think the same respect should be shown towards open source.  Consider one of the development teams at a major software company.  How much possible scrutiny can a finite team produce with respect to security and vulnerabilities?  Where as if you have the whole world scrutinizing a code base for weaknesses, the output should be substantially different.  The concept of open source lends to the phrase “safer than store bought.”  Because there is vast amounts of talented individuals around the world that are willing to make contributions towards development, you will have a much more stable and overall more secure system.  Linus's Law, proves this theory.  Meticulous developers around the world are today finding, fixing, and improving open source systems.  This will not change anytime soon, and proprietary operating systems will be right there.    



Reference:  Hoepman, J. and Jacobs, B. Increased Security through Open Source, Communications of the ACM, v.50, n.1, Jan 2007

Slax

Slax, based on Slackware, is a live CD operating system that is not only small in size, but it is also modularized to allow the user with near limitless customization capabilities.  I chose Slax because of the ease of use, design, and most of all the customization capabilities.  Slax is so versatile in that it can be run from a USB stick or CD/DVD media.  The user friendliness is so highly rated that this particular distribution is recommended to UNIX/LINUX beginners.  Its sleek, modern, and catch interface allows people of all ages to enjoy using it.  With close to 1000 different applications available to add to your custom ISO image, nearly anyone can go to their website, click and choose what applications they want, and viola,  the user has a custom live CD with the applications that they have specifically chosen.          
  
With the appropriate applications installed, Slax can provide as a very useful tool.  With Wireshark, Nmap, ClamAV, and Firestarter installed, one can not only monitor network activity, run a system/installed software inventory, but also better protect from viruses and intrusions.  Slax displays Preventative Technical Controls such as authorization (with the use of Nmap).  Slax also meets the Detection and Recovery Technical Controls criteria due to Audit and Virus Detection and Eradication controls.  In addition, Slax can help an organization meet Supporting Technical Controls with reference to identification and security administration.
  
With the ease of use, portability, variety of customization features, and security controls and practices that Slax allows, its popularity continues to soar.  Organizations need to explore these befits first hand of open source system such as Slax.  Slax is continuously being reviewed and improved upon with its most current update as of a couple of months ago August 04, 2009 with version v 6.1.2.     


References: http://www.slax.org/
http://distrowatch.com/table.php?distribution=slax
http://www.geocities.com/slaxfansite/

Wireless Infidelity

What is War Driving?

War Driving is a technique used to sniff out 802.11 data transmissions with the use of a wireless card.  This technique is a spin off of War Dialing, which is a similar technique to obtain access to a network by calling phone lines and recording lines that are modem lines.     Some of the software used in War Driving is sophisticated enough to produce outputs such as SSID's, MAC address of the device, channel, and signal strength.              

What is Wired Equivalent Privacy (WEP)?  When did WEP become part of the 802.3 Standard?  When was it deprecated?

Wired Equivalent Privacy (WEP), is one type of an encryption method that is used among wireless networks.  Due to weaknesses and flaws associated with WEP, there have been newer 802.11 encryption methods such as WPA and WPA2 that have since addressed some of the security weaknesses of WEP.  Some legacy devices are not capable of utilizing newer forms of encryption such as WPA , however according to Berghel, “The only thing worse than enabling WEP is not enabling WEP!”1  Even though WEP is not, what the Information Security community would consider an adequate method of securing a wireless networks, having some sort of encryption is better than none at all.  The 802.3 standard refers to a wired network.  WEP is an encryption method used in 802.11 wireless networks.  In 2004, the 802.11i standard was ratified which offered WPA2 and WEP encryption was deprecated by the computing community.   


What is the lesson to be learned from War Driving?

The lesson to be learned from War Driving is if we are using a wireless network with no encryption or WEP encryption, the network and devices on it are likely to become compromised.  The technology exists so that with some easy to find open source software, one can obtain access to just about any low or non encrypted network with little effort.  We need to be aware that the wireless technology we are utilizing uses radio frequencies to transmit the packets, and like a scanner listens to wireless home phone house calls, people with the right type of scanners (software) can also intercept your data transmissions.     


Reference:
Hal Berghel, “Wireless Infidelity I: War Driving”, Communications of the ACM Vol. 47, No. 9, (September 2004): p.24

Exploit of Wireless Communications


In the eyes of a criminal, no person, country, school, or organization is safe from the misuse of wireless networks.  I chose this article because it clearly illustrates the struggle between available technology and the misuse or null use thereof.  The persons credited for the bombings in Mumbai were complacent enough to even send emails to local media sources in order to claim responsibility for the terrorists acts.  They accomplished this by gaining access to insecure or minimally secured Wi-Fi networks.  This article proves further that because of the readily available open source applications online, criminals are becoming more sophisticated with their attacks.  This article clearly shows even more reasons for wireless network users to be up to par with minimum standards in security.  Leaving your wireless network unsecured, or minimally secured (WEP), can be detrimental to ones safety as well as security of their personal information and the use thereof, e.g. using your access point to distribute malicious emails or malware.                


References: http://www.navhindtimes.com/story.php?story=2008110934

Live CD Distributions Survey

The below table is a comparison of three different Live CD distributions, Slax 6.1.2, BackTrack 4, and NST (Network Security Toolkit) 2.11.0 minimal version.  Each of these distributions have their own pros and cons, but they can all be useful to a security professional in a variety of different scenarios.


 
Slax is a modularized distribution that cuts right under 200 MB for the basic release giving the user flexibility and superb performance, along with a online ISO customization portal with tons of different applications in a variety of different use categories.  Slax also has built in network connection configuration (when hard wired) and auto internal disk mounting. 

BackTrack, on the other hand, needs a bit of network configuration to get up and running on the web, along with the need to manually mount your hard disk.  BackTrack is considerably bigger in size, but with good reason.  BackTrack has over 300 security geared tools that allow as user to do everything from packet sniffing, protocol analyzation, penetration, digital forensics, VoIP analyzation,  spoofing, fingerprinting, port scanning, just to name a few. 

NST  is smaller than BackTrack in size, but contains a lot of the same applications.  While NST doesn't have a nifty GUI, as well as not being customizable like Slax and BackTrack, security professionals should find NST to be useful in it's own right.  If I absolutely had to chose one distribution out of the above three, I would have to lean towards BackTrack.  It is one of the most comprehensive security geared distributions on the open source market today.  For myself, the graphical user interface in BackTrack provides as a better navigation and overall use when compared to that of NST. 





http://www.linux.com/archive/articles/127892
http://www.slax.org/
http://distrowatch.com/index.php?distribution=slax
http://www.slax.org/forum.php?action=view&parentID=25042
http://www.networksecuritytoolkit.org/nst/
http://wiki.networksecuritytoolkit.org/nstwiki/index.php/Main_Page
http://searchsecurity.techtarget.com/tip/0,289483,sid14_gci1301195,00.html?track=NL-422&ad=621619&asrc=EM_NLT_3145395
http://www.remote-exploit.org/backtrack.html


Saturday, September 19, 2009

Twitter and Facebook Victims of DDoS

This article illustrates how there is virtually no way to stop a DDoS (Distributed Denial of Services) from affecting even the most busiest websites in the world. I believe this article shows one great example of how vulnerable and weak the internet security infrastructure really is across the board. The below diagram shows the taxonomy of how this attack was executed and how it left users requests dead in the water. This attack was carried out by overloading the web server with tons of fake request, rendering the website useless. I think this article is important because of the widespread use of websites like Twitter and Facebook, and it clearly shows the public and the user population that even social websites and online networking communities (where the attacker may have little to nothing to gain) can be easily exploited and taken out.


Article Link:
http://www.wired.com/epicenter/2009/08/twitter-facebook-attacks-no-surprise-to-security-experts/

NIST 800-53, Chapters 2 & 3

Chapter 2 of the NIST special publication 800-53 explains the elementary concepts of the selection and specification of security controls for a given IT system. Some of the various topics covered in this chapter regarding security controls include: security controls organization and structure, security control baselines, the identification and use of common security controls, security controls in external environments, security control assurance, revisions and extensions to security controls. The below table from chapter 2 illustrates the different families of security controls along with their unique two-character identifier (which corresponds to the security control catalog located in Appendix F of NIST 800-53). The class column refers to the three different types of security control classes; management, operational, technical.


Chapter 3 of the NIST special publication 800-53 takes it a step further, and explains the specific processes involved in specifying and choosing security controls. Some of the specific topics covered include; managing risk, security categorization, selecting and tailoring the initial baseline controls, supplementing the tailored baseline controls, and updating the security controls. Table 3-1 (shown below) is a graphical representation of the risk management framework security life cycle. It explicitly illustrates the flow of processes within the system security life cycle along with relevant NIS (and other standardized, regulatory, or policy related initiative documents) that will assist in guiding the risk management process.

References:
http://csrc.nist.gov/publications/nistpubs/800-53-Rev2/sp800-53-rev2-final.pdf

NIST 800-14 Generally Accepted System Security Principles (GASSP)

In the early 90's, the National Performance Review, in pursuant and part of of the National Information Infrastructure, request the NIST (National institute of Standards and Technology) to develop a set of generally accepted system security principles and practices for the United States government. These principles and practices were primarily created with government's information and data systems in mind. So in 1991 the rules and procedures were outlined in the National Research Council document titled, Computers At Risk. By 1992, several national and international entities started implementing the recommendations of that document. NIST 800-14 GASSP, is based primarily on a document titled, OECD's Guidelines for the Security of Information Systems, created by a team of international experts back in 1992. NIST built on and added to the OECD Guidelines in order to provide a more refined and detailed set of Generally Accepted System Security Principles. The below table illustrates the principals and practices described in the NIST special publication 800-14.


References: http://csrc.nist.gov/publications/nistpubs/800-14/800-14.pdf

Wednesday, September 16, 2009

RFC 1135 “The Helminthiasis of the Internet”

1) What was the cause of the first Internet Worm? In specific, what vulnerabilities did the worm take advantage of in order to spread through the Internet?

The cause of the first worm was “known security loopholes in applications closely related with the operating system” (1989, pg.1), specifically VAX computers and SUN-3 workstations running the 4.2 and 4.3 Berkley UNIX code. The two main vulnerabilities in the systems the worm exploited were basic network services; sendmail and fingerd. In the case of the sendmail service, the worm utilized a “non-standard debug command” (1989, pg.2) to propagate itself within other remote hosts, thus starting the self replicating process of the worm once again. With the fingerd service, the worm instigated a memory overflow situation in which placed more characters that the service itself could handle, thus allowing the worm to execute a random program. Other vulnerabilities this worm took advantage of included password guessing and trusted host features.

2) Are those vulnerabilities still present?

These particular vulnerabilities were address and preventative measures were implemented in order to prevent future exploitation of these services. However, that is not to say another attack or similar vulnerabilities cannot be exploited in the near future. In this particular situation, the vulnerabilities would not have been noticed would it not have been for the worm. While this worm was extremely destructive, it was also extremely eye opening for the computing community as to weaknesses that needed to be addressed.

References:
http://tools.ietf.org/rfc/rfc1135.txt
http://www.faqs.org/rfcs/rfc1135.html

Friday, September 11, 2009

Local ATM's May Have Malware


This article illustrates the complexity of a well developed Malware application installed on ATMs around the world. I thought this article was significant because of the duration and maturity of this program. The level of sophistication this program was able to achieve is remarkable, but at the same time very chilling. This article shows how people not only need to be careful while shopping online and using online banking, but also mindful about where they use their debit card or ATM card. Staying diligent about reconciling bank accounts on a regular basis has obviously become even more essential. I chose this article because nearly everyone uses an ATM from now and then, in doing so, exposing ourselves to yet even more risk.

Article Link: http://www.wired.com/threatlevel/2009/06/new-atm-malware-captures-pins-and-cash/
Google Group:
http://groups.google.com/group/itec5321fall09/web/02_journalAssignment_Malware.pdf?hl=en

Thursday, September 10, 2009

NIS-800-30 Q&A


1) What is the purpose of NIST Special Publication 800-30?

The purpose of NIST 800-30 is to facilitate the risk management process. The NIST 800-30 is a guide that assists entities in making better decisions about vulnerabilities to their IT systems.

2) What is the principal goal of an organization’s risk management process?

The principal goal of an organization’s risk management process according to NIST 800-30, “is to enable the organization to accomplish its missions” (NIST, 2002). Page 2, section 1.3 of NIST 800-30 recommends three practices of risk management to achieve company missions.

* Better securing the IT systems that store, process, or transmit organizational information

* Enabling management to make well-informed risk management decisions to justify the expenditures that are part of an IT budget

* Assisting management in authorizing (or accrediting) the IT systems on the basis of the supporting documentation resulting from the performance of risk management

3) According to NIST, what three processes compose risk management?

* Risk assessment
* Risk mitigation
* Evaluation assessment

4) How does risk management relate to the System Development Life Cycle (SDLC)?

The relationship between the SDLC and the risk management process is one of continuous and uninterrupted cooperation and constant adjacency. “Effective risk management must be totally integrated into the SDLC” (2002, p.4). In addition, according to figure 2-1, each cycle of the SDLC (initiation, development or acquisition, implementation, operation or maintenance, and disposal), should address risk management. With each phase of the SDLC, comes equally supporting risk management activities.

5) NIST 800-30 defines seven Information Assurance “key roles”. Name and briefly describe each of them.

Senior Management – These are the people responsible for making sure they organization meets their goals. They make sure that the project has the necessary resources and that the resources are properly utilized. Senior management is also responsible for evaluating and incorporating results from risk assessment practices.

Chief Information Officer (CIO) – The CIO is the person who takes all of the input and results from the risk assessment processes and is responsible for planning, budgeting, executing and the performance of the project.

System and Information Owners – This is the group who is obligated to target the integrity, confidentiality, and availability of the project's systems and data.

Business and Functional Managers – These managers are fundamentally responsible for the “operations and IT procurement processes” (2002, p.6). Business managers have the authority to make “trade-off” decisions that are imperative to the goals of the project.

ISSO – This manager is responsible for executing the “organizations security programs, including risk management” (2002, p.6). These professionals are the direct and main support of senior management and executive management. They lead the project in terms of identifying, evaluating, and mitigating risks of their IT systems.

IT Security Practitioners – These are the people that are actually performing the major systems jobs including: “network, system, application, and database administrators; computer specialists; security analysts; security consultants” (2002, p.6)

Security Awareness Trainers (Security/Subject Matter Professionals) – This group of individuals is responsible for providing security awareness training to the IT systems user population.

6) How does NIST 800-30 define risk?

According to NIST 800-30, “Risk is the net negative impact of the exercise of a vulnerability, considering both the probability and the impact of occurrence” (2002, p.2). More specifically, “Risk is a function of the likelihood of a given threat-source’s exercising a particular potential vulnerability, and the resulting impact of that adverse event on the organization” (2002, p.8)

7) How does NIST 800-30 define a threat?

NIST 800-30 defines a threat as, “the potential for a particular threat-source to successfully exercise a particular vulnerability” (2002, p.12).

8) How is a threat source defined? Name three common threat sources.

A threat source is, “any circumstance or event with the potential to cause harm to an IT system” (2002, p.13). Three common threat sources include:

* Natural threats – hurricanes, tornadoes, earthquakes, floods, wild fires

* Human threats – intentional and non-intentional acts (e.g., typo, spilled coffee, humidity and HVAC issues, hacking)

* Environmental threats – power loss and pollution

9) How does NIST 800-30 define vulnerability?

NIST 800-30 defines a vulnerability as, “A flaw or weakness in system security procedures, design, implementation, or internal controls that could be exercised (accidentally triggered or intentionally exploited) and result in a security breach or a violation of the system’s security policy” (2002, p.15).

10) According to NIST, whose responsibility is IT Security? (technical or management)

IT security is the responsibility of nearly every major functional department in an organization. Mainly, IT security responsibility lies with management, operational, and technical areas within an organization.

11) What is a security control?

Security control is a mechanism that is put in place to try to mitigate risks. Security controls can be broken into two categories. The first category is technical controls.

12) Define: technical controls, management controls, and operational controls.

Technical controls consist of, “safeguards that are incorporated into computer hardware, software, or firmware (e.g., access control mechanisms, identification and authentication mechanisms, encryption methods, intrusion detection software)” (2002, p.20). Nontechnical controls, “are management and operational controls, such as security policies; operational procedures; and personnel, physical, and environmental security” (2002, p.20). “Management security controls, in conjunction with technical and operational controls, are implemented to manage and reduce the risk of loss and to protect an organization’s mission. Management controls focus on the stipulation of information protection policy, guidelines, and standards, which are carried out through operational procedures to fulfill the organization’s goals and missions” (2002, p.35). Operational controls are, “a set of controls and guidelines to ensure that security procedures governing the use of the organization’s IT assets and resources are properly enforced and implemented in accordance with the organization’s goals and mission” (2002, p.36).

13) How should the adverse impact of a security event be described?

“The adverse impact of a security event can be described in terms of loss or degradation
of any, or a combination of any, of the following three security goals: integrity, availability, and
confidentiality” (2002, p.22)

14) Describe the difference between quantitative and qualitative assessment?

The difference between a quantitative and a qualitative assessment is the difference between quality and cost. Qualitative assessments focus more on immediate vulnerabilities and risks. A Quantitative assessment approach would focus more on the monetary impact controls.

15) Name and describe six risk mitigation options.

According to section 4.1, pg. 27, six risk mitigation options include:

Risk Assumption - to accept the potential risk and continue operating the IT system or to implement controls to lower the risk to an acceptable level

Risk Avoidance - to avoid the risk by eliminating the risk cause and/or consequence (e.g., forgo certain functions of the system or shut down the system when risks are identified)

Risk Limitation - to limit the risk by implementing controls that minimize the adverse impact of a threat’s exercising a vulnerability (e.g., use of supporting, preventive, detective controls)

Risk Planning - to manage risk by developing a risk mitigation plan that prioritizes, implements, and maintains controls

Research and Acknowledgment - to lower the risk of loss by acknowledging the vulnerability or flaw and researching controls to correct the vulnerability

Risk Transference - to transfer the risk by using other options to compensate for the loss, such as purchasing insurance.

16) What is residual risk?

Residual risk is simply the risk that is remains after the implementation of a new IT system taking into consideration the risk management process. Since it is virtually impossible to eliminate all risk, risk is forever present and risk left over after new controls have taken effect, are considered residual.

References: http://csrc.nist.gov/publications/nistpubs/800-30/sp800-30.pdf

Google Group: http://itec5321fall09.googlegroups.com/web/02_classProject_NIST_800-30_Q%26A.pdf?gda=-6J3kFgAAADGz-IK4zHXi8PRhPCSPfCHRz6XUti8QGzduyUGse7EAGwUKRPiCTMMLq-0mwsOmVRd9roYZOCyGdt0OSpDR2YapsMJpo2PJSxJJuRrHXFqXxo1YHcDYvgcK1MwRk9oTs4&gsc=Wok62gsAAAB3XudsBUvSc6wDEDB1_5CO&hl=en

Saturday, September 05, 2009

Information Assurance Model

The IA model is a tool that shows the working dynamics of how an Information Assurance program should be approached. It encapsulates the different security functions that any organization should posses. The left face of the cube shows the Information States. This represents the states in which the information is in, either sending, storing, or processing. These states show the different times the data may be vulnerable. The top layer of the cube model contains the different Security Services that should be executed in order to have an effective IA program. They include; availability (the process of making the information prompt and reliably accessible to users), Integrity (data, hardware, and security mechanism integrity), Authentication (the right users accessing the right data), Confidentiality (the assumption that personal and sensitive data, financial, medical, ect, is kept safe), and Non Repudiation (the confirmation that the involved parties [sender, receiver] of data transmission account for their transaction of information). Lastly, the right face of the cube model displays Security Countermeasures that should be in place in an efficient IA program. These countermeasures include; training and education of people (employees or parties involved), policies and procedures in place, and technology such as surveillance, communications, hardware and software.

Google Group: http://groups.google.com/group/itec5321fall09/web/Outside_Reading_Assignment_01_IA_Model.doc?hl=en

ID Theft Incident


“Court Allows Woman to Sue Bank for Lax Security After $26,000 Stolen by Hacker” displays how something so common as a bank account could create such a major disaster in one instance. I chose this article for two reasons. One, because of its close ties with a similar experience I faced, not nearly of an amount of this scale, however substantial. Second, I chose this article because of the possibility of this affecting anyone at anytime that has an online banking account. This article also shows obvious flaws in security practices surrounding online banking. Irregardless of semantics, there was an apparent failure in security.

I think the lesson in this article is that the more consumers rely on technology for online financial convenience, the more prone they are to being a victim of identity theft. This is a prime example of the never ending battle between good versus evil, hackers versus security professionals. The internet has become the worlds preferred method of communication, with that comes risk. Online banking has become a necessary evil.

Article Link:
http://www.wired.com/threatlevel/2009/09/citizens-financial-sued/#comments


Google Group:
http://groups.google.com/group/itec5321fall09/web/Journal_Assignment_01_ID_Theft.doc?hl=en




Friday, September 04, 2009

Live CD


Have you ever found yourself sitting in front of your PC with the notorious blue screen? Maybe you have an extra system around the house you want guests that visit your home to use versus them using your personal PC. What if over a long holiday weekend you happen to forget your password after a recent reset? All of these scenarios can be addressed with the use of a live CD. A live CD is a convenient way to run an operating system on a computer without the use of a hard disk drive. There is a vast array of different operating systems that are available as a live disk including, Ubuntu, Back Track, Knoppix, Windows PE, Fedora, Archie, Klax, Clusterix, just to name a few. Most of the live disks you find are some variant of, but not limited to, a Linux/Unix distribution. Depending on the end usage, they have evolved over time to come with a variety of applications for all spectrum of disciplines and areas of interest. Live disks can serve so many other functions, some of the security specific functions include: network sniffing, file integrity checking application, security testing, network discovery, network port and service identification, vulnerability scanning, wireless scanning, password cracking, remote access testing, and penetration testing.

Live CD's are extremely important for from a security perspective because of a number of reasons. First situation may be a primary educational institution or municipal organization where the administration would want to limit certain abilities of the user such as; installation privileges or even write capabilities to the internal hard disk or secondary memory considering there is a designated monitored hard drive available for data saves. Utilizing live CD technology in this situation would alleviate not only virus and malware infections but would also allow the users to have a better performing PC. Another scenario would be a corporate security professional on the job at a Fortune 500 company using live disks to analyze the companies network to mitigate risk and vulnerabilities and the exploitation thereof. Running security tools from live disk distributions like Back Track and Knoppix STD allow a security professional to run different test on various components and hardware on nearly any platform that will enable them to make informed decisions based on the output of the utilities. Lastly, a scenario that may be more pertinent to the everyday PC user; normal household network security. With the changes in technology and computational performance, on what seems to be more of a daily basis rather than the 18 month estimate proposed by Moore and colleagues, users need to be aware and take appropriate actions to potential weaknesses in their own home network by taking advantage of live disk technology. A lack of knowledge of security risks along with frivolous behavior while surfing the web can be to your detriment.

The beauty of live disks is simplicity. Each of the types of media in which they are on (CD, DVD, or USB) are ubiquitous, easily dispensable, and always replaceable. You can take your live disk with you to nearly any machine and run your preferred OS and utilities with no worries of corrupting existing files or applications and with the ease of mind that you are safe from viruses and malicious software that would normally corrupt conventional desktop OS's. To take it a step further, some of the live disk distributions allow the customization of installed utilities and custom scripting capabilities to meet the needs of your specific application.


References:
http://csrc.nist.gov/publications/nistpubs/800-115/SP800-115.pdf

http://www.nu2.nu/pebuilder/

http://www.computerweekly.com/Articles/2006/09/05/218285/protecting-wireless-networks-step-1.htm

http://lifehacker.com/232963/screenshot-tour-how-to-crack-a-windows-password-with-ophcrack-live-cd

ftp://download.intel.com/museum/Moores_Law/Video-Transcripts/Excepts_A_Conversation_with_Gordon_Moore.pdf

Google Group:
http://groups.google.com/group/itec5321fall09/web/Portfolio_Assignment_01_Live_CD.doc?hl=en