Select Page

It is continues work for:…The assignment:This should be your ‘proposal ready’ review of the literature, defining why you think the study is important, and defending the choices, you have made (intend to make) in the design and analysis of your data. A key component of a research proposal is a review of the literature. You need to establish what is known and what is unknown about your chosen topic. This review and citations of the literature related to your methods are required.This personal search, reading, and analysis of the current literature are demonstrated through the literature review section of your final proposal.You are to submit a publication-ready draft review of the literature section as an assignment. The assignment should generally be of adequate length to demonstrate your critical analysis and synthesis (conclusions) of a minimum of 10 references, with an additional attached page of the references. This should be written as a cohesive paper that integrates and discusses the relevant literature that you have collected and reviewed.This component should be in APA style and be ‘publication ready’ at the time of submission. All scores applied per the rubric are considered final for this assignment.I attached some examples.


Don't use plagiarized sources. Get Your Custom Essay on
Literature Review: Emergency Preparedness in Families
Just from $10/Page
Order Essay




Unformatted Attachment Preview

Review of the Literature
Our military and other government components such as the Department of
Defense develop technology to keep United States citizens safe by improving on
existing methods that identify and stop combatants. One method used to identify a
suspect is by recovery DNA. When an improvised explosive device (IED) is
detonated bomb fragments are recovered and examined in order to attempt to
determine the identity of the bomb maker. Iris, facial, and vascular recognition are
other biometrics tools used to identify our adversaries. Linking forensic functions
with biometric capabilities is a relatively new form of technology and is discussed in
the literature presented.
According to a study by Chirchi, Waghmare, and Chirchi (2011), choosing the
proper biometric tool to fit the specific situation requires knowledge of
technological developments. One such development is the iris scan. Found to be a
reliable form of authentication the military has evolved this form of biometric
identification into a portable tool on the battlefield. The biometric automated toolset
(BAT) is the primary system used by the U.S Central Command to store biometric
data such as iris scans, (D’Agostino, 2008). The iris scan is a unique form of
identification. In its genetic properties no two eyes are the same and furthermore
the characteristic that is dependent on genetics is the pigmentation of the iris,
(Chirchi, Waghmare, and Chirchi, 2011).
Although not less reliable but a less developed form of biometric
identification is facial recognition. It utilizes automated methods to verify the
identity of a person based on physiological characteristics. Tolba, El-Baz, and ElHarby (2011) describe facial recognition as a way to detect facial patterns even in a
crowded scene using classification algorithms. A computer algorithm “normalizes”
the biometric signature so that it is in the same format as the signatures on the
system’s database (Tolba, 2011). Facial recognition is seen as a convenient
biometric tool due to being both machine-readable and human readable. The
ubiquity of surveillance cameras means that, in a sense, a face can leave a trace and
therefore be useful forensically, as are DNA and fingerprints, (DOD, 2007).
A significant tool in biometric identification is the use of DNA analysis,
particular with recovering fingerprints. Esslinger, Siegel, Spillane, and Stallworth,
(2004) research involved using short tandem repeat (SRT) analysis to detect human
DNA from exploded pipe bomb devices. The effect on the DNA left on the
components correlated with the material the pipe was made of (pvc vs. steel), the
fragmentation pattern, and low vs. high explosives. One issue I noticed and it was
briefly mentioned in the article, was with the reliability of the material the pipes
were made. Steel is known to conduct heat better than PVC. The theory was since
steel generates more heat during an explosion the chance for degradation of the
DNA would increase. However since steel is more durable than PVC the percentage
of larger fragments should increase. The more fragments, the more DNA could be
collected. The data from the experiment showed the steel and PVC pipes had a
similar success rate for DNA recover.
Foran, Gehring, and Stallworth (2009) research included the recovery and
analysis of mitochondrial DNA (mtDNA) from exploded pipe bombs. The importance
and difference from STR analysis is that mtDNA analysis allows DNA that has been
extracted from hair, fingernails, and bone to be examined when nuclear DNA cannot
be recovered. Another significant difference is mtDNA sampling can be obtained
from not only the subject but also related family members. The article discussed the
materials and methods used in the test as well as the resulting bomb fragmentation
and the correlation with the quality and quantity of DNA recovered. The results of
the study showed the value of mtDNA analysis in identifying the manufactures of
various detonated IEDs.
Recovering fingerprints and other forms of DNA from various surface areas is
not always textbook. Elements such as temperature, humidity, moisture, and
material of surface area all affect the quality and ability to recover DNA. Shalhoub et
al, (2008) researched a fast curing silicone-casting material (Isomark) as an
effective method to obtain a reliable DNA profile from the casts of the fingerprints.
Participants were asked to handle six different surfaces of various textures. This
study was significant because various items are often used in IEDs that serve as
projectiles. The Army field manual FM 3-34.119 (2005) describes various casings
used such as pipes, soda cans, metal containers, all which turn into projectiles when
detonated. Once recovered contents inside such as marbles, nails, rocks, and glass
can all be examined for DNA. Through their research Shalhoub et al, (2008)
concluded it was possible to recover DNA from Isomark casts made on all substrates
tested. However, no link was noted between quality of finger marks obtained and
the amount of DNA extracted from them, Shalhoub (2008).
Although the research discovered additional technology questions the
research summaries concluded favorable results for recovering DNA from bomb
components leading to identifying the bomb maker. Biometrics tools such as iris
scanning, facial recognition, and fingerprinting are valuable components to
identifying our adversaries and using that intelligence to mitigate against future
Chirchi, V., Waghmar, L.M., & Chirchi, E.R. (2011). Iris biometric recognition for
person identification in security systems. International Journal of Computer
Applications, 24(9). Retrieved August 25, 2011 from – India
D’Agostino, D. 2008. Defense management: DoD can establish more guidance for
biometrics. Retrieved October 2, 2011 from
Department of Defense. 2007. Report of the defense science board task force on
defense biometrics. Retrieved October 2, 2011 from
Department of Defense. (2009). Biometrics task force annual report FY09. Retrieved
September 4, 2011 from
Esslinger, K., Siegel, J., Spillane, H., & Stallworth, S. (2004). Using STR analysis to
detect human DNA from exploded pipe bomb devices. Journal of Forensic Science,
49(3). Retrieved September 7, 2011 from
Federal Bureau of Investigation (FBI). n.d. Terrorist explosive device analytical center
(TEDAC). Retrieved September 15, 2011 from
Foran, D., Gehring, M., & Stallworth, S. (2009). The recovery and analysis of
mitochondrial DNA from exploded pipe bombs. Journal of Forensic Science (54)1.
Retrieved September 7, 2011 from
Makarski, R., Marrero, J. (2002). A surveillance society and the conflict state:
leveraging ubiquitous surveillance and biometrics technology to improve homeland
security. Retrieved September 4, 2011 from
National Science and Technology Council (NSTC). 2008. Biometrics in government
in post 9-11. Retrieved September 4, 2011 from…/Biometrics%20in%20Government%20Post%.
Shalhoub, R., Quinones, I., Ames, C., Multaney, B., Curtis, S., Seeboruth, H., . . .Daniel,
B. (2008). The recovery of latent fingermarks and DNA using a silicone-based
casting material. Forensic Science International 178. p 190-203. Retrieved September
23, 2011 from
Tolba, A.S., El-Baz, A.H., & El-Harby, A.A. (2011). Face recognition: A literature
review. International Journal of Signal Processing 2(2). Retrieved September 29,
2011 from
United States Army. n.d. Chapter 15. Unexploded ordnance and improvised
explosive devices. FM. 3-21.75 Chapter 15. Retrieved September 18, 2011 from
Literature Review for Risk Perception
Seanan Donovan
Review of Literature
Decision making is arguably the most important element of human cognition. Processes
that aid in decision making occur so rapidly that people often fail to recognize them (Gilbert,
2006). Because the cognitive processes that aid intuitive risk assessment have generally served
humanity well for so long, recognition and identification of said processes can seem purely
academic. Gilbert (2011) argues that these cognitive processes were forged in a much different
world than our current one and although moral heuristics often lead to accurate assessments,
research suggests this may not always be the case. Several decades of research has contributed to
our understanding of the decision process and helped reveal many situations in which heuristics
fail us.
Although a plethora of research has identified several decision fallacies, very little has
been done to improve the efforts of Disaster Managers and public health experts regarding risk
behavior. Pidgeon (1998) observed that a layperson’s risk assessments are influenced by the
level of dread an event provokes and to a lesser extent, the level of professional disagreement.
Kreuter and Stretcher (1995) found that individuals overestimate their ability to survive disasters
and underestimate their peer’s survivability. Allowing policies to be formed based on visceral
factors such as dread or optimistic biases have led to over-funding of projects for hazards that
elicit fear and under-funding risk mitigation programs for controllable events that people feel
they could safely avoid (Gilbert, 2011; Sjoberg, 1998).
This study aims to implement lessons learned regarding the framing fallacy and other
biases in order to test their efficacy in the field of disaster management. Using lessons drawn
from the heuristics theory, this study attempts to eliminate all possible internal and external
influences in order to gain insight into the public’s risk mitigation needs. Challenges that
Disaster Managers have faced in previous efforts to implement risk perception theories into their
field arise from the complexity of findings.
There appear to be many factors that influence decision making such as individual
characteristics including age, gender, profession, economic status, and religious beliefs
(Anderson & Lundborg, 2007; Archer, Burkle, & Smith, 2010). Characteristics of the event such
as the distribution of risk (how many people affected in a single event), perceived possibility of
avoidance, volunteerism (such as skydiving), and whether the event was caused by an agent (as
in terrorism) or an object (such as an earthquake) all influence risk perception and thus behavior
(Douglas & Widavsky, 1982; Gutscher & Siegrist, 2008; Kazan & Scott, 2008). Furthermore,
experience appears to relegate a layperson’s poor judgment (Keller, Siegrist, & Wang, 2009).
These factors and more influence the success of disaster communication and have thus far
prevented Public Health professionals form implementing lessons learned risk perception
This literature review will provide an overview of risk perception research followed by
evidence supporting (a) gaps between professionals and laypeople in risk perception (b) dualprocess cognition that uses heuristics in order to simplify complex events (c) biases formed by
heuristics and their negative consequences, and (d) how some of the biases can be used to guide
the public and policy-makers during risk mitigation decisions.
The Gap in Risk Perception
There is much research demonstrating the difference in risk perception between
laypeople and professionals (Archer, Burkle, & Smith, 2010; Sjoberg, 1998). Professionals tend
to base risk perception on probability and occurrence and number of fatalities the event
accumulates in a year (Sjoberg, 1998). Conversely, laypeople rely on other factors such as dread
and distribution of harm in order to determine their tolerable level of risk (Pidgeon, 1998). Many
researchers believe that these extra variables should be factored into policy-making and
government spending (Pidgeon, 1998). Decision-makers adopt a Utilitarian approach when
forming policies in that they attempt to do the greatest good for the greatest number of people
while maintaining an acceptable level of fairness. In a hypothetical situation where people are
asked what the maximum number of innocent lives that they would be willing to risk in order to
feel safe the answer I am willing to assume is zero. Therefore, factors such as dread or
distribution of risk should not factor into decision making and the goal should always be to do
the greatest good for the greatest amount of people. Instead of factoring in these external
variables into policy-making, efforts should be made to navigate around the layperson’s
intuitions. Since knowledge regarding a specific event appears to be the factor creating the gap
between professionals and laypeople (being a professional in terrorism doesn’t diminish the risk
perception gap in other disaster fields) efforts to educate the public should be included into
policies (Slovic, 1987).
Dual-Processing Cognition
The dual-processing theory is part of a broader theory of heuristics (Sunstein, 2005).
Heuristics states that humans create frames or schemata in order to simplify events and guide
decision making (Sunstein, 2005). George Miller’s classic research regarding working memory
was the impetus of Heuristics (Reyna, 2004). The thought behind this connection was that
decision making involved many factors and occurs rapidly, yet Miller demonstrated that our
working memory on average stores only seven items (Reyna, 2004). Heuristics was an elegant
theory but lacked practical applications since research demonstrated that reasoning was
independent of remembering (Douglas & Wildavsky, 1982; Reyna, 2004).
The dual-processing theory accounts for the reasoning-remembering independents while
maintaining that heuristics make rapid and efficient decision making possible (Reyna, 2004). The
theory suggests that humans have two types of memory recall methods (Reyna, 2004). Reyna
(2004) calls this the fuzzy-trace system and labels the two types of memory as verbatim and gist.
Verbatim involves conjuring up details of an event and is processed through working memory, as
distracting memory tests have been shown to affect this type of recall (Sunstein, 2005).
Gist memory types are fuzzier (hence fuzzy-trace theory) and involve emotional and
moral based storage and are used to guide decision making (Reyna, 2004). When a person must
solve a problem, they compare the problem facts to several gist representations involving the
same or similar problem facts then chooses which principle is best based on the greatest number
of ‘wins’ that gist produced in the past (Reyna, 2004). This is similar to how a chess master
decides their next move. The last part of the process is important as it can lead to the availability
fallacy discussed in the next section.
It cannot be overstated that this process has evolved for a reason. Using heuristics to
guide decision making allows for rapid and often accurate responses to complex problems.
People learned that betrayal is bad and should elicit greater outrage than harm from a stranger;
that people should never buy their way out of a crime; and that we should always aim to save
lives or do everything we can to avoid loss (Slovic, 1987). The danger lays in how fast and
natural this decision process works, escaping our notice and thus developing an illusion of
accuracy (Gilbert, 2006). The system of comparing events with stored problem sets for example,
runs the risk of being influenced by framing. Saving 2 hundred lives out of 6 hundred involves
the saving lives heuristic but loosing 4 hundred lives out of 6 hundred recalls the loss aversion
heuristic (Sunstein, 2005). Although the two events just mentioned have the same risk, people
are more likely to choose a policy that will save the 2 hundred lives over the policy that risks
losing 4 hundred lives (Sunstein, 2005).
Cognitive Biases Formed by Heuristics
Our brains ability to create schemata in order to aid in rapid assessment and decision
making has been a large factor in the success of humans. The previously mentioned example
regarding the belief that one should not pay their way out of a crime will lead to more successful
rather than unsuccessful decisions (Sunstein, 2005). When policy-makers suggested emission
trading as a way to lower overall pollution, opponents protested due to this ‘paying for crime’
heuristic (Sunstein, 2005). There is no societal benefit in rape, murder, and abuse so no amount
of trade would be worth its acceptance. With pollution however, economies thrive, technology
produced and transportation is available for everyone. Although polluting above the available
allotment maybe a crime, models that allow for trade-offs result in overall lower emission level,
yet due to this heuristic, people in favor of saving the environment deny its enactment (Sunstein,
2005). There are also several known heuristics that are applicable to disaster management.
Moral Framing
There is a famous scenario that has been performed in various forms many times over the
past several decades. The scenario involves some type of hazard (say an emerging infectious
disease) that is guaranteed to kill 600 people (Sunstein, 2005). Subjects are given two treatment
options and asked to choose one. Treatment A will save 200 lives while treatment B has a one
third probability that everyone will be saved but a two thirds probability that nobody will be
saved (Sunstein, 2005). For the most part, people tended to play it safe and choose treatment A
(Sunstein, 2005). After the subjects decided between treatments A or B two more treatments
were offered. With treatment C 400 people will die and with treatment D there is a one third
probability that nobody will die and a two thirds probability that 600 people will die (Sunstein,
2005). Despite treatment C and D being reworded versions of treatment A and B, people seemed
to be more willing to risk everyone’s life and chose treatment D (Sunstein, 2005). Knowing that
people will take higher risks avoiding loss than to gain rewards can have powerful implications
while communicating plans in disaster planning.
Optimistic Bias
Another fallacy that leads people to inaccurately assess their level of risk is the optimistic
bias. Kreuter and Strecher (1995) showed that people are more likely to underestimate their level
of risk if they perceived the event to be avoidable. This is because humans on average believe
that they are anything but average (which of course is statistically impossible) (Gilbert, 2006).
The only thing unique about an individual is their personal perspective, which allows for
rationalization through assessing both internal and external factors (Gilbert, 2006). This
egocentrism also prevents people from considering other peoples internal factors and thus they
overestimate their peer’s susceptibility to avoidable risks (Kreuter & Strecher, 1995).
Understandin …
Purchase answer to see full

Order your essay today and save 10% with the discount code ESSAYHSELP