How to wright an essay
Monday, August 24, 2020
Silicon Microchips Essays - Semiconductor Device Fabrication
Silicon Microchips Silicon is the crude material frequently utilized in incorporated circuit (IC) manufacture. It is the second most plentiful substance on the earth. It is removed from rocks and regular sea shore sand and put through a comprehensive filtration process. In this structure, silicon is the idealist modern substance that man produces, with polluting influences involving short of what one section in a billion. That is what could be compared to one tennis ball in a string of golf balls extending from the earth to the moon. Semiconductors are typically materials which have vitality band holes littler than 2eV. A significant property of semiconductors is the capacity to change their resistivity more than a few significant degrees by doping. Semiconductors have electrical resistivities between 10-5 and 107 ohms. Semiconductors can be crystalline or undefined. Natural semiconductors are basic component semiconductor materials, for example, silicon or germanium. Silicon is the most widely recognized semiconductor material utilized today. It is utilized for diodes, transistors, coordinated circuits, recollections, infrared discovery and focal points, light-producing diodes (LED), photosensors, strain gages, sunlight based cells, charge move gadgets, radiation finders and an assortment of different gadgets. Silicon has a place with the gathering IV in the occasional table. It is a dark fragile material with a jewel cubic structure. Silicon is routinely doped with Phosphorus, Arsenic and Antimony and Boron, Aluminum, and Gallium acceptors. The vitality hole of silicon is 1.1 eV. This worth allows the activity of silicon semiconductors gadgets at higher temperatures than germanium. Presently I will give you some concise history of the development of gadgets which will assist you with seeing more about semiconductors and the silicon chip. In the mid 1900's before coordinated circuits and silicon chips were created, PCs and radios were made with vacuum tubes. The vacuum tube was concocted in 1906 by Dr.Lee DeForest. All through the principal half of the twentieth century, vacuum tubes were utilized to lead, regulate and enhance electrical signs. They made conceivable an assortment of new items including the radio and the PC. Anyway vacuum tubes had some innate issues. They were massive, fragile and costly, devoured a extraordinary arrangement of intensity, set aside some effort to heat up, got hot, and inevitably consumed out. The principal computerized PC contained 18,000 vacuum tubes, gauged 50 tins, furthermore, required 140 kilowatts of intensity. By the 1930's, specialists at the Bell Phone Laboratories were searching for a substitution for the vacuum tube. They started examining the electrical properties of semiconductors which are non-metallic substances, for example, silicon, that are neither conveyors of power, similar to metal, nor covers like wood, yet whose electrical properties lie between these boundaries. By 1947 the transistor was imagined. The Ringer Labs explore group looked for a method of straightforwardly adjusting the electrical properties of semiconductor material. They learned they could change and control these properties by doping the semiconductor, or mixing it with chosen components, warmed to a vaporous stage. At the point when the semiconductor was too warmed, particles from the gases would saturate it and adjust its unadulterated, gem structure by uprooting a few iotas. Since these dopant molecules had extraordinary measure of electrons than the semiconductor particles, they shaped conductive ways. On the off chance that the dopant iotas had a greater number of electrons than the semiconductor molecules, the doped areas were called n-type to mean and overabundance of negative charge. Less electrons, or an overabundance of positive charge, made p-type locales. By permitting this dopant to happen in painstakingly outlined territories on the outside of the semiconductor, p-type areas could be made inside n-type districts, and the other way around. The transistor was a lot littler than the vacuum tube, didn't get exceptionally hot, and didn't require a headed fiber that would inevitably wear out. At last in 1958, incorporated circuits were concocted. By the mid 1950's, the first business transistors were being dispatched. Anyway look into proceeded. The researcher started to believe that on the off chance that one transistor could be worked inside one strong bit of semiconductor material, why not various transistors or even a whole circuit. With in a couple of years this hypothesis became one strong bit of material. These incorporated circuits(ICs) decreased the quantity of electrical interconnections required in a bit of electronic gear, in this way expanding unwavering quality and speed. Conversely, the main advanced electronic PC fabricated with 18,000 vacuum tubes and gauged 50 tons, cost around 1 million, required 140 kilowatts of intensity, and consumed a whole room. Today, a total PC, manufactured inside a solitary bit of silicon the size of a youngster's fingernail, cost distinctly about $10.00. Presently I will reveal to you the technique for how the coordinated circuits and the silicon chip is shaped. Before the IC is really made a huge scope drawing,
Saturday, August 22, 2020
Diversity in instructional methods toward meaningful learning Essay
Unique There is assorted variety in instructional strategies that educators can use to achieve important learning. This paper examines five of them; to be specific coordinated request approach, 5-model of request, the dance sawing approach, pretending and WebQuest. These instructional techniques are student focused strategies that consider earlier information, disposition and aptitudes and advance improvement of new information and relate them to an assortment of settings. Every one of them likewise manage genuine circumstances that basically create relational connections, critical thinking abilities and substance information among others. The teachersââ¬â¢ assignments are to plan and do effectively the instructional structures to have important learning among various students, instructional techniques and learning situations. Decent variety in Instructional Methods Toward Meaningful Learning Diversity is a basic element of accomplishment of all endeavors in life including training. There are various types of students as there are instructors, instructional techniques and learning conditions; yet there is just a single objective in training and this is for a powerful and significant learning. Instructors should set conditions for understudies so they could think fundamentally and autonomously and relate new information learned with an assortment of settings for significant learning. It is the undertaking of the educators to coordinate the students, the learning situations, the information to be scholarly and the instructional strategies. Adapting seriously implies that students relate new information to what they definitely know. Significant learning is non-subjective, non-verbatim, meaningful intentional exertion to connect new information with higher request ideas in psychological structures. It is a learning identified with encounters with occasions or protests and full of feeling duty to relate new information with earlier learning. The assorted instructional structures towards important learning ought to recognize results, control the improvement of guidance content and set up its adequacy. Endeavors to consider important learning in the various phases of instructional structure are fundamental. Gagne et al. (1992) distinguished the various phases of instructional structures as: characterizing instructional objectives; leading instructional investigation; recognizing passage practices and student attributes; creating execution targets; choosing an instructional technique; amassing instructional materials and arranging developmental and summative assessments. He and his partners additionally refered to that ebb and flow instructive hypothesis and explores bolster the utilization of instructional techniques that make understudies dynamic students. Among the various instructional techniques accessible to educators to investigate and utilize, the regularly used methodology towards development of new information genuinely are the issue ââ¬based learning and request approach, helpful learning, and innovation procedures. Every one of these techniques has its own favorable circumstances and weaknesses, however when utilized viably can augment learning. Issue Based and Inquiry Approach Students in the issue based and learning request approach take part in significant learning through being effectively associated with their own learning and reproducing these dependent on their encounters. They further take an interest in dynamic examination, a greater amount of incorporating information instead of isolating them with the goal that profound comprehension create from obtaining of new realities. In this strategy, understudies are given pertinent issues by educators which request must be finished. The general strides in this request approach are: distinguishing the issues, get-together of information, sorting out the information in endeavor to break down the issues and investigations of the methodologies to use to take care of the issues. Coordinated Inquiry In the Integrated Inquiry arranging process, a model of request approach created by K. Murdoch, arrangements of exercises and encounters are created to expand on and challenge understudy discernments. These groupings start with studentsââ¬â¢ earlier information and encounters and travel through intentional procedures wherein that information is broadened, tested and refined. Understudies have their own related involvements that they bring to their classes and instructors ought to know about how to address this circumstance. Exercises and learning encounters in this model are gathered as: tuning on, discovering, sifting through, going further, making ends and taking activities (Murdoch, 1999). Moreover, making arrangements for appraisal is a significant component of getting ready for Integrated Inquiry. Murdoch (1999) features the requirement for the assortment and investigation of data about what and the how understudies have learned. The appraisal in the Integrated Inquiry model is to decide how to improve understudy learning as these new data assist instructors with changing their arrangements of work to suit the requirements of the students. Studentsââ¬â¢ inclusion in getting ready for appraisal as in choosing reactions to specific learning encounters and planning shows of comprehension are exceptionally supported. In this way, instructors are likewise entrusted to distinguish and configuration learning encounters that will give data to evaluation purposes. The qualities of this model are focussed on appraisal of learning in setting and empowering an assortment of exhibits of understanding dependent on the learning encounters that understudies attempt. Students that may profit most from this Integrated Inquiry Approach are those equipped for defining objectives in their own learning and altogether contribute in deciding how appraisal could be successfully done. 5-E Model In the book ââ¬Å"Activities for Teaching Science as Inquiryâ⬠via Carin, Bass and Contant (2005) numerous research center examinations were refered to as request way to deal with learning. They concentrated on the 5-E Instructional Model with the five primary segments distinguished as Engagement, Exploration, Explanation, Elaboration and Evaluation. Every one of these segments is student focused. This investigatory technique perhaps time and asset devouring however it permits the students to create basic reasoning and critical thinking aptitudes experientially. The utilization of this strategy isn't constrained to showing sciences, which are viewed as not simple subjects. This experiential learning carries more open doors for students to deliver better understanding and longer maintenance of information learned. Agreeable Learning Cooperative learning is an instructional strategy that happens in a little gathering of students of various degrees of capacity and in situations of duty not just for their own comprehension of the subject yet in addition for his co-students. It carries additionally significance to learning since it gives shared psychological arrangements of data between understudies, inspiring them to get familiar with the materials, guaranteeing that they build their own insight, giving developmental input, creating social and gathering aptitudes vital for progress outside the study hall. Agreeable inclining advances learning and scholastic accomplishment, expands maintenance and fulfillment with their learning encounters among understudies, creates abilities in oral correspondence, social aptitudes, advances understudy confidence and encourages common duty. Despite the fact that this technique causes understudies figure out how to be increasingly understanding, not so much basic but rather more humane, a few understudies may discover trouble with this strategy. Understudies who work alone discover trouble in sharing answers while forceful understudies will in general dominate and more brilliant understudies to act better than the rest. Instructors who will utilize helpful ought to set up their understudies how to function in bunches for this technique to be effective. Dance Sawing Approach The ââ¬Å"Jig Sawingâ⬠Approach is a helpful learning technique wherein understudies turns into a ââ¬Å"expertâ⬠in a specific region, at that point imparts their learning information to different individuals from the gathering that inevitably all individuals from the gathering gain proficiency with the ideas. In the Modified Jigsaw, the class is partitioned into equivalent master gatherings, with every one of these gatherings chipping away at detached segments of the action. When every master bunch has finished the assignments, they report their discoveries as gathering to the class. Gathering report takes into account more noteworthy adaptability in understudy introduction style and forestalls the chance of unexpectedly deception of data (Beaudrie et al. 1998). This strategy best suits heterogenous students across disciplines. It gives chances to students to show different abilities. Also, understudies are increasingly agreeable to trade thoughts with their co-students as a result of their dynamic open relationship. Pretending Another instructional strategy for intrigue is pretending. It likewise manages taking care of issues yet through activities. In pretending, issues are recognized, investigated through activities and talked about. The understudies contribution to their pretending their earlier information, qualities and mentalities. A pretending procedure appears to work best when there are numerous right ways to deal with taking care of issues. It urges thinking and innovativeness to create and rehearse new practices in non-compromising setting. It gives chances to understudies to investigate further their sentiments; acquire experiences about their mentalities and furthermore improve their critical thinking abilities. It additionally advances powerful relational relations. The learning in these pretending exercises are important as they are held longer and would have liked to be useful to the genuine of the students. Terms which are utilized, frequently reciprocally with pretending are ââ¬Å"simulation,â⬠ââ¬Å"game,â⬠ââ¬Å"role-play,â⬠ââ¬Å"simulation-game,â⬠ââ¬Å"role-play simulation,â⬠and ââ¬Å"role-playing gameâ⬠. Pretending powerfully advances successful relational connections and social exchanges among students. Innovation Supported Approach Technology gives a lot of hardware to tending to the issues on improving st
The Declaration of independence Free Essays
As per the criminal equity office, in 2011 around 321 individuals were condemned to capital punishment, this being just in the province of Texas. Which raises the inquiry: Is the presentation of freedom being regarded in the US? Wellâ⬠¦ Not generally! Have you at any point seen MTVââ¬â¢s show 16 and pregnant? Well in a portion of the high schooler cases they have a fetus removal; and there are really puts where this is lawful and it is ââ¬Å"okâ⬠to do it. That being said; in this record you will discover a few contentions of why the revelation of freedom isn't being regarded. We will compose a custom exposition test on The Declaration of freedom or then again any comparable subject just for you Request Now In the event that youââ¬â¢ve ever perused the assertion of autonomy, you presumably saw that in its second section it obviously expresses that all men are made equivalent and that there are sure unalienable rights that administrations ought to never abuse. These rights incorporate the privilege to life, freedom and the quest for bliss. So lets talk a little about the main unalienable right, the privilege to life. No administration ought to have the ability to take a personââ¬â¢s life, as expressed in the DOI the option to live is the first of the unalienable rights. So for what reason is the administration actuating deadly portions into a huge number of individuals, or dropping bombs on Japan executing millions, or supporting premature birth. There is no conceivable method of making a conclusion to an actual existence ââ¬Å"okâ⬠. Capital punishment ought to be abrogated, since it is a reasonable infringement to what The announcement of autonomy stands forâ⬠¦. Just as fetus removal and the utilization of substance, or atomic bombs. Ending a life isn't just unsuitable it is a reasonable infringement to what the revelation of freedom rely on. Fetus removal, capital punishment and bombs should be abrogated and not be taken in include in any circumstance. Wellâ⬠¦ that being said the DOI isn't being regarded in the US and it is plainly being damaged. It is a disgrace that the US really has capital punishment and its certainly destroyed that they donââ¬â¢t regard what the country is essentially founded on! The most effective method to refer to The Declaration of autonomy, Papers The Declaration of Independence Free Essays 2_01Revolutionary_Ideas Alex Wasko 4-20-13 Mr. Walsh The Declaration of Independence|Use this board to give a section outline of the reason and structure of the Declaration of Independence. The Declariation of Independence is an announcement embraced by the Contenial Congress on July 4, 1776, which reported that the thirteen American provinces then at war with Great Britan, viewed themselves as free states, and not, at this point a piece of the British Empire. We will compose a custom article test on The Declaration of Independence or on the other hand any comparative subject just for you Request Now Rather they currently framed another country the United States of America. Well known sway is the rule that the authenticity of the legislature relies upon the will or assent of its kin. ââ¬Å"When in the corse of human occasions it gets important for one individual to break down the political groups which have associated them with another and to expect among the forces of the earth, the different and equivalent station to which the laws of nature and of nature,s god entitle them ,a good regard to the assessments of humankind necessitates that they ought to proclaim the causes which induce to the detachment. The Declaration of Independence basically expressed that the implicit understanding that the province of America had with the legislature of Great Britain was not, at this point substantial. What's more, this is in the absolute first sentence. The implicit agreement that was reflected in the Declaration was as an indication of upholding Democracy. |Natural rights will be rights not dependent upon the laws, customs, or convictions of a specific culture or government, and subsequently all inclusive and unavoidable rights. Singular rights will be rights held by unique individuals; regardless of whether they are bunch separated, what most rights are, they stay singular rights if the right-holders are simply the people. |The just issue with the Declaration of Independence is the ââ¬Å"all men are made equalâ⬠line. On the off chance that they had implied ââ¬Å"menâ⬠to be nonexclusive and apply to all individuals, it would be alright, yet they plainly didnââ¬â¢t. They didnââ¬â¢t give ladies the option to cast a ballot or whatever else. Actually, just male landowners were permitted to cast a ballot at first. | The most effective method to refer to The Declaration of Independence, Papers
Friday, August 21, 2020
buy custom Passive Smoking essay
purchase custom Passive Smoking article Around 90 % of all lung malignant growths are brought about by tobacco use (Jemal, 2005). Lung malignancy hazard increments relying upon the quantity of cigarettes that one has smoked and the term of time when one has been smoking. Specialists will in general characterize this hazard as pack-long stretches of a people smoking history. They do this by increasing the quantity of bundles of cigarettes that one smokes for each day by the quantity of years when one has been smoking. Smoking channel and stogie can likewise cause lung malignant growth, despite the fact that for this situation, the hazard isn't as high as in cigarette smoking. Tobacco smoke is known to contain around 4,000 concoction aggravates, some of which have been demonstrated to be cancer-causing or disease causing. The two essential cancer-causing synthetic compounds found in tobacco smoke are polycyclic sweet-smelling hydrocarbons and nitrosamines. At the point when one stops smoking, the danger of experiencing lung malignant growth diminishes drastically every yer in light of the fact that harmed cells will in general be supplanted by the ceaselessly developing typical lung cells. For previous smoker, it takes 15 years for the danger of creating lung malignant growth to get like that of an individual who has never smoked. Inactive smoking can likewise cause lung disease. Non-smokers can get detached smoking by breathing in tobacco smoke created by smokers. This happens when these non-smokers share working or living quarters with smokers and has become a set up chance factor for this sort of disease. As per The American Cancer Society, around 3,000 lung malignancy passings that are accounted for in the U.S consistently are credited to uninvolved smoking. Lung malignant growth can likewise be brought about by radon gas, asbestos filaments, familial inclination, lung maladies and air contamination. At the point when one is presented to asbestos, asbestos filaments can continue in the lung tissue for a lifetime, particularly among people who work iin settings where they are presented to asbestos. Today, utilization of asbestos for acoustic and warm protection is prohibited or utilized in constrained cases in numerous nations because of the risk of lung malignancy that bring to individuals who work in the asbestos business. In like manner, introduction to radon gas can build the danger of getting lung disease. With respect to familial inclination, various investigations have shown that lung disease is probably going to happen more among both non-smoking and smoking family members of individuals who have experienced lung malignant growth contrasted with everyone. Lung malignant growth survivors have a higher danger of experiencing the sickness for the subsequent time contrasted with others. At long last, air populace likewise raises the probability of somebody experiencing lung malignancy as per a perception made by Pope (2002). Specialists accept that the lung disease chance presented by breathing contaminated air is like the hazard brought about by latent smoking. Purchase custom Passive Smoking exposition
Friday, July 24, 2020
How Extroversion in Personality Influences Behavior
How Extroversion in Personality Influences Behavior Theories Personality Psychology Print How Extroversion in Personality Influences Behavior By Kendra Cherry facebook twitter Kendra Cherry, MS, is an author, educational consultant, and speaker focused on helping students learn about psychology. Learn about our editorial policy Kendra Cherry Updated on September 29, 2019 Portra Images / Taxi / Getty Images More in Theories Personality Psychology Myers-Briggs Type Indicator Behavioral Psychology Cognitive Psychology Developmental Psychology Social Psychology Biological Psychology Psychosocial Psychology In the big 5 theory of personality, extroversion (often known as extraversion) is one of the five core traits believed to make up human personality. Extroversion is characterized by sociability, talkativeness, assertiveness, and excitability. People who are high in extroversion tend to seek out social stimulation and opportunities to engage with others. These individuals are often described as being full of life, energy, and positivity. In group situations, extroverts (extraverts) are likely to talk often and assert themselves. Introverts, on the other hand, are people who are low in extroversion. They tend to be quiet, reserved and less involved in social situations. It is important to note that introversion and shyness are not the same things. People low in extroversion are not afraid of social situations, they simply prefer to spend more time alone and do not need as much social stimulation. Extroverts are often unfairly pegged as overly-talkative or attention-seeking. In reality, they simply gain energy from engaging in social interaction. People who are high in extroversion need social stimulation to feel energized. They gain inspiration and excitement from talking and discussing ideas with other people. 1:33 Signs You May Be An Extrovert Common Extroversion Traits Extroversion is often marked by a number of different sub-traits. Some include: WarmthSeeking novelty and excitementGregariousnessAssertivenessCheerfulnessTalkativenessEnjoys being the center of attentionAction-orientedFriendlyEngaging Causes of Extroversion The exact reason why people tend to be more extroverted or more introverted has been the subject of considerable debate and research in psychology. As with many such debates, the question tends to boil down to two key contributors: nature or nurture. Extroversion clearly has a strong genetic component. Twin studies suggest that genetics contribute somewhere between 40 and 60 percent of the variance between extroversion and introversion.Environment can also have an impact. Sibling studies have suggested that individual experiences carry greater weight than do shared experiences in families.Variability in this trait may be linked to differences in cortical arousal. Extroverts tend to need more external stimulation while introverts tend to become stimulated very easily, according to some researchers, including Hans Eysenck. Extroversion and Behavior How does extroversion impact our behavior? Researchers have found that being high in this personality trait is linked to a number of different tendencies. In addition to contributing to our personalities, this trait may also play a role in the type of career that we end up choosing. According to researchers, extroversion is associated with leadership behavior. Since extroverts are more likely to assert themselves in groups, it makes sense that these individuals often take on leadership roles when working with other people. Research has also shown that extroverts are less likely to experience anxiety over negative feedback. Those high in extroversion are often described as having a very positive outlook on life as well as being friendly, energetic, and highly adaptable. All of these tendencies can serve a person well, particularly in certain social situations. As you might imagine, high levels of extroversion can be particularly well suited to jobs that require a great deal of interaction with other people. Teaching, sales, marketing, public relations, and politics are all jobs in which an extrovert might do well. Introverts prefer less social interaction so jobs that require lots of independent work are often ideal. Writing, computer programming, engineering, and accounting are all jobs that might appeal to a person low in extroversion. How Common Is Extroversion? While it might seem like everyone in your circle of friends and acquaintances is more extroverted than you, recent research actually indicates that extroversion is less common than previously thought. In a study published in Psychological Science, researchers found that extroverts tend to be overrepresented in social networks. Because outgoing, popular people tend to have a lot of friends, they are disproportionately represented in social networks. âIf youâre more extraverted, you may really have a skewed view of how extraverted other people are in general,â explained researcher Daniel C. Feiler of Dartmouth University. âIf youâre very introverted you might actually have a pretty accurate idea.â The researchers also suggested that there are two key factors that determine who people become friends with. Extroverts tend to be very sociable, making them more likely to form new friendships than introverts. People also tend to form friendships with people with similar levels of extroversion as themselves. While extroverts are more likely to become friends with other extroverts, introverts tend to forge relationships with both introverts and extroverts. To extroverts, it seems like most people are also extroverted because that personality trait is overrepresented among their group of friends and acquaintances. Introverts, however, might have a better grasp of the true structure of social networks. 5 Signs You Might Be an Extrovert
Saturday, June 27, 2020
Recent advances in dna sequencing technologies - Free Essay Example
Abstract Recent advances in DNA sequencing technologies have led to efficient methods for determining the sequence of DNA. DNA sequencing was born in 1977 when Sanger et al proposed the chain termination method and Maxam and Gilbert proposed their own method in the same year. Sangers method was proven to be the most favourable out of the two. Since the birth of DNA sequencing, efficient DNA sequencing technologies was being produced, as Sangers method was laborious, time consuming and expensive; Hood et al proposed automated sequencers involving dye-labelled terminators. Due to the lack of available computational power prior to 1995, sequencing an entire bacterial genome was considered out of reach. This became a reality when Venter and Smith proposed shotgun sequencing in 1995. Pyrosequencing was introduced by Ronagi in 1996 and this method produce the sequence in real-time and is applied by 454 Life Sciences. An indirect method of sequencing DNA was proposed by Drmanac in 1987 called sequen cing by hybridisation and this method lead to the DNA array used by Affymetrix. Nanopore sequencing is a single-molecule sequencing technique and involves single-stranded DNA passing through lipid bilayer via an ion channel, and the ion conductance is measured. Synthetic Nanopores are being produced in order to substitute the lipid bilayer. Illumina sequencing is one of the latest sequencing technologies to be developed involving DNA clustering on flow cells and four dye-labelled terminators performing reverse termination. DNA sequencing has not only been applied to sequence DNA but applied to the real world. DNA sequencing has been involved in the Human genome project and DNA fingerprinting. Introduction Reliable DNA sequencing became a reality in 1977 when Frederick Sanger who perfected the chain termination method to sequence the genome of bacteriophage ?X174 [1][2]. Before Sangers proposal of the chain termination method, there was the plus and minus method, also presented by Sanger along with Coulson [2]. The plus and minus method depended on the use of DNA polymerase in transcribing the specific sequence DNA under controlled conditions. This method was considered efficient and simple, however it was not accurate [2]. As well as the proposal of the chain termination sequencing by Sanger, another method of DNA sequencing was introduced by Maxam and Gilbert involving restriction enzymes, which was also reported in 1977, the same year as Sangers method. The Maxamm and Gilbert method shall be discussed in more detail later on in this essay. Since the proposal of these two methods, spurred many DNA sequencing methods and as the technology developed, so did DNA sequencing. In this literature review, the various DNA sequencing technologies shall be looked into as well their applications in the real world and the tools that have aided sequencing DNA e.g. PCR. This review shall begin with the discussion of the chain termination method by Sanger. The Chain Termination Method Sanger discovered that the inhibitory activity of 23-didoxythymidine triphosphate (ddTTP) on the DNA polymerase I was dependent on its incorporation with the growing oligonucleotide chain in the place of thymidylic acid (dT) [2]. In the structure of ddT, there is no 3-hydroxyl group, by there is a hydrogen group in place. With the hydrogen in place of the hydroxyl group, the chain cannot be extended any further, so a termination occurs at the position where dT is positioned. Figure 1 shows the structure of dNTP and ddNTP. Sanger discovered that the inhibitory activity of 23-didoxythymidine triphosphate (ddTTP) on the DNA polymerase I was dependent on its incorporation with the growing oligonucleotide chain in the place of thymidylic acid (dT) [2]. In the structure of ddT, there is no 3-hydroxyl group, by there is a hydrogen group in place. With the hydrogen in place of the hydroxyl group, the chain cannot be extended any further, so a termination occurs at the position where dT is positioned. Figure 1 shows the structure of dNTP and ddNTP. In order to remove the 3-hydroxyl group and replace it with a proton, the triphosphate has to undergo a chemical procedure [1]. There is a different procedure employed for each of the triphosphate groups. Preparation of ddATP was produced from the starting material of 3-O-tosyl-2-deoxyadenosine which was treated with sodium methoxide in dimethylformamide to produce 2,3-dideoxy-2,3-didehydroadenosine, which is an unsaturated compound [4]. The double bond between carbon 2 and 3 of the cyclic ether was then hydrogenated with a palladium-on-carbon catalyst to give 2,3-dideoxyadenosine (ddA). The ddA (ddA) was then phosphorylated in order add the triphosphate group. Purification then took place on DEAE-Sephadex column using a gradient of triethylamine carbonate at pH 8.4. Figure 2 is schematic representation to produce ddA prior to phosphorylation. In the preparation of ddTTP (Figure 3), thymidine was tritylated (+C(Ph3)) at the 5-position and a methanesulphonyl (+CH3SO2) group was introduced at the 3-OH group[5]. The methanesulphonyl group was substituted with iodine by refluxing the compound in 1,2-dimethoxythane in the presence of NaI. After chromatography on a silica column the 5-trityl-3-iodothymidine was hydrogenated in 80% acetic acid to remove the trityl group. The resultant 3-iodothymidine was hydrogenated to produce 23-dideoxythymidine which subsequently was phosphorylated. Once phosphorylated, ddTTP was then purified on a DEAE-sephadex column with triethylammonium-hydrogen carbonate gradient. Figure 3 is a schematic representation to produce ddT prior phosphorylation. When preparing ddGTP, the starting material was N-isobutyryl-5-O-monomethoxytrityldepxyguanosine [1]. After the tosylation of the 3-OH group the compound was then converted to the 23-didehydro derivative with sodium methoxide. Then the isobutyryl group was partly removed during this treatment of sodium methoxide and was removed completely by incubation in the presence of NH3 overnight at 45oC. During the overnight incubation period, the didehydro derivative was reduced to the dideoxy derivative and then converted to the triphosphate. The triphosphate was purified by the fractionation on a DEAE-Sephadex column using a triethylamine carbonate gradient. Figure 4 is a schematic representation to produce ddG prior phosphorylation. Preparing the ddCTP was similar to ddGTP, but was prepared from N-anisoyl-5-O-monomethoxytrityldeoxycytidine. However the purification process was omitted for ddCTP, as it produced a very low yield, therefore the solution was used directly in the experiment described in the paper [2]. Figure 5 is a schematic representation to produce ddC prior phosphorylation. With the four dideoxy samples now prepared, the sequencing procedure can now commence. The dideoxy samples are in separate tubes, along with restriction enzymes obtained from ?X174 replicative form and the four dNTPs [2]. The restriction enzymes and the dNTPs begin strand synthesis and the ddNTP is incorporated to the growing polynucleotide and terminates further strand synthesis. This is due to the lack of the hydroxyl group at the 3 position of ddNTP which prevents the next nucleotide to attach onto the strand. The four tubes are separate by gel-electrophoresis on acrylamide gels (see Gel-Electrophoresis). Figure 6 shows the sequencing procedure. Reading the sequence is straightforward [1]. The first band that moved the furthest is located, this represents the smallest piece of DNA and is the strand terminated by incorporation of the dideoxynucleotide at the first position in the template. The track in which this band occurs is noted. For example (shown in Figure 6), the band that moved the furthest is in track A, so the first nucleotide in the sequence is A. To find out what the next nucleotide, the next most mobile band corresponding to DNA molecule which is one nucleotide longer than the first, and in this example, the band is on track T. Therefore the second nucleotide is T, and the overall sequence so far is AT. The processed is carried on along the autoradiograph until the individual bands start to close in and become inseparable, therefore becoming hard to read. In general it is possible to read upto 400 nucleotides from one autoradiograph with this method. Figure 7 is a schematic representation of an autoradiograph. Ever since Sanger perfected the method of DNA sequencing, there have been advances methods of sequencing along with the achievements. Certain achievements such as the Human genome project and shall be discussed later on in this review. Gel-Electrophoresis Gel-Electrophoresis is defined as the movement of charged molecules in an electric field [1][8]. DNA molecules, like many other biological compounds carry an electric charge. With the case of DNA, this charge is negative. Therefore when DNA is placed in an electric field, they migrate towards the positive pole (as shown in figure 8). There are three factors which affect the rate of migration, which are shape, electrical charge and size. The polyacrylamide gel comprises a complex network of pores through which the molecules must travel to reach the anode. Maxam and Gilbert Method The Maxam and Gilbert method was proposed before Sanger Method in the same year. While the Sangers method involves enzymatic radiolabelled fragments from unlabelled DNA strands [2]. The Maxam-Gilbert method involves chemical cleavage of prelabelled DNA strands in four different ways to form the four different collections of labelled fragments [6][7]. Both methods use gel-electrophoresis to separate the DNA target molecules [8]. However Sangers Chain Termination method has been proven to be simpler and easier to use than the Maxam and Gilbert method [9]. As a matter of fact, looking through the literature text books, Sangers method of DNA sequencing have been explained rather than Maxam and Gilberts [1][3][9][10]. With Maxam and Gilberts method there are two chemical cleavage reactions that take place [6][7]. One of the chemical reaction take places with guanine and the adenine, which are the two purines and the other cleaves the DNA at the cytosine and thymine, the pyrimidines. For the cleavage reaction, specific reagents are used for each of the reaction. The purine specific reagent is dimethyl sulphate and the pyrimidine specific reagent is hydrazine. Each of these reactions are done in a different way, as each of the four bases have different chemical properties. The cleavage reaction for the guanine/adenine involves using dimethyl sulphate to add a methyl group to the guanines at the N7 position and at the N3 position at the adenines [7]. The glycosidic bond of a methylated adenines is unstable and breaks easily on heating at neutral pH, leaving the sugar free. Treatment with 0.1M alkali at 90oC then will cleave the sugar from the neighbouring phosphate groups. When the resulting end-labelled fragments are resolved on a polyacrylamide gel, the autoradiograph contains a pattern a dark and light bands. The dark bands arise from the breakage at the guanines, which methylate at a rate which is 5-fold faster than adenines. From this reaction the guanine appear stronger than the adenosine, this can lead to a misinterpretation. Therefore an Adenine-Enhanced cleavage reaction takes place. Figure 9 shows the structural changes of guanine when undergoing the structural modifications involved in Maxam-Gilbert sequencing. With an Adenine-Enhanced cleavage, the glycosidic bond of methylated adenosine is less stable than that of methylated guanosine, thus gentle treatment with dilute acid at the methylation step releases the adenine, allowing darker bands to appear on the autoradiograph [7]. The chemical cleavage for the cytosine and thymine residues involves hydrazine instead of dimethyl sulphate. The hydrazine cleaves the base and leaving ribosylurea [7]. After partial hydrazinolysis in 15-18M aqueous hydrazine at 20oC, the DNA is cleaved with 0.5M piperidine. The piperidine (a cyclic secondary amine), as the free base, displaces all the products of the hydrazine reaction from the sugars and catalyzses the b-elimination of the phosphates. The final pattern contains bands of the similar intensity from the cleavages at the cytosines and thymines. As for cleavage for the cytosine, the presence of 2M NaCl preferentially suppresses the reaction of thymine with hydrazine. Once the cleavage reaction has taken place each original strand is broken into a labelled fragment and an unlabelled fragment [7]. All the labelled fragments start at the 5 end of the strand and terminate at the base that precedes the site of a nucleotide along the original strand. Only the labelled fragments are recorded on the gel electrophoresis. Dye-labelled terminators For many years DNA sequencing has been done by hand, which is both laborious and expensive[3]. Before automated sequencing, about 4 x 106 bases of DNA had been sequenced after the introduction of the Sangers method and Maxam Gilbert methods [11]. In both methods, four sets of reactions and a subsequent electrophoresis step in adjacent lanes of a high-resolution polyacrylamide gel. With the new automated sequencing procedures, four different fluorophores are used, one in each of the base-specific reactions. The reaction products are combined and co-electrophoresed, and the DNA fragments generated in each reaction are detected near the bottom of the gel and identified by their colour. As for choosing which DNA sequencing method to be used, Sangers Method was chosen. This is because Sangers method has been proven to be the most durable and efficient method of DNA sequencing and was the choice of most investigators in large scale sequencing [12]. Figure 10 shows a typical sequence is ge nerated using an automated sequencer. The selection of the dyes was the central development of automated DNA sequencing [11]. The fluorophores that were selected, had to meet several criteria. For instance the absorption and emission maxima had to be in the visible region of the spectrum [11] which is between 380 nm and 780 nm [10], each dye had to be easily distinguishable from one another [11]. Also the dyes should not impair the hybridisation of the oligonucleotide primer, as this would decrease the reliability of synthesis in the sequencing reactions. Figure 11 shows the structures of the dyes which are used in a typical automated sequencing procedure, where X is the moiety where the dye will be bound to. Table 1 shows which dye is covalently attached to which nucleotide in a typical automated DNA sequencing procedure Dye Nucleotide Attached Flourescein Adenosine NBD Thymine Tetramethylrhodamine Guanine Texas Red Cytosine In designing the instrumentation of the florescence detection apparatus, the primary consideration was sensitivity. As the concentration of each band on the co-electrophoresis gel is around 10 M, the instrument needs to be capable of detecting dye concentration of that order. This level of detection can readily be achieved by commercial spectrofluorimeter systems. Unfortunately detection from a gel leads to a much higher background scatter which in turn leads to a decrease in sensitivity. This is solved by using a laser excitation source in order to obtain maximum sensitivity [11]. Figure 12 is schematic diagram of the instrument with the explanation of the instrumentation employed. When analyzing data, Hood had found some complications [11]. Firstly the emission spectra of the different dyes overlapped, in order to overcome this, multicomponent analysis was employed to determine the different amounts of the four dyes present in the gel at any given time. Secondly, the different dye molecules impart non-identical electrophoretic mobilities to the DNA fragments. This meant that the oligonucleotides were not equal base lengths. The third major complication was in analyzing the data comes from the imperfections of the enzymatic methods, for instance there are often regions of the autoradiograph that are difficult to sequence. These complications were overcome in five steps [11] High frequency noise is removed by using a low-pass Fourier filter. A time delay (1.5-4.5 s) between measurements at different wavelength is partially corrected for by linear interpolation between successive measurements. A multicomponent analysis is performed on each set of four data points; this computation yields the amount of each of the four dyes present in the detector as a function of time. The peaks present in the data are located The mobility shift introduced by the dyes is corrected for using empirical determined correction factors. Since the publication of Hoods proposal of the fluorescence detection in automated DNA sequence analysis. Research has been made on focussed on developing which are better in terms of sensitivity [12]. Bacterial and Viral Genome Sequencing (Shotgun Sequencing) Prior to 1995, many viral genomes have been sequenced using Sangers chain termination technique [13], but no bacterial genome has been sequenced. The viral genomes that been sequenced are the 229 kb genome of cytomegalovirus [14], and the 192 kb genome of vaccinia [15], the 187 kb mitochondrial and 121 kb cholorophast genomes of Marchantia polymorpha have been sequenced [16]. Viral genome sequencing has been based upon the sequencing of clones usually derived from extensively mapped restriction fragments, or ? or cosmid clones [17]. Despite advances in DNA sequencing technology, the sequencing of genomes has not progressed beyond clones on the order of the size of the ~ 250kb, which is due to the lack of computational approaches that would enable the efficient assembly of a large number of fragments into an ordered single assembly [13][17]. Upon this, Venter and Smith in 1995 proposed Shotgun Sequencing and enabled Haemophilus influenzae (H. influenzae) to become the first bacterial genome to be sequenced [13][17]. H. influenzae was chosen as it has a similar base composition as a human does with 38 % of sequence made of G + C. Table 2 shows the procedure of the Shotgun Sequencing [17]. When constructing the library ultrasonic waves were used to randomly fragment the genomic DNA into fairly small pieces of about the size of a gene [13]. The fragments were purified and then attached to plasmid vectors[13][17]. The plasmid vectors were then inserted into an E. coli host cell to produce a library of plasmid clones. The E. coli host cell strains had no restriction enzymes which prevented any deletions, rearrangements and loss of the clones [17]. The fragments are randomly sequenced using automated sequencers (Dye-Labelled terminators), with the use of T7 and SP6 primers to sequence the ends of the inserts to enable the coverage of fragments by a factor of 6 [17]. Table 2 (Reference 17) Stage Description Random small insert and large insert library construction Shear genomic DNA randomly to ~2 kb and 15 to 20 kb respectively Library plating Verify random nature of library and maximize random selection of small insert and large insert clones for template production High-throughput DNA sequencing Sequence sufficient number of sequences fragments from both ends for 6x coverage Assembly Assemble random sequence fragments and identity repeat regions Gap Closure Physical gaps Order all contigs (fingerprints, peptide links, , clones, PCR) and provide templates for closure Sequence gaps Complete the genome sequence by primer walking Editing Inspect the sequence visually and resolve sequence ambiguities, including frameshifts Annotation Identify and describe all predicted coding regions (putative identifications, starts and stops, role assignments, operons, regulatory regions) Once the sequencing reaction has been completed, the fragments need to be assembled, and this process is done by using the software TIGR Assembler (The Institute of Genomic Research) [17]. The TIGR Assembler simultaneously clusters and assembles fragments of the genome. In order to obtain the speed necessary to assemble more than 104 fragments [17], an algorithm is used to build up the table of all 10-bp oligonucleotide subsequences to generate a list of potential sequence fragment overlaps. The algorithm begins with the initial contig (single fragment); to extend the contig, a candidate fragment is based on the overlap oligonucleotide content. The initial contig and candidate fragment are aligned by a modified version of the Smith-Waterman [18] algorithm, which allows optional gapped alignments. The contig is extended by the fragment only if strict criteria of overlap content match. The algorithm automatically lowers these criteria in regions of minimal coverage and raises them in r egions with a possible repetitive element [17]. TIGR assembler is designed to take advantage of huge clone sizes [17]. It also enforces a constraint that sequence from two ends of the same template point toward one another in the contig and are located within a certain range of the base pair [17]. Therefore the TIGR assembler provides the computational power to assemble the fragments. Once the fragments have been aligned, the TIGR Editor is used to proofread the sequence and check for any ambiguities in the data [17]. With this technique it does required precautionary care, for instance the small insert in the library should be constructed and end-sequenced concurrently [17]. It is essential that the sequence fragments are of the highest quality and should be rigorously check for any contamination [17]. Pyrosequencing Most of the DNA sequencing required gel-electrophoresis, however in 1996 at the Royal Institute of Technology, Stockholm, Ronaghi proposed Pyrosequencing [19][20]. This is an example of sequencing-by-synthesis, where DNA molecules are clonally amplified on a template, and this template then goes under sequencing [25]. This approach relies on the detection of DNA polymerase activity by enzymatic luminometric inorganic pyrophosphate (PPi) that is released during DNA synthesis and goes under detection assay and offers the advantage of real-time detection [19]. Ronaghi used Nyren [21] description of an enzymatic system consisting of DNA polymerase, ATP sulphurylase and lucifinerase to couple the release of PPi obtained when a nucleotide is incorporated by the polymerase with light emission that can be easily detected by a luminometer or photodiode [20]. When PPi is released, it is immediately converted to adenosine triphosphate (ATP) by ATP sulphurylase, and the level of generated ATP is sensed by luciferase-producing photons [19][20][21]. The unused ATP and deoxynucleotide are degraded by the enzyme apyrase. The presence or absence of PPi, and therefore the incorporation or nonincorporation of each nucleotide added, is ultimately assessed on the basis of whether or not the photons are detected. There is minimal time lapse between these events, and the conditions of the reaction are such that iterative addition of the nucleotides and PPi detection are possible. The release of PPi via the nucleotide incorporation, it is detected by ELIDA (Enzymatic Luminometric Inorganic pyrophosphate Detection Assay) [19][21]. It is within the ELIDA, the PPi is converted to ATP, with the help of ATP sulfurylase and the ATP reacts with the luciferin to generate the light at more than 6 x 109 photons at a wavelength of 560 nm which can be detected by a photodiode, photomultiplier tube, or charge-coupled device (CCD) camera [19][20]. As mentioned before, the DNA molecules need to be amplified by polymerase chain reaction (PCR which is discussed later Ronaghi observed that dATP interfered with the detection system [19]. This interference is a major problem when the method is used to detect a single-base incorporation event. This problem was rectified by replacing the dATP with dATPaS (deoxyadenosine aââ¬âthiotrisulphate). It is noticed that adding a small amount of the dATP (0.1 nmol) induces an instantaneous increase in the light emission followed by a slow decrease until it reached a steady-state level (as Figure 11 shows). This makes it impossible to start a sequencing reaction by adding dATP; the reaction must instead be started by addition of DNA polymerase. The signal-to-noise ratio also became higher for dATP compared to the other nucleotides. On the other hand, addition of 8 nmol dATPaS (80-fold higher than the amount of dATP) had only a minor effect on luciferase (as Figure 14 shows). However dATPaS is less than 0.05% as effective as dATP as a substrate for luciferase [19]. Pyrosequencing is adapted by 454 Life Sciences for sequencing by synthesis [22] and is known as the Genome Sequencer (GS) FLX [23][24]. The 454 system consist of random ssDNA (single-stranded) fragments, and each random fragment is bound to the bead under conditions that allow only one fragment to a bead [22]. Once the fragment is attached to the bead, clonal amplification occurs via emulsion. The emulsified beads are purified and placed in microfabricated picolitre wells and then goes under pyrosequencing. A lens array in the detection of the instrument focuses luminescene from each well onto the chip of a CCD camera. The CCD camera images the plate every second in order to detect progression of the pyrosequencing [20][22]. The pyrosequencing machine generates raw data in real time in form of bioluminescence generated from the reactions, and data is presented on a pyrogram [20] Sequencing by Hybridisation As discussed earlier with chain-termination, Maxamm and Gilbert and pyrosequencing, these are all direct methods of sequencing DNA, where each base position is determined individually [26]. There are also indirect methods of sequencing DNA in which the DNA sequence is assembled based on experimental determination of oligonucleotide content of the chain. One promising method of indirect DNA sequencing is called Sequencing by Hybridisation in which sets of oligonucleotide probes are hybridised under conditions that allow the detection of complementary sequences in the target nucleic acid [26]. Sequencing by Hybridisation (SBH) was proposed by Drmanac et al in 1987 [27] and is based on Dotys observation that when DNA is heated in solution, the double-strand melts to form single stranded chains, which then re-nature spontaneously when the solution is cooled [28]. This results the possibility of one piece of DNA recognize another. And hence lead to Drmanac proposal of oligonucleotides probes being hybridised under these conditions allowing the complementary sequence in the DNA target to be detected [26][27]. In SBH, an oligonucleotide probe (n-mer probe where n is the length of the probe) is a substring of a DNA sample. This process is similar to doing a keyword search in a page full of text [29]. The set of positively expressed probes is known as the spectrum of DNA sample. For example, the single strand DNA 5GGTCTCG 3 will be sequenced using 4-mer probes and 5 probes will hybridise onto the sequence successfully. The remaining probes will form hybrids with a mismatch at the end base and will be denatured during selective washing. The five probes that are of good match at the end base will result in fully matched hybrids, which will be retained and detected. Each positively expressed serves as a platform to decipher the next base as is seen in Figure 16. For the probes that have successfully hybridised onto the sequence need to be detected. This is achieved by labelling the probes with dyes such as Cyanine3 (Cy3) and Cyanine5 (Cy5) so that the degree of hybridisation can be detected by imaging devices [29]. SBH methods are ideally suited to microarray technology due to their inherent potential for parallel sample processing [29]. An important advantage of using of using a DNA array rather than a multiple probe array is that all the resulting probe-DNA hybrids in any single probe hybridisation are of identical sequence [29]. One of main type of DNA hybridisation array formats is oligonucleotide array which is currently patented by Affymetrix [30]. The commercial uses of this shall be discussed under application of the DNA Array (Affymetrix). Due to the small size of the hybridisation array and the small amount of the target present, it is a challenge to acquire the signals from a DNA Array [29]. These signals must first be amplified before they can be detected by the imaging devices. Signals can be boosted by the two means; namely target amplification and signal amplification. In target amplification such as PCR, the amount of target is increased to enhance signal strength while in signal amplification; the amount of signal per unit is increased. Nanopore Sequencing Nanopore sequencing was proposed in 1996 by Branton et al, and shows that individual polynucleotide molecules can be characterised using a membrane channel [31]. Nanopore sequencing is an example of single-molecule sequencing, in which the concept of sequencing-by-synthesis is followed, but without the prior amplification step [24]. This is achieved by the measurement of ionic conductance of a nucleotide passing through a single ion channels in biological membranes or planar lipid bilayer. The measurement of ionic conductance is routine neurobiology and biophysics [31], as well as pharmacology (Ca+ and K+ channel)[32] and biochemistry[9]. Most channels undergo voltage-dependant or ligand dependant gating, there are several large ion channels (i.e. Staphylococcus aureus a-hemolysin) which can remain open extended periods, thereby allowing continuous ionic current to flow across a lipid bilayer [31]. If a transmembrane voltage applied across an open channel of appropriate size should d raw DNA molecules through the channel as extended linear chains whose presence would detect reduce ionic flow. It was assumed, that the reduction in the ionic flow would lead to single channel recordings to characterise the length and hence lead to other characteristics of the polynucleotide. In the proposal by Branton, a-hemolysin was used to form a single channel across a lipid bilayer separating two buffer-filled compartment [31]. a-Hemolysin is a monomeric, 33kD, 293 residue protein that is secreted by the human pathogen Staphylococcus aureus [33]. The nanopore are produced when a-hemolysin subsunits are introduced into a buffered solution that separates lipid bilayer into two compartments (known as cis and trans): the head of the a-hemolysin molecule is known as the cis side, and the stem end as the trans side [31][33][34]. The polynucleotide [31] inserts into the cis side of the bilayer pore that can carries an ionic current of approximately 120 pA (picoAmperes) [31][33][34]. The lipid bilayer containing the nanopore also influences its function as an ion channel. Currently most of the nanopore sequencing is done by using the a-hemolysin [34]. Figure 17 shows the structure of a nanopore in the a-hemolysin lipid bilayer and a double-stranded DNA becoming a single-st randed DNA (ssDNA) and passing through the nanopore. ssDNA, are of ~1.3 nm by diameter and the a-hemolysin nanopores has a diameter of 1.5 nm [33]. Originally Branton thought that the diameter of the 2.6 nm [31], this was later rectified. During the experimentation Branton also realised that the diameter of the a-hemolysin was too narrow for the double-stranded DNA to pass through the cis and into the trans, therefore the double-stranded DNA had to be denatured [31]. Nanopores have proven to be used as an analytical technique in determining the concentration and size distribution of particles down to the sub-micrometer [33]. As they measure the ionic conductance of the nucleotide passing through the pore, they act as Coulter counters, in which molecules carrying a net electrical charge are electrophoretically driven through the pore, which produces measurable changes in ionic conductivity. And due to these changes in conductivity, the nucleotides can be distinguished by its characteristics effect on the ion conductance. Figure 18 shows the ion conductance between poly A and poly C. To date there have been two general approaches, the above mentioned a-hemolysin and there is synthetic solid-state nanopores that are being developed using various conventional fabrication techniques [33]. However the nanopores with a-hemolysin have size, variation and stability limitation. This is because the protein is usually labile, lipid membranes are fragile, the pore diameter is fixed and the range of safe electrical operation is narrow. To overcome these difficulties, solid-state synthetic nanopores are being fabricated by various means, bearing in mind that the properties of the nanopores must be carefully selected to respond sensitivity to the molecules that are being detected. Table 3 outlines the various synthetic methods in brief. Table 3 Synthetic nanopores Brief outline Ion Beam Sculpting [35] Involves a Si3N4 Surface containing a bowl-shaped cavity Uses argon ion beams that removes the material from Si3N4 surface that ultimately intercept the bottom of the cavity causing the pore to be open. This is sputtering erosion. The Argon ion beam is also used to stimulate lateral transport, causing the pore to close. Micromolding [36] Uses conventional lithographic techniques to create a negative master of the pore which is subsequently cast into PDMS slab. This approach enables rapid reproducible fabrication of submicrometer-scale and is easy to modify structurally and chemically for various detection applications. Latent track etching [37] A single conical nanopore was created in a polymer substrate by chemically etching the latent track of a single, energetic, heavy ion. Each ion produces an etchable track in a polymer foil, forming a one-pore membrane. The size and axial of the pore can be customised with nanometer precision by controlling the type and concentration of the etchant, the temperature and the etching duration. Electron beam-induced fine tuning [38] Use of high-energy electron beams to fine tune the size of the silicon oxide nanopore, it has become one of the most popular methologies to fabricate small nanopores. Uses commercial transmission electron microscope (TEM) that is operated at 300 kV. Electrons beam allow the pore to shrink at ~0.3 nm per minute, which is at an ideal speed stop at any desired dimension. Inorganic nanotubes [39] Consists of single-channel membrane through the embedding of multiwell carbon nanotubes (MWNTs). Carbon nanotubes have a well-defined chemical and structural properties, compared with nanopores treated by high energy beams. Easier taylor channel size. Surface charge is zero therefore electrophoretic transport becomes simple. Illumina Sequencing Illumina sequencing is one of the latest sequencing technology proposed by Simon Benett in 2004 and was originally called Solexa Sequencing [40]. In 2007, Illumina acquired the technology for $600 million [41]. This technology is more than 100 times more efficient than the chain termination method and is a base-by-base comparision of DNA sequences [40]. This sequencing technology is an example of sequencing by synthesis, another known example is the 454 system mentioned in pyrosequencing. Illumina sequencing is also much faster and cheaper than 454 systems [25][42]. Illumina sequencing is based on single molecules that are attached covalently to a flow cell and amplified to generate clusters of identical molecules [43]. The flow cell consists of 8 lanes that are separately loadable. Each lane on the flow cell has a capacity of around 5 million reads, which is greater than 40 million reads generated in the space of 3 days that reads more than 1.3Gbp (gigabase-pairs). Figure 18 shows the flow cell that is used by Illumina. On the flow cell, clusters are produced via clonal bridge amplification generating 10 million single molecule clusters per square centimetre [24]. Bridge amplification is performed after the DNA fragments are attached to the flow by ligated adaptors. The free nucleotide are added onto the flow cell and these free nucleotides annealed to a nearby on the primer. A double stranded bridge will form after elongation followed by denaturing which produce two strands that are fixed on the surface of the flow cell. The cycle is repeated until there are clusters on the flow cell which contain approximately 1000 copies within the diameter of 1 à µm [24]. Once the clusters are produced on the flow cell, the sequencing is carried by adding a terminator-enzyme mix (mixture consisting of four fluorescently labelled reversible chain terminators (ddNTP) and DNA polymerase) to the flow cell [25]. This addition leads to reverse termination. Laser excitation [11] is applied to emit the fluorescent signal which is detected via an imaging device. Then the four fluorescently labelled reversible chain terminators are removed and the second cycle can commence. For the second cycle, the terminator-enzyme mix is added and the processed repeated until the end of the run [25]. Knowing that all four of the flourscently labelled reversible chain terminators are present in the reaction, this increases the sequencing accuracy as it reduces the risk of misincorporation of the nucleotides[25]. At the moment current Illumina sequencing can read lengths of 36 bases [43], but this figure is expect to rise in the near future [42]. Illumina Sequencing has lead to the breakthrough research, an example is employing Illumina sequencers to obtain high-resolution maps of several histone modifications [44]. Typical pattern of histone methylation were exhibited at the promoters, insulators, enhancers and the transcribed regions were identified. Accompanying these findings by Barki et al [44], gave insights into the function of histone methylation in the genome function. Amplification of DNA by Polymer Chain Reaction (PCR) Kary Mullis proposed Polymerase Chain Reaction (PCR) in 1985 and it became a useful tool for DNA sequencing [50][51]. PCR is used to amplify the gene sequence and this technique is capable of producing a selective enrichment of DNA sequence by a factor of 106 [51]. PCR amplification involves two oligonucleotides primers that flank the DNA segment which is going to be amplified and repeated cycles of denaturing [51] at 95oC [10] occurs. The primers begin to anneal to their complementary sequences, and the extension of the annealed primers with DNA polymerase (Taq Fragment) [51]. These primers hybridize onto the opposite strands of the target sequence and is oriented so that DNA synthesis by the polymerase proceeds across the region between the primers, effectively doubling the amount of DNA. Moreover, since the extension products are also complementary to and capable of binding primers, each successive cycle essentially doubles the amount of DNA synthesized in the previous cycle. This results in the exponential accumulation of the specific target fragment by approximately 2n, where n is the number of cycles. Before the use of Taq DNA polymerase, the Klenow fragment of the E. coli was used but the major drawback was thermal stability, as it couldnt cope with 95oC temperature required to anneal the DNA segment. Figure 20 shows a schematic representation of how PCR amplifies DNA. Applications of DNA Sequencing Human Genome Project DNA sequencing has lead to one of the worlds leading projects which involves sequencing the Human genome and was lead by James Watson[45]. James Watson proposed the chemical structure of double-helical DNA in 1953 [46]. The purpose of the human genome was not only to explain the functions of a human on a chemical level but also help us understand genetic factors in a multitude of diseases such as cancer, Alzheimers and schizophrenia [45]. The first draft of the human genome sequence has been published in 2001 [47][48]. With the human genome sequence complete allowed scientists to analyse each chromosome. One of the chromosomes that have been analysed is chromosome 8 [49]. Chromosome 8 has a length of 145,556,489 bases and the majority of the genes that were found were related to the development or signalling in the nervous system. One gene identified in chromosome 8 is CSMD1, which is associated with the nervous system and widely expressed in the brain tissues [49]. Chromosome analysis of the human genome, can uncover the theory of evolution and lead to onto many more exciting discoveries. DNA Fingerprinting One of the major applications of DNA sequencing which has been applied in the real world is DNA fingerprinting, which was proposed by Alec J. Jeffreys in 1985 [52][53]. DNA fingerprinting has been used in crime scene investigation and in biological relations [53]. The origin of DNA fingerprinting came from discovery of minisatellite DNA. The minisatellites are known as the variable region which consists of short tandem repeats (STR) of a sequence arising presumably by mitotic or meiotic unequal exchanges during replication [52]. In identifying the minisatelittle, a probe has to be made [52]. The probe myoglobin 33- bp is used as it is capable of detecting other human minisatellites. Myglobin is prepared from human myoglobin minisatellite by purification of a single 33-bp repeat element followed by a head-to-head ligation and cloning of the resulting recombinant, pAV33.7 with BumHI plus EcoRI released a 767-bp DNA insert comprised almost entirely of 23 repeat of the 33-bp sequence [52]. The myoglobin probe hybridises to these STR and then goes under PCR. The resulting DNA fingerprints are presented on a gel-electrophoresis that is composed of multiple hypervariable DNA fragments produced on the autoradiograph shows the somatic and germline stability which are specific to an individual [53]. The positions of bands vary on the size of the fragments. An example of this technique used in forensic biology for crime scene investigation. DNA of high relatives molecular mass (Mr) is isolated from a 4 year old blood stain and semen stain made on a cotton cloth can digested and amplified via PCR to produce DNA fingerprints suitable for individual identification [53]. The bands on the fingerprints which show similar size fragments to scene of the crime can identify the culprit. DNA fingerprinting has revolutionised forensic biology with regard to the identification of the culprit. Due to this revolution, thousands of court cases have been decided on the outcome of the DNA fingerprint [10]. Figure 21 shows an example of DNA Fingerprint applied in a crime scene. Apart from forensics, DNA fingerprinting is widely used in the diagnosis of genetic disorder such as Cystic fibrosis [54] and Huntingtons disease [55], which can be detected in unborn babies as well as newborns [10]. These diseases are detected on a fingerprint which enables early treatment of an affected child. DNA Array (Affymetrix) As mentioned before, Sequencing by Hybridisation has been applied to DNA Array [26]. The DNA array is a powerful tool for high-throughput identification and quantification of nucleic acids [29]. DNA arrays, are oftened referred to as DNA Microarray as it vastly increase the number of gene that can be studied in experiment. There are many formats of DNA array, which include microarray, macroarrays, oligonucleotide arrays and microelectronic array. The DNA array format that will be discussed will be Oligonucleotide array as it involves the application of SBH and is currently patented by Affymetrix [30]. Affymetrix developed the first successful technique for oligonucleotide synthesis on a chip which is known as the Affymetrix Chip GeneChip HIV 440 assay, which is eventually shorten to Affymetrix assay [56][57]. This array format involves the use of photolithography which is a technique for manufacturing high density oligonucleotide probe array which involve the parallel synthesis of a large number of DNA sequence. These probes are capable of acquiring mass genetic information from biological samples e.g. identification of genetic diseases [56][57]. Figure 22 shows light direct synthesis of a oligonucleotide (photolithography) [28]. A surface bearing photoprotected hydroxyl (X-O) is illuminated through a photolithographic mask (M1), generating a free hydroxyl groups (O-H) in the photo deprotected regions. The hydroxyl group are then coupled to a deoxynucleoside phosphoramidite (5-photoprotected). A new mask pattern is applied, and a second photoprotected phosphoramidite is coupled. The process is repeated until the desired set of products is obtained. The oligonucleotide probes are synthesised on a glass support (in situ) [28] (shown in Figure 24), which is prepared by cleaning the glass support in concentrated NaOH, followed by thorough cleaning in water. The surfaces were then coated in 10% (v/v) bis(2-hydroxyethyl)aminopropyltriethoxysilane which is a silane coupling agent with a hydroxyl functional group that serves as the synthesis site [28][29]. With silane coupling agent in place, the synthesis linker is placed by reacting derivatised substrates with 4,4-dimethyloxytrityl(DMT)-hexaethyloxy-O-cyanoethyl phosphate. The photolabile protecting groups (X) is a-methyl-2-nitropiperonyl oxycarbonyl (MeNPoc). The MeNPoc is activated when regions of the surface are exposed to illumination (hv) resulting in the addition of the nucleoside phosphoramidite monomers. The phosphoramidite group react with the hydroxyl group on the substrate in the presence of the silane coupling agent. Under hv radiation, the photolabile MeNPOC group produ ces the hydroxyl group. The next MeNPOC protected nucleotide is added and coupled to the free hydroxyl group of the grafted molecule. The MeNPOC group protecting the 5 end of the added nucleotide is by hv and this procedure is repeated as many times to achieve the required length of the nucleotide chain (usually 25-mer or less). Future Aspects There are more projects in the future which involve DNA sequencing, there is the 1000 genome project that aim to sequence the genomes of 1000 volunteers from around the globe (a follow up to the human genome project) [58]. The expected cost of this project is around $30 to $50 million, with aim uncover more genetic factors of human health and disease. Worldwide institutions will be participating in the project with names such as Wellcome Trust Sanger and US National Human Genome Reseach Institute (NGHRI). By having the sequence of genome from a 1000 volunteers, genetic studies can be taken on common diseases which will lead to the findings of any casual variants found among these common disease. Another project occurring is the Cancer genome project [59]. This is a 10 year project, in which tumour samples are gathered from thousands of patients and then analysed to find the mutated regions. The mutated regions will be resequenced in order to identify the specific mutations. DNA will have its further use in medicine, most recently DNA sequencing has uncovered a non-invasive diagnostic method to find out whether an the fetus has genetic disease through the analysis of placental expressed mRNA in maternal plasma [60][61]. The invasive test methods involve chronic villus sampling (CVS) or amniocentesis. The development of DNA sequencing will continue and there will be affordable and much more efficient sequencing technologies available in the near future. There is the ultimate goal of $1000 human genome, where sequencing the human genome would cost $1000 [24]. At the moment there isnt a sequencing technology that will cost that amount. When the human genome was sequenced, it is estimated to have cost around $3 billion dollars [24]. The Illumina sequencing methods claims to be able to sequence the human genome for about $100,000 [24]. So the hunt for the efficient method of sequencing is still continuing. One of future sequencing technologies is nanopore sequencing, one of the single- molecule sequencing technologies. There are many more single-molecule sequencing technologies such as the HeliScope from the Helicos Biosciences and the Single-molecule Real Time Sequencing-by-Synthesis (SMRT) [24]. These technologies are currently being develop or at the proof-of-concept stage [24]. Conclusion Since Sangers chain termination method of sequencing DNA, spurred many different technologies to achieve the same goal with the introduction of the automated sequencers. In time the computational power improved, leading onto whole genome shotgun sequencing along with the first bacterial genome to be sequenced. Pyrosequencing was introduced, and this detects the release of PPi and has been applied by 454 Life Sciences as sequencing-by-synthesis technology known as the GS FLX. Another method of sequencing-by-synthesis is Illumina sequencing which is faster and cheaper than the 454 system. An indirect method is sequencing by hybridisation which lead to the DNA array currently patented by Affymetrix. The future of DNA sequencing is Nanopore sequencing which is an example of single-molecule sequencing and requires no amplification. DNA has been applied to the real world with applications like DNA fingerprinting and of course the human genome project. DNA sequencing also has its many uses in the forthcoming future. References Brown T.A, (1998),Genetics, 3rd Edition, Chapman Hall Sanger et al, (1977), DNA sequencing with chain-terminating inhibitors, Proc. Natl. Acad. USA, Vol 74, pp 5463-5467, Pierce B. A, (2008), Genetics: A Conceptual Approach, , 3rd Edition, , W.H. Freeman and Co. McCartney et al, (1966), Purine Nucleoside. XIV. Unsaturated Furanosyl Adenosine, Nucleoside prepared via Base-Catalysed Elimination Reactions of 2-Deoxyadenosine Derivative, J. Am. Chem. Soc, Vol 88, pp 1549-1553, Geider et al, (1972), DNA synthesis in Nucleoside ââ¬â Permeable Escherichia Coli Cells. The Effects of Nucleotides Analogues on DNA Synthesis, Eur. J. Biochem, Vol 27, pp 554 ââ¬â 563, Cooper N. G, (1994), The Human Genome Project: Deciphering the blueprint of Heredity, University Science Books. Maxam Gilbert, (1977), A New method for sequencing for DNA, Proc. Natl. Acad. USA, Vol 74, pp 560 ââ¬â 564, Southern et al, (1975), Detection of Specific sequences among DNA fragments separated by gel electrophoresis, J. Mol. Biol, Vol 98, pp 503-517, Lehinger et al, (2004), Principles of Biochemistry, 4th Edition, W.H. Freeman and Co. McMurray, (2003), Fundamentals of Organic Chemistry, 5th Edition, Brooks/Cole Hood et al, (1986), Fluorescence detection in automated DNA sequence analysis, Nature, Vol 321, pp 674 ââ¬â 679 Lee et al, (1992), DNA sequencing with dye-labelled terminators and T7 DNA polymerase: effect of dyes and dNTPs on incorporation of dye-terminators and probability analysis of termination fragments, Nucleic Acids Research, Vol 20, pp 2471-2483 Prescott et al, (2005), Microbiology, 6th Edition, McGraw-Hill Bankier, (1991), The DNA sequence of the human cytomegalovirus genome, DNA seq, Vol 2, pp 1 ââ¬â 12 Goebel S et al, (1990), The Complete DNA sequence of vaccinia virus, Virology, Vol 179, iss 1, pp247-266 Oda k et al, (1992), Gene Organisation deduced from the complete sequence of liverwort marchantia-polymorpha mitochondrial DNA ââ¬â A primitive form of plant mitochondrial genome, J. Mol. Bio, Vol 223, iss 1, Pg 1-7, Venter and Smith et al, (1995), Whole-Genome Random Sequencing and Assembly of Haemophilus influenzae Rd, Science, Vol 269, pp 496 ââ¬â 512, Waterman et al, (1988), Computer-Analysis of nucleic acid sequences, Methods of Enzymol, Vol 164, pp 765-793 Ronaghi et al, (1996), Real-Time DNA sequencing using detection of pyrophosphate release, Analytical Biochemistry, Vol 242, pp 84-89 Ronaghi and Elahi, (2004), Pyrosequencing: A tool for DNA sequencing analysis, Methods in Molecular Biology, Vol 255, pp 211-219 Nyren et al, (1993), Solid-phase DNA Minisequencing by an enzymatic luminnometric inorganic pyrophosphate detection assay, Anal. Biochem, Vol 208, pp 171-175 Egholm et al, (2005), Genome Sequencing in microfabricated high-density picolitre reactors, Nature, Vol 437, pp 376-380 Quinn et al, (2008), Assessing the feasibility of GS FLX Pyrosequencing for sequencing the Alantic salmon genome, BMC Genomics, Vol 9, Petterson et al, (2009), Generations of sequencing technologies, Genomics, Vol 93 Bentley et al, (2006), Whole Genome re-sequencing, Current Opinion in Genetics and Development, Vol 16, pp 545-552 Drmanac et al, (2002), Sequencing by Hybridisation (SBH): Advantages, Achievements and Opportunities, Advances in Biochemical Engineering/Biotechnology, vol 77, pp 75 ââ¬â 104 Drmanac et al, (1989 (reprint from 1987)), Sequencing of megabase plus DNA by Hybridization Theory of the Method, Genomics, Vol 4, pp 114 ââ¬â 128, Doty et al, (1960), Strand separation and specific recombination in deoxyribonucleic acids: Physical chemical studies, Proc. Natl. Acad. USA, Vol 46, 461-476 Xueji Zhang et al, (2007), Electrochemical Sensors, Biosensors and Their Biomedical Applications, 1st Edition, Academic Press Pease et al, (1994), Light generated oligonucleotide arrays for rapid sequence analysis, Proc. Natl. Acad. USA, Vol 91, pp 5022-5026 Branton and Deamer et al, (1996), Characterisation of individual polynucleotide molecules using a membrane channels, Proc. Natl. Acad. USA, Vol 93, 13770-13773 Rang et al, (2003), Pharmacology, 5th Edition, Eselvier Rhee et al, (2007), Nanopore Sequencing Technology: nanopore preparations, Trends in Biotechnology, Vol 25, 174-181 Ashkenasy et al, (2005), Recognising a Single Base in an Individual DNA Strand: A Step Toward DNA Sequencing in Nanopores, Angwandte Chemie Int. Ed, Vol 44, pp 1401-1404 Li et al, (2001), Ion beam sculpting at nanometer length scales, Nature, Vol 412, pp 166-169 Saleh and Sohn, (2003), An artifical nanopore for molecular sensing, Nano Lett. Vol 3, pp 37-38 Siwy et al, (2002), Fabrication of a synthetic nanopore ion pump, Phys. Rev. Lett, Vol 89 Storm et al, (2003), Fabrication of solid-state nanopores with single-nanometre precision, Nat. Mater, Vol 2, pp 537 ââ¬â 541 Ito et al, (2004) A carbon nanotube-based coulter nanoparticle counter, Acc. Chem. Res, Vol 37, pp 937-945 Bennet S, (2004), Solexa Ltd, Pharmacogenetics, Vol 5, 433-438 (2006), Illumina, Inc. to Purchase Solexa, Inc., Corporate Growth Report, Vol 206, pp 3 Zhang et al, (2008), Using quality scores and longer reads improves accuracy of Solexa read mapping, BMC Bioinformatics, Vol 9, Art 128 Dohm et al, (2008), Substantial biases in ultra-short read data sets from high-throughput DNA sequencing, Nucleic Acids Research, Vol 36, art 105 Barski et al, (2007), High-Resolution Profiling of Histone Methylation in the Human Genome, Cell, Vol 129, 823-837 Watson, (1992), The Human Genome Project: Past, Present and Future, Science, Vol 248, pp 44-48 Watson Crick, (1953), Molecular structure of nucleic acids; a structure for deoxyribose nucleic acid, Nature, Vol 171, pp 737-738 Venter et al, (2001), The Sequence of the Human Genome, Science, vol 291, pp 1304-1351 Lander et al, (2001), Initial sequencing and analysis of the human genome, Nature, Vol 409, pp 860-921 Nusbaum et al, (2006), DNA Sequence and analysis of human chromosome 8, Nature, Vol 439, pp331-335 Mullis et al, (1988) Primer-Directed Enzymatic Amplification of DNA with a Thermostable DNA Polymerase, Science, Vol 239, pp 487 ââ¬â 491 Mullis et al, (1985), Enzymatic Amplification of ââ¬âGlobin Genomic Sequences and Restriction Site Analysis for Diagnosis of Sickle Cell Anemia, Science, Vol 230, pp 1350-1354 Jeffreys, (1985), Hypervariable minisatelite region in Human DNA, Nature, Vol 214, pp 67-75, 1985 Jeffreys, (1985), Forensic Application of DNA Fingerprints, Nature, Vol 318, iss 6064, pp 577 ââ¬â 579 Grothues et al, (1988), Genome fingerprinting of pseudomonas-aeruginos indicates colonization of cystic-fibrosis siblings with closely related strains, J. Clin. Microbio, Vol 26, pp 1973-19977 Pritchard et al, (1992), Recombination of 4P16 DNA markers in an unusual family with Huntington disease, Am. J. Human. Gen, Vol 50, pp 1218-1230 Vahey et al, (1999), Performance of the Affymetrix GeneChip HIV PRT 440 Platform Antiretroviral Drug Resistance Genotyping Human Immunodeficiency Virus Type 1 Clades and Viral Isolates with Length Polymorphism, J. Clin. Microbio, Vol 37, pp 2533-2537 Lausted et al, (2004), POSaM: a fast, flexible, open-source, inkjet oligonucleotide synthesizer and microarray, Genome Biology, Vol 4, Siva, N, (2008), 1000 Genome Project, Nature Biotechnology, Vol 26, pp 256 Kaiser, J, (2005), NCI Gears Up for Cancer Genome Project, Science, Vol 307, pp 1182 Biever C, (2008) Promising Signs for Downs Blood test, New Scientist, No 2677, pp 10 Lo, D et al, (2007), Plasma placental RNA allelic permits noninvasive prenatal chromosomal aneuploidy detection, Nature Medicine, Vol 13, 218-223
Friday, May 22, 2020
Sexual Assault On Public Texas University Property
For our research proposal, we have chosen to analyze a topic that hits fairly close to home for the three of us. This paper will look at the social condition of sexual assault concerning college aged females on public Texas University property. The paper will mimic the style of a grant, as if we were an organization looking to get funds to implement a program that would help decrease this problem in our community. The problem of sexual assault is a very broad topic; under it falls everything from forced sexual intercourse, to child molestation, to fondling. In order to make our research more accurate, we have chosen to reduce the definition of sexual assault down to just forced intercourse, or in other terms, rape. We have found aâ⬠¦show more contentâ⬠¦Lastly, to define our arena, Texas public University property means that we are only looking at sexual assault that was committed on property owned by Universities that receive public funds from the state of Texas. For example, if an assault was committed in a fraternity house that is considered off campus, the crime will not be evaluated in our numbers. Every part of our research statement has been defined now, and the purpose of defining each aspect ourselves was to not be over or under inclusive. The reason we have narrowed down the definition of sexual assault to just forced intercourse is because among a crime that is not often reported, rape is the aspect that females will most commonly report to authorities. We chose seventeen as our minimum age because it is the legal age of consent in Texas, and any female older than seventeen is also at risk of being raped on a campus. It may seem over inclusive to add every age over seventeen, but all sexual assault crimes, regardless of age, are reported in the Clery Act. Just studying females is important because they are the most likely population to be victims of sexual assault. Finally, public campuses are the only Universities we are using because data is more readily available and representative of actual crime rates. Unfortunately, the universal connotation of sexual assault is negative. Females often donââ¬â¢t want to be associated with sexual assault because they might be considered weak, shameful, or
Subscribe to:
Posts (Atom)