Para los expertos en desarrollo integral de las sociedades, la falta de competitividad, tanto de las personas como de los sectores productivos de los países en desarrollo, obedece a la carencia de una cultura de la investigación y al menosprecio por la actividad investigativa, las cuales a su vez se explican por la debilidad del sistema educativo en este campo y por la actitud del mismo frente a su misión fundamental como motor dinamizador del desarrollo social. Dada la importancia que en la sociedad actual tiene la apropiación y la generación de conocimiento, toda sociedad y en particular toda persona está abocada a aprender los principios básicos del método científico, a reflexionar y a actuar con consistencia si quiere jugar un papel protagónico y ser artífice de su propio proyecto de vida. En virtud de lo ya mencionado, desarrollar actitudes y destrezas para la investigación científica es una necesidad ineludible que debe ser objeto de reflexión y acción para gobernantes, dirigentes empresariales y para cada persona en particular, pero, principalmente, para la comunidad académica cuya misión es contribuir al progreso y bienestar de la sociedad.
La falta de técnica en la iniciación de las investigaciones es una de las causas más frecuentes del fracaso de éstas y, por consiguiente, origen de la impunidad. Siempre hay huellas o rastros que exigen determinados conocimientos para poderlos hallar, recolectar, analizar e interpretar; precisamente en esto radica la utilidad y el valor de métodos de investigación criminal y pruebas forenses, plasmadas en este Manual de criminalística, de Carlos Alberto Guzmán, libro que viene a llenar un vacío de la criminallstica argentina, en una época signada por el auge de la alta complejidad de las comunicaciones, la informática y el diagnóstico por imágenes. Satisface ampliamente las expectativas en la materia: de los alumnos de nivel terciario, para evitar los apuntes irrelevantes o publicaciones extranjeras no adaptables a nuestros sistemas; de las instituciones de seguridad; de los encargados de administración de justicia, en sus distintas especialidades y jerarquías; de los abogados en general, en especial los penalistas; de los licenciados en higiene industrial; de las compañías aseguradoras, para consulta imprescindible en la oficina y biblioteca; de los peritos en general, en especial de los cuerpos técnicos de seguridad de los distintos servicios.
bout 20 years ago, the pharmaceutical industry started to consider mathematical model-based drug development as a means to streamline the execution of drug development programs. This constituted a new discipline, pharmacometrics, grounded in pharmacology and statistics. We have written this book to describe our learning and experiences in the emergent field of pharmacometrics applied to drug discovery and clinical devel- opment. We did not aim to compete with the many excellent contributions in this field, both theoretical and practical, but rather wanted to highlight specific areas that we have repeatedly encountered. This book may also give answers to those who wonder to what kind of problems mathematics is actually applied in the modeling and simulation department of a large pharmaceutical company. Parts of this book require some familiarity with mathematical notation and elementary calculus including the concept of a differential equation. Besides providing some concepts behind drug discovery and development, we have added many exercises (with our solutions) designed to be solved symboli- cally or programmatically. As a programming language, we chose MATLAB (Version 2012b) as we found it best suited to the diversity of problems we faced. Our final message is that pharmacometrics could have an even larger impact on drug discovery and development if it were consistently applied to diseases and their potential treatment targets, thus merging with another quantitative discipline, systems biology, to form quantitative systems pharmacology. The book is organized into nine chapters. Background of Pharmacologic Modeling ( Chap. 1 ) introduces two topics which underpin later chapters: the emergence, role, and tasks of the pharmaceutical industry as a healthcare provider; and the philosophy of modeling and simulation. Regarding modeling, we start with a First Example of a Computational Model ( Chap. 2 ) from oncology to introduce physiologic, pharmacologic, and computational concepts that are explained and detailed in later chapters. Differential Equations in MATLAB (Chap. 3) provides the numerical and symbolic treatment of ordinary differential equations with time and state event scheduling. Pharmacologic Modeling (Chap. 4) is about dynamic concepts in relationship to drugs. This entails the modeling of drug concentrations and related body responses over time. Disease Modeling (Chap. 5) adds another component. As drugs work on a diseased human body, a model-based under- standing of how the body functions under the disease will be of value to learn how
Disadvantages of fossil fuel derived transportation fuels (greenhouse gas emissions, pollution, resource depletion, unbalanced supply-demand relations) are strongly reduced or even absent with biotransportation fuels. Of all biofuels, ethanol is already produced on a fair scale. It produces slightly less greenhouse emissions than fossil fuel (carbon dioxide is recycled from the atmosphere to produce biomass); can replace harmful fuel additives (e.g., methyl tertiary butyl ether) and produces jobs for farmers and refinery workers. It is easily applicable in present day internal combustion engine vehicles (ICEVs), as mixing with gasoline is possible. Ethanol is already commonly used in a 10 % ethanol/90 % gasoline blend. Adapted ICEVs can use a blend of 85 % ethanol/15 % gasoline (E85) or even 95 % ethanol (E95). Ethanol addition increases octane and reduces carbonmonoxide, volatile organic carbon and particulate emissions of gasoline. And, via on board reforming to hydrogen, ethanol is also suitable for use in future fuel cell vehicles (FCVs). Those vehicles are supposed to have about double the current ICEV fuel efficiency. Ethanol production and use has spread to every corner of the globe. As concerns over petroleum supplies and global warming continue to grow, more nations are looking to ethanol and renewable fuels as a way to counter oil dependency and environmental impacts. World production reached an all-time high of nearly 23 billion gallons in 2010 and is expected to exceed 1,20,000 million mark by the end of the year 2020. While the US became the world’s largest producer of fuel ethanol in 2010, Brazil remains a close second, and China, India, Thailand and other nations are rapidly expanding their own domestic ethanol industries. Increased production and use of ethanol have also led to a growing international trade for the renewable fuel. While the vast majority of ethanol is consumed in the country in which it is produced, some nations are finding it more profitable to export ethanol to countries like the US and Japan. High spot market prices for ethanol and the rapid elimination of MTBE by gasoline refiners led to record imports into the US in the last few years. More than 500 million gallons of ethanol entered through American ports, paid the necessary duties, and competed effectively in the marketplace. The increased trade of ethanol around the world is helping to open up new markets for all sources of ethanol. The sustainable production of bioethanol requires well planned and reasoned development programs to assure that the many environmental, social and economic concerns related to its use are addressed adequately. The key for making ethanol competitive as an alternative fuel is the ability to produce it from low-cost biomass. Many countries around the world are working extensively to develop new technologies for ethanol production from biomass, from which the lignocellulosic materials conversion seem to be the most promising one. This e-book provides an updated and detailed overview on Advances in Bioethanol. It looks at the historical perspectives, chemistry, sources and production of ethanol and discusses biotechnology breakthroughs and promising developments, its uses, advantages, problems, environmental effects and characteristics. In addition, it presents information about ethanol in different parts of the world and also highlights the challenges and future of ethanol.
This dissertation layes out detailed descriptions for heterogeneous chemistry, electrochemistry, and porous media transport models to simulate solid oxide fuel cells (SOFCs). An elementary like heterogeneous reaction mechanism for the steam reforming of CH4 developed in our research group is used throughout this work. Based on assumption of hydrogen oxidation as the only electrochemical reaction and single step electron transfer reaction as rate limiting, a modified Butler-Volmer equation is used to model the electrochemistry. The pertinence of various porous media transport models such as Modified Fick Model (MFM), Dusty Gas Model (DGM), Mean Transport Pore Model (MTPM), Modified Maxwell Stefan Model (MMS), and Generalized Maxwell Stefan Model (GMS) under reaction conditions are studied. All model predictions are compared with experimental observations. In general MFM and DGM predictions are in good agreement with experimental data. Physically realistic electrochemical model parameters are very important for fuel cell modeling. Button cell simulations are carried out to deduce the electrochemical model parameters, and those parameters are further used in the modeling of planar cells. Button cell simulations are carried out using the commercial CFD code FLUENT [1] coupled with DETCHEM [2]. For all temperature ranges the model works well in predicting the experimental observations in the high current density region. However, the model predicts much higher open circuit potentials than that observed in the experiments, mainly due to the absence of coking model in the elementary heterogeneous mechanism leading to nonequilibrium compositions. Furthermore, the study presented here employs Nernst equation for the calculation of reversible potential which is strictly valid only for electrochemical equilibrium. It is assumed that the electrochemical charge transfer reaction involving H2 is fast enough to be in equilibrium. However, the comparison of model prediction with thermodynamic equilibrium reveals that this assumption is violated under very low current densities.
This book focuses on microalgae rather than seaweeds, as microalgae are the most attractive for renewable energy production, especially the production of biodiesel, although seaweed biomass can also be used. The aim of this book is to review in detail the most important aspects of the microalgae-to-bioenergy process, with an emphasis on microalgae as sources of lipids for the production of biodiesel and as potential sources of hydrogen. The book is meant as a guide and resource for both the experienced practitioners in the fi eld and to those newer to this exciting fi eld of research. However, no single book can cover all aspects of the production of bioenergy from algae; for example, we do not cover the fermentation of algal biomass to produce methane, nor the fermentation of algal sugars to ethanol or butanol. This book begins (Chap. 1 ) with an introduction to the history and developments over the last 80 years or so in the area of large-scale and commercial-scale culture of microalgae and the extensive literature that is available. Much can be learned from the extensive research that has been carried out, and by knowing this history (some of which is not easily accessible) we can avoid repeating past mistakes.
Nanobiotechnology is a multidisciplinary field that covers a vast and diverse array of technologies from engineering, physics, chemistry, and biology. It is expected to have a dramatic infrastructural impact on both nanotechnology and biotechnology. Its applications could potentially be quite diverse, from building faster computers to finding cancerous tumors that are still invisible to the human eye. As nanotechnology moves forward, the development of a ‘nano-toolbox’ appears to be an inevitable outcome. This toolbox will provide new technologies and instruments that will enable molecular manipulation and fabrication via both ‘top-down’ and ‘bottomup’ approaches. This book is organized into five major sections; 1. Introduction, 2. Biotemplating, 3. Bionanoelectronics and Nanocomputing, 4. Nanomedicine, Nanopharmaceuticals and Nanosensing, and 5. De NovoDesigned Structures. Section 1 is an introductory overview on nanobiotechnology, which briefly describes the many aspects of this field, while addressing the reader to relevant sources for broader information overviews. Biological materials can serve as nanotemplates for ‘bottom-up’ fabrication. In fact, this is considered one of the most promising ‘bottom-up’ approaches, mainly due to the nearly infinite types of templates available. This approach is demonstrated in Section 2.
The last twenty years of the last millennium are characterized by complex automatization of industrial plants. Complex automatization of industrial plants means a switch to factories, automatons, robots and self adaptive optimization systems. The mentioned processes can be intensified by introducing mathematical methods into all physical and chemical processes. By being acquainted with the mathematical model of a process it is possible to control it, maintain it at an optimal level, provide maximal yield of the product, and obtain the product at a minimal cost. Statistical methods in mathematical modeling of a process should not be opposed to traditional theoretical methods of complete theoretical studies of a phenomenon. The higher the theoretical level of knowledge the more efficient is the application of statistical methods like design of experiment (DOE). To design an experiment means to choose the optimal experiment design to be used simultaneously for varying all the analyzed factors. By designing an experiment one gets more precise data and more complete information on a studied phenomenon with a minimal number of experiments and the lowest possible material costs. The development of statistical methods for data analysis, combined with development of computers, has revolutionized the research and development work in all domains of human activities. Due to the fact that statistical methods are abstract and insufficiently known to all researchers, the first chapter offers the basics of statistical analysis with actual examples, physical interpretations and solutions to problems. Basic probability distributions with statistical estimations and with testings of null hypotheses are demonstrated. A detailed analysis of variance (ANOVA) has been done for screening of factors according to the significances of their effects on system responses. For statistical modeling of significant factors by linear and nonlinear regressions a sufficient time has been dedicated to regression analysis. Introduction to design of experiments (DOE) offers an original comparison between so-called classical experimental design (one factor at a time-OFAT) and statistically designed experiments (DOE). Depending on the research objective and subject, screening experiments (preliminary ranking of the factors, method of random balance, completely randomized block design, Latin squares, Graeco-Latin squares, Youdens squares) then basic experiments (full factorial experiments, fractional factorial experiments) and designs of second order (rotatable, D-optimality, orthogonal, B-designs, Hartleys designs) have been analyzed. For studies with objectives of reaching optima, of particular importance are the chapters dealing with experimental attaining of an optimum by the gradient method of steepest ascent and the nongradient simplex method. In the optimum zone up to the response surface, i.e. response function, one can reach it by applying secondorder designs. By elaborating results of second-order design one can obtain square regression models the analysis of which is shown in the chapter on canonical analysis of the response surface. The third section of the book has been dedicated to studies in the mixture design field. The methodology of approaching studies has been kept in this field too. One begins with screening experiments (simplex lattice screening designs, extreme vertices designs of mixture experiments as screening designs) through simplex lattice design, Scheffe's simplex lattice design, simplex centroid design, extreme vertices design, D-optimal design, Draper-Lawrence design, full factorial mixture design, and one ends with factorial designs of process factors that are combined with mixture design so-called "crossed" designs. The significance of mixture design for developing new materials should be particularly stressed. The book is meant for all experts who are engaged in research, development and process control.
THE DEVELOPMENT OF SEQUENCE ANALYSIS METHODShas depended on the contributions of many individuals from varied scientific backgrounds. This chapter provides a brief historical account of the more significant advances that have taken place, as well as an overview of the chapters of this book. Because many contributors cannot be mentioned due to space constraints, additional references to earlier and current reference books, articles, reviews, and journals provide a broader view of the field and are included in the reference lists to this chapter.
Team DDU
There are some excellent general reference works in toxicology, includingCasarett and Doull’s Toxicology, 6th, edition, edited by Klaassen; a 13-volume Comprehensive Toxicology, edited by Sipes, Gandolfi, and McQueen; as well as many specialized monographs on particular topics. However, the scarcity of textbooks designed for teacher and student to use in the classroom setting that impelled us to produce the first and second editions of this work is still apparent. With the retirement of Dr. Levi, a mainstay of the first two editions, and the continuing expansion of the subject matter, it seemed appropriate to invite others to contribute their expertise to the third edition. All of the authors are, or have been, involved in teaching a course in general toxicology at North Carolina State University and thus have insights into the actual teaching process as well as the subject matter of their areas of specialization. At North Carolina State University, we continue to teach a course in general toxicology that is open to graduate students and undergraduate upperclassmen. In addition, in collaboration with Toxicology Communications, Inc., of Raleigh, North Carolina, we present an accelerated short course at the same level. Our experience leads us to believe that this text is suitable, in the junior or senior year, for undergraduate students with some background in chemistry, biochemistry, and animal physiology. For graduate students it is intended to lay the foundation for subsequent specialized courses in toxicology, such as those in biochemical and molecular toxicology, environmental toxicology, chemical carcinogenesis, and risk assessment. We share the view that an introductory text must present all of the necessary fundamental information to fulfill this purpose,but in as uncomplicated a manner as possible. To enhance readability, references have been omitted from the text, although further reading is recommended at the end of each chapter. Clearly, the amount of material, and the detail with which some of it is presented, is more than is needed for the average general toxicology course. This, however, will permit each instructor to select and emphasizethose areas that they feel need particular emphasis. The obvious biochemical bias of some chapters is not accidental, rather it is based on the philosophy that progress in toxicology continues to depend on further understanding of the fundamental basis of toxic action at the cellular and molecular levels. The depth of coverage of each topic represents that chapter author’s judgment of the amount of material appropriate to the beginning level as compared to that appropriate to a more advanced course. Thanks to all of the authors and to the students and faculty of the Department of Environmental and Molecular Toxicology at North Carolina State University and to Carolyn McNeill for much word processing. Particular thanks to Bob Esposito of John Wiley and Sons, not least for his patience with missed deadlines and subsequent excuses.
A Thermodynamic Approach to Economics
This book, first published in 2009, stems from research that I began more than three decades ago when I was then working as group economist for the Babcock International Group. Prior to that, my formal university education had included degrees in engineering and management science – the latter in particular covering economics and operations research. What started out as a train of curiosity into parallels between the disciplines of economics and thermodynamics soon developed into something deeper. Following publication of two peer-reviewed papers of mine on the subject in the journal Energy Economics, I was greatly encouraged in my research by other trans-disciplinary researchers with a similar interest, in particular, Dr László Kapolyi, who was then Minister for Industry of the Hungarian government, a member of the Hungarian Academy of Science and a member of the Club of Rome. Not being based at a university and with no research grant at my disposal, my main thrust at that time had been to make a career as director of a consultancy and expert witness business and therefore, until more recently, opportunities to spend time on research had been few. Nevertheless, by the turn of the millennium I was able to find time alongside my consultancy to return to some research, and in 2007 published another peer-reviewed paper in the International Journal of Exergy entitled ‘A Thermodynamic Theory of Economics’, which was followed up with several working papers on monetary aspects and energy models. Interest in this work has been high, spurred on no doubt by general worldwide interest in energy and climate change. This book and third edition is an attempt to bring together all the facets of the research into a coherent whole. Topics covered include the gas laws, the distribution of income, the 1st and 2nd Laws of Thermodynamics applied to economics, economic processes and elasticity, entropy and utility, production and consumption processes, reaction kinetics, empirical monetary analysis of the UK and USA economies, interest rates, discounted cash flow, bond yield and spread, unemployment, principles of entropy maximization and economic development, the cycle, empirical analysis of the relationship between world energy resources, climate change and economic output, and last aspects of sustainability. Further developments have been added since the first and second editions, in particular, thoughts on production and entropy maximisation, order and disorder and relationships to the living world, which has necessitated reorganisation of some of the chapters. The chapter on money has been updated to incorporate empirical analyses of the recent upheavals in world economic activity from 2008 to 2011, though the conclusions reached have not changed, indeed, they have been reinforced. The findings, interpretations and conclusions of this book are entirely those of my own, based on the research that I have conducted. While I have made every effort to be diligent and accurate, readers should satisfy themselves as to logic and veracity of the conclusions drawn. I hope that this third edition represents an improvement and advancement on earlier editions, but would welcome nevertheless any feedback, discussions and corrections on points that readers may have.
and 2 are long and discursive, teasing out ambiguities and subtleties in thesis writing, in order to demystify the thesis writing process, while Chapter 8 is much more compact. It lists steps in a concentrated writing process and has checklists and tasks instead of definitions and explanations. It is also more directive in style. The Introduction, ‘How to write 1000 words an hour’, sets out the theory, practice and assumptions that underpin the approaches to writing proposed in this book. Chapter 1 helps you think your way into the thesis writing role. Chapter 2 has strategies to start writing right away: writing before you ‘have something to say’, using freewriting and generative writing. Chapter 3 is about bringing structure to your writing. A thesis has conventions you can use to shape and progress your thinking and writing. Chapter 4 marks the first major milestone in writing a thesis: the end of the first phase. Reporting on your work and gauging your progress is the priority at this stage. Chapter 5 has strategies for regular, incremental writing, for getting into the writing habit. A writers’ group is one example. Chapter 6 marks the halfway point in the writing of your thesis: time to move on to drafting chapters. ‘Fear and loathing’ were suggested for the title of Chapter 7 by a student who had recently completed his thesis, because they convey the frustration of constant refinements to text. Selected strategies for revising are provided here. Chapter 8 is either the introduction to the last phase or the condensed version of the whole process, depending on your progress with your thesis. This chapter shows how to pack all the writing into one full-time year or two part-time years. Chapter 9 covers ways of making your thesis ‘good enough’ – knowing it can still be improved – and defining what that means in terms of your thesis. Chapter 10 covers ways of talking about your writing convincingly – during the viva, the examination of your thesis, with suggestions for managing final revisions and publishing from your thesis. These chapters are arranged to guide you through the thesis writing process, from start to finish, but you can use the techniques described at different phases of thesis writing. Use the contents page initially to get an overview of the whole process and then strategically to locate writing problems or challenges that you face at any given time.
Biomass Gasification, Pyrolysis, and Torrefaction
The art of energy conversion of biomass is as old as our natural habitat. Such processes have been at work since the early days of vegetation on this planet. Flame leaping from forest fire is an example of “flaming pyrolysis.” Trace of blue flame in a swamp is an example of methane gas formation through decomposition of biomass and its subsequent combustion in contact with air. Burning vegetation on ground to increase soil fertility is an example of biochar production. Human beings, however, learned to harness these processes much later. Use of biomass for energy, though nearly as ancient as human civilization, did not rise at the same pace with industrialization because of the abundant supply and low prices of oil and natural gas. Only in the recent past has there been an upsurge in interest in biomass energy conversion, fueled by several factors: ? Interest in the reduction in greenhouse gas emissions as a result of energy production ? Push for independence from the less reliable supply and fluctuating prices of oil and gas ? Interest in renewable and locally available energy sources ? Rise in the price of oil and natural gas. Several excellent books on coal gasification are available, but a limited few are available about biomass gasification and pyrolysis, and none on torrefaction. A large body of peer-reviewed literature on biomass gasification, pyrolysis, and torrefaction is available; some recent books on energy also include brief discussions on these topics. For example the previous edition (Biomass Gasification and Pyrolysis) of this book along with its Chinese and Italian versions presents a good treatment of these topics. There is yet a dearth of comprehensive publications specifically on torrefaction. For this reason, the previous book was revised and expanded with several new chapters on such new topics to develop the monograph. Engineers, scientists, and operating personnel of biomass gasification, pyrolysis, or torrefaction plants clearly need such information from a single easy-to-access source. Better comprehension of the basics of biomass conversion could help an operator understand the workings of such plants, a design engineer to size the conversion reactors, and a planner to evaluate different conversion options. The present book was written to fill this important need. It attempts to mold available research results in an easy-to-use design methodology whenever possible. Additionally, it brings into focus new advanced processes such as supercritical water gasification and torrefaction of biomass
More than two decades have elapsed since the first publication of this book in the United States in 1978. During this period the first edition went through ten printings, and the second edition, which first appeared in 1993, went through more than seven printings. An increasing number of universities in North America, Europe, Asia, and elsewhere have adopted it as a text for courses in automotive engineering, vehicle dynamics, off-road vehicle engineering, or terramechanics. Many professionals in the vehicle industry around the world have also used it as a reference. It is gratifying indeed to see that the book has achieved such wide acceptance. As we enter a new millennium, the automotive industry is facing greater challenges than ever before in providing safer, more environmentally friendly, and more energy-efficient products to meet increasingly stringent demands of society. As a result, new technologies have continually been developed and introduced into its products. Accordingly, to better serve the changing needs of the educational and professional communities related to ground transportation technology, this third edition has been prepared. To improve competitiveness, shortening the product development cycle is of critical importance to vehicle manufacturers. Virtual prototyping is therefore widely adopted in the industry. To implement this process effectively, however, the development of reliable computer simulation models for vehicle performance evaluation is essential. For a realistic simulation of the handling behavior of road vehicles, a method referred to as the Magic Formula for characterizing tire behavior from test data is gaining increasingly wide acceptance.
Ch 2 Vectors cr
Physics2000 is a calculus based, college level introductory physics course that is designed to include twentieth century physics throughout. This is made possible by introducing Einstein’s special theory of relativity in the first chapter. This way, students start off with a modern picture of how space and time behave, and are prepared to approach topics such as mass and energy from a modern point of view.
This book has its origins in a set of lecture notes, assembled at UCLA for a graduate course on the optical studies of solids. In preparing the course it soon became apparent that a modern, up to date summary of the field is not available. More than a quarter of a century has elapsed since the book by Wooten: Optical Properties of Solids – and also several monographs – appeared in print. The progress in optical studies of materials, in methodology, experiments and theory has been substantial, and optical studies (often in combination with other methods) have made definite contributions to and their marks in several areas of solid state physics. There appeared to be a clear need for a summary of the state of affairs – even if with a somewhat limited scope
Gasification processes can accept a variety of feedstocks but the reactor must be selected on the basis of the feedstock properties and behavior in the process, especially when coal, biomass, and various wastes are considered as gasification feedstocks. The projections for the continued use of fossil fuels indicate that there will be at least another five decades of fossil fuel use (especially coal and petroleum) before biomass and other forms of alternative energy take a firm hold, although significant inroads are being made into the gasification of various feedstocks. However, the everincreasing global energy demand and the fast depleting fossil fuels have shifted focus on sustainable energies such as biomass and waste in the recent past. The importance of the gasification of such alternative feedstocks cannot be under-appreciated as potential sources of sustainable energy to meet the energy demands of future generations. The various technologies that are currently in practice at the commercial and pilot scale, with respect to bubbling, circulating fluidized beds and dual fluidized beds, are being developed for feedstocks other than coal.
salud y seguridad
Se realiza un estudio teórico acerca de los aspectos que afectan la salud y seguridad en los operarios de soldadura y corte. Se dan a conocer los riesgos a la salud que provoca la realización del proceso de soldadura, se enfatiza en los daños provocados por el humo producto de este proceso y se señalan algunas de las medidas que pueden tomarse para eliminar o disminuir estos efectos nocivos.
In the year 2000, the Max-Planck-Institut für Metallforschung in Stuttgart, in conjunction with other materials-related institutes of the Max Planck Society, and institutes and universities from all over Europe, decided to prepare a White Book on the topic of “Strategies for Future Areas of Materials Science and Basic Research in Europe.” Preparation of this White Book has involved extensive consultation with leading materials scientists, primarily within Europe (most notably from the CNRS in France and the MPG in Germany), but also from the United States and Japan. Discussions were also undertaken with industrial research directors and relevant policy makers.
Engineering metals are unstable in natural and industrial environments. In the long term, they inevitably revert to stable chemical species akin to the chemically combined forms from which they are extracted. In that sense, metals are only borrowed from nature for a limited time. Nevertheless, if we understand their interactions with the environments to which they are subjected and take appropriate precautions, degradation can be arrested or suppressed long enough for them to serve the purposes required. The measures that are taken to prolong the lives of metallic structures and artifacts must be compatible with other requirements, such as strength, density, thermal transfer, and wear resistance. They must also suit production arrangements and be proportionate to the expected return on investment. Thus, problems related to corrosion and its control arise within technologies, but solutions often depend on the application of aspects of chemistry, electrochemistry, physics, and metallurgy that are not always within the purview of those who initially confront the problems. Corrosion is the transformation of metallic structures into other chemical structures, most often through the intermediary of a third structure, i.e., water and a first task is to characterize these structures and examine how they determine the sequences of events that result in metal wastage.
Usted puede contribuir con Libros UCLV, es importante para nosotros su aporte..
Contribuir