In Silico Toxicology: Principles and Applications

In Silico Toxicology: Principles and Applications

by Ann Richard
     
 

In Silico methods to predict toxicity have become increasingly important recently, particularly in light of European legislation such as REACH and the Cosmetics Regulation. They are also being used extensively worldwide e.g. in the USA, Canada, Japan and Australia. In assessing the risk that a chemical may pose to human health or to the environment, focus is now

Overview

In Silico methods to predict toxicity have become increasingly important recently, particularly in light of European legislation such as REACH and the Cosmetics Regulation. They are also being used extensively worldwide e.g. in the USA, Canada, Japan and Australia. In assessing the risk that a chemical may pose to human health or to the environment, focus is now being directed towards exploitation of in silico methods to replace in vivo or in vitro techniques. A prediction of potential toxicity requires several stages: 1) Collation and organisation of data available for the compound, or if this is not available, information for related compounds. 2) An assessment of the quality of the data. 3) Generation of additional information about the compound using computational techniques at various levels of complexity - calculation of physico-chemical properties, 2-D, 3-D / MO descriptors and specific receptor modelling / interaction. 4) Use of an appropriate strategy to predict toxicity - ie a statistically valid method which makes best use of all available information (mechanism of action, activity for related compounds, extrapolation across species and endpoints, likely exposure scenario amounts over time etc). 5) Consideration then needs to be given to how this information is used in the real world ie use of expert systems / tools as relevant to assessors (if sufficiently different to previous) - weight of evidence approaches. 6) Finally evidence should be presented from case studies within this area. No other publication brings together information on all of these areas in one book and this publication is unique in that it provides a logical progression through every one of these key stages and defines the use of computational approaches to predict the environmental toxicity and human health effects of organic chemicals. The volume is aimed at the developers and users of in silico toxicology and provides an analysis of all aspects required for in silico prediction of toxicology, including data collation, quality assessment and computational approaches. The contributions from recognised leaders in each of these areas include evidence of the use and applicability of approaches using real world case studies concerning both environmental and human health effects. The book provides a very useful single source reference for people working in this area including academics, professionals, under- and post-graduate students as well as Governmental Regulatory Scientists involved in chemical risk assessment and REACH.

Product Details

ISBN-13:
9781849730044
Publisher:
Royal Society of Chemistry, The
Publication date:
12/17/2010
Series:
Issues in Toxicology Series, #7
Pages:
688
Product dimensions:
6.30(w) x 9.40(h) x 2.10(d)

Read an Excerpt

In Silico Toxicology

Principles and Applications


By Mark T. D. Cronin, Judith C. Madden

The Royal Society of Chemistry

Copyright © 2010 The Royal Society of Chemistry
All rights reserved.
ISBN: 978-1-84973-004-4



CHAPTER 1

In Silico Toxicology — An Introduction

M. T. D. CRONIN AND J. C. MADDEN

School of Pharmacy and Chemistry, Liverpool John Moores University, Byrom Street, Liverpool L3 3AF, UK


1.1 Introduction

Chemistry is a vital part of everyday life. In order for our interactions with chemicals to be safe, we must understand their properties. Traditional methods to determine the safety of chemicals are centred around toxicological assessment and testing, often using animals. There is, however, great interest and a need to develop alternatives to the traditional testing regime. Given the breadth and complexity of toxicological endpoints it is likely that, to ensure the safety of all chemicals, a variety of techniques will be required. This will require a paradigm shift in thinking, both in terms of acceptance of alternatives and the recognition that these alternatives will seldom be 'one for one' replacements.

In silico toxicology is viewed as one of the alternatives to animal testing. It is a broad term that is taken, in this book, to indicate a variety of computational techniques which relate the structure of a chemical to its toxicity or fate. The purpose of in silico toxicology is to provide techniques to retrieve relevant data and/or make predictions regarding the fate and effects of chemicals. In this sense the term 'in silico' is used in the same manner as in vitro and in vivo, with 'silico' relating to the computational nature of the work. There are, obviously, many advantages to in silico techniques, including their cost-effectiveness, speed compared with traditional testing, and reduction in animal use.

The science of in silico toxicology encompasses many techniques. These include:

• Use of existing data. If suitable data exist for a compound, there should be no requirement to initiate a new test or make a new prediction (unless prediction is for the purposes of model validation). If data are lacking for the chemical of interest, then other data can be used to develop (and subsequently evaluate) a new predictive model. Data sources include the ever increasing number of available databases as well as the open scientific literature. In addition, those working in industry may be able to utilise their own in-house data. More details on the retrieval and use of existing data are given in Chapter 3.

• Structure–activity relationships (SARs) are qualitative and can be used to demonstrate that a fragment of a molecule or a sub-structural feature is associated with a particular event. SARs become particularly powerful if they are formalised into structural alerts. A structural alert can be used to associate a particular toxicity endpoint with a specific molecular fragment such that, if the fragment is present in a new molecule, that molecule may elicit the same toxicity. The use of SARs and structural fragments is discussed in more detail in Chapters 8, 13, 16 and 19.

• There is a strong theme in this book towards forming groups of similar molecules. These groupings are also termed chemical categories. There are a number of approaches to 'categorise' a molecule including mechanistic profilers (structural alerts) and chemical similarity. Once a robust group of structures has been formed, it can be populated with toxicity data for those members of the group where experimental measurements are available. This allows for a read-across approach to be used to predict the toxicity of those members of the group for which no data are available. Various strategies for category formation and read across are discussed in Chapters 13–17.

• Quantitative structure–activity relationships (QSARs) provide a statistical relationship between the effects (toxicity and fate) of a chemical and its physico-chemical properties and structural characteristics. Linear regression analysis is often used but a variety of other multivariate statistical techniques are also used. The generation and use of QSARs are discussed in many chapters in this book.

• Expert systems (in the sense of in silico toxicology used in this book) are formalised and computerised software packages intended to make the use of SARs and QSARs easier. They usually provide an interface to enter a molecular structure and a suitable means of displaying the prediction (i.e. the result), in certain cases other supporting information is also given. Expert systems may be distributed on a commercial basis, although some are freely available. Increasingly they are integratable with other software. These types of software are described in more detail in Chapters 16, 17 and 19.

• Some in silico models can be derived to extrapolate the toxic effect measured in one species to predict toxicity in another species, thus reducing the requirement for further testing. This is discussed in more detail in Chapter 18.

• Models for other effects are increasingly becoming included under the remit of in silico toxicology. For example, a number of pieces of software can be used to estimate the likely exposure of an organism to a toxicant. Models exist for both external exposure (i.e. determining the amount present in the environment) and internal exposure (i.e. the amount taken up and distributed within an organism). These will not, themselves predict toxicity, but provide useful supplementary information for the overall risk assessment process. External and internal exposure scenarios are described in Chapters 20 and 21 respectively.


A variety of other tools and software also included within in silico toxicology are considered within this book. These include numerous applications for modelling the properties of chemicals (Chapters 5–8) defining the applicability domains of models (Chapter 12) and assisting with weight-of-evidence predictions (Chapter 22).

A note on terminology: it is increasingly common to see the use of the abbreviation (Q)SAR, indicating both SAR and QSAR. Where possible, the term (Q)SAR has been applied in this book and is intended to apply to both. A broad description of the use of alternatives to animal testing is provided by Hester and Harrison. Alternatives can be generalised into in vitro tests and in silico approaches. In addition to activities such as optimising testing and reducing harm, these approaches form a framework within the 3Rs philosophy (Replacement, Refinement and Reduction) to replace animal tests. In vitro toxicology includes the use of cellular systems, -omics, etc. and while it will support the in silico approaches discussed in this book and is often referred to, it is not the main focus here.


1.2 Factors that Have Impacted on In Silico Toxicology: the Current State-of-the-Art

Much has been written on the history of QSAR and related techniques, and it is not the purpose to review much of it in this chapter. However, it is worth considering some of the key factors to reach the current state-of-the-art. The initial factors listed below (Sections 1.2.1–1.2.4) are the drivers for the search of in silico toxicology; the remaining factors (Sections 1.2.5–1.2.7) have assisted in progress to the current state-of-the-art.


1.2.1 Environmental and Human Health

There is a need to ensure that any species exposed to a chemical is at minimal or no risk. The chemical cocktail to which man and environmental species is exposed potentially comprises a vast number of different substances, with more being added to that list annually. Whilst it is never the intention to allow exposure to a hazardous chemical at a concentration that may harm (with the exception of pesticides and pharmaceuticals etc.), for the vast majority of chemicals there is little knowledge regarding their effects. Traditional testing of chemicals to assess toxicological properties has been heavily reliant on animal testing. Such tests require specialist facilities, are time-consuming and costly — even before animal welfare considerations are taken into account. It is widely acknowledged that to assess all the chemicals that are commonly used, animal testing will not solve the problem of ensuring that harmful chemicals are identified. Therefore, in silico alternatives that can make predictions from chemical structure alone have the potential to be very powerful tools.


1.2.2 Legislation

The manufacture and release of chemicals is carefully regulated across the world. The aim is to ensure safety through legislation. In addition, companies consider it a corporate responsibility to ensure the safety of their workers and consumers, and are highly aware of the possibility of litigation if they fail to do so. Each country and geographical region has a raft of legislation allowing the risk assessment and risk management of chemicals; it is beyond the scope of this volume to discuss this further and readers are referred to the excellent 'standard text' from van Leeuwen and Vermeire.

No one single piece of legislation has promoted the use and development of in silico approaches. However, many have included it implicitly or explicitly. At the time of writing much work is being performed as a result of the European Union's Registration, Evaluation, Authorisation and restriction of Chemicals (REACH) Regulation, not to mention the Cosmetics Regulation. Elsewhere globally there has been similar legislation such as the Domestic Substances List in Canada and the Chemical Substances Control List in Japan. Within all of these pieces of legislation there is the expectation that in silico toxicology will be applied. In each case new tools, methods and techniques have been developed as a direct response to the legislative requirements.


1.2.3 Commercial — Product Development

Linked to the needs to comply with chemicals legislation, businesses have long recognised the need to predict toxicological properties from structure. This provides many competitive advantages including possibilities to identify toxic compounds early on in the development pipeline, designing out toxicity in new molecules, registration of products with the use of fewer animals and hence at lower cost — in addition to the rationalisation of testing procedures. In silico approaches are broadly applied across many industries with a particular emphasis on the development of pharmaceuticals. Therefore, there is a commercial need for reliable tools and approaches.


1.2.4 Societal

Not only does society desire safe chemicals, but it would prefer that animals were not used in the assessment of the properties of molecules. Therefore, there is commercial and consumer pressure to find and use alternatives. 'Computer models' are often cited as a method to replace animal tests. There is an opportunity here to gain public support of this area of science. In so doing, this may improve the perception of what science can provide to society. As often happens, public opinion moves ahead of the science and what might be realistically achievable. Everyone reading this book and considering using these methods should be encouraged to promote their use and realistic expectations of what they can provide.


1.2.5 Commercial — Software

Sections 1.2.1–1.2.3 reveal a potentially huge marketplace for in silico approaches to predict toxicity. This has been realised by a number of companies which have developed commercial software systems; an overview of many of these is provided in Chapter 19 and elsewhere in this book. These companies have helped to raise the profile of in silico toxicology and make it more than an academic exercise. Many of the software products are now considered 'standard' and, whilst they cannot be considered the 'finished product' (the models will always need updating and refining), the commercial impact on in silico toxicology should never be underestimated. A competitive marketplace is being developed; a number of commercial companies are offering free 'taster' products with the assumption that the user may want more from the company. While often incomplete, these free products can provide invaluable training and educational tools, and allow the novice to familiarise themselves with the concepts and practice of in silico toxicology.


1.2.6 Computational — Hardware and Software

Anyone reading this book will appreciate the exceptional advances in computational power, software and networking capabilities in their lifetime. Both authors fondly remember performing early computational chemistry calculations by building the classic 'straw' model of a molecule which was manipulated manually to obtain a 'visual' optimum geometry before guessing at reasonable bond lengths and angles. The computational input file for the three-dimensional (3-D) structure was then written by hand allowing for optimisation of a single bond that was often an overnight calculation. Happy days indeed! The rapid progression of technology has, however, meant that calculations can be performed at previously unthought of rates and for vast inventories — millions of compounds very rapidly. In silico methods for computational chemistry and toxicology have been quick to embrace the new, affordable computational power and also use the internet to compile, organise and distribute information. Many of the tools described in this book would not have been possible ten or even five years ago.


1.2.7 New and Better Solutions to Complex (Toxicological) Problems

It is true to say that in silico models may be better able to predict certain endpoints than others. This is a result of a number of factors, in particular, the data available for modelling, the extent of knowledge concerning the mechanisms involved and the complexity of the endpoint. The current real challenge to find alternatives in toxicology is for the chronic, low-dose, long-term effects to mammals (and understanding the effects on man in particular).

The most difficult endpoints to address with alternatives include developmental toxicity and repeated dose toxicity. For these endpoints it is necessary to move away from the direct replacement dogma that has driven in vitro toxicology. There are likely to be many alternatives proposed, but one way forward is capturing the chemistry (i.e. structural attributes) of compounds associated with particular pathways which lead to toxicological events; more discussion is given in Chapter 14. In a related manner, this is (partially) the vision of the report, Toxicity Testing in the 21st Century: A Vision and A Strategy, published by the US National Research Council of the National Academies, which has subsequently spawned the Tox21 collaboration between the US National Institute of Health (NIH) institutes and the US Environmental Protection Agency (EPA).


(Continues...)

Excerpted from In Silico Toxicology by Mark T. D. Cronin, Judith C. Madden. Copyright © 2010 The Royal Society of Chemistry. Excerpted by permission of The Royal Society of Chemistry.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Meet the Author

Mark Cronin is Professor of Predictive Toxicology at the Liverpool John Moores University. He has a BSc in Applied Biology and a PhD in Environmental QSAR and has co-edited 1 book and over 150 papers. He was the co-organiser of the 11th International Workshop on the Human Health and Environmental Sciences, Liverpool, May 2004 as well as a number of one day meetings at the Society of Chemical Industry, London. Judith Madden is a Senior Lecturer in Pharmaceutical and Chemical Sciences at the Liverpool John Moores University. She has a BSc in Chemistry and Pharmacology and a PhD in QSAR and Drug Design and over 25 publications to her name. She was the co-organiser of the 11th International Workshop on the Human Health and Environmental Sciences, Liverpool, May 2004 and the SETAC-UK Annual Meeting, Liverpool, September 2006.

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >