< Lauren Aleksunes, Pharm.D., Ph.D. Part of the Rutgers-led Team Involved in the Development of New Algorithm for Chemical Toxicity Testing - EOHSI | EOHSI

Lauren Aleksunes, Pharm.D., Ph.D. Part of the Rutgers-led Team Involved in the Development of New Algorithm for Chemical Toxicity Testing

Using animals to test the toxicity of chemicals may eventually become outdated thanks to a low-cost, high-speed algorithm developed by researchers at Rutgers and other universities.

Toxicity testing – or determining the amount of exposure to a chemical that is unsafe for humans – is vital to the safety of millions of workers in various industries. But of the 85,000 compounds used in consumer products, the majority have not been comprehensively tested for safety. Animal testing, in addition to its ethical concerns, can be too costly and time consuming to meet this need, according to the study published in the journal Environmental Health Perspectives.

“There is an urgent, worldwide need for an accurate, cost-effective and rapid way to test the toxicity of chemicals in order to ensure the safety of the people who work with them and of the environments in which they are used,” said lead researcher Daniel Russo, a doctoral candidate at the Rutgers University-Camden Center for Computational and Integrative Biology. “Animal testing alone cannot meet this need.”

Previous efforts used computers to compare untested chemicals with structurally similar compounds whose toxicity is already known. But those methods couldn’t assess structurally unique chemicals and were confounded by the fact that some structurally similar chemicals have very different levels of toxicity.

The Rutgers-led group overcame these challenges by developing a first-of-its-kind algorithm that automatically extracts data from PubChem, a National Institutes of Health database on millions of chemicals. The algorithm compares chemical fragments from tested compounds with those of untested compounds, and uses multiple mathematical methods to evaluate their similarities and differences in order to predict an untested chemical’s toxicity.

“The algorithm developed by Daniel and the Zhu laboratory mines massive amounts of data and discerns relationships between fragments of compounds from different chemical classes exponentially faster than a human could,” said co-author Lauren Aleksunes, an associate professor at Rutgers’ Ernest Mario School of Pharmacy and the Rutgers Environmental and Occupational Health Sciences Institute. “This model is efficient and provides companies and regulators with a tool to prioritize chemicals that may need more comprehensive testing before use in commerce.”

(Source: Rutgers Today – 4-17-2019)

Read Complete Article