Data integrity has become an industry buzzword, but do people really understand what it means? A recent survey we conducted with Chemistry & Engineering News (CE&N) showed that scientists think about data integrity differently. Read on to learn more about our survey and the results we found.
As the number of conversations we’re having with scientists in validated environments increases year over year, we have found that a number of common misconceptions exist. Many arise because previous deployments of software accompanied the installation of new hardware, or have involved informatics systems that are intimately integrated across the development workflow and are the source of data and reports submitted directly to regulatory authorities. Below we clear up some of the grey areas that seem to have become industry myths that we commonly find ourselves correcting.
In pharma, drug substances (and the resulting formulated drug products) must conform to a variety of quality specifications in order to be approved for use by healthcare practitioners and patients. While most of us who have worked in pharma know the various regulatory statutes and advisory guidance (and can quote them chapter and verse!), my belief is that there is a challenge in the practical and efficient implementation of quality practices that support conformance. When considering the increasing ‘fracturing’ of supply chains that support demand for drug substances in both clinical and healthcare systems worldwide, this challenge only continues to grow.
Analytical data plays a critical role in R&D by supporting critical decision-making on a daily basis. Whether a synthetic chemist is looking to see if their reaction yielded the product they expected, a group of scientists in development are building an impurity control strategy, or experts in manufacturing are collecting data for regulatory submissions, applications of analytical data are ubiquitous. At a time when the volume of insight-rich data one can gather is extraordinary, chemists working in academic research, industry, and non-profit organizations alike face regular challenges in managing and sharing their data.
Chemical R&D generates a deluge of instrumental analytical data on a daily basis. As critical R&D decisions and regulatory submissions are based on this data, the need for quality data management is more important than ever before. A lot has changed since the days when paper notebooks were the leading data management ‘platform’ among scientists. Advancements in research and instrument hardware continue to increase the amount of data we are able to produce and process. Sanji Bhal sits down with Graham McGibbon, director of strategic partnerships at ACD/Labs, to discuss his outlook on the industry and the pressing need for better management of analytical chemistry data in R&D.
As 2017 comes to an end, Daria Thorp, President and CEO of ACD/Labs, looks back the company's 23 year history, and recounts some of its notable solutions, including ACD/Spectrus, ACD/Name and ACD/Percepta. She also discusses Luminata, ACD/labs' award winning impurity control informatics solution, which was introduced earlier this year.
Andrew Anderson reports on the 2017 AAPS Annual Conference by taking a closer look at his recent byline in Laboratory Equipment. He also introduces Joe DiMartino, ACD/Labs' newly appointed solution manager for Luminata, and previews Joe's recent Q&A with Outsourcing-Pharma.com, which discusses how QbD and impurity control management directly impact process development within pharmaceutical R&D organizations.
Earlier this year, I had a conversation with Sophia Ktori, a reporter for Scientific Computing World, to talk about security issues in the age of R&D outsourcing. In the ensuing article, Sophia stated, “The R&D sector is increasingly turning to collaborative, partnered and outsourced projects to boost innovation, reduce costs and help expedite development.” While this is not an industry secret, this trend has led to a number of security concerns in our industry, and after looking back, I feel our discussion still rings true almost 10 months later.
As the Analytica Trade Fair in Munich came to a close in April, I realized that I had witnessed a tradeshow unlike many others in our industry today. It is probably the largest meeting I’ve participated in with 35,000 visitors and 1,244 exhibitors from 40 countries. The exhibition space spanned five halls in which any and every type of organization linked with laboratory research was represented. Analytica seems to have withstood shrinking travel budgets, mergers and acquisitions, and challenges that have hit the tradeshow circuit In fact, this year’s venue was so large I wish I had taken my trainers (sneakers) to traverse the exhibition space.
The 'Amazon Effect' describes how online shopping giant, Amazon, sorts data and matches similar products to one another—providing customers with a customized list of products they may be interested in. In laboratory informatics, scientists organize and analyze data in a very similar way. By using software like ACD/Spectrus Platform, scientists—from different laboratories using a variety of instruments—can combine and process their large data sets in a single interface that delivers quality results.