By Sanji Bhal
It’s no secret that one of the most common topics associated with analytical chemistry—or any industry for that matter—is Big Data. In fact, the phrase has transformed into an industry buzzword among scientists, and other industry professionals, because of all the challenges associated with analyzing large sets of information. While there seems to be a general appreciation of the problems related to Big Data, it has become clear that the complications caused by data heterogeneity—or the different types and/or compositions of data—are often overlooked.
Last month, Andrew Anderson and Graham McGibbon sat down with Jack Rudd, senior editor at Technology Networks, to discuss the issue of data heterogeneity as well as some of the other challenges associated with modern labs and businesses. Here are some of the specific questions covered during the conversation:
- What does heterogeneity of data mean, exactly?
- Besides heterogeneity, what are some of the challenges that modern labs and businesses still need to overcome?
- What are some of the ways labs and businesses have dealt with heterogeneity up until now?
- What are the specific benefits of data standardization?
- How do ACD/Labs’ offerings tie into and facilitate data standardization?
If you’d like to learn more, check out Technology Network’s latest podcast on this topic with Andrew and Graham here or click on our Tweet:
Also, make sure you download ACD/Labs’ new whitepaper—Looking Beyond Analytical Data Standardization—the Fourth Paradigm—and let us know if you have any questions by leaving a comment below!