Cyber Research

Automating Research to Improve Reproducibility and Throughput

Automating Research to Improve Reproducibility and Throughput

For several decades mankind has looked to automate tedious and error-prone manual steps carried out in the laboratory, with a goal to improve scientific reproducibility and throughput.

SwissCognitiveWe recently spoke to Charles Fracchia, CEO and Co-Founder of BioBright, to learn how automation can be adopted to help researchers analyze their data. Charles discusses the challenges to consider when performing analysis in an automated fashion, he also highlights the value of your data and the importance of cyber security.

Laura Lansdowne (LL): How is data revolutionizing the way we do science?

Charles Fracchia (CF):
Up until now pretty much all of science has being process driven. In particular, at the beginning this process was very manual. The subject selection step was performed manually, the observation was done manually. The analysis was done manually.

We made a tremendous amount of progress when it comes to science, and automation has really helped, particularly with the first two steps (subject selection and observation) but the last frontier, if you will, is the ability to perform analysis in a completely automated fashion. 

That concept was unthinkable a few years ago. And now we have wonderful technologies, like and (), that can help us automate analysis.

We have a volume of data that’s unprecedented and we now have the capability of computation that’s unprecedented.

That is how data is primarily changing the way we do science – we are going from a process driven approach to a data driven approach. It is turning the whole scientific process on its head. Instead of saying, “I’m going to do A, B and C, and then trust the results” researchers are adopting a data driven process, “I have, A, B and C pieces of data… what is that telling me and what other data do I need to collect?” – this inversion can pose a lot of challenges.

For example, if you’re not careful when controlling your data, or if you are careless when collecting your data, your experiment may become completely worthless. The reproducibility crisis is a related phenomenon to this, and it is costing the US economy an estimated $28 billion each year. We often see situations nowadays where scientists are drowning in data with no means to handle the volume and complexity of this data, leading to a tremendous waste of time and resources.[…]

read more – copyright by www.technologynetworks.com

Figure 1: A scientific method involves three steps; Subject selection, observation, analysis. Whist tremendous efforts have led to the automation of the first two steps, efforts to successfully automate the final step (analysis) are still ongoing. Credit: Charles Fracchia, BioBright.

1 Comment

  1. Derek Martinez

    @SwissCognitive Speed and accuracy

Leave a Reply to Derek Martinez Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.