library.jpgAround 300 BC, the rulers of Egypt created what was to become one of the largest “big data” projects in
antiquity-the library at Alexandria. Scholars came to Alexandria to study and learn, because that’s where the data was. Many spent their entire lifetimes analyzing the information housed in the library and using it to answer their big questions.

 

For example, Euclid created a system of geometry that is still taught today. Ptolemy created a geography that was the most complete work of its kind until the Renaissance. Eratosthenes used the shadow of the sun at the summer solstice to calculate the circumference of the earth-with remarkable accuracy. These ancient Greek scholars understood that the big data stored at the library was critical for the big answers they were seeking.

 

Today, digital businesses are collecting vastly more data each day than the information housed in the library of Alexandria-which could now be stored on a single 500GB memory stick and fit into your pocket. IDC forecasts that five industries alone will drive nearly $63 billion in spending for big data analytics in 2016: banking, discrete manufacturing, process manufacturing, central governments, and professional services. These industries require new analytical tools that use the power of computing not only to store the big data, but also to mine it effectively to provide timely answers.

 

With that in mind, here are a few examples of questions that NetApp customers are answering with the help of big data tools:

 

Can genomics data be used to improve health outcomes?

Can we harness the vast data encoded in the human genome to improve healthcare? Inova Translational Medicine Institute analyzes the genome to develop preventive therapies and treat disease. Its database of genomic studies and clinical trials contains billions of data points. Better use of data analytics has allowed Inova to cut the processing time of queries from weeks to hours. Read this case study to learn more.

 

What are the fundamental building blocks of the universe?

Can we peer into the nucleus of the atom to advance our understanding of the basic operation of the universe? At CERN, which I mentioned in a previous post, physicists from around the world study the data collected from its Large Hadron Collider. Not only must CERN reliably store the 1 million GB per second of data that the collider produces, it also must provide that data to scientists in a useful way. Watch this short video to learn how NetApp is helping CERN manage its big data.

 

Can nanoscale technology be harnessed to create new materials?

In my past life as a storage systems engineer, I worked on a very large project for more than 15 years. As at CERN, my customer was providing data and compute facilities to scientists from around the world to answer big questions regarding complex nanoscale processes. When we started working together, this customer needed to manage several terabytes of capacity to support 2D and black-and-white sensors. Now, the customer requires a multipetabyte data lake to analyze the data from 3D, color, and multistream sensors. Remarkably, this vast datastore can still be managed by a single person thanks to advancements in our data management software.

 

What questions are you asking?

Although we are now able to answer increasingly difficult questions, we have only begun to scratch the surface. For example, there are still some very important events that no one can predict with certainty, such as “Will the citizens of the U.K. vote to leave the European Union?” “Who will be the next president of the United States, or of France?” “When and where will the next destructive earthquake take place?”

 

Visit our website to learn how NetApp® solutions for Hadoop, NoSQL, and Splunk can help you find answers to your own big questions.

 

From Jean-Francois Marie

Head of Product & Solutions Marketing EMEA

 

JeanFrancois_Marie