NetApp, MeriTalk Assess Agencies’ Big Data Progress; Mark Weber Comments

5 mins read

Mark Weber, NetApp

MeriTalk and NetApp surveyed 151 federal information technology professionals to see if agency efforts measured up to the hype around big data.

Mark Weber, NetApp’s public sector business president, told ExecutiveGov that although there is a big hype around this newly coined term, many are not doing anything about it yet.

The survey found that over half of federal IT professionals surveyed believed the benefits of big data were improved efficiency.

Survey respondents said the ability to make accurate decisions more quickly and the ability to forecast were top benefits as well.

However, 60 percent of IT professionals said their agency is analyzing collected data and less than half said they were using data in order to perform strategic decision making.

Many agencies are investigating how they can use the larger amounts of relevant data to benefit organization efficiency, Weber said.

The survey found that 91 percent of civilian agencies are talking about big data versus 59 percent in the defense and intelligence community.

Weber also noted that less than 20 percent of agencies have a plan right now.

He added that he believes agencies are most likely investigating their larger structured and unstructured data sets but are not calling it big data yet.

According to the survey, agencies store an average of 1.61 petabytes of data, but expect to have 2.63 petabytes in the next two years.

With the 45 percent of data capture increase each year, that amount could grow to be 45 times more data than there is today in 10 years, Weber suggests.

Weber said the first step for agencies to begin taking advantage of big data is to determine what it is they are trying to learn by harnessing, mining and applying information derived from larger and larger data sets.

“People are capturing data but they are not analyzing or putting it to practical application,” Weber said.

The survey suggests agencies need to determine data ownership as well since there was a discrepancy among respondents as to who should own the data.

Weber said it is important to determine ownership since whoever owns the data will also most likely be responsible for applying analytic tools to that data.

After determining which data to go after, Weber said agencies should figure out how to scale IT systems.

Collecting and storing data does not serve a purpose unless it is applied in some way, Weber explains.

According to Weber, healthcare organizations could analyze data relating to patient re-admittance and give caregivers insight into how to prevent patients from returning to the hospital for the same issues.

To ensure success in their endeavors, agencies should purchase software necessary, determine the information necessary and put the time into it to receive return on investment, according to Weber.

Big data is not a new term for NetApp, which stores more than 50 petabytes of information right now.

Weber said NetApp has been wrestling with the issues relating to big data for a while but added the difference is people are paying attention now.

With the White House’s big data research and development initiative, Weber said the government will likely make progress since it will adhere to a measurable system.

At the same time, a larger opportunity lies ahead for companies who pursue contracting opportunities with the government.

Weber said there is much opportunity on unstructured data specifically, an untouched part of the market so far.

With opportunities galore, Weber warns that agencies need to plan how to be efficient at storing data ahead of time.

“To win on the storage side of retaining this data, you have to be efficient and not just add storage every time,” Weber said. “People are going to win on the storage side of this.”

ExecutiveGov Logo

Sign Up Now! Executive Gov provides you with Free Daily Updates and News Briefings about Technology

Leave a Reply

Your email address will not be published.