Advances in technology and the escalating volume of information are reshaping business operations across numerous industries, including the public sector. The rate at which governments generate data and archive digital records is accelerating, driven by the rapid proliferation of mobile devices and applications, smart sensors and devices, cloud computing solutions, and citizen-facing portals. As digital information expands and grows more complex, the management, processing, storage, security, and disposition of that data become increasingly intricate. New tools for capture, search, discovery, and analysis are enabling organizations to derive insights from unstructured data. The government sector is reaching a critical juncture, recognizing that information is a strategic asset. To better serve the public and meet mission requirements, governments must protect, leverage, and analyze both structured and unstructured information. As government leaders strive to evolve into data-driven organizations capable of successfully accomplishing their missions, they are laying the groundwork to correlate dependencies across events, people, processes, and information.
High-value government solutions will emerge from a combination of the most disruptive technologies:
-
Mobile devices and applications
-
Cloud services
-
Social business technologies and networking
-
Big Data and analytics
Big Data represents an intelligent industry solution that enables government entities to make better decisions by taking action based on patterns revealed through the analysis of large volumes of data—both related and unrelated, structured and unstructured.
However, achieving these outcomes requires far more than simply accumulating massive quantities of data. "Making sense of these volumes of Big Data requires cutting-edge tools and technologies that can analyze and extract useful knowledge from vast and diverse streams of information," noted Tom Kalil and Fen Zhao of the White House Office of Science and Technology Policy in a post on the OSTP Blog.
The White House took a significant step toward assisting agencies in identifying these technologies by establishing the National Big Data Research and Development Initiative in 2012. This initiative included more than $200 million to maximize the potential of the Big Data explosion and the tools required to analyze it.
The challenges posed by Big Data are nearly as daunting as its promise is encouraging. Efficient data storage is one such challenge. As always, budgets are tight, so agencies must minimize the per-megabyte cost of storage and keep data easily accessible, ensuring users can retrieve it when and how they need it. Backing up massive quantities of data further heightens this challenge.
Effectively analyzing data is another major hurdle. Many agencies utilize commercial tools that enable them to sift through mountains of data, identifying trends that help them operate more efficiently. (A recent study by MeriTalk found that federal IT executives believe Big Data could help agencies save over $500 billion while also fulfilling mission objectives.)
Custom-developed Big Data tools are also allowing agencies to address the need to analyze their data. For instance, the Oak Ridge National Laboratory’s Computational Data Analytics Group has made its Piranha data analytics system available to other agencies. The system has helped medical researchers find a link that can alert doctors to aortic aneurysms before they strike. It is also used for more routine tasks, such as sifting through resumes to connect job candidates with hiring managers.
Read more...