Big data is already changing the way business decisions are made — and it’s still early in the game. However, because big data exceeds the capacity and capabilities of conventional storage, reporting and analytics systems, it demands new problem-solving approaches. With the convergence of powerful computing, advanced database technologies, wireless data, mobility and social networking, it is now possible to bring together and process big data in many profitable ways.
Big data solutions attempt to cost-effectively solve the challenges of large and fast-growing data volumes and realize its potential analytical value. For instance, trend analytics allow you to figure out what happened, while root cause and predictive analytics enable understanding of why it happened and what is likely to happen in the future. Meanwhile, opportunity and innovative analytics can be applied to identifying opportunities and improving the future.
All healthcare constituents — members, payers, providers, groups, researchers, governments, etc. — will be impacted by big data, which can predict how these players are likely to behave, encourage desirable behaviour and minimize less desirable behaviour. These applications of big data can be tested, refined and optimized quickly and inexpensively, in turn radically changing healthcare delivery and research. Leveraging big data will certainly be part of the solution to controlling spiralling healthcare costs.
Simply by witnessing how big data has transformed consumer IT, it is clear that the promise of big data in healthcare is immense (think Google, Facebook and Apple’s Siri, which all rely on processing and transmitting massive amounts of data). While its potential in healthcare has not been fulfilled, the question is not if, but when.
This white paper will define big data, explore the opportunities and challenges it poses for healthcare, and recommend solutions and technologies that will help the healthcare industry take full advantage of this burgeoning trend.
A large amount of data becomes “big data” when it meets three criteria: volume, variety and velocity. Volume simply implies that there is a lot of data — terabytes or even petabytes (1,000 terabytes). This is perhaps the most immediate challenge of big data, as it requires scalable storage and support for complex, distributed queries across multiple data sources. While many organisations already have the basic capacity to store large volumes of data, the challenge is being able to identify, locate, analyze and aggregate specific pieces of data in a vast, partially structured data set.
Variety, on the other hand, focuses on the fact that big data is an aggregation of many types of data, both structured and unstructured, including multimedia, social media, blogs, Web server logs, financial transactions, GPS and RFID tracking information, audio/video streams and Web content. While standard techniques and technologies exist to deal with large volumes of structured data, it becomes a significant challenge to analyze and process a large amount of highly variable data and turn it into actionable information. But this is also where the potential of big data potential lays, as effective analytics allow you to make better decisions and realize opportunities that would not otherwise exist.
Coming to velocity, it’s ostensibly clear that while traditional data warehouse analytics tend to be based on periodic — daily, weekly or monthly — loads and updates of data, big data is processed and analyzed in real- or near-real-time. This is important in healthcare for areas such as clinical decision support, where access to up-to-date information is vital for correct and timely decision-making and elimination of errors. Current data is needed to support automated decision-making; after all, you can’t use five-minute-old data to cross a busy street. Without current data, automated decisions cannot be trusted, forcing expensive and time-consuming manual reviews of each decision