While perhaps not yet fodder for water cooler discussions, the term “Big Data” has vaulted into commonplace use today. PHEMI President & CEO, Dr. Paul Terry, has delivered numerous Big Data talks to a wide variety of audiences – hospital administrators, scientific researchers, government policy makers, corporate CEOs and more. After each talk comes the request for more information. Below is a list highlighting various articles and reports that provide unique perspectives on the topic of Big Data. The list is not exhaustive, but rather illustrative – just a smattering of the great content being generated about Big Data.
Demystifying Big Data – A Practical Guide to Transforming the Business of Government
Prepared by TechAmerica Foundation’s Federal Big Data Commission, October 2012.
An excellent report that provides concise conceptual information and definitions about Big Data – particularly as it relates to the US federal government – including core technologies, strategies on how to leverage Big Data as a strategic asset and pitfalls to avoid in implementation. The report stresses the necessity of asking “how can Big Data help?” not “what Big Data projects can government create?” Successful big data programs commence with a “fit for purpose approach,” not a “lets just see what we can do approach.” The report provides recommendations for getting started with Big Data, general policy and governance guidelines, sectors affected by Big Data and the huge potential societal benefits if Big Data can be leveraged as an effective information technology strategy.
Privacy By Design in the Age of Big Data
Prepared by Ann Cavoukian, Information & Privacy Commissioner, Ont., Canada & Jeff Jonas, IBM Fellow.
This paper explores how privacy and responsibility can co-exist. With Big Data as the next “frontier for innovation, competition and privacy,” the question emerges on how to protect individuals’ privacy rights. For corporations and governments to realize the full potential of Big Data, there must be a huge shift in mindset to ensure that privacy protections are designed into emerging systems. “Privacy by Design (PbD) is an approach to protecting privacy by embedding it into the design specifications of technologies, business practices, and physical infrastructures. That means building in privacy up front – right into the design specifications and architecture of new systems and processes.” The paper also provides an excellent example of how a Big Data sensemaking system was designed from concept to implementation with privacy protections as a key success metric right from the beginning.
Privacy by Design and the Emerging Personal Data EcoSystem
Prepared by the Office of the Information & Privacy Commissioner of Ontario, October 31, 2012
This paper focuses on the technologies and initiatives associated with a “Personal Data EcoSystem.” Recent years have witnessed significant consumer concern about the lack of transparency and accountability of data flows. Currently, volumes of personal data are collected, but tend to be for the use of the collector, not the person providing it. There has been much debate about who owns the data and what it can be used for. In the new Personal Data EcoSystem model, privacy must be the default setting for all processes and the user must be put in the center of the data flows and in control of their own data. The paper features several case studies and “seeks to address the challenge of protecting and promoting privacy, while at the same time, encouraging the socio-economic opportunities and benefits of personal information as a new asset class.”
Preserving Digital Data – Gregory Goth
Communications of the ACM, April 2012
Vol. 55. No 4.
A short article about scientific data, and how society’s demand for new research almost destroys the old. German archivists discovered that 90% to 95% of the data produced in public science was lost over time because it was no longer accessible. The article highlights the challenge for data preservationists to create not only the infrastructure to make data preservation possible, but also the necessity to address the “deep cultural divide” between scientists and archivists. Countries like Germany are making progress by funding projects that have a data accessibility component, demonstrating how the ability to access archived data has a much greater return on investment than just funding straight research.
Looking back at Big Data – Leah Hoffman
Communications of the ACM, April 2013
Vol. 56. No 4.
Article about how Big Data influences the field of computational history; computation is the only way to tackle large data sets and “it gives researchers a way to make sense of resources that are intractable to traditional scholarship.” The article shows how as computational tools open up new approaches to understanding history, the necessity of historians and computer scientists collaborating becomes even more apparent. Also critical is the need for researchers to have training in both humanities and applied sciences.