Categories
Cloud Cyberinfrastructure Education Innovation Network Reading Technology

Latest Read: Actionable Intelligence

Actionable Intelligence: A Guide to Delivering Business Results with Big Data Fast! falls into the must read category for leaders of any organization. Actionable Intelligence is in the Lean model well beyond the vanity metrics that so many leaders have embraced. Lessons on implementing a secure framework comes from lessons including Estee Lauder, Procter & Gamble, Lifetime Brands and the CIA. Yes the CIA.
Actionable Intelligence: A Guide to Delivering Business Results with Big Data Fast!Reading this book I have found tested lessons by Keith B. Carter regarding the lack of Actionable Intelligence in many organizations. The start always seems to be the lack of organized data and determining which is the most pressing to actually use in order to be successful in a fast changing world.

Maybe his most powerful work revolves around how executives at any company (or university) even question the value of actionable intelligence regardless of the tools already in place. Too many silo examples reinventing the wheel while overlooking the need to understand their own data reporting methods.

Sustaining delivery of actionable intelligence by the evolution from Dashboards to Cockpits. IMHO to many university leaders are just beginning to understand the Dashboard and their tools miss the Cockpit opportunities.

Business lessons alone describe how to mine actionable intelligence prove the validity of this book. Lessons from Estee Lauder include how the company was able to leverage secure data reporting in order to adjust following the powerful Japanese earthquake and tsunami that triggered the Fukushima Daiichi nuclear disaster. And in some ways Carter points to a crisis in order for executives to embrace actionable intelligence:

People do not trust data, they trust other people and their opinion of the data. So when the data owners, the people who input the data and/or use it, raise their hands and say, “This data is good; I trust it,” that will make it more likely for other people in the organization to believe it. It also means that it’s clear. It’s not just that they trust it from the point that 1 + 1 = 2. It is also clear how the data has to be used, and the definition of the data is clear.

Carter helps breakdown the old data principle “People don’t trust data – they trust other people.” Its true. Estee Lauder’s use of actionable intelligence is such that every organization should be striving towards in order to stay competitive.

Categories
Cloud Cyberinfrastructure Education Innovation Network Reading Technology

Latest read: Big Data Using smart big data

There is nothing boring in the established insights and support for Big Data Using smart big data analytics and metrics to make better decisions and improve performance.
Big Data Using smart big data analytics and metrics to make better decisions and improve performanceRead just one ‘big data’ book and  you’ve read them all right? Not so fast. Big Data: smart big data analytics reveals how a well planned understanding of your business can better embrace select data sets for your company, organization or school to not only remain competitive but thrive in the new global economy.

For all of the Big Data blogs, books and white papers that I have read Big Data – smart big data analytics by Bernard Marr is one of the better written books. Many will benefit from this knowledge.

Bernard Marr’s challenge continues to be what most senior managers do not understand about Big Data. And he does an admirable job in chapter six: Transforming Business. There are so many examples of how Big Data can actually re-define your objectives, but many are surrounded by a sales approach I recall from work Apple’s Michael Mace. Michael was Director of Competitive Analysis at Apple and eloquently addressed FUD at a sales conference in the 90s. The same lessons apply today regarding Big Data. FUD is Fear, Uncertainty & Doubt.

Categories
Cloud Education Google Internet2 Network Reading Technology WiscNet

Latest read: Reliability Assurance of Big Data in the Cloud: Cost-Effective Replication-Based Storage

While focused on the task of generating data for astrophysics Reliability Assurance of Big Data in the Cloud is a worthy read when focused around designing cloud service contacts.
Reliability Assurance of Big Data in the CloudThe work of authors Yang, Li and Yuan surround capturing big data reliability, and measuring disk storage solutions including from noted cloud vendors.

Their work at Centre for Astrophysics and Supercomputing at Swinburne University of Technology focused on reducing cloud-based storage cost and energy consumption methods.

They also share the impact of multiple replication-based data storage approaches based upon Proactive Replica Checking for Reliability (PRCR). That was very interesting in their research data gathering.

I found Reliability Assurance of Big Data in the Cloud also supports moving data into the cloud across advanced research networks including Internet2.

Processing raw data inside the data center impacts network models (based upon available bandwidth) in their work. Their research gathers and stores 8 minute segments of telescope data that generates 236GB of raw data. By no means in the petabyte stage (yet) but it still sets a solid understanding of contractual demands on big data cloud storage.

My interest peaked around impacts developing knowledgeable contracts for cloud services. Their background regarding data gathering and processing should influence procurement contract language. This is even more applicable when applied to petabyte data sets and the SLAs regarding data reliability requirements. Never leave money on the table when scaling to the petabyte range. Must read for purchasing agents and corporate (and university) CPSMs.

Categories
Cyberinfrastructure Design Education Google Innovation Internet2 Network OpenSource Reading Technology Vietnam War

The Vietnam War: Unstructured data reporting and counterinsurgency

After reading No Sure Victory: Measuring U.S. Army Effectiveness and Progress in the Vietnam War I could not help but think about the consequences failed data reporting by MACV can serve a historical lesson for re-implementing or adjusting campus data reporting systems.

data reporting during Vietnam War
Data report tickets used by MACV in the early stages of The Vietnam War

Key stakeholders on campus should easily state their reasons for data collection and reporting. No Sure Victory benefits campus units by revealing an early, dare I say Big Data approach to unstructured data reporting and delivering actionable data.

Today we immediately understand Google’s Compute Engine or an Amazon Elastic MapReduce cloud for this demand.

Universities can thrive with diverse reporting teams. Working with Institutional Research and striving to improve enrollment and retention efforts are key goals. Yet important roles are filled with student workers. Here unstructured data often fragments over mismanagement. Many ad hoc Microsoft Excel documents are created without data governance and become silo’d from the campus data warehouse. Key stakeholders on any campus including CIOs, IR Directors, Research staff, Program Directors, campus data reporter writers and student workers. Even seasoned campus data report writers are not leveraged to streamline actionable data insights.

No Sure Victory brings to light a tragic failed data reporting implementation by Secretary of Defense Robert McNamara in addressing a war in Vietnam. The was his reputation as one of The Wiz Kids, the World War II Statistical Control unit that analyzed operational and logistical data to manage war.

Categories
Cloud Education Innovation Network Reading

Latest read: Disruptive Possibilities

There can be no doubt today that Big Data has changed everything. Jeffery Needham has written a great book Disruptive Possibilities: How Big Data Changes Everything. Its all about the impact of Hadoop in the cloud as the ultimate computing platform.

Disruptive Possibilities: How Big Data Changes EverythingI was very pleased reading his work when I found his personal story at the end regarding the application of Hadoop in neuroscience as a method to address Sturge-Weber Syndrome. We know it as having a port wine stain on your face.

His story made me appreciate his desire to throw Hadoop at the datasets that may one day reveal a cure for this syndrome. I am amazed at how he described reteaching himself not only how to walk down a hallway, but train his body to hit a baseball after losing vision in his right eye.

My favorite segment of Disruptive Possibilities is chapter five: When Clouds meet Big Data. Needham also makes a very easy read in chapters one to four where he lays the foundation based upon his deep experiences with Hadoop. And yes you can run Hadoop off laptops found in a dumpster.

There is much to learn in university circles about the impact of Disruptive Possibilities and Hadoop.  Worry not its not the computing or research units that I am thinking about but rather HR, Admissions and just about every other campus unit that would benefit from moving their data into a Hadoop cluster in order to data mine their future.