We live in a world of constant change in IT. O’Reilly’sThe Changing Role of the CIO provides a foundation regarding Big Data for any IT team and every manager, executive or board member. Today if your not embracing change your getting run over by it whether you know it or not.
From the corporate boardroom to the campus research lab we indeed are undergoing a fundamental paradigm shift in our digital lives.
Without a doubt it is also an educational shift. Questions of Excel cubesets in a world of unstructured big data analytics will be a much needed training opportunity not for your IT team but actually or your entire workforce.
The Changing Role of the CIO is about the opportunities to engage your IT team over data. Today data is fueling actionable analytics not just vanity metrics. The IT team needs to embrace the idea that data is the new oil.
After leading your organization to a cloud solution that eliminates in-house, legacy enterprise systems you never can look back. Helping my organization migrate to a CMS public cloud that reduced just one enterprise service $400,000 annually resulted in our senior leadership never looking at me the same way. You gain a seat at the table.
And due to the nature of the mobile beast, The Changing Role of the CIO shows its now easier than ever to measure quality engagements in real time with your customers. The future of data, how it can be measured, immediately reported within your office or from the other side of world is a game changer.
Big Data at Work is a good book for reviewing tested analytics case studies by Tom Davenport. As I began reading this I found myself reading an update to Tom Davenport‘s great analytics book Competing on Analytics that I read in 2008 which IMHO really set the standard. Big Data at Work is the follow up with tested business cases.
It seemed like an eternity that analytics are now realized as a critical business strategy for universities. Peter Drucker said it best: if you cannot measure it, you cannot manage it.
While much shorter than his Competing on Analytics, Big Data at Work is a must read. In Higher Education alone the Big Data at Work case studies by Davenport can serve as near perfect blueprints in the dynamic world of campus networks and services migrating to cloud.
Davenport needs to convince nobody that Big Data is a growing field, yet even in 2014 the number of colleges offering degrees in Big data science is not yet up to speed. More importantly he shares how traditional Business Intelligence is struggling to adjust to the analytics and big data era.
For as much as Big Data at Work contributes to the requirements in both technology and IT professionals, his suggestions that management stands in the way of more game changers outside of Silicon Valley. Yes Hadoop and MapReduce have forever empowered LInkedIn, Google, Yahoo and other startups. Healthcare, banking and insurance are markets who have already embraced and are excited about the abilities of big data for their customers.
Davenport is pretty upfront about what is needed: colleges have not fully embraced Big Data. Their mistake is assuming Big Data is a Computer Science degree. A good chapter of this book reflects on the inability of management to adopt Big Data for today’s competitive market. Is it surprising to see only a hand full of college programs sending grads to the likes of Google? More and more companies are looking to regional campus partnerships for Hadoop big data efforts. Yet many of those colleges still have no existing undergraduate or masters-level degrees in Big Data.
Held in Indianapolis in April 2010, The 2010 Intermedia Festival of Telematic Arts held in April was a unique series of events presenting futuristic modes of live telematic and media arts by artists throughout North America and Europe. Telematic art synthesizes performing arts with computers, media and telecommunications. Over 100 artists traveled to Indianapolis while others participated remotely via Internet2.
A combination of art performances including dance, music, visual arts and videography with commentary and discussion were integrated to create a compelling set of experiences. The session included an overview of the multi-institutional activity involving students, faculty, and administrators. Classes of students from Florida State University, Indiana University Purdue University Indianapolis (IUPUI), Butler University, University of Calgary, University of Cincinnati and Indiana University Bloomington met in the months prior to the festival in order to plan and rehearse their respective performances online.
This session examined the presentation of telematic art to the general public via Internet2 at the downtown Indianapolis Public Library. This effort involved strategies to intermingle both high and low bandwidth venues into a seamless, integrated performance environment.
Supported by a grant from NSF, eight universities (including the UWisconsin System) have been funded to help support a “CI Days” event at their campus. CI Days are intended to bring together various sectors of the campus (Faculty, IT Staff, librarians, administrators, students and others) to better understand the needs and roles of each sector. Its a case of “you don’t know what you don’t know” for almost every campus.
This Friday Wisconsin will introduce their initial CI Day event at UWMilwaukee with remote viewing supported around the State. It was great to hear WiscNet’s Shaun Abshere at this session today in Q&A regarding Friday’s coming session and supported remote technologies that will be used.
Arthron was concept for experiences in the domain of Art and Technology. Arthron facilities include its simple user interface and the manipulation of different media sources. Users can remotely add, remove and configure the presentation format as well as schedule the media streaming during an artistic performance.
Arthron is composed by six components described as follow. The Articulator is responsible for the remote management. This component concentrates a great part of the Arthron functionalities, such as stream scheduling (manual or automatic), network monitoring and measurement, remote configuration of other modules, access control, web page automatic generation for online publication, video effects, and communication tools. The Encoder is responsible for capturing and encoding (when necessary) of media source, which can be external (DV or HDV camera, DVD) or internal (a local file). The Decoder’s main functionality is to decode and display the media stream in a specific device (monitor, projector, etc). The Reflector is responsible for the replication and redistribution of media streaming over the network.
The VideoServer component is able to transcoding media streaming that will be published online. This component is also responsible for working with flv, ogg and h264 formats. The MapManager controls and displays the interactive map of Arthron components. MapManager offers to users an overview of the geographical distributed locations of Arthron components.
Research and Education network organizations are beginning to successfully integrated new communities into their membership from the beginning, but funding from the National Telecommunications and Information Administration’s BTOP (Broadband Technology Opportunities Program) has created opportunities to build non-traditional communities from public sectors such as public safety, state government and healthcare.
Today’s panel session included representatives from network organizations and their new community partner discussing their experience and providing their perspective on the opportunities, challenges and lessons learned when building new communities.
David Reese CTO from CENIC (Corporation for Education Network Initiatives in California) stated they have provided iPads to everyone with all mapping of fiber routes is now digital (Google Earth) and paper maps are simply being ignored.