Categories
Cloud Cyberinfrastructure Education Globalization Innovation Network Reading Technology

Latest read: The Master Switch: The Rise and Fall of Information Empires

Tim Wu’s second book The Master Switch: The Rise and Fall of Information Empires is wonderful examination how American information empires were established and stifled innovation at the same time. This is my second book by Wu following his brilliant Who Controls the Internet.
The Master SwitchWu identifies long business cycles surrounding the birth of information systems. While they begin open over time they were consolidated and driven by the market to become closed.

We displays how they become open again following amazing innovations force a business change in order to survive in the new marketplace.

The Master Switch opens with the birth of the Bell AT&T telephone monopoly. This is a facinating story when held against the garage startups of Apple and Google.

There is an amazing look at how countries and cultures also view information empires differently. The case for Wu is the capitalist, independent market approach to radio vs the UK’s BBC dominated by the royal family.

The Master Switch reveals how four key markets actually hold government infrastructure: telecommunications, banking, energy and transportation. These four and their capitalist owners for generations established control over any citizen’s attempt at challenging their monopolies. The lesson Wu establishes is corporate control by closed technologies. Yet one cannot help but understand they magically protected the country from the devastating affects of revolution leading up to and more importantly the horrific aftermath of World War I that forever removed Paris as the hub for film entertainment.

Categories
Cloud Education Google Internet2 Network Reading Technology WiscNet

Latest read: Reliability Assurance of Big Data in the Cloud: Cost-Effective Replication-Based Storage

While focused on the task of generating data for astrophysics Reliability Assurance of Big Data in the Cloud is a worthy read when focused around designing cloud service contacts.
Reliability Assurance of Big Data in the CloudThe work of authors Yang, Li and Yuan surround capturing big data reliability, and measuring disk storage solutions including from noted cloud vendors.

Their work at Centre for Astrophysics and Supercomputing at Swinburne University of Technology focused on reducing cloud-based storage cost and energy consumption methods.

They also share the impact of multiple replication-based data storage approaches based upon Proactive Replica Checking for Reliability (PRCR). That was very interesting in their research data gathering.

I found Reliability Assurance of Big Data in the Cloud also supports moving data into the cloud across advanced research networks including Internet2.

Processing raw data inside the data center impacts network models (based upon available bandwidth) in their work. Their research gathers and stores 8 minute segments of telescope data that generates 236GB of raw data. By no means in the petabyte stage (yet) but it still sets a solid understanding of contractual demands on big data cloud storage.

My interest peaked around impacts developing knowledgeable contracts for cloud services. Their background regarding data gathering and processing should influence procurement contract language. This is even more applicable when applied to petabyte data sets and the SLAs regarding data reliability requirements. Never leave money on the table when scaling to the petabyte range. Must read for purchasing agents and corporate (and university) CPSMs.

Categories
Cyberinfrastructure Education Globalization Google Innovation Milwaukee Network Technology

University cloud computing contracts

Did you hear about the university professor signed up for a cloud service and unknowingly left his department on the hook for two years of service beyond his grant….or the university who had more than 500,000 student records (social security, addresses and grades) hacked? Cloud computing poses special demands upon Universities who can no longer employ the same procurement process used to acquire computers and software since the 1980s.

Are you aware that today many Universities (and K12 School districts) use a popular email marketing program that sells contact information of students to vertical marketing firms who in turn re-sell them to other marketing and product companies?

Today’s aggressive marketplace and the business of cloud services has radically changed the procurement process. Many of us have a fiduciary duty to protect data of our students, research and institutions.  Regardless of how students freely give away their data on Facebook, our institution will still be held responsible to  protect all of our institution’s data.

My views on the impact of Cloud Computing in Higher Education have been slowly evolving. This past May I was given an incredible opportunity to further my learning by participating in an Engineering & Technology Short Course with the UCLA Extension.
Remember those “must-take classes” in college?  UCLA’s Contracting for Cloud Computing Services is one on my list of those opportunities you cannot afford to ignore.  My advice: Find your way to UCLA.

Again, I hope this can help as many people as possible understand the lessons taught in class.  Due to the nature of the beast they are in no specific order. They are all top level concerns:

BACKGROUND
For over a generation traditional desktop PC vendors focused on features and price. Since the late 1980s schools established trust in vendor’s products to conduct business, educate students and store student data. From floppy disks to magnetic tape all data was stored locally on campus.

Today’s globalized internet marketplace is radically different when compared to the modem era of computing. The cloud computing model represents a number of fundamental shifts including Software as a Service(SaaS), Infrastructure as a Service (IaaS), Platform as a Service (PaaS) are well established.

And although it’s a bit ahead on the radar we should not overlook the quickly emerging SuperComputer as a Service. While there is no  standard acronym, there are established vendors like SGI’s Cyclone, Amazon’s Cluster Compute, IBM’s Watson, and with forthcoming merge between PiCloud and D-Wave‘s quantum computing….more options for High Performance Computing will be available to many smaller, lean and aggressive institutions.

These new services are directly tied to the “consumerization” of technology: advanced technologies at affordable price points. As a result the new focus is around access.  The shift to mobile computing via netbooks, smartphones and tablets is well underway, yet many school’s do not have a sufficient wireless infrastructure. Students, faculty and administrators are today carrying a laptop, smartphone and probably an iPad. Schools are struggling to to handle bandwidth demands of so many devices in concentrated areas around campus, from the Student Union to the ResHalls.

IMHO the tipping point with Cloud computing and digital devices is the convenience of access. Today many diverse schools have a campus community that simply demands anytime/anywhere access to data. And it’s no longer just email and web.  Its BIG data from data base research to the delivery of HD media. For better (or for worse) society has become trained to demand mobile solutions that easily integrate into the app economy and their mobile lifestyles.

Categories
Cyberinfrastructure Design Education Globalization Innovation Internet2 Network Technology

Internet2: Arthron – A Tool for Video Streaming Remote Management in Artistic Performances Experiences

Arthron was concept for experiences in the domain of Art and Technology. Arthron facilities include its simple user interface and the manipulation of different media sources. Users can remotely add, remove and configure the presentation format as well as schedule the media streaming during an artistic performance.
i2banner
Arthron is composed by six components described as follow. The Articulator is responsible for the remote management. This component concentrates a great part of the Arthron functionalities, such as stream scheduling (manual or automatic), network monitoring and measurement, remote configuration of other modules, access control, web page automatic generation for online publication, video effects, and communication tools. The Encoder is responsible for capturing and encoding (when necessary) of media source, which can be external (DV or HDV camera, DVD) or internal (a local file). The Decoder’s main functionality is to decode and display the media stream in a specific device (monitor, projector, etc). The Reflector is responsible for the replication and redistribution of media streaming over the network.

The VideoServer component is able to transcoding media streaming that will be published online. This component is also responsible for working with flv, ogg and h264 formats. The MapManager controls and displays the interactive map of Arthron components. MapManager offers to users an overview of the geographical distributed locations of Arthron components.

Categories
Design Education Globalization Innovation Reading Technology

Latest read: Cognitive Surplus

Remember the last time you read a great story that you caught yourself peaking at the remaining unread pages because you didn’t want the story to end?  That’s how I can best describe Clay Shirky‘s book Cognitive Surplus: Creativity and Generosity in a Connected Age.  His stories were coming to a close before I was ready to put the book down.

cognitive surplusThis is a great follow-up to his first book Here Comes Everybody: The Power of Organizing Without Organizations.  Shirky is right on target with engaging, connecting stories to share his ideas about our new ability today to share collective knowledge.

Over 1 trillion hours of TV is watched per year. Imagine what can happen when people turn TV off and begin contributing.  And Shirky elegantly shares the shifting nature of professionals vs. amateurs in the age of the internet.  Pretty amazing reading.

I believe there have been attempts to move in the direction he outlines but a tipping point has been the mass availability of consumer devices at very affordable price points.  I recall Peter Gabriel‘s interview on the Today Show in 1988 talking about the efforts of Amnesty International and their attempts to videotape human rights abuses with large, analog cameras.
Today we know all to well from the murder of Oscar Grant that cameraphones have made their efforts real.

The Napster thing
IMHO Clay’s single oversight in the book surrounds Napster.  I think he was trying to communicate a holistic answer to why people (not just Gen Xers) were stealing music.  He called it sharing — it was stealing plain and simple.