Over breakfast this weekend at a popular farmhouse two high school teachers sat next to me to discuss how their respective LMS solutions made teaching difficult. Both were from wealthy suburbs outside Milwaukee. What really peaked my interest was hearing how one spent over 45 minutes trying to add polling for in-class feedback.
I helped lead the adoption of a Moodle LMS at a private Wisconsin college in 2007 that is still in use today and also had the pleasure of attending a conference at UW-Madison with Martin Dougiamas the founder of Moodle.
Yet over that breakfast I was intrigued by their difficulty with all things LMS for the upcoming school year. Frustration ranged from how one teacher received no LMS training (poll example above) while the second teacher spoke about her district migrating to a new LMS vendor over the summer.
Of course no technology discussion can avoid a teacher mentioning K12 servers going offline for hours during the school day making their teaching even more difficult. Seems like teachers have a lot to confront on a daily basis in delivering education to a classroom of twenty plus students. A local LMS run from an empty closet is no longer acceptable.
Continue reading “LMSaaS”
While focused on the task of generating data for astrophysics Reliability Assurance of Big Data in the Cloud is a worthy read when focused around designing cloud service contacts.
The work of authors Yang, Li and Yuan surround capturing big data reliability, and measuring disk storage solutions including from noted cloud vendors.
Their work at Centre for Astrophysics and Supercomputing at Swinburne University of Technology focused on reducing cloud-based storage cost and energy consumption methods.
They also share the impact of multiple replication-based data storage approaches based upon Proactive Replica Checking for Reliability (PRCR). That was very interesting in their research data gathering.
I found Reliability Assurance of Big Data in the Cloud also supports moving data into the cloud across advanced research networks including Internet2.
Processing raw data inside the data center impacts network models (based upon available bandwidth) in their work. Their research gathers and stores 8 minute segments of telescope data that generates 236GB of raw data. By no means in the petabyte stage (yet) but it still sets a solid understanding of contractual demands on big data cloud storage.
My interest peaked around impacts developing knowledgeable contracts for cloud services. Their background regarding data gathering and processing should influence procurement contract language. This is even more applicable when applied to petabyte data sets and the SLAs regarding data reliability requirements. Never leave money on the table when scaling to the petabyte range. Must read for purchasing agents and corporate (and university) CPSMs.
Supported by a grant from NSF, eight universities (including the UWisconsin System) have been funded to help support a “CI Days” event at their campus.
CI Days are intended to bring together various sectors of the campus (Faculty, IT Staff, librarians, administrators, students and others) to better understand the needs and roles of each sector. Its a case of “you don’t know what you don’t know” for almost every campus.
This Friday Wisconsin will introduce their initial CI Day event at UWMilwaukee with remote viewing supported around the State. It was great to hear WiscNet’s Shaun Abshere at this session today in Q&A regarding Friday’s coming session and supported remote technologies that will be used.
UW CI Day event program at UW-Milwaukee.
Today small K12 school districts and colleges with less than 1,000 students are accustomed to accessing email around the clock. Email is habit forming at best and compulsive at worst. The digital economy proves funding in-house email services can be staggering. Hidden IT costs remain as budgets are slashed.
Annual IT costs to run legacy back-end email servers, software licensing including (anti-spam, anti-virus, filtering and backup) must run 24/7 from multiple vendors. Annual people costs include training and technical support especially in a high turnover environment.
Some legacy email solutions actually require a dedicated server that cannibalizes the CPU. They are not virtualization friendly. Think OpenText’s WorstClass FirstClass email server.
So what is the largest overlooked annual cost forgotten by IT and financial managers? Electricity. The cost to power all enterprise servers 24/7 can be rather shocking. The first time I collaborated on a private college’s annual budget I was surprised to learn total energy costs for just three buildings on a small campus ran above $260,000/year. Same rates apply for K12 districts with multiple buildings.
If your organization is running real industrial servers (1U or even 3U units) there are significant costs, regardless of rack, blade or tower servers. Many schools on tight budgets re-purpose legacy Pentium desktops into “servers” along with old, energy sucking CRT monitors. Not a good idea. Don’t be swayed by marketing and PR efforts for “green” servers because they run all day and still cost a surprising amount over a five year lease….you do lease your servers right?
Continue reading “Hidden IT costs”
15 petabytes of data a year will be generated by the Large Hadron Collider (LHC) a particle physics project running at CERN and that requires a very robust network. Data generated by LHC is being distributed to over 7,000 scientists worldwide and travels across the US Midwest via BoreasNet.
In this video CERN technologists discuss the network’s requirements which supplies the TeraScale switches that connect 6,000 processors and 2,000 storage devices. TeraScale supports 672 line-rate Gigabit and 56 line-rate 10 Gigabit Ethernet ports per system, allowing CERN to deploy fewer systems and simplify the architecture of its network.
Tags: Large Hadron Collider, CERN, Network, Research, Internet2, BoreasNet, WiscNet, reading