I just solved the most confounding home IT mystery in recent memory, and I want to describe what happened. For two months, I have been receiving some, but not all email sent to me. At first, I didn’t know I was missing any mail. But increasingly, people would see me and say “Hey, did you get my email about…” and I would just stare back in confusion. Sometimes, it would go the other way: I would wonder why so-and-so hadn’t responded to me and wasn’t that rude of them? I was starting to wonder if my mind was going way ahead of schedule.
If you have any experience with email, the first thing you’ll think is “SPAM filter, dummy.” Well, I do have some experience with email and I have a filtering setup that has been working great for me for over seven years. After checking all the reasonable locations for my missing messages, I started checking unreasonable locations, to no avail.
That’s when I first spotted it: I was doing some work and my email inbox was partially visible in another window. A message that I would like to have read popped into view in the corner of my eye. I shifted my attention to it and one second later POOF, it was gone. Now I had proof that my messages were making it all the way to my machine before disappearing. I assumed it must have been moved somewhere I didn’t intend. I searched everywhere: trash, spam, archives, pending, and any of a number of organizing boxes. I even tried grepping my email cache. It was just gone.
I tried turning off all my processing rules (including the SPAM filter). No change, messages still blinked out of existence within a few seconds. This was when I started suspecting one of my other devices. I have a desktop, a laptop (actually, it’s my wife’s, on which I have an account), a phone, and a tablet, all of which are capable of reading, composing, moving, and trashing email. I started turning them off one by one. This is what finally led me to the answer.
Every once in a while, I’ll borrow my wife’s MacBook Air when she’s not using it and I need to be mobile. I have the same email setup on it as on my desktop machine (namely, Apple Mail and Michael Tsai’s SpamSieve). I ran afoul of a problem whereby SpamSieve’s plugin for Mail became disabled, which prevents it from distinguishing between good mail and bad. Whenever the laptop received any messages, it would move all of them to the Spam folder, which is a local directory (i.e. not shared over the network). Since all my accounts use IMAP, Apple Mail would dutifully update thier respective servers that the mail had been moved offline.
What was fiendish about this problem is that I would only miss messages when Mail was running on the laptop, and that was only once in a while. The silver lining is that all those missing messages were sitting safely in the laptop’s Spam directory all along. My stomach rolled over this morning as I took inventory of everything over the past two months that I should have seen.
I sincerely apologize for any communications you were expecting from me and didn’t get. After so many years of living and breathing computer stuff, I thought I was innoculated against anything as mundane as losing email. If you’ll excuse me, I have a couple thousand messages to attend to.
These points were noted at the WebLearn Bytes: Tests and quizzes held on 26 November 2013, and may be helpful to other WebLearn users:
The Tests tool offers 6 types of questions: Multiple choice, True/False, Fill-in-the-blank, Matching, Essay and Task. Spend some time planning and formulating your envisaged test questions according to the best fit into one of the available question types.
All questions in a question pool must have the same points value (i.e. score). It is advisable to put all matching-type questions into one pool, with a reasonable points value (e.g. 5 per question, if they contain five matches). Similarly you might want to put all essay/task questions into one pool with zero points – you will manually assign scores to these questions when you mark the test.
There is also a Lickert scale question type, and most of the other questions can be turned into survey questions (i.e. there is no right/wrong answer and the score is zero). If you do use any survey-type questions, it is advisable to put them all into one question pool. It is preferable to use the specialised Surveys tool to design and conduct full surveys.
Essay and Task questions are not classified as so-called ’objective questions’ (that means they cannot be marked automatically by the system) – you can go into ‘Marking’ and manually assign a score and a comment for these questions.
The Tests tool is designed for tests to be administered to named site participants in a WebLearn site. If you wish to open the test to broader groups of Oxford users, please contact the WebLearn team (email@example.com), since a special parameter needs to be set.
Question pools, assessments and parts all have titles, so it’s helpful to use meaningful titles that describe what the item is, e.g. “Pool of Matching-type questions”; “Test for Michaelmas Term 2013″; “Section A” respectively.
You can export the list of names and total scores to Excel for further manipulation of the data and generation of your own summary reports. Within the Tests tool, you can generate ‘Summary data’ — this shows the usage statistics per question, which can help to refine and improve your questions for future use.Useful links:
- Step-by-step guide: Tests – getting started
- Step-by-step guide: Tests – more details
- Contact the Oxford Learning Institute for information on the theory of objective testing
- Leeds University Staff and Departmental Development Unit: Designing objective test questions
- Contact the WebLearn team: firstname.lastname@example.org with any questions about the Tests tool
- Book to attend Computer8 on Friday mornings during term time for free personal consultation
It is planned to upgrade WebLearn to version 2.8-ox8.1 on Tuesday 3 December 2013 7-9am. There will be no service during this period.
This release includes bug fixes for:
- broken right mouse button functionality in the WYSIWYG editor
- problems associated with subscribing to calendars that have events with attachments
- Oxford Podcasts fails to work with Internet Explorer
- poor Researcher Development Framework tagging of Graduate Training courses
- failure to update internal subgroups comprising more than one ‘role group’
- inability to assign public access to files or folders if all Oxford users have already been granted access
We apologize for any inconvenience that this essential work may cause.
It’s fair to say that Purdue University has sparked several important conversations in ed tech through their work on Course Signals. First, they pretty much put the retention early warning system as a product category on the map, conducting ground-breaking research and building a system that several major ed tech players have either licensed or imitated. More recently, they have sparked a conversation about the state of ed tech research and peer review as their more recent research has been called into question. I highly recommend reading the comment threads on these two posts to get a sense of that conversation.
Now I think Purdue may spark a third conversation—this time around the ethics of institutional learning analytics research and commercialization. Because there is no question in my mind that they have a serious ethical problem on their hands.
While I have no proof that Purdue is aware of the concerns that have been raised about the Course Signals research, I think it highly unlikely that they are unaware, after articles have been published in Inside Higher Ed and the Times Higher Education. The questions have been out for a month now, and so far we have nothing in the way of an official response from the university.
That’s a big problem for several reasons. First, has have been mentioned here before, Purdue has licensed its technology to Ellucian for sale to other schools. In other words, the university is effectively making money on the strength of research claims that have now been called into question. Second, the people who conducted and published the research are not tenured faculty but non-tenurable staff, and they did so using institutional data the access to which Purdue ostensibly controls. It seems overwhelmingly likely that the researchers whose work is being challenged are effectively powerless to respond without permission and support from their institution. If so, then these people are being put in a terrible position. They are listed as the authors of the research, but they do not have the power that an academic Principal Investigator would have to be properly accountable for the work.
For both of these reasons, I believe that Purdue has an ethical obligation as an institution to respond to the criticism. Since they seem disinclined (or at least slow) to do so of their own accord, perhaps some appropriate pressure can be brought to bear. If you are an Ellucian customer, I urge you to contact them and ask why there has not been an official response to the challenge regarding the research. Both of the partners here should know that their brand reputations and therefore future revenue streams are at stake here. (I would be grateful if you would let me know, either publicly or privately, if you take this step. I would like to keep track of the pressure that is being brought to bear. I will keep your name and that of your institution private if you want me to.)
But I also think there is a broader conversation that needs to happen about the general problem. On the one hand, schools have an obligation to protect the privacy of their students. This makes releasing student success research data challenging. On the other hand, if the research cannot be properly peer reviewed because it cannot be shared, then we cannot develop confidence in the research that is coming to us. This problem is exacerbated when research is conducted by staff whose independence is not protected, and by the increasing tendency of institutions to commercialize their educational technology research and development work. There needs to be a community-developed framework to help facilitate the safe and appropriate sharing of the data so institutions can be held accountable for their research and the staff who conduct that research can be appropriately protected.
Raise interest rates on old student loans, report by Rothschild Invest -involved with the 900M sell off http://t.co/nCqkLfnjcs #projectariel
2013年11月24日 シリーズ：Run Your Race week5「レースを走りきった先輩たち④」 メッセンジャー： 大窪秀幸牧師 / Pastor Hide メッセージノート： http://www.lighthousechurch.jp/message.html 日曜礼拝時間： １１：００〜１２：１５ １４：３０〜１５：３０ (J-on ※) ※ユースとヤングアダルト中心の礼拝ですが、もちろん誰でも参加できます。 ライトハウスキリスト教会 大阪府堺市堺区砂道町3-6-19 http://www.lighthousechurch.jp
2013年11月24日 シリーズ：Change Your World in 52 Days ~人生を変える52日~ week6「最後までやり遂げる」 メッセンジャー： 大窪秀幸牧師 / Pastor Hide メッセージノート： http://www.lighthousechurch.jp/message.html 日曜礼拝時間： １１：００〜１２：１５ １４：３０〜１５：３０ (J-on ※) ※ユースとヤングアダルト中心の礼拝ですが、もちろん誰でも参加できます。 ライトハウスキリスト教会 大阪府堺市堺区砂道町3-6-19 http://www.lighthousechurch.jp
Douglas Belkin wrote an article yesterday in the Wall Street Journal based on a study from Moody’s Investors Service. The lede of the article is that “nearly half of the nation’s colleges and universities are no longer generating enough tuition revenue to keep pace with inflation”, which comes from Moody’s interest in institutional financial stability, but I think there are other lessons available. While the revolution in college pricing effects is quiet, it is profound.
It is worth noting that the big changes are based on FY2013 and 2014, which includes survey-based estimates and projections rather than hard data. The article is behind a paywall, but here are some relevant excerpts (read the whole article if you can).
Nearly half of the nation’s colleges and universities are no longer generating enough tuition revenue to keep pace with inflation, highlighting the acceleration of a downward spiral that began as the recession ended, according to a new survey by Moody’s Investors Service.
Technically speaking, the US recession ended in June 2009, so the argument that the downward spiral began at that point does not match the data. The downward spiral started in the 2012 -13 school year, fully three years later. There are other correlations that might be more relevant such as student loans and defaults, acceptance that job prospects are not rebounding, etc.
The survey of nearly 300 schools reflects a cycle of disinvestment and falling enrollment that places a growing number at risk. While schools for two decades were seeing rising enrollments and routine increases of 5% to 8% in net tuition, many now are facing grimmer prospects: a shrinking pool of high-school graduates, depressed family incomes and precarious job prospects after college.
These are all good points, but I would extend this argument to say that families are now becoming value shoppers for college certificates and degrees. While there is still strong sentiment that the investment in college will pay off over time, families and students want to minimize the investment amount (and risk).
The softening demand for four-year degrees is prompting schools to rein in tuition increases while increasing scholarships. Those moves are cutting into net tuition revenue—the amount of money a university collects from tuition, minus school aid.
For 44% of public and 42% of private universities included in the survey, net tuition revenue is projected to grow less than the nation’s roughly 2% inflation rate this fiscal year, which for most schools ends in June. Net tuition revenue will fall for 28% of public and 19% of private schools.
What you can see from the data is not stasis followed by an accelerating drop-off; what can be seen is a reversal from tuition revenue increases far above to far below inflation rates. 5% to 8% in net tuition revenue was not sustainable, and it is likely one of the major causes of the dramatic changes over the past few years.
As Herbert Stein taught us, Trends that can’t continue, won’t.
Keep in mind that we’re talking net tuition revenue, which subtracts out school aid but includes federal financial aid. The change in school revenue cannot be explained by a drop in federal financial aid, however.
This drop in net tuition revenue cannot be explained a drop in state spending on public institutions – that is another category.
“We don’t know where the bottom is; if we knew, we could structure appropriately,” said [U Louisiana President] Mr. Bruno, with regard to the budget cuts. The result: “We have to look at a different business model; we can’t just depend on our region anymore.”
The context of this comment is higher tuition for out-of-state students, but that is a losing strategy in my opinion. He does have a point with “we have to look at a different business model”.
Schools with the strongest brands are less vulnerable to these trends. For instance, as the international market consolidates, flagship state schools with strong reputations already established in foreign countries stand to benefit from their alumni networks. Midtier schools lacking a presence overseas will find it harder to break into new overseas markets.
This point is key, as it backs up two important observations on higher ed we have seen recently.
- The schools most at risk are those without strong name recognition. While that might be unfair, it seems to be a fact of life.
- While Clayton Christensen has taken some heat from the projection that “in 15 years from now half of US universities may be in bankruptcy”, the data from this survey lends some credibility to the concept.
As stated before, keep in mind that this is survey-based data, but it does provide insight into some important issues faced by US colleges and universities.