This was the midterm for my grad algorithms class. It was inspired by binge watching seasons 1-4 of Breaking Bad (yes I’m a bit behind).
My colleague Terran Lane recently left our department for a position at Google. He’s written a great essay on why he is leaving academia. Required reading for people who feel that universities have been a force for positive social change in the past thousand years or so – and also politicians, regents and administrators – not sure if those two sets still overlap. The 144 comments are particularly interesting.
Last week, I attended the CRA Snowbird conference for the first time. This conference is held every 2 years and is attended by chairs and
associate chairs from departments in the U.S. and Canada, faculty
member on boards of organizations like CRA, representatives from
industry and funding organizations, and several big name researchers.
The conference was much more interesting that I expected. Below are my notes from the conference which are (as always) sketchy, biased and incomplete.
1) John Hennessy – President of Stanford University (and former CS professor)
- Until the invention of the printing press, doubling rate for universities was about every 100 years
- Major university cost was the library
- After printing press, university cost decreased dramatically -> huge increase in access to education -> dramatic societal impact
- Currently, major university cost is faculty salaries. Research faculty are very expensive (faculty salaries track dentist salaries)
- [You can see where this is going…]
- Claim: Advent of online education -> decrease in number of faculty in U.S. -> number of research universities in the U.S. has peaked.
This talk generated a *huge* amount of controversy. Hennessy (purposefully?) ignored any actual research benefits from research faculty.
2) Salman Khan (of Khan academy) and Peter Norvig (Google fellow and online ML course instructor)
- – Online education has the potential to dramatically improve learning
- – Only need to read 1 paper on education [Bloom ’84] (Bloom’s 2-sigmas paper), which I [Jared] will now sum up in one sentence: “Private tutoring improves student performance by two standard deviations over lecture-based instruction”
- If can “automate” effects of private tutoring, through interactive educational technology, will dramatically improve student performance.
- Example: Students who answer a question incorrectly in Stanford online class can be clustered, and pointed to information that can help them improve answer
- Norvig: motivation is extremely crucial in online classes. His primary focus is keeping students motivated and in contact with small groups of other students to create social dependencies
- Khan: want to avoid too much polish – polished lectures can be boring and inauthentic.
- Physically attending a university will always be a better experience when done right. Universities need to ensure it’s “done right” so that physical attendance offers something more than what can be obtained online.
- Online education will likely replace or at least supplement traditional textbooks
Farnam Jahanian (Assistant Director of NSF for CISE)
- NSF funding for computer science has been increasing by ~6% per year for the last 5 years. (yes, really)
- Contrasts with increases of ~3% per year for most other disciplines
- Big effort to maintain these increases. Helps immensely if the CS community actively publicizes research successes – go do this.
- Feels that CS will be better shielded than most disciplines from political vagaries and educational tsunamis
Jeffrey Dean (Google Fellow and co-inventor of MapReduce)
- “Make a reliable whole out of unreliable parts” “Make a low latency whole out of variable latency parts”. Discussed clever trick ensuring low latency on a massive server farm
- Described several applications of neural nets distributed over 1 Million CPUs. Neural nets are robust and inherently distributed so work on massive networks. Massive neural nets give huge improvements in accuracy for 1) speech recognition (“improvement is equivalent to 20 years of research in the field”) and 2) image recognition (double accuracy compared to state of the art)
Shwetak Patel (UW): Created device you plug into an electrical
outlet that tracks energy usage of all devices in your house. How?
Every type of appliance generates a unique EM signature. His device
uses this noisy signal to track each device’s energy usage with
surprising accuracy. Amazingly, can do the same thing for water
(using special sensor faucet) and gas with appropriate devices. This
was really cool!
Daphne Kohller (Stanford): Machine-learning using clinical data to
detect: 1) infant health with much higher accuracy than Apgar and
other more invasive tests – her analysis considers only time signal
data for infant respiration; 2) aggressiveness of breast cancer. In
both cases, results from the ML approach are vastly more accurate than results from traditional medical tests (e.g. Apgar). Her recent focus is applying ML to online education data to automatically determine what is the appropriate information to present to students in order to help them fix common mistakes.
Below is the last problem on a midterm for my graduate algorithms class. A few years ago, I started creating problems for this class that were based on (simplifications of) research problems. There’s a subset of the students that really like these problems and do well on them, but I worry sometimes that they hurt the students who are struggling in the class. I’m curious if others assign these types of problems for general graduate classes?
Filed under: Uncategorized | Tags: distributed algorithms, education, game theory, theory
Suresh Venkatasubramanian at geomblog blogs recent about active learning modules for graduate algorithms. I’ve frequently used active learning in undergraduate classes but somehow have found it less effective for graduate classes. However, just recently, I ran across a nice paper by Sivilotti and Pike that describes kinesthetic learning activities for a distributed algorithms class. The basic idea is to have each student simulate a processor that is running a distributed algorithm. Examples they discuss are non-deterministic sorting, parallel garbage collection, a stabilizing token ring, and stabilizing leader election. I also think the ideas could be applied to more advanced/theoretical distributed algorithms. A great thing about distributed computing is that it lends itself nicely to this sort of classroom activity.
I have not yet used these ideas in a graduate class. However, I frequently using active learning in undergrad classes and the students seem to appreciate it. I’ll probably try out some of the ideas in the next academic year. I’m teaching a game theory class in the spring where I can use these types of activities – even perhaps for “hard” problems like e.g rational secret sharing, auctions, etc.
When I first started as a professor, I felt that what students needed to do well in a class, above all else, was precise, error-free lectures, and lot’s of practice doing problems. Based on this ideas, I did most of my lectures from slides that I had very carefully prepared before hand, and gave the class lots and lots of problems to solve. This actually worked pretty well for a while – I got great feedback from students, was nominated for some teaching awards, and enjoyed the experience.
However, eventually I realized that my lectures were not as exciting as they could be. I read some articles on Tomorrows Professor listserv, that bemoaned the use of pre-prepared slides. They claimed that extensive use of slides makes lectures completely predictable and unresponsive. This semester, as an experiment, I stopped using slides, or even lecture notes of any kind. I memorize enough of the lecture so that I can be sure to use terminology that is reasonably close to the textbook and to the notes I’ve posted from previous years of the class.
How have things changed? I find teaching a more exciting and sometimes scary activity. Also, it’s easier for me to respond to feedback from the class as a whole about e.g. whether it’s worthwhile to go over an example of a particular concept. When a student asks an interesting question, I definitely spend more time than I used to on exploring the answer in depth. For me, at least, the lectures are a lot more fun and challenging. From the feedback I’m getting from students so far, they also seem to be more engaged. Do I make more mistakes in lecture? Definitely. However, I feel that I’m slowly getting better at checking things on the fly and am making fewer mistakes now than when I started this.
How has this changed my teaching philosophy? I realize now that a big part of education is inspiration, motivation, and a kind of educational “entertainment”. I also realize that it’s just much more engrossing and memorable to watch someone walk carefully across a rope in the air, than to trudge across a line on the floor that was drawn before the performance even began!