Will.i.am

Will likes to see stuff at MIT whenever he is in town. This time I took him to see robots in the Computer Science and Artificial Intelligence Laboratory, wearable computing in the Media Lab, and miscellaneous cool stuff in the Precision Engineering Research Group. It wasn't hard to find people to help out.

We walked around for three hours. Then, he was off to do soundchecks. A few hours after he learned about energy-storing inverse lakes, he and his Black Eyed Peas played to a sold-out crowd at the TD Banknorth Garden.

I always like amazing people, like Will, who is highly creative, does interesting things, and is interested in the future. MIT attracts amazing, highly creative, interesting, interested people like honey attracts bears.

And on top of all that, Will is a fan of my field, Artificial Intelligence. Check out the Peas Imma Be Rocking That Body video.

Anyway, when Will and his entourage were about to leave, and all the obligatory pictures were taken, he asked, as he generally does, if I could use a few tickets for the show. “Hey, that would be great,” I said. I like the Peas, and besides, I hadn't been to a good concert since the Rolling Stones were in town in '06.

Alas, my daughter seized the tickets. “You're nowhere near cool enough to go,” she said, “and I have some friends.” Maybe I should find a new place to buy clothes.

28 February 2010

More pictures of Will.i.am visit.


When Bose walked out

A few days ago, I was almost trampled by a herd of freshmen stampeding out of 26-100, so I went in to see what had been happening. It turned out to be a 7.01 lecture, freshman biology.

Being in 26-100, somehow the Big Event came to mind. It happened a long time ago when I took 6.01, Introduction to Circuit Theory. Professor Amar Bose, who later founded Bose Corporation, lectured. Electrical engineering sophomores sat at two-person tables, equipped with—hard to believe today—colorful stamped-aluminum ash trays.

In those days, students often made a hissing noise, like a snake, whenever an instructor announced a quiz or told a particularly corny joke. Bose didn't like it; he considered it insulting. On the first day of class, he announced there would be no hissing.

A few weeks later, somebody hissed. Bose said that whoever hissed would have to leave, and someone left.

Then, a lecture or two later, early in the lecture, someone hissed again. But this time the culprit refused to identify himself. So Bose left, and that lecture was gone forever. Those who knew the hisser or sat close to him gave him a pretty hard time. Nobody in the class ever hissed at Bose again. Alumni still talk about it at reunions: “Do you remember when Bose walked out?”

We knew Bose respected us because he put 100 hours a week into 6.01. We respected him because he didn't put up with what he considered insults. Mutual respect is the stuff from which great education emerges.

When Bose walked out, it licensed me to say, when I became a professor myself, that there would be no newspaper reading in my classes. Later, I took pride in being among the first to forbid open laptop computers in my classrooms, believing that reading papers, surfing the web, and doing email is as insulting as hissing. I respect my students, and I want them respect me.

24 October 2010

Amar Bose passed away, but his legacy survives in the many students he taught and mentored.

14 July 2013

When I walked in

It is not just MIT's 150th anniversary, it's my 50th.

Back in 1961, in the winter of my senior year in high school, my father said, “Well, you better go see what the place is like. Change trains in Chicago and get off at the last stop.”

This was long before parents routinely showed up on campus with embarrassed offspring in tow. My father simply put me on a train at Peoria, instructing me to visit MIT. I arrived at South Station early the next day, never having gone anywhere by myself before. I was tired after my first MIT-related all nighter, this one sitting up all night on the train.

After I arrived in Kendall Square, I started wandering around the edge of the campus, scared stiff, fearful that I would end up in some forbidden laboratory where I would be yelled at or even arrested. I made my way to the river, which was frozen and cold. I walked by the Great Dome, which I remember as big. And, of course, there were all those imposing names up on the buildings—Copernicus, Darwin, Newton, and lots more.

Then, I walked up Massachusetts Avenue, and there it was: the main entrance, distinguished by a door that you opened when you passed by an electric eye, a novelty in those days.

When I saw it, I knew I had found Paradise. “Ok, this is the place,” I said to myself, screwed up my courage, found B. Alden Thresher's office, got interviewed, and showed up the next fall, never to leave.

The electric eye is still there. Sadly, it doesn't work. It's function is now handled by a mat you step on. But, in a place where progress is permanent, it is nice that a few anchor points are still around, and who knows, maybe some enterprising undergraduate will update it someday with a laser in its guts.

16 January 2011


The singularity

I was recently asked a question intended to embarrass: People have been saying we will have human level intelligence in 20 years for the past 50 years; what do you have to say about that?

My answer: I'm ok with it; it will be true eventually.

My less flip answer is that, interestingly, Turing broached the subject in his original Turing test paper, Computing Machinery and Intelligence. He also wrote about whether computers could go supercritical, in 1950, using a nuclear reaction analogy:

Let us return for a moment to Lady Lovelace's objection, which stated that the machine can only do what we tell it to do. One could say that a man can "inject" an idea into the machine, and that it will respond to a certain extent and then drop into quiescence, like a piano string struck by a hammer. Another simile would be an atomic pile of less than critical size: an injected idea is to correspond to a neutron entering the pile from without. Each such neutron will cause a certain disturbance which eventually dies away. If, however, the size of the pile is sufficiently increased, the disturbance caused by such an incoming neutron will very likely go on and on increasing until the whole pile is destroyed. Is there a corresponding phenomenon for minds, and is there one for machines? .... we ask, "Can a machine be made to be supercritical?"

Since then, others have thought they have invented the singularity idea, but it is really an obvious question that anyone who has thought seriously about AI would ask.

Will there be the singularity? Sure, but it is not like getting a person to the moon, which we knew we could do when the space program started. That is, no breakthrough ideas were needed. As far a technological singularity, that requires one or more breakthroughs, and those are hard/impossible to think of in terms of time lines.

Of course it depends, in part, on how many have been drawn to think about those hard problems. Now we have huge numbers studying and working on Machine Learning and Deep Learning. Some tiny fraction of those may be drawn to thinking about understanding the nature of human intelligence, and that tiny fraction constitutes a much bigger number that were thinking about human intelligence than a decade ago.

So when will we have our Watson/Crick moment? Forced into a corner, with a knife at my throat, I would say 20 years, and I say that fully confident that it will be true eventually.

8 May 2018


Communication

Draft copies just came in from Acme Printing, soon to be distributed to friends, all asked to identify those places where I can get the biggest improvement on energy expended (Chapter 6 lesson). The book is done, but it will become undone and then redone when comments come back.

I could have called it How to Write, but it is about speaking too.

I could have called it How to Speak, as it all started with the How to Speak talk I give each year during MIT's Independent Activities Period, but it is about a lot more than just speaking.

I could have called it How to Communicate, but I don't want it shelved with books on domestic relations.

So, Communication it is. And it is about what I wish I knew when I started out, because in my world, for most people, success is determined by how well you speak, by how well you write, and by the quality of your ideas, in that order.

It will be some months before the first edition is available, but I plan to serialize some of it here before that.

10 May 2018