Posts in Uncategorized
Tape Generations - beautiful, but unbelievably laborious

[vimeo http://vimeo.com/28826269 w=700&h=580] An amazing animation by Johan Rijpma, who likes to study "the unpredictable environment".

He says: "I worked on this project for about 6 months. I tried many different compositions and then made a selection. A single composition could take more than 12 hours to develop/breakdown (the spinning of the plate was done by hand, turning the plate about 0.4 degrees every 30 seconds, this meant I was standing in the wind and the rain for hours watching the tape "grow" and watching the sun come up/go down)"

I like the sound, too.

P.S. This is my favourite memo ever

[an oldie but goody from the excellent Letters of Note] Before we begin: the content of the following memo is absolutely filthy. If you're easily offended, please don't read it. Thanks.

South Park: Bigger, Longer & Uncut is a feature length movie based on the animated series, South Park. Prior to its release in 1999, the movie's creators -Matt Stone and Trey Parker - were asked repeatedly by the MPAA to alter the film in order for it to gain an R rating rather than an NC-17. Below is a memo sent by Stone to the MPAA, in response to such a request.

The first picture is of the whole memo. Below that is a close-up of the text.

Recommended Viewing: This Film Is Not Yet Rated.

 

Transcript

Here is our new cut of the South Park movie to submit to the MPAA. I wanted to tell you exactly what notes we did and did not address.

1. We left in both the "fisting" and the "rimjob" references in the counselor's office scene. We did cut the word "hole" from "asshole" as per our conversation.

2. We took out the entire "God has fucked me in the ass so many times..." It is gone.

3. Although it is not animated yet, we put a new storyboard in for clarification in the scene with Saddam Hussein's penis. The intent now is that you never see Saddam's real penis, he in fact is using dildos both times.

4. We have the shot animated that reveals the fact that Winona is not shooting ping-pong balls from her vagina. She is, in fact, hitting the balls with a ping-pong paddle.

5. We took out the only reference to "cum-sucking ass" in the film. It was in the counselor's office and we took it out.

6. We left in the scenes with Cartman's mom and the horse as per our conversation. This is the one joke we really want to fight for.

Call with any questions

Matt

P.S. This is my favorite memo ever.

Now hear this - Spiegel im Spiegel, by Arvo Pärt

[youtube=http://www.youtube.com/watch?v=TJ6Mzvh3XCc&w=700] Spiegel im Spiegel was written by Arvo Pärt shortly before he left Estonia. He's one of the pioneers of a school of music known as holy minimalism.

Of his popularity, Steve Reich has written: "Even in Estonia, Arvo was getting the same feeling that we were all getting .... I love his music, and I love the fact that he is such a brave, talented man .... He's completely out of step with the zeitgeist and yet he's enormously popular, which is so inspiring. His music fulfills a deep human need that has nothing to do with fashion."

"Spiegel im Spiegel" in German literally can mean both "mirror in the mirror" as well as "mirrors in the mirror", referring to the infinity of images produced by parallel plane mirrors: the tonic triads are endlessly repeated with small variations as if reflected back and forth.

In 2011 the piece was the focus of a half hour BBC Radio 4 programme, Soul Music, which examined pieces of music "with a powerful emotional impact".

10 things everyone should know about time
“Time” is the most used noun in the English language, yet it remains a mystery. We’ve just completed an amazingly intense and rewarding multidisciplinary conference on the nature of time, and my brain is swimming with ideas and new questions. Rather than trying a summary (the talks will be online soon), here’s my stab at a top ten list partly inspired by our discussions: the things everyone should know about time. [Update: all of these are things I think are true, after quite a bit of deliberation. Not everyone agrees, although of course they should.]

1. Time exists. Might as well get this common question out of the way. Of course time exists — otherwise how would we set our alarm clocks? Time organizes the universe into an ordered series of moments, and thank goodness; what a mess it would be if reality were complete different from moment to moment. The real question is whether or not time is fundamental, or perhaps emergent. We used to think that “temperature” was a basic category of nature, but now we know it emerges from the motion of atoms. When it comes to whether time is fundamental, the answer is: nobody knows. My bet is “yes,” but we’ll need to understand quantum gravity much better before we can say for sure.

2. The past and future are equally real. This isn’t completely accepted, but it should be. Intuitively we think that the “now” is real, while the past is fixed and in the books, and the future hasn’t yet occurred. But physics teaches us something remarkable: every event in the past and future is implicit in the current moment. This is hard to see in our everyday lives, since we’re nowhere close to knowing everything about the universe at any moment, nor will we ever be — but the equations don’t lie. As Einstein put it, “It appears therefore more natural to think of physical reality as a four dimensional existence, instead of, as hitherto, the evolution of a three dimensional existence.”

3. Everyone experiences time differently. This is true at the level of both physics and biology. Within physics, we used to have Sir Isaac Newton’s view of time, which was universal and shared by everyone. But then Einstein came along and explained that how much time elapses for a person depends on how they travel through space (especially near the speed of light) as well as the gravitational field (especially if its near a black hole). From a biological or psychological perspective, the time measured by atomic clocks isn’t as important as the time measured by our internal rhythms and the accumulation of memories. That happens differently depending on who we are and what we are experiencing; there’s a real sense in which time moves more quickly when we’re older.

4. You live in the past. About 80 milliseconds in the past, to be precise. Use one hand to touch your nose, and the other to touch one of your feet, at exactly the same time. You will experience them as simultaneous acts. But that’s mysterious — clearly it takes more time for the signal to travel up your nerves from your feet to your brain than from your nose. The reconciliation is simple: our conscious experience takes time to assemble, and your brain waits for all the relevant input before it experiences the “now.” Experiments have shown that the lag between things happening and us experiencing them is about 80 milliseconds. (Via conference participant David Eagleman.)

5. Your memory isn’t as good as you think. When you remember an event in the past, your brain uses a very similar technique to imagining the future. The process is less like “replaying a video” than “putting on a play from a script.” If the script is wrong for whatever reason, you can have a false memory that is just as vivid as a true one. Eyewitness testimony, it turns out, is one of the least reliable forms of evidence allowed into courtrooms. (Via conference participants Kathleen McDermott and Henry Roediger.)

6. Consciousness depends on manipulating time. Many cognitive abilities are important for consciousness, and we don’t yet have a complete picture. But it’s clear that the ability to manipulate time and possibility is a crucial feature. In contrast to aquatic life, land-based animals, whose vision-based sensory field extends for hundreds of meters, have time to contemplate a variety of actions and pick the best one. The origin of grammar allowed us to talk about such hypothetical futures with each other. Consciousness wouldn’t be possible without the ability to imagine other times. (Via conference participant Malcolm MacIver.)

7. Disorder increases as time passes. At the heart of every difference between the past and future — memory, aging, causality, free will — is the fact that the universe is evolving from order to disorder. Entropy is increasing, as we physicists say. There are more ways to be disorderly (high entropy) than orderly (low entropy), so the increase of entropy seems natural. But to explain the lower entropy of past times we need to go all the way back to the Big Bang. We still haven’t answered the hard questions: why was entropy low near the Big Bang, and how does increasing entropy account for memory and causality and all the rest? (We heard great talks by David Albert and David Wallace, among others.)

8. Complexity comes and goes. Other than creationists, most people have no trouble appreciating the difference between “orderly” (low entropy) and “complex.” Entropy increases, but complexity is ephemeral; it increases and decreases in complex ways, unsurprisingly enough. Part of the “job” of complex structures is to increase entropy, e.g. in the origin of life. But we’re far from having a complete understanding of this crucial phenomenon. (Talks by Mike Russell, Richard Lenski, Raissa D’Souza.)

9. Aging can be reversed. We all grow old, part of the general trend toward growing disorder. But it’s only the universe as a whole that must increase in entropy, not every individual piece of it. (Otherwise it would be impossible to build a refrigerator.) Reversing the arrow of time for living organisms is a technological challenge, not a physical impossibility. And we’re making progress on a few fronts: stem cells, yeast, and even (with caveats) mice and human muscle tissue. As one biologist told me: “You and I won’t live forever. But as for our grandkids, I’m not placing any bets.”

10. A lifespan is a billion heartbeats. Complex organisms die. Sad though it is in individual cases, it’s a necessary part of the bigger picture; life pushes out the old to make way for the new. Remarkably, there exist simple scaling laws relating animal metabolism to body mass. Larger animals live longer; but they also metabolize slower, as manifested in slower heart rates. These effects cancel out, so that animals from shrews to blue whales have lifespans with just about equal number of heartbeats — about one and a half billion, if you simply must be precise. In that very real sense, all animal species experience “the same amount of time.” At least, until we master #9 and become immortal. (Amazing talk by Geoffrey West.)

Via: Discover Magazine.

Nice song: Real Late, by Liam Finn

[youtube=http://www.youtube.com/watch?v=DOE_22bgneI&w=700] Liam Mullane Finn (born 24 September 1983 in Melbourne, Australia) is a New Zealand musician and songwriter. Born in Australia, he moved to New Zealand as a child. He is the son of pop musician Neil Finn (of Split Enz and Crowded House).

Soap bubbles, an electro magnet and ferrous liquid - three of my favourite things
dysonology-title.png

[vimeo http://vimeo.com/28304264 w=700&h=450]

 

"I combined everyday soap bubbles with exotic ferrofluid liquid to create an eerie tale using macro lenses and time lapse techniques," says artist Kim Pimmel. "Black ferrofluid and dye race through bubble structures, drawn through by the invisible forces of capillary action and magnetism." The tecchy bit, for the film geeks, are: Time-lapse sequences: Nikon D90, Nikkor 60mm macro lens and custom built intervalometer. Motion-control: Arduino driven scanner platform and mirror rigs. Score: Ableton Live.

Click here to see Compressed 01, a sexy little number by Pimmel in which ferrous printer toner particles floating on the surface of water are attracted by a magnet and align to the invisible magnetic field around them.

Best jokes in the Edinburgh festival

1. Nick Helm – “I needed a password eight characters long so I picked Snow White and the Seven Dwarves.” 2. Tim Vine – “Crime in multi-storey car parks. That is wrong on so many different levels.”

3. Hannibal Buress – “People say ‘I'm taking it one day at a time.’ You know what? So is everybody. That's how time works.”

4. Tim Key – “Drive Thru McDonalds was more expensive than I thought ... once you've hired the car ...”

5. Matt Kirshen – “I was playing chess with my friend and he said, 'Let's make this interesting'. So we stopped playing chess.”

6. Sarah Millican – “My mother told me, you don’t have to put anything in your mouth you don’t want to. Then she made me eat broccoli, which felt like double standards.”

7. Alan Sharp – “I was in a band which we called The Prevention, because we hoped people would say we were better than The Cure.”

8. Mark Watson – “Someone asked me recently – what would I rather give up, food or sex. Neither! I’m not falling for that one again, wife.”

9. Andrew Lawrence – “I admire these phone hackers. I think they have a lot of patience. I can’t even be bothered to check my OWN voicemails.”

10. DeAnne Smith – “My friend died doing what he loved ... Heroin.”

The best of the worst 1. Tim Vine – “Uncle Ben has died. No more Mr Rice Guy.”

2. Josh Howie – I've got nothing against the Chinese. Don't get me Wong.

3. Mark Olver – “During my first murder I was like a dyslexic having my back teeth removed ... losing my morals.”

Oh...and the best of 2010 1. Tim Vine "I've just been on a once-in-a-lifetime holiday. I'll tell you what, never again."

2. David Gibson "I'm currently dating a couple of anorexics. Two birds, one stone."

3. Emo Philips "I picked up a hitch hiker. You've got to when you hit them."

4. Jack Whitehall "I bought one of those anti-bullying wristbands when they first came out. I say 'bought', I actually stole it off a short, fat ginger kid."

5. Gary Delaney "As a kid I was made to walk the plank. We couldn't afford a dog."

6. John Bishop "Being an England supporter is like being the over-optimistic parents of the fat kid on sports day."

7. Bo Burnham "What do you call a kid with no arms and an eyepatch? Names."

8. Gary Delaney "Dave drowned. So at the funeral we got him a wreath in the shape of a lifebelt. Well, it's what he would have wanted."

9. Robert White "For Vanessa Feltz, life is like a box of chocolates: Empty."

10. Gareth Richards "Wooden spoons are great. You can either use them to prepare food. Or, if you can't be bothered with that, just write a number on one and walk into a pub…"

Politeness, by A. A. Milne

If people ask me,I always tell them: "Quite well, thank you, I'm very glad to say." If people ask me, I always answer, "Quite well, thank you, how are you to-day?" I always answer, I always tell them, If they ask me Politely..... BUT SOMETIMES

I wish

That they wouldn't.

 

How to chop an onion, with breaks

[vimeo http://vimeo.com/22709715 w=700&h=500] So THIS is what old skate video makers do now...

It's nicely done, just a bit, well, could do with a bit more direction I guess - kind of form over function. I'd quite like to do one on how to make a cup of tea. Maybe how to dunk a biscuit, soundtrack by Ninjatunes.

Just turn it up and be quiet.

[youtube=http://www.youtube.com/watch?v=6XzKrYHGxzU&w=700] Yeah yeah you've heard it before and since then loads of people have sampled it and blah blah blah. But imagine hearing this for the FIRST TIME. And isn't it just the BEST break ever? Turrrrrn it up. Dance at your desk. Act the fool. It's what it's for.

The elusive big idea

Fascinating thoughts by Neal Gabler in the NY Times:

THE July/August issue of The Atlantic trumpets the “14 Biggest Ideas of the Year.” Take a deep breath. The ideas include “The Players Own the Game” (No. 12), “Wall Street: Same as it Ever Was” (No. 6), “Nothing Stays Secret” (No. 2), and the very biggest idea of the year, “The Rise of the Middle Class — Just Not Ours,” which refers to growing economies in Brazil, Russia, India and China.

Now exhale. It may strike you that none of these ideas seem particularly breathtaking. In fact, none of them are ideas. They are more on the order of observations. But one can’t really fault The Atlantic for mistaking commonplaces for intellectual vision. Ideas just aren’t what they used to be. Once upon a time, they could ignite fires of debate, stimulate other thoughts, incite revolutions and fundamentally change the ways we look at and think about the world.

They could penetrate the general culture and make celebrities out of thinkers — notably Albert Einstein, but also Reinhold Niebuhr, Daniel Bell, Betty Friedan, Carl Sagan and Stephen Jay Gould, to name a few. The ideas themselves could even be made famous: for instance, for “the end of ideology,” “the medium is the message,” “the feminine mystique,” “the Big Bang theory,” “the end of history.” A big idea could capture the cover of Time — “Is God Dead?” — and intellectuals like Norman Mailer, William F. Buckley Jr. and Gore Vidal would even occasionally be invited to the couches of late-night talk shows. How long ago that was.

If our ideas seem smaller nowadays, it’s not because we are dumber than our forebears but because we just don’t care as much about ideas as they did. In effect, we are living in an increasingly post-idea world — a world in which big, thought-provoking ideas that can’t instantly be monetized are of so little intrinsic value that fewer people are generating them and fewer outlets are disseminating them, the Internet notwithstanding. Bold ideas are almost passé.

It is no secret, especially here in America, that we live in a post-Enlightenment age in which rationality, science, evidence, logical argument and debate have lost the battle in many sectors, and perhaps even in society generally, to superstition, faith, opinion and orthodoxy. While we continue to make giant technological advances, we may be the first generation to have turned back the epochal clock — to have gone backward intellectually from advanced modes of thinking into old modes of belief. But post-Enlightenment and post-idea, while related, are not exactly the same.

Post-Enlightenment refers to a style of thinking that no longer deploys the techniques of rational thought. Post-idea refers to thinking that is no longer done, regardless of the style.

The post-idea world has been a long time coming, and many factors have contributed to it. There is the retreat in universities from the real world, and an encouragement of and reward for the narrowest specialization rather than for daring — for tending potted plants rather than planting forests.

There is the eclipse of the public intellectual in the general media by the pundit who substitutes outrageousness for thoughtfulness, and the concomitant decline of the essay in general-interest magazines. And there is the rise of an increasingly visual culture, especially among the young — a form in which ideas are more difficult to express.

But these factors, which began decades ago, were more likely harbingers of an approaching post-idea world than the chief causes of it. The real cause may be information itself. It may seem counterintuitive that at a time when we know more than we have ever known, we think about it less.

We live in the much vaunted Age of Information. Courtesy of the Internet, we seem to have immediate access to anything that anyone could ever want to know. We are certainly the most informed generation in history, at least quantitatively. There are trillions upon trillions of bytes out there in the ether — so much to gather and to think about.

And that’s just the point. In the past, we collected information not simply to know things. That was only the beginning. We also collected information to convert it into something larger than facts and ultimately more useful — into ideas that made sense of the information. We sought not just to apprehend the world but to truly comprehend it, which is the primary function of ideas. Great ideas explain the world and one another to us.

Marx pointed out the relationship between the means of production and our social and political systems. Freud taught us to explore our minds as a way of understanding our emotions and behaviors. Einstein rewrote physics. More recently, McLuhan theorized about the nature of modern communication and its effect on modern life. These ideas enabled us to get our minds around our existence and attempt to answer the big, daunting questions of our lives.

But if information was once grist for ideas, over the last decade it has become competition for them. We are like the farmer who has too much wheat to make flour. We are inundated with so much information that we wouldn’t have time to process it even if we wanted to, and most of us don’t want to.

The collection itself is exhausting: what each of our friends is doing at that particular moment and then the next moment and the next one; who Jennifer Aniston is dating right now; which video is going viral on YouTube this hour; what Princess Letizia or Kate Middleton is wearing that day. In effect, we are living within the nimbus of an informational Gresham’s law in which trivial information pushes out significant information, but it is also an ideational Gresham’s law in which information, trivial or not, pushes out ideas.

We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are today’s big questions.

It is certainly no accident that the post-idea world has sprung up alongside the social networking world. Even though there are sites and blogs dedicated to ideas, Twitter, Facebook, Myspace, Flickr, etc., the most popular sites on the Web, are basically information exchanges, designed to feed the insatiable information hunger, though this is hardly the kind of information that generates ideas. It is largely useless except insofar as it makes the possessor of the information feel, well, informed. Of course, one could argue that these sites are no different than conversation was for previous generations, and that conversation seldom generated big ideas either, and one would be right.

BUT the analogy isn’t perfect. For one thing, social networking sites are the primary form of communication among young people, and they are supplanting print, which is where ideas have typically gestated. For another, social networking sites engender habits of mind that are inimical to the kind of deliberate discourse that gives rise to ideas. Instead of theories, hypotheses and grand arguments, we get instant 140-character tweets about eating a sandwich or watching a TV show. While social networking may enlarge one’s circle and even introduce one to strangers, this is not the same thing as enlarging one’s intellectual universe. Indeed, the gab of social networking tends to shrink one’s universe to oneself and one’s friends, while thoughts organized in words, whether online or on the page, enlarge one’s focus.

To paraphrase the famous dictum, often attributed to Yogi Berra, that you can’t think and hit at the same time, you can’t think and tweet at the same time either, not because it is impossible to multitask but because tweeting, which is largely a burst of either brief, unsupported opinions or brief descriptions of your own prosaic activities, is a form of distraction or anti-thinking.

The implications of a society that no longer thinks big are enormous. Ideas aren’t just intellectual playthings. They have practical effects.

An artist friend of mine recently lamented that he felt the art world was adrift because there were no longer great critics like Harold Rosenberg and Clement Greenberg to provide theories of art that could fructify the art and energize it. Another friend made a similar argument about politics. While the parties debate how much to cut the budget, he wondered where were the John Rawlses and Robert Nozicks who could elevate our politics.

One could certainly make the same argument about economics, where John Maynard Keynes remains the center of debate nearly 80 years after propounding his theory of government pump priming. This isn’t to say that the successors of Rosenberg, Rawls and Keynes don’t exist, only that if they do, they are not likely to get traction in a culture that has so little use for ideas, especially big, exciting, dangerous ones, and that’s true whether the ideas come from academics or others who are not part of elite organizations and who challenge the conventional wisdom. All thinkers are victims of information glut, and the ideas of today’s thinkers are also victims of that glut.

But it is especially true of big thinkers in the social sciences like the cognitive psychologist Steven Pinker, who has theorized on everything from the source of language to the role of genetics in human nature, or the biologist Richard Dawkins, who has had big and controversial ideas on everything from selfishness to God, or the psychologist Jonathan Haidt, who has been analyzing different moral systems and drawing fascinating conclusions about the relationship of morality to political beliefs. But because they are scientists and empiricists rather than generalists in the humanities, the place from which ideas were customarily popularized, they suffer a double whammy: not only the whammy against ideas generally but the whammy against science, which is typically regarded in the media as mystifying at best, incomprehensible at worst. A generation ago, these men would have made their way into popular magazines and onto television screens. Now they are crowded out by informational effluvium.

No doubt there will be those who say that the big ideas have migrated to the marketplace, but there is a vast difference between profit-making inventions and intellectually challenging thoughts. Entrepreneurs have plenty of ideas, and some, like Steven P. Jobs of Apple, have come up with some brilliant ideas in the “inventional” sense of the word.

Still, while these ideas may change the way we live, they rarely transform the way we think. They are material, not ideational. It is thinkers who are in short supply, and the situation probably isn’t going to change anytime soon.

We have become information narcissists, so uninterested in anything outside ourselves and our friendship circles or in any tidbit we cannot share with those friends that if a Marx or a Nietzsche were suddenly to appear, blasting his ideas, no one would pay the slightest attention, certainly not the general media, which have learned to service our narcissism.

What the future portends is more and more information — Everests of it. There won’t be anything we won’t know. But there will be no one thinking about it.

Think about that.

Neal Gabler is a senior fellow at the Annenberg Norman Lear Center at the University of Southern California and the author of “Walt Disney: The Triumph of the American Imagination.”