Archive for the 'Technology' Category
March 21st, 2014 by Wil
The New Yorker notes that we are in the midst of a suicide epidemic. While I’m always wary of the term epidemic, it’s worth noting that American suicide rates rose about 30% from 1999 to 2010.
The article posits that suicide’s main sponsor—depression—is an illness, not a deficit or character weakness. While I agree with gist of that, one has to question, “Why then has this illness only recently increased so dramatically?”
I’m often arguing that technology has substantially changed our lives over the past 15+ years. I talk about my total frustration with the intrusions of modern media (endless email messages, Facebook alerts, the incessant fucking phone, etc.) Part of what is so annoying about all this stuff is that it gives you this sense of losing control over your life. You want to just lie down and take a nap, or sit in the yard and stare into space but there’s a half dozen electronic devices poised to ruin your reverie. I wonder if all this contributes to the rise in self-immolation.
Obviously this argument is so speculative is doesn’t even deserve the term “fanciful.” But that doesn’t mean it’s wrong. And there’s another way I think technology, particularly the web, has unsettled our psychology. In the pre-internet era, people could (somewhat) comfortably settle into various tribal distinctions, often but not exclusively based on the music they listened to. Punks, Metal-heads, hippies, hip-hoppers, yuppies etc. This allowed a certain sense of self-definition and self-worth. “I’m a cool, rebellious punk rock type!” one could think. But I think the web, for a variety of reasons has weakened these tribal self-definitions making us more like interchangeable members of the digital citizenry. And this has weakened our sense of ourselves… we find ourselves, in some hard to define way, asking ourselves “what am I?”
March 7th, 2014 by Wil
I’ve talked a bit about the work of computer scientist David Cope who has developed several software tools that compose music. The exact methodology he uses is complex (he’s written several books about it) but his programs have ably output hours of music in the style of various classical masters.
In one of his books, Cope comments that he has not used his software to write pop music. This is partly because he isn’t interested in pop music and partly because he concedes pop music is about a lot more that just the notes on a page (which is what his software is fundamentally creating.) Pop is also about the tone of instruments, their hip factor, and a lot of contextual baggage the performing artists bring to the song (their personal history, persona etc.)
Nonetheless, I think it’s inescapable that computers will be composing pop song in the future. Or more likely, computers will be helping humans compose pops songs.
But, then what? Cope’s software can generate thousands of variations on a basic tune. Say someone does the same with a pop song. You have 10,000 versions of a certain melody in A minor. Obviously nobody wants to listen to all of them to find the “best one.”
But what if you could look through a data pool of what listeners were listening to and spot upcoming trends? For example, two years ago you could have noted, “Gee, it looks like people are really digging music with these wonky low end gurgles… I bet dub-step will be popular.” Basically, you would note what properties of music seemed to be getting popular and aim the computer composed music towards those styles.
But where would you get this data? This recent NY Times piece, noting that music analysis company Echo Nest has been bought by Spotify, may offer clues.
The Echo Nest is one of a handful of companies specializing in the arcane but valuable science of music data, examining what songs are being listened to by whom, and how. It makes this information available to its clients, including major media companies like Sirius XM, Clear Channel and Univision, which use the data primarily for music-related apps.
“Analyzing music preferences is something we’ve been doing for a long time,” Jim Lucchese, chief executive of the Echo Nest, said in a joint interview with Mr. Ek. “But being directly wired in, and sitting alongside the Spotify team, will give us the ability to push products a lot faster and learn a lot faster than we could before.”
I suspect Echo Nest is, right now, just analyzing “big picture” music trends, like “people are digging hip-hop country songs.” I think eventually they could move towards more granular observations like “major scale melodies that climb high over three bars and then fall down in a giant octave leap in the fourth bar are getting popular,” or “Synth timbres that sound like a theremin and glockenspiel are getting big.” That data could then be used to power the computer aided composition of pop music.
I’m not saying this is a good thing; it worries me. It could certainly lead to an arms race of musical ideas that would result in fads burning out faster and faster. But I think it’s the future.
March 5th, 2014 by Wil
The Guardian has an interesting article about the challenges faced by writers in the digital age. With that distinctively British pessimism, the article states…
Roughly speaking, until 2000, if you wrote a story, made a film or recorded a song, and people paid to buy it, in the form of a book, a DVD or a CD, you received a measurable reward for your creativity. Customers paid because they were happy to honour your creative copyright. When the internet began in the 1990s, many utopian dreams of creating an open society, where information would be free for all, sprang into prominence. Wikipedia, for instance, is the child of such dreams. Today, Wikipedia is appealing to its users for subscriptions.
Among many champions of the open (and free) society, Jaron Lanier, author of You Are Not a Gadget and Who Owns the Future?, celebrated the idea of knowledge without frontiers from the comfortable security of a university post. The reckoning has been slow in coming, but now there are some crucial indicators of a change of heart. Lanier, for example, acknowledges that, in his excitement at the birth of the worldwide web, he forgot about the creative classes. He concedes that he has watched a generation of his friends – film-makers, writers, musicians – become professionally annihilated by the loss of creative copyright.
Copyright is the bone-marrow of the western intellectual tradition. Until the book world, like the music world, can reconcile the extraordinary opportunities provided by the web with the need for a well-regulated copyright system, artists of all kinds will struggle.
This is also interesting.
For Kavenna, this freedom is a reason to be optimistic about the future: “The digital age,” she says, “is an extraordinary revolution in consciousness. I grew up with the Modernists – Joyce et al – grappling with the technological developments of the early 20th century. The digital age is just as significant. We are developing a completely different mode of consciousness. So the digital age offers this new challenge for writers.”
A new consciousness… that’s some heavy stuff. I have to admit I was just thinking that despite all my complaints about this era of hyper-technology I do feel lucky to be alive during it. I think we’re seeing fundamental changes in how the human animal lives including changes that could indeed lead to something called “a new consciousness.”
February 15th, 2014 by Wil
A subject I’ve been interested of late is the affect the advent of robots will have on the nation’s workforce, employment rates and economy. I was particularly interested in this post (and fascinating video) about the use of robots at Amazon. These robots are changing the nature of a job I used to have. I worked at Amazon’s Seattle warehouse as a book picker and packager for one month in the mid nineties.
The article states…
[The robots] job in the warehouse is to deliver shelves of items to Amazon workers, effectively reversing the typical “picker” job. Instead of walking across the warehouse all day to retrieve different items, workers can be given fixed stations and let the shelves come to them. When a shelf arrives, they select the appropriate item off it, box it, and place it on the exit conveyor belt.
The article says that so far, no human jobs are being replaced. But for how long!!!???
February 14th, 2014 by Wil
I often comment here about the fact that the emergence of the internet has enabled the production of (and the cheapening of) content. By content I mean writing, music, video, art etc. It used to be that if you wanted to hear a song you had to either buy the cd it was on, or listen to the radio and hope you heard it. Nowadays most songs can be found on Spotify, youtube, pirate sites etc. Additionally there are gazillions of content creators, myself included, posting all kinds of content on various sites like soundcloud, youtube, Noise Trade (which is now offering free books) etc.
For content consumers (e.g. most of us) this is great. Lots of choice, lots of free or cheap stuff. But there’s an obvious problem. Most content is shit. It’s actually beyond shit—it’s utterly amateurish prattling devoid of nuance or refinement. (My work is an obvious exception.) And plenty of other content is not shit, but not all that great either. Only a small percentage of content really hits the mark. So how do you weed out the crap?
One idea is by having people rank content. This is how Amazon reviews, youtube “thumbs up and thumbs down” buttons, Facebook likes and similar concepts work. But they’re somewhat problematic. It turns out there’s a lot of people out there with no taste, so you really can’t trust their opinion on anything. How do I know the person doing the ranking is the kind of person I can trust?
Amazon has kind of gotten around this with their recommendation engine. It basically follows the logic that “this guy liked a lot of stuff you liked so you’ll like this new thing he said he likes.” It’s the obvious idea that like-minded people like the same stuff.
It kind of works, I guess. But I’m starting to wonder about another issue. All these processes assume that whether we will like something is fairly static. I see a movie on Sunday afternoon and like it. The presumption is that had I seen the movie on Thursday evening, or Tuesday morning I would have liked it just the same. But what if our liking something is more flexible? What if our mood before we examined the content affects whether we like it? What if whether we just ate a good meal affects our liking it? Then it matters less what some guy who has liked things we’ve liked thought. Maybe he liked it because he just ate a delicious Fettuccine alfredo?
And, I suspect there’s some truth to this supposition. Sometimes no music or TV, no matter how good, is going to keep my interest. And there are other times when anything seems pretty amusing. I may also like something simply because I like the person making the recommendation. There’s a lot of x-factors at work that are hard to weed out of the process.
You might say that I’m saying appreciating content is subjective (e.g. it depends on the person.) But I’m really saying it’s beyond subjective. The person I will be Saturday may not like stuff the person I am today likes.
January 14th, 2014 by Wil
I’m the first to admit that in many ways the internet is an extraordinary thing. But I do not understand why we don’t have one-click buying at this point in the game. By this I mean, you are on a web site and see a link to a song, movie, pdf, book, html template, podcast or whatever that you would like to buy. You should be able to click a “buy” link and immediately have the file start downloading to your hard drive. No logging in, no setting up a password, no nothing.
I certainly see issues with this idea; mainly that you could go to the bathroom and your little brother could come in and buy 3000 dollars worth of monkey porn. But I feel confident this could be worked out. It’s really kind of amazing that buying things is still so hard on the web.
The beauty of this, I feel, is that it would really empower the little guy (e.g. me.) If I could post a song or piece of writing and offer to sell it for, say 10 cents, I think I would probably get some buyers. Maybe not a lot, but enough to make it work. I’ve never bought a Kindle book in my life mainly because I think three bucks (a standard price) is still too much and I just don’t want to deal with the hassle of Amazon. Same goes for iTunes and the rest. There’s no need for these middle men (for a lot of items—for some there is .)
January 6th, 2014 by Wil
Over at Reason magazine, there’s an article contemplating the possibility of autonomous technology. This isn’t technology that is conscious (though plenty of people are contemplating that) but software and robots that exist as entities that can support themselves economically. The article muses on a self-driving car that operates as a cab and uses its income to pay for gas and repairs. Or investment software that buys and sells stocks; some invest-bots might make millions, some would go broke, but they would be out there.
The author states…
This little [invest-]bot can be made with technology that we have available today, and yet it is totally incompatible with our legal system. After all, it is a program that makes and spends money and acts in the world, but isn’t owned by a human or a corporation. It essentially owns itself and its capital. The law doesn’t contemplate such a thing.
It’s a fascinating idea—computer programs that are independent, money-making units. But if they are too successful, will the humans begin to eye them jealously? Will men seethe in anger when they discover millionaire robots taking their wives out for a night on the town? Are the seeds of the coming human/robot war being sowed as we speak?
December 27th, 2013 by Wil
For a while now I’ve been arguing that the digitization of content is leading to the devaluation of content. This has certainly affected the news industry (see dying newspapers and magazines) and the music industry (see the loss of revenue and the general sense that music is free.) I’ve long felt the movie industry will become a partial victim to this trend; the main thing sparing it right now is that files of burned movies are still pretty big in terms of data and thus unwieldy to upload/download etc.
That said I’ve been noticing a lot of pirated movies over at Youtube. I’ve even been passing some evenings checking them out; there’s a lot of great classic and independent horror flicks there. (I wrote about some of them here.)
The ethical problem is, of course, that these movies are pirated and the people who produced and contributed to the films see not one dime from their films being shown on Youtube. I am fortunately free of ethics but other folks may be bothered by this. You often find these “information should be free” types tying pretzels of logic to defend piracy.
One thing that I am struck by is the fact that Google—the owner of Youtube—is selling advertisements that run before the movies. So, not only are they looking the other way to movie piracy, they are profiting from it. I was a little curious about this state of affairs and tracked down this HuffPo article which examines it. As you might surmise, Google is protected because they are not legally responsible for what their users upload (which, I admit, is probably a fair position.) Nonetheless, this development clearly does not encourage the production of cinema, especially independent cinema.
The article states…
…according to judgments in the YouTube v Viacom case, the DMCA provides YouTube with protection against copyright infringements carried out by its users. YouTube is not under an obligation to ensure that the service it provides complies with copyright law. Instead, copyright owners must report infringement to YouTube. So rather than YouTube creating a clever piece of software, or employing a small team of compliance officers, studios and producers all over the world have to duplicate effort and cost to monitor whether their titles are being uploaded. It doesn’t look like this inefficient method of policing copyright is going to change any time soon, but things get really interesting when one looks at the money that is being made from illegal uploads. YouTube may not have a responsibility to ensure that the content it publishes complies with the law, but does that entitle it to derive revenue from illegally uploaded content?
Of related interest: this article alleging Google, Yahoo and Bing knowingly sell ads to spammers. So much for “do no evil.” But what are we going to do about it? Stop using Google? Of course not. We are their bitches.
December 23rd, 2013 by Wil
Al Goldstein, publisher of famous porn mag “Screw” died recently. He had an epic life arc; at one point he was a millionaire from “Screw”, but…
By the mid-2000s, Goldstein was homeless. He once got a job as a restaurant greeter, only to lose it when the management discovered he was sleeping on the premises.
Even in that period he reveled in irreverent humor. In 2005, when he worked at a New York deli, he told the Washington Post, “I’ve gone from broads to bagels.” But there was also the ever-present anger. “Anyone who wishes ill on me should feel vindicated,” he told the New York Times in 2004, “because my life has turned into a total horror.”
What brought the duke of porn down? The ubiquitous availability of porn once the internet appeared. I wonder if that is a sign of things to come for other industries —music, books, movies—that find their content becoming more and more digitized?
I discussed Al’s strange, public access talk show “Midnight Blue” a while back.
The general focus of the show was interviews, usually with female porn stars, though eventually non-porn guests like O.J. Simpson, Arnold Schwarzenegger, Gilbert Gottfried and Debbie Harry made appearances. Al’s primary interest was that of sexual technique; he would often throw out provocative, curse laden queries along the lines of “How do you like your pussy licked?” or “Is there anything wrong with me fucking a chick with my nose?” The porn royalty sat in the hot seat—too jaded to be shocked at Al’s questions—and offered serious answers. (One of my favorite “MB” moments occurred when Al asked Carol Connors, the “forgotten actress of Deep Throat” (and mother of current mainstream film actress Thora Birch!) whether she would ever perform bestiality. “No,” Carol demurred, “But I love animals!”)
December 22nd, 2013 by Wil
As a society, or species, (or whatever we are), we tend to laud forward thinking creative geniuses. When we find one, we hoist them onto a pedestal and treat them as an (to quote this article) “Übermensch [who] stands apart from the masses, pulling them forcibly into a future that they can neither understand nor appreciate.” This is true across all disciplines. Think of Einstein, Beethoven, Picasso, on and on.
So how does one become a genius? Clearly you have to innovate, to do something no one else has done. But there’s a catch here. You can’t be too innovative. You can’t be so ahead of the curve that nobody can really grasp what you’re saying or doing.
Let me propose a thought experiment. Jimi Hendrix travels to Vienna around 1750 and plays his music. Would he be lauded as a genius? Would his guitar playing be heard as the obvious evolution of current trends in music? No, he’d probably be regarded as an idiot making hideous noise and he might be burned at the stake.
But, let music evolve for around 220 years and yes, Jimi is rightfully regarded as a genius. His sound was made palatable by those who came before him, mainly electric blues guitar players of the 50s and 60s. (Obviously there are a lot of other factors (like race and class and sex) relevant to whom gets crowned a genius but I’m painting in broad strokes here.)
So the trick to being a genius is to be ahead of your time but not too ahead. The world of science and medicine is filled with examples. Gregor Mendel famously discovered that physical traits could be passed from one generation of life to another. In what was a major breakthrough in our understanding of biology, he theorized what we came to call genes. He published his results and was met with pretty much total indifference. It wasn’t until his work was rediscovered decades later that we applied them. Mendel was too ahead of his time.
The book “The Mind’s I” notes the mathematician Giovanni Girolamo Saccheri who contributed to the discovery of non-Euclidian geometry. His ideas were so controversial that even Saccheri himself rejected them! (At least he did according to the book; there seems some debate on this. See the last graph on the Saccheri wiki page.) Talk about being too ahead of your time.
But perhaps the best example of this sort of thing is Ignaz Semmelweis. The Hungarian physician…
…discovered that the incidence of puerperal fever could be drastically cut by the use of hand disinfection in obstetrical clinics.
That’s right, he basically came up with the crazy idea that doctors should wash their hands after touching sick people. Unfortunately…
Despite various publications of results where hand-washing reduced mortality to below 1%, Semmelweis’s observations conflicted with the established scientific and medical opinions of the time and his ideas were rejected by the medical community. Some doctors were offended at the suggestion that they should wash their hands and Semmelweis could offer no acceptable scientific explanation for his findings. Semmelweis’s practice earned widespread acceptance only years after his death, when Louis Pasteur confirmed the germ theory and Joseph Lister, acting on the French microbiologist’s research, practiced and operated, using hygienic methods, with great success.
Oh well. Semmelweis probably still had a great career and life right?
In 1865, Semmelweis was committed to an asylum, where he died at age 47 after being beaten by the guards, only 14 days after he was committed.
Don’t be too ahead of the curve folks.