quality – Techdirt (original) (raw)

Netflix Co-CEO: I’m Very Sorry That I Promised We’d Focus On Quality A Decade Ago

from the growth-for-growth's-sake dept

Back in 2013, Netflix co-CEO Ted Sarandos noted that his company’s goal was to “become HBO faster than HBO can become us.” His point, at the time, was that Netflix wanted to become synonymous with quality and creative artistry in the same way HBO had after decades of hard work.

11 years later, and everything has changed dramatically. HBO is a mockery of its former self after a series of pointless mergers by AT&T and Discovery resulted in thousands of layoffs, higher prices, and the death of the HBO brand. Product quality has deteriorated, with high end television programs steadily being replaced with cheaply produced, lowest common denominator reality TV drek.

And as streaming subscriber growth hits a wall, many other streaming giants have stopped being innovative in similar ways. Netflix, Amazon, and most other streaming giants have resorted to familiar tactics in order to goose quarterly revenue growth. Namely, more pointless mergers, price hikes, annoying nickel-and-diming efforts, layoffs, new restrictions, and sagging product quality.

Speaking recently with the New York Times, Sarandos says he regrets ever having said they were hoping to emulate HBO’s approach to quality content. In short, he’s forced to admit that focusing on quality won’t deliver Wall Street the unsustainable, unrealistic, impossible and permanent growth investors so crave:

“Look, if there’s one quote that I could take back, it would have been in 2012, I said we’re going to become HBO before HBO could become us. At that time, HBO was the gold standard of original programming. What I should have said back then is, We want to be HBO and CBS and BBC and all those different networks around the world that entertain people, and not narrow it to just HBO. Prestige elite programming plays a very important role in culture. But it’s very small. It’s a boutique business.”

Sarandos kind of pooh poohs the obvious sag in Netflix quality by pointing out that the streaming service is still winning Oscars. But, as a leading executive, Sarandos can’t really acknowledge a foundational truth: Wall Street’s need for impossible unlimited quarterly revenue growth means that, sooner or later, Netflix is on an unrealistic path toward self immolation just like the cable giants that preceded it.

The result: a bottomless roster of terrible reality TV shows about people trying to have sex on remote islands, peppered with a lot of movies like Under Paris.

As a publicly-traded company you can’t just consistently offer a quality product people love. So inevitably, sooner or later, once normal subscriber growth taps out, you’re forced to get “creative.” That creativity, especially in media and telecom, almost always results in pointless mergers to goose stock valuations and nab tax cuts, cutting corners on customer service and support, going cheap on product quality, while simultaneously raising rates and imposing more and more annoying restrictions.

Add lazy automation to the lowest common denominator chase for eyeballs at any cost and you have to wonder what mainstream television looks like a few decades from now.

To be clear, I still think Netflix offers a decent value proposition. Especially in comparison to traditional cable TV. But there are endless warning signs that Netflix executives are dead set on pushing their luck in terms of weird restrictions, sagging quality, and price hikes, and that will end badly.

Executives think they can strike a balancing act between quality and mass adoption at unlimited scale, but Wall Street’s demand for impossible, unlimited growth isn’t an achievable or realistic ask. And this inevitable trade off, where consumers consistently are asked to pay more for less, ultimately isn’t sustainable, opening the door to another wave of disruption.

In Netflix’s case that will increasingly come in the form of free or ad-based short form video apps, or piracy. And when piracy surges in response, which data suggests is already happening, streaming executives will blame absolutely everything but themselves.

Filed Under: competition, hbo, lowest common denominator, quality, streaming, ted sarandos, tv, video
Companies: netflix

False AI Obituary Spam The Latest Symptom Of Our Obsession With Mindless Automated Infotainment Engagement

from the that-you-click-is-all-that-matters dept

Tue, Feb 20th 2024 05:26am - Karl Bode

Last month we noted how deteriorating quality over at Google search and Google news was resulting in both platforms being flooded by AI-generated gibberish and nonsense, with money that should be going to real journalists instead being funneled to a rotating crop of lazy automated engagement farmers.

This collapse of online informational integrity is happening at precisely the same time that U.S. journalism is effectively being lobotomized by a handful of hedge fund brunchlords for whom accurately informing the public has long been a distant afterthought.

It’s a moment in time where the financial incentives all point toward lazy automated ad engagement, and away from pesky things like the truth or public welfare. It costs companies money to implement systems at scale that can help clean up online information pollution, and it’s far more profitable to spend that time and those resources lazily maximizing engagement at any cost. The end result is everywhere you look.

The latest case in point: as hustlebros look to profit from automated engagement bait, The Verge notes that there has been a rise in automated obituary spam.

Like we’ve seen elsewhere in the field of journalism, engagement is all that matters, resulting in a flood of bizarre, automated zero-calorie gibberish where facts, truth, and public welfare simply don’t matter. The result, automated obituaries at unprecedented scale for people who aren’t dead. Like this poor widower, whose death was widely (and incorrectly) reported by dozens of trash automation sites:

“[The obituaries] had this real world impact where at least four people that I know of called [our] mutual friends, and thought that I had died with her, like we had a suicide pact or something,” says Vastag, who for a time was married to Mazur and remained close with her. “It caused extra distress to some of my friends, and that made me really angry.”

Much like the recent complaints over the deteriorating quality of Google News, and the deteriorating quality of Google search, Google sits nestled at the heart of the problem thanks to a refusal to meaningfully invest in combating “obituary scraping”:

“Google has long struggled to contain obituary spam — for years, low-effort SEO-bait websites have simmered in the background and popped to the top of search results after an individual dies. The sites then aggressively monetize the content by loading up pages with intrusive ads and profit when searchers click on results. Now, the widespread availability of generative AI tools appears to be accelerating the deluge of low-quality fake obituaries.”

Yes, managing this kind of flood of automated gibberish is, like content moderation, impossible to tackle perfectly (or anywhere close) at scale. At the same time, all of the financial incentives in the modern engagement infotainment economy point toward prioritizing the embrace of automated engagement bait, as opposed to spending time and resources policing information quality (even using AI).

As journalism collapses and a parade of engagement baiting automation (and rank political propaganda) fills the void, the American public’s head gets increasingly filled with pebbles, pudding, and hate. We’re in desperate need of a paradigm shift away from viewing absolutely everything (even human death) through the MBA lens of maximizing profitability and engagement at boundless scale at any cost.

At some point morals, ethics, and competent leadership in the online information space needs to make an appearance somewhere in the frame in a bid to protect public welfare and even the accurate documentation of history. It’s just decidedly unclear how we bridge the gap.

Filed Under: engagement, google, information, journalism, obituary scraping, obituary spam, public welfare, quality, reporting, search, seo, spam

Study Says US Ranked 68th Out Of 100 In Mobile Video Quality

from the not-so-hot dept

Mon, Dec 9th 2019 06:14am - Karl Bode

While the telecom sector often enjoys crowing about the superiority of U.S. wireless, the reality is we’re not all that superior. While the U.S. was among the first countries to deploy 4G LTE, US 4G speeds tend to be fairly pathetic, with one study ranking the US 47th out of 77 countries studied. US wireless data prices are also significantly higher than a long list of other developed nations, thanks in no small part to regulatory capture and revolving door regulators.

This week the US wireless sector was shamed further via a new report by OpenSignal, which found that US wireless video streaming quality also remains somewhat underwhelming. According to the study, the U.S. is ranked 68th out of 100 when it comes to video streaming quality, someplace between Kyrgyzstan and Kazakhstan. The crowdsourced study is based on 94,086,045,513 measurements from 37,671,772 devices running Opensignal?s software between August 1 and October 30. The authors, by and large, place the lion’s share of the blame at the feet of insufficient spectrum:

“While there was an improvement in Americans? Video Experience ? with the score increasing from 46.7 to 53.8 points ? it was not enough to shift U.S. consumers up a gear into the Good category. Instead, Video Experience remained stuck in the Fair category. Americans had the lowest Video Experience score of any of the G7 economically leading countries as U.S. carriers struggle with the combination of enormous mobile video consumption and insufficient new spectrum. Opensignal?s results highlight the need for the release of more mid-band spectrum to help U.S. carriers meet the mobile video needs of Americans.”

But it’s not just spectrum. Another recent study showed that many of the problems with US video streaming come courtesy of bizarre restrictions imposed on U.S. wireless consumers’ connections. Restrictions generally used to nickel-and-dime customers into paying significantly higher rates. Sprint, for example, has sold “unlimited” data plans that throttle all video, games, and music unless consumers pay more. Verizon has similarly been selling “unlimited” data plans that throttled all video to standard definition by default, making HD or 4K luxury options that require you pay even more money to obtain.

These restrictions are justified by claims of spectrum scarcity, but often have more to do with the US telecom sector’s allergy to genuine price competition. There’s a universe of reasons for that, from the monopolies companies enjoy over tower backhaul to revolving door regulators who prioritize profits over healthy markets or consumers. And it’s a problem that’s likely to get worse with the repeal of net neutrality rules that attempted, albeit imperfectly, to thwart a lot of this kind of predatory nonsense in the absence of more heated competition.

While 5G is propped up as some miraculous panacea for the sector, it can’t stop the FCC from pandering to industry, fix backhaul monopolies, or stop our obsession with merger mania, which, as the T-Mobile Sprint deal will soon illustrate, only acts to erode competition and any incentive to compete on price. The real reason for substandard US Telecom has long been regulatory capture and limited price competition working in concert, something OpenSignal likely isn’t keen on highlighting for fear of annoying its paying clients in the telecom sector.

Still, these studies tend to highlight how, while the US crows a lot about wireless superiority, we remain largely mediocre when it comes to most of the wireless metrics (availability, speed, quality) that actually matter. So while many prattle on about the “race to 5G” and how we must pander to AT&T and Verizon or risk losing our amazing edge in wireless, it’s worth remembering that edge doesn’t actually exist.

Filed Under: broadband, competition, fcc, innovation, mobile video, quality, regulation, us

How The Dark Net Is Making Drug Purchases Safer By Eliminating Associated Violence And Improving Quality

from the hidden-virtues dept

Despite a few daring experiments in the space, the dark net (or dark web, if you prefer) is generally seen as a dangerous, frightening place inhabited by terrorists, pornographers and general ne’er-do-wells. That makes a report in The Guardian about drug dealers moving online unusual, because it shows how the dark net can also be beneficial to society:

> Research into internet drug markets by the European Monitoring Centre for Drugs and Drug Addiction (EMCDDA) suggested the self-regulation of online markets such as Silk Road provide a safer environment for users and dealers of illicit substances. > > Feedback mechanisms similar to eBay mean customers are able to hold dealers to account for the service they provide, the report said, while remote access to the market almost eliminates the risk of violence that has long been an integral part of the black economy.

Moving online not only safeguards drug users from violence and theft when they buy drugs in the physical world, it provides a natural way for customers to provide feedback on the quality of the drugs provided. Just as with traditional e-commerce companies, drug dealers who go digital can no longer risk bad customer reviews by providing inferior or dangerous products, since their future sales are likely to suffer. As a result:

> Drugs available through darknet markets tend to be of a higher purity than those available on the streets.

The new report comes from the European Monitoring Centre for Drugs and Drug Addiction, which is funded by the European Union, and, as usual, is accompanied by an official comment from the relevant EU commissioner. Unfortunately, Dimitris Avramopoulos, the European Commissioner for Migration, Home Affairs and Citizenship, trots out the usual unthinking reaction to drug sales that has made the long-running and totally futile “war on drugs” one of the most destructive and counterproductive policies ever devised:

> We should stop the abuse of the internet by those wanting to turn it into a drug market. Technology is offering fresh opportunities for law enforcement to tackle online drug markets and reduce threats to public health. Let us seize these opportunities to attack the problem head-on and reduce drug supply online.

That blinkered attitude ignores the important advantages moving drug sales from the physical world to the digital one brings not just for for users and dealers, but also for society as a whole, which does not have to deal with the social and economic consequences of violence on the streets, or with the long-term damage caused by poor-quality products. Along the way, his remarks inevitably and unhelpfully reinforce the view that the dark net is evil, and thus is something to be destroyed.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Filed Under: crime, dark markets, dark net, quality, safety, trust

No, The Internet Hasn't Destroyed Quality Music Either

from the panic-panic-everywhere dept

At what point will the music industry stop crying wolf? Remember that part of the reason behind the 1909 Copyright Act in the US was the arrival of the player piano, which some feared would put musicians out of business. Same with the phonograph. Remember, John Philip Sousa told Congress in 1906 how those darned “talking machines” were going to stop people from singing:

These talking machines are going to ruin the artistic development of music in this country. When I was a boy–I was a boy in this town–in front of every house in the summer evenings you would find young people together singing the songs of the day or the old songs. Today you hear these infernal machines going night and day. We will not have a vocal chord left. The vocal chords will be eliminated by a process of evolution, as was the tail of man when he came from the ape. The vocal chords will go because no one will have a chance to sing, the phonograph supplying a mechanical imitation of the voice, accompaniment, and effort.

And, of course, basically every other technological innovation was a threat of some sort. The radio was supposed to kill music. “Home taping is killing music” was a slogan! The RIAA undermined digital tapes and tried to limit CDs. It sued over the earliest MP3 players. It’s sued countless internet companies and even music fans.

Through it all, the refrain is always the same: if we don’t do this, “music will go away.”

But, of course, throughout it all, music only expanded. In the first decade of the 21st Century, more music was recorded than all of history combined, and it’s likely the pace has increased over the following five years as well.

And because of that, we’ve started to hear a new refrain from the same folks who insisted before that music was at risk of “dying” because of new technologies: that maybe there’s more music, but it’s clearly worse in quality. Some of this can be chalked up to the ridiculous pretension of adults who insist that the music of their youth was always so much better than the music “the kids listen to nowadays.” But plenty of it seems to be just an attack on the fact that technology has allowed the riff raff in, and the big record labels no longer get to act as a gatekeeper to block them out.

However, as pointed out in an article in The Age down in Australia, not only is music doing phenomenally well these days, but a recent study suggested that the quality of music continues to increase as well. Now, obviously, quality is a subjective thing, so it’s difficult to “measure,” but here’s what the report noted:

Yet all these years on we are still surrounded by music. It follows us throughout a day from our bedside to our commutes to our earphones at work to our drive home to settling into bed.

And an astonishing amount of it is new. A decade after the arrival of file sharing, US economist Joel Waldfogel charted what had happened in a paper called Bye, Bye, Miss American Pie? The Supply of New Recorded Music since Napster.

There is no doubt that recording companies are making less money since file sharing, he says. But that doesn’t necessarily mean they are making less music, or even less good music.

Assembling data on the quality of songs from the “all-time best” lists compiled each year by Rolling Stone and other magazines he finds that the albums regarded as good tend to be recent, and increasingly so as the internet age wears on.

The good new ones aren’t even by old artists. He says around half of the good new albums are by artists who only started recording since file sharing. It has neither killed new music, nor frightened people away from beginning to make music.

Now, there are reasonable quibbles with this methodology. You can say that of course newer lists of “all time” best music will weigh heavily more recent favorites, even if they might not truly last the tests of time. But, at the very least it does suggest that plenty of people (myself included) are still finding a ton of new music to listen to that we find to be just as good, if not better, than music from decades ago.

Filed Under: business models, internet, music, predictions, quality, streaming

Hollywood Desperate To Blame Bad Opening Box Office Of Expendables 3 On Piracy Rather Than The Fact That It Sucked

from the might-be-another-factor... dept

It’s been kind of crazy to watch movie studio Lionsgate go absolutely crazy over the fact that The Expendables 3 leaked online a few weeks ago. Within a few days, Lionsgate had filed a massive lawsuit, been granted a restraining order and followed it up with thousands of takedown notices, combined with targeting everyone from hosting providers to domain registrars, in a quixotic attempt to make the leaked files disappear.

The movie finally opened for real and the results — $16.2 million — were considered a disappointment. The credulous reporters over at Variety immediately have decided to pin the blame on the leak, rather than the fact that almost everyone agrees the movie sucks and that the third film in a crappy franchise almost never does particularly well anyway. The report points to some research claiming that when a film leaks, “it loses nearly 20 percent of its potential revenue.” Variety conveniently leaves out the fact that the research was done via a program “made possible through a gift from the MPAA,” which kinda seems relevant….

Meanwhile, it seems relevant that another study of a leak a few years ago of Wolverine under fairly similar circumstances suggested that the leak actually helped the film at the box office. At best, it seems that Hollywood might legitimately claim that the leaked copy made people realize that the movie sucked and told their friends not to go, but then they’re left arguing that they “made a movie so bad that pirates–who paid nothing to watch–told people it wasn’t worth seeing.” That doesn’t really sound like it’s the leak’s fault… so much as the fact that the movie sucked.

As always, the same basic rule has applied to movies: make a good product and any leak isn’t going to have significant impact at the box office. People go out to the movies for the social experience of it. A good movie is an event. Make a good movie and the fact that it leaks online isn’t going to have much of an impact. That’s not what happened here.

Filed Under: box office, copyright, expendables 3, infringement, leaks, piracy, quality
Companies: lionsgate

New Study: USPTO Drastically Lowered Its Standards In Approving Patents To Reduce Backlog

from the shockingly-under-shocking dept

The massive problems of the patent system really started getting renewed attention between 2002 and 2004 or so, highlighted by the publication of the book Innovation and Its Discontents: How Our Broken Patent System Is Endangering Innovation and Progress, and What to Do About It by Adam Jaffe and Josh Lerner. By that point, the combination of two key events in the late 90s was clearly being felt on the patents system. First, and most importantly, was the impact of the State Street decision that announced to the world that the courts considered software and business method patents legal. Also important was the 1999 publication of Rembrandts in the Attic: Unlocking the Hidden Value of Patents by Kevin Rivette and David Kline, which led patent lawyers and tech companies alike to suddenly both ramp up their patenting, but also to look to sell off “unused” patents to companies (lawyers) who did nothing but threaten and sue over them. Suddenly, patent trolls became a big, big issue.

Around the time of the Jaffe and Lerner book, the USPTO seemed to actually take much of the criticism to heart. One big part of Jaffe and Lerner’s criticism was the simple fact that patent examiners had significant incentives to approve patents, and almost none to reject patents. That is, the metrics by which they were measured included the rate of how many patent applications they processed. But, since there is no such thing as a truly final rejection of a patent, people would just keep asking the USPTO to look at their application again. Each time an examiner had to do this, their “rate” would decline, since they’d be spending even more time on the same old patent application. But approving a patent got it off your plate and let the court system sort out any mess. However, after the book was published, the USPTO actually seemed to pay attention and changed its internal incentives a bit to push for high quality approvals. Not surprisingly, this meant that the approval rate dropped. But, since there was more demand for bogus patents to sue over, more people appealed the rejections and the backlog grew.

Patent system lovers started whining about the “backlog,” but what they were really pissed off about was the fact that their bogus patents weren’t getting approved. Unfortunately, their message resonated with the new regime of the Obama administration, mainly Commerce Dept. boss, Gary Locke, and head of the USPTO, David Kappos. Back in 2010, we noted that the USPTO had shifted back to approving “pretty much anything” and had clearly decreased their quality standards in an effort to rush through the backlog. Not surprisingly, in stating this, we were attacked mercilessly by patent system supporters, who insisted that we were crazy, and the truth was that David Kappos had found some magic elixir that made all USPTO agents super efficient (or something like that — their actual explanations were not much more coherent). No matter what, they insisted that it was entirely possible to massively ramp up the number of approvals, decrease the backlog and not decrease patent quality.

Needless to say, we’ve been skeptical that this was possible.

And now the data is in, suggesting we were absolutely right all along. A new study done by Chris Cotropia and Cecil Quillen of the University of Richmond and independent researcher Ogden Webster used information obtained via FOIA requests to delve into what was really going on in the patent office (link to a great summary of the research by Tim Lee). The key issue, is (once again) the fact that patents are never truly rejected in full, and the people applying for patents just keep on trying again and again until someone in the USPTO approves it. However, the USPTO, to hide some of this, counts some of those “rejections” that eventually get approved as “rejections” to artificially deflate the actual “approval rate” of patent applications.

When the researchers corrected for all of this, they found that the actual patent approval rate in 2012 was almost 90% of all patents eventually get approved. 90%! That’s about where it was in 2004 and 2005 (as discussed above), though in 2001 it actually came close to 100%! However, as noted above, by the second half of 00’s corrections had been put in place and the approval rate had declined to under 70% in 2009 — meaning that the USPTO was actually rejecting bad patents. But over the past three years, we’ve shot right back up. And it’s clear that if the approval rate is much higher, the USPTO is approving many, many more bad patents.

In fact, it’s likely that the story is even worse than before. Back in 2004 and 2005 when the approval rates were similar, it was really before the public was aware of just how bad the patent troll problem was, so you had many fewer people trying to get their own bad patents to troll over. In the past five years or so that has changed quite a bit. So the number of applications has shot up massively as well. In 2004 there were 382,139 applications. By 2011 that had shot up by 50% to 576,763.

I don’t think anyone thinks that we suddenly became 50% more inventive between 2004 and 2011. No, the truth is that people were suddenly flooding the USPTO with highly questionable patent applications on broad and vague concepts, hoping to get a lottery ticket to shake down actual innovators. And, the USPTO under David Kappos complied, granting nearly all of them. Incredible.

When Thomas Jefferson put together the first patent system — after being quite skeptical that patents could actually be a good thing — he was quite careful to note that patents should only be granted in the rarest of circumstances, since such a monopoly could do a lot more harm than good. And yet, today, we encourage tons of people to send in any old bogus idea, and the USPTO has turned into little more than a rubber stamp of approval, allowing patent holders to shake down tons of people and companies, knowing that many will pay up rather than fight, and then leaving the few cases where someone fights back to be handled by the courts (who seem ignorant of the game being played).

The end result is a true disaster for actual innovation and the economy. We should all be able to agree that bad patents are not a good thing. And the USPTO is, undoubtedly, approving tons of awful patents when its true approval rate is hovering around 90%.

Filed Under: approvals, innovation, patents, quality, standards

Is There Any Merit To Neil Young's Plan To Improve The Quality Of Digital Music?

from the the-(record)-needle-and-the-damage-done dept

Neil Young has been unhappy with the state of digital audio for a while, and he’s made various overtures about fixing it. Now, some trademark applications found by Rolling Stone suggest his plans are in motion, though details on those plans are scarce. The only real clue comes from a tangential mention in an unrelated press release:

A press release issued last September by Penguin Group imprint Blue Rider Press, which is publishing Young’s upcoming memoir, may have revealed the working title of Young’s entire project. In addition to the memoir, says the release, “Young is also personally spearheading the development of Pono, a revolutionary new audio music system presenting the highest digital resolution possible, the studio quality sound that artists and producers heard when they created their original recordings. Young wants consumers to be able to take full advantage of Pono’s cloud-based libraries of recordings by their favorite artists and, with Pono, enjoy a convenient music listening experience that is superior in sound quality to anything ever presented.”

But does Young actually have a new idea? There are already lossless formats like FLAC that some audiophiles swear by, not to mention uncompressed formats like WAV and AIFF. But there is theoretically room for improvement: most uncompressed digital audio is sampled at a rate of 44,100 Hz, but some pro studio equipment can record at twice that, and technologies like DSD can go much, much further. Moreover most consumer audio consists of 16-bit samples, which could be bumped up to 24-bit. So on the technical side, there is the potential for new formats to popularize higher-quality digital audio. Who knows if that’s what Young has in mind.

That, however, leads to the bigger question: is there really a market for such a format? The digital audio debate has been raging for years, and it has a lot of contours—not just the strengths and weaknesses of digital and analog formats, but also changing approaches to sound engineering and the debates over loudness, audio compression and overprocessing. While some audiophiles insist they can tell the difference, blind listening tests have proved they rarely can. For the average listener, convenience, selection and price surely trump such a negligible (and possibly undetectable) quality difference—and since it sounds like Young hopes to develop a proprietary, cloud-only format, I’m guessing those other factors aren’t high priorities. Moreover, since most people are listening to their music on earbuds and other low-definition systems, the quality bottleneck exists much further down the line than the file format—and since an increasing amount of music is recorded with consumer tools like GarageBand that operate at the standard sampling rates for uncompressed AIFF/WAV files, there’s another bottleneck above the file format too. Though, in theory, these factors are part of what Young wants to change with his push towards higher quality—and there may be some potential in that direction over time as bandwidth and storage space increases, and even some sort of immediate market among audiophiles. But it’s hard to see what he could offer that existing formats don’t already provide.

I know some people will insist that digital audio sucks, and that they can tell the difference—but frankly that’s a meaningless assertion if they haven’t done a controlled test. There are simply too many biases to account for. But even if it is a real problem for some people, it is likely to be a very small niche market, not a cultural sea-change like Young seems to envision. Some of his proclamations about the effect of music sound eerily close to Prince’s insane ramblings about how audio interacts with the brain, which is hard to swallow. Music may create transcendent human experiences once it’s inside your head, but your ears are still made of flesh and bone, not magic. And evidence suggests that most people’s ears can’t tell the difference.

Filed Under: digital music, flac, neil young, quality

Neil Young: Piracy Is The New Radio (But The Quality Sucks)

from the well,-there's-that... dept

Neil Young apparently isn’t too concerned about copyright infringement these days, according to his comments at the D: Dive into Media conference:

It doesn’t affect me because I look at the internet as the new radio. I look at the radio as gone. […] Piracy is the new radio. That’s how music gets around. […] That’s the radio. If you really want to hear it, let’s make it available, let them hear it, let them hear the 95 percent of it.

Of course, that’s a bit of a reverse from back when he was angry that YouTube wasn’t paying him money when people uploaded his songs. Still, it’s good to see him come around to the view that infringement is, basically, a new form of radio. Artists like Chuck D have been making that argument for over a decade.

Young is still concerned… but about the fact that the quality of MP3 files sucks. He’d prefer technologies that provide a much fuller sound:

Steve Jobs was a pioneer of digital music, his legacy was tremendous. […] But when he went home, he listened to vinyl.

Filed Under: neil young, piracy, quality, radio

Even In The Age Of Abundance It's Quality, Not Quantity, That Counts

from the do-books-need-to-be-expensive-to-be-good? dept

Seth Godin is nothing if not prolific. As well as publishing a string of popular marketing books with catchy titles like “All marketers are liars”, “The big moo” and “Small is the new big”, he writes short but smart blog posts every day, some of which are rather obvious, but many of which contain real gems of insight.

This fluency with words means he is well placed to comment on the age of abundance we are entering thanks to the rise of digital technologies. One of his latest pieces is entitled “How the long tail cripples bonus content/multimedia“, and appears as part of The Domino Project, “a new way to think about publishing. Founded by Seth Godin and powered by Amazon” — a partnership that is itself symptomatic of the digital times.

The post is in response to a HuffPo interview with President and CEO of Ingram Content Group, David “Skip” Prichard. Prichard shows himself optimistic and surprisingly open to new ideas for someone leading a book distribution company — not a sector known for its innovation.

But Godin concentrates on one particular aspect of Prichard’s replies, which is typified by the following exchange:

> Are there enhanced books available this holiday season that have already changed the definition of a book? > > Yes, for example, a biography can to come to life in many ways. Jacqueline Kennedy: Historic Conversations on Life with John F. Kennedy has all of the interview audios, videos, photographs, text, and transcripts available. Even classics — Penguin has updated Pride & Prejudice with clips from the movie and even instructions on dancing. For the 75th anniversary of The Hobbit, HarperCollins released an e-version with exclusives including J.R.R Tolkien’s book illustrations and recently discovered Tolkien recordings. Publishers are still learning what added value readers will or won’t pay for. I expect we’ll continue to see lots of experimentation in this arena.

Godin describes these “breathtaking visions of the future” as “economically ridiculous”, and comments:

> The Long Tail creates acres of choice, so much as to make the number of options almost countless. But at the same time, it embraces (in every format) much lower production values. For what Michael Jackson and Sony paid to produce the Thriller album, today’s artists can make and market more than 5,000 songs. You just can’t justify spending millions of dollars to produce a record in the long tail world.

This is an important point that the copyright industries are extremely reluctant to acknowledge, because it’s at odds with their business models based on just a few massive blockbusters that are highly profitable. There’s a good reason for their preference: the elevated costs involved in creating these works act as a barrier to entry for newcomers, and help preserve the status quo. The new model, based around large numbers of low-cost products, is available for anyone to adopt — including artists selling directly to their public.

As Godin puts it:

> it’s not a few publishers putting out a few books for the masses. No, the market for the foreseeable future is a million publishers publishing to 100 million readers.

He explains what that means for ebooks:

> The typical ebook costs about 10inoutofpocketexpensestowrite(moreifyoucountcoffeeandnotjustpencils).Butifweaddin10 in out of pocket expenses to write (more if you count coffee and not just pencils). But if we add in 10inoutofpocketexpensestowrite(moreifyoucountcoffeeandnotjustpencils).Butifweaddin50,000 for app coding, 10,000foradirectorandanother10,000 for a director and another 10,000foradirectorandanother500,000 for the sort of bespoke work that was featured in Al Gore’s recent ‘book’, you can see the problem. The publisher will never have a chance to make this money back.

Finally, Godin addresses the inevitable complaint that the imminent loss of those 500,000multimediaebooks—liketheimminentdisappearanceof500,000 multimedia ebooks — like the imminent disappearance of 500,000multimediaebooksliketheimminentdisappearanceof100 million movies – means the end of creativity as we know it:

> The quality is going to remain in the writing and in the bravery of ideas, not in teams of people making expensive digital books.

Even in the age of abundance, it’s not about quantity, but quality.

Follow me @glynmoody on Twitter or identi.ca, and on Google+

Filed Under: abundance, business models, quality, quantity, seth godin