bottom up – Techdirt (original) (raw)

Internet Society Says 'Oh, Hell No' To ICANN's Plan For A 'UN Security Council For The Internet'

from the run-that-by-me-again dept

Earlier this month, ICANN, along with the World Economic Forum and a Brazilian government group called CGI.br, announced a NetMundial Initiative, which is being described as a sort of “UN Security Council for the internet.” If NetMundial sounds familiar, that’s because back in April there was a big meeting on internet governance in Brazil called NetMundial. While this has the same name, it seems to be basically unrelated to that, but rather, it appears to be these three groups setting themselves up in power positions over internet governance. While those behind it tossed in a bunch of buzzwords, about how it would be “open source,” a “shared public resource” and would have a “bottom-up, transparent” process, there was a bit of a problem with all of that. You see, the three founding organizations also… installed themselves as permanent members who would control the council.

Yes, they want other members, but setting themselves up with permanent seats seems a bit iffy on the whole “bottom-up, transparency” bit. The whole thing didn’t exactly go over well at launch:

“Everything will be done bottom-up, this is the mother of all bottom-up processes,” said Chehade to widespread disbelief in both the chatroom and on Twitter.

The claim that the initiative would not overlap other organizations’ work was also derided. “Why create another platform?” asked one person in the short Q&A session after the presentation. “How do you expect to avoid duplication?”

Asked why ICANN was installing itself as a permanent member of a body that would only focus on non-technical issues when ICANN is specifically a technical body, Chehade gave an answer that left many scratching their heads:

“Why is ICANN on the Council? Precisely to clarify why our role should remain as it is: purely technical. It should not be at ICANN where these issues should be solved.”

So, ICANN is purely technical, and it needs to install itself as a permanent member of a committee that isn’t technical to clarify why its role should be purely technical. Got that?

Anyway, you can see in the slide above, the “I* group” listed as being offered one of the remaining seats. The I* group was supposed to be made up of a bunch of organizations you should already be familiar with: the Internet Society, IETF, IAB, W3C, Regional Internet Registries, ICANN and regional TLD organizations (yes, it appears that the I* group also includes ICANN, despite its separate seat on this council). Either way, the folks at the Internet Society, who have been heavily involved in a variety of internet governance efforts, often in conjunction with ICANN, have slammed on the brakes after seeing this new initiative, saying that the group cannot support a plan that seems so questionably designed:

With respect to the need for new groups, such as the NETmundial Initiative and its Coordination Council, the Internet Society Board reiterates that the Internet Society?s longstanding position is that there is no single, global platform that can serve to coordinate, organize or govern all the Internet issues that may arise. At its heart, the Internet is a decentralized, loosely coupled, distributed system that allows policies to be defined by those who require them for their operations and that ensures that issues can be resolved at a level closest to their origin. The ecosystem draws its strength from the involvement of a broad range of actors working through open, transparent, and collaborative processes to innovate and build the network of networks that is the cornerstone of the global economy.

Based on the information that we have to date, the Internet Society cannot agree to participate in or endorse the Coordination Council for the NETmundial Initiative. We are concerned that the way in which the NETmundial Initiative is being formed does not appear to be consistent with the Internet Society?s longstanding principles.

ISOC further notes that a much bigger priority is getting through the transition of the IANA functions, from being under NTIA/Dept. of Commerce to being separate, thus taking ICANN out from under the thumb of the US government. As we’ve noted in the past, we support this move as being necessary for a variety of reasons, including some that will help prevent dangerous changes to internet governance. However, if this is the kind of crap that ICANN is going to pull, it’s only going to raise even more skepticism about the organization’s position in managing key parts of the internet.

Kudos to the Internet Society for not just giving in. Yes, if you look over the presentation below, there may be plenty of good ideas embedded in the NetMundial project, but if it’s going to go forward, it simply cannot include founding members electing themselves to a permanent controlling position (and also giving those organizations tremendous power in selecting the other council members). This is not how the internet should be run.

Filed Under: bottom up, council, netmundial, netmundial initiative, permanent members, transparent
Companies: cgi.br, icann, internet society, isoc, wef

The Death Rattle For Blackberry: Once Again, Markets Change Very Quickly

from the from-dominant-to-dead-in... dept

There’s plenty of attention going to the news of Blackberry’s awful financial performance and massive layoffs, with many people declaring the company at death’s door (meaning it’s likely someone will buy off the pieces sometime soonish). Just a few weeks ago, we wrote about the similar effective death of Nokia, highlighting just how much the mobile ecosystem had changed in just a few years. We showed the following chart from Asymco to make the point about just how much the market had changed, and how far Nokia had fallen in just a few years:

That same chart, obviously, applies to RIM as well. Though since RIM (which became Blackberry) had a rather different business model, it’s not as noticeable when you’re just looking at profit share, which that chart shows. Instead, the Washington Post shows an even more telling image, just looking at mobile OS marketshare:

The shorthand: markets change. Often in big, surprising ways. A few years back, if you looked at corporate America, it would be almost unfathomable that anyone would unseat RIM/Blackberry. It was everywhere in the corporate market. You could talk to people who referred generically to all mobile phone devices at “Blackberries.” That’s how ubiquitous the brand had become.

But, then, a variety of things changed. It wasn’t because of any anti-trust efforts. It was because of basic competition, which resulted in a somewhat surprising change in how things worked. First, the iPhone came on the scene, and that was a really compelling device. No, it didn’t have the nice keyboard, or certain features that Blackberry users loved (liked BBM), but it was really great in so many ways… so people started to want to just use that for both personal and work purposes.

And then the magic shift. As Tim Lee notes, it was the bottom up demand from users in the workplace that completely changed the market.

But the pace of innovation in the consumer smartphone market was so rapid that employees became dissatisfied with their BlackBerrys. And eventually, the advantages of iOS and Android devices became so obvious that corporate IT departments were forced to capitulate. They began supporting iPhones and Android devices even though doing so was less convenient.

BlackBerry eventually realized that it would need to compete effectively in the consumer market if it wanted to survive. But building consumer-friendly mobile devices wasn’t its engineers’ strong suit. And by the time BlackBerry released a modern touchscreen phone in 2010, three years after the iPhone came on the market, it had a huge deficit to make up.

Folks who study the Innovator’s Dilemma will likely recognize this as a classic case of that in action. You have a new upstart, that gets mocked and pooh poohed by the existing players. In the business world, very few people took the iPhone (or Android) seriously. After all, RIM totally dominated the market, and it was entirely top-down, with companies purchasing huge contracts. Plus, the Blackberry had so many features really focused on the business users that the iPhone felt like “a toy.”

But that really is the secret to tremendously disruptive innovation. The first versions are almost always mocked as “toys.” It leads the dominant player to ignore them or mock them. And then the bottom up tidal wave takes them by complete surprise. The “toys” get better at business tasks, and even if they’re not as good as the Blackberry, they’re good enough for most business use cases, and the pressure mounts to ignore the top down process and have companies support iPhones and Android phones. And, by the time Blackberry tries to catch up, they’re way, way behind. It’s the same story of disruptive innovation that has happened time and time again.

This pattern is almost undeniable. In retrospect it always looks obvious, but companies almost always miss this. They tend to think — incorrectly — that they can “spot” the disruptive innovations ahead of time. Or, barring that, they believe they have such a dominant position that if something appears to be disruptive, they can just copy it and “catch up” leveraging their position in the market to do so. But that’s rarely the case. Almost every time, by the time they realize what’s going on, it’s way too late. When they finally get to the market, they’re seen as also-rans, way behind the times. They’re often going after a moving target. The successful early innovators have already adapted and changed and innovated some more, while the big entrants are still trying to compete with last year’s model.

It’s happened before and it’ll happen again. It’s the nature of innovation. It has nothing to do with patents (RIM’s got tons of those and has used them aggressively at times — look where that got them). It has nothing to do with antitrust issues. It has everything to do with the nature of innovation and competition.

Filed Under: antitrust, bottom up, innovator's dilemma, markets, patents, smartphones
Companies: blackberry, rim

When The Creators Of Both The Internet And The Web Come Out Against The ITU, Shouldn't You Too?

from the just-saying... dept

We’ve been talking a lot about the ITU and its WCIT (World Conference on International Telecommunications) lately, given the importance of various proposals on the future of the internet. While Vint Cerf, often considered the “father of the internet” for his early (and continued!) contributions to the core of the internet, has been quite outspoken for many months about the threats of the ITU towards the internet, now we can add the creator of the World Wide Web to the list as well. Tim Berners-Lee has spoken out against the ITU efforts at WCIT.

Sir Tim is director of a standards body himself – the World Wide Web Consortium. He said that governments can already influence changes but should resist further interference.

“I think it’s important that these existing structures continue to be used without any attempt to bypass them,” he said.

“These organisations have been around for a number of years and I think it would be a disruptive threat to the stability of the system for people to try to set up alternative organisations to do the standards.” Accelerating access

[…] “A lot of concerns I’ve heard from people have been that, in fact, countries that want to be able to block the internet and give people within their country a ‘secure’ view of what’s out there would use a treaty at the ITU as a mechanism to do that, and force other countries to fall into line with the blockages that they wanted to put in place.”

When the fathers of both the internet itself and the World Wide Web are both speaking out against the ITU’s efforts to have further control over the future of the internet, isn’t it time to step back and ask what benefit the ITU would really provide. To date, none has been shown. Instead, we get vague talk about increasing “fairness” by diverting money from innovators to telcos who haven’t innovated with the promise that this will lead to greater investment. Yet, the evidence suggests that this doesn’t work, and historically, such transfers and subsidies tend to be pocketed by execs (or governments) rather than invested in infrastructure.

So, here we have two of the most visionary innovators out there — who created the key platforms we rely on — highlighting how the ITU process is the exact wrong way to go about things. Combine that with the key argument being made by the ITU being unsupportable based on history. And shouldn’t we all be wondering why this big charade is happening in the first place?

Filed Under: bottom up, itu, tim berners-lee, top down, vint cerf, wcit

Continuing The Discussion On A True Innovation Agenda

from the join-in dept

Last week, over on our Step 2 discussion platform we kicked off a discussion on what an “innovation agenda” might look like for a US-politician for 2012. What kinds of regulatory changes should they be focused on? This effort, done in partnership with Engine Advocacy, has already kicked off a nice discussion over there with some interesting ideas being tossed around. If you haven’t yet, please join in the discussion. I’m not surprised that copyright issues and open internet issues top the list of things most interesting to folks — the SOPA/PIPA debate has pretty much guaranteed that. I am a little surprised that issues around helping skilled entrepreneurs — the folks who create jobs — was seen as less of an issue compared to some of the others on the list. Either way, the discussion is still going on there, and we’ll be taking it further over the coming weeks and months, so feel free to join in.

Filed Under: bandwidth, bottom up, copyright, decentralized, entrepreneurship, free speech, innovation, innovation agenda, patents, privacy, spectrum

Attacking The Hacker Hydra: Why FBI's LulzSec Takedown May Backfire

from the top-down-approach-to-a-bottom-up-threat dept

Interesting timing. Just about the same time that we had our story concerning how LulzSec kept its own site from getting hacked, the news was breaking that the key leaders of LulzSec were being arrested, in large part because the “leader” of the group had become an FBI informant after they tracked him down last year. Of the various hacking efforts out there, LulzSec has definitely been the most brazen, so it’s not a huge surprise that it would be targeted by the FBI. Also, unlike “Anonymous,” LulzSec was pretty clearly an effort by a few key individuals, rather than a loose collective of folks joining and leaving at will.

As I’ve been saying since these various groups started their various hacking and vandalism campaigns, I think these efforts are a really bad idea, and don’t do much to further the supposed causes that they’re trying to support. They’re only going to lead to backlash, as we’re already seeing in government officials using these groups as an excuse to try to make a power grab over the wider internet.

Given that, as I’ve said in the past, I haven’t been surprised to see the various arrests of folks supposedly associated with Anonymous or LulzSec. I expect that we’ll continue to hear such stories — in part because these kinds of stories are likely to provoke more of the same type of activity. Law enforcement keeps claiming that these arrests will frighten off others, but that shows a typical lack of understanding of what’s going on. As counterproductive as these activities are, it’s pretty clear that this isn’t about criminal activity for the sake of criminal activity, but about dissatisfaction with what’s going on in the world — and, as such, the arrests are actually only likely to create more such activity, which is the exact opposite of what law enforcement should be seeking to do.

Not understanding who they’re dealing with, and taking a top down approach to a bottom up threat, seems to be a specialty of US law enforcement.

Again, I think that the actual efforts by these folks are incredibly counterproductive and set up this “battle-siege” mentality, when the folks involved in all of this could be much more strategic in using their skills for good, rather than destruction. But that doesn’t mean that we should ignore the reality of why it’s happening, or how it’s likely to continue to evolve. More groups will pop up, more hacks will happen and (I’m sure) more disaffected skilled computer hackers will be arrested. But none of that (either the hacking or the arrests) is likely to bring us any closer to actually dealing with the problems that created this mentality in the first place.

Filed Under: anonymous, bottom up, fbi, hacking, lulzsec, top down

Help Create An 'Innovation Agenda' You Wish Politicians Would Support

from the make-a-statement dept

Join the discussion over at Step 2

In the last few months it’s become clear that it’s no longer acceptable for politicians to “not get” the internet. The internet has become such a key part of our lives that anyone who is trying to regulate it without understanding it doesn’t deserve to be in office. Of course, there are some politicians who really do want to do the right thing, and it’s time to help them out. In association with Engine Advocacy, we’re looking to do a little “crowdsourcing” around what an internet “Innovation Agenda” should look like for any politician in 2012. We’re starting with this basic principle:

New businesses are the key to job creation and economic growth, and the Internet is one of the most fertile platforms for new businesses ever established.

We believe deeply in the value of decentralized, emergent, bottom-up innovation, and we want to shape public policies that will allow it to flourish.

From there, we have a list of twelve topics that we think are important — but we want your input. So we’ve posted this same thing both here and over at our Step 2 discussion platform. Over at Step 2, we’ve also posted those initial twelve topics, with each one as a separate comment on the original post, so you can vote them up and down. If you want to really participate, please head on over to Step 2, where you can do three separate things (and, yes, your Techdirt login works there too):

  1. Suggest your own topics that should be part of an innovation agenda by responding to the main post.
  2. Vote on existing topics to show which ones are more important… and which ones are less important.
  3. Comment on the existing topics to provide feedback or suggest ways to improve them.

Please help us shape a comprehensive Innovation Agenda for 2012. Engine Advocacy is working closely with the internet community and helping give them a voice in DC, and this is one way to take part, as your suggestions may help shape what politicians are hearing.

Filed Under: bandwidth, bottom up, copyright, decentralized, entrepreneurship, free speech, innovation, innovation agenda, patents, privacy, spectrum

Can Facebook Really Bring About A More Peer-to-Peer, Bottom-Up World?

from the reality-distortion-field dept

Mark Zuckerberg’s letter to shareholders included in Facebook’s IPO filing contains a pretty bold vision for Facebook to not just connect people and enable them to share, but to fundamentally restructure the way that the world works:

By helping people form these connections, we hope to rewire the way people spread and consume information. We think the world’s information infrastructure should resemble the social graph — a network built from the bottom up or peer-to-peer, rather than the monolithic, top-down structure that has existed to date. We also believe that giving people control over what they share is a fundamental principle of this rewiring.

We have already helped more than 800 million people map out more than 100 billion connections so far, and our goal is to help this rewiring accelerate. [emphasis added]

That sounds pretty lofty, but if you recognize that Facebook provides a social networking service that hundreds of millions of people use — but forget for a moment that it’s Facebook — it’s quite a bold “social mission.” And there are many examples of how the service has been used as a key tool in affecting change on everything from opposition to the Canadian DMCA to the Arab Spring. There’s no doubt that the service makes it easier for people to organize in a more bottom-up way.

But, once you remember that it’s Facebook we’re talking about, the vision sounds more problematic. Could Facebook ever truly bring about a peer-to-peer, bottom-up network? The notion seems to be an inherent contradiction to Facebook’s architecture — as a centralized, proprietary, walled garden social networking service. Facebook may enable a more bottom-up structure, but it’s a bit disingenuous for Zuckerberg to decry a monolithic, top-down structure when Facebook inserts itself as the new intermediary and gatekeeper. As a centralized, proprietary, walled garden service, Facebook is a single point for attacks, control, and surveillance, never mind controversial policies or privacy concerns. Facebook may enable a more bottom-up and peer-to-peer network compared to many things that came before, but there is something fundamentally at odds with a truly distributed solution at the core of its architecture and its DNA.

To realize the full potential of bottom-up, peer-to-peer social networking infrastructure, we need autonomous, distributed, and free network services — the sort of vision that StatusNet/Identi.ca or Diaspora have tried to bring about. Rewiring the world to create a more bottom-up, peer-to-peer network is a bold vision for Zuckerberg to put forth — and one that Facebook has advanced in many ways — yet it’s fundamentally at odds with the reality of Facebook as a centralized and proprietary walled garden.

Filed Under: bottom up, peer to peer, social network
Companies: facebook

There Are Numbers Less Than 1%

from the reasons-to-buy... dept

I’ve pointed out in the past, that any time you hear a company talk about their business model in terms of “if we only get 1% of that market… we’ll still be huge,” you should run away (and, it’s even more ridiculous when you hear some talk about 10% or 15% of a market). This is top down thinking, but it’s not how businesses work. There’s no guarantee of any percent. Instead, any business needs to focus on bottom up reasoning — explaining why the very first person will buy. Then the second. Then the third, and so on. Taking the top down approach is wishful thinking. It’s making a huge assumption that people will just buy. Taking the bottom up approach is actually building a business. It’s recognizing who the customer is, what they want and how to best get it to them. It’s tempting to do the top down approach, because it looks so tantalizing and easy. But business isn’t easy. It’s hard work.

I’m reminded of this, with a submission from JohnForDummies about a Derek Sivers blog post, discussing a musician friend who took out an ad in a magazine with 1 million subscribers, repeatedly saying:

“If only one percent of the people reading this magazine buy my CD… that’ll be 10,000 copies! And that’s only one percent!”

But, as the musician learned there are numbers much smaller than 1%, as he ended up selling just 4 copies of the CD.

This is, in some ways, similar to the give it away and pray business models that we sometimes see people trying. Giving stuff away for free is a good part of a business model, but it’s not an entire business model by a longshot. Anyone looking to use free as a part of a business model also needs to go further and do the hard part, the bottom up part, where they figure out how they’re going to get anyone (not a percentage, but specific people) to actually find something worth paying for on its own. Because 0fromamillionpeopleisstill0 from a million people is still 0fromamillionpeopleisstill0. But, reaching 1,000 people with something of value that they want and can’t get any other way… that’s the start of a business model.

Filed Under: bottom up, sales, top down

OLPC Finally Decides to Open Source Its Hardware

from the it's-about-time dept

The many travails of the One Laptop Per Child program have been widely chronicled – after developing a robust, innovative laptop for the developing world, Nicholas Negroponte’s educational project failed to garner the reception he expected. One of the main reasons for this was OLPC’s belief that the market could not do better than their small project: instead of seeking the best products for the children of the developing world, competition was anathema to the OLPC group.

But news that the hardware from OLPC’s second version, XO-2, will be open sourced, gives hope that things are starting to change. Speaking to the Guardian, Negroponte says, “The XO-1 was really designed as if we were Apple. The XO-2 will be designed as if we were Google – we’ll want people to copy it. We’ll make the constituent parts available. We’ll try and get it out there using the exact opposite approach that we did with the XO-1.” Open hardware is an exciting new arena for innovative designs and, by embracing it, OLPC will create a new opportunity for entrepreneurs to create the best laptop for the developing world (or even the developed world). Also, instead of picking an established manufacturer from East Asia, open sourced hardware specifications will allow the developing world’s emergent technology industries to compete, strengthening the communities OLPC seeks to assist.

Filed Under: bazaar, bottom up, cathedral, comeptition, nicholas negroponte, olpc, open source, top down
Companies: olpc

OLPC Is A Cathedral But OLPC Tech Is Fleeing Into The Bazaar

from the top-down-or-bottom-up dept

From the outset, one of the oddities of the One Laptop Per Child project has been the tension between its organizational philosophy and its software platform. In his famous essay, “The Cathedral and the Bazaar,” Eric Raymond contrasted two organizational philosophies for developing software. In the Cathedral, software projects are organized in a top-down fashion, with the development process following a plan carefully developed by the project’s leaders. In contrast, the philosophy of the Bazaar is to “release early and often, delegate everything you can, be open to the point of promiscuity.” The OLPC project was a strange beast because it was clearly organized on the “Cathedral” model, yet it was developed around Linux, the open source project that Raymond used as the poster child for the “Bazaar” style of development. And its broader vision of empowering third-world kids to use the laptops without a lot of central support, is clearly more Bazaar than Cathedral.

I think many of the problems we’ve noted with the project stemmed from this fundamental conflict of visions. Nicholas Negroponte’s vision for the OLPC organization has always been the model of the Cathedral: produce a perfect laptop on the first try and sell it in batches of a hundred thousand to the world’s governments. Negroponte’s plan left little room for the kind of development growth, bottom-up participation, and trial-and error that characterizes the Bazaar. Indeed, even when customers were beating down the door to try out Negroponte’s product, he resisted selling it to them because it conflicted with his vision. And of course, he absolutely hated the idea of his customers having other options to choose from.

This tension was never sustainable, and indeed there are increasing signs that OLPC’s innovative technologies are being steadily liberated from the Cathedral. In January, we noted that one OLPC alum was starting a new firm to commercialize the OLPC’s display technology. Now CNet notes that another OLPC alum, Walter Bender, is starting a new software spinoff to license OLPC technology to a variety of laptop manufacturers. Bender’s decision to start a new company was presumably sparked by Negroponte’s decision to run OLPC more like Microsoft, which one engineer claims involved demoting Bender in favor of someone with less technical expertise.

It seems that the folks who have left OLPC have a more Bazaar-like vision for their companies, licensing their technologies to a variety of companies. In contrast, Negroponte seems to be doubling down on the “Cathedral” model. He’s reportedly considering a switch from Linux to Windows. That would be oddly appropriate given the apparent similarities between Negroponte’s management philosophy and Steve Ballmer’s.

Filed Under: bazaar, bottom up, cathedral, comeptition, olpc, top down