Y U No Kickstart?

The presale, best typified by Kickstarter, has become a powerful tool for hardware companies to sample market demand and fund initial manufacturing. It’s not the endless beta-test that software developers have, but it moves in that direction.

Presales are not perfect:  a successful sale is not necessarily evidence of product-market fit.   Witness Ouya, which had one of the most successful campaigns ever, shipped product as promised, but then failed to create a library of compelling games. In this case, users bought into a vision that turned out to be much harder to realize than expected (namely, enabling a vibrant, non-proprietary, micro-console game development ecosystem).
In other cases, a presale may find the hard-core early adopters, but may not represent the broader market.  Kickstarter is littered with small, but successful products that never transitioned to mainstream.
However, a presell failure can be quite telling: if you can’t find (say) a few hundred or thousand buyers out of those bleeding edge adopters, how will you succeed in the main market?
Which leads to a very reasonable investor question:  if your hardware startup has no presale plan, why not?  There may be some good reasons, but that’s become the exception, not the rule.  After all, a presale yields valuable insights, early in the product cycle, for a relatively low amount of work (and work you’re mostly going to have to do anyway).  The process forces a lot of good MVP hygiene:  entrepreneurs have to describe the value clearly, converge the features & design, and understand pricing & margins.
It’s certainly possible that Kickstarter is the strategy fad of the decade, much like India offshore development was 10+ years ago.  But I don’t think so:  the hardware presale is here to stay!

Prototypes As Sales Tools

I’m continually surprised by hardware startups that meet with potential investors, advisors, or partners and don’t bring hardware to show!

If you’re making something physical and it’s transportable, bring it to your meetings.

If you don’t have something to show, consider spending some time on a prototype that you can demonstrate.  A “works-like” or “looks-like” prototype (or both) will go a long way to conveying your vision.

The Attention Gatekeepers

I’m seeing interesting cases where Facebook, Google or other gorillas tweak content presentation, and then some other company’s business is directly impacted.  The gorillas have increasing control (and power) over user attention.

For example, Demand Media took a huge hit after Google’s search engine updates lowered rankings for low-quality content.  (DMD is down 80%).

Zynga enjoyed spectacular early success with some of the first social games on Facebook.  But as game updates clogged feeds, Facebook made presentation changes to improve feed quality, and Zynga suffered.  (ZNGA is down 60%).  Facebook continues to tweak how feed content is selected and presented.

More recently, Google added a “Promotions” tab to Gmail, moving most marketing emails out of the main inbox view.  That directly affected Groupon, who’s trying to rely less on the daily coupon.  (GRPN down 20% so far).

And now, Gmail added a prominent “unsubscribe link” to the top of promotional emails, which will impact email marketing performance even more.  (Unsubscribe is one of many “quick action” buttons that Gmail has been rolling out.)

It’s a never-ending battle between those who want user attention, and those who manage it.

Why Aren’t ISPs Surfing Moore’s Law?

Back in the mid-90s, I bought a new family PC for $3300. It had a few MB of RAM and a 386 processor.  In today’s dollars, it cost nearly $5000 — try spending that much on a PC today!

Recently, I built a machine with 32GB of RAM, a fast 4-core Intel processor, and 12TB of disk (raw size).  It is several thousands of times larger and faster in nearly every dimension than that old family PC, at less than half the cost.

Moore’s Law has settled in, and we now expect our technology to get dramatically faster, more capable, and cheaper over time. Flip phones are gone, and we’re carrying around personal supercomputers.  My phone’s built-in cameras (plural!) have better performance than my first digital cameras.  Ethernet went from 10mbits, to 100, and now a 5-port gigabit switch costs $18.  Compare the current model iPhone to the original, and a $500 TV to the same-priced model a few years ago.

Given this, why isn’t our Internet bandwidth keeping up?  Verizon just notified me that my monthly rate is going up $10 (with no speed increase).  As I wrote last week, Netflix has reported a speed drop in some cases and is now trying to figure out its relationship with ISPs.  Akamai reports that US speeds have stopped increasing in some cases, and are increasing more slowly in many other cases.  Projecting my PC experience, my Internet connection should now be a gigabit and cost $50/month.  Why isn’t it?

ISPs argue that networks are incredibly expensive to build.  That’s true: ISPs have spent tens of billions building out fiber networks, and governments have offered significant incentives.

But fiber is special:  unlike copper circuits, fiber bandwidth is usually limited by the endpoint technology.  Where DSL is often running as fast as the copper can stand, fiber links have much, much higher potential bandwidth.  The price-performance of endpoint gear (subscriber terminals and core routers) improves at rates closer to Moore’s Law.  For example, Verizon’s own FiOS terminals have moved from 622 mbits to 2.4 gbit link speeds since they first rolled out the service.

For the fiber now in the ground and on the poles, why isn’t bandwidth price-performance improving more quickly?

There’s an Internet Showdown Brewing

Back home in West Virginia, our Verizon phones have no 3G service. There’s no fundamental technical issue; Verizon and US Cellular just won’t enter a 3G roaming agreement.

This scenario captures a core net neutrality concern: are we moving to an Internet where our access is determined more by business agendas and less by technical issues?

Recently, streaming video demand has been forcing this issue. Netflix’s traffic has been growing, and measured throughput has dropped for some major ISPs (e.g. down 14% for Verizon in one month). Verizon is seeking payment to carry Netflix’s traffic, and Craig Silliman (Verizon’s head of public policy and government affairs) has said that Verizon’s policy is to require payment from networks that send more data than they carry in return. “When one party’s getting all the benefit and the other’s carrying all the cost, issues will arise”.

This is going to get more interesting.

Clearly, ISPs are maneuvering to “double dip”: subscribers pay for access to content, and now ISPs want content providers to also pay for access to subscribers. That’s not bad business if you can get it, but it makes you wonder if the ISPs are merely leveraging their powerful position. After all, Netflix is sending data to Verizon’s network because a paying Verizon subscriber asked for it!

The problem is ISPs sell “unlimited data” (effectively), but their networks are nowhere near the capacity needed for all subscribers using full bandwidth. The ISPs bet subscribers use only a small average bandwidth fraction. In the past, this model has worked well: legacy telephone and cable networks have relatively stable demand patterns.

Now, for the first time, these providers are surfing Moore’s Law and Metcalfe’s Law. Advances in computation and network performance (not to mention billions of people on-line) are driving exponential demand for bandwidth, as well as an expectation that services will be better, faster, and cheaper over time.

Worse, ISPs have been famously unimaginative about future applications and bandwidth demand (e.g. comments by Time Warner’s CFO that customers don’t really want gigabit speeds). Your next TV will likely be 4K with IP-delivered video. Don’t forget video conferencing: HD-quality cameras are cheap and 4K cameras are a few hundred dollars. At some point, we’ll be able to look around a remote location with VR goggles.

Even worse, most ISPs are fundamentally conflicted: IP-streaming video competes with their own proprietary video offerings. However, should they be allowed to slow down and tax these new competitors? And if they charge content providers like Netflix, can they discriminate or must they offer identical terms to any content provider?

As I said, it’s going to be interesting.

Boy, Was I Wrong About Dropbox

A few years ago, I was bearish on Dropbox.  I thought they would be an OS feature in time and that turning down a (rumored) $800m from Apple was a bad move. On Quora, I wrote:

I think “slow fade” is another probable outcome.

I’m reminded of FTP Software, which went like gang-busters selling a TCP/IP stack for Windows. Their revenues fell very quickly after Microsoft started shipping TCP/IP as part of the OS.  (The same thing happened with all of the disk-compression companies).

Wow, was I off base!  I think it’s good to evaluate big entrepreneurial and investment misses and I missed several things (at least).

First, a great product goes a long way.  Dropbox has absolutely nailed the product design and user experience in virtually every aspect.

Second, the service is inherently viral.  I routinely create new Dropbox users by sharing files with them.

Third, they covered every platform equally well.  iCloud works well on OS X, and OneDrive is great on Windows, but Dropbox surfed the whitespace between all of the platforms.  They did an excellent job on everything:  the iPad version isn’t just the iPhone version running 2X and they even support Linux.

I can’t wait to see their IPO.

Email will Be Mostly Mobile

When the Blackberry first came out, it was quickly dubbed the “crackberry” because mobile email access was so addictive.  Now with ubiquitous smart phones, we’re all email addicts to some extent.

So it’s no surprise that now over 40% of email opens are on mobile devices, and mobile is on track overtake the PC this year.  It’s pretty amazing when you consider the smartphone, as we know it, was launched less than 6 years ago.

Mobile and Web are blurring together, slowly ceasing to be distinct “things” (I’ve written before about a mobile strategy for Web sites).  This trend suggests some best practices for emails:

  • Format emails for mobile.  This is basic stuff that a lot of designers seem to mess up.  Make sure emails open and render well on mobile devices.
  • Mobile-optimize email click-through landing pages and flows.  If users are reading emails on mobile devices, they’re also clicking through links on mobile devices.  Check your Web usage stats:  you might find that a significant percentage of your site usage by mobile users is coming from email click through paths.  Nothing kills the user experience like a landing page that hasn’t been mobile formatted.

Why Spectrum Auctions are a Bad Idea

Unless you’re still on a flip phone, it’s hard to miss the demand for mobile wireless bandwidth. The FCC is under intense pressure to make more spectrum (frequencies) available for data services, repurposing underused spectrum and obsolete applications (e.g. old UHF TV channels).

As you might imagine, an exclusive FCC license can have significant commercial value.  Given this, the primary method of making wireless bandwidth available (as directed by Congress) is to auction it off.

On the surface, this seems like a reasonable approach.  Companies shouldn’t get a government “free lunch”, and we can certainly use the cash ($60 billion to date).  Companies can’t mine Federal land without paying, and the patent system shows how exclusivity incents commercial investment.  Also, a market-based system sounds appealing.

But if our goal is driving innovation and meeting growing bandwidth needs, it’s time to consider that the policy (as the primary way to allocate bandwidth) is seriously flawed.

Unlike oil drilling, spectrum is not a commodity:  1GB used today doesn’t mean there’s less tomorrow.  And license exclusivity is not like patents:  the wireless bidders are not providing a documented technological advance.  Spectrum is a public right-of-way:  what if your local government auctioned off public roads to the highest bidder?  (To be clear:  I’m not suggesting government wireless infrastructure.)

The real problem is that we’re stuck in the translation trap that often happens when we attempt to treat intangible licenses as physical property.  The failure is becoming more clear:  much auctioned spectrum remains underused.  Winners generally have little obligation to actually do anything, and technology advances make it notoriously difficult to estimate future value and bid accurately.  Licenses are for very long periods, not matched to the rapid pace of innovation.

As a result, licenses become expensive trading cards for large wireless companies, with lawyers and regulators involved with every exchange.  Witness the arguing and posturing that’s between Sprint, Clearwire, and all the other wireless companies.  (TL;DR:  Clearwire’s WiMAX business hasn’t gone so well, Sprint wants to buy them for the spectrum value).   Spectrum ends up stuck in a slow-moving, heavy-friction “market”, without being efficiently deployed.

We need a policy that removes friction, by making more unlicensed (or lightly licensed) spectrum available.  The unlicensed bands are a source of significant innovation, starting with CB Radio, and continuing with cordless phones, the Family Radio Service, Wifi, and Bluetooth.   Where else can you buy a 150mbit radio for under $3?  We all switch our phones over to Wifi if it’s available (often provided by a $50 access point).

Our current spectrum policy is a vestige of the old “walled garden” mobile market, where the wireless carriers had exclusive control of the mobile device.  We need a policy that’s aligned with the app-store world, with more spectrum available to innovators that don’t have lawyers and billions of dollars.

This is Not Your Father’s Software Industry

The software industry has seen major changes in the past 10 years, as the business of software has gotten increasingly efficient and friction-free.  Expensive software stacks, primitive tools, million dollar server farms, and 50+ person development teams have given way to free, open source, high-quality tools, small teams, and rentable infrastructure.  There are more skilled people creating software than ever before, and the market provides ways for the best talent to find opportunity well above an annual salary.  And just when you think it couldn’t get any easier to create software, it does.

As friction goes away, things become much more fine-grained.  You don’t need $5m anymore to start a company:  a laptop and a cafe wifi connection will do.  This enables an explosion of new projects, but with smaller teams and narrower ideas.  The industry gorilla platforms fuel a “feature ecosystem”:  are those icons on your phone “apps” or “features”?  Viewed in person terms:  a thousand 100-person software teams might now be 30,000 3-person teams.  Software is no longer a sport of kings.

This effect, in turn, is flattening the industry.  Most projects now start on nearly identical footing, often with many competitors or near-competitors.  It’s like starting a civilization in a desert vs the mountains; there are far fewer strategic passes and valleys to control and extract disproportionate value from surrounding areas.  It’s a maddening conundrum for entrepreneurs and investors:  we’re all toting personal super computers, the world is bathed in wireless access, and there are millions & millions of mobile apps and Web sites.  But why does it feel harder than ever to create a $1b software company?  This is why.

Does this mean software’s dead?  Not at all, not even close.  When Marc Andreessen said “software is eating the world“, he got it exactly right.  Software & computation are fueling a level of innovation, disruption, and advancement never seen before.  But the way software companies extract value is evolving.  In the beginning, software was sold as a product;  then, rented as a service.  Now, many companies use software to enable other services and business models.

However, for the reasons outlined above, companies who are “just software” will have a much harder time achieving scale.  The real opportunities are in the next phase:  embedded software.  This might be software literally embedded in hardware, or cases where software value is embedded in (and enabling) some other business.  For example, Amazon is on their way to being the world’s largest retailer, and is the largest software company that doesn’t sell any software.  Uber is building the world’s largest virtual taxi fleet, and Airbnb has built the world’s largest vacation rental network.

My bet is that the next wave of disruptive software companies will look more like these examples, and less like Oracle, Microsoft, Facebook, or Salesforce.com.

This is not your father’s software business any more.

Game Consoles: The Last Remaining Walled Garden

The reddit user kmesithax wrote a brilliant comment yesterday about the realities of game console development, describing the tools and costs:

Well, no, there is no OpenGL or any graphics API for that matter, it’s all some stupid low-level hardware API that you have to tickle to get any 3D rendering to work.

and

So let’s say you get over your initial API shock, you have a decent handle on what all the little libraries do, and you wanna buy some development hardware now. Well, uh, okay. That’ll be anywhere from $2,600 (leaked 3DS devkit figures) to $10,000 or more (leaked Xbox 360/PS3 devkit figures).

This reminds me exactly of the pre-iPhone “walled garden” mobile app world, when you needed ~$10,000 for a development license for Qualcomm’s “BREW“.  The original article  ”The Minecraft Test” (e.g. could your platform spawn the next Minecraft?) is a fabulous way to think about platform openness.  (Also see Nate Brown’s post “Stupid, Stupid Xbox!!” for an insider critique).

The console platforms have completely missed the market transition to open, low-friction developer on-ramps, and it’s no surprise the console market is now anemic.  In contrast, the new OUYA console (I have one on pre-order) has a fledgling, but very open SDK and just had a “game jam“.  The OUYA is under-powered relative to current consoles, but I bet the openness will more than make up for that issue.