Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here

Tuesday, June 29, 2010

Social media uselessness

The observant among you will notice that I haven't deleted my Twitter account yet, although I haven't replaced the mobile client on my phone.

I've discovered a couple of use cases that justify its continued existence for now:

- Highlighting new blog posts, although it doesn't seem to have generated meaningful traffic uplift yet. This one is pending.
- Something to do during slow conference presentations. It makes a change from checking Facebook or my email.
- Being rude about other lousy new social media services that don't warrant the effort of a full blog post.

For example, in less than 140 characters: "Foursquare: useless except to know where to avoid meeting social-media bores, the last group anyone sane would want to socialise with". The whole thing reminds me of the famous Groucho Marx quote "I wouldn't join any club that would have me as a member" .

Gowalla, on the other hand, is *so* bad that it possible merits more than 140 characters, just to give it the deserved dose of vitriol. I initially chose to install it rather than Foursquare on the basis that most of the social-network drones had ignored it, which rather suggested it was cooler. And the stylised kangaroo icon was kinda cute.

Wrong.

After checking in at the Post Office on Baker St and being awarded a virtual pint of beer, I've been scratching my head about why dumping the content of the Yellow Pages into an app and getting people to tick off the boxes is of any use to anyone. I also "checked in" at the statue of George V opposite the Houses of Parliament this evening too, and added the new "spot" of the Institute of Economic Affairs. Wow. My life is now complete.

It's not even very good at suggesting places. If I'm in need of nearby hostelry for a beer, I'm much better off going to the Google Maps app and typing "pub" into the search box and waiting for the hail of marker pins to rain down in a circle around my location. Gowalla has a random and incomplete selection.

Looking at the app while sitting at home, I'm now suddenly aware that I have an urgent need to visit the International Cheese Centre, which is apparently only 300m away. Or the tube station.

And it crashes hangs half the time as well.

Seriously, it's just another flawed way of doing the proven-useless task of local mobile search, with prettier icons. Yet another in a long line of things that are a bit like one of Google's functions, but don't work.

The notion that this type of thing represents the future of massmarket mobility is a total joke. If you're about to spend lots of money on a social media consultant, give me a call instead. I'll help you check in at the Emperor's palace and confirm that he's wearing no clothes. And I'll most likely be able to suggest something else that helps you engage with your customers and make more cash instead.

Friday, June 25, 2010

Mobile operators' future voice strategies decoded

Apologies in advance, but this blog post is deliberately a bit of a tease.

I'm not going to spell out the answer here, as it's too valuable to open-source at this stage.

You may remember that a week or so ago, one of my consultancy & analysis peers released a study suggesting that some carriers might exit voice altogether.

Yes, I think that's an option too. But it's not an either-or, IMS/VoLTE or nothing situation.

I have identified at least six or seven different scenarios. In my view, the one defining feature about mobile voice in 2020 is that it will be *much* more heterogenous than it is today. We will definitely not all have the same "telephony" experience - or maybe even the same lowest common denominator.

Some people will have a version of today's voice - good coverage, pretty cast-iron reliability, excellent "reachability", quite expensive (especially roaming) and anchored in your existing operator's core network and billing plans.

But that will be only part of the environment. Others will be using VoIP of various types and qualities - some of which will be better than "traditional" circuit telephony. There will probably be a role for IMS in some operators, and Skype or Google in others, perhaps VoLGA in still more. Numbering and termination regimes will get more complex, not simpler. It certainly wouldn't surprise me to see some operators end up as so-called over-the-top providers of voice on other networks.

Luckily, the population is getting much better at multi-tasking and segmenting its communications experience. Multiple IMs addresses, logins, devices, numbers and other identifiers.

Despite the industry bodies crying wolf about the dangers of losing ubiquity, this is not a "tragedy of the commons" scenario. Fragmentation will add value and consumer utility, not reduce it.

As always, divergence will be more important than convergence. Multiplicity will rule, not unification.

Gateway vendors will do better than many of those dependent on integrated end-to-end standardised silos - although there will still be a few behemoths able to control particular value chains or geographic markets.

I'm not going to give away the full set of scenarios I see evolving in this blog post - or the impact on the existing mobile voice and VoIP communities, device vendors, messaging or social network providers and assorted others. In many ways, there are still plenty of variables anyway, especially regulatory ones.

I realise that many of my peers and rivals read this blog - and this is one area where I think I have a significantly different viewpoint from most of them. I also have a lot broader coverage - 2.0, 4G, spectrum, devices & OS's, operator business models, APIs, silicon, network policy & architecture, enterprise, cloud services - which are essential to pulling the pieces of the puzzle together in a meaningful way.

It means that I can consider factors like SMS, lawful intercept, indoor coverage, prepay, battery life, web/cloud voice integration, CEBP, HD voice, video, test & measurement, EU roaming & wholesale rules, Apple & Android, IP-PBXs, QoS, new devices...

In coming months I will be talking through my views about The Future of Mobile Voice with many of you in briefings, or at conferences. I will also be publishing some of the analysis in future research documents.

But if you want to get a heads-up in advance, or a more customised viewpoint, please contact me about ways Disruptive Analysis can help you move forward and position your company.

information AT disruptive-analysis dot com

Thursday, June 24, 2010

The risks of using someone else's numbers & forecasts

I've been at the Avren Femtocell conference in London today, and the Femto Forum awards dinner last night.

One thing that struck me is that I've seen the same set of Cisco VNI data traffic forecasts used by almost every presenter. I'm sure you've seen them too - 59x mobile data traffic growth 2009-2014, lots of it video. Exabytes, petabytes etc. About 5% of all Internet traffic going via mobile access and so forth.

Yes, it's good stuff. Really detailed, lots of interesting insight - there's a lot of work that's gone into it - I was on a analyst conference call with the VNI team recently and was impressed. But what rather worries me is that lots of people seem to be using it as a central part of their marketing, without appearing to understand what's in it, or how the forecasts have been constructed.

They certainly don't appear to have spoken to any of the VNI project's authors, understood the methodology or assumptions, or even to have bothered reading the fine detail. I upbraided one speaker today for citing the 66% of mobile data traffic which is video, as being a justification for needing more QoS, referencing things like videotelephony.

Yet the table in Cisco's data which describes this splits out the 2.3m TB/month of mobile video about 3/4 of the way down this page. 1.4m TB/mo is PC and tablet-based Internet video - basically YouTube, iPlayer and the like being watched over a USB modem or 3G module. The sort of application that can be expected to be buffered, rate-adaptive and so forth. The proportion likely to be QoS-based is tiny. Conversely, video communications (Apple FaceTime, Skype Video, the roughly 17 people using IMS video-sharing and so on) accounts for much less.

I sincerely hope that strategy and product planning folk at vendors and operators are being a bit more rigorous about finding *relevant* and specific addressable statistics for their own products instead of convenient data "soundbites" available for free on the web.

But given the number of pitches I hear using the same data deluge/wave/tsunami/apocalypse language to justify their femto or wifi offload, or policy-management products' value proposition, I'm starting to get worried.

Especially as none of them also bother putting up a chart of signalling traffic, which if anything is the cause of more problems to operator networks than the bulk of data downloaded. (NSN is the main exception I can remember seeing to show signalling growth trends - and I think they use their own data forecasts rather than relying on Cisco's too).

Another sobering example of mistakenly trusting web numbers is Amdocs, which is obsessed with the magical word "trillion" as a means of drawing attention to its (worthy) solutions for service providers - personalisation, billing, operational efficiency, business models and so on.

In last October they were saying "By 2015, if not sooner, experts project that the world will include a trillion networked devices providing consumers with a dizzying array of applications, services, connections, and experiences......... The business benefits could be enormous: managing (and charging for) customer access across a trillion interconnected devices, services, and locations; generating sizable new revenue streams from infinitely larger personal and community networks; and becoming the long-term trusted partner of millions of customers around the globe."

At the time, I raised an eyebrow and challenged their executives to a spread-bet with me, at say 0.000001 cent per networked device at 31st December 2015. I'm a seller at 500 billion.

Now, they've upped their game in a recent white paper. "Within the next decade, the convergence of connected devices, predicted to reach 7 trillion by 2017 (World Wireless Research Forum, Technologies for the Wireless Future: Wireless World Research Forum,
Vol.3, November 2008), and customer demand for ubiquitous connectivity, and pervasive digital content and applications will give rise to the connected world."

7 trillion by 2017. 1000 devices per person. 3000 devices per person living in developed economies.

Time for a common-sense check. I can envisage a long, long term future with lots of RFID tags on products. Maybe massive distributed sensor networks, like the military's "smart dust" concepts. Even if it occurs, I'm pretty sure only the tiniest fraction will be of any relevance to Amdocs or its telecoms customers. When my car's indicators are controlled via wireless to the dashboard, to save on the weight of a wiring loom, I'm pretty sure I won't be being billed per corner by my mobile operator.

Not only that, but the heritage of that number is fascinating. It seems to have first surfaced in a book published by Wiley, and edited by Prof Tafazolli of the Wireless World Research Forum in 2004. It's mentioned in a 2005 presentation here on slide 26. It may well have been a worthy piece of work (unfortunately I don't have the underlying documentation) - and it seems to have been re-reported in the 2nd and 3rd editions of the same book.

But 10 minutes trying to track it down in detail yielded this other document from 2009 by WWRF, which contains the footnote "Originally, the vision was developed for the year 2017. However, work within WWRF has extended the time period to 2020". In other words, even the forecast's original authors now no longer believe it's true and have pushed it out three years.

Personally, I'm pretty skeptical of 2020 as a target for 7 trillion too, but I don't have sight of their full methodology, definitions or assumptions so I'll temper my criticism.

Come on Amdocs, drop the trillion nonsense. It's not helping your message. If anything, it does the opposite, wrapping something important and opportunity-laden, with a veneer of irrelevant and non-credible big numbers just for the sake of it.

Or of course, you can put your money where your mouth is, and take me on at my bet.

Now to be fair, this is all the "science" of marketing. But to me, it illustrates the dangerous ways that poorly-understood numbers get recycled, without people sanity-checking them, or thinking through the real implications if they're wrong.

I've got similar doubts about the oft-repeated line "the next billion Internet users will first get access on a mobile phone". I think that's fabricated rubbish - perhaps the largest myth in the industry. I'm not utterly convinced by assertions that 85%+ of mobile data will be used indoors in the home/office by 2013 either - although the vagueness of whether "mobile data" includes WiFi access to fixed broadband would probably let it scrape past. A lot of hysteria about mobile apps is based on irrelevant numbers too, as Tomi Ahonen dissects with forensic detail here.

Bottom line - be very, very, wary of people citing statistics when they don't know the original source and haven't made efforts to understand the definitions, methodology and assumptions.

Sales and marketing messages are *much* more credible when they're based on real, specifically-addressable figures - ideally developed on a custom basis to illustrate the point they're being used to make. Yes, it's always easy to poke holes in forecasts - including my own - but that's no excuse for skimping on validation of the source (or even just smell-checking for plausibility) wherever possible.

Monday, June 21, 2010

If mobile data billing takes too long to change, device-based intelligence will fill the gap

My recent posts have discussed the issue of femto-originated data traffic being charged against the same caps and policies as macro traffic. I think that this is likely to be seen negatively by many consumers - it is fundamentally unjustifiable, as the cost-bases are so different.

It reminds me of one of the problems with many implementations of earlier voice FMC services such as UMA or VCC - where a mobile call is terminated via WiFi and the user's own fixed broadband network rather than the inbound operator's cellular infrastructure, the justification for keeping the call termination wholesale fees the same diminish. In that case, the customer (and the originating operator) is paying not just for a network it's not using, but for spectrum as well.

The problem is that it's very difficult for billing and rating engines to take account of cost differentials involved in a given call or data stream - not least because most operators don't really have neat and accurate per-min or per-MB costs, and it can vary according to lots of specific circumstances. The cost of connecting to a voicemail server is different to connecting to a phone; the cost of a call in a rural area is different to a city centre and so on.

So operators just take a broad average, reflected in the retail and wholesale prices and hope that people don't nit-pick too much - at least not so much that the regulator gets involved.

While I think that's fair enough - we're never going to get precise cost-based pricing even if we want it - it doesn't excuse the most *egregious* examples of iniquity. The femto vs. WiFi traffic against mobile data cap issue does seem pretty egregious, because there's an immediate A-to-B comparison available to the end user.

The industry spends a huge amount of time telling us that "customers don't care what technology they use, they just want to get connected". Which is fine, but only if there's no glarig differences in price and performance.

This comes back to my assertion that there is a need for holistic approaches to mobile data traffic management - combining all the various bits of the puzzle to offload / charge / optimise mobile traffic so that this type of head-smacking event and potential PR/loyalty disaster doesn't occur.

Although we still haven't got a detailed comment from AT&T (and Vodafone does the same) about their billing system, it is not uncommon for any major changes to take a considerably long time to implement. That's especially true in cases, such as here, where AT&T was also moving its whole system from flat-rate charging to tiered-and-capped. They may well have been aware of the femto vs. cap unfairness, but decided that it would take too long, or involve too much risk, to address that at the same time as the more general switch-over in policy and charging.

This situation is likely to be repeated throughout the data traffic management arena, in various scenarios. It is *theoretically* possible to hook various bits of the policy/charging/optimisation infrastructure up to the radio network, or other bits of the carrier's machinery. But it's not easy, it's not cheap, and it's not quick.

That leads me to think that there will have to be some short-term workarounds that are easier to implement. Top of my list is smart connection-management clients for smartphones and laptops, which have awareness of tariffs, data consumption and different access methods. "Are you sure you want to connect via the 3G femtocell? I can easily register with this nearby WiFi if you want free connectivity".

It will be interesting to see if the slowness of creating sensible end-to-end traffic management and billing priniciples in the OSS/BSS leads to the device intelligence taking the lead again. I'm tracking a number of infrastructure vendors who are looking more closely at the role of the device as a key actor in determining mobile data routing, and I suspect their collective work will pay off.

For more detail on the technologies available for holistic mobile broadband traffic management (including device connection managers), please consider purchasing my new study, from $350.
NEW Mobile Broadband Traffic Management Paper

Sunday, June 20, 2010

Putting a value on customer data with reference to offload

There continues to be a ferocious discussion about AT&T's new data plans - and also, as I commented a couple of weeks ago, about the fact that its femtocell traffic counts against a user's monthly data cap.

Broadband Reports gives airtime to AT&T's rather unconvincing explanation:

"3G MicroCell is primarily intended to enhance the voice call quality experience in your home," AT&T's Seth Bloom tells us. "While it can carry mobile data traffic, that’s not the primary solution it provides," he says. "Wi-Fi is the optimal solution for home mobile data use. We encourage people to take advantage of Wi-Fi capabilities - that’s why all of our smartphones include Wi-Fi radios, and usage on Wi-Fi doesn’t count against your mobile data usage bucket."

Firstly, I rather pity Cisco, AT&T's supplier, for its client's diminishing of its carefully-engineered data offload capabilities. I suspect it hadn't expected to be relegated to providing voice-only coverage.

But the interesting thing here is that, unwittingly, AT&T is putting a value on collecting customer data. We all keep hearing that "subscriber data management" is a big deal, and that operators urgently need to find ways to monetise their customers' "social graphs" and aggregated web access records and behaviours.

But by telling its customers to prefer WiFi to femto for data traffic, it is sending out the message "Actually, customer data isn't really worth that much after all. They might as well connect directly to the Internet via WiFi - we're not interested in collating that usage for data-mining anyway".

Either that, or AT&T's core network is weirdly expensive, and they'd much rather have all the traffic bypass it and go straight to the web by the most direct route.

Again from Broadband Reports: Bloom goes on to insist that the Microcell "uses our core wireless network just like a call placed while driving down the highway uses the core wireless network." "The only difference is how that data or call gets there – via a MicroCell connected to a wired broadband connection instead of a cell tower."

Which is funny, because I always thought that the majority of an operator's cost base was the cell tower and radio network, rather than the core. And therefore "the only difference" for femto vs. macro is really just a cost-saving.

My gut feeling is that AT&T has either not thought this through - or has major problems in changing its charging / billing structure to accommodate femto vs. macro traffic differences. Perhaps it thought nobody would care.

A counterpoint to femtocells - are they really necessary?

I've just read Andy Abramson's long and impassioned blog post about the pointlessness of femtos.

As he points out, I'm due to meet him for a few drinks later on this afternoon, so I'm sure we'll drill down a bit more. In part, I agree with him - WiFi is much better for *certain* applications and use cases.

However, I'll make a few comments upfront.

Firstly, WiFi can be a royal pain to set up, especially on smartphones - and *particularly* where it involves a third-party login. It's also difficult to monitor and control - and expensive to support if someone calls in for assistance. For this reason, many mobile carriers are wary of trusting WiFi as a suitable mechanism for either coverage extension or capacity offload.

As an example: while I've got my iPhone set up for my home WiFi's security, I hadn't provisioned it for BT Openzone - which (in theory) I should get free access to, courtesy of Vodafone - which would benefit from the offload and coverage extension when I'm in places like Starbucks. Result - 30 mins of frustration trying to use the 3G connection & browser to log into Vodafone's personal account and self-care page. Firstly the usual forgotten-password thing. Then hunting around the self-care system trying to find "switch on support for BT Openzone". Then Google the question, and be told "First, log into your Vodafone account". Aaargh. I've given up. Maybe I'll bother trying it from a PC at some point. Or seeing if it's driven by the Openzone splash page.

Or maybe Vodafone will just put femtos in all the UK branches of Starbucks, and I won't have to think about it.

Yes, I'm sure I can get some super WiFi log-on app thing for my iPhone which would automate it all, but frankly I can't be bothered to go looking for that either. I'll just use the 3G, even though I'm sitting in a basement on the edge of coverage.

So.... point #1 for femtocells is the pain / laziness / ignorance factor around setting up WiFi on devices.

Point #2 is the difficulty of extension of operators' own in-house voice services over WiFi. Yes, it's possible with UMA or some of the earlier VCC-style SIP approaches. However, those are quite expensive and complex to deploy, test, maintain and support - not to mention having similar issues and dependencies on the connection manager as discussed above. My views on UMA have been pretty negative since 2004, so it's fair to assume that's an argument I don't need to re-hash - although at least it now (finally) supports 3G.

Now clearly, many of Andy's clients (and indeed Disruptive Analysis') are quite happy when operators cannot extend their in-house voice services, as they are providing alternatives. I'm also a fan of Skype, Truphone, fring and so on... but I also recognise that many customers still prefer their existing mobile operator's own-brand voice service, or cannot be bothered to shop around. There are also issues about porting mobile-specific numbers (especially outside the US) to VoIP providers.

So - point #2 for femtocells is provision of ordinary circuit mobile voice indoors at good quality: which is desirable for many users, and most carriers. On the other hand, as Vodafone found out this week to its cost, this only works where the user has a 3G phone and decent-quality fixed broadband. (Most femtos don't support 2G GSM - although the CDMA variants support 1x).

Point #3 is around regulatory oversight and the perception of mobile operators that they are being held to a higher standard of content/access/application control than fixed operators - especially in markets with anonymous prepay. Access via a femto takes the data traffic back to the core network, where it can have whatever policies the operator desires applied to it, in the same way as macrocellular traffic. If the fixed broadband is the operator's own, then clearly this doesn't apply as it has a policy enforcement point anyway - or if the fixed provider has enabled a "managed offload" service of the type described in my recent Broadband Business Models report.

A related opportunity, point #4 - also discussed in that report - is for femto data to be *faster* than WiFi (or prioritised), if the mobile operator cuts a deal with the fixed/cable provider. Let's say you've got an 8MB ADSL service. But the line is capable of supporting 20MB. In theory, the mobile operator could arrange for your femto to deliver *more than 8MB*, essentially turbo-charging it vs. your own WiFi, if they are prepared to pick up part of the bill, or pay the fixed-line telco a "two-sided, slice and dice" fee. That's very much a "happy pipe" strategy of the type I've been mentioning recently.

Other femto rationales:

  • Open-access femtos offloading 3G data traffic without your need to have any WiFi setup or switching
  • Femtozone applications and services
  • Macro (outdoor) femtos providing extra 3G capacity in hotspot areas
  • Managed connectivity & security for home-workers from their enterprise. Yes, this is do-able via WiFi as well, but I can see this being linked to prioritisation and other corporate-grade teleworking services, such as operator-provided cloud computing resources.
But the final one is operational. One thing I've observed recently is that many radio network planners and engineers are deeply, inherently conservative about their infrastructure. While many of them would prefer to maintain the "outside-in always wins" approach, building macro base stations and towers, there is an acceptance that this might have to change in some circumstances. In those instances, femtocells have a bit more control - and are a bit tighter-integrated into the mainstream network operations and planning - than WiFi. They use licenced spectrum, standardised security - and are likely to be provided through the same vendors and integrators. They are much more likely to *trust* femtocells than WiFi provided via unknown parties. Of course some operators - AT&T for example - seem to have grasped the nettle earlier than others. But do not underestimate the amount of WiFi-skepticism left in the RAN community. Femtos aren't seen as that much better - but at least in "greater femto" format (or picocells), they're closer to the real thing.

Incidentally - I'll be at the London Femtocell conference on Thursday, and the Femto Forum awards dinner on Wednesday. I'm also based 3 mins round the corner from the Landmark Hotel, so if anyone wants an off-conference meeting (or a beer) let me know.

NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Saturday, June 19, 2010

Inter-technology competition and substitution - photo-sharing as an example

I recently came across a suggested use case for IMS RCS, which was essentially "display photos taken on your phone, on your home TV screen".

Fair enough, I can see the value in that as general idea. Whether it's boring friends with your holiday snaps, or showing Granny some pictures of the baby, I can definitely see that there is a valid user requirement to display photos on a big, high-res screen on occasion.

But that then got me thinking about the options for doing this, and the variables involved.

Without a comprehensive analysis, off the top of my head I could think of the following broad range of mechanisms for getting the image data from A to B:

- Beaming photos via Bluetooth from phone to TV
- Attachment via USB cable, or some proprietary cable from the handset/TV manufacturers
- WiFi transfer either peer-to-peer, or via your home gateway/router
- Memory card to a slot in the TV or set-top box (or perhaps in the remote control?)
- Some sort of integrated triple-play photo sharing service via the operator's core, conceivably across multiple operators if they interoperate
- Local break-out and sharing via a femtocell (3G from the phone, then ethernet to the TV)
- Upload to an Internet photo-sharing site, then access via an Internet-capable TV or STB
- MMS to an operator or off-portal service provider, then down to the TV via some sort of narrow-cast
- Various other options involving going via a PC as an intermediate step

And then there are numerous other variables involved:

- How technically-savvy are you?
- What sort of phone and OS is it?
- Does it have WiFi, 3G, RCS, Bluetooth some sort of photo-display software, and can you set them up and use them properly?
- What sort of TV or set-top do you have, and what are its interface and display features?
- Do you want to display the picture compressed or uncompressed?
- Single picture or a whole stream of them? What's the UI?
- How much control do you want from the phone? ("oops, better skip showing THAT pic to Granny")
- Do you want the photos persistently stored off the phone? Where? In the TV, in the operator's network, in your social network / photo website of choice?
- Do you want to edit or comment on the pictures? (eg comments, "like" etc)
- Do you want to display on other peoples' TVs & via their broadband? (eg Granny's) Do they have IPTV/triple play? Do they need some sort of "subscription" for this?
- Do you want to display on TVs when you're roaming?
- Is there adequate cellular coverage in the place you want to do this?
- How much latency will you tolerate in flipping from one pic to the next?

And perhaps most importantly - are you prepared to pay for this?

Not such an easy problem to solve, is it? There's a huge set of permutations, usage scenarios and technical options.

In this instance, without having analysed this particular problem in a lot of depth, my initial feel is that:

- sending uncompressed image files "tromboned" up through the cellular network and then down through the home broadband is usually going to be a bad experience
- many people will think "why bother?" and just either show the pictures on the phone screen (which is getting higher in quality), or use a PC or tablet as a display device instead. In future, the phone may also have a built-in projector.
- nobody is going to pay for this, unless it is such an utterly flawless user experience that it leave all the other options in the dust.
- if the photos are good, you'll want to put them up on the web somewhere anyway, for sharing in "non realtime" with all your other friends and relations
- an operator-mediated phone-to-TV photo sharing service, even if only the signalling traffic goes up to the network & the image is transferred locally, is going to be a pain to get working in many instances. Given nobody will pay for it, and it's feasible to do it in many other ways, it will neither generate revenues nor improve loyalty. Support costs may mean it's loss-making.

The question left outstanding is "how could the phone-to-TV photo display experience be enhanced and differentiated"? Clearly, ease-of-use and control is important - as is the ability to support many of the scenarios I list above.

But perhaps there's something more - a clever enhancement about comments/recommendation, or geo-tagging, or image analysis or whatever. Personally, I have no idea - and I'd be surprised if many telcos do either.

But someone will.

Which means that any added-value photo-transfer capability will almost certainly either be:

- end-to-end proprietary, eg from Apple or Sony or similar
- open to developers to do cool stuff at both ends of the connection

Incidentally - my RCS research paper is nearing completion. Details will be posted here and please contact me via - information AT disruptive-analysis DOT com if you'd like get pricing and the option of a pre-order discount.

Thursday, June 17, 2010

Doing a 180 on Vodafone 360

When Vodafone first announced its 360 service, I was pretty enthusiastic as it was one of the first examples of a major operator launching a so-called "over the top" service, decoupling access from service. It was theoretically available to anyone to download and register, irrespective of which operator they subscribed to.

Basically, it came in various tiers - fully customised and integrated devices like the launch Samsung H1, as a client or icon on Voda's other devices, or as a download to anyone else's smartphone as an app or widget. And importantly, it was web-based, rather than using legacy IMS concepts of centralised telecom services.

I firmly believe that this general philosophy is absolutely critical, especially for social networking and cloud services. Interoperability of lowest common denominator capabilties is not enough for real, differentiated offerings - Operator A must be able to sell unique services to Operator B's customers as well, if it is to scale to compete with Internet players with a 2-billion strong potential audience.

However, the focus of 360 - network address books and aggregating social networks - together with its initial fully-integrated devices, have been a serious disappointment. Ewan at Mobile Industry Review is possibly more outspoken than me - his coruscating comments from end-2009 speak for themselves here and here .

The general perception among many I've spoken to is that the early execution was poor. I'm also not convinced that the social aggregator role is useful anyway, although doing it web-based at least makes much more sense than shoehorning it into IMS.

Given that the basic premise didn't seem to work well, there clearly hasn't been the opportunity to expand 360 beyond the dedicated-device stage. There's no 360 app pre-loaded on Vodafone's iPhones, although its People Sync is apparently available on the AppStore. There's certainly been no viral buzz.

BUT

In the past week or so, the picture has started to shift. It is becoming much clearer that 360 genuinely is a much broader software platform, and that the address-book / social network thing is just a small part. More importantly, I think it's central to Vodafone's broader Internet strategy, and explains its lacklustre enthusiasm for traditional, conservative, walled-garden telco stuff like RCS.

Two things brought this home to me:

- The launch of 360 Shop for Android - a Voda-branded app and content store, billable to the customer's own account. Even Ewan at MIR likes it.
- Re-configuring my replacement iPhone (the old one fried its baseband last week), re-doing my email settings. Looking online, I was advised to set my SMTP outbound server to smtp.360.com for which I needed a 360 registration and log-in.

Separately, Vodafone has persuaded me to register for its (currently non-360) online customer portal, so I can track bills and sign up for its WiFi offload, as I'm getting close to its 1GB/month cap for iPhone data.

In other words.... Vodafone is going all-out to get people to register via web with one or more of its 360 properties. Yes, that's web username & password stuff.

For all the rhetoric about SIM cards, telco subscriber data repositories and identity, the fact remains that many of the best-organised and *most useable* datasets about individuals are those linked to traceable web log-ins. They're structured from Day 1, not based on 20-year old legacy billing platforms and chunks of data locked in vendors' proprietary databases in diverse network elements.

Think about the company that Vodafone is hoping to keep:

- Microsoft Passport / Live / Hotmail etc. login
- Google Gmail / Blogger / Android /GVoice etc. login
- Facebook login
- Yahoo! Mail / Messenger / My etc. login
- Apple iTunes / MobileMe / AppStore login
- Nokia Ovi login

For all the value of the SIM card in the traditional combined (service+access) telco world, it's much less useful for operators wanting to monetise access-independent applications. It locks them into onerous wholesale and interoperability relationships such as mobile roaming agreements. It makes it much harder to launch and monetise services across operator bounaries.

Some form of password / cookie / federated ID + certificate approach is much more flexible in these cases, despite its detractors' comments on its clunkiness. In my view, the clunkiness of username/password is a price worth paying - as billions of Internet logins have proven.

I'd estimate broadly similar numbers of SIMs vs. regularly-used web passwords. 4-5 billion, about 3-4 per Internet user. That tells a rarely-spoken story.

The next logical step will be another attempt to push 360-branded offerings to the Internet at large, perhaps downplaying the Vodafone brand in some cases. It's not immediately clear if the 360 Android shop is available to non-Voda users, but that could help, although how billing would work is an open question. If 360 can get past, say, 100m username/password registrations in aggregate, it will be in the big league.

Wednesday, June 16, 2010

Will delays to LTE help or hinder VoLTE?

While we've seen a lot of noise about LTE "committments" and assorted trials and early rollouts, I remain unconvinced that we'll see much adoption before 2013, or for truly mobile devices (ie phone-type products) before 2015.

Given the huge diversity frequency bands, the need for extensive work on cost-optimisation and performance-tuning of silicon, as well as the need for broad deployment both outdoor and in, the probability of LTE powering "primary mobile telephony" before mid-decade seems slim, especially for mid-tier devices needed to get hundreds of millions or billions converted from 2G/3G.

What I'm uncertain about is whether this, ultimately, helps or harms the case for mobile IMS and solutions like VoLTE. On one hand, it allows the technology to mature and equipment prices to fall. But on the other hand, it gives huge opportunity for disruptive rivals to become entrenched.

The big risk I see is that something like Skype, Google Voice - or perhaps a still-unlaunched FacePhone, iTalk or OviVoIP - performs a classic "move from an adjacent market" before IMS-VoLTE becomes a real option.

Thinking through the usual pattern of IP-style application disruption, I'd expect to see something "low quality and niche" turn into "good enough" and then morph into "de facto standard".... rather than have some new and bulletproof "official standard" parachuted onto a willing audience in a single swoop.

One thing is certain - you can bet that people on the small new Nordic LTE networks are *already* playing with Skype. You can bet that someone loads up Google Voice on Day 1 of Verizon's LTE launch. DoCoMo might be a special case, but let's see how that goes at the time.

Yes, there will be problems. It probably won't do decent roaming, or properly handle emergency calls, will have poor SMS integration and may well drop at network handovers. But if it's got some other way of introducing "coolness" and virality, people won't care.

To me, this suggests that the IMS-VoLTE guys need to find ways to get their solution out "in beta" immediately, or at least in the next 6-12 months. Ideally with some "cool hooks" as well. Doesn't matter if it's a cludge. Doesn't matter if it doesn't have IMS-based QoS or even use all the core bits of IMS like HSS. Run it off a spreadsheet if you need to. Hell, maybe just rebadge a standard SIP client as "Pre-VoLTE" or something, and leave the proper thing to the version 3 update. Full interoperability can wait too.

In other words... I think there has to be operator-branded, imperfect VoIP (with features to make up for it) ready to go on LTE devices from launch. Might be "over the top" across operator brands, or "through the middle" from the network owner. Whether to extend it down to HSPA devices as well is a more tricky decision, but one I'd probably recommend as well.

But the bottom line is to try to get as many of early LTE adopters on board ASAP, gain experience, pick off the all-important social influencers and communications "hubs", open up some APIs for developers - and then use the advantages of being a standard as a differentiator later.

Repeating the classic telco mistake of going for interoperability first, finding the politically-expedient lowest common denominator, being a "late follower" in terms of service launch, risks being yet another failure to learn from experience.

As a central planning scenario, VoLTE advocates - until we get to mid-tier LTE phones with native VoLTE clients, expect 100% of LTE users to have already made a VoIP call with a 3rd party solution before yours. So, how will you get them to switch?

PRESS RELEASE: New study forecasts $416bn worldwide broadband access market, as operators adopt "Happy Pipe" strategies

London – June 16th 2010

Both fixed and mobile broadband markets will continue growing in revenues, up to $416bn in 2020, but operators face some hard decisions about future business models, according to a new study published by the Telco 2.0TM Initiative and co-authored by research & consulting firm Disruptive Analysis.

The new report, “Mobile, Fixed & Wholesale Broadband Business Models: Best Practice Innovation, ‘Telco 2.0' Opportunities, Forecasts and Future Scenarios” finds that telecom operators will benefit from both new types of broadband wholesale, and more sophisticated direct-to-user retail propositions and tariffs. Recent introductions of new tiered and capped wireless Internet data plans are early evidence of this trend.

Key findings from the report include:

  • Global broadband access is forecast to increase from $274bn in 2010, to $416bn in 2020, an increase of 52% in revenue terms.
  • More than half the revenue growth will come from wholesale and “two-sided” fees for improved access capacity and quality. This means revenue from parties other than the end-user themselves.
  • By 2020, mobile broadband will be worth $138bn, or 32% of the total broadband access industry revenues.
  • Three new revenue streams are identified: “Bulk Wholesale”, “Comes with Data”, “Slice and Dice”.
  • New ‘upstream’ customers are forecast to generate over $90 billion in broadband revenues globally by 2020.

Today, many operators fear the supposed risks of becoming “dumb pipes”, but the study suggests the forecast market value means the term “happy pipe” is more appropriate for some. Certain telecom carriers will be able to add further value through enhanced “Telco 2.0” value-add services and platforms, but it is important to note that the basic carriage of data can itself be profitable and a source of substantial growth.

On the conventional retail broadband side, the big winners are fibre-based fixed services and mobile data plans for smartphones. Global ADSL and cable revenues will peak in mid-decade, and then decline with substitution from the progressive deployment of fibre. PC-based mobile broadband retail revenues will grow strongly in the short term, before being impacted by price competition and a shift from user-paid retail subscriptions to new wholesale-enabled models.

The ground-breaking study predicts that the wholesale market for broadband will evolve in three separate directions:

  • “Bulk wholesale” is an evolution of today's approach to MVNOs and data roaming in mobile, or loop-unbundling and open fibre access in fixed markets. The report predicts an acceleration of this type of wholesale provision, as governments force greater openness on telecoms licencees, and operators look to alternative partnerships to supply new market niches with capacity. There is also a possibility for parties other than the end-user to pick up the bill for subscriptions – for example, some local authorities are now providing free broadband to disadvantaged communities.
  • “Comes with data” business models have started to emerge recently, with devices such as the Amazon Kindle. Here, a product vendor or service provider contracts for data capacity with the broadband provider, and bundles it in a combined offer – the user does not have a subscription or direct relationship with the telco. The report expects this approach to be important for laptops, netbooks, tablets and various other new device categories.
  • “Slice and dice” wholesale is more complex, and more controversial. This involves operators selling data capacity in fine-grained “parcels” to parties other than the user, who is typically also paying for some level of access. This type of “two-sided” business model could involve deals with consumer electronics vendors for extra high-quality streams over existing broadband lines, or to content/application providers where they pick up the bill for data transmission rather than the end-user.

The incremental revenue opportunity for new “slice and dice” wholesale business models in mobile broadband alone is forecast to be $21bn worldwide by 2020.

The report’s co-author and founder of Disruptive Analysis, Dean Bubley, said “Both fixed and mobile operators need to look beyond the traditional ‘end user subscription mindset’, and examine new and innovative wholesale opportunities. At the same time, they need to embrace radical evolution of their retail portfolios – for example, supporting prepaid fixed broadband, or offering innovative tiering and policy structures for mobile Internet access from smartphones and tablets. Whoever coined the term ‘dumb pipe’ has cost the industry billions in shareholder value through negative word-association; instead, “Happy Pipe” reflects optimism and some really interesting opportunities ”.

According to Chris Barraclough, co-author of the report and Managing Director of Telco 2.0, “Telco 2.0 is not about throwing away existing operator business models, but about evolving them to generate additional value. In new Telco 2.0 style ‘two-sided’ business models, there are ‘upstream’ and ‘downstream’ customers – upstream customers are typically enterprises or merchants seeking to reach their markets – the so-called ‘downstream’ customers.”

“As we show in this report, there are many creative ways that operators can add more value for existing downstream customers. However, it is also clear that those companies providing services over the Internet will increasingly seek to mash-up connectivity more tightly with their own offerings, for example by including connectivity as a part of their products. These new ‘upstream’ customers are alone forecast to generate over $90 billion in broadband revenues globally by 2020.”

The report, “Mobile, Fixed and Wholesale Broadband Business Models: Best Practice Innovation, ‘Telco 2.0' Opportunities, Forecasts and Future Scenarios” is available to buy from Disruptive Analysis and Telco 2.0. Details are available at New Mobile, Fixed and Wholesale Broadband Business Models and www.telco2.net

Ends

Tuesday, June 15, 2010

"Comes with data" business model for TomTom SatNav

I just heard a radio advert for satnav vendor TomTom, for a device which comes with a year's free access to its over-the-air alerting service called HD Traffic. This is delivered via a GPRS module in the device - based on what appears to be a two-sided "comes with data" wholesale deal with Vodafone.

As well as traffic alerts, it apparently uses Vodafone's cell-ID capabilities to assist in location fixes - although it uses Google's local search rather than the operator's for added-value capabilities.

The promotion compares with the previous cost of around £8 per month after a free trial period. It appears to be a proper "comes with data" offer, in that there does not appear to be a need for a contract or subscription with the operator - the customer just purchases the device outright, and the manufacturer contracts for the connectivity with the operator, or one of its resellers.

"Your FREE trial of TomTom LIVE Services works straight out of the box and starts automatically when you first turn on your device, therefore you do not need to register to claim for the offer. From 9th June onwards all devices will be updated to include 1 year FREE LIVE Services"

Technically, as this uses 2G rather than 3G, it's not a "mobile broadband business model" - but it certainly points the way to possible future enhancements, using HSPA modules instead. One of the variables that remains unclear is just how fast we will see a move from 2G to 3G for this type of embedded capability.

One risk to the operator is that a shift to 3G might see a sudden explosion in data traffic with minimal additional revenue - and also the risk that poorer 3G coverage reduces usability for such truly mobile products. However, fallback ability to GPRS, coupled with rate-limiting for the 3G radio could solve that. A satnav with (say) 384kbit/s or 1Mbit/s would still be massively useful - it needn't run at full speed. In a vehicle, it could also be equipped with decent antennas and receivers to reduce impact on cell capacity.

"Comes with data"
is one of the new broadband wholesale business models identified and forecast in the new report I worked on with Telco 2.0 - full details are available here.

Thursday, June 10, 2010

Right up to the cap....

I see that O2 has become the latest operator to clamp down on mobile data usage - and at least has rather better-tiered offerings than AT&T's rather cynical options.

But I'm wondering if operators are making the same mistake as usual - assuming that, on average, people will only use xx% of their allotted quota.

Because I can forsee several scenarios in which people might use all of it, or at least 90.

It would not be very difficult to install an application which tracks your usage and the day on which it expires. Then, there could be the option to:

- wait until near the end of the month, then use any spare capacity to bulk-download and pre-emptively cache sites or content you might want to watch subsequently. There's probably a clever ad-supported model in there somewhere

- If you resent the imposition of the cap, you could be spiteful and just have the "up to the limit" app generate traffic for the sake of it. Added bonus: by visiting lots of random websites, you trash the supposed "customer data" that's being gleaned so you can be advertised-at

- Wait until near the end of the month, then offer any surplus data to friends / others via a tethering function. You could even try and create an auction application to attempt to resell it, although that would probably contravene a ton of T's and Cs.

- Donate spare end-of-month capacity to a charity - there must be *some* philanthropic use for a free mobile pipe.

Wednesday, June 09, 2010

LTE iPhone? Not until 2013, I reckon

There was a lot of speculation about whether Apple might have made some sort of announcement about 3.75G or 3.9G support in the iPhone. In the end, the iPhone 4 has stuck to the tried-and-tested HSPA formular, rather than going to HSPA+ or LTE.

This is pretty much unsurprising.

There was no technology pragmatism behind the assertions that, because Sprint was launching the so-called 4G Evo HTC phone, that Steve Jobs might try and one-up its rivals.

I guess there was a vague chance they could have put out an HSPA+ variant, which would have as much right to a 4G marketing slogan as current WiMAX (ie nil), but given the general thinness of HSPA+ availability today, it was also improbable.

Moreover, at the risk of making myself a hostage to fortune, I don't expect an LTE iPhone in 2011 either. At present, I'd rate the distribution of probabilities to be:

2011 - 5% chance
2012 - 30%
2013 - 45%
2014 - 15%
2015 or later - 5%

There are numerous reasons that I'm dismissive of the idea of a so-called 4G LTE iPhone:

  • History - Apple has shown little interest in being first to market with a new radio technology, and would rather wait until there is a sizeable base of stable networks on which to launch. Hence the first iPhone only having EDGE. By 2011, only a few networks will have launched LTE, and none will have really decent coverage.
  • Power consumption and optimisation - who honestly believes that the first batch of LTE chipsets and devices will be that great at power management, cell-to-cell handoff and all those other "hygiene factors" that make for a decent user experience?
  • Frequency bands - as I've written before, the LTE world is going to be very fragmented. 700MHz in the US, 2100Hz and then 1700MHz in Japan, some mix of 800 / 900 /1800 / 2600MHz in Europe, TDD versions in China and so on. Trying to get a 4- or 5-band LTE device capable of roaming or even sale in multiple geographies is a long way off. Add in delays in even selling spectrum in 2.6GHz or 800MHz digital dividend, and some idea of harmonisation, let alone network deployment goes back further.
  • Fallback to 3.5G - any LTE smartphone will need to have decent 3.5G backup, probably HSPA in the current bands. A Verizon LTE one would need CDMA EVDO (seems unlikely unless they get an exclusive to take over from AT&T). If Europe goes to 2.6GHz as the main LTE band (which is still unclear), then fallback to 3G indoors is going to be happening a lot of the time
  • Operator fragmentation. Apart from a couple of markets like the US, Apple is now pursuing a multi-operator strategy. The likelihood that multiple operators *within* a country will end up with similar LTE/HSPA mix of bands is looking very improbable.
  • Voice and SMS - 'nuff said.
  • Indoor usage scenarios - ditto
  • Price - adding in more expensive basebands, radio components, loads of testing, software integration.... all for what? A higher theoretical peak speed, under optimum conditions, which may not even be useable by the device?
Overall, I reckon Apple will care much more about *average* speeds attainable by users across a broad range of countries, operators and environments. It is far from clear that LTE will be the driving factor for improvements in "normal, average user experience" for several years, especially when averaged across multiple countries and operators.

There's no rush. I think Apple will wait until LTE works at least as well as HSPA does already.

Got your own opinion on when it will launch? Check out the vote set up for me by Live Talkback - http://m.livetalkback.com/disruptiveanalysis

NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Monday, June 07, 2010

Thought for the day - mobile communities...

Question:

Of a given group of friends / associates, what is the probability that they will either:

a) Share the same mobile operator
b) Share the same mobile device / OS
c) Share the same Internet social network
d) Share the same interoperable multi-operator / multi-device client & service?

Upcoming events

I'm going to be at the following events over the next few weeks. Please let me know if you want an informal meeting or introduction, either as a potential clients of Disruptive Analysis, or as a quick analyst briefing.

8/9th June - IIR In-Building Summit, London (speaking about data traffic & impact on in-building coverage and capacity)

15/16th June - TEN Telecoms Executive Network, London (moderating panel on mobile broadband)

23/24 June - Avren Femtocell event & Femto Forum Awards Dinner, London

I can be reached via information AT disruptive-analysis DOT com

Thursday, June 03, 2010

New Cisco VNI traffic report out

One of the broadband industry's "bibles" has been published in a 2010 edition.

Cisco's "Visual Networking Index" predictions of fixed and mobile data traffic are some of the most widely-cited charts and qualitative predictions in the industry.

I'll go through it with a fine-toothed comb when I get a chance, but one thing sticks out immediately:

2014 = 63.9 Exabytes / month total IP traffic
... of which 3.5 Exabytes / month is mobile data (ie 5.5%)

What is not clear to me at first sight is what happens to femtocell and WiFi offloaded traffic - ie is it double-counted? Especially femto traffic, which is likely to traverse two sets of routers first in the fixed ISP's network, and then again in the mobile operator's own core.

(I'm assuming that the figures exclude double-counting "private IP" traffic, such as transport between cell sites and RNCs and the operator core, where provided via a 3rd-party wholesale network operator)

NEW Mobile Broadband Traffic Management Paper

NEW Broadband Business Models Strategy Report

Optimised Internet apps. vs. RCS vs. multi-headed clients

OCT 11 2010 NEW REPORT AND BLOG POST ON RCS HERE
I'm currently digging into IMS/RCS for what may turn into an "epitaph" research paper. My current spoken and blogged views on it are well-known, but I feel it is worthy of a more weighty and analytical piece.

I'm currently sifting through assorted vendors' websites, GSMA RCS specs, YouTube video demos and so forth.

I'm struck by one very clear question:

On a half-decent phone, why would anyone want to use a multi-headed / aggregated app, hooked into various social networks and messaging services, rather than an optimised one from the underlying Internet service?

For example: The chance that an operator/RCS-mediated "Facebook experience" is ever going to be better than a native app or browser-based "Facebook experience" is surely zero, isn't it? Or am I missing something? Is the operator-based option solely for low-end devices that can't support proper apps?

Surely, the day a web-based service updates its capabilities with something cool and new (say, a "dislike" button, or innovative photo-upload feature), it can update both its browser and app-based interfaces. But it's stuck with whatever the current device client can support for the operator-mediated version.

I can perhaps see the value of importing some operator data and capabilities (eg presence, billing) inside the Facebook app - but I really struggle to see the rationale for doing things vice versa.

The way I see it, social networks become (relatively) more important through two main routes:
1) Viral adoption of standalone clients or web access because of some unique & desirable features to a specific user community
2) Piggybacking on another successful social network as a platform, and then spinning out to standalone once reaching critical mass

So... what is the "vector" for an operator-based social networking service to become widely adopted? Is there a catalyst for "virality"? Is the fundamental desire for that virality being embedded in RCS's design criteria and specifications? At the moment, I see it as being engineering-led, with little regard for basic behavioural psychology.

I'd contend that for innovative mobile applications to become successful, virality is more important than interoperability. (And then there's openness / extensibility, but that's a whole other story).

Wednesday, June 02, 2010

AT&T tiering, femtocells and holistic traffic management

AT&T has finally seen the light and recognised that "true unlimited" mobile data plans are unrealistic, moving instead to tiered offerings. Most of the rest of the world outside the US has long had tiered / capped services for both smartphones and laptop modems, so this starts to bring North America in line with practices elsewhere.

However, I'm not too convinced by the details. It's 200MB/month for $15, or 2GB/month for $25, plus additional overage charges of $15 for 200MB or $10 for 1GB on the respective plans.

The 200MB / 2GB looks too much like a cynical "Goldilocks" fee structure to me. Too Little, or Too Much. But not "Just Right". A good proportion of smartphone users have monthly usage in the general range 200-500MB - as indicated by AT&T's rather disingenuous comment that "currently, 65 percent of AT&T smartphone customers use less than 200 MB of data per month on average."

Let's think about that last statement. Firstly, we know that some smartphone users - notably corporate BlackBerry users - are relatively low-usage, and bring down the average. And they will be on separate BlackBerry / corporate plans anyway. Secondly, that means that if they use <200MB *on average* then it is likely that there will be variation about that average. In other words, a good proportion who are (say) hovering between 150-250MB/month will incur overage fees on a regular basis, assuming no "rollover" of unused allowances.

The other standout is the pricing of the packages is decidely non-linear. If the incremental cost of 1GB of data is given as $10, then that then points to a single-digit $ base cost per GB. So the overage charge on the 200MB package is clearly being made at an astonishing mark-up.

The clear message is that "normal" consumers are being pointed towards the $25 plan, with only exceptionally low-end smartphone users benefiting from the low-rate option.

The other detail missing from the press release is the apparent fact that femtocell traffic ("Microcell" in AT&T parlance) is *included* in counting towards the quota, but WiFi traffic is *excluded*.

[Hat-tip to competitor / peer Peter Jarich via Twitter for the Microcell anecdote. However, please note that I haven't been able to source this independently, so what follow may need to be edited if AT&T issues a contrary clarification]

This goes to the heart of some of themes in my recent research paper on Mobile Traffic Management, and the need for holistic thinking within operators. Given that the RAN generally costs much more than the core network for most operators, there should clearly be differential (or zero-rated) pricing for traffic using femtocell offload. Either that, or there should be a mechanism for customers to charge AT&T for using THE USER'S broadband pipes for backhaul.

It is critical that any policy management and charging infrastruture is capable of discerning bearer type (which could also be UMA WiFi tunneled via the core on some other networks). Otherwise it makes a total mockery of the concept that policy is intended to align pricing with the underlying costs of service delivery.

It also makes a mockery of the femtocell concept as a mass proposition, if the end-user has to pay more than using their own WiFi. If I was a femto vendor today, I'd be spitting feathers about this, as it completely undermines the positioning vs. WiFi as an offload tool.

I also know that many vendors claim it is feasible to distinguish between femto and macro traffic in their DPI / policy products because I've been asking this specific question to many of them the last month or so. And let's face it, it's pretty obvious if traffic is coming through the carrier's femto gateway - if the operator can be bothered to do the integration, and has a rating/charging system up to the job of differentiating it on the quota and bill.

My guess is that the RAN offload/femto project at AT&T has been disconnected from the tiering/policy initiative. This is not the first example of one isolated aspect of traffic management being disconnected from others. Nor will it likely be the last, given the proliferation of techniques and technologies being deployed. Many will have unfortunate side-effects and unintended consequences - as I discussed regarding video compression / optimisation recently.

It's possible that AT&T recognises the issue and will fix it in time - but at the very least it ought to recognise the issue explicitly.

If you want to know more about the range of mobile broadband traffic management options - and the need for a holistic approach to avoid outcomes like this, you need to read my recent research paper. Details are here, and it's priced from just $350.

Tuesday, June 01, 2010

Does a "coalition of the losers" ever win?

I'm currently looking at a number of mobile application domains, such as messaging, social networking, VoIP and application downloads.

One thing that strikes me is that we frequently see powerful incumbents being challenged by alliances. Apple faces attack from operator-run appstores. Facebook is viewed enviously by others that would like to control social networks. MSN has been pursued by various own-brand IM proponents. Visa and Amex are regularly targeted by new payment mechanisms.

But one regular characteristic of this type of competition in the mobile domain is the "coalition of the losers" approach, usually based on the notion of interoperability as a competitive differentiator. Industry bodies like the GSMA are frequently the drivers of such initiatives, although often they take over a pre-existing coalition.

We've seen failed attempts to build an IM interoperability community. My current view is that the RCS Initiative is also on its last legs (I'm currently writing an "epitaph" paper if anyone would like to try to change my mind). Now we have the Wholesale Application Community. There have been assorted others around payments, identity and mobile broadband (sorry, WiMAX Forum).

But I am struggling to think of a single case in which a losers' coalition has ended up being successful. For that matter, I'm not sure I can think of an example outside the telecoms industry either, where a single powerful Samson has been brought down by a coordinated horde of Davids.

Having 53 previously-ineffectual companies attacking a strong individual player usually just proves that 53 x Zero = Zero

Where change does occur, it's usually another proprietary or standalone player. BlackBerry's Messenger is taking bigger lumps out of MSN's user base than operators' messaging services ever have. It's Facebook that has given MySpace a kicking, not a consortium. Vodafone's M-Pesa has had more of an impact on mobile banking than any number of joint initiatives. Paypal has made the biggest impact on online payments.

In the airline industry, it has been the impact of individual low-cost carriers like Ryanair and Easyjet that have caused the greatest shake-ups, not Star Alliance or OneWorld.

One possible exception might be the Open Handset Alliance, aka Android. And more generally, the open-source model tends to fare a lot better than the "industry collaboration" approach at unseating incumbents.

I'm genuinely curious about this - if anyone has an example where a "coalition of the losers" has been triumphant in mobile, I'd love to know.