Prague or Berlin ? Behind the scenes of the SIM industry

SIM cards, the tiny chips that authenticate the subscriber to the network, are a market of 2 billion unit sales per year. A market which is threatened by commoditisation with prices declining by 30-40% year-on-year. And perhaps surprisingly, a market where technology and politics are in continual turmoil.

This past week, two identical conferences on the subject of SIM cards revealed the uncertainty that divides the SIM industry during the last 10 years. SIM technology has followed the same path; one where industry participants are divided over standards, control points and overhyped technologies. But with SIM card economics in a dire state, the industry needs alignment, not controversy.

SIMpolitics
Prague or Berlin ? In the past month, this was the question in the mind of most vendors forming part of the SIM industry. Prague was the location for Informa s SIM Summit, which took place on 24-26 April. Berlin was the location for SIM Alliance s SIMposium, which took place on 24-25 April. Two conferences on identical subjects in different countries, allegedly due to differences of opinion between Informa, the incumbent telecoms conference organiser and the alliance of SIM card OEMs. Dan Balaban of Card Technology Magazine has a detailed write-up of the conference politics at play.

The result ? The Prague-Berlin flight route was unusually busy and most companies had to send delegates to both conferences (except for the major SIM card manufacturers who did not attend Informa s event). I attended the SIM Summit, where I chaired two days of the event, and most delegates I spoke to had correspondants on the other conference, not to miss out on any of the important developments. This unfortunate nature of affairs will undoubtedly be remembered as the schism of the SIM industry, one which I would hope is quickly sewn up. The SIM industry, already suffering from rapidly declining profit margins and lack of technological evolution does not need more turf fitting.

The divide over conferences this past week is only the tip of the iceberg; technological evolution in the SIM industry has also been hampered by a long-term power struggle.

The power struggle behind SIM technology
Since the inception of digital mobile telephony, SIMs have been used as a physical token that authenticates the subscriber to the mobile network. The SIM toolkit interface was adopted in 1997 as a standard technology for allowing the SIM to interact with the handset, store contacts in the SIM, present a menu of operator services to the user, and show low-fi text and image popups to the users. A number of technology innovations followed, such as a Javacard application environment, secure SIM storage, more memory (up to 128KB) for storing more contacts, the BIP protocol for SIM communication over TCP/IP channels (GPRS/3G) and the JSR177 protocol for communication between handset Java apps and SIM Javacard apps.

This might sound like a lot of technology, and one which opens the door to a range of usage scenarios and value-added applications. It is. The SIM has enabled many innovative applications to date such as m-banking introduced by T-Mobile Czech Republic in 1998, automatic device detection as used by ONE Austria, SIM contacts backup on the network a launched by SFR France, and text promotions as exemplified by Celltick s LiveScreen Media solution.

So where s the catch ? The problem is that SIM technologies must be implemented by both the SIM vendors and the handset manufacturer in an interoperable fashion.

Mis-alignment of incentives
Here lies the problem that s been plaguing the SIM industry since the beginning of the decade: the mis-alignment of incentives between SIM card OEMs and handset manufacturers. On one hand, SIM manufacturers have been keen to promote the technological evolution of their SIM cards (and therefore the price tag). On the other hand handset manufacturer have always been wary of the establishment of the SIM card as a control point of value-added services delivered on the handset. As a result, despite operator pressure in favour of SIM standards, handset OEMs have been producing handsets with poor or inconsistent implementations of the SIM toolkit, and the other SIM technologies. This power struggle has been particularly evident to operators and has nurtured the demand for SIM test houses.

This state of affairs was not aided by mobile network operators, even though MNOs would be primary beneficiaries of the SIM s evolution in the technology value chain. The reason is primarily down to organisational psychology. Tier-1 network operators are large, high-inertia and risk averse organisations with a poor track record for commercialising innovations. GSM operators lacked the vision and leadership to invest in new SIM technology, or even publically declare their support for particular SIM technologies. This led to a standstill in terms of the evolution of the role of the SIM.

Reviving SIM card technology
Consequently up, until 2006 SIM technology seemed stale in comparison to handset technology; 128KB SIM storage compared to several GB of handset storage. 10s of kbits/s for SIM-network communications vs 1000s of kbits/s for handset-network communication. 5MHz SIM smartcard processors vs 200MHz handset processors.Text-only interface for SIM applications vs Flash-like interface for handset applications. This was of particular concern to SIM card OEMs who were keen to upsell the value of the SIM card.

The inflexion point came in 2006. SIM card OEMs, faced with declining profit margins pulled together to increase the valuation of the SIM card with a technology breakthrough. In February 2006, virtually all major vendors announced a next-generation SIM card product including Gemplus with .SIM, Axalto with U2 SIM, Oberthur with its GIGantIC card, Giesecke & Devrient with GalaxSIM and Sagem Orga with SIMply XXL.

All next-generation SIM cards featured up to 512MB of storage (thanks to replacing NOR with higher-density NAND memory) and high-speed protocols. This evolution would allow a number of promising scenarions, such as storage of multimedia files, DRM tokens and encrypted corporate files, and distribution of operator-customised handset applications through the SIM. Informa’s white paper on High Capacity SIM cards which I wrote last year, provides an extensive analysis of the commercial status of next-generation SIM cards.

This was all too good on paper, but again a technology evolution that had to be supported by handset manufacturers. Repeating history, network operators, with the exception of Orange, did not show leadership or invest in this evolution.

The ETSI organisation was where the discussion of the standardisation of this high-speed protocol between SIMs and handsets took place. Following a multitude of candidate proposals, patent disputes, continual controversies and uncertainty, the participants agreed to standardise on one protocol; the USB.

Two years had passed before this consensus was reached in November 2006, but the worst was yet to come. The choice of USB protocol meant that a significant rengineering effort of the part of handset OEMs (Orange had convinced 5 OEMs in 2006 to implement the MMC protocol, but the choice of USB meant that the momentum was stalled). Coupled with the continuing widespread lack of operator support, most industry observers concede that compliant handsets will not be appearing in the market before 2009, at least in any sufficient volume.

So what does this means for the SIM industry ? Faced with declining sales, SIM card OEMs went back to the drawing board. The 3GSM 2007 congress saw two new efforts to revive the value of the SIM card: the use of the BIP protocol to deliver new applications and the use of the SIM within NFC technologies for contactless transaction applications.

New SIM technologies in 2007: The quest for the holy grail
After several generations of the mobile industry, the quest for enhancing the role of the SIM seems like the quest for the holy grail. Long, uncertain and without a firm goal in sight.

At 3GSM this year, marketing around high capacity SIM cards was significantly toned down. Instead, Gemalto (the largest SIM OEM) introduced its line of multimedia-ready SIMs, essentially cards using the same memory capacity, but supporting the BIP (bearer independent protocol), which forms part of the ETSI 11.14 standard. The BIP protocol comes in two flavours:

– The BIP client protocol allows the SIM to communicate with the network over 2.5G and 3G data channels, which allows for SIM-resident data to be updated at significantly higher speeds than previously possible. The BIP client protocol is widely implemented (all top-5 OEMs except Samsung support it), but it is a point-to-point protocol (unlike cell broadcast), which cannot be used for mass updates.

– The BIP server protocol allows handset applications to communicate with the SIM and access objects stored in the SIM file-system directly. Unfortunately, the BIP server protocol has seen (unsurprisingly) poor support from the part of handset OEMs, with the exception of Sagem, Vitelcom and HTC. The BIP server implementation allows a handset application to load files from the SIM card and enable handset personalisation on SIM insertion (as demonstrated by Abaxia at this year s 3GSM). On-SIM portals can also be realised in this fashion. However the BIP server protocol is not bidirectional and therefore cannot support scenarios where an application stored in the SIM card is auto-installed onto the handset operating system.

Hooking the SIM onto NFC applications
Perhaps the most talked about future application for the SIM card is as a control point for handset-based NFC applications, such as contactless payments.

NFC (near field communications) is a wireless standard launched by NXP (formerly called Philips Semiconductors) and Sony in 2002. There are already several commercial pilots of NFC-enabled handsets around the world and ABI research predicts that around 20% of handsets in 2012 will ship with NFC capabilities. NFC technology can be used to make payments, unlock doors and download content, by simply waving the handset in front of the reader.

Clearly, establishing a role within NFC-based applications means big money for mobile operators and SIM card vendors. The GSMA (association of 700+ operators globally) recently mandated the use of the single wire protocol (SWP) for linking the SIM card to the NFC circuitry within the phone. However, there is no agreed technical framework for determining the role of the SIM in NFC applications. At the same time there are far too many players claiming a piece of the lucrative pie of contactless payments, namely card issuers, contactless ticketing providers, mobile operators, handset manufacturers and SIM card OEMs.

Undoubtedly, the NFC-related hype that surrounds the SIM industry has a long way to go, given that it will take another five years for NFC-enabled handsets to reach critical mass. Perhaps it s wiser to reflect on the many uses that the SIM can be put to, utilising not tomorrow s, but today s technology, such as T-Mobile s use of the SIM for m-banking applications supported by 80% of banks in the Czech Republic.

But not for an industry whose survival depends on demand-creation for next-generation technologies.

Andreas

The headaches of being a handset OEM

Some things remain true: Markets always shift and the lord giveth and the lord taketh. In the mobile handset industry we have seen Ericsson with 30%+ share of the market and then fall into oblivion before creating a joint venture with Sony and rising like the bird Phoenix. Does anyone remember the impact of the Vodafone terminal specifications to OEMs less than half a decade ago, for which even Nokia bent over backwards in the end? How come this changed so rapidly? Well, consumers change their minds. The industry too shifts between vertical and horizontal structures in a helical pattern. There is always a search for the next killer feature that will lead into a new shift powering market dynamics, but seldom is it a feature that creates that shift in balance but something completely different.

Consider first an example from another industry the automotive market. The last few years that market has undergone a feature renaissance , the killer features being environmental impact (look at the success of the Toyota Prius) and localization (i.e. with built in GPS). The previous killer feature was segmenting the car products into clear value propositions like SUVs, family cars, sports cars, etc. Before that it was hardware being able to build cars and ship them over the planet.

The shift of the millennium: from hardware to software
In the mobile handset industry we saw the shift in market differentiation, from hardware to software some years ago; until the end of 1999 all the big OEMs were more or less focusing on hardware. Technology differentiation was determined by how small you could make the phone, how good network reception you could achieve, and so on. In the beginning of this millennium a shift began; software became much more important. In 1998 Symbian was formed as a partnership between Ericsson, Nokia, Motorola and Psion. In 1999 J2ME was announced. The demand for software engineers surged.
It wasn t that hardware didn t matter or that it was commoditized. It wasn t that software hadn t mattered earlier either just that gradually software became more important than hardware. It takes years until we notice the difference, as it takes years to build a good software platform.

The next shift: from software to segmentation
We saw a similar shift last year, in 2006. In the third quarter of 2006 the average selling price (ASP) for several handset OEMs decreased considerably. Except one; Sony Ericsson who instead increased not only its handset ASP but also its market share. As argued earlier, increasing market share often leads to decreasing ASP, so why was Sony Ericsson an exception?

I would argue that Sony Ericsson found the new differentiator: vertical segments. A vertical segment is really just a product proposition that occupies a niche segment of the market. The more niche and targeted you can make any product, the more valuable the target consumer will find it and is thus willing to pay more for it.

The core handset differentiation shifts over time and eventually sinks under the value line .

The complexity of creating vertical segments
Designing a product to appeal to a target customer group as well as possible is important as long as that group is big enough to provide a return on investment. Today mobile handset tailoring and customizing is not an easy task and investments are substantial. There are three essential elements to creating a vertical handset proposition:

  • User Research: finding out what the customer segments want and to translate this into requirements
  • Supply-Demand Prediction and Logistics Handling: balancing supply and demand in a cost efficient and operationally responsive way
  • Product Flexing: to cost efficiently create multiple products according to requirement with minimal impact to time to market, development cost, and bill of material

The first two elements are competences taught in most marketing classes, but the third is specific to each (non-commoditized) industry. In the case of the mobile handset industry, this is the hardest part as it takes years to platformize handset software and hardware. Nokia has mastered the top two elements, but for some reason the inside of their phones look the same independently if it is game, business or multimedia that drives the phone. Sony Ericsson on the other hand was able to balance all three to a level which was in part responsible for their increase in handset ASP last year.

The next wave of differentiation?
Of course there will be a new differentiator when the art of segmentation has been mastered. We are already seeing open source as a threat to the ones that relied on the traditional model of software development. In markets where no new features can be added to the product the value lies in design, brand and product marketing, as is the case in eye-wear and watches. But surely there must be more features to add in mobile handsets, right?!

So where should we look for the next wave of differentiation? Undoubtedly OEMs will continue improving handset segmentation and user-centered design. However, I would argue that the next differentiating characteristics in mobile handsets will be delivery of True Personalization and the ability to cater to Multi-Sided Markets.

True Personalization
True Personalization is really about making the target segment so small that it becomes more or less one person (and I am not talking about ringtones, themes, mobile charms, and stickers). When Japan introduced number portability, many believed that the churn would grow immensely. It didn t, and I think one of the reasons is that DoCoMo had introduced soft walled gardens like personal email and i-mode services that users had attached themselves to earlier.

Think about it: If you had to change your email address to move to a Dell, HP or Mac instead of your current IBM/Lenovo, would you change? The OEM that is able to create an identity that resides within your mobile that is easily personalizable by the user and moved to new devices within the same brand, will definitely see less churn. I know a lot of people who don t change phones (even within the brand) because it is such an hassle to configure and move bookmarks, contacts, rss-feeds, contacts, settings, etc, and that some things like sms and email is not even transferable.

When the user is able to micro segment and truly personalize her own device she will never switch! Why do you think Nokia created the Nokia LifeBlog?

Catering to Multi-Sided Markets
Mobile phones will increasingly resemble platforms, but no one in the manufacturing part of the value chain will want a new Wintel, i.e. a singular platform. The manufacturer-operator battle is clear and a dividing line exists between the two the players above this line (operators and service providers) want all handsets to be the same for their applications, services, advertisements, etc. The players below (the handset OEM) don t want to become too platformized and end up like set-top-box manufacturers (I love asking people what the brand or even manufacturer of their set-top box is. Many answer TiVo or some other non-manufacturer; little knowing or caring about that it is built in Taiwan or China.

The way forward: handset OEMs are either building services or service platforms of their own, or are creating a flexible white label solution for third parties. Look at Nokia Ad Service, Content Discoverer as well as Motorola s Screen3. Rumors say that Google is having close talks with LG and Samsung, two hardware centric manufacturers, who should watch out for platformization. Why would Motorola not just use uiOne and why does Three remove the Nokia Active Standby? Because being able to enable third parties to monetize the mobile platform, but keeping control of the user experience will be a promising post sales revenue stream.

Thoughts?

Hampus, TAT

Bye Bye Browser

The mobile browser business has been dealt with a swift blow within the space of a week: Teleca announced that it halts investments into renewal of Obigo product , while Openwave is up for sale and is failing at licensing its v7 browser to handset manufacturers. The winner? Open-source browser derivatives based on Web Kit (adapted by Apple and Nokia), which should show up on handsets from the likes of Sony Ericsson within 2007.

Browsers beset with challenges
Cumulatively, the Openwave and Obigo browser families have to date claimed over 70% of the mobile browser market, with Access browser claiming another 20% of the market. Considering that the mobile industry sees one billion phones ship per year, this is a lot of browsers. Yet, these products have been facing multiple challenges:
– mobile handset middleware commands extremely low prices these days; the lower down the software stack you go, the lower the per-device licensing fees. For example, a single ringtone can command a retail price of $3, the same per-device pricing as the Symbian operating system. Browsers and Java virtual machines can only command perhaps one hundredth of that.
– Mobile browsers continue to be inherently complex software. Browsers that render street HTML (i.e. malformed web pages, which are pretty common on the Internet) are notoriously difficult to develop and maintain. Yet software for interpreting and rendering web (HTML) pages is becoming decisively commoditised.
– The innovation and value-add in mobile browsers lies in add-on features such as zoom and intelligent navigation – for example see Microsoft s Deepfish concept browser and the specs of Nokia s S60 open source-based browser.
– There are several open source alternatives to commercial browsers, first and foremost the S60 WebKit, which replaces Nokia s previous closed source efforts and is embedded on all S60 3rd edition handsets.

What went wrong with the browser business ?
Openwave, Obigo and Acces, the three main mobile browser vendors, had in the last two years tried to reposition their browsers as application environments, but with limited success.

Openwave has been the market-leading vendor for mobile browsers with over one billion deployments claimed to date, on handsets from BenQ Siemens, Sanyo, Sharp, Sagem, Motorola, LG, and TCL Alcatel. Openwave was the first and most vocal vendor to reposition its browser product into an application environment. In April 2006 the company announced MIDAS, a software platform combining a rendering (ECMAscript) engine with underlying browser and messaging components to deliver customizable applications to mobile operators.

What went wrong ? While Openwave was banking on the purchasing power of mobile operators to demand inclusion of its browser by manufacturers, it chose to sideline its real customers, the handset manufacturers at its own peril. Handset OEMs who previously were disinfatuated with Openwave due to the lack of flexibility in Openwave s bundled browser and messaging components were disincentivised to upgrade to Openwave s v7 browser framework (codenamed Mercury), the basis for MIDAS.

Openwave s strategy led to a welcome response from mobile operators; KDDI endorsed Openwave s Mercury browser for its EZweb services in October 2006, while O2 trialled MIDAS as the basis for a unified messaging client in mid 2006. However, MIDAS saw poor reception from the all-important handset OEMs, with only 5 out of 288 phone models embedding the Openwave v7 browser according to Openwave’s website (last updated in September 2006). This is a disappointing track record considering that v7 was announced in February 2003 from the world s leading mobile browser company.

With the MIDAS platform strategy failing, Openwave repositioned its portfolio into a product strategy, with 13 product announcements at the recent 3GSM conference in Barcelona. The announcements of the Openwave Mobile Widget, MediaCast and the Openwave Personalization and Profiling System are characteristic of the company s turn towards content delivery services. However, this turn came too late; with OpenWave’s NASDAQ-listed stocks having fallen 50% in the past 12 months, the CEO resigned in late March and the company announced it was putting itself up for sale. For a publically traded company employing 1,300 people across 26 countries, this a major shake-up. Even more so, if you consider that Openwave s decline is a far cry from the year 2000 when the company co-founded the WAP Forum and was instrumental in drafting the WAP specification which spawned the mobile browser business.

Openwave’s fortunes bare a similar fate to Obigo. In early April, Teleca announced that it would not be making further investments into the renewal of its Obigo software suite. Obigo includes not only a browser but a media player (SVG, video, audio), messaging (SMS, MMS, EMS, email), content manager, download manager and digital rights management. According to the company, Obigo software has shipped on more than 400 handset models and more than 300 million mobile phones as of July 2006, from manufacturers including BenQ Siemens, Panasonic, Pantech, Samsung, SonyEricsson and Toshiba.

Teleca said that the source code associated with Obigo would be opened up to customers “in order to drive the change from a product to a services model.”, according to CBR Online. More than 200 Teleca staff making up the Obigo product unit in Malm and Lund, have been offered voluntary transfer to Sony Ericsson.

For in-depth reviews of Openwave s MIDAS and Teleca s Obigo see the free research paper titled Mobile Operating Systems: The New Generation, published in September 2006.

The Winner: Open Source
The demise of the mobile browser business marks concurrently the first sign of the disruptive power of collaborative software development models based on open source.

The most vocal advocate of open source browsers has been Nokia. The Finnish OEM had in the past been developing a proprietary browser for S60. However, with the cost of street HTML browser development rising, Nokia tried three options:
– licensing Opera s web browser for its S60-based devices
investing in the Minimo project, a Mozilla browser branch optimised for mobile devices (which turned out to be too resource-hungry and was abandoned)
– re-developing its own S60 browser based on the the WebCore and JavaScriptCore components from on Apple s Safari browser (which in turn have been based on KDE’s Konqueror open source browser project).

In early 2006 Nokia steered towards the third option and announced the S60WebKit, the engine for Nokia s new S60 web browser, which today ships on all S60 3rd edition handsets. The S60WebKit browser offers advanced features such as mouse-based navigation, page miniatures , visual browser history and AJAX support. Furthermore, Nokia s browser additions are available under the permissive BSD open source license, which allows third parties to use these components for either open or closed source project with very few limitations.

What next ?
The discontinuation of Obigo and the financial troubles of Openwave should see Nokia s WebKit become adopted by other tier-1 OEMs such as SonyEricsson, who should acquire much of Obigo s browser know-how. The browser business should gradually shift into a professional services model, i.e. optimising and developing value-added features on top of an open source browser core, with Teleca best-placed to capitalise on this trend. I doubt that Access and Opera will be able to sustain their licensing agreements at current levels, given the popularity of low-cost open-source-based alternatives.

At the same time, this may be a lesson for the PC industry, too; had Internet Explorer not been bundled within Windows and offered to PC OEMs for free, it would no doubt have been sidestepped by Firefox.

Comments, as always, are welcome.