IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New Android rant Tuesday
Someone made the comment to me on Google+ the other day:

I find it ironic that Linux is so loved by developers, whereas Android, arguably the most successful consumer device powered by a Linux foundation so far, is resoundingly ranked as unreasonably difficult to work with by developers.

Here's the treatise I replied with, minus some G+ specific stuff:

First I'll give some of my own relevant background (I should really put this kind of stuff on my About page):

I've been using Linux since 1993 or so. In fact, I still have my SLS Linux 3.5" diskettes in a box somewhere. I'm a developer and software architect by trade, and I had always preferred Linux to the alternatives when it came to development. If there were new developer tools or frameworks to work with, obtaining and installing was always relatively straightforward. In fact, I still use Emacs to this day as my only text editor. I find that Emacs combined with the command line and bash shell to be immensely preferable to anything ever produced by Microsoft, the Linux operating system has always been more stable during daily use and through upgrades than Windows, and being able to directly examine, modify, and improve the internal workings of the OS is a great advantage. To this day I still install Linux on all of my servers, and I have multiple virtual Linux slices running various things.

On the mobile side, I've been doing serious mobile development for most of this year. Not a huge amount of time, but when you have as much experience as I do picking up a new environment is a matter of a week or two. It's simply a matter of comparing the new thing to all of the other things you've done and mastered, and noting where the differences and similarities lie. Everything else is just practicing the Art.



Next, some ground rules:

1) I'm not talking about user experience for the platform. Whether iOS or Android is better from a user's perspective is a matter of personal taste and philosophy. To the extent that this matters to a developer, I would much rather that users of my application have an all-around great experience, but I can only control just so much.

2) I'm talking about the situation as it stands right now. Yes, Jelly Bean is fast and slick. Yes, iOS 6 is likely to increase the iOS fragmentation somewhat (albeit in a very controlled manner). Neither of these matter, because I'm talking about why developers dislike Android right now.

3) I'm going to concentrate on server-side development when it comes to non-mobile Linux, including HTML applications, because this is where most Linux development happens. Very few people, relatively speaking, are developing user-oriented applications for Linux as opposed to web sites, e-commerce, or server processing applications.



But enough of that... the question posed: why is Android considered to be a pain by developers when it's just a Linux core with Java on top?

On the face of things I present a simple answer with complex underlying reasons:

Linux (Android) on mobile devices is a very different animal than Linux on a server or workstation.

That servers or workstations and mobile devices are different things should be trivially obvious upon first sight, but the complexity lies in the "why" of why this difference matters from a developer's perspective.

At the root of the matter is Android's much-discussed fragmentation. However, I will make the case that non-mobile Linux had (and to a certain extent still has) the same issue, and that there are structural differences in the two use cases that make the problem more painful for mobile.



First of all, mobile Android fragmentation:

1) Over 81% of Android devices are still running Android 2.2 or 2.3. Not quite 11% are running what could be considered a modern Android, Ice Cream Sandwich, and Jelly Bean isn't even showing up yet. 2.2 is from May of 2010, and 17% of Android devices are still running it. http://developer.and...boards/index.html

Note that the figures here will change as it is updated constantly; for future generations I'm making note of how it stands today. These numbers come from devices accessing Google Play over the past 14-day period. This is actually a very important point: China's distribution is not reflected in this because Google Play is not available in China. My personal suspicion is that the numbers are worse than shown, as China likely has more older and cheaper devices.

For comparison, by June of 2012 80% of iOS devices were running iOS 5, the latest released version. Ice Cream Sandwich was released around the same time as iOS 5, and as of June had only managed 4% penetration. An older (March 2012), unofficial iOS comparison can be found here: http://pxldot.com/po.../ios-ebb-and-flow

Not only is there the base fragmentation, but each manufacturer (like Samsung or HTC) and sometimes the carrier as well then takes the device and modifies it even further. The Android OS on a Kindle Fire is radically different than that on a stock Nexus phone, for example, and in fact a user of a Fire will have to go to lengths to be able to purchase items from the Google Play store on their device. This is really where the real problem happens: once the OS has been modified for a particular device/carrier combination, getting the OEM to go back and perform the customization again with a new Android version to provide an upgrade is nearly impossible. Even after Google announced major manufacturer's pledging to provide upgrades for a particular time period (at 2011's IO conference, if I remember correctly), the pledges only lasted until the next time the phone makers actually had to put money behind the upgrades.

Why does OS fragmentation matter? Developers are all about market share. You must get your application on as high a percentage of the available market as possible with the least amount of effort. In the case of Android, this means to hit a useful 80% of the Android market, you must target a 2 year old, creaky, slow, feature-lacking version of Android.

2) Device fragmentation: you may have seen this already, but here's a startling infographic from an Android developer: http://opensignalmap...fragmentation.php

3997 distinct devices, 599 distinct brands... but go down further and look at the screen resolution. Even weighting for frequency there are at least 14 major resolutions. And for each brand and carrier, there are different skins and UI modifications to contend with.

iOS developers have 4 screen resolutions to deal with.

Why does device fragmentation matter? For a developer, it is not possible to test your application's user interface on all of the available Android configurations. Nor is it possible to find a representative sample of available physical devices on which to test without going broke. Emulators are not a substitute for actual device testing, in case you were thinking they are, by the way.

Additionally, many Android devices are amazingly slow. This is due to a combination of old Android 2.x software and slow hardware. While you may argue that slow hardware is hardly Android's fault, the problem arises because Android in general is popular because it is so cheap to purchase for the end user. In fact, much ado was made recently about the "butter" inside Jelly Bean, so called because the new Android OS is buttery smooth in operation. But between the old Android OS in the wild and the slow, cheap, and popular Android hardware, writing an app that actually performs acceptably is more difficult than for other platforms.

As a result, every Android 2.x device I've tested on has displayed stuttering, choppy animations, and delays. The same app, running on an older iPod Touch or iPad (the gold standards for slowest iOS devices), runs fine. And since relatively few Android users are running on ICS (let alone Jelly Bean) or faster hardware, the lowest common denominator automatically limits the functionality a developer can get away with.

3) Distribution fragmentation: There are two major Android application stores, but dozens (or more) of carrier, manufacturer, and 3rd party stores. Every one of them has a different in-app purchasing mechanism, if it has one at all, as well as different publishing requirements.

iOS has one store.

Why does distribution fragmentation matter? Every store is a different development experience. You can see https://plus.google....posts/NLQ1fxDMWnS for my previous rant on this subject, so I won't go into details here. Suffice it to say that I have already had to develop two separate versions of my Android application to cover both Amazon and Google Play, and I still don't have any access to the Chinese market after all of that work. Additionally, setting up a store-front presence is a matter of hours for each store, hours which are very precious to me as a small, independent developer. For those of you who enjoy calling developers lazy when they complain about such things, please be generous with the Support Scott And His Family PAC donation slips near the door and I'll stop complaining.



So I think I've established the case that Android is a markedly more complex environment for developers than the alternatives. Why is this different than non-mobile Linux, though?

Server-side development is a completely different endeavor than mobile. I've spent approximately the last 17 years or so doing heavy enterprise software development, so I have some experience in this area. But there is still a good deal of fragmentation in the Linux server and workstation sphere, and it used to be much worse.

1) Multiple distributions, running multiple kernel versions. This is the equivalent of Android's OS fragmentation. In my past I've run, in a serious daily fashion, SLS, Slackware, Debian, Mythbuntu, SuSE, Fedora, Knoppix, RedHat Enterprise, both desktop and server Ubuntu, and several I've probably forgotten or didn't use much (like Gentoo). According to Wikipedia there are over 600 Linux distributions right now (Android is probably counted in that number).

2) Multiple user interfaces, existing in multiple versions with varying capabilities. KDE and Gnome are the two major ones, but with any of these the user could swap out the window manager for BusyBox, Enlightenment, xcfe, Sawfish/Sawmill, Metacity, and so on ad nauseum. Each one had its own integration points for menuing and icon systems. Most people, myself included (oh Sawfish, how I miss thee), loved this aspect of Linux. It was the ultimate Swiss Army knife of user interfaces. Don't like this one? Find another! Or code your own! Yay, anarchy!

3) Multiple distribution mechanisms and packaging systems. RPM, tarball, Debian packages, you name it. There weren't quite as many packaging systems as, say, Android stores or window managers, but as an administrator you might have to do some work to get that software package that was only distributed in RPM format working on your Debian box.

Linux on the server, therefore, was just as if not much more fragmented than Android is. I won't even bother comparing it to the relatively calm waters of Windows (which was no picnic) or OS X.



So then why do developers who loved the Wild West of Linux get frustrated by seemingly the same situation on Android?

1) Most importantly, people run their own servers. And within those servers there is homogeneity, as long as you and/or your system administrators are sane individuals. I run one version of Linux on my virtual slices: Ubuntu server. I have a few minor dot release differences between them, but for the most part all of my servers look very, very similar. If you are a system administrator, more than likely you are using something like Puppet, Chef, SmartFrog, or Ansible to make your servers exactly alike. If not, you should be. Bad admin, no cookie.

Not only do they run their own servers, but typically they are developing for themselves. Known server environment, known tools, homogeneous. I typically use a stack of PostgreSQL, Python and Django, and memcache. Back in my enterprise days it was Oracle, Java, and Spring. The point is, there is The Server Environment, and you develop in that environment. In most enterprises, in fact, you will get your hand rigorously slapped for attempting to step outside of that environment without discussing it with the architects and administrators because they all have a large stake in that environment from the aspects of productivity, robustness, and maintenance amongst others.

By stark contrast, when you develop a mobile application, unless you are "lucky" enough to be developing for an enterprise's in-house users with a prescribed device profile, you have no control whatsoever over your operating environment. For iOS, there are only a few devices you need worry about. For Android, well, see above.

So in a server-side Linux environment, you control the fragmentation and can therefore manage it to your liking. On Android, the mobile Linux stack, there is no control and the fragmentation is free to run wild over your best-laid plans.

2) Mobile devices have markedly different capabilities than servers.

A server has 4 basic pieces of "functionality", if they can even be called that: disk, memory, network, and CPU. You write a server application to accept requests and return results, whether that be HTML, JSON, EDIFACT, FIX, or whatever. Requests come in over the wire, your application accesses data on the disk, and then uses CPU and memory to crunch the results.

By contrast, a mobile device can have accelerometers, gyroscopes, multiple types of network access including Bluetooth and near field, cameras, microphones, and on and on. The fragmentation is multiplied by this variability.

3) Native UI development is different than server-side development. Server applications will run on different configurations with much more resiliency than a native user interface, because things like screen resolution don't matter. Again, you have memory, CPU, disk, and network. While I'm not saying that these don't have to be budgeted as part of your software architecture (far from it!), a user interface is much more sensitive to variations because it's a visual and (on mobile) tactile experience. Humans are much more sensitive to such things than are machines, and if your borders are gapped or your icons are fuzzy due to screen differences, people are going to notice.

But wait, what about HTML, Scott? That's server-side development too! Why yes, yes it is, but you will notice that HTML developers complain endlessly (and with good reason) about the enormous variety of browser capabilities and screen resolutions out there. HTML development is the same as Android development from this fragmentary perspective, while iOS development is more like developing in-house applications on Chrome only.

4) Linux is still not a desktop system, whereas mobile is all about the UI experience. This is bound to be an unpopular opinion amongst the heavy Linux users out there, but I stand by my words. Linux is a desktop for developers only, still to this day. And while it is a great development desktop, it is far removed from the consumer mass-market desktops that run Outlook, iTunes, Photoshop, Word or Pages (don't talk to me about OpenOffice: I gave it up after trying desperately to love it for well over a decade), and so on. Yes, you can run these things in a virtual machine. No, Virginia, you aren't running Linux at that point any longer and you've proven my point for me.

5) App store ratings then put the salt in the wound. Let's say you've bit the bullet, you've coded for both Amazon and Google Play, you've hit the major resolutions, you support everything from 2.3 up even though you couldn't make use of some important features, and off you go into Android Land to make money.

Then the 5 people running some whack, slow phone with 2.2 start posting negative reviews on your app's page because it doesn't work on their device. It won't matter at that point if you support 80% of the rest of the world. Those negative reviews can completely kill the market for your application.

Yech.



Now, for the most part, fragmentation was and is good for Linux. Innovation and creativity prospered (and still do) in such an environment. However, there was considerable wasted effort as well as competing efforts repeatedly reinvented the wheel, over and over, in a redundant fashion. While competition is good to a point, the Linux environment took it to great excesses. But then what happened?

1) Enterprises do not like the fragmentation. Enterprises thrive on stability and predictability. Again, if your system administrator is using 8 different Linux distributions (with a dash of Open BSD and BeOS thrown in for good measure), then you should fire them. They aren't doing your enterprise any favors as anyone who has worked in or with a professional class operations department damn well knows. As part of growing up, Linux had to consolidate to give these enterprises what they needed.

2) There are only a few main Linux distributions in enterprises today. RedHat and, well, RedHat. Some Ubuntu, but in my experience 3rd party application developers like Oracle, Progress, IBM, and others will certify on RedHat Enterprise Linux only. Why? Because it's stable and conservative, and RHEL protects those developers from the fragmentary madness of 600+ Linux distributions while still allowing them to support the platform. There's Oracle Unbreakable, but underneath the covers surprise! it's RHEL.

Developer desktops will range further afield, and in my experience you might typically see Fedora, Ubuntu, or Debian on a developer's desktop these days.

3) Many of the same issues are still present in server-side Linux. I can't tell you how many hours I've spent trying to get Oracle running on my developer workstation due to slightly mismatched library versions or incompatibilities. Not on RHEL, obviously, because it's a complete disaster as a workstation, but it's Linux, right? It should just work... except that the developers don't have the time and money to ensure that their application runs on every single Android, sorry, I mean Linux, environment out there in the wild.




In summary, Android is following the same trajectory, but the mobile vs. server differences make it more painful for development than Linux ever was. As a developer you don't control the end-user's environment, and you have to pick and choose your support profile to make development for the platform economically viable. Developing for such an environment is expensive and time-consuming, and I've seen studies that show that many, many developers simply cannot afford to develop for Android.

Which raises a related point: so there are multiple stores for Android. Why develop for Amazon if it's more expensive? Why not just develop for Google Play? Why indeed: Google Play users monetize at 23% the rate that Apple App Store customers do. Amazon users monetize at 89%. (Source: http://techcrunch.co...than-google-play/) So not only does a developer have to deal with multiple, fragmentary build targets, screen resolutions, and distribution mechanisms, but Android users simply do not pay as much for mobile applications as Apple users do. According to GigaOm, over twice as many iOS users purchase a paid-for app each month as Android users do. Note that the users of Amazon's app store are probably (warning, supposition ahead) significantly fewer than for Google's at this point, so the higher rate is negated by the fewer users.

Say what you will about Apple users, call them overpaid fanbois if you wish, but again, this is a developer's perspective, and I have to feed my family. If a platform a) costs more to develop for, b) has a riskier target environment (remember app ratings), and c) pays less on average per user, why wouldn't I get frustrated with it? Less pay for more work, win win win!




Particularly if you are writing native applications, learning a new development environment to support another platform with this many negatives is a hard sell. Which leads to my final topic:

On the positive side: mitigation.

It is possible to mitigate the expense of mobile development to a certain extent by using a cross-platform development platform, such as Flex (my own current choice), Titanium or PhoneGap. These tools can help shield the developer from learning multiple native development methods, as well as provide frameworks for lessening the impact of supporting multiple device resolutions and layouts. My Flex application ran without modification on my Kindle Fire after developing on an iPhone, albeit with some layout issues and without any store functionality. This is an impressive feat considering how different iOS and Android are.

However, these tools are predicated on a least-denominator feature set and by their very nature perform in a sub-optimal fashion on a mobile platform. Because mobile platforms are so different, each toolset has to support either native plugins or cover all of the important features, such as stores and notifications. Additionally, these tool developers are affected by the fragmentation just as much as any other developer, so while you may find a tool that supports, for example, Google Play, there's a good chance that it won't support the Amazon store as well. In my personal experience, I had to buy extensions for Flex for each of the three stores I support (Apple, Amazon, and Google Play), as well as additional extensions for some Apple-specific functionality.

PhoneGap, a tool that allows HTML5 applications to be installed and run as native applications on a mobile device, specifically has severe issues on the older Android devices because 2.x WebView is so utterly slow (http://code.google.c...s/detail?id=17352). Here you have a tool which was specifically designed to help fix the problems of mobile fragmentation, only to be brought down by the need to support these old Android devices that make up the majority of the installed base.
Regards,
-scott
Welcome to Rivendell, Mr. Anderson.
New The solution seems to be...
You've laid out a compelling case.

The solution seems to be: Sell the device with your app. That's what Amazon has done, and Google seems to be moving that direction in a big way. Depending on Verizon and Samsung and the others to keep their OS up to date seems to be a guaranteed way to strangle the platform for ISVs.

Of course, Amazon is a behemoth. Tiny ISVs can't take the risk of also having a hardware inventory... But if decent up-to-date tablets are available for $50 in the near future, or if Google starts bundling apps with their tablets, then ...

Presumably in the next 10 years things will settle out a bit. Best of luck being on the bleeding edge!

Thanks.

Cheers,
Scott.
New Nobody's going to want single-purpose devices
We had phones before that were just phones, and cameras that were just cameras. We're not going back to that.
--

Drew
New True, but I wasn't clear what I was suggesting.
Fragmentation and the other issues that malraux outlined are serious and are holding the Android platform back. If a small ISV could say:

"This app is certified to run on the new Widget tablet running Android 4.1 (JB). You can get our app and the Widget for a special bundle price of $x."

It wouldn't be a single-purpose box, it would be a standard Android box, but an up-to-date one that the ISV could target and maybe get a piece of the hardware money as well (if tablets become cheap enough - it obviously wouldn't work especially well for smart phones that depend on phone company subsidies). The Store issues would have to be addressed as well, but if it were a standard JB box then presumably at least Google Play would work out of the box.

I agree it's not practical (at least not for the biggest players), but it is a way to attack the problem. Another way is to somehow get preloaded on new boxes (the DOS/Windows/Office model). Otherwise, the market isn't going to be very big (in a sensible sense) for a few more years (after things shake out some more), and ISVs are going to be struggling in the Android space.

My $0.02.

Cheers,
Scott.
New Defeated by the market share issue
It is very unlikely that a small ISV is going to get someone to buy a phone with their Angry Birds clone, and the people who do are a very small install base.
Regards,
-scott
Welcome to Rivendell, Mr. Anderson.
New IOW, Google needs to out-Apple itself. :-/
I don't have time for a big, thoughtful response to either of you, but I can see several things Google should do or should have done. Some of these are going to be Very Difficult, however.

* Curate the Google Play store. Yes, this will hurt. The standards don't need to be as tough or as opaque as Apple's (and I don't think it should be at all), but it needs to happen.
* Licence the "Android" name and vett devices that want to use it. Or something like that. This means that shit Android devices will be told they cannot use the name. At all. They can also use this to help co-erce vendors to keep up with the OS upgrades. This is also hard because it will mean an attempt to put the "open source" genie back in the bottle. It was a great sentiment, but it's not working so well.

I'm sure you could think of other points.

Wade.
Just Add Story http://justaddstory.wordpress.com/
New On your second point:
Neither of the big new Android handsets in the UK (i.e. Samsung Galaxy S 3 and the HTC One X) make any mention of the word "Android" or "Google" anywhere in their their current advertising campaigns - at least, not in anything other than the small print.

Be under no illusion: Samsung sells Samsung phones, not Android phones made by Samsung.

It's not just about the SGS3, either; the Galaxy Note's adverts also have a conspicious lack of Google and Android about them.
Expand Edited by pwhysall July 18, 2012, 09:09:06 AM EDT
New Hmm. I hadn't noticed.
You're right: the brands are selling brand loyalty. I've been subject to that myself.

Maybe that genie simply can't be put back in that bottle...

Wade.
Just Add Story http://justaddstory.wordpress.com/
New Google is doing #2 to a certain extent
They're now threatening to withhold early access to new versions unless the manufacturer adheres to an upgrade pledge. I doubt it will work.

Google absolutely needs to do some curation, but that doesn't address the fragmentation much if at all. There's no walled garden, so there will always be competing Android stores. What we'll probably see is Amazon and Play duking it out for the top 1 and 2 spots, and multiple much smaller stores from the manufacturers, carriers, and so on.

There's probably room for a 3rd party in-app purchase provider, but the apps will need to include the libraries themselves or users won't bother. Theoretically you can use Amazon's in-app purchasing for apps sold in the Google store, but only if the user downloads and installs the Amazon store app. Google Play requires Google Mobile to be installed, which is a bit deeper than just an app. If you have a Kindle Fire you have to root the device to get it on there. The vast majority of users, again, aren't going to bother.

The Nook doesn't even have an in-app purchase API for its store, and in fact B&N will reject your app if it implements in-app purchasing or advertising.
Regards,
-scott
Welcome to Rivendell, Mr. Anderson.
     Android rant Tuesday - (malraux) - (8)
         The solution seems to be... - (Another Scott) - (7)
             Nobody's going to want single-purpose devices - (drook) - (2)
                 True, but I wasn't clear what I was suggesting. - (Another Scott) - (1)
                     Defeated by the market share issue - (malraux)
             IOW, Google needs to out-Apple itself. :-/ - (static) - (3)
                 On your second point: - (pwhysall) - (1)
                     Hmm. I hadn't noticed. - (static)
                 Google is doing #2 to a certain extent - (malraux)

The key; The whole key; And nothing but the key. So help me Codd.
54 ms