January 16, 2018
This past summer, as Hurricane Irma threatened Florida, Tesla Motors provided some of their customers with a free upgrade. It was a software update to some of their vehicles to increase those cars’ battery capacity, making it less likely that their owners would be stranded as they attempted to flee the giant storm.
Not long after, in September, a wave of rumors circulated through social media, claiming that Apple intentionally slowed down old iPhones to increase sales of newer models. The rumors, now largely discredited, appear to have begun with a now-deleted blog post, which misleadingly referenced a three-year-old story in the New York Times. It has since come to light that, although Apple does reduce the speed of some of older phones, the goal is not to spur sales. Rather, the goal is to better manage the reduced capacity of an older battery.
Upgrading car battery capacity and rumors about iPhone performance seem like two very different topics. On the face of it, they have little in common besides both involving high-profile technology companies. However, a deeper look reveals that they both revolve around the same core concept: antifeatures.
Antifeatures stand in contrast both to features and to bugs, the two sides of the usual software evaluation dichotomy. A feature is a functionality intended to be useful to the user. A bug is a behavior, usually the result of an error or sloppy programming, that gets in the way of the features. Antifeatures, unlike features, are not useful from the standpoint of the software user. In fact, they typically make the software worse from the standpoint of the user. But, unlike bugs, antifeatures are no accident.
In the case of the Tesla battery, in order to see the role of antifeatures, we need to note one crucial fact about how the upgrade worked. Obviously Tesla cannot send a message through the Internet to physically add to or enlarge the cells of a battery. In order to be upgraded remotely, the battery capacity and the circuitry to utilize it have to be in place already. The “upgrade” occurs when the software, which was originally written to limit the battery capacity to one level, is replaced (or “updated”) to raise this limit. In this example, the upgrade allowed some of Tesla’s less expensive vehicles to perform like more expensive models. That functionality of the software which, before the upgrade, limited the capacity of the battery, is an antifeature. And so Tesla’s upgrade was simply the removal of this antifeature.
In the case of the slow iPhones, whether this is an example of an antifeature depends on the goal of the slowness. If the relevant software functionality was indeed intended to better manage older batteries, then it is not an antifeature. If, however, the aim were to make older phones unpleasant to use (perhaps to motivate new purchases), then the functionality would be a blatant example of an antifeature.
Together, these two examples show us how antifeatures affect our lives as technology users. The Tesla battery example is striking evidence that antifeatures are present in the electronic devices we use. The rumors, revelations, and controversy about iPhone performance prove that, even when antifeatures are not present, we are wary and expect to be confronted with them. Thus, antifeatures affect both our actual interactions with technology and our thinking about it.
Various Types of Antifeatures
As the two examples above already suggest, antifeatures come in many forms across a wide array of devices. To get the concept of antifeatures a little more clearly in view, we can examine a few other well-known types of antifeatures.
Perhaps the most common type of antifeature is adware. Adware is software that presents the user with unrequested advertisements. Adware was not always as common as it is today. Two decades ago, software companies and developers typically made money by selling a copy of their software to the user. With early versions of its web browser, Opera Software was no exception. However, in 2000, Opera created a buzz when it released the newest version of its browser in two editions, a paid edition (which cost $39) and a “free” ad-supported edition, which displayed ads at the top screen while users browsed the web in the main panel below. Within a decade, especially with the rise of ads in mobile apps, adware became one of the dominant software monetization strategies, as users of games, weather forecast widgets and social media apps will surely attest.
Adware counts as an antifeature because the ad-delivery functionality was not added to the software to meet the users’ needs or desires. Indeed, most users will gladly turn off advertisements if given the option (and some users go to great lengths to block advertisements). Most users who put up with adware do so because they consider the other functionality provided by the software to be worth the annoyance of ads.
Closely related to adware is spyware or tracking software, which monitors and reports on how people interact with their electronic devices. Advertising is not simply about getting the right message out there; it is about getting that message to the right people. Hence, businesses make careful use of targeted advertising, and one straightforward targeting strategy is to discover users’ interests, as manifested by how they use their devices. Hence, adware works better when coupled with tracking.
Unsurprisingly, then, tracking counts as an antifeature for the same reason adware does. It is not functionality users prefer in their software, and most would get rid of it if they could (assuming they knew it was there). So, from the user’s standpoint, tracking is not a feature. But tracking is not a bug either, since it is implemented intentionally, not accidentally, by the software developers.
That said, sometimes some of us want our behavior to be tracked. Accordingly, to the extent that tracking makes the software more useful to the person — as in software that logs jogging routes or scrobbles music listening — the tracking functionality is a genuine feature, not an antifeature at all.
Another infamous type of antifeature is known by its advocates as Digital Rights Management and by its critics as Digital Restrictions Management (DRM, either way). DRM technologies are designed to restrict what users can do with copyrighted digital content. To see the purpose and function of DRM, we can compare CDs, which have no DRM, to DVDs which do. When an audio CD is played in an ordinary CD player, a laser reads the digital information from the CD, and a digital-to-analog converter produces an audio signal which can be amplified for headphones or speakers. But besides translating the digital information directly to analog, there is another option: that digital information can be copied to another medium. Hence, “ripping” a CD is quite simple, which is one reason the practice became widespread during the late 1990s.
When the movie industry and hardware makers were designing the DVD format for digital video, they wanted to prevent people from making illegal copies. In particular, they wanted to prevent people from ripping DVDs the way people can rip CDs. Hence, DVDs were designed to accommodate a kind of DRM, which is called the Content Scramble System (CSS). On DVDs that use CSS, the content is encrypted and can be read only by devices that contain one of the valid decryption keys. Major hardware manufactures and software makers were given decryption keys so that they could create DVD players and software applications that could deal with CSS. All these hardware and software providers agreed to design their systems so that the video content could only be played, not copied or transmitted. Now, as it actually happened, hackers quickly defeated CSS. Nonetheless, this DRM system has kept many users from doing things like making backups of their movies or copying movies to a laptop to watch while traveling. Thus, CSS, though it serves the interests of copyright holders, counts as an antifeature because it makes the technology less useful to the end-user.
The use of DRM to secure video has become more sophisticated since the hayday of DVDs. Blu-ray includes several kinds of DRM, and video services such as Netflix use DRM to prevent users from saving and copying the videos they stream. Earlier this year, DRM became an entrenched part of our larger computing and information ecosystem when, after contentious debate, DRM was integrated into the standards for the web. Just like CSS is an antifeature, the software implementations of these other varieties of DRM constitute antifeatures as well.
Definition and Significance of Antifeatures
Already implicit in these examples are the outlines of the concept of an antifeature. We can now give an explicit definition: An antifeature is some functionality that (1) is intentionally implemented, (2) is not intended to benefit the user, and (3) makes the product worse, from the standpoint of the intended user.
This is a broad definition, intended to cover the wide variety of antifeatures. In contrast, writers at the Free Software Foundation have defined antifeatures more narrowly, identifying them as any functionality that developers will charge a fee to remove or omit. According to this definition, Tesla’s limitation of the usable capacity of its car batteries would indeed be a paradigmatic antifeature, since Tesla charges users for the removal of this limitation. For similar reasons, the narrow definition also covers Opera’s original adware browser. However, there are some antifeatures which developers do not allow users to avoid by paying a fee. For example, users cannot typically pay a fee to disable DRM.
Although it is important not to define the class of antifeatures too narrowly, it is equally important not to define them too broadly. For instance, if we were simply to identify antifeatures as functionality that is undesirable to users, then our definition would cover too much. After all, just because some users (or even most users) dislike a particular functionality, it still might be that it was implemented to address the needs or interests of other users. Looking beyond software for a moment, we can find many safety features that fit this profile. For example, users may dislike the safety cable that shuts off a lawnmower when the handle is released, but that does not entail that the safety cable is an antifeature. Hence, antifeatures must be defined partly in terms of the intentions of the developer. Functionality intended to directly serve users does not count as an antifeature.
This issue about the intentions of the developers is key. If users believe that developers’ decisions about software design were guided by the users’ desires and interests, then those users may place their trust in the software. Of course, whether users trust software depends on many further social and technical factors as well. But the crucial point is that trust will be elusive as long as users believe that the software was designed to function in ways that are not for their benefit. Hence, if users are accustomed to encountering antifeatures and expect their software will have antifeatures, then this limits how much they can trust this software. In short, if users expect antifeatures, then they will always be on guard.
On that note, we can return to our earlier example of the slowness of old iPhones. Let us continue to assume that Apple does not decrease the performance of earlier products just to encourage people to buy new ones. Nonetheless, it is a fact that users are prone to suspicion. Indeed, the suspicion in this case was strong enough to prompt researchers to search for evidence of an antifeature. This shows that, even with popular and highly regarded products like the iPhone, users’ trust is fragile. And the more users actually encounter antifeatures in their technology, the more they will suspect their technology includes antifeatures, whether it does or not. This would be an unfortunate consequence for users and developers alike.
Owen King is the NEWEL Postdoctoral Researcher in Ethics, Well-Being, and Data Science in the Department of Philosophy at the University of Twente. His research is primarily focused on well-being, from both theoretical and practical perspectives. He also investigates ethical issues raised by new computing and data technologies.