This is a list of technologies which in some way (perhaps unintentionally) marginalize some users based on gender, race, class, sexual orientation, disability, etc. It provides motivation for having a diverse group of people participate in creating technologies. 

General issues

  • Much software has problems with accessibility, making it unusable for people with visual impairment or other disabilities.
  • A large amount of software is only available in English, and translated software often into only a few other languages. Even with high-volume products, existing translations may be faulty to the point of obfuscation.
  • Some software restricts text entry to, eg, the Latin character set. A large part of those have a very limited Latin set.
  • Some software relies on artefacts of the Latin character set for key features (eg, relying on spaces between words; not all written languages segment words with a special character at all)
  • Much software is only available for comparatively recent (and thus expensive) hardware and/or platforms.
  • Use of most software assumes at least a moderate standard of literacy.
  • Developing software is even more likely to require knowledge of English or at least command of the Latin character set.
  • Software that uses icons instead of words may still often assume a great deal of cultural literacy, usually with dominant Western European-North American cultures and their symbolism. In any event, unless verbal equivalents are given, such designs are inaccessible.
  • Much software makes very restrictive assumptions about people's names, some common examples being that everyone has exactly two "important" parts to their name, that everyone's name is written in Latin characters, or that names do not contain "special characters" like hyphens. See for example Who is harmed by a "Real Names" policy? and  Falsehoods Programmers Believe About Names.
  • Much software relies on United States-specific addressing information, such as requiring ZIP codes and selection of a US state of residence. (Even "postal code" which is sometimes used as a substitute, isn't universal. See Falsehoods programmers believe about addresses) Websites that only accept US credit cards or only ship to US addresses often do not make this plain before the customer has invested considerable energy in assembling a purchase.
  • Increasing amounts of software require either always-on or regularly-on Internet connections, which is either infeasible or enormously expensive in many parts of the world.
  • Increasing amounts of software assume that streaming tens, hundreds or even thousands of megabytes of data is both fast and so harmless that no sign of this needs to be shown to the user. In much of the world this is not true and even in some wealthy countries data is too expensive for this.
  • The terms of use for much software and many websites is based on the US legal system and requires things such as resolution of disputes in a US court.
  • Many pieces of hardware are designed for average sized men in Western countries.

Specific examples

Design for certain physical attributes

  • The HP MediaSmart webcam fails to track darker-skinned faces . HP says that It's "built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose."
  • Some body-tracking technologies make assumptions about body shape that don't apply to all bodies. For example, Kinect with XBox does not recognize people under 40 inches tall . Leap Motion devices distinguish between "tools" versus "fingers", where a tool is "thinner, straighter, and longer than a typical finger" .
  • Many computer keyboards are designed for people with larger hands, putting people with smaller hands (who are more likely to be women than men) at higher risk for repetitive strain injuries. Likewise, office chairs and desks are often designed for taller people.
  • Most computer keyboards are designed for a specific shoulder width. Those with narrower or wider shoulders are put at higher risk for repetitive strain injuries.
  • Ergonomic assistance programs sometimes fail to account for body differences which do not permit certain postures, and treat any deviation from the ideal norm as a failure of equipment or a user error. For example, a person who physically cannot fully straighten their back or keep their arms at a strict 90 degree angle to their body, might be deemed to be out of compliance with the ergonomic policy.
  • The size and shape of mobile phones presumes the user's hands are a certain size. Users with smaller hands will find it difficult or impossible to use the phone one-handed. On average women have smaller hands than men, and people from developing countries have smaller hands than those from wealthier ones. As mobile phones -- especially top-of-the-line phones -- grow larger with every new model, the problem continues to worsen. See: Dr Zeynep Tufekci, "It's a Man's Phone"
  • Mobile devices that use a capacitive touch screen (including iPhone and T-Mobile G1) require that the user touch the device with their fingertip or electrically conductive stylus (sold separately) , rather than a fingernail, non-electric stylus, or other prosthetic. This causes difficulty for people who have long fingernails (mostly women), people whose fingers are bent with arthritis (mostly older people), and people with any other disability that makes it difficult or impossible to touch fingertip to screen. 
  • Apple's Health app (built in to iOS 8) normalizes the idea of keeping detailed weight and calorie-intake records, which may be triggering for people with eating disorders and related obsessive-compulsive behaviors.
  • Fitbit's Android app requires that the user enter their weight at the time of setup, even if the user only intends to track sleep or steps.

Sexual and gender exclusivity

  • A large portion of website registration forms ask for the user's gender but only provide options "female" and "male".
    • Some forms provide the third option "other", which many genderqueer people find just as marginalizing.
  • Facebook, and perhaps other social networking sites, don't allow users who are in open relationships to list themselves as "in a relationship" with more than one person.
  • Computer games with romance/sex plotlines or mechanics often have restrictions on relationships:
    • no, or fewer, options for same-gender relationships
    • not allowing consensual or publicly known relationships with more than one partner.
  • Apple's Health app (built in to iOS 8) does not provide options for tracking menstruation.

Assumptions about technology access

  • Increasing numbers of computer games and computer gaming systems have an authenticity check performed once a day or more with a central server, requiring that the player have an Internet connection that is reliable enough for a once-a-day phone home check. Some games sold for home systems even store crucial game data on central servers, meaning the game cannot be played at all without a reliable high-bandwidth connection.
  • All "cloud" services rely to some extent, often a large one, on their users having continual broadband-quality connectivity.
  • Most wiki software (including Mediawiki underlying both Wikipedia and this wiki) relies on Internet connectivity at the moment of editing.

Marginalising language and symbols

  • One of the X11 , HTML , and CSS color names is "Indian Red".
  • The name of GIMP (GNU Image Manipulation Program) is a derogatory term for people with disabilities .
  • The emoji people in the Unicode Standard are not very diverse. "Emoji users have been clamoring for years for a more diverse palette for the people characters, one that goes beyond the small, vaguely stereotypical subset of man-characters like Man With Turban and Man With Gua Pi Mao." A draft proposal has been published at the Unicode Consortium to add diversity in skin tones by combining color swatches with existing emojis. This is a step toward diversity, but many groups of people are still excluded.
  • Cartoon yellow as a default skin tone is often claimed to be neutral as it does not represent a specific skin tone commonly found on real people, but it still represents lighter skin tones with similar color values.

User interface difficulties

  • Smart phone software varies more than physical phones in terms of how people make a call, a basic function. This requires that people re-learn how to make a telephone call with any new phone they get.
  • DVD and Bluray discs have an entirely different menu system for every different DVD they play. This makes it almost impossible for a user to insert a disc, follow a set, learned, series of steps, and see the film/show they wanted to see.
  • Facebook frequently changes its user interface, including for critical tasks such as maintaining privacy. This means that users have to re-learn the steps to accomplish tasks every few months. 

Marginalising decisions made by computers

  • Some online retailers offer different prices based on the city or neighborhood from which a customer is accessing the store. In some cases, this reinforces economic inequality. For example, Staples appeared to offer lower prices to customers within about 20 miles of an Office Max or Office Depot, which meant that lower prices tended to be offered to wealthier areas. 
  • In searches for a personal name, Google AdSense displays different ads for names assigned primarily to blacks or whites . Names assigned primarily to black babies were more likely to generate ads suggesting that the person has an arrest record ("Trevon Jones, Arrested?..."). 
  • In some mapping software, Indian reservations are unmarked when viewing the map at the state level. This design decision "impacts the way we understand (or don’t understand) the geographic and social reality of this country." 
  • "Automated decision-making systems like predictive policing or remote welfare eligibility... have become primary decision-makers in public policy". When they make wrong decisions, it's difficult to understand why because "policy algorithms are generally considered corporate intellectual property or are kept under wraps to keep users from developing ways to game the system". (This is a business decision but also a design decision, since the algorithms are designed around security through obscurity.) 
  • Digital redlining: When choosing the audience for ads on Facebook, some financial companies use "strong indicators of users’ socioeconomic status—where they attend school, where they work, who their friends are, and more" to discriminate against certain groups. "The authors of this article saw this firsthand when one of us (Astra) opened a second Facebook account to communicate with Corinthian students: Her newsfeed was overrun by [loan scams]—in stark contrast to the ads for financial services, such as PayPal and American Express, that she normally gets." 

Other

  • A lot of indexing software and systems that use folksonomy tags rely on the space character to separate all words, keywords, or tags associated with a piece of writing. Since not all writing systems segment words, this doesn't support all written languages.
  • Some image editing software provides a filter for skin whitening. In at least one example (a GIMP plugin) , the filter is classified under a menu item called "Beautify". 
  • Some people with dyslexia find it more difficult than non-dyslexics to program with a plain-text editor . (On the other hand, they may have no trouble programming with an editor that provides visual cues such as syntax highlighting.) 
  • Wikipedia provides Simple English as a language/translation separate from "regular" English, rather than linking the two more closely together. When visiting an English wikipedia page, the Simple English link is easy to miss since it's buried among all the other translations. 
  • Web apps such as YouTube, Facebook, and Amazon collect large amounts of content from users. The types of moderation these companies choose to perform on the types of content collected can't be fully automated. The ability to collect and moderate this content is based on the assumption that the system can rely on human workers ("data janitors") who are paid low wages, sometimes less than the legal minimum wage. 
Community content is available under CC-BY-SA unless otherwise noted.