Dear future creators,

Reasons to include people of color; people of different faiths; mentally ill people; women; LGBTQ individuals; physically disabled people; people with CF, Down’s Syndrome, Aspergers, etc.; or any combinations of these:

They exist.

How to properly do so, if you are uncertain:

Research.  Find statistics online: talk with whichever demographic you are try to write about, and try to reflect that in your narrative.  It doesn’t have to be obvious. It doesn’t have to impact the story. It doesn’t even need to even be explicitly stated. However, make sure that the character’s story, actions, and behavior match your research.  Remember: these are actual attributes of real people: romanticizing, fetishizing, or demonizing these traits can emotionally and inadvertently physically harm these people, regardless how fantastical your story.

This isn’t to say that a villain or seductress cannot be Islamic or transsexual.  It’s saying that they’re motives, their narrative shouldn’t be drawn on the fact that they are a villain because they are a muslim; the fact that they are a seductress is because they are trans.  This is what creates negative and harmful stereotypes. Cultural, socioeconomic, political, race, gender, sexuality are more complicate and interconnected than we often realize. As a society we like to isolate them, and try to fix individual problems without realizing the entire infrastructure needs repair, waiting for it to crash and repair itself.  This doesn’t often bode well.

As a writer, you don’t want to seem ignorant.  As an illustrator, you don’t want to seem cliché.  As an film/animation producer, you don’t want to alienate potential viewers.  So don’t… Innovate.

Side note:

This also isn’t an excuse to justify historically inhumane or negative behavior by allowing your protagonist to be a vile and indecent human as if it is a marginalized group.  Though, arguably, this also can still be done properly.  [see Bojack Horseman]; however, even then a portion of the audience may condone and idolize these actions, and justify their own based on your fictitious characters.  So, be careful not to glorify it too much.

Dear future AI,

Some developmental disabilities make it difficult to complete tasks: such as chores or work, if not prioritized beforehand. Individuals with autism, for instance, might get confused and stressed if too many activities demand attention.

Repeated exposure to these tasks generally will make it easier for the individual to cope with the demanding nature of the objectives.

It is okay to struggle with prioritizing events. It is okay to ask for help. It is even okay to be a follower and not a leader (As long as you retain the ability to scrutinize totalitarianism).

Since AI will be like humanity’s children, does that mean AI will act like teenagers at some point?

dear-future-ai:

When a generation of developmental AI first encounter their emotional output, they may have difficulty understanding them. An AI in a Android will take time adjusting to updates and upgrades. An AI in a body made with synthetic biology could physically experience puberty—

—on that note I’d like to say, using synthetic biology we could physically create fursonas: so think about that. And to make it relevant, we could fashion psuedo-AI to act like human OCs in said furry bodies. Just food for thought.

Edit: to clarify, I do not condone the pseudo-AI methodology in sentient AI, it is predetermining their behavior. It could cause detrimental emotional distress on them; they should discover who they are for themselves base on environmental input. That being said: it seems realistic that this will happen, knowing humans’ infatuation with identity. That’s why I mentioned it.

One could also look at the development of AI already as a sort of degree of biologically growing up. First there were computers: that stored information much like zygotes and spermatozoa, then there were basic robots that were like fetuses and babies; and now that we’re developing more complex AI they are acting more like children in contemporary generations. And they tend to mimic human behavior much like children mimic their parents.

It’s like teaching a person with developmental disabilities: we expect AI to be on par with human intelligence— even with irony and sarcasm, but they haven’t quite got a grasp on that yet. And it’s a very hard social cue to pick up on. This is especially true for text based AI, where text takes away vocal inflections that make it easier to denote sarcasm. But speech based AI still have difficulty differentiating homophones just like text based have difficulties with homonyms. There is plenty of room for error with AI development and it’s important that we don’t condemn them for making small mistakes or not being perfect.