Microsoft engineers have worked pretty hard to make Cortana as personal as possible. However there is a line a user cannot cross.  For instance, you can’t sexually harass her.  It turns out users feel very comfortable talking freely with text and voice assistants. Humanizing the bots with names, faked emotions, personalities and genders (mostly female) helps build trust with users. It is a known fact that users often ask their personal digital assistants like Cortana, Siri or Google Now about their sex or their dress when they first access them. However, you cant do that with Cortana. According to Harrison, Cortana might actually respond in some pretty unexpected ways when user asks some unwanted sexual question. Harrison obviously knows what she is saying because she is one of the writers creating the answers that Cortana gives you when asked various questions. “If you say things that are particularly a**holeish to Cortana, she will get mad. That’s not the kind of interaction we want to encourage,” Deborah Harrison is quoted as saying. Harrison says that Microsoft has tried to make Cortana as humane as possible. “We wanted to be very careful that she didn’t feel subservient in any way… or that we would set up a dynamic we didn’t want to perpetuate socially,” Harrison adds. It’s no surprise that Microsoft is trying to give Cortana her own female personality. Cortana is named after the female character with the same name from Halo while her actual voice is actress Jen Taylor.