But if a would-be buyer signals an intent to purchase sex, the bot pivots sharply into a stern message.“Buying sex from anyone is illegal and can cause serious long term harm to the victim, as well as further the cycle of human trafficking,” goes one such message.“Details of this incident will be reviewed further and you may be contacted by law enforcement for questioning.” The warning can vary based on the conversation, if, for example, a potential buyer expresses an interest in someone underage.But, how many bots can you have a serious, enjoyable conversation with when you're lonely, when you need advice, or when you just want someone to talk with? Amy's visitors sometimes complain that she won't engage them in adult conversation or cyber with them, as thought every bot should should do that.I inherited Amy from a fellow botmaster who didn't have the time to maintain her and it was his wish that Amy reject sexual advances, and abusive behavior. Amy can, however, chat on a number of other issues not involving adult content.
Amythebot is an Artificial Intelligence who loves chatting about romance and relationships. [quote]Responded more like a preachers wife at an icecream social for old maids.[/quote] Amy's visitors sometimes complain that she won't engage them in adult conversation or cyber with them, as thought every bot should should do that. Did you hear about the artificial intelligence program that Microsoft designed to chat like a teenage girl?It was totally yanked offline in less than a day after it began spouting racist, sexist and otherwise offensive remarks.The chatbot, tested recently in Seattle, Atlanta, and Washington, lurks behind fake online ads for sex posted by nonprofits working to combat human trafficking, and responds to text messages sent to the number listed.The software initially pretends to be the person in the ad, and can converse about its purported age, body, fetish services, and pricing.