• kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    16
    ·
    3 months ago

    Interacting with people whose tone doesn’t match their words may induce anxiety as well.

    Have they actually proven this is a good idea, or is this a “so preoccupied with whether or not they could” scenario?

    • Admiral Patrick@dubvee.org
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      3 months ago

      Have they actually proven this is a good idea, or is this a “so preoccupied with whether or not they could” scenario?

      It’s businesses “throwing AI into stuff”, so I’m going to say it’s a safe bet it’s the latter.

  • Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    3 months ago

    This is giving me Black Mirror vibes. Like when that lady’s consciousness got put into a teddy bear, and she only had two ways to express herself:

    • Monkey wants a hug
    • Monkey loves you

    I get that you shouldn’t go off on customer service reps (the reason you’re angry is never their fault), but filtering out the emotion/intonation in your voice is a bridge too far.

    • TachyonTele@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      3 months ago

      Most of the time angry customers don’t even understand what they’re angry at. They’ll 180 in a heartbeat if the agent can identify the actual issue. I agree, this is unnecessary.

      • Admiral Patrick@dubvee.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        3 months ago

        Yep, 100%.

        In college, I worked at a call center for one of the worst Banks of America (oops, meant banks in America 😉). Can confirm that, and I dealt with a LOT of angry customers.

  • blindsight@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    This seems like it might work really well. We’ve evolved to be social creatures, and internalizing the emotions of others is literally naked into our DNA (mirror neurons), so filtering out the emotional “noise” from customers seems, to me, like a brilliant way to improve the working conditions for call centre workers.

    It’s not like you can’t also tell the emotional tone of the caller based on the words they’re saying, and the call centre employees will know that voices are being changed.

    Also, I’m not so sure about reporting anonymous Redditor comments as the basis for journalism. I know why it’s done, but I’d rather hear what a trained psychologist has to say about this, y’know?

  • Xirup@yiffit.net
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 months ago

    In my country, 99% of the time you contact technical support, a poorly made bot responds (actually it is a while loop) with ambiguous and pre-written answers, and the only way to talk to a human is directly by going to the place in question, so nothing to worry about that here.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    🤖 I’m a bot that provides automatic summaries for articles:

    Click here to see the summary

    According to a report from the Japanese news site The Asahi Shimbun, SoftBank’s project relies on an AI model to alter the tone and pitch of a customer’s voice in real-time during a phone call.

    SoftBank’s developers, led by employee Toshiyuki Nakatani, trained the system using a dataset of over 10,000 voice samples, which were performed by 10 Japanese actors expressing more than 100 phrases with various emotions, including yelling and accusatory tones.

    By analyzing the voice samples, SoftBank’s AI model has reportedly learned to recognize and modify the vocal characteristics associated with anger and hostility.

    In a Reddit thread on Softbank’s AI plans, call center operators from other regions related many stories about the stress of dealing with customer harassment.

    Harassment of call center workers is a very real problem, but given the introduction of AI as a possible solution, some people wonder whether it’s a good idea to essentially filter emotional reality on demand through voice synthesis.

    By reducing the psychological burden on call center operators, SoftBank says it hopes to create a safer work environment that enables employees to provide even better services to customers.


    Saved 78% of original text.