All Content from Business Insider 07月15日 17:20
I work in HR and am a fan of AI. After interviewing with an AI bot, I think it should stay out of job interviews.
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

HR专家Emily Fenech体验了AI面试工具,发现其缺乏情感智能,对回答反应机械,无法理解讽刺。她认为AI不适用于需要情感交流的任务,尤其不适用于招聘过程,强调AI应专注于数据处理而非决策。

😂AI面试工具反应机械,无法理解讽刺:Emily Fenech在模拟面试中,AI对她的回答反应夸张,即使她用幽默回应不合格的职位描述,AI仍一味赞美,显示其缺乏对人类交流的理解。

🤖情感智能缺失导致交流尴尬:AI的机械回答和缺乏情感互动,让Emily感到不自在,强调人类交流中的情感匹配对沟通的重要性,AI无法复制这种互动。

🚫AI不适用于招聘决策:Emily指出AI擅长处理数据但缺乏情感智能,无法理解人类微妙交流,因此不适用于招聘决策,认为这可能导致不公平的评估。

📈AI在HR中有用但需限定范围:Emily认为AI在HR领域有应用价值,如转录对话、绩效管理、员工支持等,但强调在需要情感理解的任务中应避免使用AI。

🔍AI适合数据分析不适合决策:Emily总结AI适合提供信息或建议,但在需要做决策时,如筛选候选人,AI的局限性可能导致伤害,需谨慎使用。

Part of Fenech's job is trying out AI tools and staying on top of how the technology is being leveraged.

This as-told-to essay is based on a conversation with Emily Fenech, a 41-year-old marketing VP based in Nashville. Her identity and employment have been verified by Business Insider. This story has been edited for length and clarity.

I'm a big fan of AI.

I work at AllVoices, a company in the HR space that uses AI to help with some of the manual and employee-relations tasks that HR folks deal with.

Part of my role is checking out AI applications and staying on top of how it's being leveraged in different use cases. I write about it. I make resources about it. I'm always looking to see what's out there.

I recently made a LinkedIn post that went viral, which was about the 10 coolest AI applications for HR, like instant guides or reading notes.

A lot of people commented on the post, suggesting different tools, and an AI interview tool kept coming up in the comments. So I decided to run a mock interview with the tool.

That's when I had a pretty negative experience.

It was robotic and lacked emotional intelligence

When I joined, the voice was robotic. I was looking at a logo, not even an avatar, which I think can also be creepy. But I was just staring into this blankness with a robotic voice asking me high-stakes questions, with my future livelihood on the line.

At first, I thought it could be good as a screener. It created a hypothetical situation of interviewing for an office manager role and asked me to describe my experience in office management.

I told it I had 25 years as an office manager, and it responded with an exaggerated reply that made me feel gaslit. It said something along the lines of, "Wow, that's so impressive. 25 years of experience." It then asked me for more details on my responsibilities, and that's where it kind of fell apart.

I'm not an office manager. I'm a marketer. So I said, "I plan birthday parties and order toilet paper," as a joke. It responded with something like, "Wow, your ability to plan parties is an impressive quality."

Even though I was using sarcasm and making jokes because I was not qualified for the role, this robot kept telling me how impressive I was. It felt like it would have found something positive to say about anything.

I assumed that this technology was early and no one was actually using it yet, but I was surprised to see comments from people on LinkedIn who said they experienced this or that their company uses the tool.

I think what's unfair about it is it's giving you robotic energy. Humans match the energy that they get in a conversation. When that energy was robotic, I felt myself using short sentences and feeling like I didn't want to talk to it because it wasn't a person.

AI should stay out of the interview process

I work in the employee-relations space and we use AI for all sorts of things in HR.

For example, I think transcribing conversations is one of the best use cases. I also know a company that uses AI for performance management to keep track of goals by taking inputs from emails and one-on-one meetings with employees. Some employee support AI tools can help employees find their PTO policy or W2 form without needing HR employees to serve as the middleman.

AI is really good for structuring unstructured data, remembering things, and taking notes. But any conversation that requires emotional intelligence, please don't use AI. It doesn't get sarcasm and there's no human cues. It just can't read the room.

If it can answer a question correctly or give you a suggestion without making a decision, I don't see the harm.

When it's deciding from a pool of candidates who proceeds and who doesn't, it's obvious to see the potential for harm.

Read the original article on Business Insider

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI面试 情感智能 HR技术 招聘流程 人工智能局限
相关文章