TechCrunch News 01月25日
The AI industry’s pace has researchers stressed
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

AI研究者看似风光,实则面临巨大压力。行业竞争激烈,工作强度大,影响心理健康。速度至上的理念使研究成果易过时,且牺牲了学术交流。一些人提出改善建议。

🥱AI行业竞争激烈,研究者压力大,影响身心健康

🚀速度至上,研究成果易过时,工作价值受质疑

💔牺牲学术交流,专注商业化,违背行业预期

💡一些人提出改善工作环境的建议,如加强交流等

To outside observers, AI researchers are in an enviable position. They’re sought after by tech giants. They’re taking home eye-popping salaries. And they’re in the hottest industry of the moment.

But all this comes with intense pressure.

More than half a dozen researchers TechCrunch spoke with, some of whom requested anonymity for fear of reprisals, said the AI industry’s breakneck pace has taken a toll on their mental health. Fierce competition between AI labs has fomented an isolating atmosphere, they say, while the rising stakes have ratcheted up stress levels.

“Everything has changed virtually overnight,” one researcher told me, “with our work — both positive and negative results — having huge impacts as measured by things like product exposure, and financial consequences.”

Just this past December, OpenAI hosted 12 livestreams during which it announced over a dozen new tools, models, and services. Google responded with tools, models, and services of its own in a dizzying array of press releases, social media posts, and blogs. The back-and-forth between the two tech giants was remarkable for its speed — speed that researchers say comes at a steep cost.

Silicon Valley is no stranger to hustle culture. With the AI boom, however, the public endorsement of overwork has reached troubling heights.

At OpenAI, it isn’t uncommon for researchers to work six days a week — and well past quitting time. CEO Sam Altman is said to push the company’s teams to turn breakthroughs into public products on grueling timelines. OpenAI’s ex-chief research officer, Bob McGrew, reportedly cited burnout as one of the reasons he left last September.

There’s no relief to be found at competing labs. The Google DeepMind team developing Gemini, Google’s flagship series of AI models, at one point stepped up from working 100 hours a week to 120 hours to fix a bug in a system. And engineers at xAI, Elon Musk’s AI company, regularly post about working nights that bleed into the wee hours of the morning.

Why the relentless push? AI research today can have a sizeable impact on a company’s earnings. Google parent Alphabet lost some $90 billion in market value over the aforementioned bug, which caused Google’s Gemini chatbot to generate controversial depictions of historical figures.

“One of the biggest pressures is competitiveness,” Kai Arulkumaran, a research lead at AI services provider Araya, said, “combined with rapid timescales.”

Some of this competition plays out very publicly.

On a monthly — and sometimes weekly — basis, AI companies gun to displace one another on leaderboards like Chatbot Arena, which rank AI models across categories like math and coding. Logan Kilpatrick, who leads product for several Google Gemini developer tools, said in a post on X that Chatbot Arena “has had a nontrivial impact on the velocity of AI development.”

Not all researchers are convinced that’s a good thing. The industry’s velocity is such, they say, that they find their work at risk of being obsolesced before it can even ship.

“This makes many question their work’s value,” Zihan Wang, a robotics engineer working at a stealth AI startup, said. “If there is a huge probability that someone goes faster than me, what is the meaning of what I’m doing?”

Other researchers lament that the focus on productization has come at the expense of academic camaraderie.

“One of the underlying [causes of the stress] is the transition of AI researchers from pursuing their own research agendas in industry to moving to work on [AI models] and delivering solutions for products,” Arulkumaran said. “Industry set up an expectation that AI researchers could pursue academic research in industry, but this is no longer the case.”

Another researcher said that — much to their consternation and distress — open collaboration and discussions about research are no longer the norm in industry, outside of a few AI labs that have embraced openness as a release strategy.

“Now there is increasingly a focus on commercialization, closed-source scaling, and execution,” the researcher said, “without contributing back to the scientific community.”

Some researchers trace the seeds of their anxiety to their AI grad programs.

Gowthami Somepalli, a Ph.D. student studying AI at the University of Maryland, said that research is being published so rapidly, it has become difficult for grad students to distinguish between fads and meaningful developments. That matters a lot, Somepalli said, because she has seen AI companies increasingly prioritize candidates with “extremely relevant experience.”

“A Ph.D. is generally quite an isolating and stressful experience, and a machine learning Ph.D. is particularly challenging due to the field’s rapid progression and the ‘publish or perish’ mentality,” Somepalli said. “It can be especially stressful when many students in your lab are publishing 4 papers while you’re publishing only 1 or 2 papers a year.”

Somepalli said that, after the first two years of her grad program, she stopped taking vacations because she felt guilty about stepping away before she’d published any studies.

“I constantly suffered from impostor syndrome during my Ph.D. and almost dropped out at the end of my first year,” Somepalli said.

So what changes, if any, could foster a less punishing AI work environment? It’s tough to imagine the pace of development slowing any — not with so much cash at stake.

Somepalli stressed small but impactful reform, like normalizing voicing one’s own challenges.

“One of the biggest problems […] is that no one openly discusses their struggles; everyone puts on a brave face,” she said. “I believe [people] might feel better if they could see that others are struggling, too.”

Bhaskar Bhatt, an AI consultant at professional services company EY, says the industry should work to build “robust support networks” to combat feelings of isolation.

“Promoting a culture that values work-life balance, where individuals can genuinely disconnect from their work, is essential,” Bhatt said. “Organizations should foster a culture that values mental well-being as much as innovation, with tangible policies like reasonable work hours, mental health days, and access to counseling services.”

Ofir Press, a postdoctoral student at Princeton, proposed fewer AI conferences and weeklong “pauses” on paper submissions so that researchers can take a break from tracking new work. And Raj Dabre, an AI researcher at the National Institute of Information and Communications Technology in Japan, said researchers should be reminded in gentle ways of what’s really important.

“We need to educate people from the beginning that AI is just work,” Dabre said, “and we need to focus on family, friends, and the more sublime things in life.”

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI行业 工作压力 学术交流 改善建议
相关文章