少点错误 03月22日
2025 Q3 Pivotal Research Fellowship: Applications Open
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Pivotal 2025 Q3研究 Fellowship开始接受申请,这是一个9周的全额资助研究项目,涉及技术AI安全等领域。今年采用导师优先模式,申请者可根据兴趣和技能选择导师。该项目旨在培养高质量研究成果,为研究员提供多方面支持。

🎓Pivotal 2025 Q3研究项目,6月30日至8月29日在伦敦举行。

👨‍🏫今年采用导师优先模式,导师经验丰富且在特定领域有深入研究。

💪为研究员提供多种支持,包括导师指导、研究环境、津贴等。

💰推荐合适人选可获100美元,申请者4月9日截止申请。

Published on March 22, 2025 10:54 AM GMT

We are now accepting applications for the Pivotal 2025 Q3 Research Fellowship, a 9-week, fully funded research program for technical AI safety, AI governance and policy, technical AI governance, and AI-Bio.

Dates: June 30 – August 29, 2025
Location: London, at the London Initiative for Safe AI (LISA)
Deadline: Wednesday, April 9 (23:59 CET)

Mentor-First Model

This year, we’re introducing a mentor-first approach. Instead of selecting fellows first and matching them with a mentor later, we are featuring experienced researchers who:

Applicants apply to one or multiple mentors based on their interests and skills. This structure ensures that research projects are well-scoped from the start, with clear guidance from researchers who have thought carefully about which research contributions would be most valuable. If you’re open to being matched with a mentor, you can also opt to be matched by Pivotal, although this is no longer our main focus – if you are interested in being matched with a mentor, we also highly recommend applying to the ERA Fellowship!  

Find out more about our mentorsmany of them are also posting on here.

About the Fellowship

The Pivotal Research Fellowship is designed for people who want to:

The fellowship centres around producing high-quality research output – most commonly a paper, but also potentially a blog post series, policy report, or another form of meaningful contribution. For particularly strong projects, we’re excited to explore support for research extensions beyond the core fellowship period.

Fellows will be based in London and work in person at the London Initiative for Safe AI, with structured mentorship and opportunities to engage with the broader research community.

Fellows from recent years joined organisations such as GovAI and the Institute for Progress, and the UK AISI, joined fellowships at MATS, IAPS, and GovAI, published papers at top conferences, and founded AI safety organizations such as KIRA and Prism Evals.

What Fellows Receive

This is our 6th research fellowship, and we continue to refine the program to ensure it gives researchers the structure, mentorship, and environment they need to do meaningful work in AI safety.

If you are interested in technical AI safety, AI governance, or AI-bio intersections and are looking for an opportunity to contribute to high-quality research in a structured, ambitious, and supportive environment, we encourage you to apply.

Apply now

Deadline: Wednesday, April 9 (23:59 CET)

If you have any questions, please send us a message.


Recommend Someone & Earn $100

Know someone who might be a great fit? Refer them here and receive $100 for each accepted candidate we contact through you.



Discuss

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Pivotal Fellowship AI安全 导师模式 研究支持 推荐奖励
相关文章