Getting started as an independent researcher
Many of the most impactful contributions to AI safety have come from independent researchers โ people who aren't employed by a major lab but are doing serious, original work. The ecosystem is set up to support this: there are grants specifically for independent alignment research, coworking spaces that provide community and structure, and visiting researcher programs that give you access to resources without a full-time position.
The typical path looks something like this: complete a fellowship or program to build your skills, develop a concrete research agenda, then apply for funding to support 6-12 months of independent work. Many researchers base themselves at a coworking space like Constellation, LISA, or Meridian for the community and accountability.
The bar for funding is lower than you might think. Grantmakers in this space are actively looking for promising researchers and are willing to take bets on people early in their careers. A clear research question, a credible plan for investigating it, and evidence that you can execute โ that's often enough to get started. For a practical guide to the grant writing process, see Marc Sheraton-Sherbourne's Everything I Learnt About Grant Writing Without Going Mad.
Career transition grant
Coefficient Giving โ Career Transition
Specifically designed for people transitioning into AI safety work. Provides funding to cover living costs while you ramp up your skills and research. One of the few grants explicitly targeting career changers โ if you're leaving another field for alignment work, this should be your first stop.
Career changers
Living costs
Transition support
Research grant
Long-Term Future Fund (LTFF)
Run by Effective Ventures, the LTFF is one of the most active funders of independent AI safety research. Grants range from small project funding to full-time research support. They review applications on a rolling basis and are known for being willing to fund early-career researchers with strong proposals.
Major funder
Rolling applications
Flexible scope
Research grant
Future of Life Institute (FLI)
FLI runs grant programs focused on existential risk from AI, including technical safety research and AI governance. Their grants tend to be larger and more structured โ well-suited for researchers with a defined agenda and track record, or for collaborative projects between multiple researchers.
X-risk focused
Larger grants
Technical & governance
Crowdfunding / Regranting
Manifund
A regranting and crowdfunding platform where you can pitch your research project directly to funders in the AI safety community. More lightweight than a traditional grant application โ you create a project page, and regrantors or individual donors can fund it. Good for smaller projects, pilot studies, or supplementary funding.
Regranting
Crowdfunding
Low barrier
Fast turnaround
Cambridge, UK
Meridian โ Visiting Researcher Program
Meridian offers a visiting researcher program that gives you desk space, community, and access to the Cambridge AI safety ecosystem for a defined period. A good option if you want to spend time in Cambridge working on your research while connecting with the local safety community.
Desk space
Cambridge community
Temporary
Berkeley, CA
Constellation โ Visiting Researchers
Constellation welcomes visiting researchers who are working on AI safety. As the largest safety coworking space, it offers unparalleled access to the Berkeley alignment community โ including MATS scholars, independent researchers, and people from major labs. Being at Constellation for even a few weeks can transform your network and research direction.
Major hub
Berkeley community
Networking
Writing a strong grant application
Be specific about what you'll work on and why it matters for AI safety. Grantmakers see many vague proposals โ the ones that stand out have a clear research question, a concrete plan for the first few months, and honest reasoning about why this approach might work. If you've done a fellowship or published on the Alignment Forum, reference that work directly.