top of page

Accelerating technical
AI safety careers in APAC

Weekly in-person sessions. Remote expert support. A community of technical peers. All completely free. TARA gives you the skills and credentials to transition into AI safety — without relocating or taking time off.

Applications are now closed for Round 1 2026. Sign up for email updates at the bottom of this page to be notified when Round 2 2026 applications open.

DSC00827.JPG

Where you'll find our alumni...

5.png
4.png
1.png
3.png
7.png
6.png
2.png

What is TARA?

The Technical Alignment Research Accelerator (TARA) is a 14-week part-time program that builds technical AI safety skills for talent across the Asia-Pacific region (APAC). Twice a year (March-June and September-December), participants work through the ARENA curriculum via full-day Saturday sessions plus 2-7 hours of independent learning during the week.​

​

You'll learn and implement key concepts like transformer architectures, mechanistic interpretability, reinforcement learning, and model evaluation techniques. Working in groups of 10-20 participants, you'll practice pair programming - where two people collaborate on code together, taking turns to write and review - all under the guidance of an experienced Teaching Assistant (TA)​.

​

The program is completely free—we cover:

  • An expert TA

  • Saturday lunches

  • Compute credits for coursework and projects

  • Dedicated study spaces in your city.

 

For the March 2026 cohort, we're aiming to run in Singapore, Sydney, Melbourne, Brisbane, Manila, Tokyo and Taipei.

Why apply to TARA?

  • Build deep technical understanding. You'll implement core ML and AI safety algorithms from scratch - not just use libraries, but understand how they actually work. This foundation is essential for research and engineering roles in the field.

  • Learn with expert support. Through structured Saturday sessions and ongoing Slack support, you'll receive guidance from an experienced TA on technical concepts and project development.

  • Produce real research. The program culminates in a three-week project where you'll receive detailed guidance from your TA and peers, with the opportunity to produce publication-quality work.

  • Join a regional network. We're aiming to bring together 80 participants across APAC twice a year. Through small group sessions, collaborative projects, and ongoing discussions, you'll build lasting connections with others passionate about AI safety.

TARA v1 outcomes

Inaugural Sydney & Melbourne cohort, 2025

  • 90% finished the program

  • 94% would recommend to a friend

  • 100% satisfied with program management

  • 89% more motivated to pursue AI safety careers

  • 45% were full-time professionals

​​

6-month career outcomes (14 of 19 completers surveyed)

  • 29% secured competitive fellowships (SPAR, LASR Labs, CSIRO Data61, Entrepreneurs First)

  • 29% transitioned to AI safety roles

  • 2 research outputs published

For a full report on TARA v1, follow this link.

"TARA gave me the technical foundation and community I needed to make the leap into AI safety. The structured curriculum, TA support and accountability of working alongside motivated peers made all the difference.

 

Within months of completing the program, I left my role at the Commonwealth Bank to join Arcadia Impact as an Inspect Evals maintainer."

— Scott Simmons, Arcadia Impact

Scott Simmons.jpeg

Who is TARA for?

  • Software engineers and Machine Learning practitioners aiming to transition into AI safety.

  • Undergraduate and postgraduate students.

  • Technical professionals who can’t commit to full-time programs.


These are examples, not requirements. If you are a strong coder and passionate about AI safety, we encourage you to apply.​

DSC00735.JPG

Locations and dates

Round 1 2026: 7 March - 13 June

​

  • We're targeting cohorts in Sydney, Melbourne, Brisbane, Singapore, Manila, Taipei, and Tokyo.

  • Final locations are determined by where our strongest applicants are based - if there's sufficient demand in your city, we'll make it happen.

  • We'll keep applicants updated throughout the selection process. Our goal is to make TARA accessible to talented participants across APAC while maintaining high-quality, in-person study groups.

  • We run two rounds each year.

 

Round 2 2026: Dates TBA

​​

Curriculum overview (high-level structure)

The program follows a progressive learning journey through technical AI safety concepts, based on the ARENA curriculum. Topics include:

​

  • Neural network foundations and optimisation

  • Transformer architectures

  • Mechanistic interpretability 

  • Reinforcement learning

  • Model evaluations


The final three weeks are dedicated to project work with TA support.

​

A detailed weekly breakdown can be found here.

Weekly learning structure

Each week follows a consistent pattern designed to maximise learning while accommodating participants' schedules. Start times vary by city: 9:30am Australia, 10:00am Singapore/Taipei/Manila, 11:00am Tokyo.
 
Saturday Sessions (~ 7.5 hours):
​​

  • Hour 1: TA introduces the week's technical concepts

  • Hours 2-3: Pair programming to implement learnings

  • Midday: Progress update, then lunch

  • Hours 4-7: More pair programming

  • End of day: Progress update and wrap-up

 

Remote TA support is available on demand throughout the day.

 
Independent Learning (2-7 hours during the week):

  • Self-paced study of concepts introduced on the Saturday

  • Ongoing Slack support from TAs and fellow TARA participants


This structure ensures participants receive structured guidance while maintaining flexibility for individual learning styles and work commitments.

What are the entry requirements?

​Important:​

  • Strong English proficiency (all instruction and support is delivered in English).

  • Motivated to reduce catastrophic risk from AI.

  • Strong Python programming background.

  • Attend at least 11 of 14 Saturday sessions in person (necessary for certification).

  • Arrive on time each Saturday (9:30am Australian cohorts, 10:00am Singapore/Taipei/Manila, 11:00am Tokyo).

  • Commit to at least 2 hours of independent study during the week to embed learnings.

  • Deliver a project presentation (necessary for certification).

​

Desirable:

  • Basic understanding of deep learning concepts.

  • Working knowledge of linear algebra, probability and statistics​.

What can I expect from the application process?

You'll spend about 60 - 120 minutes completing an application form about your technical background and interest in AI safety. Shortlisted applicants will be invited to a 15-minute interview.
​
While TARA is free for Participants, we use a commitment bond to increase engagement in the program. Here is how it works:

  • You'll put down a deposit when you start, and receive it back once you've attended at least 11 of 14 Saturday sessions and delivered your project presentation.

  • The bond signals serious commitment - both to yourself and to your cohort.

  • In our first cohort, it helped us achieve a 90% participant completion rate.

  • If the bond amount is a barrier for you, let us know in the application form and we can discuss.​

The bond amounts are as follows:

  • Sydney/Melbourne/Brisbane: $150 AUD

  • Manila: ₱1,200 PHP

  • Singapore: $150 SGD

  • Taipei: $1,250 NTD

  • Tokyo: ¥10,000 JPY

Key Dates - 2026

  • Participant and TA applications close: Friday 23 January (TA interviews are rolling - applying earlier is better).

  • Selected participants notified: Friday 20 February.

  • City ice-breaker sessions (online): Saturday 7 March.

  • Program starts in-person: Saturday 14 March.

  • Final project presentations: Saturday 13 June, followed by celebration dinners in each city.

Sign up for email updates

Sign up for TARA program updates - new cohorts, application deadlines, and opportunities to get involved.

Thanks for submitting!

bottom of page