top of page

Accelerating AI safety careers in APAC

Weekly in-person sessions. Remote expert support. A community of technical peers. TARA gives you the skills and credentials to transition into AI safety - without relocating or taking time off.

Applications close Friday 23 January (TA interviews are rolling - applying earlier is better)

DSC00827.JPG

Where you'll find our alumni...

5.png
4.png
1.png
3.png
7.png
6.png
2.png

What is TARA?

TARA is a 14-week part-time program that builds technical AI safety skills for talent across the Asia-Pacific region (APAC). Twice a year (March-June and September-December), participants work through the ARENA curriculum via full-day Saturday sessions (9:30 AM - 5:00 PM) plus 2-7 hours of independent learning during the week.​

​

You'll learn and implement key concepts like transformer architectures, mechanistic interpretability, reinforcement learning, and model evaluation techniques. Working in groups of 10-20 participants, you'll practice pair programming - where two people collaborate on code together, taking turns to write and review - all under the guidance of an experienced Teaching Assistant (TA)​.

​

The program is completely free—we cover:

  • An expert TA

  • Saturday lunches

  • Compute credits for coursework and projects

  • Dedicated study spaces in your city.

 

For the March 2026 cohort, we're aiming to run in Singapore, Sydney, Melbourne, Brisbane, Manila, Tokyo and Taipei.

Why apply to TARA?

  • Build deep technical understanding. You'll implement core ML and AI safety algorithms from scratch - not just use libraries, but understand how they actually work. This foundation is essential for research and engineering roles in the field.

  • Learn with expert support. Through structured Saturday sessions and ongoing Slack support, you'll receive guidance from an experienced TA on technical concepts and project development.

  • Produce real research. The program culminates in a three-week project where you'll receive detailed guidance from your TA and peers, with the opportunity to produce publication-quality work.

  • Join a regional network. We're aiming to bring together 80 participants across APAC twice a year. Through small group sessions, collaborative projects, and ongoing discussions, you'll build lasting connections with others passionate about AI safety.

TARA v1 outcomes

Inaugural Sydney & Melbourne cohort, 2025

  • 90% finished the program

  • 94% would recommend to a friend

  • 100% satisfied with program management

  • 89% more motivated to pursue AI safety careers

  • 45% were full-time professionals

​​

6-month career outcomes (14 of 19 completers surveyed)

  • 29% secured competitive fellowships (SPAR, LASR Labs, CSIRO Data61, Entrepreneurs First)

  • 29% transitioned to AI safety roles

  • 2 research outputs published

"TARA gave me the technical foundation and community I needed to make the leap into AI safety. The structured curriculum, TA support and accountability of working alongside motivated peers made all the difference.

 

Within months of completing the program, I left my role at Commonwealth Bank to join Arcadia Impact as an Inspect Evals maintainer."

— Scott Simmons, Arcadia Impact

Scott Simmons.jpeg

Who is TARA for?

  • Software engineers and Machine Learning practitioners aiming to transition into AI safety.

  • Undergraduate and postgraduate students.

  • Technical professionals who can’t commit to full-time programs.


These are examples, not requirements. If you are a strong coder and passionate about AI safety, we encourage you to apply.​

DSC00735.JPG

Locations and dates

Round 1 2026: 7 March - 13 June

​

  • We're targeting cohorts in Sydney, Melbourne, Brisbane, Singapore, Manila, Taipei, and Tokyo.

  • Final locations are determined by where our strongest applicants are based - if there's sufficient demand in your city, we'll make it happen.

  • We'll keep applicants updated throughout the selection process. Our goal is to make TARA accessible to talented participants across APAC while maintaining high-quality, in-person study groups.

  • We run two rounds each year. Round 2 2026 dates TBA.

Curriculum overview (high-level structure)

The program follows a progressive learning journey through technical AI safety concepts, based on the ARENA curriculum. Topics include:

​

  • Neural network foundations and optimisation

  • Transformer architectures

  • Mechanistic interpretability 

  • Reinforcement learning

  • Model evaluations


The final three weeks are dedicated to project work with TA support.


Detailed weekly breakdown coming soon.

Weekly learning structure

Each week follows a consistent pattern designed to maximise learning while accommodating participants' schedules:
 
Saturday Sessions (9:30 AM - 5:00 PM):

  • Morning: TA introduces the week's technical concepts

  • Late morning: Pair programming to implement learnings

  • 12:30 PM: Progress update, then lunch

  • Afternoon: More pair programming

  • 5:00 PM: Progress update and wrap-up

 

Remote TA support is available on demand throughout the day.

 
Independent Learning (2-7 hours during the week):

  • Self-paced study of concepts introduced on the Saturday

  • Ongoing Slack support from TAs and fellow TARA participants


This structure ensures participants receive structured guidance while maintaining flexibility for individual learning styles and work commitments.

What are the entry requirements?

​Important:​

  • Strong English proficiency (all instruction and support is delivered in English)

  • Motivated to reduce catastrophic risk from AI

  • Strong Python programming background

  • Ability to attend at least 11 of 14 Saturday sessions in person (necessary for certification)

  • Ability to commit to at least 2 hours of independent study during the week to embed learnings

 

Desirable:

  • Basic understanding of deep learning concepts

  • Working knowledge of linear algebra, probability and statistics​

What can I expect from the application process?

You'll spend about 30-60 minutes completing an application form about your technical background and interest in AI safety. Shortlisted applicants will be invited to a 15-minute interview.
​
We use a commitment bond to increase engagement in the program. Here is how it works:

  • You'll put down a deposit when you start, and receive it back once you attend your 11th Saturday session.

  • The bond signals serious commitment - both to yourself and to your cohort.

  • In our first cohort, it helped us achieve a 90% participant completion rate.

  • For reference, Australian participants will be asked to commit a $150 AUD bond. Bonds for other cities will be calibrated to feel similarly significant - enough to signal commitment, but not so much that it creates a barrier.

  • If the bond amount is a barrier for you, let us know in the application form and we can discuss.

Key Dates - 2026

  • Participant and TA applications close: Friday 23 January (TA interviews are rolling - applying earlier is better).

  • Selected participants notified: Friday 20 February.

  • City ice-breaker sessions (online): Saturday 7 March.

  • Program starts in-person: Saturday 14 March.

  • Final project presentations: Saturday 13 June, followed by celebration dinners in each city.

Sign up for email updates

Sign up for TARA program updates - new cohorts, application deadlines, and opportunities to get involved.

Thanks for submitting!

bottom of page