<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:webfeeds="http://webfeeds.org/rss/1.0">
    <channel>
        <title><![CDATA[TA39 Community]]></title>
        <description><![CDATA[TA39 Community]]></description>
        <link>https://community.ta-39.com</link>
        <generator>Bettermode RSS Generator</generator>
        <lastBuildDate>Mon, 04 May 2026 12:18:43 GMT</lastBuildDate>
        <atom:link href="https://community.ta-39.com/rss/feed" rel="self" type="application/rss+xml"/>
        <pubDate>Mon, 04 May 2026 12:18:43 GMT</pubDate>
        <copyright><![CDATA[2026 TA39 Community]]></copyright>
        <language><![CDATA[en-US]]></language>
        <ttl>60</ttl>
        <webfeeds:icon></webfeeds:icon>
        <webfeeds:related layout="card" target="browser"/>
        <item>
            <title><![CDATA[Introducing Revision Rounds: Making Student Revision Across Drafts Visible]]></title>
            <description><![CDATA[Hello educators,

We've just released Revision Rounds in TA39. Its a way to run multi-draft assignments while keeping every draft, feedback, and rubric result connected, making it easier to track how ...]]></description>
            <link>https://community.ta-39.com/basic-2-column-umfls66p/post/introducing-revision-rounds-making-student-revision-across-drafts-visible-jbYENcYH221JQqt</link>
            <guid isPermaLink="true">https://community.ta-39.com/basic-2-column-umfls66p/post/introducing-revision-rounds-making-student-revision-across-drafts-visible-jbYENcYH221JQqt</guid>
            <category><![CDATA[Drafts]]></category>
            <category><![CDATA[Feedback]]></category>
            <category><![CDATA[Revisions]]></category>
            <category><![CDATA[Rubric]]></category>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Tue, 14 Apr 2026 07:31:27 GMT</pubDate>
            <content:encoded><![CDATA[<p>Hello educators,</p><p>We've just released <strong>Revision Rounds</strong> in TA39. Its a way to run multi-draft assignments while keeping every draft, feedback, and rubric result connected, making it easier to track how students respond to feedback across drafts.</p><p>Instead of managing revision across separate submissions and scattered comments, you can now:</p><ul><li><p>Keep every draft visible without overwriting earlier work</p></li><li><p>Compare student drafts side by side with feedback and rubric results</p></li><li><p>See how students responded to feedback across revision rounds</p></li><li><p>Choose which draft to grade from</p></li></ul><h2 class="text-xl" data-toc-id="a2d54910-8a2d-4804-a558-7c84c3c502a8" id="a2d54910-8a2d-4804-a558-7c84c3c502a8"><strong>Why this matters</strong></h2><p>Revision is where a lot of learning happens — but once students resubmit, it becomes harder to track what changed. We kept hearing a version of the same thing from teachers running revision-heavy classrooms:</p><blockquote><p>"I know the student revised. I just can't remember exactly what I asked for last week."</p><p>"By Draft 3, the history is scattered across comments, rubric scores, and my own notes."</p><p>"I want to see whether they actually responded to the feedback — not just whether the draft is longer."</p></blockquote><p>Revision Rounds makes that process easier to follow. You can review drafts in context, see how students respond to feedback, and focus your attention on what still needs work.</p><hr><h2 class="text-xl" data-toc-id="7bdf78b3-584f-4923-8002-edf20a0cd69e" id="7bdf78b3-584f-4923-8002-edf20a0cd69e"><strong>How it works</strong></h2><ol><li><p>Collect Draft 1 as usual</p></li><li><p>Provide feedback and rubric results</p></li><li><p>Open the next draft when students are ready</p></li><li><p>Review changes using:</p><ul><li><p><strong>Compare with previous</strong> (side-by-side view)</p></li><li><p><strong>How this draft changed</strong> (feedback-aligned summary)</p></li></ul></li><li><p>Post grades from the draft you choose</p></li></ol><p>Each draft remains available, so you can always go back and review earlier work.</p><h2 class="text-xl" data-toc-id="51d362d2-0f7a-4557-a7ef-9e544c0b61eb" id="51d362d2-0f7a-4557-a7ef-9e544c0b61eb"><strong>A few things to know</strong></h2><ul><li><p>Revision Rounds works best when the same criteria carry across drafts</p></li><li><p>You decide when each draft opens — students only see the next draft when you release it, and earlier work always remains available</p></li><li><p>Grading decisions remain fully under teacher control</p></li></ul><h2 class="text-xl" data-toc-id="4a1fa4d5-2bab-484c-99a7-de1ab499adb0" id="4a1fa4d5-2bab-484c-99a7-de1ab499adb0"><strong>Learn more</strong></h2><p>For a deeper look at why visible revision changes what revision cycles can teach — and how to use Revision Rounds more effectively in your classroom:</p><p><a class="text-interactive hover:text-interactive-hovered" rel="noopener noreferrer nofollow" href="https://community.ta-39.com/best-practices/post/why-revision-cycles-work-better-when-both-sides-can-see-what-changed-53T6XKj7cMWrYhu"><em><u>Why Revision Cycles Work Better When Both Sides Can See What Changed</u></em></a></p><h2 class="text-xl" data-toc-id="8e220700-c923-441c-b90f-390aace9bbc9" id="8e220700-c923-441c-b90f-390aace9bbc9"><strong>Try it out</strong></h2><p>Revision Rounds is now available in TA39. Start with one assignment and see how it fits into your revision workflow.</p><p>If something doesn't feel right, or if you see a way this could work better for your classroom, we'd truly value your input. Your feedback helps us steer TA39 in the direction it needs to go, grounded in your real-world experience — not assumptions.</p><p>— The TA39 Team</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Why Revision Cycles Work Better When Both Sides Can See What Changed]]></title>
            <description><![CDATA[Most writing teachers already know that students do not grow most from a single final draft. They grow through revision: when they read feedback, sit with it, decide what to change, and try again. ...]]></description>
            <link>https://community.ta-39.com/best-practices-4mkl4fls/post/why-revision-cycles-work-better-when-both-sides-can-see-what-changed-53T6XKj7cMWrYhu</link>
            <guid isPermaLink="true">https://community.ta-39.com/best-practices-4mkl4fls/post/why-revision-cycles-work-better-when-both-sides-can-see-what-changed-53T6XKj7cMWrYhu</guid>
            <category><![CDATA[Drafts]]></category>
            <category><![CDATA[Feedback]]></category>
            <category><![CDATA[Revisions]]></category>
            <category><![CDATA[Rubric]]></category>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Tue, 14 Apr 2026 06:58:01 GMT</pubDate>
            <content:encoded><![CDATA[<p>Most writing teachers already know that students do not grow most from a single final draft. They grow through revision: when they read feedback, sit with it, decide what to change, and try again. That cycle of feedback, reflection, and revision is where much of the learning actually happens.</p><p>A first draft, then feedback, then a second draft, sometimes a third: the structure varies, but the principle is consistent. Students need a reason to return to their own work and improve it. The difficult part is usually not designing that revision cycle. It is seeing it clearly once it is underway.</p><p>A student submits a draft. You respond. They revise and resubmit. Then you are looking at a new version of the essay, trying to remember what you asked for last time, whether the student addressed it, and what still needs work. Across a full class, that becomes difficult to hold onto. The judgment is still yours, but the evidence you need is spread across drafts, comments, rubric results, and memory. That is where the real friction begins.</p><h2 class="text-xl" data-toc-id="91442334-fe9b-45a6-8861-47888e0b7665" id="91442334-fe9b-45a6-8861-47888e0b7665"><strong>The real question revision is trying to answer</strong></h2><p>Revision is not just about producing a better final draft. It is about something more specific: whether the student understood the feedback well enough to act on it.</p><p>A student can revise and improve a piece by accident by adding detail, fixing grammar, or reorganizing a paragraph. The draft may be better, but that does not necessarily mean the student understood what the feedback was asking them to do, or that they could apply the same thinking to another assignment.</p><p>The question that matters most in a revision cycle is not simply, "Is this draft better?" It is: <strong>Did the student understand the feedback and apply it in revision?</strong> When you can answer that question clearly, revision becomes instructional. It stops being just a step between drafts and becomes evidence of how the student is learning.</p><h2 class="text-xl" data-toc-id="7cf7e481-d33c-4671-8724-166fdac85690" id="7cf7e481-d33c-4671-8724-166fdac85690"><strong>What makes revision hard to track</strong></h2><p>In practice, the challenge is usually not pedagogy. It is visibility.</p><p>You know what you asked for. You know which criteria matter. You know what improvement should look like. But verifying that often means going back to earlier feedback, comparing drafts manually, and deciding whether the revision actually meets the criterion. That is manageable for one student. Across multiple sections, it becomes heavy.</p><p>So something gives: fewer drafts, broader feedback, less specificity. Not because teachers want that, but because the process is harder to sustain at scale. The issue is not judgment. It is that the information needed to apply that judgment is not in one place.</p><h2 class="text-xl" data-toc-id="1fd2dd97-0f6b-4654-b3b3-85cabb8e8d32" id="1fd2dd97-0f6b-4654-b3b3-85cabb8e8d32"><strong>What changes when revision is visible</strong></h2><p>When you can see what the feedback asked for, what the student changed, and what still remains, the revision cycle becomes much more usable.</p><p>For the teacher, it changes the starting point. Instead of beginning with, "Let me reread this and see what I think," you can begin with something much more concrete: "I asked for stronger evidence. I can see you added quotations. The explanations are still brief, so that is the next step." That is a different kind of conversation. It is anchored in visible change.</p><p>For students, revision becomes clearer too. Instead of "make it better," revision becomes "respond to feedback." They can see what moved and what did not, often at the level of a specific criterion. That is where the learning becomes more visible for both teacher and student.</p><h2 class="text-xl" data-toc-id="22f3d16f-d532-4db6-8b97-000403f19051" id="22f3d16f-d532-4db6-8b97-000403f19051"><strong>How this works in TA39</strong></h2><p>TA39 supports this through <strong>Revision Rounds</strong>.</p><p>Revision Rounds structures multi-draft assignments so that each submission, each round of feedback, and each set of rubric results stays connected and visible over time. Teachers and students move through <strong>Draft 1, Draft 2, Draft 3</strong>, and later drafts as needed. You decide when a new draft opens, and students only see the next draft when you release it. Earlier submissions, feedback, and rubric results remain available for reference, so nothing gets overwritten. Instead of reconstructing the revision process after the fact, you can review the record of the work as it developed.</p><p>That matters for a simple reason: revision is easier to teach when the history of the work stays intact.</p><p>Two views make this especially useful. <strong>Compare with previous</strong> allows you to open two drafts side by side, including the submission, feedback, and rubric for each version, so you can review the work in context without jumping between screens. <strong>How this draft changed</strong> helps surface revision progress at the rubric-criterion level. It shows what earlier feedback focused on, what appears to have changed in the new draft, and what may still need attention.</p><p>For example:</p><blockquote><p>Previous feedback: Add quotations to support claims. This draft: Quotations were added. What still remains: Explain how each quote supports the claim.</p></blockquote><p>At that point, you are no longer asking, in a vague way, whether the draft feels better. You are asking whether specific expectations were understood and addressed. That is a much more useful question.</p><p>It also reflects the reality of classroom revision. Some students revise immediately. Some take longer. Some may stop after an earlier draft they feel good about. The teacher still needs a clear record of what was submitted, what feedback was given, and what changed from one round to the next.</p><h2 class="text-xl" data-toc-id="f9a18aa1-2994-4135-991f-e5239fafd91c" id="f9a18aa1-2994-4135-991f-e5239fafd91c"><strong>A practical way to start</strong></h2><p>If you are trying this for the first time, keep it simple. Pick one assignment and use two drafts.</p><p>Collect the first draft, review the feedback, then open Draft 2 and ask students to revise based on that feedback. When the second draft comes in, look at a few students using How this draft changed. Does it match what you see in the writing? Does it help you identify what improved and what did not? That small test is usually enough to see whether the structure fits your workflow.</p><p>It also works best when the same expectations carry across drafts. When students are revising against consistent criteria, it becomes much easier to see whether feedback was actually understood and applied.</p><h2 class="text-xl" data-toc-id="af1d02ad-8865-4f3f-9650-87fd7cc371a4" id="af1d02ad-8865-4f3f-9650-87fd7cc371a4"><strong>How to use Revision Rounds</strong></h2><p>Using Revision Rounds follows the same logic most teachers already use, just with clearer structure:</p><ol><li><p>Create an assignment and enable Revision Rounds</p></li><li><p>Collect Draft 1 as the initial submission</p></li><li><p>Review feedback and rubric results</p></li><li><p>When students are ready, open Draft 2</p></li><li><p>Ask students to revise and resubmit</p></li><li><p>Review the new submission using comparison or How this draft changed</p></li><li><p>Post grades from the draft you choose</p></li></ol><p>The sequence is familiar. What changes is that each step stays visible and connected.</p><h2 class="text-xl" data-toc-id="d8c953a5-7914-49d3-9a1e-aa04153de5c8" id="d8c953a5-7914-49d3-9a1e-aa04153de5c8"><strong>Grading stays under teacher control</strong></h2><p>Revision Rounds supports both formative and summative use. TA39 can provide feedback on every draft, but grades are only posted when you decide to post them. You can grade from the latest draft or select a specific draft, depending on how you want to evaluate the work.</p><p>That flexibility matters because teachers approach revision differently. Some prioritize final performance. Others emphasize growth. Revision Rounds does not impose a model. It gives you a clearer record to apply your own.</p><p>That also means edge cases remain manageable. If a student does not submit a later draft, the earlier work and feedback history are still there. The system keeps the record visible, while grading decisions remain with the teacher.</p><h2 class="text-xl" data-toc-id="bb2f633a-a6a7-4f23-8c2d-0cca745eb89e" id="bb2f633a-a6a7-4f23-8c2d-0cca745eb89e"><strong>What this feature is not</strong></h2><p>Revision Rounds does not provide live drafting support. It does not coach students while they are writing. It does not replace teacher judgment.</p><p>It works after submission, where feedback is given, revision happens, and growth can be observed. A system can surface patterns, organize drafts, and help show where revision appears to have happened in response to feedback. But it still works from what you wrote down: your rubric, your criteria, your feedback. The judgment of what counts, what matters, and what is sufficient remains yours.</p><p>It also does not reduce revision to an automated verdict. Side-by-side review still matters. Teacher reading still matters. The point is not to replace close evaluation, but to make the revision process easier to follow and easier to teach from.</p><h2 class="text-xl" data-toc-id="118d39c4-41b8-468f-93c7-db3e36b6f8de" id="118d39c4-41b8-468f-93c7-db3e36b6f8de"><strong>What revision visibility makes possible</strong></h2><p>The deeper value here is not just efficiency. When drafts, feedback, and criteria stay connected, revision becomes easier to teach from.</p><p>You can see which students acted on feedback, which misunderstood it, which avoided it, and which improved in one area but not another. That is not just grading information. It is instructional information. It changes how you conference, how you plan next steps, and how you understand student growth.</p><p>For students, it makes growth more concrete as well. They are not just hearing that their writing improved. They can see what changed, where it changed, and why it mattered.</p><p>That is what revision cycles are meant to do: not just produce better drafts, but make the process of getting there visible enough to learn from.</p><h2 class="text-xl" data-toc-id="ca84ae37-79e0-4b83-8e70-72233082f0ba" id="ca84ae37-79e0-4b83-8e70-72233082f0ba"><strong>Frequently asked questions</strong></h2><p><strong>Can teachers choose which draft to grade from?</strong></p><p>Yes. Teachers can choose which draft to use when posting grades. This allows flexibility to grade the final draft or a specific draft, depending on how the assignment is structured.</p><p><strong>Do students see their earlier drafts and feedback?</strong></p><p>Yes. Earlier drafts, feedback, and rubric results remain available across rounds, so both teachers and students can refer back to prior work as revision continues.</p><p><strong>What does "How this draft changed" show?</strong></p><p>It highlights revision at the rubric level by showing how the new draft relates to earlier feedback, what appears to have improved, and what may still need work. It is meant to support review, not replace teacher evaluation.</p><p><strong>Does it show exactly what changed in the student's writing?</strong></p><p>Revision Rounds allows side-by-side comparison of drafts. It is designed to help teachers review changes and responses to feedback, rather than automatically marking every text-level difference.</p><p><strong>What happens if a student does not submit the next draft?</strong></p><p>If a student does not submit a later draft, there is simply no new submission for that round. Earlier drafts and feedback remain available, and teachers can decide how to handle grading based on their classroom policy.</p><p><strong>Does Revision Rounds replace teacher judgment?</strong></p><p>No. Revision Rounds organizes drafts, feedback, and comparisons in one place, but grading decisions and expectations remain fully under teacher control.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[You Can Now Optimize Rubrics with AI in the Rubric Library]]></title>
            <description><![CDATA[Hello educators,

If you have used TA39 rubrics before, you may remember that improving or converting a rubric used to involve a separate utility.

Most teachers are not looking for a standalone rubric-...]]></description>
            <link>https://community.ta-39.com/basic-2-column-umfls66p/post/you-can-now-optimize-rubrics-with-ai-in-the-rubric-library-wUoDP62myFdyOId</link>
            <guid isPermaLink="true">https://community.ta-39.com/basic-2-column-umfls66p/post/you-can-now-optimize-rubrics-with-ai-in-the-rubric-library-wUoDP62myFdyOId</guid>
            <category><![CDATA[Rubric]]></category>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Mon, 30 Mar 2026 06:23:35 GMT</pubDate>
            <content:encoded><![CDATA[<p>Hello educators,</p><p>If you have used TA39 rubrics before, you may remember that improving or converting a rubric used to involve a separate utility.</p><p>Most teachers are not looking for a standalone rubric-conversion tool. They are trying to do something simpler: upload the rubric they already have, review it, improve it if needed, and save it for use.</p><p>That is why we changed the workflow.</p><p>You can now do all of that directly inside the <strong>rubric library</strong>. When you upload, import, or create a rubric, TA39 can extract it, help structure it, and let you use <strong>Optimize with AI</strong> as part of the same flow.</p><p>This feature is now available in TA39.</p><figure data-align="center" data-size="original" data-id="9AUqZ0goqb8CUEF8ZinNv" data-version="v2" data-type="image"><img data-id="9AUqZ0goqb8CUEF8ZinNv" src="https://tribe-s3-production.imgix.net/9AUqZ0goqb8CUEF8ZinNv?auto=compress,format"></figure><p></p><h3 class="text-lg" data-toc-id="740c73a7-63ab-426a-8f6b-eec4e6705c83" id="740c73a7-63ab-426a-8f6b-eec4e6705c83">What You Can Now Do</h3><p>You can now:</p><ul><li><p>upload or import an existing rubric directly into the rubric library</p></li><li><p>create a new rubric from scratch in the same place</p></li><li><p>review the extracted rubric inside TA39</p></li><li><p>use <strong>Optimize with AI</strong> to improve clarity and consistency</p></li><li><p>compare the original and revised versions</p></li><li><p>save the version you want to use</p></li></ul><h3 class="text-lg" data-toc-id="42b7c4b2-029d-4f55-a51e-e98115016bf6" id="42b7c4b2-029d-4f55-a51e-e98115016bf6">Why We Made This Change</h3><p>We heard a version of the same thing from teachers again and again:</p><ul><li><p>“I already have a rubric.”</p></li><li><p>“I do not want to use a separate utility first.”</p></li><li><p>“I want to work on the rubric while I am setting it up.”</p></li><li><p>“If something needs improvement, I want that built into the main flow.”</p></li></ul><p>That is the main change here.</p><p>Instead of sending teachers to a separate rubric-conversion step, TA39 now supports that work directly in the rubric library UI.</p><h3 class="text-lg" data-toc-id="a092d314-6e1b-4ef1-9ac1-3f94543e7747" id="a092d314-6e1b-4ef1-9ac1-3f94543e7747">What “Optimize with AI” Is For</h3><p><strong>Optimize with AI</strong> is designed for rubrics that are already mostly usable, but could be clearer or easier to apply consistently.</p><p>For example, it can help when:</p><ul><li><p>performance levels sound too similar</p></li><li><p>criteria overlap more than intended</p></li><li><p>score ranges are hard to interpret</p></li><li><p>wording is understandable to a teacher, but not yet clear enough for consistent AI-supported use</p></li></ul><p>The goal is not to replace your judgment. It is to help you improve the rubric you already have without leaving the workflow you are in.</p><figure data-align="center" data-size="original" data-id="6vJFwZo7p3O7xHaNc7502" data-version="v2" data-type="image"><img data-id="6vJFwZo7p3O7xHaNc7502" src="https://tribe-s3-production.imgix.net/6vJFwZo7p3O7xHaNc7502?auto=compress,format"></figure><p></p><h3 class="text-lg" data-toc-id="a768f3d1-c315-41c9-a393-677073d0ae69" id="a768f3d1-c315-41c9-a393-677073d0ae69">How to Get Started</h3><ol><li><p>Open the rubric library</p></li><li><p>Add a rubric by uploading, importing, or creating one</p></li><li><p>Review the extracted rubric</p></li><li><p>Click <strong>Optimize with AI</strong></p></li><li><p>Compare the original and optimized versions</p></li><li><p>Edit if needed, then save</p></li></ol><h3 class="text-lg" data-toc-id="0dbb3091-e429-4ec5-8be9-c7ee53b07ee0" id="0dbb3091-e429-4ec5-8be9-c7ee53b07ee0">Further Reading</h3><p>If you want help getting the most out of this workflow, these posts are a good place to start:</p><ul><li><p><a class="text-interactive hover:text-interactive-hovered" rel="noopener noreferrer nofollow" href="https://community.ta-39.com/best-practices/post/rubrics-formatting-guidelines-for-ta39-q0b4AvapKcTbOMh">Rubrics formatting guidelines for TA39</a></p></li><li><p><a class="text-interactive hover:text-interactive-hovered" rel="noopener noreferrer nofollow" href="https://community.ta-39.com/best-practices/post/from-good-enough-to-clear-and-fair-upgrading-rubrics-for-ai-support-CXigTPemrqPxM51">From good enough to clear and fair: upgrading rubrics for AI support</a></p></li></ul><p>The key improvement is not just that TA39 can optimize rubrics with AI.</p><p>It is that rubric import, review, optimization, and saving now happen together in the place teachers already expect to manage rubrics.</p><p>That makes the workflow simpler, more visible, and easier to use.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[You can now build feedback templates more easily in TA39]]></title>
            <description><![CDATA[Feedback templates have been one of the most powerful parts of TA39.

They shape how feedback sounds, how it is structured, how much detail students receive, and what kind of guidance the AI provides. ...]]></description>
            <link>https://community.ta-39.com/basic-2-column-umfls66p/post/you-can-now-build-feedback-templates-more-easily-in-ta39-toqLgjVRG53sHX8</link>
            <guid isPermaLink="true">https://community.ta-39.com/basic-2-column-umfls66p/post/you-can-now-build-feedback-templates-more-easily-in-ta39-toqLgjVRG53sHX8</guid>
            <category><![CDATA[Feedback]]></category>
            <category><![CDATA[Feedback Template]]></category>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Fri, 27 Mar 2026 06:54:15 GMT</pubDate>
            <content:encoded><![CDATA[<p>Feedback templates have been one of the most powerful parts of TA39.</p><p>They shape how feedback sounds, how it is structured, how much detail students receive, and what kind of guidance the AI provides. But for many teachers, writing a good template still meant starting from a blank page and translating their feedback instincts into a format the system could follow.</p><p>The new <strong>Feedback Template Builder</strong> is designed to make that easier.</p><figure data-align="center" data-size="full" data-id="avHs81ivNo3G1Uf4quAWp" data-version="v2" data-type="image"><img data-id="avHs81ivNo3G1Uf4quAWp" src="https://tribe-s3-production.imgix.net/avHs81ivNo3G1Uf4quAWp?auto=compress,format"></figure><p></p><p>Instead of asking you to write everything from scratch, the builder starts with the kinds of decisions teachers already make: what kind of assignment this is, how the feedback should sound, how it should be organized, and how visible you want to be in the final response.</p><p>With the new builder, you can:</p><ul><li><p>start from a preset</p></li><li><p>adjust tone, detail, scoring, and sections</p></li><li><p>shape teacher presence and student-facing voice</p></li><li><p>decide how much is AI and how much is you</p></li><li><p>preview what students will actually receive</p></li><li><p>and inspect the underlying blueprint when you want more control</p></li></ul><p>In other words, it is now much easier to build a feedback template that reflects your classroom, your students, and your own feedback style.</p><p>If you want a full walkthrough, we have a detailed guide:</p><blockquote><p><a class="text-interactive hover:text-interactive-hovered" rel="noopener noreferrer nofollow" href="https://community.ta-39.com/best-practices/post/building-your-own-feedback-templates-how-to-shape-ai-feedback-to-fit-2gFfyul6kr0GFBD">How to build your own feedback template, step by step</a></p></blockquote><p>That guide explains how to get started, how to shape the structure and tone of your feedback, and how to refine templates over time.</p><p>We hope this makes one of TA39’s most important features easier to use — and easier to trust.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Why Feedback Templates Matter — and How to Build One That Works]]></title>
            <description><![CDATA[By this point, many teachers have seen both sides of AI feedback.

On one hand, it’s fast. It can generate detailed comments in seconds.
On the other hand, it often doesn’t quite sound right—or doesn’t ...]]></description>
            <link>https://community.ta-39.com/best-practices-4mkl4fls/post/building-your-own-feedback-templates-how-to-shape-ai-feedback-to-fit-2gFfyul6kr0GFBD</link>
            <guid isPermaLink="true">https://community.ta-39.com/best-practices-4mkl4fls/post/building-your-own-feedback-templates-how-to-shape-ai-feedback-to-fit-2gFfyul6kr0GFBD</guid>
            <category><![CDATA[Feedback Template]]></category>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Mon, 23 Mar 2026 19:00:06 GMT</pubDate>
            <content:encoded><![CDATA[<p>By this point, many teachers have seen both sides of AI feedback.</p><p>On one hand, it’s fast. It can generate detailed comments in seconds.<br>On the other hand, it often doesn’t quite sound right—or doesn’t quite do what you want.</p><p>That’s because generating feedback is not the hard part.</p><p><strong>Getting feedback that is clear, consistent, and actually useful for students is.</strong></p><h2 class="text-xl" data-toc-id="97f3b552-1dbd-48aa-a337-bbaa444d9a8d" id="97f3b552-1dbd-48aa-a337-bbaa444d9a8d">The hidden challenge behind AI feedback</h2><p>Most AI tools rely on a simple idea:</p><blockquote><p>You write a prompt → the AI generates feedback.</p></blockquote><p>In theory, that sounds flexible. In practice, it creates a problem.</p><p>To get consistently good results, you would need to:</p><ul><li><p>write very specific instructions</p></li><li><p>control tone, structure, and level of detail</p></li><li><p>anticipate how the AI will respond</p></li></ul><p>In other words, you would need to do a form of <strong>prompt engineering</strong>.</p><p>But that’s not how teachers naturally think.</p><p>Teachers think in terms of:</p><ul><li><p>“Be encouraging but honest”</p></li><li><p>“Focus on the most important issue”</p></li><li><p>“Use the rubric language”</p></li><li><p>“Keep it clear for this age group”</p></li></ul><p>Translating that into precise AI instructions is where things often break down.</p><h2 class="text-xl" data-toc-id="26e456d6-a04e-4cf8-baa1-f7c9f14321df" id="26e456d6-a04e-4cf8-baa1-f7c9f14321df">From prompts to templates</h2><p>Instead of writing a new prompt every time, TA39 uses <strong>feedback templates</strong>.</p><p>A template is simply:</p><blockquote><p>A reusable set of instructions that tells the AI how to generate feedback.</p></blockquote><p>It controls things like:</p><ul><li><p>tone and voice</p></li><li><p>level of detail</p></li><li><p>structure of the response</p></li><li><p>how student work is referenced</p></li><li><p>how improvement is framed</p></li></ul><p>When a template is well designed, the feedback starts to feel like something you would have written yourself.</p><p>When it isn’t, the feedback becomes generic, inconsistent, or difficult for students to use.</p><h2 class="text-xl" data-toc-id="71656e41-6765-4a64-9ba7-787ea0bf48b5" id="71656e41-6765-4a64-9ba7-787ea0bf48b5">How templates connect to feedback structure</h2><p>In the <a class="text-interactive hover:text-interactive-hovered" rel="noopener noreferrer nofollow" href="https://community.ta-39.com/best-practices/post/why-good-feedback-still-doesn-t-improve-student-work----and-how-to-fix-it-jIK2F1K6h4tgODq">previous article</a>, we introduced five feedback structures (feed-forward, criterion-focused, SBI, task–process–self-regulation, and sandwich).</p><p>A feedback template is how you <strong>bring one of those structures into practice consistently</strong>.</p><p>For example:</p><ul><li><p>A revision-focused template will often use a <strong>feed-forward structure</strong></p></li><li><p>A grading template will align with <strong>criterion-focused feedback</strong></p></li><li><p>A precision-focused template may use <strong>SBI</strong></p></li><li><p>A deeper-learning template may include <strong>task, process, and self-regulation prompts</strong></p></li></ul><p>You don’t need to think in terms of theory when building a template.</p><p>But in practice, you are making the same decision:</p><p><strong>What kind of feedback do I want students to receive—and why?</strong></p><p>The template simply ensures that this happens consistently.</p><h2 class="text-xl" data-toc-id="3ab41b77-cc9c-43e6-bd8a-260ed8902444" id="3ab41b77-cc9c-43e6-bd8a-260ed8902444">Why templates matter</h2><p>The quality of feedback isn’t just about wording. It’s about <strong>structure and consistency</strong>.</p><p>Well-designed templates do something important:</p><ul><li><p>they prevent ineffective feedback from being generated in the first place</p></li><li><p>they ensure feedback points to specific evidence</p></li><li><p>they keep the focus on a small number of priorities</p></li><li><p>they make next steps explicit</p></li></ul><p>In other words, they don’t just make feedback sound better.<br>They make feedback <strong>work better</strong>.</p><p>They also reduce variation. Instead of starting from scratch each time, you are working from a consistent foundation—while still keeping space for teacher judgment.</p><h2 class="text-xl" data-toc-id="5f758d6b-6508-4317-933b-908d9253a5b6" id="5f758d6b-6508-4317-933b-908d9253a5b6">A shift in thinking</h2><p>Instead of asking:</p><blockquote><p>“What should the AI write?”</p></blockquote><p>You are now asking:</p><blockquote><p><strong>What kind of feedback do I want students to receive—and how should it be structured?</strong></p></blockquote><p>That’s a different level of control.</p><hr><p>How to build your own feedback template</p><p>The good news is that you don’t need to start from a blank page.</p><p>Most teachers already know what useful feedback sounds like. The challenge is turning that judgment into something the AI can follow consistently.</p><p>That’s what the Feedback Template Builder is designed to support.</p><h2 class="text-xl" data-toc-id="66f6be9b-3cb7-4627-90e0-07ed176161a8" id="66f6be9b-3cb7-4627-90e0-07ed176161a8">Step 1: Start with a strong starting point</h2><p>When you open the builder, you’ll see a range of presets. These are not rigid templates—they are useful starting points.</p><p>Each preset reflects a different feedback goal.</p><p>For example:</p><ul><li><p>revision-focused → <strong>Revision Guide</strong></p></li><li><p>rubric-based → <strong>Rubric Audit</strong></p></li><li><p>younger students → <strong>Warm &amp; Encouraging</strong></p></li><li><p>quick feedback → <strong>Quick Check</strong></p></li></ul><figure data-align="left" data-size="half" data-id="aRBaVMYwa2bNDIYXx0mSn" data-version="v2" data-type="image"><img data-id="aRBaVMYwa2bNDIYXx0mSn" src="https://tribe-s3-production.imgix.net/aRBaVMYwa2bNDIYXx0mSn?auto=compress,format"></figure><p>You’re not committing to a final version yet. You’re just choosing a place to begin.</p><h2 class="text-xl" data-toc-id="e6ffee12-806c-4151-a32b-d515330ca013" id="e6ffee12-806c-4151-a32b-d515330ca013">Step 2: Adjust the key decisions</h2><p>From there, the builder asks a few simple questions:</p><ul><li><p>Who are your students?</p></li><li><p>How should the feedback sound?</p></li><li><p>How much detail do you want?</p></li><li><p>Should scores be included?</p></li><li><p>What sections should appear?</p></li></ul><p>These are not technical settings. They are <strong>teaching decisions</strong>.</p><figure data-align="left" data-size="best-fit" data-id="fj1xssyBwSYzobdURmSXa" data-version="v2" data-type="image"><img data-id="fj1xssyBwSYzobdURmSXa" src="https://tribe-s3-production.imgix.net/fj1xssyBwSYzobdURmSXa?auto=compress,format"></figure><p>As you adjust them, the template updates in real time. You’ll start to notice:</p><ul><li><p>more detail usually creates more structure</p></li><li><p>tone changes how feedback is phrased</p></li><li><p>sections shape what students actually see</p></li></ul><p>Making these decisions explicitly helps create feedback that is more intentional and consistent.</p><h2 class="text-xl" data-toc-id="c7b7e1e5-1874-47d6-b5f5-389ee0e19bbb" id="c7b7e1e5-1874-47d6-b5f5-389ee0e19bbb">Step 3: Add your personal touch</h2><p>This is where your own voice becomes more visible.</p><figure data-align="left" data-size="best-fit" data-id="fkfe1thgaQUdS4jKNPRwv" data-version="v2" data-type="image"><img data-id="fkfe1thgaQUdS4jKNPRwv" src="https://tribe-s3-production.imgix.net/fkfe1thgaQUdS4jKNPRwv?auto=compress,format"></figure><p>You can add short instructions in your own words, such as:</p><ul><li><p>“Focus on 2 key improvements only”</p></li><li><p>“Reference our class discussions where relevant”</p></li><li><p>“Keep feedback encouraging but specific”</p></li></ul><p>You don’t need to phrase this technically. Just write it the way you would say it.</p><p>These choices shape how the feedback feels—and how students respond to it.</p><h2 class="text-xl" data-toc-id="37df114f-9132-40e3-9cde-355c53efce41" id="37df114f-9132-40e3-9cde-355c53efce41">Step 4: Decide how much is AI, and how much is you</h2><p>One of the most useful questions is also one of the most practical:</p><p><strong>How much do you want the AI to do?</strong></p><p>Some teachers:</p><ul><li><p>use AI to draft most of the feedback and then review it</p></li><li><p>guide it more closely with detailed instructions</p></li><li><p>use it mainly for structure, while keeping a stronger personal voice</p></li></ul><p>There’s no single right answer. What matters is that the decision is visible.</p><blockquote><p>A common mistake is trying to include too much. Strong templates usually focus on a small number of priorities, rather than covering everything.</p></blockquote><h2 class="text-xl" data-toc-id="c117444d-fde7-4db2-b5e9-45d5c2b355fb" id="c117444d-fde7-4db2-b5e9-45d5c2b355fb">Step 5: Look at the template</h2><p>At this point, you’ll see the full template—the actual instructions being sent to the AI.</p><figure data-align="center" data-size="full" data-id="NsisUp76zP1jjffgRSZlE" data-version="v2" data-type="image"><img data-id="NsisUp76zP1jjffgRSZlE" src="https://tribe-s3-production.imgix.net/NsisUp76zP1jjffgRSZlE?auto=compress,format"></figure><p>You don’t need to edit it yet. Just scan it.</p><p>Over time, many teachers begin to notice patterns:</p><ul><li><p>how tone is expressed</p></li><li><p>how structure is built</p></li><li><p>how instructions shape the output</p></li></ul><p>The system becomes more transparent and easier to control.</p><h2 class="text-xl" data-toc-id="216d1062-2f0a-499c-a634-9a4415d913ac" id="216d1062-2f0a-499c-a634-9a4415d913ac">Step 6: Preview what students will see</h2><p>A template can look good in theory but produce feedback that feels too broad or too generic.</p><p>Before saving, preview the output.</p><figure data-align="center" data-size="full" data-id="I2jgus4y1V9F9tQBTGcbQ" data-version="v2" data-type="image"><img data-id="I2jgus4y1V9F9tQBTGcbQ" src="https://tribe-s3-production.imgix.net/I2jgus4y1V9F9tQBTGcbQ?auto=compress,format"></figure><p>This helps you check:</p><ul><li><p>whether the tone feels right</p></li><li><p>whether the level of detail is appropriate</p></li><li><p>whether the feedback is actually usable</p></li></ul><p>This step often reveals more than rereading the template itself.</p><h2 class="text-xl" data-toc-id="f39c5605-ea8c-4cff-9e3b-b216f62de5a7" id="f39c5605-ea8c-4cff-9e3b-b216f62de5a7">Step 7: Save and refine over time</h2><p>Once the template feels right, save it with a clear name:</p><ul><li><p>“Year 10 English – Essays”</p></li><li><p>“IB – Formative Writing”</p></li><li><p>“Quick Homework Feedback”</p></li></ul><p>From there, you can reuse and refine it.</p><p>Your first version doesn’t need to be perfect. Most teachers improve templates over time by:</p><ul><li><p>seeing how students respond</p></li><li><p>adjusting the level of detail</p></li><li><p>tightening the focus</p></li></ul><h2 class="text-xl" data-toc-id="00b8f59e-9c46-49fa-b67d-a6e36bded944" id="00b8f59e-9c46-49fa-b67d-a6e36bded944">A simple check</h2><p>After using your template, ask:</p><p><strong>If a student reads this, will they know exactly what to do next?</strong></p><p>If the answer is unclear:</p><ul><li><p>reduce the number of points</p></li><li><p>make actions more explicit</p></li><li><p>focus on one or two priorities</p></li></ul><p>That one question improves most templates.</p><h2 class="text-xl" data-toc-id="6165fed9-28fc-4cca-88a4-72ed3a358276" id="6165fed9-28fc-4cca-88a4-72ed3a358276">Where teachers still matter most</h2><p>Even with strong templates, teacher judgment remains essential.</p><p>Templates can structure feedback.<br>But they don’t replace:</p><ul><li><p>your knowledge of the class</p></li><li><p>your understanding of student misconceptions</p></li><li><p>your sense of what each student needs next</p></li></ul><p>The most effective approach is:</p><ul><li><p>the template provides a strong starting point</p></li><li><p>the teacher adds context, precision, and judgment</p></li></ul><h2 class="text-xl" data-toc-id="17dc511d-3d7f-4877-92b3-bf52e669303b" id="17dc511d-3d7f-4877-92b3-bf52e669303b">Bringing it together</h2><p>Across all of this, the idea is consistent:</p><ul><li><p>feedback works when students can act on it</p></li><li><p>structure makes feedback usable</p></li><li><p>templates make that structure consistent</p></li></ul><p>The goal is not to control every word the AI writes.<br>It is to ensure that the feedback students receive is:</p><ul><li><p><strong>focused</strong></p></li><li><p><strong>structured</strong></p></li><li><p><strong>easy to act on</strong></p></li></ul><p>Once that is in place, everything else becomes easier.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Why Good Feedback Still Doesn’t Improve Student Work — and 5 Structures That Help Fix It]]></title>
            <description><![CDATA[Teachers have always known how to give feedback. We highlight strengths, point out areas for improvement, and try to encourage students while being honest. And now, with AI, generating feedback is ...]]></description>
            <link>https://community.ta-39.com/best-practices-4mkl4fls/post/why-good-feedback-still-doesn-t-improve-student-work----and-how-to-fix-it-jIK2F1K6h4tgODq</link>
            <guid isPermaLink="true">https://community.ta-39.com/best-practices-4mkl4fls/post/why-good-feedback-still-doesn-t-improve-student-work----and-how-to-fix-it-jIK2F1K6h4tgODq</guid>
            <category><![CDATA[Feedback Template]]></category>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Mon, 23 Mar 2026 17:58:10 GMT</pubDate>
            <content:encoded><![CDATA[<p>Teachers have always known how to give feedback. We highlight strengths, point out areas for improvement, and try to encourage students while being honest. And now, with AI, generating feedback is faster than ever.</p><p>But there’s a persistent problem: <strong>a lot of feedback still doesn’t lead to better student work</strong>. It may sound helpful. It may look detailed. It may even feel thoughtful. But when students go back to revise, they often still don’t know what to do next.</p><h2 class="text-xl" data-toc-id="6de24cd2-1603-4107-9faa-7a938043b6c8" id="6de24cd2-1603-4107-9faa-7a938043b6c8">The problem is not effort. It is design.</h2><p>In recent years, many tools have focused on helping teachers generate more feedback, faster. That can be useful. But more feedback does not automatically mean more learning.</p><p>Students often struggle with feedback because it is:</p><ul><li><p>too broad</p></li><li><p>too vague</p></li><li><p>too general</p></li><li><p>too focused on what went wrong instead of what to do next</p></li></ul><p>As a result, students read the comments, but don’t make meaningful changes.</p><h2 class="text-xl" data-toc-id="704d26b2-bbe8-4922-ac39-851818c4bd64" id="704d26b2-bbe8-4922-ac39-851818c4bd64">What effective feedback needs to do</h2><p>Feedback only helps when it enables students to close the gap between where they are and where they need to be. That means good feedback is not only descriptive or evaluative. It is <strong>instructional</strong>.</p><p>Useful feedback tells the student:</p><ul><li><p>what to change</p></li><li><p>where to change it</p></li><li><p>how to improve it</p></li><li><p>what to do next</p></li></ul><p>And it usually doesn’t try to fix everything at once. The strongest feedback is:</p><ul><li><p><strong>focused</strong> on a small number of priorities</p></li><li><p><strong>specific</strong> about where the issue appears</p></li><li><p><strong>actionable</strong> in its next steps</p></li><li><p><strong>usable</strong> by the student in revision</p></li></ul><h2 class="text-xl" data-toc-id="7924238d-74e4-4e42-be5a-7bcf69318974" id="7924238d-74e4-4e42-be5a-7bcf69318974">Where AI feedback often falls short</h2><p>AI can make feedback faster, but speed is not the same as usefulness.</p><p>A lot of AI-generated feedback sounds polished and balanced. It may include praise, critique, and rubric language. But often it still:</p><ul><li><p>describes rather than diagnoses</p></li><li><p>summarizes rather than guides</p></li><li><p>comments rather than instructs</p></li></ul><p>In other words, AI feedback is often <strong>linguistically strong but instructionally weak</strong>. It can sound precise without actually helping the student take action.</p><p>The missing piece is not language quality — it is <strong>structure and prioritization</strong>.</p><h2 class="text-xl" data-toc-id="43d9ee2b-ea6f-4bd3-a901-81ae5355b896" id="43d9ee2b-ea6f-4bd3-a901-81ae5355b896">Why structure matters</h2><p>One of the simplest ways to improve feedback is to give it a clearer structure.</p><p>When feedback is unstructured, we often fall into familiar patterns: a compliment, a general criticism, and a broad suggestion. It sounds reasonable, but it often leaves students wondering:</p><p><strong>“What exactly should I change?”</strong></p><p>Structured feedback helps reduce that ambiguity. It encourages you to:</p><ul><li><p>point to specific moments in the work</p></li><li><p>focus on a small number of priorities</p></li><li><p>include clear next steps</p></li></ul><p>It also reduces cognitive overload. Instead of trying to interpret a long list of comments, students are guided toward a small number of clear, prioritized actions — which makes feedback far more likely to be used.</p><h2 class="text-xl" data-toc-id="6373aca0-46c4-4f48-b68f-2dbe666a1288" id="6373aca0-46c4-4f48-b68f-2dbe666a1288">Where these structures come from</h2><p>These structures are not arbitrary formats. They are simplified, classroom-ready versions of well-established feedback approaches.</p><ul><li><p><strong>Task–Process–Self-Regulation</strong> comes from one of the most influential feedback models, showing that feedback is most effective when it focuses on the task, the strategy used, and how students monitor their own work.</p></li><li><p><strong>Feed-forward</strong> reflects a core idea in formative assessment: feedback only works when it leads to clear next steps.</p></li><li><p><strong>Criterion-focused feedback</strong> comes from standards-based systems (IB, AP, Cambridge), where performance is defined against explicit criteria.</p></li><li><p><strong>SBI (Situation–Behavior–Impact)</strong> is commonly used in coaching and instructional feedback to improve clarity and precision.</p></li><li><p><strong>The sandwich approach</strong> is widely used in classrooms, though its effectiveness depends on how clearly the improvement point is communicated.</p></li></ul><p>These are best understood not as rigid templates, but as <strong>practical ways to apply research-backed principles in everyday teaching</strong>.</p><h2 class="text-xl" data-toc-id="9d5c993c-bdb2-465a-8ab6-b0f6cfc89792" id="9d5c993c-bdb2-465a-8ab6-b0f6cfc89792">1. Feed-forward: focus on what comes next</h2><p>Feed-forward starts with a simple question:</p><p><strong>What should the student do next?</strong></p><p>Instead of spending most of the feedback explaining what was wrong, this structure emphasizes the next step the student can take.</p><p>Examples:</p><ul><li><p>“To strengthen this, your next step is to explain how each image creates meaning.”</p></li><li><p>“After each quote, add one sentence beginning with ‘This suggests…’”</p></li></ul><p>This works especially well for:</p><ul><li><p>drafts and revisions</p></li><li><p>writing tasks</p></li><li><p>building momentum in learning</p></li></ul><p>Why it helps:</p><ul><li><p>it keeps feedback manageable</p></li><li><p>it directs attention toward action</p></li><li><p>it supports immediate revision</p></li></ul><h2 class="text-xl" data-toc-id="933341a5-e39d-4c41-8ceb-a91f615ccf70" id="933341a5-e39d-4c41-8ceb-a91f615ccf70">2. Criterion-focused: make expectations visible</h2><p>Students often struggle to understand how their work aligns with a rubric.</p><p>Criterion-focused feedback addresses each criterion directly. It uses evidence from the student’s work to show what meets expectations and what is needed to improve.</p><p>Examples:</p><ul><li><p>“When you wrote ‘…’, this shows you can identify techniques.”</p></li><li><p>“To reach the next level, you need to explain how those techniques create meaning.”</p></li></ul><p>This works best for:</p><ul><li><p>rubric-based assignments</p></li><li><p>summative tasks</p></li><li><p>exam preparation</p></li></ul><p>Why it helps:</p><ul><li><p>it reduces ambiguity</p></li><li><p>it makes expectations clearer</p></li><li><p>it supports fairness and consistency</p></li></ul><h2 class="text-xl" data-toc-id="cae54a29-214f-4577-acf8-db391484696b" id="cae54a29-214f-4577-acf8-db391484696b">3. SBI: improve precision</h2><p>A common problem in feedback is vagueness. SBI improves precision by structuring feedback into three parts:</p><ul><li><p><strong>Situation</strong>: where in the work</p></li><li><p><strong>Behavior</strong>: what the student did</p></li><li><p><strong>Impact</strong>: why it matters</p></li></ul><p>Example:</p><p>“In your third paragraph, you introduce a new idea without linking it back, which makes your argument harder to follow.”</p><p>This works well for:</p><ul><li><p>writing tasks</p></li><li><p>problem-solving</p></li><li><p>targeted feedback on specific moments</p></li></ul><p>Why it helps:</p><ul><li><p>students can locate exactly what you’re referring to</p></li><li><p>it keeps feedback focused on actions, not traits</p></li><li><p>it makes feedback easier to act on</p></li></ul><h2 class="text-xl" data-toc-id="055f7fb2-4758-4699-851c-7c7860d72121" id="055f7fb2-4758-4699-851c-7c7860d72121">4. Task–Process–Self-Regulation: support deeper learning</h2><p>Some feedback helps students fix this piece of work. Other feedback helps them improve across tasks. This structure aims to do both.</p><p>It looks at three levels:</p><ul><li><p><strong>Task</strong> — Is the answer correct? What’s missing?</p></li><li><p><strong>Process</strong> — How did the student approach the task?</p></li><li><p><strong>Self-regulation</strong> — How can they check and improve their work next time?</p></li></ul><p>Example:</p><ul><li><p>Task: “Your calculation is correct.”</p></li><li><p>Process: “You used the quadratic formula, but didn’t check for sign errors.”</p></li><li><p>Self-regulation: “Before submitting, check each negative sign.”</p></li></ul><p>This works best for:</p><ul><li><p>maths and science</p></li><li><p>extended writing</p></li><li><p>building independent learners</p></li></ul><p>Why it helps:</p><ul><li><p>it goes beyond correction</p></li><li><p>it builds transferable strategies</p></li><li><p>it supports long-term improvement</p></li></ul><h2 class="text-xl" data-toc-id="09824002-ce39-458a-86f1-ed776a909c8a" id="09824002-ce39-458a-86f1-ed776a909c8a">5. Sandwich: use with care</h2><p>The sandwich structure—praise, improvement, encouragement—is widely used.</p><p>It can help:</p><ul><li><p>support confidence</p></li><li><p>reduce defensiveness</p></li><li><p>make feedback feel more approachable</p></li></ul><p>This works best for:</p><ul><li><p>younger students</p></li><li><p>early drafts</p></li><li><p>students who need encouragement</p></li></ul><p>However, if overused, it can:</p><ul><li><p>blur the main message</p></li><li><p>feel repetitive</p></li><li><p>reduce clarity about what needs to change</p></li></ul><p>Keeping it simple helps: 1–2 strengths, 1 clear area for growth, and specific guidance.</p><h2 class="text-xl" data-toc-id="47e6f682-20f3-4b3b-8520-4cb03b3c6891" id="47e6f682-20f3-4b3b-8520-4cb03b3c6891">Choosing the right structure</h2><p>The most important decision is not how much feedback to give, but <strong>what kind of feedback the situation calls for</strong>.</p><p>A useful way to think about this:</p><p><strong>1. What is the goal?</strong></p><ul><li><p>Improve the current draft → <strong>Feed-forward</strong></p></li><li><p>Explain a grade → <strong>Criterion-focused</strong></p></li><li><p>Build long-term thinking → <strong>Task–Process–Self-Regulation</strong></p></li></ul><p><strong>2. What does the student need most right now?</strong></p><ul><li><p>Clarity about a specific issue → <strong>SBI</strong></p></li><li><p>Confidence and encouragement → <strong>Sandwich</strong></p></li></ul><p><strong>3. How complex is the task?</strong></p><ul><li><p>Simple or early-stage → <strong>Feed-forward</strong> or <strong>SBI</strong></p></li><li><p>Complex or multi-step → <strong>Task–Process–Self-Regulation</strong></p></li></ul><p>These structures are not interchangeable. They are <strong>different tools for different instructional moments</strong>.</p><h2 class="text-xl" data-toc-id="f58d50e7-4b14-4f6b-82ac-44ad1f14c818" id="f58d50e7-4b14-4f6b-82ac-44ad1f14c818">Where AI helps — and where teachers matter most</h2><p>With the right structure in place, AI can be very useful. It can:</p><ul><li><p>generate a strong first draft</p></li><li><p>improve consistency</p></li><li><p>reduce repetitive writing</p></li><li><p>keep feedback aligned with criteria</p></li></ul><p>But it does not replace teacher judgment.</p><p>Teachers still know:</p><ul><li><p>what has been taught</p></li><li><p>what misconceptions are common</p></li><li><p>how a class has progressed</p></li><li><p>what will motivate a particular student</p></li></ul><p>So AI-generated feedback works best as a starting point, not the final version.</p><h2 class="text-xl" data-toc-id="afdbd83d-d466-422e-9183-d37b415db429" id="afdbd83d-d466-422e-9183-d37b415db429">A simple test</h2><p>A useful question to ask:</p><p><strong>If a student reads this, will they know exactly what to do next?</strong></p><p>If the answer is no, the feedback likely needs:</p><ul><li><p>fewer points</p></li><li><p>more specificity</p></li><li><p>clearer actions</p></li></ul><p>That is often the difference between feedback students read and feedback students actually use.</p><h2 class="text-xl" data-toc-id="25ba63e8-ded2-45a6-855d-e8688c899a95" id="25ba63e8-ded2-45a6-855d-e8688c899a95">A final thought</h2><p>This is why, in TA39, feedback templates are not just formatting tools. They are ways of making feedback more intentional — helping teachers apply structure, focus, and clarity consistently.</p><p>In practice, many teachers combine elements from more than one structure.</p><p>The goal isn’t more feedback.<br>It’s feedback students can actually use.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[From ‘Good Enough’ to Clear and Fair: Upgrading Rubrics for AI Support]]></title>
            <description><![CDATA[You know that feeling when you’re grading the 12th essay in a row and silently thanking Past You for taking the time to write a good rubric?

This post is about that version of you.

And it’s also about ...]]></description>
            <link>https://community.ta-39.com/best-practices-4mkl4fls/post/from-good-enough-to-clear-and-fair-upgrading-rubrics-for-ai-support-CXigTPemrqPxM51</link>
            <guid isPermaLink="true">https://community.ta-39.com/best-practices-4mkl4fls/post/from-good-enough-to-clear-and-fair-upgrading-rubrics-for-ai-support-CXigTPemrqPxM51</guid>
            <category><![CDATA[Rubric]]></category>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Fri, 14 Nov 2025 19:52:00 GMT</pubDate>
            <content:encoded><![CDATA[<p>You know that feeling when you’re grading the 12th essay in a row and silently thanking Past You for taking the time to write a good rubric?</p><p>This post is about that version of you.</p><p>And it’s also about what happens when we bring AI assistants like TA39 into the mix—because suddenly your rubric isn’t just a guide for you. It becomes part of how the AI understands quality, applies standards, and generates feedback.</p><h2 id="d9de78f9-2d5c-4054-9145-6f197da729e2" data-toc-id="d9de78f9-2d5c-4054-9145-6f197da729e2" class="text-xl">Why rubrics matter even more with AI</h2><p>Most teachers already believe in rubrics. We’ve seen how they:</p><ul><li><p>make expectations visible for students</p></li><li><p>support fairer grading</p></li><li><p>help us give faster, more focused feedback</p></li></ul><p>But when you bring AI into the process—whether it’s co-grading, suggesting feedback, or helping you review student work—your rubric stops being just a helpful tool and becomes a much more active part of the system.</p><p>Humans can work around a fuzzy rubric.<br>AI can’t.</p><p>It will follow what’s written, not what you meant.</p><p>If your rubric says “uses evidence effectively,” you and I bring years of classroom experience, content knowledge, and judgment to that phrase.</p><p>An AI doesn’t have your gut. It has your words.</p><p>So if those words are vague, overlapping, or open to interpretation, the AI will still do its best—but its “best” may not look much like yours.</p><h2 id="d6cde1be-8ab3-4a10-8ed5-15b6bfb0b5a5" data-toc-id="d6cde1be-8ab3-4a10-8ed5-15b6bfb0b5a5" class="text-xl">What AI is actually doing with your rubric</h2><p>Tools like TA39 are not just “scoring” writing.</p><p>They are doing something more demanding:</p><ul><li><p>reading your rubric as a set of rules</p></li><li><p>matching student work to those rules</p></li><li><p>explaining that reasoning in language a student can understand</p></li><li><p>sometimes aligning evidence or comments to specific parts of the work</p></li></ul><p>That is a lot to ask from a rubric.</p><p>If the rubric is clear, the AI has a much better chance of doing work you will recognize and trust.</p><p>If the rubric is fuzzy, the AI will still produce an answer—but now it is filling in gaps you may not have realized were there.</p><p>That is where problems tend to start.</p><h2 id="ecc720f2-fe39-4812-b989-4974c316afc4" data-toc-id="ecc720f2-fe39-4812-b989-4974c316afc4" class="text-xl">Three quiet ways a “good enough” rubric breaks down with AI</h2><p>Here are a few common patterns.</p><h3 id="7e8193d6-1570-44c2-b3a5-881c917eaf51" data-toc-id="7e8193d6-1570-44c2-b3a5-881c917eaf51" class="text-lg">1. The double-counting problem</h3><p>Imagine a rubric with these two criteria:</p><ul><li><p>Use of Evidence</p></li><li><p>Accuracy &amp; Relevance of Details</p></li></ul><p>On paper, that can look fine.</p><p>In practice, they often end up judging the same thing twice.</p><p>A human teacher may balance that out without even noticing. An AI is more likely to follow the structure literally.</p><p>That means a strong quote might get rewarded in both places. Or one weak detail might affect multiple criteria.</p><p>The issue is not that the AI is doing something wrong. The issue is that the criteria overlap more than we may realize when we write them.</p><h3 id="04359dd0-d2e0-4ed7-8937-adcbb0faa2b9" data-toc-id="04359dd0-d2e0-4ed7-8937-adcbb0faa2b9" class="text-lg">2. Vague level descriptions</h3><p>Consider level labels like:</p><ul><li><p>4 – Strong thesis and clear ideas</p></li><li><p>3 – Mostly clear ideas</p></li><li><p>2 – Somewhat unclear ideas</p></li><li><p>1 – Limited clarity</p></li></ul><p>We all know what those feel like when we’re grading.</p><p>But what does “mostly clear” mean in practice?</p><p>Is it about organization?<br>Sentence clarity?<br>The thesis?<br>How much confusion is too much?</p><p>An AI has to pick a definition. And once it does, it will apply that definition very consistently—even if it is not the one you had in mind.</p><p>So you end up with scoring that is consistent in one sense, but not necessarily aligned with your standard.</p><h3 id="3dcdaf48-2230-4102-947e-8070963357a0" data-toc-id="3dcdaf48-2230-4102-947e-8070963357a0" class="text-lg">3. The holistic fog</h3><p>Sometimes a rubric is really a checklist wrapped in a paragraph.</p><p>For example:</p><blockquote><p>Student shows strong writing skills, with good organization, appropriate vocabulary, and clear ideas.</p></blockquote><p>That might sit under one criterion such as “Overall Writing Quality.”</p><p>Humans are often quite good at reading that as a holistic judgment.</p><p>AI can mimic that. But when we also ask it to justify the decision—Why is this a 3 instead of a 4? What in the writing supports that?—the fuzziness becomes harder to ignore.</p><p>Holistic criteria are not wrong. But they are harder to use as the main engine for AI-supported feedback.</p><h2 id="35a1ddf9-44f9-4fdd-8856-ac0d5624e535" data-toc-id="35a1ddf9-44f9-4fdd-8856-ac0d5624e535" class="text-xl">What a stronger rubric usually looks like</h2><p>The goal is not to turn your rubric into a legal document.</p><p>The goal is to make your professional judgment more visible and easier to apply consistently—for you, for students, and for the AI.</p><p>A few shifts make a big difference.</p><h3 id="5a07c805-37c0-4adf-bd3e-f57d419a18f7" data-toc-id="5a07c805-37c0-4adf-bd3e-f57d419a18f7" class="text-lg">Make levels more observable</h3><p>Instead of:</p><blockquote><p>Uses evidence effectively</p></blockquote><p>try:</p><blockquote><p>Includes at least 3 pieces of relevant evidence that support the claim, and explains how each piece connects to the argument.</p></blockquote><p>Now:</p><ul><li><p>you know what you are looking for</p></li><li><p>the AI knows what it is looking for</p></li><li><p>the student knows what stronger work looks like</p></li></ul><h3 id="fb531a30-7380-4d27-8c4d-cbaaf4891eb5" data-toc-id="fb531a30-7380-4d27-8c4d-cbaaf4891eb5" class="text-lg">Separate criteria that are doing different jobs</h3><p>If one criterion says:</p><blockquote><p>Organization and language are clear</p></blockquote><p>that is probably two things:</p><ul><li><p>how ideas are structured</p></li><li><p>how clearly those ideas are expressed</p></li></ul><p>When those are separated, the feedback becomes more useful too. A student can see whether the issue is with structure, sentence-level clarity, or both.</p><h3 id="db1e92a0-f89e-4bcd-9d43-84e6e2ee52b3" data-toc-id="db1e92a0-f89e-4bcd-9d43-84e6e2ee52b3" class="text-lg">Anchor levels with clearer differences</h3><p>Instead of:</p><ul><li><p>4 – Strong thesis</p></li><li><p>3 – Clear thesis</p></li><li><p>2 – Weak thesis</p></li><li><p>1 – No thesis</p></li></ul><p>you might write:</p><ul><li><p>4 – Thesis is specific, arguable, and clearly placed. It previews the main reasons or points.</p></li><li><p>3 – Thesis states a clear main idea, but may be general or not fully preview the reasons.</p></li><li><p>2 – There is an attempt at a main idea, but it is vague, off-topic, or hard to locate.</p></li><li><p>1 – No identifiable thesis or main idea.</p></li></ul><p>That gives both you and the AI something more concrete to work with.</p><h2 id="375ec97e-61ff-4b94-8158-5689d99fa139" data-toc-id="375ec97e-61ff-4b94-8158-5689d99fa139" class="text-xl">Where TA39 fits in: “Optimize with AI”</h2><p>In TA39, this is exactly the kind of work <strong>Optimize with AI</strong> is designed to support.</p><p>You start with the rubric you already have. Then TA39 can help you review it by identifying places where the wording may be too vague, where levels overlap, or where performance distinctions are harder to apply consistently.</p><p>It can suggest a revised version that keeps your general intent, but makes the structure clearer and the level descriptions more explicit.</p><p>That matters because this feature is not just about formatting. It replaces the old idea of a separate rubric-conversion step with something more useful: a review-and-improvement step inside the rubric workflow itself.</p><p>Think of it less as a judge and more as a rubric coach.</p><p>It does not replace your expertise. It helps surface parts of the rubric that may need clearer language before you save and use it.</p><figure data-type="image" data-version="v2" data-id="qdQyFqLOFGXUJmjfReeo3" data-size="original" data-align="center"><img src="https://tribe-s3-production.imgix.net/qdQyFqLOFGXUJmjfReeo3?auto=compress,format" data-id="qdQyFqLOFGXUJmjfReeo3"></figure><p><br><em>Optimize with AI helps identify vague wording, overlapping distinctions, and places where a rubric can be made clearer before you save it.</em></p><p>If you want guidance on that workflow, see <a href="https://community.ta-39.com/best-practices/post/rubrics-formatting-guidelines-for-ta39-q0b4AvapKcTbOMh" rel="noopener noreferrer nofollow" class="text-interactive hover:text-interactive-hovered">Rubrics formatting guidelines for TA39</a>.</p><h2 id="e7775fd0-9a8b-48f6-ae32-699c006bfbe1" data-toc-id="e7775fd0-9a8b-48f6-ae32-699c006bfbe1" class="text-xl">Why this is not about replacing teacher judgment</h2><p>A clearer rubric does not replace your judgment. It documents it.</p><p>It helps when students ask why they received a score. It supports consistency across classes or teaching teams. It gives students a fairer picture of what strong work looks like.</p><p>And yes, it helps an AI assistant work more in line with your standards instead of guessing at them.</p><p>When you tighten a rubric, you are not giving up control. You are making your expectations easier to see and easier to apply.</p><h2 id="e0588d92-e18d-4685-8f41-21148db28bc4" data-toc-id="e0588d92-e18d-4685-8f41-21148db28bc4" class="text-xl">A simple starting point</h2><p>If you want to improve a rubric without rewriting the whole thing, start small.</p><p>Pick one criterion you care about most—thesis, evidence, analysis, organization.</p><p>Then ask:</p><ul><li><p>Could two different teachers reasonably agree on what this level means?</p></li><li><p>Is there at least one observable feature a student could point to?</p></li><li><p>Are any of the level descriptions relying too much on words like strong, effective, limited, or sufficient?</p></li><li><p>Would clearer wording make the feedback more useful?</p></li></ul><p>If you are using TA39, run that criterion through <strong>Optimize with AI</strong> and see what it suggests. Treat it the way you would treat feedback from a colleague: accept what helps, revise what needs adjustment, and ignore what does not fit your assignment.</p><p>Do that for one criterion, then another.</p><p>Over time, your rubric becomes more useful not just for AI support, but for students and teachers too.</p><h2 id="dd11378f-ada3-4238-9363-beda214a3f2a" data-toc-id="dd11378f-ada3-4238-9363-beda214a3f2a" class="text-xl">Final thought</h2><p>AI will not fix a fuzzy rubric.</p><p>If anything, it will make the fuzziness easier to notice.</p><p>But when you give it a clear, thoughtful, well-structured rubric—the kind many teachers are already close to writing—it can support some of the best parts of your practice:</p><ul><li><p>clearer expectations</p></li><li><p>more consistent grading</p></li><li><p>more actionable feedback</p></li></ul><p>That work has always mattered.</p><p>What has changed is that tools like TA39 make it easier to see which parts of a rubric are doing that work well—and which parts need a clearer next draft.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[You Can Now Create Exemplars for a Rubric — Powered by AI]]></title>
            <description><![CDATA[Hello educators,

We’ve heard from teachers looking for a way to generate exemplar responses aligned to their rubric and assignment prompt — sometimes to help clarify expectations for students, other ...]]></description>
            <link>https://community.ta-39.com/basic-2-column-umfls66p/post/you-can-now-create-exemplars-for-any-rubric----powered-by-ai-1S4jZwTapdze0XR</link>
            <guid isPermaLink="true">https://community.ta-39.com/basic-2-column-umfls66p/post/you-can-now-create-exemplars-for-any-rubric----powered-by-ai-1S4jZwTapdze0XR</guid>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Thu, 15 May 2025 15:28:33 GMT</pubDate>
            <content:encoded><![CDATA[<p>Hello educators,</p><p>We’ve heard from teachers looking for a way to generate exemplar responses aligned to their rubric and assignment prompt — sometimes to help clarify expectations for students, other times to explore how the AI interprets scoring before grading begins.</p><p>Whether you're modeling what an A-, B+, or “Meets Standard” response looks like, or checking how your rubric is being applied, this feature is now available in TA39.</p><p>This feature is now available in TA39.</p><figure data-align="left" data-size="half" data-id="Rl71XG5sXb1jfhl2pyBtr" data-version="v2" data-type="image"><img data-id="Rl71XG5sXb1jfhl2pyBtr" src="https://tribe-s3-production.imgix.net/Rl71XG5sXb1jfhl2pyBtr?auto=compress,format&amp;dl"></figure><h3 class="text-lg" data-toc-id="cdce4bf7-aca1-41e3-8bcc-be4616d01ac8" id="cdce4bf7-aca1-41e3-8bcc-be4616d01ac8">What You Can Now Do</h3><p>You can now generate exemplar responses using the rubric and prompt-to-the-student for an assignment.</p><p>These are student-style essays designed to reflect the score levels you define — such as “Approaching Standard” or “Exceeds Expectations.”</p><p>You select the target ratings for each rubric criterion. TA39 creates a sample response that fits.</p><p>You can review the result, make edits if needed, and save it with the assignment.</p><h3 class="text-lg" data-toc-id="c8cd2453-f4d8-42f6-bd41-7f955a0503ba" id="c8cd2453-f4d8-42f6-bd41-7f955a0503ba">What We’ve Heard from Teachers</h3><ul><li><p>To help students see grounded examples of writing at different performance levels</p></li><li><p>To check that the rubric and AI interpretation are aligned before any grading begins</p></li><li><p>To support new rubrics or writing units where examples can clarify expectations</p></li></ul><h3 class="text-lg" data-toc-id="1903b239-bfa7-4f6a-8d45-0d42c8e23264" id="1903b239-bfa7-4f6a-8d45-0d42c8e23264">You Can Still Upload Your Own</h3><p>You can continue to upload real student samples if you prefer. Both upload and generation options are supported, and both are stored alongside the assignment for future use or reference.</p><h3 class="text-lg" data-toc-id="b0efa74c-ab67-497a-a802-e13be493825c" id="b0efa74c-ab67-497a-a802-e13be493825c">How to Get Started</h3><ol><li><p>Open any assignment and go to the Exemplars tab</p></li><li><p>Click Generate Exemplars</p></li><li><p>Select the rubric scores you want the sample to reflect</p></li><li><p>TA39 will generate a response that matches your selections</p></li><li><p>Review, edit, and save</p></li></ol><figure data-align="left" data-size="half" data-id="tNzYcerOaUH6VaBPHTRLc" data-version="v2" data-type="image"><img data-id="tNzYcerOaUH6VaBPHTRLc" src="https://tribe-s3-production.imgix.net/tNzYcerOaUH6VaBPHTRLc?auto=compress,format&amp;dl"></figure><p>These exemplars are now part of the assignment and can be reused, shared with students, or compared to actual submissions.</p><h3 class="text-lg" data-toc-id="6353305e-fb8c-4ab6-b16c-64166b0545f1" id="6353305e-fb8c-4ab6-b16c-64166b0545f1">Coming Soon: AI That Learns From Your Exemplars</h3><p>In an upcoming release, TA39 will begin using these exemplars to guide how the AI gives feedback — helping it reflect your scoring approach more closely.</p><p></p><hr><p></p><p><strong>Try it out — generate a few exemplars, see how they align with your rubric, and use what’s helpful for your classroom.</strong></p><p>If something doesn’t feel quite right — or if you have ideas for how this could work better — we’d truly value your input.</p><p>Your feedback helps us steer TA39 in the direction it needs to go, grounded in your real-world experience — not assumptions.</p><p><strong>– The TA39 Team</strong></p><p></p><p></p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[✨ NEW: TA39 Copilot: Transform Feedback into Meaningful Conversations]]></title>
            <description><![CDATA[Hello Educators,

You told us students often need a safe way to discuss feedback—whether it's curiosity, confusion, or even constructively pushing back on a score.

We recognize that the most valuable ...]]></description>
            <link>https://community.ta-39.com/basic-2-column-umfls66p/post/new-ta39-copilot-transform-feedback-into-meaningful-conversations-t98LSruxg85pprV</link>
            <guid isPermaLink="true">https://community.ta-39.com/basic-2-column-umfls66p/post/new-ta39-copilot-transform-feedback-into-meaningful-conversations-t98LSruxg85pprV</guid>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Thu, 24 Apr 2025 15:25:10 GMT</pubDate>
            <content:encoded><![CDATA[<p><strong>Hello Educators,</strong></p><p>You told us students often need a safe way to discuss feedback—whether it's curiosity, confusion, or even constructively pushing back on a score.</p><p>We recognize that the most valuable learning happens not when feedback is delivered, but when students engage with it. The best teachers have always known that feedback should spark conversation and reflection—a starting point rather than an endpoint.</p><p>Introducing the <strong>TA39 Student Copilot Panel</strong>, an interactive dialogue students can engage with <strong>after reviewing their feedback and grades</strong>. It lets them comfortably ask questions, reflect, and clarify their understanding, giving you instant visibility into their interactions and learning opportunities.</p><figure data-align="center" data-size="full" data-id="0xV1xEB9IqmCz8Tvby3U4" data-version="v2" data-type="image"><img data-id="0xV1xEB9IqmCz8Tvby3U4" src="https://tribe-s3-production.imgix.net/0xV1xEB9IqmCz8Tvby3U4?auto=compress,format&amp;dl"></figure><h2 class="text-xl" data-toc-id="07015d75-3dc0-489d-b7ac-3adb48a78df8" id="07015d75-3dc0-489d-b7ac-3adb48a78df8">🚀 How It Works</h2><p>After students read their feedback and scores, they'll see a simple invitation:</p><p><code>"Shall we talk about it?"</code></p><p>The Copilot engages students with supportive prompts while allowing them to lead the conversation based on their specific needs. Students can:</p><ul><li><p>Ask questions about specific feedback</p></li><li><p>Request clarification on improvement areas</p></li><li><p>Discuss their perspective on assessments</p></li><li><p>Get guidance on making meaningful revisions</p></li></ul><p>Conversations stay anchored directly to each student's essay, rubric criteria, and personalized feedback, ensuring every interaction is relevant.</p><h2 class="text-xl" data-toc-id="02687c94-4528-4727-b743-0476af90be98" id="02687c94-4528-4727-b743-0476af90be98">📈 Real-Time Insight for Teacher</h2><p>Your TA39 dashboard now provides:</p><p>- <strong>Visibility</strong> into student participation</p><p>- <strong>Conversation archives</strong> readily available for your review</p><p>- <strong>Real-time monitoring</strong> to identify where you can step in to help</p><p></p><h2 class="text-xl" data-toc-id="daf5df8f-ce38-483d-8d95-16b24d5fd4e3" id="daf5df8f-ce38-483d-8d95-16b24d5fd4e3">✅ Classroom Management</h2><p>- <strong>Instant Control:</strong> Pause or disable the panel for any student at any time</p><p>- <strong>Automatic Alignment:</strong> Conversations clear if you update or regenerate feedback</p><p></p><p>Explore the TA39 Copilot Panel today and help students engage more deeply with their feedback while creating more meaningful conversations.</p><p>Warm regards,</p><p><strong>The TA39 Team</strong></p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Organizing Assignments: Folders in TA39]]></title>
            <description><![CDATA[Hello,

You've have shared with us that as the number of assignments grows in TA39, it can become difficult to keep these organized — especially with multiple class periods or courses.

We’ve added ...]]></description>
            <link>https://community.ta-39.com/basic-2-column-umfls66p/post/organizing-assignments-folders-in-ta39-5d8F8faYPAXjAQv</link>
            <guid isPermaLink="true">https://community.ta-39.com/basic-2-column-umfls66p/post/organizing-assignments-folders-in-ta39-5d8F8faYPAXjAQv</guid>
            <dc:creator><![CDATA[TA39 Team]]></dc:creator>
            <pubDate>Tue, 25 Mar 2025 22:23:30 GMT</pubDate>
            <content:encoded><![CDATA[<p>Hello,</p><p>You've have shared with us that as the number of assignments grows in TA39, it can become difficult to keep these organized — especially with multiple class periods or courses.</p><p>We’ve added support for&nbsp;<strong>Folders</strong>&nbsp;in your&nbsp;<strong>My Assignments</strong>&nbsp;view. This is a small but meaningful change, designed to make your workspace more manageable.</p><h2 class="text-xl" data-toc-id="785bcd87-bc06-4af2-bfb8-f3a7df277d8c" id="785bcd87-bc06-4af2-bfb8-f3a7df277d8c">What You Can Now Do</h2><p><strong>Create Folders</strong><br>Group assignments by class period, unit, or course — whatever structure works best for you.</p><p><strong>Move Assignments</strong><br>From the “...” action on the right side, you can move assignments into folders or create new ones.</p><p><strong>Filter Within Folders</strong><br>Each folder view lets you apply basic filters, like by grade or due date, so it’s easier to find what you need.</p><p><strong>Update or Delete Folders</strong><br>You can rename folders as your course evolves, or delete them when they’re no longer needed.</p><p><strong>See Everything All at Once</strong><br>At the top level — outside any specific folder — you can still view all of your assignments in one place, across folders. Nothing is hidden.</p><h3 class="text-lg" data-toc-id="f758b9ea-4b0f-451f-a78a-7c04880456f7" id="f758b9ea-4b0f-451f-a78a-7c04880456f7">Where to Find It</h3><p>Go to&nbsp;<strong>My Assignments</strong>&nbsp;in TA39. You’ll now see a folders panel on the left. Use the folder buttons to organize as needed.</p><p>As always, thank you for sharing your feedback.</p><p>If you run into questions or suggestions, we’re here:&nbsp;<a class="text-interactive hover:text-interactive-hovered" rel="noopener noreferrer nofollow" href="mailto:support@ta-39.com"><strong>support@ta-39.com</strong></a></p>]]></content:encoded>
        </item>
    </channel>
</rss>