Accelerating innovation: New generative translation capabilities in Trados Studio 2024

Daniel Brockmann 02 May 2024 6 mins read
This blog was last updated on 12 June.
Since Generative AI entered the scene at the end of 2022, we’ve been presented with endless ways we can leverage LLMs throughout the translation workflow. When exploring potential use cases, I quickly realized an LLM may finally be the key to solving a particular scenario that I have been dreaming of addressing in Trados Studio for a very long time - let me explain…  
 
When working in a CAT tool, you can leverage a machine translation provider to provide translation suggestions. You can also rely on your CAT tool to highlight any known terms recognized from your termbase in the segment. However, a user must then manually apply the terms to the translated output, increasing the amount of time spent on each translation.   
 
What if there was a way though to automate this process? What if the CAT tool could automatically apply the terms as the machine translation results are populated and then display the enhanced suggestion to you for a final check? This would address one of the well-known challenges (and frustrations) with MT – the fact that it can be inconsistent in its use of terminology – exactly where a termbase can shine and make a difference. Sounds like a dream, right? Well, now with the emergence of LLMs, we can make this a reality.   
 

How to leverage Generative AI in Trados  

But before diving into how we’ve enabled this within Trados Studio, I wanted to briefly step back and update you on how we’re integrating generative AI in Trados in general. We believe that everyone should have access to our continuous innovations, especially in the AI space, so we’ve been busy embedding generative AI capabilities throughout our desktop and cloud offerings for all to use.  
 

Generative Translation Engines in the cloud 

As you may know, Generative AI was initially introduced in our cloud platform last year to tune-up and turbocharge translation engines. Now known as generative translation engines (GTE’s), they combine your established linguistic data with the power of an LLM to provide higher quality translation results from the start. This capability is available for Trados Enterprise and Trados Accelerate users who have custom workflows. You can learn more about this exciting new development in this blog – and applying correct terminology to machine translation results is just one of the many use cases!  
 

Generative Translation in Trados Studio 

Now going back to Trados Studio, we’re also infusing generative AI here as well, so that translators can leverage this technology in a similar way. Trados Studio has always had three technologies at its core – translation memory, terminology and integrated machine translation. Now AI is on the scene, there is a fourth technology you can bring to the party, allowing brand-new ways to translate and enhance the quality of your translation output. 
 
We had previously released some AI powered apps for Trados Studio that you may have heard of, such as OpenAI Translator and AI Professional, which originally allowed users to leverage an LLM to refine their work and tackle a wider range of linguistic challenges. However, what those AI apps were missing were the true generative translation capabilities that were formerly only available in our cloud platform. 
 
Now, rather than tacking on AI to complement your existing translation resources, we’re very proud to be launching our new out-of-the-box AI functionality ‘Trados Copilot – AI Assistant’ (or AI Assistant for short) as part of our Trados Studio 2024 release. This innovation brings AI to the heart of Studio, embedding it in its core.  
 
Let’s explore a little more about what it can do: 
 

How does Trados Copilot – AI Assistant work? 

When I first saw a demo of AI Assistant and played with an early beta, I immediately recognized the potential it has for every Trados Studio user. This innovation effectively democratizes access to LLMs and enables everyone across the supply chain to leverage them - no matter your role.  
 
Let’s discuss some of the ways you can use AI Assistant: 
  • Use as a seamlessly integrated translation provider: You can use AI Assistant for initial translation, just as you would with any other machine translation provider. AI Assistant will also preserve formatting and tagging during this process – again bringing it on par with most other MT providers. You can also combine this with features such as LookAhead when translating interactively with it in the editor. 
  • Add an editing prompt when batch-translating: If you use AI Assistant as a translation provider, you can also prompt the LLM to adjust the output in any way you want and it will be applied during pre-translation. This helps you to make changes to translation output on a large scale, rather than implementing changes individually. 
  • Prompt for editing help in the desktop editor as you work. If you prefer to have finer control over how and when AI Assistant is applied to your translation output, you also have the option of applying the prompts interactively on a segment-by-segment basis while you work in the editor. The great thing about working this way is that you can apply the prompt to any translation, whether it was produced by the LLM, an MT provider, delivered as a translation memory match, or translated from scratch. 
  • Automatically apply terminology: Excitingly, whether you use the LLM as a translation provider or prompt it for help while working in the editor, you can direct AI Assistant to take any terminology applied to the project into consideration when delivering translation suggestions, making my dream I mentioned above a reality! This works with any terminology provider, whether you’re using a local MultiTerm termbase, a server termbase, a cloud termbase or even a termbase from another provider, such as IATE. We call this new innovation ‘terminology-aware translation’. 
Just a small note to remember - LLMs still struggle when it comes to fast performance, so batch tasks may take significantly longer when you incorporate AI Assistant. Also, it is important to watch cost and rate limits when using LLMs in batch mode. Still, it can be a great feature when pre-applying known terms to any suggestions, for instance.  
 

Practical example  

To make this more tangible, I thought I would share a theoretical example of how you could use AI Assistant. Let’s take a look at the exciting new use case mentioned above and assume the following:  
 
In your translation, whenever the term photo printer comes up, the latest style guide from your customer stipulates that you have to use the term Fotodruckmaschine in your translation. So, you have stored that term in your local MultiTerm termbase as the preferred translation. 
 
Normally, you would expect any MT provider to translate photo printer as Fotodrucker – a noun whose gender is masculine, while the now required term Fotodruckmaschine is feminine. When using this term, we will also need to consider whether any surrounding gender needs to be changed and make any grammatical adjustments as well.  
 
Now consider this sample sentence and just translate it with machine translation:  
 
Allow at least 12 cm clearance from the back of the photo printer for the paper to travel.  
 
The MT provider will come back with the following suggestion:  

Lassen Sie mindestens 12 cm Abstand zur Rückseite des Fotodruckers, damit sich das Papier bewegen kann.  
 
So – as expected, the MT uses Fotodrucker. No surprise there. Your termbase stipulates, however, that the term Fotodruckmaschine should be used. Now let’s use AI Assistant to translate this sentence - MT, terminology and the LLM will now work together in tandem to magically enhance the translation and deliver this suggestion for review:  
 
Lassen Sie mindestens 12 cm Abstand zur Rückseite der Fotodruckmaschine, damit sich das Papier bewegen kann.  
 
This screenshot shows how it works – the LLM picks up the MT suggestion and adapts the terminology, returning the enhanced translation to the Editor: 
 
Screenshot showing the LLM adapting the MT suggestion  
 
Can you imagine the power of this combination in your context? MT struggles with terminology consistency and specialist terms, and this can now be solved by bringing your terminology assets to the table. It was always an obvious use case, but not so easy to solve, at least not without import/export or MT glossary creation in the MT provider. Now, none of this is necessary anymore, simply ensure a termbase is connected to your project and the LLM will do the rest.   
 
What’s also nice about this is that it is very dynamic. Let’s assume that tomorrow your customer will say “We need to call this Digitalfotodrucker from now on”. Well, all you need to do is edit the term in your termbase and the MT/LLM combination will pick this up right away.  
 

What do I need to access AI Assistant? 

As I mentioned previously, AI Assistant is natively available in Trados Studio 2024 and is powered by an LLM. All you need to access generative translation is your Studio 2024 license and an LLM subscription to your provider of choice - you can currently choose between OpenAI or Azure LLMs, however support for more providers is planned in the future. 
 

What’s next?  

What we covered today is just one of the many use cases for AI Assistant, but there are many more! Give it a go and let us know how you are using it to transform your localization processes.   
 
And that’s not all! AI Assistant isn’t the only way we’re accelerating innovation in Trados - Studio 2024 is packed full of intelligent AI-enabled features that boost productivity and efficiency and help you stay ahead of the curve, including:  
  • Seamless support for Machine Translation Quality Estimation (MTQE) data, improving translation accuracy and minimizing post-editing efforts.  
  • AI-enabled features, such as Smart Help which delivers intelligent assistance at the click of a button, making it easier than ever to find information you need. 
To learn more about these, and the numerous other AI features available in our latest release, read our what's new in Studio 2024 brochure
Daniel Brockmann
Author

Daniel Brockmann

Principal Product Manager

Daniel is a Principal Product Manager at RWS focused on Trados Studio. He has previously worked as a training and support specialist, sales engineer and documentation specialist, among other roles required to establish the industry's most popular CAT tool. Today, Daniel focuses on continually delivering productivity capabilities for every stakeholder in the translation process.

All from Daniel Brockmann